Thanks for the thoughtful comments, Chad. Your approach is consistent with some others in modeling and Sam Wang's committee-written book, Verification and Validation of 3D Free-Surface Flow Models. I served on the committee (but not authorship of the book) and thought the other members had been convinced to abandon the calibration-validation two-step terminology. Either I was mistaken, or they later changed their minds.
I believe the two-step mindset leads to inferior results. First, if the second observed data set doesn't represent significantly different boundary conditions (discharge, etc.), the second comparison isn't meaningful. Second, if it does represent different boundary conditions and the comparison is unsatisfactory, what's the next step? Most modelers adjust parameters to obtain better agreement to data set 2 and then use the first observation set as the test case -- iterating among the available observed data sets until they get the best overall agreement, as described in MOP 156. Cases where agreement is still good are rare.
My experience in physical and numerical modeling has been good practice consists of first adjusting so that water levels are correct, next adjusting to achieve good velocity reproduction and back-checking to ensure that water levels are okay, then adjusting for transport and back-checking, all iteratively. That process must be repeated for each observed data set, followed by a back-check of prior validation data sets. The reality is that two observed data sets are never enough but sometimes client budgets or schedules limit us to only one data set. Good modelers like you realize this need for an iterative approach but inexperienced modelers hear the two-step terminology and fall into the trap of practicing it. I encounter that mistake rather too often.
Original Message:
Sent: 07-12-2024 11:16 AM
From: Chad Ballard
Subject: Model Validation
Hey Bill,
Just adding to the discussion, my background is primarily 1D/2D hydraulic flood modeling so terms might be used a little differently. We used a similar definitions for a document we worked on in the EWRI Computation Hydraulics Technical Committee, document is still in the works but we had an interesting discussion on the topic.
What I typically use is calibration when I compare modeled results to observed data and then adjusted model parameters to better match the observed data. Validation is comparing model results to observed data (using an independent dataset from calibration) without adjusting any model parameters, an independent "check" of the calibration. Verification is running the model against standard datasets (i.e. UK 2D Hydraulic Benchmark dataset) to check that the governing equations are being solved correctly and any modeling methods implemented are valid. Typically this is done by the model developer and not the model user. The standard datasets are specifically designed to test various components of the model governing equations and solver methods (i.e. flood wave propagation, low depth stability over adverse slopes, dam failure numerical shock, 1D/2D linking, etc).
I often see in practice verification and validation used interchangeably. I am not the engineering dictionary when it comes these terms, this is just what I use and notice others use and thought I would share.
Chad
------------------------------
Chad Ballard P.E., M.ASCE
Project Manager
Flower Mound TX
Original Message:
Sent: 07-11-2024 03:59 PM
From: Christopher Seigel
Subject: Model Validation
Hey Bill,
It does seem important that as a community we all continue to retain the same definitions of certain important words and not let their meaning and the expectation of the work attached to them begin to slip.
In my own experience, I can attest that I still see flow monitors being calibrated during the data collection process by means of field measurements.
Subsequently, model validation continues to be an iterative process against this data.
I am less well informed on how we use the word verification. It feels somewhat circular, and points back to our first word - calibration. I've mostly heard it used when someone is describing what size storm events a given model was validated against, usually with the intent of pointing out that this given model may not be able to be trusted when used to predict a response to a storm event of a drastically different size.
------------------------------
Christopher Seigel P.E., M.ASCE
Civil Engineer
Original Message:
Sent: 07-05-2024 12:48 PM
From: William McAnally
Subject: Model Validation
The recently published ASCE Manual of Practice 156, Navigation Channel Sedimentation Solutions, provides the following definitions:
Calibration: Establishment of a one-to-one correspondence between output of a measuring instrument's sensor, such as turbidity sensor, and the desired unit of measurement, such as NTU.
Validation: The process of ensuring that a model satisfactorily reproduces observed data (e.g., velocities, sediment deposition and erosion). Typically involves adjusting the model to fit several observed data sets representing a variety of flows and sedimentation processes.
Verification: The process of ensuring that a Numerical Model Program solves its equations correctly.
These definitions exclude the two-step validation terminology used by some modelers and limit the term "calibration" to meters. When the two-step calibration-verification terminology came into use in the 1970s, it was opposed by the U.S. Army Engineer Waterways Experiment Station (WES) Hydraulics Laboratory, which had used a different terminology since the 1920s. In that lexicon, instruments such as flow meters were calibrated but models were verified and validated. WES asserted that the validation process should be iterative among all available data sets until the best overall agreement with field observations was achieved. Multiple data sets under a wide range of conditions were strongly encouraged, since many waterways exhibit characteristics unique to the circumstances. As just one simple example, boundary roughness coefficients often change with flow conditions.
Your thoughts?
------------------------------
William McAnally Ph.D., P.E., BC.CE, BC.NE, F.ASCE
ENGINEER
Columbus MS
------------------------------