Discussion: View Thread

  • 1.  Model Validation

    Posted 15 days ago

    The recently published ASCE Manual of Practice 156, Navigation Channel Sedimentation Solutions, provides the following definitions:

    Calibration: Establishment of a one-to-one correspondence between output of a measuring instrument's sensor, such as turbidity sensor, and the desired unit of measurement, such as NTU.

    Validation: The process of ensuring that a model satisfactorily reproduces observed data (e.g., velocities, sediment deposition and erosion). Typically involves adjusting the model to fit several observed data sets representing a variety of flows and sedimentation processes.

    Verification: The process of ensuring that a Numerical Model Program solves its equations correctly.

    These definitions exclude the two-step validation terminology used by some modelers and limit the term "calibration" to meters. When the two-step calibration-verification terminology came into use in the 1970s, it was opposed by the U.S. Army Engineer Waterways Experiment Station (WES) Hydraulics Laboratory, which had used a different terminology since the 1920s. In that lexicon, instruments such as flow meters were calibrated but models were verified and validated. WES asserted that the validation process should be iterative among all available data sets until the best overall agreement with field observations was achieved. Multiple data sets under a wide range of conditions were strongly encouraged, since many waterways exhibit characteristics unique to the circumstances. As just one simple example, boundary roughness coefficients often change with flow conditions.

    Your thoughts?

    William McAnally Ph.D., P.E., BC.CE, BC.NE, F.ASCE
    Columbus MS

  • 2.  RE: Model Validation

    Posted 12 days ago

    Hey Bill,

    It does seem important that as a community we all continue to retain the same definitions of certain important words and not let their meaning and the expectation of the work attached to them begin to slip. 

    In my own experience, I can attest that I still see flow monitors being calibrated during the data collection process by means of field measurements. 

    Subsequently, model validation continues to be an iterative process against this data. 

    I am less well informed on how we use the word verification. It feels somewhat circular, and points back to our first word - calibration. I've mostly heard it used when someone is describing what size storm events a given model was validated against, usually with the intent of pointing out that this given model may not be able to be trusted when used to predict a response to a storm event of a drastically different size. 

    Christopher Seigel P.E., M.ASCE
    Civil Engineer

  • 3.  RE: Model Validation

    Posted 11 days ago
    Edited by Tirza Austin 8 days ago

    Hey Bill,

    Just adding to the discussion, my background is primarily 1D/2D hydraulic flood modeling so terms might be used a little differently. We used a similar definitions for a document we worked on in the EWRI Computation Hydraulics Technical Committee, document is still in the works but we had an interesting discussion on the topic.

    What I typically use is calibration when I compare modeled results to observed data and then adjusted model parameters to better match the observed data. Validation is comparing model results to observed data (using an independent dataset from calibration) without adjusting any model parameters, an independent "check" of the calibration. Verification is running the model against standard datasets (i.e. UK 2D Hydraulic Benchmark dataset) to check that the governing equations are being solved correctly and any modeling methods implemented are valid. Typically this is done by the model developer and not the model user. The standard datasets are specifically designed to test various components of the model governing equations and solver methods (i.e. flood wave propagation, low depth stability over adverse slopes, dam failure numerical shock, 1D/2D linking, etc).

    I often see in practice verification and validation used interchangeably. I am not the engineering dictionary when it comes these terms, this is just what I use and notice others use and thought I would share.


    Chad Ballard P.E., M.ASCE
    Project Manager
    Flower Mound TX

  • 4.  RE: Model Validation

    Posted 8 days ago

    Bill, I have not seen the cited ASCE manual, couldn't access it. You have pointed out something very important – that is not only limited to the model simulation of navigational channel sedimentation – but also in other areas of water modeling. As far as I know, the Verification & Validation (V&V, as it is commonly known) terminologies date back to the 1998 definitions of AIAA.

    • As proposed and outlined in Water Modeling, all different terms and definitions belong to the Model-Reality Conformity Assurance Processes – in the MRCAP envelope. The AIAA definitions have been used by many other notable subsequent publications, e.g. Oberkampf et al 2002; and ASME 2020; including the 2024 NAP Publication # 27747.

    • The V&V way of describing MRCAP – does not have any exclusive mention or discussion of 'calibration'. I am surprised to know that ASCE manual followed the V&V suite. Surprised, because, in my judgment – computational water modeling – waves, hydrodynamics, sediment transport-morphology, dynamic coupling of them – is fundamentally different than others. This becomes clear if one examines the Governing Equations that are embedded with reactive forces of solid boundary resistance and Turbulence and Eddies. These reactive forces are parameterized in the mathematical model – that ask for tuning and fine-tuning through calibration exercises – to achieve an acceptable level of model performance.

    • But, it is reasonable to assume that the term 'validation' implicitly includes 'calibration'. The reason for this assumption is that both of them are in the Computational Model ↔ Reality phase of MRCAP. The two-way arrow indicates an iterative MRCAP – and iteration continues until the model performance is at the desired confidence level of Equilibrium. And your quote of the ASCE manual suggests the same.

    • As for the ASCE use of calibration term exclusively for measurements – here is what I like to add – again from what are in Water Modeling: Models are a replicating tool of a certain physical reality. Like with developing any tool – and before being assured as a validated product – they need experimentation, refinement and calibration to satisfy the governing laws and equations. Yet, models are a soft tool that accompany uncertainties of different sorts. But, the same is true with any physical reality – the quantitative nature of which can only be understood through measurements or sampling (more in Uncertainty and Risk). So, as you see, the term 'calibration' applies to both measuring and computational tools.

    • In the V&V terminology, verification definition represents the Mathematical Model ↔ Computational Model phase of MRCAP. But, in computational water modeling practice the term is used differently as: Also, the modeler uses the term verification in a different sense than what is described AIAA framework. To avoid confusion, let us term it as 'Model Results Verification' This step is an essential confidence building exercise of MRCAP – and is done by simulating a scenario – that is different in time and space than the calibration scenarios. Your quote of the ASCE definition, basically underlines the same.

    • Finally, in the context of navigation channel sedimentation, here is what I like to add, following Harbor Sedimentation and Managing Coastal Inlets. These channels, by their very nature is dominated by different populations of sediment texture. For example, close to the high energy environments of wave and currents, it is by sand particles – as one goes further away from such energy sources – the dominance shifts from suspended particulates to aggregated fines. What does it mean for model validation? It means that a certain dynamically coupled model needs validation statements – saying, for example, it is valid for so and so areas. Such a statement may become necessary when one takes account of the totality of MRCAP: Reality ↔ Mathematical Model ↔ Computational Model ↔ Reality.



    Dr. Dilip K Barua, Ph.D

    Website Links and Profile