Cover image for Effective groundwater model calibration : with analysis of data, sensitivities, predictions, and uncertainty
Title:
Effective groundwater model calibration : with analysis of data, sensitivities, predictions, and uncertainty
Publication Information:
Hoboken, NJ : Wiley-Interscience, 2007
ISBN:
9780471776369
Added Author:

Available:*

Library
Item Barcode
Call Number
Material Type
Item Category 1
Status
Searching...
30000010128055 GB1001.72.M35 H54 2007 Open Access Book Book
Searching...

On Order

Summary

Summary

Methods and guidelines for developing and using mathematical models

Turn to Effective Groundwater Model Calibration for a set of methods and guidelines that can help produce more accurate and transparent mathematical models. The models can represent groundwater flow and transport and other natural and engineered systems. Use this book and its extensive exercises to learn methods to fully exploit the data on hand, maximize the model's potential, and troubleshoot any problems that arise. Use the methods to perform:

Sensitivity analysis to evaluate the information content of data Data assessment to identify (a) existing measurements that dominate model development and predictions and (b) potential measurements likely to improve the reliability of predictions Calibration to develop models that are consistent with the data in an optimal manner Uncertainty evaluation to quantify and communicate errors in simulated results that are often used to make important societal decisions

Most of the methods are based on linear and nonlinear regression theory.

Fourteen guidelines show the reader how to use the methods advantageously in practical situations.

Exercises focus on a groundwater flow system and management problem, enabling readers to apply all the methods presented in the text. The exercises can be completed using the material provided in the book, or as hands-on computer exercises using instructions and files available on the text's accompanying Web site.

Throughout the book, the authors stress the need for valid statistical concepts and easily understood presentation methods required to achieve well-tested, transparent models. Most of the examples and all of the exercises focus on simulating groundwater systems; other examples come from surface-water hydrology and geophysics.

The methods and guidelines in the text are broadly applicable and can be used by students, researchers, and engineers to simulate many kinds systems.


Author Notes

Mary C. Hill, PhD, is Adjunct Professor at the University of Colorado at Boulder and the Colorado School of Mines
Claire R. Tiedeman, MS, is a Research Hydrologist at the U.S. Geological Survey


Table of Contents

Prefacep. xvii
1 Introductionp. 1
1.1 Book and Associated Contributions: Methods, Guidelines, Exercises, Answers, Software, and PowerPoint Filesp. 1
1.2 Model Calibration with Inverse Modelingp. 3
1.2.1 Parameterizationp. 5
1.2.2 Objective Functionp. 6
1.2.3 Utility of Inverse Modeling and Associated Methodsp. 6
1.2.4 Using the Model to Quantitatively Connect Parameters, Observations, and Predictionsp. 7
1.3 Relation of this Book to Other Ideas and Previous Worksp. 8
1.3.1 Predictive Versus Calibrated Modelsp. 8
1.3.2 Previous Workp. 8
1.4 A Few Definitionsp. 12
1.4.1 Linear and Nonlinearp. 12
1.4.2 Precision, Accuracy, Reliability, and Uncertaintyp. 13
1.5 Advantageous Expertise and Suggested Readingsp. 14
1.6 Overview of Chapters 2 Through 15p. 16
2 Computer Software and Groundwater Management Problem Used in the Exercisesp. 18
2.1 Computer Programs MODFLOW-2000, UCODE_2005, and PESTp. 18
2.2 Groundwater Management Problem Used for the Exercisesp. 21
2.2.1 Purpose and Strategyp. 23
2.2.2 Flow System Characteristicsp. 23
2.3 Exercisesp. 24
Exercise 2.1 Simulate Steady-State Heads and Perform Preparatory Stepsp. 25
3 Comparing Observed and Simulated Values Using Objective Functionsp. 26
3.1 Weighted Least-Squares Objective Functionp. 26
3.1.1 With a Diagonal Weight Matrixp. 27
3.1.2 With a Full Weight Matrixp. 28
3.2 Alternative Objective Functionsp. 28
3.2.1 Maximum-Likelihood Objective Functionp. 29
3.2.2 L[subscript 1] Norm Objective Functionp. 29
3.2.3 Multiobjective Functionp. 29
3.3 Requirements for Accurate Simulated Resultsp. 30
3.3.1 Accurate Modelp. 30
3.3.2 Unbiased Observations and Prior Informationp. 30
3.3.3 Weighting Reflects Errorsp. 31
3.4 Additional Issues
3.4.1 Prior Informationp. 32
3.4.2 Weightingp. 34
3.4.3 Residuals and Weighted Residualsp. 35
3.5 Least-Squares Objective-Function Surfacesp. 35
3.6 Exercisesp. 36
Exercise 3.1 Steady-State Parameter Definitionp. 36
Exercise 3.2 Observations for the Steady-State Problemp. 38
Exercise 3.3 Evaluate Model Fit Using Starting Parameter Valuesp. 40
4 Determining the Information that Observations Provide on Parameter Values using Fit-Independent Statisticsp. 41
4.1 Using Observationsp. 42
4.1.1 Model Construction and Parameter Definitionp. 42
4.1.2 Parameter Valuesp. 43
4.2 When to Determine the Information that Observations Provide About Parameter Valuesp. 44
4.3 Fit-Independent Statistics for Sensitivity Analysisp. 46
4.3.1 Sensitivitiesp. 47
4.3.2 Scalingp. 48
4.3.3 Dimensionless Scaled Sensitivities (dss)p. 48
4.3.4 Composite Scaled Sensitivities (css)p. 50
4.3.5 Parameter Correlation Coefficients (pcc)p. 51
4.3.6 Leverage Statisticsp. 54
4.3.7 One-Percent Scaled Sensitivitiesp. 54
4.4 Advantages and Limitations of Fit-Independent Statistics for Sensitivity Analysisp. 56
4.4.1 Scaled Sensitivitiesp. 56
4.4.2 Parameter Correlation Coefficientsp. 58
4.4.3 Leverage Statisticsp. 59
4.5 Exercisesp. 60
Exercise 4.1 Sensitivity Analysis for the Steady-State Model with Starting Parameter Valuesp. 60
5 Estimating Parameter Valuesp. 67
5.1 The Modified Gauss-Newton Gradient Methodp. 68
5.1.1 Normal Equationsp. 68
5.1.2 An Examplep. 74
5.1.3 Convergence Criteriap. 76
5.2 Alternative Optimization Methodsp. 77
5.3 Multiobjective Optimizationp. 78
5.4 Log-Transformed Parametersp. 78
5.5 Use of Limits on Estimated Parameter Valuesp. 80
5.6 Exercisesp. 80
Exercise 5.1 Modified Gauss-Newton Method and Application to a Two-Parameter Problemp. 80
Exercise 5.2 Estimate the Parameters of the Steady-State Modelp. 87
6 Evaluating Model Fitp. 93
6.1 Magnitude of Residuals and Weighted Residualsp. 93
6.2 Identify Systematic Misfitp. 94
6.3 Measures of Overall Model Fitp. 94
6.3.1 Objective-Function Valuep. 95
6.3.2 Calculated Error Variance and Standard Errorp. 95
6.3.3 AIC, AIC[subscript c], and BIC Statisticsp. 98
6.4 Analyzing Model Fit Graphically and Related Statisticsp. 99
6.4.1 Using Graphical Analysis of Weighted Residuals to Detect Model Errorp. 100
6.4.2 Weighted Residuals Versus Weighted or Unweighted Simulated Values and Minimum, Maximum, and Average Weighted Residualsp. 100
6.4.3 Weighted or Unweighted Observations Versus Simulated Values and Correlation Coefficient Rp. 105
6.4.4 Graphs and Maps Using Independent Variables and the Runs Statisticp. 106
6.4.5 Normal Probability Graphs and Correlation Coefficient [Characters not reproducible]p. 108
6.4.6 Acceptable Deviations from Random, Normally Distributed Weighted Residualsp. 111
6.5 Exercisesp. 113
Exercise 6.1 Statistical Measures of Overall Fitp. 113
Exercise 6.2 Evaluate Graph Model fit and Related Statisticsp. 115
7 Evaluating Estimated Parameter Values and Parameter Uncertaintyp. 124
7.1 Reevaluating Composite Scaled Sensitivitiesp. 124
7.2 Using Statistics from the Parameter Variance-Covariance Matrixp. 125
7.2.1 Five Versions of the Variance-Covariance Matrixp. 125
7.2.2 Parameter Variances, Covariances, Standard Deviations, Coefficients of Variation, and Correlation Coefficientsp. 126
7.2.3 Relation Between Sample and Regression Statisticsp. 127
7.2.4 Statistics for Log-Transformed Parametersp. 130
7.2.5 When to Use the Five Versions of the Parameter Variance-Covariance Matrixp. 130
7.2.6 Some Alternate Methods: Eigenvectors, Eigenvalues, and Singular Value Decompositionp. 132
7.3 Identifying Observations Important to Estimated Parameter Valuesp. 132
7.3.1 Leverage Statisticsp. 134
7.3.2 Influence Statisticsp. 134
7.4 Uniqueness and Optimality of the Estimated Parameter Valuesp. 137
7.5 Quantifying Parameter Value Uncertaintyp. 137
7.5.1 Inferential Statisticsp. 137
7.5.2 Monte Carlo Methodsp. 140
7.6 Checking Parameter Estimates Against Reasonable Valuesp. 140
7.7 Testing Linearityp. 142
7.8 Exercisesp. 145
Exercise 7.1 Parameter Statisticsp. 145
Exercise 7.2 Consider All the Different Correlation Coefficients Presentedp. 155
Exercise 7.3 Test for Linearityp. 155
8 Evaluating Model Predictions, Data Needs, and Prediction Uncertaintyp. 158
8.1 Simulating Predictions and Prediction Sensitivities and Standard Deviationsp. 158
8.2 Using Predictions to Guide Collection of Data that Directly Characterize System Propertiesp. 159
8.2.1 Prediction Scaled Sensitivities (pss)p. 160
8.2.2 Prediction Scaled Sensitivities Used in Conjunction with Composite Scaled Sensitivitiesp. 162
8.2.3 Parameter Correlation Coefficients without and with Predictionsp. 162
8.2.4 Composite and Prediction Scaled Sensitivities Used with Parameter Correlation Coefficientsp. 165
8.2.5 Parameter-Prediction (ppr) Statisticp. 166
8.3 Using Predictions to Guide Collection of Observation Datap. 170
8.3.1 Use of Prediction, Composite, and Dimensionless Scaled Sensitivities and Parameter Correlation Coefficientsp. 170
8.3.2 Observation-Prediction (opr) Statisticp. 171
8.3.3 Insights About the opr Statistic from Other Fit-Independent Statisticsp. 173
8.3.4 Implications for Monitoring Network Designp. 174
8.4 Quantifying Prediction Uncertainty Using Inferential Statisticsp. 174
8.4.1 Definitionsp. 175
8.4.2 Linear Confidence and Prediction Intervals on Predictionsp. 176
8.4.3 Nonlinear Confidence and Prediction Intervalsp. 177
8.4.4 Using the Theis Example to Understand Linear and Nonlinear Confidence Intervalsp. 181
8.4.5 Differences and Their Standard Deviations, Confidence Intervals, and Prediction Intervalsp. 182
8.4.6 Using Confidence Intervals to Serve the Purposes of Traditional Sensitivity Analysisp. 184
8.5 Quantifying Prediction Uncertainty Using Monte Carlo Analysisp. 185
8.5.1 Elements of a Monte Carlo Analysisp. 185
8.5.2 Relation Between Monte Carlo Analysis and Linear and Nonlinear Confidence Intervalsp. 187
8.5.3 Using the Theis Example to Understand Monte Carlo Methodsp. 188
8.6 Quantifying Prediction Uncertainty Using Alternative Modelsp. 189
8.7 Testing Model Nonlinearity with Respect to the Predictionsp. 189
8.8 Exercisesp. 193
Exercise 8.1 Predict Advective Transport and Perform Sensitivity Analysisp. 195
Exercise 8.2 Prediction Uncertainty Measured Using Inferential Statisticsp. 207
9 Calibrating Transient and Transport Models and Recalibrating Existing Modelsp. 213
9.1 Strategies for Calibrating Transient Modelsp. 213
9.1.1 Initial Conditionsp. 213
9.1.2 Transient Observationsp. 214
9.1.3 Additional Model Inputsp. 216
9.2 Strategies for Calibrating Transport Modelsp. 217
9.2.1 Selecting Processes to Includep. 217
9.2.2 Defining Source Geometry and Concentrationsp. 218
9.2.3 Scale Issuesp. 219
9.2.4 Numerical Issues: Model Accuracy and Execution Timep. 220
9.2.5 Transport Observationsp. 223
9.2.6 Additional Model Inputsp. 225
9.2.7 Examples of Obtaining a Tractable, Useful Modelp. 226
9.3 Strategies for Recalibrating Existing Modelsp. 227
9.4 Exercises (optional)p. 228
Exercises 9.1 and 9.2 Simulate Transient Hydraulic Heads and Perform Preparatory Stepsp. 229
Exercise 9.3 Transient Parameter Definitionp. 230
Exercise 9.4 Observations for the Transient Problemp. 231
Exercise 9.5 Evaluate Transient Model Fit Using Starting Parameter Valuesp. 235
Exercise 9.6 Sensitivity Analysis for the Initial Modelp. 235
Exercise 9.7 Estimate Parameters for the Transient System by Nonlinear Regressionp. 243
Exercise 9.8 Evaluate Measures of Model Fitp. 244
Exercise 9.9 Perform Graphical Analyses of Model Fit and Evaluate Related Statisticsp. 246
Exercise 9.10 Evaluate Estimated Parametersp. 250
Exercise 9.11 Test for Linearityp. 253
Exercise 9.12 Predictionsp. 254
10 Guidelines for Effective Modelingp. 260
10.1 Purpose of the Guidelinesp. 263
10.2 Relation to Previous Workp. 264
10.3 Suggestions for Effective Implementationp. 264
11 Guidelines 1 Through 8-Model Developmentp. 268
Guideline 1 Apply the Principle of Parsimonyp. 268
G1.1 Problemp. 269
G1.2 Constructive Approachesp. 270
Guideline 2 Use a Broad Range of System Information to Constrain the Problemp. 272
G2.1 Data Assimilationp. 273
G2.2 Using System Informationp. 273
G2.3 Data Managementp. 274
G2.4 Application: Characterizing a Fractured Dolomite Aquiferp. 277
Guideline 3 Maintain a Well-Posed, Comprehensive Regression Problemp. 277
G3.1 Examplesp. 278
G3.2 Effects of Nonlinearity on the css and pccp. 281
Guideline 4 Include Many Kinds of Data as Observations in the Regressionp. 284
G4.1 Interpolated "Observations"p. 284
G4.2 Clustered Observationsp. 285
G4.3 Observations that Are Inconsistent with Model Constructionp. 286
G4.4 Applications: Using Different Types of Observations to Calibrate Groundwater Flow and Transport Modelsp. 287
Guideline 5 Use Prior Information Carefullyp. 288
G5.1 Use of Prior Information Compared with Observationsp. 288
G5.2 Highly Parameterized Modelsp. 290
G5.3 Applications: Geophysical Datap. 291
Guideline 6 Assign Weights that Reflect Errorsp. 291
G6.1 Determine Weightsp. 294
G6.2 Issues of Weighting in Nonlinear Regressionp. 298
Guideline 7 Encourage Convergence by Making the Model More Accurate and Evaluating the Observationsp. 306
Guideline 8 Consider Alternative Modelsp. 308
G8.1 Develop Alternative Modelsp. 309
G8.2 Discriminate Between Modelsp. 310
G8.3 Simulate Predictions with Alternative Modelsp. 312
G8.4 Applicationp. 313
12 Guidelines 9 and 10-Model Testingp. 315
Guideline 9 Evaluate Model Fitp. 316
G9.1 Determine Model Fitp. 316
G9.2 Examine Fit for Existing Observations Important to the Purpose of the Modelp. 320
G9.3 Diagnose the Cause of Poor Model Fitp. 320
Guideline 10 Evaluate Optimized Parameter Valuesp. 323
G10.1 Quantify Parameter-Value Uncertaintyp. 323
G10.2 Use Parameter Estimates to Detect Model Errorp. 323
G10.3 Diagnose the Cause of Unreasonable Optimal Parameter Estimatesp. 326
G10.4 Identify Observations Important to the Parameter Estimatesp. 327
G10.5 Reduce or Increase the Number of Parametersp. 328
13 Guidelines 11 and 12-Potential New Datap. 329
Guideline 11 Identify New Data to Improve Simulated Processes, Features, and Propertiesp. 330
Guideline 12 Identify New Data to Improve Predictionsp. 334
G12.1 Potential New Data to Improve Features and Properties Governing System Dynamicsp. 334
G12.2 Potential New Data to Support Observationsp. 335
14 Guidelines 13 and 14-Prediction Uncertaintyp. 337
Guideline 13 Evaluate Prediction Uncertainty and Accuracy Using Deterministic Methodsp. 337
G13.1 Use Regression to Determine Whether Predicted Values Are Contradicted by the Calibrated Modelp. 337
G13.2 Use Omitted Data and Postauditsp. 338
Guideline 14 Quantify Prediction Uncertainty Using Statistical Methodsp. 339
G14.1 Inferential Statisticsp. 341
G14.2 Monte Carlo Methodsp. 341
15 Using and Testing the Methods and Guidelinesp. 345
15.1 Execution Time Issuesp. 345
15.2 Field Applications and Synthetic Test Casesp. 347
15.2.1 The Death Valley Regional Flow System, California and Nevada, USAp. 347
15.2.2 Grindsted Landfill, Denmarkp. 370
Appendix A Objective Function Issuesp. 374
A.1 Derivation of the Maximum-Likelihood Objective Functionp. 375
A.2 Relation of the Maximum-Likelihood and Least-Squares Objective Functionsp. 376
A.3 Assumptions Required for Diagonal Weighting to be Correctp. 376
A.4 Referencesp. 381
Appendix B Calculation Details of the Modified Gauss-Newton Methodp. 383
B.1 Vectors and Matrices for Nonlinear Regressionp. 383
B.2 Quasi-Newton Updating of the Normal Equationsp. 384
B.3 Calculating the Damping Parameterp. 385
B.4 Solving the Normal Equationsp. 389
B.5 Referencesp. 390
Appendix C Two Important Properties of Linear Regression and the Effects of Nonlinearityp. 391
C.1 Identities Needed for the Proofsp. 392
C.1.1 True Linear Modelp. 392
C.1.2 True Nonlinear Modelp. 392
C.1.3 Linearized True Nonlinear Modelp. 392
C.1.4 Approximate Linear Modelp. 392
C.1.5 Approximate Nonlinear Modelp. 393
C.1.6 Linearized Approximate Nonlinear Modelp. 393
C.1.7 The Importance of X and Xp. 394
C.1.8 Considering Many Observationsp. 394
C.1.9 Normal Equationsp. 395
C.1.10 Random Variablesp. 395
C.1.11 Expected Valuep. 395
C.1.12 Variance-Covariance Matrix of a Vectorp. 395
C.2 Proof of Property 1: Parameters Estimated by Linear Regression are Unbiasedp. 395
C.3 Proof of Property 2: The Weight Matrix Needs to be Defined in a Particular Way for Eq. (7.1) to Apply and for the Parameter Estimates to have the Smallest Variancep. 396
C.4 Referencesp. 398
Appendix D Selected Statistical Tablesp. 399
D.1 Referencesp. 406
Referencesp. 407
Indexp. 427