Available:*
Library | Item Barcode | Call Number | Material Type | Item Category 1 | Status |
---|---|---|---|---|---|
Searching... | 30000010297771 | QA276.8 C444 2011 | Open Access Book | Book | Searching... |
On Order
Summary
Summary
A comprehensive introduction to bootstrap methods in the R programming environment
Bootstrap methods provide a powerful approach to statistical data analysis, as they have more general applications than standard parametric methods. An Introduction to Bootstrap Methods with Applications to R explores the practicality of this approach and successfully utilizes R to illustrate applications for the bootstrap and other resampling methods. This book provides a modern introduction to bootstrap methods for readers who do not have an extensive background in advanced mathematics. Emphasis throughout is on the use of bootstrap methods as an exploratory tool, including its value in variable selection and other modeling environments.
The authors begin with a description of bootstrap methods and its relationship to other resampling methods, along with an overview of the wide variety of applications of the approach. Subsequent chapters offer coverage of improved confidence set estimation, estimation of error rates in discriminant analysis, and applications to a wide variety of hypothesis testing and estimation problems, including pharmaceutical, genomics, and economics. To inform readers on the limitations of the method, the book also exhibits counterexamples to the consistency of bootstrap methods.
An introduction to R programming provides the needed preparation to work with the numerous exercises and applications presented throughout the book. A related website houses the book's R subroutines, and an extensive listing of references provides resources for further study.
Discussing the topic at a remarkably practical and accessible level, An Introduction to Bootstrap Methods with Applications to R is an excellent book for introductory courses on bootstrap and resampling methods at the upper-undergraduate and graduate levels. It also serves as an insightful reference for practitioners working with data in engineering, medicine, and the social sciences who would like to acquire a basic understanding of bootstrap methods.
Author Notes
Michael R. Chernick, PhD, is Manager of Biostatistical Services at Lankenau Institute for Medical Research, where he conducts statistical design and analysis for pharmaceutical research. He has more than thirty years of experience in the application of statistical methods to such areas as medicine, energy, engineering, insurance, and pharmaceuticals. Dr. Chernick is the author of Bootstrap Methods: A Guide for Practitioners and Researchers, Second Edition and The Essentials of Biostatistics for Physicians, Nurses, and Clinicians, and the coauthor of Introductory Biostatistics for the Health Sciences: Modern Applications Including Bootstrap, all published by Wiley.
Robert A. LaBudde, PhD, is President of Least Cost Formulations, Ltd., a mathematical software development company that specializes in optimization and process control software for manufacturing companies. He has extensive experience in industry and academia and currently serves as Adjunct Associate Professor in the Department of Mathematics and Statistics at Old Dominion University.
Table of Contents
Preface | p. xi |
Acknowledgments | p. xv |
List of Tables | p. xvii |
1 Introduction | p. 1 |
1.1 Historical Background | p. 1 |
1.2 Definition and Relationship to the Delta Method and Other Resampling Methods | p. 3 |
1.2.1 Jackknife | p. 6 |
1.2.2 Delta Method | p. 7 |
1.2.3 Cross-Validation | p. 7 |
1.2.4 Subsampling | p. 8 |
1.3 Wide Range of Applications | p. 8 |
1.4 The Bootstrap and the R Language System | p. 10 |
1.5 Historical Notes | p. 25 |
1.6 Exercises | p. 26 |
References | p. 27 |
2 Estimation | p. 30 |
2.1 Estimating Bias | p. 30 |
2.1.1 Bootstrap Adjustment | p. 30 |
2.1.2 Error Rate Estimation in Discriminant Analysis | p. 32 |
2.1.3 Simple Example of Linear Discrimination and Bootstrap Error Rate Estimation | p. 42 |
2.1.4 Patch Data Example | p. 51 |
2.2 Estimating Location | p. 53 |
2.2.1 Estimating a Mean | p. 53 |
2.2.2 Estimating a Median | p. 54 |
2.3 Estimating Dispersion | p. 54 |
2.3.1 Estimating an Estimate's Standard Error | p. 55 |
2.3.2 Estimating Interquartile Range | p. 56 |
2.4 Linear Regression | p. 56 |
2.4.1 Overview | p. 56 |
2.4.2 Bootstrapping Residuals | p. 57 |
2.4.3 Bootstrapping Pairs (Response and Predictor Vector) | p. 58 |
2.4.4 Heteroscedasticity of Variance: The Wild Bootstrap | p. 58 |
2.4.5 A Special Class of Linear Regression Models: Multivariable Fractional Polynomials | p. 60 |
2.5 Nonlinear Regression | p. 60 |
2.5.1 Examples of Nonlinear Models | p. 61 |
2.5.2 A Quasi-Optical Experiment | p. 63 |
2.6 Nonparametric Regression | p. 63 |
2.6.1 Examples of Nonparametric Regression Models | p. 64 |
2.6.2 Bootstrap Bagging | p. 66 |
2.7 Historical Notes | p. 67 |
2.8 Exercises | p. 69 |
References | p. 71 |
3 Confidence Intervals | p. 76 |
3.1 Subsampling, Typical Value Theorem, and Efron's Percentile Method | p. 77 |
3.2 Bootstrap-t | p. 79 |
3.3 Iterated Bootstrap | p. 83 |
3.4 Bias-Corrected (BC) Bootstrap | p. 85 |
3.5 BCa and ABC | p. 85 |
3.6 Tilted Bootstrap | p. 88 |
3.7 Variance Estimation with Small Sample Sizes | p. 90 |
3.8 Historical Notes | p. 94 |
3.9 Exercises | p. 96 |
References | p. 98 |
4 Hypothesis Testing | p. 101 |
4.1 Relationship to Confidence Intervals | p. 103 |
4.2 Why Test Hypotheses Differently? | p. 105 |
4.3 Tendril DX Example | p. 106 |
4.4 Klingenberg Example: Binary Dose-Response | p. 108 |
4.5 Historical Notes | p. 109 |
4.6 Exercises | p. 110 |
References | p. 111 |
5 Time Series | p. 113 |
5.1 Forecasting Methods | p. 113 |
5.2 Time Domain Models | p. 114 |
5.3 Can Bootstrapping Improve Prediction Intervals? | p. 115 |
5.4 Model-Based Methods | p. 118 |
5.4.1 Bootstrapping Stationary Autoregressive Processes | p. 118 |
5.4.2 Bootstrapping Explosive Autoregressive Processes | p. 123 |
5.4.3 Bootstrapping Unstable Autoregressive Processes | p. 123 |
5.4.4 Bootstrapping Stationary ARMA Processes | p. 123 |
5.5 Block Bootstrapping for Stationary Time Series | p. 123 |
5.6 Dependent Wild Bootstrap (DWB) | p. 126 |
5.7 Frequency-Based Approaches for Stationary Time Series | p. 127 |
5.8 Sieve Bootstrap | p. 128 |
5.9 Historical Notes | p. 129 |
5.10 Exercises | p. 131 |
References | p. 131 |
6 Bootstrap Variants | p. 136 |
6.1 Bayesian Bootstrap | p. 137 |
6.2 Smoothed Bootstrap | p. 138 |
6.3 Parametric Bootstrap | p. 139 |
6.4 Double Bootstrap | p. 139 |
6.5 The m-Out-of-n Bootstrap | p. 140 |
6.6 The Wild Bootstrap | p. 141 |
6.7 Historical Notes | p. 141 |
6.8 Exercises | p. 142 |
References | p. 142 |
7 Chapter Special Topics | p. 144 |
7.1 Spatial Data | p. 144 |
7.1.1 Kriging | p. 144 |
7.1.2 Asymptotics for Spatial Data | p. 147 |
7.1.3 Block Bootstrap on Regular Grids | p. 148 |
7.1.4 Block Bootstrap on Irregular Grids | p. 148 |
7.2 Subset Selection in Regression | p. 148 |
7.2.1 Gong's Logistic Regression Example | p. 149 |
7.2.2 Gunter's Qualitative Interaction Example | p. 153 |
7.3 Determining the Number of Distributions in a Mixture | p. 155 |
7.4 Censored Data | p. 157 |
7.5 P-Value Adjustment | p. 158 |
7.5.1 The Westfall-Young Approach | p. 159 |
7.5.2 Passive Plus Example | p. 159 |
7.5.3 Consulting Example | p. 160 |
7.6 Bioequivalence | p. 162 |
7.6.1 Individual Bioequivalence | p. 162 |
7.6.2 Population Bioequivalence | p. 165 |
7.7 Process Capability Indices | p. 165 |
7.8 Missing Data | p. 172 |
7.9 Point Processes | p. 174 |
7.10 Bootstrap to Detect Outliers | p. 176 |
7.11 Lattice Variables | p. 177 |
7.12 Covariate Adjustment of Area Under the Curve Estimates for Receiver Operating Characteristic (ROC) Curves | p. 177 |
7.13 Bootstrapping in SAS | p. 179 |
7.14 Historical Notes | p. 182 |
7.15 Exercises | p. 183 |
References | p. 185 |
8 When the Bootstrap is Inconsistent and How to Remedy It | p. 190 |
8.1 Too Small of a Sample Size | p. 191 |
8.2 Distributions with Infinite Second Moments | p. 191 |
8.2.1 Introduction | p. 191 |
8.2.2 Example of Inconsistency | p. 192 |
8.2.3 Remedies | p. 193 |
8.3 Estimating Extreme Values | p. 194 |
8.3.1 Introduction | p. 194 |
8.3.2 Example of Inconsistency | p. 194 |
8.3.3 Remedies | p. 194 |
8.4 Survey Sampling | p. 195 |
8.4.1 Introduction | p. 195 |
8.4.2 Example of Inconsistency | p. 195 |
8.4.3 Remedies | p. 195 |
8.5 m-Dependent Sequences | p. 196 |
8.5.1 Introduction | p. 196 |
8.5.2 Example of Inconsistency When Independence Is Assumed | p. 196 |
8.5.3 Remedy | p. 197 |
8.6 Unstable Autoregressive Processes | p. 197 |
8.6.1 Introduction | p. 197 |
8.6.2 Example of Inconsistency | p. 197 |
8.6.3 Remedies | p. 197 |
8.7 Long-Range Dependence | p. 198 |
8.7.1 Introduction | p. 198 |
8.7.2 Example of Inconsistency | p. 198 |
8.7.3 A Remedy | p. 198 |
8.8 Bootstrap Diagnostics | p. 199 |
8.9 Historical Notes | p. 199 |
8.10 Exercises | p. 201 |
References | p. 201 |
Author Index | p. 204 |
Subject Index | p. 210 |