Available:*
Library | Item Barcode | Call Number | Material Type | Item Category 1 | Status |
---|---|---|---|---|---|
Searching... | 30000010102702 | QA278.2 T34 2006 | Open Access Book | Book | Searching... |
On Order
Summary
Summary
An easy-to-grasp introduction to nonparametric regression
This book's straightforward, step-by-step approach provides an excellent introduction to the field for novices of nonparametric regression. Introduction to Nonparametric Regression clearly explains the basic concepts underlying nonparametric regression and features:
* Thorough explanations of various techniques, which avoid complex mathematics and excessive abstract theory to help readers intuitively grasp the value of nonparametric regression methods
* Statistical techniques accompanied by clear numerical examples that further assist readers in developing and implementing their own solutions
* Mathematical equations that are accompanied by a clear explanation of how the equation was derived
The first chapter leads with a compelling argument for studying nonparametric regression and sets the stage for more advanced discussions. In addition to covering standard topics, such as kernel and spline methods, the book provides in-depth coverage of the smoothing of histograms, a topic generally not covered in comparable texts.
With a learning-by-doing approach, each topical chapter includes thorough S-Plus? examples that allow readers to duplicate the same results described in the chapter. A separate appendix is devoted to the conversion of S-Plus objects to R objects. In addition, each chapter ends with a set of problems that test readers' grasp of key concepts and techniques and also prepares them for more advanced topics.
This book is recommended as a textbook for undergraduate and graduate courses in nonparametric regression. Only a basic knowledge of linear algebra and statistics is required. In addition, this is an excellent resource for researchers and engineers in such fields as pattern recognition, speech understanding, and data mining. Practitioners who rely on nonparametric regression for analyzing data in the physical, biological, and social sciences, as well as in finance and economics, will find this an unparalleled resource.
Author Notes
Kunio Takezawa, PhD, is a Specific Research Scientist in the Department of Information Science and Technology at the National Agricultural Research Center, Japan. He is also an Associate Professor in the Cooperative Graduate School System at the Graduate School of Life and Environmental Sciences at the University of Tsukuba, Japan
Reviews 1
Choice Review
A very useful book clearly presenting basic concepts of nonparametric regression and applications to various real-life situations, this English edition (versus the Japanese) lacks chapters on wavelets, neural networks, and the tree-based models, but includes problem sets and another appendix with a bibliography. Users are expected to have basic knowledge of statistics and linear algebra. Rather than using complex mathematical and theoretical derivations, Takezawa (National Agricultural Research Center, Japan) provides an appreciation and a better understanding of nonparametric regression techniques by using illustrative numerical examples and other intuitive arguments. Of seven chapters, the first details the importance and practical advantages of nonparametric regression. Other chapters discuss smoothing data involving one predictor (equi spaced, non-equi spaced) and many predictors; and techniques when predictors appear as distributions, in which case neighboring predictors have more or less a similar effect on the target variable. Smoothing of histograms and nonparametric density functions is presented in chapter 6, and the last chapter covers use of nonparametric regression techniques in pattern recognition. Access to data sets used in this book is found at . There are useful chapter reference lists and problem sets and thorough S-Plus examples; a separate appendix converts S-Plus objects to R objects. ^BSumming Up: Highly recommended. Upper-division undergraduates through faculty. D. V. Chopra Wichita State University
Table of Contents
Preface | p. xi |
Acknowledgments | p. xvii |
1 Exordium | p. 1 |
1.1 Introduction | p. 1 |
1.2 Are the moving average and Fourier series sufficiently useful? | p. 4 |
1.3 Is a histogram or normal distribution sufficiently powerful? | p. 8 |
1.4 Is interpolation sufficiently powerful? | p. 13 |
1.5 Should we use a descriptive equation? | p. 16 |
1.6 Parametric regression and nonparametric regression | p. 18 |
2 Smoothing for data with an equispaced predictor | p. 23 |
2.1 Introduction | p. 23 |
2.2 Moving average and binomial filter | p. 23 |
2.3 Hat matrix | p. 32 |
2.4 Local linear regression | p. 43 |
2.5 Smoothing spline | p. 51 |
2.6 Analysis on eigenvalue of hat matrix | p. 55 |
2.7 Examples of S-Plus object | p. 79 |
References | p. 94 |
Problems | p. 95 |
3 Nonparametric regression for one-dimensional predictor | p. 103 |
3.1 Introduction | p. 103 |
3.2 Trade-off between bias and variance | p. 105 |
3.3 Index to select beneficial regression equations | p. 114 |
3.4 Nadaraya-Watson estimator | p. 130 |
3.5 Local polynomial regression | p. 139 |
3.6 Natural spline and smoothing spline | p. 151 |
3.7 LOESS | p. 185 |
3.8 Supersmoother | p. 191 |
3.9 LOWESS | p. 195 |
3.10 Examples of S-Plus object | p. 197 |
References | p. 225 |
Problems | p. 226 |
4 Multidimensional smoothing | p. 231 |
4.1 Introduction | p. 231 |
4.2 Local polynomial regression for multidimensional predictor | p. 232 |
4.3 Thin plate smoothing splines | p. 234 |
4.4 LOESS and LOWESS with plural predictors | p. 237 |
4.5 Kriging | p. 238 |
4.6 Additive model | p. 258 |
4.7 ACE | p. 270 |
4.8 Projection pursuit regression | p. 283 |
4.9 Examples of S-Plus object | p. 287 |
References | p. 318 |
Problems | p. 319 |
5 Nonparametric regression with predictors represented as distributions | p. 325 |
5.1 Introduction | p. 325 |
5.2 Use of distributions as predictors | p. 326 |
5.3 Nonparametric DVR method | p. 335 |
5.4 Form of nonparametric regression with predictors represented as distributions | p. 337 |
5.5 Examples of S-Plus object | p. 344 |
References | p. 354 |
Problems | p. 355 |
6 Smoothing of histograms and nonparametric probability density functions | p. 359 |
6.1 Introduction | p. 359 |
6.2 Histogram | p. 360 |
6.3 Smoothing a histogram | p. 363 |
6.4 Nonparametric probability density function | p. 379 |
6.5 Examples of S-Plus object | p. 385 |
References | p. 405 |
Problems | p. 406 |
7 Pattern recognition | p. 409 |
7.1 Introduction | p. 409 |
7.2 Bayes' decision rule | p. 410 |
7.3 Linear discriminant rule and quadratic discriminant rule | p. 414 |
7.4 Classification using nonparametric probability density function | p. 418 |
7.5 Logistic regression | p. 421 |
7.6 Neural networks | p. 427 |
7.7 Tree-based model | p. 429 |
7.8 k-nearest-neighbor classifier | p. 434 |
7.9 Nonparametric regression based on the least squares | p. 435 |
7.10 Transformation of feature vectors | p. 439 |
7.11 Examples of S-Plus object | p. 444 |
References | p. 470 |
Problems | p. 471 |
Appendix A Creation and applications of B-spline bases | p. 473 |
A.1 Introduction | p. 473 |
A.2 Method to create B-spline basis | p. 474 |
A.3 Natural spline created by B-spline | p. 479 |
A.4 Application to smoothing spline | p. 482 |
A.5 Examples of S-Plus object | p. 484 |
References | p. 500 |
Appendix B R objects | p. 501 |
B.1 Introduction | p. 501 |
B.2 Transformation of S-Plus objects in Chapter 2 | p. 501 |
B.3 Transformation of S-Plus objects in Chapter 3 | p. 502 |
B.4 Transformation of S-Plus objects in Chapter 4 | p. 507 |
B.5 Transformation of S-Plus objects in Chapter 5 | p. 517 |
B.6 Transformation of S-Plus objects in Chapter 6 | p. 517 |
B.7 Transformation of S-Plus objects in Chapter 7 | p. 523 |
B.8 Transformation of S-Plus objects in Appendix A | p. 526 |
Appendix C Further readings | p. 529 |
Index | p. 532 |