Cover image for Introduction to nonparametric regression
Title:
Introduction to nonparametric regression
Personal Author:
Series:
Wiley series in probability and statistics
Publication Information:
Hoboken, NJ : Wiley-Interscience, 2006
ISBN:
9780471745839

Available:*

Library
Item Barcode
Call Number
Material Type
Item Category 1
Status
Searching...
30000010102702 QA278.2 T34 2006 Open Access Book Book
Searching...

On Order

Summary

Summary

An easy-to-grasp introduction to nonparametric regression

This book's straightforward, step-by-step approach provides an excellent introduction to the field for novices of nonparametric regression. Introduction to Nonparametric Regression clearly explains the basic concepts underlying nonparametric regression and features:
* Thorough explanations of various techniques, which avoid complex mathematics and excessive abstract theory to help readers intuitively grasp the value of nonparametric regression methods
* Statistical techniques accompanied by clear numerical examples that further assist readers in developing and implementing their own solutions
* Mathematical equations that are accompanied by a clear explanation of how the equation was derived


The first chapter leads with a compelling argument for studying nonparametric regression and sets the stage for more advanced discussions. In addition to covering standard topics, such as kernel and spline methods, the book provides in-depth coverage of the smoothing of histograms, a topic generally not covered in comparable texts.

With a learning-by-doing approach, each topical chapter includes thorough S-Plus? examples that allow readers to duplicate the same results described in the chapter. A separate appendix is devoted to the conversion of S-Plus objects to R objects. In addition, each chapter ends with a set of problems that test readers' grasp of key concepts and techniques and also prepares them for more advanced topics.

This book is recommended as a textbook for undergraduate and graduate courses in nonparametric regression. Only a basic knowledge of linear algebra and statistics is required. In addition, this is an excellent resource for researchers and engineers in such fields as pattern recognition, speech understanding, and data mining. Practitioners who rely on nonparametric regression for analyzing data in the physical, biological, and social sciences, as well as in finance and economics, will find this an unparalleled resource.


Author Notes

Kunio Takezawa, PhD, is a Specific Research Scientist in the Department of Information Science and Technology at the National Agricultural Research Center, Japan. He is also an Associate Professor in the Cooperative Graduate School System at the Graduate School of Life and Environmental Sciences at the University of Tsukuba, Japan


Reviews 1

Choice Review

A very useful book clearly presenting basic concepts of nonparametric regression and applications to various real-life situations, this English edition (versus the Japanese) lacks chapters on wavelets, neural networks, and the tree-based models, but includes problem sets and another appendix with a bibliography. Users are expected to have basic knowledge of statistics and linear algebra. Rather than using complex mathematical and theoretical derivations, Takezawa (National Agricultural Research Center, Japan) provides an appreciation and a better understanding of nonparametric regression techniques by using illustrative numerical examples and other intuitive arguments. Of seven chapters, the first details the importance and practical advantages of nonparametric regression. Other chapters discuss smoothing data involving one predictor (equi spaced, non-equi spaced) and many predictors; and techniques when predictors appear as distributions, in which case neighboring predictors have more or less a similar effect on the target variable. Smoothing of histograms and nonparametric density functions is presented in chapter 6, and the last chapter covers use of nonparametric regression techniques in pattern recognition. Access to data sets used in this book is found at . There are useful chapter reference lists and problem sets and thorough S-Plus examples; a separate appendix converts S-Plus objects to R objects. ^BSumming Up: Highly recommended. Upper-division undergraduates through faculty. D. V. Chopra Wichita State University


Table of Contents

Prefacep. xi
Acknowledgmentsp. xvii
1 Exordiump. 1
1.1 Introductionp. 1
1.2 Are the moving average and Fourier series sufficiently useful?p. 4
1.3 Is a histogram or normal distribution sufficiently powerful?p. 8
1.4 Is interpolation sufficiently powerful?p. 13
1.5 Should we use a descriptive equation?p. 16
1.6 Parametric regression and nonparametric regressionp. 18
2 Smoothing for data with an equispaced predictorp. 23
2.1 Introductionp. 23
2.2 Moving average and binomial filterp. 23
2.3 Hat matrixp. 32
2.4 Local linear regressionp. 43
2.5 Smoothing splinep. 51
2.6 Analysis on eigenvalue of hat matrixp. 55
2.7 Examples of S-Plus objectp. 79
Referencesp. 94
Problemsp. 95
3 Nonparametric regression for one-dimensional predictorp. 103
3.1 Introductionp. 103
3.2 Trade-off between bias and variancep. 105
3.3 Index to select beneficial regression equationsp. 114
3.4 Nadaraya-Watson estimatorp. 130
3.5 Local polynomial regressionp. 139
3.6 Natural spline and smoothing splinep. 151
3.7 LOESSp. 185
3.8 Supersmootherp. 191
3.9 LOWESSp. 195
3.10 Examples of S-Plus objectp. 197
Referencesp. 225
Problemsp. 226
4 Multidimensional smoothingp. 231
4.1 Introductionp. 231
4.2 Local polynomial regression for multidimensional predictorp. 232
4.3 Thin plate smoothing splinesp. 234
4.4 LOESS and LOWESS with plural predictorsp. 237
4.5 Krigingp. 238
4.6 Additive modelp. 258
4.7 ACEp. 270
4.8 Projection pursuit regressionp. 283
4.9 Examples of S-Plus objectp. 287
Referencesp. 318
Problemsp. 319
5 Nonparametric regression with predictors represented as distributionsp. 325
5.1 Introductionp. 325
5.2 Use of distributions as predictorsp. 326
5.3 Nonparametric DVR methodp. 335
5.4 Form of nonparametric regression with predictors represented as distributionsp. 337
5.5 Examples of S-Plus objectp. 344
Referencesp. 354
Problemsp. 355
6 Smoothing of histograms and nonparametric probability density functionsp. 359
6.1 Introductionp. 359
6.2 Histogramp. 360
6.3 Smoothing a histogramp. 363
6.4 Nonparametric probability density functionp. 379
6.5 Examples of S-Plus objectp. 385
Referencesp. 405
Problemsp. 406
7 Pattern recognitionp. 409
7.1 Introductionp. 409
7.2 Bayes' decision rulep. 410
7.3 Linear discriminant rule and quadratic discriminant rulep. 414
7.4 Classification using nonparametric probability density functionp. 418
7.5 Logistic regressionp. 421
7.6 Neural networksp. 427
7.7 Tree-based modelp. 429
7.8 k-nearest-neighbor classifierp. 434
7.9 Nonparametric regression based on the least squaresp. 435
7.10 Transformation of feature vectorsp. 439
7.11 Examples of S-Plus objectp. 444
Referencesp. 470
Problemsp. 471
Appendix A Creation and applications of B-spline basesp. 473
A.1 Introductionp. 473
A.2 Method to create B-spline basisp. 474
A.3 Natural spline created by B-splinep. 479
A.4 Application to smoothing splinep. 482
A.5 Examples of S-Plus objectp. 484
Referencesp. 500
Appendix B R objectsp. 501
B.1 Introductionp. 501
B.2 Transformation of S-Plus objects in Chapter 2p. 501
B.3 Transformation of S-Plus objects in Chapter 3p. 502
B.4 Transformation of S-Plus objects in Chapter 4p. 507
B.5 Transformation of S-Plus objects in Chapter 5p. 517
B.6 Transformation of S-Plus objects in Chapter 6p. 517
B.7 Transformation of S-Plus objects in Chapter 7p. 523
B.8 Transformation of S-Plus objects in Appendix Ap. 526
Appendix C Further readingsp. 529
Indexp. 532