Skip to main content

Part of the book series: Springer Texts in Statistics ((STS))

  • 18k Accesses

Abstract

In this chapter we will study nonparametric regression, also known as “learning a function” in the jargon of machine learning. We are given n pairs of observations (x 1, Y 1), . . ., (x n, Y n) as in Figures 5.1, 5.2 and 5.3. The response variable Y is related to the covariate x by the equations

$$ Y_i = r(x_i + \in _i ),{\text{ }}\mathbb{E}{\text{(}} \in _i {\text{)}} = {\text{0, }}i = 1, \ldots ,n $$
((5.1))

where r is the regression function. The variable x is also called a feature. We want to estimate (or “learn”) the function r under weak assumptions. The estimator of r(x) is denoted by \( \widehat{r_n }(x) \). We also refer to \( \widehat{r_n }(x) \) as a smoother. At first, we will make the simplifying assumption that the variance \( \mathbb{V}{\text{(}} \in _i {\text{) = }}\sigma ^{\text{2}} \) does not depend on x. We will relax this assumption later.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

5.14 Bibliographic Remarks

  • Fan, J. and Gijbels, I. (1996). Local Polynomial Modelling and Its Applications. Chapman and Hall. New York, NY.

    MATH  Google Scholar 

  • Hall, P. (1987). On Kullback-Leibler loss and density estimation. The Annals of Statistics15 1491–1519.

    MATH  Google Scholar 

  • Hastie, T., Tibshirani, R. and Friedman, J. H. (2001). The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer-Verlag. New York, NY.

    MATH  Google Scholar 

  • Ruppert, D., Wand, M. and Carroll, R. (2003). Semiparametric Regression. Cambridge University Press. Cambridge.

    MATH  Google Scholar 

  • Scott, D. W. (1992). Multivariate Density Estimation: Theory, Practice, and Visualization. Wiley. New York, NY.

    MATH  Google Scholar 

  • Silverman, B. W. (1986). Density Estimation for Statistics and Data Analysis. Chapman and Hall. New York, NY.

    MATH  Google Scholar 

  • Simonoff, J. S. (1996). Smoothing Methods in Statistics. Springer-Verlag. New York, NY.

    MATH  Google Scholar 

Download references

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer Science+Business Media, Inc.

About this chapter

Cite this chapter

(2006). Nonparametric Regression. In: All of Nonparametric Statistics. Springer Texts in Statistics. Springer, New York, NY. https://doi.org/10.1007/0-387-30623-4_5

Download citation

Publish with us

Policies and ethics