Springer eBooks may be purchased by end-customers only and are sold without copy protection (DRM free). Instead, all eBooks include personalized watermarks. This means you can read the Springer eBooks across numerous devices such as Laptops, eReaders, and tablets.
You can pay for Springer eBooks with Visa, Mastercard, American Express or Paypal.
After the purchase you can directly download the eBook file or read it online in our Springer eBook Reader. Furthermore your eBook will be stored in your MySpringer account. So you can always re-download your eBooks.
The regression estimation problem has a long history. Already in 1632 Galileo Galilei used a procedure which can be interpreted as ?tting a linear relationship to contaminated observed data. Such ?tting of a line through a cloud of points is the classical linear regression problem. A solution of this problem is provided by the famous principle of least squares, which was discovered independently by A. M. Legendre and C. F. Gauss and published in 1805 and 1809, respectively. The principle of least squares can also be applied to construct nonparametric regression estimates, where one does not restrict the class of possible relationships, and will be one of the approaches studied in this book. Linear regression analysis, based on the concept of a regression function, was introduced by F. Galton in 1889, while a probabilistic approach in the context of multivariate normal distributions was already given by A. B- vais in 1846. The ?rst nonparametric regression estimate of local averaging type was proposed by J. W. Tukey in 1947. The partitioning regression - timate he introduced, by analogy to the classical partitioning (histogram) density estimate, can be regarded as a special least squares estimate.
Content Level »Research
Keywords »Kernel - Martingal - neural networks - probability - probability theory