The Infona portal uses cookies, i.e. strings of text saved by a browser on the user's device. The portal can access those files and use them to remember the user's data, such as their chosen settings (screen view, interface language, etc.), or their login data. By using the Infona portal the user accepts automatic saving and using this information for portal operation purposes. More information on the subject can be found in the Privacy Policy and Terms of Service. By closing this window the user confirms that they have read the information on cookie usage, and they accept the privacy policy and the way cookies are used by the portal. You can change the cookie settings in your browser.
After the preparations of the previous chapters, we are ready to confront the practical performance of the various estimators in nonparametric regression problems. In particular, we report on simulation studies to judge the efficacy of the estimators and present more thorough analyses of the data sets introduced in § 12.1.
In this chapter, we study nonparametric regression estimators based on sieves. Here, a sieve is taken to be a nested sequence of finite-dimensional subspaces of the ambient L2 space. This is somewhat different from the alternative interpretation of a sieve as a nested sequence of compact subsets of the L2 space; see § 12.2. Either way, a sieved estimator is defined as the solution...
Having studied three widely differing nonparametric regression estimators, it is perhaps time for a comparative critique. In the authors’ view, the strength of the smoothing spline and sieved estimators derive from the maximum likelihood and/or minimum principles. A weakness is that the estimators are constructed in a global manner, even though the estimators are essentially local (as they should...
We continue the study of the nonparametric regression problem, in which one wishes to estimate a function fo on the interval [ 0 , 1 ] from the data y1, n, y2, n, …, yn, n following the model Here, dn = (d1, n, d2,...
In this chapter, we return to smoothing splines of arbitrary order but now for general, nonparametric regression problems with random designs. One goal is indeed to rework parts of Chapter 13, but we implement it as a byproduct of a more ambitious project regarding “asymptotically equivalent” (or just “equivalent”) kernel approximations to smoothing splines. We interpret “equivalence” in the strict...
In this volume, we study univariate nonparametric regression problems. The prototypical example is where one observes the data where ε1, ε2, …, εn are independent normal random variables with mean 0 and unknown variance σ2. The object is to estimate the (smooth) function fo and construct inferential procedures regarding the model...
In this section, we begin the study of nonparametric regression by way of smoothing splines. We wish to estimate the regression function fo on a bounded interval, which we take to be [0, 1], from the data y1, n, …, yn, n, following the model Here, the xin are design points...
In this chapter, we discuss the computation of the various nonparametric regression estimators encountered in the previous chapters (with kernel estimators being superseded by local polynomials).
In this chapter, we give an account of some Bayesian aspects of spline smoothing for nonparametric regression. The Bayesian view leads to concepts that do not arise in the distinctly non–Bayesian presentation of the previous chapters. Indeed, novel solutions appear to which non–Bayesians cannot object; e.g., the various developments culminating in the Kalman filter for computing spline estimators...
We start the treatment of the selection of the smoothing parameter in nonparametric regression with a discussion of optimality criteria. In the remainder of this chapter, we discuss their implementation for linear leastsquares problems, in particular for smoothing spline estimators, the polynomial sieve, and local polynomials.
The previous chapters dealt mostly with the Gauss-Markov model in the smooth space setting fo ∈ Wm, 2(0, 1) for some integer m ≥ 1. Recall that in the Gauss-Markov model the noise dn = (d1, n, …, dn, n)T satisfies We revert...
This is the second volume of a text on the theory and practice of maximum penalized likelihood estimation. It is intended for graduate students in statistics, operations research and applied mathematics, as well as for researchers and practitioners in the field. The present volume deals with nonparametric regression. The emphasis in this volume is on smoothing splines of arbitrary order, but other...
Set the date range to filter the displayed results. You can set a starting date, ending date or both. You can enter the dates manually or choose them from the calendar.