Lazy learning for local regression
By combining constant, linear, and quadratic local models,
lazy estimates the value of an unknown multivariate function on
the basis of a set of possibly noisy samples of the function itself.
This implementation of lazy learning automatically adjusts the
bandwidth on a query-by-query basis through a leave-one-out
A formula specifying the response and some numeric predictors.
An optional data frame within which to look first for the response, predictors, and weights (the latter will be ignored).
Optional weights for each case (ignored).
An optional specification of a subset of the data to be used.
The action to be taken with missing values in the response or predictors. The default is to stop.
Control parameters: see
Control parameters can also be supplied directly.
For one or more query points,
lazy estimates the value of
an unknown multivariate function on the basis of a set of possibly
noisy samples of the function itself. Each sample is an input/output
pair where the input is a vector and the output is a number. For each
query point, the estimation of the function is obtained by combining
different local models. Local models considered for combination by
lazy are polynomials of zeroth, first, and second degree that
fit a set of samples in the neighborhood of the query point. The
neighbors are selected according to either the Manhattan or the
Euclidean distance. It is possible to assign weights to the different
directions of the input domain for modifying their importance in the
computation of the distance. The number of neighbors used for
identifying local models is automatically adjusted on a query-by-query
basis through a leave-one-out validations of models, each fitting a
different numbers of neighbors. The local models are identified using
the recursive least-squares algorithm, and the leave-one-out
cross-validation is obtained through the PRESS statistic.
As the name
lazy suggests, this function does not do
anything... apart from checking the options and properly packing
the data. All the actual computation is done when a prediction is
request for a specific query point, or for a set of query points: see
An object of class
Mauro Birattari and Gianluca Bontempi
D.W. Aha (1997) Editorial. Artificial Intelligence Review, 11(1–5), pp. 1–6. Special Issue on Lazy Learning.
C.G. Atkeson, A.W. Moore, and S. Schaal (1997) Locally Weighted Learning. Artificial Intelligence Review, 11(1–5), pp. 11–73. Special Issue on Lazy Learning.
W.S. Cleveland, S.J. Devlin, and S.J. Grosse (1988) Regression by Local Fitting: Methods, Prospectives and Computational Algorithms. Journal of Econometrics, 37, pp. 87–114.
M. Birattari, G. Bontempi, and H. Bersini (1999) Lazy learning meets the recursive least squares algorithm. Advances in Neural Information Processing Systems 11, pp. 375–381. MIT Press.
G. Bontempi, M. Birattari, and H. Bersini (1999) Lazy learning for modeling and control design. International Journal of Control, 72(7/8), pp. 643–658.
G. Bontempi, M. Birattari, and H. Bersini (1999) Local learning for iterated time-series prediction. International Conference on Machine Learning, pp. 32–38. Morgan Kaufmann.
1 2 3 4