Sieve-package | R Documentation |
Performs multivariate nonparametric regression/classification by the method of sieves (using orthogonal basis). The method is suitable for moderate high-dimensional features (dimension < 100). The l1-penalized sieve estimator, a nonparametric generalization of Lasso, is adaptive to the feature dimension with provable theoretical guarantees. We also include a nonparametric stochastic gradient descent estimator, Sieve-SGD, for online or large scale batch problems. Details of the methods can be found in: <arXiv:2206.02994> <arXiv:2104.00846><arXiv:2310.12140>.
The DESCRIPTION file:
This package was not yet installed at build time.
Index: This package was not yet installed at build time.
~~ An overview of how to use the ~~
~~ package, including the most ~~
~~ important functions ~~
Tianyu Zhang
Maintainer: Tianyu Zhang <tianyuz3@andrew.cmu.edu>
Tianyu Zhang and Noah Simon (2022) <arXiv:2206.02994>
xdim <- 5
basisN <- 1000
type <- 'cosine'
#non-linear additive truth. Half of the features are truly associated with the outcome
TrainData <- GenSamples(s.size = 300, xdim = xdim,
frho = 'additive', frho.para = xdim/2)
#noise-free testing samples
TestData <- GenSamples(s.size = 1e3, xdim = xdim, noise.para = 0,
frho = 'additive', frho.para = xdim/2)
sieve.model <- sieve_preprocess(X = TrainData[,2:(xdim+1)],
basisN = basisN, type = type, interaction_order = 2)
sieve.model <- sieve_solver(sieve.model, TrainData$Y, l1 = TRUE)
sieve_model_prediction <- sieve_predict(testX = TestData[,2:(xdim+1)],
testY = TestData$Y, sieve.model)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.