Description Usage Arguments Details Value Examples
Methods to use for optimization include the Hooke-Jeeves derivative-free minimization algorithm (hjk), and the BFGS method (modified Quasi-Newton). This algorithm does variable selection by shrinking the coefficients towards zero using the combined penalty (CL2= (1-w)L0 + wL2).
1 2 | optimPenaLikL2(Data, lamda, w, standardize = TRUE, algorithms = c("QN",
"hjk"))
|
Data |
should have the following structure: the first column must be the response variable y |
lamda |
tuning penalty parameter |
w |
the weight parameter for the sum (1-w)L0+ wL2 |
standardize |
standardize Logical flag for x variable standardization, prior to fitting the model sequence. The coefficients are always returned on the original scale. Default is standardize=TRUE |
algorithms |
select between Simulated annealing or Differential evolution |
it is recommended to use the tuneParam function to tune parameters lamda and w prior using the optimPenaLik function.
a list with the shrinked coefficients and the names of the selected variables, i.e those variables with estimated coefficient different from zero.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 | ## Not run:
# use the optimPenaLik function on a simulated dataset, with given lamda and w.
set.seed(14)
beta <- c(3, 2, -1.6, -1)
noise <- 5
simData <- SimData(N=100, beta=beta, noise=noise, corr=TRUE)
# example with Quasi-Newton:
before <- Sys.time()
PenalQN <- optimPenaLikL2(Data=simData, lamda=2, w=0.6,
algorithms=c("QN"))
after <- Sys.time()
after-before
PenalQN
## End(Not run)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.