View source: R/SteepestDescend.R
SteepD | R Documentation |
Implements steepest descend method to find the coefficients \mjseqn\beta that minimize the following loss function \mjsdeqnL(\beta) = (X\beta - Y)^2 In this implementation, the stepsize is updated at each iteration employing the gradient \mjsdeqn\nabla L(\beta) = 2X^TX\beta - 2X^TY and the Hessian matrix \mjsdeqnH L(\beta) = 4X^TX
SteepD(data, init = NULL, tol = 1e-04, maxit = 1000L, verb = F, check_loss = F)
data |
list containing the data, elements must be named |
init |
vector initial guesses of the parameter of interest. If |
tol |
numeric it must be strictly positive. It is the tolerance on the error evaluation between subsequent iterations. It is use to determine the stopping criteria. |
maxit |
integer it must be strictly positive. It is the maximum number of iterations. |
verb |
bool if |
check_loss |
bool if |
list composed by
: Beta_hat
the \mjseqn\beta coefficient of interest
: Minimum
the value of the loss function at the convergence point (only if verb = TRUE
)
: Final_error
the value of the error at the convergence point
: Num_iter
the number of iterations that the function used to reach the minimum
: Time
it is the time elapsed to perform the optimization (increased by 2 seconds to make it traceable even with small data)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.