cqr.admm | R Documentation |
Composite quantile regression (cqr) find the estimated coefficient which minimize the absolute error for various quantile level. The problem is well suited to distributed convex optimization and is based on Alternating Direction Method of Multipliers (ADMM) algorithm .
cqr.admm(X,y,tau,rho,beta, maxit, toler)
X |
the design matrix |
y |
response variable |
tau |
vector of quantile level |
rho |
augmented Lagrangian parameter |
beta |
initial value of estimate coefficient (default naive guess by least square estimation) |
maxit |
maxim iteration (default 200) |
toler |
the tolerance critical for stop the algorithm (default 1e-3) |
a list
structure is with components
beta |
the vector of estimated coefficient |
b |
intercept |
cqr.admm(x,y,tau) work properly only if the least square estimation is good.
S. Boyd, N. Parikh, E. Chu, B. Peleato and J. Eckstein.(2010) Distributed Optimization and Statistical Learning via the Alternating Direction. Method of Multipliers Foundations and Trends in Machine Learning, 3, No. 1, 1–122
Hui Zou and Ming Yuan(2008). Composite Quantile Regression and the Oracle Model Selection Theory, The Annals of Statistics, 36, Number 3, Page 1108–1126.
set.seed(1) n=100 p=2 a=rnorm(n*p, mean = 1, sd =1) x=matrix(a,n,p) beta=rnorm(p,1,1) beta=matrix(beta,p,1) y=x%*%beta-matrix(rnorm(n,0.1,1),n,1) tau=1:5/6 # x is 1000*10 matrix, y is 1000*1 vector, beta is 10*1 vector cqr.admm(x,y,tau)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.