Description Usage Arguments Details Value Author(s) References See Also Examples
View source: R/ncpen_cpp_wrap.R
The function provides the selection of the regularization parameter lambda based on the GIC including AIC and BIC.
1 |
fit |
(ncpen object) fitted |
weight |
(numeric) the weight factor for various information criteria.
Default is BIC if |
verbose |
(logical) whether to plot the GIC curve. |
... |
other graphical parameters to |
User can supply various weight
values (see references). For example,
weight=2
,
weight=log(n)
,
weight=log(log(p))log(n)
,
weight=log(log(n))log(p)
,
corresponds to AIC, BIC (fixed dimensional model), modified BIC (diverging dimensional model) and GIC (high dimensional model).
The coefficients matrix
.
gic |
the GIC values. |
lambda |
the sequence of lambda values used to calculate GIC. |
opt.beta |
the optimal coefficients selected by GIC. |
opt.lambda |
the optimal lambda value. |
Dongshin Kim, Sunghoon Kwon, Sangin Lee
Wang, H., Li, R. and Tsai, C.L. (2007). Tuning parameter selectors for the smoothly clipped absolute deviation method. Biometrika, 94(3), 553-568. Wang, H., Li, B. and Leng, C. (2009). Shrinkage tuning parameter selection with a diverging number of parameters. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 71(3), 671-683. Kim, Y., Kwon, S. and Choi, H. (2012). Consistent Model Selection Criteria on High Dimensions. Journal of Machine Learning Research, 13, 1037-1057. Fan, Y. and Tang, C.Y. (2013). Tuning parameter selection in high dimensional penalized likelihood. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 75(3), 531-552. Lee, S., Kwon, S. and Kim, Y. (2016). A modified local quadratic approximation algorithm for penalized optimization problems. Computational Statistics and Data Analysis, 94, 275-286.
1 2 3 4 5 6 7 8 9 10 | ### linear regression with scad penalty
sam = sam.gen.ncpen(n=200,p=20,q=5,cf.min=0.5,cf.max=1,corr=0.5)
x.mat = sam$x.mat; y.vec = sam$y.vec
fit = ncpen(y.vec=y.vec,x.mat=x.mat)
gic.ncpen(fit,pch="*",type="b")
### multinomial regression with classo penalty
sam = sam.gen.ncpen(n=200,p=20,q=5,k=3,cf.min=0.5,cf.max=1,corr=0.5,family="multinomial")
x.mat = sam$x.mat; y.vec = sam$y.vec
fit = ncpen(y.vec=y.vec,x.mat=x.mat,family="multinomial",penalty="classo")
gic.ncpen(fit,pch="*",type="b")
|
$gic
[1] 423.7131 423.7131 418.8934 404.4903 391.7829 380.5626 370.6469 361.8766
[9] 354.1122 347.2320 341.1294 335.7110 327.0331 318.3402 311.0836 305.0588
[17] 300.0863 300.7567 291.6703 279.2499 274.4702 281.1054 267.6913 250.3692
[25] 231.6707 220.5904 200.9133 196.6216 196.0883 195.6429 205.4020 204.5306
[33] 203.7699 203.1055 202.3547 201.7137 201.1789 211.2348 210.6936 210.2471
[41] 209.8689 209.5100 214.2556 213.9065 213.6196 218.6705 218.4366 223.4167
[49] 228.2994 227.9980 226.7421 226.7282 232.0183 232.0008 237.2324 237.1238
[57] 237.0276 242.1892 242.0149 247.0367 246.9431 246.8322 246.6012 246.5916
[65] 246.5836 246.5796 246.5787 246.5787 246.5787 246.5787 246.5787 246.5787
[73] 246.5787 246.5787 246.5787 246.5787 246.5787 246.5787 251.8765 257.1733
[81] 257.1722 257.1713 257.1705 257.1699 257.1683 257.1670 257.1673 257.1676
[89] 257.1676 257.1676 257.1676 257.1676 257.1676 257.1676 257.1676 257.1676
[97] 257.1676 257.1676 257.1676 257.1676
$lambda
[1] 0.9395993318 0.8762734815 0.8172155817 0.7621379866 0.7107724371
[6] 0.6628687537 0.6181936182 0.5765294374 0.5376732829 0.5014359031
[11] 0.4676408015 0.4361233766 0.4067301207 0.3793178718 0.3537531168
[16] 0.3299113407 0.3076764205 0.2869400595 0.2676012599 0.2495658306
[21] 0.2327459289 0.2170596322 0.2024305395 0.1887873987 0.1760637598
[26] 0.1641976517 0.1531312795 0.1428107438 0.1331857776 0.1242095020
[31] 0.1158381973 0.1080310905 0.1007501565 0.0939599332 0.0876273482
[36] 0.0817215582 0.0762137987 0.0710772437 0.0662868754 0.0618193618
[41] 0.0576529437 0.0537673283 0.0501435903 0.0467640802 0.0436123377
[46] 0.0406730121 0.0379317872 0.0353753117 0.0329911341 0.0307676421
[51] 0.0286940060 0.0267601260 0.0249565831 0.0232745929 0.0217059632
[56] 0.0202430540 0.0188787399 0.0176063760 0.0164197652 0.0153131280
[61] 0.0142810744 0.0133185778 0.0124209502 0.0115838197 0.0108031090
[66] 0.0100750157 0.0093959933 0.0087627348 0.0081721558 0.0076213799
[71] 0.0071077244 0.0066286875 0.0061819362 0.0057652944 0.0053767328
[76] 0.0050143590 0.0046764080 0.0043612338 0.0040673012 0.0037931787
[81] 0.0035375312 0.0032991134 0.0030767642 0.0028694006 0.0026760126
[86] 0.0024956583 0.0023274593 0.0021705963 0.0020243054 0.0018878740
[91] 0.0017606376 0.0016419765 0.0015313128 0.0014281074 0.0013318578
[96] 0.0012420950 0.0011583820 0.0010803109 0.0010075016 0.0009395993
$opt.lambda
[1] 0.1242095
$opt.beta
intercept x1 x2 x3 x4 x5
0.03904388 -0.90624053 0.72448899 -0.60804090 0.53945700 -0.51361994
x6 x7 x8 x9 x10 x11
0.00000000 0.00000000 0.00000000 0.00000000 0.00000000 0.00000000
x12 x13 x14 x15 x16 x17
0.00000000 0.00000000 0.00000000 0.00000000 0.00000000 -0.06476728
x18 x19 x20
0.00000000 0.00000000 0.00000000
$gic
[1] 439.1201 439.1201 439.1201 413.3054 413.3054 413.3054 413.3054 413.3054
[9] 413.3054 399.1233 399.1233 399.1233 399.1233 385.3695 385.3695 362.5073
[17] 362.5073 362.5073 362.5073 362.5073 348.3072 348.3072 348.3072 348.3072
[25] 348.3072 348.3072 348.3072 348.3072 349.0423 349.0423 349.0423 349.0423
[33] 349.0423 349.0423 357.8026 357.8026 357.8026 357.8026 361.4760 361.4760
[41] 367.3735 367.3735 367.3735 370.5104 374.4214 374.4214 389.0989 389.0989
[49] 389.0989 389.0989 389.0989 402.5196 411.6900 416.2593 421.0574 421.0574
[57] 421.0574 421.0574 429.0568 429.0568 429.0568 429.0568 429.0568 429.0568
[65] 429.0568 429.0568 429.0568 429.0568 439.1463 439.1463 439.1463 444.2339
[73] 444.2339 444.2339 444.2339 444.2339 444.2339 444.2339 454.5880 454.5880
[81] 454.5880 454.5880 454.5880 454.5880 459.8262 459.8262 459.8262 459.8262
[89] 465.0843 465.0843 465.0843 470.3805 470.3805 470.3805 470.3805 470.3805
[97] 470.3805 475.6751 475.6751 475.6751
$lambda
[1] 0.195038755 0.186173943 0.177712050 0.169634763 0.161924601 0.154564878
[7] 0.147539666 0.140833760 0.134432649 0.128322477 0.122490023 0.116922662
[13] 0.111608347 0.106535575 0.101693369 0.097071248 0.092659210 0.088447706
[19] 0.084427621 0.080590256 0.076927304 0.073430839 0.070093294 0.066907446
[25] 0.063866399 0.060963573 0.058192684 0.055547737 0.053023007 0.050613029
[31] 0.048312589 0.046116707 0.044020632 0.042019826 0.040109960 0.038286901
[37] 0.036546702 0.034885598 0.033299994 0.031786457 0.030341714 0.028962636
[43] 0.027646240 0.026389675 0.025190224 0.024045289 0.022952394 0.021909172
[49] 0.020913366 0.019962822 0.019055481 0.018189380 0.017362644 0.016573485
[55] 0.015820195 0.015101143 0.014414772 0.013759599 0.013134204 0.012537234
[61] 0.011967397 0.011423461 0.010904247 0.010408632 0.009935543 0.009483958
[67] 0.009052897 0.008641429 0.008248663 0.007873748 0.007515874 0.007174266
[73] 0.006848185 0.006536924 0.006239811 0.005956202 0.005685483 0.005427069
[79] 0.005180401 0.004944943 0.004720188 0.004505648 0.004300860 0.004105379
[85] 0.003918783 0.003740668 0.003570649 0.003408358 0.003253443 0.003105569
[91] 0.002964416 0.002829678 0.002701065 0.002578298 0.002461110 0.002349249
[97] 0.002242472 0.002140548 0.002043257 0.001950388
$opt.lambda
[1] 0.0769273
$opt.beta
class1 class2 class3
[1,] -0.1792295 -0.07667944 0
[2,] 1.3909870 1.14765763 0
[3,] 0.0000000 0.00000000 0
[4,] 0.0000000 0.61681522 0
[5,] 1.2761182 1.63523645 0
[6,] 0.4424590 0.00000000 0
[7,] 0.0000000 0.00000000 0
[8,] 0.0000000 0.00000000 0
[9,] 0.0000000 0.00000000 0
[10,] 0.0000000 0.00000000 0
[11,] 0.0000000 0.00000000 0
[12,] 0.0000000 0.00000000 0
[13,] 0.0000000 0.00000000 0
[14,] 0.0000000 0.00000000 0
[15,] 0.0000000 0.00000000 0
[16,] 0.0000000 0.00000000 0
[17,] 0.0000000 0.00000000 0
[18,] 0.0000000 0.00000000 0
[19,] 0.0000000 0.00000000 0
[20,] 0.0000000 0.00000000 0
[21,] 0.0000000 0.00000000 0
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.