| coef.xgb.Booster | R Documentation |
Extracts the coefficients from a 'gblinear' booster object,
as produced by xgb.train() when using parameter booster="gblinear".
Note: this function will error out if passing a booster model which is not of "gblinear" type.
## S3 method for class 'xgb.Booster'
coef(object, ...)
object |
A fitted booster of 'gblinear' type. |
... |
Not used. |
The extracted coefficients:
If there is only one coefficient per column in the data, will be returned as a vector, potentially containing the feature names if available, with the intercept as first column.
If there is more than one coefficient per column in the data (e.g. when using
objective="multi:softmax"), will be returned as a matrix with dimensions equal
to [num_features, num_cols], with the intercepts as first row. Note that the column
(classes in multi-class classification) dimension will not be named.
The intercept returned here will include the 'base_score' parameter (unlike the 'bias'
or the last coefficient in the model dump, which doesn't have 'base_score' added to it),
hence one should get the same values from calling predict(..., outputmargin = TRUE) and
from performing a matrix multiplication with model.matrix(~., ...).
Be aware that the coefficients are obtained by first converting them to strings and back, so there will always be some very small lose of precision compared to the actual coefficients as used by predict.xgb.Booster.
library(xgboost)
data(mtcars)
y <- mtcars[, 1]
x <- as.matrix(mtcars[, -1])
dm <- xgb.DMatrix(data = x, label = y, nthread = 1)
params <- xgb.params(booster = "gblinear", nthread = 1)
model <- xgb.train(data = dm, params = params, nrounds = 2)
coef(model)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.