multinom_reg: Multinomial regression

View source: R/multinom_reg.R

multinom_regR Documentation

Multinomial regression


multinom_reg() defines a model that uses linear predictors to predict multiclass data using the multinomial distribution. This function can fit classification models.


More information on how parsnip is used for modeling is at


  mode = "classification",
  engine = "nnet",
  penalty = NULL,
  mixture = NULL



A single character string for the type of model. The only possible value for this model is "classification".


A single character string specifying what computational engine to use for fitting. Possible engines are listed below. The default for this model is "nnet".


A non-negative number representing the total amount of regularization (specific engines only). For keras models, this corresponds to purely L2 regularization (aka weight decay) while the other models can be a combination of L1 and L2 (depending on the value of mixture).


A number between zero and one (inclusive) giving the proportion of L1 regularization (i.e. lasso) in the model.

  • mixture = 1 specifies a pure lasso model,

  • mixture = 0 specifies a ridge regression model, and

  • ⁠0 < mixture < 1⁠ specifies an elastic net model, interpolating lasso and ridge.

Available for specific engines only.


This function only defines what type of model is being fit. Once an engine is specified, the method to fit the model is also defined. See set_engine() for more on setting the engine, including how to set engine arguments.

The model is not trained or fit until the fit() function is used with the data.

Each of the arguments in this function other than mode and engine are captured as quosures. To pass values programmatically, use the injection operator like so:

value <- 1
multinom_reg(argument = !!value)

This model fits a classification model for multiclass outcomes; for binary outcomes, see logistic_reg().

References, Tidy Modeling with R, searchable table of parsnip models

See Also





parsnip documentation built on Aug. 18, 2023, 1:07 a.m.