nbeats_fit_impl: GluonTS N-BEATS Modeling Function (Bridge)

Description Usage Arguments

View source: R/parsnip-nbeats.R

Description

GluonTS N-BEATS Modeling Function (Bridge)

Usage

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
nbeats_fit_impl(
  x,
  y,
  freq,
  prediction_length,
  id,
  epochs = 5,
  batch_size = 32,
  num_batches_per_epoch = 50,
  learning_rate = 0.001,
  learning_rate_decay_factor = 0.5,
  patience = 10,
  minimum_learning_rate = 5e-05,
  clip_gradient = 10,
  weight_decay = 1e-08,
  init = "xavier",
  ctx = NULL,
  hybridize = TRUE,
  context_length = NULL,
  loss_function = "sMAPE",
  num_stacks = 30,
  num_blocks = list(1),
  widths = list(512),
  sharing = list(FALSE),
  expansion_coefficient_lengths = list(32),
  stack_types = list("G")
)

Arguments

x

A dataframe of xreg (exogenous regressors)

y

A numeric vector of values to fit

freq

A pandas timeseries frequency such as "5min" for 5-minutes or "D" for daily. Refer to Pandas Offset Aliases.

prediction_length

Numeric value indicating the length of the prediction horizon

id

A quoted column name that tracks the GluonTS FieldName "item_id"

epochs

Number of epochs that the network will train (default: 5).

batch_size

Number of examples in each batch (default: 32).

num_batches_per_epoch

Number of batches at each epoch (default: 50).

learning_rate

Initial learning rate (default: 10-3 ).

learning_rate_decay_factor

Factor (between 0 and 1) by which to decrease the learning rate (default: 0.5).

patience

The patience to observe before reducing the learning rate, nonnegative integer (default: 10).

minimum_learning_rate

Lower bound for the learning rate (default: 5x10-5 ).

clip_gradient

Maximum value of gradient. The gradient is clipped if it is too large (default: 10).

weight_decay

The weight decay (or L2 regularization) coefficient. Modifies objective by adding a penalty for having large weights (default 10-8 ).

init

Initializer of the weights of the network (default: “xavier”).

ctx

The mxnet CPU/GPU context. Refer to using CPU/GPU in the mxnet documentation. (default: NULL, uses CPU)

hybridize

Increases efficiency by using symbolic programming. (default: TRUE)

context_length

Number of time units that condition the predictions Also known as 'lookback period'. Default is 2 * prediction_length

loss_function

The loss function (also known as metric) to use for training the network. Unlike other models in GluonTS this network does not use a distribution. One of the following: "sMAPE", "MASE" or "MAPE". The default value is "MAPE".

num_stacks

The number of stacks the network should contain. Default and recommended value for generic mode: 30 Recommended value for interpretable mode: 2

num_blocks

The number of blocks per stack. A list of ints of length 1 or 'num_stacks'. Default and recommended value for generic mode: 1. Recommended value for interpretable mode: 3.

widths

Widths of the fully connected layers with ReLu activation in the blocks. A list of ints of length 1 or 'num_stacks'. Default and recommended value for generic mode: list(512) Recommended value for interpretable mode: list(256, 2048)

sharing

Whether the weights are shared with the other blocks per stack. A list of ints of length 1 or 'num_stacks'. Default and recommended value for generic mode: list(FALSE) Recommended value for interpretable mode: list(TRUE)

expansion_coefficient_lengths

If the type is "G" (generic), then the length of the expansion coefficient. If type is "T" (trend), then it corresponds to the degree of the polynomial. If the type is "S" (seasonal) then its not used. A list of ints of length 1 or 'num_stacks'. Default value for generic mode: list(32) Recommended value for interpretable mode: list(3)

stack_types

One of the following values: "G" (generic), "S" (seasonal) or "T" (trend). A list of strings of length 1 or 'num_stacks'. Default and recommended value for generic mode: list("G") Recommended value for interpretable mode: list("T","S")


modeltime.gluonts documentation built on Jan. 8, 2021, 2:23 a.m.