smis()
calculation. Function compared prediction interval with prediction instead of actual values. Argument data
now requires train set only and actual
equivalent to lower/upper bounds. Argument forecast
and h
therefore removed.mase()
calculation. Fixed scaling for forecast horizons not immediately following the train set, e.g. when evaluating only on h = 5:6
. Therefore changed arguments to data
for train setactual
for actual values, the equivalent of forecast valuesforecast
for predictionscv_arima
, cv_baselines
and tune_keras_rnn_eval
tune_keras_rnn_eval
incl. internal argument checks, unit testing and examplesmultiple_h
to h
in following functions:cv_arima
cv_baselines
tune_keras_rnn_eval
smis
calculation for differing forecast horizons. Directly affecting output of cv_arima
, cv_baselines
and tune_keras_rnn_eval
parse_tf_version
to get TensorFlow versioncheck_cv_setting
for automated check of cv_setting
in functions including cross-validationkeras_rnn
-function family in particularpredict_baselines
, predict_arima
, tune_keras_rnn_bayesoptim
, tune_keras_rnn_predict
All functions checked id column for class numeric. Fixed bug to check for class charactertune_keras_rnn_bayesoptim
Change bayes optimization iterations from 50 to 30tune_keras_rnn_bayesoptim
and tune_keras_rnn_predict
: Fix resample naming once purrr loop finishedSplitting tune_keras_rnn
tune_keras_rnn_bayesoptim
tuning process by Bayesian Optimizationtune_keras_rnn_predict
Train and forecast based on tuning parameterstune_keras_rnn_eval
evaluation of trained modelstune_keras_rnn
development of grid search for PI dropout rate finishedpy_dropout_model
slight bug fixedkeras_rnn
enhanced by "optimizer" argumentpredict_arima
and predict_baselines
: Now using stats::window
for subset.ts
First (official) release of package tsRNN (former fcf) for time series recurrent neural network training and estimation
check_acf
Check for autocorrelation (Ljung-Box tests for 8, 12 and 16 lags)forecast_baseline
Baseline models to get benchmark results for advanced modelskeras_rnn
Train Recurrent Neural Network with Gated Recurrent Unit (GRU) or Long-Short Term Memory (LSTM) using Keras frameworkacd
Absolute coverage difference (ACD)smis
Scaled Mean Interval Scoremase
Mean Absolute Scaled Error (MASE)mape
Mean Absolute Percentage Error (MAPE)smape
symmetric Mean Absolute Percentage Error (sMAPE)plot_baselines
Plot Forecasts by baseline methodsplot_baselines_samples
Plot cross validated samples of forecasts by baseline methodsplot_prediction
Plot timeseries and forecast for single split and companyplot_prediction_samples
Plot multiple splits from list with forecast resultspredict_arima
Forecast the next n steps by ARIMA for each splitpredict_baselines
Prediction and evaluation for baseline models including cross validationpredict_keras_rnn
Prediction and evaluation for rnn modelspy_dropout_model
Change dropout rate in recurrent layer or dropout layer of trained model (python wrapper)ts_nn_preparation
Timeseries data preparation for neural network Keras modelsts_normalization
Normalize univariate timeseriestune_keras_rnn
Tune recurrent neural network with Keras functional API and Bayes-Optimization and select best performing modelAdd the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.