View source: R/trans_sae_encode_decode.R
sae_encode_decode | R Documentation |
Creates an deep learning stacked autoencoder to encode a sequence of observations. The autoencoder layers are based on DAL Toolbox Vanilla Autoencoder It wraps the pytorch library.
sae_encode_decode(
input_size,
encoding_size,
batch_size = 32,
num_epochs = 1000,
learning_rate = 0.001,
k = 3
)
input_size |
input size |
encoding_size |
encoding size |
batch_size |
size for batch learning |
num_epochs |
number of epochs for training |
learning_rate |
learning rate |
k |
number of AE layers in the stack |
a sae_encode_decode
object.
#See example at https://nbviewer.org/github/cefet-rj-dal/daltoolbox-examples
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.