Description Usage Arguments Value Author(s) See Also Examples
The graph convolutional network (GCN), recurrent neural network (RNN), convolutional neural network (CNN), and multilayer perceptron (MLP) are used as encoders. The last layer of the encoders is the fully connected layer. The units and activation can be vectors and the length of the vectors represents the number of layers.
1 2 3 4 5 6 7 8 9 10 11 12 13 | gcn_in_out(max_atoms, feature_dim, gcn_units, gcn_activation,
fc_units, fc_activation)
rnn_in_out(length_seq, fingerprint_size, embedding_layer = TRUE,
num_tokens, embedding_dim, rnn_type, rnn_bidirectional,
rnn_units, rnn_activation, fc_units, fc_activation)
cnn_in_out(length_seq, fingerprint_size, embedding_layer = TRUE,
num_tokens, embedding_dim, cnn_filters, cnn_kernel_size, cnn_activation,
fc_units, fc_activation)
mlp_in_out(length_seq, fingerprint_size, embedding_layer = TRUE,
num_tokens, embedding_dim, fc_units, fc_activation)
|
max_atoms |
maximum number of atoms for gcn |
feature_dim |
dimension of atom features for gcn |
gcn_units |
dimensionality of the output space in the gcn layer |
gcn_activation |
activation of the gcn layer |
fingerprint_size |
the length of a fingerprint |
embedding_layer |
use the embedding layer if TRUE (default: TRUE) |
embedding_dim |
a non-negative integer for dimension of the dense embedding |
length_seq |
length of input sequences |
num_tokens |
total number of distinct strings |
cnn_filters |
dimensionality of the output space in the cnn layer |
cnn_kernel_size |
length of the 1D convolution window in the cnn layer |
cnn_activation |
activation of the cnn layer |
rnn_type |
"lstm" or "gru" |
rnn_bidirectional |
use the bidirectional wrapper for rnn if TRUE |
rnn_units |
dimensionality of the output space in the rnn layer |
rnn_activation |
activation of the rnn layer |
fc_units |
dimensionality of the output space in the fully connected layer |
fc_activation |
activation of the fully connected layer |
input and output tensors of encoders
Dongmin Jung
keras::layer_activation, keras::bidirectional, keras::layer_conv_1d, keras::layer_dense, keras::layer_dot, keras::layer_embedding, keras::layer_global_average_pooling_1d, keras::layer_input, keras::layer_lstm, keras::layer_gru, keras::layer_flatten
1 2 3 4 5 6 | gcn_in_out(max_atoms = 50,
feature_dim = 50,
gcn_units = c(128, 64),
gcn_activation = c("relu", "relu"),
fc_units = c(10),
fc_activation = c("relu"))
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.