layer_edge_conv: EdgeConv

Description Usage Arguments

View source: R/layers_conv.R

Description

\loadmathjax

An Edge Convolutional layer as presented by Wang et al. (2018).

Mode: single, disjoint.

This layer expects a sparse adjacency matrix.

This layer computes for each node \mjeqni: \mjdeqn Z_i = \sum\limits_j \in \mathcalN(i) \textrmMLP\big( X_i \| X_j - X_i \big) where \mjeqn\textrmMLP is a multi-layer perceptron.

Input

Output

Usage

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
layer_edge_conv(
  object,
  channels,
  mlp_hidden = NULL,
  mlp_activation = "relu",
  activation = NULL,
  use_bias = TRUE,
  kernel_initializer = "glorot_uniform",
  bias_initializer = "zeros",
  kernel_regularizer = NULL,
  bias_regularizer = NULL,
  activity_regularizer = NULL,
  kernel_constraint = NULL,
  bias_constraint = NULL,
  ...
)

Arguments

channels

integer, number of output channels

mlp_hidden

list of integers, number of hidden units for each hidden layer in the MLP (if None, the MLP has only the output layer)

mlp_activation

activation for the MLP layers

activation

activation function to use

use_bias

bool, add a bias vector to the output

kernel_initializer

initializer for the weights

bias_initializer

initializer for the bias vector

kernel_regularizer

regularization applied to the weights

bias_regularizer

regularization applied to the bias vector

activity_regularizer

regularization applied to the output

kernel_constraint

constraint applied to the weights

bias_constraint

constraint applied to the bias vector.


rdinnager/rspektral documentation built on June 12, 2021, 1:26 a.m.