layer_appnp: APPNP

Description Usage Arguments

View source: R/layers_conv.R

Description

\loadmathjax

A graph convolutional layer implementing the APPNP operator, as presented by Klicpera et al. (2019).

This layer computes: \mjdeqn Z^(0) = \textrmMLP(X); \ Z^(K) = (1 - \alpha) \hat D^-1/2 \hat A \hat D^-1/2 Z^(K - 1) + \alpha Z^(0), where \mjeqn\alpha is the teleport probability and \mjeqn\textrmMLP is a multi-layer perceptron.

Mode: single, disjoint, mixed, batch.

Input

Output

Usage

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
layer_appnp(
  object,
  channels,
  alpha = 0.2,
  propagations = 1,
  mlp_hidden = NULL,
  mlp_activation = "relu",
  dropout_rate = 0,
  activation = NULL,
  use_bias = TRUE,
  kernel_initializer = "glorot_uniform",
  bias_initializer = "zeros",
  kernel_regularizer = NULL,
  bias_regularizer = NULL,
  activity_regularizer = NULL,
  kernel_constraint = NULL,
  bias_constraint = NULL,
  ...
)

Arguments

channels

number of output channels

alpha

teleport probability during propagation

propagations

number of propagation steps

mlp_hidden

list of integers, number of hidden units for each hidden layer in the MLP (if None, the MLP has only the output layer)

mlp_activation

activation for the MLP layers

dropout_rate

dropout rate for Laplacian and MLP layers

activation

activation function to use

use_bias

bool, add a bias vector to the output

kernel_initializer

initializer for the weights

bias_initializer

initializer for the bias vector

kernel_regularizer

regularization applied to the weights

bias_regularizer

regularization applied to the bias vector

activity_regularizer

regularization applied to the output

kernel_constraint

constraint applied to the weights

bias_constraint

constraint applied to the bias vector.


rdinnager/rspektral documentation built on June 12, 2021, 1:26 a.m.