nn_utils_weight_norm | R Documentation |
Applies weight normalization to a parameter in the given module.
\eqn{\mathbf{w} = g \dfrac{\mathbf{v}}{\|\mathbf{v}\|}}
Weight normalization is a reparameterization that decouples the magnitude
of a weight tensor from its direction. This replaces the parameter specified
by name
(e.g. 'weight'
) with two parameters: one specifying the
magnitude (e.g. 'weight_g'
) and one specifying the direction
(e.g. 'weight_v'
).
The original module with the weight_v and weight_g paramters.
new()
nn_utils_weight_norm$new(name, dim)
name
(str, optional): name of weight parameter
dim
(int, optional): dimension over which to compute the norm
compute_weight()
nn_utils_weight_norm$compute_weight(module, name = NULL, dim = NULL)
module
(Module): containing module
name
(str, optional): name of weight parameter
dim
(int, optional): dimension over which to compute the norm
apply()
nn_utils_weight_norm$apply(module, name = NULL, dim = NULL)
module
(Module): containing module
name
(str, optional): name of weight parameter
dim
(int, optional): dimension over which to compute the norm
call()
nn_utils_weight_norm$call(module)
module
(Module): containing module
recompute()
nn_utils_weight_norm$recompute(module)
module
(Module): containing module
remove()
nn_utils_weight_norm$remove(module, name = NULL)
module
(Module): containing module
name
(str, optional): name of weight parameter
clone()
The objects of this class are cloneable with this method.
nn_utils_weight_norm$clone(deep = FALSE)
deep
Whether to make a deep clone.
The pytorch Weight normalization is implemented via a hook that recomputes
the weight tensor from the magnitude and direction before every forward()
call. Since torch for R still do not support hooks, the weight recomputation
need to be done explicitly inside the forward()
definition trough a call of
the recompute()
method. See examples.
By default, with dim = 0
, the norm is computed independently per output
channel/plane. To compute a norm over the entire weight tensor, use
dim = NULL
.
@references https://arxiv.org/abs/1602.07868
if (torch_is_installed()) {
x = nn_linear(in_features = 20, out_features = 40)
weight_norm = nn_utils_weight_norm$new(name = 'weight', dim = 2)
weight_norm$apply(x)
x$weight_g$size()
x$weight_v$size()
x$weight
# the recompute() method recomputes the weight using g and v. It must be called
# explicitly inside `forward()`.
weight_norm$recompute(x)
}
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.