neuralop.layers.skip_connections
.SoftGating
- class neuralop.layers.skip_connections.SoftGating(in_features, out_features=None, n_dim=2, bias=False)[source]
Applies soft-gating by weighting the channels of the given input
Given an input x of size (batch-size, channels, height, width), this returns x * w ` where w is of shape `(1, channels, 1, 1)
- Parameters:
- in_featuresint
- out_featuresNone
this is provided for API compatibility with nn.Linear only
- n_dimint, default is 2
Dimensionality of the input (excluding batch-size and channels).
n_dim=2
corresponds to having Module2D.- biasbool, default is False
Methods
forward
(x)Applies soft-gating to a batch of activations
- forward(x)[source]
Applies soft-gating to a batch of activations