neuralop.layers.channel_mlp
.ChannelMLP
- class neuralop.layers.channel_mlp.ChannelMLP(in_channels, out_channels=None, hidden_channels=None, n_layers=2, n_dim=2, non_linearity=<built-in function gelu>, dropout=0.0)[source]
Multi-layer perceptron applied channel-wise across spatial dimensions.
ChannelMLP applies a series of 1D convolutions and nonlinearities to the channel dimension of input tensors, making it invariant to spatial resolution. This is particularly useful in neural operators where the spatial dimensions may vary but the channel processing should remain consistent.
The implementation uses 1D convolutions with kernel size 1, which effectively performs linear transformations on the channel dimension while preserving spatial structure. This approach is more efficient than reshaping to 2D and using fully connected layers.
- Parameters:
- in_channelsint
Number of input channels
- out_channelsint, optional
Number of output channels. If None, defaults to in_channels.
- hidden_channelsint, optional
Number of hidden channels in intermediate layers. If None, defaults to in_channels.
- n_layersint, optional
Number of linear layers in the MLP, by default 2
- n_dimint, optional
Spatial dimension of input (unused but kept for compatibility), by default 2
- non_linearitycallable, optional
Nonlinear activation function to apply between layers, by default F.gelu
- dropoutfloat, optional
Dropout probability applied after each layer (except the last). If 0, no dropout is applied, by default 0.0
Methods
forward
(x)Forward pass through the channel MLP.