neuralop.layers.normalization_layers.BatchNorm

class neuralop.layers.normalization_layers.BatchNorm(n_dim: int, num_features: int, **kwargs)[source]

Dimension-agnostic batch normalization layer for neural operators.

BatchNorm normalizes data across the entire batch, computing a single mean and standard deviation for all samples combined. This is the most common form of normalization and is effective when batch statistics are a good approximation of the overall data distribution.

For dimensions > 3, the layer automatically flattens spatial dimensions and uses BatchNorm1d, as PyTorch doesn’t implement batch norm for 4D+ tensors.

Parameters:
n_dimint

Spatial dimension of input data (e.g., 1 for 1D, 2 for 2D, 3 for 3D). Determined by FNOBlocks.n_dim. If n_dim > 3, spatial dimensions are flattened and BatchNorm1d is used.

num_featuresint

Number of channels in the input tensor to be normalized

**kwargsdict, optional

Additional parameters to pass to the underlying batch normalization layer. Common parameters include: - eps : float, optional

Small value added to the denominator for numerical stability. Default is 1e-5.

  • momentumfloat, optional

    Value used for the running_mean and running_var computation. Default is 0.1.

  • affinebool, optional

    If True, apply learnable affine transformation. Default is True.

  • track_running_statsbool, optional

    If True, track running statistics. Default is True.

Methods

forward(x)

Apply batch normalization to the input tensor.

forward(x)[source]

Apply batch normalization to the input tensor.