neuralop.layers.differential_conv.FiniteDifferenceConvolution
- class neuralop.layers.differential_conv.FiniteDifferenceConvolution(in_channels, out_channels, n_dim, kernel_size=3, groups=1, padding='periodic')[source]
Finite Difference Convolution Layer
This is the finite difference convolution layer introduced in [1].
It computes a finite difference convolution on a regular grid, which converges to a directional derivative as the grid is refined.
- Parameters:
- in_channelsint
Number of in_channels
- out_channelsint
Number of out_channels
- n_dimint
Number of dimensions in the input domain
- kernel_sizeint, optional
Odd kernel size used for convolutional finite difference stencil, by default 3
- groupsint, optional
Splitting number of channels, by default 1
- paddingliteral {‘periodic’, ‘replicate’, ‘reflect’, ‘zeros’}, optional
Mode of padding to use on input. Options: ‘periodic’, ‘replicate’, ‘reflect’, ‘zeros’, by default ‘periodic’ See torch.nn.functional.padding.
Methods
forward(x, grid_width)FiniteDifferenceConvolution's forward pass.
References
[1]: Liu-Schiaffini, M., et al. (2024). “Neural Operators with Localized Integral and Differential Kernels”. ICML 2024, https://arxiv.org/abs/2402.16845.
- forward(x, grid_width)[source]
FiniteDifferenceConvolution’s forward pass. Alternatively, one could center the conv kernel by subtracting the mean pointwise in the kernel:
conv(x, kernel - mean(kernel)) / grid_width- Parameters:
- xtorch.tensor
input tensor, shape (batch, in_channels, d_1, d_2, …d_n)
- grid_widthfloat
discretization size of input grid