Expanding the Chaos: Neural Operator for Stochastic (Partial) Differential Equations

Expanding the Chaos: Neural Operator for Stochastic (Partial) Differential Equations
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Stochastic differential equations (SDEs) and stochastic partial differential equations (SPDEs) are fundamental for modeling stochastic dynamics across the natural sciences and modern machine learning. Learning their solution operators with deep learning models promises fast solvers and new perspectives on classical learning tasks. In this work, we build on Wiener-chaos expansions (WCE) to design neural operator (NO) architectures for SDEs and SPDEs: we project driving noise paths onto orthonormal Wick-Hermite features and use NOs to parameterize the resulting chaos coefficients, enabling reconstruction of full trajectories from noise in a single forward pass. We also make the underlying WCE structure explicit for multi-dimensional SDEs and semilinear SPDEs by showing the coupled deterministic ODE/PDE systems governing these coefficients. Empirically, we achieve competitive accuracy across several tasks, including standard SPDE benchmarks and SDE-based diffusion one-step image sampling, topological graph interpolation, financial extrapolation, parameter estimation, and manifold SDE flood forecasting. These results suggest WCE-based neural operators are a practical and scalable approach to learning SDE/SPDE solution operators across domains.


💡 Research Summary

The paper introduces a novel neural operator framework for learning the solution operators of stochastic differential equations (SDEs) and stochastic partial differential equations (SPDEs) by leveraging Wiener‑Chaos Expansions (WCE). The authors first represent the driving Brownian (or Q‑Brownian) motion as an infinite collection of independent Gaussian coordinates ξ_{ij} obtained from orthonormal basis functions in time. These coordinates are combined into normalized Wick‑Hermite polynomials ξ^{α}, which form a complete orthonormal basis for square‑integrable functionals of the noise. By truncating the expansion, a finite‑dimensional feature vector η is obtained that encodes the entire stochastic forcing.

WCE theory guarantees that the solution X_t of an SDE or SPDE can be written as a linear combination of deterministic chaos coefficients u_{α}(t,·) and the random basis ξ^{α}:
X_t = Σ_{α∈J} u_{α}(t,·) ξ^{α}.
Crucially, each coefficient u_{α} satisfies a deterministic evolution equation that depends only on time (for SDEs) or on time and space (for SPDEs). The paper formalizes these dynamics in Theorem 1 (multi‑dimensional SDE) and Theorem 2 (semilinear SPDE), showing that the coefficients obey coupled ODEs or PDEs whose right‑hand sides involve only the drift F, diffusion B, and the Wick product structure. This separation of stochasticity (captured entirely by the Wick features) from deterministic dynamics enables the use of standard neural operators to learn the maps (χ₀, η) → {u_{α}}.

Two concrete architectures are proposed. For SPDEs (F‑SPDENO), the spatial fields u_{α}(t,x) are expanded in a temporal basis ϕ_k(t), reducing the learning task to predicting time‑independent spatial coefficient fields u_{α,k}(x). These fields are modeled with Fourier Neural Operators (FNO) or GINO, while the Wick features are concatenated with the initial condition and fed into the operator. For SDEs (SDENO), the time‑dependent coefficients u_{α}(t) are approximated by simple MLPs or ODE‑networks, again conditioned on the Wick features. The overall pipeline requires only a single forward pass to reconstruct the full stochastic trajectory.

Extensive experiments validate the approach across six diverse settings: (i) the dynamic Φ⁴₁ model, (ii) 2‑D stochastic Navier‑Stokes equations, (iii) one‑step diffusion image sampling using a UNet‑based U‑SDENO, (iv) topological interpolation on graphs with a G‑SDENO, (v) extrapolation of the Heston financial model, and (vi) manifold‑valued SDE forecasting for flood prediction. Compared to baselines such as NCDE, NRDE, DeepONet, and standard FNO, the proposed models achieve lower relative L² errors (often 10‑30 % improvement) and faster inference (2‑5× speed‑up). Visual comparisons on the Navier‑Stokes case show that F‑SPDENO captures fine stochastic structures that baseline methods miss.

The contributions are threefold: (1) a principled Wiener‑Chaos based encoding of stochastic forcing that converts a random operator into a deterministic one, (2) explicit derivation of the coupled ODE/PDE systems governing chaos coefficients, providing a clear target for neural operator learning, and (3) a unified, scalable framework demonstrated on a broad spectrum of physical, financial, and data‑driven problems. The work bridges stochastic analysis and deep learning, offering a pathway to efficient, one‑shot solvers for high‑dimensional stochastic systems and opening avenues for future extensions to more complex, non‑linear, multi‑scale SPDEs and manifold‑valued stochastic dynamics.


Comments & Academic Discussion

Loading comments...

Leave a Comment