A coupled Kolmogorov-Arnold Network and Level-Set framework for evolving interfaces
Kolmogorov-Arnold Networks (KANs) require significantly smaller architectures compared to multilayer perceptron (MLP)-based approaches, while retaining expressive power through spline-based activations. Moving boundary problems are ubiquitous in physical systems, whose numerical solutions are quite complex. We propose a shallow KAN framework combined with a Level-set formulation that directly approximates the temperature distribution $T(\mathbf{x},t)$ and the moving interface $Γ(t)$, enforcing the governing PDEs, phase equilibrium, and Stefan condition through physics-informed residuals. Numerical experiments in one and two dimensions show that the framework achieves accurate reconstructions of both temperature fields and interface dynamics, highlighting the potential of KANs as a compact and efficient alternative for moving boundary PDEs. First, we validate the model with semi-infinite analytical solutions. Subsequently, the model is extended to 2D using a level-set based formulation for interface propagation, which is solved within the KAN framework. This work demonstrates that KANs are capable of solving complex moving boundary problems without the need for measurement data.
💡 Research Summary
The paper introduces a novel physics‑informed neural network framework that couples Kolmogorov‑Arnold Networks (KANs) with a level‑set formulation to solve moving‑boundary (Stefan) problems without any measurement data. KANs replace the fixed scalar weights of conventional multilayer perceptrons with learnable univariate spline functions on each edge, dramatically reducing the number of trainable parameters while preserving expressive power and mitigating spectral bias. The moving interface is represented implicitly by a signed‑distance level‑set function ϕ(x,t), allowing seamless handling of complex topological changes in two or more dimensions.
Three KAN subnetworks approximate the solid temperature u_s(x,t), liquid temperature u_ℓ(x,t), and the level‑set field ϕ(x,t). The total loss aggregates several physics‑informed terms: masked diffusion residuals for each phase using smooth characteristic functions H_s(ϕ) and H_ℓ(ϕ); an interface loss that enforces temperature continuity (u_s = u_ℓ = T_m) localized around ϕ = 0 with a Gaussian weighting; the Stefan condition that provides the normal interface velocity V_n, which is extended off the interface to define a velocity field F used in the level‑set advection equation ϕ_t + F|∇ϕ| = 0; an Eikonal regularization enforcing |∇ϕ| = 1; and weakly imposed boundary and initial conditions.
Training data are generated adaptively: uniform collocation points over the space‑time domain, additional points concentrated near the interface, and separate sets for boundary and initial conditions. The AdamW optimizer with weight decay, learning‑rate scheduling, gradient clipping, and periodic resampling of collocation points ensures stable convergence.
Numerical experiments include a one‑dimensional two‑phase Stefan problem with an analytically known solution and a two‑dimensional radially symmetric problem with a circular interface. In 1D, three shallow KANs (
Comments & Academic Discussion
Loading comments...
Leave a Comment