Infinite-dimensional generative diffusions via Doob's h-transform

Infinite-dimensional generative diffusions via Doob's h-transform
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

This paper introduces a rigorous framework for defining generative diffusion models in infinite dimensions via Doob’s h-transform. Rather than relying on time reversal of a noising process, a reference diffusion is forced towards the target distribution by an exponential change of measure. Compared to existing methodology, this approach readily generalises to the infinite-dimensional setting, hence offering greater flexibility in the diffusion model. The construction is derived rigorously under verifiable conditions, and bounds with respect to the target measure are established. We show that the forced process under the changed measure can be approximated by minimising a score-matching objective and validate our method on both synthetic and real data.


💡 Research Summary

The paper proposes a mathematically rigorous framework for constructing generative diffusion models directly in infinite‑dimensional Hilbert spaces, bypassing the traditional “noising‑denoising” paradigm that relies on time‑reversal of a forward SDE. The authors observe that in infinite dimensions the lack of a Lebesgue measure makes the definition of marginal densities problematic, and that the forward noising process may fail to converge to a tractable reference distribution within a finite time horizon. Consequently, time‑reversed diffusion models can become unstable or biased, especially when the prescribed noising time is insufficient.

To overcome these issues, the authors employ Doob’s h‑transform, a classical change‑of‑measure technique that conditions a diffusion on a future event. They start from a general infinite‑dimensional stochastic differential equation

dXₜ =


Comments & Academic Discussion

Loading comments...

Leave a Comment