Generator-based Graph Generation via Heat Diffusion

Generator-based Graph Generation via Heat Diffusion
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Graph generative modelling has become an essential task due to the wide range of applications in chemistry, biology, social networks, and knowledge representation. In this work, we propose a novel framework for generating graphs by adapting the Generator Matching (arXiv:2410.20587) paradigm to graph-structured data. We leverage the graph Laplacian and its associated heat kernel to define a continous-time diffusion on each graph. The Laplacian serves as the infinitesimal generator of this diffusion, and its heat kernel provides a family of conditional perturbations of the initial graph. A neural network is trained to match this generator by minimising a Bregman divergence between the true generator and a learnable surrogate. Once trained, the surrogate generator is used to simulate a time-reversed diffusion process to sample new graph structures. Our framework unifies and generalises existing diffusion-based graph generative models, injecting domain-specific inductive bias via the Laplacian, while retaining the flexibility of neural approximators. Experimental studies demonstrate that our approach captures structural properties of real and synthetic graphs effectively.


💡 Research Summary

This paper introduces a novel framework for graph generative modelling called G³ (Generator‑based Graph Generation). The authors adapt the recently proposed Generator Matching paradigm (Holderrieth et al., 2024) to the domain of graphs by exploiting the graph Laplacian and its associated heat kernel. The key idea is to define a continuous‑time diffusion directly on the matrix representation of a graph (typically the adjacency matrix) using the symmetric heat kernel Hₛ = exp(‑sL), where L is the combinatorial Laplacian. For an initial graph Y₀ = A, the forward diffusion is Yₛ = Hₛ Y₀ Hₛ, which satisfies the matrix‑valued heat equation dYₛ/ds = ‑(LYₛ + YₛL). This equation admits a closed‑form infinitesimal generator G(Y) = ‑(LY + YL).

Generator Matching differs from the more common Flow Matching approach by targeting the infinitesimal generator of a Markov process rather than a deterministic vector field. The authors choose linear test functionals f_A(X)=⟨A,X⟩, which reduces the Bregman‑divergence loss to a state‑space form: L_GM(θ)=E_{t∼U


Comments & Academic Discussion

Loading comments...

Leave a Comment