A Likely Geometry of Generative Models
The geometry of generative models serves as the basis for interpolation, model inspection, and more. Unfortunately, most generative models lack a principal notion of geometry without restrictive assumptions on either the model or the data dimension. In this paper, we construct a general geometry compatible with different metrics and probability distributions to analyze generative models that do not require additional training. We consider curves analogous to geodesics constrained to a suitable data distribution aimed at targeting high-density regions learned by generative models. We formulate this as a (pseudo)-metric and prove that this corresponds to a Newtonian system on a Riemannian manifold. We show that shortest paths in our framework can be characterized by a system of ordinary differential equations, which locally corresponds to geodesics under a suitable Riemannian metric. Numerically, we derive a novel algorithm to efficiently compute shortest paths and generalized Fréchet means. Quantitatively, we show that curves using our metric traverse regions of higher density than baselines across a range of models and datasets.
💡 Research Summary
The paper addresses a fundamental limitation of modern generative models: the lack of a principled geometric notion that respects the learned data distribution without imposing restrictive assumptions on model architecture or data dimensionality. To overcome this, the authors introduce a general “likely geometry” built on a pseudo‑metric that blends the ambient Riemannian metric of the latent or data space with a scalar field derived from the model’s probability density (typically the negative log‑likelihood).
Formally, given a Riemannian manifold ((M,g)) with metric tensor (G(x)) and a scalar function (S:M\rightarrow\mathbb{R}) bounded from below (e.g., (S(z)=-\log p(z))), they define a pseudo‑inner product in each tangent space:
\
Comments & Academic Discussion
Loading comments...
Leave a Comment