Extremal Alexandrov estimates: singularities, obstacles, and stability

Extremal Alexandrov estimates: singularities, obstacles, and stability
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

The classical Alexandrov estimate controls the oscillation of a convex function by the mass of its associated Monge-Ampère measure and yields, for two convex functions of $n$ variables with the same boundary values, a sup-norm bound with exponent $1/n$ in the measure discrepancy. We show that this exponent is not optimal in the small-discrepancy regime once one of the functions is non-degenerate in the sense of having Monge-Ampère density bounded above and below by two positive constants. We prove sharp quantitative estimates comparing two convex functions by the total variation of the difference of their Monge-Ampère measures: in dimensions $n\ge 3$ the optimal dependence is quadratic in the natural mass scale, while in dimension $n=2$ the optimal dependence contains a logarithmic correction. These rates are shown to be optimal for all small discrepancies. A key structural ingredient is a characterization of extremizers. We identify the pointwise minimizers and maximizers in the admissible class and prove that they are realized, respectively, by solutions to Monge-Ampère equations with an isolated singularity and by solutions to Monge-Ampère equations with a linear obstacle. This extremal description reduces the sharp estimates to a precise asymptotic analysis of these two model configurations. Assuming further that the domain and the non-degenerate reference function are $C^{2,α}$ and uniformly convex, we obtain sharp pointwise two-sided asymptotics at interior points with explicit leading constants. Finally, in dimensions $n\ge 3$ we establish a stability phenomenon: if the pointwise estimate is nearly saturated, then the measure discrepancy must concentrate near the point at the natural scale, quantifying rigidity of almost-extremal configurations.


💡 Research Summary

The paper revisits the classical Alexandrov estimate, which bounds the oscillation of a convex function by the total mass of its Monge‑Ampère measure, and shows that the familiar exponent 1/n is not optimal when the discrepancy between two convex functions is small and the reference function is non‑degenerate (i.e., its Hessian determinant is bounded above and below by positive constants). The authors consider two convex functions u and φ on a bounded convex domain Ω with the same boundary values and study the quantity a = (ω_n^{-1} n |M_u – M_φ|(Ω))^{1/n}, which measures the size of the total variation of the difference of their Monge‑Ampère measures. Their main contributions are:

  1. Sharp quantitative estimates: For dimensions n≥3 they prove that ‖u−φ‖{L^∞(Ω)} ≤ C a^2 (up to an additive constant), while for n=2 the optimal bound contains a logarithmic correction, namely ‖u−φ‖{L^∞(Ω)} ≤ C a^2 (|log a|+1). These estimates improve the classical a^{1/n} bound and are shown to be optimal for all sufficiently small a.

  2. Extremal configurations: The paper identifies the extremal (maximizing and minimizing) configurations for the pointwise comparison problem. The lower extremizer is a solution of a Monge‑Ampère equation with an isolated Dirac singularity at the point of interest, i.e. det D^2 w = det D^2 φ + ω_n a^n δ_{x0}. The upper extremizer is a solution of a Monge‑Ampère obstacle problem where a linear supporting plane of φ is raised by a constant, creating a non‑trivial coincidence set. These two model solutions are dual to each other (the latter is the Legendre transform of the former in the quadratic case) and fully capture the optimal constants.

  3. Pointwise asymptotics: Assuming φ∈C^{2,α}_+(Ω) and ∂Ω∈C^{2,α}, the authors obtain precise two‑sided asymptotics at any interior point x0. In dimension two the deviation satisfies −½ λ_0^{−1} – C|log a| ≤ (u(x0)−φ(x0))/


Comments & Academic Discussion

Loading comments...

Leave a Comment