Sample complexity for divergence regularized optimal transport with radial cost
We prove a new sample complexity result for divergence regularized optimal transport. Our bound holds for probability measures on~$\mathbb{R}^d$ with exponential tail decay and for radial cost functions that satisfy a local Lipschitz condition. It is sharp up to logarithmic factors, and captures the intrinsic dimension of the marginal distributions through a generalized covering number of their supports. Examples that fit into our framework include subexponential and subgaussian distributions and radial cost functions $c(x,y)=|x-y|^p$ for $p\ge 1$ with logarithmic entropy or polynomial $α$-divergence.
💡 Research Summary
The paper establishes new sample‑complexity bounds for divergence‑regularized optimal transport (ROT) when the underlying probability measures have exponential‑type tails and the cost function is radial and locally Lipschitz. The authors consider probability measures μ and ν on ℝⁿ that satisfy a tail condition of the form μ(Bᶜ_r) ≤ exp(−c_μ r^{α_μ}) (and similarly for ν), covering sub‑Gaussian (α=2), sub‑exponential (α=1) and more general Orlicz‑type tails. The cost is assumed to be of the form c(x,y)=h(‖x−y‖) with h(0)=0 and a local Hölder condition |h(t)−h(s)| ≤ C_p |t−s| (t∨s)^{p−1}, which includes the standard ℓᵖ‑cost |x−y|ᵖ for p≥1. The regularizer is a φ‑divergence with φ strictly convex, φ(1)=0, φ(∞)=∞, and a conjugate ψ that is C¹ and satisfies a growth condition ψ″(x) ≤ C_ψ |x|^{γ} for large x. This framework encompasses entropic OT (EOT) with φ(x)=x log x and polynomial α‑divergences (including quadratic OT, QOT, when α=2).
The main result (Theorem 2.5) provides an explicit upper bound on the expected absolute error between the empirical ROT cost C_ε^μ_n,ν_n (computed from n i.i.d. samples of μ and ν) and the population ROT cost C_ε^μ,ν. The bound scales as
E|C_ε^μ_n,ν_n−C_ε^μ,ν| ≤ C n^{−1/p}
Comments & Academic Discussion
Loading comments...
Leave a Comment