Bayesian Quantile Estimation and Regression with Martingale Posteriors
Quantile estimation and regression within the Bayesian framework is challenging as the choice of likelihood and prior is not obvious. In this paper, we introduce a novel Bayesian nonparametric method for quantile estimation and regression based on the recently introduced martingale posterior (MP) framework. The core idea of the MP is that posterior sampling is equivalent to predictive imputation, which allows us to break free of the stringent likelihood-prior specification. We demonstrate that a recursive estimate of a smooth quantile function, subject to a martingale condition, is entirely sufficient for full nonparametric Bayesian inference. We term the resulting posterior distribution as the quantile martingale posterior (QMP), which arises from an implicit generative predictive distribution. Associated with the QMP is an expedient, MCMC-free and parallelizable posterior computation scheme, which can be further accelerated with an asymptotic approximation based on a Gaussian process. Furthermore, the well-known issue of monotonicity in quantile estimation is naturally alleviated through increasing rearrangement due to the connections to the Bayesian bootstrap. Finally, the QMP has a particularly tractable form that allows for comprehensive theoretical study, which forms a main focus of the work. We demonstrate the ease of posterior computation in simulations and real data experiments.
💡 Research Summary
This paper introduces a novel Bayesian non‑parametric framework for quantile estimation and quantile regression called the Quantile Martingale Posterior (QMP). Traditional Bayesian approaches to quantile analysis rely on specifying a likelihood (most commonly the asymmetric Laplace distribution) and a prior over quantile functions, which is both technically demanding and computationally intensive due to the need for MCMC. Moreover, constructing priors that respect monotonicity of quantile functions is non‑trivial, and the resulting posterior often suffers from quantile crossing.
The authors build on the recently proposed Martingale Posterior (MP) framework, which replaces the likelihood‑prior paradigm with a sequence of predictive distributions ({P_n}). In MP, posterior sampling is equivalent to “predictive resampling”: one repeatedly draws future observations (Y_{n+1},Y_{n+2},\dots) from the predictive distributions conditioned on the data seen so far, and then computes the quantity of interest from the enlarged dataset. The key requirement for MP is a martingale coherence condition (E
Comments & Academic Discussion
Loading comments...
Leave a Comment