Bayesian Conformal Prediction as a Decision Risk Problem

Bayesian Conformal Prediction as a Decision Risk Problem
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Bayesian posterior predictive densities as non-conformity scores and Bayesian quadrature are used to estimate and minimise the expected prediction set size. Operating within a split conformal framework, BCP provides valid coverage guarantees and demonstrates reliable empirical coverage under model misspecification. Across regression and classification tasks, including distribution-shifted settings such as ImageNet-A, BCP yields prediction sets of comparable size to split conformal prediction, while exhibiting substantially lower run-to-run variability in set size. In sparse regression with nominal coverage of 80 percent, BCP achieves 81 percent empirical coverage under a misspecified prior, whereas Bayesian credible intervals under-cover at 49 percent.


💡 Research Summary

The paper introduces Bayesian Conformal Prediction (BCP), a novel framework that integrates Bayesian posterior predictive densities as non‑conformity scores with Bayesian quadrature (BQ) to directly minimise the expected size of prediction sets while preserving finite‑sample coverage guarantees. Operating within the split‑conformal setting, BCP treats the conformal threshold λ as a decision variable and formulates a constrained optimisation problem: minimise E_X


Comments & Academic Discussion

Loading comments...

Leave a Comment