Adaptive Importance Sampling in General Mixture Classes

Adaptive Importance Sampling in General Mixture Classes
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

In this paper, we propose an adaptive algorithm that iteratively updates both the weights and component parameters of a mixture importance sampling density so as to optimise the importance sampling performances, as measured by an entropy criterion. The method is shown to be applicable to a wide class of importance sampling densities, which includes in particular mixtures of multivariate Student t distributions. The performances of the proposed scheme are studied on both artificial and real examples, highlighting in particular the benefit of a novel Rao-Blackwellisation device which can be easily incorporated in the updating scheme.


💡 Research Summary

The paper introduces a novel adaptive importance sampling (AIS) algorithm that simultaneously updates the mixture weights and the component parameters of a proposal distribution, extending the scope of Population Monte Carlo (PMC) methods. Traditional PMC schemes adjust only the mixture weights while keeping the component densities fixed, which limits their ability to adapt to complex target distributions, especially in high‑dimensional settings.

The authors consider a general mixture proposal
 q_{α,θ}(x)=∑_{d=1}^{D} α_d q_d(x;θ_d)
where each component q_d belongs to a flexible family (e.g., multivariate Student‑t). The performance criterion is the Kullback‑Leibler (KL) divergence between the target density π (known up to a normalising constant) and the proposal:
 E(π,q)=∫π(x) log


Comments & Academic Discussion

Loading comments...

Leave a Comment