A Kullback-Leibler divergence test for multivariate extremes: theory and practice

A Kullback-Leibler divergence test for multivariate extremes: theory and practice
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Testing whether two multivariate samples exhibit the same extremal behavior is an important problem in various fields including environmental and climate sciences. While several ad-hoc approaches exist in the literature, they often lack theoretical justification and statistical guarantees. On the other hand, extreme value theory provides the theoretical foundation for constructing asymptotically justified tests. We combine this theory with Kullback-Leibler divergence, a fundamental concept in information theory and statistics, to propose a test for equality of extremal dependence structures in practically relevant directions. Under suitable assumptions, we derive the limiting distributions of the proposed statistic under null and alternative hypotheses. Importantly, our test is fast to compute and easy to interpret by practitioners, making it attractive in applications. Simulations provide evidence of the power of our test. In a case study, we apply our method to show the strong impact of seasons on the strength of dependence between different aggregation periods (daily versus hourly) of heavy rainfall in France.


💡 Research Summary

The paper addresses the problem of testing whether two multivariate samples share the same extremal dependence structure, a question that arises in climate attribution, systemic risk, hydrology, and other fields where extreme events are of interest. Existing approaches either rely on simple summary measures such as the extremal correlation χ(v) – which often lack power – or on full copula modelling, which is computationally demanding and requires substantial expertise.

The authors propose a novel two‑sample test that combines multivariate regular variation (MRV) theory with the Kullback‑Leibler (KL) divergence. First, each original vector ˜X and ˜Y is transformed component‑wise to Pareto margins using the marginal distribution functions (known or estimated). This standardisation preserves the underlying copula, so the extremal dependence can be examined on the transformed vectors X and Y.

A homogeneous risk functional r:


Comments & Academic Discussion

Loading comments...

Leave a Comment