Model-free Analysis of Scattering and Imaging Data with Escort-Weighted Shannon Entropy and Divergence Matrices

Model-free Analysis of Scattering and Imaging Data with Escort-Weighted Shannon Entropy and Divergence Matrices
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

We demonstrate a model-free data analysis framework that leverages escort-weighted Shannon Entropy and several divergence matrices to detect phase transitions in scattering and imaging datasets. By establishing a connection between physical entropy and informational entropy, this approach provides a sensitive method for identifying phase transitions without an explicit physical model or order parameter. We further show that pairwise divergence matrices, including Kullback-Leibler divergence, Jeffrey Divergence, Jensen-Shannon Divergence and antisymmetric Kullback-Leibler divergence, provide more comprehensive measures of statistical changes than scalar entropy alone. Our approach successfully detects the onset of both long- and short-range order in neutron and X-ray scattering data, as well as a non-trivial phase transition in magnetic skyrmion lattices observed through Lorentz-transition electron microscopy. These results establish a framework for automated, model-free analysis of experimental data with broad applications in materials science and condensed matter physics.


💡 Research Summary

The authors present a fully model‑free framework for detecting phase transitions directly from scattering and imaging data by exploiting concepts from information theory. They begin by postulating a one‑to‑one mapping between the physical state distribution of a material and the probability distribution of measured intensities (or pixel values) in an experiment. The standard Shannon entropy, S = −∑p_i ln p_i, quantifies the overall disorder of a dataset but is often too coarse to reveal subtle structural changes associated with phase transitions. To increase sensitivity, the authors introduce an “escort distribution” defined as p_i^β / ∑p_j^β, where the exponent β (also denoted n) acts as an artificial temperature that controls the weighting of each intensity. β = 1 recovers the ordinary Shannon entropy; β > 1 emphasizes high‑intensity pixels, while β < 1 smooths the distribution. By tuning β, background noise can be suppressed without discarding physically relevant features.

Relative entropy between two experimental conditions (e.g., temperatures T₁ and T₂) is quantified using the Kullback‑Leibler divergence (KLD): D_KL(P‖Q) = ∑p_i ln(p_i/q_i). Because KLD is asymmetric, the authors also compute symmetric counterparts: the Jeffrey divergence (JD) = D_KL(P‖Q)+D_KL(Q‖P) and the Jensen‑Shannon divergence (JSD) = ½


Comments & Academic Discussion

Loading comments...

Leave a Comment