Testing time order and Leggett-Garg inequalities with noninvasive measurements on public quantum computers

Testing time order and Leggett-Garg inequalities with noninvasive measurements on public quantum computers
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

We demonstrate the first violation of the Leggett-Garg inequality and time-order noninvariance on public quantum computers using genuine noninvasive measurements. By gathering sufficiently large statistics, we have been able to violate Leggett-Garg inequality and time-order invariance. The detailed analysis of the data on 10 qubit sets from 5 devices available on IBM Quantum and one on IonQ reveals violations beyond 5 standard deviations in almost all cases. We implemented our protocols using fractional gates, newly available on the IBM Heron devices, allowing us to benchmark them in application to weak measurements. The noninvasiveness is supported by a qualitative and quantitative agreement with the model of weak disturbance. Moreover, our data expose statistically significant deviations from theoretical predictions that exceed declared device error rates, establishing weak measurement protocols as a sensitive benchmark for quantum hardware. These advances transform public quantum computers into practical testbeds for probing foundational questions of realism and temporal order with unprecedented accessibility and precision.


💡 Research Summary

This paper reports the first experimental violation of the Leggett‑Garg (LG) inequality and of time‑order invariance on publicly accessible quantum computers, achieved through genuinely non‑invasive weak measurements. The authors exploit IBM Quantum’s Heron‑generation devices and IonQ’s trapped‑ion platform to implement weak measurement protocols that rely on fractional rotation gates and the newly available ECR (echoed‑cross‑resonance) gate. By coupling system qubits to ancilla “meter” qubits with a tunable interaction strength λ (realized as sin θ via fractional gates), they obtain a measurement disturbance that scales as λ², in agreement with the theoretical weak‑measurement model derived from Kraus operators.

The experimental design uses three dichotomic observables A, B, and C. A and B are measured weakly in two possible temporal orders (A→B→C and B→A→C), while C is measured projectively. Theoretical analysis shows that for a specific choice of initial state |ψ⟩ = (|+⟩ + |−⟩)/√2 and observables defined by phase‑shifted Pauli‑X rotations, the quantum predictions are ⟨A⟩ = ⟨B⟩ = 1/√2, ⟨AB⟩ = 0, and ⟨ABC⟩ = −⟨BAC⟩ = ½. These values violate the LG inequality ⟨A⟩ + ⟨B⟩ − ⟨AB⟩ ≤ 1 (since √2 > 1) and demonstrate a clear time‑order asymmetry (⟨ABC⟩ ≠ ⟨BAC⟩).

To test these predictions, the authors identified ten three‑qubit groups across five IBM devices and one IonQ device that were most likely to show a violation, based on noise‑aware simulations. For each group they executed millions of circuit shots, collecting sufficient statistics to reduce sampling error well below the expected signal. The weak measurement is implemented by a controlled‑rotation (C‑RX, C‑RZ, or ECR) between the system qubit and its ancilla, with the rotation angle θ setting the measurement strength λ = sin θ. Two protocols were explored: (I) a single controlled‑X gate with a fractional rotation, and (II) a pair of standard two‑qubit gates combined with a fractional rotation, both achieving comparable λ values.

Statistical analysis shows that in virtually all groups the LG inequality is violated by more than five standard deviations, with an average significance of about ten sigma. The time‑order asymmetry is likewise observed with comparable confidence. Crucially, when the same circuits are run with strong projective measurements (λ → ∞) the correlations revert to the classical bounds, confirming that the observed violations stem from the weak‑measurement regime and are not artifacts of gate errors alone.

An unexpected but important finding is that the observed deviations from the ideal theoretical predictions exceed the error rates reported by IBM and IonQ for gate infidelity and readout error. This suggests that weak‑measurement protocols are highly sensitive probes of subtle hardware imperfections, providing a novel benchmark that can detect noise sources not captured by standard calibration metrics. The authors propose the term “weak‑measurement‑induced benchmark” for this capability and discuss its potential use in hardware diagnostics, error mitigation, and the development of more accurate noise models.

The paper’s contributions can be summarized as follows: (1) Demonstration of genuine non‑invasive weak measurements on public quantum processors, a first in the field. (2) Simultaneous experimental verification of LG inequality violation and time‑order non‑invariance using the same hardware. (3) Introduction of fractional‑gate‑based control of measurement strength, enabling precise tuning of λ. (4) Identification of weak measurements as a sensitive diagnostic tool for quantum hardware, revealing error contributions beyond declared specifications.

In the discussion, the authors outline several avenues for future work: extending the protocol to multiple weak measurements and higher‑order temporal correlations, applying the method to other platforms such as superconducting circuits with different connectivity, refining statistical methods for non‑invasiveness certification, and integrating weak‑measurement data into hardware calibration and error‑correction pipelines. By turning publicly available quantum computers into testbeds for foundational quantum physics, this work bridges the gap between quantum information technology and fundamental studies of realism and temporal causality, and opens the door for reproducible, community‑wide investigations of quantum foundations.


Comments & Academic Discussion

Loading comments...

Leave a Comment