Adaptive monotonicity testing in sublinear time

Adaptive monotonicity testing in sublinear time
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Modern large-scale data analysis increasingly faces the challenge of achieving computational efficiency as well as statistical accuracy, as classical statistically efficient methods often fall short in the first regard. In the context of testing monotonicity of a regression function, we propose FOMT (Fast and Optimal Monotonicity Test), a novel methodology tailored to meet these dual demands. FOMT employs a sparse collection of local tests, strategically generated at random, to detect violations of monotonicity scattered throughout the domain of the regression function. This sparsity enables significant computational efficiency, achieving sublinear runtime in most cases, and quasilinear runtime (i.e., linear up to a log factor) in the worst case. In contrast, existing statistically optimal tests typically require at least quadratic runtime. FOMT’s statistical accuracy is achieved through the precise calibration of these local tests and their effective combination, ensuring both sensitivity to violations and control over false positives. More precisely, we show that FOMT separates the null and alternative hypotheses at minimax optimal rates over Hölder function classes of smoothness order in $(0,2]$. Further, when the smoothness is unknown, we introduce an adaptive version of FOMT, based on a modified Lepskii principle, which attains statistical optimality and meanwhile maintains the same computational complexity as if the intrinsic smoothness were known. Extensive simulations confirm the competitiveness and effectiveness of both FOMT and its adaptive variant.


💡 Research Summary

The paper addresses the problem of testing monotonicity of a regression function in a non‑parametric setting where observations follow
(Y_i = f(x_i) + \varepsilon_i,; i=1,\dots,n) with equidistant design points (x_i=i/n) and i.i.d. Gaussian noise of known variance. The unknown function (f) is assumed to belong to a Hölder class (\Sigma(\beta,L)) with smoothness (\beta\in(0,2]). The null hypothesis is that (f) is monotone increasing on (


Comments & Academic Discussion

Loading comments...

Leave a Comment