Markov chains in random environment with applications in queueing theory and machine learning
We prove the existence of limiting distributions for a large class of Markov chains on a general state space in a random environment. We assume suitable versions of the standard drift and minorization conditions. In particular, the system dynamics should be contractive on the average with respect to the Lyapunov function and large enough small sets should exist with large enough minorization constants. We also establish that a law of large numbers holds for bounded functionals of the process. Applications to queuing systems, to machine learning algorithms and to autoregressive processes are presented.
💡 Research Summary
The paper “Markov chains in random environment with applications in queueing theory and machine learning” develops a comprehensive ergodic theory for Markov chains evolving on a general (possibly uncountable) state space while being driven by a stationary random environment. The authors relax the stringent contraction requirement that appears in earlier works (e.g., requiring the one‑step contraction factor γ(y) to be uniformly less than one) and replace it with an average‑contraction condition: the geometric mean of the random contraction factors must be strictly smaller than one, i.e. (\bar\gamma = \limsup_{n\to\infty} \bigl(E\prod_{k=1}^n \gamma(Y_k)\bigr)^{1/n}<1). This allows the model to admit environments where γ(y)≥1 for some y, greatly broadening the class of admissible systems.
The core technical framework is built on a Lyapunov drift condition (Assumption 2.2) involving a measurable function V: X→ℝ₊ and two environment‑dependent functions K(y)≥1 and γ(y)>0 such that (
Comments & Academic Discussion
Loading comments...
Leave a Comment