ChronoRAN: Analyzing Latency in 5G Systems
This paper presents ChronoRAN, a mathematical framework for accurately computing one-way latency (for uplink and downlink) in the 5G RAN across diverse system configurations. ChronoRAN models latency sources at every layer of the Radio Access Network (RAN), pinpointing system-level bottlenecks–such as radio interfaces, scheduling policies, and hardware/software constraints–while capturing their intricate dependencies and their stochastic nature. ChronoRAN also includes a configuration optimizer that uses its mathematical models to search through hundreds of billions of configurations and find settings that meet latency-reliability targets under user constraints. We validate ChronoRAN on two open-sourced 5G RAN testbeds (srsRAN and OAI) and a public commercial 5G network, demonstrating that it can closely match empirical latency distributions and significantly outperform prior analytical models and widely used simulators (MATLAB 5G Toolbox, 5G-LENA). It can also find system configurations that meet Ultra-Reliable Low-Latency Communications (URLLC) targets and enable network operators to efficiently identify the best setup for their systems.
💡 Research Summary
ChronoRAN is a comprehensive mathematical framework designed to accurately predict one‑way latency (both uplink and downlink) within the 5G Radio Access Network (RAN). The authors begin by highlighting the gap between URLLC specifications—0.5 ms one‑way latency with 99.999 % reliability—and the performance of real‑world deployments, which often fall short due to insufficient modeling of the many interacting latency sources. ChronoRAN classifies latency contributors into three categories: processing latency (CPU, OS scheduling, protocol stack traversal), protocol latency (scheduling request, grant procedures, TDD slot alignment, grant‑free vs. grant‑based access), and radio latency (RF front‑end conversion, bus interfaces, over‑the‑air transmission).
Using a detailed “ping” example, the paper decomposes the uplink path into eight sequential steps—from UE generation of a scheduling request (SR) to gNB processing and eventual forwarding to the UPF—and the downlink path into a symmetric sequence. Each step is modeled as a random variable whose distribution is obtained from empirical measurements on three testbeds: the open‑source srsRAN, Open Air Interface (OAI), and a commercial standalone 5G network. Non‑deterministic components such as OS jitter or PCIe latency are captured through Monte‑Carlo sampling of measured histograms.
ChronoRAN builds analytical expressions for several scenarios: (1) size‑1 packets that fit within the initial grant, (2) size‑2 packets requiring additional slots, (3) cases where radio latency exceeds the TDD period, (4) packet trains with RLC buffering modeled as a Markov chain, (5) grant‑free uplink, (6) downlink latency, (7) mini‑slot configurations, (8) FDD operation, and (9) multi‑UE contention. The models explicitly account for the interplay between TDD patterns, numerology (sub‑carrier spacing μ), slot duration, and the timing of SR/Grant exchanges. For instance, when radio latency is larger than a TDD period, the framework adds a full‑pattern waiting term, accurately reproducing the observed latency spikes.
A key contribution is the configuration optimizer. The latency models become objective functions subject to constraints such as frequency band (FR1/FR2), allowed TDD patterns, scheduling policy, and hardware limits. Although the search space spans billions of configurations, ChronoRAN prunes infeasible regions using dependency analysis and employs a lattice‑based grid search to locate optimal settings within minutes—far faster than exhaustive simulation.
Experimental validation shows that ChronoRAN’s predicted latency distributions achieve Wasserstein distances of 0.003–0.035 compared with real measurements, a 40‑fold improvement over MATLAB 5G Toolbox and ns‑3 5G‑LENA. Mean absolute errors for minimum and maximum latency are reduced by 21× and 35× respectively. The optimizer reveals practical insights: in sub‑6 GHz bands, no grant‑based configuration meets the 0.5 ms target at 99.99 % reliability, while about 1 % of grant‑free configurations satisfy a relaxed 1 ms target with the same reliability. Moreover, shortening TDD patterns can be counter‑productive when radio latency dominates, confirming the need for holistic modeling.
In summary, ChronoRAN provides the first mathematically rigorous, measurement‑backed tool for 5G RAN latency analysis and configuration optimization. It bridges the gap between theoretical URLLC requirements and operational reality, enabling network operators to quickly identify bottlenecks, evaluate trade‑offs, and deploy configurations that approach ultra‑low‑latency goals. The framework also lays groundwork for future 6G research, where sub‑millisecond latency will become a baseline expectation.
Comments & Academic Discussion
Loading comments...
Leave a Comment