Bridging Visual and Wireless Sensing: A Unified Radiation Field for 3D Radio Map Construction
The emerging applications of next-generation wireless networks (e.g., immersive 3D communication, low-altitude networks, and integrated sensing and communication) necessitate high-fidelity environmental intelligence. 3D radio maps have emerged as a critical tool for this purpose, enabling spectrum-aware planning and environment-aware sensing by bridging the gap between physical environments and electromagnetic signal propagation. However, constructing accurate 3D radio maps requires fine-grained 3D geometric information and a profound understanding of electromagnetic wave propagation. Existing approaches typically treat optical and wireless knowledge as distinct modalities, failing to exploit the fundamental physical principles governing both light and electromagnetic propagation. To bridge this gap, we propose URF-GS, a unified radio-optical radiation field representation framework for accurate and generalizable 3D radio map construction based on 3D Gaussian splatting (3D-GS) and inverse rendering. By fusing visual and wireless sensing observations, URF-GS recovers scene geometry and material properties while accurately predicting radio signal behavior at arbitrary transmitter-receiver (Tx-Rx) configurations. Experimental results demonstrate that URF-GS achieves up to a 24.7% improvement in spatial spectrum prediction accuracy and a 10x increase in sample efficiency for 3D radio map construction compared with neural radiance field (NeRF)-based methods. This work establishes a foundation for next-generation wireless networks by integrating perception, interaction, and communication through holistic radiation field reconstruction.
💡 Research Summary
The paper addresses the pressing need for high‑fidelity environmental intelligence in next‑generation wireless networks, where accurate three‑dimensional (3D) radio maps are essential for spectrum‑aware planning, interference management, and emerging applications such as low‑altitude networks and immersive 3D communication. Existing approaches either treat visual (optical) and wireless information as separate modalities, rely on oversimplified probabilistic models, require costly deterministic ray‑tracing with perfect geometry, or use black‑box learning methods that lack physical interpretability and suffer from high computational complexity. Moreover, most prior work focuses on 2D maps and cannot generalize to arbitrary transmitter‑receiver (Tx‑Rx) configurations in 3D space.
To overcome these limitations, the authors propose URF‑GS (Unified Radio‑Optical Radiation Field for Gaussian Splatting), a novel framework that jointly learns a unified radiation field from both visual images and wireless channel measurements. The core of URF‑GS is built on 3D Gaussian splatting (3D‑GS), which represents a scene as a collection of 3D Gaussian primitives. Each primitive encodes spatial position, scale, orientation, color, and a learned radiance (or radiation) pattern, enabling simultaneous modeling of optical radiance and electromagnetic (EM) field propagation.
Training is performed via physics‑informed inverse rendering. Visual observations provide pixel‑wise RGB loss, while wireless observations (multi‑path components such as path loss, delay, AoD/AoA) are rendered through a differentiable EM propagation model that incorporates material properties (conductivity, permeability) and antenna radiation patterns. The loss jointly minimizes image reconstruction error (e.g., L2, SSIM) and spectrum reconstruction error, forcing the network to infer accurate geometry, material parameters, and EM radiation patterns. Because the same Gaussian primitives serve both modalities, the learned representation naturally generalizes to unseen Tx‑Rx placements without retraining.
Experiments are conducted on a high‑resolution 60 GHz indoor replica of a NIST “Bistro” scene. Channel measurements are generated with the Sionna ray‑tracing simulator for multiple Tx locations and a dense set of Rx positions, then projected into “spectrum images” via equirectangular mapping. Three training regimes are evaluated: (1) full‑training with 3,200 samples for a primary Tx, (2) few‑shot with only ~10 samples for a new Tx, and (3) zero‑shot with no samples for a completely new Tx. Baselines include NeRF‑2 (NeRF‑based spectrum synthesis), RF‑3DGS (radio field reconstruction with 3D‑GS), and WRF‑GS+ (a hybrid GS method that separates large‑ and small‑scale components).
Quantitative results show that URF‑GS achieves the highest PSNR (17.38 dB), SSIM (0.701), and competitive LPIPS (0.334) in the full‑training scenario, outperforming NeRF‑2 by up to 24.7 % in SSIM. In few‑shot and zero‑shot settings, URF‑GS still dominates, delivering up to 10× better sample efficiency than NeRF‑2 and markedly higher perceptual quality than the other baselines. Visualizations confirm that URF‑GS faithfully reproduces fine‑grained multipath effects and shadowing, even with minimal data.
Two practical case studies demonstrate the framework’s utility. In Wi‑Fi access‑point (AP) deployment, URF‑GS predicts average received power for 25 unseen Tx positions across 386 Rx locations, matching ground‑truth trends and enabling data‑driven ranking of candidate AP sites without exhaustive site surveys. In robot path planning, the unified radiation field is used to anticipate radio‑dead zones and select collision‑free, energy‑efficient trajectories that respect both geometric obstacles and wireless quality constraints.
Overall, URF‑GS contributes three key innovations: (1) a joint visual‑wireless representation that unifies optical radiance and EM radiation within a single volumetric model, (2) physics‑based inverse rendering that explicitly learns material and antenna parameters, and (3) the use of 3D Gaussian splatting to achieve high‑quality, computationally efficient reconstructions. These advances enable accurate, generalizable 3D radio map construction and open new avenues for perception‑aware network planning, immersive experiences, and intelligent robotics in future wireless ecosystems.
Comments & Academic Discussion
Loading comments...
Leave a Comment