CFARNet: Learning-Based High-Resolution Multi-Target Detection for Rainbow Beam Radar

CFARNet: Learning-Based High-Resolution Multi-Target Detection for Rainbow Beam Radar
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Millimeter-wave (mmWave) OFDM radar equipped with rainbow beamforming, enabled by phase-time arrays (PTAs), provides wide-angle coverage and is well-suited for fast real-time target detection and tracking. However, accurate detection of multiple closely spaced targets remains a key challenge for conventional signal processing pipelines, particularly those relying on constant false alarm rate (CFAR) detectors. This paper presents CFARNet, a learning-based processing framework that replaces CFAR with a convolutional neural network (CNN) for peak detection in the angle-Doppler domain. The network predicts target subcarrier indices, which guide angle estimation via a known frequency-angle mapping and enable high-resolution range and velocity estimation using the MUSIC algorithm. Extensive simulations demonstrate that CFARNet significantly outperforms a baseline combining CFAR and MUSIC, especially under low transmit power and dense multi-target conditions. The proposed method offers superior angular resolution, enhanced robustness in low-SNR scenarios, and improved computational efficiency, highlighting the potential of data-driven approaches for high-resolution mmWave radar sensing.


💡 Research Summary

This paper addresses the challenge of detecting multiple closely spaced targets in millimeter‑wave (mmWave) OFDM radar that employs rainbow beamforming through phase‑time arrays (PTAs). Traditional processing pipelines first compute an angle‑Doppler spectrum from the received echoes, then apply a two‑dimensional constant false alarm rate (CFAR) detector to locate peaks, and finally use the MUSIC algorithm for high‑resolution range and velocity estimation. While effective for well‑separated targets, CFAR suffers when targets are angularly or Doppler‑wise close, when their reflectivities differ markedly, or when clutter statistics are non‑stationary. The authors propose CFARNet, a convolutional neural network (CNN) that replaces the CFAR stage entirely.

CFARNet takes the log‑magnitude of the angle‑Doppler spectrum as input, thereby compressing the dynamic range and making weak targets more visible. The network architecture consists of four 2‑D convolutional blocks that progressively increase channel depth (1→64→128→256→512) while reducing the Doppler dimension, followed by max‑pooling, two 1‑D convolutional blocks, and a final 1‑D convolution that outputs a logit for each of the M=2048 subcarriers. The output vector indicates the likelihood that a given subcarrier corresponds to a target peak. Training is formulated as a multi‑label binary classification problem using binary cross‑entropy with logits, but instead of hard one‑hot labels the authors employ Gaussian label smoothing around each true peak. This provides richer gradient information and improves robustness to small localization errors. The network is trained with AdamW (lr = 1e‑4, weight decay = 1e‑5), a cosine‑annealing learning‑rate schedule, and early stopping based on validation loss. Separate models are trained for each transmit power level (45, 50, 55, 60 dBm) to handle varying SNR conditions.

During inference, the K subcarrier indices with the highest logits are selected as predicted peak locations. The known frequency‑angle mapping of the rainbow beam (derived from the PTA phase and delay settings) converts each predicted subcarrier index into an angle estimate via a closed‑form expression. Around each predicted subcarrier, a narrow sub‑band (±10 subcarriers) of the angle‑Doppler spectrum is extracted, and the MUSIC algorithm is applied separately for range and radial velocity. For range, a covariance matrix across subcarriers is formed, its noise subspace is computed, and a spatial spectrum is evaluated using steering vectors that depend on range. For velocity, a temporal covariance matrix is built across OFDM symbols, and a similar spectral search yields the velocity estimate.

The authors generate extensive synthetic datasets to evaluate performance. Target parameters (angle, range, velocity) are drawn uniformly within realistic bounds (angles ± 60°, ranges 35–100 m, velocities 1–10 m/s). Datasets are created for five minimum angular separations (Δϕ_min = 1°, 1.5°, 3°, 5°, 10°) with 50 000 samples each; the most challenging set (Δϕ_min = 1°) is split into 35 k training, 7.5 k validation, and 7.5 k test samples.

Results show that CFARNet dramatically outperforms the baseline (CFAR + MUSIC). In the hardest scenario (Δϕ_min = 1°, transmit power = 45 dBm), the 90th‑percentile error in angle, range, and velocity is reduced by roughly 3 dB compared with the baseline. As transmit power increases, CFARNet’s error approaches zero, while the baseline remains limited by CFAR’s resolution. When the number of targets K grows from 1 to 5, CFARNet maintains low error growth, whereas the baseline’s error escalates sharply, especially at low power. Visualizations of a three‑target case (angles 5°, 6°, 7°) illustrate that CFARNet’s predicted peaks align precisely with ground truth, while CFAR misplaces peaks and even misses one target.

Beyond detection accuracy, the computational analysis indicates that CFARNet’s peak‑prediction stage scales linearly with the number of subcarriers (O(M)), whereas traditional 2‑D CFAR requires O(M · N_s) operations due to sliding‑window statistics across the Doppler dimension. This reduction in complexity makes CFARNet attractive for real‑time implementation on embedded radar processors.

In summary, the paper demonstrates that leveraging the deterministic frequency‑angle relationship inherent in rainbow‑beam PTA radars, combined with a data‑driven CNN for peak detection, can overcome the fundamental limitations of statistical CFAR detectors. The proposed CFARNet achieves higher detection probability, finer angular resolution, and robust performance under low‑SNR, dense‑target conditions while also offering computational savings. Future work is suggested on hardware validation, robustness to clutter and multipath, extension to joint range‑Doppler‑angle super‑resolution, and integration with communication waveforms for true ISAC (integrated sensing and communication) systems.


Comments & Academic Discussion

Loading comments...

Leave a Comment