Deep Generative Modeling with Spatial and Network Images: An Explainable AI (XAI) Approach
This article addresses the challenge of modeling the amplitude of spatially indexed low frequency fluctuations (ALFF) in resting state functional MRI as a function of cortical structural features and a multi-task coactivation network in the Adolescent Brain Cognitive Development (ABCD) Study. It proposes a generative model that integrates effects of spatially-varying inputs and a network-valued input using deep neural networks to capture complex non-linear and spatial associations with the output. The method models spatial smoothness, accounts for subject heterogeneity and complex associations between network and spatial images at different scales, enables accurate inference of each images effect on the output image, and allows prediction with uncertainty quantification via Monte Carlo dropout, contributing to one of the first Explainable AI (XAI) frameworks for heterogeneous imaging data. The model is highly scalable to high-resolution data without the heavy pre-processing or summarization often required by Bayesian methods. Empirical results demonstrate its strong performance compared to existing statistical and deep learning methods. We applied the XAI model to the ABCD data which revealed associations between cortical features and ALFF throughout the entire brain. Our model performed comparably to existing methods in predictive accuracy but provided superior uncertainty quantification and faster computation, demonstrating its effectiveness for large-scale neuroimaging analysis. Open-source software in Python for XAI is available.
💡 Research Summary
The paper introduces a novel two‑stage deep generative model designed to predict the amplitude of low‑frequency fluctuations (ALFF) measured from resting‑state fMRI, using both cortical structural MRI features and a multi‑task functional co‑activation network as inputs. The work is motivated by the Adolescent Brain Cognitive Development (ABCD) Study, which provides a large cohort of 9‑10‑year‑old children with high‑resolution structural and functional imaging. The authors note that while ALFF is an important biomarker of brain functional integrity, prior studies have examined its relationship with either structural or functional data in isolation, lacking a unified framework that can handle heterogeneous, high‑dimensional image inputs while offering interpretable inference and principled uncertainty quantification.
Data and Scientific Objectives
The outcome variable is the log‑transformed variance of the BOLD signal (BOLD VAR) at 148 cortical regions of interest (ROIs) defined by the Destrieux atlas, serving as a proxy for ALFF. Two spatial predictors are used for each ROI: cortical thickness (CT) and gray‑white matter intensity contrast (GWMIC). In addition, a 12 × 12 multi‑task co‑activation matrix is constructed for each subject by averaging pairwise correlations among ROIs across three task paradigms, yielding a network image that captures coordinated activation across brain lobes and hemispheres. The scientific goals are: (a) accurate whole‑brain prediction of ALFF, (b) interpretable estimation of spatially varying regression coefficients for CT and GWMIC, and (c) uncertainty quantification for both model components and predictions.
Methodology – Stage 1: Latent Network Effects
The first stage models each edge of the undirected network matrix (Z_i) as an interaction between latent vectors associated with the two incident nodes. Formally, (z_{i}(v,v’) \approx u_{i,v}^\top v_{i,v’}), where (u) and (v) are low‑dimensional embeddings learned jointly across subjects. This parsimonious probabilistic formulation captures complex inter‑node dependencies while keeping the number of parameters manageable. Bayesian inference is performed using standard maximum‑likelihood estimation with an L2 regularizer, producing subject‑specific latent node effects.
Methodology – Stage 2: Explainable AI Regression
In the second stage the outcome image (y_i(s_{v,j})) at ROI (j) within node (v) is modeled as
\
Comments & Academic Discussion
Loading comments...
Leave a Comment