Analysis of Dirichlet Energies as Over-smoothing Measures
We analyze the distinctions between two functionals often used as over-smoothing measures: the Dirichlet energies induced by the unnormalized graph Laplacian and the normalized graph Laplacian. We demonstrate that the latter fails to satisfy the axiomatic definition of a node-similarity measure proposed by Rusch \textit{et al.} By formalizing fundamental spectral properties of these two definitions, we highlight critical distinctions necessary to select the metric that is spectrally compatible with the GNN architecture, thereby resolving ambiguities in monitoring the dynamics.
💡 Research Summary
The paper conducts a rigorous comparison of two Dirichlet‑energy based over‑smoothing measures commonly employed in graph neural networks (GNNs): one induced by the unnormalized graph Laplacian Δ and the other by the symmetric normalized Laplacian Δ_norm. Starting from the axiomatic framework introduced by Rusch et al. (2021), the authors recall two essential axioms for a node‑similarity measure μ: (1) μ(X)=0 if and only if all node embeddings are identical (i.e., X_i = c for every node i), and (2) sub‑additivity (the triangle inequality). A function satisfying both is deemed a valid over‑smoothing metric.
The unnormalized Dirichlet energy E_Δ = (1/|V|) tr(XᵀΔX) and its square root √E_Δ are shown to satisfy both axioms, confirming that they are legitimate over‑smoothing measures. In contrast, the normalized Dirichlet energy E_Δnorm = tr(XᵀΔ_normX) does not satisfy axiom 1 on non‑regular graphs. The authors demonstrate that even when every node carries the same constant signal c, the energy reduces to a sum of terms proportional to (c/√d_i − c/√d_j)², which remains positive whenever neighboring nodes have different degrees. Consequently, E_Δnorm fails the “zero‑iff‑constant” condition.
To explain why the two Laplacians behave differently, the paper proves that Δ and Δ_norm generally do not commute:
Comments & Academic Discussion
Loading comments...
Leave a Comment