Graph Neural Networks as Gradient Flows: understanding graph convolutions via energy
Abstract
Gradient flows are differential equations that minimize an energy functional and constitute the main descriptors of physical systems. We apply this formalism to Graph Neural Networks (GNNs) to develop new frameworks for learning on graphs as well as provide a better theoretical understanding of existing ones. We derive GNNs as a gradient flow equation of a parametric energy that provides a physicsinspired interpretation of GNNs as learning particle dynamics in the feature space. In particular, we show that in graph convolutional models (GCN), the positive/negative eigenvalues of the channel mixing matrix correspond to attractive/repulsive forces between adjacent features. We rigorously prove how the channelmixing can learn to steer the dynamics towards low or high frequencies, which allows to deal with heterophilic graphs. We show that the same class of energies is decreasing along a larger family of GNNs; albeit not gradient flows, they retain their inductive bias. We experimentally evaluate an instance of the gradient flow framework that is principled, more efficient than GCN, and achieves competitive performance on graph datasets of varying homophily often outperforming recent baselines specifically designed to target heterophily.
 Publication:

arXiv eprints
 Pub Date:
 June 2022
 DOI:
 10.48550/arXiv.2206.10991
 arXiv:
 arXiv:2206.10991
 Bibcode:
 2022arXiv220610991D
 Keywords:

 Computer Science  Machine Learning;
 Statistics  Machine Learning
 EPrint:
 First two authors equal contribution