On Dimensionality of Feature Vectors in MPNNs

Bravo C.; Kozachinskiy A.; Rojas C.

Abstract

We revisit the result of Morris et al. (AAAI'19) that message-passing graphs neural networks (MPNNs) are equal in their distinguishing power to the Weisfeiler-Leman (WL) isomorphism test. Morris et al. show their result with ReLU activation function and O(n)-dimensional feature vectors, where n is the size of the graph. Recently, by introducing randomness into the architecture, Aamand et al. (NeurIPS'22) improved this bound to O(log n)-dimensional feature vectors, although at the expense of guaranteeing perfect simulation only with high probability. In all these constructions, to guarantee equivalence to the WL test, the dimension of feature vectors in the MPNN has to increase with the size of the graphs. However, architectures used in practice have feature vectors of constant dimension. Thus, there is a gap between the guarantees provided by these results and the actual characteristics of architectures used in practice. In this paper we close this gap by showing that, for any non-polynomial analytic (like the sigmoid) activation function, to guarantee that MPNNs are equivalent to the WL test, feature vectors of dimension d = 1 is all we need, independently of the size of the graphs.

Más información

Título según WOS: On Dimensionality of Feature Vectors in MPNNs
Título de la Revista: ALGORITHMIC LEARNING THEORY
Volumen: 235
Editorial: JMLR-JOURNAL MACHINE LEARNING RESEARCH
Fecha de publicación: 2024
Idioma: English
Notas: ISI