Graph neural networks (GNNs) are traditionally categorised into message-passing networks (MPNNs) and spectral graph networks, representing distinct approaches within machine learning and signal processing. Antonis Vasileiou from RWTH Aachen University, Juan Cervino from Massachusetts Institute of Technology, and Pascal Frossard from Ecole Polytechnique F ed erale de Lausanne, alongside Kanatsoulis et al., demonstrate that this separation is largely unfounded and impedes advancement in the field. Their work presents a novel perspective, framing both MPNNs and spectral GNNs as differing parametrisations of permutation-equivariant operators applied to graph signals. This reveals that many existing architectures possess comparable expressive capabilities, with significant differences emerging only under specific conditions. Furthermore, the researchers highlight the complementary strengths of each approach, noting that MPNNs excel in discrete structure and expressivity analysis, while spectral GNNs offer tools for understanding smoothing and stability. Ultimately, this research advocates for a unified theoretical framework to accelerate progress in graph learning by recognising the fundamental connections between these two GNN paradigms.
Message-passing and spectral methods exhibit equivalence as permutation-equivariant operators on graphs
Researchers have revealed a unifying perspective on graph neural networks, demonstrating that commonly separated approaches, message-passing and spectral methods, are fundamentally linked. This work posits that the perceived divide between these two traditions is largely artificial, hindering advancements in the field of graph learning.
By framing both message-passing neural networks and spectral graph neural networks as different ways to represent permutation-equivariant operators acting on graph signals, scientists have identified a surprising degree of equivalence in their expressive power. This new viewpoint suggests that many popular architectures are more similar than previously thought, with genuine differences emerging only under specific conditions.
The study highlights that message-passing networks excel at capturing discrete structure and facilitating expressivity analysis through tools borrowed from logic and graph isomorphism research. Conversely, spectral graph neural networks offer principled methods for understanding crucial graph properties such as smoothing, bottlenecks, stability, and community structure.
This complementary strength suggests that progress in graph learning will accelerate by acknowledging these similarities and differences, and by fostering a unified theoretical framework. Rather than treating these approaches as competing paradigms, the research advocates for a cohesive understanding of their shared foundations.
Specifically, the work demonstrates that the aggregation of information in message-passing layers can often be interpreted as the localized action of a graph shift operator. Similarly, the global actions of spectral filters, central to spectral GNNs, can frequently be implemented or approximated by sufficiently deep message-passing networks.
This finding underscores the potential for cross-pollination of ideas and techniques between the two communities. The research further proposes a clear separation between spectral filtering and spectral positional encodings, suggesting the latter should be considered a design choice applicable to both types of GNNs, potentially enhancing their performance.
Ultimately, this position paper argues for a conceptual shift in how graph neural networks are understood and developed. By providing a unifying language and clarifying the relationships between message-passing and spectral methods, the researchers aim to guide the field towards a more principled and effective approach to learning from graph data, with implications for diverse applications including optimisation and weather forecasting. This work establishes a foundation for future research focused on leveraging the combined strengths of both paradigms.
Permutation Equivariance and Expressive Power of Graph Neural Networks
Researchers posit that the distinction between message-passing graph neural networks (MPNNs) and spectral graph neural networks (GNNs) is largely artificial, hindering progress within the field of graph learning. This work frames both MPNNs and spectral GNNs as different parametrizations of permutation-equivariant operators acting on graph signals, establishing a unifying perspective.
Investigations reveal that many commonly used architectures exhibit equivalent expressive power, with genuine differences emerging only under specific conditions. The study highlights the complementary strengths of each approach, noting that MPNNs offer a natural framework for analysing discrete structure and expressivity using tools from logic and graph isomorphism research.
Conversely, the spectral perspective provides principled methods for understanding smoothing effects, identifying bottlenecks, assessing stability, and characterising community structure within graphs. This analysis involved a theoretical re-evaluation of existing architectures, demonstrating their relationships through the lens of permutation-equivariant operators.
Furthermore, the research emphasizes the potential for accelerated progress by recognising the key similarities and differences between these two GNN types. By advocating for a unified theoretical and conceptual framework, rather than treating them as competing paradigms, the study aims to foster innovation in graph learning. This work does not introduce new architectures but instead provides a novel viewpoint for interpreting and comparing existing methods, ultimately promoting a more cohesive understanding of the field.
Permutation Equivariance Unifies Message Passing and Spectral Graph Neural Networks
Researchers detail a framework understanding both message-passing neural networks and spectral graph neural networks as parametrizations of permutation-equivariant operators on graph signals. This work establishes that many popular architectures possess equivalent expressive power, with genuine differences arising only in specific operational regimes.
Message-passing neural networks offer a natural language for discrete structure and expressivity analysis, utilising tools from logic and graph isomorphism research. This allows for a principled understanding of structural properties that these networks can and cannot capture, leading to the development of more expressive architectures based on the k-dimensional Weisfeiler, Leman algorithm and subgraph counts.
The study highlights the seamless extension of message-passing neural networks to directed graphs and link prediction tasks through well-understood constructions, contrasting with the more cumbersome development of spectral GNNs for similar applications. Furthermore, the local update mechanism of message-passing neural networks aligns closely with established graph algorithms, demonstrating their capability to express and learn algorithms from data, including solving linear programs and approximating hard maximum constraint satisfaction problems.
From a systems perspective, the explicit aggregation and update structure of message-passing neural networks facilitates large-scale, distributed graph learning, particularly in databases and recommender systems. Spectral graph neural networks provide a unique abstraction for understanding information smoothing, bottlenecks, and community structure within graphs.
Oversmoothing, observed as the contraction of node embeddings, is characterised as spectral contraction, where repeated low-pass filtering collapses features into the span of slowest-decaying eigenvectors. The frequency response of spectral filters explicitly reveals the contraction rate and the role of the graph’s spectrum, exposing trade-offs between depth and contraction. Bottlenecks, manifesting as low algebraic connectivity or poor conductance, are identified through measures like balanced Forman curvature, effective resistance, and commute time, providing computable proxies for oversquashing.
Equivalence of Message Passing and Spectral Graph Neural Networks
Researchers demonstrate that the distinction between message-passing graph neural networks (MPNNs) and spectral graph neural networks is largely artificial and impedes progress within the field. This work proposes a unified perspective, framing both types of GNNs as different ways to represent permutation-equivariant operators acting on graph signals.
Many commonly used architectures exhibit equivalent expressive power, with genuine differences emerging only under specific conditions. MPNNs excel at representing discrete structures and allow for expressivity analysis using tools from logic and graph isomorphism research. Conversely, the spectral perspective offers principled methods for understanding smoothing, bottlenecks, stability, and community structure within graphs.
Generalisation analyses for both MPNNs and spectral GNNs rely on similar assumptions, such as bounded node features and Lipschitz-continuous functions, leading to comparable bounds on performance. Spectral positional encodings, often linked to spectral GNNs, are presented as input representation choices independent of the underlying network architecture.
Acknowledging limitations, the authors note that their analysis relies on assumptions regarding operator behaviour and graph characteristics. Future research should focus on further unifying these perspectives within a common theoretical framework, rather than treating them as competing paradigms. This integrated approach promises to accelerate advancements in graph learning by leveraging the complementary strengths of both message-passing and spectral methods, ultimately improving model performance and understanding of graph-structured data.
🗞 Position: Message-passing and spectral GNNs are two sides of the same coin
🧠 ArXiv: https://arxiv.org/abs/2602.10031
