We analyze graph neural network models that combine iterative message-passing implemented by a function with untrained weights and graph pooling operations. In particular, we alternate randomized neural message passing with graph coarsening operations, which provide multiple views of the underlying graph. Each view is concatenated to build a graph embedding for graph-level classification. The main advantage of the proposed architecture is its speed, further improved by the pooling, in computing graph-level representations. Results obtained on popular graph classification benchmarks, comparing different topological pooling techniques, support our claim.
Pyramidal Graph Echo State Networks
Claudio Gallicchio;Alessio Micheli
2020-01-01
Abstract
We analyze graph neural network models that combine iterative message-passing implemented by a function with untrained weights and graph pooling operations. In particular, we alternate randomized neural message passing with graph coarsening operations, which provide multiple views of the underlying graph. Each view is concatenated to build a graph embedding for graph-level classification. The main advantage of the proposed architecture is its speed, further improved by the pooling, in computing graph-level representations. Results obtained on popular graph classification benchmarks, comparing different topological pooling techniques, support our claim.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.