We analyze graph neural network models that combine iterative message-passing implemented by a function with untrained weights and graph pooling operations. In particular, we alternate randomized neural message passing with graph coarsening operations, which provide multiple views of the underlying graph. Each view is concatenated to build a graph embedding for graph-level classification. The main advantage of the proposed architecture is its speed, further improved by the pooling, in computing graph-level representations. Results obtained on popular graph classification benchmarks, comparing different topological pooling techniques, support our claim.

Pyramidal Graph Echo State Networks

Claudio Gallicchio;Alessio Micheli
2020-01-01

Abstract

We analyze graph neural network models that combine iterative message-passing implemented by a function with untrained weights and graph pooling operations. In particular, we alternate randomized neural message passing with graph coarsening operations, which provide multiple views of the underlying graph. Each view is concatenated to build a graph embedding for graph-level classification. The main advantage of the proposed architecture is its speed, further improved by the pooling, in computing graph-level representations. Results obtained on popular graph classification benchmarks, comparing different topological pooling techniques, support our claim.
2020
978-2-87587-073-5
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11568/1072576
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? ND
social impact