Graph and geometric deep learning
Graph and geometric deep learning are machine learning fields that combine graph representations for data and machine learning to exploit the inductive bias associated with the presence of functional dependencies among data. In this area IDSIA is currently exploring two main research directions: learning high order graph structures from data and investigate graph state-space dynamical systems.

Learning high-order graph structures from data

The investigation line aims at advancing research in representation learning techniques to encode relational information of any order and, at the same time, retrieve that relational structure from data. Hereby, we identify three main research tasks:

  1. Graph learning. The activity aims at developing a scalable methodology to address the problem of making inferences from multivariate data by exploiting relational inductive biases. There are many possible target domains: from physical/virtual sensor networks to knowledge graphs and point clouds.
  2. Statistical assessment of (hyper)graph estimators. The research requires to investigate suitable statistical tools and develop hypothesis tests to assess the significance of the learned graph as well as study conditions under which learnability is granted.
  3. Quasi-invertible graph embeddings. Processing a latent space is useful, but often implies losing the explicit relational information. The investigation aims at exploiting techniques and theoretical results developed in the graph learning framework to design embedding methodologies tailored to solve this decoding problem.

Graph state-space neural models

The research aims at building theories, methodologies and tools for (hyper)graph-based predictive models that extend traditional state-space representations.
The main research challenges are:

  • Enable comprehensive modelling for dynamical graph systems through graph representations for inputs, states and -possibly- outputs.
  • Design advanced neural architectures for graph processing in the space of graphs. The problem of affordable computation is of primary relevance here whenever we consider large graphs: it follows, that the computational complexity issue must be a guideline when designing the computing architecture.
  • Scalability and learning. As the complexity of the architecture and the size of the data get larger, it is necessary to provide sound model selection techniques and performance evaluation criteria to ensure proper fitting. This is a crucial part that is too often improperly carried out in graph processing, due to the lack of statistical tools such as optimality criteria for graph predictors, as well as unbiasedness and consistency of the related graph-state estimator.


  • F.M. Bianchi, D. Grattarola, L. Livi, C.Alippi, Graph Neural Networks with Convolutional ARMA Filters, IEEE Transactions on Pattern Analysis and Machine Intelligence, 2021 IRIS
  • D.Grattarola, L. Livi, C. Alippi, Learning Graph Cellular Automata, NeurIPS 2021