Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Partition and Code: learning how to compress graphs
 
conference paper

Partition and Code: learning how to compress graphs

Bouritsas, Giorgos
•
Loukas, Andreas  
•
Karalias, Nikolaos  
Show more
2021
Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, NeurIPS 2021
35th Conference on Neural Information Processing Systems (NeurIPS 2021).

Can we use machine learning to compress graph data? The absence of ordering in graphs poses a significant challenge to conventional compression algorithms, limiting their attainable gains as well as their ability to discover relevant patterns. On the other hand, most graph compression approaches rely on domain-dependent handcrafted representations and cannot adapt to different underlying graph distributions. This work aims to establish the necessary principles a lossless graph compression method should follow to approach the entropy storage lower bound. Instead of making rigid assumptions about the graph distribution, we formulate the compressor as a probabilistic model that can be learned from data and generalise to unseen instances. Our "Partition and Code" framework entails three steps: first, a partitioning algorithm decomposes the graph into subgraphs, then these are mapped to the elements of a small dictionary on which we learn a probability distribution, and finally, an entropy encoder translates the representation into bits. All the components (partitioning, dictionary and distribution) are parametric and can be trained with gradient descent. We theoretically compare the compression quality of several graph encodings and prove, under mild conditions, that PnC achieves compression gains that grow either linearly or quadratically with the number of vertices. Empirically, PnC yields significant compression improvements on diverse real-world networks.

  • Details
  • Metrics
Type
conference paper
Author(s)
Bouritsas, Giorgos
Loukas, Andreas  
Karalias, Nikolaos  
Bronstein, Michael
Date Issued

2021

Published in
Advances in Neural Information Processing Systems 34: Annual Conference on Neural Information Processing Systems 2021, NeurIPS 2021
ISBN of the book

9781713845393

Total of pages

33

URL

Link to conference paper

https://proceedings.neurips.cc/paper/2021/file/9a4d6e8685bd057e4f68930bd7c8ecc0-Paper.pdf
Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
LTS2  
Event nameEvent placeEvent date
35th Conference on Neural Information Processing Systems (NeurIPS 2021).

[Virtual]

December 6-14, 2021

Available on Infoscience
January 10, 2023
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/193641
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés