InTDS ArchivebyMichael BronsteinGraph Neural Networks as gradient flowsGNNs derived as gradient flows minimising a learnable energy that describes attractive and repulsive forces between graph nodes.Oct 14, 20222Oct 14, 20222
InTDS ArchivebyMichael GalkinGraph Machine Learning @ ICML 2022Recent advancements and hot trends, July 2022 editionJul 25, 20224Jul 25, 20224
InTDS ArchivebyBiswa SenguptaWhere there is matter, there is geometry…Gauge theories, symmetries, machine learning and all thatJan 7, 20211Jan 7, 20211
InTDS ArchivebyMichael BronsteinOver-squashing, Bottlenecks, and Graph Ricci curvatureA concept from differential geometry called Ricci curvature allows to understand the phenomena of over-squashing and bottlenecks in GNNsNov 30, 2021Nov 30, 2021
InTDS ArchivebyMichael BronsteinTowards Geometric Deep Learning I: On the Shoulders of GiantsIn a new series of posts, we discuss how geometric ideas of symmetry underpinning Geometric Deep Learning have emerged through history.Jul 4, 2022Jul 4, 2022
InTDS ArchivebyMichael GalkinGraphGPS: Navigating Graph TransformersRecipes for cooking the best graph transformersJun 14, 20224Jun 14, 20224
InTDS ArchivebyMichael BronsteinA new computational fabric for Graph Neural NetworksAre graphs the right computational fabric for GNNs? A recent line of papers challenges this assumption.Jun 10, 20223Jun 10, 20223
InTDS ArchivebyMichael BronsteinGeometric foundations of Deep LearningGeometric Deep Learning is an attempt to unify a broad class of ML problems from the perspectives of symmetry and invariance.Apr 28, 20219Apr 28, 20219
InTDS ArchivebyMichael GalkinInductive Link Prediction in Knowledge GraphsStarting a new Inductive Link Prediction Challenge 2022Mar 24, 2022Mar 24, 2022
Xinyu Chen (陈新宇)Temporal Matrix Factorization for Multivariate Time Series ForecastingEvaluate the model on fluid dynamic data and show the forecasting results with heatmapsMar 21, 2022Mar 21, 2022
InTDS ArchivebyMichael BronsteinLearning on graphs with missing featuresFeature Propagation is a simple and surprisingly efficient solution for learning on graphs with missing node featuresFeb 3, 20223Feb 3, 20223
InTDS ArchivebyMichael BronsteinLearning on graphs with missing featuresFeature Propagation is a simple and surprisingly efficient solution for learning on graphs with missing node featuresFeb 3, 20223Feb 3, 20223
InTDS ArchivebyAnas AIT AOMARNotes on graph theory — Centrality measuresNotes on different centrality measures: definition and tradeoffsAug 1, 20203Aug 1, 20203
InTDS ArchivebyMichael BronsteinGraph Neural Networks through the lens of Differential Geometry and Algebraic TopologyNew perspectives on old problems in Graph MLNov 18, 2021Nov 18, 2021
InTDS ArchivebyAndreas MaierGraph Deep Learning — Part 1Spectral ConvolutionsAug 17, 20201Aug 17, 20201
InCantor’s ParadisebySergei IvanovThe Easiest Unsolved Problem in Graph TheoryGraph theory has a long history of problems being solved by amateur mathematicians. Do you want to try yourself to become one of them?Feb 25, 20216Feb 25, 20216
InTDS ArchivebyVijay Prakash DwivediGraph Transformer: Generalization of Transformers to GraphsWe generalize Transformers to arbitrary graphs by extending key design aspects of attention and positional encodings from NLP to graphs.Mar 4, 20214Mar 4, 20214
InTDS ArchivebyMichael BronsteinGraph Neural Networks as Neural Diffusion PDEsGraph neural networks are intimately related to partial differential equations governing information diffusion on graphs.Jun 18, 20212Jun 18, 20212
InTDS ArchivebyMichael BronsteinTemporal Graph NetworksA new neural network architecture for dynamic graphsJul 27, 20207Jul 27, 20207