Learning on Graphs Conference
Learning on Graphs Conference
  • 23
  • 42 264
Xavier Bresson - Integrating Large Language Models and Graph Neural Networks - LoG 2024 Keynote
Learning on Graphs Conference, Day 3 - Nov 28, 2024
Keynote talk: Integrating Large Language Models and Graph Neural Networks
Speaker: Xavier Bresson
Abstract: Pre-trained language models on large-scale datasets have revolutionized text-based applications, enabling new capabilities in natural language processing. When documents are connected, they form a text-attributed graph (TAG), like the Internet, Wikipedia, social networks, scientific literature networks, biological networks, scene graphs, and knowledge graphs. Key applications for TAGs include recommendation (web), classification (node, link, graph), text- and visual-based reasoning, and retrieval-augmented generation (RAG). In this talk, I will introduce two approaches that integrate Large Language Models (LLMs) with Graph Neural Networks (GNNs). The first method demonstrates how LLMs’ reasoning capabilities can enhance TAG node features. The second approach introduces a pioneering technique called GraphRAG, which grounds LLM responses in a relevant sub-graph structure. This scalable technique regularizes the language model, significantly reducing incorrect responses, a.k.a. hallucinations.
Bio: Xavier Bresson is an Associate Professor in the Department of Computer Science at the National University of Singapore (NUS). His research focuses on Graph Deep Learning, a new framework that combines graph theory and neural networks to tackle complex data domains. He received the USD 2M NRF Fellowship, the largest individual grant in Singapore, to develop this new framework. He was also awarded several research grants in the U.S. and Hong Kong. He co-authored one of the most cited works in this domain (10th most cited paper at NeurIPS) and has significantly contributed to mature these emerging techniques. He has organized several conferences, workshops and tutorials on graph deep learning such as the IPAM'23 workshops on “Learning and Emergence in Molecular Systems”, the IPAM'23'21 workshops on “Deep Learning and Combinatorial Optimization”, the MLSys'21 workshop on “Graph Neural Networks and Systems”, the IPAM'19 and IPAM'18 workshops on “New Deep Learning Techniques”, and the NeurIPS'17, CVPR'17 and SIAM'18 tutorials on “Geometric Deep Learning on Graphs and Manifolds”. He has been a regular invited speaker at universities and companies to share his work. He has also been a speaker at the NeurIPS'22, KDD'21’23, AAAI'21 and ICML'20 workshops on “Graph Representation Learning”, and the ICLR'20 workshop on “Deep Neural Models and Differential Equations”. He has taught undergraduate and graduate courses on Deep Learning and Graph Neural Networks since 2014.
Переглядів: 917

Відео

Designing Antibodies to Target Disease with Generative AI - Andreas Loukas - LoG 2023 Keynote
Переглядів 712Рік тому
Learning on Graphs Conference 2023: logconference.org/ Keynote talk: Designing Antibodies to Target Disease with Generative AI Speaker: Andreas Loukas (Prescient Design, Genentech, Roche)
Tutorial: Graph Rewiring: From Theory to Applications in Fairness
Переглядів 1,5 тис.2 роки тому
Organizers: Adrian Arnaiz-Rodriguez, Francisco Escolano, and Nuria Oliver Abstract: Graph Neural Networks (GNNs) have been shown to achieve competitive results to tackle graph-related tasks, such as node and graph classification, link prediction and node and graph clustering in a variety of domains. Most GNNs use a message passing framework and hence are called MPNNs. Despite their promising re...
Tutorial: Exploring the practical and theoretical landscape of expressive graph neural networks
Переглядів 1,9 тис.2 роки тому
Organizers: Fabrizio Frasca, Beatrice Bevilacqua, and Haggai Maron Abstract: In an effort to overcome the expressiveness limitations of Graph Neural Networks (GNNs), a multitude of novel architectures has been recently proposed, aiming to balance expressive power, computational complexity, and domain-specific empiri- cal performance. Several directions and methods are involved in this recent su...
Tutorial: Neural Algorithmic Reasoning
Переглядів 5 тис.2 роки тому
Organizers: Petar Velickovic, Andreea Deac, and Andrew Dudzik Abstract: Neural networks that are able to reliably execute algorithmic computation may hold transformative potential to both machine learning and theoretical computer science. On one hand, they could enable the kind of extrapolative generalisation scarcely seen with deep learning models. On another, they may allow for running classi...
Tutorial: Parallel and Distributed Graph Neural Networks: An In-Depth Concurrency Analysis
Переглядів 1,3 тис.2 роки тому
Tutorial: Parallel and Distributed Graph Neural Networks: An In-Depth Concurrency Analysis
Tutorial: Scaling GNNs in Production: A Tale of Challenges and Opportunities
Переглядів 2,2 тис.2 роки тому
Tutorial: Scaling GNNs in Production: A Tale of Challenges and Opportunities
Tutorial: Graph Neural Networks in TensorFlow: A Practical Guide
Переглядів 5 тис.2 роки тому
Tutorial: Graph Neural Networks in TensorFlow: A Practical Guide
Tutorial: Complex Reasoning Over Relational Databases
Переглядів 1,4 тис.2 роки тому
Tutorial: Complex Reasoning Over Relational Databases

КОМЕНТАРІ

  • @JasonThomas-y7l
    @JasonThomas-y7l 28 днів тому

    Thanks for the breakdown! I need some advice: I have a SafePal wallet with USDT, and I have the seed phrase. (alarm fetch churn bridge exercise tape speak race clerk couch crater letter). What's the best way to send them to Binance?

  •  28 днів тому

    Is there any overlap between the Graph encoder and the graph textualization? The description of the sub-graph by a text -> tokens after tokenization ~ sub-graph encoding even taking into account the non-uniqueness of the textualization? Also the graph encoding is task-dependent whereas the graph textualization is task-agnostic, right?

  •  28 днів тому

    I am confused here when Glem is compared to GPT3.5... it is comparing a highly specific architecture trained for papers classification to a generic LLM which could have been exposed to the dataset but not trained for the specific task of article classification... Regarding the small delta of performance, I would not be so confident....😅

  • @atoshdustosh2762
    @atoshdustosh2762 Місяць тому

    The sound volume is too low ...

  • @tsadigov1
    @tsadigov1 11 місяців тому

    where can i find the notebook?

  • @zoratopa
    @zoratopa Рік тому

    Timestamps 01:39 - Introduction / Challenge (Anton Tsitsulin) 05:19 - Background (Sami Abu-El-Haija) 15:47 - TF-GNN High Level Overview (Sami Abu-El-Haija) 20:19 - End-to-end Tutorial to Run TF-GNN on One Machine 33:34 - TF-GNN Advanced Modelling Guide (Neslihan Bulut) 54:55 - GNN: Hands On Tutorial (Teacher Version) (Neslihan Bulut)

  • @mooncop
    @mooncop Рік тому

    WE ARE SO BACK!!1!!!

  • @chaitjo
    @chaitjo Рік тому

    Timestamps: 00:33:17 - Opening remarks 00:45:45 - Keynote, Kristof Schütt (Pfizer), Graph Learning for Chemical Discovery 01:49:22 - Keynote, Jure Leskovec (Stanford & Kumo AI), Relational Deep Learning 04:47:01 - Oral presentations

  • @shinaorca7760
    @shinaorca7760 Рік тому

    wow

  • @parasetamol6261
    @parasetamol6261 Рік тому

    Give me a notebook colab please❤

  • @valentinussofa4135
    @valentinussofa4135 Рік тому

    Thank you very much guys. From Indonesia. Great learning contents. I subscribe this UA-cam account. 🙏

  • @chinmay.prabhakar
    @chinmay.prabhakar Рік тому

    Thank you so much for uploading the video. Can you please publish the slides as well?

    • @beabevi
      @beabevi Рік тому

      You can find all the material here drive.google.com/drive/folders/1Q3PjpuP7FoY38Ik79R21k67tiA5p8uey?usp=sharing

    • @chinmay.prabhakar
      @chinmay.prabhakar Рік тому

      @@beabevi Thank you so much for the quick response and sharing the resources :)

  • @cuneyt1992
    @cuneyt1992 2 роки тому

    I have a serious question. Did someone put a gun to Maron's head to present? Because you cannot do such a bad job otherwise.

  • @prof_shixo
    @prof_shixo 2 роки тому

    Many thanks for sharing this very interesting tutorial! Amazing work from all organizers👏

  • @chester-tan
    @chester-tan 2 роки тому

    I think this abstract isn't the right one?

  • @arnaiztech
    @arnaiztech 2 роки тому

    00:00 - Introduction 00:01:40 - Motivation 00:09:45 - Introduction to Graph Spectral Theory 00:35:28 - Transductive Rewiring: Diffusion and Curvature 00:54:34 - Inductive Graph Rewiring: Lovász bound, Commute Times and Directional Graph Networks 01:08:05 - Inductive Graph Rewiring: CT-Layer 01:22:25 - [Hands-on Code] Inductive Graph Rewiring: CT-Layer 01:30:16 - Inductive Graph Rewiring: CT, Cheeger Constant, Curvature and Node Classification 01:35:27 - Inductive Graph Rewiring: GAP-Layer 01:40:35 - Graph Fairness 02:04:12 - Panel discussion: Moderators: Nuria Oliver and Adrián Arnaiz-Rodriguez. Participants: Marinka Zitnik, Petar Veličković, Francesco Di Giovanni, Francesco Fabbri. 02:48:03 - Wrap up and thanks

  • @sy422326
    @sy422326 2 роки тому

    Dear LOG, could you please release the Colab notebook link in the tutorial? It will help a lot, thanks!

    • @beabevi
      @beabevi Рік тому

      You can find all the material here drive.google.com/drive/folders/1Q3PjpuP7FoY38Ik79R21k67tiA5p8uey?usp=sharing The folder contains the practical sessions as .ipynb notebooks. Direct links to the Colab notebooks are in the slides :)

    • @sy422326
      @sy422326 Рік тому

      ​@@beabevi Thank you so much for your excellent tutorial and notebooks!

  • @dmytronikolaiev8992
    @dmytronikolaiev8992 2 роки тому

    00:00 - Preparation 19:53 - Keynote: Graph AI to Enable Precision Medicine (Marinka Zitnik) 1:20:06 - You Can Have Better Graph Neural Networks by Not Training Weights at All: Finding Untrained GNNs Tickets (Tianjin Huang et al.) 1:41:25 - Influence-Based Mini-Batching for Graph Neural Networks (Johannes Gasteiger et al.) 1:59:44 - The Multi-Orbits Skew Spectrum: Boosting Permutation-Invariant Data Representations (Armando Bellante et al.) 2:19:02 - Sponsor Talk: Graph Representation Learning for Drug Discovery (Djork-Arné Clevert) 3:17:45 - Closing Remarks

  • @dmytronikolaiev8992
    @dmytronikolaiev8992 2 роки тому

    1:07 - Keynote: Random graphs and graph neural networks (Soledad Villar) 1:04:25 - Taxonomy of Benchmarks in Graph Representation Learning (Renming Liu et al.) 1:27:53 - An Analysis of Virtual Nodes in Graph Neural Networks for Link Prediction (EunJeong Hwang et al.) 1:48:12 - Graph Learning Indexer: A Contributor-Friendly Platform for Better Curation of Graph Learning Benchmarks (Jiaqi Ma et al.)

  • @dmytronikolaiev8992
    @dmytronikolaiev8992 2 роки тому

    00:00 - Preparation 29:00 - Keynote: Equivariance, Naturality & Causality (Taco Cohen) 1:35:22 - GARNET: Reduced-Rank Topology Learning for Robust and Scalable Graph Neural Networks (Chenhui Deng et al.) 1:56:36 - Transductive Linear Probing: A Novel Framework for Few-Shot Node Classification (Zhen Tan et al.) 2:15:26 - Shortest Path Networks for Graph Property Prediction (Ralph Abboud et al.)

    • @mooncop
      @mooncop Рік тому

      by Mother Stigmergy! these are the equivariant droids I've been looking for!

  • @dmytronikolaiev8992
    @dmytronikolaiev8992 2 роки тому

    00:00 - Preparation 29:10 - Introduction 59:19 - Keynote: Graph Neural Networks for Molecular Systems (Stephan Günnemann) 2:00:08 - Not too little, not too much: a theoretical analysis of graph (over)smoothing (Nicolas Keriven) 2:20:09 - Neighborhood-aware Scalable Temporal Network Representation Learning (Yuhong Luo, Pan Li) 2:37:54 - A Generalist Neural Algorithmic Learner (Borja Ibarz, Petar Veličković, et al.)