An Introduction to Graph Neural Networks: Models and Applications

Поділитися
Вставка
  • Опубліковано 14 тра 2024
  • MSR Cambridge, AI Residency Advanced Lecture Series
    An Introduction to Graph Neural Networks: Models and Applications
    Got it now: "Graph Neural Networks (GNN) are a general class of networks that work over graphs. By representing a problem as a graph - encoding the information of individual elements as nodes and their relationships as edges - GNNs learn to capture patterns within the graph. These networks have been successfully used in applications such as chemistry and program analysis. In this introductory talk, I will do a deep dive in the neural message-passing GNNs, and show how to create a simple GNN implementation. Finally, I will illustrate how GNNs have been used in applications.
    www.microsoft.com/en-us/resea...
  • Наука та технологія

КОМЕНТАРІ • 116

  • @leixun
    @leixun 3 роки тому +120

    *My takeaways:*
    1. Background 0:48
    2. Graph neural networks (GNN) and neural message passing 6:35
    - Gated GNN 26:35
    - Graph convolutional networks 29:27
    3. Expressing GGNNs as matrix operations 33:36
    4. GNN application examples 41:25
    5. Other models as special cases of GNNs 47:53
    6. ML in practice 49:28

    • @shawnz9833
      @shawnz9833 2 роки тому +1

      can you help with that: what is a MLP in "you multiply it with a single layer MLP" @23:29?

    • @leixun
      @leixun 2 роки тому +2

      @@shawnz9833 Multilayer perceptron

    • @shawnz9833
      @shawnz9833 2 роки тому +1

      @@leixun Cool, thank you mate!

    • @leixun
      @leixun 2 роки тому +1

      @@shawnz9833 You are welcome mate, I am doing some deep learning research as well. You are welcome to check out our research talks on my UA-cam channel.

  • @mehmetf.demirel8647
    @mehmetf.demirel8647 4 роки тому +124

    great talk! the audience questions were helpful, but i felt like they were a bit too many in that they kinda negatively affected the flow of the talk.

  • @susmitislam1910
    @susmitislam1910 2 роки тому +9

    So GNNs are basically something like calculating word embeddings in NLP. We have a dataset describing the relationships between pairs of words (nodes), and we want a vector representation that reflects how often they co-occur (weight of the edge between the nodes), i.e., how much relatedness the two words have. Once we have such vectors, we can build a vanilla, recurrent, or convolutional neural net to find out a mapping between the vectors and the output we desire.

  • @rembautimes8808
    @rembautimes8808 2 роки тому +1

    "Spherical Cow" - funniest analogy yet for a Neural Net layers. Great talk

  • @iltseng
    @iltseng 4 роки тому +48

    At time 34:11, the (dot product of matrix A and matrix N) should be [ b + c ; c ; 0 ]

    • @MingshanJia
      @MingshanJia 4 роки тому +2

      Actually, here what we use is the incoming edges (see 14:55), but that is true the slide is confusing about that.

    • @michaelkovaliov8877
      @michaelkovaliov8877 3 роки тому +3

      It seems that the mistake is in the graph adjacency matrix, because the result vector is true given the drawing of the graph.

    • @PenguinMaths
      @PenguinMaths 3 роки тому +15

      That is a mistake in the slide, A should be transposed to describe the incoming edges instead of the outgoing ones.

    • @khwajawisal1220
      @khwajawisal1220 3 роки тому +1

      its using einstein notation, not the normal one that we use.

    • @kristofneys2349
      @kristofneys2349 3 роки тому +3

      There are many mistakes or confusing comments in this presentation, no wonder the audience keeps asking questions. Not a good talk at all....

  • @syllogismo
    @syllogismo 3 роки тому +12

    Don't know why people are criticizing this video and the audience. Great introduction to graph neural networks!

  • @runggp
    @runggp 3 роки тому +4

    awesome talk! The MSR audience asked quite a few questions, which are actually helpful , eg, what are they, how they work/update, why they are created and designed this way, etc

    • @shawnz9833
      @shawnz9833 2 роки тому

      can you help with that: what is a MLP in "you multiply it with a single layer MLP" @23:29?

  • @alphaO27
    @alphaO27 2 роки тому +2

    In 16:51, I think he meant for each node connecting to n (instead of n_j), because from the expression, we take all nodes n_j connected to n to be able to calculate the new state of node n h_t^n.

  • @Peaceluvr18
    @Peaceluvr18 3 роки тому +21

    Wow, what an excellent presentation, from someone with an ML background. Explains the basics a bit but also covers deep concepts. Super clear graphics! Seriously whoever made the graphics for this can I hire you to do my slide graphics? And thought it was very cool that the lecture attendees were bold enough to ask so many questions! Wish people asked more questions during my lectures+talks.

  • @heejuneAhn
    @heejuneAhn 3 роки тому +3

    He explains using time progress, which make some cofusion to the audience and me.

  • @NewtonInDaHouseYo
    @NewtonInDaHouseYo 2 роки тому

    Excellent introduction, thanks a lot!

  • @soniagupta2851
    @soniagupta2851 Рік тому

    Excellent explanation

  • @ashutoshukey3803
    @ashutoshukey3803 3 роки тому +22

    The explanation ability and use of high-level diagrams by the presenter were phenomenal. Questions from the audience definitely messed up the flow of the presentation quite a bit though.

  • @arjitarora8418
    @arjitarora8418 2 місяці тому

    Great Introduction!

  • @Exhora
    @Exhora 7 місяців тому

    29:35 about CGNs, he said you multiply the sum of the messages with your own state. But in the equation, it is a sum. I didn't get which one is correct.

  • @klaimouad740
    @klaimouad740 3 роки тому

    can we propagate messages for example depending on the edges features for example if the distance from node n to m is greater than their distance to p then we propagate the message first to p then we perform the propagation to the other node m

  • @sm_xiii
    @sm_xiii 4 роки тому +5

    Where can we get the slide deck please?

  • @2000sunnybunny
    @2000sunnybunny 3 роки тому

    Great session !

  • @halilibrahimakgun7569
    @halilibrahimakgun7569 8 місяців тому

    Can you share the slides please. I like them.

  • @hamishhall5423
    @hamishhall5423 4 роки тому

    What is the dimension M, for the msg_to_be_sent and received_messages etc. I get that D is the dimension of the node representation, N the num_nodes etc

  • @bibiworm
    @bibiworm 2 роки тому +5

    is there anyway to get access to the slides? Great talk! Thanks.

  • @jonfe
    @jonfe 3 місяці тому +1

    I dont understand why GRU is used, the input in GRU is a (Node x Caracteristics) Matrix, where is the temporal dimension?

  • @BapiKAR
    @BapiKAR 3 роки тому +2

    could you please post the ppt here? Thanks

  • @mansurZ01
    @mansurZ01 4 роки тому +3

    35:40 I think the dimensionality of M should be (num_nodes x D), unless D==M.
    EDIT: from what follows, it should be M = HE, and D can be different from M.

  • @instantpotenjoyer
    @instantpotenjoyer 4 роки тому +13

    DiscussIon on the actual topic starts ~ 6:40

  • @codewithyouml8994
    @codewithyouml8994 2 роки тому

    The miss, at 40:00 was right .... as i was alsoooo really confused, like all the matrix operations were seemed to be invalid if not swapped ... lol what kind of inverted conventions are these ....

  • @mohammedamraoui4147
    @mohammedamraoui4147 2 місяці тому

    and for Edge classification?

  • @sebamurgui
    @sebamurgui 3 роки тому +46

    OH MY GOD that audience

    • @miketurner3461
      @miketurner3461 3 роки тому +3

      I had to stop watching because of them

    • @kellybrower301
      @kellybrower301 3 роки тому

      I wanted to believe it wasn’t true.

  • @MobileComputing
    @MobileComputing 3 роки тому +73

    While the audience questions were mildly irritating (to put it, mildly), bombarding the speaker during his intro with questions that could reasonably be expected to be answered eventually in an 1-hour talk, why would the speaker give a talk on one of the most advanced neural network architecture to an audience without any machine learning background?

    • @ziranshuzhang6831
      @ziranshuzhang6831 3 роки тому +11

      You are right, I am expecting to quickly adopt the GNN concept but the audience keeps asking irritating questions that I have to constantly hit the right button.

    • @ohmyumbrella
      @ohmyumbrella 3 роки тому

      I agree. I mean I do see the point of giving this lecture to an audience without previous exposure to ML , if it is for the purpose of attracting them to the subject but in that case there should have been another video of the same lecture without so much interruption. It would take extra time and effort but for people who are trying to effectively learn GNN and have some knowledge of basic ML, these questions are very annoying and hinders the learning experience.

  • @sherazbaloch1642
    @sherazbaloch1642 Рік тому

    Need more tutorial on GNN

  • @pielang5524
    @pielang5524 3 роки тому

    at time 35:51, the adjacency matrix A should also depend on edge type k imo.

    • @pielang5524
      @pielang5524 3 роки тому

      OK.. The presenter confirmed this shortly after...

  • @pielang5524
    @pielang5524 3 роки тому +3

    Let the speaker talk!

  • @sunaxes
    @sunaxes 2 роки тому

    So GNN is just message passing on a graph or did I miss something? This has been around since way back, isnt it??

  • @marcolerena456
    @marcolerena456 3 роки тому +39

    The audience are very inquisitive, but grilling the speaker way too much

    • @vibhor0202
      @vibhor0202 9 місяців тому

      There is one smartass

  • @eljangoolak
    @eljangoolak 2 роки тому +7

    Very good presentation, but it is very difficult to follow with all the interrupting questions

  • @pharofx5884
    @pharofx5884 4 роки тому +91

    dat cough frequency suspiciously high. Distance thyself socially sir.

    • @maloxi1472
      @maloxi1472 3 роки тому +5

      Don't worry, it's from November 2019

    • @jmplaza4947
      @jmplaza4947 3 роки тому +5

      @@maloxi1472 COVID was already spreading then, right? I hope he's ok... wherever he is now...

    • @danielliu9616
      @danielliu9616 3 роки тому +5

      He is maybe THE patient zero

    • @yanzexu6912
      @yanzexu6912 3 роки тому

      bless him

    • @stackexchange7353
      @stackexchange7353 3 роки тому

      @@danielliu9616oh shii

  • @FsimulatorX
    @FsimulatorX 3 роки тому +2

    OMG! For a second I thought he looked like the CEO of Google and was wondering to myself: why would the CEO of Google do a presentation about Neural Networks AT MICROSOFT!!

  • @giannismanousaridis4010
    @giannismanousaridis4010 3 роки тому +2

    I found the slide for everyone who asked here: miltos.allamanis.com/files/slides/2020gnn.pdf
    (Idk If I'm not supposed or allowed to post the link here, if not sorry for that, I'll delete my comment. Just let me know).

  • @CodingwithIndy
    @CodingwithIndy 3 роки тому

    Is that SPJ I hear in the audience at 9:18

  • @samuisuman
    @samuisuman 3 роки тому +4

    There is no intuitive explaination, but quite informative

  • @arashjavanmard5911
    @arashjavanmard5911 3 роки тому +4

    It would be great if you could also publish the slides!

    • @mayankgolhar8761
      @mayankgolhar8761 3 роки тому +5

      Slides from the presenter's website: miltos.allamanis.com/files/slides/2020gnn.pdf

    • @arashjavanmard5911
      @arashjavanmard5911 3 роки тому +1

      @@mayankgolhar8761 thanks

  • @r.dselectronics3349
    @r.dselectronics3349 3 роки тому

    i am a researcher..the video contain the beautiful concept ...i like very much....:)..specially binary classification part ...i am so excited about this concepts....

  • @losiu998
    @losiu998 3 роки тому +2

    Great presentation. But I have to point out something. I have no idea why you would use einstein notation instead of simple matrix multiplication? It raises unnecessary confusion and it's not related
    to GNNs.

  • @yb801
    @yb801 2 роки тому

    Is this a 2016 talk?

  • @minghan111
    @minghan111 4 роки тому +22

    too many questions, just wait for the speaker please.

    • @pharofx5884
      @pharofx5884 4 роки тому +6

      In formulating their questions they re-explained what is going on an order of magnitude better than the speaker. thats kinda sad

    • @MobileComputing
      @MobileComputing 3 роки тому +1

      @@pharofx5884 I just watched the version from 2 years ago. Only 18 minutes long, but almost identical in content, yet that was much clearer. Really sad to see.

  • @kushalneo
    @kushalneo 3 роки тому

    👍

  • @RARa12812
    @RARa12812 3 роки тому

    How tp turn off questions

  • @sirisaksirisak6981
    @sirisaksirisak6981 3 роки тому

    At least it save time in doing strategy.

  • @blanamaxima
    @blanamaxima 3 роки тому +1

    Is this related to bread baking ?

  • @aichemozenerator8446
    @aichemozenerator8446 Рік тому

    good

  •  4 роки тому +1

    Where is the inputs and outputs?

    • @miketurner3461
      @miketurner3461 3 роки тому

      Clearly you would've been one of the people asking foolish questions they could answer using Google

  • @peter-holzer-dev
    @peter-holzer-dev Рік тому +1

    I am pretty sure this is a great talk but unfortunately all the questions in between disturbs the flow a lot (also because most of them are hard to understand acoustically).

  • @arisioz
    @arisioz 2 роки тому +3

    Are those actual MS employers in the crowd? They are worse than 1st year CS students

  • @losiu998
    @losiu998 3 роки тому

    34:22 - is it vector-matrix multiplication? if so, the result is wrong i guess
    @edit: Matrix A should have ones under diagonal, not above - then result is as presented

  • @BruceChen27
    @BruceChen27 3 роки тому +5

    Seems several people were not healthy

  • @WeeeAffandi
    @WeeeAffandi 8 місяців тому

    You can tell this lecture was recorded during the prime Covid by hearing the constant coughing from audience (and the speaker)

  • @_engid
    @_engid 3 роки тому +1

    I hear Simon Peyton Jones in the audience

    • @rherrmann
      @rherrmann 3 роки тому

      Easily recognizable indeed!

  • @lifelemons8176
    @lifelemons8176 3 роки тому

    horror crowd. this is something I see in every microsoft talk

  • @hfkssadfrew
    @hfkssadfrew 3 роки тому +1

    34:46 . it is NOT A* N, it is N' * A ....

  • @patrickadjei9676
    @patrickadjei9676 3 роки тому +3

    People are not happy for the many questions. However, I'm kinda sad that he doesn't re-state the questions before answering :( like why?

  • @goblue1011
    @goblue1011 Рік тому +2

    Honestly some of the audience who raised questions have quite big ego and have no idea what they are talking about.

  • @fatcat4791
    @fatcat4791 2 роки тому

    It should be (A^T)*N

  • @rocking4joy
    @rocking4joy 2 роки тому +1

    Don't understand the praise in the comment section, I actually found it kind of sloppy with typo-s but the audience and the questions are really great.

  • @khwajawisal1220
    @khwajawisal1220 3 роки тому +8

    thats why software engineers should not teach, you assume everything is a design/modeling detail when in reality they are part of the mathematics behind them. and i seriously miss those old days where professors used to teach with a chalk and board.

  • @MrArmas555
    @MrArmas555 4 роки тому

    ++

  • @eaglesofmai
    @eaglesofmai 3 роки тому

    are GNN's patented? does anyone know if using a paritcular ANN construct can be subject to litigation?

  • @carloderamo
    @carloderamo Рік тому +1

    Some good person should take this video and remove all the awful questions from the audience

  • @guest1754
    @guest1754 10 місяців тому

    Wondering how many people had covid in that recording...

  • @jebinjames6317
    @jebinjames6317 3 роки тому

    People need to remember they’re watching a free video on UA-cam...it’s not your advanced ML private tutoring session...

  • @yiweijiang
    @yiweijiang 3 роки тому +1

    So that's machine learning! Haha, lol

  • @yanyipu4029
    @yanyipu4029 2 роки тому

    Great Video but annoying audience

  • @barriesosinsky9566
    @barriesosinsky9566 2 роки тому

    An aromatic ring is not a "single bone" next to a "double bone." The bonds are a resonance form in a single state. Treating them with graph theory is not supported by current models.

  • @kutilkol
    @kutilkol 2 роки тому +2

    Rip ears. Wtf with the caughing. Use at least some compressor for the vocal audio omg.

  • @aglaiawong8058
    @aglaiawong8058 2 роки тому

    the interruptions are so annoying...

  • @iSpades0
    @iSpades0 2 роки тому

    that audience was pretty annoying tbh

  • @thevanisher4609
    @thevanisher4609 3 місяці тому

    Horrible audience, great talk!

  • @KeshavDial
    @KeshavDial 2 роки тому

    The audience ruined this presentation. I have never felt worse for a presenter.

  • @miketurner3461
    @miketurner3461 3 роки тому +2

    The audience needs to take a freaking ML 101 class before asking stupid questions

  • @user-ku8ph5hs7o
    @user-ku8ph5hs7o 3 роки тому

    Kind of confusing for me. And the audience very annoying

  • @robodoc1446
    @robodoc1446 2 роки тому

    Appalling talk! It shows why coders are terrible in public speaking or often fail to explain things in a transparent manner. Before explaining how message passing is done in an end-to-end learning architecture, he jumped to talk about Gated GNN leaving an impression that GRU may be an important part of GNN. This is one of the reasons why he got so many questions and confusion surrounding his lecture.....what is h_t? "well, this is not something that changes"... seriously Microsoft!

  • @williamashbee
    @williamashbee 3 роки тому

    cough totally ruined the presentation.

  • @Tyomas1
    @Tyomas1 3 роки тому +2

    awful introduction

  • @TheDavidlloydjones
    @TheDavidlloydjones 2 роки тому

    Word salad. A hopeless mess of talking at and around a topic without actually touching it.
    Take it down. Tell the guy to try again.

  • @saulrojas2679
    @saulrojas2679 Рік тому

    Slides can be found at: miltos.allamanis.com/files/slides/2020gnn.pdf