Deep learning on graphs: successes, challenges | Graph Neural Networks | Michael Bronstein
Вставка
- Опубліковано 3 чер 2024
- Conference Website: saiconference.com/IntelliSys
Deep learning on graphs and network-structured data has recently become one of the hottest topics in machine learning. Graphs are powerful mathematical abstractions that can describe complex systems of relations and interactions in fields ranging from biology and high-energy physics to social science and economics. In this talk, I will outline the basic methods, applications, challenges and possible future directions in the field.
About the Speaker: Michael Bronstein is a professor at Imperial College London, where he holds the Chair in Machine Learning and Pattern Recognition, and Head of Graph Learning Research at Twitter. Michael received his PhD from the Technion in 2007. He has held visiting appointments at Stanford, MIT, Harvard, and Tel Aviv University, and has also been affiliated with three Institutes for Advanced Study (at TU Munich as a Rudolf Diesel Fellow (2017-), at Harvard as a Radcliffe fellow (2017-2018), and at Princeton (2020)). Michael is the recipient of five ERC grants, Fellow of IEEE, IAPR, and ELLIS, ACM Distinguished Speaker, and World Economic Forum Young Scientist. In addition to his academic career, Michael is a serial entrepreneur and founder of multiple startup companies, including Novafora, Invision (acquired by Intel in 2012), Videocites, and Fabula AI (acquired by Twitter in 2019). He has previously served as Principal Engineer at Intel Perceptual Computing and was one of the key developers of the Intel RealSense technology. - Наука та технологія
An excellent talk on the emerging topic of geometrical deep learning - this could bring topological data analysis which we did for decades to new importance!
This talk provides a helpful understanding of the intuition behind the speaker's work on geometric deep learning. Great talk! Thank you.
Amazing talk! I learned a lot and got some ideas for my research. Cheers!
great talk, such a generous share
Man, this is a Masterpiece.
Thank you for sharing.
Fantastic talk! Thanks for sharing!
Nice lecture!
“Attention is all you need.” - Is that also true for GNNs?
Can dynamic graph networks be used to predict chemical reaction outcomes or yields or retrosynthetic pathways?
The secret sauce of autogluon tabular is bagging, stacking and destillation. How to apply destillation on graph neural networks?
Well regarding attention I would look up GAT (Graphical Attention Networks)
Awesome lecture thanks!!
One thing that’s always confuse me is that seems (or in any example I found) with node/link prediction the whole dataset is a single graph. Is it possible to train/predict with different graphs?
Great talk!
Thanks super powerful.
Gold
Yup, food is medicine, many of us overlook this fundamental property. Eat right and the likelihood of illness diminishes exponentially
41:45 It was a bunny mesh oirc
*iirc
I love how carefully picking samples expedites the training time! Great idea that may be applicable in other situations!
oh I was wondering how all those 3d AI things worked
Hi
I didn't even need to see the bunny to be pretty sure it must be a bunny. Most animal meshes in academia are bunnies.
Applying Bayes' naive rule, eh?😜
This is who you want to be getting your geometry from hahahahaha