*DeepMind x UCL | Deep Learning Lectures | 10/12 | Unsupervised Representation Learning* *My takeaways:* *1. Plan for this lecture **0:57* -In this lecture, unsupervised learning also refers to self-supervised learning 1:23 *2. What is unsupervised learning **2:13* -In this lecture, supervised learning refers to both supervised learning and reinforcement learning 2.1 Do we need it? Clustering; Dimensionality reduction 4:13 2.2 How do we evaluate it? 5:45 *3. Why is it important **6:51* 3.1 History of representation learning 7:30 3.2 Shortcomings of supervised learning 9:46 -Data efficiency; Robustness; Generalization; Transfer; "Common sense" 3.3 What Geoff Hinton, Yann LeCun and Yoshua Bengio have said: Unsupervised Representation Learning 15:10 *4. What makes a good representation **16:41* *5. Evaluating the merit of representation **34:13* *6. Techniques & applications **42:47* - Downstream tasks to evaluate the representation quality: semi-supervised learning; reinforcement learning; model analysis 44:13 6.1 Generative modelling 49:22 6.2 Contrastive learning 1:23:06 6.3 Self-supervision 1:34:38 *7. Future **1:42:38*
I found it completely impossible to understand anything without first reading the linked papers (or at least watching detailed talks about them). Once I know the paper, however, this lecture provides a valuable high level commentary on how that paper fit into the overall research. Also, I'm not sure why, when describing ways to learn representations, the talk didn't start with the simplest one: learn a clarification model, then use the penultimate layer as your representation.
*DeepMind x UCL | Deep Learning Lectures | 10/12 | Unsupervised Representation Learning*
*My takeaways:*
*1. Plan for this lecture **0:57*
-In this lecture, unsupervised learning also refers to self-supervised learning 1:23
*2. What is unsupervised learning **2:13*
-In this lecture, supervised learning refers to both supervised learning and reinforcement learning
2.1 Do we need it? Clustering; Dimensionality reduction 4:13
2.2 How do we evaluate it? 5:45
*3. Why is it important **6:51*
3.1 History of representation learning 7:30
3.2 Shortcomings of supervised learning 9:46
-Data efficiency; Robustness; Generalization; Transfer; "Common sense"
3.3 What Geoff Hinton, Yann LeCun and Yoshua Bengio have said: Unsupervised Representation Learning 15:10
*4. What makes a good representation **16:41*
*5. Evaluating the merit of representation **34:13*
*6. Techniques & applications **42:47*
- Downstream tasks to evaluate the representation quality: semi-supervised learning; reinforcement learning; model analysis 44:13
6.1 Generative modelling 49:22
6.2 Contrastive learning 1:23:06
6.3 Self-supervision 1:34:38
*7. Future **1:42:38*
When the presentation starts getting a little confusing and esoteric, you know we're reaching the edges of our current knowledge 😁
could be yours too :)
Thank you so much, Irina Higgins and Mihaela Rosca and DeepMind for giving intuition on Unsupervised Representation Learning : ) ppt
Thank you so much, Irina Higgins and Mihaela Rosca and DeepMind for giving intuition on Unsupervised Representation Learning : )
I found it completely impossible to understand anything without first reading the linked papers (or at least watching detailed talks about them). Once I know the paper, however, this lecture provides a valuable high level commentary on how that paper fit into the overall research.
Also, I'm not sure why, when describing ways to learn representations, the talk didn't start with the simplest one: learn a clarification model, then use the penultimate layer as your representation.
I agree, complex equations are presented with poor explanation, very hard to learn unless you already know what they are.
That's not really the simplest approach. KMeans, agglomerative clustering and many other clustering algorithms are simpler.
Agree. Poor explanation on equations, too much lnowledge in too short time
Excellent Lecture on Neural network, physics and Math. Reinforcements and Deep mind learning, thank you very much.
Great lecture. Very interesting topic. Thx Irina and thx Mihaela!
Thank you so much, just everyday you find out more you don't know, always.
Brilliant! 2 months, 2 weeks....
I love the diversity in deepmind
you'll go far, buddy. good luck
@@tractatusviii7465thank you 🙏
Thank you for sharing the research. Is there any paper that you will recommend?
Please publish a list of the referenced papers in the description of the video! These yellow boxes are hard to read!
The boxes are more readable if you change the streaming quality to HD. You could also check out the slides directly here: bit.ly/3eqYlyt
very important lecture
those are cool robots drawings!
This is great! Although why does Irina's voice creep me out as if it is AI generated 😱
I'd say Miahela voice is not bad either :)
баси якита машин лърнин. АЗ машин лърнствам от няколко години и съм мнгоо добър, уее
disabling comments is weak