MIT Introduction to Deep Learning (2022) | 6.S191
Вставка
- Опубліковано 14 чер 2024
- MIT Introduction to Deep Learning 6.S191: Lecture 1
Foundations of Deep Learning
Lecturer: Alexander Amini
For all lectures, slides, and lab materials: introtodeeplearning.com/
Lecture Outline
0:00 - Introduction
6:35 - Course information
9:51 - Why deep learning?
12:30 - The perceptron
14:31 - Activation functions
17:03 - Perceptron example
20:25 - From perceptrons to neural networks
26:37 - Applying neural networks
29:18 - Loss functions
31:19 - Training and gradient descent
35:46 - Backpropagation
38:55 - Setting the learning rate
41:37 - Batched gradient descent
43:45 - Regularization: dropout and early stopping
47:58 - Summary
Subscribe to stay up to date with new deep learning lectures at MIT, or follow us on @MITDeepLearning on Twitter and Instagram to stay fully-connected!! - Наука та технологія
Super Excited to learn! Thank you MIT folks for open sourcing your lectures for lesser fortunate folks like to learn and grow
What else could one ask as a weekend treat?!
Alexander is a master in presenting super complex things in a simple way, making such lectures public helps a lot . I personally have been benefited a lot .
I’m citing you in my high school project! Thank you for making these lectures public I literally can’t thank you enough
I studied ML years ago and watched most of the available MOOC content out there. Doing a refresher. This guy is the best teacher I’ve come across!
Never thought that someday I would be able to learn from a lecture happening at MIT but here you are.
Thank you so much
Just watched the lecture and I'm amazed at how "easy" it seems to be, which says a lot about the knowledge and teaching technique of the Professor. It sounds all doable even for people who have no contact with ML and DL, like myself. Well done and a big thank you for making this available worldwide!
what are pre requisite of this? kindly reply
@@usrehman5046 im not sure what will be covered in this course but it wouldnt hurt to get/be familiar with the mathematics required for AI.
@@ThriveUp1 thanks alot for replying. Will look into it
Gg
Its really amazing that we have access to such high quality content available for free. Thank you and will be looking forward for the upcoming lectures.
areee kissan bhai kya haal chaal
Year number 3 now that I am following 6.S191....
and I am still eagerly awaiting these lecture series. More power to you and Dr Ava..
1. Dot product
2. add bias
3. apply a non-linearity
I have to say, this is one of the best classes. I do have a subject called Deep learning at my Uni which has very good information as well just like this lecture. Thanks for the recap
Thank you Professor! Its really great to watch lectures from class while in WFH.
This is one of the best crash courses of deep learning I've ever seen, thanks for the good stuff! Please keep sharing!
This is amazing man. Thank you for the lectures. You have no idea how informative these lectures are for me.
Thanks for making such an awesome series of lecture available for free . Really loving this course and DEEP LEARNING
I also teach deep learning and when I see classes like this it teaches me how easy it can be to explain complex things like deep learning. Thanks!!
Can u explain me something, at 42:38, the alfo shown says pick a single data point I and this step is inside loop. So if we can reach minimum using gradient, why are we taking this step inside loop,.we just need one random point?
@@abhinavmishra9323 Great question as it may not have been clear in the video. My understanding for stochastic gradient descent is that you randomly pick one data point at each iteration. Each iteration uses a random value pulled from the complete data set. Basically, it's not the same 'i' each time unless the randomness is messed up and randomly chooses the same 'i' over and over or you actually experience achieving a probability that was very very close to 0.
An excellent introduction to deep learning. Crisp and clear. Thank You!
It may be the first time that I understand the methods and definitions so easily. Great presentation.
Good work Alexander. Keep it up. I'll be watching your whole series this year.
A-MA-ZING. Looking forward to the rest of the class. Thank you! :)
Many thanks for breaking down complex subject matter into easily graspable blocks !!!
That was the best introductory lecture on neural networks!!! Thanks for open sourcing lectures!!!!
Thanks to your high quality teaching of Deep learning! it really helps a lot to understand it!
Finally understood how neural networks work and some basic concepts ooof!!!
Thank you.
Super excited to be here, and a great opportunity to learn more through open sourcing.
Hey, thanks Alexander for this, totally worth every minute I watched it.
I appreciate the way you draw the neural network model, it's not cluttered with lines like some people draw, but in the course you haven't explained the topic of hyperparameters. thanks again!
OMFG ! had been waiting for a while and I was thinking should I just take last year's session, but now this is coming!
Thank you Alex!. I had been waiting for this since the new Year
This was really really helpful. Thank you MIT team. Keep up your good work.
The mathematical explanation is very clear.
I've already learned these concepts, but this introduction gave me more deeper understanding of these.
This is unbelievable 🔥 how good could a course ever be!!
Hi , Alexander , you just explained deep learning in a very easy and intuitive way .
Very comprehensive and conscisive , thank you very much , excellent explaination
waited for this and now watching this on the night before my undergraduate practicals :)
I thank you so much for putting these online.
Wow! That was a rollercoaster for mind. Best show ever!
thank you so much MIT, just got the mail and dah I am here, I have been waiting for this
Love the last course, really excited and thank you :D
thank you for posting this lecture series!
Fantastic 1 lecture intro! Thank you very much!!
Hi Alexander! I've been fan of you and Ava since this series from 2020! Looking forward to the new updates in this season. Just wondering would AlphaFold get a snippet to be introduced in detail?
Thanks for providing these great lectures! Are the assignments also available?
Thank you for this engaging lecture presentation. It helps me a lot.
Working with deep learning on my master thesis even though I have no background in computer science😅, this was a fantastic introduction thanks Alexander!!
Thanks , finally wait is over 😊
thank you guys for open sourcing this treasure!!!
This is actually one of the better videos to understand this.
Thank you thank you, I’ve been waiting.
I'm really excited to learn THANK YOU
Awesome content, thanks for creating and keeping it free, unlike others :)
Set the reminder on. I'm waiting! 🙂
Most awaited video of year
Amazing!
Thanks from Brazil!
Wooohooo!!! Let's go for another year!
Thanks for open sourcing the course @MIT and @Alexander Amini and @Ava Soleimany
Wow.Just awesome. Glad to learn from the nerds at MIT.
Starting of course amazing
Anticipating!!!!!!!😊😊
Good lecture. Enjoyed it immensely.
دمت گرم خیلی دوره ی خوبیه. درود بر تو.
Thank you so much for the amazing lecture.
I was just wondering when will this be out yesterday!
Special thanks from South Korea 🎉
Super excited to learn.. Luv from INDIA
thank you for the series :-)
23:00 Dense Layer
Awesome, I just set the reminder.😋
nice lecture and relatable, cool teaching skills
I'm super excited to learn deep learning model ...
Just 5 min of this video > whole engineering course I had in college.
Thanks for the wonderful course
such a great lecture. thanks
I fell this lectures teaches complex problem in a understandable way with basic knowledge in programming.
Absolutely fantastic.
Thank you for the course :D
Greatly appreciated
Thank you for your content
what an amazing lecture
Very Informative
19:55. At this point it would help if you explain that you've chosen the first activation function which is monotonic and g(0)=0.5.
Thank you professor💙
excellent!
me: excited,
forwarded to students
students: excited
🍻👍
Thanks for the good work
So excited to be part of this cohort. Am new to Deep Learning looking for fellow enthusiasts and if anyone wishes to collaborate feel free to hit me up.
@@mohammadolaimat1063 Sure
Muchas Gracias!
Good morning nerd, pursuing my degree as a computer scientist with interest lying in this sort of things this will be epic
Thank you !
Thanks you are good!
Thank you! ♥
Thanks a bunch !!
Sir great work
thank you so much for your explaniation, the slides links not wroking
nice lecture!
well explained.
Hi Alex! Does this have required technical rigor? What's the prerequisites then?
Many thanks....
Wonderful
I am super glad to follow this class, Thank you. I can't access the slides, I got a "404
File not found". please kindly help look into the link. Thank you.
Thanks for letting me know, I'm working on fixing that ASAP and getting the slides published
Great lecture. Thanks for sharing !!. At 26:30 what does nk-1 represents in the computation of zk,i ?
I think 'k' here represents the position of the hidden layer in the deep neural network and 'n' is the number of perceptron in that layer. So n(k-1) is then number of perceptron in the layer preceding the 'k'th layer. This is just what I understood, though I may be wrong.