I really enjoyed this talk by Vladimir. Here's the outline: 0:00 - Introduction 0:46 - Overview: Complete Statistical Theory of Learning 3:47 - Part 1: VC Theory of Generalization 11:04 - Part 2: Target Functional for Minimization 27:13 - Part 3: Selection of Admissible Set of Functions 37:26 - Part 4: Complete Solution in Reproducing Kernel Hilbert Space (RKHS) 53:16 - Part 5: LUSI Approach in Neural Networks 59:28 - Part 6: Examples of Predicates 1:10:39 - Conclusion 1:16:10 - Q&A: Overfitting 1:17:18 - Q&A: Language
My master's thesis was forecasting using SVM. That was the first time I fell in love with machine learning and even Math. Thank you Vladimir for living.
I'm studying SVM in my MCS program. I was so surprised to find this video with Dr. Vapnik. We live in such blessed times to have easy access to this level of high-quality content. Thank you!
@Lex Fridman 1.5 years ago I listened to your first podcast with prof vapnik and was blown away. Great man, great story. I love it. Funny is that while pursuing the topic of machine learning and deep learning myself at the moment I hit the subject of learning curves, cross-validation and other methods to learn more efficient and remembered the podcast in which he mentioned his Complete Statistical Theory and as a former math major I appreciate his approach so much. Thx for this opportunity
His concept of predicates is intriguing: Everything can be deconstructed to see what it is consisting of - the basic building blocks. With that, what is left to do is only one more step: analyzing the structure. Excellent concept!
Amazing talk and amazing contributions to the field of statistical learning theory. This is definitely a piece of the puzzle that I feel like is very under represented today.
Thanks for noticing that. I just paid for English captions to be created. It should be done in 30-40 hours. I'll update the video then. *Update:* The completed English captions are now added to the video.
That's cause youtube probably doesn't use Vapnik's algorythms :) BTW I'm native Russian speaker, and for me his English is much clearer than US or British one :)
@@thewiseturtle you can support him in other ways. It is likely easier for Lex to manage this in a transactional way as opposed to managing all the drama that goes with community involvement. And even though he is paying for a service, he is likely supporting an individual's small business, which is a wonderful way to share wealth with hard workers.
Arriving here from the podcast, must say that horizontal expansion will give us the models that we need and yes, even after than it would be an imitation. Intelligence seems to be far from our reach as of now.
My master's thesis was forecasting using SVM. That was the first time I fell in love with machine learning and even Math. Thank you Vladimir for living.
I really enjoyed this talk by Vladimir. Here's the outline:
0:00 - Introduction
0:46 - Overview: Complete Statistical Theory of Learning
3:47 - Part 1: VC Theory of Generalization
11:04 - Part 2: Target Functional for Minimization
27:13 - Part 3: Selection of Admissible Set of Functions
37:26 - Part 4: Complete Solution in Reproducing Kernel Hilbert Space (RKHS)
53:16 - Part 5: LUSI Approach in Neural Networks
59:28 - Part 6: Examples of Predicates
1:10:39 - Conclusion
1:16:10 - Q&A: Overfitting
1:17:18 - Q&A: Language
My master's thesis was forecasting using SVM. That was the first time I fell in love with machine learning and even Math. Thank you Vladimir for living.
I'm studying SVM in my MCS program. I was so surprised to find this video with Dr. Vapnik. We live in such blessed times to have easy access to this level of high-quality content.
Thank you!
@Lex Fridman 1.5 years ago I listened to your first podcast with prof vapnik and was blown away. Great man, great story. I love it. Funny is that while pursuing the topic of machine learning and deep learning myself at the moment I hit the subject of learning curves, cross-validation and other methods to learn more efficient and remembered the podcast in which he mentioned his Complete Statistical Theory and as a former math major I appreciate his approach so much. Thx for this opportunity
Huge respect for the gentleman (he is a legend for us, AI-Masters students in Ireland;) Thank you for uploading to UA-cam.
I feel privileged to have the opportunity to watch this video. Thank you very much @Lex Fridman
His concept of predicates is intriguing: Everything can be deconstructed to see what it is consisting of - the basic building blocks. With that, what is left to do is only one more step: analyzing the structure.
Excellent concept!
Lex, we are so grateful for the amazing lectures and conversations you provide to the Internet all assembled in one place, thank you!
I think this lecture broke my mind. Legend!
This envokes great memories to my university days! Working in applied ml is seldom as elegant as vc theory lol
Thank you very much for this video. Watching a lecture from this gentleman is such an honor.
It's an honor to see one of the living legends of Theoretical Machine Learning / and the father Statistical Learning Theory in flesh!
Wow, being taught by the man himself, what an honor
Amazing talk and amazing contributions to the field of statistical learning theory. This is definitely a piece of the puzzle that I feel like is very under represented today.
Legend in statistical learning ❤️
Thanks a lot!!! Very Informative!!!! And thanks for making all of this happen!!!!
Thank you for offering us this possibility.
HI Lex :) , Can you do a series/playlist on NLP research and where NLP is going after 2020 and its future? That'd be really helpful!!!!
Funnily, UA-cam detects the language of the Video as Russian for subtitles
Thanks for noticing that. I just paid for English captions to be created. It should be done in 30-40 hours. I'll update the video then. *Update:* The completed English captions are now added to the video.
@@lexfridman Thank you!
This video will be extremely difficult for someone without prior math knowledge to caption. God bless whoever is able to finish it.
That's cause youtube probably doesn't use Vapnik's algorythms :)
BTW I'm native Russian speaker, and for me his English is much clearer than US or British one :)
@@thewiseturtle you can support him in other ways. It is likely easier for Lex to manage this in a transactional way as opposed to managing all the drama that goes with community involvement. And even though he is paying for a service, he is likely supporting an individual's small business, which is a wonderful way to share wealth with hard workers.
He is a hero 😊
I love you Vapnik!
i use his invention every single day !
so clear explanations. thanks.
0:04 "co-inventor of supported vector machines". Lex invented the unsupported ones.
What a legend !!!
How does RBMK reactor....
Спасибо большое за семинар и лекцию
Катерина, у вас интересные плей листы,у меня схожие интересы и я тоже люблю смотреть лекции которые вы считаете интересными.Давайте дружить?)
Arriving here from the podcast, must say that horizontal expansion will give us the models that we need and yes, even after than it would be an imitation. Intelligence seems to be far from our reach as of now.
Thanks, I dot understand what are predicates formaly and when we use them
great talk but for you guys out there let's hope it gets released in English
Thx
2 views and 6 upvotes! that is what i am talking about!
youtube does not update "view" in real-time, and it's updated later than "upvotes". I guess.
OK, and now, how to program it in Python?
priveleged to see him, legendary persons are messengers of god
can anyone tell the prerequisite maths for this book ?
Great opening joke!😆
Marvin Minsky says statistical learning won't work to build AGI.
...but it still can be a building block in a society-of-mind system.
I thought it was just me but apparently his accent might be just a little bit hard to understand.
all part of the university experience lol
OMG is Vladimir Vapnik our Valentine!?
rip i have no idea what's going on
I think I would like to speak of AI. I m a sim ple man tho is there real ly such a thing... I think not.... so AM I !
First🙌🙌🙀😃😃😃😃
real soviet man, not your regular russkii :-P
Can't understand a fooking thing he says.
anyone else didn't understand it at all?
welcome to STEM university, my friend.
Heavy russian accent is hard to tolerate, while the math is pretty basic
My master's thesis was forecasting using SVM. That was the first time I fell in love with machine learning and even Math. Thank you Vladimir for living.