The KL Divergence : Data Science Basics
Вставка
- Опубліковано 2 чер 2024
- understanding how to measure the difference between two distributions
Proof that KL Divergence is non-negative : • Jensen's Inequality : ...
My Patreon : www.patreon.com/user?u=49277905
0:00 How to Learn Math
1:57 Motivation for P(x) / Q(x)
7:21 Motivation for Log
11:43 Motivation for Leading P(x)
15:59 Application to Data Science
Wow... 😳 I've never seen more genius, easy and intuitive explanation of KL-div 😳👏👏👏👏👏 Big thanks good man ! ❤️
Glad you liked it!
I agree,
Your bottom-up (instead of top-down) approach that you mentioned in the beginning of the video would be really great to see for all kinds of differrent concepts!
Great idea!
That was the best description of why we use log that I have ever seen. Good work, man.
I am a research scientist. You provide a clear and concise treatment of KL-Divergence. The best I have seen to date. Thanks.
I don't think I'm ever going to forget this. Thanks so much.
Amazing. The pace you have explained, the approach...everything is just top-notch.
This video is absolutely mind-blowing! The way it breaks down such a complex concept into an intuitive understanding is truly remarkable.
Thank you!
I'm in the middle of a $2,500 course, BUT → UA-cam → your video... 👏🏻👏🏻👏🏻👏🏻👏🏻 Thank you for starting with the "why", and appealing to my brains desire to understand, not just do.
That was great. I have struggled to understand certain aspects of KL Divergence, and this is a great way to think about it without getting bogged down in symbology.
Glad it was helpful!
Thank you for this, the best explanation of KL divergence that I have seen. Love how you approach it building gradually, really inspiring for how to learn math.
Fantastically clearly explained, congrats.
I think you're channel and teaching style is brilliant. I wish I knew about this channel when I was doing my undergrad.
Great explanation, this is the first time I'm learning about KL divergence and it was very easy to grasp because of the way you taught it
I love how you approach to the KL divergence!
This is mind blowing.... I love the way you go from the problem to the solution, it's clever way to understand this KL divergence
thanks!
Best Math Teacher ever. So clearly explained the design and thinking process of how the algo comes out. Many video just explain the formula which confused me why we should do this way... Thank you!
You've really made my day with ur explanation. Thank you so much :D
In the 'Motivation for log,' you said that taking a simple average is not the right way to go, and then you try to find a function that makes f(4) and f(1/4) have opposite signs. That means you are trying to make two very different distributions have the smallest distance possible (canceling each other out), which is contradictory to what we expected. We expected them to be large.
Outstanding. Really helping me through this info retrieval course!
Amazing video, love the format!
That was great! Not just dumping the formula on you but walking you through its logic with simple steps. Loved it! ❤
Wow!!! This approach to explaining was mind "opening". I got it! Thanks so much
Another amazing video! Please keep them coming!
One of the BEST tutorials for sure
Great work! I've been a fan of your ,material for some time and in this video you have truly mastered your craft.
Wow, thank you!
superb...I believe this is the best explanation I have ever come across for K L Divergence. Thanks a tonne.
The best explanation I've ever seen about KL divergence ❤
Wow. This is the best explanation of KL-divergence I've ever heard. So many over-complicated stuff out there but yours is absolutely genius.
Glad it was helpful!
Excellent intuitive explanation!
This is the perfect video in Math. Love it. Shared with all my readers
Everytime i have a math question your hannel is my first choice! Amazing ✅ thanks a million 🎉
Let's celebrate a new video on this amazing chanel!!! Love your work!
🎉
Your videos are great just keep going, I watched you for few years already
I found out this professor is very good at explaining every tough concept! respect and many appreciations!
Excellent way to explain it. Makes maths sounds logical and approachable 🎉
Thank you as always for sharing your brilliant teachings, Ritvik. Could you please do a video on the Gram-Schmidt process and how orthonormal basis matrices are relevant to data science?
That was one of the best explanations I have ever heard! Great job and many thanks!
Thanks!!
This was incredibly illustrative!
dude, the explanation is so good, you rock!
Glad it helped!
me not know some of the fundamentals after listening to your explanation made a lot of sense, and I felt I understood the concept well. I am willing to watch your videos more often.
amazing explanation. not many can do this. well done.
Thanks, exactly the explanation I have been looking for!
very nice explanation. Thanks for the work.
This is an amzing explanation, thanks!
Glad it was helpful!
I have never seen complex math explained this good Thank you very much!
Best video I've seen in a while!
Thanks!
I recently got interested in learning machine learning and stumbled upon the stable diffusion, the current state of art open source image generation ai.
That's where I encountered the KL divergence. The more I try to understand it, more complicated concepts and formulas are thrown at me.
I managed to find some videos that explains how to derive it, but none of them explained why the hell logarithm is present in it for gods sake!
And here you are, explaining every missing details from other videos and blog posts in a way that the person who knows very little about the subject can understand in a very satisfying and easy to follow way. Hats off to you, sir. I wish every teachers are like you.
Thanks and godspeed for your journey through machine learning !
Awesome explaination. Thanks for this video
Glad it was helpful!
That was amazing. Thank you so much!
Thank you for the great explanation! I totally agree that math is not given from above, but invented by people. And showing how the invention can be done is the best way to teach the new concepts. Thanks a lot!
You just got a subscriber. Thank You! 😊
wahh.. i am studying computer science master degree. Your video really helps me a lot! please keep on doing such great work for us!
Awesome! Very intuitive
Thank you so much for this explanation and also got a new insight about the log :)
Happy to help!
Amazing teaching. It helps a lot in my process of data shift covariate detection project. Thanks
Glad it was helpful!
Thank you. As usual, great and very intuitive explanation.
No problem !
Blew my mind, I wanted to understand what kl divergence is to understand the recent Gen AI papers and couldn't. This video helped me a lot.
thank you for the clear explanation.
very very very very well explained, Thanks
Thank you! This is the best explanation of KL divergence wich i've seen
Glad it was helpful!
God level explanation thank you!!!
the thing you said in the first minute, is something ive been saying for a while now. As students we arent told what problem drove scientists or engineers into constructing new formulas or ways of thinking.
Amazing explanation!
Very good explanation
Wonderful man. Thank you so much.
Great stuff... Learning a way to teach maths to my kid... A constructivist method... While learning about stats... I really appreciate your work.
Glad it was helpful!
thank you for the best expanation on this topic
Great explanation
mad respect for Ritvik from Ritwik for acing the subtle art of intuitive explanation:)) If only professors could master the same art.
You are great in explaining this! Thanks!
Glad it was helpful!
Wow. Just wow! This is brilliant🤩
Thanks!
thank you. this really helped !!
It was the easiest explanation I’ve ever seen.
Thank you SO much! God bless you Sir, keep up the great work 😊
You are very welcome
Bro is a legend
Thank you very much for your valuable videos!!
Glad you like them!
Taking the MITx Stats class, but I find that you explain the concepts so much better!
Glad to hear!
you are way more better than my school's professor. thank you
This was awesome. Really helpful to think through it backwards and “redevelop” our own function
Thanks a lot for sharing the underlying motivation behind the K-L divergence! I really needed such deep insights! JAJAKALLAH...
You're so welcome!
The comments didn't lie you actually explained this so well. I watched the ads all the way through btw.
Thanks for the lecture, your work is always so intuitive.
You are very welcome
Great explanation 👏
Glad you think so!
great explanation!
Glad you think so!
Thank you so much for the explanation. it was really helpful👍👍
Thanks!
Thank you :) for valuable content
Amazing explanation
Glad you liked it
Super clear !
Thank you very much! Besides the "norminal" category in your example, I am wondering if this can also be used in "ordinal" category. For example, if I make a questionare from "dislike" to "like very much" and get poll from 2 groups, can I use the KL-divergence to calculate the difference between these 2 groups, and whether there is a even better way to discribe this difference, for example, group2 shows "significant" higher interest than group1 ?
Love problem solving perspective
You're just phenomenal
Thanks!
That was really awesome
Subscribed! just from this one video
Woo welcome!
Great job
Great video!
Glad you enjoyed it!
讲的真好,谢谢
It would be interesting to have a video on how you study to understand a topic, what resources you use and the materials you look for
Love this homie! Better than university.
😊
Good Job 👍
Math should all be taught this way, and to go one step further we should teach people how to make sense of math themselve in the long run
Thanks for the explanation for kl divergence though ;)