The KL Divergence : Data Science Basics

Поділитися
Вставка
  • Опубліковано 2 чер 2024
  • understanding how to measure the difference between two distributions
    Proof that KL Divergence is non-negative : • Jensen's Inequality : ...
    My Patreon : www.patreon.com/user?u=49277905
    0:00 How to Learn Math
    1:57 Motivation for P(x) / Q(x)
    7:21 Motivation for Log
    11:43 Motivation for Leading P(x)
    15:59 Application to Data Science

КОМЕНТАРІ • 220

  • @szymonk.7237
    @szymonk.7237 Рік тому +99

    Wow... 😳 I've never seen more genius, easy and intuitive explanation of KL-div 😳👏👏👏👏👏 Big thanks good man ! ❤️

  • @murkyPurple123
    @murkyPurple123 Рік тому +48

    Your bottom-up (instead of top-down) approach that you mentioned in the beginning of the video would be really great to see for all kinds of differrent concepts!

  • @DS-vu5yo
    @DS-vu5yo 9 місяців тому +9

    That was the best description of why we use log that I have ever seen. Good work, man.

  • @zafersahinoglu5913
    @zafersahinoglu5913 5 місяців тому +5

    I am a research scientist. You provide a clear and concise treatment of KL-Divergence. The best I have seen to date. Thanks.

  • @varadpuntambekar8895
    @varadpuntambekar8895 Місяць тому +1

    I don't think I'm ever going to forget this. Thanks so much.

  • @tanvirazhar
    @tanvirazhar 10 місяців тому

    Amazing. The pace you have explained, the approach...everything is just top-notch.

  • @asimosman3428
    @asimosman3428 10 місяців тому +2

    This video is absolutely mind-blowing! The way it breaks down such a complex concept into an intuitive understanding is truly remarkable.
    Thank you!

  • @user-li5lh1qs6s
    @user-li5lh1qs6s Рік тому +3

    I'm in the middle of a $2,500 course, BUT → UA-cam → your video... 👏🏻👏🏻👏🏻👏🏻👏🏻 Thank you for starting with the "why", and appealing to my brains desire to understand, not just do.

  • @JBoy340a
    @JBoy340a Рік тому +16

    That was great. I have struggled to understand certain aspects of KL Divergence, and this is a great way to think about it without getting bogged down in symbology.

  • @markozege
    @markozege Рік тому

    Thank you for this, the best explanation of KL divergence that I have seen. Love how you approach it building gradually, really inspiring for how to learn math.

  • @andrashorvath2411
    @andrashorvath2411 Рік тому +1

    Fantastically clearly explained, congrats.

  • @mrcaljoe1
    @mrcaljoe1 Рік тому +1

    I think you're channel and teaching style is brilliant. I wish I knew about this channel when I was doing my undergrad.

  • @anujadassanayake6202
    @anujadassanayake6202 Рік тому

    Great explanation, this is the first time I'm learning about KL divergence and it was very easy to grasp because of the way you taught it

  • @tampopo_yukki
    @tampopo_yukki 8 місяців тому +1

    I love how you approach to the KL divergence!

  • @trungphan9137
    @trungphan9137 Рік тому +5

    This is mind blowing.... I love the way you go from the problem to the solution, it's clever way to understand this KL divergence

  • @eagermage3157
    @eagermage3157 11 місяців тому +3

    Best Math Teacher ever. So clearly explained the design and thinking process of how the algo comes out. Many video just explain the formula which confused me why we should do this way... Thank you!

  • @trentbolt2006
    @trentbolt2006 9 місяців тому +2

    You've really made my day with ur explanation. Thank you so much :D

  • @yhoang6674
    @yhoang6674 11 місяців тому +1

    In the 'Motivation for log,' you said that taking a simple average is not the right way to go, and then you try to find a function that makes f(4) and f(1/4) have opposite signs. That means you are trying to make two very different distributions have the smallest distance possible (canceling each other out), which is contradictory to what we expected. We expected them to be large.

  • @paigecarlson1742
    @paigecarlson1742 2 місяці тому

    Outstanding. Really helping me through this info retrieval course!

  • @0hexe
    @0hexe Рік тому

    Amazing video, love the format!

  • @julianwebb9222
    @julianwebb9222 Місяць тому

    That was great! Not just dumping the formula on you but walking you through its logic with simple steps. Loved it! ❤

  • @joesavage9077
    @joesavage9077 День тому

    Wow!!! This approach to explaining was mind "opening". I got it! Thanks so much

  • @danscherb4130
    @danscherb4130 Рік тому

    Another amazing video! Please keep them coming!

  • @tayyibulhassan6227
    @tayyibulhassan6227 10 місяців тому +1

    One of the BEST tutorials for sure

  • @midnightwanders5876
    @midnightwanders5876 Рік тому

    Great work! I've been a fan of your ,material for some time and in this video you have truly mastered your craft.

  • @somdubey5436
    @somdubey5436 5 місяців тому

    superb...I believe this is the best explanation I have ever come across for K L Divergence. Thanks a tonne.

  • @godlyradmehr2004
    @godlyradmehr2004 Місяць тому

    The best explanation I've ever seen about KL divergence ❤

  • @marka5968
    @marka5968 Рік тому +7

    Wow. This is the best explanation of KL-divergence I've ever heard. So many over-complicated stuff out there but yours is absolutely genius.

  • @seansullivan6986
    @seansullivan6986 16 днів тому

    Excellent intuitive explanation!

  • @ChocolateMilkCultLeader
    @ChocolateMilkCultLeader Рік тому

    This is the perfect video in Math. Love it. Shared with all my readers

  • @Mars.2024
    @Mars.2024 4 місяці тому

    Everytime i have a math question your hannel is my first choice! Amazing ✅ thanks a million 🎉

  • @shamarbauyrzhan7997
    @shamarbauyrzhan7997 Рік тому +3

    Let's celebrate a new video on this amazing chanel!!! Love your work!

  • @momcilomrkaic2214
    @momcilomrkaic2214 3 місяці тому

    Your videos are great just keep going, I watched you for few years already

  • @thankyouthankyou1172
    @thankyouthankyou1172 6 місяців тому

    I found out this professor is very good at explaining every tough concept! respect and many appreciations!

  • @Hobbies_forkids
    @Hobbies_forkids Рік тому

    Excellent way to explain it. Makes maths sounds logical and approachable 🎉

  • @vzinko
    @vzinko Рік тому +3

    Thank you as always for sharing your brilliant teachings, Ritvik. Could you please do a video on the Gram-Schmidt process and how orthonormal basis matrices are relevant to data science?

  • @kasyaci
    @kasyaci Рік тому

    That was one of the best explanations I have ever heard! Great job and many thanks!

  • @hpp496videos
    @hpp496videos Рік тому

    This was incredibly illustrative!

  • @Justin-zw1hx
    @Justin-zw1hx Рік тому

    dude, the explanation is so good, you rock!

  • @user-vb1no5lq1e
    @user-vb1no5lq1e 4 місяці тому

    me not know some of the fundamentals after listening to your explanation made a lot of sense, and I felt I understood the concept well. I am willing to watch your videos more often.

  • @sandipmehta2950
    @sandipmehta2950 9 місяців тому +1

    amazing explanation. not many can do this. well done.

  • @manducchuc915
    @manducchuc915 8 місяців тому

    Thanks, exactly the explanation I have been looking for!

  • @akhileshpandey123
    @akhileshpandey123 Рік тому

    very nice explanation. Thanks for the work.

  • @yb801
    @yb801 Рік тому

    This is an amzing explanation, thanks!

  • @mehmetozkan1479
    @mehmetozkan1479 7 місяців тому

    I have never seen complex math explained this good Thank you very much!

  • @razgaon3680
    @razgaon3680 Рік тому

    Best video I've seen in a while!

  • @brandonkim4675
    @brandonkim4675 Рік тому

    I recently got interested in learning machine learning and stumbled upon the stable diffusion, the current state of art open source image generation ai.
    That's where I encountered the KL divergence. The more I try to understand it, more complicated concepts and formulas are thrown at me.
    I managed to find some videos that explains how to derive it, but none of them explained why the hell logarithm is present in it for gods sake!
    And here you are, explaining every missing details from other videos and blog posts in a way that the person who knows very little about the subject can understand in a very satisfying and easy to follow way. Hats off to you, sir. I wish every teachers are like you.

    • @ritvikmath
      @ritvikmath  Рік тому

      Thanks and godspeed for your journey through machine learning !

  • @s.prakash7869
    @s.prakash7869 Рік тому

    Awesome explaination. Thanks for this video

  • @sparkgin
    @sparkgin 28 днів тому

    That was amazing. Thank you so much!

  • @vorushin
    @vorushin 4 місяці тому

    Thank you for the great explanation! I totally agree that math is not given from above, but invented by people. And showing how the invention can be done is the best way to teach the new concepts. Thanks a lot!

  • @rishi2504
    @rishi2504 8 місяців тому

    You just got a subscriber. Thank You! 😊

  • @tom199520000
    @tom199520000 2 місяці тому +1

    wahh.. i am studying computer science master degree. Your video really helps me a lot! please keep on doing such great work for us!

  • @vaibhavnakrani2983
    @vaibhavnakrani2983 5 місяців тому

    Awesome! Very intuitive

  • @PrajwalSingh15
    @PrajwalSingh15 Рік тому

    Thank you so much for this explanation and also got a new insight about the log :)

  • @Andy-qi5nh
    @Andy-qi5nh Рік тому

    Amazing teaching. It helps a lot in my process of data shift covariate detection project. Thanks

  • @aliksargsyan2035
    @aliksargsyan2035 Рік тому

    Thank you. As usual, great and very intuitive explanation.

  • @RakshithReddy5555
    @RakshithReddy5555 6 місяців тому

    Blew my mind, I wanted to understand what kl divergence is to understand the recent Gen AI papers and couldn't. This video helped me a lot.

  • @keyvan4680
    @keyvan4680 6 місяців тому

    thank you for the clear explanation.

  • @seyyedmahdihoseini3084
    @seyyedmahdihoseini3084 6 місяців тому

    very very very very well explained, Thanks

  • @user-co6pu8zv3v
    @user-co6pu8zv3v Рік тому

    Thank you! This is the best explanation of KL divergence wich i've seen

  • @kaanefe4266
    @kaanefe4266 7 місяців тому

    God level explanation thank you!!!

  • @andrew-qf4xl
    @andrew-qf4xl 5 місяців тому

    the thing you said in the first minute, is something ive been saying for a while now. As students we arent told what problem drove scientists or engineers into constructing new formulas or ways of thinking.

  • @aisniper4095
    @aisniper4095 Рік тому

    Amazing explanation!

  • @jtkklb
    @jtkklb Місяць тому

    Very good explanation

  • @devindoinmonkmode
    @devindoinmonkmode 2 місяці тому

    Wonderful man. Thank you so much.

  • @fh3652
    @fh3652 Рік тому

    Great stuff... Learning a way to teach maths to my kid... A constructivist method... While learning about stats... I really appreciate your work.

  • @barbaraalexandrova6680
    @barbaraalexandrova6680 10 днів тому

    thank you for the best expanation on this topic

  • @talsveta
    @talsveta 9 місяців тому

    Great explanation

  • @jackritwik09
    @jackritwik09 6 місяців тому

    mad respect for Ritvik from Ritwik for acing the subtle art of intuitive explanation:)) If only professors could master the same art.

  • @sikalee415
    @sikalee415 Рік тому

    You are great in explaining this! Thanks!

  • @mariusschmidt6883
    @mariusschmidt6883 Рік тому

    Wow. Just wow! This is brilliant🤩

  • @afsanarabeya4417
    @afsanarabeya4417 Рік тому

    thank you. this really helped !!

  • @mantische
    @mantische Рік тому +1

    It was the easiest explanation I’ve ever seen.

  • @s.m.tahsinzaman_2720
    @s.m.tahsinzaman_2720 Рік тому +1

    Thank you SO much! God bless you Sir, keep up the great work 😊

  • @sunset6109
    @sunset6109 10 місяців тому +3

    Bro is a legend

  • @ingenierocivilizado728
    @ingenierocivilizado728 2 місяці тому

    Thank you very much for your valuable videos!!

  • @gingerderidder8665
    @gingerderidder8665 2 місяці тому

    Taking the MITx Stats class, but I find that you explain the concepts so much better!

  • @ringo8530
    @ringo8530 9 місяців тому

    you are way more better than my school's professor. thank you

  • @winstongraves8321
    @winstongraves8321 Рік тому +1

    This was awesome. Really helpful to think through it backwards and “redevelop” our own function

  • @komuna5984
    @komuna5984 10 місяців тому

    Thanks a lot for sharing the underlying motivation behind the K-L divergence! I really needed such deep insights! JAJAKALLAH...

  • @SSJVNN
    @SSJVNN 6 місяців тому

    The comments didn't lie you actually explained this so well. I watched the ads all the way through btw.

  • @qiguosun129
    @qiguosun129 Рік тому

    Thanks for the lecture, your work is always so intuitive.

  • @shadabalam2122
    @shadabalam2122 2 місяці тому

    Great explanation 👏

  • @orenkoriat
    @orenkoriat Рік тому +1

    great explanation!

  • @abironnoy3115
    @abironnoy3115 Рік тому

    Thank you so much for the explanation. it was really helpful👍👍

  • @DataScienceAI-rf4kx
    @DataScienceAI-rf4kx 3 місяці тому

    Thank you :) for valuable content

  • @chandrashekaravula1292
    @chandrashekaravula1292 Рік тому

    Amazing explanation

  • @yorker0507
    @yorker0507 3 місяці тому

    Super clear !

  • @chenqu773
    @chenqu773 Рік тому

    Thank you very much! Besides the "norminal" category in your example, I am wondering if this can also be used in "ordinal" category. For example, if I make a questionare from "dislike" to "like very much" and get poll from 2 groups, can I use the KL-divergence to calculate the difference between these 2 groups, and whether there is a even better way to discribe this difference, for example, group2 shows "significant" higher interest than group1 ?

  • @nanthawatanancharoenpakorn6649
    @nanthawatanancharoenpakorn6649 7 місяців тому

    Love problem solving perspective

  • @anonymousiguana168
    @anonymousiguana168 Рік тому

    You're just phenomenal

  • @RAHUDAS
    @RAHUDAS Рік тому

    That was really awesome

  • @MuzammilZahidJahangiri
    @MuzammilZahidJahangiri Рік тому

    Subscribed! just from this one video

  • @physicsfaith
    @physicsfaith 10 місяців тому

    Great job

  • @jijie133
    @jijie133 Рік тому

    Great video!

  • @shuweiPeng-id4xv
    @shuweiPeng-id4xv 3 місяці тому

    讲的真好,谢谢

  • @TheFirebolt2010
    @TheFirebolt2010 2 місяці тому

    It would be interesting to have a video on how you study to understand a topic, what resources you use and the materials you look for

  • @ResilientFighter
    @ResilientFighter Рік тому +2

    Love this homie! Better than university.

  • @msfasha
    @msfasha Рік тому

    Good Job 👍

  • @chaochaisit
    @chaochaisit Рік тому

    Math should all be taught this way, and to go one step further we should teach people how to make sense of math themselve in the long run
    Thanks for the explanation for kl divergence though ;)