Entropy (for data science) Clearly Explained!!!

Поділитися
Вставка
  • Опубліковано 17 чер 2024
  • Entropy is a fundamental concept in Data Science because it shows up all over the place - from Decision Trees, to similarity metrics, to state of the art dimension reduction algorithms. It's also surprisingly simple, but often poorly explained. Traditionally the equation is presented with the expectation that you memorize it without thoroughly understanding what it means and where it came from. This video takes a very different approach by showing you, step-by-step, where this simple equation comes from, making it easy to remember (and derive), understand and explain to your friends at parties.
    For a complete index of all the StatQuest videos, check out:
    statquest.org/video-index/
    If you'd like to support StatQuest, please consider...
    Buying my book, The StatQuest Illustrated Guide to Machine Learning:
    PDF - statquest.gumroad.com/l/wvtmc
    Paperback - www.amazon.com/dp/B09ZCKR4H6
    Kindle eBook - www.amazon.com/dp/B09ZG79HXC
    Patreon: / statquest
    ...or...
    UA-cam Membership: / @statquest
    ...a cool StatQuest t-shirt or sweatshirt:
    shop.spreadshirt.com/statques...
    ...buying one or two of my songs (or go large and get a whole album!)
    joshuastarmer.bandcamp.com/
    ...or just donating to StatQuest!
    www.paypal.me/statquest
    Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
    / joshuastarmer
    0:00 Awesome song and introduction
    1:28 Introduction to surprise
    4:34 Equation for surprise
    6:09 Calculating surprise for a series of events
    9:35 Entropy defined for a coin
    10:45 Entropy is the expected value of surprise
    11:41 The entropy equation
    13:01 Entropy in action!!!
    #StatQuest #Entropy

КОМЕНТАРІ • 1,2 тис.

  • @statquest
    @statquest  2 роки тому +37

    Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/

    • @yakubsadlilseyam5166
      @yakubsadlilseyam5166 Рік тому

      Sir, have you included entropy in your book, I couldn't find it

    • @statquest
      @statquest  Рік тому +1

      @@yakubsadlilseyam5166 No, it's not in the book because, while it's nice to know about, it's not essential since there are other, easier to understand options that you can use.

    • @sandeepgill4282
      @sandeepgill4282 Рік тому

      @@statquest could you please provide sources to gain understanding of all types of entropies?

    • @statquest
      @statquest  Рік тому

      @@sandeepgill4282 en.wikipedia.org/wiki/Entropy_(information_theory)

    • @rikodewaner
      @rikodewaner Рік тому +1

      It's a helpful introduction Josh....👍

  • @luischinchilla-garcia4840
    @luischinchilla-garcia4840 2 роки тому +779

    This is quite possibly the best explanation of entropy I've ever seen. This is even better than Shannon's own paper!

    • @statquest
      @statquest  2 роки тому +117

      BAM! :)

    • @DarXPloit
      @DarXPloit 2 роки тому +18

      @@statquest Big Bang BAM

    • @SaffaGains
      @SaffaGains 2 роки тому +18

      @@statquest DOUBLE BAM

    • @adipurnomo5683
      @adipurnomo5683 2 роки тому

      What time Shannons was write that paper?

    • @statquest
      @statquest  2 роки тому +8

      @@adipurnomo5683 I believe it was 1948

  • @felixvanderspek1293
    @felixvanderspek1293 2 роки тому +318

    "Simplicity is the ultimate sophistication." - Leonardo da Vinci
    Thanks for explaining in such simple terms!

  • @sade922
    @sade922 Рік тому +54

    9 minutes of your video explained everything better than 2 hours of my professor giving a lecture... Thank you!!!

  • @CellRus
    @CellRus 2 роки тому +9

    Absolutely amazing. I always come back to your videos from time to time for simple (but absolutely useful) explanation of complicated concepts that I found in papers. They all have helped me a lot, and I feel I'm better at communicating these concepts to other researchers too.

    • @statquest
      @statquest  2 роки тому +1

      Hooray!!! I'm glad the videos are so helpful.

  • @deepakmehta1813
    @deepakmehta1813 2 роки тому +77

    Amazing video on Entropy Josh. Thank you. I am certainly more addicted to statquest than Netflix. I really liked the way you have introduced the notion of surprise, how you used it to pedagogically explain entropy. It is certainly now easy to think and remember the definition of entropy.

    • @statquest
      @statquest  2 роки тому +2

      Awesome, thank you!!!!

    • @shahf13
      @shahf13 2 роки тому +2

      @@statquest As a heavy UA-cam addict with 40 sub-channels you are the only one I put a bell on

    • @statquest
      @statquest  2 роки тому +3

      @@shahf13 BAM! :)

  • @dvo66
    @dvo66 2 роки тому +38

    best entropy explanation. I took a 500 level ML class last spring in my masters and this is better explanation than my prof(no disrespect to him, he is amazing too)

  • @nataliatenoriomaia1635
    @nataliatenoriomaia1635 2 роки тому +6

    Awesome as always, Josh! Thank you for continuing to share high quality content with us. You’re a very talented teacher. I wish you all the best!

    • @statquest
      @statquest  2 роки тому

      Thank you very much! :)

  • @wolfgangi
    @wolfgangi 2 роки тому +9

    I freaking love these video, Josh has a gift to explain things so vividly

    • @statquest
      @statquest  2 роки тому

      Thank you very much! :)

  • @saidisha6199
    @saidisha6199 8 місяців тому +5

    One of the best explanations of entropy. I had been struggling for a while with this concept and there was no intuitive way I could understand and remember the formula so far, your video made it possible. Great video!

  • @eduardoh.m2072
    @eduardoh.m2072 2 роки тому +14

    You, sir, you are the very first person to actually explain this subject and not just repeat some random definition without giving any thought to it. I'm amazed by the amount of people who confuse rambling on about the topic with actually explaining it. Thank you!

  • @danielpaul65
    @danielpaul65 Рік тому +2

    Starting the video with a message declaring that we can understand Entropy is the best starting line I have ever seen from any teacher in my life. Great work!!!

  • @hemanthhariharan5105
    @hemanthhariharan5105 10 місяців тому +3

    I'm truly amazed by the power of simplicity and intuition. Hats off Josh!

  • @loay1844
    @loay1844 Рік тому +4

    Wow im sooo impressed. Frrr! It’s been a week trying to understand entropy and I rly thought I was never going to understand this bs. This video is arguably the best video on UA-cam! Not only about entropy, but absolutely!! Thank you soo much

  • @emmydistortion3997
    @emmydistortion3997 2 роки тому +36

    Awesome world-level teaching...
    Thank you!

  • @shahedmahbub9013
    @shahedmahbub9013 2 роки тому +8

    Thanks for all your efforts in creating a smart, funny and most importantly CLEAR explanation. This was awesome.

  • @sammitiyadav6914
    @sammitiyadav6914 3 місяці тому +1

    I follow most of your videos, not sure how I missed this gold! This is just the best entropy video I’ve ever seen.

  • @tranquil123r
    @tranquil123r 2 роки тому +7

    Loved it. The best explanation I came across on Entropy. Thanks Josh!

  • @OwenMcKinley
    @OwenMcKinley 2 роки тому +22

    😊😊15:45 hahaha “psst… the log of the inverse of the probability...”
    Josh this was a fantastic tutorial. Love how I can just wake up and see content like this fresh in my YT recommendations
    We all appreciate it

    • @statquest
      @statquest  2 роки тому +3

      Thank you very much! :)

    • @jeffnador9594
      @jeffnador9594 2 роки тому

      You can also say "negative log of the probability". Since 1/(x^c) = x^(-c), if c = 1, then log(1/x) = log(x^-1) = -1 log(x)

    • @statquest
      @statquest  2 роки тому +2

      @@jeffnador9594 Yep. But, to me, that form makes it just a little bit harder to see what's going on.

    • @jeffnador9594
      @jeffnador9594 2 роки тому

      @@statquest Agreed! But, the more you can keep people guessing, the higher the surprise value of the statement...

    • @lbognini
      @lbognini 2 роки тому

      @@jeffnador9594 the point is to get an intuition and derive the formula. Not to manipulate mathematic terms.

  • @jaysonl3685
    @jaysonl3685 Рік тому +2

    Absolutely amazing and intuitive explanation Josh! I couldn't have understood it without you, huge thanks :D

  • @gustavorm5686
    @gustavorm5686 7 місяців тому +3

    the best explanation for entropy i saw, after browsing for tens of videos. well done prof!!

  • @bagavathypriya4628
    @bagavathypriya4628 Рік тому +3

    You are the BEST teacher !! Thanking God that you exists.

  • @bjornnorenjobb
    @bjornnorenjobb 2 роки тому +2

    I love your videos, you make a wonderful effort for explaining difficult topics on a easy way. Huge thanks!

  • @TuNguyen-ox5lt
    @TuNguyen-ox5lt 2 роки тому +1

    this is definitely the most intuitive way to really gasp the ideal of entropy. you're just wonderful. Thank you soo much

    • @statquest
      @statquest  2 роки тому

      Thank you very much! :)

  • @Shionita
    @Shionita 2 роки тому +22

    I feel so happy because I just learned something new thanks as always!! 😁

  • @shamshersingh9680
    @shamshersingh9680 Рік тому +3

    How can it be!! How can you simplify such complex topics into such simple explanations. Hats Off Man. I seriously wish if I could have had a Maths teacher like you back in school. I have become fan of your videos. Your videos are the first and last stop solution for all my doubts. Thanks Josh. You are a boon to learners like us. Impressed.

  • @chaitu2037
    @chaitu2037 2 роки тому +1

    "This is by far the best explanation for entropy that I have ever come across", thanks so much!

  • @tomashernandezacosta9715
    @tomashernandezacosta9715 2 роки тому +1

    This is THEE single BEST explanation for Entropy that I have ever heard. After this video I bought your book instantly. TRIPLE BAM!!

    • @statquest
      @statquest  2 роки тому

      Wow! Thank you very much for your support!

  • @eliyahubasa9401
    @eliyahubasa9401 2 роки тому +3

    Thanks, I’d been waiting for good explanation about entropy for a long time. Thanks :)

  • @GokulAchin
    @GokulAchin 2 роки тому +9

    Please give Josh a nobel prize for not getting a single dislikes in many of his videos and for his contribution to the ML , Stats community. I have to forgive myself for not finding this channel way before when i started my interest in data science.
    You are definitely inspiring me to teach many people the same content you taught us.

    • @statquest
      @statquest  2 роки тому +5

      Thank you!!! However, this video actually has 13 dislikes. For some reason UA-cam no longer posts the number of dislikes. However, with 3, 402 likes, that means 99.6% of the people like this video, which is pretty good.

    • @GokulAchin
      @GokulAchin 2 роки тому +6

      @@statquest DOUBLE BAM!!!!

    • @buihung3704
      @buihung3704 6 місяців тому +2

      @@statquest The entropy, or the expected Surprise when you randomly pick a Like in your react pool, is pretty low :)))

    • @statquest
      @statquest  6 місяців тому

      @@buihung3704 bam!

    • @matthewfox1561
      @matthewfox1561 5 місяців тому

      ua-cam.com/video/ULc6z3EsXUs/v-deo.htmlsi=wtYPowFO-N_iDqdS bazinga

  • @nakul___
    @nakul___ 2 роки тому +1

    Been looking forward to this one for a while and was not disappointed at all - thanks!

  • @pietronickl8779
    @pietronickl8779 Рік тому +1

    Thanks for these super clear explanations - you really manage to break down complex concepts till they seem simple and (almost) intuitive. Also really appreciate the pace, ie the patience of going step-by-step and not making any crazy leaps 2 mins to the end 👏👏👏

  • @dylansatow3315
    @dylansatow3315 9 місяців тому +3

    Wow this was amazing. I've never seen entropy explained this clearly before

    • @statquest
      @statquest  9 місяців тому

      Glad to hear it!

  • @user-cc6ro9kv1k
    @user-cc6ro9kv1k 9 місяців тому +3

    This is such a fascinating video, learning the theory of ML and I can certainly say you are a gifted person. Your perfect understanding field of Probability, Math and ML gives ability to explain it in the best way in the entire world. I'm amazed with your explanation skills

  • @DanishArchive
    @DanishArchive 10 місяців тому +2

    I was awestruck when I finally understood what on earth Entropy is. In most algorithms, I hear entropy must be less, and I felt that it is some weird value which the model gives, and we have to tune it to reduce it. But now, sitting here, watching this video felt like an eye-opener. What a simplistic and beautiful way to explain complicated concepts. You truly are amazing, Josh!!! Super BAMMM!!

  • @jsebdev1539
    @jsebdev1539 Рік тому +1

    I'm so happy these channel exists! hurray!!!

  • @michaelgeorgoulopoulos8678
    @michaelgeorgoulopoulos8678 2 роки тому +3

    The inverse probability is a much better way of putting it than the minus sign. It was all this time in front of me and I didn't notice. Thank you!

  • @AtiqurRahman-uk6vj
    @AtiqurRahman-uk6vj 2 роки тому +33

    Your self promotion is not shameless, it a gift to humanity. Free content that explains way better than paid content on Coursera. Thanks for helping out poor guys like us Josh

  • @danielmcleod6535
    @danielmcleod6535 2 роки тому +2

    Hi Josh, you're always my go to for stats videos. Thank you, they're amazing. Would love to hear you talk through chi squared? I couldn't see anything on your channel thus far on it.

    • @statquest
      @statquest  2 роки тому +2

      I'd like to cover that one day.

  • @tianyiluo0105
    @tianyiluo0105 2 роки тому +2

    This is a great explanation! Thank you Josh and StatQuest!

  • @lan30198
    @lan30198 2 роки тому +3

    Fuck, I never understand about the entropy before watching this video, you are amazing

  • @RoRight
    @RoRight 2 роки тому +1

    I was NOT surprised by the high quality of this video given StatQuest's high probability of producing awesome videos.

  • @felixlaw8377
    @felixlaw8377 5 місяців тому +1

    Being able to derive entropy and show it simply to us in a funny way is just mindblowing... Hats off to you sir!!

  • @ceseb23
    @ceseb23 2 роки тому +3

    Hello, Thanks for this video, its really helpfull as always :D.
    Quick question : why not use Surprise = 1- P(x), as it scale inverse to the probability and the surprise of a sure event is 0 as requested ?

    • @statquest
      @statquest  2 роки тому +1

      Maybe it doesn't make sense for Surprise to be 1 when the probability is 0.

    • @AAA-tc1uh
      @AAA-tc1uh 2 роки тому

      @@statquest I would expect a deeper answer than that, as the [0,1] range can be scaled with any constant to give any large-enough
      surprise value to probability 0. It's just that the function would be linear now.

    • @statquest
      @statquest  2 роки тому

      @@AAA-tc1uh Well, then you're stuck with the figuring out what that constant should be. Infinity? But that kind of opens another can of worms because anything times infinity is infinity. Thus, another advantage of using the log function is that the limit as x goes go zero is -infinity.

    • @AAA-tc1uh
      @AAA-tc1uh 2 роки тому

      @@statquest Sure, I understand, but my rebuttal would be: we already use the [0,1] range for the probability distribution with 0="would never happen" and 1="always happens" (not entirely correct, I know, e.g. continuous distributions) so the same way we could treat Surprise value of 0 as "no surprise at all", and 1 as "maximum surprise". And we have a nice, well-behaved range with no infinities or undefined behavior.
      Skimming Shannon's original paper I see he argues for the use of the logarithmic function in the opening paragraphs but never provides real deep reason other than convenience and and practicality in engineering usage (another point for the linear function suggested above). Edit: the real reason is the characterization of such function, see en.wikipedia.org/wiki/Entropy_(information_theory)#Characterization, which is apparently only satisfied by entropy function in this form, using logarithms, as proved by Shannon.

    • @statquest
      @statquest  2 роки тому

      @@AAA-tc1uh Nice!

  • @varuntejkasula748
    @varuntejkasula748 11 місяців тому +1

    absolutely clear. Can't expect a more clearer explanation than this

    • @statquest
      @statquest  11 місяців тому

      Glad you think so!

  • @DubZenStep
    @DubZenStep 2 роки тому +2

    The world needs an army of people like you man. This explanation is outstanding. A triple bam.

  • @cls8895
    @cls8895 2 роки тому +1

    WOW its SUPEERRR EASY and well explained!! I only had known about the entropy in physics, but now I can see the calculation way of the entropy. THANK YOU for your hard work for easy understanding from S.Korea!

    • @statquest
      @statquest  2 роки тому

      Hooray! I"m glad the video was helpful! :)

  • @eusha54
    @eusha54 2 місяці тому +1

    You are really great man!
    You explained this so easily. Loved it!

  • @dohasan5053
    @dohasan5053 Рік тому +1

    Hello! your videos are amazing and you do a great job in simplifying such difficult concepts and making them easier to understand. Thank you so much.

  • @mohammedlabeeb
    @mohammedlabeeb 2 роки тому +1

    Really Great video. Right to the point. I met with one of my coworker who is very seasoned in Data science to help me work on a project and use entropy for the first time. After one hour I was as confused as I could be. But this video really helped. I wish if I saw this video before I had my meeting.

  • @jiaweizhang6189
    @jiaweizhang6189 2 роки тому +1

    I have studied ml for a lot of time. i dont clearly know what is cross-entropy or entropy until now. This is the best explanation for entropy!

  • @crackedatcurry
    @crackedatcurry 6 днів тому +1

    This man deserves a prize for how well he taught this. BAMMMM!!!!

  • @user-su7id4xe2d
    @user-su7id4xe2d 10 місяців тому +1

    The most complete video I've ever seen on this platform : ) I subscribed😎

  • @forresthu6204
    @forresthu6204 2 роки тому +1

    This is the BEST version of the explanation about entropy.

  • @KleineInii
    @KleineInii 2 роки тому +1

    Thank you so much for sharing this great explanation with us! I stopped the video after you derived the formula and then derived it again on my own. Makes so complete sense!!!
    I am giving a talk at a conference in 2 weeks, and in my presentation there is a formula using mutual information. I was asked to explain this in my pratice presentation and was not able to. Now, after seeing your video, I am so clear about the concept of entropy and feel much more confident when I need to explain it :)

    • @statquest
      @statquest  2 роки тому

      Good luck with your presentation! BAM! :)

  • @michaelzavin969
    @michaelzavin969 2 роки тому +1

    Just wow !
    i've watched my prof's lecture (1.5 h long) 3 times and did not understand anything
    and here you come with 15 minutes long video and BAM and medium BAM !! and I finally got it
    THANK YOU!!!

  • @kaushaljani6769
    @kaushaljani6769 2 роки тому +1

    Bam!!! Hats off man that was the easiest explaination i ever come across...
    Thanks for making such kinds of tutorials.

  • @thegt
    @thegt 10 місяців тому +1

    Simply amazing... I have been using CrossEntropy for months and only now I understood where the word Entropy came from in CrossEntropy

  • @emsdy6741
    @emsdy6741 2 роки тому +1

    DOUBLE BAM! Thanks for the video.
    I liked how you derived the formula of entropy, and now it is easier to understand.

  • @devrus265
    @devrus265 Рік тому +1

    This is by far the best explanation I heard on entropy.

  • @shahf13
    @shahf13 2 роки тому +1

    YAY I have been waiting for a good explanation of entropy for a long time

  • @Stilzel
    @Stilzel 6 місяців тому +1

    Josh, thank you so much for your videos, you are a GREAT teacher. I wish you all the best, and thank you again!

    • @statquest
      @statquest  6 місяців тому

      Thank you very much! :)

  • @muskanmahajan04
    @muskanmahajan04 2 роки тому +1

    By far the best explanation I've seen. You are a true saviour.

  • @SonSon-rq5dj
    @SonSon-rq5dj 2 роки тому +1

    Solid video, solid explanation. Best channel out there for your data mining needs

  • @anibaldk
    @anibaldk 2 роки тому +1

    Priceless channel for anyone interested in statistics. Just BRILLIANT.

  • @shaahinfazeli9095
    @shaahinfazeli9095 2 роки тому +1

    You are truly amazing in simply explain the complicated things!

  • @viethoalam9958
    @viethoalam9958 2 місяці тому +1

    this is so smooth and easy to understand in connecting between “surprise” - an emotion with a mathmatic theory - number.

  • @jinyunghong
    @jinyunghong 2 роки тому +2

    The best explanation of entropy! Thank you so much as always :)

  • @leassis91
    @leassis91 Рік тому +1

    this is gold, bro. one of your best videos

  • @guliyevshahriyar
    @guliyevshahriyar 4 місяці тому +1

    This is phenomenal work for ENTIRE DATA SCIENCE! Thank you a loot.

    • @statquest
      @statquest  4 місяці тому

      You're very welcome!

  • @GregSom
    @GregSom 4 місяці тому +1

    Simple, elegant and funny. Just perfect. Thanks!

  • @bushraw66
    @bushraw66 2 місяці тому +1

    I can't believe this guy made entropy fun and understandable. The intro song really lowered my anxiety about passing my exam thank you so much for your content

  • @pradyumnagupta3989
    @pradyumnagupta3989 2 роки тому +1

    Awesome awesome video. The best explaination of entropy I have seen so far!

  • @sumitmishra8449
    @sumitmishra8449 2 роки тому +2

    Thank you, Josh, you literally are the best teacher out there. I got a job as a Data Analyst and I Only watched your videos for all the explanations and understanding. Made a lot of notes as well. Sincerely Thank you.
    PS: First thing I'm gonna do with my salary is buy a membership!! Infinite Bam!!

    • @statquest
      @statquest  2 роки тому +2

      Congratulations!!!! TRIPLE BAM!!! :)

  • @ArmanAli-ww7ml
    @ArmanAli-ww7ml 2 роки тому +1

    I can’t thank you enough for your efforts and the way of teaching

  • @azamatbagatov4933
    @azamatbagatov4933 2 роки тому +1

    I am surprised how easily understandable entropy is! Thanks!

  • @vaibhavnakrani2983
    @vaibhavnakrani2983 6 місяців тому +1

    I bow down to you sir. It was truly amazing in a simple way.

  • @kuchuksary
    @kuchuksary 6 місяців тому +1

    This is the beeeeeeeeeest explanation of entropy! THANK YOU!!

  • @nirmithrjain6265
    @nirmithrjain6265 10 місяців тому +1

    Seriously, you are the best teacher I have ever had

  • @victorcoulon9375
    @victorcoulon9375 2 роки тому +2

    Outstanding explanation made of simplicity. Huge thanks!

    • @statquest
      @statquest  2 роки тому +1

      Thank you very much! :)

  • @surendrabarsode8959
    @surendrabarsode8959 2 роки тому +1

    This is the best ever explanation of entropy I have seen!! The real surprise is the totally innovative idea of 'surprise! Thanks with entropy of zero!!!.

  • @minhuc-08tran90
    @minhuc-08tran90 11 місяців тому +1

    What a useful video, I really like the layout and content of the video. It's easy to understand. This channel is really cool ❤❤

    • @statquest
      @statquest  11 місяців тому

      Thank you so much!

  • @ralphchien184
    @ralphchien184 Рік тому +1

    This is the most excellent explaination that I have ever seen. Impressive! Impressive! Impressive! Three times I must give. Thanks a lot!

  • @namesurname1040
    @namesurname1040 Рік тому +1

    It was just amazing .Thank you for that video!It really helped me.

  • @pacco2012
    @pacco2012 2 роки тому +1

    wow! your explanation saved my butt big time! probably the best one on this platform

  • @alialthiab7527
    @alialthiab7527 Рік тому +1

    You are awesome. I finally understood the entropy concept without any equations.
    Big love😍

  • @erv993
    @erv993 2 роки тому +2

    This is a brilliant explanation! It all makes sense now!

    • @statquest
      @statquest  2 роки тому

      Thank you very much! :)

  • @deschryverjan
    @deschryverjan 2 роки тому +1

    No surprise here: yet another amazing video! Great explanation!

  • @alinazem6662
    @alinazem6662 8 місяців тому +1

    Your videos are as straightforward as Y=mX. thanks Josh.

    • @statquest
      @statquest  8 місяців тому

      Glad you like them!

  • @dabestpilot4157
    @dabestpilot4157 3 місяці тому +1

    Wonderful video, sending this to my group thats supposed to be making a presentation on this next week.

    • @statquest
      @statquest  3 місяці тому

      Good luck with the presentation!

  • @dafeiwang7797
    @dafeiwang7797 2 роки тому +1

    我所见过最易于理解的老师👨‍🏫 Greate

  • @hemanbassi12
    @hemanbassi12 2 роки тому +1

    Sweet and perfect explanation, this was so insightful. thank you.

  • @christiansiemes9902
    @christiansiemes9902 2 роки тому +1

    Great video! For the first time, I understood what entropy means. Thanks!

  • @pingmelody8609
    @pingmelody8609 Рік тому +1

    You are so talented! I'm so thankful UA-cam recommendation system guided me into your videos, it's a whole new world!Every data scientists should watch your videos!!!! bam!!

  • @lourencopintodasilva4321
    @lourencopintodasilva4321 2 роки тому +1

    Best explanation about something I've ever seen. Just subscribed the channel

  • @cristianofroes4681
    @cristianofroes4681 2 роки тому +1

    Hello from Brazil, thank you very much for this (the best so far) explanation.... I really impressed how simply you do the hole thing. Keep going.... Good Bless you.

    • @statquest
      @statquest  2 роки тому +1

      Thank you very much!

  • @sumitpaliwal1540
    @sumitpaliwal1540 2 роки тому +1

    StatQuest: super easy explanations to complex problems. BAM!!!

  • @chenmarkson7413
    @chenmarkson7413 4 місяці тому +1

    You might like to know that I am sharing this video with my whole class of CSC311 Introduction to Machine Learning at the University of Toronto.
    You are doing a phenomenal work in explaining concepts in such an intuitively understandable way! Hugest thanks!

    • @statquest
      @statquest  4 місяці тому

      Thank you very much! I'm so happy the video is helpful! :)

  • @PedroRibeiro-zs5go
    @PedroRibeiro-zs5go 2 роки тому

    Hi Josh, you're the BEST!!! Would love a subsequent video on the KL-Divergence!

    • @statquest
      @statquest  2 роки тому

      Me too. Hopefully soon.

  • @amisadaisanchez1897
    @amisadaisanchez1897 Рік тому +1

    Best video ever, never saw entropy that way.