Support Vector Machines Part 3: The Radial (RBF) Kernel (Part 3 of 3)

Поділитися
Вставка
  • Опубліковано 3 лис 2019
  • Support Vector Machines use kernel functions to do all the hard work and this StatQuest dives deep into one of the most popular: The Radial (RBF) Kernel. We talk about the parameter values, how they calculate high-dimensional coordinates and then we'll figure out, step-by-step, how the Radial Kernel works in infinite dimensions.
    NOTE: This StatQuest assumes you already know about...
    Support Vector Machines: • Support Vector Machine...
    Cross Validation: • Machine Learning Funda...
    The Polynomial Kernel: • Support Vector Machine...
    ALSO NOTE: This StatQuest is based on...
    1) The description of Kernel Functions, and associated concepts on pages 352 to 353 of the Introduction to Statistical Learning in R: faculty.marshall.usc.edu/garet...
    2) The derivation of the of the infinite dot product is based on Matthew Bernstein's notes: pages.cs.wisc.edu/~matthewb/pa...
    For a complete index of all the StatQuest videos, check out:
    statquest.org/video-index/
    If you'd like to support StatQuest, please consider...
    Buying The StatQuest Illustrated Guide to Machine Learning!!!
    PDF - statquest.gumroad.com/l/wvtmc
    Paperback - www.amazon.com/dp/B09ZCKR4H6
    Kindle eBook - www.amazon.com/dp/B09ZG79HXC
    Patreon: / statquest
    ...or...
    UA-cam Membership: / @statquest
    ...a cool StatQuest t-shirt or sweatshirt:
    shop.spreadshirt.com/statques...
    ...buying one or two of my songs (or go large and get a whole album!)
    joshuastarmer.bandcamp.com/
    ...or just donating to StatQuest!
    www.paypal.me/statquest
    Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
    / joshuastarmer
    #statquest #SVM #RBF

КОМЕНТАРІ • 595

  • @statquest
    @statquest  2 роки тому +7

    Support StatQuest by buying my book The StatQuest Illustrated Guide to Machine Learning or a Study Guide or Merch!!! statquest.org/statquest-store/

  • @priyangkumarpatel9317
    @priyangkumarpatel9317 3 роки тому +197

    You make statistics and machine learning so much fun. Your channel is binge watch worthy. Keep spreading good education in fun way. :)

  • @hawaiicashew3237
    @hawaiicashew3237 2 місяці тому +9

    Holly cow this one is flying high. The guy who figured out all the maths must have been on fire!

  • @yuqing3667
    @yuqing3667 2 роки тому +3

    Now we can eat snacks! Thank you so much, your visual explanation makes things so much easier to understand.

    • @statquest
      @statquest  2 роки тому +1

      Glad it was helpful!

  • @chemgreec
    @chemgreec 3 роки тому +41

    Excellent! There is a higher dimensional space where this video is linearly seperable from anything else on youtube. What I love is that you use both math and intuition in good measure. You dont sacrifice intutition over math or math over intuition like most other attempts. This balance you ve got here is excellent.

  • @ayoubmarah4063
    @ayoubmarah4063 4 роки тому +131

    i bet there is a place in heaven named statquest where you gonna live an internal life there

    • @statquest
      @statquest  4 роки тому +3

      Thank you very much!!!! :)

    • @philipkopylov3058
      @philipkopylov3058 3 роки тому +16

      * a place between heaven and earth with the biggest margin possible

    • @bnglr
      @bnglr Рік тому

      @@philipkopylov3058 psst, in a flat affine subspace of dimension 2

  • @EtherealMetal
    @EtherealMetal 4 роки тому +158

    This channel is so amazing。For the past few months I have been trying to catch up on concepts in statistics that my university never taught so that I have enough knowledge to go into the data science and machine learning fields。
    The way you teach concepts in clear concise and short videos is extremely valuable。I have learned much in such a short time from just watching your videos and taking handwritten notesーthank you for all the hard work you have put in delivering this invaluable knowledge!Please continue making videos!

    • @statquest
      @statquest  4 роки тому +8

      Thank you very much! :)

    • @arunavsaikia2678
      @arunavsaikia2678 4 роки тому +1

      @@statquest Hey Josh, what would be an intuitive way to understand how SVM uses the high dimensional relationship between each pair of points to make the actual classification

    • @statquest
      @statquest  4 роки тому +8

      @@arunavsaikia2678 This is a good question. The dot product between two points, which we use to create the "high-dimensional relationships," can be interpreted in a geometric way that includes the cosine of the angle between the two points multiplied by their magnitudes (the distances from the origin). With that in mind, check out slide 18 in this PDF: web.mit.edu/6.034/wwwbob/svm.pdf

    • @Illinoise888
      @Illinoise888 4 роки тому +1

      My university skimmed over RBF, but then had 15 marker about it on the midterm.. Now studying for finals, and wish I had this video for the midterm.

    • @This_isNotAThing
      @This_isNotAThing 3 роки тому

      Yes please

  • @RM-zx9ee
    @RM-zx9ee 2 роки тому +27

    This video deserves an Oscar. Seriously, that was incredible. Infinite BAM!

  • @SometimesMagii
    @SometimesMagii Рік тому +5

    the mathematical reasoning behind the radial kernel has been plaguing me for so long, finally after many tries now it starts to click and my mind can visualize better what is happening and why. Thank you so much :)

  • @hoangtrunghaipham5999
    @hoangtrunghaipham5999 4 роки тому +13

    I have gone through 85% of the full list and found this series extremely useful. The instructions are simple to understand and give the sufficient overview into Machine learning. Highly recommend for starters like me. Looking forward to the advanced parts, e.g. Deep learning. Many thanks!

    • @statquest
      @statquest  4 роки тому +6

      SVM and the Radial is actually pretty advanced, so you've made huge progress. The current series, on XGBoost, is also very advanced. After this, I'll do deep learning and neural networks.

  • @kk___kk___kk
    @kk___kk___kk Рік тому +5

    I find some beep boop sounds a bit cringe, but it's crazy how good you are at explaining and showing things step-by-step. Thank you so much !

  • @tampopo_yukki
    @tampopo_yukki 10 місяців тому +2

    When I was struggling to understand what the use of kernel is intuitively, I found this video in StatQuest series. Now, this seems to have fixed my shaky comprehension about the kernel! Your video series are one of my favorite explanations for base of ML. I'm so glad if you'd keep making these kinds of interesting videos on your pace, BAM!

    • @statquest
      @statquest  10 місяців тому

      Thank you very much! :)

  • @kartikbhanot4692
    @kartikbhanot4692 3 роки тому +4

    I was finding it hard to understand the concept of RBF and this video helped me immensely. Thank you Josh for the amazing work that you doing.

    • @statquest
      @statquest  3 роки тому

      Thank you very much! :)

  • @akrylic_
    @akrylic_ 4 роки тому +12

    What an interesting application of the Taylor Series. Such a beautiful explanation, thank you!

    • @statquest
      @statquest  4 роки тому +1

      This is actually the 3rd place I've seen the Taylor Series in Machine Learning - so it's a super useful trick.

  • @jeongwonkim247
    @jeongwonkim247 4 роки тому +1

    The best machine learning statistics video. Came from confused at a coursera course for data science, taught by u mich faculty, and this video does that 100000x better. Thank you so much!

    • @statquest
      @statquest  4 роки тому

      Awesome!!! I'm glad my video was helpful. :)

  • @yulinsun8873
    @yulinsun8873 2 роки тому +10

    Infinite Bam! This is the most understandable ML video I ever watched. Thank you for sharing this.

  • @nabeelhasan6593
    @nabeelhasan6593 3 роки тому +3

    Your videos are like magic, making such a difficult derivation look so much easy. God Bless You

  • @ketkiambekar7607
    @ketkiambekar7607 3 роки тому +4

    Thank you for making one of the best videos out there for understanding SVM (and log likelihood maximization, and countless other concepts). I am going to make a good contribution to your Patreon once I get start earning because you so so deserve it, omigosh.

  • @sinkseeker
    @sinkseeker 8 місяців тому +1

    The way you explain the math is astounding! I hope you'll continue making videos like this!

    • @statquest
      @statquest  8 місяців тому +1

      Thanks, will do!

  • @gautamdawar5067
    @gautamdawar5067 3 роки тому +3

    A beautiful video, I had tears of joy after watching this. Sir you are amazing!

  • @tonysand69
    @tonysand69 3 роки тому +2

    Oh man, thanks you for your videos, i mean, you're really awesome. You don't only explain the concepts, but also you keep it real and fun. I have learned a lot from you, when i have money i will donate every penny of it.

  • @hashiska.5358
    @hashiska.5358 4 роки тому +1

    nobody explains the concepts better than you do. I have to study ML for a project and I haven't found a channel better than yours. That is why I have a request: please make a video on Support Vector Regression.

  • @tanmaymehta9696
    @tanmaymehta9696 Рік тому +2

    Thank you so much for making all these ML and stats terms so understandable! Great work!

  • @mustufakerawala1796
    @mustufakerawala1796 4 роки тому +3

    I love your videos!!! I understand this content better, even better than my data science lecture at uni. I hope you keep up the great work, I'm officially gonna get some Statquest merch to support this chanel.

  • @dome8116
    @dome8116 4 роки тому +3

    Thanks for creating this amazing video. After watching the lecture on RBF from Caltech I was so lost and felt so bad since it was the first concept that I didnt understand at all. Your video gave some good intuition why it works and how. Thank you Statquest :D

  • @jiayiwu4101
    @jiayiwu4101 3 роки тому +2

    Such a nice, crystal, clear explanation!! Awesome job!!!

  • @clapdrix72
    @clapdrix72 4 роки тому +1

    One of the most clearly explained proofs I've seen in a while

    • @statquest
      @statquest  4 роки тому

      Thank you very much! :)

  • @wangwenyu9052
    @wangwenyu9052 4 роки тому +3

    Thanks for the wonderful video! It really helps in both forming the intuitive, as well as connecting key math concept together!

  • @mypure
    @mypure 2 роки тому +2

    Watching Josh is like feeling like Flash of statistical concepts.. Every skipped concepts in stat class seems becoming crystal clear here..

  • @xlmentx
    @xlmentx 3 роки тому +4

    Damn, good job dude. At first I felt like I was being talked down to, but eventually grew to like it lol. You're way better at teaching this stuff than my professor is.

    • @statquest
      @statquest  3 роки тому +1

      Glad you like it. I try to teach the way I want to be taught myself. I'm not super good at this stuff, so I try to keep it simple.

  • @prashant4814
    @prashant4814 4 роки тому +4

    wow, god bless you
    we need good teachers like you

    • @statquest
      @statquest  4 роки тому +1

      Thank you very much! :)

  • @cheshtagupta7491
    @cheshtagupta7491 3 роки тому +1

    this video put an instant smile on my face

  • @user-ee3qt2fb7i
    @user-ee3qt2fb7i Рік тому +1

    Every time I watch your visualized explanation, I just got amazed

  • @akashjain35
    @akashjain35 4 роки тому +3

    This video and in fact the whole playlist of machine learning is so amazing. Your way of teaching makes it so easy to understand the mathematics behind these concepts. Don't ever stop making these videos!

    • @statquest
      @statquest  4 роки тому +1

      Thank you very much!!!! :)

  • @andersoneduardo749
    @andersoneduardo749 Рік тому +2

    you are the best math professor I ever had.. thanks a lot!!

  • @richf.9211
    @richf.9211 2 місяці тому +2

    This man just answered questions I didn’t even know I had!😂 Excellent job thank you for the videos!

  • @skelgamingyt
    @skelgamingyt 6 місяців тому +4

    Please take our professor's job. We need you.

  • @winstonzhang6352
    @winstonzhang6352 2 роки тому +1

    Amazing video, this saved me for my ML midterm. THANK YOU.

  • @user-zy8sf7tv2f
    @user-zy8sf7tv2f 2 роки тому +1

    Hey man, I have to thank you a lot for describing these things so well !
    Thank you very much !

  • @lauriesteveescoses590
    @lauriesteveescoses590 Місяць тому +1

    holy shit, i didnt expect series expansion to come at the end. so cool

  • @maverickop4134
    @maverickop4134 3 роки тому +1

    this explanation made it look too easy. Good job . Thanks for making this video.

  • @harithagayathri7185
    @harithagayathri7185 4 роки тому +1

    Very clear explanations and far better than the videos on udemy !!

  • @Hermioneswand1
    @Hermioneswand1 2 роки тому +1

    I love this channel. Much love! :)

  • @muskankhajuria1038
    @muskankhajuria1038 3 роки тому +1

    Amazing teaching! Thank you sooo much!

    • @statquest
      @statquest  3 роки тому +1

      Glad it was helpful!

  • @tymothylim6550
    @tymothylim6550 3 роки тому +2

    Thank you very much for this video! I learnt a lot from this step-by-step math guide! Great to eat snacks too!

  • @ilyabykov2437
    @ilyabykov2437 Рік тому

    If I ever get a job in data science, it'll be thanks to this guy.

  • @jinyunghong
    @jinyunghong 2 роки тому +1

    This is such a great lecture!!

  • @trenton7
    @trenton7 3 роки тому +2

    The initial singing and the double, triple, quadruple bam grows on you, didn't like them much at first but it is now an essential part of the learning experience for me.

  • @samyadeep906
    @samyadeep906 4 роки тому +1

    This was really good, thank you ♥️

  • @KiranDhakal8848
    @KiranDhakal8848 2 місяці тому +1

    As I continue watching your video the satisfaction of understanding BAMSS EXPONENTIALLLY!!!

  • @feynmanc303
    @feynmanc303 2 роки тому +1

    thank you .. you make things super easy to understand.. amazingly good

  • @peterng.
    @peterng. Рік тому +1

    worth to spend times on! thank you Josh!

  • @buh357
    @buh357 3 роки тому +5

    wow
    wow
    wow,
    the relationship between two objects in infinite dimension.
    absolutely beautiful and amazing. thanks for ML and you :)

  • @perstarke1295
    @perstarke1295 3 роки тому +1

    Wow. Just WOW. Hella good explanation!

  • @sakshambali3040
    @sakshambali3040 3 роки тому +1

    Just amazing stuff man. God bless you, love from Indian..!!

    • @statquest
      @statquest  3 роки тому

      Thank you very much! :)

  • @abir95571
    @abir95571 3 роки тому +1

    Not gonna lie , I have read a few other books understanding how RBF computes relationship between data points in infinite dimension ... none of them are as simple and comprehensive as your video.
    Thanks a lot

  • @umeshphadke6745
    @umeshphadke6745 3 роки тому

    This channel is absolute gold ! Thanks for your help mate , and also you should consider teaching mathematics too.

    • @statquest
      @statquest  3 роки тому +1

      I'll keep that in mind! :)

  • @fatihboyar9697
    @fatihboyar9697 Рік тому +1

    calculation noises are so realistic and horizon widening experience

  • @devindoinmonkmode
    @devindoinmonkmode 8 місяців тому +1

    3hours lectures in 15 mins, and it's super funny. Super Bam for StatQuest

  • @siddharthpilli62
    @siddharthpilli62 3 роки тому +1

    I Love the videos you make keep up the good work!! BAM!!

  • @ritshpatidar
    @ritshpatidar 6 місяців тому +1

    The guy who came up with RBF is genius.

  • @toxic_narcissist
    @toxic_narcissist 3 роки тому +2

    but how underrated is this video

  • @parthsharma8269
    @parthsharma8269 Рік тому +1

    What are you Josh? Clear - Done, Concise- Done, Amazing -Done, Infinite BAM!!

  • @boranzhou937
    @boranzhou937 4 роки тому +1

    thanks, man! this is really helpful!

  • @MsIHateMiley
    @MsIHateMiley 3 роки тому +2

    god bless, this channel is amazing

  • @shauryavatsa595
    @shauryavatsa595 7 місяців тому +1

    I wish I could be taught by you physically. I know nothing about machine learning and I am going through some of the topics for my internship and I cannot tell you how easy you are making things for me. Quadruple BAMM!!

    • @statquest
      @statquest  7 місяців тому

      Bam! I'm glad you enjoy the videos. :)

  • @61_shivangbhardwaj46
    @61_shivangbhardwaj46 3 роки тому +1

    No words for you sir
    You are great!

  • @filipfolkesson3865
    @filipfolkesson3865 11 місяців тому +1

    This was so funny and educational, thanks man

  • @xuemeiwang1881
    @xuemeiwang1881 3 роки тому +1

    GREAT MAN, GREAT CHANNEL.

    • @statquest
      @statquest  3 роки тому

      Thank you so much 👍

  • @williamobando4159
    @williamobando4159 3 роки тому +1

    This blew my mind lob u Josh ty

  • @henryatehortua6149
    @henryatehortua6149 3 роки тому +1

    I don't understand much English but I notice that you are a lot of fun teaching.

    • @statquest
      @statquest  3 роки тому

      Thank you very much! :)

  • @ahming123
    @ahming123 4 роки тому +2

    OK got what diamension relationship means now. You the best

  • @vncoolestguy
    @vncoolestguy 6 місяців тому +1

    + 3 views, thanks for awesome tutorials Josh.

  • @jasonfaustino8815
    @jasonfaustino8815 2 роки тому +4

    Can’t believe I used my knowledge on Taylor series expansion. Thanks for not wasting precious brain space for that

    • @statquest
      @statquest  2 роки тому +1

      BAM! The Taylor series actually pops up a bunch in machine learning (Gradient Boost and XGBoost etc.)

  • @haochang4293
    @haochang4293 4 роки тому +1

    I benefit so much from your video. From a Chinese Ph.D.

  • @RH-zp1ry
    @RH-zp1ry 3 роки тому +28

    15:18 I would say "INFINITE BAM!!!"

    • @statquest
      @statquest  3 роки тому +7

      YES!

    • @paulovinicius5833
      @paulovinicius5833 3 роки тому

      I was SO hoping for that to happen! hahaha I was expecting this part to be the largest BAM he ever did hahah

    • @statquest
      @statquest  3 роки тому

      @Eyal Barazan I would recommend starting with the first video in this series: ua-cam.com/video/efR1C6CvhmE/v-deo.html

  • @himanshu1056
    @himanshu1056 2 роки тому +1

    Awesome explanation 👍

  • @dougIas1
    @dougIas1 2 роки тому +1

    That's soo good :'). Thanks man!!!

    • @statquest
      @statquest  2 роки тому +1

      Glad you like it! :)

  • @yavdhesh
    @yavdhesh 4 роки тому +1

    Namastey, you are the best person.

  • @hustler212
    @hustler212 2 роки тому +1

    You're a life saver.

  • @mariafirulyova5020
    @mariafirulyova5020 4 роки тому +2

    Dear Josh Starmer, thanks for all amazing videos! Your channel is really helpful for me.
    Could you please explain how UMAP works? Also, comparison UMAP with tSNE will be nice )

    • @statquest
      @statquest  4 роки тому +2

      UMAP is on the to-do list.

  • @RanjeetSingh-pp4uu
    @RanjeetSingh-pp4uu 3 роки тому +1

    LOVED IT! THANKS!!

  • @justgame5508
    @justgame5508 3 роки тому +1

    Ahhh thankyou electronic engineering for having difficult mathematics, it’s makes it easy to branch out into more statistical domains such as machine learning and still be able to keep up, also equips me with other techniques such as Fourier and Laplace transforms which can be useful in data analysis and feature extraction. Great derivation btw

  • @sabrinapatania7810
    @sabrinapatania7810 4 роки тому

    Beautiful video! Thank you so much! I have a question. May you explain why we can use Taylor series assuming a=0? Why can we do this assumption?

  • @ManojGupta-tb3ei
    @ManojGupta-tb3ei 8 місяців тому +1

    Well explained
    Thank you....

  • @user-bz8nm6eb6g
    @user-bz8nm6eb6g 3 роки тому +1

    Best explanation! Wow wow

  • @amishgoel2377
    @amishgoel2377 4 роки тому +2

    Awesome Video and clear explanations! I had one doubt. In the end when forming dot product of RBF kernel, you have used s to multiply the two dot products. But s is a function of both a and b. In the dot product of a pair of observations in high dimension, each term in the product should be a function of one observation since it corresponds to high dimensional feature of an observation. I think one should multiply e^(-0.5a^2) to the first term and e^(-0.5b^2) to the second term of the dot product.

    • @statquest
      @statquest  4 роки тому +1

      All 's' is doing is scaling the dot-product by a constant value. For more details, see: pages.cs.wisc.edu/~matthewb/pages/notes/pdf/svms/RBFKernel.pdf

  • @jeevanraajan3238
    @jeevanraajan3238 4 роки тому +9

    There is soo much of effort put into making these videos and it has come out soo welll !!
    When you die...You ll leave behind a legacy and will be known as a legend !!..

    • @statquest
      @statquest  4 роки тому +1

      Thank you very much! :)

  • @joxa6119
    @joxa6119 2 роки тому +1

    I just regret not found your channel during my degree.

    • @statquest
      @statquest  2 роки тому

      better late than never! :)

  • @DinaEl-Kholy--
    @DinaEl-Kholy-- 4 роки тому +2

    Thank you! You made it so easy and just saved my course project 😂❤

  • @95019124
    @95019124 3 роки тому +1

    you saved my grades in data mining and machine learning courses

  • @taotaotan5671
    @taotaotan5671 2 роки тому +1

    I learned RBF from the Gaussian process, and seems the idea of "kernal" has numerous applications!

  • @leanneZzz08
    @leanneZzz08 3 роки тому +2

    Thanks for the amazing video. Just one quick question. When calculating the relationship between two observations, the larger the distance between the two observations, the smaller the high-dimensional relationship results. You says it is because there is less influence between the two. But, as in the example, the observations (red) are spreading on the both sides of the green observations, while they are in the same classification. According to the distance rule mentioned above, the high-dimensional relationships results between the two red observations ,laying separately on the both side of the green observation, will also be very small. How to explain the weak relationship between observations in the same classification? Is that also because of weaker influence?

    • @statquest
      @statquest  3 роки тому +2

      It's OK for two items from the same category to have a low value. The goal is to find a linear classifier in a high dimensional space that separates the two categories. If that means we have multiple clusters that represent the same category, that is OK as long as we can separate them from the other category.

  • @wolfisraging
    @wolfisraging 4 роки тому +2

    Just can't wait for more svm's

    • @statquest
      @statquest  4 роки тому

      Unfortunately, my next series of videos will be on XGBoost. If there if there is a demand, I'll return to SVMs as soon as I can.

    • @gauravgogia9939
      @gauravgogia9939 4 роки тому

      @@statquest When are you planning to upload XGBoost Video, your videos are awesome!!

    • @statquest
      @statquest  4 роки тому +1

      XGBoost videos should start coming out in the next few weeks. The first step is to learn how regression trees are traditionally pruned. XGBoost uses a different method and we need to learn the traditional way to appreciate how XGBoost does it.

  • @harshitlamba155
    @harshitlamba155 2 роки тому +1

    You are Richard Feynman of this era!

  • @sy3002
    @sy3002 2 роки тому +1

    @15:12, you should have nuclear BAM!!!!!!! for such revelations, awesome series loved every part. Thanks for your good work.

  • @savvasnikolaosanastasiadis5245
    @savvasnikolaosanastasiadis5245 2 роки тому +1

    you deserve 100m subscribers

  • @MominSaadAltafnab
    @MominSaadAltafnab Рік тому +3

    "pipipu pipipu" hits every time 😂

  • @well....7751
    @well....7751 4 роки тому +5

    I wanted to know what relationship do we get after doing the dot product between the values and can you do a video on support vector regression or give some links??
    (Great video though)

    • @adarshs6571
      @adarshs6571 3 роки тому +1

      Even i do have the same doubt on how the relationship is used further in classification.. please help

  • @ashishtiwari1912
    @ashishtiwari1912 4 роки тому

    Thank you for the videos. i have learned a lot of things from your channel. What I would like to know is the scenarios where the SVM algorithm will fail. How do we make a relative comparison while choosing the different classification Algorithms?

    • @statquest
      @statquest  4 роки тому +1

      We can always use 10-fold cross validation to compare how different models perform: ua-cam.com/video/fSytzGwwBVw/v-deo.html