That was awesome. Everyone always jumps to neural networks, but often svms do the same classification work with no huge iterative training required and are simply a better solution much of the time. Thanks for explaining the mathematical basis behind their power, and particularly the gaussian kernel function, so simple and yet such amazing results.
I love you, man! I went through several explanations, books, guides but couldn't understand the underlying concept. You made it crystal clear! Thank you very much!
i did not understand this video when I saw it first but then I took a pen and a paper and did the proof along with him. Now i understood the maths behind it. Thank you.
This is an amazing video. Usually these UA-cam videos leave out the math and focus on the animation. But clearly the mathematician/animator of this video is very talented.
I really love the Graphics and All the animation going around, it helped a LOT with my ADHD, and it gives me hope seeing people such as you go this far to make learning pleasing!
After watching the Support Vector Machine lecture from MIT Professor Patrick Winston, it makes so much sense! One of the best explained videos on SVM! Although I think there is a typo on the Lagrangian equation, the sign of the first term SUM(alpha _i) should be "+" instead of "-". Otherwise perfect!
Great potential, but throwing some important mathematical concepts (that actually build the intuition) around like it's something trivial is careless at best. Either the channel name then should be non-intuitive ML or gotta expand on things like "for mathematical convenience" -- that is, convexity + computational / numerical optimization -- things that are way more important to emphasize for intuition-building purposes than technical use of Lagrange multipliers
I think there are multiple places the signs are not correct around the time mark. One more thing, around these deviations, I think it is better not to use so much animation, or do the animation slowly. It is too difficult to see it clearly. But the content of the video is very high quality. Thanks a lot for sharing it.
Those are not equal. We want to maximize 2/||w|| (this is the distance between H1 and H2). Maximizing this is equivalent to minimizing ||w||, which is equivalent to minimizing 0.5*||w||^2. :)
That was awesome. Everyone always jumps to neural networks, but often svms do the same classification work with no huge iterative training required and are simply a better solution much of the time. Thanks for explaining the mathematical basis behind their power, and particularly the gaussian kernel function, so simple and yet such amazing results.
Its hurt when you don't even understand this kind of explanation 😭
Don't feel ashamed it's not for someone who does not know SVM at all.
yeah I'm crying rn
Best video on UA-cam about this topic! You explained everything that others pushed to the side with "this is beyond the scope of this video"!
Thanks
this is the only one i found trying to explain mathematically, thanks a lot!
So concise, so intuitive, yet mathematically in-depth as well! Tysm!
I understood the whole SVM concept with just one video, that too with visualisations.
Wow! Definitely one of the most clear intution of SVMs availble on the internet, Thank you sir.
I love you, man! I went through several explanations, books, guides but couldn't understand the underlying concept. You made it crystal clear! Thank you very much!
This is one of the clearest explanations in 15 minutes!
First video on youtube about svm that gathers together everything you need to know about this algorithm
This video fully covers the range of the SVM lecture at my university, something that others video don't even come close to. Perfect !
i did not understand this video when I saw it first but then I took a pen and a paper and did the proof along with him. Now i understood the maths behind it. Thank you.
This is an amazing video. Usually these UA-cam videos leave out the math and focus on the animation. But clearly the mathematician/animator of this video is very talented.
I really love the Graphics and All the animation going around, it helped a LOT with my ADHD, and it gives me hope seeing people such as you go this far to make learning pleasing!
This is most amazing and actual explanation of SVM in entire internet.
Thank you ❤️
You deserve like 1 Million subscribers for your contents, I am telling you that!
Thanks!!
The math is soooo good explained. I think the pacing is hard without prior knowledge but you explained that so damn elegantly! Thank you!
This was the best and most quick SVM video.
Oh my god, this video is so well made it's actually mind blowing.....
All of that in 15 minutes. My mind is blown 😲Thanks for the video
The best SVM explanation video I have ever watched
Getting even better with every upload!
Thanks!
Intuitive Machine Learning do you have some further recommendations for learning about ML? Thanks!
Stay tuned on our channel😊
HATS OFF TO YOU . THE BEST VIDEO I HAVE EVER SEEN ABOUT SVM . THANNK YOU
The best machine learning channel!
Thanks! 😊
I am glad I found this channel
After watching the Support Vector Machine lecture from MIT Professor Patrick Winston, it makes so much sense! One of the best explained videos on SVM!
Although I think there is a typo on the Lagrangian equation, the sign of the first term SUM(alpha _i) should be "+" instead of "-". Otherwise perfect!
underrated video, too good thank you so much
That was epic!! Please making more of these
Thanks very much!
I hate that this channel is underrated
Great job. It must have taken a lot of time. Thanks a lot!
Great explanation sir i love they way of teaching kindly make videos on all Ml and Deep learning Algorithms 🤩
I love it that you did not leave math part
This is one of the best videos ever
This is supper cool! Best explanation!
Thanks very much!
This is Great, so simple explanation. Thank you
thanks!
Extremely high quality content! I've already shared it on linkedIn... By the way, what software do you use for graph animations? Keep it up!
We use adobe premiere for video editing
@@IntuitiveMachineLearning Adobe Premiere for graph animations too?
You are doing great man👍
Keep going ❤
This video helps me sleep at night.
Thank you for the nice video. I wanna contribute to the world like you. Hope your happiness!
Fantastic explaination!! Such great representation of concepts. thank you so much!
I love this channel!
Thanks!
Clear explanations. I encourage you to produce more content. :)
Bro u r doing great keep it up
thanks
Very high quality video it is
Well done Sir, thanks for the effort 🤘
Very helpful video. Thank you!
Way to go.. It's simply awesome..
thanks!
Thank you for making this video
I felt lucky after watching this video
Pls teach maths behind ML. Thank you
Thanks for the brilliant video!
giau
amazing, have to say : thank you very much!
Love this video! Only one question. Is the partial derivative ∂L/∂w the same as the gradient ∇w L?
amazing, thanks laotie
At 9:07 I think there's a problem with signs? Or prolly missing brackets.
Simply Brilliant!!!
Great job!
Good content
I LOVE THE VIDEO. It's better than my professor LOL
Great potential, but throwing some important mathematical concepts (that actually build the intuition) around like it's something trivial is careless at best. Either the channel name then should be non-intuitive ML or gotta expand on things like "for mathematical convenience" -- that is, convexity + computational / numerical optimization -- things that are way more important to emphasize for intuition-building purposes than technical use of Lagrange multipliers
How come negative Sum alpha sub i * negative 1 = negative Sum alpha sub i at 9:12 ?
I think there are multiple places the signs are not correct around the time mark. One more thing, around these deviations, I think it is better not to use so much animation, or do the animation slowly. It is too difficult to see it clearly. But the content of the video is very high quality. Thanks a lot for sharing it.
How to extends support vector machine to multi label classification?
Thanks a lot!
Great video but at the beginning you say that SVM is a nonlinear model when it is a linear one.
The art style is similar to a comic I've read. Does anyone know what it is?
it's a great video, but i didn't understand how (2/||w||) is equal to 0.5*||w||^2 . TIA
Those are not equal. We want to maximize 2/||w|| (this is the distance between H1 and H2). Maximizing this is equivalent to minimizing ||w||, which is equivalent to minimizing 0.5*||w||^2. :)
Awesome !!
Thx
you are the best
What is the unknow vector u?
thanks mate
The best
Exacellent
can i used your PPT sir ? i have my reporting tomorrow about SVM please can someone help me send ma a PPT? of this?
If I don’t understand this explanation….maybe I’m the problem
I accepted my fate about failing my machine learning test
This was a really good video but this dudes English was so hard to understand at some points I had to keep subtitles on😭
This is good but i think you are going too much into the math which is unnecessarily making this video very long.
A.K.A.*
Nobody explains what is w? where is it coming from, or what is b which is the intercept, how to calculate it , terrible video
Good explanation, but I really have trouble understanding your accent.
Man you gotta skip explaining things mathematically and do it in other ways, a lot of us students don't even understand math.
I'm so glad you made this video the statquest guy can't teach shit
I lost it in the first equation itself🥲. Even Transformers are easier to understand.