I almost never comment on youtube videos, but this is hands down one of the best videos I have seen on any topic ever! This is my first time watching your any of your videos. Thanks for what you do!👏👏👏
Thank you Anthony, I'm so glad you like the videos! And thank you so much for your really kind donation, I really appreciate it! Please let me know if there are any topics you'd like to see a video on, I'm always looking for suggestions. Have a wonderful day!
You are one of the best educators. I hope you read this message. I hope you understand the positive impact you are having on all of us around the world. Thank you.
Wow - better than my 2 hour long lecture on unsupervised learning but can one really visualise a dendogram with several dimensions and millions of records? it sounds insane
Entraba buscando las diferencias entre un modelo y el otro pero... no solo entendí las diferencias sino que me di cuenta que no entendía ninguno de los dos. Mil gracias
Hey, thanks for the explanation. I had a doubt about feature scaling when clustering. In the first example for k-means, age and engagements are on different scales, so if you don't scale the data you're making the assumption that an age difference of a year has the same 'weight' as 1 engagement a week for k-means's sum of squares optimisation. Some sources recommend scaling or normalising your data before clustering and other don't. How does one make that choice of scaling the data before clustering? Is it a business/use case decision (what do you want to give weight to) or do you always leave the data as is (like in the example) or do you always scale the data (equal weightage) ?
This is a great presentation. I've learned a lot. I was wondering whether there was a connection between hierarchical clustering and decision trees. They look and feel similar, but I'm not sure whether that's mathematically evident.
Fantastic explanation and presentation. I just finished reading 49 pages of a textbook and got more out of this video in 16 minutes than from my textbook... thank you sir I truly appreciate it.
Excelente explicacion te escribo en espanol porque acabo de ver tu biografia un poco y veo que eres de Colombia, yo soy de Peru. Estoy estudiando mi master en data science, en NJIT universidad en New Jersey espero puedas ayudarme en algunas cosas: I am exploring the dataset ALLSTATE CLAIMS SEVERITY , it has 12 numeric variables, my question is, to apply clustering I need to calculate the distance for all numeric variables in one step? I hope you can answer my question
Thank you so much Luis! Your explanations have been so important for me to learn key concepts whilst in my DS/ML bootcamp. The way you present initial intuition of the algorithms without delving into any of the math and then slowly introducing the principles is brilliant! Other courses can't wait to throw all the math on you from the very beginning, getting you lost before you could ever start to learn. Your animations are also much appreciated!
Thanks for the videos! Do you have a source for some mock datasets to play with? I always want to pull up a notebook and try these algorithms, but dont have any good data
How you do your explanation is amazing. An earth-shattering simplification of the two methods delivered by a great cool voice!
I took a loan to pay for my college tuition to teach me this and what's worse is that they didn't teach it half as good as you. Thanks a ton, Luis!
I almost never comment on youtube videos, but this is hands down one of the best videos I have seen on any topic ever! This is my first time watching your any of your videos. Thanks for what you do!👏👏👏
Thank you Anthony, I'm so glad you like the videos! And thank you so much for your really kind donation, I really appreciate it! Please let me know if there are any topics you'd like to see a video on, I'm always looking for suggestions. Have a wonderful day!
You are one of the best educators. I hope you read this message. I hope you understand the positive impact you are having on all of us around the world. Thank you.
Thank you so much for your kind message. :) I love teaching these concepts, and it's very rewarding to hear that you're enjoying the videos. :)
I always start watching your video with liking it. Thank you so much!
Luis, I have been viewing multiple videos to find someone who could explain these concepts with clarity and you nailed it! Thakn you
Thank you for the most clear explanation I’ve heard so far.
I don't have words to express my views. I can simply say Awesomeeeeeeee . Keep uploading for other techniques also.
Interesting video content. I am a newbie to the area of data mining but it gave me a clear insight about clustering. Cheers!
This is how you teach! Basic theory followes by immediate example.
The best explanation of clustering which I have seen!
One of the good visual way to explain K-means, Thanks Luis :)
Best explanation so far. Thanky you sir.
Great explanation, really helped my understanding of hierarchical clustering and dendrograms!
Absolutely simplified explanation..thanks
thank you for this explanation Luis
Great Explanation and illustration
Next Level Explanation.........................
Dude, I love your lectures....Best in the world...
I am crying looking at the level of simplicity. Felt people waste 1000s of $ in Uni .. thanks
you are amaaaaaaaaaaaaaazing, i was really confused during the class
Excellent .....It will help me to better explain the application of optimization (locator-allocator) theory to my colleagues in Public Health
great content, clear explanation. Thank you!
Wow - better than my 2 hour long lecture on unsupervised learning but can one really visualise a dendogram with several dimensions and millions of records? it sounds insane
Entraba buscando las diferencias entre un modelo y el otro pero... no solo entendí las diferencias sino que me di cuenta que no entendía ninguno de los dos. Mil gracias
Thank you so much for putting out this content! Really well explained. Much appreciated.
YOURE JUST AMAZING SIR I WISH YOU FLOWERS AND JOY AND ETERNAL GLORY ! THANK YOU
excellent way of explaining difficult concepts !!! Keep it up and thank you
another great course like everytime, thankiiies!
Fantastic amazing explanation, subscribed!
If all my teachers were like you, I had a Nobel prize now!
thank u so much!! it was very helpful and easy to understand :)
Hey, thanks for the explanation. I had a doubt about feature scaling when clustering. In the first example for k-means, age and engagements are on different scales, so if you don't scale the data you're making the assumption that an age difference of a year has the same 'weight' as 1 engagement a week for k-means's sum of squares optimisation. Some sources recommend scaling or normalising your data before clustering and other don't.
How does one make that choice of scaling the data before clustering? Is it a business/use case decision (what do you want to give weight to) or do you always leave the data as is (like in the example) or do you always scale the data (equal weightage) ?
Amazing explanation Luis
Great explanation!
Thanks Luis. It is really great and vivid!
Epic explanation! Like always !
Awesome video. Thank you very much
Great explanation! Very clear.
how do you create such wonderful animation? Can you please tell me what software you use for this?
Great Serrano ...
loved the content Luis
This is fantastic, thanks a lot
11:33 Hierarchical Clustering
You are my best, really appreciate you
Best video to serve the purpose :D
helps so much, thank you!!
This is the simplest one on K-means and Hierarchical Clustering
Very good keep up good work
Thank you so much!!!!!
This put a lot of perspectives and clarity into my confusion. Thank you Luis
hatsoff sir
Just perfect.
Very well explained! Thank you Luis. +1 sub!
Great. But I didn't understand this that no. 6 have the lowest diameter. Yet we choose 3 during elbow method. On what parameter we took this decision.
Great question! 3 was where the corner appeared, since the diameter went down drastically from 2 to 3, but then just a little bit from 3 to 4.
Do a least square algorithm video
Awesome!!!
Thanks!
Thank you so much for your kind donation, Anthony!
Guy you are legend !
sir you did this animations ??
Yes, all on keynote
@@SerranoAcademy sir which software can you guide me
This is a great presentation. I've learned a lot.
I was wondering whether there was a connection between hierarchical clustering and decision trees. They look and feel similar, but I'm not sure whether that's mathematically evident.
Awesome
Hola Luis, tienes talvez planificado hacer los videos en español?
Hola tocayo! Si, en los próximos días voy a poner algunos en español! Te aviso
why call it k-means? why not j-means?
You make dreadful theories amazingly simple! Thank you very much for the great explanations AND the super-cool animations!!! Keep up!
You're really good at explaining concepts
Fantastic explanation and presentation. I just finished reading 49 pages of a textbook and got more out of this video in 16 minutes than from my textbook... thank you sir I truly appreciate it.
Easily the clearest explanation of the two clustering concepts. Thank you!
Simple, concise and informative video. All educative videos should be like this.I have subscribed
Excelente explicacion te escribo en espanol porque acabo de ver tu biografia un poco y veo que eres de Colombia, yo soy de Peru. Estoy estudiando mi master en data science, en NJIT universidad en New Jersey espero puedas ayudarme en algunas cosas:
I am exploring the dataset ALLSTATE CLAIMS SEVERITY , it has 12 numeric variables, my question is, to apply clustering I need to calculate the distance for all numeric variables in one step?
I hope you can answer my question
Your explanations are just so easy to understand and brilliant..........
I can't believe how simple and easy to understand you made clustering. Thank you Luis!
I have watched a lot of videos about hierarchical, your video is epic! thanks...
You explained these theories by examples which are super clear! Thanks a lot! Keep updating more videos. Really helpful!
Great. It is very informative Tks Luis..
Amazing video , No one can beat your approach in explaining concepts
Very well explained!
Brilliant video - such a clear and understandable explanation! Thank you so much :)
Great explanations! Thank you. I’d love to see a video explaining the use of silhouette scores and plots for picking the best number of clusters.
This is by far the best explanation of the elbow method I've seen. Thank you!
Just want to leave a comment so that more people could learn from your amazing videos! Many thanks for the wonderful and fun creation!!!
Amazingly composed, a much simpler and understandable version, Hats off Luis, loved it, thanks.
Underrated channel, great explanation.
This is a very helpful video Luis. Thank you. (PS - This is also my very first UA-cam comment)
Thank you so much Luis! Your explanations have been so important for me to learn key concepts whilst in my DS/ML bootcamp. The way you present initial intuition of the algorithms without delving into any of the math and then slowly introducing the principles is brilliant! Other courses can't wait to throw all the math on you from the very beginning, getting you lost before you could ever start to learn. Your animations are also much appreciated!
Your explanation was just out of this world
Thanks for the videos! Do you have a source for some mock datasets to play with? I always want to pull up a notebook and try these algorithms, but dont have any good data
I'm taking a course in data science by hk university and I didn't new what the points of clustering but now I know, you got a new subscriber.
i subscribed just by reading the comments, lol. but actually i will watch the video later and I hope i will get some useful information from it :)
amazing video! there's just one thing I didn't understand: what the hell is a pizza 'parlor'?? ;)
Great simplification in explaining the clustering principles!
Wow. Thi channel is Godsends. I have more understanding the fundamental clearly. Thanm you
Can you do a series on data analytics or data mining etc. ?
Thank you so much for making the video. Your explanation is very clear and easy to understand
Excellent! Excellent!! Excellent!!! Explanation. Many Thanks!
Thank you so much! Super helpful
I really like the way you teach these topics. thank you 🤩
Thank you for the application oriented explanation. :)
The best explanation I've seen/heard so far!
Awesome explanation. Very intuitive.