PCA 1: curse of dimensionality
Вставка
- Опубліковано 19 чер 2024
- Full lecture: bit.ly/PCA-alg
The number of attributes in our data is often a lot higher than the true dimensionality of the dataset. This means we have to estimate a large number of parameters, which are often not directly related to what we're trying to learn. This creates a problem, because our training data is limited. - Наука та технологія
Excellent teaching style, Victor. I haven't watched all the videos, but it seems that you are able to explain both the concepts and the maths in a very simple way. Well done! Keep on posting, please!
I have followed so many tutorials and lectures but nothing is as explanatory as yours. The way have explained is so intuitive that high school students will get the gist of PCA. Thanks a lot Vitctor. I wish I was a student of yours and could attend the lectures. Nevertheless, it is great to have your precious work available online.
I haven't watched all the videos but I've seen several videos trying to understand PCA. This is definitely the best one! Great job explaining the concepts and leading with the example. Thank you very much :)
Wow. You are really good at explaining the important concepts of PCA !! Thanks for posting these videos. I wish you had lectures on partial least square (PLS) as well.
Thank you so much, Victor.
I really enjoy watching your video. They are clear and funny.
btw I love your articulation.
Great lectures. You explained the concept very well
Great video, curse of high dimensionality clearly explained!
Enjoyed it
Your lectures are great. Hats off.
This has really helped me, Thank you!
Great lectures. Thank you.
That was an amazing explanation, thanks!
Your teaching style is really very good! I like how you introduce the problem with some good examples before elaborating..Thank you so much for this informative lecture!
Thanks! Happy to know you found this useful.
very informative lecture . can u send the matlab code for PCA at imalik860@gmail.com. thank you .
The best explanation I have seen, Please keep producing more stuff like this in machine learning...
Thanks for this!
Very useful. Thanks!
I struggle with Statistics, this is extremely helpful for supplementary information for my Multivariate Statistics class. Thank you :)
Thanks, happy to know this is helpful.
Awesome video. Thanks.
best explaination to show how distance of pairwise point increases with increasing dimensions
Wao! Thank you very much Sir
can any please tell me which book the prof. is following?
what is the book you have referenced?
***** "Data Mining" by Witten and Frank: www.cs.waikato.ac.nz/ml/weka/book.html
the best explanations EVER!
thank u so goddamn much .
Wait a sec. How is the dimensionality 2^400? Shouldn't it be just 400.
The input feature vector is just an array of 400 pixels where each pixel can take either 0 or 1 value.
The total number of possible combinations is 400.
Please correct me if I am wrong.
Hahaha...okay urefu, is length in Swahili then kimo is height but urefu is ocassionally used for height.
Thumb up for the urefu joke