PCA 1: curse of dimensionality

Поділитися
Вставка
  • Опубліковано 19 чер 2024
  • Full lecture: bit.ly/PCA-alg
    The number of attributes in our data is often a lot higher than the true dimensionality of the dataset. This means we have to estimate a large number of parameters, which are often not directly related to what we're trying to learn. This creates a problem, because our training data is limited.
  • Наука та технологія

КОМЕНТАРІ • 30

  • @MrAndreaCaso
    @MrAndreaCaso 9 років тому +7

    Excellent teaching style, Victor. I haven't watched all the videos, but it seems that you are able to explain both the concepts and the maths in a very simple way. Well done! Keep on posting, please!

  • @anonymous_anonymity
    @anonymous_anonymity 5 років тому +2

    I have followed so many tutorials and lectures but nothing is as explanatory as yours. The way have explained is so intuitive that high school students will get the gist of PCA. Thanks a lot Vitctor. I wish I was a student of yours and could attend the lectures. Nevertheless, it is great to have your precious work available online.

  • @zops5
    @zops5 7 років тому

    I haven't watched all the videos but I've seen several videos trying to understand PCA. This is definitely the best one! Great job explaining the concepts and leading with the example. Thank you very much :)

  • @younghwanchae1422
    @younghwanchae1422 8 років тому

    Wow. You are really good at explaining the important concepts of PCA !! Thanks for posting these videos. I wish you had lectures on partial least square (PLS) as well.

  • @nikolahuang1919
    @nikolahuang1919 6 років тому +1

    Thank you so much, Victor.
    I really enjoy watching your video. They are clear and funny.
    btw I love your articulation.

  • @yajiren1
    @yajiren1 8 років тому +1

    Great lectures. You explained the concept very well

  • @romyli5159
    @romyli5159 4 роки тому

    Great video, curse of high dimensionality clearly explained!
    Enjoyed it

  • @anonymous_anonymity
    @anonymous_anonymity 5 років тому

    Your lectures are great. Hats off.

  • @DrKhan-hd4cd
    @DrKhan-hd4cd 5 років тому

    This has really helped me, Thank you!

  • @vivekvikramsingh7685
    @vivekvikramsingh7685 8 років тому

    Great lectures. Thank you.

  • @supergization
    @supergization 4 роки тому

    That was an amazing explanation, thanks!

  • @bhaavamritdhaara
    @bhaavamritdhaara 9 років тому +8

    Your teaching style is really very good! I like how you introduce the problem with some good examples before elaborating..Thank you so much for this informative lecture!

    • @vlavrenko
      @vlavrenko  9 років тому

      Thanks! Happy to know you found this useful.

    • @MuhammadImran-rn9rg
      @MuhammadImran-rn9rg 7 років тому

      very informative lecture . can u send the matlab code for PCA at imalik860@gmail.com. thank you .

  • @subhashtn2
    @subhashtn2 5 років тому +2

    The best explanation I have seen, Please keep producing more stuff like this in machine learning...

  • @abhishekeaswaran3315
    @abhishekeaswaran3315 6 років тому +1

    Thanks for this!

  • @ClassicContent
    @ClassicContent 6 років тому

    Very useful. Thanks!

  • @Cassielball
    @Cassielball 9 років тому

    I struggle with Statistics, this is extremely helpful for supplementary information for my Multivariate Statistics class. Thank you :)

    • @vlavrenko
      @vlavrenko  9 років тому

      Thanks, happy to know this is helpful.

  • @inteligenciaartificiuau
    @inteligenciaartificiuau 3 роки тому

    Awesome video. Thanks.

  • @snackbob100
    @snackbob100 4 роки тому

    best explaination to show how distance of pairwise point increases with increasing dimensions

  • @carlsoftsln1183
    @carlsoftsln1183 5 років тому

    Wao! Thank you very much Sir

  • @EjazAhmed-pf5tz
    @EjazAhmed-pf5tz Рік тому

    can any please tell me which book the prof. is following?

  • @kirillb.9322
    @kirillb.9322 9 років тому

    what is the book you have referenced?

    • @vlavrenko
      @vlavrenko  9 років тому +1

      ***** "Data Mining" by Witten and Frank: www.cs.waikato.ac.nz/ml/weka/book.html

  • @randomguy75
    @randomguy75 7 років тому

    the best explanations EVER!
    thank u so goddamn much .

  • @abhishekbhatia6092
    @abhishekbhatia6092 5 років тому

    Wait a sec. How is the dimensionality 2^400? Shouldn't it be just 400.
    The input feature vector is just an array of 400 pixels where each pixel can take either 0 or 1 value.
    The total number of possible combinations is 400.
    Please correct me if I am wrong.

  • @tomkamikaze
    @tomkamikaze 4 роки тому +1

    Hahaha...okay urefu, is length in Swahili then kimo is height but urefu is ocassionally used for height.

  • @SamuelKupferschmid
    @SamuelKupferschmid 6 років тому

    Thumb up for the urefu joke