PCA 4: principal components = eigenvectors

Поділитися
Вставка
  • Опубліковано 18 січ 2014
  • Full lecture: bit.ly/PCA-alg
    We can find the direction of the greatest variance in our data from the covariance matrix. It is the vector that does not rotate when we multiply it by the covariance matrix. Such vectors are called eigenvectors, and have corresponding eigenvalues. Eigenvectors that have the largest eigenvalues will be the principal components (new dimensions of our data).
  • Наука та технологія

КОМЕНТАРІ • 108

  • @d.youtubr
    @d.youtubr 2 роки тому +1

    Probably THE BEST explanation of PCA on the internet.
    Thank you.

  • @3446
    @3446 Рік тому

    finally someone who is able explain the relationship between PCA and eigenvector. Thank yoU!

  • @michael_bryant
    @michael_bryant 11 місяців тому

    This is the first time that PCA has actually made sense mathematically. Great video

  • @deebo7276
    @deebo7276 Рік тому

    Without this video I don't think I'd ever intuitively understand the inherent link between PCA and covariance eigenvectors. And to think that it only took 7 minutes of my time. THANK YOU

  • @faustoblascopisador571
    @faustoblascopisador571 5 років тому +1

    Just a couple of minutes of this video, when you explain how the covariance matrix acts on any vector, are the glue for all the different pieces of the PCA puzzle in my mind. Thanks for such excellent explanation.

  • @AdrianVrabie
    @AdrianVrabie 9 років тому +42

    AWESOOOME! :D Now I understand not only PCA but also what eigenvectors are

    • @vlavrenko
      @vlavrenko  9 років тому +6

      Thank you!

    • @iSohrab
      @iSohrab 8 років тому

      exactly !

    • @user-vf1cg5vs8e
      @user-vf1cg5vs8e 5 років тому

      Me Too !!!!!!!!!!!!!

    • @jaxxonleighton4105
      @jaxxonleighton4105 3 роки тому

      pro tip : watch movies at Flixzone. Me and my gf have been using it for watching loads of movies these days.

    • @kristophergrady5255
      @kristophergrady5255 3 роки тому

      @Jaxxon Leighton Yup, I've been using flixzone} for years myself =)

  • @leonidsdreams3919
    @leonidsdreams3919 3 роки тому

    Even without any fancy visualization, you managed to explain things in an intuitive and clear manner! Amazing!

  • @scares009
    @scares009 5 років тому

    Possibly the best explanation of eigenvectors, in the context of PCA, I've seen. Great work!

  • @puttatidam.1819
    @puttatidam.1819 2 роки тому

    3 years... I did data science without this clear understanding of PCA. I have used it, I know how to calculate everything, but I never really get to see the essence of it. This is such a great lecture, Thank you.

  • @milosmladenovic7822
    @milosmladenovic7822 6 років тому

    This is amazing, thank you so much! This series of videos, combined with the one you have on Eigenvectors and Eigenvalues helped me to completely and thoroughly understand PCA and math behind it!

  • @abhishekkumarsrivastava3480
    @abhishekkumarsrivastava3480 8 років тому +3

    today only i understood the exact meaning of PCA ... was struggling from more than a month
    NO match for thses lectures .... great job

  • @usefbob
    @usefbob 7 років тому +16

    You're an insanely good teacher! Really appreciate you putting in time to make these videos, thanks :D

    • @georgalem3310
      @georgalem3310 4 роки тому

      He was a professor at this point actually.

    • @monsta8279
      @monsta8279 9 місяців тому

      @@georgalem3310😊😅😅😅😅😅😅😅😅😊😅😊

    • @georgalem3310
      @georgalem3310 9 місяців тому

      @@monsta8279hehe

  • @chronobiologierug7111
    @chronobiologierug7111 7 років тому

    This is the best explanation of finding the eigenvector in PCA I have seen so far and is easy to understand. Thank you very much

  • @JulianHarris
    @JulianHarris 5 років тому +3

    "How do you extract these dimensions of greatest spread of the data."
    Really love the plain-English intuitive explanation of eigenvectors and principle component analysis. Thanks Victor!

  • @Marcus-ok2jy
    @Marcus-ok2jy Рік тому

    have been really confused why eigenvectors were used in PCA for the longest! Love the explaination

  • @nishkumar5298
    @nishkumar5298 7 років тому +1

    Wow! Such an elegant explanation of Co-Variance matrix and Eigen vectors.

  • @amalalmuarik5160
    @amalalmuarik5160 Місяць тому

    THANKS, you've answered a lot of questions in my mind with your amazing explanation!!!!

  • @noemi9351
    @noemi9351 6 років тому

    Great video! I have always used PCA but this is the first time I actually understand the underlying principle. Thank you

  • @pussyripper1
    @pussyripper1 8 років тому

    Went thru countless videos to understand why we try to find vectors which dont change their direction when multiplied by covariance matrix. Finally found the answer here. Thanks for posting.

  • @seoulitguy
    @seoulitguy 4 роки тому +1

    I don't comment on videos very often, but this is an amazing explanation of PCA. Thank you! Helped a lot :)

  • @jaideepcsamuel
    @jaideepcsamuel 2 роки тому

    I have a feeling that most teachers who do not themselves have a sound understanding and will confuse the heck out of you to cover their own inadequacies. Thank you for explaining this so clearly - the part where you mentioned about mutiplying the Covariance by any vector rotates towards the principal component is superb and is understandable even by a novice.

  • @yuvrajkhanna5841
    @yuvrajkhanna5841 3 роки тому

    dude i always wondered why do u find eigen vectors while finding principle components now I understood thanks man really appreciated

  • @michellemaher1300
    @michellemaher1300 6 років тому

    You are a great teacher Victor, thank you for being able to explain this so wel

  • @smamorti
    @smamorti 4 роки тому

    Your videos are doing an amazing job helping me understand these subjects better. Thanks a lot

  • @ericyoung6420
    @ericyoung6420 2 роки тому

    Very clear lecture! Well-organized contents!

  • @piecoos1367
    @piecoos1367 8 років тому

    Thank you for the clear explanation. The extra work on colour and graph helped.

  • @yu-linchen281
    @yu-linchen281 4 роки тому

    Thanks for the video! Now I got the sense why calculate eigenvector of the covariance matrix to get PC. Really great explanation.

  • @HarshitSharma-lu3uf
    @HarshitSharma-lu3uf 5 років тому

    Thanks a lot for this video, this is the first time I understood Covariance matrix Eigenvectors and Eigen values....superb video

  • @andrewpham1566
    @andrewpham1566 4 роки тому +1

    one of the best explanation i could find!! thank you

  • @ryanb616
    @ryanb616 7 років тому

    Excellent explanation. Finding information on this exact process proved difficult - thank you for posting.

  • @stvgigi
    @stvgigi 5 років тому

    Brilliant explanation. Very good presenter and orator.

  • @RS-el7iu
    @RS-el7iu 6 років тому

    You are simply wonderful....these stuff must be taught everywhere

  • @waelfarah8347
    @waelfarah8347 8 років тому +1

    Thank you for the instructional and helpful videos! Is there any reference to what you said about the multiplication of the covariance matrix with any vector yields a vector that points more towards the maximum variance?

  • @joaoneto4533
    @joaoneto4533 6 років тому

    Thank you for the explanation, Mr. Robot!

  • @tsniranjan
    @tsniranjan 5 років тому

    Amazing explanation of eigenvectors and PCA, helped me a lot, Thank you.

  • @quessar
    @quessar 3 роки тому

    Great explanation in such a short video !!!

  • @prudvim3513
    @prudvim3513 4 роки тому

    Awesome! I understood the connection between covariance matrix and principal components now.

  • @abhinavchauhan2737
    @abhinavchauhan2737 7 років тому

    WoW!!! Finally I can understand the intuition behind using eigenvalues/vectors for PCA....Thank U so much !!!!! More Power to u :-) \m/

  • @domenicodifraia7338
    @domenicodifraia7338 6 років тому

    MAN YOU ARE GREAT! My brain lighted thanks to you!

  • @aliceh7898
    @aliceh7898 9 років тому

    The best intuitive explanation of PCA. Thank you!

  • @abdulateek1137
    @abdulateek1137 7 років тому +3

    Your videos really helped me. I have a doubt in this video, how did we know that multiplying the co-variance matrix with the vector (the red one in the video) will rotate it towards the principal component?

  • @jdm89s13
    @jdm89s13 4 роки тому

    Awesome explanation of principle components.

  • @jayasuryam8575
    @jayasuryam8575 3 роки тому

    Excellent Explanation! Thank you so much

  • @atineshsingh3382
    @atineshsingh3382 7 років тому

    Saw so many videos, But cleared my doubt. Thanks for the awesome explanation.

  • @muhammedafifi6388
    @muhammedafifi6388 5 років тому

    More exciting than watching a movie!

  • @tulasijamun3234
    @tulasijamun3234 6 років тому

    Gained a lot from this video! Thanks Victor!

  • @sk8erbyern
    @sk8erbyern 7 років тому

    ohh ooooh so that's the idea. Great explanation very thanks. It is so simple yet no one properly explains like that.

  • @jiangxu3895
    @jiangxu3895 3 роки тому

    Thanks for your video, it solved my puzzle.

  • @jschoi9290
    @jschoi9290 4 роки тому

    This was very helpful. Thank you!

  • @guillermoguijarrorodriguez252
    @guillermoguijarrorodriguez252 6 років тому

    Impressive explanation. Thank you very much

  • @xiaolinwang639
    @xiaolinwang639 9 років тому +1

    This is really helpful and easy to understand! Thank you!

    • @vlavrenko
      @vlavrenko  9 років тому

      Thank you, happy to know you find it useful.

  • @soojinlee6191
    @soojinlee6191 10 років тому

    Really great lecture! Thanks!

  • @tanvirtanvir6435
    @tanvirtanvir6435 Рік тому

    4:10 multiply by cov; slopes coverging; that is the PCA
    5:50 e2* cov : turn korbe NA
    6:48 choose eig vectors who have BIGGEST eigen values

  • @greenfairy5125
    @greenfairy5125 6 років тому

    Thank you for the video! But may ask you something? How can you find a slope when the eigenvector is of size 17x1 and not 2x2? My initial dataset has 17 variables of 4 observations each (4x17 matrix), so the covariance table hat derives is 17x17 and we have 17 eigenvalues and 17 eigenvectors.
    Thank you in advance!

  • @IIAndersII
    @IIAndersII 4 роки тому

    incredible explanation

  • @birendrakathariya3517
    @birendrakathariya3517 8 років тому

    Awesome explanation, very very intuitive.......

  • @hosseinpourghaemi4600
    @hosseinpourghaemi4600 7 років тому

    Great explanation. Thank you.

  • @vanlalhruaiiswansi7305
    @vanlalhruaiiswansi7305 7 років тому

    Awesomeness...thank you Prof !!

  • @nambs7122
    @nambs7122 5 років тому

    Deep understanding!

  • @rafimohammad5213
    @rafimohammad5213 3 роки тому

    Excellent explanation

  • @fadinader3468
    @fadinader3468 4 роки тому

    for the covariance matrix, shouldn't the diagonal equal to 1 as this means the covariance of the feature with itself ?

  • @11maxed11
    @11maxed11 6 років тому

    dude this lecture is awesome

  • @inteligenciaartificiuau
    @inteligenciaartificiuau 3 роки тому

    Awesome. Thank you very much.

  • @artagainstknivesthelab512
    @artagainstknivesthelab512 5 років тому

    Great explanation - Is there any intuition for why the covariance matrix tends to transform vectors towards the eigenvectors? Or to put this another way: what is the reason that the eigenvectors are not rotated by the linear transformation arising from the covariance matrix?

  • @peterg7643
    @peterg7643 8 років тому

    This is great. Thank you!

  • @glitchAI
    @glitchAI 19 днів тому

    why does the covariance matrix rotates the vectors towards the greatest variance?

  • @danajoffe8621
    @danajoffe8621 5 років тому

    thank you! amazing explanation

  • @leconstruxviyan7909
    @leconstruxviyan7909 6 років тому

    great explanation !! double thumbs up :-)

  • @littlerainyone
    @littlerainyone 7 років тому

    How do you plot vectors real time on top of a document. Are you using a Python notebook by chance?

  • @madhousetoobah
    @madhousetoobah 9 років тому

    Great lecture - wonderful explanation

  • @aimonallouache7993
    @aimonallouache7993 6 років тому

    Great video, thanks man

  • @tighthead03
    @tighthead03 8 років тому

    Hi Victor, thanks for the great videos they're a fantastic study aid. For PCA some people recommend using the correlation matrix over the covariance matrix, can you explain why you would use one over the other? Are there any advantages of one vs. the other?

    • @SaschaFroelich
      @SaschaFroelich 8 років тому

      +John Garrigan Chiming in. You can use both, the correlation is just the normalized form of the covariance. The correlation of a dataset always lies between -1 and 1 while the covariance can take on any value from negative infitinity to infinity. Specifically, the covariance is scale-dependent.
      So the correlation is a better measure if you wish to compare different systems because it does not depend on the scale of the systems.

  • @osamahabdullah3715
    @osamahabdullah3715 6 років тому

    such a wonderful explanation, plz how can I get the slides ??

  • @taotaotan5671
    @taotaotan5671 4 роки тому

    Appreciate it man.

  • @mightyduckyo
    @mightyduckyo 9 місяців тому

    Step 1 is center, should we also scale so variance = 1

  • @darchcruise
    @darchcruise 8 років тому +1

    Are eigenvectors:
    1) the ones that "don't change" when you multiply (vector * co-var)
    2) or the ones that "do change" when you multiply (vector * co-var) ?

    • @SaschaFroelich
      @SaschaFroelich 8 років тому

      +darchz Neither of both. Eigenvectors are vectors that do not change their direction when multiplied by the matrix (matrix*vector), they might however change their length.

    • @robinranabhat3125
      @robinranabhat3125 6 років тому

      eigenvectors of co-variance matrix DONT CHANGE with rotation

  • @MawaMaverick
    @MawaMaverick 7 років тому

    Thank you sir.

  • @TheAkshaykher
    @TheAkshaykher 5 років тому

    How are we getting the vector (-2.5, -1) ? Victor said multiply again but by multiplying the covariance matrix with (-1.2, -0.2) I do not get (-2.5, -1). Could anybody explain this?

    • @sumitshukla3689
      @sumitshukla3689 4 роки тому

      Hi Akshay, It's already late to reply but answering for others. So you need to multiply the covariance matrix with the new vector (-1.2, -0.2) to get (-2.5, -1) and then is the same process you follow.

  • @mk5863
    @mk5863 9 років тому

    very useful thanks a lot.

    • @vlavrenko
      @vlavrenko  9 років тому

      Thanks, glad it worked for you!

  • @tmrmbx5496
    @tmrmbx5496 4 роки тому

    Thanks

  • @bhupensinha3767
    @bhupensinha3767 5 років тому

    Good video.. but difficult to follow what is 'here' and 'there'

  • @sameerpurwar4836
    @sameerpurwar4836 8 років тому

    man u r awesome

  • @jonathanr6391
    @jonathanr6391 8 років тому

    Awesome

  • @vijayendrasdm
    @vijayendrasdm 9 років тому

    why do we subtract the mean from each attribute?

    • @vlavrenko
      @vlavrenko  9 років тому

      v9 The pragmatic reason -- it makes all the maths a lot simpler: covariances become dot-products, and the maths in parts 7 and 8 are a lot easier.
      The conceptual reason -- PCA is all about preserving variance, it is not intended to model the mean of the data, so we factor out the mean, model what remains (the variance around that means), and then add the mean back (if we need to reconstruct some semblance of the original data from the low-dimensional representation -- see see: ua-cam.com/video/_lY74pXWlS8/v-deo.html for an example)

    • @vijayendrasdm
      @vijayendrasdm 9 років тому

      Victor Lavrenko Thank you.

  • @jospijkers1003
    @jospijkers1003 11 місяців тому

    SVD
    3:08

  • @peterkovgan993
    @peterkovgan993 5 років тому

    Hugs and kisses!!!!

  • @arulpraveen
    @arulpraveen 9 років тому

    wow, so this is what exactly eigen vectors really are....

    • @vlavrenko
      @vlavrenko  9 років тому

      Glad I could help :-)

  • @choubro2
    @choubro2 Рік тому

    british gale from breaking bad

  • @bikinibottom2100
    @bikinibottom2100 Рік тому

    Andrew Tates doing statistics