PCA 6 - Relationship to SVD

Поділитися
Вставка
  • Опубліковано 15 тра 2020
  • Full video list and slides: www.kamperh.com/data414/
    Errata:
    1:35 - Both the rows and columns of U are actually orthonormal.

КОМЕНТАРІ • 26

  • @ex-pwian1190
    @ex-pwian1190 3 дні тому

    The best explanation!

  • @shuddR
    @shuddR 3 роки тому +15

    This fills in so many gaps I had in my knowledge on PCA and SVD. Thank you

    • @kamperh
      @kamperh  3 роки тому +2

      Very big pleasure! :)

  • @kobi981
    @kobi981 23 дні тому

    Thanks ! great video

  • @RaksohOap
    @RaksohOap 2 роки тому +1

    Thank you very much for this video. I spend many time in stackexchange trying to understand this relationship, but you explain it really well in a short video. Thank you so much.

  • @johnjung-studywithme
    @johnjung-studywithme Рік тому

    the best explanation I've heard on this subject!

  • @RaviChoudhary_iitkgp
    @RaviChoudhary_iitkgp 6 місяців тому

    the best explanation of pca and how it relates to svd and the eigen value and eigen vectors 🙌

  • @atrayeedasgupta2872
    @atrayeedasgupta2872 Рік тому

    Excellent explanation. I could join all the dots regarding my understanding! Thanks a lot :)

  • @m.preacher2829
    @m.preacher2829 10 місяців тому

    a great video. I have learned a lot from your machine learning series. 😁

  • @janlukasr.2141
    @janlukasr.2141 3 роки тому +2

    Really nice explanation, thanks for this! :)

  • @davidzhang4825
    @davidzhang4825 Рік тому

    Great video ! This clears my doubts

  • @user-yc3op1vo8e
    @user-yc3op1vo8e 3 місяці тому

    Oh man, your channel its amazing, i am from brazil and i liked of your explanation about this content.. congratulations.

    • @kamperh
      @kamperh  3 місяці тому

      Thanks a ton for the encouragement!! :D

  • @edgar_benitez
    @edgar_benitez 3 роки тому +3

    At last a good explanation... Thanks.

  • @mvijayvenkatesh
    @mvijayvenkatesh 3 роки тому +4

    Excellent video..always had some doubt on why we take the covariance matrix when computing eigen decomposition. Thank you and had a lot of my doubts clarified by this video.

    • @kamperh
      @kamperh  3 роки тому +1

      Huge pleasure! :) I also struggled with this stuff the first (many) times I looked at it.

  • @Trubripes
    @Trubripes 3 місяці тому +1

    Brilliant, You forgot to center the data but that's a trivial oversight.

  • @RS-el7iu
    @RS-el7iu 4 роки тому +2

    thanks a lot... very nice n concise explanation.

  • @RitikaSarkar-vn1mr
    @RitikaSarkar-vn1mr 2 місяці тому

    Awesomeeeeeeee

  • @RitikaSarkar-vn1mr
    @RitikaSarkar-vn1mr 2 місяці тому

    awesomeee

  • @bilalbayrakdar7100
    @bilalbayrakdar7100 2 роки тому

    thx dude, it was clear explanation

  • @zenchiassassin283
    @zenchiassassin283 Рік тому

    Nice video ! Note that X has to be centered (I don't know the definition of design matrix though x) ) so that we have the a correct covariance matrix

  • @xerocool2109
    @xerocool2109 Рік тому +2

    in 8:00 you say that "we can get all the eigen vector and eigen values by just doing the SVD of X' * X (sample coveriance matrix)" but I think what you mean is by just simply doing the eigen decomposition of X' * X, or ultimately SVD of X' and take the V values, am I correct?

    • @halihammer
      @halihammer Рік тому

      I assume too?! i was confused a bit but this is what i got now:
      X = Design matrix with the variables in the columns and the observations in the rows, the data must be centered and normalized: X = (X(ij) - avg(j))/(n-1), where n is the amount of observations
      A = Covarinace matrix of X
      D = Diagonal matrix containing the eigenvalues of A
      S = Diagonal matrix containing the singualr values of X
      X = USV^T SVD of the design matrix
      A = VDV^T Eigendecomposition of the covariance matrix
      Now: X^TX = A = (VS^TU^T)(USV^T) = VS^T(SV^T = VDV^T since S is a diagonal: S^TS = S^2 = D and U is orthogonal: U^T=U^-1
      So the root of the eigenvalues of the covariance matrix are the same as the singular values of the design matrix
      Or the other way around: the squares of the singular values are the same as the eigenvalues of the covariance matrix

  • @prachimehta9839
    @prachimehta9839 2 роки тому

    Amazing video !

    • @kamperh
      @kamperh  2 роки тому +1

      Thanks Prachi!! :)