Thank you very much for this video. I spend many time in stackexchange trying to understand this relationship, but you explain it really well in a short video. Thank you so much.
Excellent video..always had some doubt on why we take the covariance matrix when computing eigen decomposition. Thank you and had a lot of my doubts clarified by this video.
in 8:00 you say that "we can get all the eigen vector and eigen values by just doing the SVD of X' * X (sample coveriance matrix)" but I think what you mean is by just simply doing the eigen decomposition of X' * X, or ultimately SVD of X' and take the V values, am I correct?
I assume too?! i was confused a bit but this is what i got now: X = Design matrix with the variables in the columns and the observations in the rows, the data must be centered and normalized: X = (X(ij) - avg(j))/(n-1), where n is the amount of observations A = Covarinace matrix of X D = Diagonal matrix containing the eigenvalues of A S = Diagonal matrix containing the singualr values of X X = USV^T SVD of the design matrix A = VDV^T Eigendecomposition of the covariance matrix Now: X^TX = A = (VS^TU^T)(USV^T) = VS^T(SV^T = VDV^T since S is a diagonal: S^TS = S^2 = D and U is orthogonal: U^T=U^-1 So the root of the eigenvalues of the covariance matrix are the same as the singular values of the design matrix Or the other way around: the squares of the singular values are the same as the eigenvalues of the covariance matrix
The best explanation!
This fills in so many gaps I had in my knowledge on PCA and SVD. Thank you
Very big pleasure! :)
Thanks ! great video
Thank you very much for this video. I spend many time in stackexchange trying to understand this relationship, but you explain it really well in a short video. Thank you so much.
the best explanation I've heard on this subject!
the best explanation of pca and how it relates to svd and the eigen value and eigen vectors 🙌
Excellent explanation. I could join all the dots regarding my understanding! Thanks a lot :)
a great video. I have learned a lot from your machine learning series. 😁
Really nice explanation, thanks for this! :)
Great video ! This clears my doubts
Oh man, your channel its amazing, i am from brazil and i liked of your explanation about this content.. congratulations.
Thanks a ton for the encouragement!! :D
At last a good explanation... Thanks.
Excellent video..always had some doubt on why we take the covariance matrix when computing eigen decomposition. Thank you and had a lot of my doubts clarified by this video.
Huge pleasure! :) I also struggled with this stuff the first (many) times I looked at it.
Brilliant, You forgot to center the data but that's a trivial oversight.
thanks a lot... very nice n concise explanation.
Awesomeeeeeeee
awesomeee
thx dude, it was clear explanation
Nice video ! Note that X has to be centered (I don't know the definition of design matrix though x) ) so that we have the a correct covariance matrix
in 8:00 you say that "we can get all the eigen vector and eigen values by just doing the SVD of X' * X (sample coveriance matrix)" but I think what you mean is by just simply doing the eigen decomposition of X' * X, or ultimately SVD of X' and take the V values, am I correct?
I assume too?! i was confused a bit but this is what i got now:
X = Design matrix with the variables in the columns and the observations in the rows, the data must be centered and normalized: X = (X(ij) - avg(j))/(n-1), where n is the amount of observations
A = Covarinace matrix of X
D = Diagonal matrix containing the eigenvalues of A
S = Diagonal matrix containing the singualr values of X
X = USV^T SVD of the design matrix
A = VDV^T Eigendecomposition of the covariance matrix
Now: X^TX = A = (VS^TU^T)(USV^T) = VS^T(SV^T = VDV^T since S is a diagonal: S^TS = S^2 = D and U is orthogonal: U^T=U^-1
So the root of the eigenvalues of the covariance matrix are the same as the singular values of the design matrix
Or the other way around: the squares of the singular values are the same as the eigenvalues of the covariance matrix
Amazing video !
Thanks Prachi!! :)