Amazing video, I was not sure about the relationship between scores and loading vectors, and finally, I found this video which corroborates my idea. Thanks a lot
This is my very first approximation to PCA, which I need to learn by myself for my thesis. I found it extremely useful!! Very well explained, thank you so much :-)
Great video! My question concerns the slope of pc's. For two variables the slope is calculated thus: pc1.slope = my.eigen$vectors[1,1]/my.eigen$vectors[2,1]. How do you calculate the slope when there are more than two variables and therefore more eigenvectors (for instance, three or four)? Thanks!
Prof, I also find your examples quite easy to follow... Kindly do a talk on Markov chain Monte Carlo in r....mcmc... And if u oblige, the kalman filters....in r. Once again, thank u for the intuitive/ingenious postings on R. God bless u.
Thank you for the great tutorial! So if there were data on characteristics of these students, could PCA be used to find out which characteristics contribute most to PC2?
Thanks for the video. The question is how do you do a PCA when more than two variables are involved? but not using prcomp or princomp. For example using chemical analysis of water or soil samples including SiO2, Al2O3, Fe2O3, MgO, MnO, etc.
I recommend using prcomp instead of princomp - If you look in the help pages for princomp you will see the following: "The calculation is done using eigen on the correlation or covariance matrix, as determined by cor. This is done for compatibility with the S-PLUS result. A preferred method of calculation is to use svd on x, as is done in prcomp." Hence my decision to focus on prcomp.
Great tutorial. However, I believe line 49 is incomplete. The calculation of the singular value "sd" should include an adjusting factor. The line should be written as follows: sd = sqrt(my.eigen$values)*sqrt(nrow(my.scaled.classes)-1). The Covariance approach uses this factor, which has to be removed when calculating the singular values. I used the Singular Value Decomposition approach to double check the singular values and found the discrepancy in the results.
Prof, I have found that your "pc1.slope" is calculated as "my.eigen$vectors[1,1]/my.eigen$vectors[2,1]". Why is the slope calculated as such? Should the slope be the inverse of the value you used (i.e., my.eigen$vectors[2,1]/my.eigen$vectors[1,1])? Thanks in advance.
Thanks for great tutorial, prof. In part 1, there's a command I didn't follow.... Text(xy, 12, 10)...........I don't understand the origin of 12 and 10.... Kindly explain. Otherwise, I don't have much problems following the commands in this part 2! God bless.
Amazing video, I was not sure about the relationship between scores and loading vectors, and finally, I found this video which corroborates my idea. Thanks a lot
This is my very first approximation to PCA, which I need to learn by myself for my thesis. I found it extremely useful!! Very well explained, thank you so much :-)
Thank You. Certainly one of the better tutorials on PCA here.
very useful breakdowns, part I and II, rare and clear explanations.
So very good explanation, Now I understand PCA, Thank you!!!
Excellent breakdown of PCA, Thanks
Are you going to be doing any more?
Thanks Steve!!! I love that you use R!
Great video! My question concerns the slope of pc's. For two variables the slope is calculated thus: pc1.slope = my.eigen$vectors[1,1]/my.eigen$vectors[2,1]. How do you calculate the slope when there are more than two variables and therefore more eigenvectors (for instance, three or four)? Thanks!
Prof,
I also find your examples quite easy to follow...
Kindly do a talk on Markov chain Monte Carlo in r....mcmc...
And if u oblige, the kalman filters....in r.
Once again, thank u for the intuitive/ingenious postings on R.
God bless u.
Thank you for the great tutorial! So if there were data on characteristics of these students, could PCA be used to find out which characteristics contribute most to PC2?
Thank you for all the videos !!! Do you think upload a video of LDA ?? It will be so helpful !!!!
Thanks for the video.
The question is how do you do a PCA when more than two variables are involved? but not using prcomp or princomp.
For example using chemical analysis of water or soil samples including SiO2, Al2O3, Fe2O3, MgO, MnO, etc.
I recommend using prcomp instead of princomp - If you look in the help pages for princomp you will see the following: "The calculation is done using eigen on the correlation or covariance matrix, as determined by cor. This is done for compatibility with the S-PLUS result. A preferred method of calculation is to use svd on x, as is done in prcomp." Hence my decision to focus on prcomp.
sad that I can only click thumbs up once. Great videos!
Hello,
Fantastic explanation of PCA, appreciate it...
Babak
great video. could you demo the princomp R function please?
Thank you, clear and well explained!
"lengthlwd" is not a graphical parameter
good stuff; recommend to interested in PCA
Great tutorial. However, I believe line 49 is incomplete. The calculation of the singular value "sd" should include an adjusting factor. The line should be written as follows:
sd = sqrt(my.eigen$values)*sqrt(nrow(my.scaled.classes)-1). The Covariance approach uses this factor, which has to be removed when calculating the singular values.
I used the Singular Value Decomposition approach to double check the singular values and found the discrepancy in the results.
Very interested ❤
Hi. Just one question. How I can do to change the x axis of a DCA? thanks!
Prof,
I have found that your "pc1.slope" is calculated as "my.eigen$vectors[1,1]/my.eigen$vectors[2,1]". Why is the slope calculated as such? Should the slope be the inverse of the value you used (i.e., my.eigen$vectors[2,1]/my.eigen$vectors[1,1])? Thanks in advance.
Thanks for great tutorial, prof.
In part 1, there's a command I didn't follow....
Text(xy, 12, 10)...........I don't understand the origin of 12 and 10....
Kindly explain.
Otherwise, I don't have much problems following the commands in this part 2!
God bless.
very good video. recommended.
That guy has 2542 unread emails... Must be a professor
Is is same to do prcomp for a "correlation matrix for a dataset" and "raw dataset with scale and center both TRUE"?
this video really help me .. thank you !!!
Amazing can you provide me the codes
Can we get the scripts please?
Check the "more information" section for the URL
Hi. Where can I access the code? Thanks .
OH MY GOD!!! THANK YOU!!!
Thank you very much
Thank a lots.