Principal Components Analysis Using R - P2

Поділитися
Вставка
  • Опубліковано 2 лис 2024

КОМЕНТАРІ • 35

  • @danielalejandrocordovadela657
    @danielalejandrocordovadela657 3 роки тому

    Amazing video, I was not sure about the relationship between scores and loading vectors, and finally, I found this video which corroborates my idea. Thanks a lot

  • @sternblume
    @sternblume 11 років тому

    This is my very first approximation to PCA, which I need to learn by myself for my thesis. I found it extremely useful!! Very well explained, thank you so much :-)

  • @timerwentoff
    @timerwentoff 11 років тому

    Thank You. Certainly one of the better tutorials on PCA here.

  • @toletasah
    @toletasah 10 років тому +1

    very useful breakdowns, part I and II, rare and clear explanations.

  • @ofdomejean
    @ofdomejean 9 років тому

    So very good explanation, Now I understand PCA, Thank you!!!

  • @Actanonverba01
    @Actanonverba01 5 років тому

    Excellent breakdown of PCA, Thanks
    Are you going to be doing any more?

  • @careyduryea6010
    @careyduryea6010 9 років тому

    Thanks Steve!!! I love that you use R!

  • @alkobut6642
    @alkobut6642 9 років тому

    Great video! My question concerns the slope of pc's. For two variables the slope is calculated thus: pc1.slope = my.eigen$vectors[1,1]/my.eigen$vectors[2,1]. How do you calculate the slope when there are more than two variables and therefore more eigenvectors (for instance, three or four)? Thanks!

  • @meshackamimo1945
    @meshackamimo1945 10 років тому

    Prof,
    I also find your examples quite easy to follow...
    Kindly do a talk on Markov chain Monte Carlo in r....mcmc...
    And if u oblige, the kalman filters....in r.
    Once again, thank u for the intuitive/ingenious postings on R.
    God bless u.

  • @PhucNguyen-fx1vg
    @PhucNguyen-fx1vg 8 років тому

    Thank you for the great tutorial! So if there were data on characteristics of these students, could PCA be used to find out which characteristics contribute most to PC2?

  • @ysiquita65
    @ysiquita65 12 років тому

    Thank you for all the videos !!! Do you think upload a video of LDA ?? It will be so helpful !!!!

  • @ncamposa
    @ncamposa 12 років тому

    Thanks for the video.
    The question is how do you do a PCA when more than two variables are involved? but not using prcomp or princomp.
    For example using chemical analysis of water or soil samples including SiO2, Al2O3, Fe2O3, MgO, MnO, etc.

  • @StevePittard
    @StevePittard  12 років тому

    I recommend using prcomp instead of princomp - If you look in the help pages for princomp you will see the following: "The calculation is done using eigen on the correlation or covariance matrix, as determined by cor. This is done for compatibility with the S-PLUS result. A preferred method of calculation is to use svd on x, as is done in prcomp." Hence my decision to focus on prcomp.

  • @JosefHabdank
    @JosefHabdank 10 років тому

    sad that I can only click thumbs up once. Great videos!

  • @bubiloo11
    @bubiloo11 11 років тому

    Hello,
    Fantastic explanation of PCA, appreciate it...
    Babak

  • @RefUser
    @RefUser 12 років тому

    great video. could you demo the princomp R function please?

  • @ehsannajafi635
    @ehsannajafi635 7 років тому

    Thank you, clear and well explained!

  • @alejandrogonzaleztrevino8990
    @alejandrogonzaleztrevino8990 10 років тому

    "lengthlwd" is not a graphical parameter

  • @ssniegula81
    @ssniegula81 12 років тому

    good stuff; recommend to interested in PCA

  • @corcho1956
    @corcho1956 9 років тому

    Great tutorial. However, I believe line 49 is incomplete. The calculation of the singular value "sd" should include an adjusting factor. The line should be written as follows:
    sd = sqrt(my.eigen$values)*sqrt(nrow(my.scaled.classes)-1). The Covariance approach uses this factor, which has to be removed when calculating the singular values.
    I used the Singular Value Decomposition approach to double check the singular values and found the discrepancy in the results.

  • @abdourakibmama4274
    @abdourakibmama4274 6 місяців тому

    Very interested ❤

  • @m.noelserra5551
    @m.noelserra5551 8 років тому

    Hi. Just one question. How I can do to change the x axis of a DCA? thanks!

  • @shaoxinluan3391
    @shaoxinluan3391 7 років тому

    Prof,
    I have found that your "pc1.slope" is calculated as "my.eigen$vectors[1,1]/my.eigen$vectors[2,1]". Why is the slope calculated as such? Should the slope be the inverse of the value you used (i.e., my.eigen$vectors[2,1]/my.eigen$vectors[1,1])? Thanks in advance.

  • @meshackamimo1945
    @meshackamimo1945 10 років тому

    Thanks for great tutorial, prof.
    In part 1, there's a command I didn't follow....
    Text(xy, 12, 10)...........I don't understand the origin of 12 and 10....
    Kindly explain.
    Otherwise, I don't have much problems following the commands in this part 2!
    God bless.

  • @Mrnka218
    @Mrnka218 12 років тому

    very good video. recommended.

  • @JermanJackass
    @JermanJackass 9 років тому +5

    That guy has 2542 unread emails... Must be a professor

  • @therimalaya
    @therimalaya 11 років тому

    Is is same to do prcomp for a "correlation matrix for a dataset" and "raw dataset with scale and center both TRUE"?

  • @ysiquita65
    @ysiquita65 12 років тому

    this video really help me .. thank you !!!

  • @abdullahallah6231
    @abdullahallah6231 3 роки тому

    Amazing can you provide me the codes

  • @abdourakibmama4274
    @abdourakibmama4274 6 місяців тому

    Can we get the scripts please?

  • @StevePittard
    @StevePittard  11 років тому +1

    Check the "more information" section for the URL

  • @ferniize
    @ferniize 11 років тому

    Hi. Where can I access the code? Thanks .

  • @jesbuddy07
    @jesbuddy07 10 років тому

    OH MY GOD!!! THANK YOU!!!

  • @DoiLamung
    @DoiLamung 8 років тому

    Thank you very much

  • @eak092
    @eak092 11 років тому

    Thank a lots.