PCA 5: finding eigenvalues and eigenvectors

Поділитися
Вставка
  • Опубліковано 18 січ 2014
  • Full lecture: bit.ly/PCA-alg
    To find the eigenvectors, we first solve the determinant equation for the eigenvalues. We then solve for each eigenvector by plugging the corresponding eigenvalue into the linear system. Remember that eigenvectors must have unit length.
  • Наука та технологія

КОМЕНТАРІ • 36

  • @evawilhelmsson3469
    @evawilhelmsson3469 8 років тому +8

    I'm so happy and proud. Just solved my first PC by hand, with help from this video. Thanks!!

  • @pavelkonovalov8931
    @pavelkonovalov8931 7 років тому

    Thank you you very much! Your video helped me to get the idea behind the PCA and also, the philosophy behind the calculus

  • @tiarcandrawardaya7610
    @tiarcandrawardaya7610 3 роки тому

    Thank you very much. I've been looking for that euclidean length as a divisor just like for ever.

  • @liwang248
    @liwang248 6 років тому +3

    Hi Victor,
    This is the best PCA explaining video I can find online. Could you also provide a link of the Powerpoint used in your video?
    Thank you very much!
    Li Wang

  • @rahelrapazzini7229
    @rahelrapazzini7229 5 років тому

    Thank you so much for this very clear and step by step explanation! Saved me a lot of time and helped me prepare for my exam!

  • @Nissearne12
    @Nissearne12 6 років тому

    Thanks. very helpful.
    I have seen "Power Method" also but that seems to solve only the first (dominant, strongest) eigenvector. Is there any method to get out several eigenvector (in order from dominant and down) by use Power Method ?
    Best Regards
    Olle Welin

  • @moganraj2833
    @moganraj2833 5 років тому

    really thanks prof, its very useful to my work.thank u very much.

  • @rishijoshi1977
    @rishijoshi1977 5 років тому +2

    Thank you so much for such a great explanation. I was losing my mind over this.

  • @cookie920211
    @cookie920211 7 років тому

    Thanks very much. Your video is very inspiring!!!

  • @jiashenglai5509
    @jiashenglai5509 6 років тому

    This is simply excellent.

  • @yeah6732
    @yeah6732 4 місяці тому

    Great tutorial! But why the slop of two eigenvectors are expected to be the same?!

  • @ogunyemisegun4909
    @ogunyemisegun4909 2 роки тому

    I don't seem to understand the final step, what did u use to divide to get those final PC

  • @manuarteteco6153
    @manuarteteco6153 3 роки тому

    thanks, this was really helpful!

  • @zamanmakan2729
    @zamanmakan2729 2 роки тому

    Is using Lagrange multiplier an alternative or another step?

  • @mmuuuuhh
    @mmuuuuhh 9 років тому +1

    Doing SVD (by calling the routin in Matlab^^) is only a quick alternative algorithm for doing those calculations you presented for finding the Eigenvectors, isn't it?

  • @jaredsalinasgonzalez9899
    @jaredsalinasgonzalez9899 3 роки тому

    Amazing video !!!

  • @MehmetAltuntasmehmetcc
    @MehmetAltuntasmehmetcc 6 років тому

    Dude you totally rock!

  • @affectionlifeaffliction
    @affectionlifeaffliction 6 років тому

    what is the Euclidean distance? I did not get how he solved the final eigenvector when e1 =1 and e2= 1

    • @JoaoVitorBRgomes
      @JoaoVitorBRgomes 3 роки тому

      Euclidian distance can be the distance of a point from the origin (giving the size of the vector) or can be the distance between 2 points. In this case it is the size of the vector. He wants to put it all in the 0-1 scale so he divides by its size

  • @saurabh75prakash
    @saurabh75prakash 6 років тому

    Excellent tutorial on PCA and Eigen values/vectors. How a dxd matrix have d Eigen values/vectors?

    • @JoaoVitorBRgomes
      @JoaoVitorBRgomes 3 роки тому +1

      It is because the number of eigenvectors is going to be the same as the number of columns your matrix has

  • @aman6487
    @aman6487 2 роки тому

    What if we get two independent eigenvectors for one eigenvalue

  • @abidahmed4182
    @abidahmed4182 6 років тому +1

    how did you find the euclidean distance

    • @scares009
      @scares009 5 років тому +2

      You would take the coordinates of the vector, square them, and add them together, then take the square root. That gives you its distance from the origin, the actual length you would get by measuring the vector with a ruler. :)

  • @chan-holee7031
    @chan-holee7031 6 років тому

    is the first symmetric matrix covariance matrix(Of raw data)??

    • @JoaoVitorBRgomes
      @JoaoVitorBRgomes 3 роки тому

      Yes he uses the covar matrix to find the eigenvalues and eigenvectors

  • @FarhanAli-zj3rb
    @FarhanAli-zj3rb 5 років тому

    Thank you

  • @arulpraveen
    @arulpraveen 9 років тому

    but the points (2.2, 1) dosn't fit in the second equation and so cannot be considered as a solution. in that case what do we do ??

  • @AhmadTalkss
    @AhmadTalkss 3 роки тому

    at 4:00 how did u go from 2.2, 1 to 0.91, 0.41

    • @JoaoVitorBRgomes
      @JoaoVitorBRgomes 3 роки тому

      he divide by the euclidian distance

    • @JoaoVitorBRgomes
      @JoaoVitorBRgomes 3 роки тому +3

      d = sqrt(2.2^2 + 1^2) = 2,41660 ; then divide 2.2/2,41... = 0.91

  • @dominicdannies7482
    @dominicdannies7482 6 років тому

    can somebody explain 3:03 ?

  • @dmitriidatsenko
    @dmitriidatsenko 7 років тому +2

    0.8*0.8 = 0.64

  • @aiinabox1260
    @aiinabox1260 Рік тому

    fantastic job....One Stop Shopp