19. Principal Component Analysis

Поділитися
Вставка
  • Опубліковано 27 сер 2024
  • MIT 18.650 Statistics for Applications, Fall 2016
    View the complete course: ocw.mit.edu/18-...
    Instructor: Philippe Rigollet
    In this lecture, Prof. Rigollet reviewed linear algebra and talked about multivariate statistics.
    License: Creative Commons BY-NC-SA
    More information at ocw.mit.edu/terms
    More courses at ocw.mit.edu

КОМЕНТАРІ • 78

  • @bowenzheng8580
    @bowenzheng8580 4 роки тому +9

    if you find this lecture challenging, it might be because you forget some basic linear algebra. Don't be discouraged by the somewhat trivial algebraic calculation. the Prof does a very good job in explaining the intuition and statistical foundation for doing PCA. PCA is so commonly used in psychology studies, yet no one in the my Psy department seem to have a clue where PCA is coming from.

  • @gouravkarmakar9606
    @gouravkarmakar9606 4 роки тому +7

    extremely helpful with building the basics and then moving forward

  • @linkmaster959
    @linkmaster959 3 роки тому +4

    Gave me some insight, Thanks. I liked the part about how u^TSu is the variance of the X's along the u direction. Good to know for an alternative viewpoint to Singular value decomposition as a PCA.

  • @tilohauke9033
    @tilohauke9033 3 роки тому +8

    A good opportunity to burn calories would be to wipe the blackboard properly.

  • @user-ff5sx6pg3d
    @user-ff5sx6pg3d 2 місяці тому

    For people who are whining about the lecture is too hard or can not follow. I think you guys do not have the prerequisite for the course. His lecture is trying to illustrate from the statistical prospective of PCA, anybody who is series in data science should know that statistics and linear algebra shares a lot of same ideas from different prospective.

  • @Mau365PP
    @Mau365PP 3 роки тому +3

    Audio starts at 1:15

  • @vegeta23121992
    @vegeta23121992 5 років тому +6

    I was forwarding like crazy until I hear something and was thinking "Damn not only the first minute without audio". Just to realise my sound was mute

  • @zhenhuahu9814
    @zhenhuahu9814 5 років тому +3

    H is a n by n matrix, and v is a d-element column vector. H can not multiply v

    • @brucereinhold9564
      @brucereinhold9564 5 років тому +3

      He corrected the idea but didn't clean up the board. v is n-dim

    • @danishmahajan6901
      @danishmahajan6901 2 місяці тому

      But I have a doubt here n shows number of examples and d tells about how many dimension the space have so v should be of size (dx1) so it should not be feasible for Hv??

    • @danishmahajan6901
      @danishmahajan6901 2 місяці тому

      Can someone pls answer on this??

  • @user-ff9oq6jg9c
    @user-ff9oq6jg9c 4 роки тому +2

    nice example of seeing matrix in perspective of stat

  • @StatelessLiberty
    @StatelessLiberty 4 роки тому +4

    Shouldn't the empirical covariance matrix be divided by n-1 and not n?

  • @shashanksharma1498
    @shashanksharma1498 23 дні тому

    wonderful teacher and everything. But what's with the horrible chalk rubbing.

  • @MrPolinesso3
    @MrPolinesso3 11 місяців тому +2

    Clean the blackboard PROPERLY

  • @yasmineguemouria9099
    @yasmineguemouria9099 3 роки тому +12

    49 min in and still hoping he'll get to PCA soon hahaha... great lecture though

  • @danishmahajan6901
    @danishmahajan6901 2 місяці тому

    Can anyone pls help me with how prof. Come up on the final result from multiplication of Hv?? Steps i am little bit confused

  • @rw7154
    @rw7154 2 місяці тому

    Is he deliberately making the writing hard to read by making the blackboards so poorly erased and specifically writing on those poorly erased boards instead of the nice black ones?

  • @zknolz
    @zknolz 5 років тому +2

    Concept of Eigenvector at 1:02

  • @MsKouider
    @MsKouider Рік тому

    how to proof that eigenvectors are coulums of projection matrix

  • @aungkyaw9353
    @aungkyaw9353 4 роки тому +2

    Great lecture. Thank so much Professor.

  • @pedagil4570
    @pedagil4570 4 роки тому +1

    He is pretty good actually

  • @tomasjurica9624
    @tomasjurica9624 5 років тому

    In 1:08:10, those lambda's should not be eigen values of Sigma ? (or covariance matrix ?)

  • @sAAgit
    @sAAgit 4 роки тому

    47:25 bottom left: How is Var(u^TX)defined? What does the "variance" of a random vector mean? Thank you so much

    • @zongmianli9072
      @zongmianli9072 4 роки тому

      X is vector, not matrice in this case. So u^TX is just scalar

    • @quantlfc
      @quantlfc Рік тому

      Try working backwards from the result U^TxU

  • @wnualplaud2132
    @wnualplaud2132 3 роки тому

    Anyone can explain how the did he get the term in the parenthesis at 39:07? Why does Transpose(v)[1] = Transpose([1])v?

    • @asterkleel3409
      @asterkleel3409 3 роки тому +1

      both are transpose to each other and anyway, you are going to take the expectation of those two. so it will be same

  • @milindyadav7703
    @milindyadav7703 4 роки тому

    Can anyone explain how is he multiplying Identity matrix Id which is dxd with all-ones matrix which is nxn?????

    • @milindyadav7703
      @milindyadav7703 4 роки тому

      Nevermind....he clears it up around 40:00 it was gigantic mess

  • @ahmadmousavi495
    @ahmadmousavi495 4 роки тому +1

    Absolutely precious! Excellent in explaining details! Thank you.

  • @user-vy7sl9ki4o
    @user-vy7sl9ki4o 5 років тому

    I don't know why he let I_d rather than I_n denote n by n Identity matrix since 32:10.

  • @thedailyepochs338
    @thedailyepochs338 3 роки тому +1

    for such an important concept you would think Mit would've fixed this issue by now

  • @huzefaghadiyali5886
    @huzefaghadiyali5886 2 роки тому

    The only bothersome thing in this video is the dirty is the blackboard.

  • @viniciusviena8496
    @viniciusviena8496 3 роки тому +2

    wtf is he writing in a messed up white board???

  • @user-qd1cw5yy9m
    @user-qd1cw5yy9m 4 роки тому +1

    Absolutely great. If you have trouble getting this, maybe read a book first.

  • @uploaddeasentvideo402
    @uploaddeasentvideo402 Рік тому

    Can you share your slide please?

    • @mitocw
      @mitocw  Рік тому +2

      The lecture slides are available on MIT OpenCourseWare at: ocw.mit.edu/18-650F16. Best wishes on your studies!

  • @lazywarrior
    @lazywarrior 4 роки тому +1

    horrible. Don't max out your volume. There's nothing till you get a huge surprise at 1:15.
    One of the cameras is tracking the movement of the lecturer, and it makes me dizzy. The view of the blackboard is enough. Even in 2016, the camera man at OCW still can't master how to record good video lectures.

  • @melihaslan9509
    @melihaslan9509 3 роки тому +1

    I understand nothing...

  • @pranavchat141
    @pranavchat141 4 роки тому +2

    Rather watch one of the lectures on PCA by Prof Ali Ghodsi.

    • @NphiniT
      @NphiniT 4 роки тому

      Link please

    • @pranavchat141
      @pranavchat141 4 роки тому

      @@NphiniT ua-cam.com/play/PLehuLRPyt1Hy-4ObWBK4Ab0xk97s6imfC.html
      This is full playlist.

  • @mswoonc
    @mswoonc 7 років тому +5

    He makes PCA way more complicated than it should be, wow...

    • @brucereinhold9564
      @brucereinhold9564 5 років тому +4

      Most of what he is doing is introducing the linear operator formalism. The gravy here is this side stuff, not the minimalist way to explain PCA

    • @joelwillis2043
      @joelwillis2043 Рік тому

      no

  • @luluW199005
    @luluW199005 5 років тому

    No sound?

    • @mitocw
      @mitocw  5 років тому +1

      It has sound... it's just really low. Sorry!

  • @user-nq2fi4qm6y
    @user-nq2fi4qm6y 5 років тому +16

    He should learn how to teach from Gibert Strang

    • @aazz7997
      @aazz7997 4 роки тому +10

      Bit rude.

    • @MrJ691
      @MrJ691 Рік тому

      ​@@aazz7997but true

    • @freeeagle6074
      @freeeagle6074 10 місяців тому

      Lectures of both professors are awesome. It may be helpful to understand this course if prerequisite courses (18.600, 18.06, 18.100, etc.) are completed first. May also be helpful to study the slides first before listening to the lectures.

  • @Tyokok
    @Tyokok 5 років тому

    thanks for the video! Question: can someone explain the difference between big Sigma and S. One is covariance matrix, one is sample covariance matrix. they are not the same thing? Thanks!

    • @MrTdib
      @MrTdib 4 роки тому +10

      Big Sigma is for the whole population. S is when selecting a sample from the population. S is an estimate of Sigma. If the sample is big enough, S would approach Sigma, but may not be exactly equal to the population parameter. I hope this is clear!

  • @highartalert6927
    @highartalert6927 4 роки тому +7

    Man this video is such a torture! :D

  • @srikanthmyskar5610
    @srikanthmyskar5610 Рік тому

    Good Lecture. But bad handling by Cameraman

  • @user-cc7fo5ml1m
    @user-cc7fo5ml1m 4 роки тому

    11:00 There is a ghost on a board, in a right lower corner of it.

  • @Illinoise888
    @Illinoise888 4 роки тому +1

    I see why this was made free.

  • @out_aloud
    @out_aloud 3 роки тому +1

    He doesn't even cares to rub the board properly 😅😅😅😅

  • @MeAlireza
    @MeAlireza Рік тому +1

    too much unnecessarily complicating. this is in fact not only unnecessarily complicating but also in a confusing and destructive way. however, i should still say that other lecturers up to this lecture were better in term of presentation

  • @djangoworldwide7925
    @djangoworldwide7925 Рік тому

    my computer is so smart

  • @jivillain
    @jivillain 4 роки тому +2

    This guy is so cute

  • @chaitanya.teja.g
    @chaitanya.teja.g 5 років тому +2

    Is this really MIT?

  • @WowIan2012
    @WowIan2012 4 роки тому +5

    this is really not the quality i expected from MIT, pretty sloppy instructor

  • @danielketterer9683
    @danielketterer9683 4 роки тому

    Dude needs better erasers

  • @quantlfc
    @quantlfc Рік тому

    Insane :)

  • @georgeivanchyk9376
    @georgeivanchyk9376 4 роки тому

    1:13:51 v1 ZULUL

  • @carloszg8398
    @carloszg8398 4 роки тому

    this guy is a complete mess...

  • @AndresSalas
    @AndresSalas 7 років тому +2

    He looks rather unsecure.

  • @zbynekba
    @zbynekba 4 роки тому

    Winy have you published such a mess? Shame on you!

  • @looploop6612
    @looploop6612 5 років тому

    Terrible

  • @2357y1113
    @2357y1113 6 років тому +8

    Audio starts at 1:14