Singular Value Decomposition (SVD): Matrix Approximation

Поділитися
Вставка
  • Опубліковано 29 вер 2024

КОМЕНТАРІ • 211

  • @greenpumpkin172
    @greenpumpkin172 4 роки тому +209

    This channel is so underrated, your explanations and overal video presentation is really good!

    • @dombowombo3076
      @dombowombo3076 4 роки тому +3

      Don't know why you think it's underrated...
      Everyone who is watching this videos knows how great they are.

  • @AkshatJha
    @AkshatJha Рік тому +1

    What a wonderful way to simplify a complicated topic such as SVD--I wish more people in academia emulated your way of teaching, Mr. Brunton.

  • @NickKingIII
    @NickKingIII 4 роки тому +4

    Wonderful explanation, clear and easy to understand. Thank you very much

  • @liuhuoji
    @liuhuoji 3 роки тому

    love the video, well explained and aesthetically good.

  • @hugeride
    @hugeride 3 роки тому

    Just amazing explanation.

  • @maipyaar
    @maipyaar 3 роки тому

    Thank you for this video series.

  • @eeeeee132
    @eeeeee132 3 роки тому

    It is very clear when you compare the vector x with the water flow that, 'U' is the eigen flows and 'V' is the time the eigen flows evolves in the flow. In contrast to the face, also it ise very clear in the part of 'U' as the eigen faces but I wonder what will be the 'V' vector ?

  • @warrior_1309
    @warrior_1309 3 роки тому

    I wish i was his student during my college days ,my grades had been improved ..

  • @amirhamzekhoshnam4755
    @amirhamzekhoshnam4755 5 місяців тому

    what if we change our norm. is this remain the best approximation or not?

  • @Dominus_Ryder
    @Dominus_Ryder 3 роки тому

    I'm confused with the T-transpose notation. Dose the SVD of X give U*Sigma* V, for which you manually have to transpose V afterwards, or does the SVD of X give you U*Sigma*V-transpose, such that the SVD transposes V for you automatically upon completion of its calculation.

  • @sohanaggarwal8770
    @sohanaggarwal8770 4 роки тому

    are the u vectors sorted organized in the descending order?

  • @ayushsaraswat866
    @ayushsaraswat866 4 роки тому +132

    This series is by far the best explanation of SVD that I have seen.

  • @ris2043
    @ris2043 4 роки тому +47

    The best explanation of SVD. Your videos are excellent. Thank you very much!

  • @AdityaDiwakarVex
    @AdityaDiwakarVex 4 роки тому +35

    SVD was at the very end of my college LinAlg class so I never got a very good understanding of it before the final - this is truly amazing; you say "thank you" at the end of every video but it should be us saying it to you- keep doing your thing! I'm loving it.

  • @smilebig3884
    @smilebig3884 4 роки тому +38

    The best thing about your lectures is, u do coding implementation along with huge maths.. That makes u different from rest of the traditional instructors. Kudos to you!!!

  • @alek282
    @alek282 4 роки тому +12

    Amazing lectures, immidiately bought the book, thank you!

    • @LusidDreaming
      @LusidDreaming 3 роки тому

      The book is great, but relatively terse for someone like me who needs to brush up on his linear algebra. These video lectures are an excellent compliment to the book and really help drive home the concepts.

  • @eugeneteoh1
    @eugeneteoh1 4 роки тому +9

    I just finished my exam and I see this lmaoo 😭😭

  • @sachingarg4385
    @sachingarg4385 4 роки тому +1

    Part 2 of the Eckard Young theorem is that this video is the best explanation of the theorem's part1 :P

  • @tobyleung96
    @tobyleung96 3 роки тому +1

    @14:52
    No Steve, thank YOU!

  • @kansasmypie6466
    @kansasmypie6466 3 роки тому +5

    Can you do a series on QR decomposition as well? This is so useful!

  • @johnberry5275
    @johnberry5275 3 роки тому +1

    _"I've been assuming the whole time in these lectures that_ *n* _is_ *much much larger* _than_ *m,* _meaning I have many more_ *entries* _in each column than I have_ *columns."*
    [question]: To agree with his choice of wording, didn't he actually mean to say that _m_ (the number of rows; you could also say the number of entries down each column) is much much larger than _n_ (the count of columns)? I think he got his _m_ and _n_ mixed up when he wrote *n>>m* . I think, instead, he meant to write *m>>n* .

    • @VainBrilliance
      @VainBrilliance 3 роки тому

      n -- is the number of entries in each column
      m -- is the number of columns (can be interpreted as number of samples/individual observations)
      so, I think in your interpretations, after the [question], you have swapped m and n. m is the number of columns, not the number of rows

    • @JJChameleon
      @JJChameleon 2 роки тому

      I believe in this video he makes m = number of columns and n = number of rows. It is standard to have it the other away around and this confused me as well.

  • @wackojacko1997
    @wackojacko1997 Рік тому +1

    Not an engineer/student, but I'm watching this to get a better understanding of PCA in statistics. I'm going to check the book and research this, but my only complaint (nit-picky) is trying to tell the difference when Steve speaks between "M" and "N" which I know refers to the number of rows or columns of the matrix. But really, this was great and I am thankful that this is something I can study on my own. Much appreciated.

  • @omniscienceisdead8837
    @omniscienceisdead8837 2 роки тому +3

    you explain math in such a way as to not make someone feel stupid, but feel like their taking steps into understanding a larger concept, and the tools they need are the ones we already have, big ups

  • @athanasiospliousis2654
    @athanasiospliousis2654 4 роки тому +4

    Very, very nice explanation and presentation. Thank you!

  • @Multibjarne
    @Multibjarne 2 роки тому +2

    Explanations like this for a dummy like me makes my life so much easier

  • @douglasespindola5185
    @douglasespindola5185 2 роки тому +3

    Gosh, what a class! As mr. Ayush said, this was indeed by far the best SVD explanation I've seen. You've made a such complicated subject way more affordable! I wish you all the best, Steve! Greetings from Brazil!

    • @Eigensteve
      @Eigensteve  2 роки тому +2

      Thanks so much! That is great to hear!!

  • @skilambi
    @skilambi 4 роки тому +3

    Please keep making these high quality lectures. They are some of the best I have seen on UA-cam and that goes a long way because I watch a lot of lectures online.

  • @saitaro
    @saitaro 4 роки тому +3

    It was pleasure to watch. You should do more educational videos, mr. Brunton.

  • @andrezabona3518
    @andrezabona3518 3 роки тому +1

    for mn ? (For example, what happen if my dataset is composed by 5000 images of 32x32?)

  • @carlossouza5151
    @carlossouza5151 4 роки тому +3

    You are a very very gifted teacher! Thank you for sharing this! :)

  • @mourenlee5403
    @mourenlee5403 4 роки тому +6

    why could he write in mirror image? Amazing!

    • @NG-lx1kx
      @NG-lx1kx 4 роки тому

      I also feel amazed by that

    • @parrychen4738
      @parrychen4738 4 роки тому

      @@NG-lx1kx Remember that image processing is one of the applications of SVD. Now think, how does image processing relate to this video?

    • @douglashurd4356
      @douglashurd4356 3 роки тому

      Post processing he flips left-right. Notice which hand has the wedding ring.

  • @Phi1618033
    @Phi1618033 Рік тому

    This all sounded like gibberish until I started to think of the first term of the expansion (Sigma1*U1*V1T) as the (strongest) "signal" and the rest of the terms as ever decreasing amounts of "signal" and ever increasing amounts of "noise". So the last term (Sigmam*Um*VmT) is essentially all background "noise" in the data. Thinking of it that way, it all makes perfect sense.

  • @MrFurano
    @MrFurano 2 роки тому

    In case anyone didn't know, there's an entire playlist for SVD: ua-cam.com/video/gXbThCXjZFM/v-deo.html

  • @dewinmoonl
    @dewinmoonl 3 роки тому

    others : "wow great explanation thanks for the lsson"
    me : how is this man writing perfectly backwards onto thin air ?
    mr bruton : I'm glad the green screen, glass board, and $1000 dollars of adobe products really paid off

  • @rajkundaliya7796
    @rajkundaliya7796 2 роки тому +1

    It doesn't get better than this. I am so thankful to you. I don't know how to repay this help.... And yes, this is a highly underrated channel

  • @4096fb
    @4096fb Рік тому

    Thank you for this great explanation. I just lost you on one point, why is this matrix multiplication equals sig1U1V1T + sig2U2V2T + ... + sigmUmVmT
    Can someone explain how does it complete the entire matrix multiplication? I somehow lost in this columns of U and row of V

  • @videofountain
    @videofountain 4 роки тому

    ua-cam.com/video/xy3QyyhiuY4/v-deo.html I am not sure what .. [increasingly improves] means. The singular values were stated to be decreasing. I was expecting something such as ..[[improves]].

  • @kevinle6122
    @kevinle6122 2 роки тому

    @2:39 "the first column sigma1 u1 only multiplies the v1 transpose column, the 2nd column only multiplies the v2 transpose column and so forth" did he mean 'the first column sigma1 u1 only multiplies the v1 transpose ROW' like his hand motion shows? when I multiply the matrices by hand it seems to be sigma1u1 by v1 tranpose row

  • @momoh6696
    @momoh6696 10 місяців тому

    Hello, do we pass the U, S or VT as input into the SINDy algorithm or we pass in the approximation of X gotten using the U,S and VT?

  • @bernardoolisan1010
    @bernardoolisan1010 2 роки тому

    Hey nice tutorial, so you are saying to me that if we pass as input an incomplete EDM we can find the complete EDM with this approximation?

  • @momoh6696
    @momoh6696 10 місяців тому

    Hello once again(sorry, this will be the last I think), is there somewhere I can get some pictures like the waveform you showed several timesteps of(to be processed by SINDy i think) and the PDE of the waveform. I want to use the images with a known PDE to see if my compressed images will give something the same or similar:)

  • @非常大的圆白菜
    @非常大的圆白菜 2 роки тому

    I don't understand why we are multiplying the columns of U by the rows of V... shouldn't it be the opposite?

  • @maciejmikulski7287
    @maciejmikulski7287 Рік тому

    The assumption n >> m is contrary to what we have quite often in data sciences. In many problems, the number of samples (here m) is bigger than number of features (here n). In such a case, we just take the transpose and keep going the same way? Or there are some additional considerations (of course except of swapping interpretations of eigen vectors etc)?

  • @murphp151
    @murphp151 2 роки тому

    This is so excellent. I just have one question you keep saying that you will multiply a column of u with a row of v. But matrix multiplication is row by column. So how are you doing it the other way around ?
    What am I missing ?

  • @fabou7486
    @fabou7486 3 роки тому +1

    One of the best channels I have ever followed, appreciate it so much!

  • @eveninrose
    @eveninrose 4 роки тому +3

    Just started watching this playlist, excellent explanations and a great way to promote while sharing knowledge; bought your book and can't wait to revisit w/the text!

  • @LyndaCorliss
    @LyndaCorliss Рік тому

    Top rate education, I'm happily learning a lot.
    Nicely done. Thank you

  • @momoh6696
    @momoh6696 10 місяців тому

    Hello, I can't find a way to compress all my images into one X matrix, the only examples I've seen are for doing SVD for one image at a time. How do I make one X matrix from images please?

  • @raven5165
    @raven5165 11 місяців тому

    In the example, m is the number of images you have, n is their pixel values. So m doesn't have to be enough to represent your data.
    It is like saying, if you have 2 photo, 2 dimensions are enough to represent image features.

  • @takumimatsuzawa1774
    @takumimatsuzawa1774 3 роки тому

    I’m coding up the exercises in his book in Python but somebody must have done this before. Does anybody know?

  •  3 роки тому

    Shouldn't u vectors be row vectors? How do you multiply column vector with another column vector in other matrix?

  • @mkhex87
    @mkhex87 2 роки тому

    To the point. Answers all the important questions. I mean you should come to the party knowing some lin alg but great for intermediate level

  • @YasserHawass
    @YasserHawass Рік тому

    i had some time to accept 1:22 conclusion, since if i understood you right, we have n vector space, in which our data can be, so it should be okay to use the whole n vectors of U as new basis, unless we want dimensionality reduction and not just matrix decomposition, or i'm just missing something?

  • @yutengfeng377
    @yutengfeng377 3 роки тому

    For the last point, (u~)*transpose(u~) is still a identity matrix, but it is a n by n matrix instead of a r by r matrix. Am I right?

  • @nikosips
    @nikosips 4 роки тому +2

    Thank you very much for those videos , they are very explanatory . Keep up the good work, we need you lessons for our academic improvement.

  • @ShaidaMuhammad
    @ShaidaMuhammad 4 роки тому +1

    Wawww ... 202 Liken, 0 Dislike.
    It shows the quality of content.

  • @sabarathinamsrinivasan8832
    @sabarathinamsrinivasan8832 2 місяці тому

    Very nice lecture and clearly understantable...Thanks Steve...🤗

  • @nicholashawkins1017
    @nicholashawkins1017 2 роки тому

    Lightbulbs are finally going off when it comes to SVD cant thank you enough!

  • @alexpujoldartmouth
    @alexpujoldartmouth 3 роки тому +2

    You have a talent for taking complicated topics and breaking them down into digestible pieces. That's the sign of a good teacher. Thank you.

  • @isaaclakes-trapbeatspianob5955
    @isaaclakes-trapbeatspianob5955 3 роки тому

    What if u = 2x2 matrix, and V a 3x3 matrix. Then how would you calculate that outer product uV

  • @jonathanschwartz7256
    @jonathanschwartz7256 4 роки тому +1

    Watch out Kahn Academy, Steve Brunton is coming for ya! Seriously though, these videos are fantastic :)

  • @Catwomen4512
    @Catwomen4512 2 роки тому

    Is the economy SVD not also an approximation of X? (Since we lose some columns of U)

  • @omidbahar
    @omidbahar 3 роки тому

    I could take nap alonh each topic and still didn't lose any important notes, you should get prepared before start presenting

  • @guitar300k
    @guitar300k 2 роки тому

    I like your series also the dark background make my eye feels ease than white background like other channels did

  • @sonilshrivastava1428
    @sonilshrivastava1428 3 роки тому +1

    One of the best videos on singular value decomposition. it not only tells the maths but also the intuition. Thanks. !

  • @YYchen713
    @YYchen713 2 роки тому +2

    Thank you for making the linear algebra less boring and really connected to data science and machine learning, this series is so much more interpretable than what my professor explains

    • @PunmasterSTP
      @PunmasterSTP Рік тому

      Hey I know it's been nine months but I just came across your comment and was curious. How'd the rest of your class go?

  • @es8426
    @es8426 Рік тому

    I do not understand why the sigma values are in decending order and why the first sigma values are more important than the latter ones

  • @psycheguy503
    @psycheguy503 6 місяців тому

    How to determine the criterion to truncate please? It might depend on the specific case but is there a general guideline for it?

  • @douglashurd4356
    @douglashurd4356 3 роки тому +1

    Superlative production! Lighting, sound, set, rehearsals, material, these videos are among the best productions on UA-cam. Even I understood some of it! :-)

  • @nathannguyen2041
    @nathannguyen2041 2 роки тому +1

    This was, by far, the most compensable explanation of what the SVD is mathematically and visually. The SVD is an incredible algorithm! Amazing how so little you could keep in order to understand the original system.

  • @peymanzirak5400
    @peymanzirak5400 2 роки тому +1

    I find everything with these courses, even the way board arranged is just great. Many many thanks for this wonderful explanation and all your effort to make it understandable and yet complete.

  • @colingillespie7635
    @colingillespie7635 3 роки тому

    Anybody else unable to shake the fact that this guy looks like dr harrison wells?

  • @zrmsraggot
    @zrmsraggot 2 роки тому

    Is truncating at 'r' similar to filtering out the highest frequencies in the FFT ?

  • @prabalghosh2954
    @prabalghosh2954 2 роки тому

    Very, very nice explanation and presentation. Thank you!

  • @matthewprestifilippo7673
    @matthewprestifilippo7673 2 роки тому

    found this hard to follow without teaching any ways to think of this intuitively.

  • @Aditya-ne4lk
    @Aditya-ne4lk 4 роки тому +4

    Just in time for the new semester!

  • @wmafyouni
    @wmafyouni Рік тому

    Why is the X matrix n by m but your SVD written out for X as m by n matrix?

  • @arbitrandomuser
    @arbitrandomuser Рік тому

    if the U columns after m dont matter .. why is U unique ?

  • @ARSHABBIR100
    @ARSHABBIR100 4 роки тому +1

    Excellent explanation. Thank you very much.

  • @TheLOLon10
    @TheLOLon10 4 роки тому +1

    I didn't get why your sigamatrix (X in the eigenvector base) has so many rows of zeros. Is it because of the non quadratic dimension of X or due to the precision limitations of the computer? I guess you just cut off at some point?

    • @TheLOLon10
      @TheLOLon10 4 роки тому +1

      Should have waited for a bit :P no explanation needed anymore..

  • @alwaysaditi2001
    @alwaysaditi2001 5 місяців тому

    Thank you so much for this easy to understand explanation. I was really struggling with the topic and this helped a lot. Thanks again 😊

    • @Eigensteve
      @Eigensteve  5 місяців тому

      Glad it was helpful!

  • @zepingluo694
    @zepingluo694 3 роки тому +1

    Thank you for presenting us an amazing experience to learn about SVD!

  • @jamesalesi1305
    @jamesalesi1305 7 місяців тому

    What are you, some kind of eigengenius?

  • @mvincent00
    @mvincent00 2 роки тому

    Help me out; are you right or left-handed?

  • @khim2970
    @khim2970 Рік тому

    really appreciate your efforts. wish u all the best

  • @Klompe2003
    @Klompe2003 Рік тому

    @11:08 his face when he said Frobenius norm xD

  • @asmafarid2939
    @asmafarid2939 4 роки тому +1

    I need to work on SVD where n

    • @Eigensteve
      @Eigensteve  4 роки тому +5

      Absolutely, this is no problem. The SVD should work for any size matrix, I just considered the case where n>>m for simplicity. But if you want to use the notation in this lecture series, you can just transpose your matrix so that n>>m, compute the SVD, and transpose the result.

  • @borna6403
    @borna6403 3 роки тому

    what drove you to watch this channel?
    me: money

  • @ParijatKar
    @ParijatKar 2 роки тому

    I am interested in knowing what software he is using to write.

  • @oscarsmith-b3x
    @oscarsmith-b3x Рік тому

    really really nice explanation!you are really a great teacher!

  • @florawoflour4501
    @florawoflour4501 Рік тому

    thank u so much sir, very helpful

  • @arthurlee8961
    @arthurlee8961 2 місяці тому

    That's so helpful! thank you!

  • @hchoudhary92
    @hchoudhary92 4 роки тому

    I am trying to identify dominant modes/ coherent structures and inner-outer interaction in the turbulent wall jets using PIV images. Can you give any suggestion?

  • @aryanshrajsaxena6961
    @aryanshrajsaxena6961 5 місяців тому

    Thank You Professor. Respects from India

  • @robinfreeman3223
    @robinfreeman3223 4 роки тому

    write everything mirrored? WOW....

  • @pilmo11
    @pilmo11 Рік тому

    superinformative series of SVD

  • @SACHCHIDANANDSINGH-h3c
    @SACHCHIDANANDSINGH-h3c 2 дні тому

    Awesome explanation

  • @mohammedal-khulaifi7655
    @mohammedal-khulaifi7655 2 роки тому

    you are at the tip-top i like your explanation

  • @jayives9008
    @jayives9008 4 роки тому +1

    I got confused at about 2:40. Shouldn't it be a row of the left matrix (U) times a column of the right matrix (V transposed)?

    • @phamhongvinh550
      @phamhongvinh550 4 роки тому +1

      Yes, it is. But there are different ways to visualize matrix multiplication . In your visualization, row i of U times column j of V and we have a scalar at position ij in the result matrix. In his visualization, each column of U times the corresponding row of V and we get a rank 1 matrix. The result will be the summation of those rank 1 matrices.

    • @Eigensteve
      @Eigensteve  4 роки тому +1

      This can be a bit tricky to visualize, especially since there is a diagonal matrix between U and V. So I would recommend actually writing this out and checking that it makes sense.

  • @vitorsousa5390
    @vitorsousa5390 3 роки тому

    Please, could you please tell me how can I run the economy SVD in Python ? I always use all my memory when I use the function "np.linalg.svd()". My matrix has 100 thousands rows and 27 columns. Thanks !