Randomized Singular Value Decomposition (SVD)

Поділитися
Вставка
  • Опубліковано 28 січ 2020
  • This video describes how to use recent techniques in randomized linear algebra to efficiently compute the singular value decomposition (SVD) for extremely large matrices.
    Book Website: databookuw.com
    Book PDF: databookuw.com/databook.pdf
    These lectures follow Chapter 1 from: "Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control" by Brunton and Kutz
    Amazon: www.amazon.com/Data-Driven-Sc...
    Brunton Website: eigensteve.com
    This video was produced at the University of Washington
  • Наука та технологія

КОМЕНТАРІ • 26

  • @user-hl5sk1oj1m
    @user-hl5sk1oj1m 4 роки тому +4

    I can't believe that I understood this. Thank you so much Prof.

  • @chienthan12345
    @chienthan12345 4 роки тому +3

    Very easy to understand by watching your explanation. Thank you

  • @chrisw3327
    @chrisw3327 Рік тому +1

    Thanks Steve!!!! This is a very important approach for my desire to analyse high resolution climate model data.

  • @nazniamul
    @nazniamul 3 роки тому +4

    Dear Prof.
    Your videos are fantastic and help me a lot. I just want to request you to explain (or give the numerical proof) how Sigma and V of Y matrix should be same as original data matrix X. Is there any references or documents I can go through? Your response would be a great help for all of us.

  • @zoloojagaa9198
    @zoloojagaa9198 4 роки тому +2

    Thank you. You just saved my life

  • @zelexi
    @zelexi 3 роки тому +2

    this is a fantastic explanation

  • @andrezabona3518
    @andrezabona3518 3 роки тому +1

    You are the best! I love your videos!!

  • @dragosmanailoiu9544
    @dragosmanailoiu9544 4 роки тому +4

    This prof is cool he’s interested in complex systems and fractals

  • @mahan1598
    @mahan1598 3 роки тому +2

    Thank you Steve! I have a question: Suppose we have the SVD od the current matrix A. If we add a new snapshot, is there any quick way to find the SVD of the new matrix? It will be useful when we want to find the POD of time-consuming LES or DNS simulations while calculating the flow field.

  • @johnl4885
    @johnl4885 4 роки тому +1

    Good explanation.... not clear why it works as well as it does. Gives us some homework to get to the bottom of these ideas. Thank you

  • @Falangaz
    @Falangaz 4 роки тому +1

    Thanks Prof!

  • @yasserothman4023
    @yasserothman4023 3 роки тому

    Thank you. but can you point out how can we compute the SVD using the QR algorithm ? i am referring to @5:29

  • @MohammedFarag81
    @MohammedFarag81 3 роки тому +2

    Thanks for this interesting explanation.
    I have one question: How do you determine the rank r which requires pre-knowledge of the optimal threshold obtained from the Sigma matrix we are trying to compute using the rSVD? has this point been raised in related work in the literature?

    • @nami1540
      @nami1540 2 роки тому +1

      Watch the videos before this one in the series

  • @ondrejkotaba
    @ondrejkotaba 4 роки тому +1

    Thanks!

  • @sexsex1980
    @sexsex1980 4 роки тому +2

    Great!

  • @butette
    @butette 3 роки тому

    Stupid question but what is the system you use to write on the screen like that? It looks very cool.

    • @nazniamul
      @nazniamul 3 роки тому

      He use a transparent glass as a writing board and keep the background dark. Then mirror the whole video so that we can see the transformed version of whole activities. You can imagine the whole process as a transformation of space (or matrix manipulation). The whole idea is just amazing.

  • @brunoisy
    @brunoisy 4 роки тому +2

    Why must the projection be random?
    From what distribution should it be generated?
    Couldn't I use the same projection over and over again, which means it wouldn't actually be random?

    • @Eigensteve
      @Eigensteve  4 роки тому +3

      In principle, you could pregenerate a big random matrix and then use it over and over. Eventually, you might run into some bias problems, but it should be okay. For example, if I built a camera on these principles, my guess is that my random measurement matrix C might not change... but that would be fine, since it is highly unlikely that it would be a "bad" matrix for any real-world signals.

  • @karthikr3977
    @karthikr3977 4 роки тому +1

    kool stuff]

  • @Erotemic
    @Erotemic 4 роки тому +2

    I realize it's postprocessing but it bugs me that it looks like his writing appears backwards from his perspective.

    • @mahan1598
      @mahan1598 3 роки тому

      Look at his ring on his right hand! It is revered!

  • @nami1540
    @nami1540 2 роки тому +1

    But what is Q and R? You never explained ...

    • @cretinobambino
      @cretinobambino Рік тому +1

      Look for QR decomposition in your favorite linear algebra book.