Randomized Singular Value Decomposition (SVD)
Вставка
- Опубліковано 28 січ 2020
- This video describes how to use recent techniques in randomized linear algebra to efficiently compute the singular value decomposition (SVD) for extremely large matrices.
Book Website: databookuw.com
Book PDF: databookuw.com/databook.pdf
These lectures follow Chapter 1 from: "Data-Driven Science and Engineering: Machine Learning, Dynamical Systems, and Control" by Brunton and Kutz
Amazon: www.amazon.com/Data-Driven-Sc...
Brunton Website: eigensteve.com
This video was produced at the University of Washington - Наука та технологія
I can't believe that I understood this. Thank you so much Prof.
Very easy to understand by watching your explanation. Thank you
Thanks Steve!!!! This is a very important approach for my desire to analyse high resolution climate model data.
Dear Prof.
Your videos are fantastic and help me a lot. I just want to request you to explain (or give the numerical proof) how Sigma and V of Y matrix should be same as original data matrix X. Is there any references or documents I can go through? Your response would be a great help for all of us.
Thank you. You just saved my life
this is a fantastic explanation
You are the best! I love your videos!!
This prof is cool he’s interested in complex systems and fractals
Thank you Steve! I have a question: Suppose we have the SVD od the current matrix A. If we add a new snapshot, is there any quick way to find the SVD of the new matrix? It will be useful when we want to find the POD of time-consuming LES or DNS simulations while calculating the flow field.
Good explanation.... not clear why it works as well as it does. Gives us some homework to get to the bottom of these ideas. Thank you
Thanks Prof!
Thank you. but can you point out how can we compute the SVD using the QR algorithm ? i am referring to @5:29
Thanks for this interesting explanation.
I have one question: How do you determine the rank r which requires pre-knowledge of the optimal threshold obtained from the Sigma matrix we are trying to compute using the rSVD? has this point been raised in related work in the literature?
Watch the videos before this one in the series
Thanks!
Great!
Stupid question but what is the system you use to write on the screen like that? It looks very cool.
He use a transparent glass as a writing board and keep the background dark. Then mirror the whole video so that we can see the transformed version of whole activities. You can imagine the whole process as a transformation of space (or matrix manipulation). The whole idea is just amazing.
Why must the projection be random?
From what distribution should it be generated?
Couldn't I use the same projection over and over again, which means it wouldn't actually be random?
In principle, you could pregenerate a big random matrix and then use it over and over. Eventually, you might run into some bias problems, but it should be okay. For example, if I built a camera on these principles, my guess is that my random measurement matrix C might not change... but that would be fine, since it is highly unlikely that it would be a "bad" matrix for any real-world signals.
kool stuff]
I realize it's postprocessing but it bugs me that it looks like his writing appears backwards from his perspective.
Look at his ring on his right hand! It is revered!
But what is Q and R? You never explained ...
Look for QR decomposition in your favorite linear algebra book.