@Aaron and @Mimi - this is an excellent walk-through of the SVD computation -- one of the top explanations I have found out of _over 30 tutorials_ on PCA and SVD I have studied. Especially you show the 'why' of each computation: that is, given that we know that eigenvalues and eigenvectors will be valuable for matrix data analysis, you show why each step in the matrix computations relates to the eigenvalues and eigenvectors (you explain both the concepts and the matrix computation mechanics), and you show how each step drives to the spectral decomposition of the matrix. And you use the industry standard notation for the SVD computation (with U, Sigma, and V), which helps when we compare your notes to other sources. I see an immense amount of careful preparation you have done for this seminar. *Thank you and much respect to you both.*
I'm long retired from teaching now but this is the kind of introduction I would have really enjoyed giving. Surely a lot of viewers will find that this lucid and engaging presentation has broken the logjam. I learned a lot, my thanks to Aaron & Mimi.
Being great is not being great , being great is being simple.... a fantastic lecture that has been sequenced in such elegance and in such simplicity that it is relay great. ( let your work continue to contribute knowledge to every one .... Thank a Lot ...... great!
Hey, just want to say. This is a really quality video! Thanks a ton! With all the complicated (and often non-concrete) math involved in linear algebra its easy to forget what you are doing and why you are doing it... and this video does a great job of covering all the bases (pun intended)!
God bless this kind girl. Math lectures often omit to remind why we should bother about new formulas, techniques and what they really mean in heart, so they jump out unexpectedly in front of your eyes and as such quickly disappear from memory. Unlike this footage.
This is a vivid explanation video on SVD... I can say only one word.. "Awesome". Please make videos on some of the mathematical concepts like Affine and Perspective transformations.
SVD is also used with the Principal Component Analysis and very often with the topic modeling, and basically every time you want to lower the number of parameters
Thanks a lot for this tutorial. got a clear vision of SVD due to this. Can you please do the same for CVR decomposition? It will be of great help then. Thanks in advance.
Wow! Breathtaking your transparency making a black box turn into a yummy white box !
Wow, a math tutorial with good audio quality, good handwriting, and a calm and soothing voice that I can understand. And jokes? A+
One of the best videos on SVD out there!
Thank you! I'm so glad you found it!
Agree 100%.
@Aaron and @Mimi - this is an excellent walk-through of the SVD computation -- one of the top explanations I have found out of _over 30 tutorials_ on PCA and SVD I have studied. Especially you show the 'why' of each computation: that is, given that we know that eigenvalues and eigenvectors will be valuable for matrix data analysis, you show why each step in the matrix computations relates to the eigenvalues and eigenvectors (you explain both the concepts and the matrix computation mechanics), and you show how each step drives to the spectral decomposition of the matrix. And you use the industry standard notation for the SVD computation (with U, Sigma, and V), which helps when we compare your notes to other sources. I see an immense amount of careful preparation you have done for this seminar. *Thank you and much respect to you both.*
I have watched more than 100 SVD videos so far, but this one is the best and complete one!!!
I'm long retired from teaching now but this is the kind of introduction I would have really enjoyed giving.
Surely a lot of viewers will find that this lucid and engaging presentation has broken the logjam.
I learned a lot, my thanks to Aaron & Mimi.
Amazing video! Wish the channel made more of these!
Being great is not being great , being great is being simple.... a fantastic lecture that has been sequenced in such elegance and in such simplicity that it is relay great. ( let your work continue to contribute knowledge to every one .... Thank a Lot ...... great!
I think this video is actually going to save me on my final exam. Excellent video, thank you so much!!!
What a great explaination of SVD.As simple as as clear as it get! WELL DONE
wow,great teaching,simple way to explain tough concepts.Thanks!
Best svd video I've found!
This is such a nice video! I really like your style of teaching!
Really, awesome explanation!
Hey, just want to say. This is a really quality video! Thanks a ton! With all the complicated (and often non-concrete) math involved in linear algebra its easy to forget what you are doing and why you are doing it... and this video does a great job of covering all the bases (pun intended)!
I'm so glad you found it helpful! It was a great learning experience to make, too.
God bless this kind girl. Math lectures often omit to remind why we should bother about new formulas, techniques and what they really mean in heart, so they jump out unexpectedly in front of your eyes and as such quickly disappear from memory. Unlike this footage.
Congrats! Great video with a nice explanation!
excellent way of teaching. She is absolutely fantastic.
Very nice explanation, was very helpful. The bit about Taylor Series blew my mind! Thank you
Really beautiful illustration of a complex subject. Very nice madam. Thank you very much for taking the time to make this video. God bless you.
Thanks for simplifying SVD.Very helpful video.
Very well explained 👏
This is a vivid explanation video on SVD... I can say only one word.. "Awesome". Please make videos on some of the mathematical concepts like Affine and Perspective transformations.
It was really helpful, thank you!
SVD is also used with the Principal Component Analysis and very often with the topic modeling, and basically every time you want to lower the number of parameters
this video explained SVD really clearly :)
Thanks!
That is awesome. Finally I got it
Awesome explanation!
Thanks a lot for this tutorial. got a clear vision of SVD due to this. Can you please do the same for CVR decomposition? It will be of great help then. Thanks in advance.
Thank you!! This was very helpful :)
Wow, thank you so much! Super helpful
I came here expecting Socratically Verbose Dragons but all i got was the maths
Best video of SVD
one of the best video ever !!!!
very good explanation!!
This is insane thanks so much...
Fuck!
That video was excatly what I needed!
really helpful. thanks!
Wonderful!!! Thank you so much :)
your explanation is really awesome .....amazing
Thanks! I''m happy it was helpful!
Do U and V matrices always have to be orthonormal? If yes, how do we handle the case of repeated eigenvalues
i'm learning machine learning and don't understand what SVD is. thank for good video.
That's awesome, I wish you all the best in your studies!
Do you get the same eigenvector multiple times when an eigenvalue has an algebraic multiplicity greater than one?
to show that AA' or A'A are symmetric, you could've just taken the transpose. It would have been immediate.
Very nice :)
Thanks!