Solving Systems of Differential Equations with Eigenvalues and Eigenvectors
Вставка
- Опубліковано 21 лип 2024
- We now show how to solve a generic matrix system of linear ordinary differential equations (ODEs) using eigenvalues and eigenvectors. This is one of the most powerful techniques in linear systems theory, with applications in stability theory and control.
Code examples are given in Python and Matlab.
Playlist: • Engineering Math: Diff...
Course Website: faculty.washington.edu/sbrunto...
@eigensteve on Twitter
eigensteve.com
databookuw.com
This video was produced at the University of Washington
%%% CHAPTERS %%%
0:00 Overview and Recap of Eigenvalues and Eigenvectors
2:58 Eigenvalues in Matlab
4:40 Eigenvalues in Python
5:55 Setting up the Problem
15:25 The Full Solution
17:00 Intuitive Interpretation - Наука та технологія
I'm going to speak for everyone here, but we all really appreciate the effort you put into these videos. Especially considering they are probably at least tertiary to teaching and research. Thank you, Steve.
definitely
This is truly a piece of art. Years of frustration healed in three videos. A hero of education.
So grateful for UA-cam and EigenSteve! I barely earned a physics degree back in the ‘90’s, but never really internalized math. Thanks to Dr Kutz lecture on SVD, and your series on same, I have been hooked on math videos for years now. In particular, thank you for not skipping steps, and spelling things out for the non-geniuses among us. After all, if we didn’t need things spelled out, we wouldn’t need some one to show us in the first place. After many years of struggle, I am finally starting to understand the language of math. Thanks to UA-cam, you have left a legacy for thousands to benefit from. I know it takes patience to keep from skipping steps… I can see you struggle…but from your UA-cam students point of view it’s well worth it!
When I learned this the first time my advanced control systems professor absolutely butchered the explanation. He robbed me of discovering such an amazing discovery about eigenvalues and eigenvectors. I never understood how or why this works.
Thank you for explaining so clearly, Steve, this series is amazing!
You know that it is going to be a great lecture series when Eigensteve is teaching you about eigenvalues and eigenvectors
this lecture is soooo good. I never really understood it when I was self-studying this from a ODE book recently. Now I have a clear and intuitive understanding. Thanks!
It is ime to get stirred though 😂😂
What I like about these videos is that they are much better and concise version of the material I learned in college. My notes weren't very good so I'm really glad there is a quick and easily accessible reference to this kind of material.
It is just under 50 years ago when I covered this, on the Monday one lecturer boringly introduced us to eigenvalues, on the Thursday a control engineer took us from “equations of motion” of a plane through matrix representation and onto eigenvalues. Never forgotten the second approach. Your videos are great.
My top list of UA-cam math teachers
1) Linear Algebra - Pavel Grinfeld
2) Engineering Math - Steve Brunton
3) Diff Geo- Keenan Crane
Very understandable explanation, amazing how you can simplify things by taking a detour into eigenspaces.
I feel old for those kind of lectures were years ago
Great video !
One of the best professors in the world!!! thanks!
Absolutely amazing. Thanks for the video!
Beautiful. Really wish the best for you!!! Thank you for sharing such wonderful knowledge
The last explanation was AMAZING!
Thank you Dr. Brunton., amazing series!
Just a side note: I believe it would have been nice if showed how the D power series in 14:59 goes to e^(Dt) by showing that each diagonal entry of the matrices in the power series, was its own power series for each diagonal element. This would prove why e^(Dt) equals a matrix with exponentials in the diagonals.
Muito massa nunca tinha aprendido isso na minha vida, a minha faculdade me ensinou de uma maneira muito estranha é essa maneira que você ensina é muito mais simples e intuitiva do que apenas fazer cálculos automáticos usando formulas prontas. Incrível vídeo
Suggestion: You could include the lecture number in the video title.
Your comment is 11 days old when the video was uploaded only 4 hours back? 😮
@@TNTsundar That is because the entire video playlist is available on the channel but individual publish dates are in the future time (t). To know when the next video will come out you will have to solve the y(t) [youtube] diff eqn or you can change the coordinate system by going to the playlist vector and watch all the videos from future.
Huge thanks sir @Steve Brunton . Future generation of engineers will have real motivation to study eigen values and other fun stuff if they don’t waste their time in PubG or on TikTok. I wish these lectures were there during my college days. All I heard in class for 4 semesters was Cauchy Reimann theorem and Eigen values, Eigen vectors without actually understanding the crux of the matter.
This is so beautiful, Now I can tell where is eigen value and vector is actually used.
Thanks for this excellent lecture!
I wonder what is the physical meaning of a system of ODEs that thier matrix is not diagonalizable? After all not all matrices can transform into diag matrix via eig values and eigvectors.
Steve, thank you for your work, I'm eternaly thankful for this video series. The most difficult part for me in this topic is notorious Jordan form for non-diagonalizable matrices. It would be a most nice of you if you make a video explaining this topic
I'm a donkey. There is already a video that elaborates on this🙃
THIS VIDEO IS REALY REALY COOL!
بی نظیر بود🎉
Amazing!
thanks a lot
Very good
I know this has been mirrored since the beginning, but it just occurred to me to look up your university page, and confirm that your hair is in fact flipped here.
:)
How different is it for discrete systems? And how the limit dt->0 gives same results as a continuous system.
Also, I have noticed the general solution for differential equations is of form: y=c1*v1*exp(\lambda1*t)+c2*v2*exp(\lambda2*t), while the P matrix is represented by {v1,v2} where is inv(P)?
When is the book coming out ?
Good lecture, but what about using this solution when we have boundary conditions and not initial conditions?
Hmm but what if the coefficients are not just simple constants, but functions of `t` instead?
Then we won't have just simple numbers in our matrix `A`, but functions of `t` :q
Which I suppose that it means that the directions of eigenvectors will be moving (rotating) with time, perhaps even along with their centres (fixpoints), am I right?
How can we deal with such equations?
Steve, I really like your content. But I think the text for equations and font size for codes are too small, even watching on a big monitor...
Why is only one T and T inverse pair canceled when matrix A is squared?
Matrix multiplication is not commutative. You cannot rearrange the terms in a product as you please.
So only the terms which touch each other directly will cancel.
It’s raining Eigen vectors! 😂
Note for thinking) I think we don't have to expand e^At.
- We can just directly transform z(t)=e^(Dt)z(0) to x(t)=Te^(Dt)T^(-1)x(0), since x=Tz and thus T^(-1)x=z.
Note2) Oh, and the professor forgot to show us why x(t)=e^(At)x(0)=Te^(Dt)T^-1x(0) is solution of x'=Ax. Just calculate x' and Ax, and using the fact A=TDT^-1, then we see it's the solution! Very trivial but for logical completeness :)
This does confuses me. I think x(t)=e^At works only if A is diagonal.
ua-cam.com/video/O85OWBJ2ayo/v-deo.html
This video answered my question.
You do need to expand for the proof. How else would you show that e^(At) = Te^(Dt)T^-1?
@@APaleDot Thanks for the comment. Well I think it depends on how we think. If we want to express the solution of z'=Dz as the form of "matrix exponential", z=e^(Dt)z(0), then it includes the concept of the expansion of e^(Dt) which is simpler than expanding e^(At) since D is diagonal, and we don't have to directly show that e^(At) = Te^(Dt)T^-1, anyway. But A=TDT^-1 and everything is connected so... it's up to our viewpoint :)
Fascinating stuff! 😂
No It Doesn't
at 11:40, does the multiplication between matrices follow transpose(A)*A, instead of A*A? I wonder if A*A is valid.