22. Diagonalization and Powers of A

Поділитися
Вставка
  • Опубліковано 7 лют 2025

КОМЕНТАРІ • 350

  • @bigfrankgaming2423
    @bigfrankgaming2423 2 роки тому +154

    This man single handedly saved my university algebra course, my teacher was just reading notes, he's actually expalining in a very clear manner.

  • @kayfouroneseven
    @kayfouroneseven 12 років тому +69

    this is a bazillion times more straightforward and clear than the lectures i pay for at my university. :( I appreciate this being online

    • @bsmichael9570
      @bsmichael9570 Рік тому +2

      He tells it like a story. It’s like he’s taking us all on a journey. You can’t wait to see the next episode.

  • @BaranwalAYUSH
    @BaranwalAYUSH 2 місяці тому +8

    This man has really reignited my passion for mathematics. Thank You Professor Strang for such amazing lectures.

  • @Tutkumsdream
    @Tutkumsdream 11 років тому +83

    Thanks to him! I passed Linear Algebra.. I watched his videos for 4 days before final exam and I got 74 from final.. If I couldnt watch Dr.Strang's lectures, I would probably fail...

    • @snoefnone9647
      @snoefnone9647 Рік тому +3

      For some reason i thought you were saying Dr. Strange's lecture!

  • @charmenk
    @charmenk 11 років тому +132

    Good professor with good old blackboard and white chalk teaching method. This is way better than all the fancy powerpoints that many teachers use now a days.

  • @BirnieMac1
    @BirnieMac1 Рік тому +2

    You know you’re in for some shenanigans when they pull out the “little trick”
    Professor Gilbert is an incredible teacher; I struggled with Eigenvalues and vectors in a previous course and this series of lectures has really helped understand it better
    Love your work Professor Gilbert

  • @christoskettenis880
    @christoskettenis880 Рік тому +1

    The explanations of this professor of all those abstract theorems and blind methodologies are simply briliant

  • @jollysan3228
    @jollysan3228 9 років тому +219

    I agree.
    > Just one small correction at 32:30: It should have been S * LAMBDA^100 * c instead of LAMBDA^100 * S * c.

    • @Slogan6418
      @Slogan6418 5 років тому +4

      thank you

    • @ozzyfromspace
      @ozzyfromspace 4 роки тому +13

      The sad thing was, a few moments later he was struggling to explain things because even though he hadn't pinned down the error, he someone knew that something wasn't quite right. But he obviously had the core idea nailed

    • @alexandresoaresdasilva1966
      @alexandresoaresdasilva1966 4 роки тому +6

      thank you so much, was about to post asking about this.

    • @吴瀚宇
      @吴瀚宇 4 роки тому +13

      I stuck on this for like 10 mins, until I saw the comments here...

    • @maitreyverma2996
      @maitreyverma2996 4 роки тому +5

      Perfect. I was about to write the same.

  • @albertacristie99
    @albertacristie99 15 років тому +17

    This is magnificiant!! I have no words to express how thankful I am towards the exposure of this video

  • @apocalypse2004
    @apocalypse2004 8 років тому +248

    I think Strang leaves out a key point in the difference equation example, which is that the n unique eigenvectors form a basis for R^n, which is why u0 can be expressed as a linear combination of the eigenvectors.

    • @alessapiolin
      @alessapiolin 7 років тому +1

      thanks!

    • @wontbenice
      @wontbenice 7 років тому +6

      I was totally confused until you chimed in. Thx!

    • @seanmcqueen8498
      @seanmcqueen8498 6 років тому +1

      Thank you for this comment!

    • @arsenron
      @arsenron 6 років тому +17

      in my opinion it is so obvious that it is not worth stopping on it

    • @dexterod
      @dexterod 6 років тому +14

      I think Strang assumed that A has n independent eigenvectors since most matrices do not have repeated eigenvalues.

  • @eye2eyeerigavo777
    @eye2eyeerigavo777 5 років тому +55

    Math surprises you everytime...🤔 Never thought that connections between rate of growth in system dynamics, fibonacci Series and diagnalization of an INDEPENDENT vectors will finally boil down INTO GOLDEN RATIO OF EIGENVALUES at END! 😳

  • @georgesadler7830
    @georgesadler7830 3 роки тому +6

    From this latest lecture , I am learning more about eigenvalues and eigenvectors in relation to diagonalization of a matrix. DR. Strang continues to increase my knowledge of linear algebra with these amazing lectures.

  • @go_all_in_777
    @go_all_in_777 11 місяців тому +1

    At 28:07, uk = (A^k)uo, can also be written as uk = S*(Lambda)^k*(S^-1)*uo. Also, we can write uo = S*c as explained at 30:00. therefore, uk = S*(Lambda)^k*(S^-1)*S*c=S*(Lambda)^k*c

    • @jeanpierre-st7rl
      @jeanpierre-st7rl 10 місяців тому

      Hi @ 29:46 Uo = C1X1 + C2X2 +C3X3... Is U0 a vector? If so, How can split this U0 in to a combination of eigen vectors? What is Ci ? If you have any info pleases let me know. Thanks.

  • @ozzyfromspace
    @ozzyfromspace 4 роки тому +3

    For the curious:
    F_100 = (a^99 - b^99) * b/sqrt(5) + a^99 , where a = (1 + sqrt(5))/2 and b = (1 - sqrt(5))/2 are the two eigenvalues of our system of difference equations.
    Numerically, F_100 = ~3.542248482 * 10^20 ... it's a very large number that grows like ~1.618^k 😲
    Overall, great lecture Professor Strang! Thank you for posting, MIT OCW ☺️

  • @eroicawu
    @eroicawu 14 років тому +5

    It's getting more and more interesting when differential equations are involved!

  • @coreconceptclasses7494
    @coreconceptclasses7494 4 роки тому +5

    I got 70 out of 75 in my final linear algebra exam thanks MIT...

  • @meetghelani5222
    @meetghelani5222 Рік тому

    Thank you for existing MITOCW and Prof. Gilbert Strang.

  • @kanikabagree1084
    @kanikabagree1084 4 роки тому +2

    This teacher made fall in love with linear algebra thankyou ❤️

  • @Huayuan-p4z
    @Huayuan-p4z Рік тому +1

    I have learned about the Fibonacci sequence in my high school, and it is so good to have a new perspective on the magical sequence.I think the significane of learning lies in the collection of new perspectives.😀

  • @syedsheheryarbokhari2780
    @syedsheheryarbokhari2780 Рік тому +2

    There is a small writing mistake at 32:30 by Prof Strang. He writes (eigenvalue matrix)^100 multiplying (eigenvector matrix) multiplying c's (constants). It ought to be (eigenvector matrix) multiplying (eigenvalue matrix)^100 multiplying c's.
    At the end of the lecture Professor Strang does narrate the correct formula but it is easier to miss.

    • @clutterbrainx
      @clutterbrainx Рік тому

      Yeah I was confused for a very long time there

  • @Zumerjud
    @Zumerjud 10 років тому +24

    This is so beautiful!

  • @neoneo1503
    @neoneo1503 3 роки тому +2

    A*S=S*Lambda (Using the linear combination view (Ax1=b1 column part) of Matrix Multiplication), That is Brilliant and Clear! Thanks!

    • @neoneo1503
      @neoneo1503 3 роки тому

      Also expressing the state u_0 to u_k as linear combination of eigenvectors (at 30:00 and 50:00)

    • @Wabbelpaddel
      @Wabbelpaddel 3 роки тому

      Well, it's - if you interpret it that way - just a basis transformation from the standard base (up to isomorphism, then just additionally multiply the transforms of the alternate basis) onto the eigenvector basis.
      Provided of course, that either the characteristic polynomial factors distinctly, or that geometric and algebraic multiplicity match (because then the eigenspaces distinctly span the vector space up to isomorphism; if they weren't, you'd just have a subspace as a generating system).
      For anyone who wanted one more run-through.

    • @neoneo1503
      @neoneo1503 3 роки тому +1

      @@Wabbelpaddel Thanks! =)

  • @sathviktummala5480
    @sathviktummala5480 4 роки тому +5

    44:00 well that's an outstanding move

  • @SimmySimmy
    @SimmySimmy 5 років тому +6

    through single matrix transformation, the whole subspace will expand or shrink with the rate of eigenvalues in the direction of its eigenvectors, suppose you can decompose a vector in this subspace into the linear combination of its eigenvectors, so after many times of the same transformation, the random vector will ultimately land on one of its eigenvectors with the largest eigenvalue.

  • @alexspiers6229
    @alexspiers6229 11 місяців тому

    This is one of the best in the series

  • @uzferry5524
    @uzferry5524 Рік тому

    bruh the fibonacci example just blew my mind. crazy how linear algebra just works like that!!

  • @ItsKhabib
    @ItsKhabib 2 місяці тому +1

    true masterpiece!

  • @abdulghanialmasri5550
    @abdulghanialmasri5550 2 роки тому

    The best math teacher ever.

  • @RolfBazuin
    @RolfBazuin 11 років тому

    Who would have guessed, when this guy explains it, it almost sounds easy! You, dear dr. Strang, are a master at what you do...

  • @ccamii__
    @ccamii__ Рік тому

    Absolutely amazing! This lecture really helped me to understand better the ideas about Linear Algebra I've already had.

  • @atomatik_x
    @atomatik_x 14 років тому +5

    the lecture and the teacher of my life!

  • @nguyenbaodung1603
    @nguyenbaodung1603 3 роки тому +5

    I read something on SVD without even knowing about eigenvalues and eigenvectors, then watch a youtube video, explaining that V is actually the eigenvector decomposition of A^TA. Which is extremely insane when I got to see this video oh my godness. Now even haven't watched your SVD lecture, I can even tell the precise concept of it. Oh my godness Math is so perfect!!

  • @florianwicher
    @florianwicher 7 років тому +3

    Really happy this is online! Thank you Professor :)

  • @eren96lmn
    @eren96lmn 8 років тому +56

    43:36 that moment when your professor's computational abilities goes far beyond standart human capabilities

    • @BalerionFyre
      @BalerionFyre 8 років тому

      Yeah wtf? How did he do that in his head?? lol

    • @BalerionFyre
      @BalerionFyre 8 років тому +45

      Wait a minute! He didn't do anything special. 1.618... is the golden ratio! He just knew the first 4 digits. Damn that's a little anticlimactic. Bummer.

    • @AdrianVrabie
      @AdrianVrabie 8 років тому +2

      +Stephen Lovejoy Damn! :D Wow! AWESOME! I have no words! Nice spot! I actually checked it in Octave and I was amazed the prof could do it in his head. But I guess he knew the Fibonacci is related to the golden ratio.

    • @IJOHN84
      @IJOHN84 6 років тому +6

      All students should know the solution to that golden quadratic by heart.

    • @ozzyfromspace
      @ozzyfromspace 4 роки тому +2

      Fun fact since we're all talking about the golden ratio. The Fibonacci sequence isn't that special. Any sequence F_(k+2) = F_(k+1) + F_k for any seeds F_0 = a and F_1 = b != -a generate a sequence that grows at the rate (1+sqrt(5))/2 .. your golden ratio. Another fun way to check this: take the limit of the ratio of numbers in your arbitrary sequence with your preferred software :)
      edit: that's a great excuse to write a bit of code lol

  • @Zesty_Soul
    @Zesty_Soul 4 дні тому

    32:41 With all due respect to the legendary Prof. Strang, he may have meant to refer the vector u_100 as:
    u_100 = (A^100)u_0
    = S(Λ^100)c
    [rather than (Λ^100)Sc].
    And likewise, the original/initial vector u_0 as:
    Au_0 = SΛc
    [and not ΛSc],
    given u_0 = Sc.
    Must have been an unintended typo.

  • @muyuanliu3175
    @muyuanliu3175 5 місяців тому

    32:42, should be S lambda^100 c, great lecture, 3rd time I learn this

  • @cuinuc
    @cuinuc 15 років тому +19

    I love professor Strang's great lectures.
    Just one small correction at 32:30: It should have been S * LAMBDA^100 * c instead of LAMBDA^100 * S * c.

    • @starriet
      @starriet 2 роки тому

      Nice catch!

    • @jeffery777
      @jeffery777 2 роки тому

      haha I think so

    • @eyuptarkengin816
      @eyuptarkengin816 Рік тому

      yeah, i though of the same thing and scrolled down the comments for a approval. Thanks mate :D

    • @ranabhatashim
      @ranabhatashim Місяць тому

      There is no mistake. The thing about c1x1 is c1 is a number while x1 is a vector, keep this in mind. we have u0 = c1x1....cnxn. Multiply by A both sides, Au0 = Ac1x1..Acnxn. Bring c1 to the front since its a number, c1Ax1 + c2Ax2..., now since x1...xn are eigenvectors Ax1 = lambda x1, so Au0 = c1*lambda*x1. We can now factor out the lamba as a eigenmatrix so Au0 = lambdamatrix(c1x1 + c2x2 .. cnxn). Remeber than x is a vector and c is a number? Therefore we can do c1x1+....cnxn as Xc, here X is eigenvector matrix which is S and c is a vector of (c1 c2...). Therefore Au0 = lambda * S * c.

  • @rolandheinze7182
    @rolandheinze7182 5 років тому +3

    Hard lecture to get through personally but does illustrate some of the cool machinery for applying eigenvectors

  • @zyctc000
    @zyctc000 Рік тому

    If any one ever asks you about why the Fibonacci and the golden ratio phi is connected , point him/her to this video.
    Thank you Dr. Strang

  • @dwijdixit7810
    @dwijdixit7810 2 роки тому +1

    33:40 Correction: Eigenvalue matrix be multiplied to S from the right. That has been made in the book. Probably, it slipped off Prof. Strang in the flow.

  • @kunleolutomilayo4018
    @kunleolutomilayo4018 6 років тому +1

    Thank you, Prof.
    Thank you, MIT.

  • @jasonhe6947
    @jasonhe6947 5 років тому

    absolutely a brilliant example for how to apply eigenvalues to real world problem

  • @eugenek951
    @eugenek951 Рік тому

    He is my linear algebra super hero!🙂

  • @ozzyfromspace
    @ozzyfromspace 4 роки тому +6

    Did we ever prove that if the set of eigenvalues are distinct, the set of eigenvectors are linearly independent? I ask because at ~ 32:00 taking u_o = c1*x1 + c2*x2 + ... + cn*xn requires the eigenvectors to form a basis for an n-dimensional vector space (i.e. span the column space of an invertible matrix). It feels right but I have no solid background for how to think about it

    • @roshinis9986
      @roshinis9986 Рік тому

      The idea is easy for 2d. If you have two distinct eigenvalues and their corresponding eigenvectors, you don't just have one eigenvector per eigenvalue, the whole span of that vector (its multiples forming a line) are also the eigenvectors associated with that eigenvalue. If the original eigenvectors were to be dependent, they would lie in the same line making it impossible for them to scale by a factor of two distinct eigenvalues simultaneously. I haven't yet been able to extend this intuition to 3 or higher dimensions though as now dependence need not mean lying in the same line.

    • @jeanpierre-st7rl
      @jeanpierre-st7rl 10 місяців тому

      @@roshinis9986 Hi @ 29:46 Uo = C1X1 + C2X2 +C3X3... Is U0 a vector? If so, How can split this U0 in to a combination of eigen vectors? What is Ci ? If you have any info pleases let me know. Thanks.

  • @benzhang7261
    @benzhang7261 4 роки тому +4

    Master Yoda passed on what he has learnt by fibonacci and 1.618.

  • @Afnimation
    @Afnimation 11 років тому +1

    well i got impressed at the begining, but when he stated the second eigenvalue i realized it is just the golden ratio... That does not demerits him, he's great!

  • @Mohamed1992able
    @Mohamed1992able 13 років тому +1

    a big thanks tothis prof for his efforts to give us cours about linear algebra

  • @shadownik2327
    @shadownik2327 Рік тому

    Now I get it, so its like breaking the thing ( vector or matrix or system really) we want to transform into little parts and then transforming them individually cz thats easier as the parts get transformed in the same direction and then adding up all those pieces. E vectors tell us how to make the pieces and e values how to make the transformation with the given matrix or system. Wow thanks ! It’s like something fit in in my mind and became very simple.
    Basically this is like finding the easiest way to transform.
    Thanks to @MIT and Professor Strang for making this available online for free.

  • @dennisyangji
    @dennisyangji 15 років тому

    A great lecture showing us the wonderful secret behind linear algebra

  • @maoqiutong
    @maoqiutong Рік тому +1

    32:41 There is a slight error here. The result Λ^100 * S * C may be wrong. I think it should be S * Λ^100 * C.

  • @jojowasamanwho
    @jojowasamanwho Рік тому

    19:21 I would sure like to see the proof that if there are no repeated eigenvalues, then there are certain to be n linearly independent eigenvectors

  • @gomasaanjanna2897
    @gomasaanjanna2897 4 роки тому +4

    Iam from india I love your teaching

  • @bastudil94
    @bastudil94 11 років тому +76

    There is a MISTAKE on the formula of the minute 32:31. It must be S(Λ^100)c in order to work as it is supposed. However it is an excellent lecture, thanks a lot. :)

    • @YaguangLi
      @YaguangLi 10 років тому +3

      Yes, I am also confused by this mistake.

    • @sammao8478
      @sammao8478 9 років тому

      Yaguang Li
      agree with you.

    • @AdrianVrabie
      @AdrianVrabie 8 років тому

      +Bryan Astudillo Carpio why not S(Λ^100)S^{-1}c ???

    • @apocalypse2004
      @apocalypse2004 8 років тому +4

      u0 is Sc, so S inverse cancels out with the S

    • @daiz9109
      @daiz9109 7 років тому

      You're right... it confused me too...

  • @tomodren
    @tomodren 13 років тому

    Thank you for posting this. These videos will allow me to pass my class!

  • @sharmabu
    @sharmabu 6 місяців тому

    absolutely beautiful

  • @amyzeng7130
    @amyzeng7130 3 роки тому +1

    What a brilliant lecture !!!

  • @praduk
    @praduk 15 років тому

    Fibonacci numbers being solved for as an algebraic equation with linear algebra was pretty cool.

  • @cecilimiao
    @cecilimiao 15 років тому +2

    @cuinuc
    I think they are actually the same, because LAMBDA is a diagonal matrix, you can have a try.

  • @АлександрСницаренко-р4д

    MIT, thanks you!

  • @phononify
    @phononify Рік тому

    very nice discussion about Fibonacci ... great !

  • @dalisabe62
    @dalisabe62 4 роки тому +8

    The golden ratio arose from the Fibonacci sequence and has nothing to do with eigenvectors or eigenvalues. The beauty of using the eigenvectors and eigenvalue of a matrix though is limiting the effect of the transformation to the change in magnitude only, which reduces dynamics systems such as population growth that is a function of several variables to be encoded in a matrix computation without worrying about the effect of direction or rotation typically associated with matrix transformation. Since eigenvectors and eigenvalues change the magnitude of the parameter vector only, the idea of employing the Eigen transformation concept is quite genius. The same technique could be used in any dynamic system that could be modeled as a matrix transformation but one that produces a change in magnitude only.

    • @Arycke
      @Arycke Рік тому +1

      Hence the title of his *example* as "Fibonacci Example." Nowhere was it stated explicitly sthat the golden ratio didn't arise from the Fibonacci sequence, so I don't see where you got that from. The example has a lot to do with eigenvalues and eigenvectors by design, and is using a simple recurrence relation to show a use case. The Fibonacci sequence isn't unique anyway.

  • @starriet
    @starriet 2 роки тому +1

    Notes for future ref.)
    (7:16) there are _some_ matrices that do _NOT_ have n-independent eigenvectors, but _most_ of the matrices we deal with do have n-independent eigenvectors.
    (17:14) If all evalues are different, there _must_ be n-indep evectors. But if there are same evalues, it's possible _no_ n-indep evectors. (Identity matrix is an example of having the same evalues but still having n-indep evectors)
    * Also, the position of Lambda and S should be changed(32:36). You'll see why by just thinking matrix multiplication, and it can also be viewed by knowing A^100=S*Lambda^100*S^-1 and u_0=S*c.
    Thus, it should be S*Lambda^100*c, and this also can be thought of as 'transformation' between the two different bases - one of the two is the set of the egenvectors of A.
    * Also, (43:34) How prof. Strang could calculate that?? Actually that number _1.618033988749894..._ is called the 'golden ratio'.
    * (8:15) Note that A and Lambda are 'similar'. (And, S and S_-1(S inverse) transforms the coordinates.. you know what I mean.. both A and Lambda can be though of as some "transformation" based on different basis.. and S(or S_-1) transforms the coord between those two world.)

    • @shadowByte99
      @shadowByte99 2 роки тому

      I spent a few hours on the second point before figuring it out :(

  • @shamsularefinsajib7778
    @shamsularefinsajib7778 12 років тому

    Gilbert strang a great math teacher............

  • @wendywang4232
    @wendywang4232 12 років тому +2

    something wrong with this lecture, 32:39, A^{100}u_0=SM^100c. Here I use M to substitute the eigenvalue diagonal matrix. The professor said A^{100}u_0=M^100Sc which is not correct.

  • @Hindusandaczech
    @Hindusandaczech 13 років тому

    Bravo!!! Very much the best and premium stuff.

  • @mike-yj5mm
    @mike-yj5mm 4 роки тому +2

    I don't understand 11:25 why A square can be written in the way on the blackboard. I think A^2 should be (S Lambda S^-1)^T (S Lambda S^-1), the result differs from the one on the blackboard. Could someone explain this?

    • @mike-yj5mm
      @mike-yj5mm 4 роки тому +2

      Okay, I figured it out. The S is an orthogonal matrix under the n independent eigenvector assumption, the inverse of which equals to its transpose.

    • @APaleDot
      @APaleDot 2 роки тому

      @@mike-yj5mm
      No, it doesn't require S to be an orthogonal matrix. n independent eigenvectors ≠ n orthogonal eigenvectors of unit length, which would be required to make S an orthogonal matrix.
      At this point in the lecture we've already proven that A = S ∧ S^-1 and therefore it follows immediately that A^2 = AA = (S ∧ S^-1)(S ∧ S^-1). All the matrices are square, so there is no conflict in their dimensions.

  • @mospehraict
    @mospehraict 13 років тому +1

    @PhilOrzechowski he does it to make first order difference equations system out of second order

  • @veronicaecheverria594
    @veronicaecheverria594 4 роки тому

    What a great professor!!!

  • @iebalazs
    @iebalazs 2 роки тому

    At 32:32 the expression is actually S*Lamdba^100*c, and not Lambda^100*S*c .

  • @rambohrynyk8897
    @rambohrynyk8897 Рік тому

    It always shits me how quickly the students clammer to get out of the class….how are you not absolutely dumbfounded by the profundity of what this great man is laying down!!!!

  • @ranabhatashim
    @ranabhatashim Місяць тому

    There is no mistake at 32:00. The thing about c1x1 is c1 is a number while x1 is a vector, keep this in mind. we have u0 = c1x1....cnxn. Multiply by A both sides, Au0 = Ac1x1..Acnxn. Bring c1 to the front since its a number, c1Ax1 + c2Ax2..., now since x1...xn are eigenvectors Ax1 = lambda x1, so Au0 = c1*lambda*x1. We can now factor out the lamba as a eigenmatrix so Au0 = lambdamatrix(c1x1 + c2x2 .. cnxn). Remeber than x is a vector and c is a number? Therefore we can do c1x1+....cnxn as Xc, here X is eigenvector matrix which is S and c is a vector of (c1 c2...). Therefore Au0 = lambda * S * c.

    • @keremdirlik
      @keremdirlik Місяць тому

      Nope, be careful. It's S *lambda *c

  • @SamSarwat90
    @SamSarwat90 7 років тому

    I love you professor !!!

  • @PaulHobbs23
    @PaulHobbs23 13 років тому +2

    @lolololort
    1/2(1 + sqrt(5)) is also the golden ratio! Math is amazing =] I'm sure the professor knew the answer and didn't calculate it in his head on the spot.

  • @LAnonHubbard
    @LAnonHubbard 13 років тому +1

    I've only just learnt about eigenvalues and eigenvectors from KhanAcademy and Strang's Lecture 21 so a lot of this went whoooosh over my head, but managed to find the first 20 minutes useful. Hope to come back to this when I've looked at differential equations (which AFAIK are very daunting), etc and understand more of it.

    • @rolandheinze7182
      @rolandheinze7182 5 років тому

      Don't think you need diff EQ at all to understand the algebra. Maybe the applications

  • @technoshrink
    @technoshrink 9 років тому +6

    U0 == "you know it"
    First time I've heard his boston accent c:

  • @iDiAnZhu
    @iDiAnZhu 11 років тому +4

    At around 32:45, Prof. Strang writes Lambda^100*S*c. Notation wise, shouldn't this be S*Lambda^100*c?

  • @dexterod
    @dexterod 8 років тому +24

    I'd say if you play this video at speed 1.5, it's even more awesome!

  • @khanhdovanit
    @khanhdovanit 4 роки тому

    15:02 interested information inside matrix - eigenvalues

  • @theshreyansjain
    @theshreyansjain Рік тому

    Is there an error at 32:30? Shouldn't S be multiplied before (lamda matrix)^100?

  • @jamesmcpherson3924
    @jamesmcpherson3924 4 роки тому

    I had to pause to figure out how he got the eigenvectors at the end. Plugging in Phi works but it wasn’t until I watched again that I noticed he was pointing to the lambda^2-lambda-1=0 relationship to reveal the vector.

  • @MaproXiZ
    @MaproXiZ 10 років тому

    I dont undertand why the eigenvectors are [lamba_1 1] and [lamba_2 1] at 49:19...
    since it is NOT true that ((1 - lamba) * (lamba)) + 1 is lamba^2 - lamba - 1 ... or it is? or what is happening?

    • @youcefyahiaoui1465
      @youcefyahiaoui1465 10 років тому +1

      He's just using the original definition of the eigenvalues. We already have the characteristic equation lamd^2-lamda-1=0 as the polynomial equation the solution of which is both eigenvalues. Then he recognized that by writing the A-lamda.I by the vector [lamda 1] will generate this same characteristic equation. Hence, the eigenvectors are just [lamda 1]

  • @lastchance8142
    @lastchance8142 2 роки тому

    Clearly Prof.Strang is a master, and his lectures are brilliant. But how do the students learn without Q&A? Is this standard procedure at MIT?

  • @blondii0072
    @blondii0072 12 років тому +1

    Beautiful lecture. Thanks

  • @ricardocesargomes7274
    @ricardocesargomes7274 8 років тому

    Thanks for uploading.!

  • @dadadada2367
    @dadadada2367 12 років тому

    the best of the best

  • @thomassun3046
    @thomassun3046 8 місяців тому

    Here comes a question, How U0 is equal to c1x1+c2x2...+cnxn. at 29:50, confused, could anyone explain it to me?

    • @Tman1000-be7op
      @Tman1000-be7op 4 місяці тому

      He is just writing that the initial condition is equal to a combination of n independent eigen vectors.

  • @Moriadin
    @Moriadin 4 місяці тому

    I'm blown away by how he calculated 1/2(1+sqrt(5)) to 3dp IN HIS HEAD! Jsesus Christ lol.

  • @zionen01
    @zionen01 15 років тому

    Great stuff. I was able to do my homework with this lecture. I will definitely be getting Strang's book.

  • @davidsfc9
    @davidsfc9 12 років тому

    Great lecture !

  • @ax2kool
    @ax2kool 13 років тому

    That was amazing and awe-inspiring. :)

  • @sammatthew7
    @sammatthew7 5 місяців тому +1

    GOLD

  • @noorceen
    @noorceen 13 років тому +2

    thank you :))
    you are amazing

  • @pelemanov
    @pelemanov 13 років тому +1

    @LAnonHubbard You don't really need to know about differential equations to understand this lecture. Just watch lessons 1 to 20 as well ;-). Takes you only 15h :-D.

  • @Zoro3120
    @Zoro3120 8 років тому +1

    In the computation of the Eigen values for A², he used A = SʌSˉ¹ to derive that ʌ² represents its Eigen value matrix. However this can be true only if S is invertible for A², which need not be always true.
    For example, for the matrix below (say A), the Eigen values are 1, -1(refer previous lecture). This would imply that A² has only one Eigen value of 1. This would imply that S has 2 columns which are same (if it has only one column then it is no longer square and hence inverse doesn't apply) and hence non invertible. This implies that this proof cannot be used for all the cases of the matrix A.
    _ _
    │ 0 1 │
    │ 1 0 │
    ¯ ¯
    Is there something I'm missing here?

    • @hinmatth
      @hinmatth 8 років тому +2

      Please check 17:32

  • @niraj_ds
    @niraj_ds 2 роки тому

    @ 44:00 why summation of both eigen values are 1?? have i missed any concept behind this?? : (

    • @ElectricTeaCup
      @ElectricTeaCup 2 роки тому +1

      Yes, "The sum of the n eigenvalues equals the sum of the n diagonal entries". The sum of the diagonal entries is 1.

  • @Stoikpilled
    @Stoikpilled 15 років тому

    awesome!! Greetings from Peru

  • @joe130l
    @joe130l 4 роки тому +1

    so it seems like the professor emphasized the importance of the eigenvalue here, that's nice. but is the eigenvector of any importance? what's a good example of eigenvectors?

  • @alijoueizadeh8477
    @alijoueizadeh8477 6 років тому

    Thank you.

  • @NisargJain
    @NisargJain 6 років тому +3

    Never in my entire life, i would have been able to convert that Fibonacci sequence in matrix form. Untill, he did it.

    • @thedailyepochs338
      @thedailyepochs338 4 роки тому

      can you explain how he did it

    • @NisargJain
      @NisargJain 4 роки тому +1

      @@thedailyepochs338 sure, for understanding that we must first understand what fibonacci sequence is, in a fibonacci sequence every term is the sum of previous two terms (given the first two terms starting from 0 and then 1). So, the 3rd term F3= F2+F1=1+0=1. Similarly F(k+2)= F(k+1) + F(k). But in a matrix, let us assume that U(k) is 2 dimensional vector that consists of the first term as F(k+1) and 2nd term as F(k). Similarly U(k+1) would be [F(k+2), F(k+1)]. But see that the first term of U(k+1) that is F(k+2) is equal to sum of the term of U(k) that F(k+1) + F(k); and the second term of U(k+1) is is F(k+1) which is only the first term of U(k). Hence we get the matrix
      A=[ 1 1 ]
      [ 1 0]
      If you multiply, AU(k) which is
      A U(k)
      [ 1 1 ] [ F(k+1) ]
      [ 1 0] [ F(k) ]
      You will fet first term as sum of the terms of U(k) and second term as just the first term of U(k) which we deduced to be U(k+1).

    • @thedailyepochs338
      @thedailyepochs338 4 роки тому

      @@NisargJain thanks , really appreciate it

  • @cooperxie
    @cooperxie 11 років тому

    agree!
    that's what I plan to use in my teacing