10. The Four Fundamental Subspaces

Поділитися
Вставка
  • Опубліковано 1 гру 2024

КОМЕНТАРІ • 330

  •  10 років тому +279

    Thank you MIT, thank you Prof Strang.

  • @PhucLe-qs7nx
    @PhucLe-qs7nx 3 роки тому +106

    00:00 Error from last lecture, row dependent.
    04:28 4 Fundamental subspaces.
    08:30 Where are those spaces?
    11:45 Dimension of those spaces.
    21:20 Basis for those space.
    30:00 N(A^T) "Left nullspace"?
    42:10 New "matrix" space?

    • @lokahit6940
      @lokahit6940 11 місяців тому

      i am asking you because your's is the most recent comment?
      1)at 9:15 how the column space is R^m? for mxn(m rows x n columns)matrix there are n colums so there are n column vectors so it supposed to be R^n right?

    • @aarongreenberg159
      @aarongreenberg159 8 місяців тому +1

      @@lokahit6940 Because each vector in the column space has m components. Yes, there are n vectors, but the number of components of a vector describes the dimensions of space its in.
      This is different once you get to a basis, where the number of vectors describe its dimension, but even that is a subspace of R^(# of components). So a two-vector basis where each vector has 5 components is a 2d subspace in R^5.

    • @deveshbhatt4063
      @deveshbhatt4063 5 місяців тому +1

      @@aarongreenberg159 Thanks for the clarification.

  • @GavinoFelix
    @GavinoFelix 10 років тому +359

    "But, after class - TO MY SORROW - a student tells me, 'Wait a minute that [third vector] is not independent...'"
    I love it. What other professor brings this kind of passion to linear algebra? This is what makes real in the flesh lectures worthwhile.

    • @xoppa09
      @xoppa09 6 років тому +6

      Give that brave student a medal.

    • @fanzhang3746
      @fanzhang3746 6 років тому +31

      xoppa09 I think here it is the Professor that's honorable . He elaborated on his mistake, which is reasonably embarrassing for him, and made clear important concepts. I think most others would just correct it, apologize, and move on. You can see his embarrassment when he used words like 'bury', and the reaction when he accidentally uncovered the board again later.

    • @andersony4970
      @andersony4970 3 роки тому +6

      @@fanzhang3746 I don't think he is much embarrassed. He talked about doing math in class in the first vedio of this series, if you've watched that. He said that it might be inevitable to make mistakes, and it's great to go through all the processes with the students including making errors and correcting those.

    • @NazriB
      @NazriB 2 роки тому

      Lies again? FAS FUS Sheng Siong

    • @sahil0094
      @sahil0094 2 роки тому

      whats so passionate about accepting & correcting own mistake?

  • @corey333p
    @corey333p 7 років тому +377

    "No mathematics went on there; we just got some vectors that were lying down to stand up."

    • @corey333p
      @corey333p 7 років тому +8

      Gotta know the bases for the spaces.

    • @why6447
      @why6447 4 роки тому

      AHAHHAHHAHAHHAHAH

    • @delta_sleepy
      @delta_sleepy 11 місяців тому

      😂

  • @matthewsarsam8920
    @matthewsarsam8920 11 місяців тому +3

    Cant lie being able to pause the video and ponder about the ideas is so nice to have. Goes to show how much work those students had to put in

  • @juansepardo2020
    @juansepardo2020 Рік тому +41

    I am a 4th year, double engineering student re-learning linear algebra so I can have a stronger basis for ML, DL and AI. Never in my college classes, or independent studying, have I been so amazed in the way a concept is introduced as I was when prof. Strang got to the computing of the left null space. The way this man teaches is just astonishing, thank you very much.

    • @reganmian
      @reganmian 10 місяців тому +2

      Have you checked out his newest book "Linear Algebra and Learning from Data"?. That plus "Introduction to Statistical Learning" given a foundation in programming, probability, and statistical inference is a killer combo. I'm a statistics graduate student wanting to specialize in ML. I've been watching these on 2x speed as a review

    • @itsnotthattough7588
      @itsnotthattough7588 6 місяців тому +1

      OMG I'm literally the same. I jumped on ML and AI early in my 2nd year, but could not understand any concepts thoroughly. Now I really feel the need to relearn the basics and prof. Strang is like the savior for me.

  • @DanielCoutoF
    @DanielCoutoF 9 років тому +175

    I am so fascinated by the way that professor G. Strang gives his lectures, he does it in such a great way that even a 5 years old boy could understand , on the side , teachers from my university make the subject so complicated, that even highly above the avarege students struggle to understand the concepts poperly.

    • @JadedForAlways
      @JadedForAlways 9 років тому +10

      +Daniel Couto Fonseca What about a 5 year old girl?

    • @DanielCoutoF
      @DanielCoutoF 9 років тому +20

      Only 5 years old WHITE BOYS I would say

    • @JadedForAlways
      @JadedForAlways 9 років тому +3

      Are you joking? I can't tell

    • @DanielCoutoF
      @DanielCoutoF 9 років тому +28

      I guess it's more funny if you dont

    • @Bm23CC
      @Bm23CC 8 років тому +10

      +Daniel Couto Fonseca I challenge you to teach a 5 yr old linear algebra.Good luck to that.

  • @davidwilliam152
    @davidwilliam152 4 роки тому +17

    How a perfect thing that being able to be a great mathematician and a great teacher at the same time! Especially, being a great teacher is priceless!

  • @duqueng
    @duqueng 14 років тому +11

    The best teacher ever. I really admire the act of MIT. Like in a phrase in its website: "Unlocking Knowledge, Empowering Minds."

  • @jonathanoneill3464
    @jonathanoneill3464 8 років тому +45

    These lectures are saving my bachelors in Engineering. Thanks MIT!

    • @rohanmalik895
      @rohanmalik895 6 років тому +20

      woah your icon image tells that very precisely that you survived engineering after all.....wish me luck

  • @KaveriChatra
    @KaveriChatra 5 років тому +108

    "I see that this fourth space is getting second class citizen treatment..it doesn't deserve it"

    • @NG-we8uu
      @NG-we8uu 5 років тому +4

      Kaveri Chatra by coincidence I read this exactly when he said it

    • @alenjose3903
      @alenjose3903 4 роки тому +1

      @@NG-we8uu me too, i just read this while i was listening to it 😂

    • @MrGameWWE
      @MrGameWWE 4 роки тому

      Me too 😂😂

  • @PyMoondra
    @PyMoondra 5 років тому +4

    The end portion really educated how matrix algebra theory can be applied to computer vision; really glad he added that in.

  • @xiaohanwang3885
    @xiaohanwang3885 9 років тому +71

    For the first time I envy students in MIT. Because they have such genius lectures to attend.

    • @NostraDavid2
      @NostraDavid2 Рік тому +2

      I don't. I've got it better. No time pressure to watch the lectures, I don't NEED to make the exercises, nor the exams. It's great! 😁

    • @swatejreddy216
      @swatejreddy216 Рік тому +3

      @@NostraDavid2 and nor the hefty money too. So yeah.

  • @bobmike828
    @bobmike828 5 років тому +9

    Correct me if I'm wrong but Strang was introducing abstract algebra at the end. Once you have all of these linear transformation transforming more linear transformations, you have an even greater transformation of space. Absolutely love this man

    • @usozertr
      @usozertr 4 роки тому +1

      Bob Mike yes, and in an earlier lecture he was talking about how n x n permutation matrices form a group

    • @pubgplayer1720
      @pubgplayer1720 Рік тому +1

      Yes, abstract vector spaces are quite important in linear algebra

  • @Upgradezz
    @Upgradezz 3 роки тому +5

    It's my honor to have met you even virtually, sir!

  • @yanshudu9370
    @yanshudu9370 2 роки тому +7

    Conclusion: Four fundamental subspaces of A(m*n), including 1. The column space means spanning the column vectors, which is in R to m, notation as C(A)
    2. The nullspace of A means the free variables corresponding vector span the null space, which is in R to n, notation as N(A)
    3. The row space means spanning the row vectors, which is in R to n, notation as C(A') equal to n-r
    4. The left nullspace of A means the A' free variables corresponding vector span the null space, which is in R to m, notation as N(A') equal to m-r.
    other conclusions: The sum of dim(C(A')) and N(A) is equal to n, the sum of dim(C(A)) and N(A') is equal to m.

  • @easterPole
    @easterPole 7 років тому +94

    I'm into the fifth minute and wondering whether he made that mistake in last lecture knowingly

    • @sachidanandprajapati9446
      @sachidanandprajapati9446 4 роки тому +18

      man, exactly. Due to this error, i came to know if a matrix in non invertible, the columns would be linearly dependent

    • @eduardoschiavon5652
      @eduardoschiavon5652 4 роки тому +9

      40:54 There's no one in the class...

    • @ManishKumar-xx7ny
      @ManishKumar-xx7ny 4 роки тому +1

      Same thought and maybe he did. Great chance

    • @matthieugrosrenaud1777
      @matthieugrosrenaud1777 3 роки тому +10

      @@eduardoschiavon5652 nah it's because they reduced the rows of the class, whtat we see are the rows of zeros.

    • @GiovannaIwishyou
      @GiovannaIwishyou 3 роки тому +6

      I'm actually pretty sure he did this on purpose to trick the audience. Since first two rows are identical, it's too obvious when you learn that matrix must have the same number of linearly independent columns and rows (and it's a GREAT introduction to the lecture).

  • @YufanZhou
    @YufanZhou 2 місяці тому

    This lecture about the four subspaces is the most beautiful Linear Algebra lecture I have ever had.

  • @antoniolewis1016
    @antoniolewis1016 8 років тому +63

    This man has dedication!
    Also, that girl in the beginning must have been a sharp genius.

    • @ispeakforthebeans
      @ispeakforthebeans 5 років тому +17

      Bruh its MIT they got Gods in there you talk about sharp

    • @akmalsultanov9801
      @akmalsultanov9801 4 роки тому +17

      well, when you have an intuition of just row space and column space and connection between them, it's quite obvious and you don't have to be a genius to recognize the dependency of those row vectors. In fact, the first half of the linear algebra is relatively simple.

    • @sreenjaysen927
      @sreenjaysen927 4 роки тому +13

      I think professor just made that up and he intentionally did wrong in the previous lecture just to introduce the row space.
      Professor just planned it like in "Money Heist"

    • @leophysics
      @leophysics 2 роки тому

      @@sreenjaysen927
      I agree

  • @jingyiwang5113
    @jingyiwang5113 Рік тому +3

    I am really grateful for your wonderful explanation about the four fundamental subspaces. My mathematics exam is tomorrow. It is a wonderful source for me to learn and refresh my memory. Thank you so much!

  • @Cyraxsify
    @Cyraxsify 8 років тому +12

    At t = 38:00, Strang shows a way that expedites finding L: find E, then solve [E| I | to get E inverse which = L. Now we can quickly decompose A into LU if we do Gaussian elimination only--not Gauss-Jordan elimination--from the beginning.
    At t = 43:00, he defines a vector space out of 3x3 matrices, call it M_33.
    At t = 47:00, he covers the dimensions of subspaces of M.

  • @trevandrea8909
    @trevandrea8909 10 місяців тому +2

    Thank you so much!! Your explanation is soo amazing! Now I finally get why the column space of A and R are different, and why the row space of A and R is the same!! Btw, I'm saving 24:00 for the explanation of the subspaces of A and R

  • @serg303
    @serg303 13 років тому +149

    I want to write on that chalkboard with that chalk.

    • @vabez00
      @vabez00 4 роки тому +6

      It seems quite satisfying indeed

    • @Lets_MakeItSimple
      @Lets_MakeItSimple 3 роки тому +1

      the chalk looked like a big stone

    • @yolobot2920
      @yolobot2920 2 місяці тому

      Nah man u got sm severe autism

  • @navs8603
    @navs8603 5 років тому +3

    Thank you MIT for enabling us enjoy these treats.. And Prof. Strang is just pure genius

  • @bfl9075
    @bfl9075 3 роки тому +2

    I was totally astonished by the idea of computing left nullspace!
    Thank you Dr. Gilbert.

  • @maoqiutong
    @maoqiutong 6 років тому +58

    The second time to see nobody in the classroom. The camera man is really happy to be a VIP student I believe.

    • @phil97n
      @phil97n Рік тому +3

      How can you tell? He seemed to be talking to audience

    • @klartraum8495
      @klartraum8495 4 місяці тому

      @@phil97n cameraman avoids to point at the chairs and at the end you don't hear the usual chatter, just silence

  • @DeLuini985
    @DeLuini985 Рік тому

    Thank God for dr.Strang. I am understanding concepts that have eluded me for over a decade.

  • @gavinresch1144
    @gavinresch1144 4 роки тому +5

    It is amazing how he can do these lectures in front of no students and still be so engaging. In a way he is a great actor.

  • @yufanzhou9948
    @yufanzhou9948 4 роки тому +6

    The mistake professor Strang made turned into a great connection to the new topic. That's why he is a genius

  • @LAnonHubbard
    @LAnonHubbard 12 років тому +2

    Loved the bit at the end where he showed that upper triangular or symmetric or diagonal matrices form a subspace.

  • @lokeshkumar-ub9bb
    @lokeshkumar-ub9bb 9 років тому +43

    at 3:15 - 3:20 Instead of looking at the row picture to realize the dependence we may also see that 2*(column 2) - (column 1) gives (column-3) :)

    • @jacobm7026
      @jacobm7026 6 років тому +17

      This is correct, but his mistake actually illuminates the importance of understanding independence from both the row space and column space. Most matrices wont be this easy to find column space independence so conceptualizing both of those spaces will give you a deeper, richer understanding of vector spaces in general

    • @dhruvg550
      @dhruvg550 5 років тому +2

      He explains in the first three minutes why you didn't even have to look at the columns. The girl who pointed this out was quick!

    • @京城五
      @京城五 5 років тому +3

      @@dhruvg550 I think the girl was Gilbert Strang himself

  • @yourroyalhighness7662
    @yourroyalhighness7662 8 місяців тому +1

    My, I feel so….dense. What a sense of humor this brilliant man must have to have penned a book entitled “Linear Algebra for Everyone”.
    Sir, I can’t even subract!

  • @georgesadler7830
    @georgesadler7830 3 роки тому +2

    Incorporating MATLAB commands in the lecture is a great way for students to learn about matrices and linear algebra in context. The overall lecture is another classic by DR. Gilbert Strang.

  • @Afnimation
    @Afnimation 11 років тому +2

    It's interesting that he constantly regards on the fact that he exposes things without proving them, but in fact I think he explains the things so clearly an understandable that he does'nt need to prove them, because we can realize about them almost in an axiomatic way.

    • @robertcarhart4168
      @robertcarhart4168 Рік тому

      Strang proves things without you even realizing that you've just experienced a 'proof.' He makes it very conversational and intuitive.

  • @stefanfarier7384
    @stefanfarier7384 Рік тому

    I really like how he talks. He sounds so friendly in his explanations.

  • @ispeakforthebeans
    @ispeakforthebeans 5 років тому +20

    "Poor misbegotten fourth subspace"
    -Gilbert Strang, 1999
    Remember when Elizabeth Sobeck decided to give GAIA feelings? These guys gave math feelings. And I love him for that. I didn't even know that was possible.

  • @All_Kraft
    @All_Kraft 9 місяців тому +1

    Thank was great performance! Thank you MIT.

  • @chuckhei
    @chuckhei 4 роки тому +1

    I really don't know what to say..... Satisfying? Grateful? OMG I just love it!!!!

  • @gavilanch
    @gavilanch 15 років тому +1

    So?
    This can mean a lot of things, and one of them is that they couldn´t tape this class and Strang had to repeat it in front of the cameras and they didn´t pay to some people to just sit right there so people like you would stop commenting that fact.
    Great classes, I do not speak english as native language, but certainly this is awesome, I really appreciate it
    So much Thanks to MIT and Professor Strang!!

  • @HamizAhmed-uk4de
    @HamizAhmed-uk4de 4 дні тому

    The four fundamental subspaces are the column space, null space, row space, and left null space. The dimensions of these spaces are related to the rank of the matrix, with the sum of the dimensions of the null space and row space equaling the number of columns, and the sum of the dimensions of the column space and left null space equaling the number of rows.
    Highlights:
    00:10 The lecture focuses on correcting errors from the previous lecture and introducing the concept of four subspaces associated with a matrix, including column space, null space, row space, and the left null space.
    -Explanation of the error correction process from the previous lecture and the significance of having different bases for spaces in linear algebra.
    -Introduction and explanation of the row space as a fundamental subspace, its basis, and its connection to the rows of a matrix through combinations.
    -Discussion on transposing matrices to work with column vectors, leading to the column space of the transposed matrix and the null space of the transposed matrix.
    07:52 Understanding the four spaces in linear algebra - null space of A, column space of A, row space of A, and null space of A transpose - is crucial as they provide insights into the properties of matrices and their dimensions.
    -The importance of the four spaces in linear algebra and their relation to matrices' properties and dimensions.
    -The process of determining bases and dimensions for each of the four spaces, providing a systematic approach to understanding and analyzing matrices.
    -Explanation of the dimensions of the column space, row space, and null space of A transpose, highlighting their significance in understanding linear algebra concepts.
    15:47 Understanding the dimensions of row space, column space, and null space in a matrix is crucial. The row space and column space have the same dimension, while the null space dimension is determined by the number of free variables.
    -The relationship between row space, column space, and null space dimensions. The row space and column space have the same dimension, while the null space dimension is determined by the number of free variables.
    -Determining the basis and dimensions of the null space. Special solutions from free variables form a basis for the null space, with the dimension being n-r, where n is the total variables and r is the number of pivot variables.
    -Exploring the dimensions of the left null space. The left null space dimension is m-r, where m is the number of columns in the transpose matrix. A transpose follows similar rules as the original matrix in terms of dimensions.
    23:24 Understanding row space and column space in matrix operations is crucial. The row space and column space of a matrix can have different bases, but the row space basis can be identified by the matrix's rows.
    -Difference between row space and column space in matrix operations. Identifying the basis for the row space using the matrix's rows.
    -Exploring the concept of basis for the row space and its significance in matrix transformations. The importance of independence in determining the basis for the row space.
    -Understanding the left null space of a matrix and its relation to the null space of the matrix's transpose. Exploring the concept of vectors in the null space of A transpose.
    31:06 Understanding the left null space involves transforming A to R using row reduction, resulting in a matrix E. In the invertible square case, E is the inverse of A, but for rectangular A, E connects A to R.
    -Explanation of left null space and its connection to row reduction and matrix E in transforming A to R.
    -Comparison of E in the invertible square case to the case of a rectangular A, where E does not represent the inverse of A.
    38:58 Understanding the concept of subspaces in linear algebra is crucial. The video discusses row space, null space, column space, and left null space, emphasizing their dimensions and relationships in a matrix. It also introduces a new vector space using three by three matrices.
    -Exploring the dimensions and relationships of row space, null space, column space, and left null space in a matrix is essential in linear algebra.
    -Introducing a new vector space using three by three matrices and discussing the rules that define vectors within this space.
    -Discussing subspaces within the matrix space, such as upper triangular matrices and symmetric matrices, and how the intersection of subspaces forms a subspace.
    46:36 The dimension of different subspaces of matrices can be determined by finding a basis. Diagonal matrices have a dimension of three and can be spanned by three independent matrices.
    -Understanding the concept of dimension in linear algebra and how it relates to subspaces of matrices.
    -Exploring the basis of diagonal matrices and how they form a subspace.

  • @archilzhvania6242
    @archilzhvania6242 6 років тому +1

    He makes everything look so clear.

  • @maximliankremsner633
    @maximliankremsner633 5 років тому +3

    Thank you so much for this lecture series. This helps a lot! Great professor with great and easy to understand explanations.

  • @MAGonzzManifesto
    @MAGonzzManifesto 11 років тому +1

    Thank you Dr. Strang and MIT. These videos are amazing and keeping me afloat in my class.

  • @shivamkasat6334
    @shivamkasat6334 5 років тому

    A mathematician with Great sense of Humour. Mr. Strang !

  • @DerekWoolverton
    @DerekWoolverton 4 роки тому +1

    I was nodding my head, keeping up just swimmingly, it all made perfect sense. He wrapped up the diagram and it seemed like we were done. Then he stepped over to the far board and replaced vectors with matrices and just turned everything upside down. Didn't see that coming.

  • @markymark443
    @markymark443 8 років тому +3

    lol funny I'm just first watching this today and it was posted exactly 7 years ago xD
    thanks for the video, really helpful! I was struggling with this concept for my current linear algebra 2 course since I took the non-specialist version of linear algebra 1 which didn't really test us on proofs at all. I think I have a better understanding of the four fundamental subspaces now! :)

  • @kaiding3322
    @kaiding3322 Рік тому +1

    I believe Prof. Strang deliberately made the mistake at the end of Lec 9, in order to transition the focus from column space to row space. The transition was too smooth for this to be an accident. This is also a great show of humility that he didn't mind being perceived making a mistake!

  • @ozzyfromspace
    @ozzyfromspace 4 роки тому +7

    *Question:* what is the relationship between rank(A) and rank(A^T)? Does rank(A) = rank(A^T) in general?
    The professor seems to be hinting at this, but rref(A) only preserves the column space, so it doesn’t seem so trivial to me. Any insight is highly appreciated.
    Edit: I found the answer. rank(A) = rank(A^T) by virtue of the fact that linear independence of the columns implies linear independence of the rows, even for non-square matrices. I proved this for myself this evening. The main idea for the proof (at least how I did it) is that if you have two linearly dependent rows, one above the other say, row reduction kills the lower one (reduces number of possibly independent rows). Killing off the row (making the row all zeros) also makes it so that the given row can’t have a pivot. Thus, we’ve reduced the number of potential pivot columns by one. That’s the relationship in a nutshell. The math is only slightly more involved

    • @ostrodmit
      @ostrodmit 2 роки тому +1

      rref(A) does not preserve the column space, only the null and row spaces. It does preserve the dim(Row(A)) however, which suffices to prove that the row and column ranks are equal.

  • @onatgirit4798
    @onatgirit4798 3 роки тому +3

    If all youtube content would be deleted today, the most upsetting thing for me would probably be losing this series of lessons.

  • @ozzyfromspace
    @ozzyfromspace 4 роки тому

    Worth mentioning: if row-reduction of the matrix generates the most natural row space basis without much effort, we can also generate the most natural basis of the column space of said matrix by doing row-reduction on the transpose of the matrix. This is all so incredibly fascinating!

  • @Mike-mu3og
    @Mike-mu3og 5 років тому +2

    45:26 transform an exclamation mark into an M. Brilliant!

  • @georgipopov2754
    @georgipopov2754 2 роки тому

    Brilliant. This lectures connects the complex puzzle

  • @williamss4277
    @williamss4277 Місяць тому

    beginning from natural number N, then integer Z, then real number, then complex number which is just only 2 dimensional number, then vector which is a n dimensional number. vector space of vectors is like the range of N, or Z, or real numbers. with vector space the confident definition of vectors can be obtained. in terms of calculation linear algebra uses computer to do calculation. the key is to find out the algorith by studying examples of low dimensions vectors and matrices. with the algorithm the calculation can scale to vectors and matrices of high dimensions. obviously vector is a much much more expressive number comparing with preceeding any kind of number. so vector as well as linear algebra become a very powerful math tools in many applications.

  • @ozzyfromspace
    @ozzyfromspace 4 роки тому +1

    It's not that she found a numerical error, it was the power of her reasoning for it. I'm shook, whoever that girl is, she's clearly brilliant.

    • @webstime1
      @webstime1 3 роки тому

      He made that story up to drive a point

  • @serenakillion7008
    @serenakillion7008 5 років тому +1

    Thank you MIT and Professor Strang!

  • @dariopl8664
    @dariopl8664 Рік тому

    min 18:50 If it's helpful for anybody: the dimension of the null space is the same as the number of basis vectors that form the null space. Just like the dimension of a column space (or rank) is the number of linearly independent columns (i.e. vectors within the matrix), in the case of the null space, its dimension is the number of linearly independent columns, i.e. the number of basis vectors that form the null space.

  • @brogcooper25
    @brogcooper25 13 років тому +1

    He is not only a master lecturer, he is a master of writing on a chalkboard. I swear, it looks like he is using a paint pen.

  • @karthik3685
    @karthik3685 3 роки тому

    There is a problem with Dr. Strang's lectures.
    The problem is, he makes it so intuitive that I'm literally nodding in agreement the entire lecture. I've now watched the lectures once, read the book chapters, and watched the lectures a second time. And while I have a good grasp of everything discussed so far, they all sort of blend in. I couldn't list the things I learnt one by one for these 10 lectures. :D (well, I sort of can.)

    • @johnk8174
      @johnk8174 2 роки тому +1

      ya gotta do the problems. That pulls it together in your head.

  • @AlexanderList
    @AlexanderList 10 років тому +8

    Class is crowded these days, no worries. Don't know why no one is attending back in 2005!

    • @omega7377
      @omega7377 7 років тому +2

      It was actually in 2000. But it was uploaded to web in Spring 2005. The dates written in video titles are dates of upload not dates of record.

  • @ucrclxl
    @ucrclxl 14 років тому +3

    What fascinates me are some stats you can find below this video. Maybe it's some bug but youtube tells us that this video is most popular among:
    1) men 45-54 yo
    2) men 35-44
    3) men 25-34
    Which I find really strange cause I've thought that most of viewers would be actual students.
    Also, popularity by region is interesting stat.

    • @agarwaengrc
      @agarwaengrc 8 років тому +3

      where exactly can you find these stats? When I click on statistics I just get a viewcount graph

  • @ermomusic
    @ermomusic 4 роки тому

    You could also argue that it isnt a basis because -1 time the first vector plus 2 times de second vector gives us the third vector... You really dropped the ball there professor G. hahahaha just kidding, this man is the best thing that ever happened to Linear Algebra right after Gauss

  • @yojansh
    @yojansh 4 роки тому +4

    Just when I thought he ran out of blackboard to write he moves to the right and lo and behold there's more of them

  • @fanggladys9986
    @fanggladys9986 2 роки тому

    He is lecturing to an empty classroom if you look at time 40'53'' !! Even more wonders!

  • @ramkrishna3256
    @ramkrishna3256 4 роки тому +2

    For finding basis for N(A), Why can't we use similar approach of finding basis for left nullspace.
    1) trans(A) - - - -> RREF
    2) E' × trans(A) = RREF
    3) finding basis from E'

    • @MultiRNR
      @MultiRNR 9 місяців тому

      Yes I have same question and this way sounds more mechanical (programmable) than earlier way

  • @aymensekhri2133
    @aymensekhri2133 5 років тому +1

    Thank you Prof. Strang

  • @ghsjgsjg53chjdkhjydhdkhfmh74
    @ghsjgsjg53chjdkhjydhdkhfmh74 4 роки тому +4

    😖😖 He's the best professor I know and yet my brain doesn't get it at once😂

    • @nonconsensualopinion
      @nonconsensualopinion 3 роки тому +2

      That's fine. All at once doesn't matter. What matters is "forever and always". Do what you must to understand it deeply so that you will know it the rest of your life. It may take watching the video many times and will probably require writing down some matrices and doing them yourself. Math is a subject which is hard to learn by observation; it really depends on participation. Remember, the students in the audience were MIT students, so they had proven they were quite talented. Those students saw what you saw in the video. Those students had the ability to talk to this professor after class. Those students had homework practice. Still, when the quiz was administered, I guarantee the average score was below 100%. Even after all that help, some students didn't quite get it all. They didn't get it "all at once". How can you expect yourself to do better than that, especially if you demand it happen "all at once"?

  • @p.z.8355
    @p.z.8355 5 місяців тому

    Why is he such a good lecturer, my Prof used to just read from the text book

  • @marcuschiu8615
    @marcuschiu8615 4 роки тому +1

    this is mind-blowing
    i don't fully understand it
    but i know it's mind-blowing

  • @abdelaziz2788
    @abdelaziz2788 3 роки тому +1

    40:50
    is the best plot twist awesomee

  • @alsah-him1571
    @alsah-him1571 4 роки тому +4

    9:45
    Professor Strang subtly integrates class consciousness into his lecture of the Four Fundamental Subspaces.
    Truly a genius.

    • @bokumo7063
      @bokumo7063 3 роки тому

      Last hired First fired?

  • @mohammedtarek9544
    @mohammedtarek9544 4 роки тому +1

    Im just gonna write this for ppl like me
    if you are like me(didn't finish high school yet(first year)) you will have to put some effort in studying this u gotta search for some basics u may not have been taught yet and it will take a lot of time to search alot but just try and take the reason you are studying this that early as a motivation cuaze some stuff will be so frustrating and hard to understand cuaze u haven't mastered some basics yet but ik for sure it's worth
    for me Im studying this to learn machine learning and some stuff i wanna also do physics engines and stuff like that to help people and i found that everything i wanna do is related to linear algebra and once i get it and get deep into it i can do whatever i want just by some basic research
    Trust me it's hard but worth❤

  • @magdaamiridi7090
    @magdaamiridi7090 6 років тому +6

    Hello! Does anybody know any other lecturers like Dr. Strang with such passion in fields like convex optimization, detection estimation or probability theory?

    • @q44444q
      @q44444q 5 років тому

      Look up lectures by Steven Boyd. "Stanford Engineering Everywhere" is like Stanford's version of OCW and has some great courses in convex optimization: EE263 and EE364A. They aren't quite as good as Strang's lectures, but he's hard to beat!

    • @nonconsensualopinion
      @nonconsensualopinion 4 роки тому +2

      John N. Tsitsiklis has great probability lectures on MIT open courseware here on UA-cam. Highly recommended.

  • @SandeepSingh-hc3no
    @SandeepSingh-hc3no 2 роки тому

    It's like an enlightenment moment when he says, "she said, it's got two identical rows"

  • @fuahuahuatime5196
    @fuahuahuatime5196 10 років тому +3

    25:06 So performing row eliminations doesn't change the row space but changes the column space?
    So to get the basis for the column space, would you have to do column elimination for matrix [A]? Or could you take the transpose, do row elimination, and just use that row basis for [A] transpose as the column basis for [A]?

    • @readap427
      @readap427 8 років тому +1

      +Pablo P That's what I was thinking as I watched that part of the video. It seems that approach would work. Before this lecture, it's the approach I probably would have used, but now that I see the tie-in to pseudo-Gauss-Jordan, I think I prefer pseudo-Gauss-Jordan.

  • @JohnPaul-di3ph
    @JohnPaul-di3ph 3 роки тому

    My mind got blown when I realized you could get the basis for the left null space from row transformation. I mean, it seems completely obvious after he points it out but I never thought much of it until then.

  • @encheng1136
    @encheng1136 8 років тому

    There are no students sitting there, but the lecture is still so good.

  • @shavuklia7731
    @shavuklia7731 7 років тому

    Oh cool. I've never computer the nullspace of the row space before. Initially, I thought of computer the nullspace of the columnspace of the transpose, but the method he provides - calculating E - is so easy, once you've already done all the work computing the other subspaces.

  • @christophercrawford2883
    @christophercrawford2883 Рік тому

    Nice lecture. Would like to have seen that N(A) and C(A^T) are independent (or even orthogonal!)

  • @xiemins
    @xiemins 4 роки тому

    May I say that the vectors in R span the same space as vectors in A after row operation because you can do a reverse ROW operation and construct the same vectors in A from R? It can't be true for column space because after row operations you most likely can't reverse and reconstruct the original column vectors from R through COLUMN combinations.

  • @timelordyunt7696
    @timelordyunt7696 5 років тому

    Take another look at the list...the first time I feel glad at so many left unwatched.

  • @s4mwize
    @s4mwize 6 років тому +2

    Here's a paper by prof. Strang related to this lecture.
    web.mit.edu/18.06/www/Essays/newpaper_ver3.pdf

  • @sauravparajuli4988
    @sauravparajuli4988 4 роки тому

    The twist at the end was better than that of GOT's.

  • @anikislamdu
    @anikislamdu 13 років тому

    great lecture .i am so grateful to prof.gilbert

  • @Afnimation
    @Afnimation 11 років тому

    I think that for to figure out how a space looks like, you must ask yourself how a space looks like... because a subspace is in fact a space inside another space, so just imagine that instead of being in a three dimensional space we are in a 3 dimensional subspace inside another bigger-dimensional space and a null-space turns to be the space of all vectors in that bigger-dimensional space that when multiplies any array of 3 basis vector of our "subspace R^3" that nullifies our space (makes it 0)

  • @miladaghajohari2308
    @miladaghajohari2308 3 роки тому

    I love these lectures

  • @guptaji_uvach
    @guptaji_uvach 15 років тому +1

    Thanks Dr. Strang

  • @redthunder6183
    @redthunder6183 Рік тому

    god dude, my school cobined multivariable calc and linear algebra into one class, so this entire lecture was only one part of 4 of my most recent lecture

  • @Afnimation
    @Afnimation 11 років тому

    I don't know if if made my point, but if you see the other lectures you will understand better... in fact i just realize i made a mistake in the grammar at the end... next time i'll review before posting anything.

  • @ozzyfromspace
    @ozzyfromspace 4 роки тому

    At 14:00 that’s *not* true in general. The number of basis vectors of C(A) is rank(A) *yes* but said basis vectors are not the pivot columns. The professor probably misspoke or something.
    Doing rref(A) of a matrix means we’re taking linear combinations of the rows of A, so there’s no reason to believe the column space will be preserved during the row-reduction operations. In general, C(A) is not equal to C(rref(A)).
    Let A^T stand for transpose of matrix A. Then the following is always true: *C(A) = C( ( rref(A^T) )^T )* . He’s kind of implied this already in previous lectures without using the rote notation, I’m just connecting the dots like a game.
    Edit:
    A simple example is A = [1,2; 2,4]. rref(A) = [1,2; 0,0]. The column space of rref(A) is along the “y” axis whereas the column space of A is along the line “y” = 2 ”x”. It should be clear that aside from the zero vector, they have no other vectors in common. Further, rref(A) has one pivot column, so the dimension of A should be 1. 2*[1;2] = [2;4] (columns are linearly dependent, so we really only have *1* vector to scale. Hence, dimension of C(A) = r = 1, as earlier predicted 😊
    Hopefully this helps someone. Professor Strang is one of the best educators out there, I see why people admire MIT! This playlist in linear algebra is the perfect way to prepare for the vectorized implementations of machine learning (if you actually wanna understand what you’re doing, that is). Best wishes to you all. 👨🏽‍🏫

  • @flyLeonardofly
    @flyLeonardofly 8 років тому +1

    i think he breaks with the usual convention for m * n Matrix being: m rows, n collums ... which confused me, but great lecture anyways ...
    edit: I was wrong
    9:12 ... misunderstood what he said there, ofc the columspace has m (rows many) components, because colums go m (rows-many) components down ... thanks Robert Smits

    • @TheRsmits
      @TheRsmits 8 років тому +2

      He keeps the convention. His example A=[[1 2 3 1], [1 1 2 1], [1 2 3 1]] has 3 rows and 4 columns. The 1st row is [1 2 3 1] has 4 components and is thus an element of R^4.

    • @Neme112
      @Neme112 7 років тому

      He doesn't but it is definitely a convention that confuses me and I have to think twice about the coordinates every time. Usually in mathematics (and in programming) the dimensions are X (meaning horizontal offset) and then Y - here it's reversed. If somebody tells me coordinates [100, 1] I expect it to be far to the right, not way down.

    • @kristiantorres1080
      @kristiantorres1080 5 років тому

      I was confused too! So I scrolled down in the comments looking for a lost soul like me xD Thank you for the explanation Robert!

    • @user-wm8xr4bz3b
      @user-wm8xr4bz3b 5 років тому

      thanks Luis .. you cleared my doubt !! ^^

  • @davidmurphy563
    @davidmurphy563 11 місяців тому

    3:30 Funny, I was looking in col space and noticed that -1 * C1+ 2 * C2 = C3 and completely missing the far more obvious fact that R1 = R2. Hey ho.
    Target fixation. It's what sunk the Kaga and Akagi.

  • @souravghosh7558
    @souravghosh7558 5 років тому +1

    Somehow after some thought I figured out why Prof says all upper triangular matrices are subspaces of 3X3 Matrices! If we add any 2 upper triangular matrices we are still in that space as also if we multiply with scalar any upper triangular matrices we are still in that space. Same with Symmetric and Diagonal matrices. But Prof assumes that all students will decipher this and he did not spell that out. Otherwise its a privilege to hear from a genius!

  • @jrkirby93
    @jrkirby93 13 років тому +2

    great that we can E=AR this lecture.

    • @OneZombieTrain
      @OneZombieTrain 7 років тому

      Guess you could say the students who understood that were all E=ARs

  • @BVaibhav-mt8jx
    @BVaibhav-mt8jx 3 роки тому

    he is so dam good at explaning! I love him!!!!!!!!!!!

  • @apaksoy
    @apaksoy 6 років тому

    I watched this video hoping to learn what the row space and left null space were good for and learned nothing new. This lecture recounts only the definitions of the four fundamental subspaces and their dimensions and works on an example of finding their respective bases. Like in his book "Introduction to Linear Algebra", Dr. Strang takes so many shortcuts and skips over the precise definition of many concepts and proofs and precise definitions of so many important theorems that I find his lectures (and book as well) useful only to a limited extent. I appreciate the effort but I believe there should be a better way of teaching linear algebra.

    • @syedsaad6929
      @syedsaad6929 6 років тому

      I felt that too, both his lectures and his book lack in rigor and depth. Do you know some other resource?

    • @apaksoy
      @apaksoy 6 років тому

      ​@@syedsaad6929 It has significant shortcomings as well but I found Dr. van de Geijn's course on edX, "Linear Algebra: Foundations to Frontiers", to suit better to my taste, having a better balance of theory vs applications. The fall class has just ended but they have a new one starting 16 January 2019 (totally free if you like) and the professors themselves answer the questions during the course! The course also provides downloadable pdf notes along with the class.
      Having made the above criticism against Dr. Strang's way of teaching linear algebra, I have to acknowledge that nearly all of the (worked) examples I have studied and quite a few of the exercises in his book "Introduction to Linear Algebra" were excellent. Though I have reservations about his approach to teaching linear algebra, I still recommend studying his book but not as the only source.

    • @apaksoy
      @apaksoy 6 років тому

      Also check out these resources which I found helpful at times: 1) Linear Algebra Done Right ( www.linear.axler.net ), 2) immersive linear algebra ( immersivemath.com/ila/index.html ), 3) A First Course in Linear Algebra ( linear.ups.edu/html/fcla.html ).
      The first one is the site for Dr. Sheldon Axler's book which refers to videos based on his book. Videos only provide a summary of his book but still helpful like the abridged version of the book. Unfortunately, you may need his unabridged book to make the most out of his teachings but it is not (legally) freely available. Axler's approach is totally different from than that of Strang and more suitable for math majors than other science and engineering majors but it is so clean and fundamental. It is a good source whenever you want to understand some basic linear algebra concept deeply.

    • @mitocw
      @mitocw  6 років тому +1

      @@apaksoy There are also a number of other courses and resources available for linear algebra on MIT OpenCourseWare. We recommend you check out Herb Gross' "Calculus Revisited: Complex Variables, Differential Equations, and Linear Algebra" (ocw.mit.edu/RES18-008 and/or UA-cam playlist: ua-cam.com/play/PLD971E94905A70448.html ) To see the complete listing of courses related to linear algebra, visit our Course Finder: ocw.mit.edu/courses/find-by-topic/#cat=mathematics&subcat=linearalgebra. Best wishes on your studies!

    • @syedsaad6929
      @syedsaad6929 6 років тому

      @@apaksoy I agree, I do follow the book. Its good for applications of linear algebra which is what I need, but not what satisfies me.

  • @arteks2001
    @arteks2001 3 роки тому +1

    Correction of error from previous lecture 0:43
    Introduction to the four fundamental subspaces
    (column space, null space, row space, left null space) 4:20
    Basis and dimension of each fundamental subspace 11:44
    Basis and dimension of the column space 12:50
    Dimension of the row space (it is the rank) 14:41
    Basis and dimension of the null space 17:05
    Dimension of the left null space (m - rank) 19:41
    Basis of the row space (nonzero rows in the rref) 21:08
    Basis of the left null space 29:48
    Review of the four fundamental subspaces 42:09
    A new vector space of all 3 by 3 matrices 42:32

  • @marverickbin
    @marverickbin 6 років тому

    vector spaces of matrices! mindblow!

  • @thejasonchu
    @thejasonchu 9 років тому +2

    thanks Prof and MIT

  • @ChandanKumar-ct7du
    @ChandanKumar-ct7du 6 років тому

    Thank You Frof. Strang...

  • @habenbelai7420
    @habenbelai7420 4 роки тому +1

    36:24
    ...SPORTS. IT'S IN THE GAME!