Matrices, matrix multiplication and linear transformations | Linear algebra makes sense

Поділитися
Вставка
  • Опубліковано 19 лип 2018
  • Brilliant.org: brilliant.org/LookingGlassUni...
    Previous video on vectors and bases (watch this first): • Linear Algebra makes s...
    Next video:
    • A simple condition for...
    Matrices are often presented as a useful bookkeeping/ commutation tools to students- but there’s much more to them. When you understand what a Matrix really is so many parts of Linear Algebra will be completely obvious to you… including the formula for matrix multiplication and the fact that matrices don’t commute. So here's the big secret: A matrix is a linear transformation that eats a vectors and outputs another vector.
    Homework questions:
    Not all sized matrices can be multiplied together. Think about it in terms of them representing transformations from one space to another, and figure out which size matrices can be multiplied and explain why in the comments.
    Consider a transformation that takes a 3d vector, and adds some fixed vector k to it. Say k is the vector 7 3 3. Is this a linear transformation or not? brilliant.org/practice/linear...
    Imagine you have a matrix A that multiplies the first basis vector by 2, and the second basis vector by 6. How do you write A in this basis? brilliant.org/practice/linear...
    Music: Epidemic sound, Summer nights 2
    This video is an Introduction to Matrices but could be useful revision for school/university. If you have an exam, good luck!
  • Наука та технологія

КОМЕНТАРІ • 88

  • @ScienceAsylum
    @ScienceAsylum 5 років тому +31

    8:59 "no. I don't expect anything anymore" ...this is likely the best plan for understanding quantum mechanics.

  • @casperTheBird
    @casperTheBird 5 років тому +5

    I really appreciate these linear algebra videos. I took a class in the math last year in college and it was pretty aggravating, to say the least, that while everyone could explain to me how to solve matrix multiplication and even very elaborate proofs, NOBODY, not professor, TA, or tutor could explain to me what a matrix even was. To which point I was like, if you dont know what it even it is then how in the world could you ever apply this anywhere?

  • @adityakhanna113
    @adityakhanna113 5 років тому +11

    I really like the idea of matrix dimensions representing the dimensions of the input and output vectors/space. I feel dumb for never noticing that before.
    Anyway. Homework! (Look I am excited. Weird, right?)
    1) so, considering the dimension argument, the number of rows represent the output dimensions and the number of columns represent the number of input dimensions. So, like composition of functions in which the range of one must be the domain of the other,, basically, what one produces is what the other must consume. Or in other words, the output of one is the input of the other. Or ("It's enough, Aditya. She gets it". "Okay.")
    So, basically the number of rows (output) of the first transformation (conventionally on the right) should match the number of columns (input) of the other one.
    2) I hope I am not wrong about this, but this is the way I look at it. In the case of linear transformation, T: R^n -> R^n, T(x+y) = T(x) + T(y) must hold true [x and y are vectors in R^n]. Suppose, x = y = 0 (vector), then,
    T(0+0) = T(0) + T(0)
    T(0) = 2*T(0)
    T(0) = 0.
    Thus, any transformation that is a linear transformation must preserve the origin. Fancy!
    Given a translation, that is, T(x) = x + k, we can say that it is not a linear transformation as it does not preserves the origin.
    Does this work? Yay?
    Also also, comments on the video, I like how you are unafraid of silence, there are parts where you let the video convey whatever is necessary. I feel that silence drives viewers away. Apparently not!
    And like always, I feel bad laughing at the self-deprecating jokes but they are half the reason why I am here, so...
    And, regarding the mathematical education, I feel like a failed product of my system, because I ended up liking Math. But something needs to be done, and I have no idea what.
    Anyway, keep up the good work!
    DFTBA?

  • @JorgeGarcia-jt4kq
    @JorgeGarcia-jt4kq 5 років тому +2

    OMG!!! I've been bumping my head against the wall thinking I would never understant linear algebra until I saw this videos. This is amazing!!! It is shocking to realize how many people manage to turn such neat and clear ideas into some awful algorithmic torture. Really, thanks a lot for this!!! By far this is the best youtube channel! Keep up the excelent work! ;)

  • @e.s.r5809
    @e.s.r5809 Рік тому +1

    I find it helpful to think of matrix notation like functions. g(f(x)) would say, do f(x) first then do g to the result, so ABv would be like A(B(v)) with the little nested brackets, "Do B to the vector first then A to the result". Then it's clearer they don't commute, the wording is just confusing.

  • @louiebafford1346
    @louiebafford1346 5 років тому

    Amazing video the intuitive explanation was so well done

  • @cycklist
    @cycklist 5 років тому +4

    Wonderful sense of humour!

  • @adarshkishore6666
    @adarshkishore6666 3 роки тому

    Your way of teaching and sense of humour is great!

  • @adityakhanna113
    @adityakhanna113 5 років тому +4

    You're not lazy, you're resourceful!

  • @ohyouresilly7366
    @ohyouresilly7366 5 років тому

    Yes, part two! And - as usual - you've done an excellent job explaining a concept that I doubt I'd ever be able to understand otherwise (although I'll have to wait until I'm home to do the examples since it won't work for me on mobile.) I agree with what you noted in this video around the 12 minute mark though; there isn't much point in knowing something without also understanding it. I was taught most of this stuff in high school about a decade ago and it just never stuck, yet here I am successfully learning how it actually works from a youtube channel... Thanks US educational system! :P
    But in seriousness, I'm really glad you're making vids at a regular rate these days. I can't wait to see more from you! In the meantime, have fun down under!
    Sincerely, a fan who is most definitely "a bit weird" ;)

  • @steffahn
    @steffahn 5 років тому +8

    I love how mathematically on point your video is. It communicates precisely what matrices are / what they are good for. I can’t comment on if this presentation is optimal for helping people understand the topic that didn’t understand it before. Probably it *only* fits those who worked with matrices before and just need more intuition behind what it’s all for. I guess if someone is completely new to the topic this is only going to feel confusing as the details are missing at places.

    • @AidanRatnage
      @AidanRatnage 5 років тому

      I got lost quite early on and I have worked with matrices before.

  • @caioreis350
    @caioreis350 5 років тому

    I'm glad you are back

  • @PraveenYoga
    @PraveenYoga 5 років тому

    Wow such teaching, much learning

  • @mayukh3556
    @mayukh3556 5 років тому +17

    You changed the way I look at math

  • @seanziewonzie
    @seanziewonzie 5 років тому +1

    >I don't understand why some courses insist on making linear algebra awful and algorithmic. What's the point of 'knowing' how to multiply matrices if you don't understand it.
    From my experience, it's because a lot of students will whinge if you try and explain what things "mean". Professors often get stuck in a "you can't win with everyone" situation and decide to (or are instructed to) err towards teaching algorithms so we can at the very least guarantee that every engineer who passes the class knows how to multiply matrices. I disagree with this practice in the modern era -- MatLab exists, so the only challenge left is knowing which matrices to choose, and understanding the meaning of matrices is the only way to make this process straightforward!
    Some schools have made the decision to make TWO "intro to linear algebra" courses. To anyone reading this who hopes to go into physics, I advise you to go into the more mature of the two. It is not harder, or more work; it is designed with a different ethos. The meanings of the objects of linear algebra are explained as well as they are in this series. Key words to look out for in the course descriptions are "proof-based" or "abstract vector spaces". There is probably no better example of the conceptual power of abstract vector spaces than states in QM.

  • @nachannachle2706
    @nachannachle2706 5 років тому

    I like the way you try to "teach" an intuitive approach.
    As someone who naturally gravitates towards spotting patterns and schemata behind anything that I do and learn, I found this very difficult to stick to. It might be because there is a bit of "memorisation" and "manipulation" involved here, something I specifically suck at.
    Regardless, this proves to me, once again, that teaching and learning maths is a question of personal preferences: whatever method/approach works for one person to a certain degree might work for others but to a lesser extent. :)

  • @Hecatonicosachoron
    @Hecatonicosachoron 5 років тому

    I'm surprised that there are no row operations, haha!
    Interesting vids, it's always nice to see different presentations on the basics :)

  • @electromorphous9567
    @electromorphous9567 5 років тому

    *YOU'RE ALIVE!!!*

  • @ThePCxbox
    @ThePCxbox 4 роки тому +1

    You're literally better than khan academy and 3blue(if you know who that is) combined! Please never stop making videos, they dont even need to be animated

  • @TheViolaBuddy
    @TheViolaBuddy 5 років тому +1

    I do have to say that AB != BA is not quite as obvious as you make it seem. Especially since we just talked about linear transformations whose definition involves doing transformations in different orders and getting the same thing. But of course many lin. alg. classes don't go into the intuition of why it's not associative, which is what you were presenting here. Speaking of intuition, I guess it really is true that all matrices are transformations - at least all that I can think of. I knew that they _can_ be, but I never really thought of that as being their fundamental identity of a matrix.
    As for the homework questions, they're less proof-y this week! I answered the multiple choice questions, but as for the first one, "When can you multiply an (n x m) matrix by a (p x q) one?" the answer is of course "m = p": the p x q matrix takes in a q-dimensional vector and spits out a p-dimensional one. You now need a matrix that can take in a p-dimensional vector (regardless of what it spits out), so our second (n x m) matrix has to accept a p-dimensional vector; that is, m = p.

  • @OnTheThirdDay
    @OnTheThirdDay 5 років тому

    Hello. Thank you for the videos.
    Question: Should we will in the survey/poll questions if we are already familiar with Linear Algebra? (I am and did....)

  • @Erioch
    @Erioch 5 років тому

    Very nice video! loved the comments here and there :D

  • @viniciuslambardozzi4358
    @viniciuslambardozzi4358 5 років тому

    Is this actually a recording of your hand as your write? It's so smooth, almost seems like a tool for animating over strokes.

  • @mranonymous5268
    @mranonymous5268 5 років тому +3

    Also, I just noticed, these are stopmotion vids right? Damn, how long does it take you to just make the footage???

  • @avanishpadmakar5897
    @avanishpadmakar5897 5 років тому

    1)number of linear tranformation rules for basis vectors = number of basis vectors
    for some Multiplication AB this means
    Number of columns in A=number of rows B
    2)in some basis space [|1》,...|N》] let |v》 be an arbitrary vector
    T(|v》)=|v》+|k》
    And
    Sigma(aT(|n》))=Sigma(a|n》+a|k》)
    Where a varied with n
    So Sigma(a T(|n》))=
    Sigma(a|n》)+|k》Sigma(a)=
    |v》+|k》Sigma(a)
    hence T is not linear
    3)for some basis space [u,v]
    A [u,v]=[xu,yv]
    If u=[0,1] v=[1,0]
    we get for x=2 and y=6
    A=2 0
    0 6

  • @mc4444
    @mc4444 5 років тому

    I remember our teaching assistant had us write "matrix multiplication is NOT commutative" in red pen, then underline it and then circle it. It was effective but I got a much better understanding of matrices only later, in a computer graphics course.

  • @abhishekbhattacharjee495
    @abhishekbhattacharjee495 5 років тому

    and yes we can also enjoy any research type lecture (with minimal animation if u want)...of course it'll consume lot of ur time but i think quality videos will enhance ideas for both...

  • @kauhanen44
    @kauhanen44 5 років тому +2

    I think you can only multiply two matrices together if the output dimension (height) of the first matrix is equal to the input dimension (width) of the second. Otherwise it makes no sense, because you can't transform an n-dimensional vector space with a matrix meant to transform m-dimensional vector spaces where n != m.

  • @mranonymous5268
    @mranonymous5268 5 років тому

    Homework:
    1) Two matrices can only be multiplied if the first (right most) has the same amount of rows, as the second (left most) has columns. That is because if the first one spits out, say, two-dimensional vectors (n rows = n dimensions, this may be too simple), and the second one takes in three-dimensional ones, well, that doesn't make sense.
    2) No. Example (horizontal vectors because there's no other option that I know of): the transformation adds vector v = (3 3).
    Basis vectors: i = (1 0); j = (0 1). To keep it simple, I just add i and j, which gives me (1 1), then add v and I end up with (4 4).
    On the other hand, i + v = (4 3); j + v = (3 4). Adding these two gives me (7 7), which is clearly not the same as (4 4).
    3) Here's my best try at making a matrix...
    ( 2 0 )
    ( 0 6 )
    not quite, but it works I suppose :)
    Awesome video! Indeed, I hope that somebody realizes at some point that teaching the underlying principles is the way to go. I honestly don't see this problem like it is described by many (UA-camrs), but hey, I haven't even graduated (the dutch version of) high school yet :)

  • @shivChitinous
    @shivChitinous 5 років тому +1

    So there are constraints on which vectors can be acted upon by a transformation. You can’t apply a matrix with 6 columns on a vector with 3 bases, because each column basically tells you what to do to each of the basis vectors. And clearly a matrix spits out a vector that has the same no. of bases as the no. of rows. So an mxn matrix will give you a vector with m bases. And this vector can only be operated on by a pxq matrix B if q=m.

  • @Pyriphlegeton
    @Pyriphlegeton 5 років тому +30

    Is that AB=/=BA Thumbnail a psychological Trick to garner views from people hyped for Mamma Mia 2?

    • @Pyriphlegeton
      @Pyriphlegeton 5 років тому +5

      And...apparently their boyfriends overthinking Thumbnails?
      Because at least one of them worked! :D

    • @Andoresu96
      @Andoresu96 5 років тому

      mathbait confirmed

  • @sarafranco43
    @sarafranco43 4 роки тому +2

    O.o I still don't get why at 7:55 the matrix cannot be
    (1 0)
    (1 0)
    I thought that since we cannot change the first basis vector, the left column (which affects that first basis vector) should keep it's components unaltered, therefore multiplying them by 1.
    What am I missing here?

    • @biblebot3947
      @biblebot3947 3 роки тому

      Because the columns of the matrix are the components of the new basis, the second basis vector becoming the zero vector will collapse a part of the space (the second dimension)

    • @Trucmuch
      @Trucmuch 2 роки тому

      you're right the first vector should be unaltered and that's exactly what her matrix does.
      if the base vector were called i et j, the column
      (1
      (1
      would not mean left i components unaltered, it would mean transform i into 1 i + 1 j.
      On the other hand
      (1
      (0
      means i become 1 i + 0 j (ie leave i unaltered)
      hope it helps.

  • @roxes787b
    @roxes787b 2 роки тому

    At 8:55 AB is not equal to BA. Is that directly the reason why commutators are made to tell about how the order matters?

  • @avanishpadmakar5897
    @avanishpadmakar5897 5 років тому

    The
    question at 7:26 were we supposed to assume the basis vectors were [1,0], [0,1] .Would in general for basis vectors [a,b] and [c,d]
    M=a 0
    b 0 ?

  • @GauravGandhiOfficial
    @GauravGandhiOfficial 4 роки тому

    Why isn't the determinant of a 3×2 matrix defined? The input space is a 2D plane and output space is also a 2D plane in 3 dimensions. So our unit square in the input space has converted into a parallelogram on that plane in 3 dimensions. Thinking about it this way, shouldn't the determinant be defined?

  • @billylee5624
    @billylee5624 2 роки тому

    Nice geometric interpretation of the non-commutative matrices multiplied. Do you physicists do similar calculations with matrices like economists? I know a common formula is something like this z(XPP'X')^-1'(XPz)'=z(P'X'XP)^-1P'X'z' in econometrics.

  • @dylanparker130
    @dylanparker130 5 років тому

    wait, in that first poll question, i thought you were asking about a matrix that effectively made a dot product style projection. or even just taking the cosine of the first vector's angle in obtaining the 2nd vector?
    that's non-linear, surely?

    • @guidogaggl4020
      @guidogaggl4020 5 років тому

      it is the cosine of the angle times the length of the vector and it s linear.
      By the way the dot product with ONE fixed vector( for example the x-unit vector as in the example) is also linear no mater which fixed vector

  • @justpaulo
    @justpaulo 5 років тому

    Wait, if you go to Australia how do I go to your office to take some doubts ??!
    And believe me, I have several... I feel my brain just transformed into some hot noodles mess!! My mind got blown away in one hand, but on the other hand I am confused...
    Maybe I was just overexposed to new knowledge haha :). I guess I'll have to play your video in repeat till it sinks in...
    Linear algebra confused me in college and it seems it didn't lost any of its powers over the years! haha (it actually gives me some strange feeling in my gut to think about it)
    Anyway, have a good one among the kangaroos!

  • @joaquinbadillogranillo8252
    @joaquinbadillogranillo8252 3 роки тому

    A vector that's perpendicular to the projection line would become null when projected, then rotating it would leave it null. However if this is in R^2 rotating it would make it have a component with respect to the projection line and when projected it won't be null :)

    • @joaquinbadillogranillo8252
      @joaquinbadillogranillo8252 3 роки тому

      the number of columns represents the input space dimension and the number of rows represents the output space dimension. Since the second matrix in the multiplication is the transformation we applied first, then the output of that matrix should have the same dimensions as the inputs of the first first matrix in the multiplication. That is p should be equal to m.

    • @joaquinbadillogranillo8252
      @joaquinbadillogranillo8252 3 роки тому

      On the second one I think its not a linear transformation, this is because if we take two random vectors, x and y and apply the transformation to their sum we get:
      T(x + y) = (x + y) + k
      However, if we took the transformations separately and add them:
      T(x) + T(y) = x + k + y + k = (x + y) + 2k
      Therefore:
      T(x + y) != T(x) + T(y)
      Hence it is not a linear transformation.

    • @joaquinbadillogranillo8252
      @joaquinbadillogranillo8252 3 роки тому

      For the third one lets consider the basis B={e_1, e_2}
      The matrix A does the following to the basis vectors:
      A(e_1) = 2 e_1
      Since it is a basis vector we should write it as e_1 = (1,0)
      Then A(e_1)=(2,0)
      Moreover:
      A(e_2) = 6 e_2
      Since it is a basis vector we should be able to write it as e_2 = (0,1)
      Then A(e_2)=(0,6)
      The matrix takes this transformations as columns:
      A = 2 0
      0 6

  • @satyabrata3622
    @satyabrata3622 2 роки тому

    wow creative .....

  • @Dr_LK
    @Dr_LK Рік тому

    Vectors and transformations in geometry is only a small application area of matrices.

  • @KalpitaDas-eq9kg
    @KalpitaDas-eq9kg 9 місяців тому

    Can anyone suggest me a good book for visualisation of vectors, covectors, dual basis and matrices. By the way I'm a beginner (Physics background) and really want to visualise these mathematical tools.

  • @LoganDunbar
    @LoganDunbar Рік тому

    At 5:10 I think you made a mistake by saying v_3 = -1.5v_1 - 1.5v_2. When you flip a vector the tail should remain in the same place, but you flipped from the head, so to me it looks like it should be v_3 = -1.5v_1 - 0.5v_2. (i.e. v_2' is only half a green grid line from the origin, not 1.5 times)

  • @abhishekbhattacharjee495
    @abhishekbhattacharjee495 5 років тому

    hi lgu this is a good video for for under grads ...... but i am a physics student doing my msc and these type (last two) videos were not much useful to me...... unlike i really enjoyed ur views regarding quantum mechanics and mechanics..... i would suggest u to put some topic which are useful to post grads like non linear dynamics and group theory along with some of their applications..... I'll be TRUELY GREATFUL ......

  • @heyandy889
    @heyandy889 5 років тому +7

    Maybe it's because I don't understand vectors, matrices, and linear combinations very well yet, but I found the frequent meta-comments pretty distracting.

    • @adityakhanna113
      @adityakhanna113 5 років тому

      They are, but you get used to them over time

  • @culwin
    @culwin 5 років тому +3

    Vectas

  • @guidogaggl4020
    @guidogaggl4020 5 років тому

    wow the 1st homework was really brilliant. Before this video my answere would be: well because the matrix multiplication formula wouldn;t work
    But now its like of course: you can't make a transformation from n to m dimensions if your vector space doesn't has n dimensions after the first Transformation (aka first matrix)
    it all makes sense now

    • @LookingGlassUniverse
      @LookingGlassUniverse  5 років тому

      I'm so so happy to hear this! I remember when I realised this too and I it all finally clicked for me. It's a good feeling.

  • @kemubadap3643
    @kemubadap3643 3 роки тому

    0:48 Ignotem per ignotem

  • @patrickwienhoft7987
    @patrickwienhoft7987 5 років тому

    3:14
    Really small criticism: Try to make your voiceover order match what you're writing. You say "first do a lin. comb. and then multiply by 3" but at the same time you write the 3 first, then do the lin. comb. This can really throw some people off.
    I imagine you do the visuals first an then do the audio? Do you have the audio written out when recording the visuals or do you only have a general outline but no conrete sentences?

  • @casperTheBird
    @casperTheBird 5 років тому

    I like to think of matrix multiplication more like applying a function to a value. "Multiplying" is pretty much a flat out lie, if anything

  • @saththiyambharathiyan8175
    @saththiyambharathiyan8175 4 роки тому

    Why vectors are taken column wise.....?

  • @electrikshock2950
    @electrikshock2950 5 років тому

    *sees the thumbnail*
    And that's why we have commutators

  • @timanderson5717
    @timanderson5717 5 років тому

    All nonzero vectors will be different if you do the transformation in a different order.

    • @michaelsommers2356
      @michaelsommers2356 5 років тому

      That's not true. If A and B are inverses of each other, then AB = BA, for example if A rotates 90 degrees clockwise and B rotates 90 degrees counter-clockwise. Or, more generally, any two rotations can be done in either order: rotation by 20 degrees and then by -10 degrees is the same as rotation by -10 degrees and then by 20 degrees. It isn't hard to come up with examples that don't involve rotations, too.

    • @timanderson5717
      @timanderson5717 5 років тому

      Using the example transformations given in the video, my statement holds.

    • @michaelsommers2356
      @michaelsommers2356 5 років тому

      You didn't restrict your statement to the examples in the video. Your statement is not true in general, which is how you expressed it.

  • @brynwhitehead1731
    @brynwhitehead1731 5 років тому

    Mama Mia that's crazy.

  • @LG-qj9xv
    @LG-qj9xv 5 років тому

    The comments are entertaining but very distracting

    • @steffahn
      @steffahn 5 років тому +1

      Well, psychologically speaking (not an expert though) distraction might even help with understanding, because it forces you to focus more to be able to follow along. More focus = better retention.

    • @LG-qj9xv
      @LG-qj9xv 5 років тому

      Frank Steffahn Very good point

  • @tpb2
    @tpb2 Місяць тому

    Good but find the visual style really distracting and unhelpful.

  • @tungom8752
    @tungom8752 5 років тому

    No. A matrix is an array of numbers and a vector is a column/line of numbers. See the definition. Anything else is just visualization tricks.

    • @tungom8752
      @tungom8752 5 років тому

      Confusing things and their applications is not a good idea at all.

  • @tokajileo5928
    @tokajileo5928 5 років тому

    the background music is really annoying and reduces concentration. why videos explaining things must always have to have music in the background? what is this obsession with background music when the important thing is is what the person says? I can never understand that.

  • @ManojKumar-cj7oj
    @ManojKumar-cj7oj 3 роки тому

    1 yes
    2 no

  • @jonasdaverio9369
    @jonasdaverio9369 5 років тому

    Actually, a matrix is really an array of numbers but a vector is not. A vector in a basis could be represented by a column matrix. A matrix is a basis-dependent representation of a linear transformation.
    That's the formal aspect but everything is ok if you look at matrix and see a linear transformation, just pick up any basis you want

    • @46pi26
      @46pi26 5 років тому +1

      Jonas Daverio what a vector or a matrix "is" is totally up to interpretation. 3blue1brown's Essence of Linear Algebra series describes this more in depth, but it's kind of like defining the Gamma function. Some people say that its factorial form is the definition, and the integral just so happens to work, whereas others say the integral is the definition, and the factorial just so happens to work. So really, no one's wrong about their definition of vectors and matrices, as long as they obey the right axioms.

  • @nicolasvallee5992
    @nicolasvallee5992 5 років тому

    3b1b

  • @chicchi1682
    @chicchi1682 5 років тому

    ok this one lost me

  • @bigdx5059
    @bigdx5059 5 років тому +4

    Can you do a face reveal pleeease???

    • @bigdx5059
      @bigdx5059 5 років тому +3

      DeadLink 404 no

  • @schitlipz
    @schitlipz 5 років тому

    Why are your lovely hands doing high school math now?
    D'oh! I meant, "A hand!!!"

    • @schitlipz
      @schitlipz 5 років тому

      Aw don't shrink away, Alice. Just trying to give some levity. But, actually, matrices and vectors were high school math back in the day... not today, unfortunately. Speaking of Wonderland, can Ant-Man tech ever become a reality? :)