Tensors for Beginners 1: Forward and Backward Transformations (REMAKE)

Поділитися
Вставка
  • Опубліковано 29 лис 2022
  • Tensors for Beginners playlist: • Tensors for Beginners
    Leave me a tip: ko-fi.com/eigenchris
    I made a mistake in the original version of this video that has been confusing people for years. Super late, but trying to make amends.

КОМЕНТАРІ • 95

  • @thedorantor
    @thedorantor Рік тому +152

    The fact that after so many years you still bother to answer questions from comments under old videos and even remake a 4 year old video... You're an amazing teacher, nothing but respect for you!

  • @Jonas-Seiler
    @Jonas-Seiler Рік тому +31

    I have never seen the visualisation of vectors you show at the very end of the video before. This is basically an epiphany for me. Taking the basis to be rows while taking the components to be columns suddenly makes everything I have heard about vectors and covectors make sense. This video has already been truly invaluable to me.

    • @eigenchris
      @eigenchris  Рік тому +16

      I didn't actually realize that's what you're supposed to do until half-way through making this video series. I think a viewer pointed it out to me. Part of the reason I did a "REMAKE" of this video was to state that fact. It's a bit shocking it's not more common.

  • @nstrisower
    @nstrisower Рік тому +4

    WHAT THE FUCK HOW DID I NEVER MANAGE TO LEARN ABOUT THAT WAY OF MULTIPLYING MATRICES
    that alone makes this video a literal godsend tbh thank u so much

  • @atzuras
    @atzuras Рік тому +5

    This remake is very welcome. One of the best introductory courses I found on this plattform.

  • @linuxp00
    @linuxp00 Рік тому +17

    I'd like to thank you for your care, even on such early lessons like that. Personally, your lectures helped me in a presentation about Schwarzschild's solution to Einstein's Field Equations

  • @dalisabe62
    @dalisabe62 Рік тому +2

    Excellent presentation. It assumes that the student know very little or nothing. This shows experience in teaching and good common sense on the part of the instructor.

  • @michaelzumpano7318
    @michaelzumpano7318 Рік тому +5

    I watched your series on general relativity. It was awesome! You make these subjects so navigable. I hope you never stop.

  • @etiennecameron7783
    @etiennecameron7783 Рік тому +5

    Thank you Chris. You answered the exact question I had yesterday in your tensor calculus series. Didn't understand why the jacobian matrix was not transposed. Basis vectors are covariant. Covariant vectors are covectors. Covectors multiply from the left. Thank you

  • @jonpritzker3314
    @jonpritzker3314 Рік тому +2

    Heaping on the deserved laudations. Your effort is noble and nearly brings me to tears. Thank you.
    I've studied row and column multiplication, and I started Arfken's Math for physicists, where one of the opening problems is rotating a 2d basis, and I've heard people say "nevermind that torque is a pseudo bi-vector," but I did not know linear algebra and that Arfken stuff was baby tensor math. Your presentation has got me excited, because it might not be as scary as I thought :D
    Second smily for emphasis :D

  • @mMaximus56789
    @mMaximus56789 Рік тому

    Hopefully you updated/continue this series!

  • @msontrent9936
    @msontrent9936 2 місяці тому

    Excellent explanation. Thank you for taking the time ...

  • @Michallote
    @Michallote Рік тому +1

    Thank you for the remake!

  • @quantumspark343
    @quantumspark343 Рік тому

    Excellent quality as always

  • @MrMilesfinn
    @MrMilesfinn Рік тому +1

    Thanks from an experimental physicist--great work.

  • @phijiks
    @phijiks Рік тому +6

    I was reading quite a few books on Tensors for beginners, and then i came across your series of Tensors for beginners(many years ago when you just started making those initial videos)..... Initially it felt the best source for learning tensors however after the error in this particular video and then the correction video you made earlier it all felt so much confusing that i tried many times on multiple occasions to clear things up and make a coherent sense of all the books i have and your (this particular) video but i couldn't.... Ultimately each time i had to drop this topic and move further only to get stuck in advance as I didn't know basics well......
    I really hope this time when I'm going to give a new try with this video, i really can make a sense of all the books and your videos and finally understand the proper basics of it....
    Meanwhile all these years i was just wondering you moved ahead continuing your series all the way up to General relativity and it made me wonder even if you could revisit your older videos and make a correction for people like me so I can reach the advance videos with you as well..... In hopes i always had your notifications ON...... And woah ho finally you did that today ..... I am so so much grateful for you, for that ...... I'm a high school physics teacher, but Tensors has always been something i really really want to understand and learn properly....and you are the best source I have found till date, again thankyou for this correction video

    • @eigenchris
      @eigenchris  Рік тому +6

      Yeah, I'm really sorry about the error. I was pretty inexperienced when I made this and had no idea so many people would watch it. I thought the correction video would help, but I realized it just confused some people more. I'm late to the party fixing this, but hopefully it will help future people who watch.
      Also, my relativity series covers tensors from the beginning in a much more understandable way (I think). I'd suggest watching the Relativity 102 videos if you are interested.

    • @phijiks
      @phijiks Рік тому +2

      @@eigenchris woah, this the second time i got your reply 🥺🥺 .... And yes the correction video created more confusion😅..... I understand your concern that you were learning that time but still even that time your videos made much more sense than mere mathematical definitions on Tensors and yes still better late than never, a huge shoutout to your effort in remaking this video ..... Will see your other series as well(on relativity), first I'm going to resume this series again and see if i can get through this time .... Heartfelt thank you for this remake Chris 🙂

  • @johnsimspon8893
    @johnsimspon8893 Рік тому

    kudo man. That mistake has caused me much difficulty for years.

  • @omnipotentpotato2436
    @omnipotentpotato2436 6 місяців тому

    Great video chris!

  • @tomgraupner171
    @tomgraupner171 Рік тому

    Thanks a lot. Still hope to see you doing stuff on Dirac spinors, QM and QFT!

    • @tomgraupner171
      @tomgraupner171 Рік тому

      That's so cool - I wished - You made it .... Wonderful life

  • @robertlouis9083
    @robertlouis9083 Рік тому

    Wow I just found this video for the first time and haven't watched it yet but I'm sure glad whatever mistake was in the original one I missed because by the sounds of these comments it was a doozy.

  • @jonnymahony9402
    @jonnymahony9402 Рік тому

    to understand all of this you really have this notation and linear algebra manipulations under your belt, same if true for quantumfieldtheory, I'm often lost in notation 😂

  • @manfredbogner9799
    @manfredbogner9799 5 місяців тому

    very good

  • @Mikey-mike
    @Mikey-mike Рік тому +7

    Thanks, EigenChris.
    Actually, your original video on this subject with your mistake was a good pedagogic device to point the matrix entries of the individual equations.
    You are one of kind, an excellent teacher.

  • @martin2ostra
    @martin2ostra Місяць тому

    Thanks

  • @vincenzocotrone4370
    @vincenzocotrone4370 2 місяці тому

    Good morning,
    Could you please suggest a book which explains the subject in a similar way that you do?
    With exercises as well.
    Thank you very much, in advance.

  • @aditya_a
    @aditya_a Рік тому +3

    Hey, thanks for this series! Just a question - In the very beginning, you say that the matrix representing the "forward transformation" eats vectors in the old, blue basis and outputs vectors in the new, red basis. But this doesn't quite sit right with me. If I feed the vector v with components (1, 0) into this matrix via F*v, the result is (2, 1). So clearly, what seems to be happening is F eats vectors in the red, what you call "new" basis, such as (1, 0), and outputs vectors in the blue, what you call "old" basis...

    • @jongraham7362
      @jongraham7362 Рік тому

      I think you are correct, if you are multiplying from the right, Aditya. To go from the new basis to the old you write the "new basis elements" in terms of the old basis. So the first column is [2,1] the second column is [-1/2, 1/4]...again these are the coordinates for the new basis elements {e˜1, e˜2} in the old basis {e1, e2}. These coordinates make up the columns of the matrix. This will take you from the new basis to the old basis. Plugging in something on the right with coordinates in the new basis...this will give you that point in the old basis. For instance plugging in (1,0) from the new basis will give you (2,1) in the old basis. Plugging in (0.1) in the new basis will give you (-1/2, 1/4) in the old basis. He is not multiplying on the right, he is multiplying on the left. It makes much more sense to me multiplying on the right as a transformation, it is not clear to me why he is multiplying on the left. I am hoping though that this will not cause issues further along though, because he is someone who fleshes out issues with tensors that I struggle with, that others seem to skim over.

    • @jongraham7362
      @jongraham7362 Рік тому

      In fact, if you go on to the next video, you will discover that Chris understands that F goes from the new basis to the old basis...plug in something in with new basis coordinates and you get that point in the old basis coordinates, and B goes from the old to the new. This makes sense for B because it is the old basis elements (1,0), (0,1) written in the new basis coordinates. He describes it as being backwards... but that is because he has it backwards in this video. No worries though... I think his videos are well done.

  • @sdsa007
    @sdsa007 Рік тому

    awesome! I get that they are inverses of each other… is seems like it would be easy to directly derive one from the other… just by reversing the left-right vertical and flipping the sign on the right-left vertical… but not sure how to prove that.

    • @eigenchris
      @eigenchris  Рік тому +2

      If you google "2x2 matrix inverse formula", you'll find a formula for how to convert and 2x2 matrix into its inverse.

  • @abhisheksuretia
    @abhisheksuretia 8 місяців тому

    Sir can you tell the refrence book for tensor

  • @Vickipirate12
    @Vickipirate12 2 місяці тому

    How you changed the summation signs in 8:47
    Is there any property?
    IDK

  • @peronianguy
    @peronianguy 2 місяці тому

    Great video! Two questions/comments from a real beginner: in the linear algebra video series by 3b1b, the vector was to the right and the transformation to the left, echoing f(x). In matrix multiplication, it means that we frist apply the transformaiton to the right and then the transformation to the left. Why is the order different here?
    The second comment is that although the building of the new basis vectors out of the unit vectors looks visually pretty intuitive, the opposite is not true. In fact it is not clear at all whether you're applying the same "visual" procedure and why you get for example a -1

    • @eigenchris
      @eigenchris  2 місяці тому +1

      The way 3b1b does it involves the vector components (I show this quickly near the end of the video). In this video I'm dealing with basis vectors, which must be written as rows if you want the multiplication rules to work out properly.
      This style of writing basis vectors in a row is pretty unique to me. I haven't seen it anywhere else on youtube or textbooks.
      As for figuring out which vectors belong in the linear combination, I agree it can involve some trial and error.

    • @peronianguy
      @peronianguy 2 місяці тому

      @@eigenchris I see! Thank you very much for your answer :)

  • @kilianwilhelm3184
    @kilianwilhelm3184 4 місяці тому

    Why do we notate basis vectors as row vectors? Is it because basis vectors transform like covectors? Would you then notate basis covectors as column vectors?

  • @ilikegeorgiabutiveonlybeen6705

    thanks

  • @wjrasmussen666
    @wjrasmussen666 Рік тому

    is there going to be a playlist for this?

    • @eigenchris
      @eigenchris  Рік тому

      There already is a playlist from 5 years ago. Just search "Tensors for Beginners" in the search bar and it should pop up.

  • @anangelsdiaries
    @anangelsdiaries 3 місяці тому

    Is there a specific reason why we write the vectors as row vectors and multiply from the left instead of column vector multiplied on the right?

    • @eigenchris
      @eigenchris  3 місяці тому +2

      It's arbitrary. But almost everyone writes vector components as columns, so I stick to that convention.

  • @viaprenestina3894
    @viaprenestina3894 6 місяців тому

    shouldn't the backward transformation be the inverse of the forward one?

  • @sethapex9670
    @sethapex9670 Рік тому

    Is the fact that tensors are coordinate system invariant the reason they are used in relativity, since then it would not matter what frame of reference we are operating in?

    • @eigenchris
      @eigenchris  Рік тому +1

      Yes. All important equations in relativity should be written with tensors, so that they are the same for all reference frames.

  • @Benjatastic
    @Benjatastic Рік тому +1

    Is there a deep reason this video series multiplies vectors from the left like xA instead of from the right? Or is it just the convention you felt was pedagogically superior?

    • @Benjatastic
      @Benjatastic Рік тому +1

      To elaborate on what I mean by a "deep reason," are the vectors rows because they represent linear functionals or come from a dual space?

  • @JL-jc5fj
    @JL-jc5fj 4 місяці тому

    Sir,how do we know that we need 1/4 of e1 tilde to construct e1

    • @eigenchris
      @eigenchris  4 місяці тому

      It's just from looking at the picture and doing the measurement.

  • @danieleba.9924
    @danieleba.9924 10 місяців тому

    If I understand the backward Matrix is equal to the inverse of the Forward Matrix? (I do not speak English so much, and if I had a mistake I'm sorry 😅)

    • @eigenchris
      @eigenchris  10 місяців тому +1

      Yes, that's right.

  • @pratik9056
    @pratik9056 Місяць тому

    But e_i = \sum_i e_i ? then the the summation here only represents the value of i taken into consideration?

  • @maryamhasan2618
    @maryamhasan2618 11 місяців тому

    Please clear my one confusion, here inverse means that when we go from old basis to new basis we had ro take inverse of the new basis. M i correct ?

    • @eigenchris
      @eigenchris  11 місяців тому

      Yes. If the F matrix takes us from the old basis to the new basis, then F-inverse takes us from the new basis to the old basis.

  • @sampson4844
    @sampson4844 Рік тому +1

    is [e1 e2] a covector? or is it informal to write it this way? thx

    • @eigenchris
      @eigenchris  Рік тому +2

      It's a pair of basis vectors written as a row. They transform covariantly, but they are not a covector. It's a pretty informal way to write it--I haven't seen anyone else write them like this.

    • @sampson4844
      @sampson4844 Рік тому

      @@eigenchris thx,😃

  • @fredrickcampbell8198
    @fredrickcampbell8198 3 дні тому

    3:10
    Using the forward matrix:
    [1, 0][2, -1/2; 1, 1/4] = [2 1]
    Since the forward matrix is supposed to transform the basis vectors from the one without a ~ to the one with a ~, isn't this actually the backwards matrix since it transforms the basis vectors from the one with a ~ to the one without a ~?

  • @yoavboaz1078
    @yoavboaz1078 2 місяці тому

    What was the mistake?

  • @thevegg3275
    @thevegg3275 8 місяців тому

    At minutes 6:50 you say how to determine the coefficients for a summation
    I do not understand your choice of linear term coefficients to express E1 tilde and E2 tilde.
    You have a myriad of terms to select the components from but you pick F12 and Fn1. I do not understand why those two. Thanks

    • @eigenchris
      @eigenchris  8 місяців тому

      I was just using those two as an example to explain what the subscripts mean. The 1st F subscript is attached to an "e" basis vector on the right of the = sign, and the 2nd F subscript is attached to an "e~" basis vector on the right of the = sign.

  • @thevegg3275
    @thevegg3275 8 місяців тому

    As far as vector spaces by the definition it seems like all vectors in the entire universe are in the same vector space since all vectors have the same 3 V,S,+,; collection. Is that correct. Based on that it seems vector space never have a vector outside of the one and only vector space.

    • @eigenchris
      @eigenchris  8 місяців тому +2

      The main thing that differentiates one vector space from another is the dimension (how many independent directions you can define).

    • @thevegg3275
      @thevegg3275 8 місяців тому

      @@eigenchris Chris, I will pay you if you can help me go from the graphical representation of contravariant and covariance components how they get attached to a Tensor. It will probably have to be done through a phone call. Let me know if you’re willing to do this and I will somehow give you my contact information. This is super important to me.

  • @StefanoBusnelliGuru
    @StefanoBusnelliGuru Рік тому

    Writing vectors as row vectors means that those are not vectors but 1-forms?

    • @eigenchris
      @eigenchris  Рік тому +3

      1-forms have components written as rows and basis-covectors written as columns.
      The fact that basis vectors are written as a row just means that the basis vectors are covariant.

  • @dhruvsharma8430
    @dhruvsharma8430 10 місяців тому

    Bro @8.48 how did you changed the position of sigmas without checking the limits for the variables

    • @eigenchris
      @eigenchris  10 місяців тому

      I don't understand the question. Can you explain what you mean?

    • @roadtogod6556
      @roadtogod6556 9 місяців тому

      doesnt matter thise are finite summs

    • @dksmiffs
      @dksmiffs 4 місяці тому

      @dhruvsharma8430, the swapped sigmas @8.48 also weren't intuitive to me, so I took the time to expand the sums written both ways. I concluded that associativity of addition allows this swap, @eigenchris please correct me if I'm wrong.

  • @thevegg3275
    @thevegg3275 8 місяців тому

    Is there something fundamental about the blue old basis? It seems like they are unit basis.
    For the sake of argument, what if the old basis was the red one and then you wanted to transform to the new blue basis… then yould need to use the forward transformation still.
    It almost seems like you can always just use the forward transformation as long as you switch, which one is old and which one is new

    • @eigenchris
      @eigenchris  8 місяців тому +2

      You're right: the one you label "old" and the one you label "new" is arbitrary. I just needed to call them something.

    • @thevegg3275
      @thevegg3275 8 місяців тому

      @@eigenchris thank you I’ve been waiting a long time to ask this question.
      I cannot seem to make a connection between a graphical representation of contravariant components and cover and components, and how they relate to tensors as indices. if I have a vector with contravariant components of five and three, and this can be represented as covariate components of said 13 and seven how do these numbers actually populate anything in the tensor? I’m just trying to understand the connection between the graphical representation of contravariant and covariant components and tensors. Thanks for any help you can give.

  • @Oylesinebiri58
    @Oylesinebiri58 Рік тому

    👍

  • @edwardlee7898
    @edwardlee7898 7 місяців тому

    Fkj? Do you mean Fki ?

  • @eqwerewrqwerqre
    @eqwerewrqwerqre Рік тому +1

    If you made a text book I would buy 10

  • @ShadowZZZ
    @ShadowZZZ Рік тому

    well that's unexpected to see

    • @eigenchris
      @eigenchris  Рік тому +4

      I still get people in the comments confused about it, so I just had to bit the bullet and upload a new one, even if it's an old video.

    • @abstractnonsense3253
      @abstractnonsense3253 Рік тому

      @@eigenchris Thank you for doing that

  • @DavidSartor0
    @DavidSartor0 Рік тому

    The audio sounds worse. People usually get better audio, not worse, so I'm probably doing something wrong.

    • @eigenchris
      @eigenchris  Рік тому

      I figured the audio sounded much better. What's wrong with it?

    • @DavidSartor0
      @DavidSartor0 Рік тому

      @@eigenchris Your voice sounds deeper. Probably it just deepened in real life.

  • @b43xoit
    @b43xoit Рік тому

    Leopold Kronecker, German.

  • @thesigmaenigma9102
    @thesigmaenigma9102 Рік тому

    Tensors are just objects that behave like tensors

  • @mohsinshah6857
    @mohsinshah6857 Рік тому

    Sir, why u r not regular in uploading the vedioes

    • @linuxp00
      @linuxp00 Рік тому

      These topics are no easy, man. Chris was learning while teaching. In fact, he already finished this course, now is just a correction.

    • @jonpritzker3314
      @jonpritzker3314 Рік тому

      I like that format. Sir, why u not explain me everything pls?

    • @francisherman8982
      @francisherman8982 Рік тому

      You're assuming he's got a queue of videos finished and ready to post. More likely he posts as he finishes them, and makes them as time allows. I don't know if he's monetized these videos at all, but even if he has, I doubt it's enough to quit his day job. A bit less entitlement, a bit more appreciation!

  • @wrog268
    @wrog268 11 місяців тому

    Fji

  • @edwardlee7898
    @edwardlee7898 7 місяців тому

    Sorry Fkj is right

  • @seriktabussov5892
    @seriktabussov5892 4 місяці тому

    it's more like brugging not teaching

  • @angeldude101
    @angeldude101 Рік тому

    What do you mean by "summarizes ... nice and simply" when that just looks like an ordinary matrix multiplication? It doesn't seem to be saying anything that isn't simpler than ẽᵢ = eⱼF. (The fact that the shorter version is completely representable in a UA-cam comment just hurts the more verbose version more.)
    Similarly, δ is a lot harder to type than I, is introduced much later, but seems to mean the exact same thing. Iv = v is one of the first things that you learn about matrices, alongside MM¯¹ = M¯¹M = I. I'm just failing to see the reasoning for using δ instead of just the identity matrix, or using explicit summation over the matrix (and probably later tensor) product. (Actually, I'm pretty sure the tensor product isn't a direct extension of the matrix product like how the matrix product is an extension of the dot product, but that just begs the question of why there isn't such an extension to begin with.)