Tensors for Beginners 14: Tensors are general vector/covector combinations

Поділитися
Вставка
  • Опубліковано 23 лют 2018

КОМЕНТАРІ • 95

  • @spencermoran2970
    @spencermoran2970 4 роки тому +40

    Students taking GR or diff geo for the first time are spoiled to have this! I remember tearing my hair out trying to reason about all of this.

    • @mr.es1857
      @mr.es1857 3 роки тому +5

      Is this a great start before taking diff geo?

    • @tasnimulsarwar9189
      @tasnimulsarwar9189 3 роки тому +2

      @@mr.es1857 yes, definitely for sure.

    • @korwi7373
      @korwi7373 3 роки тому

      TRUESSS

    • @chenlecong9938
      @chenlecong9938 Рік тому +1

      expecially in undergrad…even though most undergrad offer gr,but no one really understand the intuition behind that though.

  • @rcsz229
    @rcsz229 4 роки тому +25

    Wow! I was taught this years ago and it never really sunk in because I never used it. I suddenly discovered that I needed tensor algebra about 40 years later. This is really a 20 hour course as one has to work through each step but it is really clear and informative. Excellent

  • @klong4128
    @klong4128 3 роки тому +12

    Very good introduction to Tensor using Matrix ( non conventional/traditional way ) . When
    I see your video series 1 to
    11 , nothing new and it is just a lower secondary school matrix ( nothing new ,and why student must wait until Master/PhD degree to learn Tensor ? ) .When video series continue after series 12 ,13,... Etc .I begin to understand why some primary/secondary school Genius pupil able to jump classes to University Level because they had been early exposed to advance knowledge until their mind enlighten one day . Thus your way of teaching Tensor is very slow and steady .It is absolutely different from most University professor teaching straight into the introduction by Contravariant/Covariant and a bunch of Enstein Notation .Confusing beginners/layman see Sky-Stars .Too many abstract symbols/mathematics until many student Scared and give up !! Your good job and keep it up !!!

  • @jamescook5617
    @jamescook5617 4 роки тому +14

    When you say tensors are formed by combining vectors and covectors using tensor products it is important to understand that "combined" is meant to include linear combinations of tensor products. I think this was made clear in an earlier video, and perhaps even here, I could have missed it. Anyway, as usual, nice work.

  • @guglielmofratticioli7773
    @guglielmofratticioli7773 3 роки тому +2

    Instant knowledge never been so fast

  • @rcsz229
    @rcsz229 4 роки тому +5

    Looking at some of the comments, I hope that your next series will make manifolds clearer. I have always assumed that they are a local linearisation of multi-variable function and this allows local solution of PDEs. I think it's a difficult concept so I'm hoping that when I "transition" to the tensor calculus series, things will be clearer.

    • @eigenchris
      @eigenchris  4 роки тому +5

      Unfortunately I don't cover manifolds. I don't understand them that well and the definition seems very heavy and not needed for the basics of general relativity (which is why I made these videos). I just use the hand-wavey definition of "curved space that looks flat if you zoom in really close".

  • @user-hh5te1fr4w
    @user-hh5te1fr4w 4 роки тому +8

    Something is wrong at 6:32. It should be Q^1_2_2 in the right up corner

    • @bahtree2385
      @bahtree2385 4 місяці тому +1

      You are right, otherwise it breaks the pattern.

  • @JgM-ie5jy
    @JgM-ie5jy 5 років тому +3

    I really love your clear explanations. One thing missing : the motivation for these two new tensors. The letters D and Q seem to suggest something specific, as opposed to say something like X and Y.

    • @eigenchris
      @eigenchris  5 років тому +5

      D and Q were just random letters, honeslty. If you want to see "real" higher-order tensors, you can look at the Riemann Curvature Tensor, which I blieve is a (1,3) tensor. The Torsion Tensor is another example, but I don't see it used as much. These pop up in geometry, and General Relativity.

  • @drlangattx3dotnet
    @drlangattx3dotnet 6 років тому +1

    I will watch the last few lectures over and work harder to understand. It got hard for me around 3:40. :-)

  • @andreutormos7210
    @andreutormos7210 3 роки тому +4

    How many indices you want?
    Algebra: yes
    Jokes aside, thank you for the videos, they are really helpful

  • @Adityarm.08
    @Adityarm.08 11 місяців тому

    Thank you!

  • @dennisbrown5313
    @dennisbrown5313 5 років тому

    A little fast; I assume the Kronecker delta function converted the indices in the 1 to 2 minute marks

  • @syedzaheerabbas4691
    @syedzaheerabbas4691 6 років тому +3

    Thanks for your nice explanations.

    • @eigenchris
      @eigenchris  6 років тому +2

      You're welcome. I was really worried about this video in particular. I was feeling I didn't do a great job. Glad you liked it.

    • @syedzaheerabbas4691
      @syedzaheerabbas4691 6 років тому

      Actually i am using tensors as tool for information geometry, i got the secrets by your videos.
      can you share your email please?

    • @syedzaheerabbas4691
      @syedzaheerabbas4691 6 років тому

      thanks

    • @TheBigBangggggg
      @TheBigBangggggg 6 років тому

      I love the honest comments you make during the process of making these videos.

  • @drlangattx3dotnet
    @drlangattx3dotnet 6 років тому +3

    at 0:57when I see three "e's" together, what is that in terms of the standard tensor product notation? Would that use two circle-mult symbols?

    • @dustloopspace
      @dustloopspace 4 роки тому +1

      For the tensor Q made up of 1 vector and 2 covectors, normally textbooks would write it that out in full as Q = Qijk ei⊗ϵj⊗ϵk. Either that or they would omit the components altogether and simply write Qijk.

  • @vibinpain
    @vibinpain 2 роки тому +2

    6:27 i think that the last array and the first element should be 1
    Q
    22

  • @admiralhyperspace0015
    @admiralhyperspace0015 2 роки тому +1

    At 6:13
    I saw it as a row of columns of columns.
    I saw one row in which there were two columns and there were two columns in each of those columns.
    But you said row of rows of columns which I think isn't correct. You get rows out of vectors as they are. columns with many row and columns out of co vectors because they are rows with many columns
    I am still confused but think I am correct. Anyone

  • @francescocannistra7915
    @francescocannistra7915 3 роки тому +1

    In the previous videos you explain how to interpret vector-covector and convertor-covector pairs, thus giving a general insight on how to interpret tensors made up of a combination of 1 vector and multiple covectors and (as a multi-linear map) and of a combination of just multiple covectors (as a multi-linear form).
    However, you don't explain what vector-vector pairs so that it doesn't appear clear how to interpret a general tensor made up of a combination of more than one vector. For example, in the video, it is clear how to interpret the general tensor Q (as a bi-linear map from VxV -> V) while it's not completely clear how to interpret tensor D.
    Anyway, your videos are great!

    • @eigenchris
      @eigenchris  3 роки тому +1

      Usually covectors can be thought of as "actions"/"functions", and vectors are the "things" that covectors act on. D, being a tensor product of 2 vectors, is a "thing" can can be acted on by any function that accepts 2 vectors. For example, since Q is a bilinear map from V x V -> V, you could feed D into Q to get a vector from V, since D is a vector-vector pair.

    • @francescocannistra7915
      @francescocannistra7915 3 роки тому +1

      Not sure this makes completely sense. Suppose a simplified Q without vectors in the tensor product. Then Q is a (0,2) tensor, i.e. a bilinear form with domain in VxV: it acts on bi-vectors and returns a scalar. D is definitively a richer object than a pair of vectors: it is not an element of VxV but it is an element of the space tensor product of V with itself (in some sense it is a N-vector).
      I am afraid that there are not insightful shortcuts to make sense of also (n,0) tensors as abstract entities besides thereof characterization as linear functions that act on the dual space of V. Of course, this comes at the expense of stealing some clarity to the direct interpretation of the tensors (1,0) as vectors because they must then be seen as linear functions that act on covectors.

  • @warrenchu6319
    @warrenchu6319 2 роки тому

    So the point of this video is: Tensors expressed in Einstein notation is unambiguous vs them expressed in the abstract or as arrays. Is that right?

  • @jasonbroadway8027
    @jasonbroadway8027 3 роки тому +1

    Nevermind, I see the error in my logic. Thanks for the videos.

  • @prateekgupta3936
    @prateekgupta3936 6 років тому +1

    Great explanation! Will it be possible to get slides for future reference?

    • @eigenchris
      @eigenchris  6 років тому +2

      I can look into uploading the slides for this video on the weekend. If you want the slides for every video, I can do that too, but it will take a while since some of them have mistakes I need to fix.

    • @prateekgupta3936
      @prateekgupta3936 6 років тому +1

      Thanks! I think the summary slides will be very helpful. Those are some neatly explained concepts.

    • @eigenchris
      @eigenchris  6 років тому +4

      You can download them from here: github.com/eigenchris/MathNotes
      Click "Clone or Download" and click "Download as ZIP".
      Let me know if you run into any problems.

  • @mr.es1857
    @mr.es1857 4 роки тому +2

    at 2:12 in the "Q" tensor how do we know that we can send "B" up in front in the transformation rule from old to new, I mean
    Qnew= BQFF bout how do we know that "B" should move to the front?
    Thanks for all of your work!
    Also at 4:38 how can we distribute the e's in whichever way we want to?

    • @IlshatAliev
      @IlshatAliev 4 роки тому +1

      >> Qnew= BQFF bout how do we know that "B" should move to the front?
      My understanding is that if you look at it as a sum of products it should not matter (taking into account that upper/lower indexes correspond to row/column indexes).
      In terms of matrix multiplication we want to multiply row x column i.e. iterating over column index (lower index) and row index (upper index): Bi/a * Qa/bc for fixed i, b and c.

  • @mahdiqaryan6962
    @mahdiqaryan6962 5 років тому +5

    First, Thank you so much for your great effort to prepare this awesome series on tensors.
    I had a question as well. In defining two new tensors at 1:09, you introduced D tensor made up of two vectors as (2,0) tensor and for Q, made up of one vector and two covectors as (1,2) tensor. I think the order of numbers in these two cases are reverse regarding what you explained before in Video 9, at 13:26. Actually, I was thinking for example in the case of D tensor which is built of two vector basis which obeys covariant transformation rules and as I understood the second number in order stands for covariants. Please let me know if I am true or wrong.

    • @azgnmaymun2500
      @azgnmaymun2500 4 роки тому

      Wrong. Regarding of what you just said, implying (2,0) should be (0,2) and (1,2) should be (2,1), remember that (0,2) tensors are basically Bilinear forms by definition

    • @user-kn6eq1cf5s
      @user-kn6eq1cf5s 6 місяців тому

      I was confused myself. This is how I understood after some searching and thinking.
      Conclusion first: the notation at 1:09 (“Q is a (1,2) tensor”) is correct.
      1. First it is helpful to clarify on what the indices mean.
      en.m.wikipedia.org/wiki/Tensor#Examples
      “This table shows important examples of tensors on vector spaces and tensor fields on manifolds. The tensors are classified according to their type (n, m), where n is the number of contravariant indices, m is the number of covariant indices, and n + m gives the total order of the tensor.”
      2. This means that the (n,m) notation which is (1,2) in the case of Q, refers to the number of indices of Qa_bc - NOT the number of covectors and vectors.
      This was the root of confusion for me, since if we do as the latter - there are two covectors(epsilons) which are contravariant and one vector(e) which is covariant, so it seems that it should be a (2,1) tensor.
      3. 2:10 makes this clear. We can see here that the change of basis transformation rules for the Q indices are given by two forward matrices(thus two covariant transformations) and one backward matrix(thus one contravariant transformation), thus giving us a (1,2) tensor.
      We can see here the reason why the (n,m) for Q is the reverse of the number of vectors and covectors used.

  • @kimchi_taco
    @kimchi_taco 9 місяців тому

    Covector basis and vector component have similar notation, which is confusing. I think convector basis needs hat, tilda, or array decoration.

  • @drlangattx3dotnet
    @drlangattx3dotnet 6 років тому

    at time 0:48 if using the standard notation, should the tensor product have brackets around it? The two epsilons?

    • @eigenchris
      @eigenchris  6 років тому

      Usually it would be written woth brackets, yes.

  • @drlangattx3dotnet
    @drlangattx3dotnet 6 років тому

    At time 0:26 you show L = L ee. Is this a tensor product? That is, the e and the epsilon next to each other could be written with the circle mult symbol? I am confused. Is this correct?

    • @eigenchris
      @eigenchris  6 років тому

      Yes. You could write with the circle-times symbol. It is a tensor product.

  • @GeneralAblon
    @GeneralAblon 6 років тому

    Great videos so far. Could you maybe at some point discuss manifolds. I am following a course on Tensors and I am stuck.

    • @eigenchris
      @eigenchris  6 років тому +1

      I will be starting videos on tensor calculus later in march. I still need to complete videos 15, 16, 17.
      What particular part are you stuck on? Is it something I could help with through text?

    • @GeneralAblon
      @GeneralAblon 6 років тому

      eigenchris I mostly just need to understand precisely what a manifold is. I do not understand what it is. Still, I have the exam at April 12th, so there should still be time for me to grasp the concept, maybe you will have made a video on it by then. Where did you learn about tensors? Some of the notations differ from the ones my teacher uses.

    • @eigenchris
      @eigenchris  6 років тому +7

      A manifold is basically an N-dimensional "surface" where if you "zoom in" far enough, it looks flat (that is, it locally looks like R^N). The surface of the earth is a 2D manifold. To us tiny people walking around, the earth looks flat,like R^2.
      Differential geometry started with Gauss. He did not study abstract manifolds... instead he studied 2D surfaces like spheres, cylinders, and the torus. These 2D surfaces are simple enough to describe with 1 function. We can also make more complicated 2D surfaces by "stitching" together simple surfafes, such as maybe stitiching the open end of a half-torus onto the open end of a cylinder. This is basically the inspiration for manifold "charts", if you have heard of those.
      When we move to 3D or 4D, manifolds tend to be described more abstractly rather than using explicit functions, but they are still ultimately "curved surfaces" that locally look like R^3 and R^4.
      I mostly learned about tensors from random internet videos and PDFs files... some notation I made up myself because it helped me understand, but maybe it is confusing you. Which parts are you not understanding?

    • @GeneralAblon
      @GeneralAblon 6 років тому

      Okay, seems reasonable enough.
      I do not find your notation confusing. Well, maybe a bit at first, but, now, I understand. From your videos, I believe I understand everything.
      I also find it really impressive you learned all this from videos and PDFs.

    • @jonasdaverio9369
      @jonasdaverio9369 5 років тому

      What you said about manifolds is only true about differentiable manifolds, isn't it?

  • @davidhand9721
    @davidhand9721 3 роки тому +2

    I think the common notation for tensors makes them 1000x more difficult. With basis included I get it just fine. Dropping the basis confuses the crap out of me.

    • @eigenchris
      @eigenchris  3 роки тому

      Yeah, I think you'll find this is even more true if you continue to study tensor calculus. Many concepts (especially the covariant derivative) seem incomprehensible if you leave out the basis, but it becomes common sense if you keep the basis in.

  • @aanandbadwaik7698
    @aanandbadwaik7698 6 років тому +1

    First of all thanks for this amazing video series . In this video, is D a Linear map type of Tensor? Can Linear maps also be Vector - Vector pair, Covector -Covector pairs ?

    • @eigenchris
      @eigenchris  6 років тому

      Yes, it is a linear map. You just need to keep in mind it takes a covector and produces a vector, so it's a map from V* -> V, as opposed to a V -> V. Covector-covector pairs would be linear maps V -> V*.

    • @aanandbadwaik7698
      @aanandbadwaik7698 6 років тому

      Got it !! Thanks.

    • @IlanTarabula
      @IlanTarabula 4 роки тому

      @@eigenchris I don't understand why. Please could you reexplain because in the previous videos, you said that linear maps are vector-covector pairs. But covector-covector pairs are bilinear forms ? Am I Wrong ?!?

    • @adamsperry2248
      @adamsperry2248 Рік тому

      @@IlanTarabula covector-covector pairs are both bilinear forms (bilinear maps from V x V -> R) and linear maps from V -> V*. It’s just a different interpretation of the same tensor. For the bilinear map, you can think of each covector eating each vector input and giving a real number times a real number (which is a real number), but if you give it just one vector, only one covector eats the vector and you’re left with a covector (or more specifically, a linear combination of the dual basis covectors).

    • @IlanTarabula
      @IlanTarabula Рік тому

      @@adamsperry2248 Thank you sir, got it !

  • @drlangattx3dotnet
    @drlangattx3dotnet 6 років тому

    I am sorry to trouble Chris. I have spent a great deal of time on these videos and I want to understand....but again...at time 1:50 the transformation rule derivation ends with two tilda e's next to each other. This is your tensor product notation?

    • @eigenchris
      @eigenchris  6 років тому +1

      Yes. I'm sorry you find my notation confusing, but anytime two e's or e-tildes are next to each other, there is an implied tensor product.

  • @admiralhyperspace0015
    @admiralhyperspace0015 2 роки тому

    Sometimes you pull the coefficient out and put them on the left side and sometimes you keep em on the right side. Is there a reason for it I am too dumb to realize? or is it just symmetry you like of backwards at one and forwards at other side

    • @Stobber1981
      @Stobber1981 2 роки тому

      He mentioned in an earlier video but it's easy to have missed or forgotten it. Convention says that when a lower and upper index match (and should be summed after multiplication), the tensor with the lower index is written first.

  • @drlangattx3dotnet
    @drlangattx3dotnet 6 років тому +1

    "e" epsilon epsilon three basis vectors . Is this some kind of three term tensor product? then you have an epsilon "acting on" an "e" to get a Kronecker Delta. But is the epsilon acting on e ...is that a tensor product?

    • @eigenchris
      @eigenchris  6 років тому

      It's a tensor product. If an epsilon acts on an e, I'll use the function notation with brackets, "epsilon(e)".

    • @drlangattx3dotnet
      @drlangattx3dotnet 6 років тому +1

      thanks for not giving up on me. I will keep working. BTW It has been 40 years since I had a math class. Highest I got was calculus, elementary linear algebra and one course of DE. Then I dropped out for various reasons.

  • @carlosantoniogaleanorios4580
    @carlosantoniogaleanorios4580 4 роки тому

    Is there a missing \vec{e_i} on the sixth line of your left column at 3:06, or am I lost?

    • @luisaim27
      @luisaim27 4 роки тому

      I have the same question, and I think that you are right.

    • @liuhenryc
      @liuhenryc 4 роки тому

      You're strictly right, but often the basis is left off and things are only written with the components

  • @TheBigBangggggg
    @TheBigBangggggg 6 років тому

    Is it just a habit or is there a reason for the fact that you place the forward-transformation-matrices on the right of a tensor-component and the backward-ones on the left?

    • @eigenchris
      @eigenchris  6 років тому

      There's no particular mathematical reason for doing this--the order of terms in an Einstein summation doesn't matter. I was just trying to keep things clean so people can clearly see the number of F's and B's. In other videos I may have written things out differently.

    • @TheBigBangggggg
      @TheBigBangggggg 6 років тому

      Okay, thanks.

  • @nedisawegoyogya
    @nedisawegoyogya 4 роки тому +1

    hate to notice but 6:35, the upper right Q index should be 1Q22 right?

    • @eigenchris
      @eigenchris  4 роки тому +2

      That's right. Word of advice if you ever make math videos: never copy+paste anything ever.

    • @nedisawegoyogya
      @nedisawegoyogya 4 роки тому

      @@eigenchris your content is very great btw, had been searching for explanation of tensor (with understanding gr as motivation), but only found the 1st definition, which as you say, not helpful. And im not anticipating that notation is this important in college math lol. Thank you so much

  • @papetoast
    @papetoast 4 місяці тому

    I have watched all the videos before this, but I don't understand this video. I get that the 3d cube notation loses information, I couldn't wrap my head around the Q(D) thing and why there are more than 1 ways to multiply though. Like, if the Einstein notation uniquely determines a multiplication rule, and the other notations doesn't, then there must be some information lost in the other representations?? But I don't see what information got lost.
    I guess I will read on other sources on this.

  • @hieudang1789
    @hieudang1789 3 роки тому

    hey chris, what about bivector, or multivector in general? In the wikipedia, it says that bivectors are tensor of (2,0) type, but that means bivectors are tensor product of vector and vector or combination of them. But bivectors by definition are exterior product of vectors, which isn't the same as tensor product. So maybe the wikipedia is wrong? All I know is bivectors are tensor because they are independent of coordinate system. But I'm not sure which type they are. Or maybe they belong to a different vector space

    • @eigenchris
      @eigenchris  3 роки тому

      I think I incorrectly used the word "bivector" in this video. You are correct that "bivector" means the exterior product of 2 vectors, not the tensor product.

  • @chenlecong9938
    @chenlecong9938 Рік тому +1

    4.07 that whole slide over there,i thought the order for tensor product does matter because its generally non commutative?

    • @eigenchris
      @eigenchris  Рік тому +1

      Yes, the order of the tensor product does matter, but it's possible to define the "summation" between tensors in multiple different ways.

    • @chenlecong9938
      @chenlecong9938 Рік тому

      @@eigenchris you mean as in,which indices to sum over first?essentially the order of the sum over?

    • @amedeonannini2332
      @amedeonannini2332 Рік тому

      If I have well understood, this is why Einstein's summation is "better" in higher tensor multiplication than tensor or array notations

  • @juanjoromero2730
    @juanjoromero2730 Рік тому

    In 5:40 the rank 3 tensor could have had a different array shape if the order of the kronecker product had been different. How do we know the specific order of multiplication of a tensor?

    • @eigenchris
      @eigenchris  Рік тому +1

      I somewhat regret trying to create "array shapes" for any tensors beyond rank 2. I was interested in it at the time of making this video, but it serves no purpose and is ambiguous, as you point out. I'd stick with the summation notation for tensors of rank 3 and above.

    • @juanjoromero2730
      @juanjoromero2730 Рік тому

      @@eigenchris Thank you si much for your answer!!!

  • @danielribastandeitnik9550
    @danielribastandeitnik9550 5 років тому

    Hi, I think you are getting the notation for the type of the tensors wrong. The book I use, and I just confirmed that it's commonplace notation googling it, defines the type of the tensor (r,s) as a C-valued (C is the set of scalars) multilinear function that maps r vectors and s dual-vectors (so VxV..VxV*xV*x...xV*, V r times and V* s times) to C. So, in your video, for example at 1:15, you say a tensor Q, that clearly takes for input 2 vectors and 1 dual-vector, is of type (1,2), but it's of type (2,1). You've been inverting the numbers inside the type for some videos.

    • @jonasdaverio9369
      @jonasdaverio9369 5 років тому

      Your definition is wrong. Wikipedia says an (r,s) tensor is a function that maps r covectors and s vector to a member of the field K you work with.

    • @danielribastandeitnik9550
      @danielribastandeitnik9550 5 років тому

      @@jonasdaverio9369 hold your horses, I know it's strange but it makes no sense to say that a definition is wrong because a definition is not a theorem or anything like that, it's just a convention you're creating. You can only say that a definition is wrong if the contradicts another definition you've made earlier or there's a contradiction inside the definition, besides that, it cannot be wrong just because of how a Wikipedia article says (btw, what a nice reference you choose). Is like the laplacian, some books use the definition that is a nabla squared, others just use a nabla (hate those haha), but you can't say that one is right and another is wrong. So the book I learned Tensor Algebra uses the notation I told above, it defined like that and uses it like that for the whole book, never contradicting itself, so Jonas Daverio, no the definition is not wrong, it's apparently not common.

    • @danielribastandeitnik9550
      @danielribastandeitnik9550 5 років тому

      OK, I admit that I myself said that I thought the notation, i.e., the definition used on the video was wrong, now I see that the book I used defined it reversed to everyone else.

    • @jonasdaverio9369
      @jonasdaverio9369 5 років тому

      @@danielribastandeitnik9550 ahaha, it's good to know. I simply said it was wrong to say his convention was not the one used by others.

  • @abnereliberganzahernandez6337
    @abnereliberganzahernandez6337 Рік тому +2

    the only bad thing about your videos is that they are finete and must have an end so sad, I think i have changed series on netflix by series on math