Great video. Just a nitpick: the purple box at 45:10 is wrong, it contradicts what's at 48:47. The indexes should be the other way around, alpha up, beta down.
That's right, my apologies... I wanted to plug in the transformation law directly the way we have derived it in the basis and transforms video to make it more clear, so I put the basis vector indices up when they should be down (so vector components would have indices up), then it seems I messed up the rest accordingly!
@@MoreinDepth thanks. Nevertheless it's a great video, I appreciate you making content like this. Please keep at it, not everyone has a talent for explaining things, and you definitely do.
A brilliant mind who is the one behind this magnificent video -someone I am proud to call my friend from our undergraduate courses on group theory and tensor algebra-recently asked me to critique his video introducing the concept of tensors. I’ve watched it multiple times and, to be honest, there is very little to criticize. His grasp of the subject is, as expected, flawless. I already knew this before watching the video. Some might suggest that a more physical introduction could precede the linear algebra portion, but anyone in the STEM fields should already be familiar with linear algebra. Well done, my friend. I’m sorry I couldn’t find more to critique, but that’s your fault for creating a video so close to perfection. Keep up the great work!
This is by far the best tensor explanation on UA-cam. Other explanations either directly start with the geometrical viewpoint or run through vectors -> matrices -> tensors without any mathematical geometrical motivation. I like that you spend time on the prerequisite covector tooling. You also make it very clear how tensors are a geometric object in their own right (after normalising out the components).
Disagree strongly. Xylyxylyx has a series called "what is a tensor" and it's far more detailed and starts with the tensor basis, i.e. reciprocal spaces, from which all other aspects flow intuitively.
Fantastic presentation. I’m self taught and I always learn little tidbits of info that pulls everything together tighter. Looking forward to more videos from you.
36:30 i believe the cross product is not a tensor, but a pseudo tensor because of a change of sign from orientation non preserving change of coordinates
Wow - thank you! So glad there are people like you that understand this stuff well enough to combine concepts and provide a high level summary without the details and rigor that obscure many of the important concepts. Well done! I kept hoping you would work my favorite subject in there - the Fourier transforms and the SVD, singular value decomposition, but hopefully you can do a video on how tensors and the Fourier transforms and SVD are related someday? Keep up the great work!
This video is awesome. Thank you, I'm a complete novice at this and could finally understand what tensors are and how they work thanks to your explanation, thank you!
59:23 by construction you can see that vector product C is a covector (lower indices). To map it to scalars you need a *vector* (not a covector). So, cross product is (0,3) tensor with an open slot for a *vector*, not (1,2) with covector slot open.
I'm a hobbyist watcher of maths videos 🤓 and I've been interested to learn what a tensor is for years. Decades in fact: the word 'tensor' was mysteriously mentioned once, in passing, when I was studying 📚 I'm watching this (22:00 in) and It looks like this video is going to deliver on its title. You've got my sub.
tensors were hocus pocus before. minus the correction that is pinned, this is properly motivated and well defined and step by step. a lot of tensor presentations are not worth the time nor effort because they skip steps and defined too abstractly that it loses the intuition of the math. the term mapping clicked for me because i can visualize practical applications such as, geographical map overlaid on traffic, overlaid on satellite; mri scan of skin, digestive system, nervous system, etc. financial mapping for clients, creditors, banks, etc. i'm sure there are tons of applications but at least now i have a good idea how tensors work or can be applied
The transformation law isn't something mysterious, it's just saying that tensors are "vectors of vectors", i.e. they are vectors, each of whose components are vectors. The linear map stuff follows immediately from this and the idea of invariance. It is important not to start with linear maps, because there are redundancies in the map notation caused by not having indices.
I respect your fine and obviously expert efforts to make tensorial objects much more approachable to newcomers. I would merely caution, as a physicist, "speaking" for budding physicists, to not let the vector-eating-machine picture of tensors be the only concept of tensors that students are left with (although I like it alot, after having been myself partly introduced to to general relativity using Misner, Throne and Wheeler's similar approach in their magnum opus "Gravitation" circa 1976.) The student should recognize that a tensor - which could, by the way, be just a (0,1) [vector] or (1,0) [one-form] object in your notation - has a physical interpretation in its own right. Your definition of the dot product draws on the larger meaning of the metric tensor as a local description of the shape of the manifold in which an area or distance is calculated, or in which other tensors (like vectors) are"multiplied". Objects like the stress-energy tensor (T_u_v) featured in the Einstein equation, or even the generalized Faraday tensor (F_u_v), have intrinsic meanings in themselves, as local energy densities or EM field intensities, respectively, in some local Minkowski frame. Thanks for your efforts nevertheless. DKB
beautifully explained, thank you. finally someone understands that not everyone wants to be a fckng mathematician and reinvent and rediscover already known things, holly fck.
The Xylyxylyx explanation is still the best because it discusses what a tensor _basis_ is, which is what I was always missing because _everyone else_ just pretends it doesn't exist. The series is "What is a tensor?" on the channel xylyxylyx. Much better and much more detailed.
Thank you for sharing, it looks excellent and it inspires me to see such great content! In my own defense, of course a series with 40 videos is much better and detailed than my one video :=)!
well thats something my mind goes block in a few minutes, spescial i dont know what a tensor is at all. the math sounds solid but i have my problems with the handwriting..
Of course, I've learned about math, and learned even more about ranting about coordinates from him! :=) The lecture series Gravity and light are linked in the description for anyone looking for more details!
the creator of this video seems confused about the difference between a 'tensor field' (which is where a transformation rule such as that shown on the first slide is relevant) and a 'tensor' (a purely linear algebraic object). at least, that's what it seemed from the first few minutes. i'm undecided whether i want to give the rest a try, is this addressed later on?
sounds a ... little bit strange... if you mean that the result of scalar multiplication of two vectors is a 0-rank tensor - that's ok. But my opinion is the 0-rank tensor is a value (scalar) that exists in some point of a space and doesn't depends of a basis we choose.... IMHO your made a little bit ...strange definition
You’re sloppiness is a very bad start. Calling multi linear transformation linear is absurd. It just isn’t true. It makes no sense. It introduces contradictions into mathematics; don’t do it.
Ok, so you're describing a second order covariant tensor, and declaring it a function of two vectors is fair. What then, is a second order contravariant tensor? For that matter, what is a vector? Ultimately you have to come around to the mathematical definitions. Of course, you're already coming to mathematical definitions when you talk about linearity and so on. You're just choosing a different set of mathematical words that turn out to be equivalent to the ones you rejected at the outset. The fact that you find one set of words more satisfying than the other really just says more about your math familiarity than anything else.
They're not axioms of a vector space, they're the definition of a vector space. We prove a particular structure is a vector space by showing that it satisfies the definition.
Few things I have ever hated as the einstein sum convention. So much extra work for students so the teacher can spare a few strokes with the chalk. Everything becomes a mini puzzle you have to decipher in order to make sense of the expression on top of everything else. I liked the video overall, but the entire topic feels like physics has gone wrong, the example with the vector product looks encredible scetchy, define a map by not plugging an argument in, really?
But the point there is that the new vector (or covector) that you define by leaving a slot open is also a map. It has one slot open into which you can plug in a covector (or vector). So leaving a slot open means you have contracted the filled slots, the result is a smaller piece of that map. This is also done in multivariable calculus, say you have a function f = f(x,y). If you want to only consider the part of the map that takes in x, you would plug in y=0 which "projects" the map onto the f=f(x) axes. So f = f(_ , 0) with a constant y plugged in and the x slot open gives you another map, which is a 2D slice of the more general 3D map. And that is precisely where the Einstein summation convention comes in handy, it doesn't only save chalk, the index notation makes these maps more clear and concise, albeit it takes some getting used to. Hey I'd say when Einstein does physics in some way it is rarely ever the wrong way to do it! ;)
If you want to give a quick intuitive explanation for tensors, don't start with a lecture on linear algebra. Give us the explanation in a concise form. If you haven't heard of linear algebra, you don't click on a video with this title. And if you do, the content goes over your head no matter your introduction.
Thanks for your feedback! That was very much my intention, it's why I've started with examples of tensors instead of going right into the formalism. That being said there is a price to pay to make the video as self-contained as it can be, and it comes in the form of going over the basics!
@@MoreinDepth I also do think that going through the foundations is the way to go around about this. I've watched other videos explaining tensors by starting from the Einstein notation, you kinda actually get a bit of the notions of how it works, but not as much deep intuition. I consider you're amazing by making everything so consice. To be fit in around an 1 hour, you managed to make the concepts so meaningfull for such a topic as this one. Obviously the viewer needs to be already somewhat familiar with some topics, there are a bunch of concepts that need to be already well understood before diving here
Actually the relationship between vectors and covectors was one that I never had really grasped before this, but my primary knowledge of Lin Alg is at least four decades old now, so *I* can't even remember what I once knew. I stumbled into a lot of related areas of mathematics through work I did in CS on types & category theory and this actually connected quite a few dots for me.
Great video. Just a nitpick: the purple box at 45:10 is wrong, it contradicts what's at 48:47. The indexes should be the other way around, alpha up, beta down.
That's right, my apologies...
I wanted to plug in the transformation law directly the way we have derived it in the basis and transforms video to make it more clear, so I put the basis vector indices up when they should be down (so vector components would have indices up), then it seems I messed up the rest accordingly!
@@MoreinDepth thanks. Nevertheless it's a great video, I appreciate you making content like this. Please keep at it, not everyone has a talent for explaining things, and you definitely do.
you can reply with a n-fold tensor product over a module. 😅
I really appreciate how you gradually introduced the topic, making it feel like having a companion throughout the journey. Good job!
A brilliant mind who is the one behind this magnificent video -someone I am proud to call my friend from our undergraduate courses on group theory and tensor algebra-recently asked me to critique his video introducing the concept of tensors. I’ve watched it multiple times and, to be honest, there is very little to criticize. His grasp of the subject is, as expected, flawless. I already knew this before watching the video. Some might suggest that a more physical introduction could precede the linear algebra portion, but anyone in the STEM fields should already be familiar with linear algebra. Well done, my friend. I’m sorry I couldn’t find more to critique, but that’s your fault for creating a video so close to perfection. Keep up the great work!
This is by far the best tensor explanation on UA-cam. Other explanations either directly start with the geometrical viewpoint or run through vectors -> matrices -> tensors without any mathematical geometrical motivation. I like that you spend time on the prerequisite covector tooling. You also make it very clear how tensors are a geometric object in their own right (after normalising out the components).
Disagree strongly. Xylyxylyx has a series called "what is a tensor" and it's far more detailed and starts with the tensor basis, i.e. reciprocal spaces, from which all other aspects flow intuitively.
Simple, straightforward, yet not oversimplified. As things should be taught! I'm sharing this video with anyone who needs it!
That means a lot, I'm glad you liked it!
Thank you! This is by far the best motivated explination i have come across. Really made something click.
The part about covectros was really helpful
Fantastic presentation. I’m self taught and I always learn little tidbits of info that pulls everything together tighter. Looking forward to more videos from you.
36:30 i believe the cross product is not a tensor, but a pseudo tensor because of a change of sign from orientation non preserving change of coordinates
Wow - thank you!
So glad there are people like you that understand this stuff well enough to combine concepts and provide a high level summary without the details and rigor that obscure many of the important concepts. Well done! I kept hoping you would work my favorite subject in there - the Fourier transforms and the SVD, singular value decomposition, but hopefully you can do a video on how tensors and the Fourier transforms and SVD are related someday? Keep up the great work!
When tensors first "click" in your mind they are the most amazing thing.
I tend to tense up when I’m around a tensor ; and well, I think that’s a completely normal reaction.
This video is awesome. Thank you, I'm a complete novice at this and could finally understand what tensors are and how they work thanks to your explanation, thank you!
im going to share this to my friends thanks!
59:23 by construction you can see that vector product C is a covector (lower indices). To map it to scalars you need a *vector* (not a covector). So, cross product is (0,3) tensor with an open slot for a *vector*, not (1,2) with covector slot open.
Great video, probably one of the best on this topic! Very much appreciated :)
Great work! Concise explanation and fantastic presentation. I hope your channel grows (:
Thanks for the clear explanation, some notations also got cleared 👌.
I wish more videos were like yours, amazing teaching thank you very much
I'm a hobbyist watcher of maths videos 🤓 and I've been interested to learn what a tensor is for years. Decades in fact: the word 'tensor' was mysteriously mentioned once, in passing, when I was studying 📚
I'm watching this (22:00 in) and It looks like this video is going to deliver on its title. You've got my sub.
The videos you make are awesome with great explanations! I really appreciate this.
This is a really well made video👍
Love the focus on making stuff not scary
A monad is just a monoid in the category of endofunctors.
I think identity is usually used for neutral element in multiplication, not addition, but perhaps this is not formal :) very educational!
tensors were hocus pocus before. minus the correction that is pinned, this is properly motivated and well defined and step by step. a lot of tensor presentations are not worth the time nor effort because they skip steps and defined too abstractly that it loses the intuition of the math. the term mapping clicked for me because i can visualize practical applications such as, geographical map overlaid on traffic, overlaid on satellite; mri scan of skin, digestive system, nervous system, etc. financial mapping for clients, creditors, banks, etc. i'm sure there are tons of applications but at least now i have a good idea how tensors work or can be applied
You really do have to eventually swallow the transformation stuff if you want to get the full applicability of tensors.
Very nice video. Thank you for sharing.
A tensor is something that transforms like a tensor.
Excellent video !
The transformation law isn't something mysterious, it's just saying that tensors are "vectors of vectors", i.e. they are vectors, each of whose components are vectors. The linear map stuff follows immediately from this and the idea of invariance. It is important not to start with linear maps, because there are redundancies in the map notation caused by not having indices.
I respect your fine and obviously expert efforts to make tensorial objects much more approachable to newcomers. I would merely caution, as a physicist, "speaking" for budding physicists, to not let the vector-eating-machine picture of tensors be the only concept of tensors that students are left with (although I like it alot, after having been myself partly introduced to to general relativity using Misner, Throne and Wheeler's similar approach in their magnum opus "Gravitation" circa 1976.)
The student should recognize that a tensor - which could, by the way, be just a (0,1) [vector] or (1,0) [one-form] object in your notation - has a physical interpretation in its own right. Your definition of the dot product draws on the larger meaning of the metric tensor as a local description of the shape of the manifold in which an area or distance is calculated, or in which other tensors (like vectors) are"multiplied". Objects like the stress-energy tensor (T_u_v) featured in the Einstein equation, or even the generalized Faraday tensor (F_u_v), have intrinsic meanings in themselves, as local energy densities or EM field intensities, respectively, in some local Minkowski frame.
Thanks for your efforts nevertheless. DKB
Thank you!
you have a great style of teaching and taste in topic. do you plan to build on these topics?
Yes! I'm glad you've enjoyed it, there's much more to come!
beautifully explained, thank you.
finally someone understands that not everyone wants to be a fckng mathematician and reinvent and rediscover already known things, holly fck.
The best introduction to geometric objects called TENSORS.
The Xylyxylyx explanation is still the best because it discusses what a tensor _basis_ is, which is what I was always missing because _everyone else_ just pretends it doesn't exist. The series is "What is a tensor?" on the channel xylyxylyx. Much better and much more detailed.
Thank you for sharing, it looks excellent and it inspires me to see such great content!
In my own defense, of course a series with 40 videos is much better and detailed than my one video :=)!
small error: at 11:58 it should be (alpha dot beta) circle-dot v = alpha circle-dot (beta circle-dot v)
great video
Simple and straightforward you say! but what is a tensor used for in the real world.
I'd say it depends on the world! :=)
5:12 huh??? the cross product is not a real number. will you explain later??
This is GREAT video
A Tensor is linear over the functions. This is the most bare bone, straight forward definition.
Bravo
Very understandable!
Thank you for this video
Really excellent ! Question : any good book about tensors in line with this presentation ?
Yes! My references are in the description
@@MoreinDepth Sorry ! I haven't seen them.
Can a tensor flow?
well thats something my mind goes block in a few minutes, spescial i dont know what a tensor is at all. the math sounds solid but i have my problems with the handwriting..
Multi-linearity isn't what makes a tensor, a tensor, is it? A simple mapping of a vector by its dual is also a tensor.
Did you perhaps watch the lectures given by prof. Frederic Schuller?
Of course, I've learned about math, and learned even more about ranting about coordinates from him! :=)
The lecture series Gravity and light are linked in the description for anyone looking for more details!
@@MoreinDepth I knew that your way of speaking about these is familiar immediately haha
When people ask me what a tensor is I say maths.
Classical Physics by kip throne is actually 5 books what explicitly is the source
The full title of the book is "Modern Classical Physics: Optics, Fluids, Plasmas, Elasticity, Relativity, and Statistical Physics"
Yeah, if you're trying to grok tensors without knowing about vectors and vector spaces then you're doing it in the wrong order.
the creator of this video seems confused about the difference between a 'tensor field' (which is where a transformation rule such as that shown on the first slide is relevant) and a 'tensor' (a purely linear algebraic object).
at least, that's what it seemed from the first few minutes. i'm undecided whether i want to give the rest a try, is this addressed later on?
sounds a ... little bit strange... if you mean that the result of scalar multiplication of two vectors is a 0-rank tensor - that's ok.
But my opinion is the 0-rank tensor is a value (scalar) that exists in some point of a space and doesn't depends of a basis we choose.... IMHO your made a little bit ...strange definition
This is really rough start.
👍
Here 38:05 NOT CLEAR AT ALL !!
multilinear form - that's all
Your voice, accent are identical with @Let's Talk Religion 🤔
You’re sloppiness is a very bad start. Calling multi linear transformation linear is absurd. It just isn’t true. It makes no sense. It introduces contradictions into mathematics; don’t do it.
Ok, so you're describing a second order covariant tensor, and declaring it a function of two vectors is fair. What then, is a second order contravariant tensor? For that matter, what is a vector? Ultimately you have to come around to the mathematical definitions. Of course, you're already coming to mathematical definitions when you talk about linearity and so on. You're just choosing a different set of mathematical words that turn out to be equivalent to the ones you rejected at the outset.
The fact that you find one set of words more satisfying than the other really just says more about your math familiarity than anything else.
They're not axioms of a vector space, they're the definition of a vector space. We prove a particular structure is a vector space by showing that it satisfies the definition.
Hi dear 🎉❤
I joined just now
Hope find guides and answers for my questions 😊😊
Few things I have ever hated as the einstein sum convention. So much extra work for students so the teacher can spare a few strokes with the chalk. Everything becomes a mini puzzle you have to decipher in order to make sense of the expression on top of everything else. I liked the video overall, but the entire topic feels like physics has gone wrong, the example with the vector product looks encredible scetchy, define a map by not plugging an argument in, really?
But the point there is that the new vector (or covector) that you define by leaving a slot open is also a map. It has one slot open into which you can plug in a covector (or vector). So leaving a slot open means you have contracted the filled slots, the result is a smaller piece of that map. This is also done in multivariable calculus, say you have a function f = f(x,y). If you want to only consider the part of the map that takes in x, you would plug in y=0 which "projects" the map onto the f=f(x) axes. So f = f(_ , 0) with a constant y plugged in and the x slot open gives you another map, which is a 2D slice of the more general 3D map.
And that is precisely where the Einstein summation convention comes in handy, it doesn't only save chalk, the index notation makes these maps more clear and concise, albeit it takes some getting used to.
Hey I'd say when Einstein does physics in some way it is rarely ever the wrong way to do it! ;)
Very clear if you know what a tensor is, as indicated by the comments. Else, completely useless.
Not the type of explanation i ever able to understand, even knowledge of what tensors are didnt helps lol.
If you want to give a quick intuitive explanation for tensors, don't start with a lecture on linear algebra. Give us the explanation in a concise form. If you haven't heard of linear algebra, you don't click on a video with this title. And if you do, the content goes over your head no matter your introduction.
Thanks for your feedback! That was very much my intention, it's why I've started with examples of tensors instead of going right into the formalism. That being said there is a price to pay to make the video as self-contained as it can be, and it comes in the form of going over the basics!
Haven't you learned it in hs tho? I am currently in 11th grade and I can easily grasp everything
@@MoreinDepth I also do think that going through the foundations is the way to go around about this. I've watched other videos explaining tensors by starting from the Einstein notation, you kinda actually get a bit of the notions of how it works, but not as much deep intuition.
I consider you're amazing by making everything so consice. To be fit in around an 1 hour, you managed to make the concepts so meaningfull for such a topic as this one.
Obviously the viewer needs to be already somewhat familiar with some topics, there are a bunch of concepts that need to be already well understood before diving here
You do realize you can skip with the bar under the video right?
Actually the relationship between vectors and covectors was one that I never had really grasped before this, but my primary knowledge of Lin Alg is at least four decades old now, so *I* can't even remember what I once knew. I stumbled into a lot of related areas of mathematics through work I did in CS on types & category theory and this actually connected quite a few dots for me.
Most people are rude, ignorant, nasty morons. That is so true
Another bad tensor video
I think you can do better than this. Just try a little harder 😊
Not very clear presentation.
Your feedback is important as I'm very new at this! I'd appreciate if you would evaluate on that :)
@@MoreinDepth now that's mature. Bravo!
does not seems clearly exposed which real number corresponds to tensor application ... there is confusion between matrix and applications .
Aren't the scalar entries in the matrix the result of the tensor application?