4:20 The cross product of 2 perpendicular vectors produces a new vector which is also perpendicular to the 2 previous vectors. does that mean vector ω has the same direction as vector L?
4:20 what's the logic of relating a change in momentum to a constant velocity? instead either both a changing velocity and mass or the traditional change in velocity
When using the einstein summation convention, we usually sum over equal indices when one index is lowered, while the other index is raised. The lowered index represents a covariant component, and a raised index represents a covariant component. For convenience, let’s say that the tensor is uniquely defined by a singular index, meaning that it can be thought of as a vector. Let’s also say that the vector lives in a vector space (let’s call that vector space V), and can be written as a linear combination of basis vectors that has contravariant components (raised index). The covariant (lowered index) components can then be found by finding the corresponding dual vector. The dual vector lives in another vector space, called the dual space of V. The dual space is often denoted as V^*. However, for a eucledian space V, it’s dual space V^* will be isomorphic, essentialy meaning that these to vectors spaces are the same, just represented with different basis vectors. When this is the case, there is no real use to distinguish between covariant and contravariant components. Thus, we do not need to have one raised, and one lowered index for the einstein summation convention. Hence, the inner product between two vectors a and b can be written ass a • b = a_i*b_i. Hopes this helps to clear some confusion.
@mathiasfjsne8854 Name checks out Math-ius. Thank you fellow traveler of this universe. When and why would the dual space be different? Why are we using Euclidian space?
The elements inside a square matrix that do not lie on the main diagonal are called off-diagonal terms. If you write down a square matrix on a piece of paper, and draw a line through the matrix connecting the top left corner to the bottom right corner, all the elements that this line passes through are called diagonal elements. All the other elements are called off-diagonal elements. If you are familiar with matrix indices, then the diagonal elements are all those elements that have the same row and column number (i = j). And the off-diagonal elements are all those elements that have i not equal to j. Hope that clarifies 😊😊
@@affafjunaid3814 Sorry 😔😔😔, that's the reason I clarified I am not a native English speaker. In the context of the inertia tensor what does non diagonal elements mean or say? Btw, thank you for the answer I liked it
The inertia tensor is specific to a selection of axes. If all of the non-diagonal elements are zero, then the tensor is for the *principal* axes of the body. If the axes are principal axes, then a torque around an axis will only cause an acceleration of the angular velocity in *that* axis. If the non-diagonal tensor elements are not zero, then the axes of that tensor are not principal axes, and a torque around one axis can cause an acceleration of the angular velocity in a *different* axis!@@lq_12
This is the one of the best videos that I have found on the internet.
????
just what I was looking for. Thanks!
1:52 "Euclidean space, dual spaces are isomorphic." WHAT?
4:20 The cross product of 2 perpendicular vectors produces a new vector which is also perpendicular to the 2 previous vectors. does that mean vector ω has the same direction as vector L?
Yes, the vectors are colinear, at least for the point particle
8:43 why the off-diagonal term (integral von xy) measure the distance to lines x=y??? there is not even an z component here!
Very nice job!
4:20 what's the logic of relating a change in momentum to a constant velocity? instead either both a changing velocity and mass or the traditional change in velocity
1:56 "Euclidean space, dual spaces are isomorphic." WHAT? Repeated this a million times and still don't know what she means.
When using the einstein summation convention, we usually sum over equal indices when one index is lowered, while the other index is raised.
The lowered index represents a covariant component, and a raised index represents a covariant component. For convenience, let’s say that the tensor is uniquely defined by a singular index, meaning that it can be thought of as a vector. Let’s also say that the vector lives in a vector space (let’s call that vector space V), and can be written as a linear combination of basis vectors that has contravariant components (raised index). The covariant (lowered index) components can then be found by finding the corresponding dual vector. The dual vector lives in another vector space, called the dual space of V. The dual space is often denoted as V^*. However, for a eucledian space V, it’s dual space V^* will be isomorphic, essentialy meaning that these to vectors spaces are the same, just represented with different basis vectors. When this is the case, there is no real use to distinguish between covariant and contravariant components. Thus, we do not need to have one raised, and one lowered index for the einstein summation convention. Hence, the inner product between two vectors a and b can be written ass a • b = a_i*b_i. Hopes this helps to clear some confusion.
@mathiasfjsne8854 Name checks out Math-ius. Thank you fellow traveler of this universe.
When and why would the dual space be different? Why are we using Euclidian space?
Hello! May I ask a question? If I want to transform moment inertia tensor from Cartesian to spherical coordinate, how to do it? Thanks!
Very good
Didn’t need to read out the whole expression 😭 LMAO
For which standard this topic is for??
This is taught in college/university. Or maybe you can do this for JEE Advanced (Indian entrance exam for engineering).
@@absolutedesi5899 Yes I'm preparing for JEE only but this is not in our syllabus we have only scalars and vectors
I am not native english speaker. What do off diagonal elements mean??
The elements inside a square matrix that do not lie on the main diagonal are called off-diagonal terms. If you write down a square matrix on a piece of paper, and draw a line through the matrix connecting the top left corner to the bottom right corner, all the elements that this line passes through are called diagonal elements. All the other elements are called off-diagonal elements. If you are familiar with matrix indices, then the diagonal elements are all those elements that have the same row and column number (i = j). And the off-diagonal elements are all those elements that have i not equal to j. Hope that clarifies 😊😊
@@affafjunaid3814 Sorry 😔😔😔, that's the reason I clarified I am not a native English speaker.
In the context of the inertia tensor what does non diagonal elements mean or say?
Btw, thank you for the answer I liked it
The inertia tensor is specific to a selection of axes. If all of the non-diagonal elements are zero, then the tensor is for the *principal* axes of the body. If the axes are principal axes, then a torque around an axis will only cause an acceleration of the angular velocity in *that* axis. If the non-diagonal tensor elements are not zero, then the axes of that tensor are not principal axes, and a torque around one axis can cause an acceleration of the angular velocity in a *different* axis!@@lq_12
thank you
Sehr gut
but first what A TeNsah
❤love u
That accent 😶