Dual Vector Properties and Raising/Lowering Indices

Поділитися
Вставка
  • Опубліковано 17 січ 2025

КОМЕНТАРІ • 18

  • @dennisbrown5313
    @dennisbrown5313 Місяць тому +3

    Outstanding explanation; how you show the equivalency of the ortho-normal corr. system and the dual space vector (function )is insightful. The proof was elegant. Then showing a non-Euclidean to introduce the entire idea of a metric tensor is brilliant.

  • @ChaineYTXF
    @ChaineYTXF Місяць тому +3

    This is a very, very good explanation. Thank you for this outstanding contribution!🙏

  • @giuliocasa1304
    @giuliocasa1304 Місяць тому +1

    Thank you very much !

  • @benburdick9834
    @benburdick9834 Місяць тому +2

    Since we use the dot product to define the metric tensor, does that mean that we don't have a metric for a general vector space, since vector spaces are not always inner product spaces?

    • @FacultyofKhan
      @FacultyofKhan  Місяць тому +1

      Yes that's correct. You must have an inner product to define the metric tensor.

  • @sagsolyukariasagi
    @sagsolyukariasagi Місяць тому

    It's a great explanation. I think I would graps it more firmly, when there exist some numerical examples, where we can see the numerical expression of non orthogonal bases and the necessity of the metric tensor when we deal with non orthogonal bases. Otherwise, it feels like they all follow some rules.

  • @manstuckinabox3679
    @manstuckinabox3679 24 дні тому +1

    Currently taking bi-linear algebra, I'm starting to see some heavy similarities between these two subjects.

  • @fluteferret1129
    @fluteferret1129 Місяць тому

    4:20 in eq(3) thats for sure not how multiply matrices. In fact you can't multiply those two. It feels to me that, what you do is multimply a row matrix with d^1 all the way to the n-column with u vector (+) a row matrix with d^2 all the way to the n-column with u vector (+)...(+)a row matrix with d^n all the way to the n-column with u vector to get the RHS result. What's going on?

    • @FacultyofKhan
      @FacultyofKhan  28 днів тому

      You’re right that you can’t multiply two column matrices like the ones in equation 3, but I’m not multiplying matrices in that equation; I’m doing a dot product of two column vectors, which is different altogether.

    • @fluteferret1129
      @fluteferret1129 27 днів тому

      @@FacultyofKhan Firstly thanks for you answer. Now let me rephrase my problematic. Assuming ofc that the right way to compute a dot product is the sums in eq(3), there you get mixed terms. On the contrary in method 1 and 2 you don't get those mixed terms(e_1*e_2). So in my point of view the problem does not arise when you transpose the covector form row to column. It's still there in method 1 when it's a row vector, you still don't get the mixed terms. And the problem is the mixed terms don't show up in either method. So why are you saying that the problem is in the transpose?

    • @皇甫累
      @皇甫累 26 днів тому

      Yes I feel the same way

    • @皇甫累
      @皇甫累 26 днів тому

      @@FacultyofKhan is it correct that when you use the subscript, your are considering it in the dual vector space, however, when the super i is used, you put it in the normal vector space as you use the same base vector e there. In other words, when you transpose the vector, you change the vector space?

    • @FacultyofKhan
      @FacultyofKhan  26 днів тому

      @@皇甫累 Yes, vector components with a subscript represent the components of the dual vector, while vector components with a superscript represent the components of a regular/normal vector.
      What I want to emphasize though is that taking the transpose of a vector doesn't necessarily give you its dual vector: this rule only applies in Euclidean space with an orthonormal basis. In every other situation, you need to use the metric tensor to lower the index and convert a vector to its dual.

  • @aliasghar4409
    @aliasghar4409 Місяць тому

    Which software do you use to record these videos?