Outer Product

Поділитися
Вставка
  • Опубліковано 5 вер 2024

КОМЕНТАРІ • 20

  • @user-qp2ps1bk3b
    @user-qp2ps1bk3b 2 роки тому +15

    tensor product is denoted by the same symbol. How does it relate to both vector product and kroneker product?

    • @HilbertXVI
      @HilbertXVI 2 роки тому +14

      The outer product and kronecker product are just special cases of the tensor product.

  • @pinklady7184
    @pinklady7184 2 роки тому +6

    So this is called "outer product."
    Before coming here, I was previously stuck on this symbol. Now, I understand its usage. Thank you fr explaining.

    • @pinklady7184
      @pinklady7184 2 роки тому +1

      Btw, I love the example with its usage. That helps me understand better the outer product.

    • @lagrangian143
      @lagrangian143 2 роки тому +4

      the same symbol is also used for the tensor product

  • @remlatzargonix1329
    @remlatzargonix1329 2 роки тому +4

    Thank you for doing these videos!
    Cheers!

  • @floribus5522
    @floribus5522 2 роки тому +8

    Thank you very much for you (also more advanced) videos. Isn't this also called a dyade? Could you also cover the "edge" product and this symplectic "stuff"? Thanks again :)

  • @MrLikon7
    @MrLikon7 2 роки тому +3

    so the outer product of vectors is the same as the kronecker product of matrices but with the second factor transposed
    edit: found confirmation here:
    en.wikipedia.org/wiki/Outer_product#Connection_with_the_Kronecker_product

    • @patryk_49
      @patryk_49 2 роки тому

      Also for column vectors the outer product is the same as the matrix multiplication but with the second factor transposed.

  • @BrickBreaker21
    @BrickBreaker21 5 місяців тому

    This is equal to a column vector times a row vector (or AB^T)

  • @Hold_it
    @Hold_it 2 роки тому

    I believe that i have never seen this symbol before.
    Now i'm prepared for the day i do come across it somewhere else.
    Thank you very much :)

    • @froglet827
      @froglet827 2 роки тому

      It shows up all the time in linear algebra

  • @davidmurphy563
    @davidmurphy563 2 роки тому

    OK, doing the sum is simple but what does it mean to take the outer product? With the dot product, the scalar output is a measure of the likeness of the vectors. The cross product outputs a new orthogonal vector.
    What is the outer product output?

    • @brightsideofmaths
      @brightsideofmaths  2 роки тому +1

      A matrix = linear map. We will discuss this in my linear algebra series :)

    • @davidmurphy563
      @davidmurphy563 2 роки тому +1

      @@brightsideofmaths Yeah, I've been going through the search results for this and everyone explains the trivial task of how to do it but not why to do it. I'm experimenting with a neural net using the discrete cosine transform. As far as I know, this has not been tried before. I want to do it without just using modules so I have a full appreciation of the transforms as my maths is very weak. So the task I have is taking an x, y plot and breaking it into its cosine frequency and amplitude components.
      Another (engineering) channel mentioned that the outer product is a matrix with "low dimensionality". That sounds fascinating... Probably the best thing I can do is sit down with a piece of paper and see what it would take to produce an identity matrix. Then I'll see what happens when I multiply the matrix but the original vectors.
      Hopefully I can figure this out. Tbh, I've coded 3d graphics engines from scratch but I've never heard of this product... I'm quite excited to find out what it does / is / can be used for.

    • @davidmurphy563
      @davidmurphy563 2 роки тому

      Wow, this thing is amazing...
      `
      import math
      class Vector2:
      def __init__(self, x, y):
      self.x = float(x)
      self.y = float(y)
      def normalize(self):
      r = math.sqrt(self.x*self.x+self.y*self.y)
      self.x *= 1/r
      self.y *= 1/r
      return self
      def __str__(self):
      return f"Vector2({self.x}, {self.y})"
      class Matrix:
      def __init__(self, mx, my):
      self.mx = mx
      self.my = my
      def __str__(self):
      return f"Matrix({self.mx}, {self.my})"
      def outer(v, v2):
      return Matrix(Vector2(v.x*v2.x, v.x*v2.y), Vector2(v.y*v2.x, v.y*v2.y))
      if __name__ == '__main__':
      print(outer(Vector2(1, 0), Vector2(1, 0)))
      print(outer(Vector2(0, 1), Vector2(1, 0)))
      print(outer(Vector2(0, 1), Vector2(1, 1).normalize()))
      print(outer(Vector2(-1, 0), Vector2(1, 0)))
      `
      Outputs:
      `
      Matrix(Vector2(1.0, 0.0), Vector2(0.0, 0.0))
      Matrix(Vector2(0.0, 0.0), Vector2(1.0, 0.0))
      Matrix(Vector2(0.0, 0.0), Vector2(0.7071067811865475, 0.7071067811865475))
      Matrix(Vector2(-1.0, -0.0), Vector2(0.0, 0.0))
      `
      Very low dimensional... And look, identical vectors get you a matrix with a single value top left. I you get the first line of the matrix zeroed out when it's orthonormal. Look at the PI/4 one... you lose the first vector but it preserves the 45 degree angle of the transposed second vector.
      Fascinating... You're obviously preserving the relationship between the vectors. Let me add vector matrix multiplication to the code and see what happens.

  • @thesovereignofdawn9300
    @thesovereignofdawn9300 Рік тому

    Finally reached max karma. Didn't know that The Bright Side of Mathemathics was an echo all along.