Thank you very much for you (also more advanced) videos. Isn't this also called a dyade? Could you also cover the "edge" product and this symplectic "stuff"? Thanks again :)
so the outer product of vectors is the same as the kronecker product of matrices but with the second factor transposed edit: found confirmation here: en.wikipedia.org/wiki/Outer_product#Connection_with_the_Kronecker_product
OK, doing the sum is simple but what does it mean to take the outer product? With the dot product, the scalar output is a measure of the likeness of the vectors. The cross product outputs a new orthogonal vector. What is the outer product output?
@@brightsideofmaths Yeah, I've been going through the search results for this and everyone explains the trivial task of how to do it but not why to do it. I'm experimenting with a neural net using the discrete cosine transform. As far as I know, this has not been tried before. I want to do it without just using modules so I have a full appreciation of the transforms as my maths is very weak. So the task I have is taking an x, y plot and breaking it into its cosine frequency and amplitude components. Another (engineering) channel mentioned that the outer product is a matrix with "low dimensionality". That sounds fascinating... Probably the best thing I can do is sit down with a piece of paper and see what it would take to produce an identity matrix. Then I'll see what happens when I multiply the matrix but the original vectors. Hopefully I can figure this out. Tbh, I've coded 3d graphics engines from scratch but I've never heard of this product... I'm quite excited to find out what it does / is / can be used for.
Wow, this thing is amazing... ` import math class Vector2: def __init__(self, x, y): self.x = float(x) self.y = float(y) def normalize(self): r = math.sqrt(self.x*self.x+self.y*self.y) self.x *= 1/r self.y *= 1/r return self def __str__(self): return f"Vector2({self.x}, {self.y})" class Matrix: def __init__(self, mx, my): self.mx = mx self.my = my def __str__(self): return f"Matrix({self.mx}, {self.my})" def outer(v, v2): return Matrix(Vector2(v.x*v2.x, v.x*v2.y), Vector2(v.y*v2.x, v.y*v2.y)) if __name__ == '__main__': print(outer(Vector2(1, 0), Vector2(1, 0))) print(outer(Vector2(0, 1), Vector2(1, 0))) print(outer(Vector2(0, 1), Vector2(1, 1).normalize())) print(outer(Vector2(-1, 0), Vector2(1, 0))) ` Outputs: ` Matrix(Vector2(1.0, 0.0), Vector2(0.0, 0.0)) Matrix(Vector2(0.0, 0.0), Vector2(1.0, 0.0)) Matrix(Vector2(0.0, 0.0), Vector2(0.7071067811865475, 0.7071067811865475)) Matrix(Vector2(-1.0, -0.0), Vector2(0.0, 0.0)) ` Very low dimensional... And look, identical vectors get you a matrix with a single value top left. I you get the first line of the matrix zeroed out when it's orthonormal. Look at the PI/4 one... you lose the first vector but it preserves the 45 degree angle of the transposed second vector. Fascinating... You're obviously preserving the relationship between the vectors. Let me add vector matrix multiplication to the code and see what happens.
tensor product is denoted by the same symbol. How does it relate to both vector product and kroneker product?
The outer product and kronecker product are just special cases of the tensor product.
So this is called "outer product."
Before coming here, I was previously stuck on this symbol. Now, I understand its usage. Thank you fr explaining.
Btw, I love the example with its usage. That helps me understand better the outer product.
the same symbol is also used for the tensor product
Thank you for doing these videos!
Cheers!
Thank you very much for you (also more advanced) videos. Isn't this also called a dyade? Could you also cover the "edge" product and this symplectic "stuff"? Thanks again :)
Yes, indeed. It's also called dyade.
so the outer product of vectors is the same as the kronecker product of matrices but with the second factor transposed
edit: found confirmation here:
en.wikipedia.org/wiki/Outer_product#Connection_with_the_Kronecker_product
Also for column vectors the outer product is the same as the matrix multiplication but with the second factor transposed.
This is equal to a column vector times a row vector (or AB^T)
I believe that i have never seen this symbol before.
Now i'm prepared for the day i do come across it somewhere else.
Thank you very much :)
It shows up all the time in linear algebra
OK, doing the sum is simple but what does it mean to take the outer product? With the dot product, the scalar output is a measure of the likeness of the vectors. The cross product outputs a new orthogonal vector.
What is the outer product output?
A matrix = linear map. We will discuss this in my linear algebra series :)
@@brightsideofmaths Yeah, I've been going through the search results for this and everyone explains the trivial task of how to do it but not why to do it. I'm experimenting with a neural net using the discrete cosine transform. As far as I know, this has not been tried before. I want to do it without just using modules so I have a full appreciation of the transforms as my maths is very weak. So the task I have is taking an x, y plot and breaking it into its cosine frequency and amplitude components.
Another (engineering) channel mentioned that the outer product is a matrix with "low dimensionality". That sounds fascinating... Probably the best thing I can do is sit down with a piece of paper and see what it would take to produce an identity matrix. Then I'll see what happens when I multiply the matrix but the original vectors.
Hopefully I can figure this out. Tbh, I've coded 3d graphics engines from scratch but I've never heard of this product... I'm quite excited to find out what it does / is / can be used for.
Wow, this thing is amazing...
`
import math
class Vector2:
def __init__(self, x, y):
self.x = float(x)
self.y = float(y)
def normalize(self):
r = math.sqrt(self.x*self.x+self.y*self.y)
self.x *= 1/r
self.y *= 1/r
return self
def __str__(self):
return f"Vector2({self.x}, {self.y})"
class Matrix:
def __init__(self, mx, my):
self.mx = mx
self.my = my
def __str__(self):
return f"Matrix({self.mx}, {self.my})"
def outer(v, v2):
return Matrix(Vector2(v.x*v2.x, v.x*v2.y), Vector2(v.y*v2.x, v.y*v2.y))
if __name__ == '__main__':
print(outer(Vector2(1, 0), Vector2(1, 0)))
print(outer(Vector2(0, 1), Vector2(1, 0)))
print(outer(Vector2(0, 1), Vector2(1, 1).normalize()))
print(outer(Vector2(-1, 0), Vector2(1, 0)))
`
Outputs:
`
Matrix(Vector2(1.0, 0.0), Vector2(0.0, 0.0))
Matrix(Vector2(0.0, 0.0), Vector2(1.0, 0.0))
Matrix(Vector2(0.0, 0.0), Vector2(0.7071067811865475, 0.7071067811865475))
Matrix(Vector2(-1.0, -0.0), Vector2(0.0, 0.0))
`
Very low dimensional... And look, identical vectors get you a matrix with a single value top left. I you get the first line of the matrix zeroed out when it's orthonormal. Look at the PI/4 one... you lose the first vector but it preserves the 45 degree angle of the transposed second vector.
Fascinating... You're obviously preserving the relationship between the vectors. Let me add vector matrix multiplication to the code and see what happens.
Finally reached max karma. Didn't know that The Bright Side of Mathemathics was an echo all along.
Yeah :D
@@brightsideofmaths wait, have you played rain world?