"Do what you want with this information. I don't know what this is useful for, and, to be honest, I don't care, because it's just beautiful as it is." Spoken like a pure mathematician. Study math because math is beautiful!
It doesn't need to have any immediate clear uses; it just _might_ turn out to be useful for something at some point, for whatever reason. So math is a little bit like preparing a "toolbox", where things are as general and flexible as possible, just in case they turn out to be needed.
Cool exercise. It teaches us something about the domain of math and how to explore it. Just a small slip of notation there, though: x^(1/n) is not (1/n)√x it is n√x.
I just finished my intro to linear algebra course and I was hoping to never see anything related to linear again but this was really interesting and fun to watch! What's even better is that I actually understood the steps you were taking.
I've been having a bad dwelling anxiety attack and what do I find that saves me from my somber mood? This gem! Genial! The Mad Man did it!I am so happy to see these bizarre beauties on your channel!
I've only studied math until C1 for my business degree, and to be honest, it is not my favorite subject, but is awesome to see how passionate you sound in your videos, keep up the good work, your content is very interesting
2:00 there is a fair argument you can make in favor of what you are doing. Essentially, a^b is a left-right association, but at the same time you could find a mathematical use for treating roots and powers differently, as the nth root of x is a power-base ordered phrasing, so you could actually want to use e^(n^-1 ln(x)) for roots, and e^(ln(x) n) for regular powers. In this case, it boils down to convention, as long as it's forever consistent.
I'll have to subscribe after seeing this. I haven't seen the Matrix since college so I will need to go back and review some more of Dr. Peyam's videos.
omg diagonalization is so powerful it seems the main technique in linear algebra invented by Grassman 1848 , the matrix Latin for womb by Sylvester an American Actuary 1848 , with Cayley defining the inverse in the 1860s Another beaudy by Dr Peyam always upbeat and chirpy 😜👏🏿👏🏿👏🏿
Yup, algebra is amazing. It is the most potent form of meta-mathematics that exists, studying decompositions, representations and data compression of structures. It is like a detective game, but within mathematical structures. No math would prosper without algebra ✌
I'm a math major and just finished my linear algebra sequences. And let me tell you that I've never dreamt that this could be done. It's weird lol. But beautiful
Perhaps one can derive some kind of rule for similar problems? I notice that the matrix that needs to be taken a root of is simply divided by 2 and 4 at the bottom row, which possibly has something to do with the 2's in the diagonal of other matrix (the one above the root symbol). And it also happens to contain 1,2 and 3 in both matrices.
There is a little mistake of the video but it is just notation problem. 1/n root of x is equal to x^(1/n). It is actually is x^n. But the video is very entertainment I have subscribed it to your channel and liked this video. :)
I agree with your statement about not caring about what this is useful for. but I do think it would be worthwhile to try to obtain some intuition about what this means. what is the meaning of taking the matrix root of something. very strange but the analysis shows that it works and therefore there is probably some meaning behind it. oftentimes things like this can reveal something about the operation in question. we can view root extraction as something far more general than just an operation on vectors. i think a lot of ppl would appreciate if you'd explain a bit more about why you can just apply a function like ln or e^x to a diagonalized matrix the way you did. i know i didn't understand that bit, but my linear algebra is a bit ancient and weak 🥴
I'm in 12th class currently and I don't carry much knowledge about matrices in this standard but when I saw the thumbnail of the video I just went crazy and tapped on it immediately....This is a truly wonderful clickbait
Ok, so if we consider scalars to be 1x1 matrices, then for an nxn matrix, it appears we can define the 1x1 root as well as the nxn root of it. Can this be generalized to any mxm matrix root? Or is there something special about 1 and n in producing the roots?
I believe that there is indeed something special about 1 and n in this context, since we're in an algebra (the algebra of matrices, which is essentially a vector space with an additional product operation, like in a ring), and in this algebra, we can define multiplication either by scalars (1x1 matrices if you like) and other elements of the algebra (nxn matrices). So, in that sense, I can't think of a natural way to generalize this root operation to accept other sizes of matrices
I like it when a math person assumes that we know what he talking about. Sounds like my mathematical physics teacher 40 years ago. I was the only student that liked him. Not stated is that from the power series expansion of any function, the eigenvector matrix and its inverse would be end up adjacent to each other given identity, leaving the diagonal matrix of a particular power.
@@drpeyam Thanks you for the prompt reply sir. specifically I am asking if there is any published book or something. I have studied matrix functions which are extended from real valued functions but I have never seen such thing.
Couldn't we compute the logarithm of a matrix A=RDR^-1 as log(A)=log(RDR^-1)=log(R)+log(D)+log(R^-1)=log(D) ? I know that this would probably hold only if the matrices commuted, but it could be nice.
Not partical fan of these number examples since the small computational problems keeps me distracted to see the big picture. I would rather like a more generalized approach, let say a 2x2 Matrix ([a1,a2], [a3, a4]) or even nxn matrix
@@drpeyam It is completely doable to do matrix-matrix exponentials for normal nonsinguar matrices A,B such that A^B = exp(log(A) B). However, I guess the case where A^(B^-1) is just a matter of handwork. Any idea if diagonalization of B will make it doable?
The physical significance of the matrix root of another matrix is the one to one mapping of the galaxies of one universe onto its neighboring universe assuming that the mapped universe is invertible. The mapping is unique and conforms to the laws of relativity
Well, one possible evolution of neural network might be a convolution, somehow, of exponentiation of matrices (i.e., connections between layers), so ,.... it might be VERY useful :D
That's great but could eigen value be zero and can you apply such diagonalization for any square matrix? I mean, we have charateristic polynoms for eigenvales and last ones have complexity roots sometimes.
No it's correct. Think of √4, it's the same as 4^(1/2) = 2. This is because 4^(1/2)*4^(1/2) = 4^(1/2+1/2) = 4^1 = 4, so it follows that (4^(1/2))^2 = 4, so it is in fact the square root of 4.
Well… I mean….
"This is math. We can do whatever we want."
As the beaten to death Thanos meme goes, "reality can be whatever I want" - and this is true in linear algebra where you can choose any basis!
Legendary quote. I’m going to put it at the top of my syllabus
@@angeldude101 Nice, I was just watching some of her videos!
challenge accepted
*let 1 = 2*
Matriz that contains matrix as element
"Do what you want with this information. I don't know what this is useful for, and, to be honest, I don't care, because it's just beautiful as it is."
Spoken like a pure mathematician. Study math because math is beautiful!
It doesn't need to have any immediate clear uses;
it just _might_ turn out to be useful for something at some point, for whatever reason.
So math is a little bit like preparing a "toolbox", where things are as general and flexible as possible, just in case they turn out to be needed.
Cool exercise. It teaches us something about the domain of math and how to explore it. Just a small slip of notation there, though: x^(1/n) is not (1/n)√x it is n√x.
Thank you!!!
@@drpeyam Dr P is always so courteous 😜
From the category of calculus to the category of linear algebra, there is a fully faithful functor. Perhaps contravariant?
Do you pay property taxes for your forehead? That’s a lot of acres man…
Yep
I just finished my intro to linear algebra course and I was hoping to never see anything related to linear again but this was really interesting and fun to watch! What's even better is that I actually understood the steps you were taking.
Exactly how I felt watching this
This is totally batshit crazy, I love it
I've been having a bad dwelling anxiety attack and what do I find that saves me from my somber mood? This gem! Genial! The Mad Man did it!I am so happy to see these bizarre beauties on your channel!
This is not madness but mathness
So he should be called Mad Maths!
I've only studied math until C1 for my business degree, and to be honest, it is not my favorite subject, but is awesome to see how passionate you sound in your videos, keep up the good work, your content is very interesting
I have used exponential matrices and the logarithm of matrices before. Writing some kind of matrixth root is just a nice possibility to consider.
OK, thank you for blowing my brains out. Linear algebra was one of my favorite subjects in college, but this is exquisite nuts stuff.
So fitting that December is the release month of the Matrix Resurrections!
Funny because the new trailer just released a few hours ago. After all...I still know math fu...
@@citizencj3389 i know! Are you pumped to go see it?
@@devsquaredTV Yeah I just hope it is at least half as good as the first one. I still liked the other two though.
2:00 there is a fair argument you can make in favor of what you are doing. Essentially, a^b is a left-right association, but at the same time you could find a mathematical use for treating roots and powers differently, as the nth root of x is a power-base ordered phrasing, so you could actually want to use e^(n^-1 ln(x)) for roots, and e^(ln(x) n) for regular powers. In this case, it boils down to convention, as long as it's forever consistent.
I think you might want roots to still be the inverses of powers, so you need to keep the convention consistent between them.
I'll have to subscribe after seeing this. I haven't seen the Matrix since college so I will need to go back and review some more of Dr. Peyam's videos.
Thank you!!!
WOW THAT IS CRAZY!!!
omg diagonalization is so powerful it seems the main technique in linear algebra invented by Grassman 1848 , the matrix Latin for womb by Sylvester an American Actuary 1848 , with Cayley defining the inverse in the 1860s
Another beaudy by Dr Peyam always upbeat and chirpy 😜👏🏿👏🏿👏🏿
Yup, algebra is amazing.
It is the most potent form of meta-mathematics that exists, studying decompositions, representations and data compression of structures.
It is like a detective game, but within mathematical structures.
No math would prosper without algebra ✌
Esa es la esencia de un matematico, generalizar los conceptos y las operaciones.
una observación inteligente mi amigo algebraico
"I don't know what this is useful for, and to be honest, I don't care" - every mathematician's favorite sentence
This is insane in every definition of the word! Great job :)
I'm a math major and just finished my linear algebra sequences. And let me tell you that I've never dreamt that this could be done. It's weird lol. But beautiful
I've taken scalar to matrix and matrix to scalar powers before, but never matrix to matrix. Very cool
Amazing! Can you do the matrixth derivative of a matrix?
I never thought the answer would be this but your explanation was so simple that I got it at almost once thank you for interesting video
right is always right
I didn't think he'd actually do it, lol!
Perhaps one can derive some kind of rule for similar problems? I notice that the matrix that needs to be taken a root of is simply divided by 2 and 4 at the bottom row, which possibly has something to do with the 2's in the diagonal of other matrix (the one above the root symbol). And it also happens to contain 1,2 and 3 in both matrices.
since its all based on the diagonalised eigenmatrix maybe you can directly use that?
Seriously amazing concept
There is a little mistake of the video but it is just notation problem. 1/n root of x is equal to x^(1/n). It is actually is x^n. But the video is very entertainment I have subscribed it to your channel and liked this video. :)
The way to do this is to write X=exp(log(X)), and then use the series expansions for log and exp.
Then you will have to deal with the convergence issues though.
@@hOREP245 Of course but that just falls upon eigenvalues of the matrix.
Foarte interesant! Care este aplicabilitatea practica?
I agree with your statement about not caring about what this is useful for. but I do think it would be worthwhile to try to obtain some intuition about what this means. what is the meaning of taking the matrix root of something. very strange but the analysis shows that it works and therefore there is probably some meaning behind it. oftentimes things like this can reveal something about the operation in question. we can view root extraction as something far more general than just an operation on vectors.
i think a lot of ppl would appreciate if you'd explain a bit more about why you can just apply a function like ln or e^x to a diagonalized matrix the way you did. i know i didn't understand that bit, but my linear algebra is a bit ancient and weak 🥴
There’s a video on matrix exponentials that explains this, it basically applies to any function that has a power series
I fucking love how much this guy is enjoying himself. King.
mad
absolutely crazy love it
Doctor Peyam I absolutely love your videos!! It's so inspiring to see such a knowledgeable man as you at work! It instantly makes me want to study :p
I'm in 12th class currently and I don't carry much knowledge about matrices in this standard but when I saw the thumbnail of the video I just went crazy and tapped on it immediately....This is a truly wonderful clickbait
Ok, so if we consider scalars to be 1x1 matrices, then for an nxn matrix, it appears we can define the 1x1 root as well as the nxn root of it. Can this be generalized to any mxm matrix root? Or is there something special about 1 and n in producing the roots?
I believe that there is indeed something special about 1 and n in this context, since we're in an algebra (the algebra of matrices, which is essentially a vector space with an additional product operation, like in a ring), and in this algebra, we can define multiplication either by scalars (1x1 matrices if you like) and other elements of the algebra (nxn matrices). So, in that sense, I can't think of a natural way to generalize this root operation to accept other sizes of matrices
A true mad lad, thanks for this 🤣
Esto es otro nivel...muchas gracias por dar luz a la caverna
"I'm sorry ln(DeGeneres) this is my time to shine" - 😂🤣😅🤣😂🤣😅 I can't believe how much I laughed.
🤣💀
"I don't know what this is useful for, to be honest I don't care, because it's just beautiful as it is"
I think that's something my mother says.
You're incredibly entertaining to watch! Greetings from Italy ✋🍕🔥
Raiz de uma Matriz. Esse é boa !
When he said "[two, minus one, minus three, second]th", I felt that.
This is insane, I love it
Bravo, Maestro! Bravissimo! I never even thought of this , let alone how to do it! Live and learn, the Weird!
Linear algebra final on Wednesday, this is perfect
Great job sir
This is absolutely CRAZY but wonderful!!!
Why didn’t I ever think of this in 6 decades?
I want more insanity!!!
Thanks so much!!!
What's next? αth derivative of a matrix function with respect to a matrix variable, where α is also a matrix?
Fractional derivative of the curve integral of homological chain complexes of Lie algebras or some other crazy shit lol
@@Wabbelpaddel something that's more likely to be taught at hogwarts, honestly
"This is Math, we can do whatever we want"! Love it!
I like it when a math person assumes that we know what he talking about. Sounds like my mathematical physics teacher 40 years ago. I was the only student that liked him. Not stated is that from the power series expansion of any function, the eigenvector matrix and its inverse would be end up adjacent to each other given identity, leaving the diagonal matrix of a particular power.
It’s because i’ve done countless videos on this, check out my eigenvalues playlist
Sounds applicable for some tensor calc in GR
"...because right is always right"
Just a reminder that Dr Peyam is left handed.
You're a literal god Dr. Peyam
Thanks so much!!!
" this is math , we can do whatever we do " this statement is mathematically false 😁❤ .... salute to you ❤
It's a cool exercise on matrix to the power of matrix. It must have an interesting app some day.
I love this. Thank you very, very much.
This is so cool. I used to philosophize about this kind of shit in hugh school and college. Cool to see that it is possible to do. problem like this.
Makes me wonder... can the gamma function be extended to matrices in order to get a smooth matrix factorial? 🤯
Full immersion i m in love
He is crazy but in a good way!
Sweet.
VERY GREAT EXERCISE SIR
YOU ARE REAL MATHS MASTER SIR
THANK YOU SIR
I just finished my linear algebra final and this… THIS THING! Shows up in my recommended!?
At 0:28, did you mean $\sqrt[n]{x} = x^{1/n}$ rather than $\sqrt[1/n]{x} = x^{1/n}$?
Hi Dr. P. Where can I read more about this? Could you pls help me
Check out the playlist
@@drpeyam Thanks you for the prompt reply sir. specifically I am asking if there is any published book or something. I have studied matrix functions which are extended from real valued functions but I have never seen such thing.
That was really a funny example!
What would be the general form of A^B, where A is the matrix
a b
c d
And B is the matrix
w x
y z
?
Left as an exercise to the reader :)
This is sooo crazy!!!
I don’t understand such high level of math…but I fcking loved this. Instant sub
Thank youuuu
Right is always right?
Couldn't we compute the logarithm of a matrix A=RDR^-1 as log(A)=log(RDR^-1)=log(R)+log(D)+log(R^-1)=log(D) ? I know that this would probably hold only if the matrices commuted, but it could be nice.
Sadly logs don’t operate this way for matrices, in fact we don’t even have identities like exp(A+B) = exp(A) exp(B) for matrices
@@drpeyam Sadface
Not partical fan of these number examples since the small computational problems keeps me distracted to see the big picture. I would rather like a more generalized approach, let say a 2x2 Matrix ([a1,a2], [a3, a4]) or even nxn matrix
LOL, well good luck with that
@@drpeyam It is completely doable to do matrix-matrix exponentials for normal nonsinguar matrices A,B such that A^B = exp(log(A) B). However, I guess the case where A^(B^-1) is just a matter of handwork. Any idea if diagonalization of B will make it doable?
excellent thanx a lot!!
The physical significance of the matrix root of another matrix is the one to one mapping of the galaxies of one universe onto its neighboring universe assuming that the mapped universe is invertible. The mapping is unique and conforms to the laws of relativity
This was recommended to me. Im proud of myself
Peyam do you prefer using pens or pencils for doing math?
Pencil for sure
@@drpeyam
Thanks
@@dfdxdfdydfdz all mathematicians love doing analytical math via pencil...and paper respectively.
Very interesting 👍🏼
This reminds me of Kalman filters ... if there is any interest, perhaps see if this might apply somehow to moving-target tracking. Cheers.
Ha ha ha ha. This was so giddy fun. Stuff we do with maths
Well, one possible evolution of neural network might be a convolution, somehow, of exponentiation of matrices (i.e., connections between layers), so ,.... it might be VERY useful :D
Matrices as exponents are in fact useful in machine learning.
Matrixth is my new favorite word
Me not knowing ANYTHING about a mathematical matrix and still watching:
_Interesting_
Thanks!
Omg thanks so much for the super thanks!!!
This looks like smth you would watch procrastinating at 3am.
Loved it! Btw, at 0:35 should it not be "x to the power of n"? Since you're inversing and inverse? Lol
First he wrote wrong...he meant nth root not 1/nth root
That's great but could eigen value be zero and can you apply such diagonalization for any square matrix? I mean, we have charateristic polynoms for eigenvales and last ones have complexity roots sometimes.
It’s fine, ln(0+) = - infinity and if you exponentiate that you get 0. And ln(-1) is complex so also ok
what is eigenvalue ?
Insanity can be a sign of genius and I think that applies here!
isn't the equation in 0:35
wrong?
i think it's x^n
No it's correct. Think of √4, it's the same as 4^(1/2) = 2.
This is because 4^(1/2)*4^(1/2) = 4^(1/2+1/2) = 4^1 = 4, so it follows that (4^(1/2))^2 = 4, so it is in fact the square root of 4.
Yeah he accidentally wrote 1/n on the left side
@@bomboid
Thats it
@@ubs7239 oh yeah you're right, I'm sorry, it is the n-th root yeah (or just x^n as you said).
@@bomboid that would make it a very interesting problem!!
I think I've seen this type of linear algebra used in Kalman filtering, but I'm not an expert on it. Neat vid though
Ooooh interesting!!
Great stuff. But how does one cause a matrixth root or a matrixth power of a matrix?
très surprenant. merci.
De rien!!
Hurting our heads so early in the Holiday season.
That was.................interesting
"This is Math, we can do whatever we want. " - Dr. Peyam
So in my history of Math, I was never wrong. I just did whatever I wanted. 😁
So I guess A^B (for matrices A,B) can not be defined uniquely?
Left power and right power :)
@@drpeyam Yeah, unfortunate :-|
This is why I propose notation A^B=(exp(ln(A)B)) and A ↑B=exp(Bln(A))
@@aneeshsrinivas9088 The use of arrow up already has a signification.
Arrow up repeated exponentiation.
X AU 3 = x^x^x
@@poutineausyropderable7108 thats a double up arrow, not a single up arrow, the single up arrow is the same thing as exponentiation
I'd say the best way to find applications for this kind of math is to model it in a simulation.
Can we somehow decompose a 3×3 matrix into several 2×2 matrix such that the operation is unique and an inverse decompose yields the same 3×3 matrix?
Eso es una completa locura 😂😂
Muy bueno 👏gracias