This method requires A to be diagonalizable, but in fact that's not required to calculate the ith root of a matrix, it just gives some handy shortcuts for a closed form answer.
@@drpeyam You can use the Taylor series for log(1+M) to get log(1+(A-1))=(A-1)-(A-1)^2/2+... Then multiplying by -i gives -i*log A, and using the Taylor series for e^M, we get A^i=I+(-i*log A)+(-i*log A)/2!+... The only problem with this is that the Taylor series for log(1+M) will only converge if the eigenvalues have absolute value less than 1, but maybe there's a way around that.
@@MathFromAlphaToOmega If you consider A as a linear operator acting on a hilbert spaces , it sufficies to take Hilbert-Schmidt norm to be strictly smaller than 1 in order the series to converge
Only wish the root that's lifting a section of driveway out front was imaginary ... Great video, as always (enjoyed the poke at quantum too) ... Cheers ...
It’s always hard for a student to find some examples… maybe not a formal form but sometimes, when I am studying it’s hard to find more intuitive first steps, there is always time for a more formal way in the books… thank you very much
0:50 makes me think about how amazing (i) is with the property: x^(1/i) = 1/x^i solving... x^(p + 1/p) = 1 -> p + 1/p = 0 or x = 1; since this is supposed to hold for all x, we take the former. -> p^2 = -1 this really is a property that uniquely follows from the definition of (i)
@@drpeyam I have seen techniques such as this used before. I am very uncertain about them. The thing is to my very old mathematical view point what you are doing is making an analytical continuation from the integers into the complex plane . May be this is rigorous but there are examples when doing this causes problems.
that's actually not too difficult if one knows euler's formula and the taylor series for e^x and the matrix is "nice" i is equal to e^(i*pi/2) and so the M root of i (where M is a matrix) = (e^(i*pi/2))^(1/M) = e^(i*pi/2M) = sum as n goes from 0 to infinity of (i*pi/2M)^n /n! (well assuming that the matrix can be raised to the power of any negative integer, and that this series converges)
Alslam Alycom Dr Peyam Could you please answer my question? If we plug i1 instead of 3 in Euler formula , we certainly get 1 to the minus i = 1 On other hand , we can think of 1 as exp( (2)(pi)(i)) so that 1 to the minus i = exp(2 pi) What an amazing coincidence!
It actually does pop up in quantum mechanics in a way. The time-evolution operator (if we assume H is a time independent Hamiltonian, which can be represented as a matrix) is given by U=exp(itH) so if you define A=exp(-tH) for whatever reason, you will infact get U=A^-i
Wait i've actually seen that one before... transforming t into tau = it, useful for switching between Lorentz spacetime (special relativity) and euclidean space and time (just 4D space), if you want to do that for some reason. Called "Wick rotation", i believe.
This was really interesting, thanks. What are some of the applications of taking imaginary roots of matrices? Also, how does this extend to quaternions/octonions/etc.?
Since the i-th root isn't well defined in terms of how many roots we are expecting, (there are for example more than 2 square roots of a 2 by 2 matrix), how would we know if there is more than just one answer?
Morning,FROM Argentina,dr.Peyam,thanks You for giving once more lesson,of knowledge,i've been watching your tutorials long time ago,every time i enjoy myself.
Ahh, so starting with the Fibonacci companion matrix [ 1 1 ] [ 1 0 ] I was able to find that the ith Fibonacci number is roughly 0.6520 + 0.3294i it looks like I get basically the same answers across the complex plane as the closed form formula I'm using (ignoring conjugates and a right shift)
Hey thanks ! I just realise that any a^i is on the unit circle in complex space. So that any complex number can be written as c*a^i . (I know, i know, it is evident, just put a=e^theta). But can something similar be used to delimit the domain of A^i, A being a matrix? Is there a matrix equivalent to the unit circle? - - Follow-up - - - OK, I thought it out. Please correct me if I err. The 2x2 real-numbered matrix A is a linear operator in R². So the complex-numbered matrix is a linear operator in C². The eigenvectors determine a change of base (new reference axes) and the eigenvalues an expansion/contraction along these axes/directions. In complex matrices, that expansion is in the complex (double-)plane C². Similarly to any complex number being equal to c*a^i, I conjecture that any complex matrix can be written as B*A^i. B is the expansion, A^i is the direction, the angle, on the (2-dimensional) circle. This also hints to the use: linear operations on complex vectors. Finally, i-th root of A is just (A -¹)^i .
Hmm there has to be a complex simplification of that (maybe in terms of sinh) but it eludes me. I do appreciate you reminding me of de Moivre's identity.
But (ln3 + m 2π) would have the same cosinus and sinus as ln3, wouldn't they? This is one and the sale complex number. So is the final answer the same.
oh wow I'm first for the first time, also dr.peyam always makes my day just a bit better due to his smile lol he's always like "thanks for watching :D" at the beginning of every video before watching the video I would guess that it involves the taylor series of x^(1/i) or a laurent series to be more precise
I am a bit perplex towards the check at the end in which you compute A^(1/i)^i, it feels like you just redo the steps backwards with the same assumptions.
Thank you Dr. Payem. And yes your information is helpful in quantum mechanics. I was thinking about this problem a couple of weeks ago. Nth root matrices are easier to understand. Here's an example if you scroll up a little from this link, you can see a square root matrix being used in quantum computing. en.wikipedia.org/wiki/Quantum_logic_gate#Controlled_gates
Yes is this Dr. Peyam's Matrix Power Pizzeria? Could I request an order to go of a matrix raised to trig function? Thanks will come to pick up whenever its ready. - also looking at good math is like sampling good food
@@drpeyam Wait I have an idea. I'm not sure if it works though maybe at least for if we raise a matrix to the sin(x) or cos(x) then we just need to concentrate on the real or imaginary parts of the complex exponential function f(x)=e^(ix). Maybe for starters what we need to look at more clearly is what does it mean to raise a matrix to the power of a function? (each element in the domain is related to exactly one element in the codomain - I'm not a logician but I wish I were so I could write this down nicely with quantifiers)
"What is this used for? Maybe quantum mechanics ... that is always the answer." :-) You are a Maths comedian, Dr Peyam.
That's the question ... Use of it ? Quantum mechanics ? An example ?
This method requires A to be diagonalizable, but in fact that's not required to calculate the ith root of a matrix, it just gives some handy shortcuts for a closed form answer.
How would you do it in general?
@@drpeyam You can use the Taylor series for log(1+M) to get
log(1+(A-1))=(A-1)-(A-1)^2/2+...
Then multiplying by -i gives -i*log A, and using the Taylor series for e^M, we get
A^i=I+(-i*log A)+(-i*log A)/2!+...
The only problem with this is that the Taylor series for log(1+M) will only converge if the eigenvalues have absolute value less than 1, but maybe there's a way around that.
That’s very nice too
There’s something called Abel series theorem that takes care of that
@@MathFromAlphaToOmega If you consider A as a linear operator acting on a hilbert spaces , it sufficies to take Hilbert-Schmidt norm to be strictly smaller than 1 in order the series to converge
"What is this useful for in practice?" Us real mathematicians don't concern ourselves with such questions
Only wish the root that's lifting a section of driveway out front was imaginary ... Great video, as always (enjoyed the poke at quantum too) ... Cheers ...
„Spiel mit mir“ 🤣👍 greetings from Germany. (P.S. great channel)
It’s always hard for a student to find some examples… maybe not a formal form but sometimes, when I am studying it’s hard to find more intuitive first steps, there is always time for a more formal way in the books… thank you very much
This is why I love the recommendation algorithms of the internet. I always seem to learn something about a topic that has never even crossed my mind
0:50 makes me think about how amazing (i) is with the property: x^(1/i) = 1/x^i
solving...
x^(p + 1/p) = 1
-> p + 1/p = 0 or x = 1; since this is supposed to hold for all x, we take the former.
-> p^2 = -1
this really is a property that uniquely follows from the definition of (i)
I like that!
Next up: finding the ith root of derivative of a function
^ Yeah
@@drpeyam I have seen techniques such as this used before. I am very uncertain about them. The thing is to my very old mathematical view point what you are doing is making an analytical continuation from the integers into the complex plane . May be this is rigorous but there are examples when doing this causes problems.
Next up: Finding the matrix root of _i_ :)
Oooooh I like that hahaha
that's actually not too difficult if one knows euler's formula and the taylor series for e^x and the matrix is "nice"
i is equal to e^(i*pi/2) and so the M root of i (where M is a matrix) = (e^(i*pi/2))^(1/M) = e^(i*pi/2M) = sum as n goes from 0 to infinity of (i*pi/2M)^n /n! (well assuming that the matrix can be raised to the power of any negative integer, and that this series converges)
I had no idea to do it!! Thaank you bro! New suscriber!
Yaaaaay
Excellent presentation. vow !! Dr 3.14159.....m.
3.14159265358797932384623364* to be more precise
Alslam Alycom Dr Peyam
Could you please answer my question?
If we plug i1 instead of 3 in Euler formula , we certainly get 1 to the minus i = 1
On other hand , we can think of 1 as exp( (2)(pi)(i)) so that 1 to the minus i = exp(2 pi)
What an amazing coincidence!
Yep :)
Excellent sir
It actually does pop up in quantum mechanics in a way. The time-evolution operator (if we assume H is a time independent Hamiltonian, which can be represented as a matrix) is given by U=exp(itH) so if you define A=exp(-tH) for whatever reason, you will infact get U=A^-i
Hahahaha
Wait i've actually seen that one before... transforming t into tau = it, useful for switching between Lorentz spacetime (special relativity) and euclidean space and time (just 4D space), if you want to do that for some reason. Called "Wick rotation", i believe.
@@geoffrygifari3377 yes that's a Wick rotation. If I recall correctly a Wick rotation of the Schrödinger equation gives the heat equation
This was really interesting, thanks. What are some of the applications of taking imaginary roots of matrices? Also, how does this extend to quaternions/octonions/etc.?
I mentioned applications at the end hahaha
Since the i-th root isn't well defined in terms of how many roots we are expecting, (there are for example more than 2 square roots of a 2 by 2 matrix), how would we know if there is more than just one answer?
Morning,FROM Argentina,dr.Peyam,thanks You for giving once more lesson,of knowledge,i've been watching your tutorials long time ago,every time i enjoy myself.
Can this be solved using SVD instead of eigen decomp?
Ahh, so starting with the Fibonacci companion matrix
[ 1 1 ]
[ 1 0 ]
I was able to find that the ith Fibonacci number is roughly 0.6520 + 0.3294i
it looks like I get basically the same answers across the complex plane as the closed form formula I'm using (ignoring conjugates and a right shift)
Nice
I like the form to solve the exercise!
Hey thanks ! I just realise that any a^i is on the unit circle in complex space.
So that any complex number can be written as c*a^i . (I know, i know, it is evident, just put a=e^theta).
But can something similar be used to delimit the domain of A^i, A being a matrix? Is there a matrix equivalent to the unit circle?
- - Follow-up - - -
OK, I thought it out. Please correct me if I err.
The 2x2 real-numbered matrix A is a linear operator in R². So the complex-numbered matrix is a linear operator in C².
The eigenvectors determine a change of base (new reference axes) and the eigenvalues an expansion/contraction along these axes/directions. In complex matrices, that expansion is in the complex (double-)plane C².
Similarly to any complex number being equal to c*a^i, I conjecture that any complex matrix can be written as B*A^i. B is the expansion, A^i is the direction, the angle, on the (2-dimensional) circle.
This also hints to the use: linear operations on complex vectors.
Finally, i-th root of A is just (A -¹)^i .
Yes there is a polar decomposition of a matrix and I think I made a video on it somewhere
Thank you it’s great !! From France
Merciii
Add to this category of problems without serious physical significance integral of a matrix
Hi Dr Peyam, I think it may be useful in a new compression algorithm. Not sure, will have to think more on the problem.
Hahaha omg
Hmm there has to be a complex simplification of that (maybe in terms of sinh) but it eludes me. I do appreciate you reminding me of de Moivre's identity.
so 1^-i=1 b/c 1=e^0 so e^0^-i=1 but what about the other ways to wirite 1(e^2pi(m)i). Are they all right? Same thing with 3^-i
They’re all right, and would give you different roots
But (ln3 + m 2π) would have the same cosinus and sinus as ln3, wouldn't they? This is one and the sale complex number. So is the final answer the same.
oh wow I'm first for the first time, also dr.peyam always makes my day just a bit better due to his smile lol he's always like "thanks for watching :D" at the beginning of every video
before watching the video I would guess that it involves the taylor series of x^(1/i) or a laurent series to be more precise
Sort of, but I hope you watched it
I am a bit perplex towards the check at the end in which you compute A^(1/i)^i, it feels like you just redo the steps backwards with the same assumptions.
Yep, exactly 😁
How do you know that A^x=PD^xP^-1 ? I mean it's obvious for natural power but i don't see it for real / complex
By approximating real numbers with rational ones and taking limits
Well Dr Peyam very wierd question. Has application in what branch of science? I will show this to Dr Osaka ,( Portuguese mathematicean)
Quantum mechanics haha
Would love to see you integrate ((sin(x^n)/(x^n(x^2+1))) from -inf to inf. Super interesting!
Nice, how do you do it?
WHAT FOR ?
Thank you Dr. Payem. And yes your information is helpful in quantum mechanics. I was thinking about this problem a couple of weeks ago. Nth root matrices are easier to understand. Here's an example if you scroll up a little from this link, you can see a square root matrix being used in quantum computing. en.wikipedia.org/wiki/Quantum_logic_gate#Controlled_gates
Next up, the ith derivative of a function
Already done ✅
Yes is this Dr. Peyam's Matrix Power Pizzeria? Could I request an order to go of a matrix raised to trig function? Thanks will come to pick up whenever its ready. - also looking at good math is like sampling good food
That would be cool
@@drpeyam Wait I have an idea. I'm not sure if it works though maybe at least for if we raise a matrix to the sin(x) or cos(x) then we just need to concentrate on the real or imaginary parts of the complex exponential function f(x)=e^(ix). Maybe for starters what we need to look at more clearly is what does it mean to raise a matrix to the power of a function? (each element in the domain is related to exactly one element in the codomain - I'm not a logician but I wish I were so I could write this down nicely with quantifiers)
If you don't know where to go with a matrix, the answer is: find its eigenvalues !! (and eigenvectors, of course)
Mathematician getting adventurous :)
Next up, matrix raised to the power of a matrix
Already done ✅
@@drpeyam Oh! I forgot, now i remember, you are great and also, its been so long since you have used a lot of chen lu, looking forward to it
It is amazing how little i makes this marix become a monster😀
Lets imagine...
interesting
Anyone here believe that log should natural log and not ln.
Next: finding the root of all evil.
😈😈😈
yep, quantum mechanics.
Hi, would you give an example, please?
@@diegotristan8234 time evolution operator, it can be rappresented as an exponential operator
Next: just do some normal math for crying out loud... Just kidding XD
Oh like g^-1 H g kind of math, or A*A = AA* kind of math 🤪
sigh this video should be creative commons, not youtube standard license
Huh? What’s the difference?
Noice
Thanks,Dr. PEYAM,and don't care irrelevante coments,lets go on!