Matrix exponential properties
Вставка
- Опубліковано 15 вер 2024
- Showing that exp(A+B) doesn't equal exp(A)exp(B), but showing that it's the case when AB = BA
Check out my Eigenvalues playlist: • Diagonalize 2x2 matrix
Subscribe to my channel: / @drpeyam
Holy sh*t! The discrete version of Fubuni !!!
It kinda is!
Dr Peyam is my hero. His explanations are so useful and he makes it so fun.
For pi day I did a Math Olympian event, and a question similar to this was asked. I couldn't for the life of me answer it because I'm only a Calc 3 student with no Linear Algebra experience. Glad to see you do something like this :)
2:22 isn't e^B [1 0] because I is [1 0]
[1 1] [0 1]?
Ya
At 3:03, is it not that e^B has a zero in the top right entry instead?
Yeah
@@drpeyam, thanks for sanity check. Great vid, I remember looking into this and being amazed. Happy to have a youtuber who is willing to play with maths like this.
There is a very nice general formula I know from Quantum Mechanics in the context of Feynman path integral, which is known as Baker-Campbell-Hausdorff (BCH) Formula, so if A and B are commutative operators, then exp(A)exp(B)=exp(A+B).
I think Dr Peyam can do another video on BCH formula...
I don’t know that formula, haha
@DrPeyam: REALLY??? haha
@@drpeyam You should really learn that formula. It shows us how to compute z from x and y, knowing that e^x * e^ y = e^z.
Thus it kind of gives us the ln of a product. The Baker-Campbell-Hausdorff formula uses quite a lot of commutators, where the commutator of x and y, [x,y], is defined as [x,y] = x*y - y*x.
The formula for e^(x+y) is much easier, and iirc can be expressed through anticommutators instead, i.e. {x,y} = x*y + y*x.
Baker-Campbell-Hausdorff works not only for matrices and associative algebras, but for alternative algebras as well, like the octonions e.g.
hey man, just wanted to say this really helped me on my Quantum Theory exercise.
Thank you!
Hello split imaginary unit. (the [[0,1],[1,0]]) I just now got the connection between the split imaginary and hyperbolic functions. Lol.
When you “pull out” the summation or “constant”, do you have to pull out to the left? Or can you pull it out to the right, thus preserves the order e^A•e^B. I would like to know in what conditions are we pulling (factor out) to the left or right? Thanks
when you multiply matrices by matrices the left/right hand side is very important as it's asymmetrical. But when we take out a matrix out of a single variable equation, then it won't matter. Notice that c*A = A*c with c a constant or a variable.
Skeleton Rowdie around 15:30, dr is going to take out the-soon-become e^B because it is not dependent on k anymore. Notice that e^B is a matrix, so it should stay right of the summation of k. Hence I think e^A•e^B would be more appropriate at that step.
according to 10:10,
can this theorem be if and only if statement?
in other words, is the converse statement true also: exp(A+B)=exp(A)exp(B) implies AB=BA???
according to mathworld.wolfram.com/MatrixExponential.html
"in general, the formula exp(A)exp(B)=exp(A+B) holds *only* when A and B commute, i.e., [A,B]=AB-BA=0." that seems to be the Converse statement of 10:10 but i am not sure.
I thought it wasn't necessary, but anticommutivity doesn't work. What might be a fun other topic would be deriving the de moivre formula for quaternions. You already showed that you can peel off the real part.
Me answering this by intuition:
Yes. Obviously. That's just a property of the exponential.
Wait, no. Maybe. No...
No, of course not, because addition commutes but multiplication doesn't.
At 2:30, isn't e^B = [1 0; 0 1] + [0 0; 1 0] = [1 0; 1 1] (ie with the rows swapped)? Or am I having another of my brain-fades?
It is
@@drpeyam Thanks! I assumed I was going mad(der) for a minute, there!
Super interesting when you consider an exponent (X) that is an element of a matrix lie algebra (g = TG). In this case, exponentiating (e^tX) is "flowing from the indentity along the integral curves of X" in the associated lie group (G). It's the exponential map exp(tX):g --> G, where g is the Lie algebra of G. Then [ X, Y ] is the Lie Derivative of e^X by Y and measures the failure of the flow in successive directions to return to the identity.
Thanks doctor it’s extremely perfect video i more benefits that but i need more again video to clear
I was hoping you'd prove the version of this that involves commutators of A and B and commutators of commutators and so on and so on. I think it was called the Baker-Campbell-Hausdorff formula?
I love linear algebra extravaganzas!
Just subscribed. Beautiful problem and solution!
How can I find a rule for obtain all the matrix, 2x2 dimension that conmute one with each other on the multiplication?
Thanks Dr. Peyam 😘but I really did not caught that 12:42.....interchange of summation... 🤔..how did you done that? 😅
Magic!
You got e^B wrong, it's:
[1 0]
[1 1]
Yeah
9:35 So what is BA factorial? :P
The sum of A and B here - [0 1, 1 0] - actually corrosponds to the "split complex" unit j which squares to 1, and also has the property of exp(t*j) = cosh(t) + j sinh(t) (very similar to eulers formula!), both of which were demonstrated.
p.s. funnily you accidentally make a connection to it at 15:09!
Yes, split complex numbers are very underrated. The deepest intuition you can have about complex numbers, split complex numbers, and even quaternions and octonions, is that they are both numbers and symmetries of their corresponding spaces. You can actually generate all 2D linear transformations by combining two copies of the imaginary split numbers (call them j and k) and one copy of the regular imaginary number (i) and a real part. In fact, in such a number system, the squared magnitude of such a hyper-complex number is the determinant of a corresponding matrix!
Pluggin matrices into functions with a power series always remembers me ol linear algebra 2 where some Cayley-Formula1 Driver said that if you plug a matrix into its characteristic polynomial you will get the 0-matrix.
Hahaha, Cayley-Formula 1, I’m gonna use that 😂
e^A e^B=exp[A+B+1/2(AB-BA)] if A and B commute with their commutator, i.e.
[A,B]=AB-BA
[A,[A,B]]=[B,[A,B]]=0
So amazing!!
This is brilliant
Great explanation! Thanks!
Shouldn’t e^B be
1 0
1 1
And not
1 1
1 0
, or am I missing something?
Exactly, i was thinking the same
Wow ,I found it !💐💐💐💐
Love u
It would be nice if there was somwhere in the internet a simple proof for exp(a+b)=exp(a)*exp(b) not for the matrix
I think there is one somewhere in my videos
Beautiful
So AB=BA is a sufficient condition for exp(A+B)=exp(A)exp(B)
Is it necessary
Absolutely!
This math is beautiful.
Thanks a lot D peyam السلام عليكم
Nice proof!
Greetings Genius!. It is all I had to say.
Sir i am from india 🇮🇳
Siry sir,I have small doubt when you expB=I+B=[1 1 1 0 ] sir please checked the matrix
I think that might be a typo, see comments
In fact that *is* a rotation matrix - in Minkowski space!
iff a and b commute!
What are the prerequisites of this video sir ?
No prerequisites, really
@@drpeyam let the video be played.😊
Is there any special names for that kind of matrix A,B which you defined ?
Not really
a + b looks like an evil identity matrix lol.
e^(A+B) = e^A e^B ?
I come here, certain that if A & B commute, the equality will hold.
Because the Taylor series that defines the exponential of a matrix, consists of a sum of scalar multiples of non-neg int. powers of the matrix, and the same algebra that can show eˣ⁺ʸ = eˣeʸ for numbers, by multiplying Taylor series, will work for matrices that commute.
Note that
e^(A+B) = e^(B+A)
because matrix addition always commutes; and that if the topmost equality ("distribution of the exponential") always holds, then
e^A e^B = e^(A+B) = e^(B+A) = e^B e^A
showing that e^A & e^B commute in that case.
What I'm not quite certain of, is that if A & B fail to commute, the equality always fails.
Seems likely, but I'm not sure how to show it. I might approach it using the commutator*:
[A,B] =def= AB - BA
*The usual definition of the commutator, in a group, is
[A,B] =def= ABA⁻¹B⁻¹
so I'm actually hijacking the term here; in a group there's just 1 defined operator; matrices form a non-abelian ring, and so, there are 2 operators.
I don't know what else to call the construction I'm using.
If [e^A, e^B] can be developed into a series of powers of [A,B], and that series can be shown ≠ 0 when [A,B] ≠ 0, that would clinch it.
But that seems too hard; there's probably a nicer way.
Let's see what the good Doctor has up his sleeve . . .
Post-view:
That was very good!
My only disappointment was that the converse wasn't shown; but because I can't see how to do that either, I really can't complain. ;-)
So, Dr. ∏M, what do you think? Could my idea about commutators be carried out to the desired conclusion? Do you have any ideas how to do that?
Or can you, or anybody here, give a counterexample; namely, where [A,B] ≠ 0, but e^(A+B) = e^A e^B ?
BTW, the "almost rotation matrix" you got in your example, is actually a "boost" matrix in special relativity! It's a Lorentz transformation in 1 spatial dimension, for a velocity of
v = tanh(1)·c ≈ 0.76c.
Fred
I think it’s not a necessary condition, if A and B fail to commute, we may still have cases where the exponential identity holds
@@drpeyam [I've added to my comment while you were replying...] Yes, I can't quite see how to rule that out, but I suspect it *is* a necessary condition.
This is another one of those,
'This will be tough either way, so do I spend my time & effort on a long-and-involved proof, or on a long-and-involved search for a counterexample?'
situations... GRRRR ;-(
Fred
One of the commenters seems to suggest that there are counter examples, although I can’t come up with one right out of the bat
oh my *cosh*
I see what you sre doing here.
ur a god
I like your video, you like my comment. You know the rules, Peyam.
Baker Campbell Hausdorff would be good.
Don’t know it, though :/
Cool thanks.
Cool
ABBA! 😍🙈
baker champbell haussdorf
wow!
I'm not first, I'm so disappointed!