Euler's Formula Beyond Complex Numbers

Поділитися
Вставка
  • Опубліковано 21 лис 2024

КОМЕНТАРІ • 289

  • @morphocular
    @morphocular  Рік тому +262

    4:24 - I feel I should also mention that what tool you use to describe higher dimensional rotations also depends on what use case you have in mind. In particular, if computational speed and numerical stability are required, matrices may not be the best choice for representing 3D rotations, and there quaternions and the like may be the better option. In this video, I wanted to explore the matrix representation, but as is often the case in math, there are multiple ways to do the same thing.

    • @guidosalescalvano9862
      @guidosalescalvano9862 Рік тому +9

      I'm particularly interested in "reorientation"; two subsequent rotations of a 3D shape cannot necessarily be expressed using a single rotation. For instance; imagine an airplane that pitches 90 degrees and then rolls 90 degrees. No single rotation axis and angle can be found that rotate the airplane into this orientation. A rotation matrix can express this though. So something weird is going on here that I wish I understood better.

    • @morphocular
      @morphocular  Рік тому +24

      @@guidosalescalvano9862 Good question! But as it turns out, any 3D rotation actually can be expressed as a rotation by a certain angle about a certain axis. The keyword to search for is Euler's Rotation Theorem. In the case of your pitch and roll example, if I take the rotation axes to be the x-axis and the y-axis, a 90 degree rotation about x followed by a 90 degree rotation about y can actually be expressed as a 120 degree rotation about an oblique axis (specifically, a 120 degree rotation about the vector (1,1,-1)). An example of this kind of rotation actually appears in the video at about 3:38.
      However, in 4D and higher dimensions, I believe you're right. There we can encounter a phenomenon known as a "double rotation" where an object is being rotated independently in more than one plane. So I believe in 4D and above, it's possible to have rotations that truly cannot be expressed in terms of a single angle within a single plane.

    • @guidosalescalvano9862
      @guidosalescalvano9862 Рік тому +5

      @@morphocular Thank you for your response, but I think you didn't understand my question. You are right, for a single point you can apply Eurler's rotation theorem. But for a 3D shape, i.e. group of points two consecutive rotations on ALL points cannot be simplified to a single rotation on all points. To see why, all you need to do is construct the rotation axes. Let's say the x axis runs along the right wing of the plane at it's initial position. The z axis runs towards the tail of the plane at it's initial position. The y axis points upwards perpendicular to the x and z axis. After the two rotation operations the plane's nose points upwards, and the wings stretch along the z axis. So let's compute the rotation axes for both the nose and the right wing tip. We can do this by computing the rotation axes for the initial position of the nose/wingtip and the final position of the nose/wingtip. The nose starts at on the z axis, and ends on the y axis. This means that the rotation axis for the nose is the x axis. The right wingtip starts on the x axis, and ends on the z axis. So the rotation axis is the y axis. Because the x and y axis are not the same, no single rotation can be found that can simplify two rotations of the the plane as a WHOLE to s a single rotation. A rotation matrix can express shape rotations, but I strongly suspect that there are better representations; Using two rotation axes the same shape rotation can be expressed in multiple ways (pitch roll can also be expressed as roll yaw, yaw pitch). So I am wondering; can a shape reorientation be uniquely expressed in terms of two independent operations? If I had to guess it would be two reflections... but how do I prove this beyond a shadow of a doubt?

    • @Trinexx42
      @Trinexx42 Рік тому +9

      ​@@guidosalescalvano9862 Not quite.
      "The nose starts at on the z axis, and ends on the y axis. This means that the rotation axis for the nose is the x axis."
      This does not follow. There are actually infinitely many possible rotations that map the y axis to the z axis. For example, you can rotate pi around the axis that runs pi/4 in between the y axis and z axis or (yhat+zhat)/sqrt(2)
      As for your exercise, if you do an xhat rotation pi/2 followed by the yhat rotation pi/2, it's equivalent to a single rotation by 2pi/3 that runs about axis equal to (xhat+yhat-zhat)/sqrt(3)
      If you want to see this for yourself, you can if you have a rubik's cube. If you hold the cube with the yellow face on the top and the red face on the front and do an xhat rotation, orange ends up on top. Then if you do a yhat rotation, blue ends up on top and yellow ends up in the front.
      Now in your left hand, put your thumb on the upper front left corner (yellow red blue) and your middle finger on the lower back right corner (white green orange). Using your other hand, you can rotate the cube back to having the red on the front and yellow on the top. It's a single rotation that's equivalent to two.

    • @TheLuckySpades
      @TheLuckySpades Рік тому +1

      ​@@guidosalescalvano9862since we are considering the rotation around the center of the plane, Euler's Rotation Theorem states that the composition of the rotations you describe is equivalent to a rotation around some axis passing through the center of the plane again
      The flaw with your argument is that given the center (O) a point (A) and the image of the point (A') you say there is only one possible axis and angle (angle formed by AOA' and axis perpendicular to the plane described by those points (I'll call this axis g since I will use it later))
      But there are more possible axes of roation and each of those has it's own angle
      A useful one to consider is the axis passing through the midpoint of A and A' and the center (let's call this one f), rotating half a turn in either way around that axis gives us the desired image for A
      And similarly we can find angles for any line in the plane formed by f and g that passes through the center O, giving us uncountably many possible axes, represented by the plane
      We can then repeat that with a second point and get a second plane of possible axes
      These planes intersect in at least one point (the center), which either means they are identical (which we can avoid by choosing the second point well and ruling out the case that we mirrored around the plane) or their intersection is a line, which would be an axis of the rotation
      Since we know 3 (non-colinear points) and their inages and since the transformation is orientation preserving, the transformation is unique and thus can be described via the roation we found

  • @txikitofandango
    @txikitofandango Рік тому +210

    The variety of styles of math youtube explainers is truly kaleidoscopic, and i really appreciate the balanced tone and energy of all your videos

    • @easports2618
      @easports2618 Рік тому +6

      dude can you give me a list of all the math youtubers you’ve encountered, I have a math channel which is for only mathematics i’d love to learn from them

    • @txikitofandango
      @txikitofandango Рік тому

      @@easports2618 For a good smattering check out these channels: Michael Penn, Maths 505 (integral sports), blackpenredpen, Flammable Maths, 3blue1brown, Mathologer, zetamaths, Dr Barker

    • @haipingcao2212_.
      @haipingcao2212_. Рік тому

      ​@@easports2618
      ᔫᔒᑷᕓᕝ
      ᕧᕽ
      ᕞᕀᓖᖌᔔᖌᖌᖌᓾᔃᒜᒶᕄᐞ

  • @dunodisko2217
    @dunodisko2217 Рік тому +599

    “We should get a gradual rotation of the cube about the z-axis. And…” *ascends*

  • @AzureLazuline
    @AzureLazuline Рік тому +156

    i'm a programmer and i've done a *lot* of work with 3D graphics, which uses this stuff all the time, but i never thought about how it was built up. The triple cross product is absolute genius, and using it to derive the general rotation matrix (which i use every day and recognized on-sight) is mind-blowing! There's still a few little things i didn't get, so i'll need to watch this again at some point. Thanks for making this video, and i hope everything's been going well for you ❤

    • @sawc.ma.bals.
      @sawc.ma.bals. Рік тому +5

      ​@guitarszennoone asked

    • @AkamiChannel
      @AkamiChannel Рік тому +3

      @guitarszenno one asked and geometric algebra is used in programming only by people who specifically sought out such a library. None of the major linear algebra libraries use geo alg.

    • @umeshrajesh1
      @umeshrajesh1 Місяць тому

      “I want to be a programmer when I grow up!”
      -My Son😊

    • @umeshrajesh1
      @umeshrajesh1 Місяць тому

      0:00 0:00 0:00 0:00 0:00 0:00 0:00 0:00 0:00 0:00 0:00 0:00 0:00 0:00 0:00 0:00 0:00 0:00 0:00 0:00 0:00 0:00

    • @umeshrajesh1
      @umeshrajesh1 Місяць тому

      Oh uh I misclicked above but that’s how you’ll watch the video again! 0:00

  • @johnwickfromfortnite5744
    @johnwickfromfortnite5744 Рік тому +68

    Another addendum for the matrix exponential: Doing it is a LOT easier when you change basis to the "eigenbasis" of the matrix, the base made by the unit vectors it only scales by a number and doesn't rotate at all. Doing this, the exponential of a matrix exp(A) = B exp(D) B^-1, where B and inverse B are the matrices switching from the standard basis to the eigenbasis, in which the matrix A is represented by the matrix D, which is a diagonal matrix of the eigenvalues, the scale factors of the eigenvectors. Diagonal matrices are MUCH MUCH easier to work with for calculations, so finding the eigenbasis for a matrix is very important and helpful. There are even tricks to "almost" diagonalize them in case your nxn matrix has less than n orthogonal eigenvectors (keyword is the Jordan normal form!)

  • @jamiewalker329
    @jamiewalker329 Рік тому +40

    This has real importance in quantum mechanics, as we define the angular momentum operator in terms of the generator of infinitesimal rotations. The lack of commutativity of rotation around two different axes, results in non-trivial commutation relations that have implications for the inability to measure things simultaneously.

    • @pseudolullus
      @pseudolullus Рік тому +7

      Yup, this is basically the exponential map between Lie groups and algebras in very thin disguise

    • @ukkomies100
      @ukkomies100 Рік тому +2

      Is that all really as hard as all this technical talk has me believe. Im studying engineering in an university and there is an optional course on particle and quantum physics. Will it end my already feeble mental health

    • @oni8337
      @oni8337 11 місяців тому +1

      Especially when you get to the path integral formulation

    • @umeshrajesh1
      @umeshrajesh1 Місяць тому

      What

  • @benwhite2226
    @benwhite2226 Рік тому +29

    This video would had been useful when I was taking a vector analysis course as this gives some very good perspectives of matric rotations and matrix exponential and their geometric meanings. This reminds me a lot of eueler angles and their derivation and aspects of using cross products to relate the new basis vectors.

  • @chriskiwi9833
    @chriskiwi9833 Рік тому +4

    I remember deriving all of this content in my first year linear algebra course but I never knew what it all meant. Thank you, I am so grateful.

  • @angeldude101
    @angeldude101 Рік тому +62

    GA and bivectors are more abstract algebraically, but I'd argue more concrete geometrically. It's also worth mentioning the traditional limit definition of the exponential, because it's absolutely beautiful seeing the transformation broken into many small pieces and then applied consecutively to build up the final transformation.
    I have a tendency to see things that are isomorphic and behave the same way as being the same thing, and it only took a single look at the output of the tilt product to recognize the resulting matrix as a bivector. The tilt product can be compared _either_ to the wedge product of vectors _or_ the commutator product of two bivectors (the latter of which actually uses the same symbol as the cross product).
    Quaternions are usually seen as complicated 4D things, but quaternions are no more 4D than 3x3 matrices are 9-dimensional. They're a space of linear combinations of 3D rotations: 180° from x to y around z, 180° from y to z around x, 180° from x to z around y, and a final 0° rotation. By mixing different amounts of these 4 rotations (or not-rotation in the case of the identity), you can get any (scaled) rotation in 3D space. I should warn you though that these are _rotations,_ not _orientations._ For quaternions, as well as general rotors in GA, 270° clockwise is a distinct rotation from 90° counterclockwise, even if they end up in the same place. This is also related to why the basis rotations are 180° rather than 90°.
    In addition to distinguishing different directions of rotation, quaternions (and rotors) often yield nicer interpolations precisely because they're _specifically_ rotations, where as interpolating two rotation matrices often gives intermediate matrices that _aren't_ rotations. While you _can_ try rescaling them so the determinant is 1, 180° rotations are impossible to do so with since they end up with a matrix with 0 determinant in the middle, and also wouldn't know which direction to interpolate through. Quaternions and rotors have neither of these problems unless you're interpolating from 0° to 360°, which is understandably uncommon.

    • @tylerfusco7495
      @tylerfusco7495 Рік тому +18

      Ah yes a fellow geometric algebra enthusiast! It’s really a shame too seeing as how the “tilt product” matrix times a vector is exactly equivalent to taking the dot product with the bivector. i.e: ⟨u×v⟩x = x•(u∧v), and in my opinion that’s a MUCH more elegant and geometrically enlightening way to do that that’s not matrix manipulation 😢

    • @nad2040
      @nad2040 Рік тому

      I found this video two days ago. GA seems great. ua-cam.com/video/60z_hpEAtD8/v-deo.html

    • @morphocular
      @morphocular  Рік тому +20

      I had a feeling my choice to use matrices might bother some in the GA scene :)
      As a person who knows a little GA but not a lot, perhaps you (or anyone else) can answer a question I had: One of the reasons I went with the matrix approach in this video was because I was seeking a version of Euler's Formula that was as similar as possible to its original complex number version, particularly in that rotations could be performed by simply multiplying your vector by a single object. As far as I've been able to see, GA performs vector rotations by sandwiching your vector between two versions of a half rotor---a technique that I'm sure has its merits, but isn't quite what I was looking for. So I'm curious, do you know if GA has a way to represent rotations with just a single multiplication by an exponential expression?

    • @angeldude101
      @angeldude101 Рік тому +14

      @@morphocular Because multiplication is composition, not application. For 2D complex numbers, these are equivalent (and related to them being commutative), but in 3D, applying one rotation, then applying another gives a new rotation district from having applied one to the other. Since composition is the only primitive you have access to, applying a rotation is equivalent to performing one rotation, then the other, and then _undoing_ the first. That's why it uses the sandwich product. For unit quaternions/rotors, "undoing" is equivalent to using the complex conjugate, or for rotors: the "reverse." In general though, it's instead the inverse, because performing an action, and then immediately undoing it, is equivalent to having done nothing at all: the identity 1.

    • @angeldude101
      @angeldude101 Рік тому +14

      ​@@morphocular GA can only represent rotations as a one-sided multiplication if the vector is completely inside the plane of rotation/is completely orthogonal to the axis of rotation. This is trivially true in 2D GAs like G(2), G(1,1), G(1,0,1), etc, but it can also be true for select elements in higher dimensional Geometric Algebras.
      However, since rotations are preferred to be represented as compositions of two reflections, which are still double sided even in the lower dimensional algebras, it's actually preferred to use the 2-sided rotations even in 2D even if it isn't strictly necessary.
      That's right: drink the GA kool-aid hard enough and you become convinced that we've been using complex numbers wrong; that we've been ignoring the fact that we've been composing rotations this whole time and never actually rotating anything with them.

  • @Ihab.A
    @Ihab.A Рік тому +2

    Saying GREAT VIDEO does not even come close to express my gratitude and joy of watching this video. THANK YOU! Subscribed!

  • @Gus-AI-World
    @Gus-AI-World Рік тому +5

    My God! You made my head spin in joy. I'll be watching this many times more. Thanks a million 👍

  • @costa_marco
    @costa_marco Рік тому +4

    Finally I got it! Thank you so much for your insight on the TILT operation. It makes so much more sense to understand that the cross product is just a coincidence in 3D space.

  • @stereotypical-taco
    @stereotypical-taco Рік тому +17

    wake up babe, new morphocular vid just dropped!

  • @tedsheridan8725
    @tedsheridan8725 Рік тому +10

    What a cool video! I'd love to see you connect the matrix representations with the quaternions and/or Clifford algebra in a future video.

  • @General12th
    @General12th Рік тому +7

    Hi Morph!
    This is a great video. And it makes me even more excited to see the follow-ups!

  • @42f87d89
    @42f87d89 Рік тому

    The explanation on why higher than 2D rotation is not commutative is the same as multiplication in higher than "2D" rotation is not commutative is absolutely brilliant.

  • @elliotbasem5850
    @elliotbasem5850 Рік тому +3

    Wonderful timing on this set of videos Just finished a course in spacecraft dynamics and am putting together a note set for myself that goes a little deeper into 3D rotation formalisms like the Simple Rotation Theorem/Rodrigues's rotation formula and Euler axis-angle formulation so I'm excited to see what the next video will be like.

  • @thomasjefferson6225
    @thomasjefferson6225 Рік тому

    I studied these problems from old exams, nice to learn it now!!😊
    Please more of this!!! It beautiful!!!!

  • @enbyarchmage
    @enbyarchmage Рік тому +12

    This video is A-FREAKING-MAZING! 🤩🤩 Maybe the best of the whole channel up until now, and that's saying something.

  • @howdy832
    @howdy832 Рік тому +3

    I was hoping this would be a geometric algebra video and was disappointed when you turned to matrices, but this approach is so delightfully similar!
    The GA equivalent of your projection matrix is the bivector u ^ v (u wedge v) which is formed by 0.5(uv-vu). the projection/tilt operation is done with the dot product, r . (u ^ v), and rotation is achieved by exponentiation: to rotate r by angle t in the xy plane, the rotor is exp(t * x^y/2) and r' = exp(-t x^y/2) * r * exp(t x^y/2). GA has 1/2 everywhere instead of double-computing a symmetric matrices, though it can't avoid the sandwich product like your matrices can.

  • @TheJara123
    @TheJara123 Рік тому +1

    OOh what a nice voice, nice math, nice pace and the right blend of sense of wonder all come together make it a fantastic complex ride!!

  • @Kram1032
    @Kram1032 Рік тому +14

    Honestly the Multivectors approach is the easiest to understand. Quaternions is Multivectors in disguise, and Multivectors also neatly generalize to include other symmetry transforms (translations, screw motions) in basically the same form. They do everything, ultimately, in terms of repeat reflections, and the product is really easy to understand.
    One of the nicest things about them is, that they make transparently clear the otherwise mysterious distinction of a "polar" vs. "axial" vector.
    Quaternions aren't nearly as hard as they are made out to be, when viewed through the multivector lens, where they end up simply being a subset (with a slightly different sign convention which is an arbitrary choice anyway)
    Matrices look far more mysterious in comparison.
    In fact, much of what you did there amounted to the typical bivector based rotors but, imo, your tilt product is still somewhat mysterious the way you explained it, as you only gave us, *that* these decompositions are possible and *that* these matrices have that form, but not really why that is or how to arrive at that
    Even so, very nice video

    • @michaelwang1730
      @michaelwang1730 Рік тому +2

      Alright, I'm looking forward to your video explaining Multivectors the intuitive way!

    • @Kram1032
      @Kram1032 Рік тому +5

      @@michaelwang1730
      This is obviously very abbreviated and short on detail as it wouldn't reasonably fit into a comment, but the basics are roughly this:
      it just starts with the multiplication with three pretty simple rules:
      1) x² = 1 (etc.)
      2) x y = -y x (etc.)
      3) the distributive law holds, so (a+b)(c+d) = ac + bc + ad + bd
      For convenience in R³, the trivector xyz is called i. It's easy to confirm by explicit calculation, that it commutes with all elements and squares to -1.
      That way, the complex unit arises completely naturally, in a geometric fashion, rather than having to declare a new unit out of nowhere as would normally be done.
      It also explains why that space spanned by {1, i} works so well as an alternative to a space spanned by {x, y}, but the same isn't true for higher dimensions. It comes down to combinatorics (i.e. counting how many base directions there are for each grade) and the geometric dual.
      By simply following that, you get your regular polar vectors as vectors {x, y, z} and your axial vectors as bivectors {x, y, z} i = {yz, zx, xy}
      This, to me, is one of the biggest wins of this view, explicitly distinguishing between polar and axial vector, really showing by simple construction what this difference arises from. Regular vector algebra really makes that an odd peculiarity of R³
      The sandwich product
      - r v r
      reflects the vector v in the vector r
      and if you do it twice in a row, you get a rotation.
      The sandwich product as meaningful geometric entity, describing a symmetry transform of the inner object about or along the outer object is rather nice and intuitive too.
      From there it's a pretty similar construction as in the video to find the exponential form, using the taylor series definition of exp(X) to get the continuous rotation.
      In full generality it can be very abstract. But for any given specific space, such as R³, it's really not that hard to grasp.
      The only initially weird thing is that you get objects with multiple grades, i.e., for R³, in general
      s + V + B + i t
      a scalar, a vector, a bivector, and a trivector all in a single number, using 8 numbers total.
      But that just naturally arises from the three simple rules I started with, and really isn't much of an issue in the end and, imo, isn't really stranger on the face of it, than introducing matrices and what not.
      The only reason matrices are seen as less strange is because they tend to be taught relatively early (i.e. often before university) whereas Geometric Algebra tends not to be taught at all (i.e. even university often doesn't cover this if you don't go into the right courses). But as far as I can tell, this is mostly an accident of historic happenstance.
      I'm willing to bet, that it would actually be easier to teach kids the basics of geometric algebra (restricted to 2 or 3 dimensions at least), than it is to teach them matrices and vector algebra, especially when it comes to rotation.

    • @muenstercheese
      @muenstercheese Рік тому +1

      i would LOVE to see "Quaternions is Multivectors in disguise," explained more. i've always thought of them as extensions of complex numbers rather than having any relation to linear algebra, so seeing the link between the two representations would be super satisfying

    • @Kram1032
      @Kram1032 Рік тому +4

      ​@@muenstercheese that part is quite easy:
      The imaginary part of a quaternion is simply a bivector.
      The scalar part is the scalar part.
      So the quaternions are the "even-graded subalgebra" of multivectors built on R³.
      A "grade" is basically the kind of geometric structure you are looking at.
      grade 0 is scalars
      grade 1 is vectors (corresponding to 1D structures such as lines)
      grade 2 is bivectors (corresponding to 2D structures such as planes)
      grade 3 is trivectors (corresponding to 3D structures. In R³ there is, up to scaling, only one of those. It's therefore often called a "pseudo-scalar" and identified with i)
      So if you just take the even ones, grades 0 and 2, scalars and bivectors, you already are working with quaternions.
      I think due to sign conventions, the y-axis (and its corresponding bivector zx) gets flipped between the two. But otherwise it's identical and you'd use it in exactly the same way.
      so the quaternions' i, j, k are, in the multivector view, ix, -iy, iz or, equivalently, yz, xz, xy
      (that equivalence can be straigh-forwardly derived through explicit calculation too. For instance:
      i * x = (by definition i = xyz)
      xyz * x = (anti-commutativity)
      - yxz * x = (anti-commutativity)
      yz x * x = (x² = 1)
      yz
      Literally just applying the rules I mentioned in my previous comment.)

    • @angeldude101
      @angeldude101 Рік тому +1

      @@Kram1032 One of the most important aspects of Geometric Algebra can be demonstrated with nothing more than a piece of paper with a drawing on it. You can reflect the drawing by flipping the paper over across a line on it, and then you can do it again across a different line and watch how the drawing rotates around the intersection of the two lines by twice the angle between them. If the lines don't intersect, then the drawing instead doesn't rotate, but rather translates perpendicularly to the two lines by twice the distance between them. There's also a more subtle aspect of GA being used here which is the relation between reflections in 2D being represented as 180° rotations in 3D. This is because the 2D PGA being demonstrated is actually embedded as an even-subalgebra of 3D PGA; a relation that can be done for any Geometric Algebra. (The even subalgebra of every GA is another GA. Every GA is the even subalgebra of another GA.)
      All that from flipping a piece of paper; something someone in _kindergarten_ can understand even if they don't understand the significance of it yet.

  • @meson2439
    @meson2439 Рік тому +1

    Awesome. I had been confused about the rotation matrix for a long time before this.

  • @infernape716
    @infernape716 Рік тому

    This channel deserves way more subscribers

  • @Degenerac1ng
    @Degenerac1ng Рік тому +3

    MORE! WE NEED MORE OF SIMILAR STUFF

  • @scptime1188
    @scptime1188 Рік тому

    clarifies alot of my understanding of matrix mechanics in QM. thank youuu

  • @DokterKaj
    @DokterKaj Рік тому +9

    Ooh, new intro animation! The text is finally centered lol

    • @morphocular
      @morphocular  Рік тому +3

      At the cost of the "O" no longer being centered, unfortunately. But actually, the main reason for the new logo was to make the "O" no longer oversized, which was bothering me :)

  • @IanBLacy
    @IanBLacy Рік тому +1

    This is brushing so uncomfortably close to abstract algebra that I’ve been picking up terms from lately and I’m dreading trying to wrap my head around it. Thankfully, I’ll never have to do it in school. Sadly, I’ll probably never get to do it in school

  • @parzh
    @parzh Рік тому +4

    Props to this guy. He single-handedly invented rotation. Way to go!

  • @TheJara123
    @TheJara123 Рік тому

    What a delightful video, superb math, superb animation, and superb pace, superb narration and to top that superb voice with right speed of articulation..so to say
    Thank you is understatement..
    Thank you very much...for enriching our Intellectual joy..
    Please post more!! More!!😂

  • @jakoblenke3012
    @jakoblenke3012 Рік тому +5

    Nice work!
    Funnily I just has to prove that det(e^(Rt))=e^(tr(R)*t) in my university course Higher Mathematics II, which is why I knew something couldn’t work out as planned at 13:40 :D

  • @GlortMusic
    @GlortMusic Рік тому +2

    Awesome concepts with stunning visuals! They costed a lot of time for sure, but I'm really glad to see that it paid off very well!
    P.S. I want to see a Wikipedia page about Tilt Products now.

  • @dennisbrown5313
    @dennisbrown5313 Рік тому

    Overall an excellent prestation. Very logical and really good approch linking the various subjects and showing what they are 'doing' graphically.

  • @AriKath
    @AriKath Рік тому +2

    i am so excited whenever you drop another gem!

  • @thefunpolice
    @thefunpolice Рік тому

    Regarding exponentiation, there is a lovely theorem involving the spectral decomposition of a matrix, A, and the exponential exp(A).
    If U is a matrix whose columns are a set of ordered eigenvectors of A and, (bending conventional notation a bit for the current purpose), U* is the inverse of U... and suppose also that we create some matrix called E, which is a matrix containing the appropriately ordered eigenvalues of A, then,
    A = UEU*
    Is the so-called _spectral decomposition_ (or eigenvalue decomposition) of A.
    The theorem of interest is the following, exp(A) = U (exp(E)) U*
    This is, of course, still valid for a scalar multiple, ø, of A, namely exp(Aø) = U (exp(Eø)) U* so that you don't need the Maclaurin series for exp(A). It's easier to use Maclaurin of course but it's nice that there's another method to serve as a check on the result.

  • @salvadorvillarreal1643
    @salvadorvillarreal1643 Рік тому +1

    I'm surprised you didn't touch upon Lie groups and algebras in this video, I really expected that when you started introducing generators. It's really interesting to see this topic developed from a different perspective.

  • @kono152
    @kono152 Рік тому +2

    small comment about the tilt product notation: since a lot of textbooks use those brackets for the dot product, it might be better, to avoid confusion, to use something like \triangleright (which as far i know, nobody uses)

    • @morphocular
      @morphocular  Рік тому +1

      Ironically, the resemblance is actually part of the reason I chose to use angle brackets, since the tilt product involves outer products, kind of making it the complement of the inner product. I hoped it wouldn't be too confusing though, since inner product notation uses a comma as the operand separator, whereas the tilt product uses a cross ¯\_(ツ)_/¯

  • @celadon2048
    @celadon2048 Рік тому

    Yo are you sudgylacmoe? You sound like him on a different mic, I swear, and your explanations are just a crystal clear! So glad to be subscribed to both channels. ❤

  • @jacobgluhcheff5569
    @jacobgluhcheff5569 Рік тому

    Commenting for engagement, thanks for the great visuals and clear explanations

  • @bidyut9873
    @bidyut9873 Місяць тому

    Superb lecture. Keep creating more

  • @hifriend7789
    @hifriend7789 Рік тому +2

    I understood probably 25% of the video. Still watched until the end bc I love math

  • @Sanchuniathon384
    @Sanchuniathon384 Рік тому

    This was great work, I really look forward to the connection of sines and cosines to the tilt product in higher dimensions.

  • @lostythesecond3916
    @lostythesecond3916 9 місяців тому +3

    why did I look at the thumbnail and think "Wow math is on rule 34 now"

  • @Sciencedoneright
    @Sciencedoneright Рік тому +6

    Amazing video, goes to show how incredibly useful matrices are ❤

  • @DeclanMBrennan
    @DeclanMBrennan Рік тому +1

    That was excellent. I really like your coining of the "tilt product". I love that it works as is in higher dimensions without having to pull in the whole of geometric algebra. The cross product always seemed a bit hand crafted for 3D.
    I've played around with animating folding polyhedrons in the past. Wherever I finally get some free time again, applying the tilt product to folding polytopes might be fun. But looking at the result might cause a bit brain tilt :-)

    • @schmetterling4477
      @schmetterling4477 Рік тому

      The cross product isn't "hand crafted" for three dimensions: it simply does not exist for dimensions other than three and seven (with a slight change in its properties). It's just one of those things where n matters big time. It's not a choice that can be made without losing major structural elements.

    • @angeldude101
      @angeldude101 Рік тому

      @@schmetterling4477 That's because the 3D and 7D cross products are just quaternion and octonion multiplication respectively while ignoring the scalar terms. A more general way to describe cross products is something called a "lie algebra" (pronounced "lee"), of which none exist for 7D. There are however lie algebras and corresponding "cross products" for 6D, 10D, 15D, 21D, 28D... (N choose 2)D. You may or may not guess that these particular lie algebras have more to do with N-dimensional geometry than their own dimension. If you did, then you'd be right.

    • @schmetterling4477
      @schmetterling4477 Рік тому

      @@angeldude101 That's correct, but there is no other such algebra that preserves the physics of the cross product (I am a physicist, so to me this matters). Physics as you know it happens exclusively in three dimensions. One can, of course, build general antisymmetric algebras for many more cases, but they don't go from vectors to vectors the way the cross product does as far as I know and they don't have the meaning of a Laplace-Runge-Lenz vector if applied to position and momentum (which, of course, derives from the symmetries of the four dimensional hypersphere).

    • @angeldude101
      @angeldude101 Рік тому

      @@schmetterling4477 _The_ 6D version doesn't preserve physics as we know it, but _a_ 6D version _does,_ actually 2 of them. One describes linear and rotational motion in Galilean Relativity, the other in Special Relativity. _The_ 6D version _would_ preserve physics if the universe had spherical or elliptic spacetime geometry, but that would also allow for smooth transitions to backwards through time motion. (Alternatively, it could just mean 4 spatial dimensions.) Technically the 3D cross product _only_ describes rotational motion and can't represent linear motion at all unless we were talking about 2+1D spherical spacetime.
      Technically all of those are just parts of larger 8 or 16 dimensional algebras. It's also not a coincidence that 3 of those 6 dimensions correspond with the usual 3 linear directions, and the 3 correspond with the 3 rotational directions.

    • @schmetterling4477
      @schmetterling4477 Рік тому

      @@angeldude101 I am not aware that Kepler-like closed orbital motion exists in four spatial dimensions, though. One can map the four dimensional harmonic oscillators to the three dimensional Kepler problem by transforming the time variable (which unfortunately is not a physical operation because time is not actually a dimension), so mathematically there is a somewhat strange correspondence, but it is not a simple one to one mapping, if I remember. A student friend of mine did some work on the Kustaanheimo-Stiefel transformation a long time ago, but I could never wrap my mind around it.
      There is no need to treat linear motion the same way as rotations because technically linear motion is not even motion in a sense. In physics linear motion is caused by the absence of physical interaction. All systems in linear motion form equivalence classes, so to speak, but all of this is only "nice" in flatland. If we drill down deeper, however, as soon as we get to general relativity and gravity becomes a field theory, the notion of linear motion on geodesics breaks any and all local conservation laws at the level of the particle/trajectory approximation anyway because it induces gravitational waves (deformations of the background manifold). At that point all the nice algebra that we have built up so far goes to hell. You can see that in the difficulty to even define something as "simple" as "total mass-energy" for a strongly gravitating system.

  • @zathrasyes1287
    @zathrasyes1287 Рік тому

    Very good explanation of the subject! Fun to watch. Thx.

  • @ErickBuildsStuff
    @ErickBuildsStuff Рік тому

    Please do the follow up video faster. I would like to know the next as it would be super helpful to learn Bloch spheres for my quantum course 😅 I seem to understand faster with the way you explain.

  • @alexbennie
    @alexbennie Рік тому +1

    Around 9:30, after actually paying attention (not just seeing the same expansion of e for the umpteenth time):
    I think I finally understand the "why" part for spirals in fractals!
    I imagined a plot of (n; T_n) of the infinite sequence, where T_n is the nth term containing an A=exp(iθ)... Raising the exponent in increasingly higher orders, increases the angle of rotation.
    Fractals (the usual/popular ones) rely on iteration, i.e increasing exponents on the seed value.
    I would love to know if this train of thought is correct, since 90% of "fractal"-related videos out there are surprisingly either Number Theory related (too scary sometimes), "Ooh! Pretty math!" videos, or Biology-meets-philosophy pop-science videos.
    The few out there that do tackle the reasons behind the geometric structure of (e.g. Why do the tendrils make spirals, and not wobbly lines in the Mandlebrot set?) fractals are unfortunately not really in the 'hobbyist' category.
    So, yeah. This channel really is awesome in the simplicity of presenting massive math!
    Awesome video as always!

  • @rrr00bb1
    @rrr00bb1 Рік тому

    actually, that complex numbers commute is the quirk. if you include directions in space in the algebra, a pair of orthogonal directions multiplied squares to -1. But the directions themselves do NOT commute. Directions in space square to 1. What is happening is that parallel vectors only have a scalar part, perpendicular vectors have a bivector part. And any others are the sum of the two; which is a complex number. Ex:
    // counter-clockwise loop of the same representation of directions
    right * up = up * left = left * down = down * right
    =
    right * up = up * -right = -right * -up = -up * right
    =
    right * up = -(up * right) = right * up = -(up * right)
    So that directions anti-commute. You get a minus sign when you swap them. These create complex numbers and geometry, etc.
    right up right up
    =
    right * -(right up) * up
    =
    - (right right) (up up)
    =
    -1
    ex: i = (right up)
    i^2 = -1
    using (right up) gives us a counter-clockwise rotation. Using (up right) gives us a clockwise rotation. But it happens to be that for a single plane, the anti-commutative part cancels out when you are just squaring the one rotation (right up), etc.
    This also means that there isn't a single thing called "i" that squares to -1. Many things square to -1, including different planes:
    -1
    =
    (right up)^2
    =
    (right forward)^2
    Notice that these are two different possibilities for "i". Their square is the same, but they represent different planes. If you do sin[x] and cos[x], you are implicitly operating on an ambiguous rotation plane.

    • @angeldude101
      @angeldude101 Рік тому

      Two parallel arrows have no bivector part when multiplied, but that's just because arrows are invariant under translation. More general reflections can be parallel while still having a bivector part of their product. Whether or not two objects are parallel depends only on the scalar part, not the bivector part.

  • @CognitiveOffense
    @CognitiveOffense Рік тому

    I learned a lot from this video. Thank you for making it.

  • @dAni-ik1hv
    @dAni-ik1hv 9 місяців тому +1

    had to go learn Linear Algebra to understand this; I now am infatuated with matricies.

  • @keyboard_toucher
    @keyboard_toucher Рік тому

    19:01 Taking v to be 90 degrees away from u is indeed a clever way to perform a 90-degree rotation from u towards v.

  • @idohalamit
    @idohalamit Рік тому

    Amazing video. Did you develop this?

  • @adrigor4461
    @adrigor4461 Рік тому

    Amazing video! Made my day better!

  • @ominollo
    @ominollo Рік тому

    I need to rewatch it but it was a very nice video 😊

  • @mikikaboom9084
    @mikikaboom9084 Рік тому +1

    Pls make a follow-up with quaternions

  • @anandvaz1421
    @anandvaz1421 Рік тому

    Thanks for the excellent video.
    As a matter of curiosity, I would like to know about the software that you are using for the presentation and the graphics.
    Thanks, once again!

  • @tszhanglau5747
    @tszhanglau5747 Рік тому

    finally,generalized euler's formula

  • @ahmedtijanimusa4926
    @ahmedtijanimusa4926 Рік тому

    The explanation is good

  • @berserker1340
    @berserker1340 Рік тому +1

    Tbh I think Quaternions would have been a better follow up to the previous video. As it is a historically older tool than matrices and it represents angular velocities in 3 dimensions in a much similar way that the complex numbers represent it in 2D

  • @jeppekjer6138
    @jeppekjer6138 Рік тому

    The entries in the matrix at 21:45 are all determinants of some 2x2 matrix that you can construct from u and v. The (i, j)'th entry is det([[ui, vi], [uj, vj]]). I think it's an easy way to remember it, and it also makes it such that the zero-diagonal occurs naturally. Now it's been too long since I dove deeply into linear algebra, so whatever the geometric interpretation of these determinants may be, I do not know. But it should scale seamlessly to higher dimensions than 3, liberating it from its cross product origins, as it also ended up in the video.
    To get a neat notation for this, we would of course need to invent something new. I like that in the video, the newly invented notation leans heavily towards established notation. This might be difficult to do while thinking in determinants.

    • @schmetterling4477
      @schmetterling4477 Рік тому

      The cross product does not extend to higher dimensions. There is a similar product in seven dimensions, it seems, but that's it. This is probably one of the reasons why nature gave us a three dimensional space and not some other dimension: in three dimensions angular momentum is a very special physical property. Dynamically this expresses itself in a rather dramatic way: there are no stable n-dimensional planetary orbits, other than in two and three dimensions. See Bertrand's theorem for a rather strong limit on orbital stability of central potentials (in 3 as well as n dimensions). So the next time when a physicist talks to you about 10 or 11 dimensional spaces as part of string theory etc., keep this in mind: no matter what we do in the microscopic world, it always has to collapse to three dimensions at the end for macroscopic dynamics, otherwise the universe goes off the rails... literally.

    • @angeldude101
      @angeldude101 Рік тому

      ​@@schmetterling4477 Except we _don't_ live in a 3D universe. We live in a 4D universe where we usually just ignore 1 of them because it's different from the other three. The cross-product only existing in 3D is because 3D is the only dimension where the number of rotational degrees of freedom is the same as the number of linear degrees of freedom. The 7D cross-product is the result of doing octonion multiplication without the scalar term.
      4D actually has 2 different analogs to the cross product, but neither give 4-dimensional linear vectors as outputs. One takes two 4D linear vectors and then gives a 6-dimensional rotational object, while the other takes two of those 6D objects (known as bivectors) and gives another bivector. While 4D bivectors have 6-components (or "are 6D"), they still exist entirely within 4 geometric dimensions. 5D also has cross-product like operators, but there the results have 10 components each.

    • @schmetterling4477
      @schmetterling4477 Рік тому

      @@angeldude101 Time is not a dimension. Only one theory treats it like a pseudo-dimension and that theory can't even explain why matter exists at all. In all other theories in physics time is an irreversible local order parameter. Math is great, but it is not physics. Not even close.

  • @whilewecan
    @whilewecan 5 місяців тому

    Thank you. I learned a lot.

  • @axewellll
    @axewellll Рік тому

    you should use a 'text-align: left' to avoid the crazy jittering on the animation of 't = 3.43'

  • @Jaylooker
    @Jaylooker Рік тому

    The exponential function exp(A) with A being a skew-symmetric matrix defines a vector field. This vector field could be taken to be Hamiltonian vector field which describes the phase space of classical mechanics. The exp(A) send a Lie algebra to a Lie group. Continuing, this Lie group can be taken as the gauge group. Hermitian matrices of quantum mechanics are skew-Hermitian matrices after multiplying by i. Note Hermitian matrices are symmetric matrices. Maybe this factor of i multiplying the Hermitian matrices is noticed by Wick’s rotation?

  • @DudeWhoSaysDeez
    @DudeWhoSaysDeez Рік тому

    these videos are great. keep up the good work!

  • @fungouslobster5123
    @fungouslobster5123 Рік тому +1

    this would be a great intro into linear lie groups👀

  • @6DAMMK9
    @6DAMMK9 Рік тому

    Welcome to computation graphics / computer vision, but somehow in the notation of Im-Re plane lol
    All you need is some "transformation matrix" (affine transfromation), or a "pinhole camera model" (perspective projection)
    However it will be a (n+1)-diim matrix for a n-dim rotation.

  • @darkseraph2009
    @darkseraph2009 Рік тому

    Me, who has only ever learned how to 3d rotate with quaternions: Getting some real quaternion vibes from this.

  • @iphys8074
    @iphys8074 Рік тому +1

    Really nice. Can we have some references you've used please ?

  • @אלון-מ4ו
    @אלון-מ4ו Рік тому

    A. Thanks. great video!
    B. Can you add a link to video or a website like TA where you solve questions \ present examples with numbers

  • @raajchatterjee3901
    @raajchatterjee3901 Рік тому

    This seems like an alternative explanation of the Lie algebra of the special orthogonal group of rotations? Where a skew symmetric matrix defines the tangent space?

  • @Darkness_7193
    @Darkness_7193 Рік тому +2

    To better understand what matrices are, imagine a Lego house. We want to rotate it 90 degrees. To do this, we take it apart into Lego blocks and rotate each 90 degrees, and then we play it back. The matrix here is the transition from Lego blocks to rotated Lego blocks. You may remember python dictionaries
    Matrix = {
    block1 -> turned_block1,
    block2 -> turned_block2,
    ...
    }
    Now replace the lego house with a vector, and the lego blocks with x, y, z hats. (By the way, this is another matrix)
    There are many benefits to this approach. There are many repeating blocks in a Lego house (vector) [for example, vector (3, 1) is x hat + x hat + x hat and 1 y hat), and each of them can be rotated by rotating one block. We can also rotate not only the house, but also other lego structures (other vectors).
    Our notation clear, but loosely linked with modern notation. Let convert matrix from video on 13:37 to our notation:
    Matrix = {
    x_hat=(1, 0, 0) -> (0, 1, 0),
    y_hat=(0, 1, 0) -> (-1, 0, 0),
    z_hat=(0, 0, 1) -> (0, 0, 1),
    }
    As you can see, in modern notation, the old blocks are omitted, as it is always x_hat, y_hat and z_hat. Also modern notation is rotated.

  • @evandrofilipe1526
    @evandrofilipe1526 Рік тому

    I've been acting like a spoiler brat in the comments but this is simply because I love geometric algebra. I still enjoyed this video and I think you chose just the perfect background music. I'll watch stuff that I don't like if it's by you. ;)

  • @matemaicon
    @matemaicon Рік тому

    Thank you for the excelent video. May you share some references about this topic?

  • @randfur
    @randfur Рік тому

    Fantastic stuff, learned a lot and didn't have to pause much.
    I'd love to understand more about wedges and bivectors with representing rotations as a goal.

  • @kianchoi177
    @kianchoi177 Рік тому

    Thanks for the great video!

  • @gat0tsu
    @gat0tsu Рік тому

    thanks for the great video

  • @barryzeeberg3672
    @barryzeeberg3672 Рік тому

    When performing interactive 3D rotations of a digitized image, one often uses a trackball as the input device. Can you derive how to relate the (arbitrary) rotation of a trackball to a mathematical/numerical model of how to correspondingly transform the digitized image?

  • @ابولفضلجهانی-ص2و

    very intuitive, god bless you

  • @AriKath
    @AriKath Рік тому

    i am really grateful, thank you!

  • @khiyar2287
    @khiyar2287 Рік тому

    your videos so good

  • @asadyezdan3858
    @asadyezdan3858 Рік тому

    Cube when it’s supposed to rotate
    “I’m bout to end this man’s whole career”

  • @tasnimul0096
    @tasnimul0096 Рік тому

    perfect sequel!

  • @bowlineobama
    @bowlineobama 7 місяців тому

    This subject is over my head. I am still trying to understand Euler's Formula.

  • @lucianchauvin8587
    @lucianchauvin8587 Рік тому

    Great video!

  • @marcelocampos665
    @marcelocampos665 Рік тому

    Beautiful.

  • @Scyth3934
    @Scyth3934 Рік тому +1

    yooo new video

  • @fangjiunnewe3634
    @fangjiunnewe3634 Рік тому +5

    15:52 the matrix with z column 0 would not purely rotate the cube it would flatten it into the xy plane first and then rotate it. Consider the corner of the cube at [1,1,1] after t = 1 it should be at [-1,1,1] but instead it would be at [-1,1,0]
    Edit: nvm I got it the matrix itself would flatten then cube but exp(matrix) will rotate it

    • @F_A_F123
      @F_A_F123 Рік тому +4

      you should put that matrix in e^(Aθ) formula like on 9:44

    • @QweRinatrtY
      @QweRinatrtY Рік тому +1

      e^0 = 1

  • @ghhoward
    @ghhoward Рік тому

    Thank you!

  • @categorygrp
    @categorygrp Рік тому

    MORE OF THIS

  • @tilkesh
    @tilkesh Рік тому

    Thank you

  • @LucenProject
    @LucenProject Рік тому

    22:37 super small thing, but I believe the first term of the dot product shown on screen UxVy is meant to be UxVx.

  • @KangHyunChu
    @KangHyunChu Рік тому

    Fascinating!

  • @devrimturker
    @devrimturker Рік тому +1

    Poisson brackets in classical physics, commutators in quantum mechanics :)

  • @cxzuk
    @cxzuk Рік тому

    Thanks

  • @mathisnotforthefaintofheart

    2D rotation is the same as multiplying a complex number "equivalent" by "i"...it's true of course but I never thought about it that way in one line of thought

  • @gabrielj.9786
    @gabrielj.9786 Рік тому

    The tilt product can be expressed as a linear map:
    (u tilt v)(w) = Dot(w, u) v - Dot(w, v) u
    In 2D and 3D, the sum of two tilt products is a tilt product.
    But in 4D, the sum of two tilt products may not be a tilt product.
    For example, (i tilt j) + (k tilt l) cannot be simplified a single tilt product (u tilt v).
    What does this non-simple sum of tilt products represent?

    • @angeldude101
      @angeldude101 Рік тому

      Since the tilt product behaves like the exterior product, and even gives the matrix form of a 2-blade, the matrices resulting from the tilt product add in a manner similar to 2-blades. You may have heard the term "bivector" and may be wondering why I'm saying "2-blade." If you have and are, bivectors are specifically sums of 2-blades. In 2D and 3D, every bivector is also a 2-blade, but 4D is where this stops being true. _Why_ does this happen? Several reasons, actually. A big part is in fact the phenomenon of double-rotations in 4D. In general, a rotation in N dimensions can have at most floor(N/2) distinct and independent axes. That's 0 is 0D and 1D, 1 in 2D and 3D, 2 in 4D and 5D, 3 in 6D and 7D, etc.
      The sum of two tilt-products / 2-blades in 4D or 5D gives "the axis" for a double rotation, encoding both independent axes in a single object. By exponentiating, this should give the full rotation matrix that does both parts of the rotation.
      Fun fact: it's possible to treat 3D _translation_ as a type of rotation in a 4th dimension. In this context, this bivector can be called a "screw axis," since when exponentiated, it gives a rotation about some axis combined with a translation along said axis, forming a screw motion. Mathematically, this behaves very similarly to a 4D double rotation.

  • @zackbarkley7593
    @zackbarkley7593 Рік тому

    The closest match to the euler function who advantage is to use multiplication rules for complex numbers would be the quaternions...so i wish you would have covered that first.

  • @ko-prometheus
    @ko-prometheus Рік тому

    I am looking for mathematical theories and algorithms to describe the laws and regularities inherent in Metaphysics.I don't know if this knowledge is appropriate for this question?

  • @tariq3erwa
    @tariq3erwa Рік тому

    4:40 I see you are a man of multiculture as well