Joan Lasenby on Applications of Geometric Algebra in Engineering

Поділитися
Вставка
  • Опубліковано 3 чер 2024
  • Joan Lasenby - www-sigproc.eng.cam.ac.uk/Mai... - is a University Reader in the Signal Processing and Communications Group of the Cambridge University Engineering Department, and is a College Lecturer and Director of Studies in Engineering at Trinity College. Here's a list of her published work - publications.eng.cam.ac.uk/vie...
    In this episode we talk about Joan’s research into 3D reconstruction from multiple cameras and her interest in geometric algebra.
    The YC podcast is hosted by Craig Cannon - / craigcannon
    ***
    Topics
    00:00 - What's a tangible example of geometric algebra?
    1:20 - What is geometric algebra?
    6:15 - What resparked interest in geometric algebra?
    7:10 - Why is it important?
    11:00 - When did Joan start working on it?
    12:55 - Rotations
    16:45 - Computer vision in the early 90s
    19:00 - Joan's fellowship at the Royal Society
    23:00 - What's changed in computer vision since the 90s to allow for Joan's drone research?
    29:35 - Machine learning in computer vision
    31:20 - How Joan and her students are applying machine learning
    34:30 - Unifying qualities of geometric algebra
    40:30 - Joan's paper ending up on Hacker News
    45:00 - Where could geometric algebra take hold?
    47:00 - Running and mobility
    48:00 - Where to learn more about geometric algebra
  • Наука та технологія

КОМЕНТАРІ • 78

  • @JimSmithInChiapas
    @JimSmithInChiapas 5 років тому +38

    LinkedIn has two GA groups: "
    Geometric Algebra" , and "Pre-University Geometric Algebra". We will welcome your participation.

  • @markrolfe6979
    @markrolfe6979 Рік тому +4

    In my school experience(1970), vectors were introduced in the final years of high school physics in order to model real world quantities which possessed both a magnitude and direction. Now, looking back, Geometric Algebra offers a system of algebra capable of naturally and automatically looking after the details and in the same almost magical way that integration and differentiation looked after the details in calculus. It would seem impossible that this is not introduced into high school level physics and mathematics.

  • @DrunkenUFOPilot
    @DrunkenUFOPilot 2 роки тому +22

    Long time fan of David Hestenes and his way of looking at the laws of physics through multivectors, and I've worked in computer vision... so naturally, I think this is a great interview!

  • @paulbloemen7256
    @paulbloemen7256 Рік тому +5

    Very interesting and enthusiastic video, thank you!
    I’m 72 years old, not very into mathematics, but since about 2 months almost obsessed with geometric algebra, looking at UA-cam videos and taking notes, to make sense out of it all. I think it’s amazing!
    As for study material, and as kind of a marketing effort, I wonder whether a publication for school kids is possible, available. Magic words: easy to understand, easy to apply. Those interested could then use it in whatever computer endeavour they are undertaking, and take that knowledge and experience to university later, having a head start on this subject. On the long run this may help to give geometric algebra its rightful place in the mathematical world.
    If you know about such a publication, website etc, please could you tell me?

  • @flaguser4196
    @flaguser4196 Рік тому +2

    it's like a new language. and, like learning foreign human languages, the challenge is finding motivation given you can already get things done in your native language. i'm still dabbling in it as its very interesting and not really that difficult. it's really just a matter of finding a use case that makes sense to me.
    she's honest and practical and doesn't oversell it. this makes her more credible.

  • @brandonmcfarland6678
    @brandonmcfarland6678 5 років тому +4

    Great discussion. Thanks Joan and YC!

  • @geoyoshinaka5251
    @geoyoshinaka5251 2 роки тому +6

    what a wonderful interview! I sure would love to see a follow up on the applications of Geometric Algebra in Neural Networks, Robotics etc. Thanks for insight into the uses in Computer Vision!

    • @MiroslawHorbal
      @MiroslawHorbal Рік тому +1

      I've been thinking about this for the past month. Got interested in the subject by the beauty of the algebra, but as an ML practitioner, I'm curious if we can bring in the geometric product into neural network activations.
      For example, rather than learning a matrix multiply into a RELU activation for example, can we learn an oriented hyper-cone where input vectors are projected to the interior into the cone?
      The question is if it will allow for more expressive activation functions that actually improve model performance.
      Also probably some interesting applications in image processing.

  • @biswajitmishra7341
    @biswajitmishra7341 4 роки тому

    Great Discussion Joan. Going down the memory lane...It was great to hear your simple yet effective discussion on the operations and usefulness of Geometric Algebra!

  • @bellinterlab8139
    @bellinterlab8139 3 роки тому +4

    This was a genuine pleasure. Folks, it's Real.

  • @OnlyPenguian
    @OnlyPenguian 3 роки тому +5

    Very good interview. There is a whole extra mathematics and physics backstory that goes through characters like Cartan, Dirac, Chevallay, Atiyah, Bott and Shapiro, Porteous, Lounesto, and many others, and deals with Clifford algebras, spinors, their representations, their geometric properties, and their applications to topology, differential geometry and physics. Then come applications in coding theory, machine learning, computer vision, robotics, and engineering, as discussed here.

    • @Anonymous-by5jp
      @Anonymous-by5jp 2 роки тому

      I'm fascinated by the history of mathematics - can the backstory involving the above characters be found in one place or do I need to seek out their individual bios?

    • @OnlyPenguian
      @OnlyPenguian 2 роки тому

      The post-quaternion history stretches back to Grassmann in 1844, then Clifford, Lipschitz, Cartan, and the others I mentioned above. The first chapter of the book "Clifford Algebras and Spinor Structures: A Special Volume Dedicated to the Memory of Albert Crumeyrolle (1919-1992)" gives some of the history up to about 1992.

    • @Anonymous-by5jp
      @Anonymous-by5jp 2 роки тому

      @@OnlyPenguian Thanks - I’ll look that up😊

  • @FerroNeoBoron
    @FerroNeoBoron 5 років тому +3

    I was reading up on golang when suddenly 12:50 "Particularly things that involve rotations" got my attention. I've used quaternions and have an intuition for them but the arithmetic is insane though useful and their extensions real, complex, quaternion, octonian, ... is rather much especially given their double-cover of orientation.

  • @samirelzein1978
    @samirelzein1978 2 роки тому +1

    so glad to see this level of abstraction at an accelerator!

  • @howwitty
    @howwitty 5 місяців тому

    Thank you for this video. Trinity college never disappoints.

  • @reyramirez5695
    @reyramirez5695 5 років тому +2

    Good explanations!

  • @BjrnRemseth
    @BjrnRemseth 5 років тому +8

    Fascinating subject, and fascinating person :-)

  • @TAntonio
    @TAntonio 5 років тому +4

    With the quantum computing episode I was just barely keeping up. This episode left me in the dust at 0:15 😂

  • @Interspirituality
    @Interspirituality 2 роки тому

    Joan is amazing!

  • @robertmcmorran6680
    @robertmcmorran6680 3 роки тому +2

    Wow, the applications to physics and Maxwells equations.

  • @jesseghansah2001
    @jesseghansah2001 5 років тому +3

    Fascinating...

  • @emcgrotty
    @emcgrotty 2 роки тому +1

    Great interview. As Joan said at the end of the interview GA is a powerful tool that gives people who have geometric insight an advantage. GA levels the playing field. It makes complex physics concepts more accessible to more people. There in lies the power of this tool. As an engineer I am particularly interested and excited about the work coming out of the University of Cambridge in the area of “acoustic space time”.

  • @tombouie
    @tombouie 2 роки тому

    Thks & she's sooooooo amazingly smart (!nullius in verba rules!).

  • @antoniusvanoosterwijck8368
    @antoniusvanoosterwijck8368 Рік тому

    We need to introduce GA at secondary school now!

  • @michaelgonzalez9058
    @michaelgonzalez9058 Рік тому

    Cambridge lines are an computer based area 3.14(2)

  • @lucagagliano5118
    @lucagagliano5118 5 років тому +2

    I still don't understand the advantage of such framework compared to linear algebra. Does it simplify computations if implemented? Is there an instance where such advantage is evident?

    • @minRef
      @minRef 5 років тому +3

      youtube search for "marc ten bosch rotors"

    • @minRef
      @minRef 5 років тому +8

      It certainly does simplify the math after a painful start. There's a web demo called Geometric Algebra Not Just Algebra (Google search for "ganja.js coffeeshop" ). Note that a few dozen different fields of math(back propagation, quaternion skinning, 3D surfaces, plotting, etc etc) are each implemented in under 70 lines using the same generator function. The actual engine is less than 1200 lines. Mandelbrot set example has only 16 lines including comments. Other mathematical systems may represent a few thing very succinctly, but eventually trip over themselves when they leave their domain. Considering that the proof-of concept is just javascript and not NVIDIA PTX assembly or something, it's pretty impressive. It proves the point that the same system can represent all of the math things in a naturally decipherable way (and not just in video games). There are tons of frameworks and tons of notations but this is the most usable web thing I've seen.
      As far as native PC gaming, I'm not aware of any open-source GA graphics things that are super-impressive...yet.
      Unity and Unreal have a closed source lighting plugin that called Enlighten that is rumored to use GA for lighting math, but I haven't seen the code yet.
      The best GA frameworks are yet to be written, given just how powerful the math is.
      If you enjoy physics, I suggest googling ("C2 Clifford Algebra") , 1st result.
      For a TL;DR version, youtube search ("uni adel vector algebra war"), play at 1.5x speed

    • @lucagagliano5118
      @lucagagliano5118 5 років тому +1

      @@minRef Having less lines of codes doesn't necessarily mean having more efficient code. It is my understanding that GA can be implemented efficiently, still this doesn't mean this would be any better than LA.

    • @minRef
      @minRef 5 років тому +7

      ​@Luca Gagliano Regarding "efficiency of LA": Clifford/Geometric Algebra can use trivial rules to literally create LA, as well as things like quaternions, Plucker coordinates, or implicitly create new theorems on the fly to simplify things and avoid brute-forcing. The idea being that the fastest code is code that doesn't exist. Linear algebra cannot generate Clifford algebra, and it struggles with Automated Theorem Proving. That being said, nothing changes the fact that you still have to benchmark every real-time system and pick the best implementations for your finite set of target architectures. Regarding numbers of lines, that wasn't referring to efficiency, but as a rough estimate of the relatively small variance in complexity of expressions among a set of problems that varied significantly in problem type and dimensions. I mentioned the length of the engine and the fact that theres only one generator function to show that the complexity isn't being hidden somewhere and appears to be emergent. If you click on the examples you'll see what I mean. This is the most obvious web based demo of GA's strengths that ive seen and it suggests that Hestenes's work to create a "unified language for mathematics and physics" is going in the right direction.
      Anyway, I would only suggest that you try Geometric Algebra if you've ever been bothered by why certain linear algebra and quaternion theorems seem to come from nowhere. If so then I'd suggest trying Lengyel's "Foundations of Game Engine Development Vol 1: Mathematics" book, chapter 4, or try the youtube lecture "A Bigger Mathematical Picture for Computer Graphics". I'm not saying you should ship this next week, but that building math intuition ends up being useful in surprising ways. Have fun!

    • @andrewluscombe497
      @andrewluscombe497 5 років тому +8

      Linear Algebra does some parts of what Geometic Algebra does. Vector Algebra, Tensor Algebra, complex numbers, quaternions, Pauli and Dirac matrices also. Geometric Algebra covers all those things in a consistent way.

  • @kwccoin3115
    @kwccoin3115 Рік тому

    He got a maths chair as a dad helps a lot to avoid the rush and the uncertainty of being a young guy not sure. Move from philosophy to maths to physics … good if you do it but hard in those days. And now.

  • @skeptical_Inquirer200
    @skeptical_Inquirer200 2 роки тому

    The exterior algebra of differential forms is much better known than geometric algebra and also has been used to simplify Maxwell's equations. Can somebody recommend a reference where the two are related?

    • @numoru
      @numoru Рік тому

      A geometric algebra reformulation of geometric optics
      Quirino M. Sugon Jr. and Daniel J. McNamara just came out

  • @waystilos
    @waystilos 5 років тому

    What mics are those?

  • @aleksandarjankovski6542
    @aleksandarjankovski6542 10 місяців тому

    Mathematics is so beautiful.

  • @robfielding8566
    @robfielding8566 5 років тому +6

    O(2^(2n)) complexity in general multivector multiplication is an issue. i find the standard books extremely unclear on multivectors, and making it seem like there are a lot of corner cases. but if you just look at a pair of multi-vectors that explicitly represent every part, then each multivector has 2^n components, and multiplying them is just the distributive law; where the magic comes from the cancellations and anti-commutes when you compute (a b) = (a1 e1 + a2 e2)(b1 e1 + b2 e2) ... questions about whether something is a scalar, vector, bivector etc is answered by which parts come out as zero. I had to find this out myself by writing code to perform the general multivector computation; and don't think any of the books or papers make this clear at all. Of course, dealing with the O(2^(2n)) complexity of general multivector multiplication can be handled behind the scenes with a sparse representation generated by a smart compiler.

    • @robfielding8566
      @robfielding8566 5 років тому

      note the 2^n parts... think of patterns of bits for turning e1, e2 on/off.... (0 e00 + a1 e01 + a2 e10 + 0 e11) (0 e00 + b1 e01 + b2 e10 + 0 e11) ... then input and output for each multivector has 2^n scalars, and you distribute in 2^(2n) raw operations; or you need clever sparse operation. you have problems with numerical accuracy, where sometimes something will come out as a wrong type because you subtract two numbers and don't get exactly zero for a component... suddenly you can have something that is "mostly" a bivector with a very very tiny trivector part... that infects everything.

    • @minRef
      @minRef 5 років тому

      @@robfielding8566 Regarding the quote
      "...suddenly you can have something that is "mostly" a bivector with a very very tiny trivector part... that infects everything"
      Very interesting (no sarcasm).
      Could you point me to an open source GA codebase that suffers from this?
      Does this affect the 5D Conformal GA as well?
      (I agree about mv-related big O is pretty nasty, and I agree that the books are still a mess and everyone has their own notation.)

    • @rrr00bb1
      @rrr00bb1 5 років тому +1

      @@minRef This definitely happened in my custom Go code that uses float64 to represent each of the spots in the vector, and just ONLY does raw multivector multiplication, with no notion of higher primitives.. I haven't uploaded it into Github yet. I'm sure that the Python code will suffer from it as well..
      github.com/rfielding/gaMul/blob/master/gaMul.py
      ... it is literally just distributing over terms with multiplication, where i had to sort the basis vectors to figure out the final sign to correctly add into the target terms. (My Go code was a bit more tested than this, but I haven't gotten around to uploading it anywhere yet.)
      For 3D rotation for instance... ((b a) v (a b)), .... (b a) by itself has no issues, because it's (vector vector) -> (scalar + bivector). You end up multiplying by input terms that are exactly 0.0. But when you go to multiply the (scalar + bivector) times (vector), the trivector could be non-zero for a while. Then you multiply that times (a b), which is another scalar+bivector), and even though technically, the trivector part should cancel out, it may not cancel to exactly 0.0 due to floating point inaccuracy. With floating point, exact comparisons generally don't work. If you KNOW that the final result got multiplied by exactly 0, you can test parts for zero to determine the type. But if you are expecting to subtract to things and arrive at 0.0 it is sometimes off.
      In fact, it is a very well known numerically BAD case to subtract sums of squares, when doing stddev calculations. If you square two things and subtract them, in just a few iterations, the noise will be larger than the signal. After many iterations, for something that should be zero, you can get a completely random answer that could be of any size, especially if the expression involved division in it. So, it is common to use strange looking recurrences for things like stddev calculations for this reason.
      So, this is one of the reasons why a naive O(2^n) multiplication of multivectors isn't as nice to use as a much more complicated and strongly typed implementation. I only did it because it's an exceedingly simple way to think about the Geometric Product.

    • @minRef
      @minRef 5 років тому +2

      @@rrr00bb1 Cool stuff. I think it's a neat approach and worth exploring. I agree that the core should be written in a more strongly typed language to allow powerful, narrow contracts and then wrap them with a more user-friendly OO language. I've thought about it a bit and it sounds like the floating point problem that you are referring to might not be caused by the language, but by oddities of IEEE754 (I've seen similar problems in strongly typed languages). The solution was often to brush up on how to avoid getting stung by IEE754's edge cases, (google search "bitbashing comparing floating point numbers is tricky", first result ). Knowing that, either 1) use "epsilons" 2) rethink the algo/data structures to somehow avoid this problem.
      Alternative resource to bitbashing dot io page: (google search "comparing floating point number 2012", 1st result).
      You should be able to try this with Go although I would personally write those modules in something more strongly typed like C just to reduce the chance that the problem is coming from anywhere other than IEEE754 or the algo. If you're familiar with Python, try Ben Shaw's tutorial on making CModules inside python is good (youtube search "ben shaw new zealand", first result). Hope one of these links helps.

    • @rrr00bb1
      @rrr00bb1 5 років тому

      @@minRef when i talk about "strong typing", i'm not talking about the language at all, actually. what i mean is that if you use a bit string to represent a spot on the multivector where a value goes, such as "e_000" for scalar, "e_001" for e1, "e_010" for e2, "e_011" for (e1 e2), etc. a "scalar" is a multivector for which every part is exactly 0.0 other than in the e_000 slot. if you just multiply raw multivectors, some parts will come out slightly non-zero in some circumstances. but if you know that you are going to perform: (b a v a b), then you know that the end result will by of type vector. therefore, do it as one computation where every spot is guaranteed to be 0.0 except for e_001, e_010, e_100. that's what I mean when I say "strongly typed". if you just do general multivector multiplication, you can't just look for where zeroes are to deduce the type.

  • @michaelgonzalez9058
    @michaelgonzalez9058 10 місяців тому +1

    Use. The matrix

  • @christopherwalsh3101
    @christopherwalsh3101 4 роки тому

    *british accent* wow she already sounds really smart!

  • @BOON2785
    @BOON2785 8 місяців тому

    It's like legos that make complex processes visual and intuitive.

  • @petergoodall6258
    @petergoodall6258 2 роки тому +1

    “Point of view is worth 80 IQ points.” -Alan Kay

  • @feraudyh
    @feraudyh Рік тому

    Every time I mull over the limitations of the cross product, I get cross.

  • @kwccoin3115
    @kwccoin3115 Рік тому

    Guess separate those point, line with direction, plane with direction and volume. All in i.
    Guess no python library.

  • @michaelgonzalez9058
    @michaelgonzalez9058 10 місяців тому +1

    Pythagoras theorum

  • @rileystewart9165
    @rileystewart9165 2 роки тому

    There still is cross product in 4d right? Assuming basis vectors w x y z. 2w crox 3y is 6z no?

    • @DrunkenUFOPilot
      @DrunkenUFOPilot 2 роки тому +1

      Cross products don't make much sense. W ∧ X = WX, a bivector. There is a "dual" to any bivector, in this case YZ. That might be the closest to anything like a "right hand rule" but giving another bivector, not any kind of vector, not even a "polar" vector.

    • @rileystewart9165
      @rileystewart9165 2 роки тому

      @@DrunkenUFOPilot Hmm, also in my example you could interchange x and w making them essentially the same. Which creates problems. I'll have to look more into the so called outer products. Interesting video.

    • @Dayanto
      @Dayanto 2 роки тому

      Depends on how you generalize its meaning. If you see it as an object orthogonal to a plane (bivector), you get another plane. If you think of it as a pseudovector (i e the dual of an n-1 dimensional object), then you get the vector perpendicular to a trivector volume.
      For 2D, that corresponds to finding the vector perpendicular to another vector.
      Vectors are only perpendicular to rotation in a plane in exactly 3D (since 2+1=3). The generalization you choose depends on if you care about vectors or rotation.

    • @jamesharford9788
      @jamesharford9788 7 місяців тому

      There is no cross product in 4d. In 3 dimensions a cross product of any two of three orthogonal basis vectors is by definition the third basis vector. In 4 dimensions there are six cross products wx wy wz xy xz yz but only 4 basis vectors w x y z. See the problem?

  • @thomasmonti2450
    @thomasmonti2450 Рік тому

    Anghjulu clound juju-palteforne

  • @AkamiChannel
    @AkamiChannel 3 місяці тому

    Wife of Anthony Lasenby?

  • @user-xedwsg
    @user-xedwsg 5 років тому +1

    nah..

  • @fkeyvan
    @fkeyvan Рік тому

    hard to understand her accent

  • @oraz.
    @oraz. 2 місяці тому

    the guy's voice is too loud and he's constantly talking while she's talking.

  • @Troinik
    @Troinik 3 роки тому

    What’s wrong with this guy’s voice?

  • @thomasmonti2450
    @thomasmonti2450 Рік тому

    Nice tony tensorflow123