The unreasonable effectiveness of linear algebra.

Поділитися
Вставка
  • Опубліковано 21 лис 2024

КОМЕНТАРІ • 245

  • @MichaelPennMath
    @MichaelPennMath  11 місяців тому +40

    To apply for an open position with MatX, visit www.matx.com/jobs.

    • @nripdave673
      @nripdave673 11 місяців тому +1

      isn`t this website truly for math....?
      and Is there any age restriction on this website for applying...?

    • @oni8337
      @oni8337 11 місяців тому +2

      Truly a video a representation theorist would've made

    • @AutoDisheep
      @AutoDisheep 11 місяців тому

      Thank you for this post. I didn't even know Linear Algebra could do so much. I knew it was a great math but I was just ignorant as to how great it is.

    • @jneal4154
      @jneal4154 11 місяців тому +1

      Wow. Thanks! Just the kind of work I'm looking for.

    • @jaideepshekhar4621
      @jaideepshekhar4621 11 місяців тому

      The explanations could have been clearer.

  • @felipelopes3171
    @felipelopes3171 11 місяців тому +569

    For people who want to know more, what Michael Penn is hinting at is called Representation Theory. One very popular line of attack to classify mathematical structures is to represent them as compositions of linear transformations in vector spaces. In many cases of interest, you can prove that if you cannot find a representation with certain properties, then it means that the thing you are trying to study does not have an important property. And since studying representations is much easier than studying the abstract structure, it simplifies things a lot.
    That's how Fermat's Last Theorem was ultimately conquered. They reduced the problem to the nonexistence of a given structure, and through some long arguments could reduce it to properties of the representations, which could be brute forced to prove no solution would exist.

    • @Wielorybkek
      @Wielorybkek 11 місяців тому +8

      "In many cases of interest, you can prove that if you cannot find a representation with certain properties, then it means that the thing you are trying to study does not have an important property"
      is it some kind of theorem? could yout throw some keywords so I could learn more?

    • @felipelopes3171
      @felipelopes3171 11 місяців тому +24

      @@Wielorybkek Look at the Cartan subalgebra of a finite dimensional Lie algebra, in particular it's defined in terms of a property of a representation of a Lie algebra.
      It's the properties of the Cartan subalgebra that allow us to classify finite dimensional Lie algebras. This is widely considered one of the most powerful results in Lie theory.

    • @jplikesmaths
      @jplikesmaths 11 місяців тому +3

      Representation theory will be my thesis. Writing about rep theory of special linear groups.

    • @repbacchista
      @repbacchista 11 місяців тому

      that was a nice comment! gonna look into it! vlwsss! =D

    • @emanuellandeholm5657
      @emanuellandeholm5657 11 місяців тому +6

      I'm reminded of linear filtering done in frequency-like domains instead of convolution in the original domain. Transform your filter kernel, transform your signal, pointwise multiply, inverse transform. Where "transform" is something like the FFT. You can do this operation in blocks and then there are various methods with various tradeoffs used to "stitch" together the inverse transform blocks.

  • @godelianconfucianism8184
    @godelianconfucianism8184 11 місяців тому +788

    "If you can reduce a mathematical problem to a problem in linear algebra, you can most likely solve it, provided you know enough linear algebra". This was a quote in the preface of Linear Algebra and its Applications by the great mathematician Peter D. Lax. It was my first book on the subject and that sentence stuck with me ever since

    • @trevoro.9731
      @trevoro.9731 11 місяців тому +18

      But for real-life modelling, if you can reduce a a problem to a problem in linear algebra, it very likely means that your understanding of the problem is wrong or you have made a huge mistake in your modelling.

    • @arthurswanson3285
      @arthurswanson3285 11 місяців тому +48

      ​​@@trevoro.9731bro people have sent rockets to other planets, built trillion dollar search engines, and built compression algorithms you're using to watch this UA-cam video right now with linear algebra. Huh?

    • @trevoro.9731
      @trevoro.9731 11 місяців тому

      @@arthurswanson3285 First of all, you are talking about data abstraction, not real processes modelling. I'm not sure about rockets, as the it involves non-linear processes modelling. As well as electronics and other things.

    • @misterlau5246
      @misterlau5246 11 місяців тому +5

      🤓 My first linear algebra book was the very easy to read, Grossman 🤗🤗🤓🤓🤓

    • @pik910
      @pik910 11 місяців тому +1

      Wonderful sentence. I think of a lot of math as spatial analogies.

  • @michakuczynski2987
    @michakuczynski2987 11 місяців тому +162

    My quantum mechanics professor once mentioned that "we are very lucky that the fundamental laws of nature are expressed using the language of linear algebra". This video really changed my perspective on this matter...

    • @iyziejane
      @iyziejane 11 місяців тому +20

      An old school perspective on this is that classical mechanics rapidly gives rise to nonlinear differential equations, like the pendulum theta'' = -sin(theta), but the dynamics of quantum systems are always linear equations (time derivative of a state is equal to a matrix applied to the state). The traditional explanation is that the dimension of the matrix grows exponentially with the number of particles, and a compact nonlinear equation is in some sense better to work with than an exponentially large linear set of equations. But yes it could be that quantum mechanics is linear because that's the only part of it we can access (like a tangent approximation to a full theory, Hilbert spaces as approximations to Kahler manifolds).

    • @TheThreatenedSwan
      @TheThreatenedSwan 11 місяців тому +6

      I hate that way of thinking of things when people mean it. It's kind of like how the question of is light a wave or a particle doesn't make sense as a question. People take these conceptions of things that are systematized that are useful in describing reality which is immanent and run with them getting further and further away from the point or science. It's like how since science has become high status, now a bunch of people rush in an refer to the science or the form of science (sometimes not even that) without the essence.

  • @aweebthatlovesmath4220
    @aweebthatlovesmath4220 11 місяців тому +134

    Algebra is a really important subject of math and almost everything in algebra can be understood with linear algebra via representation theory this makes linear algebra a really powerful tool!

    • @raphaelreichmannrolim25
      @raphaelreichmannrolim25 11 місяців тому

      If anyone here also likes Number Theory, look up the concept of Arithmetic Space which I invented in my book, Foundations of Formal Arithmetic.

    • @thirdeyeblind6369
      @thirdeyeblind6369 10 місяців тому

      @@raphaelreichmannrolim25 Is there an English version of your Masters Thesis? I am afraid the only copy I can find is in Brazilian Portuguese.

    • @raphaelreichmannrolim25
      @raphaelreichmannrolim25 10 місяців тому

      ​​​@@thirdeyeblind6369Sadly, there isn't yet. I had the objetive of translating It myself, but I didn't yet. The most fundamental concept there exposed is the nilpotent arithmetic space of order N. It's algebra behaves as a finite dimensional projection of the infinite algebra of Arithmetic operations. When I was researching this, used this finiteness while applying abstract harmonic analysis and Gelfand Theory to obtain trigonometric representations of functionals defined on the group of invertible arithmetic operations os these algebras. In particular, we can use this to represent any function obtained by additive and multiplicative convolutions, as the Mertens functions. However, despite the simplicity of the method, I didn't see how this could help us bound the Mertens functions. Stopped working on the subject a few months ago.

  • @boomerzilean
    @boomerzilean 11 місяців тому +143

    I think when you learn linear algebra as a student you don't really get this impression, but I think it's an important thing to realise when you study mathematics, that linear algebra is just conceptually extremely easy and is basically "solved" as a subject

    • @Rudenbehr
      @Rudenbehr 11 місяців тому +21

      When I was studying it, it felt like we were just doing different versions of the same matrix multiplication/addition/subtraction at the beginning of the course, but obscured through vocabulary and proofs.

    • @dougsherman1562
      @dougsherman1562 11 місяців тому +3

      So true, in my path to a physics degree, we really didn't spend lots of time on linear algebra. I was always fascinated by it and appreciate these videos. Once I retire it will be fun to study this again. Thanks for this video!

    • @Myrslokstok
      @Myrslokstok 11 місяців тому +3

      I pased linear algebra and I always woundered what real mathematicians do with representation teory, i had no idé I was soo close!

    • @jessewolf7649
      @jessewolf7649 11 місяців тому +1

      Not true. Particularly Applied Linear Algebra. “Applied” here still means pure mathematics: e.g., solving a linear system more efficiently than standard techniques under certain conditions. An extremely vibrant field. Actually used in AI, for example. (Which is a true “application” of pure mathematics).

    • @LaughingBat
      @LaughingBat 6 днів тому

      This is so wildly untrue that I can't tell if you're trolling or just never actually studied linear algebra beyond introductory material and have mistakenly concluded that the introduction is all there is.
      Either way I encourage you (and anyone else interested in math) to study linear algebra. It's an exceptionally fun research area where you're never lacking in support due to its utility both in and outside of mathematics.

  • @ke9tv
    @ke9tv 11 місяців тому +64

    I've done a lot of numerical analysis in a long career. I've long claimed that 90% of the job is finding the integral transform that maps your impossible problem into linear algebra, and then letting a computer do the linear algebra. If asked what piece of the subroutine libraries that I would re-implement first if I didn't have them, I'd have to say that it's the singular value decomposition. It's the Swiss Army Knife of numerical linear algebra.

    • @MatheusOliveira-dk9zq
      @MatheusOliveira-dk9zq 11 місяців тому

      I have a question, why is the SVD important outside of your general lossy compressom or least norms and how is it used in these cases, for example you mentioned calculus, any keywords for those methods?

    • @christressler3857
      @christressler3857 11 місяців тому

      Would you be willing to recommend books on this?

  • @janvesely3279
    @janvesely3279 11 місяців тому +21

    The best application of linear algebra is certainly the functional analysis. It transfroms the mess of differential/integral equations into something really elegant and easy to be used.

    • @raphaelreichmannrolim25
      @raphaelreichmannrolim25 11 місяців тому

      If you are also interested in Number Theory, look up the concept of Arithmetic Space which I invented in my book, Foundations of Formal Arithmetic.

  • @PhilBoswell
    @PhilBoswell 11 місяців тому +111

    At 16:50 should the adjacent matrix be adjusted a bit? It seems to suggest that node 1 is only connected to itself and node 5, missing out the connection to node 2, and that node 2 is connected to itself…

    • @joelklein3501
      @joelklein3501 11 місяців тому +6

      Yep I think so too

    • @Alan-zf2tt
      @Alan-zf2tt 11 місяців тому +4

      Thank for your post - at first viewing I had involuntarily stopped listening as Row_n = Column_n for the obvious 1

    • @spogel9981
      @spogel9981 11 місяців тому +3

      And second colum should be 10010 because 2 is connected to 1 and 4.

    • @spogel9981
      @spogel9981 11 місяців тому +1

      Many thanks for this video. Short remark: Second colum should be 10010 because 2 is connected to 1 and 4.

    • @krisbrandenberger544
      @krisbrandenberger544 11 місяців тому

      Yes. Node 2 is connected to 1 and 4, not itself and 4.

  • @jhfoleiss
    @jhfoleiss 11 місяців тому +5

    That first example is so refreshing!

  • @TomFarrell-p9z
    @TomFarrell-p9z 11 місяців тому +30

    Wow! Great overview! My favorite applications of linear algebra: spherical geometry (makes the equations intuitive), Fourier analysis, multivariate Gaussian distributions, affine transformations of random variables, linear regression, and engineering problems combining some of the above, especially when the matrices can be manipulated to make the solution methods (almost) unreasonably elegant! 🙂

    • @AdrianBoyko
      @AdrianBoyko 3 місяці тому

      I’m curious to learn more about spherical geometry as linear algebra. Do you have any pointers? Searches like “linear algebra of spherical geometry” hit almost everything related to LA *or* SG but I haven’t found anything related to LA *and* SG.

    • @TomFarrell-p9z
      @TomFarrell-p9z 3 місяці тому

      @@AdrianBoyko I would start with a search for "3D rotation matrices" or just "rotation matrices", and explore from there. That's really what I meant when I said spherical geometry. I rotate to a coordinate system where points of interest are in a plane and then measure distance, etc. in that plane. You might find references to chapters in astronomical calculations text books that deal with the subject. (I'm away from my library for an extended period, or I'd offer a suggestion.) Hope this helps.

    • @AdrianBoyko
      @AdrianBoyko 3 місяці тому

      @@TomFarrell-p9z Thanks for responding but the spherical geometry I’m interested in is on the 2D surface of a sphere. I’m aware of the applications in astronomy and navigation so I’ve seen all that (:

  • @mathisnotforthefaintofheart
    @mathisnotforthefaintofheart 11 місяців тому +8

    I found the idea of using Cayley Hamilton to find the four square roots of a 2 by 2 matrix stunning. Because the algebra behind it ultimately is so easy...Linear is a must for every math inclined person.

  • @scottcentoni7478
    @scottcentoni7478 11 місяців тому +7

    My favorite application of linear algebra is quantum mechanics. Quantum chemistry basically is a huge eigenvalue problem. If you use a plane wave basis with periodic boundary conditions, you can do some of the calculations much more efficiently in momentum space using a fast Fourier transform.

  • @jarno.rajala
    @jarno.rajala 11 місяців тому +37

    Besides the adjacency matrix, a graph can be represented using the closely related Laplacian matrix. This has some mind-blowing applications. Like if you take the two eigenvectors corresponding to the two largest eigenvalues (by absolute value) of the Laplacian and use them as arrays of X and Y coordinates for the nodes, you get a really nice 2D representation of the graph that happens to be the solution to a particular optimization problem where the edges are springs.

    • @tzimmermann
      @tzimmermann 11 місяців тому +5

      Thanks, your answer led me to reading a few articles on graph spectral theory, a subject I am more or less innocent about. It looks powerful, I'm gonna have a good time geeking this up!

    • @joshavery
      @joshavery 11 місяців тому

      What is the optimization problem called?

  • @ianfowler9340
    @ianfowler9340 11 місяців тому +8

    I remember the first time (Grade 13 High School) I saw the complex numbers, a+bi, represented by a 2x2 real martrix :
    a -b
    b a
    All of the operations on the complex numbers match exactly with matrix operations. So simple and obvious once you see it, but that's what makes it so amazing. Modulus/Determinant, DeMoivre's Theorem, rotation matrix. The list goes on. Even
    0 -1
    1 0 multiplied by itself by itself gives
    -1 0
    0 -1

  • @mattsgamingstuff5867
    @mattsgamingstuff5867 11 місяців тому +1

    Not a mathematician, just like to mess with math some times. Was playing with some ideas for fun and stumbled upon the modular addition and subtraction looking like rotation from a different lens (it was an algebraic approach but no matrices). I was just interested in sets that generate their elements cyclically over a given operation (that might also have an inverse). I was just playing with the idea that for some functions repeated application of the derivative will yield the integral, and just started trying to generalize and maybe extend the idea (I figured if e^x, e^ix, sin(x), and cos(x) are examples I could maybe do interesting stuff if I could get a bit more general). After I had a tentative list of axioms (for the behavior I was interested in) modular addition/subtraction and rotations (clockwise and counterclockwise) ended up among examples I could come up with satisfying them. Maybe one day I'll get to finish playing around with the ideas perhaps, and probably discover I'm going nowhere new and stumble upon something easily googlable most likely.

  • @tommychau1211
    @tommychau1211 11 місяців тому +12

    Days ago, I was thinking of the volume of a slanted/oblique cone. If it was a cylinder, it would be obviously the same as a regular cylinder by thinking a stack of disks pushed sideways.
    After little digging, it’s called Cavalieri’s Principle and It should work for cones as well. So from other perspective, I tried to write this pushing stack of disks into matrix, which formed something called “shear matrix”. Amazingly, the determinant is 1, meaning there is no impact on the volume!

    • @sumdumbmick
      @sumdumbmick 11 місяців тому +10

      this is why triangles are 1/2 *b*h regardless of how 'slanted' they are.
      the basic thing you've (partially) discovered is a shape constant. the shape constant for any rectangular n-dimensional figure is 1, but if you remove material from this to get other shapes then a shape constant comes into play. for ellipses it's pi/4, which relates both the area of the ellipse to the circumscribing rectangle and the circumference of the ellipse to the perimeter of the circumscribing rectangle, though if the ellipse has an eccentricity greater than 0 there's another shape coefficient at play as well. this is pretty well approximated by pi/4 *((1+b/a)^2 +4), or pi/4 * ((b/a)^2 +2b/a +5), where a is the semi-major axis and b is the semi-minor axis of the ellipse. for triangles the shape constant is 1/2, and this only relates to area. for pyramids the shape constant is 1/3, and this only relates to volume. the reason it's 1/2 and 1/3 is that it's the inverse of the number of dimensions the shape lives in, and it's no coincidence that these are identical to the coefficients that appear in the power rule when taking integrals or derivatives of polynomials.
      since a cone is just a pyramid with a circular base it's volume will always be pi/4 *1/3 of the rectangular prism made of its height and the rectangle circumscribing its base.
      pi/4 is in fact so much more fundamental than pi that most slide rules that mark something related to pi mark either pi/4 or 4/pi, because it allows finding the areas or volumes of shapes much more easily than trying to use pi usually does.

  • @goodplacetostop2973
    @goodplacetostop2973 11 місяців тому +14

    17:57 Actually, that would be graph theory. But I also like to show the Fibonacci formulas with matrices!
    18:01 Good Place To Stop

    • @einbatixx4874
      @einbatixx4874 11 місяців тому +2

      How long have you been doing this by now actually?

    • @goodplacetostop2973
      @goodplacetostop2973 11 місяців тому +2

      @@einbatixx4874 I’ve thought it was around 2 years, but actually it’s 3 years and half 🤯

    • @anonymous_4276
      @anonymous_4276 11 місяців тому +1

      Dude I got busy with college and stuff so couldn't really visit this channel much, now after two years you're still doing this. Respect the dedication!

    • @ewthmatth
      @ewthmatth 11 місяців тому

      ​@@einbatixx4874 "doing this" doing what?

  • @kylebowles9820
    @kylebowles9820 11 місяців тому +7

    Finally a video i could keep up with! Theres a small error in the adjacency matrix on column 2 but this was a great video. I recently used linear algebra to least squares fit a function on a non euclidian manifold. Linear algebra is unreasonably effective even in intrinsically nonlinear spaces lol

  • @xlerb_again_to_music7908
    @xlerb_again_to_music7908 11 місяців тому +3

    The concepts and symbology of linear algebra was not taught to me at school, so in Uni the jump into what the classes were doing hit me like a train. Pushed me out of the course. 10 years later, I took a subject related course (computing) - and it happened again, same problems; would never pass that module so dropped out.
    This happened to a relative in 2023 - had to quit after 1st semester as he is utterly lost, never used LA, the class already past what he knew and pulling away rapidly. His subject - finance.
    I went on to complete a PhD in a related topic - but the thing is - why is LA missing from some schools, yet assumed at Uni??

  • @darthTwin6
    @darthTwin6 9 днів тому

    Fabulous!!!! I am so excited to explore all of this stuff more deeply!

  • @TheLuckySpades
    @TheLuckySpades 11 місяців тому +6

    In a bunch of classes we would reduce parts of the problems to Linear Algebra and the proof would then be written as "proof via LinAlg"

  • @alexdefoc6919
    @alexdefoc6919 11 місяців тому +2

    When i started learning determinants, i began wanting to revert the way of determinants back into matrixes. And so i studied them a lot, and have gotten a prototype of using matrixes equality to matrixes to equation equalities but in the reverse order, basically finding all the solutions for a 3 variable equation.

  • @JalebJay
    @JalebJay 11 місяців тому +2

    I remember using it for mono directional paths. With the trace of A^n being the number of ways to complete a cycle after n steps. Useful in one of my classes back in 2013.

  • @DaniErik
    @DaniErik 11 місяців тому +3

    So hard to choose one application as my favorite, but I think Markov chains is definitely high up on my list.

  • @rogerr4220
    @rogerr4220 11 місяців тому +2

    I like the use of linear algebra to find a closed form expression for the nth Fibonacci number. Solving linear recurrences by turning them to a matrix, diagonalizing them, and computing their powers to give entries of the sequence.

  • @OmnipotentO
    @OmnipotentO 11 місяців тому +9

    I regret not taking linear algebra in college. I was a biology major but I love math and physics. I should've taken them as electives.

    • @devon9374
      @devon9374 11 місяців тому +2

      Learn it now

    • @DistortedV12
      @DistortedV12 11 місяців тому +2

      Check out Linear Algebra done right videos or Gilbert Strang lectures... not too late (just gotta devote Saturday mornings)

  • @raphaelreichmannrolim25
    @raphaelreichmannrolim25 11 місяців тому +2

    My favorite application of linear algebra is in the Foundations of Mathematics and Number Theory, through the concept of an Arithmetic Space, which I developed, showing a way to study the Peano axioms using linear algebra. Didn't solve the hardest problems yet, though! You can find it in the book Foundations of Formal Arithmetic

    • @vyrsh0
      @vyrsh0 4 місяці тому

      is that your book?

    • @raphaelreichmannrolim25
      @raphaelreichmannrolim25 4 місяці тому +1

      @@vyrsh0 It is! Not the most polished book, but in it I introduced the notion of the Algebra of Arithmetic Transformations, which gave a universal foundation for the theory of generating functions directly from Peano Axioms and linear algebra. It's very neat! The most interesting object is the group of invertible Arithmetic Transformations.

  • @JosephCatrambone
    @JosephCatrambone 11 місяців тому +1

    The modulo sum to matrix multiplication blew my mind. I wish I'd known that years ago.

  • @dominiquelaurain6427
    @dominiquelaurain6427 11 місяців тому +1

    My favorite application of linear algebra is : intersecting conics in the 2D euclidean plane. I do euclidean geometry (with python code, sagemath and so on) and because I never found a better way, I use one simple code I never tried to complety undertstand, for getting the intersecting points. It is based on the 3x3 matrix representation of conics. My second application is Cayley conditions for Poncelet configuration. My old third application would be quaternions represented as 3x3 matrix...more number theory, or Hamilton's work review sequel. My fourth application would be in computational geometry graph theory, looks like part end of you video, about paths in a graph (if M adjacency matrix with binary elements then M^n is path lengths n, and M+M^2+... is a matrix with binary elements deciding whether there is ANY path between two vertices ).

  • @journeymantraveller3338
    @journeymantraveller3338 11 місяців тому +1

    In statistics we have eigenvalues representing variances in factor analysis, cholesky decomposition, Jacobians, Hessian matrix, the magical LR hat matrix, variance/covariance matrices.

  • @icenarsin5283
    @icenarsin5283 11 місяців тому

    Wow!!! I never saw this connection before. Integrating, by using and inverse matrix!!! So awesome. Thank you!

  • @DailyFrankPeter
    @DailyFrankPeter 9 місяців тому +1

    I like linear algebra, it's straight to the point.

  • @stevepa3416
    @stevepa3416 11 місяців тому

    There was an exercise in a parallel computing text book that I solved with a pretty fun application of linear algebra. It's especially nice to use it to show people why care of linear independence and why care about doing things over general fields F rather than just R or C. Problem was show that all cycles in hyper cubes have a even number of edges.
    I imagine there's a more straightforward way to do it. But idea is ok you make a boolean vector space. So if you hypercube is d dimensional, take it as binary strings/tuples of size d. Then there's 2 to the d of those tuples.
    As vector add take component wise XOR and as scalar multiply logical and. Im pretty sure this is just Zmod2 arithmetic but idk if there's something I'm forgetting.
    Tldr use the standard basis as e_i but again use the Zmod2 arithmetic component wise and as your field, then bam the hypercube is the span of your basis. And the basis vectors are your vertices of the graph. Then the edges of the hyper cube, take the product of the basis/vertices with itself to get all 2 tuples of them and now the edges are the subset of them that when you add them, the result is in the basis/vertex set.
    Now you're off to the races. Take any cycle of length p. We have to show p is even. The thin we have to work with is that by definition of a cycle, you end where you start. Idea is. Ok how did we define an edge, that when you add those 2 you get a basis vector. So ok you can cook up an equation where let's say v1 is the start; v1 + sum of a bunch of basis vectors = v1. Add v1 both sides again and then vi + vi is equal to the zero vector in this arithmetic since any bit XOR itself is 0. So now you have the sum of a bunch of basis vectors is 0, there may be a bunch of repeats since you can cycle around a trillion times any which way. So ok do some argument to justify collecting them by common ones so you'll have some a1 of e1 + a2 of e1... + ad of ed = zero vector. Only way that happens is if each of the ai is 0 mod 2. So all the ai are even. And any finite sum of even numbers is even.
    Really fun for pure math folks who only think of linear algebra for use in analysis and really fun for CS/EE types who just think of it as a means to crunch matricies.

  • @ANTONIOMARTINEZ-zz4sp
    @ANTONIOMARTINEZ-zz4sp 11 місяців тому +15

    Machine Learning is a powerful application of linear algebra in the IT ecosystem.

  • @musicarroll
    @musicarroll 11 місяців тому +1

    Linear algebra can also be thought of as mathematics in the small, i.e., local analysis. A large scale structure like an n-dim smooth manifold looks like a Euclidean space when you zoom in, and voila, you can apply linear algebra!

  • @ArjenVreugdenhil
    @ArjenVreugdenhil 11 місяців тому

    A simple but elegant application is representing f(x) = (ax + b)/(cx + d) as matrix [[a b][c d]], thus making the space of Möbius functions isomorphic to SL(2) (or to something much more exciting when working in a module over Z instead of a real/complex vector space).

  • @TheEternalVortex42
    @TheEternalVortex42 11 місяців тому +7

    I wonder how much of this is that our puny human brains don't do well with nonlinear concepts. Thus the math that we've gotten good at happens to be the linear stuff. You could imagine an alternate universe in which we are better at nonlinear concepts in which linear math is but a tiny subset of what we focus on.

    • @Alan-zf2tt
      @Alan-zf2tt 11 місяців тому

      I must admit to wondering if things really are chaotically linear - linearly chaotic? - with chaos inherent within a system merely be nesting a generator in the system.
      In other words nested chaotic feedback.
      Wouldn't it be nice if nature played simply with simple things to beguile us rather than create monsters just out of sight?

    • @kylebowles9820
      @kylebowles9820 11 місяців тому

      I deep dived into lie algebra for quaternions, everything made a lot of sense, it gives me hope we could one day master the nonlinear (although it's a much larger class than linear so maybe it's apples and oranges)

    • @ulrichtietz1327
      @ulrichtietz1327 11 місяців тому

      our puny human brains can't even formulate the question to the answer "42" 😅

  • @사기꾼진우야내가죽여
    @사기꾼진우야내가죽여 4 місяці тому

    When we define a norm on Lp space using lesbaege integration, the positive definiteness does not hold, since integration of nonzero function almost everywhere equal to 0 is 0. Fortuneately the set of all functions that are almost everywhere equal to 0 is a subspace of Lp space, so a new vector space which is the quotient of Lp space by the subspace can be defined. On this vector space, we can define Lp norm without failing to satisfying the positive definiteness.
    Quotient space is a power ful concept.

  • @bigbroiswatchingyou2137
    @bigbroiswatchingyou2137 11 місяців тому

    That is indeed a wonderful picture that you've drawn, thanks for the video!

  • @carriersignal
    @carriersignal 10 місяців тому +1

    I've always found mathematics to be a great subject and have always respected it for both what it is, and its usefulness. However, in the past I have always had trouble understanding some of it. Here lately, I have spent much more time studying the subject, and have realized that with enough effort, time and determination, you can get there.

  • @ArjenVreugdenhil
    @ArjenVreugdenhil 11 місяців тому

    Here is a physical application to explore: in geometric optics, represent a light ray by the vector (nu, h), where n = refractive index of medium, u = slope of light ray, h = height at which ray enters a given surface (relative to optical axis). An optical system can be described by composition of matrices:
    * [[1 P] [0 1]] for refraction, where P is the refractive power of the surface
    * [[1 0] [-d/n 1]] for travelling through a medium, where d is the horizontal distance
    For instance, a typical thin lens situation is described as product of four matrices: object distance, entering the lens, leaving the lens, image distance.

  • @budstep7361
    @budstep7361 11 місяців тому

    7:12 this is only so simple because you set up the V space as identity matrix! For anyone curious

  • @antoniusnies-komponistpian2172
    @antoniusnies-komponistpian2172 8 місяців тому

    This might actually help me with getting more familiar with analysis/calculus

  • @grizzleyeasy4480
    @grizzleyeasy4480 10 місяців тому +1

    what the hell just happened . is he a magician. why nobody told me this. i had like 20 math subjects

  • @phenixorbitall3917
    @phenixorbitall3917 11 місяців тому

    Differential Equations is my favorite. WOW! THANK YOU SO MUCH FOR THIS VIDEO SIR ❤🧠

  • @ripper5941
    @ripper5941 11 місяців тому +1

    Linear algebra is the most fluid and versatile metaphoric skeleton that u can use to solve real life scenarios

  • @commonwombat-h6r
    @commonwombat-h6r 11 місяців тому +2

    your videos are a joy to watch!

  • @crimfan
    @crimfan 11 місяців тому +1

    One of my professors in grad school---a famous numerical analyst---said that, with maybe a few exceptions like sorting, any applied problem that can't be turned into linear algebra can't be solved at all.

  • @KusacUK
    @KusacUK 11 місяців тому +1

    And now I have a simple way of deriving the sum of angles formulae for sin and cos!

    • @jagatiello6900
      @jagatiello6900 11 місяців тому

      12:00 Ha! This can be obtained from exp(tW)=I+tW+1/2!(tW)^2+... where I is the 2x2 identity matrix and W has first row (0 1) and second row (-1 0), by working out the matrix products of the expansion and identifying that exp(tW)=K(t) where K has first row (cos(t) sin(t)) and second row (-sin(t) cos(t)) from the Iwasawa decomposition, as in Proposition 2.2.5 of Bump's book Automorphic forms and representations.

  • @muhammadkumaylabbas8513
    @muhammadkumaylabbas8513 11 місяців тому +1

    Very fascinating. As usual a super interesting video!

  • @Alkis05
    @Alkis05 11 місяців тому

    That first example reminded me of cathegory theory some how. I bet there is a functor hiding in that situation, but I'm just a novice in CT

  • @mschuhler
    @mschuhler 10 місяців тому

    great video, loved the derivative example! (as a side note, i think your 2nd column for Node 2 in the last example should go {1 0 0 1 0})

  • @charbeleid193
    @charbeleid193 11 місяців тому +1

    As a researcher in quantum information, I am incapable of imagining what my world would look like without the wonders of linear algebra,.

  • @shrayanpramanik8985
    @shrayanpramanik8985 8 місяців тому

    9:10 no no no no! I'm in love after seeing this.

  • @MrFtriana
    @MrFtriana 11 місяців тому +1

    Many problems in physics can be studied with linear algebra. From the newtonian mechanics to the monsters called quantum field theory and relativity (special and general), linear algebra has been proved as a powerful tool to make predictions about the Nature, because have a great unification power. You can study coupled oscillatory linear systems and find their symmetries under linear transformations; at the end, this implies conserved quantities, according with the Noether's theorem.

  • @1vootman
    @1vootman 11 місяців тому

    My favorite math class in college, particularly because it was the easiest for me! Im a visual thinker and LN suited my brain.

  • @nujuat
    @nujuat 11 місяців тому

    I feel like the most intuitive way to think about linear algebra is LTI systems. Ie, amplifiers. Lets say you want to put a sound through an amplifier, and mix it with another one. Then it doent matter if you mix the two sounds before or after you put them through the amplifier. Thats all linearity means. Now lets say youre dr dre and want to pump the base. What is an arbitrary sound going to look like after being put through the amplifier? No idea. But each individual frequency is just going to be multiplied or phase shifted by some number. Therefore the frequencies are eigenvectors of the amplifier. Thats all eigenvector means. The eigenvalues are just the multiplications and phase shifts. So you can simplify the calculations of whats going to happen to a sound by transforming to a frequency basis and doing your calculations there. Thats all matrix diagonalisation is.

  • @D.E.P.-J.
    @D.E.P.-J. 11 місяців тому +1

    Another class of applications comes from algebraic topology. Algebraic topology uses linear algebra to study topological spaces.

  • @rainerzufall42
    @rainerzufall42 11 місяців тому +1

    The 2-node is messed up in the last example. 2 is not connected to 2, but to 1! So the upper left 2x2 square is not (1, 0; 0, 1), but (1, 1; 1, 0)...

    • @rainerzufall42
      @rainerzufall42 11 місяців тому

      BTW: It's interesting to calculate Eigenvalues and Eigenvectors of this matrix...
      For example the 5th EW is λ_5 = 0 with EV v_5 = (0, -1, 1, 0, 1).
      On the other hand, the first EW is the biggest EW λ_1 = (1+sqrt(13))/2 with EV v_1 = (λ_1, 2, 1, λ_1, 1).
      The other Eigenvalues are λ_2 = (1+sqrt(5))/2, λ_3 = (1-sqrt(13))/2, and λ_4 = (1-sqrt(5))/2.
      But it some cases, it may be better to ignore the reflexions, then λ_1 = - sqrt(3), λ_2 = sqrt(3), λ_3 = -1, λ_4 = 1, and λ_5 = 0.

  • @zachchamp93
    @zachchamp93 11 місяців тому

    This is like using Abstractions in Computer programming basically. Just representing and embedding computational algorithms as algebraic expressions

  • @user-bk2fo7ny9s
    @user-bk2fo7ny9s 11 місяців тому +1

    my fav: dx(t)/dt = A x(t) solution is x(t) = exp(At) x(0)

  • @5_inchc594
    @5_inchc594 11 місяців тому

    I’m enlightened. Thanks

  • @terryrodgers9560
    @terryrodgers9560 7 місяців тому

    Im not math professor, but I think trig is one of the most important concepts in mathematics (as a multivariable calculus student)

  • @markharder3676
    @markharder3676 11 місяців тому

    Thanks for this lecture. The calculus application was something I never learned about before. A real eye-opener, that.
    In the first example, how do we know that the 4 trig-based functions actually span a 4 D space? What about linear independence?
    In the graph theory application, it seems to me that vtx 1 is also connected to vtx 2, which you did not include in the matrix.

    • @vangrails
      @vangrails 11 місяців тому

      I think that you need to proof that the span is also a base; that means that you need to proof that those 4 functions are lineary independent. They are so this span is also a base.

  • @mathobey
    @mathobey 11 місяців тому +2

    I don’t think that these examples show us “unreasonable effectiveness”, there effectiveness is very reasonable. Spaces of smooth functions naturally have structure of vector spaces and linear differential equations by definition rise from linear operators on these spaces.
    Same story with groups. On the one hand they have strong connections with rings (because there is construction of group ring, and group action of G ZG module structure) and so with modules over the rings (theory of modules of rings is generalization of linear algebra). On the other hand, vector spaces have natural action of automorphism group (also known as GL - general linear group) and for every group G we can find big enough space V and build faithful representation G -> GL(V) (“vectorification” of Cayley theorem for groups).
    That’s why connections with group theory and theory of differential equations are not surprising
    What is REALLY surprising, that linear algebra helps us to solve a lot of problems from discrete maths. For example weak Berge conjecture (graph is perfect complement of graph is perfect) has linear algebraical proof. Also spectral theory of graphs studies spectra of graph matrices (purely combinatorial construction, it’s hard to see algebraic meaning in it) and gives us results about (for example) regular graphs inner structure.
    This is what we really can call “unreasonable effectiveness of linear algebra”.
    Sorry for mistakes, English is not my native language

    • @mikecaetano
      @mikecaetano 11 місяців тому

      Unbelievable would make better sense here than unreasonable. Linear algebra is highly effective, so effective that at times it may be difficult to believe exactly how highly effective.

  • @nicholaslear7002
    @nicholaslear7002 11 місяців тому

    Hey great video! Would you mind sharing what brand of chalkboard you use in your videos?

  • @Wise4HarvestTime
    @Wise4HarvestTime 11 місяців тому

    I remember taking linear algebra in college and it opening up amazing possibilities in computer graphics but this is pretty dense. I'm saying this 4 minutes in and will continue watching to see if I get what he's saying before applying for a job with the sponsor 😅😂😂🤣

  • @ComplexVariables
    @ComplexVariables 11 місяців тому

    I definitely share this view with all my students; linear is a serious power-tool

  • @Bertogil98
    @Bertogil98 8 місяців тому +1

    My Differential Equations lecturer said that humans are so stupid that we are only capable of computing linear things. Out of all the possibilities, our brain only works for A(x+y)=A(x)+A(y), so we have developed our mathematics in the lines of the "developable", which is relating everything with linear algebra.
    For answering "why is linear algebra so useful?", try to imagine anything not using linearity, and no progress will be achieved, so nobody would study it, nobody would care about it and we would forget about it. So, the usefulness of linear algebra is a kind of survival bias

  • @aarongracia4555
    @aarongracia4555 11 місяців тому

    We need a video about graphs, pls 🙏🏼

  • @euanthomas3423
    @euanthomas3423 11 місяців тому +1

    Fascinating insight. What do the eigenvectors mean in these situations?

    • @01binaryboy
      @01binaryboy 5 місяців тому

      Even after transformation , eigenvector remains in the same spot may be stretched

  • @abdelkaioumbouaicha
    @abdelkaioumbouaicha 11 місяців тому +3

    📝 Summary of Key Points:
    📌 Linear algebra provides powerful tools for analyzing mathematical structures and gaining a deeper understanding of them. It allows for the translation of mathematical concepts into the language of linear algebra, enabling their study using linear algebra techniques.
    🧐 In the first example, a four-dimensional real vector space spanned by four functions is examined. By representing this structure as a matrix, the derivative of the functions can be analyzed, and the matrix can be used to find the anti-derivative of a function.
    🚀 The second example explores how a group can be represented in linear algebra using matrices. The group ZN is represented as 2x2 matrices with real entries, and addition in ZN is represented as matrix multiplication. This representation allows for the study of groups using linear algebra techniques.
    📊 Linear algebra has applications in data science and machine learning. Data can be encoded into matrices, and matrix factorization techniques can be used to analyze the data. Examples include encoding images and representing networks or graphs using adjacency matrices.
    💡 Additional Insights and Observations:
    💬 "Linear algebra provides a powerful framework for studying mathematical structures and solving problems in various fields."
    🌐 The video references the use of linear algebra in data science and machine learning, highlighting its practical applications in these areas.
    📣 Concluding Remarks:
    Linear algebra is a versatile and effective tool for studying mathematical structures and solving problems. By translating mathematical concepts into the language of linear algebra, we can gain a deeper understanding and apply powerful techniques. From analyzing derivatives and integrals to representing groups and encoding data, linear algebra plays a crucial role in various fields of study.
    Generated using Talkbud (Browser Extension)

    • @ulrichtietz1327
      @ulrichtietz1327 11 місяців тому +1

      Bravo! ChatGPT -- using the video-transcript -- couldn't have produced a better summary.

  • @minecraftermad
    @minecraftermad 11 місяців тому

    17:50, your matrix is off, its saying that 2 connects to itself,when that 1 is supposed to be 1 slot higher.

  • @rayjay13790
    @rayjay13790 3 місяці тому

    So I had been recommended the study of LA *before* Calculus 3 so that an understanding of the chain rule as a simplified form of The Jacobian could be ascertained.

  • @strikeemblem2886
    @strikeemblem2886 11 місяців тому +2

    Claim that dim V = 4 at 3:15 is unjustified. Wrong adjecency matrix at 16:50.

  • @Alan-zf2tt
    @Alan-zf2tt 11 місяців тому +1

    Truly = I do not know.
    I have much to learn
    I do appreciate it's beauty

  • @marcotosini7156
    @marcotosini7156 11 місяців тому

    The graphical representation appears to be incorrect, but the video is fantastic!

  • @사기꾼진우야내가죽여
    @사기꾼진우야내가죽여 11 місяців тому

    the existence of basis of vector space is equivalent to the axiom of choice which seems to be unrelated to linear algebra.

  • @juandesalgado
    @juandesalgado 11 місяців тому +1

    If you ever browse over the "attention" paper on the transformers architecture in LLMs, the sentence about positional encoding that goes "... for any fixed offset k, PE_{pos+k} can be represented as a linear function of PE_{pos}" has some relation to the first application in this video.

  • @alnitaka
    @alnitaka 11 місяців тому

    One can represent the elements of the Galois group of an equation as matrices.

  • @jennifertate4397
    @jennifertate4397 10 місяців тому

    Great vid. Thanks.

  • @cd-zw2tt
    @cd-zw2tt 11 місяців тому

    Could you theoretically use any periodic function instead of sin and cos in the modulo sum example? My thinking is you need pure sine and cosine to get a steady tick around the circle, but with other periodic functions you could have some sort of interesting "weighting function" to the inputs. With sine and cosine, its pure and steady, but with something like a triangle wave or a compound sine wave, you could induce some very strange behavior.

  • @NityaKrishnaDas926
    @NityaKrishnaDas926 6 місяців тому

    Thank you so much 🙏🙏🙏🙏🙏🙏🙏🙏

  • @MS-sv1tr
    @MS-sv1tr 11 місяців тому

    Finding the integral by finding the inverse of the derivative matrix was kind of mind-blowing. I know your example was chosen to make it simple, but can this be generally applied as a way to compute integrals?

    • @rainzhao2000
      @rainzhao2000 10 місяців тому +1

      I remember being mind-blown just like you and wondering the same thing when I first saw this. I went down the rabbit hole of functional analysis and differential equations and I'm still digging. A neat way of thinking about this problem is to view it as solving the differential equation Df=g for the antiderivative f of the function g, and D is the derivative operator.
      Since differentiation is linear, Df=g now looks like a linear algebra problem where D is a linear operator, and f and g are vectors. In fact, in the context of functional analysis, functions are vectors belonging to vector spaces of functions appropriately called function spaces.
      If we can find a finite basis for the function space of f and g, then we can represent D as a matrix, g as a coordinate vector, and solve for f by matrix multiplying D inverse with g just like in the video.
      In general, these function spaces could be infinite dimensional and there's not always a useful basis to represent them, but the field of functional analysis has classified many kinds of function spaces with a variety of useful bases for solving differential equations.

  • @tikeshverma4777
    @tikeshverma4777 11 місяців тому

    My fav is least square solutions.

  • @michaelaristidou2605
    @michaelaristidou2605 11 місяців тому

    Excellent video!

  • @빛과어둠-q8s
    @빛과어둠-q8s 24 дні тому

    As someone who knows neither, should you study Linear Algebra first or Calculus first?

  • @anonymous_4276
    @anonymous_4276 11 місяців тому

    Great video!

  • @ronaldjorgensen6839
    @ronaldjorgensen6839 11 місяців тому

    thank you

  • @superuser8636
    @superuser8636 11 місяців тому

    Hi, how would you encode the +C ? You can point me to reference or calculate directly here, I will comprehend using notation. Also, never saw the integral vector matrix notation but follow you very easily. MS CS with BS CS (ML) BA Math (Stats). Wanted to take more real/complex/advanced matrix theory but ran out of time and had to get my career started before I hit 35. Love staying sharp with this content early mornings. TYVM❤

  • @MrJepreis
    @MrJepreis 11 місяців тому

    Brilliant! as always!

  • @MasterHigure
    @MasterHigure 11 місяців тому

    (Finite dimensional) linear algebra is the backbone of calculus. And calculus is everywhere, so linear algebra is everywhere.

  • @xlegiofalco
    @xlegiofalco 11 місяців тому

    In Soviet Russia, I forgot the name of the economic advisor but he used linear algebra to balance the productions from cities to maximize profits and minimize waste. Also I think about linear algebra all the time while I sit at strings of red lights in traffic every day going to class 🤬😡

  • @DaddyRaiden
    @DaddyRaiden 11 місяців тому

    That is really fucking interesting

  • @littlenarwhal3914
    @littlenarwhal3914 11 місяців тому

    A lot of math us about figuring out how to reduce a complicated problem to linear algebra

  • @rmv9194
    @rmv9194 9 місяців тому

    Shouldn't the first column in the graph matrix be (1,1,0,0,1)????

  • @dogbiscuituk
    @dogbiscuituk 6 місяців тому +1

    You love your linear algebra. I love your abstract algebra. Let's call the whole thing off.

  • @quiksilverrandom
    @quiksilverrandom 11 місяців тому

    Isn't there a mistake in the matrix at 14:20 and following. The matrix shows no connection between points 1 and 2 but a connection between 2 and itself.

  • @MGoebel-c8e
    @MGoebel-c8e 11 місяців тому

    8:46 Isn’t that the wrong sequence of Matrix and vector? Seems it should be (1,0,0,0)*D^-1