Functional Analysis 15 | Riesz Representation Theorem

Поділитися
Вставка
  • Опубліковано 29 вер 2024

КОМЕНТАРІ • 84

  • @psyspin
    @psyspin 3 роки тому +8

    Congratulations man! This is an amazing intro to a topic that I like very much (although I am not a mathematician) but I struggle to understand through my self-study. It really helps me a lot! Again congrats and keep up the good work :)

  • @lucaug10
    @lucaug10 3 роки тому +25

    I'm loving the frequent videos! Such a happy surprise to open UA-cam and see a new functional analysis video every day! :)

    • @brightsideofmaths
      @brightsideofmaths  3 роки тому +4

      Thank you! I am working hard at the moment :)

    • @saadtahir96
      @saadtahir96 3 роки тому

      @@brightsideofmaths Thank you for this! Can you please share your email address or inbox me at saadtahir96@gmail.com? I have some useful material that you may like, and ultimately also help me with this course too! :D

    • @brightsideofmaths
      @brightsideofmaths  3 роки тому +1

      @@saadtahir96 ua-cam.com/users/brightsideofmathsabout

  • @JR-iu8yl
    @JR-iu8yl 2 роки тому

    Thank You

  • @weirdo-jw9kc
    @weirdo-jw9kc 3 роки тому

    Do a series on topology and algebra too. If you have done it before please share the link. I like how you present the ideas and it gives right intuition.

  • @sanjursan
    @sanjursan 3 роки тому

    My proof is much simpler than this. Of course, it is wrong!

  • @NorwegianFr34k
    @NorwegianFr34k 3 роки тому +15

    Not even joking, yesterday I was about to ask whether you would make a video on this topic. So this was a nice surprise :D

  • @dibeos
    @dibeos 3 роки тому +5

    I don’t understand why (lambda)*x(hat)-x is in the Kernel of l... 7:08

    • @brightsideofmaths
      @brightsideofmaths  3 роки тому +3

      If you apply the map l, you get zero. This is same calculation as done by the blue brackets above.

    • @dibeos
      @dibeos 3 роки тому +2

      Ahhh got it! And by the way, thanks for the videos. They are really amazing. I even already sign up to your your website steadyhq.

    • @brightsideofmaths
      @brightsideofmaths  3 роки тому +1

      @@dibeos Thank you very much :)

  • @tensorfeld295
    @tensorfeld295 3 роки тому +20

    Can you do a course on differential geometry? Starting elementary then continuing with manifolds. Maybe you can do something with Banach- and Hilbert-Manifolds. Would be nice! ^^

  • @RepTheoAndFriends
    @RepTheoAndFriends 2 роки тому +3

    I know that this is a meme, but: I really enjoy the statements of basic Functional Analysis, because they resemble stuff from representation theory (A continuous function H -> F seems to be an analytic version of an exact Functor from a 'nice' triangulated category T to k-Vect). For a certain triangulated category T one can show that K_0(T) is the power series ring k[[t]]. Alltogether we obtain a map from K_0(T)= k[[t]]-> k=K_0(k-Vect) (At least after tensoring with k). The statement is that every such Functor which is continuous (i.e. exact and sends arbitrary coproducts to arbitrary coproducts) is representable (This is Brown Representability theorem).

  • @qiaohuizhou6960
    @qiaohuizhou6960 3 роки тому +3

    Hi, thank you so much for your video! I am sorry if I throw too many questions on the same day... I am wondering could you please share insights on why x_l must belongs to the orthogonal complement of the kernel of l? I know the kernel is a subspace of a vector space, and I know the row space(or column space) is orthogonal to the null space. I can sort of following every step to where l(x)= but I don't get the insight of choosing x_l from orthogonal complement of the kernel. Also, it seems this special x_l chosen is analog to the singular vector in a finite space... are these two concepts somehow connected?
    Sorry I wasn't majored in Maths and have a very limited background in all sorts of maths subjects. I hope you don't find my question naive and lack in basic understanding. I am glad if you could point me to the right direction of study!

    • @brightsideofmaths
      @brightsideofmaths  3 роки тому +2

      Don't worry at all. All questions are welcome here. Even naive ones can help other viewers here quite a lot.
      The choice of x_l makes sense here because in the inner product all elements in ker(l) have to be sent to 0 as well. This is then what the inner product can do.

  • @Hold_it
    @Hold_it 3 роки тому +4

    I hope you still get enough sleep with all these high quality videos coming out in a short time ;)

  • @mathieumaticien
    @mathieumaticien 3 роки тому +2

    Why does the closedness of ker(l) imply that ker(l)^ortho is nontrivial?

    • @brightsideofmaths
      @brightsideofmaths  3 роки тому

      We also have the assumption that ker(l) is not the whole space. Hence closedness means that ker(l) is a proper subset and a Hilbert space in the Hilbert space X. Does this already help you?

    • @mathieumaticien
      @mathieumaticien 3 роки тому

      @@brightsideofmaths hmmm now I'm wondering why the closedness is necessary. If we say ker(l) is a strict subset of X, and k is in ker(l) and let x be in X but not ker(l), then = 0 by definition, so x is in ker(l)^ortho. Since 0 is in ker(l) and we defined x to not be in ker(l), x is not 0, and ker(l)^ortho is nontrivial.
      Where does the closedness of ker(l) come into play?

    • @hanfsi
      @hanfsi 3 роки тому

      ​@@mathieumaticien To even be able to split up the whole space into a subspace and its orthogonal complement you need to apply the Hilbert projection theorem. (Which is done implicitly in the video) And the theorem requires a closed subspace. (Just look at its proof) So its really a condition imposed by that theorem if you want to be able to split the space up in the first place.

  • @h-bar8649
    @h-bar8649 10 місяців тому

    No clue where you would put it, but it would be great if somehow Fréchet and Gateaux were discussed in this Functional Analysis series. Unless you think it should belong elsewhere? Thanks for the videos!

  • @arturo3511
    @arturo3511 Рік тому

    at 4:45, is it always true that by continuity, the pre-image of closed sets are closed ? You said that with the continuity translates to closed sets for complements. I don't understand what you mean for complements , is there an extra-criterion for it to translate to closed sets? Or is it always true that if continuous the pre-image of closed sets is always closed? I'm simply asking this to know whether it's possible to have the preimage of an closed set being open which wouldn't go against the definition of continuity we saw. Thank you !
    Additionally it seems that at 5:16, x_l can be defined by any x^ (x-hat) that satisfies given properties, is it true that only one x^ satisfies these properties since x_l is unique ?

    • @brightsideofmaths
      @brightsideofmaths  Рік тому

      By the abstract continuity we have: preimages of open sets are also open. This translates to: preimages of closed sets are also closed.
      Please also note that a set can be closed and open at the same time.

  • @jorgearturomartinezsanchez4882
    @jorgearturomartinezsanchez4882 2 роки тому +1

    Thank you, good sir. I'm writing my thesis and never took functional analysis so your videos help a lot

  • @anne-catherine_gagne
    @anne-catherine_gagne 4 місяці тому

    Thank you! Your video really helped me understand better the material. I feel more confident for my final tomorrow

  • @lorenzougo6571
    @lorenzougo6571 8 місяців тому

    want to cry, calculus 3 incoming hahahah

  • @hyperduality2838
    @hyperduality2838 Рік тому

    Domain (pre-image) is dual to the co-domain (image) -- rank nullity theorem in linear algebra.
    Isomorphism (sameness) is dual to homomorphism (similar or relative sameness) -- Group Theory.

  • @AadityaVicramSaraf
    @AadityaVicramSaraf Рік тому

    I'm unsure if I'm being dumb but for 6:34, doesn't the complex conjugate come when we multiply the scalar in the second component? I might be confused but kindly clarify.

    • @AadityaVicramSaraf
      @AadityaVicramSaraf Рік тому

      in wikipedia and conway's functional analysis as well, I saw that it is conjugates for second component and normal for first component. I checked out your previous video where you said linear in second component though.

    • @AadityaVicramSaraf
      @AadityaVicramSaraf Рік тому

      Ok sorry i rewatched that video and found at 6:18 that you clarified that you had chosen this definition. I also understood that eventually it is there to ensure positivity so it is our choice to choose linearity in first (or second) argument. Thanks. Stuff is much clear now

    • @brightsideofmaths
      @brightsideofmaths  Рік тому

      Great :)

  • @luciaperez4400
    @luciaperez4400 Рік тому

    Excellent video! Would you reference where the proof that the orthogonal complement of a closed set in a Hilbert space contains elements other than 0?

  • @hoijanlai
    @hoijanlai 3 роки тому

    The course I am taking also has a step that proves that the dimension of the ortho-complement of ker(l) is 1, do you know why is it? Thanks

  • @StratosFair
    @StratosFair 2 роки тому

    Wanted to give myself a quick refresher for the proof of Riesz representation theorem, and this was extremely clear and helpful, just like I remembered it to be !
    I hope you will get the chance to cover orthogonal projections as some point

  • @tigernov_425
    @tigernov_425 2 роки тому

    why L-norm's bound is L(uni-vector of X)'s norm?

  • @chenliou2578
    @chenliou2578 3 роки тому +1

    Thx

  • @MikhailBarabanovA
    @MikhailBarabanovA 3 роки тому

    Finally an answer to why we can just transpose vector space elements and they become OK as an functional. Thanks!

  • @munausef3891
    @munausef3891 2 роки тому

    thanks good explain can give me site to solve questions .thx

  • @lonjezosithole6285
    @lonjezosithole6285 3 роки тому

    I am learning a lot from your videos, man. Thank you for posting this content

  • @JaspreetSingh-zp2nm
    @JaspreetSingh-zp2nm 12 днів тому

    Why orthogonal complement being closed has to contain something other than zero vector? Closed is something topological I am confused here. For finite dimension Gram- Schmidt process may help but in general I am not sure.

    • @brightsideofmaths
      @brightsideofmaths  12 днів тому

      The orthogonal complement is always a closed set. So maybe you can clarify your question?

  • @pan19682
    @pan19682 2 роки тому

    we are looking forward to giving us a video series in topology

  • @Domzies
    @Domzies 3 роки тому

    3:50 has made me realise I ddin't understand this. The professor at my functional analysis course did say that the theorem wouldn't work without X being a hilbert space but he didn't explicitly say why. Judging form your video I also probably don't quite understand orthogonal projectors as well as I'd like to. I've tryied looking into the book Functional Analysis by Peter Lax , but got even more confused. There it almost seems like you need a vector subspace (not jusz an arbitrary set) in order to even define an orthogonal complement. Besides this it would seem that perhaps the classical relation from linear algebra, namely that X=Y directsum Y^ortho, for any vector substace Y, only holds true in a general hilber space if Y is closed ?

  • @hectormerinocruz7965
    @hectormerinocruz7965 3 роки тому

    Bonita y muy bien explicada la demostración de este importante teorema.

  • @moritzbecker5703
    @moritzbecker5703 3 роки тому

    Thank you very much for your excellent videos!

  • @scollyer.tuition
    @scollyer.tuition 3 роки тому

    In a finite dimensional Euclidean space, we often represent linear functionals via row vectors, which map column vectors into the underlying field via a dot/inner product. I guess the Riesz Representation Theorem guarantees:
    a) that this operation can be justified rigorously
    b) that the analogue of this operation in infinite dimensional vector spaces also exists

    • @brightsideofmaths
      @brightsideofmaths  3 роки тому +2

      I think that is a short rough summary one can always have in mind.
      However, in infinite-dimensional spaces some technical details are involved as well: We need completeness for example and the dual space consists of *continuous* functionals.

  • @RangQuid
    @RangQuid Рік тому

    The proofs are very elegant, they really bring out the beauty and bright side of mathematics!

  • @ecologypig
    @ecologypig 2 роки тому

    Thanks for your super helpful videos! 😀 I have a quick question: how do we know that $x_l := l(\hat{x}) \hat{x}$ is still inside the set $X$? Since we have scaled $\hat{x}$ by $l(\hat{x})$, and the scaling might be large, so it could be that $x_l$ now lies outside of the set $X$?

    • @brightsideofmaths
      @brightsideofmaths  2 роки тому

      X is not just a set but a vector space. Hence you can never leave it just by scaling :)

    • @ecologypig
      @ecologypig 2 роки тому

      @@brightsideofmaths oh got you! Thanks very much for your quick reply!😃

  • @zaccandels6695
    @zaccandels6695 3 місяці тому

    Excellent video

  • @anowarali668
    @anowarali668 Рік тому

    Thanks for the video. My doubt is "Whenever you are entering l(x^) in inner product , you are taking Conjugate of l(x^)" why? We know conjugate come if we take with second term of inner product. Please clear it.

    • @brightsideofmaths
      @brightsideofmaths  Рік тому

      I defined the conjugate in the first term of the inner product.

    • @anowarali668
      @anowarali668 Рік тому

      @@brightsideofmaths Is it not against the inner product formula since we know = a and = b*.

    • @brightsideofmaths
      @brightsideofmaths  Рік тому

      @@anowarali668 What is not against it?

    • @anowarali668
      @anowarali668 Рік тому

      @@brightsideofmaths l(x^)

    • @brightsideofmaths
      @brightsideofmaths  Рік тому

      @@anowarali668 As I said: we defined the inner product with the property = b

  • @TheWombatGuru
    @TheWombatGuru 3 роки тому

    Thank you for this video :)

  • @xwyl
    @xwyl 2 роки тому

    With your constructed x\hat, the proof is done like a knife through butter. But it raises a bigger question, how did you come up with the construction?

    • @brightsideofmaths
      @brightsideofmaths  2 роки тому

      Thanks. We know the start and the goal. One just tries to fill in the gaps and finds x_l.

    • @xwyl
      @xwyl 2 роки тому +1

      @@brightsideofmaths I'm trying to understand this without any construction (for these constructions were perhaps invented after the theorem was proven, and may hinder deeper understanding)
      The original inspiration may be the Euclidean space R^n. Consider a vector r in R^3, r=(x,0,0)+(0,y,0)+(0,0,z). When we study l(r), just take l((x,0,0)) for example, the linearity of l(r) implies that l((x,0,0)) is just a multiple of x, therefore l(r) is just . This also implies that dim(ker(l))=n-1 for the space R^n.
      Knowing that x_l exists, then any vector is the sum of the parallel part and the orthogonal part with x_l. Then it's natural to propose the unified parallel component x\hat (meaning x_l is a multiple of x\hat) and then the parallel part is easily l(x)/l(x\hat)*x\hat = \lambda * x\hat.
      The next big leap is 6:17 where is miraculously put there. It's natural to approach from =, comparing to the equation l(x)=\lambda * l(x\hat), knowing that x_l is a multiple of x\hat, say x_l=a*x\hat, we finally get = \lambda * l(x\hat), and solve for a=l(x\hat), i.e. x_l=a*x\hat=l(x\hat)*x\hat. And this process can be generalized to Hilbert spaces.
      Sorry for the messy writing, but the reasoning is completely natural without any prior-construction, all from what we already have in the derivation. I prefer this derivation for it's more basic and learner-friendly.

  • @Zero-es-natural
    @Zero-es-natural 3 роки тому

    Great video!

  • @jaimelima2420
    @jaimelima2420 3 роки тому

    Thank you so much!

  • @zazinjozaza6193
    @zazinjozaza6193 3 роки тому +2

    Wow this was a really cool topic, can't wait to see the applications.

  • @RohanKumar-zn4qg
    @RohanKumar-zn4qg 3 роки тому

    Can you share your slides

    • @brightsideofmaths
      @brightsideofmaths  3 роки тому +1

      Oh sorry! I totally forgot. Now they are all in :) steadyhq.com/en/brightsideofmaths/posts/c6641292-1666-4a24-a4b9-cd9c4147d7d3

    • @RohanKumar-zn4qg
      @RohanKumar-zn4qg 3 роки тому

      @@brightsideofmaths it is asking for member access... please share on some open source platform

    • @brightsideofmaths
      @brightsideofmaths  3 роки тому +2

      @@RohanKumar-zn4qg PDFs are a perk for my Steady members.