Categories 3 Natural transformations

Поділитися
Вставка
  • Опубліковано 15 гру 2024

КОМЕНТАРІ • 20

  • @bideshghosh5899
    @bideshghosh5899 3 роки тому +21

    It's our pleasure to learn from you ,Sir. You are great in teaching also. We are really lucky . Thank you 🙂

  • @franciscusrebro1416
    @franciscusrebro1416 3 роки тому +11

    Excellent presentation, as always. I really hope a video on the Yoneda lemma is coming!

  • @zubin8010
    @zubin8010 3 роки тому +8

    Wow! Coincidentally, we just discussed the idea of the dual of a vector space in linear algebra this morning!
    We went over some of the exact same facts, for finite-dimensional vector spaces: that V and V* are isomorphic, but that this isomorphism is not "natural", whereas the isomorphism between V and V** is.
    I still don't fully understand duals (I think I will need to write down and fiddle with many more examples) but I'm just happily surprised to see the same idea show up here, and in my linear algebra class, at nearly the same time :)

    • @98danielray
      @98danielray 3 роки тому +2

      if your vector space has an inner product and is a complete metric space (or is finite dimensional), there is a really cool interpretation of the (topological) dual. In such case, the dual linear functionals can be identified with the inner product by a fixed element (for instance, the functional of R^3 f(x,y,x) = ax+by+cz can be uniquely associated to the inner product by the vector (a,b,c)). Likewise, since the inner product is similar to a "shadow" or projection of a vector onto the other, you can think of elements of duals as tools of measuring vectors. In that sense, they take vectors and measure the length of their shadows times their norm.

    • @ILikeMario1
      @ILikeMario1 3 роки тому +1

      Fun fact: once you learn about inner products (the dot product is the prototypical example), it turns out that the non-degenerate inner products are bijection with the isomorphisms V --> V*

    • @wreynolds1995
      @wreynolds1995 3 роки тому +3

      The following thought process helped me understand this stuff. If you choose a basis in V, then the elements of V can be thought of as column vectors: the entry in the i-th place is the coefficient of the i-th basis element. Then the dual space V* can be thought of as the space of row vectors of the same length, and applying a linear functional to an element of V is just matrix-multiplying the corresponding row and column vectors (which you'll note is the same as calculating an inner product). Moreover, from this viewpoint, the isomorphism between V and V* is exactly "taking the transpose". Obviously, this isomorphism only makes sense to talk about because you chose a basis in the first place. But now suppose you take the transpose and take the transpose again; then it doesn't matter what basis you chose to begin with, you must get the same vector back because that's how "taking the transpose" works.
      Now I recommend you take the time to really understand why the application of a linear functional in V* to an element of V should so closely resemble an inner product calculation (hint: dual basis), and understand the details of the claim "the isomorphism is exactly "taking the transpose"." I hope this helps.

    • @yaeldillies
      @yaeldillies 3 роки тому +4

      What really clicked for me is that `V*` is really just `V → ℝ`, the type of maps from `V` to `ℝ`. Similarly, `V*\*` (sorry, had to escape the star) is `V* → ℝ`, which is `(V → ℝ) → ℝ`. So now you want a "nice" function `V → V*\*`, that is a function `V → ((V → ℝ) → ℝ)`. What in the world could it be... What if you just took `v : V`, `f : V → ℝ`, and returned `f v : ℝ`? That does the job! And it's canonical!
      Of course, ℝ here has no meaning further than being the ring of scalars of your vector space/module. You can replace it by anything you want and you'll still get that homomorphism. For the isomorphism, a bit more is required.
      People in type theory would write `V → ((V → ℝ) → ℝ)` as just `V → (V → ℝ) → ℝ` because, as it turns out, `a → b → c` is very much the same thing as `b → a → c`, both meaning "I take (something of type) `a` and (something of type) `b`, and return (something of type) `c`".

    • @zubin8010
      @zubin8010 3 роки тому +1

      Thank you!

  • @hausdorffm
    @hausdorffm 3 роки тому +3

    6:29 V ->V** is not isomorphism... I do not understand. Trouble is for any vector space X, there is not always exists some vector space Y so that X =Y**. This trouble occurs in Banach spaces maybe. Infinite dimension gives such example.
    7:54 FG=1_D there is a typo, it is 1_C.

    • @hausdorffm
      @hausdorffm 3 роки тому

      I misunderstood! Thanks.

    • @wargreymon2024
      @wargreymon2024 3 місяці тому

      Maybe hes refering to finite dimensional vector space

  • @i6g7f
    @i6g7f 2 роки тому

    brilliant explain and demonstration technique

  • @rodolfoassereto7626
    @rodolfoassereto7626 2 роки тому +1

    My professor gave us the definition of natural transformation without referring to any (this!) motivating example. Why would he do so?

  • @jimadams8385
    @jimadams8385 2 роки тому +1

    Determinants are contravariant! Specify equivalence in Topologic! Spec(T).

  • @migarsormrapophis2755
    @migarsormrapophis2755 3 роки тому +6

    yeeeeeeeeee

  • @jimadams8385
    @jimadams8385 2 роки тому +2

    I think it is Burn-Willy and Grow-ten-dick. Am I right?🙃

  • @brookrangeloff3423
    @brookrangeloff3423 Рік тому

    Trans and more

  • @RB_Rajesh
    @RB_Rajesh 3 роки тому

    Nice
    #we_love_our_nature