Wow! Coincidentally, we just discussed the idea of the dual of a vector space in linear algebra this morning! We went over some of the exact same facts, for finite-dimensional vector spaces: that V and V* are isomorphic, but that this isomorphism is not "natural", whereas the isomorphism between V and V** is. I still don't fully understand duals (I think I will need to write down and fiddle with many more examples) but I'm just happily surprised to see the same idea show up here, and in my linear algebra class, at nearly the same time :)
if your vector space has an inner product and is a complete metric space (or is finite dimensional), there is a really cool interpretation of the (topological) dual. In such case, the dual linear functionals can be identified with the inner product by a fixed element (for instance, the functional of R^3 f(x,y,x) = ax+by+cz can be uniquely associated to the inner product by the vector (a,b,c)). Likewise, since the inner product is similar to a "shadow" or projection of a vector onto the other, you can think of elements of duals as tools of measuring vectors. In that sense, they take vectors and measure the length of their shadows times their norm.
Fun fact: once you learn about inner products (the dot product is the prototypical example), it turns out that the non-degenerate inner products are bijection with the isomorphisms V --> V*
The following thought process helped me understand this stuff. If you choose a basis in V, then the elements of V can be thought of as column vectors: the entry in the i-th place is the coefficient of the i-th basis element. Then the dual space V* can be thought of as the space of row vectors of the same length, and applying a linear functional to an element of V is just matrix-multiplying the corresponding row and column vectors (which you'll note is the same as calculating an inner product). Moreover, from this viewpoint, the isomorphism between V and V* is exactly "taking the transpose". Obviously, this isomorphism only makes sense to talk about because you chose a basis in the first place. But now suppose you take the transpose and take the transpose again; then it doesn't matter what basis you chose to begin with, you must get the same vector back because that's how "taking the transpose" works. Now I recommend you take the time to really understand why the application of a linear functional in V* to an element of V should so closely resemble an inner product calculation (hint: dual basis), and understand the details of the claim "the isomorphism is exactly "taking the transpose"." I hope this helps.
What really clicked for me is that `V*` is really just `V → ℝ`, the type of maps from `V` to `ℝ`. Similarly, `V*\*` (sorry, had to escape the star) is `V* → ℝ`, which is `(V → ℝ) → ℝ`. So now you want a "nice" function `V → V*\*`, that is a function `V → ((V → ℝ) → ℝ)`. What in the world could it be... What if you just took `v : V`, `f : V → ℝ`, and returned `f v : ℝ`? That does the job! And it's canonical! Of course, ℝ here has no meaning further than being the ring of scalars of your vector space/module. You can replace it by anything you want and you'll still get that homomorphism. For the isomorphism, a bit more is required. People in type theory would write `V → ((V → ℝ) → ℝ)` as just `V → (V → ℝ) → ℝ` because, as it turns out, `a → b → c` is very much the same thing as `b → a → c`, both meaning "I take (something of type) `a` and (something of type) `b`, and return (something of type) `c`".
6:29 V ->V** is not isomorphism... I do not understand. Trouble is for any vector space X, there is not always exists some vector space Y so that X =Y**. This trouble occurs in Banach spaces maybe. Infinite dimension gives such example. 7:54 FG=1_D there is a typo, it is 1_C.
It's our pleasure to learn from you ,Sir. You are great in teaching also. We are really lucky . Thank you 🙂
Excellent presentation, as always. I really hope a video on the Yoneda lemma is coming!
Wow! Coincidentally, we just discussed the idea of the dual of a vector space in linear algebra this morning!
We went over some of the exact same facts, for finite-dimensional vector spaces: that V and V* are isomorphic, but that this isomorphism is not "natural", whereas the isomorphism between V and V** is.
I still don't fully understand duals (I think I will need to write down and fiddle with many more examples) but I'm just happily surprised to see the same idea show up here, and in my linear algebra class, at nearly the same time :)
if your vector space has an inner product and is a complete metric space (or is finite dimensional), there is a really cool interpretation of the (topological) dual. In such case, the dual linear functionals can be identified with the inner product by a fixed element (for instance, the functional of R^3 f(x,y,x) = ax+by+cz can be uniquely associated to the inner product by the vector (a,b,c)). Likewise, since the inner product is similar to a "shadow" or projection of a vector onto the other, you can think of elements of duals as tools of measuring vectors. In that sense, they take vectors and measure the length of their shadows times their norm.
Fun fact: once you learn about inner products (the dot product is the prototypical example), it turns out that the non-degenerate inner products are bijection with the isomorphisms V --> V*
The following thought process helped me understand this stuff. If you choose a basis in V, then the elements of V can be thought of as column vectors: the entry in the i-th place is the coefficient of the i-th basis element. Then the dual space V* can be thought of as the space of row vectors of the same length, and applying a linear functional to an element of V is just matrix-multiplying the corresponding row and column vectors (which you'll note is the same as calculating an inner product). Moreover, from this viewpoint, the isomorphism between V and V* is exactly "taking the transpose". Obviously, this isomorphism only makes sense to talk about because you chose a basis in the first place. But now suppose you take the transpose and take the transpose again; then it doesn't matter what basis you chose to begin with, you must get the same vector back because that's how "taking the transpose" works.
Now I recommend you take the time to really understand why the application of a linear functional in V* to an element of V should so closely resemble an inner product calculation (hint: dual basis), and understand the details of the claim "the isomorphism is exactly "taking the transpose"." I hope this helps.
What really clicked for me is that `V*` is really just `V → ℝ`, the type of maps from `V` to `ℝ`. Similarly, `V*\*` (sorry, had to escape the star) is `V* → ℝ`, which is `(V → ℝ) → ℝ`. So now you want a "nice" function `V → V*\*`, that is a function `V → ((V → ℝ) → ℝ)`. What in the world could it be... What if you just took `v : V`, `f : V → ℝ`, and returned `f v : ℝ`? That does the job! And it's canonical!
Of course, ℝ here has no meaning further than being the ring of scalars of your vector space/module. You can replace it by anything you want and you'll still get that homomorphism. For the isomorphism, a bit more is required.
People in type theory would write `V → ((V → ℝ) → ℝ)` as just `V → (V → ℝ) → ℝ` because, as it turns out, `a → b → c` is very much the same thing as `b → a → c`, both meaning "I take (something of type) `a` and (something of type) `b`, and return (something of type) `c`".
Thank you!
6:29 V ->V** is not isomorphism... I do not understand. Trouble is for any vector space X, there is not always exists some vector space Y so that X =Y**. This trouble occurs in Banach spaces maybe. Infinite dimension gives such example.
7:54 FG=1_D there is a typo, it is 1_C.
I misunderstood! Thanks.
Maybe hes refering to finite dimensional vector space
brilliant explain and demonstration technique
My professor gave us the definition of natural transformation without referring to any (this!) motivating example. Why would he do so?
Determinants are contravariant! Specify equivalence in Topologic! Spec(T).
??? What are you talking about?
yeeeeeeeeee
ye
I think it is Burn-Willy and Grow-ten-dick. Am I right?🙃
Trans and more
Nice
#we_love_our_nature