Impressive fast talking and fast scrolling. A lot of knowledge and experience for sure. I guess I'll have to do some digging if I want to really benefit from this lecture.
Great work! I will definitely follow up on your website. Some of the clustering results are really remarkable. Have you thought about hierarchal clustering on the embeddings to see if a sensible taxonomy emerges?
How does the semantic vectorization of a word look like, in a mathematical sense ? Is it like every word has it’s spatial ID (coordinate) and gets kind of multiplied with a vector array of assoziatives IDs?
I’m totally new to embeddings and this video inspired me to want learn even more!
These 38 minutes changed my next 38 years
How so? I'm still at the beginning
This has been a timely and really useful presentation. Thank you for posting it!
very great demo, thanks for sharing!
this is an excellent example of practical use of embeddings.
Thanks Simon for sharing your knowledge. This video is so underrated.
"vibes-based search" lol. love the term you invented.
Things start to become magical.
That was awesome. Thank you for uploading it!
Great talk. Thoroughly enjoyed it. Thanks!
A nice and very inspiring presentation! Thank you!
Brilliant!!! (as always)
Great talk. Very interesting subject.
Impressive fast talking and fast scrolling. A lot of knowledge and experience for sure. I guess I'll have to do some digging if I want to really benefit from this lecture.
Great work! I will definitely follow up on your website. Some of the clustering results are really remarkable. Have you thought about hierarchal clustering on the embeddings to see if a sensible taxonomy emerges?
What a genius
How does the semantic vectorization of a word look like, in a mathematical sense ? Is it like every word has it’s spatial ID (coordinate) and gets kind of multiplied with a vector array of assoziatives IDs?
damn!!!
Can you show us some Imagebind unix-fu?
Why are you wearing a mask 😷?
Maybe he doesn’t want to get people sick genius