Such a refreshing feeling while watching a young and brilliant mind presenting AI ... Thank you for your videos and work, much appreciated! And I will now subscribe on your patreon too!. If you celebrate Diwali/Deepavali tomorrow, I wish you happy celebration with your beloved ones!
Thank you so much for the kind words! This has to be one of the best comments I’ve received. Happy Diwali to you too! If you celebrate Halloween, Happy Halloween!
Because I saw your upload yesterday I thought to myself, hmmm why not? Why don't I try to understand Transformers. So I began the journey by watching your first 4 videos, then I watched 2 lectures online, then I kinda watched 3b1b playlist on deep learning and transformers, then i red the paper and now I'm back here finally ready to understand what you are talking about. You were very helpful and a key motivator for this journey. Thank you very much. You deserve FAR MORE subscribers. Your videos are awesome and well produced.
This video is full of information, let me request one thing here I'm not a beginner but still have to struggle with lot of concepts to understand this could be because you have to explain whole course in a single video but If it is possible for you to teach this in more simple way that will be very helpful even for beginners. Thank you so much for this amazing video❤.
Great idea. I'll add NERFs to my TODO list. Fwiw, I do have a video in the channel that covers the theory and concepts some of the seminal works in the field of NERFs. Here is the link: Understanding Zip-NeRF - a cool new AI algorithm for 3D scene synthesis ua-cam.com/video/BE_kimatpnQ/v-deo.html It's one of the earliest videos in the channel, so it might feel a bit janky coz I was figuring things out from scratch back then (I am still figuring things out now tbh). But hopefully you'll find the explanations interesting, and get references to some nice repos and papers!
Pretty much! So a traditional transformer is a sequence to sequence model with an encoder-decoder architecture. This implementation covers the “decoder” part of the transformer, which is the sequence generation part only. Modern language models like GPT (and successors) use the same decoder architecture that is implemented in the video!
I have one pdf of total 125pages and each page covering different topics. Example page from 10to20 covers about Health, than the page 21 to 30 covers the topic Education. First I need to store this pdf 125 data into the vector database. Then the actual requirement is if the user gives a topic as Health I need to retrieve the whole content from the page 10to20 and make it as simple blog. How to make this possible? Suggest me a best approach to start this bro
I made a video on RAGs recently which might be of interest to you. ua-cam.com/video/OHh_SByRYmQ/v-deo.html I'll also suggest that if you got just 125 pages, it is not that big, so you don't need the complexity of interacting with vector databases right away. I'd start experimenting with storing the data on your RAM as a dataframe and operate with Pandas. Add a column to specify which topic the docs belong to (education, health, etc). Go with sentence level or page level chunking to begin with and build the full quick-and-dirty end-to-end pipeline first. You can then play around with different embedding & retrieval techniques to iterate on alternative approaches. Finally, after you are happy with the general approach, you can switch to vector databases if you need to scale to larger documents.
Such a refreshing feeling while watching a young and brilliant mind presenting AI ... Thank you for your videos and work, much appreciated! And I will now subscribe on your patreon too!. If you celebrate Diwali/Deepavali tomorrow, I wish you happy celebration with your beloved ones!
Thank you so much for the kind words! This has to be one of the best comments I’ve received.
Happy Diwali to you too! If you celebrate Halloween, Happy Halloween!
Thank you for explaining transformer based LM with code and concepts, in such a simple manner! hard to find such tutorials these days!
Because I saw your upload yesterday I thought to myself, hmmm why not? Why don't I try to understand Transformers. So I began the journey by watching your first 4 videos, then I watched 2 lectures online, then I kinda watched 3b1b playlist on deep learning and transformers, then i red the paper and now I'm back here finally ready to understand what you are talking about. You were very helpful and a key motivator for this journey. Thank you very much. You deserve FAR MORE subscribers. Your videos are awesome and well produced.
Woahhh! Very impressive! Amazing dedication.
This became my one of the most fav channel on Neural Networks . The other favs are by karpathy and HeduAI
Thanks a lot! I am glad to be on that list. :)
You are really great Machine Learning teacher, from the math to language model, this is cool.
Thanks for the kind words! 🙏
You are good. Keep it up. Subscribed and Liked!
Thanks!
This video is full of information, let me request one thing here I'm not a beginner but still have to struggle with lot of concepts to understand this could be because you have to explain whole course in a single video but If it is possible for you to teach this in more simple way that will be very helpful even for beginners. Thank you so much for this amazing video❤.
Thanks a lot for the feedback! Striking a balance is definitely something I’m always looking to do. Glad you enjoyed it overall.
Excellent contribution 👍
Thanks for the appreciation!
like your explanations
This guy deserves to be viral.
Bro can you make a video on image to 3d objects generation using neural radiance field
Great idea. I'll add NERFs to my TODO list. Fwiw, I do have a video in the channel that covers the theory and concepts some of the seminal works in the field of NERFs. Here is the link:
Understanding Zip-NeRF - a cool new AI algorithm for 3D scene synthesis
ua-cam.com/video/BE_kimatpnQ/v-deo.html
It's one of the earliest videos in the channel, so it might feel a bit janky coz I was figuring things out from scratch back then (I am still figuring things out now tbh). But hopefully you'll find the explanations interesting, and get references to some nice repos and papers!
Great work! could you please describe a little more abput the book " neural attention" .
Thanks! I am not sure which book you are talking about. I am not much of a reader :(
🎉
🙏🙏
This is implementation of transformer from scratch right??
Pretty much! So a traditional transformer is a sequence to sequence model with an encoder-decoder architecture. This implementation covers the “decoder” part of the transformer, which is the sequence generation part only. Modern language models like GPT (and successors) use the same decoder architecture that is implemented in the video!
@@avb_fj can u do transformer implementation from scratch ur explanation is clear
Finally Some good phukin food. Let's go.
Haha! Appreciate it!
I have one pdf of total 125pages and each page covering different topics. Example page from 10to20 covers about Health, than the page 21 to 30 covers the topic Education.
First I need to store this pdf 125 data into the vector database.
Then the actual requirement is if the user gives a topic as Health I need to retrieve the whole content from the page 10to20 and make it as simple blog.
How to make this possible? Suggest me a best approach to start this bro
I made a video on RAGs recently which might be of interest to you.
ua-cam.com/video/OHh_SByRYmQ/v-deo.html
I'll also suggest that if you got just 125 pages, it is not that big, so you don't need the complexity of interacting with vector databases right away. I'd start experimenting with storing the data on your RAM as a dataframe and operate with Pandas. Add a column to specify which topic the docs belong to (education, health, etc). Go with sentence level or page level chunking to begin with and build the full quick-and-dirty end-to-end pipeline first. You can then play around with different embedding & retrieval techniques to iterate on alternative approaches. Finally, after you are happy with the general approach, you can switch to vector databases if you need to scale to larger documents.
Thanks bro ❤, how to contact you for your guidance? Definitely i will not disturb you more😅
Dude it's illegal for this to be free