AI field is really becoming dynamic.Lot of changes are coming from Traditional Machine Learning to Generative AI.This field is changing dynamically and we need to update ourself as we go ahead. Join my telegram group where I post and discuss these types of content.Happy Learning!! Make sure you have telegram installed in it. t.me/+V0UeLG8ji-F8ThNb
Sir, I have a basic question related to Prompting. Does any learning (with model weight update) happen during Prompting? If not, then how does the model learn from Few Shot Prompting?
hi sir could you please help me with the erro i'm facing when run the model File "C:\Users\tarun\anaconda3\envs\venv\lib\site-packages\streamlit untime\scriptrunner\script_runner.py", line 534, in _run_script exec(code, module.__dict__) File "C:\Users\tarun\llm\app.py", line 57, in st.write(getllamaresponse(input_text,no_words,blog_style)) File "C:\Users\tarun\llm\app.py", line 28, in getllamaresponse response=llm(prompt.format(style=blog_style,text=input_text,n_words=no_words)) File "C:\Users\tarun\anaconda3\envs\venv\lib\site-packages\langchain_core\prompts\prompt.py", line 132, in format return DEFAULT_FORMATTER_MAPPING[self.template_format](self.template, **kwargs) File "C:\Users\tarun\anaconda3\envs\venv\lib\string.py", line 161, in format return self.vformat(format_string, args, kwargs) File "C:\Users\tarun\anaconda3\envs\venv\lib\site-packages\langchain_core\utils\formatting.py", line 18, in vformat return super().vformat(format_string, args, kwargs) File "C:\Users\tarun\anaconda3\envs\venv\lib\string.py", line 165, in vformat result, _ = self._vformat(format_string, args, kwargs, used_args, 2) File "C:\Users\tarun\anaconda3\envs\venv\lib\string.py", line 205, in _vformat obj, arg_used = self.get_field(field_name, args, kwargs) File "C:\Users\tarun\anaconda3\envs\venv\lib\string.py", line 270, in get_field obj = self.get_value(first, args, kwargs) File "C:\Users\tarun\anaconda3\envs\venv\lib\string.py", line 227, in get_value return kwargs[key]
Hello sir, PW student here. First of all, I would like to thank you for the amazing ML classes you took for PW. They were really amazing and the way you taught were very easy and very simple. Looking forward to learning the rest from you here.
Great Sir, I am a Full Stack Developer, I am looking forword to involve in Data Science field, and I am going to follow your The-Grand-Complete-Data-Science-Materials and Roadmap-To-Learn-Generative-AI-In-2024. My 2024 target is only follow you .😊 Thank you GURU JI🙏
Hi Krish, Thank you so much for uploading all best resources to keep us upto date. I am ur follower from many years. because of your resources I got into Datascience job 2 year back and upgrading myself by ur videos and encouragement. Thank you
i learn and know about data science and ai ...thank you very much sir your video such a very help me for work in project and also understand about next workplace in ai and data science ...
Video tutorial are good. A quick suggestion: Could u move your image somewhere else because it was on top of some of the commands , hence making them invisible.
Can you please make a tutorial on using LLM's to augment textual data on private data. Where no data will be re-used to train LLM models. That would be a great help!
Hello Sir, Thank you for such a quick and Concise tutorial on Llama. I watched till the end and did a side-by-side code my myself till the end BUT the output in the StreamLit app is, taking forever to generate the required text
This video is based on Pre-trained models like Llama2 which has been trained on vast datasets and can generate high-quality text with minimal setup. Since the model is already trained, you don't need extensive computational resources. The use case is suitable for applications where the quality of generation is paramount, and the context or data to be retrieved can be generalized or is contained within the training data of the LLM. However, Could you please create a video of the Customized Retrieval and Generation project(RAG)? This would be Ideal for applications where the retrieval needs are specific and the data sources are unique or require customized processing (e.g., legal documents, technical manuals, news articles).
This is nice. However can you create some end to end video project like network log analysis or db log analysis for enterprise customers so that it is more meaningful
I want to understand what is the future of LLM and GenAI for developers what kind of work we will get ? Using other AI models to crate chat bot, images, videos ??
Hi Krish! I learned data science through your videos and am now working as a data scientist. BIg Thanks! Can we use QA answer models without OpenAI or anything free of cost?
Great Sir, Thanks for this wonderful video. Pls continue read---- I have built Chatbot's using llama-2 13B using API only, 2 bots i built, one normal text generation and another is for Uploading a PDF and Text documents (you have made video using OpenAI) and asking QnA on the document, bots performing well. 😊. But I have tried to build same document QnA for CSV file, when it comes to CSV not performing good, (then i use CSV agent it is working great) if possible pls do make video on QnA using CSV file. But Pls make a video that instead of downloading the model and running in Locally, use API (don't have capabilities to go with locally). Thanks again
I hope the next video would be as open source as possible . If you are going to use cloud please make sure its free so that we can code along with you. Again thank you so much
Superb video. Thanks, Krish. Just a question: How do you use a finetuned model adapter to build the application? A video on this would be greatly helpful.
Can you point me to the video where you deploy this to AWS, as you mentioned in the video? I created a similar streamlit app for my use case but want to deploy it to AWS to reduce the latency of the generation. Looking forward to hearing from you!
AI field is really becoming dynamic.Lot of changes are coming from Traditional Machine Learning to Generative AI.This field is changing dynamically and we need to update ourself as we go ahead.
Join my telegram group where I post and discuss these types of content.Happy Learning!!
Make sure you have telegram installed in it.
t.me/+V0UeLG8ji-F8ThNb
hi, do we have an open source model which can help us in selecting something out of the available list based on requirements...... or any other way
Sir we don't want to use pre-trained model, we want to fine tuning these models with our custome data.
Sir, I have a basic question related to Prompting. Does any learning (with model weight update) happen during Prompting? If not, then how does the model learn from Few Shot Prompting?
Your videos are awesome, If you kindly make a video on RLHF with the code, it will be greatly helpful.
hi sir could you please help me with the erro i'm facing when run the model
File "C:\Users\tarun\anaconda3\envs\venv\lib\site-packages\streamlit
untime\scriptrunner\script_runner.py", line 534, in _run_script
exec(code, module.__dict__)
File "C:\Users\tarun\llm\app.py", line 57, in
st.write(getllamaresponse(input_text,no_words,blog_style))
File "C:\Users\tarun\llm\app.py", line 28, in getllamaresponse
response=llm(prompt.format(style=blog_style,text=input_text,n_words=no_words))
File "C:\Users\tarun\anaconda3\envs\venv\lib\site-packages\langchain_core\prompts\prompt.py", line 132, in format
return DEFAULT_FORMATTER_MAPPING[self.template_format](self.template, **kwargs)
File "C:\Users\tarun\anaconda3\envs\venv\lib\string.py", line 161, in format
return self.vformat(format_string, args, kwargs)
File "C:\Users\tarun\anaconda3\envs\venv\lib\site-packages\langchain_core\utils\formatting.py", line 18, in vformat
return super().vformat(format_string, args, kwargs)
File "C:\Users\tarun\anaconda3\envs\venv\lib\string.py", line 165, in vformat
result, _ = self._vformat(format_string, args, kwargs, used_args, 2)
File "C:\Users\tarun\anaconda3\envs\venv\lib\string.py", line 205, in _vformat
obj, arg_used = self.get_field(field_name, args, kwargs)
File "C:\Users\tarun\anaconda3\envs\venv\lib\string.py", line 270, in get_field
obj = self.get_value(first, args, kwargs)
File "C:\Users\tarun\anaconda3\envs\venv\lib\string.py", line 227, in get_value
return kwargs[key]
Hello sir, PW student here. First of all, I would like to thank you for the amazing ML classes you took for PW. They were really amazing and the way you taught were very easy and very simple. Looking forward to learning the rest from you here.
What is pw
@@riyayadav8468😮 Physics wallah
@@riyayadav8468, Physics Wallah
it is taking a lot of time to execute. can somebody tell me solution for that?
Because of your efforts, i got the Job as a Associate Data scientist. thank you so much for your support sir. you are my motivation for everything....
Hey can. You please tell me how you downloaded the model in your system??????
Great Sir, I am a Full Stack Developer, I am looking forword to involve in Data Science field, and I am going to follow your The-Grand-Complete-Data-Science-Materials and Roadmap-To-Learn-Generative-AI-In-2024.
My 2024 target is only follow you .😊
Thank you GURU JI🙏
Hi Krish, Thank you so much for uploading all best resources to keep us upto date. I am ur follower from many years. because of your resources I got into Datascience job 2 year back and upgrading myself by ur videos and encouragement. Thank you
that's great 😊
Thank you Krish, please continue to create contents like this!
Even I am not from Dev background still I am able to understand the entire open source LLM model, Nice Video Keep it up for (Y)
I am happy that i found you in UA-cam. You are doing a great job. Keep going.
Thank you very much, You have made my Project way easier by the way you explained everything!
please keep doing such videos!! kudos to your work.. and patience.
Mazza aa gaya, Sir. Really well structured! Thanks a lot!
I really Love the way you explain Krish , I am a big Fan of your teaching style and content tooo..
your contributions to deep learning field is amazing thank you
Eagerly waiting for fine-tuning such model on our own data.
wow @krishnaik06!!! Thank you for this. Just in time!! I had been looking for Llama tutorials and here you are :)
Thanks for making it simple yet comprehensive. Look forward to more such videos. :)
Thanks for making this video, which is very much useful in implementing real world LLM use cases, OfCourse we are using currently.
Very good and informative video. Thank you for your time. I highly appreciate your efforts.👍
Thank you Krish, I am from QA background but your videos helped me a lot to learn AI. you are my AI Guru :)
thank you sir ,your video have every details we required for using a llm model.
Finally meta also created their llm models
Sir your knowledge, confidence and content are phenomenal. Salute to great efforts. :)
Hey Krish, this is a treasure please do videos on fine-tuning, and deployment also.
This is a great way to learn Krish , keep doing your great work
Hi Krish,
I like this approach of the training. It gives you a good background
Please can you tell me how you downloaded the model in your system?
please keep going with this format for other models as well. Thanks
i learn and know about data science and ai ...thank you very much sir your video such a very help me for work in project and also understand about next workplace in ai and data science ...
This is my excellent learning till today. Thank you so much sir.
Thank you for making this amazing video on open source llama2 , really helpful for free gpt programmers
Great explanation sir, much love from sri lanka❤
Sir its a very good video to learn about LLM and to know how to generate end to end projects from these models
Thanks a lot for your amazing videos! I learn a lot from you. Love from Togo!
Video tutorial are good. A quick suggestion: Could u move your image somewhere else because it was on top of some of the commands , hence making them invisible.
haha your predicted stats have been smashed. cheers to seeing more videos around working with LLM's
you videos deserve more than 100000000 comments and likes
I can't believe this is available for free
ALWAYS BIG THINGS AND EPLAINATIONS , THANK YOU
Thankyou for teaching us... Very helpfull for other domains too..
Can you please make a tutorial on using LLM's to augment textual data on private data. Where no data will be re-used to train LLM models. That would be a great help!
Really cool tutorial. You made it easy to understand>>>Hats off
It already met your target. Keep teaching!
Quite Good Sir And we Want More Like This
It's really initiative. You are doing very well
sir please videos on using pretrained models also like yolo ,coco ssd so that we can dive deep into how to build projects using that
Hi Krish,
Thanks for your educational videos.
Hello Sir, Thank you for such a quick and Concise tutorial on Llama.
I watched till the end and did a side-by-side code my myself till the end BUT the output in the StreamLit app is, taking forever to generate the required text
I hope to see your future video tutorials of ai-based chatbot using python. Thanks
Thanks Krish this way ahead ChatGPT LLM
Thank you sir for this wonderful and fruitful video
This video is based on Pre-trained models like Llama2 which has been trained on vast datasets and can generate high-quality text with minimal setup. Since the model is already trained, you don't need extensive computational resources.
The use case is suitable for applications where the quality of generation is paramount, and the context or data to be retrieved can be generalized or is contained within the training data of the LLM.
However, Could you please create a video of the Customized Retrieval and Generation project(RAG)? This would be Ideal for applications where the retrieval needs are specific and the data sources are unique or require customized processing (e.g., legal documents, technical manuals, news articles).
best video on LLama
Great lecture, and great effort than you very much.
It is really a nice initiative by you
This is a Great video.
but please make a video about on a how to connect NVidia gpu with conda oro python
Thanks for the wonderful vedio, Can you also create a vedio on deployment of such large models
Thank you for uploading end to end project, can you make video on other LLMs like falcon, Jurassic, Lama index
This is nice. However can you create some end to end video project like network log analysis or db log analysis for enterprise customers so that it is more meaningful
No need of stopping in terminal, if you refresh the webpage it will fetch the latest changes which you have made in the visual studio
Hi how to solve No module named 'langchain_community' error
Please help
Thank you for sharing this valuable knowledge
Hi Sir, fantastic video. Keep it up
good sir very helping this vedios nice nice......
Hi Sir, Please create a video on Optimization and quantization techniques for LLM
Thanks krish for this series, can you also make video on huggingface RagModel and RagRetriever for generative AI
Thanks a lot sir nice initiative please keep it up
Amazing video, Thanks Krish
It was a great project! Could you please create a video on integrating the same project with an API?🙂
Did you find any other resources about it? Because I don't want to download such big model into my local.
Great content! thank you Krish
Good project🎉
Great video, love from Pakistan sir...
this is an awsome format
Thank you so much. I love all your Videos.
Great share sir.
Many many thanks
Amazing.......... it is really good
I want to understand what is the future of LLM and GenAI for developers what kind of work we will get ? Using other AI models to crate chat bot, images, videos ??
Good project. Thanks Krish
make a video on how to train llama2 on your custom data on local machine and make fastapi deploy on azure
Incredible Stuff Keep Going ✌
Hi Krish! I learned data science through your videos and am now working as a data scientist. BIg Thanks! Can we use QA answer models without OpenAI or anything free of cost?
one of the best video
good job man, thank you!
Great Sir, Thanks for this wonderful video. Pls continue read----
I have built Chatbot's using llama-2 13B using API only, 2 bots i built, one normal text generation and another is for Uploading a PDF and Text documents (you have made video using OpenAI) and asking QnA on the document, bots performing well. 😊.
But I have tried to build same document QnA for CSV file, when it comes to CSV not performing good, (then i use CSV agent it is working great) if possible pls do make video on QnA using CSV file.
But Pls make a video that instead of downloading the model and running in Locally, use API (don't have capabilities to go with locally).
Thanks again
how to you build this chatbot using API only?
Really enjoyed while learning from you. Could you please tell us how we can use this LLM'S API?
I hope the next video would be as open source as possible . If you are going to use cloud please make sure its free so that we can code along with you. Again thank you so much
Nice initiative ❤
Thank you so much for amazing contents
You forgot "f" in formating string
template = f"""
Text
"""
Superb video. Thanks, Krish. Just a question: How do you use a finetuned model adapter to build the application? A video on this would be greatly helpful.
Great effort
Thank you for the great video
Thanks for making this video
Make more project videos please!
NICE WORK SIR
Can you point me to the video where you deploy this to AWS, as you mentioned in the video? I created a similar streamlit app for my use case but want to deploy it to AWS to reduce the latency of the generation. Looking forward to hearing from you!
For your effort🎉
Sir, it's a very important video. Nice job, Sir. Can I add to my website if I used WordPress?
Get this man to 100 likes
Really great tutorial
please make more videos videos using open source llm.
That was really helpful.
sir also need a video on using RAG with these models