Meta Llama 3.1 - Easiest Local Installation - Step-by-Step Testing
Вставка
- Опубліковано 12 вер 2024
- This video shows how to locally install Meta Llama 3.1 8B model and test it on various benchmarks.
🔥 Buy Me a Coffee to support the channel: ko-fi.com/fahd...
🔥 Get 50% Discount on any A6000 or A5000 GPU rental, use following link and coupon:
bit.ly/fahd-mirza
Coupon code: FahdMirza
▶ Become a Patron 🔥 - / fahdmirza
#llama3.1 #llama405b
PLEASE FOLLOW ME:
▶ LinkedIn: / fahdmirza
▶ UA-cam: / @fahdmirza
▶ Blog: www.fahdmirza.com
RELATED VIDEOS:
▶ Code www.fahdmirza....
▶ Resource huggingface.co...
All rights reserved © 2021 Fahd Mirza
🔥Meta Llama 3.1 - Easiest Local Installation ua-cam.com/video/2PalnkYmux0/v-deo.htmlsi=K4EpvMBf2nrpWlPP
🔥Most Rational Coverage of Llama 3.1 405B Model ua-cam.com/video/MKgV6ZORXg0/v-deo.htmlsi=AfwQ5eXfyjhRPmjm
Thanks a million for your amazing videos. It is astonishing to see my machine screaming for power and I really love it. Is there a follow up video on this coming shortly? I would like to see how I could implement the setup you demo’ed to interesting frontend applications. Open Web UI is one of them. Or even hosing that openwebui to the internet! Thanks Fahd Mirza for your good work
Thank you so much. It really helped me
Glad to hear that!
Thanks for the video! Could you please tell us what versions did you use for python, torch, cuda and so on? I couldn't make it work with my GPU even though I have NVIDIA GeForce GTX 1650 Ti, so I installed cpu-only version of Pythorch but the response is extremely slow..
Thanks for sharing!
Thanks for watching!
I finished download successfully, but now I want to uninstall, how do I do that?
Why Uninstaller?
Thanks Fahd for this great video
I've installed all the packages and I can run the program without errors (after several fights with the libs conflict) but it takes too much time to print the response, why ? Is it due to the fact that I'm using CPU ? I don't have an NVIDIA card in my laptop. is there a way to make it faster ?
Thank you again Fahd
very welcome, thanks.
amazing. thank u brother!
My pleasure!
Hi. Easiest install???? Ollama works with Llama 3.1. and use page assist Chrome extension for nice UI.
yes this is easiest install. Ollama uses quantized format of actual model, and infact a very trimmed down version. This video is install is of full model.
@@fahdmirza ok. Got your point. Thx
@@RABRABB very welcome, thanks
I will use the web version why I need to spend 6 GB data
Can run Llama 3.1 on Python (VS code or google Collab) ? If i can you tell me how and what to put in terminal for that
yes you can, I just did a video
its literally the exact same things he does in this video lol
It can't run on local house machines. So what's the point?
you can try running it with ollama on cpu . I have done another video on it. Thanks.
Hi, thanks for your video. I'm sorry, but I find it very difficult to follow. Please describe each step and the tools you are using, as not everyone is a software engineer.
This video does not explain anything?!
👎
ok can you please what explanation you were after? Would be good to have some constructive feedback here. Thanks.