How To Run ANY Open Source LLM LOCALLY In Linux
Вставка
- Опубліковано 5 лип 2024
- In this video, I will show you how to run ANY open source llm (large language models) locally on Linux using Ollama & LMStudio. Ollama & LMStudio are the best tools that allows you to run various models such as llama3, Gemma, Mistral, codellama & much more. Watch this video and learn running LLMS locally on Linux computer.
Timestamps
00:00 Introduction
00:38 Pre-requisites
01:16 Installing Ollama
02:18 Download LLM
03:01 Testing LLAMA3 & Gemma
05:31 Customizing Model
06:55 Installing LMStudio
Download
Ollama: ollama.com/download
LMStudio: lmstudio.ai/
Relevant Tech Videos
Dual boot ubuntu 24.04 LTS And Windows 11 - • How to Dual Boot Ubunt...
Clean Install Ubuntu 24.04 LTS - • How TO Install Ubuntu ...
Install Ubuntu 24.04 LTS On Virtual Box - • How To Install Ubuntu ...
~ Buy Me A Coffee - buymeacoffee.com/kskroyal
~ Connect On Instagram - @KSKROYALTECH
~ For Business Enquires ONLY - business.ksktech@yahoo.com
~ My Website - kskroyal.com/
© KSK ROYAL
MereSai - Наука та технологія
For anyone having trouble with the 'ollama create', you have to spell the model's name in lowercase according to the ollama documentation. So the first line would be 'FROM llama3'
Awesome bro same problem me also facing ❤❤❤
Correct.
LLM local running is cool but what is the best training set?
Very nice Video 🙏🏼
Bur is There also a free ki you can Host localy for picture Generation? Maybe this would be a Video Worth 😊🙌🏼 I would be interested 💯🙌🏼
The ollama system configuration is very iseful for agentic workflows
Need to learn to make llms talk to each other
Absolutely!
Any good LLM for low end hardware?
TinyLlama, GEMMA 2B..
Alpaca is best for llm GUI. its on flatpak as well. Clean & simple UI.
Thanks for telling will try.
which is your main system for you work also what are you doinng in you life like in education pov
I use Linux and macOS as primary OS'es.. MacOS I use for building iOS Apps.
but mostly I spend my time with Linux . I love tinkering open source stuff.
Education: I dropped Out B-Tech long back. Natural Farming I do in the part time and full time UA-cam.
@@kskroyaltech great bro
@@Arador1112 Nice dp 🙂
hey,how do one can delete a model from oolama?
ollama list
to see all models
ollama rm MODEL_NAME
@@kskroyaltech i mean how to completely delete it. it does take the space even after running this command
How to uninstall ollama from my computer. I have no graphics card
You dont need one
Do what exactly makes Linux superior for AI?
You do realise that you can run Ollama & LM Studio just as easily on macOS & Windows. Not to mention, they also work with AMD GPUs, not just Nvidia.
Offcourse.
Yea but windows sucks balls 😂😂😂😂