Mistral AI is so back with Apache 2.0!!! 💥 Mistral Small 3 Hands-on Testing💥
Вставка
- Опубліковано 9 лют 2025
- Mistral Small 3, a latency-optimized 24B-parameter model released under the Apache 2.0 license.
Mistral Small 3
mistral.ai/new...
Mistral Small can be deployed locally and is exceptionally "knowledge-dense", fitting in a single RTX 4090 or a 32GB RAM MacBook once quantized.
huggingface.co...
❤️ If you want to support the channel ❤️
Support here:
Patreon - / 1littlecoder
Ko-Fi - ko-fi.com/1lit...
🧭 Follow me on 🧭
Twitter - / 1littlecoder
Dude you get this info out so fast with actual useful info. Love your content 🔥
@@ThatNerdChris this time I had to spend some time designing the questions aligned with the model strength, so some delay, but thanks for the kind words!
Very true
Yes, the answers in French are all correct-some are more formal, while others are more familiar-but all are accurate.
@@anne-marieroy8812 Merci for confirming 😊
@@1littlecoder You are welcome, and thank you for your videos, always interesting to watch.
@anne-marieroy8812 thanks for the kind words!
Great to see Mistral back, I really liked experimenting with 7B back when it came out. Been waiting for local models that were about this size / capable. Going to try it out.
@@parttimelarry 7B was my favorite until all the Qwen models came out. So glad to see Mistral back!
Your content is perfect as it is!
Thanks, just trying different things at times!
another banger! ❤🔥🧑🚒🚒🔥 Happy Prompting!
Happy prompting 🙏🏾
ty for the tests
hello, for french test , that's about good, thanks for that review 👍
My pleasure!
best ai youtuber
@@Student-m4f thanks boss!
I've had so much trouble with quantized models since the quantization boom happened with all the tiny Llama 3 70b models people tried to stick into graphics cards. Excited to see that we're getting professional quants out of the big leaguers, if Mistral says we're supposed to run a quant then it's a good one
Never touch the schema :D. French answers all correct. Bisous and thanks for the awesome insights! More on LLM coding, JSON, agents etc. please.
@@Pregidth thank you!
Can you show same stuff about kimi qwen and with tulu as soon as.. And i will appreciate you giving alot of valuable stuff to us by putting lot of efforts from ur side i think you will become underated channel keep rocking....
Wish india would develop an model to replace our babus.
Yes the answers to say "see you later" in french are fine. "Bisous" suppose that you know pretty well the person ;)
Stop with the green screen effects!!!
@@Leo_ai75 not good ?
@@Leo_ai75 I had a few people tell me that effect is nice
Brother, you’re a no nonsense expert in the field. Your earlier tutorials from a few years ago opened my mind into coding with ai. I think your time is better spent elsewhere.
btw, just an added info, I don't have a green screen, the editor uses object segmentation to split the person there!
If it’s done automatically and you’re not having to do loads of work for the effects then I guess it’s ok! I wouldn’t want you being delayed in getting the news out!