Great breakdown on AI's potential! The idea of using AI to generate responses for common questions is super intriguing-I can see how it would save time while maintaining quality. Love the insight into balancing AI's capabilities with ethical considerations, especially regarding spam bots. Looking forward to more updates and content from you! 🌟
I'm always late to the party....and on a budget. I'm looking at buying a 4060 TI 16 GB assuming it will drop in price even further with the 5000 series being released.
For the price, it's only really worth me upgrading to 24Gb VRAM and I don't see them dropping enough for me. Not when you allow fot the other system changes I'd need to do as well. I wouldn't expect them to drop much either, since the whole mining shortage Nvidia know people will pay the higher prices, so there's no incentive to drop them.
RTX 4070 will most likely still be the best bang for the buck for the time being. 5070 is not much better and will be price scalped to hell and end up costing at least 700 bucks for a year or more...
@@Elwaves2925 i got my basic 4060 on sale - for what i do with it i'm satisfied. But then i started messing with AI image gen and LLM's so the 8gb runs out fast. Live an learn i guess.
You are better off going for a 3090, 4060ti is too slow because it uses smaller buswidth. If not. Then go 4080. I'm on a 2080ti, it faster than the 4060ti. 2080ti undervolts verywell, as low as 150w and still faster than the 4060ti. You can upgrade thee vram to 22gb, and 8gb GPUs can be upgraded to 16gb.
@ I dont have the coin for a 4080 nor the psu - its a mid tier pc at best. How do you upgrade Vram? as far as speed i make a 2048x2048 in about 4 min - and that is using more than my 8gb card has - about 12gb of "shared" memory so i would assume having more vram would speed it up as it won't have to swap with system ram when decoding. Ofc i don't really know what I'm doing, just getting into it.
you know what is worse, i live in Brazil and here is so bad to live, dollar is 7 times more expensive to our currency, i buyed my computer in 2020, a very good PC, but i have 8gb vram, what more i can do with this? i will have to change all my computer again to improve it, i'm still using SD with 1.5 models and thats all i can do, i love A.I but there is a limitation for who live in brazil. A person in USA can buy a car very faster, we take a entire life literally, so how i'm suppose to follow at least a bit the new technologies, is too tragic for us. i'm satisfied with the things i have, but when the time passes i notice i will not be satisfied for so long, i wanna learn things and enjoy new things but i can't, actually almost all Brazilians can't.
it seems nicely detailed in that turntable, but I can already see that at the paw of the cat and the coins in the box there are massive dents. They just look right with that shading and that turn table. Certainly needs to be reworked
Thanks for the effort, amazing video, the NVIDIA part shid light to lots of us including me! But these AI tools are like the meme coins you see in crypto, a lot of junk, we need more big project that will not fade after a week, that's where the money is, Huggingface is filled with space like these
If you mean projects being quickly abandoned. Yes, for sure. But keep in mind that these are often free open source projects. So people don't feel as attached to them. Big projects on the other hand cost money to use and not everyone can or wants to afford that. It's a bit of a minefield right now
Requirements for Windows: Nvidia Project DIGITS 128GB. 1. Hardware: - Project DIGITS Hardware: Includes the NVIDIA GB10 Grace Blackwell Superchip. - System RAM: At least 128GB. - Storage: Up to 4TB NVMe storage. - Networking: High-speed Ethernet (10Gbps recommended for linking systems). - Power Supply: Standard electrical outlet. 2. Software: - Windows 11 Pro: Install for compatibility with professional AI tools. - Drivers: Latest NVIDIA Studio Drivers. - AI Software: - Install NVIDIA AI Enterprise (for enterprise-grade security and updates). - Frameworks like PyTorch, Python, and Jupyter Notebooks. 3. Considerations: - Windows may not fully support all Linux-based features (e.g., DGX OS optimizations). - You may need WSL2 (Windows Subsystem for Linux) to run some Linux-exclusive tools and features. For Linux (Preferred for Full Functionality): 1. Hardware: - Same as Windows (Project DIGITS hardware, 128GB RAM, NVMe, Ethernet). 2. Software: - Ubuntu Linux 22.04: Required for running NVIDIA DGX OS. - NVIDIA DGX OS: Ships with Project DIGITS for seamless integration. - AI tools like NVIDIA RAPIDS, NeMo framework, and NVIDIA Blueprints. 3. Setup Steps: - Connect the DIGITS hardware to your desktop. - Use the NVIDIA NGC catalog and Developer Portal to access prebuilt software. - Optimize networking for model sharing and cloud scaling. Key Notes: - Windows Drawback: Some Linux-optimized tools and DGX OS benefits may not work natively on Windows. - Linux Benefit: Full access to the NVIDIA AI stack, preloaded on Project DIGITS hardware. - Linux offers better performance and compatibility for this system if you're focused on maximizing AI model training and inference like Stable Diffusion text-to-image AI generation. Project DIGITS vs. Prebuilt RTX 5090 System: Which Saves You More? - Project DIGITS ($3,000): - Tailored for AI tasks like model training, inference, and text-to-image generation. - Comes with 128GB unified memory and up to 4TB NVMe storage. - Setup requires some manual work (installing software, configuring tools, etc.). - Total estimated cost: $3,000 + your time for setup. - RTX 5090 Prebuilt System (~$5,000-$6,000): - High-end gaming and creative workstation with general-purpose performance. - Focused on cutting-edge graphics rather than AI-specific workloads. - Fully assembled and ready to use out of the box. - Total cost: $5,000-$6,000. Savings Breakdown: If you’re focusing on AI generation and willing to set up DIGITS yourself, you save $2,000-$3,000 compared to buying a prebuilt RTX 5090 system. Plus, DIGITS is specifically optimized for AI workloads, making it the better choice for tasks like Stable Diffusion, LoRA training, and text-to-image generation.
**Tutorial: Why Project DIGITS is Ideal for Stable Diffusion and ComfyUI** 1. **Compatibility**: - Project DIGITS is fully compatible with Stable Diffusion and ComfyUI, as it supports NVIDIA CUDA and Tensor Cores. - With its Grace Blackwell Superchip, it delivers the performance needed for AI model inference and training. 2. **Performance Comparison**: - **RTX 5090**: Faster for quick, single-image generations (1-4 at a time). However, its 24GB VRAM limits its ability to handle larger models or high workloads, causing slowdowns or crashes when memory limits are exceeded. - **Project DIGITS**: Equipped with 128GB unified memory, it ensures consistent speed even with large models, high-resolution generations, or multiple simultaneous tasks. 3. **Using DIGITS with Stable Diffusion and ComfyUI**: - Stable Diffusion: DIGITS handles larger batch sizes and complex prompts without hitting memory bottlenecks, ensuring stable performance. - ComfyUI: With DIGITS, you can load intricate workflows involving multiple models or high-resolution outputs while maintaining steady generation speed. 4. **Key Advantage**: - DIGITS maintains consistent performance as workloads increase, making it better suited for researchers or creators working on complex AI projects. **Alternative Advantage**: - DIGITS can run multiple instances of Stable Diffusion or ComfyUI at the same time without slowdowns, making it ideal for multitasking workflows. This is an alternative the RTX 5090 cannot handle effectively due to its VRAM limitations. **Conclusion**: While the RTX 5090 is faster for light tasks, Project DIGITS is the superior choice for consistent, scalable performance in Stable Diffusion and ComfyUI.
@@ronnetgrazer362 There's some hoops and caveats with Radeon consumer cards, but if you are prepared to run Linux the 7900XTX is pretty capable although i'd be more inclined to use a 3090.
@ I was born prepared to run linux :) but yeah 4 of those on a proper mobo would get you 96GB for around 5K, which seems hefty until you do the same calculation for the 3090s... although there are advantages, it mostly comes down to use case I guess. Still keeping my fingers crossed for an affordable ASIC wondercard to arrive some time this year. I will reflow the memory myself if I have to.
Anybody could see that they are forcing you to buy the 5090 by lowering VRAM on all other cards. Thankfully, I've been saving up for a year for the 5090 so this problem doesn't apply to me.
The original 4070 was also released with 12GB vram. Only the "Super" version has 16GB. While 12 GB isn't much for the larger new model, it can still be enough or 1.5 models, gguf models and LLMs. Also the 5070 Vram is faster. Not saying it is a great card, but it's kind of ok for that price. And that's the release price. Will probably go down soon enough ;)
I would like to see more tutorial videos on Flux as well as some new but simplified workflows. I would also like to see some tutorials on running Hunyuan models for txt to video as well as img to video using all the latest lightweight versions of the models. WaveSpeed for comfy also comes to mind.
Dear Olivio, Roland here, would you be interested to read a Book series which was created in a collab with an AI. If yes, I could send you a mail since i contacted you in the past.
All the technical details will not be worth exploring as AI gets more "intelligent". Soon it will be evident that humans will not understand anything that the machines will be putting out. So why bother with educating ourselves to the end of trying to comprehend this stuff?
@@duanium have you considered that learning is fun? You might as well say, since you don't life forever, why do anything at all? Because you can. That's the reason. Humans do things because we like doing them. That's our main drive
You do know that people actually need to build this tech right? It doesn't just fall out of the sky. Unless you're asking why you personally need to learn this stuff to which I would answer... you don't. You can eat Cheetos and watch Netflix all day for all I care.
Great breakdown on AI's potential! The idea of using AI to generate responses for common questions is super intriguing-I can see how it would save time while maintaining quality. Love the insight into balancing AI's capabilities with ethical considerations, especially regarding spam bots. Looking forward to more updates and content from you! 🌟
12:52 Why is there a woman licking your head? 😂
Excellent, thanks for posting this, it's useful.
좋은 소식 감사합니다. :)😊
After a week of trying, I still can’t get Sana to work properly. I need an idiots walkthrough so I can understand.
I will have a look into that. I haven't used it myself yet
@@OlivioSarikas thank you!
Retired from newspaper art but it is fun to witness this revolution in AI art and simulated photography. I have been prompting the free stuff for fun.
"ai photography" is a lot of fun :)
I'm always late to the party....and on a budget. I'm looking at buying a 4060 TI 16 GB assuming it will drop in price even further with the 5000 series being released.
For the price, it's only really worth me upgrading to 24Gb VRAM and I don't see them dropping enough for me. Not when you allow fot the other system changes I'd need to do as well. I wouldn't expect them to drop much either, since the whole mining shortage Nvidia know people will pay the higher prices, so there's no incentive to drop them.
RTX 4070 will most likely still be the best bang for the buck for the time being. 5070 is not much better and will be price scalped to hell and end up costing at least 700 bucks for a year or more...
@@Elwaves2925 i got my basic 4060 on sale - for what i do with it i'm satisfied. But then i started messing with AI image gen and LLM's so the 8gb runs out fast.
Live an learn i guess.
You are better off going for a 3090, 4060ti is too slow because it uses smaller buswidth. If not. Then go 4080. I'm on a 2080ti, it faster than the 4060ti. 2080ti undervolts verywell, as low as 150w and still faster than the 4060ti. You can upgrade thee vram to 22gb, and 8gb GPUs can be upgraded to 16gb.
@ I dont have the coin for a 4080 nor the psu - its a mid tier pc at best.
How do you upgrade Vram?
as far as speed i make a 2048x2048 in about 4 min - and that is using more than my 8gb card has - about 12gb of "shared" memory so i would assume having more vram would speed it up as it won't have to swap with system ram when decoding.
Ofc i don't really know what I'm doing, just getting into it.
You can convert those images to a model using photogrammetry then get it into blender. It'll need a lot of clean up.
I feel so behind on this stuff now. lol
It really is all moving at Lightspeed. You are not alone
you know what is worse, i live in Brazil and here is so bad to live, dollar is 7 times more expensive to our currency, i buyed my computer in 2020, a very good PC, but i have 8gb vram, what more i can do with this? i will have to change all my computer again to improve it, i'm still using SD with 1.5 models and thats all i can do, i love A.I but there is a limitation for who live in brazil. A person in USA can buy a car very faster, we take a entire life literally, so how i'm suppose to follow at least a bit the new technologies, is too tragic for us. i'm satisfied with the things i have, but when the time passes i notice i will not be satisfied for so long, i wanna learn things and enjoy new things but i can't, actually almost all Brazilians can't.
Yea, I've tuned everything out since it seemed to slow down, but I guess not.
@ Na, it's speeding up like crazy
@ That realy sucks. I don't understand how that is fair. The government should really protect their people and protect the economy
12gb vram = nope
yes, that's really not great. That said SD 1.5 still is really really nice
it seems nicely detailed in that turntable, but I can already see that at the paw of the cat and the coins in the box there are massive dents. They just look right with that shading and that turn table. Certainly needs to be reworked
Too real, yes, I see AI models on Instagram everywhere and it's harder and harder to see what's real and what's not.
Yes, with models like Kling, things are getting really extreme. But also AI Voices are really good now
those ai spambots are scummy as hell
Thanks for the effort, amazing video, the NVIDIA part shid light to lots of us including me! But these AI tools are like the meme coins you see in crypto, a lot of junk, we need more big project that will not fade after a week, that's where the money is, Huggingface is filled with space like these
If you mean projects being quickly abandoned. Yes, for sure. But keep in mind that these are often free open source projects. So people don't feel as attached to them. Big projects on the other hand cost money to use and not everyone can or wants to afford that. It's a bit of a minefield right now
There should be something in between a 5080 and 5090. A 5070/80 to with 24gb for an additional 300€ or something like that.
yes, totally agree. they should give more vram choices
Is 'rtx 3060 12gb' ok?
Requirements for Windows: Nvidia Project DIGITS 128GB.
1. Hardware:
- Project DIGITS Hardware: Includes the NVIDIA GB10 Grace Blackwell Superchip.
- System RAM: At least 128GB.
- Storage: Up to 4TB NVMe storage.
- Networking: High-speed Ethernet (10Gbps recommended for linking systems).
- Power Supply: Standard electrical outlet.
2. Software:
- Windows 11 Pro: Install for compatibility with professional AI tools.
- Drivers: Latest NVIDIA Studio Drivers.
- AI Software:
- Install NVIDIA AI Enterprise (for enterprise-grade security and updates).
- Frameworks like PyTorch, Python, and Jupyter Notebooks.
3. Considerations:
- Windows may not fully support all Linux-based features (e.g., DGX OS optimizations).
- You may need WSL2 (Windows Subsystem for Linux) to run some Linux-exclusive tools and features.
For Linux (Preferred for Full Functionality):
1. Hardware:
- Same as Windows (Project DIGITS hardware, 128GB RAM, NVMe, Ethernet).
2. Software:
- Ubuntu Linux 22.04: Required for running NVIDIA DGX OS.
- NVIDIA DGX OS: Ships with Project DIGITS for seamless integration.
- AI tools like NVIDIA RAPIDS, NeMo framework, and NVIDIA Blueprints.
3. Setup Steps:
- Connect the DIGITS hardware to your desktop.
- Use the NVIDIA NGC catalog and Developer Portal to access prebuilt software.
- Optimize networking for model sharing and cloud scaling.
Key Notes:
- Windows Drawback: Some Linux-optimized tools and DGX OS benefits may not work natively on Windows.
- Linux Benefit: Full access to the NVIDIA AI stack, preloaded on Project DIGITS hardware.
- Linux offers better performance and compatibility for this system if you're focused on maximizing AI model training and inference like Stable Diffusion text-to-image AI generation.
Project DIGITS vs. Prebuilt RTX 5090 System: Which Saves You More?
- Project DIGITS ($3,000):
- Tailored for AI tasks like model training, inference, and text-to-image generation.
- Comes with 128GB unified memory and up to 4TB NVMe storage.
- Setup requires some manual work (installing software, configuring tools, etc.).
- Total estimated cost: $3,000 + your time for setup.
- RTX 5090 Prebuilt System (~$5,000-$6,000):
- High-end gaming and creative workstation with general-purpose performance.
- Focused on cutting-edge graphics rather than AI-specific workloads.
- Fully assembled and ready to use out of the box.
- Total cost: $5,000-$6,000.
Savings Breakdown:
If you’re focusing on AI generation and willing to set up DIGITS yourself, you save $2,000-$3,000 compared to buying a prebuilt RTX 5090 system.
Plus, DIGITS is specifically optimized for AI workloads, making it the better choice for tasks like Stable Diffusion, LoRA training, and text-to-image generation.
**Tutorial: Why Project DIGITS is Ideal for Stable Diffusion and ComfyUI**
1. **Compatibility**:
- Project DIGITS is fully compatible with Stable Diffusion and ComfyUI, as it supports NVIDIA CUDA and Tensor Cores.
- With its Grace Blackwell Superchip, it delivers the performance needed for AI model inference and training.
2. **Performance Comparison**:
- **RTX 5090**: Faster for quick, single-image generations (1-4 at a time). However, its 24GB VRAM limits its ability to handle larger models or high workloads, causing slowdowns or crashes when memory limits are exceeded.
- **Project DIGITS**: Equipped with 128GB unified memory, it ensures consistent speed even with large models, high-resolution generations, or multiple simultaneous tasks.
3. **Using DIGITS with Stable Diffusion and ComfyUI**:
- Stable Diffusion: DIGITS handles larger batch sizes and complex prompts without hitting memory bottlenecks, ensuring stable performance.
- ComfyUI: With DIGITS, you can load intricate workflows involving multiple models or high-resolution outputs while maintaining steady generation speed.
4. **Key Advantage**:
- DIGITS maintains consistent performance as workloads increase, making it better suited for researchers or creators working on complex AI projects.
**Alternative Advantage**:
- DIGITS can run multiple instances of Stable Diffusion or ComfyUI at the same time without slowdowns, making it ideal for multitasking workflows. This is an alternative the RTX 5090 cannot handle effectively due to its VRAM limitations.
**Conclusion**:
While the RTX 5090 is faster for light tasks, Project DIGITS is the superior choice for consistent, scalable performance in Stable Diffusion and ComfyUI.
you're the best!
Shorts are the most annoying thing on UA-cam. It's basically just useless spam.
the 5090 is dead to me $2000 !!! just get the DIGITS for $3000 it is so much better for AI if you got 2k for a gpu you have 3k for a super computer
That one STARTS at $3k. But 4 AMD 32GB GPUs in one rig would have my preference. Support is slowly improving, AFAIK.
DIGITS is not really a supercomputer, it's more like a Mac Pro with 128gb unified RAM. A 5090 is probably going to be better for AI.
It looks like DIGITS will be good for inference, but not for training
@@ronnetgrazer362 There's some hoops and caveats with Radeon consumer cards, but if you are prepared to run Linux the 7900XTX is pretty capable although i'd be more inclined to use a 3090.
@ I was born prepared to run linux :) but yeah 4 of those on a proper mobo would get you 96GB for around 5K, which seems hefty until you do the same calculation for the 3090s... although there are advantages, it mostly comes down to use case I guess.
Still keeping my fingers crossed for an affordable ASIC wondercard to arrive some time this year. I will reflow the memory myself if I have to.
Hope someone releases an Unreal Engine plugin for the 3D stuff.
that would be amazing. Won't be long and we will have model generation in UE and any other Ai software
RTX 5070 is an effin joke. The card barely has more performance than a RTX 4070, 30.97 vs 29.15 TFLOPS... this company is PURE GREED at this point.
Anybody could see that they are forcing you to buy the 5090 by lowering VRAM on all other cards. Thankfully, I've been saving up for a year for the 5090 so this problem doesn't apply to me.
5070, the VRAM of 4090, pass.
The original 4070 was also released with 12GB vram. Only the "Super" version has 16GB. While 12 GB isn't much for the larger new model, it can still be enough or 1.5 models, gguf models and LLMs. Also the 5070 Vram is faster. Not saying it is a great card, but it's kind of ok for that price. And that's the release price. Will probably go down soon enough ;)
@@bigge1002 and my 2 year old 4090 the resale value will stay good :D cant wait to start with the 5090
It's just a software upgrade.
I would like to see more tutorial videos on Flux as well as some new but simplified workflows. I would also like to see some tutorials on running Hunyuan models for txt to video as well as img to video using all the latest lightweight versions of the models. WaveSpeed for comfy also comes to mind.
239k subs en je krijgt gemiddeld nog net aan 20k/30k views per video.. iets klopt niet hè 😅🤔😄
Lots of other channels about AI these days. Plus i slowed down a bit in my production of new videos
Nvidia Cosmos I found it more interesting
Yes, that one is pretty cool too. :) Can you believe that we are about the live with walking, talking robots? Crazy!
AI TOPS is kinda of a useless metric you should have a look at CUDA cores
Dear Olivio, Roland here, would you be interested to read a Book series which was created in a collab with an AI. If yes, I could send you a mail since i contacted you in the past.
I wish I had time to read books. I have a lost of over 100 books i would love to read, but even the once i really want to read I don't have time for
All the technical details will not be worth exploring as AI gets more "intelligent". Soon it will be evident that humans will not understand anything that the machines will be putting out. So why bother with educating ourselves to the end of trying to comprehend this stuff?
@@duanium have you considered that learning is fun? You might as well say, since you don't life forever, why do anything at all? Because you can. That's the reason. Humans do things because we like doing them. That's our main drive
You do know that people actually need to build this tech right? It doesn't just fall out of the sky. Unless you're asking why you personally need to learn this stuff to which I would answer... you don't. You can eat Cheetos and watch Netflix all day for all I care.
rtx is a rip off
It's the home of Cuda and that makes it amazing for AI and Gaming ;)
@OlivioSarikas the card is less powerful it needs to rely on AI technology and software put the card itself is not strong it's a weak card