According to Nvidia Blackwell can not only process models with one trillion parameters but ten trillion. It is significantly better than Intels Gaudi 3.
Of course it's better than Intel... Intel is behind AMD in graphics cards and processors and way behind Nvidia in AI applications and gpus. Intel is holding on by a thread due to its older user base that were around when amd wasn't good quality chips but that was decades ago now
@DarkAttack14 Intel cpu is still strong. They still have big budget, way more budget than amd currently. graphic cards happened to naturally able to do ai computation and Nvidia got their experience from it. Nvidia just build a chips do just that. So yes, Nvidia graphic cards is ahead which in terms they can build a fast ai chips. The main goal is how they can build chips low power and efficiency. Intel could network a lot their chips together but that defeat the purpose of making a single chips stronger.
@hg6996 just like honda success is not just their cars. They manufacturers a lot of other products. Their main point is their engine just like Nvidia cuda.
Imaging one trillion ants underground building a castle. Each one ant has its own tasks and work together with a group on ants. Each one sharing one mind. You can't make this up. This technology is from another world. Q from Star Trek Next Generation.
Only 4.5 times the performance with reduced precision. Tom's Hardware explains the caveat: "Blackwell B200 gets to that figure via a new FP4 number format, with twice the throughput as Hopper H100’s FP8 format. So, if we were comparing apples to apples and sticking with FP8, B200 ‘only’ offers 2.5X more theoretical FP8 compute than H100 (with sparsity)". 4-bit floating-point numbers is crazy, very imprecise. But so are human neurons...
Obviously not, but as all mass produced consumer products either has to save money over time from a consumer perspective... From a business perspective it does not matter because everything is bought with debt.. so the money never exist untill you have to work you ass of on an extra job to be able to pay for food ..
Data centers and cloud services need super computers with this kind of GPUs. If they cost so high, which may be justifiable, how computing can be served at zero cost.
I don't understand why nobody's calling the following out: GB200 is 2 GB100's on the same package. It will probably cost twice as much, consume twice as much power, and require twice the cooling performance. Comparing it to the H100 is pretty misleading. Jensen Huang boasting about the size of the die is moronic from a technical perspective. Smaller dies are more desirable. Yes, the interconnect technology between the dies is impressive, but the main benefit from the GB200 is the amount of space it will save per server rack.
Why the liquid cooling in a data center? Why don't GPU's get clocked down like Xeon processors to help them more efficiently stay cool and last longer? Slower clocks save a lot of money long term in a data center.
Nvidia has gotten where they are because they invest more money in research and development than any other Corporation including Microsoft and Google. It's not because they invest in advertising, of which they do very little
So far ai didnt add anything to human knowledge,instead We are increasing the Ai s knowledge. We're making a calculators that will outsmart us,hoping one day we can learn somthing from them,there is no ai out there that can gather its own sensory data .
@@blackpillfitness9136 then it remain s that powerfull calculators that process our idea,our knowledge,to reach a true or false answer ,which leads to more questions
Other way around. Gamer gpus funded by AI and cloud company buyers. All the GeForce cards we get are quite literally just defective Quadros/Teslas with lower bins.
Nvidia is like post WW2 USA right now. Only they have the nuke and immense global power until Soviets (AMD, Intel or another company) will do the same.
Tom's Hardware explains the caveat "Blackwell B200 gets to that figure via a new FP4 number format, with twice the throughput as Hopper H100’s FP8 format. So, if we were comparing apples to apples and sticking with FP8, B200 ‘only’ offers 2.5X more theoretical FP8 compute than H100 (with sparsity)". 4-bit floating-point numbers is crazy, very imprecise. But so are human neurons...
Nvidia is multiple Generations ahead of AMD and Intel. Intel is doomed. Yes Nvidia is making a lot of profit, but if they sold those chips at base price it would totally wipe out other companies. Competition is good for the industry
Nvidia GPU just make AI more advance , Terminator will become a reality with in a few years from now , AI will become smarter tha human , when AI become self aware and developt creative imagination , we will be in danger ,
All connected to an ARM, PCI connected CPU that is a dog. I hate to say it... but AMD with its shared HBM between CPU and GPU, is years ahead of this architecture.
@@hg6996 they released a custom XPU that has 12 HBM stacks versus 8 on Nvidia’s Blackwell B200. That’s 50% more memory and 25% more bandwidth than Blackwell or MI300. Media doesn’t talk about this stuff you have to find out on your own
And now they are designing the world's best CPU’s they got all the details from ARM in their collaboration and built one together. Nvidia tried to buy ARM for 40 billion cash! The EU would not allow it to have their only successful company to leave the UK 🇬🇧! Whats it been 4-5 years and now they are sick of cousin Lisa puffing her chest about being a strong competitor to Nvidia! Obviously, Jensen is still enjoying the hell out of kicking her butt on every level! As first cousins he actually worked for AMD briefly and they talked about starting a new company together and the reason why it never came to fruition is she demanded that she would be the CEO 😂! His response was no seriously you can certainly comprehend that I am a great leader and visionary. So, I guess I will see you in the Arena cuz! You will Envy what I build! Hence, the name NV for next version and then the ultimate name change as Nvidia= Envy Epic Jensen! Green logo = 💰💸💵💰💸💵💲💷💶💴💹🤑
According to Nvidia Blackwell can not only process models with one trillion parameters but ten trillion.
It is significantly better than Intels Gaudi 3.
Gpt 4 and 4o are 1.6 trillion in parameters. It can do more with lower precision like the int4 they showcase.
Of course it's better than Intel... Intel is behind AMD in graphics cards and processors and way behind Nvidia in AI applications and gpus. Intel is holding on by a thread due to its older user base that were around when amd wasn't good quality chips but that was decades ago now
@DarkAttack14 Intel cpu is still strong. They still have big budget, way more budget than amd currently. graphic cards happened to naturally able to do ai computation and Nvidia got their experience from it. Nvidia just build a chips do just that. So yes, Nvidia graphic cards is ahead which in terms they can build a fast ai chips. The main goal is how they can build chips low power and efficiency. Intel could network a lot their chips together but that defeat the purpose of making a single chips stronger.
@@BlueRice Nvidias success isn't just based on silicon. Their CUDA is a differentiator which nobody else has.
@hg6996 just like honda success is not just their cars. They manufacturers a lot of other products. Their main point is their engine just like Nvidia cuda.
Imaging one trillion ants underground building a castle. Each one ant has its own tasks and work together with a group on ants. Each one sharing one mind. You can't make this up. This technology is from another world. Q from Star Trek Next Generation.
They invested $10 billion dollars in this technology
@@MASterElectron24Doesn't change the fact that it's literally magic invented by humans.
You just made it up
@@Robert_4444 no it's not. We just shocked some rocks
To Juan.
It still doesn't compare to the human brain
But will it run Crisis 4?
On 4 K and 144hz
"only gamers get that joke"
Probs will reach 40fps max
For breakfast. With an extra side tray of ray tracing.
It can also do Minecraft
Don’t know what but every time I hear blackwell it reminds me of phantom liberty
Also me I think of blackwall 😂
damn, what a good DLC
Rogue AI accelerator
Only 4.5 times the performance with reduced precision. Tom's Hardware explains the caveat: "Blackwell B200 gets to that figure via a new FP4 number format, with twice the throughput as Hopper H100’s FP8 format. So, if we were comparing apples to apples and sticking with FP8, B200 ‘only’ offers 2.5X more theoretical FP8 compute than H100 (with sparsity)".
4-bit floating-point numbers is crazy, very imprecise. But so are human neurons...
Skynet beta
There isnt enough money in the world to buy them.
They’re like a couple hundred k
Print more, then...
the more you buy, the more you save 😂
Obviously not, but as all mass produced consumer products either has to save money over time from a consumer perspective... From a business perspective it does not matter because everything is bought with debt.. so the money never exist untill you have to work you ass of on an extra job to be able to pay for food ..
Data centers and cloud services need super computers with this kind of GPUs. If they cost so high, which may be justifiable, how computing can be served at zero cost.
Go on aws and rent it for 0 cost. Bankrupt as soon as you click order
Crazy how we tricked rocks into thinking so hard they can simulate reality
Sand*
Ok but what's the blender benchmark score
I don't understand why nobody's calling the following out: GB200 is 2 GB100's on the same package. It will probably cost twice as much, consume twice as much power, and require twice the cooling performance. Comparing it to the H100 is pretty misleading. Jensen Huang boasting about the size of the die is moronic from a technical perspective. Smaller dies are more desirable. Yes, the interconnect technology between the dies is impressive, but the main benefit from the GB200 is the amount of space it will save per server rack.
Jensen Huang's statement is called "moronic" by you? I've held Nvidia stock since 2016. What have you done?
There’s no such thing as GB100. GB is the name of the board (Grace + blackwell). B100, B200 is the name of the gpu chip. GB200 uses B200 chip
@@theuserofdoom Gotcha. So the G must stand for "Grace" then. GB200 will cost somewhere around twice the H100 though, correct?
Why the liquid cooling in a data center? Why don't GPU's get clocked down like Xeon processors to help them more efficiently stay cool and last longer? Slower clocks save a lot of money long term in a data center.
what does a cat say
AI: woof
ahh money well spent.
Hahaha
Gotta say, Blackwell is a badass name.
Named after a historical scientist. Same for Grace Hopper. 😃
Named after statistician and mathematician David Blackwell (Wikipedia)
but will it blend?
that is the question
So can i put one in my desktop or no?
Lol
NVDA to the moon 🚀🌑😄
the profit margin is at least 1500%
Good , lets use it train lc0 .
I wonder how many % is the profit margin?
Over 50%
@@solomonmendonca322370-80%
these black squares cost more than gold of same weight
是全球最好的🎉🎉🎉
I want one
I want that performance on ARM.
Well... you can, and it will have same power consumption as x86 CPUs :)
@@beuman0 That sucks. I want that performance with the power consumption of Apple's M series chips.
@@Kevin6t8 CPU design is a trilemma that must balance cost, performance and power consumption. You cannot get all of 3 at the same time
So...can it run Crisis?
I donno man.
nope, it doesn't support DirectX and Vulkan
Power consumption must be reduced think of sustainability, in the want of more computing power we must not forget the sustainability goals
The best one to talk about it in a nutshell
😮 super insane
buy NVIDIA shares 💵
instead of graphics cards
and wait 20 years. 🎩👑
def not business not personal but dope
Blackwell is hot and that’s a crinkle in the wrinkle.
Beautiful 🖤💚🙏
Can it run doom ?
It only need 20,000 Watts this time😁
Here goes skynet...we r screwed
Crazy to think times are only gonna change faster. Unless we botch it. (We will probably botch it)
They don't really care about us
Elon should drob a billion on stocking up on these in conjunction with what he bought
Nvidia is investigating heavily in advertisement
Nvidia has gotten where they are because they invest more money in research and development than any other Corporation including Microsoft and Google. It's not because they invest in advertising, of which they do very little
So far ai didnt add anything to human knowledge,instead We are increasing the Ai s knowledge.
We're making a calculators that will outsmart us,hoping one day we can learn somthing from them,there is no ai out there that can gather its own sensory data .
Maybe learning from ai isnt its main use case. Instead just making computing tasks faster and more efficient.
@@blackpillfitness9136 then it remain s that powerfull calculators that process our idea,our knowledge,to reach a true or false answer ,which leads to more questions
We are gonna replace ourselves 😂
@@x260houtoriIf we don’t start regulating these companies then yeah. One day we will.
Me using a fraction of my country's electricity to generate a batch of AI art png slops
now world most valuable company not third…
Brilliant🌐
Looks like digital prison for your mind.
THIS
Blackwell:
Sounds like reptilian double speak for:
The bottomless pit......
All funded by gamers money buying their over priced gpu's
Other way around. Gamer gpus funded by AI and cloud company buyers. All the GeForce cards we get are quite literally just defective Quadros/Teslas with lower bins.
Payed by the crypto hype
Lmao little wrong they make a Butt load of money from chips for cars and compute/networking…gpu’s are the second source of revenue though
No
Gpus are a side gig now.
Who gonna be the one to destroy the machine
As usual, Nvidia skimps on RAM. Only 192GB.
Trap lore ross now does IT reviews
The boards look like faces, and the first one looks angry. If you have seen Terminator, I wouldn't switch it on.
Okay but will this be able to run gta6?
Nvidia is like post WW2 USA right now. Only they have the nuke and immense global power until Soviets (AMD, Intel or another company) will do the same.
Blackwell are too expensive for games, and is not as efficient as claimed for computer processing.
Tom's Hardware explains the caveat "Blackwell B200 gets to that figure via a new FP4 number format, with twice the throughput as Hopper H100’s FP8 format. So, if we were comparing apples to apples and sticking with FP8, B200 ‘only’ offers 2.5X more theoretical FP8 compute than H100 (with sparsity)".
4-bit floating-point numbers is crazy, very imprecise. But so are human neurons...
i wonder what would happen to nvidia if apple made an m chip that big?
YOUR TECHNOLOGY IS HELPING EVERY HUMAN & CREATURE ON EARTH -And-Beyond ! ? Dr Lucich Ex Navy SPEC-FORCES
Nvidia is multiple Generations ahead of AMD and Intel. Intel is doomed. Yes Nvidia is making a lot of profit, but if they sold those chips at base price it would totally wipe out other companies. Competition is good for the industry
No chips 4 winnie (Xitler). 😅😅
Good secured cause thief is so many
Yeah pushed back a couple of months due to a design flaw.
Nvidia GPU just make AI more advance , Terminator will become a reality with in a few years from now , AI will become smarter tha human , when AI become self aware and developt creative imagination , we will be in danger ,
All connected to an ARM, PCI connected CPU that is a dog. I hate to say it... but AMD with its shared HBM between CPU and GPU, is years ahead of this architecture.
Nope, these aren't connected over PCI, and you can network sets of them together with NVLink which is much faster.
AMD is a great chip too
Broadcom’s XPU is better than this lol
There are no specs from broad on but there are 20 pages of specs from Nvidia. I doubt that broadcom has the better solution.
@@hg6996 go look at Broadcom’s AI event then talk to me
@@hg6996 they released a custom XPU that has 12 HBM stacks versus 8 on Nvidia’s Blackwell B200. That’s 50% more memory and 25% more bandwidth than Blackwell or MI300. Media doesn’t talk about this stuff you have to find out on your own
@@Jai-qf8lw Maybe this is because media is busy covering the Broadcom ripoff after it's acquisition of VMWARE.
@@hg6996 lol nothing but coordinated FUD
If this come out on the market imagine what the government have. Probably 100x better than this.
And now they are designing the world's best CPU’s they got all the details from ARM in their collaboration and built one together. Nvidia tried to buy ARM for 40 billion cash! The EU would not allow it to have their only successful company to leave the UK 🇬🇧! Whats it been 4-5 years and now they are sick of cousin Lisa puffing her chest about being a strong competitor to Nvidia! Obviously, Jensen is still enjoying the hell out of kicking her butt on every level!
As first cousins he actually worked for AMD briefly and they talked about starting a new company together and the reason why it never came to fruition is she demanded that she would be the CEO 😂! His response was no seriously you can certainly comprehend that I am a great leader and visionary. So, I guess I will see you in the Arena cuz! You will Envy what I build! Hence, the name NV for next version and then the ultimate name change as Nvidia= Envy
Epic Jensen! Green logo = 💰💸💵💰💸💵💲💷💶💴💹🤑