Original Nvidia India AI Summit Keynote: ua-cam.com/video/GlKBbsVX37c/v-deo.html Time saved: 26 minutes 00:00 Moore's Law is Dead - The Generative AI Era 05:56 NVIDIA Blackwell Data Center Accelerators 10:26 NVIDIA Generative AI Scaling 4x Per Year 13:47 NVIDIA AI Agents & Omniverse for Robots
Programming will not end, but will evolve and continue. Programming is our ability to think ahead in a very controlled way. We will have higher level tools to do programming.
The inevitability of where we are today was predicted very early on in the life of computers. Where we go tomorrow with computing is impossible to comprehend as the compute ability is advancing at exponential speed. Next 10 years will see colossal change, I’m not sure that most people will enjoy the change it may be painful
Yeah actually computers go back to before WWII, but they weren't digital then, they were analog computers. Once the transistor became small enough to make a computer based on transistor technology was when the push to digital computers started. It was a digital computer that flew to the moon in 1969.
@@mikewa2 The magic is that for personal computing, 99% of the population can buy a PC for under $1000 and it will do what they want for the next 15 - 20 years as long as it's still supported and that tends to happen first, the hardware stops being supported. The only REAL issue right now is how much AI compute power people want. But this is why I didn't listen to this video because I already heard a brief snippet of what Jensen said and his goal is to convince you that you NEED an Nvidia based system they're putting out in the near future. With AI being the new thing you could be convinced that AI accelerator cores HAVE to be included in the processor. No, no they don't. With faster interfaces anything connected to the CPU on the PCIe bus can process AI. We are now at PCIe gen5 which is fast enough for any home use case. Neural Processing Units (NPU) are going to be released by different companies on NVMe devices, and in fact they already are, but more powerful ones will come out on NVMe and wont cost much, and that means I could run AI workloads off an NVMe connected to a Zen 3 based CPU on a motherboard that has an open NVMe port. Gee, glad I bought a motherboard that has at least 2 NVMe ports. I will admit that the PCIe interface on that motherboard is gen4, but it will still be fast enough for typical light AI loads that run on a home computer. And this means hardware that's already 4 years old can be upgraded, run Microsoft Windows 12 and you can be fat dumb and happy watching YT videos on your 10 year old system that runs great, because you don't need the kind of power being talked about as 99% of the population doesn't. But new CPUs coming out on X86-64 are going to have a mix of cores and will certainly have NPUs built into the CPU. Graphics cores might not be included because once again PCIe gen5 is fast enough to push that work to the GPU anyway. OMG it takes TWO MORE SECONDS!!!!!!
So, his opening statement is incorrect, to be clear and HE is saying this because he wants people to move to an Nvidia based system that they will put out in the somewhat near future AND will cost more than an X86-64 based system. X86-64 offers everything from a system that's dirt cheap and will do office apps all day, browse the web and let you watch 4K HDR movies. Or, they can run any engineering problems that 10 years ago could take a week to solve and now takes a few hours. With X86-64 the limiting factors are NOT the CPU and the general processing cores which are worlds better than even 10 years ago. The limiting factor is ALWAYS cost for the type of work you want to do, and that cost has come down DRAMATICALLY over the last 15 years and ASTOUNDINGLY over the last 20 years. You can now do renders on the best workstations, where 6 years ago it would have required a render server that would eat up a lot of power and take a few days. I'm talking studio production, not simple renders people do at home. So, while Jensen has done well to grow Nvidia into a technology company while still being able to put out the best GPUs for a PC, the goal of what he's saying which isn't factual is based on his desire for people to leave a very mature platform, whether it's Apple, X86-64, or some new Microsoft ARM based system which isn't mature, and move to an Nvidia based system making Nvidia the masters of the world and charging you an insane amount of money once you're trapped. No thanks, my system does EVERYTHING I want to do as fast as I need it to.
When will this fit in a box on my desk to run locally? I have no interest in paying a monthly subscription to run my games, etc. over the internet. This feels like the 60s/70's style mainframe computers that now fit in your pocket.
Seeing this after watching a video about autonomous killing drones used in Ukraine and as well as remembering an article a few days ago about a couple from US moving to Jamaica to live a longer and happier life in a very real way describing such extremes as doing laundry by hand, yet living in a community, singing, dancing, eating healthy food. What a disconnect! Just think for a second...
someone who works with CUDA - is there an opportunity to drill down below that level to take particular advantage of features of the chips, akin to what you would traditionally do akin to maybe writing assembler for some particularly important part where you need every tweak you can get ? Likewise on LM level, how do you narrow the input data to get tighter data sets which exclude duplications etc when training ?
AI (playing the role of Dr Strange) is telling us that after over 4 million training simulations, the only way to begin to save ourselves is to save Tony (played by Jenson). Which is the ant and which is the boot? Personally, it is fascinating to watch all the players positioning themselves for the game of the century.
@@DeroQC89 No offense but have you seen how Elon has been running Tesla and Twitter? Tesla’s has been shipping less vehicles and is being outcompeted by Chinese ev’s. Tesla has had the same stagnated design for years and not to mention the cyber truck that has been a complete and utter failure. I think it was just 2% of reservations for the cyber truck actually bought it. And then we have FSD which he has been promising for nearly 10 years and still has not delivered. Twitter has lost over 80% of its revenue and value. Elon has gotten lucky in many regards and what SpaceX has achieved is nothing short of impressive but he is quite incompetent in many ways. Not to mention how many people have been turned off from buying Teslas due to his rhetoric on twitter
@@pingpong1727 He's still the richest person in the world. Chinese EV's outcompete due to cheaper labor, not better quality, management, and technology. Outside of Chinese Tesla sales, Teslas are made in the US and are profitable. How many other auto companies are selling EV's at a profit? Tesla is the first and only Western automaker selling EV's at a profit. Twitter was always unprofitable. Musk put it in a better position to become profitable when revenue eventually returns. He cut headcount by 75% and the platform still runs just as fast and stable. The main cost issue is the legacy contracts they had in place with AWS. Once that goes away, a huge liability goes away and X will become profitable. The short term value doesn't matter. He got free advertising for all of his other companies. He had founded or turned around every company he has touched from Paypal, to Tesla, to SpaceX, to Neuralink. He's only owned X for 2 years. If that's how you judge a person's success, you're a clown. Apple almost went bankrupt under Steve Jobs before he was able to turn it around. You can call that luck. Tell us about all of the success you've had.
terima kasih atas video anda. Bagaimanakah saya boleh memulakan perdagangan crypto sebagai pemula yang lengkap? Saya baru dalam mata wang kripto dan tidak faham bagaimana ia benar-benar berfungsi. Bagaimanakah seseorang boleh mengetahui pendekatan yang betul untuk melabur dan membuat keuntungan yang baik daripada pelaburan mata wang kripto?
Saya membaca begitu ramai orang di sini di UA-cam berkongsi kerjanya yang luar biasa dan saya rasa dia benar-benar bagus dan menarik untuk bekerjasama...
Nvda.. Nvidia pushing green today.? VHAI 30 % rise week. Vocodia. Conversational AI tech with new revenue streams. Palantir up 5 % today..Sym.. Symbotic green today. Thumbs Up video/ comments as the AI Evolution Begins Globally. Thanks
Don't care how great Blackwell is , they are beholden to tsmc and lined up like everyone else with alloted slots thus can't fill their many blackwell orders even if they want to.
the fancy Blackwell means that individual GPU can't get faster, the Moore's Law is dead but in the other direction, Nvidia can't make the chip 2x faster, they have to combine many GPUs into one, that's not Moore's Law, that's like clustering which was done for CPU decades ago!
The overall narrative is the growing utility of AI. AI has extended my capabilities- in ways which saves me 3-4 hours a week me , and enables extensions to my work not possible before. That kind of productivity will unlock new kinds of products services systems platforms and business. And that’s before we even talk about physical robots.
10:47 for those who don't get why he didn't give a number for after 10 years of 4x double a year.... is because its over a trillion x, while moores law is only 100x after 10 years "incredible scaling" my ass more like focken insanity
@@kurtpedersen17031.05 million. Also raw calculation power is less efficient since managing the parallel tasks becomes cumbersome. Still impressive though.
@@kurtpedersen1703 if its 4 doubles in a year its 1 trillion and if its from 1 to 4 a year then its 9.7 million x in 10 years maybe i miss understood his quote but either way its a lot lol
@@DK-ox7ze don't feel like you missed out Tesla. It's a company has over 1 trillion market cap and only makes $3.65 a share. It's up because of politics, not because the company has a solid ground. Usually are the young people investing in this company. NVDA has plenty of room to grow that TESLA.
Tesla energy may grow net profit by 40% alone one year from now. With over 30billion in cash, I think Tesla is in a pretty good position to grow. Too bad the media spews endless misinformation about Tesla - did by they say all the competition had arrived five years ago? VW Audi bmw Hyundai were all supppswd to leave Tesla in the dust. What actually happened? Model Y and model 3 some of the best selling cars globally. While ther competitors are far far behind Except for BYD.
If you make something 2x faster, it means that you do something smart. If you make something 100x faster, it means that you stop doing something stupid.
Did noone else notice a agentbot in development got loose and walked on the stage trying to be useful... Huang just patted him, called him Michelle and dissimulated that it were anything unusual 😂😂😂😂😂 Then he walked off stage and shouts at his staff "bring me our guardrails engineer, it was too close this time"
This ist Just the beginning. NVIDIA will change the world we know. It's bigger than anyone can imagine now. Humanity will skyrocket the next 5 years. We'll solve problems you wouldn't Imagine now. Eternal lives, warp travelling, time travelling, dark hole creation, terraforming, unlimited Energy, dark Matter etc....this ist beyond reality.
None of that will happen in the next 5 years or even 10, They still haven't given us our flying cars. Late 1980's they said by 2000 we would have flying cars, 2000 was nearly 25 years ago!
whenever I see a youtube title that says "shocking" or any variation of it, I automatically assume that it's just clickbait and is not worth watching. Maybe this type of titles would've worked on me 10 years ago.
This technology is really impressive and scaring at the same time. The last part of a fully robotic factory that is actually designed by AI makes human beings obsolete. In a world economy that is based on humans able to eat based on getting paid for the work they execute and the final product that is sold. Automating all aspects of productive work actually means humanity will need a new economic model where the current capital owners no longer control the system. Keeping the current model under this IA revolution means that only capital owners would be able to exist. If the current society model is not completely remade. Social unrest would arise and we will self destruct as race. So, in a fully IA based world like the future that is presented by Huang this production capability must be transferred to collective ownership. Looks a lot like communism but it is not. Communism negates human's rights regarding individual benefit from direct work. The new model will be based on that human labor is no longer required, except for very small control positions that should be filled ad honorem.
for nvidia's line of business of selling computing capability this new "software" 2.0 based on ML is great. But why that makes current or classic programming "bad"? because is made by human beings that Nvidia can't sell? why replace fully working deterministic systems by those made by IA that actually costs a lot of power and energy to be produced, refined and run? I can see IA as a good efficiency tool regarding the ability to produce new systems that humans can't or haven't done yet because the data volume is not at human scale. Like simulating and testing billions of combinations. But neural networks are like statistics that are based on the premise that the data is correct. I mean if you feed a ML model enough data that 2 + 2 = 5 the ML will produce a model that produces wrong addition results. This is very different from fully deterministic and rules based programming that is correct (or wrong) by fundamentals. As Huang mentions on this speech this programming model are like numeric approximation formulas and those never produce truth only approximations of truth. That might be a dangerous path to follow if all systems get to be made that way. You will end with software that on some occasions will produce wrong results impossible to track and correct as no human being will be able to modify ML generated functions as the programs will be unreadable. This issues are already happening on some industries where automated decisions hurt customers and as those companies don't provide proper support by trusting blindly on automation, customers get their rights denied and the basic rule "customer is always right" turns into companies that only trust on their own belly button.
Sorry Jensen, there are some things you can't "accelerate" (aka parallelize). Pretty much every program has sequence of steps that need to be executed in order. Scientist and engineers have been searching for this holy grail of parallelizing everything, but it's mathematically not possible. CPU companies like Intel and AMD have put in decades of work into branch prediction and gotten some gains, but you can't predict everything. I'm sure today's latest AI models could improve on this, but you can't do what is logically impossible (100% prediction of all branches).
Branch prediction is very important for common CPUs, but it has nothing to do in AI. Imagine that you have a huge array of data and everything you need to do is very simple allreduce. This is very simple calculations and there is no need to jump depending on the result of calculation. And every chunk of data can be calculated independently (in parallel)
@@dimes1917 not arguing with that scenario. That’s why GPUs exist and are so good at tasks that are parallelizable such as graphics and DNN training. But in the video, Jenson is implying that we (software developers) have to apply this throughout our code instead doing sequential operations. I don’t think that’s possible. Let’s take a super basic “Hello World” program. The program starts, it prints the string, and then ends, and it has to do all 3 in that order. You can’t end the program at the same time you’re printing the string. Another example would be a function that has a switch statement based on the current temperature outside. This value is fetched from an api, and let’s say we divided the values into three ranges, such that there’s 3 different branches the code can take depending on the temperature. You can’t predict the temperature value in the code so your branch prediction pipeline will fail 2/3rds of the time. Expand this to 100 temperature ranges and you see how the problem only gets worse the more complicated the conditional logic is.
@@dimes1917 I’d love to hear in more detail how this outdated. Tell me how we can use the GPU to “accelerate” a REST API that does CRUD operations for an e-commerce website?
@@TickerSymbolYOU: not a next-word predictor trained by stochastic gradient descent. understanding the meaning of something means abstract principles underlying something, not a statistical model.
Original Nvidia India AI Summit Keynote: ua-cam.com/video/GlKBbsVX37c/v-deo.html
Time saved: 26 minutes
00:00 Moore's Law is Dead - The Generative AI Era
05:56 NVIDIA Blackwell Data Center Accelerators
10:26 NVIDIA Generative AI Scaling 4x Per Year
13:47 NVIDIA AI Agents & Omniverse for Robots
I honestly thought that other guy on stage was an AI prototype.
yeahh i was thinking so... that's an IA Agent now?
@@brianaragon1641 IRL NPC IMO
Nah. he looks upset that he wasn't able to utter a word.
I was thinking the same thing.. is that a morpheus clone !!
@@manishchhetri And Jensen keeps walking around him, patting him on the shoulder as he makes slight facial gestures.
My grandkids will be baffled that the G in GPU stands for "Graphic"
it actually stands for generating processor unit grandpa
Chips made exclusively for AI, could be called IPUs.
Programming will not end, but will evolve and continue. Programming is our ability to think ahead in a very controlled way. We will have higher level tools to do programming.
programming will emerge more as a conversation than the direct application of coding
AMAZING! thank you for this! It's mind blowing typical of Jensen Huang.
Computers been around only 60 years?
Dang, not even a century and already they're advancing pretty fast.
@@CrowleyBlack2 I’ve been working with computers for 60 years (1964) 😀 so at least 70 and 80 if you include the ENIAC from WWII.
The inevitability of where we are today was predicted very early on in the life of computers. Where we go tomorrow with computing is impossible to comprehend as the compute ability is advancing at exponential speed. Next 10 years will see colossal change, I’m not sure that most people will enjoy the change it may be painful
Yeah actually computers go back to before WWII, but they weren't digital then, they were analog computers. Once the transistor became small enough to make a computer based on transistor technology was when the push to digital computers started. It was a digital computer that flew to the moon in 1969.
@@mikewa2 The magic is that for personal computing, 99% of the population can buy a PC for under $1000 and it will do what they want for the next 15 - 20 years as long as it's still supported and that tends to happen first, the hardware stops being supported. The only REAL issue right now is how much AI compute power people want.
But this is why I didn't listen to this video because I already heard a brief snippet of what Jensen said and his goal is to convince you that you NEED an Nvidia based system they're putting out in the near future. With AI being the new thing you could be convinced that AI accelerator cores HAVE to be included in the processor. No, no they don't. With faster interfaces anything connected to the CPU on the PCIe bus can process AI. We are now at PCIe gen5 which is fast enough for any home use case. Neural Processing Units (NPU) are going to be released by different companies on NVMe devices, and in fact they already are, but more powerful ones will come out on NVMe and wont cost much, and that means I could run AI workloads off an NVMe connected to a Zen 3 based CPU on a motherboard that has an open NVMe port. Gee, glad I bought a motherboard that has at least 2 NVMe ports. I will admit that the PCIe interface on that motherboard is gen4, but it will still be fast enough for typical light AI loads that run on a home computer. And this means hardware that's already 4 years old can be upgraded, run Microsoft Windows 12 and you can be fat dumb and happy watching YT videos on your 10 year old system that runs great, because you don't need the kind of power being talked about as 99% of the population doesn't.
But new CPUs coming out on X86-64 are going to have a mix of cores and will certainly have NPUs built into the CPU. Graphics cores might not be included because once again PCIe gen5 is fast enough to push that work to the GPU anyway. OMG it takes TWO MORE SECONDS!!!!!!
@@johndoh5182 The Apollo Guidance Computer was indeed the first modular digital computer, and paved the way for the computers we know today.
Not trusting any of this AI hype.
what was that other dude doing there? other than looking like hes part of the screens used for the presentation
Security
So, his opening statement is incorrect, to be clear and HE is saying this because he wants people to move to an Nvidia based system that they will put out in the somewhat near future AND will cost more than an X86-64 based system.
X86-64 offers everything from a system that's dirt cheap and will do office apps all day, browse the web and let you watch 4K HDR movies. Or, they can run any engineering problems that 10 years ago could take a week to solve and now takes a few hours.
With X86-64 the limiting factors are NOT the CPU and the general processing cores which are worlds better than even 10 years ago. The limiting factor is ALWAYS cost for the type of work you want to do, and that cost has come down DRAMATICALLY over the last 15 years and ASTOUNDINGLY over the last 20 years. You can now do renders on the best workstations, where 6 years ago it would have required a render server that would eat up a lot of power and take a few days. I'm talking studio production, not simple renders people do at home.
So, while Jensen has done well to grow Nvidia into a technology company while still being able to put out the best GPUs for a PC, the goal of what he's saying which isn't factual is based on his desire for people to leave a very mature platform, whether it's Apple, X86-64, or some new Microsoft ARM based system which isn't mature, and move to an Nvidia based system making Nvidia the masters of the world and charging you an insane amount of money once you're trapped.
No thanks, my system does EVERYTHING I want to do as fast as I need it to.
Should individual nvda stock be held in a roth ira ? Or is it worth holding qqm
Jensen is a genius. NVDA CC next week will be a monumental event. Massive upside with major BUY recommendations and higher price targets
What is CC?
What's cc?
He probably means the earnings call. Just another UA-cam person with a crystal ball…
Jensen dominou totalmente a oratória, como esse cara apresenta bem. Musk podia ter umas aulas com ele.
Thank you very much Michelle
When will this fit in a box on my desk to run locally? I have no interest in paying a monthly subscription to run my games, etc. over the internet. This feels like the 60s/70's style mainframe computers that now fit in your pocket.
And this my friends is the future!
Seeing this after watching a video about autonomous killing drones used in Ukraine and as well as remembering an article a few days ago about a couple from US moving to Jamaica to live a longer and happier life in a very real way describing such extremes as doing laundry by hand, yet living in a community, singing, dancing, eating healthy food. What a disconnect! Just think for a second...
Thank you. Very informative.
Glad it was helpful!
Recursion Pharmaceuticals should be around $ 50 in 3 years!! Greetings from Australia 🇦🇺
this feels like old news. because it is. gg thanks for the supercut
The original came out October 31… so it’s less than 2 weeks “old”
Some day: "My great grandfather used to WRITE PROGRAMS. Imagine that!"
First let's make sure they know how many fingers there are in a hand.
My great great grandmother used to be a computer. Imagine that!
@@KilgoreTroutAsf That's pretty much been solved. Not worth worrying over things that progress either has or soon enough will resolve.
I was able to buy an RTX 4050 after 4 years. This Nvidia CEO is not one to bet against. Insane what they have achieved in this short GPU life span
I need a solor pannel and this black well system
And i will do wonders!!
dead internet theory is evident from the comment section..
hardly a human in sight
And really
Hey I’m here 😊
You're a bot
bleep blop boop. I will compute for you.
beepboop
Only thing I don’t like is that they are very close to SMCI, which needs a lot of clearing up and report their 10-k
Yes, but how well does it mine bitcoins?
someone who works with CUDA - is there an opportunity to drill down below that level to take particular advantage of features of the chips, akin to what you would traditionally do akin to maybe writing assembler for some particularly important part where you need every tweak you can get ? Likewise on LM level, how do you narrow the input data to get tighter data sets which exclude duplications etc when training ?
What brand of shoes are those?
@@TheMurrblake crocs
@@TheMurrblake ali baba
Temu Airs
Who was Rochelle? The man in black that looked like Jensens security.
Why he constantly gave India reference 😅 Hindi, Chicken 🍗 biriyani, Mumbai
Because this event was in India.
ha, hearing Jensen say "is impossible" is funny- if he wanted to, he could probably do more impossible things! Go NVDA!
Donate for a Dream
AI (playing the role of Dr Strange) is telling us that after over 4 million training simulations, the only way to begin to save ourselves is to save Tony (played by Jenson). Which is the ant and which is the boot? Personally, it is fascinating to watch all the players positioning themselves for the game of the century.
I see the title, and I check the stock 😆
Moore's Law isn't dead?
Also see our new processor. It's 100x faster and 500x bulkier🎉
This guy, Elon and Alex Karp are 3 of the best CEOs in the world and I wouldn't bet against any of them. Can't wait for Nvidia to crush earnings !
@@DeroQC89 No offense but have you seen how Elon has been running Tesla and Twitter? Tesla’s has been shipping less vehicles and is being outcompeted by Chinese ev’s. Tesla has had the same stagnated design for years and not to mention the cyber truck that has been a complete and utter failure. I think it was just 2% of reservations for the cyber truck actually bought it. And then we have FSD which he has been promising for nearly 10 years and still has not delivered. Twitter has lost over 80% of its revenue and value. Elon has gotten lucky in many regards and what SpaceX has achieved is nothing short of impressive but he is quite incompetent in many ways. Not to mention how many people have been turned off from buying Teslas due to his rhetoric on twitter
@@pingpong1727I agree. Some what emotionally unstable.
@@pingpong1727 He's still the richest person in the world. Chinese EV's outcompete due to cheaper labor, not better quality, management, and technology. Outside of Chinese Tesla sales, Teslas are made in the US and are profitable. How many other auto companies are selling EV's at a profit? Tesla is the first and only Western automaker selling EV's at a profit.
Twitter was always unprofitable. Musk put it in a better position to become profitable when revenue eventually returns. He cut headcount by 75% and the platform still runs just as fast and stable. The main cost issue is the legacy contracts they had in place with AWS. Once that goes away, a huge liability goes away and X will become profitable. The short term value doesn't matter. He got free advertising for all of his other companies.
He had founded or turned around every company he has touched from Paypal, to Tesla, to SpaceX, to Neuralink. He's only owned X for 2 years. If that's how you judge a person's success, you're a clown. Apple almost went bankrupt under Steve Jobs before he was able to turn it around. You can call that luck. Tell us about all of the success you've had.
I add Sam Altman in your list of CEOs👍👍
@@pingpong1727 The easily led from TDS to EDS just by being directed by the DNC. Model Y is the best selling car in the world. (of any kind)
Gary Marcus says this is all hokum. Anybody have an opinion based on the reality of the situation?
But will Blackwell run Crysis?
So could the physical AI they are working on, outdo FSD?
i am doing this for years, maybe it's time to make me a shareholder right?
terima kasih atas video anda. Bagaimanakah saya boleh memulakan perdagangan crypto sebagai pemula yang lengkap? Saya baru dalam mata wang kripto dan tidak faham bagaimana ia benar-benar berfungsi. Bagaimanakah seseorang boleh mengetahui pendekatan yang betul untuk melabur dan membuat keuntungan yang baik daripada pelaburan mata wang kripto?
Cikgu Lewis Hayes. Dia memang terbaik, orang ramai bercakap tentang dia di sini.
Saya membaca begitu ramai orang di sini di UA-cam berkongsi kerjanya yang luar biasa dan saya rasa dia benar-benar bagus dan menarik untuk bekerjasama...
Ya, saya baru sahaja menerima bayaran kedua daripadanya semalam dan saya juga menantikan pembayaran ketiga saya...
Ia merupakan pengalaman yang luar biasa dengan Cik Lewis Hayes setakat ini. Semoga Allah merahmatinya kerana mengubah hidup saya.
Tolong saya baru di sini, bagaimana saya boleh menghubunginya?
Nvda.. Nvidia pushing green today.? VHAI 30 % rise week. Vocodia. Conversational AI tech with new revenue streams. Palantir up 5 % today..Sym.. Symbotic green today. Thumbs Up video/ comments as the AI Evolution Begins Globally. Thanks
Nvidia and Palentir is the Future 😊
Don't care how great Blackwell is , they are beholden to tsmc and lined up like everyone else with alloted slots thus can't fill their many blackwell orders even if they want to.
Bro, u know what dish is chicken briyani?
the fancy Blackwell means that individual GPU can't get faster, the Moore's Law is dead but in the other direction, Nvidia can't make the chip 2x faster, they have to combine many GPUs into one, that's not Moore's Law, that's like clustering which was done for CPU decades ago!
The overall narrative is the growing utility of AI. AI has extended my capabilities- in ways which saves me 3-4 hours a week me , and enables extensions to my work not possible before. That kind of productivity will unlock new kinds of products services systems platforms and business. And that’s before we even talk about physical robots.
Akoustis Technologies is not a widely known stock, but according to Chat GPT and Gemini it's both overlooked and undervalued 😎
Let’s Go
10:47 for those who don't get why he didn't give a number for after 10 years of 4x double a year.... is because its over a trillion x, while moores law is only 100x after 10 years
"incredible scaling" my ass more like focken insanity
trillion, or million?
@@kurtpedersen17031.05 million.
Also raw calculation power is less efficient since managing the parallel tasks becomes cumbersome. Still impressive though.
@@kurtpedersen1703 if its 4 doubles in a year its 1 trillion and if its from 1 to 4 a year then its 9.7 million x in 10 years maybe i miss understood his quote but either way its a lot lol
@brainstormingchannel7490 i think he said 4x/year, not 4 doubles. But I'm not gonna watch it again 😆
The King!
The absolute legend!
I am regretting missing the recent Tesla rally.
@@DK-ox7ze don't feel like you missed out Tesla. It's a company has over 1 trillion market cap and only makes $3.65 a share. It's up because of politics, not because the company has a solid ground.
Usually are the young people investing in this company.
NVDA has plenty of room to grow that TESLA.
Tesla energy may grow net profit by 40% alone one year from now. With over 30billion in cash, I think Tesla is in a pretty good position to grow. Too bad the media spews endless misinformation about Tesla - did by they say all the competition had arrived five years ago? VW Audi bmw Hyundai were all supppswd to leave Tesla in the dust. What actually happened? Model Y and model 3 some of the best selling cars globally. While ther competitors are far far behind Except for BYD.
will turn 1000 again in a couple of years 🎉🎉🎉🎉
If you make something 2x faster, it means that you do something smart. If you make something 100x faster, it means that you stop doing something stupid.
Haha I like this. I’m stealing it!
Did noone else notice a agentbot in development got loose and walked on the stage trying to be useful...
Huang just patted him, called him Michelle and dissimulated that it were anything unusual 😂😂😂😂😂
Then he walked off stage and shouts at his staff "bring me our guardrails engineer, it was too close this time"
Compute in compute out. On off, yes no, true false else, base 2. Linearity ruled. Moving towards 3 Dimensional computing.
Never crossed your mind that NVIDIA accounts can be also "cooked" like SMCI? 😅
60 years really I never Heard about it
Nvidia need to package the next chip in a black leather case. I reckon sales would double.
no new GPUs? :(
This ist Just the beginning. NVIDIA will change the world we know. It's bigger than anyone can imagine now. Humanity will skyrocket the next 5 years. We'll solve problems you wouldn't Imagine now. Eternal lives, warp travelling, time travelling, dark hole creation, terraforming, unlimited Energy, dark Matter etc....this ist beyond reality.
You're absolutely correct.
None of that will happen in the next 5 years or even 10, They still haven't given us our flying cars.
Late 1980's they said by 2000 we would have flying cars, 2000 was nearly 25 years ago!
Black holes been solved long time
Ago. Search p diddy freak offs.
😂 keep dreaming. Think about real problems, not just theories that are more fiction than real science
@@ShaneMcGrath. We have them, but no one wants them or uses them.
whenever I see a youtube title that says "shocking" or any variation of it, I automatically assume that it's just clickbait and is not worth watching. Maybe this type of titles would've worked on me 10 years ago.
Can AI helps me to fast my computer , without using a graphics card 🤔
This technology is really impressive and scaring at the same time. The last part of a fully robotic factory that is actually designed by AI makes human beings obsolete. In a world economy that is based on humans able to eat based on getting paid for the work they execute and the final product that is sold. Automating all aspects of productive work actually means humanity will need a new economic model where the current capital owners no longer control the system. Keeping the current model under this IA revolution means that only capital owners would be able to exist. If the current society model is not completely remade. Social unrest would arise and we will self destruct as race. So, in a fully IA based world like the future that is presented by Huang this production capability must be transferred to collective ownership. Looks a lot like communism but it is not. Communism negates human's rights regarding individual benefit from direct work. The new model will be based on that human labor is no longer required, except for very small control positions that should be filled ad honorem.
speechless indeed.
So Blackwell is teslas dojo
End of human Era. Age of Robots is upon us. What's going to happen to all the manual jobs?
You will still need manual labor … it won’t replace everything
@@alanalda9686 and just remember, we aided it all, every step of the way.
Jobs for dummies won’t exist anymore.
@kilodave77 pretty and it'll be the dummies that gave them away. 😂
@ I climb trees for a living. I’m not worried. No computer or robot can do what I do.
it looks like the computers from the 60s :)
Definitely just bought more nvidia
This guys is amazing. How’d they come up with all this. Nuts!
Yeah that dude on stage was eerily ai looking.
how do I do a hello world with this?
You don’t. The AI now does hello world with you.
@@edgardcz say hello to your computer overlords
😂😂😂😂😂😂😂 as an NVIDIA investor I call this a complete joke.
Saw at least the first part of this video a while back. Is there anything new in this video?
The original video is from Oct 31, 2024. If you've watched Jensen's other keynote though, you probably saw most of this as well.
I remember the days before digital anything. Has the quality of life improved? Warning Will Robinson...Cybernetics ahead.
what date was this talk?
Oct 31, 2024
He's such a rebel with his black clothes and black leather jacket .... /s
You just know he's back stage hitting a tobacco cone as they are making his introduction
He never said a million times faster he said 4^10.
So you're click bait for most people that can't do, or can't be bothered to do math.
AI stole my job.
what have I watched?
Just realised the white on his shoes is to match his hair…
Make the software the hardware.
Just curious, whether your bull thesis on SMCI is intact or not! Or trash it?
Jensen it's time to buy a new jacket
for nvidia's line of business of selling computing capability this new "software" 2.0 based on ML is great. But why that makes current or classic programming "bad"? because is made by human beings that Nvidia can't sell? why replace fully working deterministic systems by those made by IA that actually costs a lot of power and energy to be produced, refined and run? I can see IA as a good efficiency tool regarding the ability to produce new systems that humans can't or haven't done yet because the data volume is not at human scale. Like simulating and testing billions of combinations. But neural networks are like statistics that are based on the premise that the data is correct. I mean if you feed a ML model enough data that 2 + 2 = 5 the ML will produce a model that produces wrong addition results. This is very different from fully deterministic and rules based programming that is correct (or wrong) by fundamentals. As Huang mentions on this speech this programming model are like numeric approximation formulas and those never produce truth only approximations of truth. That might be a dangerous path to follow if all systems get to be made that way. You will end with software that on some occasions will produce wrong results impossible to track and correct as no human being will be able to modify ML generated functions as the programs will be unreadable. This issues are already happening on some industries where automated decisions hurt customers and as those companies don't provide proper support by trusting blindly on automation, customers get their rights denied and the basic rule "customer is always right" turns into companies that only trust on their own belly button.
Traducerea, în, tymp, real
Nvidia slowly turning into Skynet?
80%dynynformaty,ze,perd,yn,system,spatyu,tymp,sau,altele.
Good thing they use GPU instead of GUP….it would laugh every time I say it….
if that technology did existed for that many years why bring it up know if you Had the idea of creating AI technology
The hardware to train LLM only relatively recently (the last couple of years) became fast enough to do the training.
Sorry Jensen, there are some things you can't "accelerate" (aka parallelize). Pretty much every program has sequence of steps that need to be executed in order. Scientist and engineers have been searching for this holy grail of parallelizing everything, but it's mathematically not possible. CPU companies like Intel and AMD have put in decades of work into branch prediction and gotten some gains, but you can't predict everything. I'm sure today's latest AI models could improve on this, but you can't do what is logically impossible (100% prediction of all branches).
Branch prediction is very important for common CPUs, but it has nothing to do in AI.
Imagine that you have a huge array of data and everything you need to do is very simple allreduce.
This is very simple calculations and there is no need to jump depending on the result of calculation.
And every chunk of data can be calculated independently (in parallel)
Quantum computing would solve it I guess, easily simulate trillions of states at the same time
@@dimes1917 not arguing with that scenario. That’s why GPUs exist and are so good at tasks that are parallelizable such as graphics and DNN training. But in the video, Jenson is implying that we (software developers) have to apply this throughout our code instead doing sequential operations. I don’t think that’s possible. Let’s take a super basic “Hello World” program. The program starts, it prints the string, and then ends, and it has to do all 3 in that order. You can’t end the program at the same time you’re printing the string. Another example would be a function that has a switch statement based on the current temperature outside. This value is fetched from an api, and let’s say we divided the values into three ranges, such that there’s 3 different branches the code can take depending on the temperature. You can’t predict the temperature value in the code so your branch prediction pipeline will fail 2/3rds of the time. Expand this to 100 temperature ranges and you see how the problem only gets worse the more complicated the conditional logic is.
@@magicsmoke0 sorry, dude, but seems you think too narrow.
Probably outdated...
Well, about 25 years ago I worked in Intel compiler team...
@@dimes1917 I’d love to hear in more detail how this outdated. Tell me how we can use the GPU to “accelerate” a REST API that does CRUD operations for an e-commerce website?
This is highly edited.
Yes, that's what a supercut is.
@@TickerSymbolYOU kudos to saving some time for us.
That's not the meaning of 'meaning'.
What does 'meaning' mean to you?
@@TickerSymbolYOU: not a next-word predictor trained by stochastic gradient descent. understanding the meaning of something means abstract principles underlying something, not a statistical model.
Why can your AI not find the truth?
Too bad Tesla doesn't have a smart, full-time CEO like Jensen.
They do a lot of good Ai stuff. But their graphics cards are a massive rip off
Poor Jensen has to censor his DNA and identity 😂😂😂
its funny did he say nvidia made the first 3d gpus? well 3dfx matrox etc
What the heck is he talking about
huh we are no where close to machine developing everything.
Put that in YEARS: 50? LOL, not a chance it's anywhere that far out.
Like you have any idea on what is going on.
@@francoisdube0 well i'm a software engineer and for now LLM r kinda shit at coding depsite all the videos u watch to hype it up
🥳🇺🇲✌️
Jensen's mouth is 1000 time faster