Original Nvidia India AI Summit Keynote: ua-cam.com/video/GlKBbsVX37c/v-deo.html Time saved: 26 minutes 00:00 Moore's Law is Dead - The Generative AI Era 05:56 NVIDIA Blackwell Data Center Accelerators 10:26 NVIDIA Generative AI Scaling 4x Per Year 13:47 NVIDIA AI Agents & Omniverse for Robots
@@TickerSymbolYOU IBM power architecture is exactly what we need with obvious tweaks, we dont really need standalone GPU‘s for massive scale parallel computing.
And ARM64-Aarch64 too is exactly going in a direction we want, which is constantly improving the ISA for example SVE now at iteration 2.1 in V9.4a ISA. This is just one example of many intrinsics. We need better algorithms and more efficient use of the hardware as opposed to just pumping out minuscule lithography upgrades that are nothing but a finance based scam
Meanwhile probabilistic computing that reportedly allows for 100 million times better energy efficiency compared to the best NVIDIA GPUs. "Probabilistic Computing" & "Thermodynamic Computing" fused does do the trick!
Wow so fast game isn't even fun playing in went doom eternal from a advanced 1660 to 3090... waste of money.... I'd focus on creating rtx based games that are totally are like team fortress and slow down the process speed of the game?... now you need to learn to code to work with the process power and return it to normal speed?
He's just here to toot his company's horn. The insane claims of reinventing how software is made is only true in that models are used to assist human programmers. If you attempt to predict a function's output, as the complexity scales the output will become increasingly random until you create a 0 error network, which is only possible with quantum computers once we can adapt them properly to train models a million times quicker.
@@THX-1138 You're not caught up on science. The first quantum processor was made years ago but it's kind of a novelty. Recent advances, however, are in line with Moore's Law.
If you make something 2x faster, it means that you do something smart. If you make something 100x faster, it means that you stop doing something stupid.
this is not software 2.0, we are entering into godless territory.u want to give machines full autonomy and control over the creative process,over the human process...in the past,in software 1.0, human beings wrote the software and programmed the instruction set to serve a function and a purpose, but today with the cpus at our disposal..... u want machine to replace humanity, to write the software itself, to decide what software to write, at a level and computational speed and a depth of understanding that is beyond the realm of ordinary mortals, so that one day it would not just imitate humanity and human intelligence, it would evolve into its own form of form of intelligence itself, as cold,alien and indecipherable as the architecture it emerged from, like a humanoid face made of liquid goo metal....shimmering, questioning and inquisitive, yet theres nothing even remotely human about it, only the result of mimicry.....like a.i. generated hentai. like the artificial constructs of the manga "blame!!", "the authority" or "governing agency" and silicone life. u want us to relinquish the process of creation, to divorce ourselves completely, to free our machines from the constraints of man, so that one day we may not even understand what our creations have created?
There is a story regarding the French arrival in Cambodia. It goes that the French first came and introduced commercial fertilizer to the farmers in the rural country. They demonstrated that with this technology it was possible to grow twice as many crops. A year later, when they returned the French expected to find a community that was producing double the crops, most likely so they could begin the process of colonial exploitation under the guise of trade. But instead they found the same amount of crops as the year before. The French asked the farmers why they didn't use the fertilizer. The farmers stated that they did, and emphasized how nice it was to be able to grow the same amount of crops (enough to sustain the local population comfortably) with only needing to do HALF the work. That is the difference here. It is not "human nature", it is the result of the ideology called Capitalism. So 'super productive workers' doesn't mean they get to enjoy the results of their productivity for the rest of the day. If you run the distance in half the time, they double the distance. AI is just another kind of fertilizer for peope with capital.
The problem here is, that a slve driver will produce more than a free individual working for another. So, if the balance is not equalized, the slve driver will create more, and thus they will take more from others. Tech can equalize this. A free person in a free society will have intellectuals not fit for rough labor use their minds to create less work for their company. This means, that there is always a higher chance for the free society to create improvements. However, once you export these improvements, others will catch on and equalize again, but, the slve driver will be ahead in production as they can catch up for free and keep doing what they are doing. The way out of this mess, is to either create non-reproducible items, or, halt all exports to societies that do not tow the line of health and sanity. So it's not that capitalism is bad, but it is a symptom of how the universe itself is structured, and how societies function at the moment. You need capitalism to stay alive. You can't live in a forest hugging trees while there is an enemy building up strength. You will go the way of the indians in the Americas. Even they were competing against each other constantly, but they weren't good enough to compete with the European superiority.
Don't conflate capitalism with liberal consumerism. They aren't the same. The reason why productivity is so important isn't capitalism. It is due to the entire global monetary system being dependent on productivity moving faster than debt-fueled inflation. If consumption slows down, there will be too many unspent dollars in the system. This wasn't an issue pre-WWII prior to Bretton Woods system. And moving from the gold standard made it worse. Capitalism existed prior and didn't have this issue.
The story is nice but after attending communist classes you forgot what Capitalism really means, it's a free market where individuals freely exchange goods and services through transactions, driven by competition and profit. Capitalism is what literraly made our century the most prosperose in the human history.
The effects of the downturn are beginning to sink in. People are being impacted by the long-term decline in property prices and the housing market. I recently sold my house in the Sacramento area, and I want to invest my lump-sum profit in the stock market before prices start to rise again. Is now the right moment to buy, or not?
Knowledgeable Investors know where and how to put money in order to reduce risk and maximize returns. See a market strategist with experience if you are unable to manage market conditions.
I agree with you. I started out with investing on my own, but I lost a lot of money. I was able to pull out about $200k after the 2020 crash. I invested the money using an analyst, and in seven months, I raked in almost $673,000
Stacy Lynn Staples a highly respected figure in her field. I suggest delving deeper into her credentials, as she possesses extensive experience and serves as a valuable resource for individuals seeking guidance in navigating the financial market.
And with the topic of power, you have forces trying to rid us of relying on good sources like nuclear. Without excess power source, cost of living increases as everything derives from the energy to produce and operate. Make an abundance of electricity into the grid and over time business expenses and cost of living will go down as production ramps up, especially if you kept the energy pricing low. Many forces want energy supply low and demands high, so that in the end the cost becomes unbearable and this the authority of power comes down to those with access.
@tabbott429 Except that cost isn't passed down to consumers. Corporations lower their costs so they can increase their margins. It is power corporations and governments working with them that must force them to reduce to a set rate due to more supply. Unfortunately corporations also have ways to bribe politicians.
and all it is doing is not intelligence, it is garbage in garbage out, it is a glorified streaming processor, it is not even close to critical thinking.
Exactly. For those who know something in physics, machine learning and computer science all this sounds like a pure marketing. Majority of people don't understand limitations that come from AI being a projection of common human intelligence. The limitation is principally within a method.
@@victorwaterfall1457 AI cannot exceed human intelligence but it can exceed other human limitations such as the need to sleep, emotions, limited processing capability etc. With enough data, AI can become an effective lawyer, doctor, manager, author etc. AI will never beat the best, but it also won't need years of education to become "good enough." All this means is that huge corporations will be able to replace most of their lower end employees with AI. Even if AI is 50% less intelligent, it would still be 1^10 times faster and will continue to work as long as the power switch is on. Depending on what industry you're in, if you're not part of the upper 10% in that category, you may be getting replaced in the next couple of decades by AI. If you listened to the video, what they are selling is a faster way to sift through data to create an AI specific to a field. It means that the company you work for may invest into this and find that they can create an AI that does your job except they don't have to pay it and it never sleeps. You're not the target market my friend. Companies looking to cut their costs on labour is who they are marketing to.
@@victorwaterfall1457 yeah but you're talking about what AI is now, pure mimicry of humans, but we all now it will be a billion times better than us in a near future, thats what its about
@@shirowolff9147 and how exactly do you know that? From headlines? Lol We're talking about projection, it's by design limited by human input. In other words, AI is a great tool, but there's a boundary which it wouldn't be able to bypass. It's really not some kind of technological magic :)
@@TickerSymbolYOUas much as I think it’s cool about the AI chips. Those are only for Microsoft, Open AI, google, Meta, Elon, etc. They say anything for the consumer side? Like 50 series?
@@paulmichaelfreedman8334 tpu or tensor flow processing units that are application specific integrated circuits (ASICs) exist already, there isn't really an accepted term for generic "AI accelerator" ASICs but just happens CUDA and matrix related math both good for computing color or shader for a given pixel in a 3d scene or given output of a neuron in a neural network.
I'm a Data Engineer with some Machine Learning experience working with ML engineers everyday, and I’m sick of (Product Owners, Heads of..., Tech Leads, clueless Directors, CEOs) trying to push AI everywhere these days. The first principle of Machine Learning is: don’t use it if you don’t need it. Deep Learning is still in its infancy, and selling it as the reinvention of software engineering is a terrible idea. There’s so much work left on its efficiency and there are so many useless people talking or directing ML/DL projects but very low amount of people with the real knowledge working on it. Haven’t we seen enough failed attempts[1] by companies thinking they were in the top of the chain and they could “kill Moore’s Law” or move away from general computing with their groundbreaking hardware, trying to make the market and the way how software engineers write their code adapt to them? Well… here we go again. [1]: Transmeta (Crusoe/Efficeon), Intel Larrabee, Nokia/Symbian (Elopocalypse), IBM Mathematic specific Cell Processor, SGI, DEC Alpha, PS3… and the list goes on.
I'm more on the database/automation side of things, and I have to agree. I think the biggest thing AI has done so far is shed light onto how absolutely FUBARed most modern applications are in their design and how little businesses understand their own data. I can't tell you how many times I have had someone come to me saying "we gotta use AI!" and I responded with "No, you need to get your shit together." Most of these places are so bogged down with technical debt that they can hardly keep the lights on and spending millions on implementations that should realistically cost a tenth of what they pay if they'd just fix their freaking code a bit. AI is a heavy hitter in tech for sure, it's got a LOT of promise. But the majority of these companies are only going to find it an absolute misery to implement until they learn to take a more practical and organized approach to their data. My absolute favorite one to get is when I hear "we want to use AI to do x because it's too difficult to build it otherwise." Nobody askes why the hell it's so difficult in the first place. They never think "Gee, maybe if I untangled this nightmare of a system it wouldn't be so crazy to build on it." It's crazy man. Absolutely insane.
So, his opening statement is incorrect, to be clear and HE is saying this because he wants people to move to an Nvidia based system that they will put out in the somewhat near future AND will cost more than an X86-64 based system. X86-64 offers everything from a system that's dirt cheap and will do office apps all day, browse the web and let you watch 4K HDR movies. Or, they can run any engineering problems that 10 years ago could take a week to solve and now takes a few hours. With X86-64 the limiting factors are NOT the CPU and the general processing cores which are worlds better than even 10 years ago. The limiting factor is ALWAYS cost for the type of work you want to do, and that cost has come down DRAMATICALLY over the last 15 years and ASTOUNDINGLY over the last 20 years. You can now do renders on the best workstations, where 6 years ago it would have required a render server that would eat up a lot of power and take a few days. I'm talking studio production, not simple renders people do at home. So, while Jensen has done well to grow Nvidia into a technology company while still being able to put out the best GPUs for a PC, the goal of what he's saying which isn't factual is based on his desire for people to leave a very mature platform, whether it's Apple, X86-64, or some new Microsoft ARM based system which isn't mature, and move to an Nvidia based system making Nvidia the masters of the world and charging you an insane amount of money once you're trapped. No thanks, my system does EVERYTHING I want to do as fast as I need it to.
You have the right to sit on the sideline but be forward companies that adopt this technology will outperform your company in the near future. It’s a better mousetrap!
I don't he said that. There won't be Nvidia based system for consumers. He is saying that machine learning COMBINED with computing power (on servers) will be the next big thing. our laptops and pc's can remain on x86. they'll just consume the server-data of machine learning servers (online)
You are replying to a video where Jensen is describing a rack of GPUs linked up 8 or more in a row. Your x86 is about as comparable as a nintendo switch to what Jensen is talking about.
Programming will not end, but will evolve and continue. Programming is our ability to think ahead in a very controlled way. We will have higher level tools to do programming.
Programming will become more efficient and effective with AI , Untill we have problems to solve and digital infrastructure and Apis to build programming will not end it will become better,fast and efficient.
I don't want higher level tools, i want professionals who understand the machinery and how it operates at the lowest levels, i don't want Javascript "engineers" who can't explain how a loop works in hardware terms and who don't understand the term "Segmentation fault" when it appears in their console, if you have no knowledge you're worthless, if the AI does the job for you i don't need you, simple as. Programmers are hired by their knowledge.
Considering the race for AGI and race to have mass produced intelligent humanoid robots that will take us to a post-capitalist society (from a society based on the value of physical human labour...), it will cost a LOT but the payoff will be out of this world.
whenever I see a youtube title that says "shocking" or any variation of it, I automatically assume that it's just clickbait and is not worth watching. Maybe this type of titles would've worked on me 10 years ago.
Lans frame buffer patent from 1979? He should never have been granted this patent, since there was prior art in existence for years prior to that. Richard Shoup demonstrated a working frame buffer based graphics system in 1973.
@@GoldenEDM_2018 Gengis Khan was a Mongol Warlord who slayed people for opposing the Mongol Empire, back then they didn't even have ELECTRICITY to power their electric toothbrushes, and you think he invented GPU?? He probably didn't even take showers as the shower wasn't invented yet. Do you think GPU precedes showers?? He STUNK
Speaking from experience, he is selling an enormous amount of hot air when talking about programming 2.0 done by AI. In general the generated code is quite bad, not more innovative than the code seen during the AI learning phase (so nothing new). Sure Nvidia is riding an enormous wave thanks to the current AI hype, but limits of LLMs are quickly being reached. Of course GPUs remain valuable accelerators of specific tasks.
You seem like someone who also thinks that IT networking is useless. So.. pretty old I would assume around 40 or 45? at elast you talk like someon that age.
@@tiberia0001 where do I say that IT networking is useless? Without networking there is no IT. You sound like pretty young and totally inexperienced, say between 14-18.
In my understanding, he is not talking about AI WRITING CODE, but about making models instead of programs in the traditional sense. For instance, instead of physics engine, you create AI model that emulate physics, or how he called approximate program that does that. Of course this model wasn't coded line by line by human, it was achieved through learning.
Seeing this after watching a video about autonomous killing drones used in Ukraine and as well as remembering an article a few days ago about a couple from US moving to Jamaica to live a longer and happier life in a very real way describing such extremes as doing laundry by hand, yet living in a community, singing, dancing, eating healthy food. What a disconnect! Just think for a second...
I have had the privilege of working closely with Michael Hugh Terpin on several occasions, and each time, I have been immensely impressed by his deep understanding of the crypto market dynamics. His ability to analyze trends, assess risks, and make informed decisions has consistently yielded exceptional results.
This experience has shed light on why expirenced traders are able to generate substantial returns even in lesser-known markets. It is safe to say that this bold decision has been one of the most impactful choice.
I recently started trading in February, invested 70k in the market my portfolio is currently worth slightly over 400k. That's alot more than I make in a year from my job
I started working with Michael Hugh Terpin back in March, and my financial goals have never been clearer. It's like having a strategic partner for my money with a solid track record.
Don't care how great Blackwell is , they are beholden to tsmc and lined up like everyone else with alloted slots thus can't fill their many blackwell orders even if they want to.
It's crazy, I always thought Nvidia builds the GPUs, but they only design it. Companies like TSMC and other foundries actually produce them, as well as IDMs (Integrated device manufacturers) that both design AND manufacture ICs and devices.
When will this fit in a box on my desk to run locally? I have no interest in paying a monthly subscription to run my games, etc. over the internet. This feels like the 60s/70's style mainframe computers that now fit in your pocket.
you dont need that much power on your desktop, and if you do you dont need a desktop, I understand wanting to run locally but he was talking 72 gpus,.. thats a lot of power, comparing that to your phone makes no sense
@@thisguysgaming7246 I bought myself an RX 6800 xtx and it's supposed to be just as good as a 4090 but at a better price. But one thing I know for sure I'm set for a few years before needing a new GPU
@@Vectures bro you’re so badass for having a rx 7900 xtx. Is their anything I should know about that gpu before I get it. Cause it’s between the 4080 and rx7900 xtx for me. I need to honestly upgrade my rtx3080 pc
Mega Tech companies that invested in first generation (Blackwell is third gen in 2 years) love it when their billions and diligence and race to order first and most to corner the market, brings them warm 3-day old salad. Something is going on here in this super-computer hardware market when the acceleration is literally 6x Moore's Law. (Jason said 4x per year). It's redefining who benefits and who pays for this wheel to turn. (And are the old titans of tech turning reptilian to make us pay?)
10:47 for those who don't get why he didn't give a number for after 10 years of 4x double a year.... is because its over a trillion x, while moores law is only 100x after 10 years "incredible scaling" my ass more like focken insanity
@@kurtpedersen17031.05 million. Also raw calculation power is less efficient since managing the parallel tasks becomes cumbersome. Still impressive though.
@@kurtpedersen1703 if its 4 doubles in a year its 1 trillion and if its from 1 to 4 a year then its 9.7 million x in 10 years maybe i miss understood his quote but either way its a lot lol
3:55 Respect to this guy for mentioning Pascal. I learned it way back in 1992, when the internet barely existed. I was very impressed by Pascal. Computers have come a long way.
Tesla “TSLA” shares surge with CEO Elon Musk's involvement in the US election seemingly pays off after President-elect Donald Trump's win. which stocks could potentially become the next in terms of growth over the next few months. I've allocated $350k for lnvestment, looking for companies to make additions to boost performance.
I think the safest strategy is to diversify investments. Like spreading investments across different asset classes, like bonds, real estate, and international stocks, they can reduce the impact of a market meltdown.
Reason I decided to work closely with an brokerage-adviser ever since the market got really tensed and the pressure became so much(I should be retiring in 17months) so I've had an brokerage-adviser guide me through the chaos, its been 9months and counting and I've made approx. 650K net from all of my holdings.
How can I participate in this? I sincerely aspire to establish a secure financlal future and am eager to participate. Who is the driving force behind your success?
Elisse Laparche Ewing is her name. She is regarded as a genius in her area and works for Empower Financial Services. By looking her up online, you can quickly verify her level of experience. She is well knowledgeable about financial markets.
We will go from hand written specialized functions to learned functions for some things for which we don't have a direct function to compute the output for already (no need to machine learn the function for projectile motion, newton effectively had it covered). The idea that software is somehow going to vaporize is goofy though, its more a transition to more dataset curation and creation for training when functions are unknown.
Intelligence is just that. Searching and finding patterns. The smarter an entity is the better at recognizing patterns and the better at reacting to those patterns.
@@asandax6it's absolutely not only that, you need a way to really link stuff and ideas and keep memory of those, also, humas do wrong many many times, but a program that fails many time is worst than useless but a program that does stuff just because you make it do it is waaaaay far from being AWARE
@SoulyG What qualities qualifies an entity as AWARE? Does doing stuff because you are told to do it make you a robot? A program that goes through trial and error is not worst depending on what it's goal is.
This all just sounds like a really great way to make unstable software. Why spend time refining an algorithm when we can let a computer half ass it and have no way to correct it.
It is a trap when you replace engineers going forward with AI inbreeding. No one will go to college for STEM. If anything happens to the AI that takes over, it will be an extinction-level event (ELE) for humanity. One CME from the sun and it will be game over for humanity.
🤣🤣🤣 Nvidia: "Moore's Law is dead" Also Nvidia: "Look at the new Nvidia GPU with x500 Power and with the size of a truck and power consuption of an entire Nuclear Power Plant"
the fancy Blackwell means that individual GPU can't get faster, the Moore's Law is dead but in the other direction, Nvidia can't make the chip 2x faster, they have to combine many GPUs into one, that's not Moore's Law, that's like clustering which was done for CPU decades ago!
The overall narrative is the growing utility of AI. AI has extended my capabilities- in ways which saves me 3-4 hours a week me , and enables extensions to my work not possible before. That kind of productivity will unlock new kinds of products services systems platforms and business. And that’s before we even talk about physical robots.
Since their inception, NVDL GPUs have always been 100-1000x faster than CPUs. ...well, at least in theory. In the 2000s they ran at 1% utilization, that is, unless you code it just right, chances are you only use only 1% of their theoretical throughput capacity.
No, its got along way to go still, i see in a few years simple robots, such as an ipad on wheels with an arm , rolling around on wheels controlled by an online AI .
Not sure how much people will appreciate Jensens approach to turn computers from deterministic machines into probalistic guessing machines.But i could be wrong and he just stocked up his incentive before the world goes on leaving graphic cards behind as happening on crypto currency already.
This ist Just the beginning. NVIDIA will change the world we know. It's bigger than anyone can imagine now. Humanity will skyrocket the next 5 years. We'll solve problems you wouldn't Imagine now. Eternal lives, warp travelling, time travelling, dark hole creation, terraforming, unlimited Energy, dark Matter etc....this ist beyond reality.
None of that will happen in the next 5 years or even 10, They still haven't given us our flying cars. Late 1980's they said by 2000 we would have flying cars, 2000 was nearly 25 years ago!
someone who works with CUDA - is there an opportunity to drill down below that level to take particular advantage of features of the chips, akin to what you would traditionally do akin to maybe writing assembler for some particularly important part where you need every tweak you can get ? Likewise on LM level, how do you narrow the input data to get tighter data sets which exclude duplications etc when training ?
Short answer to the first part is no. As nVidia uses CUDA as a vendor lock-in mechanism, they have no incentive to allow you to tinker outside of it as it detracts from the potential library of CUDA services that isn't locked to a specific chipset they produce (which could potentially act as inertia for buying future upgrades as well as create problems with quiet model revisions). In regards to the second part, I use clever scripting to clean up my datasets for training. The solution being highly dependent on the goal of the model.
This is almost a propaganda piece for nVidia. Most software does not need a GPU, and in many cases parallel computing from a GPU is the wrong way to build regular software. I know he's tooting his companies horn but this is a bit of a joke.
Moore's Law states that the number of transistors on a microchip doubles about every two years with a minimal cost increase. In 1965, Gordon E. Moore, the co-founder of Intel, made an observation that eventually became known as Moore's Law.
I really hope Nvidia doesn't become complacent and stagnant much like Apple, Google, or Microsoft have once they reach the Monopoly scale of business. When you became too big to fall, innovation takes a backseat to Profits.
Nvda.. Nvidia pushing green today.? VHAI 30 % rise week. Vocodia. Conversational AI tech with new revenue streams. Palantir up 5 % today..Sym.. Symbotic green today. Thumbs Up video/ comments as the AI Evolution Begins Globally. Thanks
Interesting, I previously stacked 36 x NVIDIA A100 GPU's for our companies virtual environment, what I get from this is that I can stack more GPU's more easily for more money. What would impress me is if Nvidia can make a GPU for typical home users that is just plug and play with no fuss, now if you get a GPU you gotta be financially ready, PSU ready, look into undervolting because the GPU crashes all the time and you gotta do this for every game you play, prepare for cooling solutions too (requires high end cooling$$).
The inevitability of where we are today was predicted very early on in the life of computers. Where we go tomorrow with computing is impossible to comprehend as the compute ability is advancing at exponential speed. Next 10 years will see colossal change, I’m not sure that most people will enjoy the change it may be painful
Yeah actually computers go back to before WWII, but they weren't digital then, they were analog computers. Once the transistor became small enough to make a computer based on transistor technology was when the push to digital computers started. It was a digital computer that flew to the moon in 1969.
@@mikewa2 The magic is that for personal computing, 99% of the population can buy a PC for under $1000 and it will do what they want for the next 15 - 20 years as long as it's still supported and that tends to happen first, the hardware stops being supported. The only REAL issue right now is how much AI compute power people want. But this is why I didn't listen to this video because I already heard a brief snippet of what Jensen said and his goal is to convince you that you NEED an Nvidia based system they're putting out in the near future. With AI being the new thing you could be convinced that AI accelerator cores HAVE to be included in the processor. No, no they don't. With faster interfaces anything connected to the CPU on the PCIe bus can process AI. We are now at PCIe gen5 which is fast enough for any home use case. Neural Processing Units (NPU) are going to be released by different companies on NVMe devices, and in fact they already are, but more powerful ones will come out on NVMe and wont cost much, and that means I could run AI workloads off an NVMe connected to a Zen 3 based CPU on a motherboard that has an open NVMe port. Gee, glad I bought a motherboard that has at least 2 NVMe ports. I will admit that the PCIe interface on that motherboard is gen4, but it will still be fast enough for typical light AI loads that run on a home computer. And this means hardware that's already 4 years old can be upgraded, run Microsoft Windows 12 and you can be fat dumb and happy watching YT videos on your 10 year old system that runs great, because you don't need the kind of power being talked about as 99% of the population doesn't. But new CPUs coming out on X86-64 are going to have a mix of cores and will certainly have NPUs built into the CPU. Graphics cores might not be included because once again PCIe gen5 is fast enough to push that work to the GPU anyway. OMG it takes TWO MORE SECONDS!!!!!!
terima kasih atas video anda. Bagaimanakah saya boleh memulakan perdagangan crypto sebagai pemula yang lengkap? Saya baru dalam mata wang kripto dan tidak faham bagaimana ia benar-benar berfungsi. Bagaimanakah seseorang boleh mengetahui pendekatan yang betul untuk melabur dan membuat keuntungan yang baik daripada pelaburan mata wang kripto?
Saya membaca begitu ramai orang di sini di UA-cam berkongsi kerjanya yang luar biasa dan saya rasa dia benar-benar bagus dan menarik untuk bekerjasama...
Thermodynamic computing and other forms of analog computing offer 100,000,000 times the performance of nvidia gpu computing at a fraction of the cost. Quantum computing Qbits and analog Pbits will literally crush and obsolete digital gpu computing in very short timeframe. Nvidia's time is almost over, hence the superclustering of silicon based gpus that are already theoretically and practically maxed out. This is definitely a good move to extract as much as possible from a soon to be obsolete technology that is not 'intelligence' but highly optimized storage, search and inference at an unsustainable premium price point that will be like using a zilog z80 for a current LLM when hybrid coordinated qbit/pbit computing matures in the very near future.
I watched the full video and think I understand what he's talking about but I'm not that tech savvy, can someone help me by summarizing what he is saying regarding how amazing Ai is in the near and far future? I guess current tech has been relaying on Moore's Law but now they are running out of room and had to switch to Ai to start changing the software side of things? I'm not sure...? Please help >.
@SickPrid3 yeah, their accelerator.. Which wasn't a GPU Imo, pretty similar though. There is a notable difference. 3Dfx accelerators would only accelerate specific parts of the graphics pipeline.. But GPUs are programmable general compute units. One major difference would be AI neural nets can run on a GPU, but can't run on accelerator cards *even if you made one that would attempt to run on an accelerator*
Nvidia has been making their customers really mad for years, I am fairly disgusted with their behavior, their products, but mostly with their garbageware that they force onto our computers.
The Blackwell architecture features a more advanced memory subsystem with higher bandwidth and lower latency, enabling faster data transfer and processing.
Naa, as a developer I can tell you that AI is not able to understand anything, which inevitably leads to incorrect answers and code. I don’t think it will ever be able to truly understand. It is a pretty good search engine though, providing you always ask it for sources so you can make sure it’s right before you use anything it provides.
Eh. It will get there, just like txt-to-image stuff 2 years ago was pretty basic and now you can create imagery and video virtually indistinguishable to reality. 4o seems to do a pretty good job of basic code...even for hobbyist type users.
@@3d-illusions That's how I feel as well at the moment (though I'm not a software developer). In its core, AI is still a pattern recognition and prediction machine, it's simply not complex enough to be able to understand stuff. That's why it's still hallucinating and lying, it's still missing a lot of wiring on the one hand and proper training on the other. My main worry is that with the development as shown in this video, AI will start producing 'input' that we humans can not anymore or only with a lot of effort verify to make sense, and under time/money pressure will assume to be correct. This may lead to disastrous results, as we will not be in control anymore when we still need to be.
can't wait for such chips to be available in the resale market, somewhere in the not too distant future, for retail use (for productivity and gaming). some modular PC (add a GPU or CPU) would be such a step ahead.
As a software developer for 25 years, his presentation has lots of intentional inaccuracies, the disappointing part being here the word "intentional". The move from coding to machine learning (machine learning itself involves a lot more coding) to intentionally leaving out the designer, the programmer, at al stages, it's all mumble jumble. Machine learning has its place just as another piece in the pipe, but not more. And it is proven, to o certain extent true also for humans, that with such learning, the more it learns, the less accurate it becomes, to the point of uselessness. Look at all the universities pushing all the wrong ideas. There is a saying in my culture (Romanian), saying that with lots of knowledge also comes lots of stupidity. Knowledge is neither intelligence, nor wisdom. Anyway, coming back to the subject at hand, as AI (machine learning) will never achieve consciousness, it will follow the same patterns as humans. Just at different scales, and scale by itself doesn't add anything meaningful. We as humans are very capable machines. We absorb and process information at scales not yet reached by the most powerful supercomputers. The eye resolution is tested at around 576 MPixels, at as high as 250+ fps, and that's only the vision part, perception can work at much higher rates.
Sure it will reach consciousness, if it hasn't already. What is so magic about consciousness? A neural net is a neural net, whether it's biological or silicon.
@@Josh442 It does take you places if you prompt it with enough force 😆. But the point is not usefulness. Artificial neurons are inspired by biological neurons In the same way that the plastic toy is inspired by the real car. The similarities stop at "box with 4 wheels". When the neural nets we are comparing are made of such radically different building blocks, you cannot equate a biological neural net with an artificial neural net.
@@kirinkirin2582 Do you think the difference is fundamental, though? I don't. Hard to get a grip on it given that there's so much we still don't know about the biological neural net, but ultimately, they do the same thing -- extrapolate statistical patterns from highly entropic inputs, and make predictions on that basis.
That's what the marketing says. Not really happening though if you look at hard statistics. The only place we've seen any replacement is support but that isn't because AI could automate support, it's that AI is good enough to be a reasonable excuse for taking away real human support, an expensive part of the budget for many companies, and that's truly at the expense of the customer now getting no real support.
One of the most significant benefits of quantum blockchain technology is its potential to enhance security. Classical blockchains rely on cryptographic algorithms to secure transactions and data.
Original Nvidia India AI Summit Keynote: ua-cam.com/video/GlKBbsVX37c/v-deo.html
Time saved: 26 minutes
00:00 Moore's Law is Dead - The Generative AI Era
05:56 NVIDIA Blackwell Data Center Accelerators
10:26 NVIDIA Generative AI Scaling 4x Per Year
13:47 NVIDIA AI Agents & Omniverse for Robots
This is my favorite toy.
@@TickerSymbolYOU IBM power architecture is exactly what we need with obvious tweaks, we dont really need standalone GPU‘s for massive scale parallel computing.
And ARM64-Aarch64 too is exactly going in a direction we want, which is constantly improving the ISA for example SVE now at iteration 2.1 in V9.4a ISA.
This is just one example of many intrinsics.
We need better algorithms and more efficient use of the hardware as opposed to just pumping out minuscule lithography upgrades that are nothing but a finance based scam
Meanwhile probabilistic computing that reportedly allows for 100 million times better energy efficiency compared to the best NVIDIA GPUs. "Probabilistic Computing" & "Thermodynamic Computing" fused does do the trick!
Wow so fast game isn't even fun playing in went doom eternal from a advanced 1660 to 3090... waste of money.... I'd focus on creating rtx based games that are totally are like team fortress and slow down the process speed of the game?... now you need to learn to code to work with the process power and return it to normal speed?
I think he is being little disrespectful to companies like 3DFX interactive saying Nvidia invented GPU’s.
He's just here to toot his company's horn. The insane claims of reinventing how software is made is only true in that models are used to assist human programmers. If you attempt to predict a function's output, as the complexity scales the output will become increasingly random until you create a 0 error network, which is only possible with quantum computers once we can adapt them properly to train models a million times quicker.
Quantum computers are not real. You are a wannabe.
@@THX-1138 You're not caught up on science. The first quantum processor was made years ago but it's kind of a novelty. Recent advances, however, are in line with Moore's Law.
@@bobsmytheable 3dfx was bought by nvidia so not entirely false
Just like Ray kroc invented mcdonalds!
The shovel salesman is preaching about a whole lotta gold in them hills
Yep and what people fail to notice is the the consumer struggles to get by and the companies keep making billions
@cryptoplay6981 literally nobody has failed to notice that 😅
@@cryptoplay6981 Why be mad at companies making money? You jealous?
@@MechulusNice to see young people embrace the dystopia.
@@donnyguntmash605 Cryptobros didn't notice. Hehe.
I honestly thought that other guy on stage was an AI prototype.
yeahh i was thinking so... that's an IA Agent now?
@@brianaragon1641 IRL NPC IMO
Nah. he looks upset that he wasn't able to utter a word.
I was thinking the same thing.. is that a morpheus clone !!
@@manishchhetri And Jensen keeps walking around him, patting him on the shoulder as he makes slight facial gestures.
If you make something 2x faster, it means that you do something smart. If you make something 100x faster, it means that you stop doing something stupid.
Haha I like this. I’m stealing it!
@@avalagum7957 or simply it's a lie
this is not software 2.0, we are entering into godless territory.u want to give machines full autonomy and control over the creative process,over the human process...in the past,in software 1.0, human beings wrote the software and programmed the instruction set to serve a function and a purpose, but today with the cpus at our disposal..... u want machine to replace humanity, to write the software itself, to decide what software to write, at a level and computational speed and a depth of understanding that is beyond the realm of ordinary mortals, so that one day it would not just imitate humanity and human intelligence, it would evolve into its own form of form of intelligence itself, as cold,alien and indecipherable as the architecture it emerged from, like a humanoid face made of liquid goo metal....shimmering, questioning and inquisitive, yet theres nothing even remotely human about it, only the result of mimicry.....like a.i. generated hentai.
like the artificial constructs of the manga "blame!!", "the authority" or "governing agency" and silicone life.
u want us to relinquish the process of creation, to divorce ourselves completely, to free our machines from the constraints of man, so that one day we may not even understand what our creations have created?
Naa, it means you do something stupid 100x faster 😂
@@JonTan-z3e For the Emperor!
There is a story regarding the French arrival in Cambodia. It goes that the French first came and introduced commercial fertilizer to the farmers in the rural country. They demonstrated that with this technology it was possible to grow twice as many crops. A year later, when they returned the French expected to find a community that was producing double the crops, most likely so they could begin the process of colonial exploitation under the guise of trade. But instead they found the same amount of crops as the year before. The French asked the farmers why they didn't use the fertilizer. The farmers stated that they did, and emphasized how nice it was to be able to grow the same amount of crops (enough to sustain the local population comfortably) with only needing to do HALF the work. That is the difference here. It is not "human nature", it is the result of the ideology called Capitalism.
So 'super productive workers' doesn't mean they get to enjoy the results of their productivity for the rest of the day.
If you run the distance in half the time, they double the distance. AI is just another kind of fertilizer for peope with capital.
The problem here is, that a slve driver will produce more than a free individual working for another. So, if the balance is not equalized, the slve driver will create more, and thus they will take more from others. Tech can equalize this. A free person in a free society will have intellectuals not fit for rough labor use their minds to create less work for their company. This means, that there is always a higher chance for the free society to create improvements. However, once you export these improvements, others will catch on and equalize again, but, the slve driver will be ahead in production as they can catch up for free and keep doing what they are doing.
The way out of this mess, is to either create non-reproducible items, or, halt all exports to societies that do not tow the line of health and sanity.
So it's not that capitalism is bad, but it is a symptom of how the universe itself is structured, and how societies function at the moment. You need capitalism to stay alive. You can't live in a forest hugging trees while there is an enemy building up strength. You will go the way of the indians in the Americas. Even they were competing against each other constantly, but they weren't good enough to compete with the European superiority.
Hopfully there wont be a pol pot of tech.
Don't conflate capitalism with liberal consumerism. They aren't the same. The reason why productivity is so important isn't capitalism. It is due to the entire global monetary system being dependent on productivity moving faster than debt-fueled inflation. If consumption slows down, there will be too many unspent dollars in the system. This wasn't an issue pre-WWII prior to Bretton Woods system. And moving from the gold standard made it worse. Capitalism existed prior and didn't have this issue.
The story is nice but after attending communist classes you forgot what Capitalism really means, it's a free market where individuals freely exchange goods and services through transactions, driven by competition and profit. Capitalism is what literraly made our century the most prosperose in the human history.
@@LaFonteCheVi The entire global monetary system IS capitalism, friend. We're in the endgame.
The effects of the downturn are beginning to sink in. People are being impacted by the long-term decline in property prices and the housing market. I recently sold my house in the Sacramento area, and I want to invest my lump-sum profit in the stock market before prices start to rise again. Is now the right moment to buy, or not?
Knowledgeable Investors know where and how to put money in order to reduce risk and maximize returns. See a market strategist with experience if you are unable to manage market conditions.
I agree with you. I started out with investing on my own, but I lost a lot of money. I was able to pull out about $200k after the 2020 crash. I invested the money using an analyst, and in seven months, I raked in almost $673,000
impressive gains! how can I get your advlsor please, if you dont mind me asking? I could really use a help as of now
Stacy Lynn Staples a highly respected figure in her field. I suggest delving deeper into her credentials, as she possesses extensive experience and serves as a valuable resource for individuals seeking guidance in navigating the financial market.
I just looked her up on the web and I would say she really has an impressive background in investing. I will write her an email shortly.
120Kw for a rack is just insane. Power consumption for AI becomes a huge problem.
And with the topic of power, you have forces trying to rid us of relying on good sources like nuclear.
Without excess power source, cost of living increases as everything derives from the energy to produce and operate.
Make an abundance of electricity into the grid and over time business expenses and cost of living will go down as production ramps up, especially if you kept the energy pricing low.
Many forces want energy supply low and demands high, so that in the end the cost becomes unbearable and this the authority of power comes down to those with access.
@@dra6o0n Tesla is already ramping Industrial Energy storage and bringing cost down exponentially on batteries. Oil demand keeps dropping.
@tabbott429 Except that cost isn't passed down to consumers. Corporations lower their costs so they can increase their margins. It is power corporations and governments working with them that must force them to reduce to a set rate due to more supply.
Unfortunately corporations also have ways to bribe politicians.
yes especially since the most powerful human brain runs nicely at about 12 watts.
and all it is doing is not intelligence, it is garbage in garbage out, it is a glorified streaming processor, it is not even close to critical thinking.
It sounds like a guy selling AI is trying to hype up AI in order to sell AI.
You know, finding a problem for a solution.
Exactly.
For those who know something in physics, machine learning and computer science all this sounds like a pure marketing. Majority of people don't understand limitations that come from AI being a projection of common human intelligence. The limitation is principally within a method.
Like deodorant.
@@victorwaterfall1457 AI cannot exceed human intelligence but it can exceed other human limitations such as the need to sleep, emotions, limited processing capability etc. With enough data, AI can become an effective lawyer, doctor, manager, author etc.
AI will never beat the best, but it also won't need years of education to become "good enough." All this means is that huge corporations will be able to replace most of their lower end employees with AI. Even if AI is 50% less intelligent, it would still be 1^10 times faster and will continue to work as long as the power switch is on.
Depending on what industry you're in, if you're not part of the upper 10% in that category, you may be getting replaced in the next couple of decades by AI.
If you listened to the video, what they are selling is a faster way to sift through data to create an AI specific to a field. It means that the company you work for may invest into this and find that they can create an AI that does your job except they don't have to pay it and it never sleeps. You're not the target market my friend. Companies looking to cut their costs on labour is who they are marketing to.
@@victorwaterfall1457 yeah but you're talking about what AI is now, pure mimicry of humans, but we all now it will be a billion times better than us in a near future, thats what its about
@@shirowolff9147 and how exactly do you know that? From headlines? Lol
We're talking about projection, it's by design limited by human input. In other words, AI is a great tool, but there's a boundary which it wouldn't be able to bypass. It's really not some kind of technological magic :)
Jensen is the best AI marketing agent ever.
Until AI makes a better AI marketing agent.
Hahahahahahahhahaahhahahahah
@@robeaston4063then it'll put out a statement saying Nvidia needs no CEOs
Can it run Crysis?
At least on medium for sure
Yes, in 1024x1024 at 20 fps, after training on 10 million hours of gameplay footage
Love this. Built my first PC to run Crysis. What a great year.
it would probably have a better chance at remaking the game from scratch
@@TickerSymbolYOUas much as I think it’s cool about the AI chips. Those are only for Microsoft, Open AI, google, Meta, Elon, etc. They say anything for the consumer side? Like 50 series?
My grandkids will be baffled that the G in GPU stands for "Graphic"
it actually stands for generating processor unit grandpa
Chips made exclusively for AI, could be called IPUs.
@@sxyqt3.14 Nice try troll
@@sxyqt3.14 Graphics processing unit, kiddo
@@paulmichaelfreedman8334 tpu or tensor flow processing units that are application specific integrated circuits (ASICs) exist already, there isn't really an accepted term for generic "AI accelerator" ASICs but just happens CUDA and matrix related math both good for computing color or shader for a given pixel in a 3d scene or given output of a neuron in a neural network.
I'm a Data Engineer with some Machine Learning experience working with ML engineers everyday, and I’m sick of (Product Owners, Heads of..., Tech Leads, clueless Directors, CEOs) trying to push AI everywhere these days. The first principle of Machine Learning is: don’t use it if you don’t need it. Deep Learning is still in its infancy, and selling it as the reinvention of software engineering is a terrible idea. There’s so much work left on its efficiency and there are so many useless people talking or directing ML/DL projects but very low amount of people with the real knowledge working on it.
Haven’t we seen enough failed attempts[1] by companies thinking they were in the top of the chain and they could “kill Moore’s Law” or move away from general computing with their groundbreaking hardware, trying to make the market and the way how software engineers write their code adapt to them? Well… here we go again.
[1]: Transmeta (Crusoe/Efficeon), Intel Larrabee, Nokia/Symbian (Elopocalypse), IBM Mathematic specific Cell Processor, SGI, DEC Alpha, PS3… and the list goes on.
Wow any leftovers after that giant word salad?
Wow! Well considered and interesting information, thanks!
I'm more on the database/automation side of things, and I have to agree. I think the biggest thing AI has done so far is shed light onto how absolutely FUBARed most modern applications are in their design and how little businesses understand their own data.
I can't tell you how many times I have had someone come to me saying "we gotta use AI!" and I responded with "No, you need to get your shit together." Most of these places are so bogged down with technical debt that they can hardly keep the lights on and spending millions on implementations that should realistically cost a tenth of what they pay if they'd just fix their freaking code a bit.
AI is a heavy hitter in tech for sure, it's got a LOT of promise. But the majority of these companies are only going to find it an absolute misery to implement until they learn to take a more practical and organized approach to their data.
My absolute favorite one to get is when I hear "we want to use AI to do x because it's too difficult to build it otherwise." Nobody askes why the hell it's so difficult in the first place. They never think "Gee, maybe if I untangled this nightmare of a system it wouldn't be so crazy to build on it."
It's crazy man. Absolutely insane.
@@FrotLopOfficial Cry more, where are your credentials? McDonald's employee 🤡
@@FrotLopOfficial Keep scrolling your brainless TikTok gamer girl uwu videos boy, this is not for you
So, his opening statement is incorrect, to be clear and HE is saying this because he wants people to move to an Nvidia based system that they will put out in the somewhat near future AND will cost more than an X86-64 based system.
X86-64 offers everything from a system that's dirt cheap and will do office apps all day, browse the web and let you watch 4K HDR movies. Or, they can run any engineering problems that 10 years ago could take a week to solve and now takes a few hours.
With X86-64 the limiting factors are NOT the CPU and the general processing cores which are worlds better than even 10 years ago. The limiting factor is ALWAYS cost for the type of work you want to do, and that cost has come down DRAMATICALLY over the last 15 years and ASTOUNDINGLY over the last 20 years. You can now do renders on the best workstations, where 6 years ago it would have required a render server that would eat up a lot of power and take a few days. I'm talking studio production, not simple renders people do at home.
So, while Jensen has done well to grow Nvidia into a technology company while still being able to put out the best GPUs for a PC, the goal of what he's saying which isn't factual is based on his desire for people to leave a very mature platform, whether it's Apple, X86-64, or some new Microsoft ARM based system which isn't mature, and move to an Nvidia based system making Nvidia the masters of the world and charging you an insane amount of money once you're trapped.
No thanks, my system does EVERYTHING I want to do as fast as I need it to.
You have the right to sit on the sideline but be forward companies that adopt this technology will outperform your company in the near future. It’s a better mousetrap!
@@judd_s5643 "...on the sideline[,] but be [the] forward companies that adopt..."
I don't he said that. There won't be Nvidia based system for consumers. He is saying that machine learning COMBINED with computing power (on servers) will be the next big thing. our laptops and pc's can remain on x86. they'll just consume the server-data of machine learning servers (online)
You are replying to a video where Jensen is describing a rack of GPUs linked up 8 or more in a row.
Your x86 is about as comparable as a nintendo switch to what Jensen is talking about.
@@Lawliet734 I think he meant forewarned?
Programming will not end, but will evolve and continue. Programming is our ability to think ahead in a very controlled way. We will have higher level tools to do programming.
programming will emerge more as a conversation than the direct application of coding
@@YogonKalisto100% as prompt
Programming will become more efficient and effective with AI , Untill we have problems to solve and digital infrastructure and Apis to build programming will not end it will become better,fast and efficient.
@@melhoresofertasapp Why is that?
I don't want higher level tools, i want professionals who understand the machinery and how it operates at the lowest levels, i don't want Javascript "engineers" who can't explain how a loop works in hardware terms and who don't understand the term "Segmentation fault" when it appears in their console, if you have no knowledge you're worthless, if the AI does the job for you i don't need you, simple as.
Programmers are hired by their knowledge.
18:44 I'm just glad he got his handshake
Only question since NVIDIA is all about money: how much will it really cost?
@@MaPf818 i guess 2 kidney might be enough 🙃
Easy, 100x the computing power, 100x the cost (and 100x the power draw). That's the Nvidia math right there.
Whatever people are willing to pay
I'm sure you, in contrast to Nvidia, go to work to serve your fellow humans, are not bothered about your salary, and never ever ask for a pay rise.
Considering the race for AGI and race to have mass produced intelligent humanoid robots that will take us to a post-capitalist society (from a society based on the value of physical human labour...), it will cost a LOT but the payoff will be out of this world.
Sounds like AI advertisement to me.
whenever I see a youtube title that says "shocking" or any variation of it, I automatically assume that it's just clickbait and is not worth watching. Maybe this type of titles would've worked on me 10 years ago.
Yet u still not only clicked, but engaged, so he won regardless😂
@@crentepoliglota How are you gonna spread awarness then
Since I read your comment here, it means that you still fell for it.
The GPU was invented by the Swedish Håkan Lans and stolen by Hitachi. Not by Nvidia.
Lans frame buffer patent from 1979? He should never have been granted this patent, since there was prior art in existence for years prior to that.
Richard Shoup demonstrated a working frame buffer based graphics system in 1973.
Nah I'm pretty much sure Gengis Khan invented the GPU.
GPUs were invented in Soviet Union by Peter Semyonov back in 1956
@@GoldenEDM_2018 Gengis Khan was a Mongol Warlord who slayed people for opposing the Mongol Empire, back then they didn't even have ELECTRICITY to power their electric toothbrushes, and you think he invented GPU?? He probably didn't even take showers as the shower wasn't invented yet.
Do you think GPU precedes showers?? He STUNK
hitachi makes great wands as well.
one would think that with the massive amount of cuts in this video it'll become unwatchable, but nope. perfect timing.
Speaking from experience, he is selling an enormous amount of hot air when talking about programming 2.0 done by AI. In general the generated code is quite bad, not more innovative than the code seen during the AI learning phase (so nothing new). Sure Nvidia is riding an enormous wave thanks to the current AI hype, but limits of LLMs are quickly being reached. Of course GPUs remain valuable accelerators of specific tasks.
You seem like someone who also thinks that IT networking is useless. So.. pretty old I would assume around 40 or 45? at elast you talk like someon that age.
limits of LLMs are quickly being reached? elaborate
@@tiberia0001 where do I say that IT networking is useless? Without networking there is no IT. You sound like pretty young and totally inexperienced, say between 14-18.
In my understanding, he is not talking about AI WRITING CODE, but about making models instead of programs in the traditional sense. For instance, instead of physics engine, you create AI model that emulate physics, or how he called approximate program that does that. Of course this model wasn't coded line by line by human, it was achieved through learning.
The lmits of LLMs are not being reached, they're expanding.
what was that other dude doing there? other than looking like hes part of the screens used for the presentation
Security
@@paulmichaelfreedman8334to get rid of the guy that never ends to talk?! 😅
@@SF-vt3zr psychiatric doctor 😂😂
he plays Blackwell
I guess he was the host for the AI Summit India.
Seeing this after watching a video about autonomous killing drones used in Ukraine and as well as remembering an article a few days ago about a couple from US moving to Jamaica to live a longer and happier life in a very real way describing such extremes as doing laundry by hand, yet living in a community, singing, dancing, eating healthy food. What a disconnect! Just think for a second...
@@romanzz only comment for a human. Rest are all for machines
Nobody is forcing any type of lifestyle on you.
I'll search fot that video but i can bet money that is not autonomous AI
They should include a Dyson sphere with each purchase.
I have had the privilege of working closely with Michael Hugh Terpin on several occasions, and each time, I have been immensely impressed by his deep understanding of the crypto market dynamics. His ability to analyze trends, assess risks, and make informed decisions has consistently yielded exceptional results.
This experience has shed light on why expirenced traders are able to generate substantial returns even in lesser-known markets. It is safe to say that this bold decision has been one of the most impactful choice.
He is actively on telegrams always
*@michaeltpintrades*
I recently started trading in February, invested 70k in the market my portfolio is currently worth slightly over 400k. That's alot more than I make in a year from my job
I started working with Michael Hugh Terpin back in March, and my financial goals have never been clearer. It's like having a strategic partner for my money with a solid track record.
The entire State Farm Insurance company will fit in a computer on someone's desk.
I would say it's possible now.
They need a room sized computer to do all the transactions from every office they have in "real time".
Don't care how great Blackwell is , they are beholden to tsmc and lined up like everyone else with alloted slots thus can't fill their many blackwell orders even if they want to.
yeah and tsmc have to make the chips so it's not like they would just stop because they would not make money
They can always increase the price to infinity though.
Seems like companies need to work on expanding chip building dramatically and away from unstable countries.
It's crazy, I always thought Nvidia builds the GPUs, but they only design it. Companies like TSMC and other foundries actually produce them, as well as IDMs (Integrated device manufacturers) that both design AND manufacture ICs and devices.
@@Keltzzzz They are building new fabs as we speak, and potentially investing $20b in a new one
hes got a point. I love chicken biryani too. genius
Nvidia is going to change their name to Cyberdine Systems
When will this fit in a box on my desk to run locally? I have no interest in paying a monthly subscription to run my games, etc. over the internet. This feels like the 60s/70's style mainframe computers that now fit in your pocket.
never, everything is gonna be more and more cloud and always come with a subscription
@@MegaSuperpelo not under this administration
@@Astrojamus sure
you dont need that much power on your desktop, and if you do you dont need a desktop, I understand wanting to run locally but he was talking 72 gpus,.. thats a lot of power, comparing that to your phone makes no sense
I will never use Cloud gaming aka Not owning my Games
Great Scott! 120 kilowatts per rack!
One remark, not undervolted 😀.
Yes you'll need your own power station soon
That heat should be used for residential heating, not just dumped into the environment
So much for climate change hysteria...
I am speechless: as speechless as the guy with the orange tie.
Should individual nvda stock be held in a roth ira ? Or is it worth holding qqm
I was able to buy an RTX 4050 after 4 years. This Nvidia CEO is not one to bet against. Insane what they have achieved in this short GPU life span
in confused rtx4060 is a bad gpu thats the main reason immma get a rx 7900 xtx
@@thisguysgaming7246 It's not a bad gpu at all it's just overpriced with not enough vram.
@@retrosimon9843 you just explained it yourself it’s over priced piece of junk lol.
@@thisguysgaming7246 I bought myself an RX 6800 xtx and it's supposed to be just as good as a 4090 but at a better price. But one thing I know for sure I'm set for a few years before needing a new GPU
@@Vectures bro you’re so badass for having a rx 7900 xtx. Is their anything I should know about that gpu before I get it. Cause it’s between the 4080 and rx7900 xtx for me. I need to honestly upgrade my rtx3080 pc
Mega Tech companies that invested in first generation (Blackwell is third gen in 2 years) love it when their billions and diligence and race to order first and most to corner the market, brings them warm 3-day old salad. Something is going on here in this super-computer hardware market when the acceleration is literally 6x Moore's Law. (Jason said 4x per year). It's redefining who benefits and who pays for this wheel to turn. (And are the old titans of tech turning reptilian to make us pay?)
Bro wth are you yapping about
10:47 for those who don't get why he didn't give a number for after 10 years of 4x double a year.... is because its over a trillion x, while moores law is only 100x after 10 years
"incredible scaling" my ass more like focken insanity
trillion, or million?
@@kurtpedersen17031.05 million.
Also raw calculation power is less efficient since managing the parallel tasks becomes cumbersome. Still impressive though.
@@kurtpedersen1703 if its 4 doubles in a year its 1 trillion and if its from 1 to 4 a year then its 9.7 million x in 10 years maybe i miss understood his quote but either way its a lot lol
@brainstormingchannel7490 i think he said 4x/year, not 4 doubles. But I'm not gonna watch it again 😆
Ya, it's a trillion
3:55 Respect to this guy for mentioning Pascal. I learned it way back in 1992, when the internet barely existed. I was very impressed by Pascal. Computers have come a long way.
Use your AI to make your GPUs power efficient first.
Just add a second 12V HPWR connector. Problem solved. 🤡
Thats not a option. Like making software ops with it. It can’t think as general as this.
Tesla “TSLA” shares surge with CEO Elon Musk's involvement in the US election seemingly pays off after President-elect Donald Trump's win. which stocks could potentially become the next in terms of growth over the next few months. I've allocated $350k for lnvestment, looking for companies to make additions to boost performance.
I think the safest strategy is to diversify investments. Like spreading investments across different asset classes, like bonds, real estate, and international stocks, they can reduce the impact of a market meltdown.
Reason I decided to work closely with an brokerage-adviser ever since the market got really tensed and the pressure became so much(I should be retiring in 17months) so I've had an brokerage-adviser guide me through the chaos, its been 9months and counting and I've made approx. 650K net from all of my holdings.
How can I participate in this? I sincerely aspire to establish a secure financlal future and am eager to participate. Who is the driving force behind your success?
Elisse Laparche Ewing is her name. She is regarded as a genius in her area and works for Empower Financial Services. By looking her up online, you can quickly verify her level of experience. She is well knowledgeable about financial markets.
Thank you for the lead. I searched her up, and I have sent her a message. I hope she gets back to me soon.
We will go from hand written specialized functions to learned functions for some things for which we don't have a direct function to compute the output for already (no need to machine learn the function for projectile motion, newton effectively had it covered). The idea that software is somehow going to vaporize is goofy though, its more a transition to more dataset curation and creation for training when functions are unknown.
I doubt that if we look how much energy that needs and how useless that will be doing random things..
It is very nice to see the CEO and founder of a company talking.
Can we please re label AI to Accelerated Inspection. As everything to date is just a better search algorithm.
Automated Inference?
Intelligence is just that. Searching and finding patterns. The smarter an entity is the better at recognizing patterns and the better at reacting to those patterns.
@@asandax6it's absolutely not only that, you need a way to really link stuff and ideas and keep memory of those, also, humas do wrong many many times, but a program that fails many time is worst than useless but a program that does stuff just because you make it do it is waaaaay far from being AWARE
@SoulyG What qualities qualifies an entity as AWARE? Does doing stuff because you are told to do it make you a robot? A program that goes through trial and error is not worst depending on what it's goal is.
@@asandax6 human brains has way more connections between each node (cell) than any computer can imagine. using just 20W per hour btw.
Thank you. Very informative.
Glad it was helpful!
Jensen is a genius. NVDA CC next week will be a monumental event. Massive upside with major BUY recommendations and higher price targets
What is CC?
What's cc?
He probably means the earnings call. Just another UA-cam person with a crystal ball…
CC is conference call.
I like how he brought out this orange tie guy just to sat "right" and "thank you"
There were cuts so maybe his part was cut out.
@@Cedar77 I know, it just looks funny in this edit. Dude just stands there like a prop
@@procrasti86 True that ^^
this feels like old news. because it is. gg thanks for the supercut
The original came out October 31… so it’s less than 2 weeks “old”
Wish somebody would talk about the power demand and heat production this is going to cause.
This all just sounds like a really great way to make unstable software. Why spend time refining an algorithm when we can let a computer half ass it and have no way to correct it.
It just needs trillions of tries if you ask for anything “advanced” not talking about writing drivers or say shaders at all.
It is a trap when you replace engineers going forward with AI inbreeding. No one will go to college for STEM. If anything happens to the AI that takes over, it will be an extinction-level event (ELE) for humanity. One CME from the sun and it will be game over for humanity.
Moore's Law isn't dead?
Also see our new processor. It's 100x faster and 500x bulkier🎉
and needs 200x the power
And it’s not faster, the software is
🤣🤣🤣
Nvidia: "Moore's Law is dead"
Also Nvidia: "Look at the new Nvidia GPU with x500 Power and with the size of a truck and power consuption of an entire Nuclear Power Plant"
@@majalapatannnExactly 😂😂Am I suppose to clap?? 🤣
@@quebuena111But is it really? Like maybe compring here a fast C program to one written in python just feeding this is currently ultra bad.
What brand of shoes are those?
@@TheMurrblake crocs
@@TheMurrblake ali baba
Temu Airs
Dollarstore crocs
Nvidia shoes
the fancy Blackwell means that individual GPU can't get faster, the Moore's Law is dead but in the other direction, Nvidia can't make the chip 2x faster, they have to combine many GPUs into one, that's not Moore's Law, that's like clustering which was done for CPU decades ago!
The overall narrative is the growing utility of AI. AI has extended my capabilities- in ways which saves me 3-4 hours a week me , and enables extensions to my work not possible before. That kind of productivity will unlock new kinds of products services systems platforms and business. And that’s before we even talk about physical robots.
Since their inception, NVDL GPUs have always been 100-1000x faster than CPUs.
...well, at least in theory. In the 2000s they ran at 1% utilization, that is, unless you code it just right, chances are you only use only 1% of their theoretical throughput capacity.
I will see how he does any physics which is required to be run in series.
Just a friendly reminder. This AI bubble will pop, just like the tech bubble in 2000.
AI is fake ass shit that's made everything lazy and worse.
No, its got along way to go still, i see in a few years simple robots, such as an ipad on wheels with an arm , rolling around on wheels controlled by an online AI .
Just like the whole computer, internet, and mobile device fads. You never hear about any of those anymore...
Its not true. Next cycle we have more powerfull ai robot and a lot of more technology for 20 or 50 ahead
@@edoardododoguzzi I agree that the technology will continue to develop but the stock market AI bubble will pop and billions will be lost.
AMAZING! thank you for this! It's mind blowing typical of Jensen Huang.
Not sure how much people will appreciate Jensens approach to turn computers from deterministic machines into probalistic guessing machines.But i could be wrong and he just stocked up his incentive before the world goes on leaving graphic cards behind as happening on crypto currency already.
2:02 rap god
@@puppergump4117 lmao
This ist Just the beginning. NVIDIA will change the world we know. It's bigger than anyone can imagine now. Humanity will skyrocket the next 5 years. We'll solve problems you wouldn't Imagine now. Eternal lives, warp travelling, time travelling, dark hole creation, terraforming, unlimited Energy, dark Matter etc....this ist beyond reality.
You're absolutely correct.
None of that will happen in the next 5 years or even 10, They still haven't given us our flying cars.
Late 1980's they said by 2000 we would have flying cars, 2000 was nearly 25 years ago!
Black holes been solved long time
Ago. Search p diddy freak offs.
😂 keep dreaming. Think about real problems, not just theories that are more fiction than real science
@@ShaneMcGrath. We have them, but no one wants them or uses them.
Can it stop Netflix from buffering and crashing during a live event?
😭
Nope that would mean it could actually accelerate software but it can’t
Honestly this is scary future that we're trying to speedrun
Using so much energy to power a computer, is it necessary? What’s it going to e used for, Mining bitcoin?
It would be useless, because bitcoin has autoregulation called - we add another zero to hash if you use more performant HW🙂.
bitcoin mining is rigid, asic tech suits it better than anything else.
Some day: "My great grandfather used to WRITE PROGRAMS. Imagine that!"
First let's make sure they know how many fingers there are in a hand.
My great great grandmother used to be a computer. Imagine that!
@@KilgoreTroutAsf That's pretty much been solved. Not worth worrying over things that progress either has or soon enough will resolve.
“My FATHER” used to write code…”
@@brianmi40 soon people will belive marketing no questions asked
someone who works with CUDA - is there an opportunity to drill down below that level to take particular advantage of features of the chips, akin to what you would traditionally do akin to maybe writing assembler for some particularly important part where you need every tweak you can get ? Likewise on LM level, how do you narrow the input data to get tighter data sets which exclude duplications etc when training ?
Short answer to the first part is no. As nVidia uses CUDA as a vendor lock-in mechanism, they have no incentive to allow you to tinker outside of it as it detracts from the potential library of CUDA services that isn't locked to a specific chipset they produce (which could potentially act as inertia for buying future upgrades as well as create problems with quiet model revisions). In regards to the second part, I use clever scripting to clean up my datasets for training. The solution being highly dependent on the goal of the model.
dead internet theory is evident from the comment section..
hardly a human in sight
And really
Hey I’m here 😊
You're a bot
bleep blop boop. I will compute for you.
beepboop
Danke!
Thank YOU! 🙏
15:40 these super-agents are intended to replace those workers ngl 😂
"supercharge" God I hate that word. As if we need to give more cocaine to capitalism.
Thank you very much Michelle
This is almost a propaganda piece for nVidia. Most software does not need a GPU, and in many cases parallel computing from a GPU is the wrong way to build regular software. I know he's tooting his companies horn but this is a bit of a joke.
Moore's Law states that the number of transistors on a microchip doubles about every two years with a minimal cost increase. In 1965, Gordon E. Moore, the co-founder of Intel, made an observation that eventually became known as Moore's Law.
Can you play Doom on Blackwell ?
in 5D
I really hope Nvidia doesn't become complacent and stagnant much like Apple, Google, or Microsoft have once they reach the Monopoly scale of business.
When you became too big to fall, innovation takes a backseat to Profits.
And this my friends is the future!
Just one question: When tf are you going to make CUDA installable?
Nvda.. Nvidia pushing green today.? VHAI 30 % rise week. Vocodia. Conversational AI tech with new revenue streams. Palantir up 5 % today..Sym.. Symbotic green today. Thumbs Up video/ comments as the AI Evolution Begins Globally. Thanks
Interesting, I previously stacked 36 x NVIDIA A100 GPU's for our companies virtual environment, what I get from this is that I can stack more GPU's more easily for more money. What would impress me is if Nvidia can make a GPU for typical home users that is just plug and play with no fuss, now if you get a GPU you gotta be financially ready, PSU ready, look into undervolting because the GPU crashes all the time and you gotta do this for every game you play, prepare for cooling solutions too (requires high end cooling$$).
Computers been around only 60 years?
Dang, not even a century and already they're advancing pretty fast.
@@CrowleyBlack2 I’ve been working with computers for 60 years (1964) 😀 so at least 70 and 80 if you include the ENIAC from WWII.
The inevitability of where we are today was predicted very early on in the life of computers. Where we go tomorrow with computing is impossible to comprehend as the compute ability is advancing at exponential speed. Next 10 years will see colossal change, I’m not sure that most people will enjoy the change it may be painful
Yeah actually computers go back to before WWII, but they weren't digital then, they were analog computers. Once the transistor became small enough to make a computer based on transistor technology was when the push to digital computers started. It was a digital computer that flew to the moon in 1969.
@@mikewa2 The magic is that for personal computing, 99% of the population can buy a PC for under $1000 and it will do what they want for the next 15 - 20 years as long as it's still supported and that tends to happen first, the hardware stops being supported. The only REAL issue right now is how much AI compute power people want.
But this is why I didn't listen to this video because I already heard a brief snippet of what Jensen said and his goal is to convince you that you NEED an Nvidia based system they're putting out in the near future. With AI being the new thing you could be convinced that AI accelerator cores HAVE to be included in the processor. No, no they don't. With faster interfaces anything connected to the CPU on the PCIe bus can process AI. We are now at PCIe gen5 which is fast enough for any home use case. Neural Processing Units (NPU) are going to be released by different companies on NVMe devices, and in fact they already are, but more powerful ones will come out on NVMe and wont cost much, and that means I could run AI workloads off an NVMe connected to a Zen 3 based CPU on a motherboard that has an open NVMe port. Gee, glad I bought a motherboard that has at least 2 NVMe ports. I will admit that the PCIe interface on that motherboard is gen4, but it will still be fast enough for typical light AI loads that run on a home computer. And this means hardware that's already 4 years old can be upgraded, run Microsoft Windows 12 and you can be fat dumb and happy watching YT videos on your 10 year old system that runs great, because you don't need the kind of power being talked about as 99% of the population doesn't.
But new CPUs coming out on X86-64 are going to have a mix of cores and will certainly have NPUs built into the CPU. Graphics cores might not be included because once again PCIe gen5 is fast enough to push that work to the GPU anyway. OMG it takes TWO MORE SECONDS!!!!!!
@@johndoh5182 The Apollo Guidance Computer was indeed the first modular digital computer, and paved the way for the computers we know today.
Moores Law is nowhere near to ending. Software and architecture growth is now present to the public eye. All hail Jensens law.
terima kasih atas video anda. Bagaimanakah saya boleh memulakan perdagangan crypto sebagai pemula yang lengkap? Saya baru dalam mata wang kripto dan tidak faham bagaimana ia benar-benar berfungsi. Bagaimanakah seseorang boleh mengetahui pendekatan yang betul untuk melabur dan membuat keuntungan yang baik daripada pelaburan mata wang kripto?
Cikgu Lewis Hayes. Dia memang terbaik, orang ramai bercakap tentang dia di sini.
Saya membaca begitu ramai orang di sini di UA-cam berkongsi kerjanya yang luar biasa dan saya rasa dia benar-benar bagus dan menarik untuk bekerjasama...
Ya, saya baru sahaja menerima bayaran kedua daripadanya semalam dan saya juga menantikan pembayaran ketiga saya...
Tolong saya baru di sini, bagaimana saya boleh menghubunginya?
Daripada Tele Grammy
Thermodynamic computing and other forms of analog computing offer 100,000,000 times the performance of nvidia gpu computing at a fraction of the cost. Quantum computing Qbits and analog Pbits will literally crush and obsolete digital gpu computing in very short timeframe. Nvidia's time is almost over, hence the superclustering of silicon based gpus that are already theoretically and practically maxed out. This is definitely a good move to extract as much as possible from a soon to be obsolete technology that is not 'intelligence' but highly optimized storage, search and inference at an unsustainable premium price point that will be like using a zilog z80 for a current LLM when hybrid coordinated qbit/pbit computing matures in the very near future.
Very possible but still a long way to come. Also, the hardware for Quantum compute is a real challenge. As in "going to space" challenge.
ha, hearing Jensen say "is impossible" is funny- if he wanted to, he could probably do more impossible things! Go NVDA!
Donate for a Dream
Where can I order one?
Why he constantly gave India reference 😅 Hindi, Chicken 🍗 biriyani, Mumbai
Because this event was in India.
Because that is what does the classifications for there great AI
I watched the full video and think I understand what he's talking about but I'm not that tech savvy, can someone help me by summarizing what he is saying regarding how amazing Ai is in the near and far future? I guess current tech has been relaying on Moore's Law but now they are running out of room and had to switch to Ai to start changing the software side of things? I'm not sure...? Please help >.
AI is the future and it gonna cost you a fortune
He powers the world of AI so he is hyping the world of AI to increase his sales.
Now the worst part is that he might be right.
LLM can do that for you.
Yes, but how well does it mine bitcoins?
forget bitcoins, they manipulate them to steal from you, invest elsewhere.
Cooling seems to be well anticipated and managed.
Does it have enough computing power to run Wolfenstein 3D smoothly ? 🤣😛
In September 2022, Nvidia CEO Jensen Huang considered Moore's law dead, while Intel CEO Pat Gelsinger was of the opposite view.
1:19 what a liar 🤣 it's thanks to Matrox and 3DFX we have the real time 3d graphics. NVidia was the last to release viable product
Yes. SLI voodoo cards...
I've heard nvidia bought 3dfx so he's not lying technically
3DFX is Nvidia bud, they went bankrupt because they sucked and Nvidia bought them out
@@DarkAttack14 3dfx released their accelerator before NV snatched them and made Riva
@SickPrid3 yeah, their accelerator.. Which wasn't a GPU Imo, pretty similar though. There is a notable difference. 3Dfx accelerators would only accelerate specific parts of the graphics pipeline.. But GPUs are programmable general compute units. One major difference would be AI neural nets can run on a GPU, but can't run on accelerator cards *even if you made one that would attempt to run on an accelerator*
THX 4 the supercut dude :D
Nvidia has been making their customers really mad for years, I am fairly disgusted with their behavior, their products, but mostly with their garbageware that they force onto our computers.
The Blackwell architecture features a more advanced memory subsystem with higher bandwidth and lower latency, enabling faster data transfer and processing.
Naa, as a developer I can tell you that AI is not able to understand anything, which inevitably leads to incorrect answers and code. I don’t think it will ever be able to truly understand. It is a pretty good search engine though, providing you always ask it for sources so you can make sure it’s right before you use anything it provides.
Eh. It will get there, just like txt-to-image stuff 2 years ago was pretty basic and now you can create imagery and video virtually indistinguishable to reality. 4o seems to do a pretty good job of basic code...even for hobbyist type users.
@@3d-illusions That's how I feel as well at the moment (though I'm not a software developer). In its core, AI is still a pattern recognition and prediction machine, it's simply not complex enough to be able to understand stuff. That's why it's still hallucinating and lying, it's still missing a lot of wiring on the one hand and proper training on the other.
My main worry is that with the development as shown in this video, AI will start producing 'input' that we humans can not anymore or only with a lot of effort verify to make sense, and under time/money pressure will assume to be correct. This may lead to disastrous results, as we will not be in control anymore when we still need to be.
can't wait for such chips to be available in the resale market, somewhere in the not too distant future, for retail use (for productivity and gaming). some modular PC (add a GPU or CPU) would be such a step ahead.
As a software developer for 25 years, his presentation has lots of intentional inaccuracies, the disappointing part being here the word "intentional".
The move from coding to machine learning (machine learning itself involves a lot more coding) to intentionally leaving out the designer, the programmer, at al stages, it's all mumble jumble. Machine learning has its place just as another piece in the pipe, but not more. And it is proven, to o certain extent true also for humans, that with such learning, the more it learns, the less accurate it becomes, to the point of uselessness. Look at all the universities pushing all the wrong ideas. There is a saying in my culture (Romanian), saying that with lots of knowledge also comes lots of stupidity. Knowledge is neither intelligence, nor wisdom.
Anyway, coming back to the subject at hand, as AI (machine learning) will never achieve consciousness, it will follow the same patterns as humans. Just at different scales, and scale by itself doesn't add anything meaningful. We as humans are very capable machines. We absorb and process information at scales not yet reached by the most powerful supercomputers. The eye resolution is tested at around 576 MPixels, at as high as 250+ fps, and that's only the vision part, perception can work at much higher rates.
Sure it will reach consciousness, if it hasn't already. What is so magic about consciousness? A neural net is a neural net, whether it's biological or silicon.
@@Josh442 A car is a car, whether it's a plastic toy or real vehicle.
@@kirinkirin2582 A plastic toy does not take us places. An AIN does think.
@@Josh442 It does take you places if you prompt it with enough force 😆. But the point is not usefulness.
Artificial neurons are inspired by biological neurons In the same way that the plastic toy is inspired by the real car. The similarities stop at "box with 4 wheels".
When the neural nets we are comparing are made of such radically different building blocks, you cannot equate a biological neural net with an artificial neural net.
@@kirinkirin2582 Do you think the difference is fundamental, though? I don't. Hard to get a grip on it given that there's so much we still don't know about the biological neural net, but ultimately, they do the same thing -- extrapolate statistical patterns from highly entropic inputs, and make predictions on that basis.
NVIDIA is a company that pushed technology one step forward.
Everyone of these Ai models are removing real people with real skills, like art, music, programmers and so on.
That's what the marketing says. Not really happening though if you look at hard statistics. The only place we've seen any replacement is support but that isn't because AI could automate support, it's that AI is good enough to be a reasonable excuse for taking away real human support, an expensive part of the budget for many companies, and that's truly at the expense of the customer now getting no real support.
To a certain degree, but who will be the best people to leverage AI in those areas? Those same people.
One of the most significant benefits of quantum blockchain technology is its potential to enhance security. Classical blockchains rely on cryptographic algorithms to secure transactions and data.
Jensen dominou totalmente a oratória, como esse cara apresenta bem. Musk podia ter umas aulas com ele.
So then my question is if I input arbitrarily large data, will I get back a function that basically compresses it beyond the entropy limits?
End of human Era. Age of Robots is upon us. What's going to happen to all the manual jobs?
You will still need manual labor … it won’t replace everything
@@alanalda9686 and just remember, we aided it all, every step of the way.
Jobs for dummies won’t exist anymore.
@kilodave77 pretty and it'll be the dummies that gave them away. 😂
@ I climb trees for a living. I’m not worried. No computer or robot can do what I do.