Jensen Huang is a Founder/CEO. It makes a huge difference in company culture. Their employees are engaged, feel connected, and have a strong sense of ownership. Intel, AMD, and others are just corporations offering good and well-paid jobs but nothing more.
Jensen Huang is in a different league of CEO. He has the technical knowledge and vision of the entire GPU field from the beginning that he helped create to vision for the future. Other CEOs are worried about next quarter earnings and figuring out wtf AI is.
AMD's CEO has three degrees in electrical engineering, has spent much of her life developing products including semi-conductors, and you are talking nonsense. Do a minute of research. Huang and Su are closely related by family.
@@Longtack55 Lisa Su is an expert in semiconductors and EE but, from all the talks I've seen of her, demonstrates only a very basic knowledge and understanding of the AI field. When I watch Jensen's presentation, I see he knows all the cutting-edge details of the field and has a vision of where the next big thing is heading. Lisa Su at AMD AI Day and Jensen at Computex was like night and day. Being closely related by family does not translate to equivalent AI knowledge and capability as CEO. Lisa Su has been an amazing CEO for AMD, but Jensen is at a whole new level.
And AMD offers custom designs for specific solutions. I think people are really underestimating the advantage of AMD chiplets and the flexibility they offer. And Nvidia is not customer friendly. They force what they think you want into the market and people are adapting
Not only flexibility -also price. From what I have picked up as rumours, AMD will kill NVidia on the pricing front. Chiplets are BRUTALLY cheaper to produce.
@ThomasTomiczek that's exactly what I'm saying. AMD can literally interchange the different chiplet ccd count for whatever workload that is specific to without any effect on pricing. It's like a jigsaw puzzle where all the parts are interchangeable. NVIDIA margins might be good for the 2md half of this year, but that 1st quarter next year will paint the AI picture more clearly.
From a purely valuation perspective, Nvidia's stock should fall about 90% in the next recession, AMD's should get cut in half, but Intel's should barely fall at all because it is already so depressed.
Yeah but let's not forget that AMD is the company that drops the Popsicle on the floor and cries looking at it like a baby. You could give them a fish on their hand and the fish will still scape.
The big edge Nvidia has is their software stack (CUDA), but I've heard several in the ecosystem are not a big fan of the lock in that comes with CUDA. Just something to keep in mind.
True but right anyway there so far ahead this going have national security ramifications. I can see major world player like France, Turkey, and Japan having a problem with the United States having this technology to themselves.
And AMD offers custom designs for specific solutions. I think people are really underestimating the advantage of AMD chiplets and the flexibility they offer. And Nvidia is not customer friendly. They force what they think you want into the market and people are adapting early. It will change shortly and those Nvidia margins will crash...
You downplay AMD and don't even mention Pytorch? That is huge given AMDs history with open source. How do you think LLMs are made? AMD has made a huge step to being somewhere from being absolutely nowhere.
The problem with AMD's recent AI presentation is that I don't really feel like any of this is going to make them competitive with NVIDIA in the AI space. It's not so much the hardware that's the issue, but the lack of investment on the software side from AMD. NVIDIA has heavily invested in CUDA, making it very powerful, which in turn makes it very useful for AI. NVIDIA also does a whole heap of AI research internally using their own hardware, which leads to further improvement with each generation. A simple example of this is the move to 4bit inference. As Lisa Su said, their 192GB AMD system allows you to run an 80 billion parameter LLM model (since it would be running at 16bits on AMD hardware). With CUDA it's possible to run these models at 4bits with no perceptible loss in quality (there is a scientifically measurable loss, but you'd never notice it in practice). This allows you to run those models with a quarter of the VRAM. To put it another way, you can already run 33+ billion parameter models on a single RTX 3090, which is a 24GB consumer GPU available for around $700 on the second-hand market. So running an 80 billion parameter model just isn't that impressive anymore. Hypothetically speaking, if AMD could get 4bit inference working on these systems, it would allow you to run a 300+ billion parameter model with 192GB of VRAM, which would actually be game-changing (as it would mean being able to run models larger than GPT3 on a single AMD system). Unfortunately, they're just throwing VRAM at the problem, rather than making the technological investment needed to make their hardware more useful for AI.
I’m no tech geek but from Su presentation it does sound like they’re throwing in more Ram to compensate for lack of compute and offering a more affordable version than NVDA. I Keep hearing NVDA software/ecosystem is way more developed than AMD, so interesting to hear how AMD will catch up on that side
"Unfortunately, they're just throwing VRAM at the problem, rather than making the technological investment needed to make their hardware more useful for AI."their actually doing both. see changes in RoCM
This video is not very insightful. I agree that AMD's presentation was a little too honest, but Nvidias presentation was super cringe in comparison. AMD's datacenter APUs are super competitive with much more memory, yes these MI300 products come out a little later, but they are actually stronger, making up for the delay. The MI300 series bring feature parity with better 16bit float, int8, and int4 support. Nvidias main lead is in software, and that will probably continue for one or two generations, but AMD is catching up really fast. CUDAs vendor locking-value is hugely overstated. People don't write code for CUDA directly for data science problems, they write in frameworks like TensorFlow, PyTorch, Flax/Jax, etc. AMD's chiplet based hardware will be cheaper to produce and will have much more silicon to dedicate to each product. Nvidia needs to deliver on chiplets in their next generation otherwise they will suffer a slow death. AMD will take the lead if they continue their advances with chiplets and if Nvidia doesn't.
There wont be a monopoly because there's not enough supply from nvda to meet the demand of the whole market. Grace is nowhere close to EPYC in performance. MI300A can steal enough CPU sensitive AI workloads as well. Nvda has marketed themselves to non-experts, which makes them seem they are more important than they actually are. Google isn't a long term customer for H100. They are investing in their own TPU and networking solution, with proprietary software written from scratch bespoke to their hardware. The same is true with microsoft working on their own AI chip. When software gets bespoke, it doesn't matter if you have a much more powerful chip because it will still run slower on those custom crafted workloads.
Nvidia is indeed way ahead, but are they too far out over their ai skis? Though none quite this massive or fast, I've been through countless explosive industry growth spurts that left market leaders scrambling when the hype-train suddenly switched tracks. Don't get me wrong, Intel has long needed to be shaken up more than AMD's been able to by themselves. It's just that so much money and research being poured into ai core architecture gives me an admittedly nostalgic mental pause/hesitancy. Then again, I may just be getting to old for this. 😁
These are fantastic picks, I was really hopeful of my investments this year, but I followed some stock suggestions that didn't go so well, I've been studying the stock market and I realized some investors made millions from the recent recession and I was wondering if such success rate could be achieved in this present market. and the Federal Reserve taking a more hawkish approach to interest rates.
Throw it into AI stocks / Hold some in gold. I built a 7 figure well-diversified portfolio just by following Trisha Jean Webb's recommendations. I buy quality firms, anticipate to hold them regardless of what happens, pay up but not too much, keep track, sell only when necessary, and be ready to course correct. also ignore the forecasts and market views which are at best entertaining but completely useless.
Expect Nvidia to fall by about 90% in the coming recession. Not because it's a bad company, but because that's how much the stock is overvalued compared to its profit and revenue figures.
Not so fast buddy. AMD is still very much alive. Just that Lise didn’t sell her chip as nvda did. That’s a blind spot.. Lisa messed up big time but the tech is definitely in line. She just doesn’t know how to spin tech!!!
I put all my money aside to invest, NVDA could be one of them but no way at this point. Because if I buy a trillion dollar valuation company and sell with a 25% profit, how much profit will he make?
The All in one podcast... Grace hopper is a separate ARM processor, separate memory... MI300 is a single package unified HBM memory CPU and GPU MCM module which is heterogenious. How exactly is Grace Hopper, both monolithic, even a remote threat?
150x the work compared to what exactly? No details provided and nobody is asking. It's interesting how different this is perceived by those in the investing space vs the engineers and developers who actually implement and use these systems.
Excellent Video. It has become clear to me over the past year - Nvidia has addressed the silicon need right along with an amazing ecosystem AND community. I’m convinced their market dominance comes from the fact they have succeeded simultaneously on both fronts. Competing companies talk about AI. Nvidia IS AI. Having said all that, obviously none of this is a done deal. Environments evolve. At this point, NVIDIA has a big lead
If China were to invade Taiwan, what happens to NVDA?
Рік тому+1
I think it would be not just Nvidia who gets screwed if that happens. Every new high end chip is produced there basically, no matter if it's CPU or GPU. Imo we would basically getting chip shortage vol.2 but it will be much worse. And analysts are leaning into the conclusion that it's not the question of will it happen but rather the when it will happen.
The Fab in AZ does not the most advanced nodes. The most advanced nodes always keep in Taiwan. Taiwanese is not stupid! However, I am from Asian (HK) and I am 99% sure China will not invade Taiwan. China has no military power to invade in many years.
@@ec188 China will not invade Taiwan not because of military power but to keep the tension remain tight that would keep USA deviate from its core focus.
I want to see Intel succeed because depending on Taiwan for 90% of advanced chips is a problem. People can talk smack about the company, but if anything happens to Taiwan, all the tech companies you mentioned will be lining up to use Intel's foundry service. Jensen likes their new chips and says he's open to working with them BTW. The DGX H100 uses Xeons, too.
Hi Alex, love your video editing! and I want to start my youtube channel - what software do you use to edit? where do you get your music and images? lastly, how do you create your thumbnails? your advice would be greatly appreciated.
Hey there. Thanks for the kind words. I use Adobe Premiere Pro to edit, get my music and images from Storyblocks, and create my thumbnails in Adobe Photoshop. Cheers.
If i was in the Nvidia board room. I would be pushing for a long term deal w Intel to lock down the foundries. In one handshake they would put the entire industry in a choke hold for the next decade at the very least. With Intel's foundries coming online over the next few years it would be a one stop shop for the future of the tech world and would re define economics as the world understands it today. and we'd all get filthy rich as it happened
Nvda is a Gorilla company. They have the dominant open proprietary architecture that include hardware & software. They have value added chain that includes software applications and tools. Hi switching cost. Msft, Google, Meta, Amazon, etc can not afford to wait for AMD to catch up… or they loose in the market place. They have to pay Nvda to play in AI, smart cloud, etc. This scenario is similar to Wintel during their exponential growth. Eventually competitors came in but took years. Arm Holdings could a good investment after ipo as their cores are used by Nvda and amd. Along with the rest of the world including Europe.
intel has neuromorphic chips that process data closer to how a living creature would and nvidia is just brute forcing things with gpu shaders and tensor cores. so it's kind of obvious that intel has the real lead in the long run. plus they are in talks to produce next gen nvidia parts.
Don’t forget though… Jenseng and the AMD CEO are cousins! Hmmmm… I don’t have AMD , but don’t think AMD is going away. My Uncle Jenseng always deliver🎉
I don't think it's so bad for AMD. DC at AI is just for training, for inferencing, Nvidia will not have monopoly with these prices. Now we see inferencing on these beasts, but LLMs will become smaller because people don't like who it is now and because the massive computing power needed for inference is not worth either. So for a DC, yes AMD may be less of a thing for now, but if they have full support for Torch, it is usually soluable to swap the vendor. And I know it is far not so simple, but if it is worth to switch, such "small" players like Google will be able to do it. Moreover, I (and many others) feel that AI is just a hype. I mean it is inevitable that AI is useful, but it is not SO useful as we believe. Not yet. It will be, but not now. So the hype will drop in a few years.
Nvidia H100 has transformer (LLM architecture) engine that they claim is 3x faster. AMD doesn’t have any such optimizations for LLMs. People want smaller architectures cause it’s cheaper but for best performance, the larger ones such as GPT4 are still currently in a different league. LLM has already demonstrated usefulness in coding coPilot and learning (threat to Google). We have no idea what new killer applications might emerge a year from now.
If it drops few years, it is good enough! I am worried it will drop very soon. However based on my research, it is not very likely. I will find something else to invest after AI is cool down. It is extremely unlikely companies will let Nvidia to monopoly. They are not stupid. AMD will almost guarantee to get some market share. I do agree AI is just a hype. It is useful but it is cover overexaggerated
Intel certainly has issues, but they are working to transform the business into a TSMC competitor and really focus on chip manufacturing for other companies, which is good. And they are in talks with ARM on a 10 billion dollar investment pre IPO, which would be huge and very strategic. Easy firm to hate on, but they have the potential to get back on track.
The problem with ia is that makes users more poor because 1000x more competition. And the need in power and memory already dropped 1000x too in just week and the pe still
Most people are negative regarding Intel; however, their stock is screaming "BUY me NOW" !!! (I did June 8th for $31.75 and I'm already up 14% ... $36.37) Its next resistance level (s) ~ $44.00 and $55.00, much more affordable than Nvidia and AMD as of Juneteenth, 2023.
8:53 “This is like them going for the jugular” Are you sure it has anything to do with another company and not them trying to build the best product they can
That’s the problem by the time you invest in your own hardware and create all the software necessary to get it working Nvidia will have a production solution on better tech. Not to mention they have the tools to crush you on price whenever you get some what competitive. Just look at what they have done to AMD in the past. Anytime they get close they just drop the price on a previous gen gpu and smash them.
The long term winners in AI will be software companies, not those making hardware. Think Intel vs Microsoft. This means companies like Tesla will benefit far more from AI than a hardware company like Nvidia.
Bro, you are wrong. Hardware is the main part of AI's computing. If there is no hardware, there are no AIs. Just like you use computer software If you don't have a computer to run software, how can you use software?
uuhm if nvidia dont ajust its prices then it wont kill anything because many gpu users wont mind to use a bit lesser powerfull gpu but reasonable priced instead of the crazy prices nvidia is asking. Also even the bit lesser gpu's are powerfull enough for the current and future games. So again if nvidia dont change their crazy pricing they will price themself out of the common market and will only retain the high end users and i think most money can be made in the common market.
@@tylerbennett4488 ahh ok my bad but still i dont think this will kill amd nor any other company especially not intel bc they still will have enough market share in other plac es to be killed off by nvidia. Also they themself (amd and intel) for sure will bring out a product that can compete with nvidia
"Nobody" uses cuda anymore. Pytorch and other libraries are the glue... If this libraries work with AMD, NVIDA moat is gone. Right now what we have is lack of trust on AMD, firms just buy what they know it's going to work... like in 2018/2019 when AMD had better data-center products than INTEL, but people didn't bought it. More... big companies (msft, meta, google,...) don't like NVIDIA, and they want competition in the sector so that their acquisition cost is lower. Also AI is not only GPU's... the majority of AI is done with CPU's.
Yes, I heard Nvidia AI accelerators only use 25% of time. The rest 75% is idle and CPU takes over. I believe AMD X86 with AI chip has advantage. Nvidia Arm CPU with AI accelerators combination is not going to be popular since Arm CPU does have good software support.
Yes, Nvidia is very innovative and strong atm, but at the end of the day they also need to manufacture all of that stuff in huge quantities and that's where to problem lies. There simply aren't enough factories on the planet that can pump these chips out fast and cheaply enough. TSMC cannot do it all and most of it's capacity for quite some time has been bought up by Apple, as far as I know. So as long as Nvidia doesn't build it's own very expensive factories, these chips will be an expensive niche product, imho.
@@techwithdave Costco had quality products at a reasonable price. Look at their parking lots and the carts all loaded up. Great service. Not some trashy place like Walmart or Sams.
Because AMD is putting the same amount of HBM3 in as Nvidia probably - 192GB. EDIT: actually Grace Hopper only has 96GB - Nvidia's GH whitepaper is slightly confusing in this respect.
@TickerSymbolYOU you are missing the point. Nvidia has like 3x the memory that AMD does on their configuration and more memory allows for larger models and less network traffic.
MI300X is untouchable in the near future. Nvidia just can't do anything to AMD on cloud AI at least in the near future. In AMD announcement there's Pytorch part. That's pretty big thing you have missed since you're not a developer. That's understandable.
I invest in both AMD, Nvidia and other AI software and EV companies. I don’t agree with you (I could be wrong). If the AI market is so big, it is not possible AMD can’t get any market share. The possibility that would happen is almost 0%. If Nvidia could not get enough TSMC capacity to meet the demand, the rest had to fill up by AMD most likely. I also heard the AMD AI Chip cost of the ownership is cheaper since Nvidia charges huge premium (Just like AMD vs Intel) Based on many reviews, AMD AI hardware is not bad but the software is not as good as CUDA. AMD is going to release MI300 later this year and let see how it is doing. I think it will do well. Nvidia stock price is just so high and it has to execute well to justify. Right now, Nvidia is “monopoly” until AMD releases the new AI chip. So they could claim whatever they want. I think Nvidia will do very well for this year. AMD AI chip will not beat Nvidia anything soon but it will have some market share.
Agree it’s a matter of time where NVDA will lose the complete monopoly. However say 50% of a $150 B TAM is larger than almost 100% of current $30B. That’s why NVDA has more headroom in its stock.
@@dfv671The TSMC quota is per company. Even if Nvidia uses up all its quota, AMD may still have some unless Nvidia bought all the advanced nodes quota. It is not very likely.
The money isn't in the hardware (Intel, Qualcom), it's in the software (Microsoft, Apple). The chip makers will make money, but the software companies will make more.
Many forecasts don't factor in geopolitical events, like the possibility of China blockading/invading Taiwan, which currently Nvidia, AMD and many others are dependent on. Intel's fabrication is more spread out globally though will inevitably be impacted too by China moving against Taiwan.
Because it's only used internally at Google, and Google Cloud is a distant third in the cloud war. It's great for Google to have their own chips. But enterprise customers want to use a platform that works on all cloud or on prem.
Nvidia doesn't have a monopoly, it's just that other company didn't have the balls to take risk and invest into innovation. Nvidia plans of the future were always public. They've been working on this for about a decade now and for a decade they were the only one who believe in AI hardware acceleration. Other companies has had a decade to make a similar decision but they didn't. This is just the result of Nvidia taking risk and Investing into the innovation. They've earn the first place fair and square since no one had the balls to do what Nvidia did.
Nvidia doesn't have a monopoly. Except that almost all PC games are developed directly for nvidia gpus, CUDA makes it practically a monopoly in the AI space, and while it does take balls to do that, these don't contribute to the discussion. I know for a fact you have no idea about what innovations are happening at both AMD and Intel and how much is invested in them.
@@nexovec Why call it "practically a monopoly" ,which literally cannot exist, when you can just call it for what is really is? that is crushing the competition fair and square with a superior product. Let me guess, "monopoly" equal bad so it's to make it sound like Nvidia is a bad guy. Also, you totally right about me not knowing of the ground breaking innovations AMD/intel has provided to the industry for the past ten years or so. Clearly nothing in the GPU sector. Cpu? Don't they own all right of the x86 architecture? slightly better than a monopoly, Duopoly? mind providing some example? And having balls does contribute to the discussion. Innovation isn't gonna happen if you don't have the balls to dig deeper out of your comfort zone. I mean just look Apple. They are/were not even using windows and still made it this big, And they now even use their "own" cpu that is not even an x86 architecture. Competition means nothing in regard of innovation/progress. Having balls does.
Intel is supposed to have super smart engineers as well as world class R&D infrastructures for a long time. How come Intel can go so helplessly behind?
IDK about industrial segment, but on consumer side, they relaxed and lay back during their CPU domination phase (Core i3~i7 2000 series and up to 9000 series, when Ryzen finally took off and started pecking them on teh butt), throwing an extra +5~7% performance per generation. They though they have it, like Apple does with their market share. Reality check didn't go so well.
The general consensus on what nGreedia is after, is that they want to become "the Apple of AI", much in the same way as real Apple themselves have their constant share of consumer market. This *may* be achievable. But there are also multiple examples of epic fails and slip-ups in the history, e.g. 3dfx, who were *the* GPU at the time, but slipped so hard they got gobbled up by nV; IBM were once *the* PC to have, but now they're nowhere near consumer electronics. Some people also like to compare AI and cryptocurrencies hype, but AI has practical workload implications outside just abstract "value generation". But here also lies the caveat, namely "market saturation". AI is just barely out of infancy and grows rapidly, but this is not an infinite loop.
Like Tesla, Nvidia saw the opportunity and took hold of it long before anyone else. How's Google's battle with Tesla on autonomous driving going? Where's the Apple car? How about Microsoft's browser war? Won't be so easy catching up to nVidia.
Op made a jokebait. Nvidia can't even make drivers that can support a 1080 forget about a 4080 😂 Nvidia blowing cards lifespans by overpushing power in the card which cannot even give you the stable or near constant FPS. Yet they want to even attempt this
⚠ No matter who wins the AI market, you can get up to 17 FREE stocks with #moomoo on the #moomootrading app with my link: tickersymbolyou.com/moomoo
Jensen Huang is a Founder/CEO. It makes a huge difference in company culture. Their employees are engaged, feel connected, and have a strong sense of ownership. Intel, AMD, and others are just corporations offering good and well-paid jobs but nothing more.
Solid point!
Jensen Huang is in a different league of CEO. He has the technical knowledge and vision of the entire GPU field from the beginning that he helped create to vision for the future. Other CEOs are worried about next quarter earnings and figuring out wtf AI is.
AMD's CEO has three degrees in electrical engineering, has spent much of her life developing products including semi-conductors, and you are talking nonsense. Do a minute of research. Huang and Su are closely related by family.
@@Longtack55 Lisa Su is an expert in semiconductors and EE but, from all the talks I've seen of her, demonstrates only a very basic knowledge and understanding of the AI field. When I watch Jensen's presentation, I see he knows all the cutting-edge details of the field and has a vision of where the next big thing is heading. Lisa Su at AMD AI Day and Jensen at Computex was like night and day. Being closely related by family does not translate to equivalent AI knowledge and capability as CEO. Lisa Su has been an amazing CEO for AMD, but Jensen is at a whole new level.
the hell does family connection have to do with anything? Are they sharing some telepathic family hive mind?
And AMD offers custom designs for specific solutions. I think people are really underestimating the advantage of AMD chiplets and the flexibility they offer. And Nvidia is not customer friendly. They force what they think you want into the market and people are adapting
Not only flexibility -also price. From what I have picked up as rumours, AMD will kill NVidia on the pricing front. Chiplets are BRUTALLY cheaper to produce.
@ThomasTomiczek that's exactly what I'm saying. AMD can literally interchange the different chiplet ccd count for whatever workload that is specific to without any effect on pricing. It's like a jigsaw puzzle where all the parts are interchangeable. NVIDIA margins might be good for the 2md half of this year, but that 1st quarter next year will paint the AI picture more clearly.
From a purely valuation perspective, Nvidia's stock should fall about 90% in the next recession, AMD's should get cut in half, but Intel's should barely fall at all because it is already so depressed.
Yeah but let's not forget that AMD is the company that drops the Popsicle on the floor and cries looking at it like a baby. You could give them a fish on their hand and the fish will still scape.
What ? Amd is way behind! Right now their bread and butter are consoles .. notebooks .. not serious about high end.
Big customers like Amazon and MSFT will develop software for AMD.
The big edge Nvidia has is their software stack (CUDA), but I've heard several in the ecosystem are not a big fan of the lock in that comes with CUDA. Just something to keep in mind.
True but right anyway there so far ahead this going have national security ramifications. I can see major world player like France, Turkey, and Japan having a problem with the United States having this technology to themselves.
And AMD offers custom designs for specific solutions. I think people are really underestimating the advantage of AMD chiplets and the flexibility they offer. And Nvidia is not customer friendly. They force what they think you want into the market and people are adapting early. It will change shortly and those Nvidia margins will crash...
You downplay AMD and don't even mention Pytorch? That is huge given AMDs history with open source. How do you think LLMs are made? AMD has made a huge step to being somewhere from being absolutely nowhere.
Definitely agree with this but do you really think that's an apples-to-apples comparison?
@@TickerSymbolYOU Dude you lost my respect !! Please do just basic reserch before you try claim anything.
@thechezik so what are you claiming??
The problem with AMD's recent AI presentation is that I don't really feel like any of this is going to make them competitive with NVIDIA in the AI space. It's not so much the hardware that's the issue, but the lack of investment on the software side from AMD. NVIDIA has heavily invested in CUDA, making it very powerful, which in turn makes it very useful for AI. NVIDIA also does a whole heap of AI research internally using their own hardware, which leads to further improvement with each generation.
A simple example of this is the move to 4bit inference. As Lisa Su said, their 192GB AMD system allows you to run an 80 billion parameter LLM model (since it would be running at 16bits on AMD hardware). With CUDA it's possible to run these models at 4bits with no perceptible loss in quality (there is a scientifically measurable loss, but you'd never notice it in practice). This allows you to run those models with a quarter of the VRAM. To put it another way, you can already run 33+ billion parameter models on a single RTX 3090, which is a 24GB consumer GPU available for around $700 on the second-hand market. So running an 80 billion parameter model just isn't that impressive anymore.
Hypothetically speaking, if AMD could get 4bit inference working on these systems, it would allow you to run a 300+ billion parameter model with 192GB of VRAM, which would actually be game-changing (as it would mean being able to run models larger than GPT3 on a single AMD system). Unfortunately, they're just throwing VRAM at the problem, rather than making the technological investment needed to make their hardware more useful for AI.
I’m no tech geek but from Su presentation it does sound like they’re throwing in more Ram to compensate for lack of compute and offering a more affordable version than NVDA.
I Keep hearing NVDA software/ecosystem is way more developed than AMD, so interesting to hear how AMD will catch up on that side
"Unfortunately, they're just throwing VRAM at the problem, rather than making the technological investment needed to make their hardware more useful for AI."their actually doing both. see changes in RoCM
NVDA iterates itself with AI it's insane.
This video is not very insightful. I agree that AMD's presentation was a little too honest, but Nvidias presentation was super cringe in comparison. AMD's datacenter APUs are super competitive with much more memory, yes these MI300 products come out a little later, but they are actually stronger, making up for the delay. The MI300 series bring feature parity with better 16bit float, int8, and int4 support. Nvidias main lead is in software, and that will probably continue for one or two generations, but AMD is catching up really fast. CUDAs vendor locking-value is hugely overstated. People don't write code for CUDA directly for data science problems, they write in frameworks like TensorFlow, PyTorch, Flax/Jax, etc. AMD's chiplet based hardware will be cheaper to produce and will have much more silicon to dedicate to each product.
Nvidia needs to deliver on chiplets in their next generation otherwise they will suffer a slow death. AMD will take the lead if they continue their advances with chiplets and if Nvidia doesn't.
AMD could crawl back if NVDA doesn’t master MCM and vertical stacking.
I said it in the last video and I'll say it again: Nvidia has a PhD in AI while AMD is still tryna graduate from high school.
😅 well said
Ha haaaa hold my beer ngreedia is going down the rabbit hole with crave for cash while intel and AMD gaining more customers and fans
Dave Brown (vice president of elastic compute said AWS had declined to work with Nvidia on the DGX Cloud offering.
They sounded more willing to use AMD MI300 than Nvidia products , so it's not clear cut and dry when it comes to hyperscalers.
@@TickerSymbolYOU
You are giving a thumb up to every prause to Nvidia.
I agree. Nvidia is way ahead. It's a very focused company waiting for a long time for their moment. Now their moment has come.
Well, it's market cap is way ahead anyway.
The problem with nvidia is that they are not big on open source software and drivers. This is important to me. That’s why I’m using AMD on Linux.
a very valid concern
I recently switched from Nvidia to a AMD GPU on Linux because I really got annoyed with Nvidia's shitty drivers.
Linus needs to drop them another middle finger 😂
Wrong. Nvidia is actually a HUGELY prolific open source publisher of software, you have no idea what you're talking about.
There wont be a monopoly because there's not enough supply from nvda to meet the demand of the whole market. Grace is nowhere close to EPYC in performance. MI300A can steal enough CPU sensitive AI workloads as well. Nvda has marketed themselves to non-experts, which makes them seem they are more important than they actually are. Google isn't a long term customer for H100. They are investing in their own TPU and networking solution, with proprietary software written from scratch bespoke to their hardware. The same is true with microsoft working on their own AI chip. When software gets bespoke, it doesn't matter if you have a much more powerful chip because it will still run slower on those custom crafted workloads.
Nvidia is indeed way ahead, but are they too far out over their ai skis? Though none quite this massive or fast, I've been through countless explosive industry growth spurts that left market leaders scrambling when the hype-train suddenly switched tracks. Don't get me wrong, Intel has long needed to be shaken up more than AMD's been able to by themselves.
It's just that so much money and research being poured into ai core architecture gives me an admittedly nostalgic mental pause/hesitancy.
Then again, I may just be getting to old for this. 😁
Nvidia isn't valued as a trillion dollar corporation today based on their success "jumping on the band wagon". Nvidia got the money to spend.
That's just ridiculous , where are the 8K pc screens ??? . By 8k i mean 7680x4320 on a reasonable 24" and i want two of those with a single slot gpu.
so how do you really feel about NVIDIA? Cause im just not clear....
These are fantastic picks, I was really hopeful of my investments this year, but I followed some stock suggestions that didn't go so well, I've been studying the stock market and I realized some investors made millions from the recent recession and I was wondering if such success rate could be achieved in this present market. and the Federal Reserve taking a more hawkish approach to interest rates.
Yes, If you go long against the market you'd make such success.
Throw it into AI stocks / Hold some in gold. I built a 7 figure well-diversified portfolio just by following Trisha Jean Webb's recommendations. I buy quality firms, anticipate to hold them regardless of what happens, pay up but not too much, keep track, sell only when necessary, and be ready to course correct. also ignore the forecasts and market views which are at best entertaining but completely useless.
@@RandyPelletier I did look up your FA and found her web page. she has a pretty decent bio, I wrote her and I'm waiting on her reply.
Expect Nvidia to fall by about 90% in the coming recession. Not because it's a bad company, but because that's how much the stock is overvalued compared to its profit and revenue figures.
Not having tensor cores in AMD was very very bad decision, now they are playing catching up, while Nvidia is leading.
See the AMD instinc MI300 that will understand why.
Nvidia start early. Since rtx 20 they already target the Ai market
Not so fast buddy. AMD is still very much alive. Just that Lise didn’t sell her chip as nvda did. That’s a blind spot.. Lisa messed up big time but the tech is definitely in line. She just doesn’t know how to spin tech!!!
I don't know... NVDA price already went too high. That makes me afraid to invest on it at this point as it can drop at any moment 😬
It won't because their AI breakthrough is phenomenal
I am invested since 2017 but if I were you I wouldn't invest at this point either. The rally always ends at some point.
when the stock goes vertically up-there's hype & FOMO in fusion-crash is eminent. NVDA is selling axe and now everyone is looking for Gold.
Its going for a split
I put all my money aside to invest, NVDA could be one of them but no way at this point. Because if I buy a trillion dollar valuation company and sell with a 25% profit, how much profit will he make?
The All in one podcast... Grace hopper is a separate ARM processor, separate memory... MI300 is a single package unified HBM memory CPU and GPU MCM module which is heterogenious. How exactly is Grace Hopper, both monolithic, even a remote threat?
150x the work compared to what exactly? No details provided and nobody is asking. It's interesting how different this is perceived by those in the investing space vs the engineers and developers who actually implement and use these systems.
Why does everyone covering Nvidia's recent success avoid talking about the 4000 series fiasco?
because it's not what's driving their balance sheet
@@TickerSymbolYOU 😢 ik
@@TickerSymbolYOU fair enough!
Excellent Video. It has become clear to me over the past year - Nvidia has addressed the silicon need right along with an amazing ecosystem AND community. I’m convinced their market dominance comes from the fact they have succeeded simultaneously on both fronts.
Competing companies talk about AI. Nvidia IS AI.
Having said all that, obviously none of this is a done deal. Environments evolve. At this point, NVIDIA has a big lead
Exactly right
The rush to invest in AI since GPT3's popularity makes little sense to me. Do people think AI will generate massive amounts of profit so soon?
maybe were not even talking profit anymore but world domination ...
If China were to invade Taiwan, what happens to NVDA?
I think it would be not just Nvidia who gets screwed if that happens. Every new high end chip is produced there basically, no matter if it's CPU or GPU. Imo we would basically getting chip shortage vol.2 but it will be much worse. And analysts are leaning into the conclusion that it's not the question of will it happen but rather the when it will happen.
That is why the US is making chip manufacturing in the US.
TSMC has new factory in Phoenix AZ
The Fab in AZ does not the most advanced nodes. The most advanced nodes always keep in Taiwan. Taiwanese is not stupid!
However, I am from Asian (HK) and I am 99% sure China will not invade Taiwan. China has no military power to invade in many years.
@@ec188 China will not invade Taiwan not because of military power but to keep the tension remain tight that would keep USA deviate from its core focus.
I want to see Intel succeed because depending on Taiwan for 90% of advanced chips is a problem. People can talk smack about the company, but if anything happens to Taiwan, all the tech companies you mentioned will be lining up to use Intel's foundry service. Jensen likes their new chips and says he's open to working with them BTW. The DGX H100 uses Xeons, too.
Hi Alex, love your video editing! and I want to start my youtube channel - what software do you use to edit? where do you get your music and images? lastly, how do you create your thumbnails? your advice would be greatly appreciated.
Hey there. Thanks for the kind words. I use Adobe Premiere Pro to edit, get my music and images from Storyblocks, and create my thumbnails in Adobe Photoshop. Cheers.
@TickerSymbolYOU Brilliant - Thank you for such a quick response, and wish me luck, you are one of my inspirations !
I wonder to stop the monopoly they would have to allow Cuda to run on other hardware.
It's not a natural monopoly so they could get broken up in the future. Nvidia should be careful.
Nvda designs chips but does not manufacture them. What does this mean for Nvda, will this be a problem as they can’t meet demand.
Thanks for throwing in that chip bag gag. Gives your videos that human touch.
If i was in the Nvidia board room. I would be pushing for a long term deal w Intel to lock down the foundries. In one handshake they would put the entire industry in a choke hold for the next decade at the very least. With Intel's foundries coming online over the next few years it would be a one stop shop for the future of the tech world and would re define economics as the world understands it today. and we'd all get filthy rich as it happened
Nvda is a Gorilla company. They have the dominant open proprietary architecture that include hardware & software. They have value added chain that includes software applications and tools. Hi switching cost. Msft, Google, Meta, Amazon, etc can not afford to wait for AMD to catch up… or they loose in the market place. They have to pay Nvda to play in AI, smart cloud, etc.
This scenario is similar to Wintel during their exponential growth. Eventually competitors came in but took years. Arm Holdings could a good investment after ipo as their cores are used by Nvda and amd. Along with the rest of the world including Europe.
i did a similar comment about wintel and the decade long monopoly of the pc market
@@phvaessen Gorilla Game, by Jefferey Moore
Innovator's Dilema, by Clay Christensen
I wish I had invested more into NVDA 🤦♂️ 😭
more opportunity in the markets, do not worry habibi
@@1lowtrade Do you think of any specific stock?
@@hugonordenswan You are asking for investment tips on youtube...you are already a loser at this point lol.
Not too late! DCA
Your last video was about AMD destroying Nvidia. Now this😅. I think both will do fine over the decade.
My last video was definitely not about AMD destroying Nvidia.
@@TickerSymbolYOU maybe I should watch the video and not just the thumbnail 😂
What do you think about the collaboration between AMD and huggingface? Could that be one way to fight CUDA for generative AI? Cheers!
intel has neuromorphic chips that process data closer to how a living creature would and nvidia is just brute forcing things with gpu shaders and tensor cores. so it's kind of obvious that intel has the real lead in the long run. plus they are in talks to produce next gen nvidia parts.
Nvidia too big.
I be supporting amd.
alex. gt video. thx. what is the next thing in computing and artifical revoltion and how can we make money? your thoughts
still very early innings but semiconductors seem to be the way to go for now
Don’t forget though… Jenseng and the AMD CEO are cousins! Hmmmm… I don’t have AMD , but don’t think AMD is going away. My Uncle Jenseng always deliver🎉
This fact blows my mind every time I hear it
More like Uncle and Niece. Her granddad is eldest bro of Jensen’s mom
Yes… that’s what I meant. These guys are pros!
I don't think it's so bad for AMD. DC at AI is just for training, for inferencing, Nvidia will not have monopoly with these prices. Now we see inferencing on these beasts, but LLMs will become smaller because people don't like who it is now and because the massive computing power needed for inference is not worth either. So for a DC, yes AMD may be less of a thing for now, but if they have full support for Torch, it is usually soluable to swap the vendor. And I know it is far not so simple, but if it is worth to switch, such "small" players like Google will be able to do it.
Moreover, I (and many others) feel that AI is just a hype. I mean it is inevitable that AI is useful, but it is not SO useful as we believe. Not yet. It will be, but not now. So the hype will drop in a few years.
You sound like the CEO of Blockbuster video.
Where is Blockbuster now?
Nvidia H100 has transformer (LLM architecture) engine that they claim is 3x faster. AMD doesn’t have any such optimizations for LLMs. People want smaller architectures cause it’s cheaper but for best performance, the larger ones such as GPT4 are still currently in a different league. LLM has already demonstrated usefulness in coding coPilot and learning (threat to Google). We have no idea what new killer applications might emerge a year from now.
If it drops few years, it is good enough! I am worried it will drop very soon. However based on my research, it is not very likely.
I will find something else to invest after AI is cool down.
It is extremely unlikely companies will let Nvidia to monopoly. They are not stupid. AMD will almost guarantee to get some market share.
I do agree AI is just a hype. It is useful but it is cover overexaggerated
👍👍👍👍👍👍👍👍
I think it's unfair to say she doesn't know how big the market will be, nobody does - it's actually an honest position.
Intel certainly has issues, but they are working to transform the business into a TSMC competitor and really focus on chip manufacturing for other companies, which is good. And they are in talks with ARM on a 10 billion dollar investment pre IPO, which would be huge and very strategic. Easy firm to hate on, but they have the potential to get back on track.
Intel can not buy Arm or Own a Majority Voting Shares under UK Law ⚖ as it is a British Company
The problem with ia is that makes users more poor because 1000x more competition. And the need in power and memory already dropped 1000x too in just week and the pe still
not sure why the monopoly comments. you don't think google, amazon, facebook are all monopolies?
Next Nvidia stock is 🔥PLTR!
Most people are negative regarding Intel; however, their stock is screaming "BUY me NOW" !!! (I did June 8th for $31.75 and I'm already up 14% ... $36.37) Its next resistance level (s) ~ $44.00 and $55.00, much more affordable than Nvidia and AMD as of Juneteenth, 2023.
8:53
“This is like them going for the jugular”
Are you sure it has anything to do with another company and not them trying to build the best product they can
Why can’t it be both? Companies only build products they think they can sell in a market with competitors right?
AMD way overpriced today.
"I am the Sound effect" and the Karaoke with AI was best
Nvidia’s DGX supercomputer just killed Tesla’s Dojo…
That’s the problem by the time you invest in your own hardware and create all the software necessary to get it working Nvidia will have a production solution on better tech. Not to mention they have the tools to crush you on price whenever you get some what competitive. Just look at what they have done to AMD in the past. Anytime they get close they just drop the price on a previous gen gpu and smash them.
What does that mean, "My ticker system is "You." Are you a stock?!
The long term winners in AI will be software companies, not those making hardware. Think Intel vs Microsoft. This means companies like Tesla will benefit far more from AI than a hardware company like Nvidia.
It depends whether NVDA settles just being a Hardware company. It’s move up the stack has been impressive and more to come.
Lol
Tesla can be Nvidia’s customer in AI.
Many people forget AMD is a software company: amdgpu
Bro, you are wrong. Hardware is the main part of AI's computing. If there is no hardware, there are no AIs. Just like you use computer software If you don't have a computer to run software, how can you use software?
uuhm if nvidia dont ajust its prices then it wont kill anything because many gpu users wont mind to use a bit lesser powerfull gpu but reasonable priced instead of the crazy prices nvidia is asking. Also even the bit lesser gpu's are powerfull enough for the current and future games. So again if nvidia dont change their crazy pricing they will price themself out of the common market and will only retain the high end users and i think most money can be made in the common market.
I think he is rather talking about Nvidia killing AMDs chance to grow big in the AI industry.
He’s talking about workstation GPU’s. Not consumer GPU’s. Nobody is buying 3060Ti’s for their AI workloads.
@@tylerbennett4488 ahh ok my bad but still i dont think this will kill amd nor any other company especially not intel bc they still will have enough market share in other plac es to be killed off by nvidia. Also they themself (amd and intel) for sure will bring out a product that can compete with nvidia
Can you do a break down of tesla DOJO in relation to nvda H100’s 🙏
sure, i'll definitely do another Tesla vs Nvidia video!
next time we see Jen-Hsun Huang on stage, he will be holding assimov millenial man positronic brain...
Nvidia has moved away from consumerism
"Nobody" uses cuda anymore. Pytorch and other libraries are the glue... If this libraries work with AMD, NVIDA moat is gone. Right now what we have is lack of trust on AMD, firms just buy what they know it's going to work... like in 2018/2019 when AMD had better data-center products than INTEL, but people didn't bought it.
More... big companies (msft, meta, google,...) don't like NVIDIA, and they want competition in the sector so that their acquisition cost is lower.
Also AI is not only GPU's... the majority of AI is done with CPU's.
Agree in long run AMD will probably catch up. However as investors how much more to ride on coattails of NVIDIA?
Yes, I heard Nvidia AI accelerators only use 25% of time. The rest 75% is idle and CPU takes over. I believe AMD X86 with AI chip has advantage. Nvidia Arm CPU with AI accelerators combination is not going to be popular since Arm CPU does have good software support.
pytourch already works with AMD gpu, just nobody is using it
Can we have an episode about Arm ipo and softbank 🙏
Yes, Nvidia is very innovative and strong atm, but at the end of the day they also need to manufacture all of that stuff in huge quantities and that's where to problem lies. There simply aren't enough factories on the planet that can pump these chips out fast and cheaply enough. TSMC cannot do it all and most of it's capacity for quite some time has been bought up by Apple, as far as I know. So as long as Nvidia doesn't build it's own very expensive factories, these chips will be an expensive niche product, imho.
If anyone is wondering, you can buy the popcorn at Costco.
Dollar cost average to Costo. Slow and steady.
I don't see what's the hype with Costco. It's Just another place to buy stuff. And not cheap for sure BTW!!!!!
@@johnboon5330 I can’t speak for the USA, but in Canada, many items at Costco are a good buy compared to what you get at Canadian grocery stores.
@@techwithdave Costco had quality products at a reasonable price. Look at their parking lots and the carts all loaded up. Great service. Not some trashy place like Walmart or Sams.
why are you keeping amd stock,it looks like they lost to AI designed chips
Because they’re winning in data center CPUs
Cuda baby. Amd will always be a toy without CUDA
No one really knows what size the market will be
No one is even talking about the huge amount of memory Nvidia has put on these chips compared to AMD.
Because AMD is putting the same amount of HBM3 in as Nvidia probably - 192GB. EDIT: actually Grace Hopper only has 96GB - Nvidia's GH whitepaper is slightly confusing in this respect.
I'm pretty sure Nvidia won't have trouble putting more memory on these chips if they thought it was a serious differentiator
@TickerSymbolYOU you are missing the point. Nvidia has like 3x the memory that AMD does on their configuration and more memory allows for larger models and less network traffic.
I have got the feeling that these Companies dont even know what they are inveting and what the impact will be.
MI300X is untouchable in the near future. Nvidia just can't do anything to AMD on cloud AI at least in the near future.
In AMD announcement there's Pytorch part. That's pretty big thing you have missed since you're not a developer. That's understandable.
I Bought the stock when it was low $100 range. People doubted the company, And I still envision the company's longevity 😌
I lol'd at the Intel dumpster fire. Thank you.
Nvidia don't make CPU's for consumer or database requirements. I'd say AMD are very much safe and comfortably steaming ahead in their own lane.
Isn't 3.2 gwh a lot ?
No. Not for the size.
that keynote... was pretty bad... lol Nvidia might think AI is the way for everything, but it's not lol
Why would I take investment advice from a UA-camr? It's like asking a butcher to advise you on buying diamonds.
I’m a full-time investor but ok
@@TickerSymbolYOU If you say so.
I actually think AMD's chip is pretty cool... : / It's doing all that on ONE die.
I agree; definitely not as lame as the demo made it out to be
@@TickerSymbolYOU Imagine being an AMD engineer right now with Nvidia as your competition.
Elon Musk, many other NN accelerator chips are also under development. Nvidia will not have a monopoly on large-scale training & interference forever.
Popcorn bit was a BIG miss... yikes.
I know. I should’ve used actual chips
“Early on our partnership …”
Not now😊
Do realy think NVIDIA alone is the real deal.
Let's see how it panes out
I invest in both AMD, Nvidia and other AI software and EV companies. I don’t agree with you (I could be wrong). If the AI market is so big, it is not possible AMD can’t get any market share. The possibility that would happen is almost 0%. If Nvidia could not get enough TSMC capacity to meet the demand, the rest had to fill up by AMD most likely. I also heard the AMD AI Chip cost of the ownership is cheaper since Nvidia charges huge premium (Just like AMD vs Intel)
Based on many reviews, AMD AI hardware is not bad but the software is not as good as CUDA.
AMD is going to release MI300 later this year and let see how it is doing. I think it will do well. Nvidia stock price is just so high and it has to execute well to justify.
Right now, Nvidia is “monopoly” until AMD releases the new AI chip. So they could claim whatever they want. I think Nvidia will do very well for this year.
AMD AI chip will not beat Nvidia anything soon but it will have some market share.
TSMC also makes all of AMD's chips.
Agree it’s a matter of time where NVDA will lose the complete monopoly. However say 50% of a $150 B TAM is larger than almost 100% of current $30B. That’s why NVDA has more headroom in its stock.
@@dfv671The TSMC quota is per company. Even if Nvidia uses up all its quota, AMD may still have some unless Nvidia bought all the advanced nodes quota. It is not very likely.
Oh. I see a monopoly coming. Antitrust law suit anyone?
The Nvidia guy and the AMD lady is the same person. Just a different wig and different jacket.
So going all in on Nvidia would be a safe retirement play?
The money isn't in the hardware (Intel, Qualcom), it's in the software (Microsoft, Apple). The chip makers will make money, but the software companies will make more.
Not a good idea.. have some on AMD and INTC, TSMc or other techs.
Yes definitely gonna spead it around tech companies
I did, but I bought it under $200 in the past year... now is not the time.
Many forecasts don't factor in geopolitical events, like the possibility of China blockading/invading Taiwan, which currently Nvidia, AMD and many others are dependent on. Intel's fabrication is more spread out globally though will inevitably be impacted too by China moving against Taiwan.
Nvidia ruining gaming! 😒 beware for the little guy 😎
Why does no one talk about google TPUs? they are cost and watt competitive with anything nvidia has today.
Because it's only used internally at Google, and Google Cloud is a distant third in the cloud war. It's great for Google to have their own chips. But enterprise customers want to use a platform that works on all cloud or on prem.
It's also because of CUDA. Their software that runs on their GPUs puts them far ahead of any competitor.
@Sandipan Majhi google uses nvidia GPUs for inference, actually
@Sandipan Majhi I think GPUs are more expensive but I honestly don't know
@@chrisrogers1092 TPUs are expensive
News videos about AI this year, no matter the release date: "it's been an insane few weeks for AI announcements". (:
Nvidia doesn't have a monopoly, it's just that other company didn't have the balls to take risk and invest into innovation. Nvidia plans of the future were always public. They've been working on this for about a decade now and for a decade they were the only one who believe in AI hardware acceleration. Other companies has had a decade to make a similar decision but they didn't. This is just the result of Nvidia taking risk and Investing into the innovation. They've earn the first place fair and square since no one had the balls to do what Nvidia did.
Nvidia doesn't have a monopoly. Except that almost all PC games are developed directly for nvidia gpus, CUDA makes it practically a monopoly in the AI space, and while it does take balls to do that, these don't contribute to the discussion. I know for a fact you have no idea about what innovations are happening at both AMD and Intel and how much is invested in them.
@@nexovec Why call it "practically a monopoly" ,which literally cannot exist, when you can just call it for what is really is? that is crushing the competition fair and square with a superior product. Let me guess, "monopoly" equal bad so it's to make it sound like Nvidia is a bad guy.
Also, you totally right about me not knowing of the ground breaking innovations AMD/intel has provided to the industry for the past ten years or so. Clearly nothing in the GPU sector. Cpu? Don't they own all right of the x86 architecture? slightly better than a monopoly, Duopoly? mind providing some example?
And having balls does contribute to the discussion. Innovation isn't gonna happen if you don't have the balls to dig deeper out of your comfort zone. I mean just look Apple. They are/were not even using windows and still made it this big, And they now even use their "own" cpu that is not even an x86 architecture. Competition means nothing in regard of innovation/progress. Having balls does.
Nvidia stock is to the the pluto. They will be at 1000+. Mark my word!! NVIDIA lets go!
Intel is supposed to have super smart engineers as well as world class R&D infrastructures for a long time. How come Intel can go so helplessly behind?
IDK about industrial segment, but on consumer side, they relaxed and lay back during their CPU domination phase (Core i3~i7 2000 series and up to 9000 series, when Ryzen finally took off and started pecking them on teh butt), throwing an extra +5~7% performance per generation. They though they have it, like Apple does with their market share. Reality check didn't go so well.
You have overlooked Tesla's DOJO.
no, i haven't.
Nvidia share price rn 🤯🤯
The general consensus on what nGreedia is after, is that they want to become "the Apple of AI", much in the same way as real Apple themselves have their constant share of consumer market. This *may* be achievable. But there are also multiple examples of epic fails and slip-ups in the history, e.g. 3dfx, who were *the* GPU at the time, but slipped so hard they got gobbled up by nV; IBM were once *the* PC to have, but now they're nowhere near consumer electronics. Some people also like to compare AI and cryptocurrencies hype, but AI has practical workload implications outside just abstract "value generation". But here also lies the caveat, namely "market saturation". AI is just barely out of infancy and grows rapidly, but this is not an infinite loop.
Nvidia just a drop of microcosm is a response to threats
Like Tesla, Nvidia saw the opportunity and took hold of it long before anyone else. How's Google's battle with Tesla on autonomous driving going? Where's the Apple car? How about Microsoft's browser war? Won't be so easy catching up to nVidia.
Exactly my take as well
Alex, pass that popcorn. This battle is crazy !!!
🍿🍿
@@TickerSymbolYOU Thank you for the popcorn. Looking forward to sequel(s).
Why is he jump scaring us with himself with the popcorn
Because it’s fun
Op made a jokebait. Nvidia can't even make drivers that can support a 1080 forget about a 4080 😂 Nvidia blowing cards lifespans by overpushing power in the card which cannot even give you the stable or near constant FPS. Yet they want to even attempt this
if you think the future of compute is only about AI, then you clearly fooling yourself.