AI-AI-oh! Thread for people who don't get the comment about 9 not being a prime number is below! If we rhyme with AI like Intel, can we eliminate the prime with Dell? Grab one of our GN15 Metal Emblem pint glasses to support our work! store.gamersnexus.net/products/gn-3d-emblem-glasses Watch our coverage of the NVIDIA keynote that preceded this: ua-cam.com/video/0_zScV_cVug/v-deo.html
Xcellent dancing four this man number one king four computer. Intel gr8 and wonderful four India become number one nationality. Intel mr Pet sir more xcellent and more smartly and power then AMD Lisa su, number one disgrace four all humanitarians but she try her best her branes and four this I am so proud.
Last time I took acid, I took 10 tabs and experienced the ineffable disection of reality and my soul and how it relates with said reality. I have not done enough acid to be prepared for this video, I'll be right back.
Last time I took psychedelics the code in which the universe is written was laid bare to me and I was annhilated for realizing the true nature of the simulation. I have not done enough acid to be prepared for this video, I'll be right back.
At this point, we’re going to need a compilation of all the intros where Steve gets cut off before he can swear due to the sheer bewilderment to what was presented before him
the reality becomes more and more unhinged. how do we even percieve this change anymore? other than simply staring flabbergasted and waiting for appropriate time to say "what the fuck is going on" anymore? its all so hillarious
@@luandoduy416 CEOs were probably always a little unhinged but mostly contained by PR/HR but these days being memeable is an asset. While somebody like Steve Ballmer was a liability back in the days.
(Manufacturing AI and Robotics practitioner here) Machine learning in factories is good for getting actionable insight from the enormous amount of data in a smart factory... Applications like anomaly detection (is the machine ok?), explaining phenomena (what's causing bad quality?), optimisation (energy, quality, material consumption), corrections (can I add X to save the batch?). Factories often had the issue of drowning in data but not deriving enough benefits from it.. cool, we have 100gb of historical data on that electric motor but what's going on now? What's going to happen? What should happen?
@@mobugs There were computer driven manufacturing machines from the early 1950's. The tools got a whole lot nicer after that but we're talking "ease of use" and "leveraging new technology" rather than mind-blowing paradigm shift. I don't want a spontaneously smart robot. I want it to boringly work the same way until I tell it not to.
@@MichaelSmith-fg8xh yeah got it, what im saying is 100GB of data and anomaly detection can be handled quite well without deep learning (NNets with tons of hidden layers), which is what these NPUs are used for. If you were talking about things like state of the art computer vision - which I'm sure has its uses in industrial applications - is another story.
@@steviesavagery no, I'll check it out! Edit: omg, that was something else!💀 They at least had genuine energy to it. It had me thinking, "hell yeah," instead of, "oh no...🤦🏻"
It's quite fascinating to hear constantly about miraculously advanced AI stuff in an age when mainstream IDEs - used by the tech crowd - still often fail to correctly execute a simple refactor/rename operation. Or, to hear about upcoming advanced AI in Windows while they still couldn't even develop a properly working basic search box that could find a partial word match among the few dozen installed applications.
It's not for you, it's for execs who have wet dreams about replacing their workforce with technology that doesn't exist and will not exist any time soon
There is a good amount of 'there' there, but it's a tiny fraction of the hype. Like, yes, Gen AI CAN do a lot of things... but it shouldn't. And after the novelty, no one really wants it to.
Exactly, Intel spent it sandbagging and flogging quad cores, until AMD spanked it's ass with Ryzen, and Intel panicked and had nothing in return coz it spent over a decade rinsing consumers with mediocre upgrades while they twiddled their thumbs thinking yeah we can milk this gig for another decade, instead of researching and developing anything ground breaking.
Supposedly Microsoft is working on a consolidation layer (like they did for DLSS, FSR, etc) to help bridge the gap between CUDA, RocM and Intel's oneAPI. Take this rumor with a grain of salt though.
Normally I hate Pat Gelsinger, but that chip reveal was so cringy that it swung back around to me loving it. I mean I still hate him but I liked the bit.
@@thepunish3r735 I found the way he told Congress to go hurry up and get him his Chips Act money like he was entitled to my tax Dollars was as revolting as it was infuriating
@@relucentsandman6447 ahh I gotcha .. the chips act was in the making for awhile, he made the Ohio plans for construction, and had the companies wait, because it was the only reason he was expanding as much. Europe chip act went fast and was gonna expand even more in Europe, thinking the chips act wasn’t gonna get signed. But I see your point. To me, the chips act is the only thing Biden did that was solid, considering china only wants to invade Taiwan mainly for tsmc. ( tsmc is the only company that can make high end chips for our military like the patriot system, aka we won’t have access to make any more
@@thepunish3r735 I completely agree with your points about it's a good strategic use of tax Dollars. But that makes it more engaging to me because I haven't seen any sign of Intel getting its act together since Pat joined again, and now that prick is going to get even richer just for sitting in the right chair at the right time.
@@relucentsandman6447 10 billion extension to Arizona, 10 billion extension to New Mexico , 10 billion extension Oregon, plus the first of its kind high end ev light machine ( tsmc doesn’t even have it yet. 20 billion Ohio plant being Built ( 100 billion biulding extension in the future, will be the biggest foundry biulding. .. new foundry in Germany 20 billion, 20 billion biulding in Ireland . A massive r&d only facultie in France .. a 20 billion foundry in Israel ( doesn’t start until 2027) … all this takes years to build and get working at max potential.. all this takes time and they will be a powerhouse, nivida even wants them to make 300,000 chips a month in 2025 I believe that starts .. I know iam a nobody to u on the internet , but I been researching them hardcore since the beginning of last year and I see the public seeing their potential at the end of next year and 2027 seeing a lot of their investments paying off
Computer hardware marketing has officially jumped the shark. I know we used to complain that these announcements/presentations were boring but this is not what we wanted them to do instead.
On one hand, I understand that they're trying to market to people with little to none of the requisite knowledge to understand what their products do or how they work: Tech investors and corporate executives. On the other hand, why are major investment and purchasing decisions being made by people who need bright flashing lights and a live DJ to keep them entertained in presentations about cutting edge computer hardware? Why do we live in a world where this isn't a dry PowerPoint presentation going over performance data and their methodology for collecting said data?
When Lisa Su became CEO, AMD was basically an unprofitable penny stock company with inferior products while Intel was a borderline monopoly. Since then, AMD share price has gone up over 50x and has overtaken Intel in market cap. Their CPUs now dominate gaming, and they're continually taking market share away from Intel when it comes to data centers as well. She knows what she's doing.
Bah! Is he faking heart attacks to escape the feds? Is he trolling people by tweeting about his whale banging? Is he absconding to some banana republic?
i doubt pat will sell drugs, kill his neighbours and abandon country any time soon he will just keep destroying the company the same way he has been doing untill it becomes a fab only
One of the most concrete evidence of a higher power, was the fact that everyone who was listening to that Nvidia AI song in person wasnt convulsing like a epileptic salmon at the first note.
Yeah I'd say so. WTF were those clips lol. And at 0:37 at first I though GN edited in the music. But I underestimated how cringe Intel would get... And I genuinely don't know WTF is going on at 1:16. Other than it kind of has that watercolor look that you can sometimes get when running images or videos through AI upscale programs (like topaz video AI). The clips were bringing back memories of that amazingly cringe Qualcomm conference with the whole "Born Mobile" thing (Qualcomm at CES 2013). Or Konami E3 2010.
All the companies at least indirectly do maybe, but Intel is first and directly supports occupation and terrorists. They are obviously most insane and should be boycotted
Intel has the best of the best video codec hardware so far. Yet all rounder nvidia still hold the market. However nvuda must decrease the price and doing what amd do. Make some of the software open source. So the nvida will wipe the floor with intel. Other wise within 10 more years intel.will catching up with nvidia. Or even sooner if their gpu can consume far less power
I did a lot of testing on arc in its early days. It’s easy to dunk on them for drivers, but it’s continued to impress me on the media/productivity side . I just hope for more completion in the market
RT doesn't really mean much for AI performance. However, Arc is pretty much designed for AI. The XMX that runs XeSS is an advanced matrix engine, which is what you really want for tensor operations. In a generation or two, I wouldn't be surprised to see XMX be very competitive with Nvidia's tensor cores. I don't know if AMD has anything comparable in CDNA, but they don't for RDNA and that will hold them back.
I remember those days when the Blue and the Green used to team up together and simply ignore the Red team. Also the days when Blue used to insult Red openly in their presentation calling them "Imitator". The table didn't only turn, it got upside down!
That Intel presentation reminds me of the old 80's "Mind of Minolta" TV Ads. They were amazing back then. Seems like Intel might have a new vibe coming to it. I'm thinking it might be due to real competition from elsewhere, and some folks leading the way.
@JeskidoYT youre saying that as if AI isnt a recent invention, people will always find ways not to get stuff done and the best inventions in tech history have mostly been on accident (if everything had gone to plan for IBM, there would be no PC compatibles like we have them today, only propriatary IBM PCs)
If Baiden gives 20 billions to nVidia, and just see what happen. tsmc got only 6 billions and relocate all the best engineer from Taiwan to help up. tsmc can withstand 7.4 earthquakes. Take that, Intel.
Yes, this is why AMD is making the MI-Cards and already baking in Tops in every CPU. And they already beat Intel by lightyears. Before aiming at NVIDIA, they should try to beat AMD first, what they are not able to do right now. This is just a brainwashing show to make them look competitive in any way, but they are not.
technically seen both amd and intel already more than compete with nvidia, the problem is people just won't see that and will by defauly keep only looking at nvidia. and that is because nvidia is a marketing company, meaning they spend most of their budged on how to look good and make people think of them as great. as a result both amd and intel and broadcom and others need to provide way better value in order to even get some okay marketshare, and as a result they generally have lower marketshare making those insanely better value harder to reach. the main problem is essentially people being mentally weak, and so weak towards nvidias marketing. if people would just decide to go with nvidias competition, not only will they suddenly get a lot better due to finaly having okay marketshare and so software also being optimized for them(as in right now only nvidia receives much optimization from third parties, if intel and amd also got optimization from people(like game engines also by default optimizing for them instead of only for nvidia then amd and intel cards and such will perform much better than they do now. since hardware is one thing. drivers are one thing. first party is one thing. but third party implementation and optimization gives insane effects and is reached through marketshare. for example if a nvidia gpu is just designed poorly or has terrible drivers, all games and gameengines and most softwares will in general optimize and fix those problems caused by bad nvidia design or poor drivers. if amd or intel has a driver issue, well only rare cases do third parties optimize to get around that, as a result people think intel and amd to be somewhat poor. while in reality as a example gaudi3 currently beats the h100 with around 1.5 times while also being much cheaper and more efficient. however when things are optimized for gaudi3 as well it's potential is many times higher since it just is many times faster in actual AI performance, but most softwares are made for nvidia. and nvidia has the nasty habbit of not accepting or using new technology and instead relying on old legacy technology so their stuf looks good early on yet gets outdated rapidly. as a result big AI datacenter workloads are still designed for those legacy methods, for which nvidias hardware is also optimized. meaning that gaudi since it is not optimized for those legacy(old) things and instead for the future will right now only be around 1.5 times faster, while if AI gets more futureproofed gaudi3 will be many times better than that 1.5 times better. originally I thought they reffered to the entire board with 8 of those modules when they named the AI performance since in that case even 1 module would still be much faster than the H100 in technical performance when optimized for, but turns out it reffered to 1 such module which can help you imagine how much more powerfull than H100 it is. but it is a vicious cycle of nvidia having most marketshare, so all is optimized for nvidia, nvidia however uses very outdated methods meaning all is optimized for those outdated methods and hardware types, as a result all runs way worse than it could actually run, but if it where actually optimized properly to be more efficient and fast then nvidia gpus can no longer really run it because they where based on legacy stuf and so do not support the new stuf or barely and so since nvidia has the biggest marketshare softwares keep using those outdated methods despite it greatly increasing powerusage and reducing performance. amd and intel since they can't compete in the legacy field have to innavate and improve but they need to get adoption for such newer methods to be used and to show the world how great things are, amd tends still follow nvidia a lot in hardware, intel however seems to really push for those newer much faster and more effient methods, but again rely on adoption to get it working even close to it's potential since if all is written for legacy shit then it won't run as well on modern hardware. if many people would adopt intel and amd however, and right now especially intel then the gpu market for normal users will skyrocket since not only will the new better methods finally be widely supported, but also nvidia will be forced to actually improve their performance and value yet most people are to stupid to see even that single point
One would’ve hoped that AI would have replaced mundane tasks to allow people to focus more time on their hobbies, expressing themselves with art and music. Unfortunately, the AI-obsessed C-suites of modern tech companies would love to see it be quite the opposite; they want to replace the artists, the people with passion and vision, with AI models that hallucinate amalgamated images using training data that they didn’t earn. As someone who has been interested in both tech and art for a long time, the shift is both disturbing and disheartening.
I'm not sure what gave you the impression that they would be so altruistic. It's always just been about selling the product. The reason for art and music being on the forefront is simply because it was the easiest. For better or worse it's coming to replace everything, it's just a matter of how hard is it to implement in each use case.
Let me tell you how the economy works. And how your complaints about ai literally do not matter. The cycle goes like this: new groundbreaking tech is created, that technology displaces a portion of the workforce while increasing productivity for far cheaper, new jobs are created that usually pay more than the previous jobs, the displaced workers find a new job. I want to ask you, where do you think we would be if we rejected advances in technology when it comes to the industrial or agricultural sector, because what about muh job as a farmhand? Ill tell you what would have happened, we would still be in a pre industrial society with less food, less products, less money, dying of polio or some other bs at 30. Implementing ai allows businesses to create things for cheaper, it also allows artists to be more productive, it also allows people to create more niche products at a higher standard of quality that would otherwise not be economical. In other words, before you start talking about anything, you should probably at least understand the bare minimum. Money and greed, you guys seriously think you understand anything. Let's be real here, the thing you're complaining about is "capitalism" Which just shows how narrow minded and frankly it shows how much of a dimwit you are.
Let me tell you how the economy works. And how your complaints about ai literally do not matter. The cycle goes like this: new groundbreaking tech is created, that technology displaces a portion of the workforce while increasing productivity for far cheaper, new jobs are created that usually pay more than the previous jobs, the displaced workers find a new job. I want to ask you, where do you think we would be if we rejected advances in technology when it comes to the industrial or agricultural sector, because what about muh job as a farmhand? Ill tell you what would have happened, we would still be in a pre industrial society with less food, less products, less money, dying of polio or some other bs at 30. Implementing ai allows businesses to create things for cheaper, it also allows artists to be more productive, it also allows people to create more niche products at a higher standard of quality that would otherwise not be economical. In other words, before you start talking about anything, you should probably at least understand the bare minimum. Money and greed, you guys seriously think you understand anything. Let's be real here, the thing you're complaining about is "capitalism" Which just shows how narrow minded and frankly it shows that you're not exactly the brightest bulb. Also, ai is overvalued like hell right now, sure it's useful, but this is getting to .com levels of bubble, maybe even worse. The internet was the future, that doesn't mean that every .com was useful.
@@alexis1156 Brevity dude. And redirecting our complaints to Capitalism sounds like a defunct all-encompassing umbrella. There's morals in the business, when the internet was discovered it created more jobs than it destroyed for all skill-levels. When AI was developed, it sapped skill from lower workers for a small fraction of people. "That's where Capitalism comes in" keep pedalling that irrelevant retort, because AI has helped significantly less than what the media has led you to believe. Go buy shares, it's the new trend right now.
Hi Steve! Just a quick one... I got the evo RGB and the 140mm fans are bigger than the bracket, the 2 screws on the bottom and 2 on the top of the 3 fans are getting screwed in the air :D I love how it is presented in the demo they fit on the support :))
10:57 Using Pytorch as an example is not a good one. Pytorch is a deep learning library. Its counterpart is Tensorflow and many others. It does not directly interface with GPU, instead it goes through cuda or ROCm (amd equivalent). Intel is trying to push for their open source "cuda", OneAPI, which I just checked still does not have official released PyTorch using it. Nvidia has more than 10 years of development and experience in cuda and GPU accelerated computing (not limited to deep learning). Both Intel and AMD will have a very rough uphill battle to fight.
Nvidia's dominance in AI is not even so much CUDA, but more of the overall relentless in support and polish for everything, writing tons of custom optimized kernels for libraries etc. The whole CUDA moat in this case is more of an umbrella term for their entire ecosystem, because many of these advantages aren't technically CUDA related. Nvidia actually spent more than a decade of effort while these other companies did not.
IMHO, the ML frameworks is where Nvidia has a huge leg up, because they were (and have been there) for a long long time helping build them but I'll be totally honest, if Intel makes realistically good TF/PyTorch support a reality, they could cause a huge upset by simply saying "look, we have this nice card with the A750 and 64GB and it doesn't cost an arm and a leg because VRAM is cheap and Nvidia are bloodsuckers". Make them ~$600 and I'll be running out the door to buy a couple (couples) :P Much like the prisoners dilemma, the one that defects first on VRAM size/cost will totally wipe the rest for a modest time, because performance isn't really a concern if you can't event fit the model in memory to start with.
मग कसं भाऊ, जमेल का Intel ला सर्व काही.. मला माझ्या घरी बसवायचं आहे AI. चल AI, हात पाय दाबुन दे. चल AI, चल AI, अद्रक टाकून चहा बनव.. अशा order ठोकणार आहे मी.
That Intel guy considered the last decade boring?! What the f**** were they doing? Oh yeah! NOT fixing the countless design flaws in their CPUs, instead expecting us to sacrifice the performance we paid for by rolling out (software) mitigations. Go to heck, Intel!
After watching Fallout I've learned what we really need are Super Managers so that these ai projects can continue for 200 years. 200 years of song and dance.
07:32 I remember ad campaign, "Switch", talking about openness of formats, compatible interfaces, easy to replace parts. The same "*totaly family unfriendly word*" company which is now one of the toughest and harshily defended monopolies of consumer devices and services. If you're try to gain positions, you talk about openness. If you're leading, you secure with copyright and propietary solutions. Decades changes, a-bole business practices don't.
TBH my biggest personal hesitance towards using any AI at all is that we constructively can't run and use it locally to ensure the safe-keeping of confidential information. In law, that sort of means sending absolutely privileged information right to some company, usually a big and powerful one. Just because I don't know how these companies will use the information and how they might adverse my specific client doesn't mean I can put them at unknown risk for the sake of possibly more efficient use of time. But being able to run open-source code on local hardware could change that. If Intel can come out with AI Add-In-Cards that don't cost the kidney required to buy a 4090 (if one somehow evades China's inhalation of them), then maybe I'll consider it. They could save cost making discrete cards apart from GPUs by not having to fuss with all the other silicon devoted to things unhelpful to AI purposes. It might be nice to return to the days of having more than one Add-In-Card
While the notes for Xeon 6 2.4x and 2.7x notes said that it was vs prior generation platforms, on Intel website they were comparing those Xeon 6 to Xeon 2nd Gen which was using Skylake architecture, so not exactly prior generation unless they were thinking that because Sierra Forest is based on E-cores which is based on Skylake thus the comparison is somewhat valid? Also they had a slide comparing Xeon 6 Granite Rapids (P-cores only) with.... Xeon 4th gen, which is using the same architecture as Alder Lake, so not exactly the previous gen (which should be 5th gen). But I guess their wording should be enough to avoid problem with false marketing since they didn't say previous generation but again, prior generation platforms. To be precise, their wording is "Based on architectural projections as of Feb. 14,2023 vs. prior generation platforms. Your result may vary." They do wrote about the comparison was with those CPUs, but it is in their written article and not on the slide itself. So basically a bit of snake oil just enough to make everything more shiny than it should.
They are pretty competitive in price though. You're better off buying an amd card rather than a 40x card that's not a 4090. Unless you're willing to toss additional dollars due to the nvidia tax for less vram and performance increase from previous generations then sure do what you do.
How is Intel a competitor in the GPU space? Currently at best, they make AMD and Nvidia’s GPUs look better while not having better software or hardware price to performance to AMD. At launch, the A770 and had the same performance per dollar as the 1650 SUPER
Checking the Gaudi 3 AI white paper, the 1.5x speed up *on average* is just due to the larger memory on each card compared to the H100 (128GB vs 80GB)... There are some improvements on interconnection speed (900GB/s vs 1200 GB/s on intel) and memory bandwidth (3.35TB/s vs 12.8 TB/s) but Gauid 3 is essentially is a slower card. 1835 vs 1979 TFLOPS on BF16 AND 1835 vs 3958 TFLOPS on FP8
but Gaudi is made by tsmc. tsmc's capacity is booked by nvidia and apple. when is intel's fabs going to be able to compete, even for intel's own business?
Hey, just letting you know, I actually really appreciate the AI hardware/software coverage. I'm a gamer and an AI software engineer working on developing Retrieval Augmented Generation chatbot at my company. It's nice to hear Gamer Nexus cover both sides of the things I love to do. lol unfortunately, these chips are so damn expensive, even my company can't afford to buy them for me :') Not that they're being cheap. They'd be 100% willing to buy them for me if they could get the ROI from it to justify the purchase for our mid size company. But the hardware is too expensive to get proper ROI. Hoping for some stuff to "trickle down" to my company, and then eventually trickle down to prosumer.
Nvidia is seeing what Cisco experienced in the early 2000s where a bunch of manufacturers jumped onboard the networking equipment bandwagon. I prefer setting up Cisco at work but at home I prefer Netgear. There's a brand for everyone else in between.
Absolutly. this show is just stand-up-comedy and self-caricature. I dont know what they are smoking at Intel, but if people reacting to show the same as election for President, Intel has a bright future ahead. Its not always the best who wins, but the one who makes most of the noise. They are quite behind AMD, aiming for NVIDIA is self-destruction. But in their presentation they show themself as if they are already King in the AI world. And this Cringe-fest is just making Gelsinger look like a clown. He might be a smart man, but he should not try to present him as the "edge-overlord". This space is already overcrowded with the ego from Jensen Huang.
I work/worked on the LLM in the RA3 FAB that was shown off. Although it may have been "staged" in the sense that we knew what we were going to ask it before the camera turned on. It does work in FAB to get content real time without the MET searching the four or five different data sources that a MET would normally need to review for an RFC. It's freakin Epic. I wish I had it when I was an L3E in the factory.
I think that's part of the allure though. Nerds should be at intel. If it turned into RedBull Bro culture we'd probably stop upgrading often... or we'd all become redbull bros
It's on purpose. Intel in specific wants to present as boring and safe. We have to remember who Intel's core market is. It's not us and it's not Wall Street either. It's Fortune 500 companies ran by boards of directors over 60. The socially awkward nerd is safe, he'll do what he says to the best of his ability. The guy in the leather jacket and sun glasses is trying to put his hand up your granddaughter's skirt.
@@jtland4842 On the Blizzard note these "people" are an intersection of frat boys and nerds (and rap**ts). An aberration of nature. Blizzard North is long gone, for decades, and Activision wears its corpse. Just look up a photo of the makers of Diablo I together...
If we look at the Intel image at 5:10 it will easily show their novel approach, where memory is surrounded by processing cores using two separate dies. An impressive achievement for a real first-time competitive offering.
Google translate 😂😂😂😂😂😂😂😂 I don't think this AI crap is going to go well. There was a limit of development in this direction that we should not have crossed, but it is already too late. Technology went exactly in this direction, which partly scares me. But in the end, if humanity isn't wiped out by an asteroid or something like that, it will at least be wiped out by something we ourselves created. 😅
The slide on Intel Gaudi 3 talks about inference and running, so it is not about training AI. The inference phase is what Chat GPT does to answer when you write some text (so, after training). AFAIK, both NVidia and Intel are behind the Groq LPU both in term of latency (time to replay in chat) and energy efficiency (and Groq is still using 14nm)... There is a huge architecture difference, the Groq has a deterministic behavior (no varying instruction latency due to cache, unordered execution...a dream for developers), embed a network switch, which as I understand provide deterministic behavior at distributed scale.
AI-AI-oh! Thread for people who don't get the comment about 9 not being a prime number is below! If we rhyme with AI like Intel, can we eliminate the prime with Dell?
Grab one of our GN15 Metal Emblem pint glasses to support our work! store.gamersnexus.net/products/gn-3d-emblem-glasses
Watch our coverage of the NVIDIA keynote that preceded this: ua-cam.com/video/0_zScV_cVug/v-deo.html
@@lunarvvolf9606 why do you say that?
Thanks Steve.
Xcellent dancing four this man number one king four computer. Intel gr8 and wonderful four India become number one nationality. Intel mr Pet sir more xcellent and more smartly and power then AMD Lisa su, number one disgrace four all humanitarians but she try her best her branes and four this I am so proud.
Annnals of History
@@RanjakarPatel बहुत सुन्दर कहा भाई
good news steve, auto captions was able to successfully transcribe "annals"
Thanks to Nvidia H100, funny
Really?! It must have changed at some point. I checked originally and it said "ANIMALS."
Instant pause and rewind with cc on for me as well lol
on the other hand it messed up "Google autot transcribes" only a few words later
@@GamersNexus HEHEH....SURE....
I’m sorry Steve, I have not done enough acid to be prepared for this video, I’ll be right back.
Hahahaha
...you've had two hours now...
How's the acid
Last time I took acid, I took 10 tabs and experienced the ineffable disection of reality and my soul and how it relates with said reality. I have not done enough acid to be prepared for this video, I'll be right back.
Last time I took psychedelics the code in which the universe is written was laid bare to me and I was annhilated for realizing the true nature of the simulation.
I have not done enough acid to be prepared for this video, I'll be right back.
At this point, we’re going to need a compilation of all the intros where Steve gets cut off before he can swear due to the sheer bewilderment to what was presented before him
the reality becomes more and more unhinged. how do we even percieve this change anymore? other than simply staring flabbergasted and waiting for appropriate time to say "what the fuck is going on" anymore? its all so hillarious
Nah this is just ceo in their natural habitat. The ai just let them loose
@@luandoduy416 CEOs were probably always a little unhinged but mostly contained by PR/HR but these days being memeable is an asset. While somebody like Steve Ballmer was a liability back in the days.
Hear hear! Please do that at the end of year recap.
12 seconds, 12 fucking seconds and I am already cringing to the point my dentist can smell the payday, what the actual fuck
(Manufacturing AI and Robotics practitioner here) Machine learning in factories is good for getting actionable insight from the enormous amount of data in a smart factory... Applications like anomaly detection (is the machine ok?), explaining phenomena (what's causing bad quality?), optimisation (energy, quality, material consumption), corrections (can I add X to save the batch?).
Factories often had the issue of drowning in data but not deriving enough benefits from it.. cool, we have 100gb of historical data on that electric motor but what's going on now? What's going to happen? What should happen?
Thanks for the interesting insight!
You don't need state of the art NPUs for that type of AI/ML tho
@@mobugs There were computer driven manufacturing machines from the early 1950's. The tools got a whole lot nicer after that but we're talking "ease of use" and "leveraging new technology" rather than mind-blowing paradigm shift.
I don't want a spontaneously smart robot. I want it to boringly work the same way until I tell it not to.
@@MichaelSmith-fg8xh yeah got it, what im saying is 100GB of data and anomaly detection can be handled quite well without deep learning (NNets with tons of hidden layers), which is what these NPUs are used for. If you were talking about things like state of the art computer vision - which I'm sure has its uses in industrial applications - is another story.
0:37 I thought the music was edited, but it was actually part of the reveal lmao
I thought that as well at first. I thought "this can't actually be part of the presentation" but I underestimated them...
It seems the companies truly try to out cringe each other with this stuff.
@@B_Machine well... nerds be always be awkward... it's a trope for nothing.
@@B_Machinebrother have you seen the windows 95 reveal
@@steviesavagery no, I'll check it out!
Edit: omg, that was something else!💀
They at least had genuine energy to it. It had me thinking, "hell yeah," instead of, "oh no...🤦🏻"
"Every company will be an AI company"
Every time I see stuff like this I feel like John Cusack's character in 1408, living in a Kafkaesque reality.
I fully agree with you, cultured man. It lowkey chills my spine.
Companies will say whatever they need to in order to get Wall Street to give them money.
or that it is becoming a bubble, until it pop.
And when everyone's AI... no one will be
Every restaurant is Taco Bell
- How do we stitch it together?
WIth glue, intel. Glue.
and tape
Because they use EMIB for die to die interconnects, it’s more like a sticky pad😆
dont forget the Snake Oil!
@@benc3825nope...solder metal pad vs amd short term glue that their chips gonna fall off
and also duct tape
It's quite fascinating to hear constantly about miraculously advanced AI stuff in an age when mainstream IDEs - used by the tech crowd - still often fail to correctly execute a simple refactor/rename operation. Or, to hear about upcoming advanced AI in Windows while they still couldn't even develop a properly working basic search box that could find a partial word match among the few dozen installed applications.
It's not for you, it's for execs who have wet dreams about replacing their workforce with technology that doesn't exist and will not exist any time soon
jetbrains moment
The US tech industry has always made money on hype. This is the latest one.
There is a good amount of 'there' there, but it's a tiny fraction of the hype. Like, yes, Gen AI CAN do a lot of things... but it shouldn't. And after the novelty, no one really wants it to.
Those 24 200Gbit Ethernet nodes would make a nice Quake server for a LAN party
Fuck yeah
Wanna slide in a couple cod2 rounds, too? Pls! I won’t play shotguns, i promise!!
"Prior decade was sort of boring"
Yeah and who's fault was that?
Lugma corporation
Exactly, Intel spent it sandbagging and flogging quad cores, until AMD spanked it's ass with Ryzen, and Intel panicked and had nothing in return coz it spent over a decade rinsing consumers with mediocre upgrades while they twiddled their thumbs thinking yeah we can milk this gig for another decade, instead of researching and developing anything ground breaking.
AMD, for not being competitive and creating a situation where everyone else was allowed to coast.
"4 cores 4 ever" was the defining thing for their decade.
Clearly it was all those pesky engineers who love doing “boring” and “meaningful” work. The C-suite is here to fix that!
Good, I don't like nvidias stranglehold on multiple sectors of the industry.
All we need now is a competitor to CUDA.
what about ROCM????
@@theaveragecactusyoure comedic
Supposedly Microsoft is working on a consolidation layer (like they did for DLSS, FSR, etc) to help bridge the gap between CUDA, RocM and Intel's oneAPI. Take this rumor with a grain of salt though.
@@uzikun People said this with Zen, but here we are. Everything can improve with time, we'll just have to wait and see.
Yes, it's called SYCL
The “I love NVIDIA” AI song will go down in history as the tipping point in losing the race to our robot overlords
15:46
Even Big Brother is losing his job to AI...
Skynets plan to wipe out humanity is to make everyone unemployed.
The robots are coming
It is super sexy 💚🖤🙏
Normally I hate Pat Gelsinger, but that chip reveal was so cringy that it swung back around to me loving it. I mean I still hate him but I liked the bit.
For real ? .. I love pats passion and his all in mentality…sure he’s cringy , but I love it
@@thepunish3r735 I found the way he told Congress to go hurry up and get him his Chips Act money like he was entitled to my tax Dollars was as revolting as it was infuriating
@@relucentsandman6447 ahh I gotcha .. the chips act was in the making for awhile, he made the Ohio plans for construction, and had the companies wait, because it was the only reason he was expanding as much. Europe chip act went fast and was gonna expand even more in Europe, thinking the chips act wasn’t gonna get signed. But I see your point. To me, the chips act is the only thing Biden did that was solid, considering china only wants to invade Taiwan mainly for tsmc. ( tsmc is the only company that can make high end chips for our military like the patriot system, aka we won’t have access to make any more
@@thepunish3r735 I completely agree with your points about it's a good strategic use of tax Dollars. But that makes it more engaging to me because I haven't seen any sign of Intel getting its act together since Pat joined again, and now that prick is going to get even richer just for sitting in the right chair at the right time.
@@relucentsandman6447 10 billion extension to Arizona, 10 billion extension to New Mexico , 10 billion extension Oregon, plus the first of its kind high end ev light machine ( tsmc doesn’t even have it yet. 20 billion Ohio plant being Built ( 100 billion biulding extension in the future, will be the biggest foundry biulding. .. new foundry in Germany 20 billion, 20 billion biulding in Ireland . A massive r&d only facultie in France .. a 20 billion foundry in Israel ( doesn’t start until 2027) … all this takes years to build and get working at max potential.. all this takes time and they will be a powerhouse, nivida even wants them to make 300,000 chips a month in 2025 I believe that starts .. I know iam a nobody to u on the internet , but I been researching them hardcore since the beginning of last year and I see the public seeing their potential at the end of next year and 2027 seeing a lot of their investments paying off
Google: correctly transcribes "annal"
Also Google: incorrectly transcribes "auto"
trillion dollar tech company, give em some slack
2:53 I too had to turn on my Closed Captioning and check, after he made that little comment 😉
Guess they should have used AI.
Oh wait
Computer hardware marketing has officially jumped the shark. I know we used to complain that these announcements/presentations were boring but this is not what we wanted them to do instead.
I also felt like I needed to wash and yell translator. Half what the presenter said was fucking acronym giberish.
IDK I’m kind of enjoying laughing at them
Totally agree. I like my tech announcements prefaced by developers developers developers
Bruv, we going back to the blu man group Pentium days.
On one hand, I understand that they're trying to market to people with little to none of the requisite knowledge to understand what their products do or how they work: Tech investors and corporate executives.
On the other hand, why are major investment and purchasing decisions being made by people who need bright flashing lights and a live DJ to keep them entertained in presentations about cutting edge computer hardware? Why do we live in a world where this isn't a dry PowerPoint presentation going over performance data and their methodology for collecting said data?
Lisa Su looking through binoculars at Intel and nVidia fighting: "Let them fight..."
"let them take over the trillion dollar industry, those fools."
Yeah, Intel isn’t doing that any time soon. Losing tons of money tends to lead to having problems taking over the industry
@@benc3825 Isn't it remarkable that we live in a world where Intel is behind AMD and Nvidia in Market Cap? I never thought I'd see it.
@@smuggy8576”Intel has depleted its supply of prime numbers.” LMAO.😂
When Lisa Su became CEO, AMD was basically an unprofitable penny stock company with inferior products while Intel was a borderline monopoly. Since then, AMD share price has gone up over 50x and has overtaken Intel in market cap. Their CPUs now dominate gaming, and they're continually taking market share away from Intel when it comes to data centers as well. She knows what she's doing.
0:47 is an Intel MMX reference from a TV ad back in the 90's... but you already knew that didn't you.
Regardless I'm glad he included the clip because it's funny seeing this out of touch CEO goofily dancing while holding his product.
Probably the most under-appreciated aspect of this channel is Steve’s sense of humor 🤣🤣
Also brilliantly covered the entire topic.
Papas lil baby.
PAPA'S HERE *boss music*
Mr. Gelsinger I feel very uncomfortable right now.
Some sugar baby just got a new ringtone 😂
He ate Papa John's pizza.
@HyperionZero I like their CPUs, the GPUs not so much with a few exceptions.
The robots will look back one day in embarrassment at how goofy they were conceived
Yes, but so did you
In the words of Ford Prefect; "Family is always embarrassing isn't it."
goofy aah
*goofily
Sorry 😂
Pat Gelsinger is slowly transmogrifying into John McAfee, and I'm not sure how I feel about that...
Bah! Is he faking heart attacks to escape the feds? Is he trolling people by tweeting about his whale banging? Is he absconding to some banana republic?
The music and dancing reminded me of Steve Ballmer’s “DEVELOPERS DEVELOPERS DEVELOPERS”
As long as he doesn't buy a house in south America we're fine
i doubt pat will sell drugs, kill his neighbours and abandon country any time soon
he will just keep destroying the company the same way he has been doing untill it becomes a fab only
I welcome this insanity
One of the most concrete evidence of a higher power, was the fact that everyone who was listening to that Nvidia AI song in person wasnt convulsing like a epileptic salmon at the first note.
Gaudi is an old german word for having a lively party
Intel has finally completely lost it LMAO
😂😂😂
Yeah I'd say so. WTF were those clips lol. And at 0:37 at first I though GN edited in the music. But I underestimated how cringe Intel would get... And I genuinely don't know WTF is going on at 1:16. Other than it kind of has that watercolor look that you can sometimes get when running images or videos through AI upscale programs (like topaz video AI). The clips were bringing back memories of that amazingly cringe Qualcomm conference with the whole "Born Mobile" thing (Qualcomm at CES 2013). Or Konami E3 2010.
All the companies at least indirectly do maybe, but Intel is first and directly supports occupation and terrorists.
They are obviously most insane and should be boycotted
This is still my favorite out of context quote from good ol’ Patty G
I might even be able to beat my children with that.
@@benc3825I feel like it’s such a rich person thing if you beat your kid with a GPU
If anyone can target nvidia with ai it’s Intel. Intel arc has really interesting RT and productivity on their GPUs that I didn’t expect
Absolutely right on the media side as well.
Intel has the best of the best video codec hardware so far. Yet all rounder nvidia still hold the market. However nvuda must decrease the price and doing what amd do. Make some of the software open source. So the nvida will wipe the floor with intel. Other wise within 10 more years intel.will catching up with nvidia. Or even sooner if their gpu can consume far less power
I did a lot of testing on arc in its early days. It’s easy to dunk on them for drivers, but it’s continued to impress me on the media/productivity side .
I just hope for more completion in the market
RT doesn't really mean much for AI performance. However, Arc is pretty much designed for AI. The XMX that runs XeSS is an advanced matrix engine, which is what you really want for tensor operations. In a generation or two, I wouldn't be surprised to see XMX be very competitive with Nvidia's tensor cores. I don't know if AMD has anything comparable in CDNA, but they don't for RDNA and that will hold them back.
NVIDIA, AMD, Intel. The competition is set up now. Let's see how the race will be going. Gaudi2 had a good price tag.
Had no problems with "annals", but for some reason your clearly spoken "auto transcribe" turned into "autot transcribe".
Do they use AI for it now? I don't remember there being so many spelling mistake before, but I've been seeing weird errors for a while now.
@@AvendesoraThat part has always been AI what are you taking about
@@letcreate123 Sorry. A modern LLM. better?
I remember those days when the Blue and the Green used to team up together and simply ignore the Red team. Also the days when Blue used to insult Red openly in their presentation calling them "Imitator". The table didn't only turn, it got upside down!
That Intel presentation reminds me of the old 80's "Mind of Minolta" TV Ads. They were amazing back then.
Seems like Intel might have a new vibe coming to it.
I'm thinking it might be due to real competition from elsewhere, and some folks leading the way.
6:13 Fortunately 6 is a perfect number
The sheer volume of cringe these companies are putting out is absolutely insane.
A reflection of many of their customers, they are trying to cash in on the whole fanboy/celebrity thing.
That intel thing is highly likely inspired by that car idiot who put up a dancing human trying to pass it off as an innovation in robotics.
@@Skobeloff... Cash in, as in people who like cringe?
The cringe doesn't matter. They're boomers. Only the raw untapped compute chips matter.
@@mysticalpotato86 lots of people like cringe, whether they are aware of it, or not.
The captions show annals correctly, but Gelsinger as gelnar lol
Sounds like a D&D character name!
"I am become Gelnar, destroyer of Ansys!"
That lian li Case looks insane. Actually got decent space for cables.
I'm only 1 minute in, what the hell is going on with Intel's presentation 😂
lol
With the number of super cuts of people repeatable saying AI, I'm surprised that the editor didn't make keynote speakers sing Old Macdonald
Can they at least use AI to make those presentations less terrible?
That's the thing!
AI makes it WORSE!
I bet the AI is referencing some crappy presentation, and the generated presentation got referenced again by future AI, and so on
Imagine a tech industry without AI
Engineers would get SO SO SOOO MUCH more work done
@JeskidoYT youre saying that as if AI isnt a recent invention, people will always find ways not to get stuff done and the best inventions in tech history have mostly been on accident (if everything had gone to plan for IBM, there would be no PC compatibles like we have them today, only propriatary IBM PCs)
Nah. Good and honest presentations come from AMD.
NVIDIA's music was really brainwashing because I couldn't get it out of my head for weeks.
That's how Skynet will get us. Drive us mad with earworms.
And that was even before Udio arrived on the scene
Your editing team is hilarious. 😂
Since Intel famously follows BMW numbering, that means the 6-series must be a 2-seater cabriolet.
Intel: (every time they say) “A.l.”
Ali G: “Ayye!”
A'aight (The 't' is silent)
Because of that opening, I had to check that the video was not uploaded on April 1st. 😂
Or 1998
Old mcdonald had a server farm 1:46
Come on dude I thought of that first (in my mind)
Intel & Nvdia together could be a real power house. If they do it right..
The dancing dj fab worker made me cry
Old McDonald, ai, ai, ohhh
Old MacDonald's render farm,
ai, AIO
On this farm he had Jensen
ai, AIO
Thanks Steve!
Back to you Steve!
What a classic 😂😂😂😂
Pat forgot the leather jacket lol
a small mercy
He hasn't unlocked that trade with the villagers yet, Pat needs to farm more cows first.
He forgor 💀
He cannot afford it😂
@@TheDumbTake-xb6rr living down to your name
This video was really well done. Awesome!
14:39 SRAM in polish means "I shit" and intel is opening manufactures in Poland soo lets hope they handle any misscomunications there
NVidia seriously needs competition
If Baiden gives 20 billions to nVidia, and just see what happen. tsmc got only 6 billions and relocate all the best engineer from Taiwan to help up. tsmc can withstand 7.4 earthquakes. Take that, Intel.
Making graphic cards is really hard it's why it's only nvidia and amd,intel
Yes, this is why AMD is making the MI-Cards and already baking in Tops in every CPU. And they already beat Intel by lightyears. Before aiming at NVIDIA, they should try to beat AMD first, what they are not able to do right now. This is just a brainwashing show to make them look competitive in any way, but they are not.
technically seen both amd and intel already more than compete with nvidia, the problem is people just won't see that and will by defauly keep only looking at nvidia.
and that is because nvidia is a marketing company, meaning they spend most of their budged on how to look good and make people think of them as great.
as a result both amd and intel and broadcom and others need to provide way better value in order to even get some okay marketshare,
and as a result they generally have lower marketshare making those insanely better value harder to reach.
the main problem is essentially people being mentally weak, and so weak towards nvidias marketing.
if people would just decide to go with nvidias competition, not only will they suddenly get a lot better due to finaly having okay marketshare and so software also being optimized for them(as in right now only nvidia receives much optimization from third parties, if intel and amd also got optimization from people(like game engines also by default optimizing for them instead of only for nvidia then amd and intel cards and such will perform much better than they do now.
since hardware is one thing.
drivers are one thing.
first party is one thing.
but third party implementation and optimization gives insane effects and is reached through marketshare.
for example if a nvidia gpu is just designed poorly or has terrible drivers, all games and gameengines and most softwares will in general optimize and fix those problems caused by bad nvidia design or poor drivers.
if amd or intel has a driver issue, well only rare cases do third parties optimize to get around that, as a result people think intel and amd to be somewhat poor.
while in reality as a example gaudi3 currently beats the h100 with around 1.5 times while also being much cheaper and more efficient.
however when things are optimized for gaudi3 as well it's potential is many times higher since it just is many times faster in actual AI performance, but most softwares are made for nvidia.
and nvidia has the nasty habbit of not accepting or using new technology and instead relying on old legacy technology so their stuf looks good early on yet gets outdated rapidly.
as a result big AI datacenter workloads are still designed for those legacy methods, for which nvidias hardware is also optimized. meaning that gaudi since it is not optimized for those legacy(old) things and instead for the future will right now only be around 1.5 times faster, while if AI gets more futureproofed gaudi3 will be many times better than that 1.5 times better.
originally I thought they reffered to the entire board with 8 of those modules when they named the AI performance since in that case even 1 module would still be much faster than the H100 in technical performance when optimized for, but turns out it reffered to 1 such module which can help you imagine how much more powerfull than H100 it is.
but it is a vicious cycle of nvidia having most marketshare, so all is optimized for nvidia, nvidia however uses very outdated methods meaning all is optimized for those outdated methods and hardware types, as a result all runs way worse than it could actually run, but if it where actually optimized properly to be more efficient and fast then nvidia gpus can no longer really run it because they where based on legacy stuf and so do not support the new stuf or barely and so since nvidia has the biggest marketshare softwares keep using those outdated methods despite it greatly increasing powerusage and reducing performance.
amd and intel since they can't compete in the legacy field have to innavate and improve but they need to get adoption for such newer methods to be used and to show the world how great things are, amd tends still follow nvidia a lot in hardware, intel however seems to really push for those newer much faster and more effient methods, but again rely on adoption to get it working even close to it's potential since if all is written for legacy shit then it won't run as well on modern hardware.
if many people would adopt intel and amd however, and right now especially intel then the gpu market for normal users will skyrocket since not only will the new better methods finally be widely supported, but also nvidia will be forced to actually improve their performance and value yet most people are to stupid to see even that single point
@@enbe3188AMD will never be as good as Nvidia. That's why Nvidia are slacking.
One would’ve hoped that AI would have replaced mundane tasks to allow people to focus more time on their hobbies, expressing themselves with art and music.
Unfortunately, the AI-obsessed C-suites of modern tech companies would love to see it be quite the opposite; they want to replace the artists, the people with passion and vision, with AI models that hallucinate amalgamated images using training data that they didn’t earn.
As someone who has been interested in both tech and art for a long time, the shift is both disturbing and disheartening.
Money and greed corrupts everything.
I'm not sure what gave you the impression that they would be so altruistic. It's always just been about selling the product. The reason for art and music being on the forefront is simply because it was the easiest. For better or worse it's coming to replace everything, it's just a matter of how hard is it to implement in each use case.
Let me tell you how the economy works.
And how your complaints about ai literally do not matter.
The cycle goes like this: new groundbreaking tech is created, that technology displaces a portion of the workforce while increasing productivity for far cheaper, new jobs are created that usually pay more than the previous jobs, the displaced workers find a new job.
I want to ask you, where do you think we would be if we rejected advances in technology when it comes to the industrial or agricultural sector, because what about muh job as a farmhand? Ill tell you what would have happened, we would still be in a pre industrial society with less food, less products, less money, dying of polio or some other bs at 30.
Implementing ai allows businesses to create things for cheaper, it also allows artists to be more productive, it also allows people to create more niche products at a higher standard of quality that would otherwise not be economical.
In other words, before you start talking about anything, you should probably at least understand the bare minimum.
Money and greed, you guys seriously think you understand anything. Let's be real here, the thing you're complaining about is "capitalism" Which just shows how narrow minded and frankly it shows how much of a dimwit you are.
Let me tell you how the economy works.
And how your complaints about ai literally do not matter.
The cycle goes like this: new groundbreaking tech is created, that technology displaces a portion of the workforce while increasing productivity for far cheaper, new jobs are created that usually pay more than the previous jobs, the displaced workers find a new job.
I want to ask you, where do you think we would be if we rejected advances in technology when it comes to the industrial or agricultural sector, because what about muh job as a farmhand? Ill tell you what would have happened, we would still be in a pre industrial society with less food, less products, less money, dying of polio or some other bs at 30.
Implementing ai allows businesses to create things for cheaper, it also allows artists to be more productive, it also allows people to create more niche products at a higher standard of quality that would otherwise not be economical.
In other words, before you start talking about anything, you should probably at least understand the bare minimum.
Money and greed, you guys seriously think you understand anything. Let's be real here, the thing you're complaining about is "capitalism" Which just shows how narrow minded and frankly it shows that you're not exactly the brightest bulb.
Also, ai is overvalued like hell right now, sure it's useful, but this is getting to .com levels of bubble, maybe even worse. The internet was the future, that doesn't mean that every .com was useful.
@@alexis1156 Brevity dude.
And redirecting our complaints to Capitalism sounds like a defunct all-encompassing umbrella. There's morals in the business, when the internet was discovered it created more jobs than it destroyed for all skill-levels. When AI was developed, it sapped skill from lower workers for a small fraction of people. "That's where Capitalism comes in" keep pedalling that irrelevant retort, because AI has helped significantly less than what the media has led you to believe.
Go buy shares, it's the new trend right now.
Intel is on the cutting edge of AI generated Cringe.
Yeah the video was pretty cringe. But I’m happy they are taking the competition seriously. The more competition the better
“We are not an AI-focused outlet”…. You will be…..you will be.
Hi Steve! Just a quick one... I got the evo RGB and the 140mm fans are bigger than the bracket, the 2 screws on the bottom and 2 on the top of the 3 fans are getting screwed in the air :D I love how it is presented in the demo they fit on the support :))
10:57 Using Pytorch as an example is not a good one. Pytorch is a deep learning library. Its counterpart is Tensorflow and many others. It does not directly interface with GPU, instead it goes through cuda or ROCm (amd equivalent). Intel is trying to push for their open source "cuda", OneAPI, which I just checked still does not have official released PyTorch using it. Nvidia has more than 10 years of development and experience in cuda and GPU accelerated computing (not limited to deep learning). Both Intel and AMD will have a very rough uphill battle to fight.
Nvidia's dominance in AI is not even so much CUDA, but more of the overall relentless in support and polish for everything, writing tons of custom optimized kernels for libraries etc. The whole CUDA moat in this case is more of an umbrella term for their entire ecosystem, because many of these advantages aren't technically CUDA related. Nvidia actually spent more than a decade of effort while these other companies did not.
IMHO, the ML frameworks is where Nvidia has a huge leg up, because they were (and have been there) for a long long time helping build them but I'll be totally honest, if Intel makes realistically good TF/PyTorch support a reality, they could cause a huge upset by simply saying "look, we have this nice card with the A750 and 64GB and it doesn't cost an arm and a leg because VRAM is cheap and Nvidia are bloodsuckers". Make them ~$600 and I'll be running out the door to buy a couple (couples) :P
Much like the prisoners dilemma, the one that defects first on VRAM size/cost will totally wipe the rest for a modest time, because performance isn't really a concern if you can't event fit the model in memory to start with.
मग कसं भाऊ, जमेल का Intel ला सर्व काही.. मला माझ्या घरी बसवायचं आहे AI. चल AI, हात पाय दाबुन दे. चल AI, चल AI, अद्रक टाकून चहा बनव.. अशा order ठोकणार आहे मी.
This lowkey has the same energy as the Gavin Belson Box3
hahahaha
Signature edition B=ox=3
I have only watched SV a year ago and it's a future documentary like Idiocracy. Judge is a genius.
lmfao :D :D
Gavin Belson: "I want the signature to be BIGGER"
Steve turning from Tech Jesus to Tech Santa.
This tech is already dead, it just doesn't know it
🎅🎅🐱
That Intel guy considered the last decade boring?! What the f**** were they doing? Oh yeah! NOT fixing the countless design flaws in their CPUs, instead expecting us to sacrifice the performance we paid for by rolling out (software) mitigations. Go to heck, Intel!
3:02 google actually was spot on accurate with the AI closed captions
After watching Fallout I've learned what we really need are Super Managers so that these ai projects can continue for 200 years. 200 years of song and dance.
There hasn't been a new Crowbcat video in awhile but this scratches some of that itch, thanks guys
You gotta embrace the bore. Lean into it.
07:32 I remember ad campaign, "Switch", talking about openness of formats, compatible interfaces, easy to replace parts. The same "*totaly family unfriendly word*" company which is now one of the toughest and harshily defended monopolies of consumer devices and services.
If you're try to gain positions, you talk about openness. If you're leading, you secure with copyright and propietary solutions.
Decades changes, a-bole business practices don't.
TBH my biggest personal hesitance towards using any AI at all is that we constructively can't run and use it locally to ensure the safe-keeping of confidential information. In law, that sort of means sending absolutely privileged information right to some company, usually a big and powerful one. Just because I don't know how these companies will use the information and how they might adverse my specific client doesn't mean I can put them at unknown risk for the sake of possibly more efficient use of time.
But being able to run open-source code on local hardware could change that. If Intel can come out with AI Add-In-Cards that don't cost the kidney required to buy a 4090 (if one somehow evades China's inhalation of them), then maybe I'll consider it. They could save cost making discrete cards apart from GPUs by not having to fuss with all the other silicon devoted to things unhelpful to AI purposes. It might be nice to return to the days of having more than one Add-In-Card
You can, stable diffusion and LLM can run locally on your PC. It's still require mid-high end hardware, but you already can do it now, for free.
@@sadkurtable 10GB VRAM really isn't enough, and getting a new GPU means getting a new waterblock too
I'm convinced ai is safety word for cocaine
CocAIne
When the "AI" segment started I thought you were playing a clip from Ozzy's Crazy Train ("Eye Eye Eye")
A.I....bro just tranquilize me at this point🙄
"I am the sound effect" - Darth Jensen.
While the notes for Xeon 6 2.4x and 2.7x notes said that it was vs prior generation platforms, on Intel website they were comparing those Xeon 6 to Xeon 2nd Gen which was using Skylake architecture, so not exactly prior generation unless they were thinking that because Sierra Forest is based on E-cores which is based on Skylake thus the comparison is somewhat valid?
Also they had a slide comparing Xeon 6 Granite Rapids (P-cores only) with.... Xeon 4th gen, which is using the same architecture as Alder Lake, so not exactly the previous gen (which should be 5th gen). But I guess their wording should be enough to avoid problem with false marketing since they didn't say previous generation but again, prior generation platforms. To be precise, their wording is "Based on architectural projections as of Feb. 14,2023 vs. prior generation platforms. Your result may vary." They do wrote about the comparison was with those CPUs, but it is in their written article and not on the slide itself. So basically a bit of snake oil just enough to make everything more shiny than it should.
Can’t wait for battlemage! It’s so cool to see a third competitor!
third? Oh you actually think AMD is competitive ,, LMAO
@@tilapiadave3234 i am more inclined to buy amd cpu and gpu lol. nvidia can suck my dingdong
They are pretty competitive in price though. You're better off buying an amd card rather than a 40x card that's not a 4090. Unless you're willing to toss additional dollars due to the nvidia tax for less vram and performance increase from previous generations then sure do what you do.
How is Intel a competitor in the GPU space? Currently at best, they make AMD and Nvidia’s GPUs look better while not having better software or hardware price to performance to AMD.
At launch, the A770 and had the same performance per dollar as the 1650 SUPER
@@Eins3467 your are much better off buying an Nvidia card ,, superior product in every way
im sorry, but i totally lose it and start laughing uncontrollably when steve goes "WTF" on 1:37 mark XD
Like these videos outside of games focus; thanks for everything you do guys.
Checking the Gaudi 3 AI white paper, the 1.5x speed up *on average* is just due to the larger memory on each card compared to the H100 (128GB vs 80GB)... There are some improvements on interconnection speed (900GB/s vs 1200 GB/s on intel) and memory bandwidth (3.35TB/s vs 12.8 TB/s) but Gauid 3 is essentially is a slower card. 1835 vs 1979 TFLOPS on BF16 AND 1835 vs 3958 TFLOPS on FP8
As much as I think monopiles are generally bad for most people I HATE how Nvidia runs itself.
“Please pump our stock.”
big Ballmer energy coming from intel
Nowhere near that amount of sweat stains
DEVELOPERS DEVELOPERS
But now with AI
not enough jumps and wohoo
lol, no, they are not that bad, yet
we NEED intel for high end gpus but not sure if thats happening anytime soon
Their battlemage is coming in the fall and that’s supposed to go band for band with nvidia
@@TEENYcharma Not at the high end. Battlemage is topping out at around 4070 Ti level.
Something to compete at 70ti level should do the job maybe 80series but thats too optimistic, most people buy gpu under that mark for personal use
I feel like this conference from Intel was a true Steve Ballmer spec show, and I am excited to see more
"I guess there is the 9"... Don't joke like that. I almost died.
Thanks Steve, back to you Steve
INTC 500$ ?
but Gaudi is made by tsmc. tsmc's capacity is booked by nvidia and apple. when is intel's fabs going to be able to compete, even for intel's own business?
Hey, just letting you know, I actually really appreciate the AI hardware/software coverage. I'm a gamer and an AI software engineer working on developing Retrieval Augmented Generation chatbot at my company. It's nice to hear Gamer Nexus cover both sides of the things I love to do.
lol unfortunately, these chips are so damn expensive, even my company can't afford to buy them for me :')
Not that they're being cheap. They'd be 100% willing to buy them for me if they could get the ROI from it to justify the purchase for our mid size company. But the hardware is too expensive to get proper ROI. Hoping for some stuff to "trickle down" to my company, and then eventually trickle down to prosumer.
The Intros to GN's vids are just getting better and better 🤣🤣🤣
Nvidia is seeing what Cisco experienced in the early 2000s where a bunch of manufacturers jumped onboard the networking equipment bandwagon. I prefer setting up Cisco at work but at home I prefer Netgear. There's a brand for everyone else in between.
Thoughts on Ubiquiti?
Love when you cover this stuff tbh I just find it all very interesting
Why at 18:51 the part where is wirtten "codename" between parenthesis shifts 1 pixel to the left?
1:38 this is the constant state of my brain in the last 5 years
Love how animated Pat is
Intel are pretty good at selling refreshes and stand-up comedy without a doubt
Absolutly. this show is just stand-up-comedy and self-caricature. I dont know what they are smoking at Intel, but if people reacting to show the same as election for President, Intel has a bright future ahead. Its not always the best who wins, but the one who makes most of the noise. They are quite behind AMD, aiming for NVIDIA is self-destruction. But in their presentation they show themself as if they are already King in the AI world. And this Cringe-fest is just making Gelsinger look like a clown. He might be a smart man, but he should not try to present him as the "edge-overlord". This space is already overcrowded with the ego from Jensen Huang.
I work/worked on the LLM in the RA3 FAB that was shown off. Although it may have been "staged" in the sense that we knew what we were going to ask it before the camera turned on. It does work in FAB to get content real time without the MET searching the four or five different data sources that a MET would normally need to review for an RFC. It's freakin Epic. I wish I had it when I was an L3E in the factory.
God, the Intel presenter is so nerdy and ackward I'm having flashbacks to Bill Gates presenting stuff in the 90s.
I think that's part of the allure though. Nerds should be at intel. If it turned into RedBull Bro culture we'd probably stop upgrading often... or we'd all become redbull bros
It's on purpose. Intel in specific wants to present as boring and safe. We have to remember who Intel's core market is. It's not us and it's not Wall Street either. It's Fortune 500 companies ran by boards of directors over 60. The socially awkward nerd is safe, he'll do what he says to the best of his ability. The guy in the leather jacket and sun glasses is trying to put his hand up your granddaughter's skirt.
@@didamnesia3575 That would be what Activision Blizzard turned into and we all know what their reputation has become.
or Lisa Su, Jansen Huang, and that other nobhead AMD dork in the '3' sport shirt
@@jtland4842 On the Blizzard note these "people" are an intersection of frat boys and nerds (and rap**ts). An aberration of nature. Blizzard North is long gone, for decades, and Activision wears its corpse. Just look up a photo of the makers of Diablo I together...
If we look at the Intel image at 5:10 it will easily show their novel approach, where memory is surrounded by processing cores using two separate dies. An impressive achievement for a real first-time competitive offering.
Vader: "I am your father"
Jensen: "I am your sound effect"
AMD REALLY did hire the right CEO in Lisa Su.
"I guess there's the 9" lol
Google translate 😂😂😂😂😂😂😂😂 I don't think this AI crap is going to go well. There was a limit of development in this direction that we should not have crossed, but it is already too late. Technology went exactly in this direction, which partly scares me. But in the end, if humanity isn't wiped out by an asteroid or something like that, it will at least be wiped out by something we ourselves created. 😅
Which is the most embarrassing way if you think about it ;)
That presentation Intel held made me completely want to switch to AMD CPU/GPU with my next computer.
The slide on Intel Gaudi 3 talks about inference and running, so it is not about training AI. The inference phase is what Chat GPT does to answer when you write some text (so, after training).
AFAIK, both NVidia and Intel are behind the Groq LPU both in term of latency (time to replay in chat) and energy efficiency (and Groq is still using 14nm)... There is a huge architecture difference, the Groq has a deterministic behavior (no varying instruction latency due to cache, unordered execution...a dream for developers), embed a network switch, which as I understand provide deterministic behavior at distributed scale.