The amount of power these use is less than what was used just a couple of years ago, and the next generation will use less power too, so you'll eventually have the computer power required for chatgpt in your desktop.
@@pcmason you already can have it with GPT4ALL. It can run on your cpu or use vulkan, opencl, cuda or the M series of the macbooks and it is pretty fast if you have a gpu with 6 or more gb of ram
@@pcmason While you are definetly not wrong we still have the problem that the usecases for these machines are getting better and better with better performance, leading to a massive growth in usage and thus increase in energy consumption in total. The efficiency increases we have these days are way smaller than the increases in usage overall. Data center energy consumption in projected to double by 2030. The current trend to "daily" AI usage and more compute intense applications being simpler to run is heavily counteracting any improvements we made towards efficiency. Using ChatGPT instead of Google for example uses (up to) 25 times the energy, no matter how efficient the servers are. So tldr, you are both not wrong, but also not right.
@@DajovaI mean I would want a case with actual cooking so a radiator case that's just a GPU that I could hook my water loop into the case so maybe we could have a chance to watercolor the RTX 50 cards
@@bluflame5381 This I dream about powerful PC yet I probably would just still stream UA-cam half the time and play Elden Ring/Battlefield and Cyberpunk the other half... Which runs fine on my 4060.
1:45 that Nvidia employee forgot she was working with linus. You got to stay on your toes.... And just a few seconds later you realize it's Yvonne, the GOAT of guiding Linus. Here's to Yvonne, saving the company one small catch at a time.
@@gonthyalavishalchandan1193 "is *that* John?" "yes, *it* is" That's a common type of exchange, most people don't wanna be referred to with *it* or *that,* but it's common to use those pronouns in this specific case, I'm sure no disrespect was meant
i wonder how powerful it'd be if nvidia could actually build a single gpu which is built out of like 1000 blackwell superchips that would fill up multiple rooms. And i wonder what we'd be able to do with such thing...
I mean, almost? These things are meant for datacenters, so you'd have dozens or hundreds of racks like these, all connected with extremely fast interconnects. That's almost like having a GPU the size of a building.
AI stocks will dominate 2024. Why I prefer NVIDIA is that they are better placed to maintain long term growth potential, and provide a platform for other AI companies. I know someone who has made more than 200% from NVIDIA. I'll also take any other recommendations you make.
I believe the next major breakthrough will be in A.I. For sustained growth similar to META, it's crucial to avoid making impulsive decisions based on short-term market fluctuations. Instead, prioritize patience and maintain a long-term perspective. Consider seeking financial advisory services to make informed buying and selling decisions.
In a similar situation, I sought advice from a financial advisor. By restructuring my portfolio and diversifying with good ETFs, the S&P 500, and growth stocks, I was able to grow my portfolio from $200,000 to over $800,000 in just a few years.
This is definitely considerable! think you could suggest any professional/advisors i can get on the phone with? i'm in dire need of proper portfolio allocation
Rebecca Nassar Dunne is the licensed fiduciary I use. Just research the name. You’d find necessary details to work with a correspondence to set up an appointment..
We got DX12 and Vulcan...... That either no one bothers trying to use, or no dev bothers doing the "less work than back then" to enable..... Gimme back SLI / Crossfire damit!
@@51Archives And that is what we want, our downfall, after all we all have to die eventually, if we die as the last humans in hisotry then we are special.
@ZaHandle Just use those industrial red power plugs. Youll need at least one of those for the rig if not more. That puppy can supply over 300,000 watts. Psu? Who needs that crap just plug it straight into a nuclear reactor. The rack likes raw unfiltered energy kappa.
I mean,... you would have to probably cover the whole world with 'super' computers of old (Zuse anyone?) and wouldn't have a faction of calculation power of one of these bad boys
I designed something very similar to the spline about 12 years ago, except mine also included quick connect cooling lines to connect a hot and cool side with teg between for slight power efficiency increases. It had 2 iterations, one for servers (with interconnects for multiple racks) and another for a modular/scalable pc concept.
When I started investing last year, I avoided significant mistakes. I've focused on investing modest sums in stable businesses for the long term. If stocks perform well, I hold onto them; otherwise, I reinvest losses into profits. Recently, I made $9.5k from a $4k investment in NVIDIA.
I wholeheartedly concur; I'm 60 years old, just retired, and have about $1,250,000 in non-retirement assets. Compared to the whole value of my portfolio during the last three years, I have no debt and a very little amount of money in retirement accounts. To be completely honest, the information provided by invt-advisors can only be ignored but not neglected. Simply undertake research to choose a trustworthy one.
@@Safetytrousers Sure, and if all you do is play games with the exact same framerate and graphics settings as before then you'll use less power. But people won't do that.
I am happy to be a part of this project in making this tech. All those years of buying more than 25 different units among many generations of Nvidia GPUs for gaming, allowed me to be part of this project. I contributed .000000001 percent of the revenue Nvidia used to R&D this technology!
*Segway* Two-wheeled, self-balancing personal vehicle. A Segway is a two-wheeled, self-balancing personal transporter device invented by Dean Kamen. It is a registered trademark of Segway Inc. It was brought to market in 2001 as the Segway HT, and then subsequently as the Segway PT. HT is an initialism for "human transporter" and PT for "personal transporter." A lesser known fact about Segway is that the company's previous owner died when he fell off a cliff while riding a Segway, kind of ironic. Those things aren't safe, let me tell you. Also, the word you're looking for is 'segue'.
because it didn't. you're either viewing this on a Nokia phone from 2004, or half blind. his hand would even dissapear every time he waved it in front of it
To Nvidia: This admittedly funny marketing stunt won't make me forget that your other GPU's are still overpriced. To Linus: BUY IT AND RUN THINGS ON IT.
He doesn't give a shit about you or any home user. He's got almost a laser focus on the datacenter. Sales of gaming GPUs are just the cherry on top of the cake for him.
The amount of power data centers are already consuming is absolutely egregious, and this is going to almost double current usage rates. RIP to the glaciers and Florida💀
@@Viewer13128 Neuro will make jokes about whats happening in game, she insults peoples gaming set ups within moments. shes a personal buddy to the creator, and hes just one guy. before chat gbt was even a thing
@@novadestry i was mainly wondering if the ai in this presentation are using more advanced stuff or not. for example the clip where the ai can react to moving game footage. i'm not sure what else the ai does so i was wondering if neuro is indeed doing everything it can do and more. i was just wondering how an entire company with lots of budget still can't make something equal/better than 1 person. that would be quite shocking and they really need to go back to school.
Simple, MS Winblows used to run like ass on ARM and Nvidia is too dumb for everything else (and burned all goodwill so no one else will do it for 'em).
I mean they have the Shield as a standalone product, along with their Jetson series to some extent. As for selling bare CPUs, there is no standard platform yet on the market to my knowledge. So you end up with proprietary junk on Apple side for consumer Arm, and more proprietary junk on the datacenter space from Nvidia. AMD and Intel are in no hurry to move away from x86 yet. Nvidia and Apple both love money so I don't see how Arm PCs are gonna happen anytime soon. Kind of a shame.
They are going to, in a partnership with MediaTek, it's supposed to be a snapdragon X elite like competitor. Is expected around end of year or 2025 (don't quote me on these numbers, I don't remember it too well)
Everyone and their mom knows the 50-series is launching this year. If you spent hundreds of dollars on an upgrade without researching that, it’s on you.
@@trapical Comprehension isn’t your strong suit. You literally took my words and made them mean what you wanted them to so you could argue. Try reading again… slower… and see if you come out with the same result. L
NVIDIA networking makes me remember when NVIDIA firewall was installed by default with one of my GPUs back in the day that would block absolutely all traffic by default 😂
@@frankhuurman3955 or better yet, pick nothingness. Maybe now u're not forced to use AI, but sooner or later u would be. At this point someone should amass a "no tech" campaign, since pretty tech nowadays are the same -- making you rely on them.
I actually work for a hyperscale ODM in QC and testing and we're renovating one of our rooms at the moment for building scale liquid cooling. Am excite.
Can wait for the future generations to see this video and be like "Oh yeah, 8 TB/s. That's roughly the transfer speed of those cheaper USB flash drives... What a slow memory those guys head."
Having it go straight from the GPU, rather than being cloud based, is the real game changer. Programming the AI personality reminded me of how the holodeck is programmed in Star Trek.
I can’t wait for this to become outdated because one of those boards in a nice frame with some lighting on the inside and hung on your wall would be some great wall art 8:51 Jake in the background appears on tv😂
Consumers: "Man, I hope the 50 series is good" Jensen: "Let's see how many times I can say AI in my rambling two hour speech no one wants to hear" Next, they'll combine Crypto and AI. They'll call it CrAI, because that's what most consumers will be doing as AI continues to be shoved down our throats.
Cost is about $70,000 for that CPU + 2 GPU board that plugs into the blade and you need 2 per blade. You are looking at about $3 million per server rack. Do you see why NVIDIA's stock is so high?
Technically it all benefits gamers... just not always intentionally. Gamer divisions alone aren't enough for any of the big three to turn a profit at scale. Everything we get from the big three are byproducts of development for their professional products. Even AI has benefited gamers in the form of DLSS and frame generation. Think of anything you like about your favorite video card and I can pretty much guarantee you at some point it originated with an engineer looking at something for a professional product and going "Huh, would this work for gaming?"
Remember when Microsoft said the 360 was going to do something similar to the beginning demo with Project Milo when the connect came out? Something that took an insane nvidia super gpu... they apparently had for the 360. No wonder Project Milo never actually became a product
1:45 almost the most expensive mistake linus would ever make
They don't call him Linus Drop Tips for nothing
Since they are bare dies, I'm pretty sure, there is nothing there to break.
the way that Yvonne rushed in lol
In hospital, probably
Lol wife to the rescue 😂
Oh, man! Do I have to buy a new case again?
Maybe a new house my guy.
no you don't
@@dzello damn, it wont fit
No, a new house
Nah bro for new gpus you need to buy a new country + planet to barely fit this thing 💀💀💀
So it can heat up the planet while simulating itself heating the planet?
My mind is blown.
The amount of power these use is less than what was used just a couple of years ago, and the next generation will use less power too, so you'll eventually have the computer power required for chatgpt in your desktop.
@@pcmason great explaination ngl
Genius
@@pcmason you already can have it with GPT4ALL. It can run on your cpu or use vulkan, opencl, cuda or the M series of the macbooks and it is pretty fast if you have a gpu with 6 or more gb of ram
@@pcmason While you are definetly not wrong we still have the problem that the usecases for these machines are getting better and better with better performance, leading to a massive growth in usage and thus increase in energy consumption in total. The efficiency increases we have these days are way smaller than the increases in usage overall. Data center energy consumption in projected to double by 2030. The current trend to "daily" AI usage and more compute intense applications being simpler to run is heavily counteracting any improvements we made towards efficiency. Using ChatGPT instead of Google for example uses (up to) 25 times the energy, no matter how efficient the servers are.
So tldr, you are both not wrong, but also not right.
1:45
It's hilarious how the moment he says "oh no" everyone rushes their shit to support him
IT WOULD HAVE BEEN A MASTERCLASS if the cameraman had rushed too dropping his camera.
‘Sorry Linus, you pay me to film you and not to catch stuff … got a solid clip tho bro’
I can't remember ever being so underwhelmed by a demo
Nvidia should embrace the memes further and just announce a case designed around a GPU.
Isn't that technically what their SFF certification is?
@@DajovaI mean I would want a case with actual cooking so a radiator case that's just a GPU that I could hook my water loop into the case so maybe we could have a chance to watercolor the RTX 50 cards
Sure it's fast but can it run elden ring?
@@bluflame5381 This I dream about powerful PC yet I probably would just still stream UA-cam half the time and play Elden Ring/Battlefield and Cyberpunk the other half... Which runs fine on my 4060.
External GPU
Little did we know the GPU is normal size, but Linus being so short makes it look huge
Lmaoo
1:45 that Nvidia employee forgot she was working with linus. You got to stay on your toes.... And just a few seconds later you realize it's Yvonne, the GOAT of guiding Linus. Here's to Yvonne, saving the company one small catch at a time.
Isnt that his wife
@@mpresto15 It is.
@@ACHonezGaming She* is
@@gonthyalavishalchandan1193 "is *that* John?"
"yes, *it* is"
That's a common type of exchange, most people don't wanna be referred to with *it* or *that,* but it's common to use those pronouns in this specific case, I'm sure no disrespect was meant
Love how she rushed in like she was thinking "WE CANT AFFORD THIS DROP LINUS!!" 😂
1:46
Where would Linus be without Yvonne to support him?
Linus Tech Tips brought to you by NVIDIA.
probably in debt
Linus drop tips
I mean, he's said many times that LMG wouldn't exist without her. That was just the latest example of why lol.
She looked extremely ready to react and I'm guessing that wasn't staged :)
1:43 concerned wife runs to save her house LOL
in 5 years we gonna have entire buildings for just our GPU, just like the 60s!
i wonder how powerful it'd be if nvidia could actually build a single gpu which is built out of like 1000 blackwell superchips that would fill up multiple rooms. And i wonder what we'd be able to do with such thing...
I mean, almost? These things are meant for datacenters, so you'd have dozens or hundreds of racks like these, all connected with extremely fast interconnects. That's almost like having a GPU the size of a building.
YES! The "Nvidia Univac"
@@SimplCup You also had to build a nuclear power plant next door, just to power that thing!
IKR. I was noticing that the boards were the size of VLSI PDP-11 or VAXen boards...
All hail the return of the vector-based supercomputer.
Imagine the insurance company seeing Linus carrying your equipment.
It feels like there's linus clause for tech companies.
@@grze149 so,Top Gear/Grand Tour Hammond similarity? :)
I bet he has to take out a policy every time he does these kinda projects.
AI stocks will dominate 2024. Why I prefer NVIDIA is that they are better placed to maintain long term growth potential, and provide a platform for other AI companies. I know someone who has made more than 200% from NVIDIA. I'll also take any other recommendations you make.
I believe the next major breakthrough will be in A.I. For sustained growth similar to META, it's crucial to avoid making impulsive decisions based on short-term market fluctuations. Instead, prioritize patience and maintain a long-term perspective. Consider seeking financial advisory services to make informed buying and selling decisions.
In a similar situation, I sought advice from a financial advisor. By restructuring my portfolio and diversifying with good ETFs, the S&P 500, and growth stocks, I was able to grow my portfolio from $200,000 to over $800,000 in just a few years.
This is definitely considerable! think you could suggest any professional/advisors i can get on the phone with? i'm in dire need of proper portfolio allocation
Rebecca Nassar Dunne is the licensed fiduciary I use. Just research the name. You’d find necessary details to work with a correspondence to set up an appointment..
I just curiously searched her up, and I have sent her an email. I hope she gets back to me soon. Thank you
Watching Jake & Yvonne on crowd control at the start made me spit out my lunch haha
They did great! - LS
Now what is this going to mean for the 50 series..boooom
at first i though what jake doin standing over there now i know, haha
so thats what she was doing there. I kind of wondered why she was just standing there.
Great content well made
72 Petaflops? hah, losers, i'm already at 2 Flipflops
underrated comment !
Next is Exaflop followed by Zettaflop FYI, if you care to know. The Russian military is already a Z-Flop.
its kinda funny cuz a flip-flop is also a computer thing. but 2 flip flops is like 2 bits of storage so not very impressive
How many petaflops in a flip flop?
XD
"That's not a graphics card, it's a space station!" Obi-Wan Kenobi
1:45 Linus, please, not again.
They were ready to hold it
@@xugro I think you mean catch it, which they did. Because again, Linus is Linus.
Catcher
wait, what happened the first time?! Lmao
@@Aonoexorcist100 he dropped a 3090 during it's review... C'mon it's not been that long.
1:20 - All I heard is Nvidia will never give us SLI back.
Key words US
We got DX12 and Vulcan......
That either no one bothers trying to use, or no dev bothers doing the "less work than back then" to enable.....
Gimme back SLI / Crossfire damit!
Well hopefully amd is cooking up something
It sucked and nobody used it anyway...
@@littlefrank90 TF is "nobody?"
Speak for yourself 🤣
2:00 Somehow this is my first time hearing Yvonne speaking Chinese 🤣
ikr? isn't she korean?
@@skylinrg Singaporean iirc but hearing that I doubt she had a lot of practice.
That's no Graphics Processing Unit, that's a Giant Processing Unit!
so it processes giants... love it.
Nah; that's an AI produced monster that will surely lead to our downfall within a few integrations in the near future..
@@51Archives And that is what we want, our downfall, after all we all have to die eventually, if we die as the last humans in hisotry then we are special.
The G’s for “Grill”, right?
No all you have to do to survive is have Linus drop it on purpose or take off the cooling and turn it on
Time to sketchily wire together 4 2000w power supplies
duct tape them together
you mean 50?
I believe in Alex
@ZaHandle Just use those industrial red power plugs. Youll need at least one of those for the rig if not more. That puppy can supply over 300,000 watts. Psu? Who needs that crap just plug it straight into a nuclear reactor. The rack likes raw unfiltered energy kappa.
@ZaHandle Almost the entire world uses 240v power.
1:45 LTT entire wealth just flashed before Yvonne eyes
computers in the past: its the size of a whole room computers today: also the size of a room lol
And that will not change anytime soon I think.
I mean,... you would have to probably cover the whole world with 'super' computers of old (Zuse anyone?) and wouldn't have a faction of calculation power of one of these bad boys
Computers gets compact releasing the space that could be filled with even more computers
@@GroovingDrums You mean fraction, don't you? 0o
That one rack could probably simulate every mainframe that ever existed on a molecular level in real time.
Can't wait to buy a GPU in 2050 that's equivalent to that whole rack, and only use it to play Minecraft.
And will only hit 20fps in Minecraft....
That'll likely already be around 2035
i bet we could still crash it using tnt or redstone.
🤣🤣🤣🤣
Lol. 2050? Don't think the world is gonna be in a good enough shape for that my friend...
I designed something very similar to the spline about 12 years ago, except mine also included quick connect cooling lines to connect a hot and cool side with teg between for slight power efficiency increases. It had 2 iterations, one for servers (with interconnects for multiple racks) and another for a modular/scalable pc concept.
1:45 multiple NVIDIA employees' hearts skipped a beat just there.
will it fit in my mini-ITX case?
yes it sure will
If not just return it, someone will appreciate an open box deal.
LMAOOO 😂😂😂
If you use the chip as a side panel maybe
yes you just need apple's hydraulic press
4:26 “Reticulating Splines” that takes me back a bit 🤣
1:45 The Ghost of Michael Jackson passed through Linus there
The GPU needed for GTA 6:
Minimum requirement for GTA6 720p 30 fps
@@hannes0000 seriously, that's what i am wishing to play gta 6 at gtx 1080
With how optimization has been going the past couple of years. I think you're going to be lucky to get a solid 60fps Using a 5090 at 4k.
Just put a ps5 or Xbox series s/x console cause it won’t be out for pc day one but for console it will
@@hokahn6313you def will buy it
When I started investing last year, I avoided significant mistakes. I've focused on investing modest sums in stable businesses for the long term. If stocks perform well, I hold onto them; otherwise, I reinvest losses into profits. Recently, I made $9.5k from a $4k investment in NVIDIA.
well as you know bigger risk, bigger results, but such impeccable high-value trades are often carried out by pros.
I wholeheartedly concur; I'm 60 years old, just retired, and have about $1,250,000 in non-retirement assets. Compared to the whole value of my portfolio during the last three years, I have no debt and a very little amount of money in retirement accounts. To be completely honest, the information provided by invt-advisors can only be ignored but not neglected. Simply undertake research to choose a trustworthy one.
Her name is. 'JULIANNE IWERSEN NIEMANN’. Just research the name. You’d find necessary details to work with a correspondence to set up an appointment.
I looked up her name online and found her page. I emailed and made an appointment to talk with her. Thanks for the tip
I didn't know what Jensen was cooking, but it might be all of us with all the power consumption and the carbon footprint
The cost of power per compute has been greatly improved.
I really hope it's renewable
@@Safetytrousers Sure, and if all you do is play games with the exact same framerate and graphics settings as before then you'll use less power. But people won't do that.
@@Safetytrousers that's gonna create more demand and gap in the AI training market so there's that too
Our AI overlords will apparently win through creation of ecological collapse.
11:30
that segue was so fast i didnt even notice that it was a sponsor
great video as always!
I am happy to be a part of this project in making this tech. All those years of buying more than 25 different units among many generations of Nvidia GPUs for gaming, allowed me to be part of this project. I contributed .000000001 percent of the revenue Nvidia used to R&D this technology!
Me seeing Linus and GPU in the preview: So it's a normal sized GPU?
I was searching for this comment
Nvidia enters the SFF market
Finally a GPU that could run Cities Skylines 2
But can it run Crysis?
@@markflakezCGbut can it run Minecraft alpha 1.5.2?
Finally, something that can run Microsoft Flight Simulator in VR
Okay, Linus. Whole house GPU when. The only logical step after whole house watercooling
Can't wait for LTT to benchmark it in 10years time.
i love how linus' hand goes behind the "GPU" at around 0:11 😆
1:46 she was fast with that because she knew no one in that room could’ve afford that😂
11:34 is what I believe is the smoothest segue transition so far, did not expect to get baited that hard.
*Segway*
Two-wheeled, self-balancing personal vehicle.
A Segway is a two-wheeled, self-balancing personal transporter device invented by Dean Kamen. It is a registered trademark of Segway Inc. It was brought to market in 2001 as the Segway HT, and then subsequently as the Segway PT. HT is an initialism for "human transporter" and PT for "personal transporter."
A lesser known fact about Segway is that the company's previous owner died when he fell off a cliff while riding a Segway, kind of ironic. Those things aren't safe, let me tell you. Also, the word you're looking for is 'segue'.
Moran
@@TriggerHippie🫡
It's almost sounds like an audio glitch, it was so well timed. I kept scrolling comments till i found someone who agreed, lol
1:45 Heart Rate Go BRRR! The most impressive/unbelievable part of this video is that NVIDIA allowed him to pickup/carry this tech
@6:25 I can't wait until those heatsinks start popping up on ebay in several years.
But effectively useless unless you have data centre level airflow generators
Why did that heatsink in the begining look so real, i didnt even notice it until it was gone
because it didn't. you're either viewing this on a Nokia phone from 2004, or half blind. his hand would even dissapear every time he waved it in front of it
5:26 was the men behind Linus Supermicro CEO?
Yvonne literally teleports at 1:45 🤣
To Nvidia: This admittedly funny marketing stunt won't make me forget that your other GPU's are still overpriced. To Linus: BUY IT AND RUN THINGS ON IT.
He doesn't give a shit about you or any home user. He's got almost a laser focus on the datacenter. Sales of gaming GPUs are just the cherry on top of the cake for him.
@@arnox4554 I mean ngl, Datacenters has more purpose than games.
You can’t be serious with a comment like this.
4:17 Jensen specifically called it the spine in the keynote. Did the press kit call it a spline?
*3 million dollar machine exists*
Linus: *Goes near machine*
Me: 😬😬😬😬😬
Jensen: 😅
Linus: Look at all this amazing stuff.
Comments: Yea, yea, you guys did you see that he almost dropped it!
The amount of power data centers are already consuming is absolutely egregious, and this is going to almost double current usage rates. RIP to the glaciers and Florida💀
my room is 11 degrees celcius right now so if only they could somehow route the excess heat to my home, then the problem would be solved
Apparently many large scale AI data centers in the US are capped as they can't draw any more power without causing grid instability.
1:45 I think that was Linus most expensive "OH NO"😂
That last presentation is literally what one guy made called Neuro Sama. And that ai has more personality than a Bethesda character
are u certain they have the same functions? i'm just checking to make sure.
@@Viewer13128 Neuro will make jokes about whats happening in game, she insults peoples gaming set ups within moments. shes a personal buddy to the creator, and hes just one guy. before chat gbt was even a thing
@@novadestry i was mainly wondering if the ai in this presentation are using more advanced stuff or not. for example the clip where the ai can react to moving game footage. i'm not sure what else the ai does so i was wondering if neuro is indeed doing everything it can do and more.
i was just wondering how an entire company with lots of budget still can't make something equal/better than 1 person. that would be quite shocking and they really need to go back to school.
love how fast the nvidia rep reacted, almost like she was expecting it. 1:39
That's Linus wife, but yes she definetly expected that and reacted FAST.
Linus listing the specs around the 5:15 mark reminds me of the turbo encabulator meme
Makes you wonder why Nvidia has not already released a consumer ARM CPU if they are so good.
Simple, MS Winblows used to run like ass on ARM and Nvidia is too dumb for everything else (and burned all goodwill so no one else will do it for 'em).
they make ARM chips because they don't have a x86 licence so they don't have much choice
I mean they have the Shield as a standalone product, along with their Jetson series to some extent. As for selling bare CPUs, there is no standard platform yet on the market to my knowledge. So you end up with proprietary junk on Apple side for consumer Arm, and more proprietary junk on the datacenter space from Nvidia. AMD and Intel are in no hurry to move away from x86 yet. Nvidia and Apple both love money so I don't see how Arm PCs are gonna happen anytime soon. Kind of a shame.
They are going to, in a partnership with MediaTek, it's supposed to be a snapdragon X elite like competitor. Is expected around end of year or 2025 (don't quote me on these numbers, I don't remember it too well)
They have had ARM SoC's for well over a decade. For example the Nintendo Switch has a NVIDIA Tegra X1 from 2015
but can it run Crysis?
I feel like I just bought my 3090 last week… and it’s already two cycles old
Everyone and their mom knows the 50-series is launching this year.
If you spent hundreds of dollars on an upgrade without researching that, it’s on you.
@@trapical My mom doesnt know that
@@trapical I didn't know you are able to read people mind trough your monitor.
@@trapical Comprehension isn’t your strong suit.
You literally took my words and made them mean what you wanted them to so you could argue.
Try reading again… slower… and see if you come out with the same result.
L
lmao i bought my 4090 last week as well. its so sad...
NVIDIA networking makes me remember when NVIDIA firewall was installed by default with one of my GPUs back in the day that would block absolutely all traffic by default 😂
That's what a firewall is supposed to do. Block everything and from there you allow only what you actually need.
@@DJDocsVideosmost people have no idea what they need to allow and would click yes on any water falling green text app that requested it
5:11 Techquickie Linus showed up for a second.
4:13 felt like he was gona segue to another sponsor, “im gona have to tell you… about our sponsor”
Nono he would say "I'm gonna have to tell you.. about this segue to our sponsor!"
What if I don't want my pc to send a screen shot to Microsoft or nvidia every 10 seconds
He said with the last demo that it supposedly runs on device (no data is sent anywhere)
then you pick Linux and a non-AI bs GPU
Have you tried turning off your internet
@@frankhuurman3955 or better yet, pick nothingness. Maybe now u're not forced to use AI, but sooner or later u would be. At this point someone should amass a "no tech" campaign, since pretty tech nowadays are the same -- making you rely on them.
0:05 Jake looking like the Linus's body guard moment lol
1:02 premere pro moment XD (if someone sees this a re-export might fix it because this happened to dankpods before)
GPU = Generative Processing Unit?
Gougable pricing utility.
This makes much more sense than it should.
Rip GPGPU.
I actually work for a hyperscale ODM in QC and testing and we're renovating one of our rooms at the moment for building scale liquid cooling. Am excite.
Can wait for the future generations to see this video and be like "Oh yeah, 8 TB/s. That's roughly the transfer speed of those cheaper USB flash drives... What a slow memory those guys head."
0:02 Yvonne rockin' backpack prototype
0:35 linus holding it like that made me so nervous given his history with dropping things
Almost as big as drakes “gpu”
too bad he prefers mini itx cases
Oh my
@@alleymeow 😂😂😂😂😂😂😂 this got me dying
@@alleymeow nearly spit my drink out
Lol@@alleymeow
Can’t wait until 3035 so we can have this in a regular GPU sized 6:09
Having it go straight from the GPU, rather than being cloud based, is the real game changer.
Programming the AI personality reminded me of how the holodeck is programmed in Star Trek.
With Linus' almost completely absent upper body strength, the fact that they still let him pick stuff up at all is amazing to me.
TLDR: Your AI waifu is about to look at you in a much higher resolution than ever and be even more realistic.
0:27 that hose clamp moved when linus hit it
Yvonne with the save. Stop letting this maniac handle expensive shit!
🤣🤣🤣
Finally, a system that can run Cities Skylines 2.
I'm glad humanity have been given Linus in this era and not 300years ago
2:07 : lololol the AI killed the Vibe
They didn't show the responsiveness from the first test... I wonder how long it takes to reply... 🤔
About 3-6 seconds. Just didn't want to break the flow that early in the intro - LS
@@LinusTechTips all of this hardware and generative AI still sucks lmao
@@LinusTechTips Well. I guess we can wait...3-6 seconds is still fast to me. I guess there would be a time when things are going **too** fast.
0:30 Yos just slappin around their multi million dollar demo😂 i love it..
4:20 is that spline reticulated?
So that's where all the memory Nvidia didn't put on their consumer GPUs went
I can’t wait for this to become outdated because one of those boards in a nice frame with some lighting on the inside and hung on your wall would be some great wall art 8:51 Jake in the background appears on tv😂
This entire video, including Linus, was AI generated on Nvidia GB200's! 😂
In fact the GB200 is just an AI hallucination and does not exist!
Consumers: "Man, I hope the 50 series is good"
Jensen: "Let's see how many times I can say AI in my rambling two hour speech no one wants to hear"
Next, they'll combine Crypto and AI. They'll call it CrAI, because that's what most consumers will be doing as AI continues to be shoved down our throats.
had the feeling as soon as Linus picked it up that I was about to witness another classic Linus drops everything and breaks the budget
You know it looks like a great deal, but the shipping is just too much for me. Guess ill buy one next generation
That reaction time is uncanny xD
That’s what happens when you’re married to him.
4:50 realized Linus was in some type of luxury hotel or some fancy conference building
So weird to see Jake and Yvonne as NPC's
How many kidneys do i need to sell?
you only need to sell one of yours and 300 others from your friends
You need to sell more than half your organs kidney isnt enough for a single tube of that thing
Cost is about $70,000 for that CPU + 2 GPU board that plugs into the blade and you need 2 per blade. You are looking at about $3 million per server rack.
Do you see why NVIDIA's stock is so high?
Most of the population don’t appreciate the absolutely insane engineering these things take. It’s hard to even imagine, but it’s wild.
11:25 - that is NOT a Triceratops...too many spikey bits on the crest.
9:02: The "-ish" does some really heavy lifting here.
Technically it all benefits gamers... just not always intentionally. Gamer divisions alone aren't enough for any of the big three to turn a profit at scale. Everything we get from the big three are byproducts of development for their professional products. Even AI has benefited gamers in the form of DLSS and frame generation.
Think of anything you like about your favorite video card and I can pretty much guarantee you at some point it originated with an engineer looking at something for a professional product and going "Huh, would this work for gaming?"
Remember when Microsoft said the 360 was going to do something similar to the beginning demo with Project Milo when the connect came out? Something that took an insane nvidia super gpu... they apparently had for the 360. No wonder Project Milo never actually became a product
I cannot imagine how you do it 😂 just laying around, goofing around and filming the shots for your video unphased by all the other people haha
I love the opening shot with Yvonne playing bouncer so no one walks into the shot. Yvonne's the enforcer lol
4:13 i think the guy behind you almost had a heart attack in that moment