hey guys its riley from the video i just got back from the top of the glorious mountaintop where i go to meditate on how absolutely BASED i am because I access local perspectives to better understand world politics and current events with Ground News. Use my link ground.news/TechLinked to get 40% off the Vantage plan and join me in the light of correctness
btw the 4090 can only do 52fps in alan woke 2 @1440p without upscale from 1080p or glitchy fake frames. which makes the 4090 basically just a 1080p card in actually modern games (lets ignore drops to 16fps in cp2077).
I was rocking 970s until the 30 series and went AMD because Nvidia price to performance ratio is abysmal these days. I hope AMD goes hard in software and forces Nvidia to take the wakeup hit they need
Went from 970 to 4060, change was night and day, maybe I'm just too biased to say dlss3 it's a marvel along frame gen, but I never would imagine be able to play nowadays high end games at 1080 with above 60 fps, but I feel like I will upgrade eventually to a 4080 in the long term
when the 5090 releases so the price of the 4090 goes down so the price of the 3090 ti goes down so the price of the 2080 ti goes down so the price of the 1080 ti goes down so that i can get a 1050 ti : - ))
umm MS Ai skunk works has a controlling share of 3mile island, and already has one some test starts. They expect all nuclear cores to be back online, working and have even more safety guard rails. LOL I guess someone as MS saw Terminators 1-3 and decided going full retard and making skynet delete humans was a bad idea.
@@iCore7Gaming Well, it's a mini-split, not a central AC system, so for a 16 m2 (180 sqft) bedroom, it's plenty enough. Also 2kW is literally 1/3 of my house's maximum allocated power💀
@@Pan_TorchaBut AMD has unfortunately given up the fight with Nvidia and will settle for mid and low range graphics cards where Intel too is competing for their piece of pie. While Nvidia will be free to charge whatever they want in the enthusiast class.
@@AkashYadavOriginalUnless you're the 1% of gamers, AMD has a card at every point that works great lol. Only the like top top top top do they not compete....
600w is insane imagine your pc cooking trying to run black myth wukong on max settings I'm pretty sure some gaming rooms already have their outlet maxed out too 😂
NVidia knows that most reasonably powerful AI models require 24gb+ of VRAM to run them. Sounds like they're using VRAM to limit the appeal of all cards aside from their flagship models. Its the cypto-mining false-scarcity model all over again
Yep! VRAM reality hit stability Ai very hard...so they did what every other random moonshot startup does, give up. And then comeback as Flux- small, medium, large and. XL is for research only (ie people who can afford many GPU's with 48gigs of ram) the medium models are slower than large, but "only" need about 20-24gigs of vram. It's almost as if using asyncronocous computes on a computer part never meant to make "copy pasta, analog eldritch horror" isn't going to help make general Ai, and keep people from make prompt based pr0n,.
btw the 4090 can only do 52fps in alan woke 2 @1440p without upscale from 1080p or glitchy fake frames. which makes the 4090 basically just a 1080p card in actually modern games (lets ignore drops to 16fps in cp2077). "high-end GPUs" are obsolete and overpriced x10 times in 2024, because can't even run 2024 games on a $200 1440p 180hz monitors properly. the 5090 would be a 1440p 75fps GPU, lol, which barely makes it a 1440p card when all cheap 1440p monitors are 144-180hz these days.
Remember when they used to make GPUs better by improving architecture and manufacturing process and not just by making it literally bigger and more power hungry every year?
ive already got a 4090 but its bottlenecked by my computer right now. i think i'll just upgrade that and see how the reviews for the 5090 are. i'll probably get double the performance out of my poor 4090 anyways(which is waaay more than i need)
I remember when the goal was getting lower and lower TDP with each high end generation. Like all the 700 series, 900 series, 1000 series marketing vs their AMD counterparts since the AMD Volcanic Islands series were all power and heat hogs.
Silicon limits. Advancements in die shrinks has slowed significantly over the last 15 years. So the only way for the next generation to be significantly faster is to pump more watts through it
Yet when products actually did aim at lower TDP, like 4060Ti vs 3060Ti or the Ryzen 9000 processors, reviewers and consumers hated them for not providing enough performance gain. And the 4090 is quite power efficient if you FPS-limit it, it just also has the ability to go pedal to the metal if you want the best graphics out there.
this is like the people complaining about games getting too big smh...to these people wanting significant improvements but no compromises, could you just stop and think for a second first?
@@R0DSTER L take. The whole point of a new architecture/smaller node is efficiency/more performance at a similar power draw. Otherwise there's basically no value in changing to a new process.
That’s what you get for not reading the TOS. W/ all new cars you need to disable the antennas so they can’t communicate w/ the mother ship. OTA updates should not be a thing when it comes to vehicles.
There are articles saying that Nissans are tracking everything about their drivers down to their sex life which is insane (mainly because there's no way that Altima drivers are actually getting laid).
They're all doing it now. ColdFusion recently made a vid about it. It's used against you as well - that data is sold to insurers, resulting in your premiums going up.
2:07 you know, a bunch of people crapped on the Qualcomm chips, but I’m so happy they made them. Interesting how when they said “we have 5x battery life compared to x86” suddenly intel and amd made their processors extremely power efficient. Just a thought.
i dont like this trend. the 4090 was already a massive jump in specs over the 4080. now the 5090 is literally double that of the 5080. its like they are releasing the **90 card to show what they can do, then cutting the rest of the line up right down to milk profit while keeping performance on the table. im totally fine with using the **90 card to push the boundaries with a crazy price only elitists will pay, but making the **80 card actually be more of a **50 card by comparison is just wrong.
**50? Seriously? 4080 eclipses 3090ti in raster and is way better in RT. What arbitrary metric are you following!? 5080 is supposedly around 10% faster than 4090! On which planet is it a 50 class performance!!? With AMD pulling out of the higher tier, Nvidia is free to get away with whatever they please. Now they are providing minimal uplift which defines a generation shift.
@@arch800 plenty of good titles that are out between 2019 and now with amazing graphics and story, don't get pulled from the whirlpool of "keeping up", just to play micro transaction hell shit shows of a game nowadays.
@@-ev1l562 It is crazy, like make the 5080 then 24gig, but it all to not make them future proof...Look also at new MFS2024 specs we will soon run into super heavy gpu needs
5:58 Recall is one of those features where Microsoft spent too much money developing it to just throw it away even though none of us wants it. The equivalent of grandma's terrible, terrible mincemeat pie. "No, grandma, no. It's neither minced nor is it meat. Keep that the hell out of my operating system!!!"
Looking at the history of super computers, 2 of these GPUs would match the TFLOPs of the most powerful (public) supercomputer of 2005. By NASA public information standards, it's more like 2007 - 2008. An entire rooms worth of computing with over 20 miles of cables is beaten by something that can fit inside a suitcase.
My 7900XTX was bought with the intention of not caring at all for a GPU for at least 10 years. Given the crappy AAA games it may very well outlast this period of time.
@@fampic7133 my 6800 xt handles 4k just fine, at an average of over 150 fps on any game at medium-high settings, wdym? And AI is possibly the worst thing for gaming, ever, lol. It causes so many issues like increased input delay, graphical artifacts, etc etc. I don't use modern features, because they're awful, lol. 16 gb vram is perfectly fine. Of course, if you're a complete f***ing r****d, and wont be happy with anything less than 4k gaming at ultra settings, and expecting like 360 fps... You're not gonna be happy with anything, because, that doesn't exist. xD
Watercooling my card to make sure. Then again all these new games that "need" a new graphics card don't seem to look any better than the ones that don't 🤔
If they're doing 32 on the '90 they should do 20 on the 80 and 24 on the 80ti. The 70's should get 16 the 60's get 12 gb. and 8 on the 50's and 6 on the 30's.
the 80 has half the cores of the 90. its no 80. its a 50. nvidia releasing the 90 to show what they can do but cutting everything else down to milk the consumer.
@@iris4547 The 1060 had a similar core ratio to the 1080. Also the 1660TI to the 2080. the 2060 has just over 1/3rd what the 3090 has. I'd say it's close to being a 60TI or 70 relative to the 5090. They should up the ram amount also. If a 5080TI is going to be done then raise the 80 to 20GB and the TI to 24 otherwise the 80 should be 24.
So NVidia has basically decided to blur the line between Gaming GPU and Workstation to try and pretend like moore's law still exists but really it's just old tech with higher specs as a refresh and a rename and rebrand to satisfy and confuse consumers.
I mean, their stock price have been on a roller-coaster ride because of constant reports / murmers that the big Blackwell chip-s keeps hitting production snags so, eh?
All I want is an Intel card that can outpunch my 3080 in traditional rendering (idc about AI upscaling or Rey Skywalker tracing) that only costs one goat and some bone marrow.
Our dystopian cyberpunk hyper-corporate future is already here: it just doesn’t look as cool as we thought it would. Where’s my motorcycle-riding ninja data-broker? Or my samurai sword wielding, hacker pizza delivery guy? 😅
i think your sponsorship segment of ground news was actually more informative and explaining THE ACTUAL POINT OF THE SITE more than others who get sponsored, love to see it!
@iCore7Gaming the 4090 has 68%more cores, 40% more bandwidth while costsing double almost double of a 4080 and only has 25% better performance. Yeah the 5090 will have the same plautau limitation.
@@TheRealLink try 4k pathtracing also means you are leaving significant performance on the table. I get 10 to 15 % delta gains in rt at 3ghz/450 watts.
In Scandinavian countries, it’s still huge! My whole family is actively using snapchat, from my youngest nephews to my grandparents. I’d love to quit using it, but it’s basically the primary form for communication
They have a yearning for ye old yester year, back when a computer was a bunch of huge cabinets taking up multiple rooms, giving off huge heat, and sucking huge power.
Embed the GPU dye into the PSU since its gonna draw that much power and have it plug into the wall and supply power to the rest of the components. It has plenty of power and uses the same PSU fan and saves space. THIS CAN WORK.
Місяць тому+2
"If you announce something enough times eventually the public will be too exhausted to get angry". So true and sad...
Please talk about the Spruce Pine quartz mine. Critical to the semiconductor manufacturing supply chain, operations were suspended due to hurricane damage.
6:35 you mean that thing they sneak in when you install windows without telling the user? bitlocker and drive encryption are always enabled, if they aren't, it means the user disabled them willingly, you know, opt out.
Instead of saying the mouse is shaped like existential dread, I hoped you would say it's shaped like Existenz, the organic gaming console that connects directly to your spine and makes you lose your sense of reality, from the movie Existenz. The mouse really does remind me of that thing.
So for the 6000 series are we just gonna plug the Graphics card directly into the wall? Am I going to need another 15 amp circuit to run it? It's only 3000$ USD. Comes with a discount on your electrician to install it.
We always go through this song and dance. Nvidia makes it's non-90 cards disappointing. We all say we're going to get AMD. AMD then releases their top-end card as a 4080 competitor. Only problem? It's 8-10% less expensive than the 4080. Most people shrug and decide to either sit this generation out or begrudgingly get the Nvidia card.
600watts, that's a microwave or toaster in a pc case. Definitely don't want that hot air staying in the case, it will want water cooling and the hot air going directly outside or it will need some intense airflow inside the case.
How is it changing anything? Top 4090 models are having 600w tdp and most of them are air cooled and they dont even run hot. I doubt any 5090 will go beyond 700w, but those 700w are going to be AIO for sure.
@@filippetrovic845 how is it changing anything? it has more transistors, more and faster ram, larger memory bus, i feel like having to explain this im already wasting my time. more cores, more components, when they already struggle to shrink the transistors mean more power draw. But you already know that, its why you said Top 4090. so how is it changing anything? its a new card and the trend has always been more power and more power draw, even though shrinking causes them to be more efficient. efficiency is the low end market and power is the high end one and again thats why you said it..
perhaps i should explain when i said thats a microwave or toaster in a PC case. i was also including the rest of the system as in total output. but also GPUs are not always pulling full power, i was making a joke.
Gotta love when Ryan mentions the "looking for 7 trillion dollars" part of Altman's plan, I instantly think of it as some kind of hyperbole, and no, it's actually 7 trillion dollars. Bro, wtf
I'll be hanging onto my RX 6800 for a long time till it dies. Bought it 2nd hand last year it was an ex-mining card and boy still performing great...hopefully RX 9000 or 10000 series GPUs will still use the 2x 8-pin connectors....I'll still WON'T touch that recall feature with a 10ft pole.
I swear I was listening, until @6:16 - upper right corner: "Winamp releases source code, asks for help modernizing player". I'd love to hear more about that. 👍
5090s' need the extra memory and speed for Ai and extremely resource heavy work. They're completely stupid for branding them as gaming GPUs. Running games at 4k ultra at 120fps with ray tracing was already achieved with the 4080/4090. And yes, I count DLSS and FG as GPU performance. I don't care how much people cry about it.
Great to see Sony _and_ Nintendo promoting this incredible historical artifact, and this collection tells you _a lot_ about Atari’s equally impressive hardware history - along with giving you a sense of what the machines could do, and when... _In 1979,_ Atari released the first of its “8-bit line,” with the _Atari 400 and 800,_ sporting *the first separate sound and graphics chips* in mass market computers, as well as the direct predecessor to USB-“Serial I/O”-which allowed for “daisy-chaining” multiple accessories through a single port and “hot-swapping” devices ((un)hooking them while the computer was on, which seems ordinary today but was a HUGE pain for other systems for years (in some cases 2 decades) to come... The separate sound and video chips (boards) was such a major advancement that Commodore even got the _same two engineers_ that led the development of Atari’s (1979) 8-bit line chips to design updated versions for the C64 (1982). Sound boards were popular in PCs through the mid-late Aughts, and graphics boards are obviously still a thing (although that’s all reverting, with tech advances... ). Atari’s hardware placed them among the top five most influential computer brands (with Apple, Microsoft/IBM, and Commodore) helped shape the landscape of computing in many allowed their apps/games to be better than PC/Apple, back when computing was wide open and IBM and Microsoft were positioning themselves as “making ‘PCs’” and “running on ‘PCs,’” to make IBM MS-DOS and Windows synonymous with “PC,” or “Personal Computer,” despite so many solid competitors, including Atari, which sold 30+ million 2600s and almost 5 million 8-bits, and Commodore, which sold over 20 million (that still and always will be), _the most units of a single computer model ever sold. * The pre-‘80s POKEY sound chip was used in the subsequent 5200 (1982) and 7800 (1984) consoles. Using recorded audio (on CDs and DVDs), computers and consoles started ditching audio generation chips (although, of course, some DAC chip is still necessary, somewhere along the line - even if it’s your soundbar or (pre-)amp...
Meanwhile the 40 series cards are just not dropping in pricing at all. Thanks AMD for also pulling out the race. With developers getting more lazy asking gamers to get better hardware instead of optimizing gaming is slowly becoming a hobby for the rich. Even the consoles are going up in pricing now. Back to minesweeper we go I guess
"B-but AMD are the good guys! They offer real value for their GPUs!" As claimed by the many AMD shills even as they see the effects of a duopoly falling apart because one half decided to just stop putting effort into their products.
@@Hybris51129 to be fair, amd never stood a chance. nvidia were already pricing like they dont exist cause they really dont. only poor people buy an amd gpu, and now even poor people cant afford them.
@@iris4547 The higher Nvidia goes AMD only has to be priced a little cheaper and their fans will keep parroting their definition of "value". I really hope in a few years Intel's GPU division will get off the ground and at least start pinching AMD in the budget and mid-tiers and maybe force some form of a balance to the market.
Only recently the graphics in my cpu was deemed outdated by me, now replaced by that amd card that was just a mobile chip in a coat That shit can run gta v just fine
Why would you ever expect the XX90 series to be mid range, it’s literally the definition of top end. The XX60 and XX70 cards already exist for mid range along with the X700 and X800 cards from AMD.
@@-BurbHaha good answer. When 90 series didnt exist none complained. But dumb people like this guy doesn't realise that quite alot of us who can afford it actually care and dont care for midrange cards. 80 series is true high end gaming GPU, while 90 is halo product. 70 and 60 series should be compared to 80 series and then you see that 70 series is actually quite adequate. Hell, even 4060 runs most demanding games as good as AMD's 7900xt in many cases.
@@filippetrovic845ok, but the 90 existing doesn't excuse the fact nvidia is intentionally handicapping their midrange lineup to upsell you to their high end cards. And just because you can afford something, doesn't mean you're automatically interested. Honestly, if my choices are spending an uncomfortable amount of money I could be spending on other things, or knowingly getting ripped off, I'd rather just not bother anymore. They risk alienating a lot of potential customers
8:57 good to see the Statial getting mainstream attention after Optimum and DiamondLobbyReviews released sponsored videos on a 3D printed modular mouse,
Between that and TSMC's CEO calling him a "podcasting bro" Alt-Man has a lot to cry into his pillow tonight. You know, if he had functional tear ducts.
hey guys its riley from the video i just got back from the top of the glorious mountaintop where i go to meditate on how absolutely BASED i am because I access local perspectives to better understand world politics and current events with Ground News. Use my link ground.news/TechLinked to get 40% off the Vantage plan and join me in the light of correctness
Cool
thank you Riley
gbt had a go at me for being noninclusive towards, nazis.
Riley peak based
btw the 4090 can only do 52fps in alan woke 2 @1440p without upscale from 1080p or glitchy fake frames. which makes the 4090 basically just a 1080p card in actually modern games (lets ignore drops to 16fps in cp2077).
*pats 1080ti*
"you ready to go for one more gen buddy?"
I was rocking 970s until the 30 series and went AMD because Nvidia price to performance ratio is abysmal these days. I hope AMD goes hard in software and forces Nvidia to take the wakeup hit they need
@@Day100 hopeful thinking
"I didn't hear no bell."
Went from 970 to 4060, change was night and day, maybe I'm just too biased to say dlss3 it's a marvel along frame gen, but I never would imagine be able to play nowadays high end games at 1080 with above 60 fps, but I feel like I will upgrade eventually to a 4080 in the long term
Still have the rtx2080 which is a 1080ti with but with rt cores
when the 5090 releases so the price of the 4090 goes down so the price of the 3090 ti goes down so the price of the 2080 ti goes down so the price of the 1080 ti goes down so that i can get a 1050 ti : - ))
treat your self for 1660s
😂
Someone gets it. We will just leech the savings even more, saving on the already cheap budget cards 😂🫡
Rumor is they stopped making the 4090 so they don't have to discount them when the 5090 comes out.
4090 isn't being discounted, the 5090 is going to be more expensive
Can’t wait for the Nvidia 1000090 that will require its own nuclear power plant to run
And 6 connectors known to have fire issues!
Microsoft can plug you into 3 Mile Island.
I kinda wanna see this now
@@tonys.1946 6? 2000 connectors
umm MS Ai skunk works has a controlling share of 3mile island, and already has one some test starts. They expect all nuclear cores to be back online, working and have even more safety guard rails. LOL I guess someone as MS saw Terminators 1-3 and decided going full retard and making skynet delete humans was a bad idea.
Ain't no way the RTX 5090 consumes more power than my bedroom air conditioner 💀
You're gonna need another air conditioner to cool it
That isn't much for an air conditioner. A good air conditioner is usually 2kw or more.
@@iCore7Gaming Well, it's a mini-split, not a central AC system, so for a 16 m2 (180 sqft) bedroom, it's plenty enough.
Also 2kW is literally 1/3 of my house's maximum allocated power💀
Can't play games on your air conditioner. Or can you 👀💀.
@@ASEM-1123 Damn didn't realise that some houses were only rated up to 6kW. I've got a tiny city flat that's rated up to 20kW
why not just make the 5090 a PSU for the entire PC
Wire a new electrical circuit just for the GPU
Built in psu just plug in two power cables into the back
Yeah science...
cause money.
Wait, wait... I think your cooking here..
That might be genius.
At this point Nvidia should start using the C14 connector. Instead of making Power Supplies obsolete with every new generation.
not bad idea, but better to not use nvidia, AMD ON TOP
@@Pan_TorchaBut AMD has unfortunately given up the fight with Nvidia and will settle for mid and low range graphics cards where Intel too is competing for their piece of pie.
While Nvidia will be free to charge whatever they want in the enthusiast class.
I CAN'T LIVE WITHOUT CUDA ANDD DLSSS SORRYY @@Pan_Torcha
@@AkashYadavOriginalshame amd gave up the fight it’d be easy to surpass Intel. If I was running amd they’d crush intel
@@AkashYadavOriginalUnless you're the 1% of gamers, AMD has a card at every point that works great lol. Only the like top top top top do they not compete....
So if I get a 5090 I won't need to turn on my heater anymore
600w is insane imagine your pc cooking trying to run black myth wukong on max settings I'm pretty sure some gaming rooms already have their outlet maxed out too 😂
My old 1080 FE pumps out 80 Celsius. Didn't have to turn on my heater once through winter.
If you trust the connectors enough, you become the heater too, so does everything else though. :(
Should be integrated with water tank. My water heater is 1000W, that's probably 8090.
Free heat, my brother. Free heat😂
NVidia knows that most reasonably powerful AI models require 24gb+ of VRAM to run them. Sounds like they're using VRAM to limit the appeal of all cards aside from their flagship models. Its the cypto-mining false-scarcity model all over again
Just going to buy the 5080 Super with 24 gb then. More of a i want and less of i need.
16gb can run a lot of good models, but 32gb doesn't run the best ones since they require 70.
@@Donnirononon when do you think the super models will be released
@@khunshub18461 year after release.
Yep! VRAM reality hit stability Ai very hard...so they did what every other random moonshot startup does, give up. And then comeback as Flux- small, medium, large and. XL is for research only (ie people who can afford many GPU's with 48gigs of ram) the medium models are slower than large, but "only" need about 20-24gigs of vram. It's almost as if using asyncronocous computes on a computer part never meant to make "copy pasta, analog eldritch horror" isn't going to help make general Ai, and keep people from make prompt based pr0n,.
From OpenAI to ClosedAI to ProfitAI
next stop TheranosAI
last stop TerminatorAI
It was never open.
to DystopianAI
my wallet taketh a vacation
I lacketh both of those
Oh no 😅
btw the 4090 can only do 52fps in alan woke 2 @1440p without upscale from 1080p or glitchy fake frames. which makes the 4090 basically just a 1080p card in actually modern games (lets ignore drops to 16fps in cp2077).
"high-end GPUs" are obsolete and overpriced x10 times in 2024, because can't even run 2024 games on a $200 1440p 180hz monitors properly. the 5090 would be a 1440p 75fps GPU, lol, which barely makes it a 1440p card when all cheap 1440p monitors are 144-180hz these days.
@@rawdez_ the problem is the games not being optimized not the gpus tbh
@@lonewanderer3292 Take consolation in knowing you are not alone my brother
It consumes more power than my 2.5 hp split type inverter aircon.
It’s getting to server territory in power consumption. I have a T430 that has 750W PSUs. Probably going to get there with the 6090 (nice)
I’m not sure my aircon can handle 600w in the room lol
They realized they can get away with whatever wattage they want and consumers will still buy, so this is the result.
why do you think they're making freakin nuclear plants to power these things
2.5hp=1838W.
Remember when they used to make GPUs better by improving architecture and manufacturing process and not just by making it literally bigger and more power hungry every year?
No
@@MrMKFreak yes, shit I am getting too old
I was too poor in the before time
Really? 16GB on a $1000 card made for 2025? Come on Nvidia. Next thing they're going to do is keep 6GB of VRAM on the 5050M
20$ for whopper combo man
Don't forget 8GB on 5070M
Apple still sells Laptops with 8gb ram xd
I wouldn't put it past them to charge as much for the 5080 as the 4090. Id be surprised if the 5090 is going to be less than $2000. :(
ive already got a 4090 but its bottlenecked by my computer right now. i think i'll just upgrade that and see how the reviews for the 5090 are. i'll probably get double the performance out of my poor 4090 anyways(which is waaay more than i need)
I remember when the goal was getting lower and lower TDP with each high end generation. Like all the 700 series, 900 series, 1000 series marketing vs their AMD counterparts since the AMD Volcanic Islands series were all power and heat hogs.
Silicon limits. Advancements in die shrinks has slowed significantly over the last 15 years. So the only way for the next generation to be significantly faster is to pump more watts through it
Yet when products actually did aim at lower TDP, like 4060Ti vs 3060Ti or the Ryzen 9000 processors, reviewers and consumers hated them for not providing enough performance gain.
And the 4090 is quite power efficient if you FPS-limit it, it just also has the ability to go pedal to the metal if you want the best graphics out there.
@@T33K3SS3LCH3N Because in the past you still got the performance gain as well as the lower TDP.
this is like the people complaining about games getting too big smh...to these people wanting significant improvements but no compromises, could you just stop and think for a second first?
@@R0DSTER L take. The whole point of a new architecture/smaller node is efficiency/more performance at a similar power draw. Otherwise there's basically no value in changing to a new process.
0:33 Missed the chance to say "This is the best nvidia GPU we have ever made" in tim cook voice
Our thickest GPU yet
"Actually, this is not the best phone we've ever made, buy the 15"
5080 only 16gb is typical Nvidia again... 20gb would have been good.
20 is for ti
When "last gen" AMD has 24GB. Yeaaaaah all is fine.
They're gonna release a 12gb 70 series card again ☹️
20 makes to much sense when need you to pay for our more expensive model
✨for your 1080p gaming needs✨
Even if the Kia system was secure, I don’t like the idea of the dealer having that much access to my car after I’ve bought it, that is horrendous
That’s what you get for not reading the TOS. W/ all new cars you need to disable the antennas so they can’t communicate w/ the mother ship. OTA updates should not be a thing when it comes to vehicles.
There are articles saying that Nissans are tracking everything about their drivers down to their sex life which is insane (mainly because there's no way that Altima drivers are actually getting laid).
@@CyanRooper 🤣🤣🤣🤣
It is probably a feature for the lending entity since it makes it a lot easier to repo the cars.
They're all doing it now. ColdFusion recently made a vid about it. It's used against you as well - that data is sold to insurers, resulting in your premiums going up.
Sure you can uninstall the 'recall' feature.... and you will have to after every single update.
Even if you uninstall it, do you really THINK it's uninstalled? That's like uninstalling a keylogger.
let's just take a step further back and ask why is it going to be installed to begin with
@@CoreDreamStudios Uninstalling a keylogger? You mean reformatting your PC to Linux? :3
@@MyouKyuubi EXACTLY :D
@@CoreDreamStudios xD
2:07 you know, a bunch of people crapped on the Qualcomm chips, but I’m so happy they made them. Interesting how when they said “we have 5x battery life compared to x86” suddenly intel and amd made their processors extremely power efficient. Just a thought.
Honestly? I'd rather Raspberry had a segment with Qualcomm Snapdragon chips, even if we had to pay a premium over the conventional ones.
*reads title*
*glances at my gtx 770*
Same here with my GTX 1050 Ti Max-Q on my 5 year old laptop.
@@microbuilder 2080tis are pretty cheap now
one of them situations where your like, hold on little card, one day one day.
@@redinthesky1 Just watch out for the ones with those sketchy failing Micron chips.
reads title
looks at my Mac m1 with disappointment
i dont like this trend. the 4090 was already a massive jump in specs over the 4080. now the 5090 is literally double that of the 5080. its like they are releasing the **90 card to show what they can do, then cutting the rest of the line up right down to milk profit while keeping performance on the table. im totally fine with using the **90 card to push the boundaries with a crazy price only elitists will pay, but making the **80 card actually be more of a **50 card by comparison is just wrong.
**50? Seriously? 4080 eclipses 3090ti in raster and is way better in RT. What arbitrary metric are you following!? 5080 is supposedly around 10% faster than 4090! On which planet is it a 50 class performance!!?
With AMD pulling out of the higher tier, Nvidia is free to get away with whatever they please. Now they are providing minimal uplift which defines a generation shift.
Exaggerating much
@@jal.ajeera read better
@@caliginousmoira8565 Read the "edited" tag in the actual comment.
@@jal.ajeeraLet's see if what you "suppose" is true. Nvidia organizing early shortage on 4090 is not a good sign (for both prices & performances).
Can't wait to not afford this generation either
RIGHT!!!! "The new 5090 with 32 gb of ram, only a modest 5 grand for a modest performance boost over a 4090"
@@arch800 plenty of good titles that are out between 2019 and now with amazing graphics and story, don't get pulled from the whirlpool of "keeping up", just to play micro transaction hell shit shows of a game nowadays.
Get a job, you will be able to buy it.
@@edenassos I'm an IT manager and I still can't afford it, brain dead take 🤡
@@arch800 Poor financial planning at its finest but keep finding excuses.
Then 5080 only has 16gig like wtf nvidia😂 they sure are just taking a pss now
and you need more than that why
@@andrewcairnsmrkiplin Because we are already seeing games useing more vram 🤔 so 16gig will become the new 12gig or 8gig...do we need to draw pictures
@@reghardmostert8425 frr the 2080ti i used to have had 12gb and that was from 2018.... +4 gigabytes for almost the same price 8 years later is wild
@@-ev1l562 It is crazy, like make the 5080 then 24gig, but it all to not make them future proof...Look also at new MFS2024 specs we will soon run into super heavy gpu needs
5:58 Recall is one of those features where Microsoft spent too much money developing it to just throw it away even though none of us wants it. The equivalent of grandma's terrible, terrible mincemeat pie.
"No, grandma, no. It's neither minced nor is it meat. Keep that the hell out of my operating system!!!"
but come on man they fixed it. you can tru....... hahaha sorry i couldn't say that without cracking up laughing.
The problem here is you called it YOUR operating system. It isn't yours. It is grandma's.
I kinda want it, but my laptop doesn't have an NPU, man.
@@GigaheartRegular people don't gaf about Linux and never will.
@@Toastybees paranoia much?
People say that x86 is old and that ARM is modern. But they don't realise that both x86 and ARM are older than ATX form factor itself.
I don't know that calling 5090 a gaming gpu is fair
Yeah.. gaming on a NASA super computer 😂
Looking at the history of super computers, 2 of these GPUs would match the TFLOPs of the most powerful (public) supercomputer of 2005. By NASA public information standards, it's more like 2007 - 2008. An entire rooms worth of computing with over 20 miles of cables is beaten by something that can fit inside a suitcase.
@@ryanthompson3737 exactly
The real target is AI developers
It's 2K, 2.5K tops. A few months of work at most and you can buy it.
My 7900XTX was bought with the intention of not caring at all for a GPU for at least 10 years. Given the crappy AAA games it may very well outlast this period of time.
a 6800 XT does the job just fine too. :]
@@MyouKyuubi na i have the same GPU 16Gb of VRAM is not enough for 4k or AI
@@fampic7133 my 6800 xt handles 4k just fine, at an average of over 150 fps on any game at medium-high settings, wdym?
And AI is possibly the worst thing for gaming, ever, lol. It causes so many issues like increased input delay, graphical artifacts, etc etc.
I don't use modern features, because they're awful, lol.
16 gb vram is perfectly fine.
Of course, if you're a complete f***ing r****d, and wont be happy with anything less than 4k gaming at ultra settings, and expecting like 360 fps... You're not gonna be happy with anything, because, that doesn't exist. xD
@@fampic7133for AI, just enable CPU offload. No need to hold everything in VRAM.
Watercooling my card to make sure. Then again all these new games that "need" a new graphics card don't seem to look any better than the ones that don't 🤔
If they're doing 32 on the '90 they should do 20 on the 80 and 24 on the 80ti. The 70's should get 16 the 60's get 12 gb. and 8 on the 50's and 6 on the 30's.
the 80 has half the cores of the 90. its no 80. its a 50. nvidia releasing the 90 to show what they can do but cutting everything else down to milk the consumer.
@@iris4547The 80 at 50% leavesneniugh room to introduce a 5080 Super, 5080 Ti and 5080 Ti Super!
ti super reflex edition
Ngreedia
@@iris4547 The 1060 had a similar core ratio to the 1080. Also the 1660TI to the 2080. the 2060 has just over 1/3rd what the 3090 has. I'd say it's close to being a 60TI or 70 relative to the 5090. They should up the ram amount also. If a 5080TI is going to be done then raise the 80 to 20GB and the TI to 24 otherwise the 80 should be 24.
So NVidia has basically decided to blur the line between Gaming GPU and Workstation to try and pretend like moore's law still exists but really it's just old tech with higher specs as a refresh and a rename and rebrand to satisfy and confuse consumers.
? GPUs don't use leading edge nodes, they're not and never have been a measure for Moore's law.
I mean, their stock price have been on a roller-coaster ride because of constant reports / murmers that the big Blackwell chip-s keeps hitting production snags so, eh?
Mmmm... satisfusion.
Very true Moore's law is dead
@@TheSouthParkVidsFTW no, it isn't
All I want is an Intel card that can outpunch my 3080 in traditional rendering (idc about AI upscaling or Rey Skywalker tracing) that only costs one goat and some bone marrow.
Our dystopian cyberpunk hyper-corporate future is already here: it just doesn’t look as cool as we thought it would. Where’s my motorcycle-riding ninja data-broker? Or my samurai sword wielding, hacker pizza delivery guy? 😅
They are always there; you just don't know that side of them ;)
i think i will be holding on to my 4090 for a few generations (assuming it doesnt melt)
I'm gonna sell my 3090 l, 4090 to get the 5090
still holding onto my 1660ti
I won't, I will sell mine as soon as Nvidia releases something better that doesnt guzzle down energy. I dont trust it surviving for long.
@@zedoctor_ ooooooof
you can't pass up on the 6090
Well I guess I don’t need a heater in my room for winter this card will do the job.
i think your sponsorship segment of ground news was actually more informative and explaining THE ACTUAL POINT OF THE SITE more than others who get sponsored, love to see it!
That's it I'm building a nuclear reactor now to be ready for it😂
Microsoft is on board with this solution
even my AC wasn't that power hungry today
Just plug the mouse and keyboard into the GPU at this point
RTX 60 series will just be a nvidia gaming console, except you have to insert your own hard drive/ssd
600 watts? really? this is not power, is waste of energy and an unoptimized MESS, this is ridiculous!
It's still way more efficient than the 4090 though.
@@iCore7Gamingthey both have terrible price to performance and power optimization. It's pitiful.
That's always the limit. Not like it's going to draw 600 all the time.
My 4090 is rated for 450, haven't ever seen it push past around 325-360.
@iCore7Gaming the 4090 has 68%more cores, 40% more bandwidth while costsing double almost double of a 4080 and only has 25% better performance. Yeah the 5090 will have the same plautau limitation.
@@TheRealLink try 4k pathtracing also means you are leaving significant performance on the table. I get 10 to 15 % delta gains in rt at 3ghz/450 watts.
7:14 Snapchat has 422 million daily users as of 2024 Q1. I feel like I know none of them.
Why would you want to?
It’s all kids.
@@zheta42 Can confirm, snapchat is HUGE in my local high school and is the main form of communication between students.
In Scandinavian countries, it’s still huge! My whole family is actively using snapchat, from my youngest nephews to my grandparents. I’d love to quit using it, but it’s basically the primary form for communication
@@VetoHow If it keeps your family together, then more power to it!
Remember the days of the 1080ti? When they tried to reduce power consumption per fps for each generation? What the duck happened?
They have a yearning for ye old yester year, back when a computer was a bunch of huge cabinets taking up multiple rooms, giving off huge heat, and sucking huge power.
Embed the GPU dye into the PSU since its gonna draw that much power and have it plug into the wall and supply power to the rest of the components. It has plenty of power and uses the same PSU fan and saves space. THIS CAN WORK.
"If you announce something enough times eventually the public will be too exhausted to get angry". So true and sad...
600W stock which means it'll be 800W in reality.
i remember when i bought my cooler master silent pro gold 1000 watt, everyone was saying you will never need that much. oh really.
It's still on a 600w connection.
600w max output einstein
@@demonpride1975 saaame
" 1.21 gigawatts?! "
- Dr. Brown discovering the 5090
Please talk about the Spruce Pine quartz mine. Critical to the semiconductor manufacturing supply chain, operations were suspended due to hurricane damage.
why does Kia keep having stupid security issues?
It will happen to all the manufacturers eventually. Cars with Internet access to core systems is a monumentally bad idea.
That's why you don't want internet on everything, everything will be used against you
6:35 you mean that thing they sneak in when you install windows without telling the user? bitlocker and drive encryption are always enabled, if they aren't, it means the user disabled them willingly, you know, opt out.
Instead of saying the mouse is shaped like existential dread, I hoped you would say it's shaped like Existenz, the organic gaming console that connects directly to your spine and makes you lose your sense of reality, from the movie Existenz. The mouse really does remind me of that thing.
So for the 6000 series are we just gonna plug the Graphics card directly into the wall? Am I going to need another 15 amp circuit to run it?
It's only 3000$ USD. Comes with a discount on your electrician to install it.
only 16 gb guess im going amd next gen
I'm waiting for the 6070 TI Ultra Super tbh
We always go through this song and dance. Nvidia makes it's non-90 cards disappointing. We all say we're going to get AMD. AMD then releases their top-end card as a 4080 competitor. Only problem? It's 8-10% less expensive than the 4080. Most people shrug and decide to either sit this generation out or begrudgingly get the Nvidia card.
AMD left the High End GPU Market. Was in the news this week.
Thank you for covering the exoskeleton story!
Where does Copilot save its files?
I want to make sure to pre-give the folder Deny All permission ahead of time.
I think I need more Riley videos in my feed. Instantly makes me feel better anytime I watch these
The 5080 got absolutely scalped. So.. Is the rest of the lineup going for 4060 status ?
By claiming you can "uninstall" recall, what they mean is: We'll just add it back with every update until you give up.
600watts, that's a microwave or toaster in a pc case.
Definitely don't want that hot air staying in the case, it will want water cooling and the hot air going directly outside or it will need some intense airflow inside the case.
How is it changing anything? Top 4090 models are having 600w tdp and most of them are air cooled and they dont even run hot. I doubt any 5090 will go beyond 700w, but those 700w are going to be AIO for sure.
@@filippetrovic845 how is it changing anything? it has more transistors, more and faster ram, larger memory bus, i feel like having to explain this im already wasting my time. more cores, more components, when they already struggle to shrink the transistors mean more power draw. But you already know that, its why you said Top 4090. so how is it changing anything? its a new card and the trend has always been more power and more power draw, even though shrinking causes them to be more efficient. efficiency is the low end market and power is the high end one and again thats why you said it..
Low end micro my microwave is 800w😂
@@frallorfrallor3410 higher powered microwaves tend to cook food fast on the outside and less on the inside, more is not always better :D
perhaps i should explain when i said thats a microwave or toaster in a PC case. i was also including the rest of the system as in total output. but also GPUs are not always pulling full power, i was making a joke.
Gotta love when Ryan mentions the "looking for 7 trillion dollars" part of Altman's plan, I instantly think of it as some kind of hyperbole, and no, it's actually 7 trillion dollars. Bro, wtf
Seinfeld fans unite:
Previous video: "These pretzels are making me thirsty!"
This video: "They are real and they are spectacular."
The next one will be "No [insert tech here] for you!"
Seinfeld fan # 426321 reporting for duty!
0:43 Hey I just watched that movie! Are you spying on me Riley? I wouldn't mind actually...
Thanks for the Sam Altman insults. First time I ever upvoted Techlinked!
For these prices and wattages, it should come with its own full size AC
I'll be hanging onto my RX 6800 for a long time till it dies. Bought it 2nd hand last year it was an ex-mining card and boy still performing great...hopefully RX 9000 or 10000 series GPUs will still use the 2x 8-pin connectors....I'll still WON'T touch that recall feature with a 10ft pole.
Just want so day, I really love your delivery and the writing on this show. Thank you so much for sharing your talent
600 Watts? No thanks. Feels like the tech has stalled, and they're just cramming in more processors and more power pull.
You're obviously broke, why talk about something you can't afford?
I swear I was listening, until @6:16 - upper right corner: "Winamp releases source code, asks for help modernizing player". I'd love to hear more about that. 👍
How much you wanna bet the only upgrades the 5090 has is bigger memory, GDDR7, and 600 watt access. That's incredibly lame if that's the case.
5090s' need the extra memory and speed for Ai and extremely resource heavy work. They're completely stupid for branding them as gaming GPUs. Running games at 4k ultra at 120fps with ray tracing was already achieved with the 4080/4090. And yes, I count DLSS and FG as GPU performance. I don't care how much people cry about it.
@@smittyvanjagermanjenson182 I wasn't dissing on frame gen dude.
This is the best grounded humor I've thoroughly enjoyed in a while. Thank you!
Why Sam looks so evil in that 3:50 picture lol 😢😅
Bro I was gonna say that his face is nightmare fuel. Idk what kind of editing they did or where they got the photo from but they really did sam dirty😂
@@Renaldo_Graham got him looking like he’s ready to play the next installment of Marvel’s Red Skull 👹
@@npc-drew lmao realll
Low key Riley could do voice acting for cartoons and shit
600 WATTS?!? That sucker better have the performance to run anything at 4K Ultra 120FPS for the next 5 years
Natively? It won't.
Too bad my private powerplant is down for repair, so I have to wait
Does an AC come with the 5090?
Looks at 4090
"We need to talk!"
need to black out my entire city to powered up the GPU
Jakob's ad lib game was on point in this one
I’m surprised Tim Cook knows so much about Nvidia
Great to see Sony _and_ Nintendo promoting this incredible historical artifact, and this collection tells you _a lot_ about Atari’s equally impressive hardware history - along with giving you a sense of what the machines could do, and when...
_In 1979,_ Atari released the first of its “8-bit line,” with the _Atari 400 and 800,_ sporting *the first separate sound and graphics chips* in mass market computers, as well as the direct predecessor to USB-“Serial I/O”-which allowed for “daisy-chaining” multiple accessories through a single port and “hot-swapping” devices ((un)hooking them while the computer was on, which seems ordinary today but was a HUGE pain for other systems for years (in some cases 2 decades) to come...
The separate sound and video chips (boards) was such a major advancement that Commodore even got the _same two engineers_ that led the development of Atari’s (1979) 8-bit line chips to design updated versions for the C64 (1982). Sound boards were popular in PCs through the mid-late Aughts, and graphics boards are obviously still a thing (although that’s all reverting, with tech advances... ).
Atari’s hardware placed them among the top five most influential computer brands (with Apple, Microsoft/IBM, and Commodore) helped shape the landscape of computing in many allowed their apps/games to be better than PC/Apple, back when computing was wide open and IBM and Microsoft were positioning themselves as “making ‘PCs’” and “running on ‘PCs,’” to make IBM MS-DOS and Windows synonymous with “PC,” or “Personal Computer,” despite so many solid competitors, including Atari, which sold 30+ million 2600s and almost 5 million 8-bits, and Commodore, which sold over 20 million (that still and always will be), _the most units of a single computer model ever sold.
* The pre-‘80s POKEY sound chip was used in the subsequent 5200 (1982) and 7800 (1984) consoles. Using recorded audio (on CDs and DVDs), computers and consoles started ditching audio generation chips (although, of course, some DAC chip is still necessary, somewhere along the line - even if it’s your soundbar or (pre-)amp...
Riley shirt status: washed 3 days ago, picked up off the floor this morning
tech news isn’t even tech news anymore. nice to see more actual tech news
Meanwhile the 40 series cards are just not dropping in pricing at all. Thanks AMD for also pulling out the race. With developers getting more lazy asking gamers to get better hardware instead of optimizing gaming is slowly becoming a hobby for the rich. Even the consoles are going up in pricing now. Back to minesweeper we go I guess
"B-but AMD are the good guys! They offer real value for their GPUs!" As claimed by the many AMD shills even as they see the effects of a duopoly falling apart because one half decided to just stop putting effort into their products.
@@Hybris51129 to be fair, amd never stood a chance. nvidia were already pricing like they dont exist cause they really dont. only poor people buy an amd gpu, and now even poor people cant afford them.
@@Hybris51129 you are mad at the shills? Or the duopoly?
Yea, for me to get a 4090 in Aus, It's roughly around 3k - 3.5k Aud. Fucking insane.
@@iris4547 The higher Nvidia goes AMD only has to be priced a little cheaper and their fans will keep parroting their definition of "value".
I really hope in a few years Intel's GPU division will get off the ground and at least start pinching AMD in the budget and mid-tiers and maybe force some form of a balance to the market.
Now you can in winter choose between oil heater or rtx 5090
My 1070 TI is fine.... really....
Only recently the graphics in my cpu was deemed outdated by me, now replaced by that amd card that was just a mobile chip in a coat
That shit can run gta v just fine
It honestly is. How many truly exciting AAA games are being released these days
words we never fathomed "it better be good, or someones going to buy you" while dancing. X-)
Devs: “optimizations? You mean peasants settings?”
Gotta love the Seinfeld reference right at the beginning 😂😂😂 "they're real & they're spectacular"
Hey Techlinked! Are you sure they are real? Those chips have silicone in them!
honestly you guys flow like a sick beat !
Hey, where's Tech Linked for Sept 30th, 2024? No video?! Wack...
This
Well when I make the upgrade from my 1080 Ti to the 5090 it's going to be one hell of a upgrade.
Thankyou Notification Bell
Think outside the.... notification panel?
I was confused about the statutory holiday on Monday and then I remembered I live in Ontario and it ISN'T a statutory holiday here.
Who asked for a 600 watt unaffordable card? How many years do normal gamers have to wait for a midrange card?
Why would you ever expect the XX90 series to be mid range, it’s literally the definition of top end. The XX60 and XX70 cards already exist for mid range along with the X700 and X800 cards from AMD.
@@-BurbHaha good answer. When 90 series didnt exist none complained. But dumb people like this guy doesn't realise that quite alot of us who can afford it actually care and dont care for midrange cards. 80 series is true high end gaming GPU, while 90 is halo product. 70 and 60 series should be compared to 80 series and then you see that 70 series is actually quite adequate. Hell, even 4060 runs most demanding games as good as AMD's 7900xt in many cases.
@@filippetrovic845ok, but the 90 existing doesn't excuse the fact nvidia is intentionally handicapping their midrange lineup to upsell you to their high end cards. And just because you can afford something, doesn't mean you're automatically interested. Honestly, if my choices are spending an uncomfortable amount of money I could be spending on other things, or knowingly getting ripped off, I'd rather just not bother anymore. They risk alienating a lot of potential customers
xx90s are not gamer cards. They never was. xx60s for workstations, xx70 and xx80 are for gaming.
8:57 good to see the Statial getting mainstream attention after Optimum and DiamondLobbyReviews released sponsored videos on a 3D printed modular mouse,
Let's bet. How many AI bros will grovel and plea to buy a 5090 before anyone else? It will sell like hot cakes.
But not to gamers.
Oh shit are you wroong..
Seinfeld reference in almost every episode, it's gold Riley, GOLD!
My whole PC uses a 550W psu ;-;
"also the attack takes, like, 30 seconds" makes that exploit especially scary.
1:22 lol 😂
no
Yes @@xapu8775
Never did I think I would hear the words “twink Palpatine”, thank you Jakob.
Between that and TSMC's CEO calling him a "podcasting bro" Alt-Man has a lot to cry into his pillow tonight.
You know, if he had functional tear ducts.
Please AMD, make a good competitive product, with moderate power consumption and a fair price
They won’t. AMD announced this week that they’re pulling out of the High End GPU Market.
maybe they will compete with well in the mid to low end
- my wallet