9:18 Don't be fooled. The GPU's raw performance is 28 fps. Like huh? This is so stupid. There is no way their raw power only gets 28 fps. This is even more proof that Modern Games are extremely unoptimized and heavily relied on DLSS.
18:10 Zack still doesn't understand the retail price of goods is not determined by the costs of production. They are aimed to be priced at the optimum price that generates the most aggregate profit.
These are two separate things. Of course the price is determined by the cost of production. And they come down from the price after they have developed cheaper productions methods.
@@damazywlodarczyk They are not separate, they are priced above BOM/production and margined at what people will spend for the product. It is tech not a necessity.
You're basically paying and banking on Nvidia to provide you performance through software updates rather than buying a solid and powerful GPU. No thanks.
this is it 1000% its like a subscription. you want to unlock the new software features we locked behind the new hardware thats not even much better, give is 2k...
as.owner of the 4080 super, the extra fake frames feels terrible, it does not feel smooth and your output from your keyboard and mouse feels really delayed, I tried them a few times and never turned back on
Be careful. I just said the same on another video and got 10 fanboys telling me I'm against technology and I'm stupid etcetc. Supposedly if they can't notice then the issue doesn't exist🤡
@@РаЫо It's wild they are going that route, every compute that is going to AI, means it's not going to actually generating authentic pixels the way devs want them.
Yup, listen to the guy thats been personally invited backstage on Nvidias HQ with multiple Nvidia executives listen to every word cause if there is one negative thing, he'll lose Nvidia as a potential sponsor. I like LTT, but this felt forced. Almost like gunpoint.
He vowed not to get the 4090 but he said he's gonna get the 5090. However the elephant in the room is he didn't get the 4090 because of the price, and the 5090 is even more expensive.
Hes always expressed distaste for Nvidia. He says they're a huge pain to work with all the time. The card is more powerful. And he's got the money to spend. I don't blame him for getting the best card available. But he's probably just as miffed as most of us that this is all dlss AI performance. You're still going to get great performance without craptracing on.
A bunch of tech channels already commented that what DLSS4 offers is 3 "fake" frames for every 1 true frame. It only works in games that support it and while LOOKING smooth has noticeable delay.
You can just buy the B580 instead and less money going to them will get your point across. So hope you're not one of those hypocrites with their cards ...
The hottest take anyone can give: We should have NEVER left the GTX lineup. RTX and the very IDEA of raytracing has done nothing but irreversible damage to the entire gaming industry as a whole to the point of ruin. Raytracing was not only a mistake, but flat out the worse direction that industry has taken on mass due to the negative impact it has had on development (industry-wide) and then by proxy, performance. Gaming would be in a MUCH better place right now if raytracing technology never existed.
I wouldn't go as far as it shouldn't exist as much as it shouldn't be on a commercial product. Production workloads still benefit from RTX cards for ray traced rendering.
Or was released in done state - as in, all games would be fully ray-traced out of the box. Nowadays this is not the case - RT is unoptimized mess that runs on top of another unoptimized mess, and is also completely butchered by needing a proper implementation from dev team which happens...extremely rare.
@@indeedinteresting2156 I'd say that anything that can't run on a 1000/1600 series card on lowest settings without any frame gen and upscaling bullshit deserves to go under
21:00 as someone who bought a gaming laptop i think it's because I wanted to be able to take it to university, play games when I'm on my lunch breaks, and be able to take my laptop to a friend's house and game together with split screen games
@@alexturnbackthearmy1907 it's just to go to a friend's dorm and casually game together if we were bored. There's a lot of uses normies get out of gaming laptops was my point, I couldn't take my pc with me
Yes I got a gaming laptop for that purpose too. Ended up getting a desktop PC and a cheap laptop. Just remote into the gaming desktop with moonlight and sunshine. Extremely low latency. Works out perfectly. Laptop gets great battery life because it's only streaming video. No sound from the loud desktop as it's in another room. And I can stream it to any screen anywhere.
@@smokinace926 tbh it's harder and harfer to notice, sure, but proper 240 is already noticebly better. For those who chase further and further fluidity it makes sense I guess
One of the problems with fake frames is that performance doesn't work in VR or other high demand applications. 3D modeling, Video editing, multi screen, etc
yip. pure rasterization performance is always more important and more useful in general for various tasks. imho the die space for dlss, raytracing and ai is just wasted resources that could have been put toward more powerful gpu's.
@@socks2441 you seem to not understanding current engineering capabilities, the reason we have software solutions are because of our hardware limitations. GeForce is geared towards gaming, not productivity. DLSS and Ray Tracing are both beneficial for gaming.
I cannot believe that frame generation feels good at all. Each frame that is generated does not have any direct input from your mouse and keyboard. This would feel like the input lag we had with v-sync 60hz, where you also have to wait for the monitor's refresh rate (it creates small gaps between control-input and video-output). Also, the fps counter can be manipulated by frame generation by simply showing the same generated frame twice. If you have enough different frames in a second, you won't notice it except for some blur - which the AI conveniently generates anyway, and the fps counter going up.
I've tried it with just regular frame gen and it feels like Jell-O. It'll be even worse, it'll feel like a monitor with slow pixel response time despite having more frames displayed.
I think that from now on GPUs should no longer be called "hardware" but here you are practically paying for a "software" since 3/4 is generated by dlss4
@@HellPedre so you really think that artificially generated frames are better than physical frames generated by the gpu? 🤣I think that dlss4 should only "help" a gpu to "compensate" with additional frames only when it becomes obsolete... not become the main feature. otherwise here we are no longer talking about GPU but about FGU (frame generating unit)
When he says "ai" all I can hear is "horrible TAA" and "blurry in motion" and also "horrific ghosting artifacts", too. Stalker 2 is one great recent example of all of those.
You havent seen DLSS Transformer model. It will be available for all RTX cards since 20series. And it fixes a lot of the ghosting and blurry issues. Go look it up, Digital Foundry already has a video on it
I think we brought this to ourselves. Many gamers are so obsessed with frame rates that graphic cards no longer provide "visuals" as such but they are simply focused on generating frame rates, while games look like their 2015 counterparts but in 4k...
That's a lie. We're talking about the 60 fps standard, the lowest possible bar, set more than 10 years ago. I, nor anyone else, should be gaslit into believing that "gamers have unreasonable standards." Games should run on reasonably affordable hardware, at 60 fps, minimum, before any frame generation. That has been the standard for more than 10 years, the absolute minimum standard, and it's the industry's fault for failing to meet even the minimum standard.
@@afelias Games can run at 60fps though if everything else sacrificed. Frankly, I see no issue with the rtx50 generation. The AI and frame generation is for people wanting 4K and 60 fps and ray tracing and pathtracing. If you play at 1080p or 2K 60 fps and no raytracing or pathtracing you won't need the AI frame generation.
@@sadiqraga3451 Yes, but you shouldn't advertise any sort of improvement if it comes at the cost of the minimum bar. If you have to turn off RT, lower the texture quality, lower the resolution or use supersampling, just to hit 60 FPS, then that is the limit of the performance uplift. If the games can't look better at 60 FPS then there would be no reason to spend more or even an equal amount of money for new hardware. That's the point. It's not like it's anyone's fault but NVIDIA's for aggressively pushing for RT in the first place. So they can't advertise any kind of uplift if they have to start below-minimum to create that comparison.
@@roshunepp Bruh, been running my 2080TI since 2019, intend to keep going for a few more years and praying maybe something that isn't AI dogslop comes out with actual hardware improvements.
@@bigturkey1 maybe because you are using a damn 4080? on a 3080ti i have noticed almost every AAA game of the last few years is terribly optimized. indiana jones being the one exception.
@@bigturkey1 exactly. its incredible. especially after so many games that require dlss and low settings and still cant hold 60fps. but yeah, i was shocked to find my 3080 ti could max the game out at NATIVE 4k60. i wish they had not limited the game to ray tracing hardware though, because clearly this game is well enough optimized for any decent GTX gpu to play. i dont think the forced ray tracing is doing much in this game.
a lot of people, including me, travel a lot and buy a powerful gaming laptop to be able to play while on business trips. it doesnt feel like the setup at home, but gets close enough.
8:11 DLSS Performance Screen Space Reflections Quality not on the maximum setting This is 1/4th of the resolution of the original, meaning this "4k" gameplay is only a 1080p one upscaled, also 3 out of 4 frames is AI generated, by doing a simplified calculation you have to subtract by ~7 (you have to calculate with the actual calculation of the upscaling so amore realistic number is around subtracting by 5-6) to get the "real" performance of this card, for 4k native So the 110 FPS you see at 8:41 would be about 20-25 FPS
7:40 "you cant change the settings" sounds weird to me, im sure its not because they have to reset it for the next journalist, more like "it only works if you put it like that" basically what Triple A devs do today "the game only works, if you play as we want you to play!" if you do it a bit differently, everything brakes
They don’t want Linus to turn off DLSS. Nvidia is building up hype around the 5090 performing twice as good as the 4090 but only with DLSS which is meaningless.
they have it set up so the 4090 runs at a crawl, and the 5090 has all the new ai software for fake frames. if he could turn that off he could actually compare raw performance numbers of the actual hardware instead of the software. they dont want that.
Yep. But the problem for Nvidia will be that after few days of official launch, we will have tech reviews from GN, Hardware Unboxed etc., and we will see the reality. Making Linus not change things is stupid, they know very well that every major tech reviewer will play with the settings and the truth will shine.
The 5090 exists as a price anchor, what nVidia really wants you to buy is the 5070/5080. The 5090 is priced that way to make the 5070/5080 look like a good deal (they aren't). The 5070 could be sold at $400 and the 5080 at $650.
More like the opposite. 5080 is not even half of 5090. And other cards show no significant improvement over previous gen as well. So cards for poor, and A SINGLE card that actually delivers performance.
@@alexturnbackthearmy1907 You're thinking of 4060 (eventually 5060), an intentionally horrendous value meant to push buyers to the 4060ti (eventually 5060ti). The 5090 has 102% more cores, the 5080 has 14% faster clock. The 5080 is half the price, more than half the performance.
20% more performance , 100% more dlss4 performance lmao imagine AI generating 15 out of 16 pixels and calling it performance lol , unlike FSR this will only work with a few titles that Require Nvidia to work with the developers , Nvidia must be mad that the 1080TI Was relevant for too long and they want performance to be locked behing "software compatibility" while a old and weak gtx1050 from a decade ago can run AMD Frame generation technology lol
yeah, but since that frame generation is 50series exclusive, its still produced double the frames. And think about the fact that game devs dont give a shit about opimization anymore, so they just implement these frame gen features. So in the end it does not matter, still double the frames. BTW did no one notice that the resolution was even on 4k???? With the new Displayport upgrades to 2.1 and the new 4k OLED 240hz monitors, the quality of pixels will be insane.
Real frames are generated. These are more like hallucinated, as they're that bad. GPU just went on lsd-level of artifacts. But somehow it's amazing and I'm just stupid because others can't notice the problem so it must not exist.
@@Jay-vt1mw Same. I used Frame Generation on some titles now. Some were broken at release, but later fixed, some had input-lag issues, but some also ran perfectly fine as if there was no Frame Gen (it was on) with high FPS. What Nvidia needs to do it to push it onto more games or engines like UE5 OR make it available via drivers, so you can use it on all games (but it might be broken).
For most. Same way that 1080p was standard for 2020, and 720p was for 2015/2016, 1440p should be standard for 2025. Benchmarking to 60 frames at ultra at 1440p native should be the ideal for optimization now.
@@PerciusLive I feel like 1080 has been a standard for waaay longer...Like 2010's late, pretty much every single person had a 1080p monitor. Besides that i couldnt agree more.
Even 1080p is plenty enough for basically almost everyone, and its soso much easy to run compared to 1440p, especially 2160p. I do however think for gaming 1080p is enough but for more productive work having multiple windows side by side 1440p is the sweetspot. For only gaming 1080p is enough for most people
It feels like the biggest issue with DLSS is not to do with it's performance, but that developers are relying on it instead of developing actual good graphics processes. DLSS does have it's negative quirks, such as producing frames that are not responsive to user input, and smearing between frames, but it does produce a lot of "performance" for much less effort.
Cars are like a religion to a bunch of us. Working on them, building them, customizing them, it’s gonna be really tough to make driving illegal without a major uprising.
One day the roads belonged to people, not to cars. Even if it will take time, they will be forced out of the driving their car. For "their own good" of cource.
Linus gave it to us clearly. He even showed the fault in the text when moving the frame quickly. He also said it was software doing a lot, and it has bugs which can be fixed. We need to hear what production capability is going to be. Will the GPU for gaming supply be stable? If there's going to be exponential demand increase then production capability needs to be explained. Production location - Taiwan or somewhere less vulnerable? Trump Tariff - how much and when?
Knowing linus, he probably got a free 5090, put it on a motherboard it didn't fit, said it didn't work, sold it at auction against the wishes of nvidia, then when asked to apologize, doubled down and said nvidia are stupid.
Honestly, I think Nvidia took this approach because their marketing strategies are lacking, and collaborating with UA-camrs might help them come across as more genuine.
I had to recheck becuase I didnt even know that was the input lag...If 5ms is already noticeable but playable, I can't imagine what 50ms does to a person HAHAHAHA
I think many have already begun to guess that if you have 1 real frame and 3-4 AI generated ones, then the quality of these generated frames will be at least mediocre.
On the gaming laptop question, I did. I recently bought a gaming laptop. I'm an engineer, I work in a company that has multiple sites that are out of town, occassionally, I get called to those places. This is where the gaming laptop comes handy. Mind you I have a gaming PC as well, but the laptop is much more handy especially when the accomodation you get is a table at a 3 star hotel. Perfect size for a decent amount of performance. I am not packing my PC in a bag and getting on a train. I already have to pack for clothing.
1000$ is unrealistic for every reason being all the different manufacturers for each different PC component charging what they want - even for very basic bare bones features. And then the periodic gimmick advancements we see in hardware/software. It'll never really happen without making "non-name brand" types of cost reductions
12gb vram on the 5070, that's really a no go, regardless of the price. You're going to have a bad time with that already with certain games, especially new releases this and next year.
The 12-16 GB Vram preaching is such a gaslight attempt by the communiy. Unless you’re playing 4k, 12GB covers all modern games and is fairly future proof too. And I’m not paying another 200-300 bucks + the extra for a new screen just to still have worse performance because I’d then need even better stuff so as not to have terrible FPS. It’s a scam.
the 5000 series prices for me is they found the price cap for each class with 4000 series, and used the successful 'super' card prices to gauge the 5000 series cards, the 5090, they can price it whatever they want... because... what else are you buying at that level? 🤣
the problem with current top tier GPUs is that we've been like 6 years now been constrained by power and heat, cards can't get more powerful because they simply generate more heat than any reasonable PC case can dissipate through air, so the only thing they can do now is make more efficient cards and add more AI features to keep ramping the FPS, otherwise it will feel like the latest Intel iteration which isn't faster, just draws less power
This 50s series is probably just a substandard byproduct of their ongoing Blackwell AI research. The money shifted from their GPUs to their AI enterprise instead. It's like petrol byproducts. You really want the petrol, but why not sell the byproducts too.
An investment is buying a gaming phone back in 2019 with 256GB storage and 12 GB RAM for $350, and still using it to this day. I've been using it 24/7 since then-for gaming, coding, video editing, you name it. I don’t see how overpaying for something that will only be relevant for another 3-5 months can even be called an investment.
exactly. especially as the main feature isn't even available on 40 series cards. (mfg 4x) i thought longevity would be the one advantage of relying on ai instead of actual native performance, but apparently not.
to answer the laptop topic : people don't realize the cons of laptop computers, they think that desktop and laptop are the same, exept that laptop can be brought everywhere
After photorealism there is videorealism. Animations would have to stop being made by humans and start being generated on the spot by ai. From animation to simulation. Think of clipping for example. Clipping happens when an animation, made by a human, either don´t exist for the interaction of two or more different assets, or it exists in a generic form for a specific asset without taking into account that asset is going to interact with many different ones. With simulation the interaction between assets is generated on the fly for those two specific assets. Once you have photorealistic graphics the biggest challenge to your suspension of disbelief are the animations, which are imposible to prepare beforehand for all the multiple combinations of different interactions between assets. Physics in videogames have always been a huge challenge and it was a first attempt to tackle the limitations of working with animations, but the computational power required is socking. AI can be of great help here and it would be, imo, a great trend in the years to come. It isnt going to work for everything, but it can potentially be a great jump in terms of realism in motion.
Once you go Ultrawide monitor it is SO hard to go back...WoW in ultrawide...Path of Exile, FPS games etc etc all are just mindblowingly better in Ultrawide...mix in OLED and a regular non ultrawide will never ever work again.
I dont know how im every gonna afford this shit, so annoying with PC games being unoptimized shit stains nowdays and you need DLSS and 5070TI's to run them well
I'm on a Acer 4k 60hz 28" monitor and I don't really think you need a high refresh monitor. I have a rx6750xt GPU and game at 1440p on my 4k monitor. And some games at 4k. And I have no issue in multiplayer games. Started playing Marvel rivals and in 15 games I got 9 MVPs. Also I play cod a lot and I'm always in the top five in any match. And same with Fornite I win way more than I lose. And I'm not even a hardcore gamer. I think higher refresh is only needed to make 1080p look better.
@@РаЫо Exactly, thats why i don't like how they advertise the card and the fact that they chose to not update serie 40 with MFG to "force" people to think they "need" the serie 50
Linus is no one I listen to at all for component advice since lockdown. They are biased, wrong and fuck up comparisons constantly and in an obvious manner. That said, I am very surprised he didnt suck eggs over the 5070 claims. DIY Gamers know the 5070 is nowhere near the league the 4090 is but Linus calling them out and agreeing 1 feature was similar, was unexpected. The 5070 will be good for what it is and is still £200 over priced. Nvidia just needs a headline for the masses to read. 5070 like 4090- that is pro marketing. It does not matter that the reality is the only thing those 2 GPUs share is a producer.
Exactly, I have no doubt the 5070 will be an acceptable card when in promo but NO ONE, and I mean NO ONE, that has ever built a computer knows that comparison is fallacious and misleading.
They intentionally not compare it to 4070S and 4080S or you will see how little it matters, the only product that mattered here was 5090 behind an insane pricing.
The crazy thing is the 5070 is going to be worse in raster than 4070 super as it has less cuda and less rt cores just wait the reviews are going to be brutal
I have a custom PC at home but travel for work alot, so I got a Gaming laptop 2 years ago to keep up on some games while traveling. Lots of Gaming laptops are sold for travel purposes.
bought a 7900xt knowing that amd won't release a new "high-end" card this year. i'll just skip next 1-2 generations and wait for improvements. i can easily just swap out the gpu in 2 years and still have a high-end pc with my 9800X3D. sticking with 20gb raw power for now, should do the trick playing in 2k.
Got a 4080 and never use raytracing bc dlss doesn’t do it when base frames are below 70 or 80, so we need raw performance to get that frames first. Working with 30 base frames in performance mode doesn’t work for me at all
We're making the most modern GPUs ever made, and yet we still can't even get 5 year old games to run at 4k Native 60fps on them, let alone 120fps. DLSS is just a damn cancer on the gaming industry.
Not long ago, playing CS2, I noticed how viscous the controls were, and they often killed me as if they saw me before I did them. When I turned on the FPS counter, I saw that my frame time was jumping by 18-24 milliseconds. There are several factors here, of course, after working on a lot of things, I eventually achieved 11-14 milliseconds and this radically changed my feeling about the game. If you think about what 10 milliseconds are, it's almost nothing, but in reality everything became faster, controls, reaction time, even skill increased and it became easier to kill. What I'm getting at is that they feed us 300-500 fake frames with a delay of over 35 milliseconds, this is just hell for multiplayer shooters.
First thing I do is turn off all the upscaling and frame generation as it feels washed out and blurry when its on. Reportedly 20% uplift on the 5090 compared to 4090 in 2 years? Does not seem worth it.
21:05 Dude forgots about students who games in between classes and needs a laptop for study, that's a niche market. as well as gamers who travels a lot as their jobs like journalists/business executives/etc. that also needs a laptop for work on the go.
6:00 pc building is already modular and incredibly easy. its just ad adult version of the kids toy where you put the square block through the sqaure hole and the round block through the round hole. anybody can do it. putting together ikea furniture is more difficult. (not that that is particularly hard either)
@@TopShot501st Still stupid, since they could just make a build-in 90 degree connector...tho it would be hard since the PCB itself is absolutely tiny compared to that of 4090.
@@TopShot501st what was so hard about plugging a few of the bigger 8 pin connectors in. that is already modular. its not hard. it works. its reliable. it looks good. and why tf does it need to look good anyway.
9:18 Don't be fooled.
The GPU's raw performance is 28 fps.
Like huh? This is so stupid.
There is no way their raw power only gets 28 fps.
This is even more proof that Modern Games are extremely unoptimized and heavily relied on DLSS.
It is because it runs full path tracing in 4k that is ridiculously demanding. But yeah fake frames suck it isn't performance.
In the same scenario with full RT-overdrive, 4090 gives you 17-18 of raw perfomance. Just to be clear.
@@KraszuPolis You wouldn't need DLSS/FSR for full RT if the games were made properly. You'd just need rasterisation.
@@sanek9500 4090 gives 20 fps but keep in mind this "jump" in "raw performance" will be even lower when you not using RT
Yep, I commented the exact same thing.
18:10 Zack still doesn't understand the retail price of goods is not determined by the costs of production. They are aimed to be priced at the optimum price that generates the most aggregate profit.
You're absolutely correct in saying this. It's why ink cartridges have a nightmarish profit margin. They know you'll pay.
Sad that Asmon will read comments on videos he reacts to and not his own YT comment section.
These are two separate things. Of course the price is determined by the cost of production. And they come down from the price after they have developed cheaper productions methods.
@@joev3783 It's a different business model, ink cartridges works like a subscription. You pay for the printer in ink.
@@damazywlodarczyk They are not separate, they are priced above BOM/production and margined at what people will spend for the product. It is tech not a necessity.
You're basically paying and banking on Nvidia to provide you performance through software updates rather than buying a solid and powerful GPU. No thanks.
Yes 💯% True!
this is it 1000% its like a subscription. you want to unlock the new software features we locked behind the new hardware thats not even much better, give is 2k...
The software “updates” are only for their next gen cards everytime
as.owner of the 4080 super, the extra fake frames feels terrible, it does not feel smooth and your output from your keyboard and mouse feels really delayed, I tried them a few times and never turned back on
Finally somebody else said it.
Be careful. I just said the same on another video and got 10 fanboys telling me I'm against technology and I'm stupid etcetc. Supposedly if they can't notice then the issue doesn't exist🤡
@@GFClocked i doubt they have a 40 series card lmao
That’s why they showed a reduced latency. Everything will be AI in the future
@@РаЫо It's wild they are going that route, every compute that is going to AI, means it's not going to actually generating authentic pixels the way devs want them.
Yup, listen to the guy thats been personally invited backstage on Nvidias HQ with multiple Nvidia executives listen to every word cause if there is one negative thing, he'll lose Nvidia as a potential sponsor.
I like LTT, but this felt forced. Almost like gunpoint.
He vowed not to get the 4090 but he said he's gonna get the 5090. However the elephant in the room is he didn't get the 4090 because of the price, and the 5090 is even more expensive.
Hes always expressed distaste for Nvidia. He says they're a huge pain to work with all the time.
The card is more powerful. And he's got the money to spend. I don't blame him for getting the best card available. But he's probably just as miffed as most of us that this is all dlss AI performance.
You're still going to get great performance without craptracing on.
@@Grodstark poor linus🤣
Yeah, with the whole room full of Nvidia execs looking at you while making a video, it must felt akward as hell.
@@truckywuckyuwu If he truly thought Nvidia was distasteful and a pain to work with he simply wouldn’t work with them.
5090: “Hey we heard you love AI generated frames so we put more AI in your AI”
Truly a yo dawg moment
truly an Xzibition of our time
Let's play Unreal 98!
RTX 5090: Hey, i can't play that. it doesn't use DLSS!
"You can't, but I can!" *3DFX Voodoo 3 3000*
Graphics Processing Unit ❌
Frames Generating Unit✅
Hallucination lsd-trip unit 🤡
27 frames on native resolution ahh card
@@shreyasdharashivkar8027 this is the future you don't understand!!11!!! 😠😠
@@shreyasdharashivkar8027 on 4k with path tracing
😂😂😂 @@sadge0
A bunch of tech channels already commented that what DLSS4 offers is 3 "fake" frames for every 1 true frame.
It only works in games that support it and while LOOKING smooth has noticeable delay.
It doesnt even look smooth, there is noticeable ghosting and smear even on these early reviews. Which means it will be even worse in person.
a lot of the tech channels on youtube are just influencers posing as reviewers
Can we pay fake money for fake frames?
Bitcoin
You can just buy the B580 instead and less money going to them will get your point across. So hope you're not one of those hypocrites with their cards ...
We already do lol, our money is backed by nothing.
@@nw932 never was
AI upscaling subscription coming to a GPU near you.... I wouldn't be surprised if they swap to a model like that in the future.
The hottest take anyone can give:
We should have NEVER left the GTX lineup. RTX and the very IDEA of raytracing has done nothing but irreversible damage to the entire gaming industry as a whole to the point of ruin.
Raytracing was not only a mistake, but flat out the worse direction that industry has taken on mass due to the negative impact it has had on development (industry-wide) and then by proxy, performance.
Gaming would be in a MUCH better place right now if raytracing technology never existed.
I wouldn't go as far as it shouldn't exist as much as it shouldn't be on a commercial product. Production workloads still benefit from RTX cards for ray traced rendering.
Or was released in done state - as in, all games would be fully ray-traced out of the box. Nowadays this is not the case - RT is unoptimized mess that runs on top of another unoptimized mess, and is also completely butchered by needing a proper implementation from dev team which happens...extremely rare.
Its so funny to me that raytracing can look worse but needs far more ressources then baked lighting. But GUYS THE RAYS, THE TRACING, SPEND 2000$ NOW!
@@sebastianrosenheim6196 Bought into the hype with a 3070, I'll turn RT on to see how it looks, then back off when I actually go to play.
you can still buy a 1080ti if you want, they are like $100
NVIDIA doing the Apple move increasing pricing even higher. Chip shortage am I right
The 5090 is hardly a consumer card, if you only plan to game then can just pretend it doesn't exist.
People with financial difficulties shouldn’t be buying halo tech products. For moderately successful middle-aged people, $2K isn’t a big deal.
@@ivyr336 IKR. Anything that can't run smoothly on a 3000 series card deserves to go down under. Unoptimized bullshit.
@@indeedinteresting2156 I'd say that anything that can't run on a 1000/1600 series card on lowest settings without any frame gen and upscaling bullshit deserves to go under
@@ivyr336 It's very much a consumer card. Won't be buying it though.
Frame generation is like painting stripes and changing exhaust your car and say that it's faster now just because it looks and sounds faster.
Nah, frame generation is parking at the finish line and bribing a ref to affirm you ran the race
Comment section on this video is making me cry hahahaah, good one dude, really puts things into perspective
Lol the copium is through the roof.
Add stripes to your Lada to make Adidas car
frame gen is like making games like hogarts and msfs run smooth for the first time ever
1000 bucks for 16GB of vram.
this is the worst timeline
Ai reduces vram usage who cares
Fr plus it's ddr7, but people love complaining@@РаЫо
@@РаЫо HAH. Good one.
@@РаЫо and that's the problem the over reliance on AI
@@РаЫо In their tests it only reduced 400mb tho
21:00 as someone who bought a gaming laptop i think it's because I wanted to be able to take it to university, play games when I'm on my lunch breaks, and be able to take my laptop to a friend's house and game together with split screen games
Split screen? At least i hope you are doing it on TV at least, even the regular monitor is way too small for that.
@@alexturnbackthearmy1907No reason you can't. all you need is a hdmi cable
@@alexturnbackthearmy1907 it's just to go to a friend's dorm and casually game together if we were bored. There's a lot of uses normies get out of gaming laptops was my point, I couldn't take my pc with me
Yes I got a gaming laptop for that purpose too.
Ended up getting a desktop PC and a cheap laptop. Just remote into the gaming desktop with moonlight and sunshine. Extremely low latency. Works out perfectly.
Laptop gets great battery life because it's only streaming video.
No sound from the loud desktop as it's in another room.
And I can stream it to any screen anywhere.
@@hotrodhunk7389 the simplest explanation I can say is, this correlates as to why mobile gaming is so big too. It's all about convenience.
I'm not paying 2k for double the fake frames. Real frames peaked
True but tbh human eyes don’t get a real benefit over 120 fps. So what’s the point in trying to go higher?
@@smokinace926 Well when it can't even go over 28fps at native....
Tell me when they hit 60fps@@smokinace926
@@smokinace926 tbh it's harder and harfer to notice, sure, but proper 240 is already noticebly better. For those who chase further and further fluidity it makes sense I guess
@@smokinace926 the point is, the real frames are only 28 so...
One of the problems with fake frames is that performance doesn't work in VR or other high demand applications. 3D modeling, Video editing, multi screen, etc
So no reason to get 5090 if those don't work
yip. pure rasterization performance is always more important and more useful in general for various tasks. imho the die space for dlss, raytracing and ai is just wasted resources that could have been put toward more powerful gpu's.
That's a good point
@@socks2441 you seem to not understanding current engineering capabilities, the reason we have software solutions are because of our hardware limitations. GeForce is geared towards gaming, not productivity. DLSS and Ray Tracing are both beneficial for gaming.
Linus would have been an infomercial star like 30 years ago
He was.
The Billy Mays of tech.
He was for Gerber
I cannot believe that frame generation feels good at all.
Each frame that is generated does not have any direct input from your mouse and keyboard.
This would feel like the input lag we had with v-sync 60hz, where you also have to wait for the monitor's refresh rate (it creates small gaps between control-input and video-output).
Also, the fps counter can be manipulated by frame generation by simply showing the same generated frame twice. If you have enough different frames in a second, you won't notice it except for some blur - which the AI conveniently generates anyway, and the fps counter going up.
I've tried it with just regular frame gen and it feels like Jell-O. It'll be even worse, it'll feel like a monitor with slow pixel response time despite having more frames displayed.
Bro got abducted to the Nvidia Backrooms to make content for them.
They even got a "casting"couch in there 🤣
The most optimal hardware from previous generations are about to skyrocket in value; it’ll be a scalper’s paradise
I think that from now on GPUs should no longer be called "hardware" but here you are practically paying for a "software" since 3/4 is generated by dlss4
does that make it feel laggy or can you not even tell its on like current nvidia frame gen
they literally invented the HARDWARE to make DLSS and real time RT possible........................................ SMH
@@HellPedre so you really think that artificially generated frames are better than physical frames generated by the gpu? 🤣I think that dlss4 should only "help" a gpu to "compensate" with additional frames only when it becomes obsolete... not become the main feature. otherwise here we are no longer talking about GPU but about FGU (frame generating unit)
@@HellPedre haha you think dlss is hardware? good joke!
@@HellPedre that's exactly what they want you to think.
4070 to 5070
$549 for software upgrade "AI frame"
When he says "ai" all I can hear is "horrible TAA" and "blurry in motion" and also "horrific ghosting artifacts", too. Stalker 2 is one great recent example of all of those.
we'll see with benchmarks if Nvidia really solved that with DLSS 4.0
You havent seen DLSS Transformer model. It will be available for all RTX cards since 20series. And it fixes a lot of the ghosting and blurry issues. Go look it up, Digital Foundry already has a video on it
Remember - GSC sacrificed A-life for aggresive optimisation in Stalker2
Hallucinated frames. Soon we'll have full hallucinated games with 6000 series !
@@BlazeBluetm35Probably not...
I think we brought this to ourselves. Many gamers are so obsessed with frame rates that graphic cards no longer provide "visuals" as such but they are simply focused on generating frame rates, while games look like their 2015 counterparts but in 4k...
That's a lie. We're talking about the 60 fps standard, the lowest possible bar, set more than 10 years ago. I, nor anyone else, should be gaslit into believing that "gamers have unreasonable standards."
Games should run on reasonably affordable hardware, at 60 fps, minimum, before any frame generation. That has been the standard for more than 10 years, the absolute minimum standard, and it's the industry's fault for failing to meet even the minimum standard.
did you even game in 2015?
@@afelias Games can run at 60fps though if everything else sacrificed.
Frankly, I see no issue with the rtx50 generation. The AI and frame generation is for people wanting 4K and 60 fps and ray tracing and pathtracing.
If you play at 1080p or 2K 60 fps and no raytracing or pathtracing you won't need the AI frame generation.
@@sadiqraga3451 Yes, but you shouldn't advertise any sort of improvement if it comes at the cost of the minimum bar.
If you have to turn off RT, lower the texture quality, lower the resolution or use supersampling, just to hit 60 FPS, then that is the limit of the performance uplift. If the games can't look better at 60 FPS then there would be no reason to spend more or even an equal amount of money for new hardware. That's the point.
It's not like it's anyone's fault but NVIDIA's for aggressively pushing for RT in the first place. So they can't advertise any kind of uplift if they have to start below-minimum to create that comparison.
@@bigturkey1 I gamed in the 1990s on Amiga 600 :P
And then it dies after exactly 2 years of warranty
Don't you already get a new card before the warranty runs out?
c o n s u m e
@@roshuneppBro, what?
This is what bothers me most. Paying 2000 should get you a minimum of 4 year warranty. I'd be happy with that.
@@roshunepp Bruh, been running my 2080TI since 2019, intend to keep going for a few more years and praying maybe something that isn't AI dogslop comes out with actual hardware improvements.
0:46 no wonder there employees are retiring
No we’re not.
Imagine how bad the optimization is going to be in future releases oh god, they're already stuttery messes on high end current hardware
i use a 4080 at 4k and never felt like a game was unoptimized
@@bigturkey1 maybe because you are using a damn 4080?
on a 3080ti i have noticed almost every AAA game of the last few years is terribly optimized. indiana jones being the one exception.
@@socks2441 Indiana jones at 4k wtih a 3080 gets 80-90 in the jungle areas and 110-120 in doors.
@@bigturkey1 exactly. its incredible. especially after so many games that require dlss and low settings and still cant hold 60fps.
but yeah, i was shocked to find my 3080 ti could max the game out at NATIVE 4k60.
i wish they had not limited the game to ray tracing hardware though, because clearly this game is well enough optimized for any decent GTX gpu to play. i dont think the forced ray tracing is doing much in this game.
Now we wait for game devs to release unoptimized slop running at 15fps and leaving it to MFG to reach their 60fps target.
60 frames being the target is very disingenuous now because 60 frames at native should be the target. Not 60 frames after all these AI bs
25% price increase for 25% performance at 25% more electricity used.
The question is do you need that 25% increase to play a game? Most people would say no. Band wagoners would say probably yes. 😂
best case scenario.
Im still on my 1080ti lol
a lot of people, including me, travel a lot and buy a powerful gaming laptop to be able to play while on business trips. it doesnt feel like the setup at home, but gets close enough.
This GPU draws more power than my fridge.
That's true for practically any GPU genius
8:11
DLSS Performance
Screen Space Reflections Quality not on the maximum setting
This is 1/4th of the resolution of the original, meaning this "4k" gameplay is only a 1080p one upscaled, also 3 out of 4 frames is AI generated, by doing a simplified calculation you have to subtract by ~7 (you have to calculate with the actual calculation of the upscaling so amore realistic number is around subtracting by 5-6) to get the "real" performance of this card, for 4k native
So the 110 FPS you see at 8:41 would be about 20-25 FPS
Damn I can't believe DLSS isn't actually real, can't believe we've been lied to
7:40 "you cant change the settings" sounds weird to me, im sure its not because they have to reset it for the next journalist, more like "it only works if you put it like that"
basically what Triple A devs do today "the game only works, if you play as we want you to play!" if you do it a bit differently, everything brakes
They don’t want Linus to turn off DLSS. Nvidia is building up hype around the 5090 performing twice as good as the 4090 but only with DLSS which is meaningless.
they have it set up so the 4090 runs at a crawl, and the 5090 has all the new ai software for fake frames. if he could turn that off he could actually compare raw performance numbers of the actual hardware instead of the software. they dont want that.
Yep. But the problem for Nvidia will be that after few days of official launch, we will have tech reviews from GN, Hardware Unboxed etc., and we will see the reality. Making Linus not change things is stupid, they know very well that every major tech reviewer will play with the settings and the truth will shine.
The 5090 exists as a price anchor, what nVidia really wants you to buy is the 5070/5080. The 5090 is priced that way to make the 5070/5080 look like a good deal (they aren't).
The 5070 could be sold at $400 and the 5080 at $650.
More like the opposite. 5080 is not even half of 5090. And other cards show no significant improvement over previous gen as well. So cards for poor, and A SINGLE card that actually delivers performance.
@@alexturnbackthearmy1907 You're thinking of 4060 (eventually 5060), an intentionally horrendous value meant to push buyers to the 4060ti (eventually 5060ti). The 5090 has 102% more cores, the 5080 has 14% faster clock. The 5080 is half the price, more than half the performance.
20% more performance , 100% more dlss4 performance lmao imagine AI generating 15 out of 16 pixels and calling it performance lol , unlike FSR this will only work with a few titles that Require Nvidia to work with the developers , Nvidia must be mad that the 1080TI Was relevant for too long and they want performance to be locked behing "software compatibility" while a old and weak gtx1050 from a decade ago can run AMD Frame generation technology lol
If it actually works i dont see any issue.
:D
yeah, but since that frame generation is 50series exclusive, its still produced double the frames. And think about the fact that game devs dont give a shit about opimization anymore, so they just implement these frame gen features. So in the end it does not matter, still double the frames. BTW did no one notice that the resolution was even on 4k???? With the new Displayport upgrades to 2.1 and the new 4k OLED 240hz monitors, the quality of pixels will be insane.
Real frames are generated. These are more like hallucinated, as they're that bad. GPU just went on lsd-level of artifacts. But somehow it's amazing and I'm just stupid because others can't notice the problem so it must not exist.
@@Jay-vt1mw Same.
I used Frame Generation on some titles now. Some were broken at release, but later fixed, some had input-lag issues, but some also ran perfectly fine as if there was no Frame Gen (it was on) with high FPS.
What Nvidia needs to do it to push it onto more games or engines like UE5 OR make it available via drivers, so you can use it on all games (but it might be broken).
NGL, that chain did add charisma. Now, it's gone. RIP Chain Asmon
Built my lad his first pc Christmas day will never forget it.
20:55 In India, the gaming laptop market is significantly larger than the gaming PC market.
It's worldwide 🥰
1440p for a monitor is plenty enough
nah you need to play at 8k
For most. Same way that 1080p was standard for 2020, and 720p was for 2015/2016, 1440p should be standard for 2025. Benchmarking to 60 frames at ultra at 1440p native should be the ideal for optimization now.
@@PerciusLive you need to get an 8k tho
@@PerciusLive I feel like 1080 has been a standard for waaay longer...Like 2010's late, pretty much every single person had a 1080p monitor. Besides that i couldnt agree more.
Even 1080p is plenty enough for basically almost everyone, and its soso much easy to run compared to 1440p, especially 2160p. I do however think for gaming 1080p is enough but for more productive work having multiple windows side by side 1440p is the sweetspot. For only gaming 1080p is enough for most people
The "Zuckerberg free speech" video wore the chain
13:30 and now seeing it broken by next video it's like seeing the conclusion side quest story
There are no "tech bloggers".
There are only freelance marketers for tech corporations.
and then there's Gamers Nexus
It feels like the biggest issue with DLSS is not to do with it's performance, but that developers are relying on it instead of developing actual good graphics processes.
DLSS does have it's negative quirks, such as producing frames that are not responsive to user input, and smearing between frames, but it does produce a lot of "performance" for much less effort.
Wait for benchmarks. Every launch nvidia has had since the 970 has had an issue on day 1. Plus youre paying more $$$ for artificial frames.
Cars are like a religion to a bunch of us. Working on them, building them, customizing them, it’s gonna be really tough to make driving illegal without a major uprising.
One day the roads belonged to people, not to cars. Even if it will take time, they will be forced out of the driving their car. For "their own good" of cource.
If only we had games, worth playing
If only we knew how to use commas.
Just because you cant find a game you enjoy doesnt mean „we“ dont have good games😂😂🤦🏽♂️
@@megaplexXxHDOK concord player lmao
Concord
Banana
Linus gave it to us clearly. He even showed the fault in the text when moving the frame quickly. He also said it was software doing a lot, and it has bugs which can be fixed.
We need to hear what production capability is going to be. Will the GPU for gaming supply be stable? If there's going to be exponential demand increase then production capability needs to be explained.
Production location - Taiwan or somewhere less vulnerable?
Trump Tariff - how much and when?
RTX 5090 giving 20 fps at native 4k with full ray tracing
Knowing linus, he probably got a free 5090, put it on a motherboard it didn't fit, said it didn't work, sold it at auction against the wishes of nvidia, then when asked to apologize, doubled down and said nvidia are stupid.
yeah that story your referencing is why i stopped watching linus
Does this mean that my 3DFX Voodoo can outperform the RTX 50 series because it doesn't rely on AI & just runs off the 3D Acceleration?
Honestly, I think Nvidia took this approach because their marketing strategies are lacking, and collaborating with UA-camrs might help them come across as more genuine.
their marketing is so far behind AMD at this point it's sad.
@@thescarletpumpernel3305 10-15% market share AMD Radeon doesn't really scream "leadership in marketing strategy" like you think it does.
@@TheDravic Yeah...at least nvidia has something to show. Coming empty handed is worse then not participating at all.
Gaming laptops are being used for powerhouse processing for programs like photoshop, 3DS Max, Blendr, and animation studios.
9:25 50 ms input lagg oh my Lord, its disgusting.
And thats the problem with fake frames. The engine is still bogged down, and itll feel like shit.
I had to recheck becuase I didnt even know that was the input lag...If 5ms is already noticeable but playable, I can't imagine what 50ms does to a person HAHAHAHA
Just wait for Deep Learning Input Generation.
Cyberpunk a.k.a. the new Crysis. Make a graphics heavy game and give it a C name with a y in it for 10+ years of mentions.
Yes, lets trust Linus's words.
Linus is a shill
he also promo honey scam ADS why everyone trust him word
The only person you can really trust is Steve.
especially when the NVIDIA staff watching him while he's doing the review. Totally not holding him on a "gunpoint" 😂
You realize every youtuber didnt know honey was scamming right??@@HitomiYuna1
Look at the stock plummet down
just a side note: I think it should be called Big Black Case.
I love how the raytracing on Jen-Hsun Huangs jacket gets better with every presentation.
I think many have already begun to guess that if you have 1 real frame and 3-4 AI generated ones, then the quality of these generated frames will be at least mediocre.
On the gaming laptop question, I did. I recently bought a gaming laptop. I'm an engineer, I work in a company that has multiple sites that are out of town, occassionally, I get called to those places. This is where the gaming laptop comes handy. Mind you I have a gaming PC as well, but the laptop is much more handy especially when the accomodation you get is a table at a 3 star hotel. Perfect size for a decent amount of performance. I am not packing my PC in a bag and getting on a train. I already have to pack for clothing.
I bring an HDMI cable to connect my gaming laptop to the room TV for gaming when on the go.
1000$ is unrealistic for every reason being all the different manufacturers for each different PC component charging what they want - even for very basic bare bones features. And then the periodic gimmick advancements we see in hardware/software. It'll never really happen without making "non-name brand" types of cost reductions
12gb vram on the 5070, that's really a no go, regardless of the price. You're going to have a bad time with that already with certain games, especially new releases this and next year.
People will tell you that they "fixed" it when they literally shown that the new technology only reduce 400mb of vram on games :v
That’s why they ran benchmarks at 4K and had zero issues
The 12-16 GB Vram preaching is such a gaslight attempt by the communiy. Unless you’re playing 4k, 12GB covers all modern games and is fairly future proof too. And I’m not paying another 200-300 bucks + the extra for a new screen just to still have worse performance because I’d then need even better stuff so as not to have terrible FPS. It’s a scam.
the 5000 series prices for me is they found the price cap for each class with 4000 series, and used the successful 'super' card prices to gauge the 5000 series cards, the 5090, they can price it whatever they want... because... what else are you buying at that level? 🤣
the problem with current top tier GPUs is that we've been like 6 years now been constrained by power and heat,
cards can't get more powerful because they simply generate more heat than any reasonable PC case can dissipate through air, so the only thing they can do now is make more efficient cards and add more AI features to keep ramping the FPS, otherwise it will feel like the latest Intel iteration which isn't faster, just draws less power
This 50s series is probably just a substandard byproduct of their ongoing Blackwell AI research.
The money shifted from their GPUs to their AI enterprise instead.
It's like petrol byproducts. You really want the petrol, but why not sell the byproducts too.
1:07 "the most beautiful real time rendered demo" that ball is blurring... in the demo.
An investment is buying a gaming phone back in 2019 with 256GB storage and 12 GB RAM for $350, and still using it to this day. I've been using it 24/7 since then-for gaming, coding, video editing, you name it.
I don’t see how overpaying for something that will only be relevant for another 3-5 months can even be called an investment.
Agreed. People don't know what that word even means nowadays. But i guess it is an investment- A TERRIBLE ONE.
Heh, GTX 1080 Ti w/ 12 Gb VRAM is a good investment.
exactly. especially as the main feature isn't even available on 40 series cards. (mfg 4x) i thought longevity would be the one advantage of relying on ai instead of actual native performance, but apparently not.
@@abelingaw5070 Nvidia will never make that mistake again🤣
@@ayandangcobo1755 Yeah, because people will still fall for their BS.
Remember RTX 3050? 😂😂😂😂
3:30 chatter: "AD on AD" lol
Not just scalpers and inflation but pretty soon tariffs as well
to answer the laptop topic :
people don't realize the cons of laptop computers, they think that desktop and laptop are the same, exept that laptop can be brought everywhere
20:51 usually its corpos, professionals who need the mobility and engineers for site analysis.
Not a fan of the AI shit. As others have said, all it really brings to the table is blur and texture smear.
DLSS fan girls will flame you for this.
my 4080 never smear anything
@@bigturkey1 then you're blind.
@@mikschultzyevo what nvidia card do you use to game at 4k and what monitor, i use 4080 and 55" 4k oled. no smearing no blur
@@Grodstark and here one is
I dOnt GeT SmEar wItH mY 4o80
After photorealism there is videorealism. Animations would have to stop being made by humans and start being generated on the spot by ai. From animation to simulation. Think of clipping for example. Clipping happens when an animation, made by a human, either don´t exist for the interaction of two or more different assets, or it exists in a generic form for a specific asset without taking into account that asset is going to interact with many different ones. With simulation the interaction between assets is generated on the fly for those two specific assets. Once you have photorealistic graphics the biggest challenge to your suspension of disbelief are the animations, which are imposible to prepare beforehand for all the multiple combinations of different interactions between assets.
Physics in videogames have always been a huge challenge and it was a first attempt to tackle the limitations of working with animations, but the computational power required is socking. AI can be of great help here and it would be, imo, a great trend in the years to come. It isnt going to work for everything, but it can potentially be a great jump in terms of realism in motion.
People in the military and people that travel for work a lot use gaming laptops
Once you go Ultrawide monitor it is SO hard to go back...WoW in ultrawide...Path of Exile, FPS games etc etc all are just mindblowingly better in Ultrawide...mix in OLED and a regular non ultrawide will never ever work again.
I dont know how im every gonna afford this shit, so annoying with PC games being unoptimized shit stains nowdays and you need DLSS and 5070TI's to run them well
I'm on a Acer 4k 60hz 28" monitor and I don't really think you need a high refresh monitor. I have a rx6750xt GPU and game at 1440p on my 4k monitor. And some games at 4k. And I have no issue in multiplayer games. Started playing Marvel rivals and in 15 games I got 9 MVPs. Also I play cod a lot and I'm always in the top five in any match. And same with Fornite I win way more than I lose. And I'm not even a hardcore gamer. I think higher refresh is only needed to make 1080p look better.
True you dont need a higher refresh rate monitor, I promise you. I have proof.
5070 has less AI TOPS than 4090, so if that is the case, why can't 4090 have MFG?
Because the 4090 mfs can afford a 5090 too
@@РаЫо Exactly, thats why i don't like how they advertise the card and the fact that they chose to not update serie 40 with MFG to "force" people to think they "need" the serie 50
@@YamGaming idk man i feel like people with 90 cards always buy new 90 cards anyway.
@@РаЫо Yeh and that is a problem :v
Asmongold looking at the settings like an episode of CSI.
Linus is no one I listen to at all for component advice since lockdown. They are biased, wrong and fuck up comparisons constantly and in an obvious manner. That said, I am very surprised he didnt suck eggs over the 5070 claims. DIY Gamers know the 5070 is nowhere near the league the 4090 is but Linus calling them out and agreeing 1 feature was similar, was unexpected.
The 5070 will be good for what it is and is still £200 over priced.
Nvidia just needs a headline for the masses to read. 5070 like 4090- that is pro marketing. It does not matter that the reality is the only thing those 2 GPUs share is a producer.
Exactly, I have no doubt the 5070 will be an acceptable card when in promo but NO ONE, and I mean NO ONE, that has ever built a computer knows that comparison is fallacious and misleading.
They intentionally not compare it to 4070S and 4080S or you will see how little it matters, the only product that mattered here was 5090 behind an insane pricing.
The crazy thing is the 5070 is going to be worse in raster than 4070 super as it has less cuda and less rt cores just wait the reviews are going to be brutal
but but but multi frame generation in 2 out of 100 games where the game doesn't look like complete dogshit.
I’m so invested in this video, I didn’t notice that Asmon wasn’t wearing a white shirt
I have a custom PC at home but travel for work alot, so I got a Gaming laptop 2 years ago to keep up on some games while traveling. Lots of Gaming laptops are sold for travel purposes.
Do you remember when people used to smoke on planes = do you remember when people used to drive their car manually
Linus is untrustworthy IMHO.
Gamers Nexus/Hardware Unboxed/JaysTwoCents are the go to
Yep, he will simp for his sponsor
14:10 A casual player might not notice a blur but they surely will not buy every top end graphics card right as it comes out.
It's over
When he said 'the thing'... i was like.. 'Oh... you're gonna drop it...'
It's all about FAKE frames now. I am sorry, am going with AMD for now.
Yup holding my 7900xtx until it doesnt work at this point
@@Cin-UltraYes sir, RTX 7800 XT in my rig rn and i couldn’t be happier with its performance
bought a 7900xt knowing that amd won't release a new "high-end" card this year. i'll just skip next 1-2 generations and wait for improvements. i can easily just swap out the gpu in 2 years and still have a high-end pc with my 9800X3D. sticking with 20gb raw power for now, should do the trick playing in 2k.
Got a 4080 and never use raytracing bc dlss doesn’t do it when base frames are below 70 or 80, so we need raw performance to get that frames first.
Working with 30 base frames in performance mode doesn’t work for me at all
We're making the most modern GPUs ever made, and yet we still can't even get 5 year old games to run at 4k Native 60fps on them, let alone 120fps. DLSS is just a damn cancer on the gaming industry.
Be specific. Your statement is wrong.
We can actually run 5 years and older games at native 4K60fps as long as there is no raytracing or pathtracing.
Not long ago, playing CS2, I noticed how viscous the controls were, and they often killed me as if they saw me before I did them. When I turned on the FPS counter, I saw that my frame time was jumping by 18-24 milliseconds. There are several factors here, of course, after working on a lot of things, I eventually achieved 11-14 milliseconds and this radically changed my feeling about the game. If you think about what 10 milliseconds are, it's almost nothing, but in reality everything became faster, controls, reaction time, even skill increased and it became easier to kill. What I'm getting at is that they feed us 300-500 fake frames with a delay of over 35 milliseconds, this is just hell for multiplayer shooters.
It's great for movies pretending to be video games though
18:30 Great pause, Insecure Linus lmao 🤣
First thing I do is turn off all the upscaling and frame generation as it feels washed out and blurry when its on. Reportedly 20% uplift on the 5090 compared to 4090 in 2 years? Does not seem worth it.
21:05 Dude forgots about students who games in between classes and needs a laptop for study, that's a niche market. as well as gamers who travels a lot as their jobs like journalists/business executives/etc. that also needs a laptop for work on the go.
Broke the necklace now you can't learn jujitsu. RIP
6:00 pc building is already modular and incredibly easy. its just ad adult version of the kids toy where you put the square block through the sqaure hole and the round block through the round hole. anybody can do it. putting together ikea furniture is more difficult. (not that that is particularly hard either)
TBH them angling the Power connector is nice since the cards got so big it would push the cable against the glass and make it a pain to route.
@@TopShot501st Still stupid, since they could just make a build-in 90 degree connector...tho it would be hard since the PCB itself is absolutely tiny compared to that of 4090.
@alexturnbackthearmy1907 cablemod did then it decided to catch fire/melt on people's 4090s.
@@TopShot501st what was so hard about plugging a few of the bigger 8 pin connectors in. that is already modular. its not hard. it works. its reliable. it looks good.
and why tf does it need to look good anyway.
@@TopShot501st i can understand that. great point.
Frame generation is a cancer on gaming... As in AI and the push for tech for it :/
12:10 motion blur smoothes the blur effect, it doesn't just overlap some parts of the image with 100% alpha
Human eye tap out is 8k at 1inch, at that point your eye cannot see the pixels, alot of studies were done around when VR started gaining traction.