It's crazy that even a 7800X3D is still bottlenecking the 4090 at 1080p or even 1440p in some of these games! What the heck do these game engines need?
I'm shocked. Even on top hardware, cyberpunk does not produce 144 fps, not at the highest settings at 1080p. when will processors overtake gpus and there will be no bottleneck...
That 6 core chip from 13 years ago certainly has much larger die area than the 16-24 core mainstream chips today (cores got smaller over time) which makes it easier to cool.
30% of players = 1440P / 4K gaming is so smooth. 70% of players = 1080P bottleneck. 😂😂 1080P beautifully shows us how poorly optimized games are over the last 2 years.
@@sheltonpicardo2161 CPU heavy? Starfield that just has over 80-100 fps with everything maxed out, on a 4090 in 1080p? 😂😂 Starfield is truly poorly optimised, but the story is nice. This game had the potential..
@@gibon1431 oh no no, i agree starfield is unoptimized, but other games in the video are quite optimized, idk what OP is talking about. its just starfield thats unoptimized from this video.
@@sheltonpicardo2161 Yeah he kinda said the non-sense. People with high-end PCs and at least 2k or 4k resolutions tend to not use 1080p cuz of bottleneck. This is not about "buttery smooth" things..
As much as i want a 4090 or 7900 XTX, the power draw of those cards is too much. I hope the next generation of graphics cards will be more efficient, with a good uplift in performance.
They can't really improve on the transistor (fetch production) size as we're already at 3/4nm and progress is getting really slow (due to multiple problems occuring at these smaller sizes... adding to that that a silicon atom is only 0.2nm wide... this is all kinda insane anyway). So progress on the physical side will not be that impressive. But they can't even do that much architecture wise as there haven't been any (publicly known) big achievements in micro architecture research since the 4000 series came out. Maybe they'll just produce bigger chips then (drawing more power) ... let's see.
@@MrDrelnar Of course there is no more powerful GPU, but that is now, another generation will come out later, and with this processor you will not have to buy another one to use a modern GPU
How are u barely hitting 60c with ur 7800x3d in hogwarts 4k? Im also on a 4090 with the same cpu but im custom loop and with only 20-30% usage on my 7800x3d im easily in the low 70s? I dont get it, this is also with a -20 all core offset with ppt set to 80 and voltage at 1.20, how is your 7800x3d much cooler?
I was wondering the same. If he's using that dark rock pro, then he's getting those temps with an air cooler, and yet my 360 aio can never hit those temps. The dude must be in Antarctica.
@@TheSoxor123I spent over 2 hours reading about 7800x3d thermals. Mine sits around 44-47 celsius idle and sometimes will hit 70 celsius or higher while playing. I am cooling it with a corsair h150i elite lcd. Haven’t arrived to an answer
@@TheSoxor123 I'll let you guys in on a secret: it's because none of these benchmarks are real. The guy uses the exact same footage in multiple videos and uploads some nonsense graphs and charts based on what other, credible reviewers and benchmarkers have said about the card. You can tell since if you look closely, the exact same stutter appears in multiple of his videos across vastly different hardware configurations, while the frametime graph shows no stutter in some videos, and a noticeable stutter in his video on the RTX 4060 paired with a Ryzen 5 7600, which is most likely the only actual config he has ever tested.
@@AlexCS8just switch to 7950x3d and 64gb ram instead of 32 and boom best for work too except crazy stuff like game devs or something higher will require Rtx A6000 or something but for someone at home no way anything better even for office or big companies you wont see a 1% difference above 4090 until 5090 comes
All modern CPU are bottlenecking the 4090 at low resolutions but what I really like with this GPU is the power, the silence and the fact that it doesn't throw tons of heat at your face in summer like my RX7900XTX.
When a 4090 cannot run everything maxed at 1440p with 144 fps minimum, I don't even know what is worth and what isn't. All I know is, this is getting ridiculous.
The gpu is not being fully utilized at 1080p then it's hitting other bottlenecks in the system and the gpu is mostly just waiting for the rest to catch up. Even with that setup.
1080p = 2 mio pixels 1440p = 3.7 mio pixels 2160p = 8.3 mio pixels so 4k needs (very roughly calculated) over 4 times the processing power ... hence it makes sense the power draw is so high.
Hi buddy good video there but something not right about FPS in some games , I have 5900x and 4090 and my lows are not below 90fps , in Ghost of Tsushima is 145fps with dlss frame gen and in witcher at 4k my lows are 90 to 110.
Diablo4 :Try this :4k,ultra settings,all ray tracing on, resolution scaling:none . Resoulution percentaige :200% Sharpen image:100 .So you get the best picture quality in the game, so what was the fps?4090 2-3 fps?
Buy the 7800X3D. It costs less and is better in Gaming. 7950X3D got a bunch of problems with games not using X3D Cores and you get worse performance. Also the 7800X3D got way better 0.1% FPS
@@mayxui69 No, it's not because of the recording, the footage wasn't recorded on an RTX 4090, the guy probably doesn't even own an RTX 4090. None of the footage is real, none of the graphs are real, the entire channel is fake.
In cb2077 maxed out at 4k with fg there are 💩ton of visible artifacting and ghosting. Basically even fg isnt helping with maxing out nvidia software tech. Rt is such a marketing gimmick.
Ganz ehrlich, diese Karte war noch nie ihr Geld Wert. Vllt ist sie auch derzeit noch das Beste was man kriegen kann, aber zum High End Bereich gehört für mich nur eine Grafikkarte die auch fähig ist dem Potential der restlichen verbauten Hardware gerecht zu werden u wirklich alles aus Spielen rauszuholen. Letzteres bedeutet für mich dass JEDES Spiel bei 4k, Ultra-Settings und vollem Raytracing mit mindestens 120 oder eher 144 FPS laufen sollte (solange die restliche Hardware es nicht verhindert). Echt traurig dass so eine Karte immer noch 2000€ kosten soll. Die waren auch schon zum Release nicht gerechtfertigt. Man versucht so hohe Preise zwar immer vor dem Kunden zu rechtfertigen, spricht von so viel Forschung u aufwendigeren also angeblich teureren Fertigungsprozessen. Dass das aber nur billige Ausreden sind wird einem ganz schnell klar wenn man bedenkt wie Forschung, Entwicklung u Fertigung in diesen Bereichen heutzutage aussehen u ablaufen. Besonders sauer macht mich das halt gerade weil ich selbst einen neuen PC brauche aber auch keine Lust hab mir für so viel Geld etwas zu kaufen das den eigentlich angebrachten Standards schon zur Markteinführung nicht gerecht wurde u in ein paar Jahren schon wieder an der untersten Framegrenze aktueller Spiele bei höheren Settings kratzt.
No one online game. For RPG games 7800 is good. Try PUBG or Warzone in the city and houses. You'll see 1% and 0.1% fps drop. It's amd problem №1. P.S. And use OC 4090@3000+.
Why do we have games in 2024 that 4090 can't run in 4k60fps? It makes no sense at all cause of no ability to use decent frame generation. I mean why do games developers still running for mysterious graphic effects that aren't visible enough but needed to be drawn by GPU?
When a 4090 cannot run everything maxed at 1440p with 144 fps minimum, I don't even know what is worth and what isn't. All I know is, this is getting ridiculous. 4090 is now a 1440p card and what makes things even worse, because a 5080 will be very similar to 4090 in performance, if we want to game on 144fps minimum in all games maxed, we need a 5090... Great...
@@Redstarka22and the 4090 has a huge heatsink if you have a good model card staying under 60C is doable if you have very good airflow in a cool room but cause its a test might be a open test bench which would mean even lower temps then in a case
@@Qelyn you're not getting a consistent 50-55C with that CPU. I know, because I have one with a custom loop, and during gaming, it will regularly spike to 65-70-75C under heavy loads, and then it will go back to around 60C. This is with a 20% undervolt. All AM5 CPUs are designed like this, but it's especially evident with the X3D variants. Any benchmark where the CPU temperature is static, with no spiking is an obvious fake, and anyone who has an AM5 CPU can testify to this.
I usually get around 80 to 100 fps at 4k Max and RT Overdrive with DLSS3 set to Quality and frame generation. Their GPU in the video isn’t being fully utilized at 1080p so has underwhelming performance at that low of resolution.
That is toasty for a 4090. Not dangerously high by any means though. Which 4090 you have? My ZOTAC never got that hot with the air cooler, but now it is in the mid 40s at load with a waterblock. Edited: question mark in wrong spot.
@@Hunter-nb5bjI have an ROG Strix OC edition in a Phanteks NV7 case. Resting temp is 37 but goes up to 75 under heavy load. Not sure what's wrong with it
@@burai647 could be case ventilation, ambient air temp being higher, the fan curve on the GPU isn’t as aggressive. But I wouldn’t worry at all, you aren’t losing out on performance and it won’t damage the GPU. So keep gaming on! 😊
La puce ou le hot spot ? Sur ma 4090 gaming OC Température max en jeux : Pièce : 20 degrés Puce : 63 degrés Mémoire : 62 degrés Hot spot : 78/80 degrés !
For the price this card is a serious joke , and the performance? I am not jeaolus but this card is overrated tbh , if it costed 1000$ than it'd be mid/fine (the fps on Alane wake for 2200$ are something degrading and scamming)
Alan wake 1080p but native its 1440p or 4k. Check power consumption. Its not real 1080p. Gpu see 1080p but rendering 1440p or like 4k. 400w in 1080p never real.
@@TestingGames its doesnt much matter man. I use dlss with control and same power at 1440 on native. Most games today just use upscalled soo much. 1440p its like 4k native, and gpu see 4k, monitor see native 1440p, but gpu not.
@@furieux6742 He is right. I got also a 4090/7800X3D and Nvidia GPUs are using less energy while enabling DLSS, even if its 100%. 40series GPUs are using less energy while using DLSS.
Its amazing and kind of sad that even 4090 is not capable of native 240hz 4k, hopefully 5090 leads us to a better result (Ofc I udnerstand max settings are kinda useless and is shown in this video but even with med high 4090 cant reach 240 on many games)
im tired of asking smth shouldnt have asked in the first place, but really it looks like my life been playing around and dramatized instead of given what deserved
Wait 5-10 years. For gaming today's games of course, not these what come out in 2030-2035 years. 8k it's veeeeeeery much pixels, too hard for any hardware in future and no chance today.
@@Geezer341 Well, but let's be honest, let's sit at a desk, how big should the monitor be? Companies already want to make 16K monitors, which seems like too much to me, unless you buy a console controller to play
@@AdaptacionGamer Resolution does not directly depend on screen size. In principle, you can make a matrix of any size with any resolution. 16k does not mean that the screen itself must have a huge diagonal; it can be 27" or less, or 50" or more. Look at the matrices on smartphones, they are tiny compared to monitors or TVs, but they have a high resolution, often no less than that of large monitors (which are many times, or even tens of times, larger).
It's hard to believe how bad MS Flight Simulator engine is. Stutters from nowhere, sudden drops to 25% CPU utilization (so what, 4 threads? I'm afraid its 2 cores 4 MT) while GPU is far from busy. Bad product.
More like a 5080, usually how the performance increases work. Also they wouldn’t launch a ti right away. Usually atleast 6 months after the regular card 🤷♂️
Games :
CYBERPUNK 2077 l 1080p l - 0:06 - gvo.deals/TestingGamesCP2077
CYBERPUNK 2077 l 1080p l Path Tracing - 0:51
CYBERPUNK 2077 l 1440p l - 1:19
CYBERPUNK 2077 l 1440p l Path Tracing - 1:53
CYBERPUNK 2077 l 4K l - 2:29
CYBERPUNK 2077 l 4K l Path Tracing - 3:08
CYBERPUNK 2077 l 4K l Path Tracing, DLSS 3 - 3:51
Assassin's Creed Mirage | 1080p | - 4:32
Assassin's Creed Mirage | 1440p | - 5:09
Assassin's Creed Mirage | 4K | - 5:46
Red Dead Redemption 2 l 1080p l - 6:22 - gvo.deals/TestingGamesRDR2
Red Dead Redemption 2 l 1440p l - 7:17
Red Dead Redemption 2 l 4K l - 7:55
Forza Horizon 5 l 1080p l - 8:55 - gvo.deals/TestingGamesForza5
Forza Horizon 5 l 1440p l - 9:36
Forza Horizon 5 l 4K l - 10:17
Avatar Frontiers of Pandora | 1080p | - 11:02
Avatar Frontiers of Pandora | 1440p | - 11:52
Avatar Frontiers of Pandora | 4K | - 12:39
Alan Wake 2 | 1080p | - 13:21
Alan Wake 2 | 1440p | - 14:26
Alan Wake 2 | 4K | - 15:22
Alan Wake 2 | 4K | DLSS3 - 16:00
Hogwarts Legacy | 1080p | - 16:42 - gvo.deals/TG3HogwartsLegacy
Hogwarts Legacy | 1440p | - 17:22
Hogwarts Legacy | 4K | - 18:04
The Last of Us Part I | 1080p | - 18:36
The Last of Us Part I | 1440p | - 19:30
The Last of Us Part I | 4K | - 20:03
Resident Evil 4 Remake | 1080p | - 20:45
Resident Evil 4 Remake | 1440p | - 21:37
Resident Evil 4 Remake | 4K | - 22:18
Microsoft Flight Simulator | 1080p | - 23:13 - gvo.deals/TestingGamesMFS20
Microsoft Flight Simulator | 1440p | - 24:14
Microsoft Flight Simulator | 4K | - 25:15
Microsoft Flight Simulator | 4K | DLSS3 - 26:27
Remnant 2 | 1080p | - 27:07
Remnant 2 | 1440p | - 27:44
Remnant 2 | 4K | - 28:35
Starfield | 1080p | - 29:28
Starfield | 1440p | - 30:09
Starfield | 4K | - 30:56
Spider-Man l 1080p l - 30:43 - gvo.deals/TestingGamesSpiderManPC
Spider-Man l 1440p l - 32:25
Spider-Man l 4K l - 33:06
The Witcher 3 l 1080p l - 33:51 - gvo.deals/TestingGamesWitcher
The Witcher 3 l 1440p l - 34:41
The Witcher 3 l 4K l - 35:24
System:
Windows 11
Ryzen 7 7800X3D - bit.ly/43e3VxW
MSI MPG X670E CARBON
G.SKILL Trident Z5 RGB 32GB DDR5 6000MHz - bit.ly/3XlBGdU
GeForce RTX 4090 24GB - bit.ly/3CSaMCj
32Gb RAM DDR5 6000Mhz - bit.ly/3BOxlni
SSD - 2xSAMSUNG 970 EVO M.2 2280 1TB - bit.ly/2NmWeQe
Power Supply CORSAIR RM850i 850W - bit.ly/3i2VoGI
What is the fps and temp monitor you are using?
Bro make a video at 1 rtx 4090 vs 2 rtx 4090 . Fps test
The dream CPU & GPU combo for gaming.
that I will never be able to afford
It’s amazing 🤩
a correctly tuned 7950x3d can be better
True
For now
It's crazy that even a 7800X3D is still bottlenecking the 4090 at 1080p or even 1440p in some of these games! What the heck do these game engines need?
I'd probably put it down to the sheer horsepower of the 4090 that there's no current CPU which can max it out below 1440p ultrawide (3440x1440p)
4090 is just an insane GPU. Before the 7800X3D, there was a few games bottlenecked at native 4K!
I got a 4090 and 7800X3D. The 7800X3D ist a hard bottleneck on Helldivers 2. Only 4k seems to be good for this comb.
There is no 1080p in this video, DLSS is on, so it's probably just 720p upscaled to 1080.
@@Geezer341I guess you didn't watch the part where he showed the graphic and resolution settings in each game.
Currently running a Rtx 4080 with the 7800x3d. Loving this Cpu
I’m assuming it’s extremely close but since I’m a perfectionist I must ask why you have the 4080 instead of 4090?
@@PatsPerfectPlatinumsbc he’s not nvidias personal fellatio artist
@@blueballs6357
Learnt what "fellatio" meant for the first time. Favourite reply today.
@@PatsPerfectPlatinumsmaybe because it's 1k more. are you gonna buy it for him?
terrible combo man
I'm shocked. Even on top hardware, cyberpunk does not produce 144 fps, not at the highest settings at 1080p. when will processors overtake gpus and there will be no bottleneck...
Thats is not correct I play at 4k with Dlss on balanced and get 110fps
@@tarunchalla7031you know how to read?
@@tarunchalla7031 due to the cpu bottleneck its harder to max out the gpu at a lower resolution
This test is a mega job, thank you and respect! :D
guess both amd and nvidia are busy enough w the respective title lul
thats true nvidia mainly lol
Almost 400 Watts of heat and not even 60C. That radiator is insane.
That 6 core chip from 13 years ago certainly has much larger die area than the 16-24 core mainstream chips today (cores got smaller over time) which makes it easier to cool.
Why is DLSS on at 1080p? It looks bad and bottlenecks the card. It doesn't add much performance.
30% of players = 1440P / 4K gaming is so smooth.
70% of players = 1080P bottleneck. 😂😂
1080P beautifully shows us how poorly optimized games are over the last 2 years.
extreme cpu heavy game = poorly optimized 🤦♂
@@sheltonpicardo2161 CPU heavy? Starfield that just has over 80-100 fps with everything maxed out, on a 4090 in 1080p? 😂😂 Starfield is truly poorly optimised, but the story is nice. This game had the potential..
@@gibon1431 oh no no, i agree starfield is unoptimized, but other games in the video are quite optimized, idk what OP is talking about. its just starfield thats unoptimized from this video.
@@sheltonpicardo2161 Yeah he kinda said the non-sense. People with high-end PCs and at least 2k or 4k resolutions tend to not use 1080p cuz of bottleneck. This is not about "buttery smooth" things..
That RTX 4090 is working the 7800X3D the hardest i have seen. Wow.
I'm waiting for the 5090 to finally play Cyberpunk the correct way
wdym
same here lol
@@TheBrainBubble4k all ultra + path tracing at a high FPS is dream Cyberpunk gameplay.
1080p+4090 is not even crazy for Alan Wake 2, barely over 60fps at native lmao
Witcher 3 Looks so beautifull at 4K 😮
You can probably upgrade to a GT 710 1GB DDR3 paired with Pentium G2020 for the best performance. And oh, 4GB RAM.
that's legendary 😎
I have owned from p4, core 2duo, i 7 3rd gen, 7th gen, 10thgen, 11thgen, 14th is coming soon in few weeks.
@@FriendlyPCGamers 14th is terrible value, but hey you do you
@@shadowlemon69not the i7
Nah bro, intel hd 4000 with no cpu and 2gb of ram is peak performance‼️
Ridiculous how in 7-8 months time, 4090 will be a mid range card...
Yeah , it's insane
It won't that soon. 3090 still isn't mid range
@@AmEg 3090 is on par with 4070 super which is a mid-range card. 4090 is on its way on becoming a mid range soon.
@@AmEg 3090 IS mid range. It's on the same level ( well worst actually since does noy have dlss 3 ) of the 4070 , which is mid range.
@@unknownorigin8446 Well, maybe. But not that soon I believe
11:07 13GB of VRAM? 💀
Bugisoft did it again
More devs will follow, 16GB is mandatory @1080p from now on
/s
Forza 8 eats 20 GB VRAM on my 4090 regularly because it's broken.
But normally, Cyberpunk or Alan Wake eat 18 GB VRAM fully maxed out in 4K DLSS
@@Chasm9 So what are you seeing is VRAM allocated (kinda fake) not dedicated
Do you use CQP (in obs) to record videos?
As much as i want a 4090 or 7900 XTX, the power draw of those cards is too much. I hope the next generation of graphics cards will be more efficient, with a good uplift in performance.
They can't really improve on the transistor (fetch production) size as we're already at 3/4nm and progress is getting really slow (due to multiple problems occuring at these smaller sizes... adding to that that a silicon atom is only 0.2nm wide... this is all kinda insane anyway).
So progress on the physical side will not be that impressive. But they can't even do that much architecture wise as there haven't been any (publicly known) big achievements in micro architecture research since the 4000 series came out.
Maybe they'll just produce bigger chips then (drawing more power) ... let's see.
Hi! Nice video.
What cooler did you use witch this setup?
Водянку
custom or AIO? up to 10 degrees less than me with nzxt kraken 360@@mixorab13
Imagine looking like shit and still stuttering on RTX4090
Starfield moment
It’s insane how much horsepower it takes to run Alan wake 2.
4:00 Same Performance with my 5800XD
Tell me you know nothing about components' roles in a gaming PC wihout telling me 🤣
@@MrDrelnar He is right in this case, the GPU is limiting the processor, that means that the processor can take advantage of more powerful GPUs
Thats with PT, so its GPU bottleneck and not CPU
@@AdaptacionGamer there is no more powerful GPU so what's the point.
@@MrDrelnar Of course there is no more powerful GPU, but that is now, another generation will come out later, and with this processor you will not have to buy another one to use a modern GPU
RE 4 looking insane on max graphics. 4090 is 1080p gaming
So it would seem a 4090 is now a 1440p card...
How are u barely hitting 60c with ur 7800x3d in hogwarts 4k? Im also on a 4090 with the same cpu but im custom loop and with only 20-30% usage on my 7800x3d im easily in the low 70s? I dont get it, this is also with a -20 all core offset with ppt set to 80 and voltage at 1.20, how is your 7800x3d much cooler?
Yeah I would really like to know the same, have similar setup, tuning and temps as you
I was wondering the same. If he's using that dark rock pro, then he's getting those temps with an air cooler, and yet my 360 aio can never hit those temps. The dude must be in Antarctica.
@@TheSoxor123I spent over 2 hours reading about 7800x3d thermals. Mine sits around 44-47 celsius idle and sometimes will hit 70 celsius or higher while playing. I am cooling it with a corsair h150i elite lcd. Haven’t arrived to an answer
@@TheSoxor123 I'll let you guys in on a secret: it's because none of these benchmarks are real. The guy uses the exact same footage in multiple videos and uploads some nonsense graphs and charts based on what other, credible reviewers and benchmarkers have said about the card. You can tell since if you look closely, the exact same stutter appears in multiple of his videos across vastly different hardware configurations, while the frametime graph shows no stutter in some videos, and a noticeable stutter in his video on the RTX 4060 paired with a Ryzen 5 7600, which is most likely the only actual config he has ever tested.
best PC at the moment
for gaming maybe
@@AlexCS8
"Maybe"
@@AlexCS8just switch to 7950x3d and 64gb ram instead of 32 and boom best for work too except crazy stuff like game devs or something higher will require Rtx A6000 or something but for someone at home no way anything better even for office or big companies you wont see a 1% difference above 4090 until 5090 comes
All modern CPU are bottlenecking the 4090 at low resolutions but what I really like with this GPU is the power, the silence and the fact that it doesn't throw tons of heat at your face in summer like my RX7900XTX.
is it even worth upgrading past the 4090 in the future
Yes for gaming and or productivity but I would say mainly for the enthusiasts.
When a 4090 cannot run everything maxed at 1440p with 144 fps minimum, I don't even know what is worth and what isn't. All I know is, this is getting ridiculous.
Upgrading from the 4090 probably no, having the power of the 4090 on a 600$ RTX 5070 yes.
@@Sims64340 According to the leaks, a 5080 will be on the 4090's level, maybe 10% faster if that. That means a 5070 will be slower than a 4090.
I’m wondering how the temps are so low.
Mine idles at about 46 celcius and runs 65 on 60% load
He use artic cooler cp
It's because all the numbers and graphs in this video are made up. The whole channel is nothing but fake videos.
Why the is is shuttring here?? 24:00
Why the games with 4090 is not smooth?
Wtf do you talking about its youtube quality and frame but that the final form for gaming configuration
The power draw at 1080p its crazy good... but 1440p and 4k its soo much power draw and heating in most gpus. Crazy. :o
The gpu is not being fully utilized at 1080p then it's hitting other bottlenecks in the system and the gpu is mostly just waiting for the rest to catch up. Even with that setup.
@@bb5307 DLSS is on ☹
Yeah, also crazy that this probably is one of the only cards that give you more FPS when you play in 1440p rather than 1080p :D
1080p = 2 mio pixels
1440p = 3.7 mio pixels
2160p = 8.3 mio pixels
so 4k needs (very roughly calculated) over 4 times the processing power ... hence it makes sense the power draw is so high.
Thanks for the video.
Im confused by your Memory, why does it say 6GB running at 10,000 mhz?
I have a 4090 and an i9 14900k and I still think it’s not enough for 4K so I use a 1440p still
Great card..but am getting the 5090 next june hopefully..got to have patience when upgrading. My 3080ti still does the job .. even my 3060ti .
When is coming the 5000serie in your opinion? Im going to buy soon a new whole system so I don’t know if I should wait at least until September 2024?
Well here we are in June how is that 5090 treating you 😂
Hi buddy good video there but something not right about FPS in some games , I have 5900x and 4090 and my lows are not below 90fps , in Ghost of Tsushima is 145fps with dlss frame gen and in witcher at 4k my lows are 90 to 110.
Hello and welcome to another episode of "you're poor"
cpu power below 80W ? 1.3V?
As a person who plays with locked 60 fps, I see it as an absolute win
120 fps locked is better than 60 fps
@@MafiosoDon2130 fps locked 😎
@@teeg1130 stop the cap ✋
@@MafiosoDon21 or, you know, 24 fps locked. Cinematic experience. 🔥🔥🔥🔥
@@teeg1130 nah bruh you living in a fake simulation
13:21 It seems that Alan Woke game optimization is still trash
Looks silky smooth
Finally a 1080p maxed out 60fps gpu
Diablo4 :Try this :4k,ultra settings,all ray tracing on, resolution scaling:none . Resoulution percentaige :200% Sharpen image:100 .So you get the best picture quality in the game, so what was the fps?4090 2-3 fps?
not sure man i still dont have plan, maybe ill do graphic comparison maybe just lazily upload the settings for content
What's crazy is that this card is basically using the same Watt as a RX Vega 56 or RX Vega 64 :D
both has hbm memory instead of regular gddr, its stacked memory technology that used now in x3d ryzen memory caches
but iirc its not the 1st amd product that use stacked memory/hbm, but i forgot the gpu
how much RTX40 is be cheapier with only 12 gb VRAM ?
Tell me what is better to buy for games 7800x3d or 7950x3d?
7800x3d
Лучше 7800x3d выше примерно 10%,чем 7950x3d ,но будущего запасы 16 ядро держать 7950x3d
7950x3d for production and gaming work
7800x3d for just gaming
Buy the 7800X3D. It costs less and is better in Gaming. 7950X3D got a bunch of problems with games not using X3D Cores and you get worse performance.
Also the 7800X3D got way better 0.1% FPS
Weird im getting a consistent 120-130 fps ultra settings at 4k with cyberpunk. Same cpu and 4090 (zotac).
recording bro
@@mayxui69 No, it's not because of the recording, the footage wasn't recorded on an RTX 4090, the guy probably doesn't even own an RTX 4090. None of the footage is real, none of the graphs are real, the entire channel is fake.
Should upload this in 4k if its at 4k res.
I am just here to see how an RTX 5080 will play out since it will be 10% faster than a 4090...
In cb2077 maxed out at 4k with fg there are 💩ton of visible artifacting and ghosting. Basically even fg isnt helping with maxing out nvidia software tech. Rt is such a marketing gimmick.
Dang, Avatar eats 16gb of Vram
Jesus Christ what is wrong with Hogwarts at 1080p
Why test high end cards in 1080p
I think is something off with fps in MFS. Or its just my weird feelings
No MW3 test? My eyes!
in 20 years this will be the minimum requirement
Only 7 years since the 1080ti to this. Id say in 5 years people will wonder why even bother with a 4090, crazy to think about
why Remnant 2 is getting 50fps 4k lol
Damn it Alan wake 😂
Процессор просто отдыхал все игры😂😂😂
Ganz ehrlich, diese Karte war noch nie ihr Geld Wert. Vllt ist sie auch derzeit noch das Beste was man kriegen kann, aber zum High End Bereich gehört für mich nur eine Grafikkarte die auch fähig ist dem Potential der restlichen verbauten Hardware gerecht zu werden u wirklich alles aus Spielen rauszuholen. Letzteres bedeutet für mich dass JEDES Spiel bei 4k, Ultra-Settings und vollem Raytracing mit mindestens 120 oder eher 144 FPS laufen sollte (solange die restliche Hardware es nicht verhindert). Echt traurig dass so eine Karte immer noch 2000€ kosten soll. Die waren auch schon zum Release nicht gerechtfertigt. Man versucht so hohe Preise zwar immer vor dem Kunden zu rechtfertigen, spricht von so viel Forschung u aufwendigeren also angeblich teureren Fertigungsprozessen. Dass das aber nur billige Ausreden sind wird einem ganz schnell klar wenn man bedenkt wie Forschung, Entwicklung u Fertigung in diesen Bereichen heutzutage aussehen u ablaufen. Besonders sauer macht mich das halt gerade weil ich selbst einen neuen PC brauche aber auch keine Lust hab mir für so viel Geld etwas zu kaufen das den eigentlich angebrachten Standards schon zur Markteinführung nicht gerecht wurde u in ein paar Jahren schon wieder an der untersten Framegrenze aktueller Spiele bei höheren Settings kratzt.
When you can get 7900xtx as the price of 4080ti and it is only 20 fps less than the double priced 4090ti
U are right but there is no 4090 Tie, yet...
No one online game. For RPG games 7800 is good. Try PUBG or Warzone in the city and houses. You'll see 1% and 0.1% fps drop. It's amd problem №1. P.S. And use OC 4090@3000+.
You don't need more than this ❤
Why do we have games in 2024 that 4090 can't run in 4k60fps? It makes no sense at all cause of no ability to use decent frame generation. I mean why do games developers still running for mysterious graphic effects that aren't visible enough but needed to be drawn by GPU?
Literally my rig. Definitely wasn’t cheap by all means 😅
Same except I got the Rx 7900xtx instead of a 4090
@@brbgboxing5738 mine came with driver issues. Got my money back and went with the 4090 😭
When a 4090 cannot run everything maxed at 1440p with 144 fps minimum, I don't even know what is worth and what isn't. All I know is, this is getting ridiculous. 4090 is now a 1440p card and what makes things even worse, because a 5080 will be very similar to 4090 in performance, if we want to game on 144fps minimum in all games maxed, we need a 5090... Great...
damn...avatar is one heavy motherfucker of a game hahahahah
cpu cooler ? and undervolt ?
thank you
36 minutes, nice
is it just me or is Witcher 3 kinda terrible. Lots of stutter
Bro plz upload more videos of rtx 4060 vs all its budget range gpus and CPUs plz ❤ a subscriber needs
why cp runs at only 100-130fps at 1080p? the gpu is not fully used why? other yts show that it reachs 170fps at ur settings, bad bad ram>?
how do you getthe temperatures so low
This entire channel is fake, none of these benchmarks are real. They've been reusing footage for a while now and making amateur mistakes like this.
@@Redstarka22what 😭 you can get 50-55C on 7800x3D with a artic liquid freezer 360mm aio and a undervolt maybe even without the undervolt
@@Redstarka22and the 4090 has a huge heatsink if you have a good model card staying under 60C is doable if you have very good airflow in a cool room but cause its a test might be a open test bench which would mean even lower temps then in a case
@@Redstarka22but this channel is a bit fishy i agree
@@Qelyn you're not getting a consistent 50-55C with that CPU. I know, because I have one with a custom loop, and during gaming, it will regularly spike to 65-70-75C under heavy loads, and then it will go back to around 60C. This is with a 20% undervolt. All AM5 CPUs are designed like this, but it's especially evident with the X3D variants. Any benchmark where the CPU temperature is static, with no spiking is an obvious fake, and anyone who has an AM5 CPU can testify to this.
I'm surprised no FPS game was considered in the testing... thank you for the test video 🫡
Every FPS games will be abobe 144 fps in 4K ultra
Fps games people play at low only
@@AzZulski true competitive wise
I expected more from cyberpunk 1080p! 126fps even no 144
a lot depends on the location in the game, somewhere the fps reaches 160, and somewhere you drop to 100
I usually get around 80 to 100 fps at 4k Max and RT Overdrive with DLSS3 set to Quality and frame generation. Their GPU in the video isn’t being fully utilized at 1080p so has underwhelming performance at that low of resolution.
how is your 4090 so cool? Mine's goes up to 75 on heavy load
That is toasty for a 4090. Not dangerously high by any means though. Which 4090 you have? My ZOTAC never got that hot with the air cooler, but now it is in the mid 40s at load with a waterblock. Edited: question mark in wrong spot.
@@Hunter-nb5bjI have an ROG Strix OC edition in a Phanteks NV7 case. Resting temp is 37 but goes up to 75 under heavy load. Not sure what's wrong with it
@@burai647 could be case ventilation, ambient air temp being higher, the fan curve on the GPU isn’t as aggressive. But I wouldn’t worry at all, you aren’t losing out on performance and it won’t damage the GPU. So keep gaming on! 😊
La puce ou le hot spot ?
Sur ma 4090 gaming OC
Température max en jeux :
Pièce : 20 degrés
Puce : 63 degrés
Mémoire : 62 degrés
Hot spot : 78/80 degrés !
There's nothing wrong with 75*c smh
For the price this card is a serious joke , and the performance? I am not jeaolus but this card is overrated tbh , if it costed 1000$ than it'd be mid/fine (the fps on Alane wake for 2200$ are something degrading and scamming)
Alan wake 1080p but native its 1440p or 4k. Check power consumption. Its not real 1080p. Gpu see 1080p but rendering 1440p or like 4k. 400w in 1080p never real.
this is because the first part of the test in Alan Wake 2 in 1080p without dlss, and in 1440p and 4k with dlss
No it is because ray tracing is on. It think path tracing is also.
@@TestingGames its doesnt much matter man. I use dlss with control and same power at 1440 on native. Most games today just use upscalled soo much. 1440p its like 4k native, and gpu see 4k, monitor see native 1440p, but gpu not.
@@furieux6742 He is right. I got also a 4090/7800X3D and Nvidia GPUs are using less energy while enabling DLSS, even if its 100%. 40series GPUs are using less energy while using DLSS.
because it renders less with DLSS@@venusprinzj8094
Its amazing and kind of sad that even 4090 is not capable of native 240hz 4k, hopefully 5090 leads us to a better result
(Ofc I udnerstand max settings are kinda useless and is shown in this video but even with med high 4090 cant reach 240 on many games)
Best combo
12700F 4090 32램 어때?
Can't wait to see 1 year later the 5070 Ti giving the same performance as the 4090 for half the price and power consumption.
😂yeah for 1500 dollars
@@Fazersofti It'll not sell at $1,500...
@@laszlozsurka8991 its not good to dream a lot bro ! NVIDIA bro we are talking !!
WOW
5600 vs 14100f thx.
Freakng 60FPS in 4K for all that money. JOKE!
for real. imagine in a few years
im tired of asking smth shouldnt have asked in the first place, but really it looks like my life been playing around and dramatized instead of given what deserved
its like lowest way to earn money and also lowest way to treat human being
Brave el mejor navegador del mundo
And my 8K 60 FPS all at maximum for when? 😥
Wait 5-10 years. For gaming today's games of course, not these what come out in 2030-2035 years. 8k it's veeeeeeery much pixels, too hard for any hardware in future and no chance today.
@@Geezer341 Yup, I just wanted to make a joke about this, since we are not yet ready to play games in 8K, maybe 30FPS, but 60FPS is not yet there
@@AdaptacionGamer It’s even worse, there’s not even 30fps without upscalers. And with upscalers it’s not 8k at all.😓
@@Geezer341 Well, but let's be honest, let's sit at a desk, how big should the monitor be? Companies already want to make 16K monitors, which seems like too much to me, unless you buy a console controller to play
@@AdaptacionGamer Resolution does not directly depend on screen size. In principle, you can make a matrix of any size with any resolution. 16k does not mean that the screen itself must have a huge diagonal; it can be 27" or less, or 50" or more. Look at the matrices on smartphones, they are tiny compared to monitors or TVs,
but they have a high resolution, often no less than that of large monitors (which are many times, or even tens of times, larger).
The best
bro vids hasnt been shown it seems
where is pubg???
Flight simulator only for rich 😂💀
❤
It's hard to believe how bad MS Flight Simulator engine is. Stutters from nowhere, sudden drops to 25% CPU utilization (so what, 4 threads? I'm afraid its 2 cores 4 MT) while GPU is far from busy. Bad product.
Underwhelming performance for the price. I expected better for $2K+
top
sera que vamos ter uma rtx5070 ou rtx5070ti com desempenho de uma rtx4090.
ahahhahah no
welcome to another masterpiece episode ladies and gentlemen..
benchamark on those hardwares which 90% of us can't buy
but 100% of us can watch it ☝like your 150k dream car. Always nice when it drives by....
though it was my vid
At the end of the year you'll get the same performance from a 5070 Ti and a Ryzen 7 9700.
More like a 5080, usually how the performance increases work. Also they wouldn’t launch a ti right away. Usually atleast 6 months after the regular card 🤷♂️
@@Hunter-nb5bj Wrong. 4070 Ti launched before the regular 4070.
@@laszlozsurka8991 strange, they always used to do it the other way lol
include PUBG pls, the 2º most played game on steam, thanks
Cyberpunk micro- stutters even with the best chips atm. Wow.
The microstutters in the game are almost always autosaves, so it’s not just a random thing.
@pizzagamecube2564 I installed a mod from nexus wich disables autosaving feature, works with the latest patches. No more stutter every 5 or so min.