Will RTX 5090 Be Too Fast For Any Current CPU?

Поділитися
Вставка
  • Опубліковано 22 кві 2024
  • ► Watch the FULL Video: • DF Direct Weekly #159:...
    ► Support us on Patreon! bit.ly/3jEGjvx
    ► Digital Foundry UA-cam: / digitalfoundry
    ► Digital Foundry Merch: store.digitalfoundry.net
    ► Digital Foundry at Eurogamer: eurogamer.net/digitalfoundry
    ► Follow on X/Twitter: / digitalfoundry

КОМЕНТАРІ • 680

  • @UnimportantAcc
    @UnimportantAcc Місяць тому +302

    Just replace the CPU with a GPU silly 🥰🥰

    • @tsorakin
      @tsorakin Місяць тому +14

      Like duh 😂

    • @vitordelima
      @vitordelima Місяць тому +23

      This is kind of possible but it takes too long to explain. IBM Cell was one failed attempt at it.

    • @jmssun
      @jmssun Місяць тому +19

      The more you buy, the more you save ~

    • @daniil3815
      @daniil3815 Місяць тому +10

      exactly. you have money for 5090, but not for CPU upgrade lol

    • @photonboy999
      @photonboy999 Місяць тому +3

      @@daniil3815
      I think you missed the joke.
      Anyway, I remember Intel trying to go the other way with Larrabee trying to use stripped-down x86 cores to create a GPU. It was interesting, but predictably was very inefficient for GPU tasks. So I'm curious WHY they put the money into it. I'm not going to just assume there was no purpose whatsoever based on my limited computer experience.

  • @leo_stanek
    @leo_stanek Місяць тому +57

    What a world we live in where we are concerned that our GPUs have gotten so good we bottleneck the CPU at 4K high refresh rate. I remember when getting 1080p60 was the dream for high end hardware.

    • @stephenmeinhold5452
      @stephenmeinhold5452 Місяць тому +6

      its still is for me on a 2080ti although with DLSS i can go up to 1440p.

    • @Veganarchy-Zetetic
      @Veganarchy-Zetetic Місяць тому +5

      @@stephenmeinhold5452 Yh Alan Wake 2 can barely run at 1080P on my 4090 with raytracing on lol.

    • @UTFapollomarine7409
      @UTFapollomarine7409 28 днів тому

      my 3900x lacks in 4k in some areas believe it or not

    • @LordKosmux
      @LordKosmux 23 дні тому +2

      ​@@Veganarchy-Zetetic You know that this is due to developer's laziness to optimize the game, right?

    • @Veganarchy-Zetetic
      @Veganarchy-Zetetic 23 дні тому +2

      @@LordKosmux I would say it has a lot to do with Ray Tracing.

  • @byronfranek2706
    @byronfranek2706 Місяць тому +141

    32" 4K/240hz OLED displays would be an obvious target for the 5090.

    • @xpodx
      @xpodx Місяць тому +14

      8k 120hz/144hz

    • @GatsuRage
      @GatsuRage Місяць тому +20

      even a 5090 wouldn't be able to push 4k 240fps... unless u only playing cs and LoL lmao so I seriously see no point in looking at those displays yet. 1440p still makes way more sense for high refresh rates.

    • @mttrashcan-bg1ro
      @mttrashcan-bg1ro Місяць тому +1

      That's all well and good, but I've just gotten a 4k 240hz monitor, I have a 4090 and 5900X, and even games from 2015 are CPU bottlenecked, 240hz should never be a target. Making 8k gaming doable seems to be what it'll be targeting, we don't need better GPUs for 4k right now.

    •  Місяць тому +4

      @@GatsuRage Well the more FPS the better. 4K looks way better than 1440P, and OLED is better than any display in the market.

    • @xpodx
      @xpodx Місяць тому +1

      @GatsuRage I get 180-220 in cod Vanguard with my 4090 at.4k max no dlss. But yea the 5090 won't be able to do cyberpunk 2077 max 4k 240 and other similar games. But easier games for sure.

  • @wickfut8917
    @wickfut8917 Місяць тому +160

    VR needs more power. Always. My 4090 isn't good enough for high resolution headsets in graphic intense games. The new headsets on the horizon with 3800x3800 resolution per eye will easily chew through the power of the next few generations of GPUs.

    • @clockworklegionaire2135
      @clockworklegionaire2135 Місяць тому +13

      Real

    • @mattzun6779
      @mattzun6779 Місяць тому +9

      Who in their right mind would develop a game that NEEDS something faster than a 4090 to be good.
      If VR needs that much power, VR games need to go for several thousand dollars each to make a profit with current tech.
      One can hope that next gen consoles and NVidia 6000 series mid range cards get to that level
      Hopefully, there are tricks like frame generation and higher resolution where you are looking that help.

    • @rahulahl
      @rahulahl Місяць тому +27

      @@mattzun6779 Not official games. But the UEVR mod allows you to play non VR UE games in VR mode. Imagine running the latest UE5 games at about 6k resolution, aiming for 90FPS stable. My 3080 couldn`t even run simple games like Talos Principle 2 at a playable quality or frame rate. Best I got was a blurry mess equivalent to 720p at low settings at about 80ish FPS. This is why I am waiting for the 5080/90 so I can finally play those UE games in VR.

    • @xpodx
      @xpodx Місяць тому +7

      Yea and tons of games can do higher render scaling, and the 4090 is not strong enough 4k max 8k Render at 144hz+

    • @nossy232323
      @nossy232323 Місяць тому

      @@mattzun6779 I personally would hope games will be scalable enough to use all the power from the lower end up to the ultra high end.

  • @professorJorge11
    @professorJorge11 Місяць тому +162

    I need a 5090 for My 1080p monitor, like a fish needs a bicycle

    • @darkdraconis
      @darkdraconis Місяць тому +9

      Hey hey hey hey now!
      What kind of "fish-racist" are you?
      Does a fish not have the right to evolve into a bike riding creature?
      Incredible you anti fishists!

    • @professorJorge11
      @professorJorge11 Місяць тому +1

      @@darkdraconis it's a song bruh

    • @darkdraconis
      @darkdraconis Місяць тому +4

      @@professorJorge11 it's a joke brah

    • @CeceliPS3
      @CeceliPS3 Місяць тому +4

      Let me teach you, professor. There's this thing called DLDSR. You can render games at 1440p and 1620p, mantain high af FPS and still get a much better and crispy clean graphics on your 1080p. Sure, a 4090 user (or even a 5090 user) could do with a 1440p monitor, but your analogy is entirely wrong in this case as there is a use for those GPUs with a 1080p monitor.

    • @professorJorge11
      @professorJorge11 Місяць тому

      @@CeceliPS3 I have a Radeon 7600. There's no DLSS, it's FSR2

  • @markusmitchell8585
    @markusmitchell8585 Місяць тому +59

    Diminishing returns at this point. Games are unoptimized that's the only reason why these ridiculously stupid specs are needed.

    • @saliva1305
      @saliva1305 Місяць тому +5

      so true

    • @justhomas83
      @justhomas83 Місяць тому +3

      correct I have a Rtx 4080 i am not upgrading for another 3 years.
      It's getting pointless

    • @saliva1305
      @saliva1305 Місяць тому +1

      @@justhomas83 i play to get 50 series since im on a 3070ti, but maybe AMD us an option too.. sad we have to buy top tier gpu's to play games that should run on 3080 and no frame gen

    • @justhomas83
      @justhomas83 Місяць тому +3

      @@saliva1305 I feel you I just can't do it anymore. I have more bills coming in now and my daughter is graduating college.
      I understand though you're right the top tier is the only path now. We are chasing rabbits in Alice Wonderland at this point 😔😔😔 damnit

    • @Koozwad
      @Koozwad Місяць тому +2

      yes exactly and the fact that RT/PT exists - it's there to make people $pend, $pend and $pend some more
      amazing graphics are possible using non-RT/PT - just look at RDR2 from what 2018(?)
      I've been saying for years now that devs should be using RT/PT as a TOOL to see how scenes should be lit and then recreate them with NON-RT means which will give the players a ton of performance and be much friendlier to their wallets
      plus hand-crafting it can look nicer

  • @EmblemParade
    @EmblemParade Місяць тому +101

    As a 4K/120 gamer I can promise you that we're still GPU limited with the 4090. I often have to compromise on AAA games by enabling DLSS 3 or lower settings, and sometimes just hit 60 FPS. At the same time, I do think upscaling is changing our requirements and expectations, so I hope the silicon can be optimized around that. We don't necessarily need more pixel shader performance if we assume upscaling. The die space is better spent on other features.

    • @StarkR3ality
      @StarkR3ality Місяць тому +4

      I can think of one title you would be fully GPU limited on that card is Cyberpunk 2077 path tracing mode.. and that game is an outlier and also extremely CPU heavy, I'm CPU bottlenecked on that title in certain areas on a 4070s so I'd get your card looked at because something wrong there.

    • @lorsch.
      @lorsch. Місяць тому +9

      And to max out high end VR headsets these days a 6090 is probably not enough...

    • @ghostofreality1222
      @ghostofreality1222 Місяць тому +4

      @EmblemParade - What CPU and RAM are you running? CPU and RAM has a lot to do with it as well. - but I also agree with @starkr3ality - 4090 @ 4k 120 should be running fine on all AAA games with a couple of exceptions being Cyberpunk or Microsoft Flight Sim - A 4090 should be maxing out all AAA @4Kx120hz. You have got to be hitting a CPU Bottleneck and that is why your having to lower graphics settings to get your desired results. Again this is mostly assumption at this point as I have no idea what CPU or RAM your running but this is what makes sense in my mind with what you stated in your post.

    • @EmblemParade
      @EmblemParade Місяць тому

      @@ghostofreality1222 Look at benchmarks from HardwareUnboxed, GamersNexus, and others, and see that you are very far from the mark. I have a 5800X3D and high-end DDR4 RAM. I'm not saying that I'm not having a great time with this setup, but forget about ultra settings AAA at 4K without the help of DLSS. The 4090 is great, but 4K is a lot of pixels.

    • @blackcaesar8387
      @blackcaesar8387 Місяць тому +5

      @@StarkR3ality cyberpunk is no longer an outlier...Alan wake 2 made sure of that. I am guessing helllblade 2 further confirm that.

  • @mchits9297
    @mchits9297 Місяць тому +26

    [It's 2030, goes to buy RTX 9090 with all of my savings.]
    Me: Hey, you have latest GPU, May be RTX 9090.
    dealer: (goes inside and comes with a 8 feet server wrack.) here's Your GPU sir, just $100k dollars.

    • @akam9919
      @akam9919 Місяць тому

      rah. it'll be a tiny quantum board... but you need to buy a giant freezer sized cooler...and not like the fridge you have at your house...like a giant walk in-freezer. You will also have to pay $500K to turn it on, wait for 2 days to get the thing to cold enough, and then spend $2.3M to run crysis, $2.345 for doom, and $76B for fortnite... no the game will not look more realistic.

    • @mchits9297
      @mchits9297 Місяць тому

      @@akam9919 I might create physical sets for all those games with that kinda money 🤑💰

    • @mchits9297
      @mchits9297 Місяць тому

      @@akam9919
      Virtual reality ❌
      Reality ✔️

    • @GamingXPOfficial
      @GamingXPOfficial 28 днів тому +1

      This was somehow very hard to read/understand, but I got it at the end of the day.

    • @LordKosmux
      @LordKosmux 23 дні тому +2

      What if they get smaller instead? A GPU the size of your smartphone. And the price of a house.

  • @heyguyslolGAMING
    @heyguyslolGAMING Місяць тому +119

    I'll only consider the 5090 if it puts my house at risk of burning down. If it can't do that then its not powerful enough.

    • @ThePlainswalker13
      @ThePlainswalker13 Місяць тому +13

      Nvidia Power Plug Engineer: "Hold my half caf soy milk grande carmel macchiato."

    • @EdNarculus
      @EdNarculus Місяць тому +10

      I'm an overheating enthusiast myself and would like to see products that carry risk of spontaneous human combustion.

    • @murray821
      @murray821 Місяць тому

      Easy, just put steel wool on it while playing

    • @chillnspace777
      @chillnspace777 Місяць тому

      Just get a 14900ks then your good to go

    • @dieglhix
      @dieglhix Місяць тому

      it will be more efficient than a 4090, which can powercapped at 70% and running at 98% performance.. meaning a 5090 will be able to run at itsnfull potential at lower than 300w. stock power is too much power already

  • @JamesSmith-sw3nk
    @JamesSmith-sw3nk Місяць тому +16

    There is always a bottleneck in a pc depending on the resolution and game settings. Doesn't matter if it's a $400 pc or a $4000 pc.

    • @ZackSNetwork
      @ZackSNetwork Місяць тому +1

      Not exactly if your hardware pars together fine the bottleneck can be the software and not the hardware.

    • @EspHack
      @EspHack Місяць тому +1

      thats why I aim for a monitor bottleneck

    • @user-rt4ct6tq3r
      @user-rt4ct6tq3r Місяць тому +3

      Such a dumb argument, there is a difference between something BARELY hindering performance of another part vs a HUGE hinderance.

  • @hoverbike
    @hoverbike Місяць тому +26

    the 4090 is still hugely gpu limited in games like MSFS2020 in VR, and i expect 2024 to be utterly devastating in VR - and i'm all for it. We get closer and closer to Star Trek computer simulations every year.

    • @2drealms196
      @2drealms196 Місяць тому +2

      Visually yeah a single flagship videocard is getting closer and closer when it comes to rasterization. But the holodeck virtual npc's have GTP-5 or higher level AI, the physics simulations are orders of magnitudes more realistic and computationally demanding, true pathtracing without the need for any denoising, ultra realistic animations. Latency is also an issue with these LLM responses. So you'd need an entire futuristic datacenter's worth of power with futuristic networking that provides exponentially quicker responses, so even a single flagship videocard from 2045 wouldn't be enough.

    • @numlock9653
      @numlock9653 Місяць тому +3

      Actually the cpu is definitely the bottle neck when using a 4090 in MSFS VR, in my experience . I have a 7800x3d and a 4090 using a Vive pro 2 on highest settings and no matter what graphical setting I change it makes little difference in frame rate, but if I lower traffic I get a huge boost, which is all CPU based calculations. Very Poor CPU optimization unfortunately.

    • @hoverbike
      @hoverbike Місяць тому

      @@numlock9653 Vive 2 pro Res is very low

    • @hoverbike
      @hoverbike Місяць тому

      ​@@numlock9653huh, well you must've either clogged up that CPU with poor bios settings, or your Vive 2 Pro just has subpar resolution.

    • @Myosos
      @Myosos Місяць тому

      ​@@numlock9653 get a better VR headset

  • @tomthomas3499
    @tomthomas3499 Місяць тому +41

    50%, 70%, or even double the power of 4090, as long as it's not melting it's connector it's fine by me

    • @ZackSNetwork
      @ZackSNetwork Місяць тому +6

      I expect %60 faster rasterization performance and 2.5x better raytracing. While pushing the same Power output.

    • @iansteelmatheson
      @iansteelmatheson Місяць тому +6

      @@ZackSNetwork exactly. efficiency is really underrated

    • @nicane-9966
      @nicane-9966 Місяць тому +1

      gonna pack 2 of those filthy ass conectors man lol

    • @nicane-9966
      @nicane-9966 Місяць тому +1

      @@iansteelmatheson is not, is just that those very high tier cards are made to get the maximum amount of power possible, for efficiency you have the 80s and below that.

    • @aberkae
      @aberkae Місяць тому

      ​@@iansteelmatheson 4N node how much efficency can they squeeze out of a similar node🤔.

  • @chrisguillenart
    @chrisguillenart Місяць тому +5

    Ampere didn't have price cuts of any kind, it maintained the price hikes of Turing.

  • @eugkra33
    @eugkra33 Місяць тому +19

    Alex said especially with RT it'll be CPU bound, because of the BVH workload. But what the next generation could offer is moving the BVH maintenance to the GPU alleviating a whole bunch of CPU work.

    • @Hi-levels
      @Hi-levels Місяць тому +1

      Npu s will also come to rescure either on gpus or cpus

    • @yesyes-om1po
      @yesyes-om1po Місяць тому

      @@Hi-levels I don't think that has anything to do with RT's BVH workload, the only thing an NPU could do better is AI denoising, but nvidia already has dedicated hardware to do that on RT cards, and I'm pretty sure an NPU would introduce too much latency for realtime denoising as a separate piece of hardware.

    • @BaieDesBaies
      @BaieDesBaies Місяць тому

      RT is so GPU intensive that I don't see how it could CPU limit games.
      If I activate RT in games, CPU load tends to lower because GPU is struggling.
      I have i5 and 3080

  • @Torso6131
    @Torso6131 Місяць тому +55

    I mean, 4k120 has to be on the table for most games. Even 4k90, throw on some DLAA, call it good. Aside from totally broken PC ports I feel like we're still GPU limited 99% of the time, especially if you have something like a 7800x3D.

    • @StarkR3ality
      @StarkR3ality Місяць тому +4

      depends what res like you've said, but a lot of the games alex mentioned and others, I'm bottlenecked on a 5800x3d with a 4070 super at 1440p in a lot of modern titles, which is crazy right?
      For me, only use case for 90 class and even 80 class GPU's is if you're going to be using 4K and ray tracing in every title available. When you're getting GPU's like the 4090 that are doubling performance gen on gen, and then you get a what? 20% increase in perf going from a 5800x3d to the 7800x3d.
      CPU's cannot keep up and it's really starting to show.

    • @WhoIsLost
      @WhoIsLost Місяць тому

      @@StarkR3alitysomething must be wrong with your PC if you’re getting bottlenecked with that hardware. 5800x3d is only 12% slower than the 7800x3d in 1440p

    • @StarkR3ality
      @StarkR3ality Місяць тому

      @@WhoIsLost defo not pal can assure you, I still great good performance, and I'm not talking about every title only the recent big ones, Baldurs gate 3, cyberpunk, witcher 3 next gen.
      My point was is if I'm CPU bottle necked at times in some titles, what's a 4090 gunna be which I think offer 2.5x the performance? not to even mention the 5090.

    • @JBrinx18
      @JBrinx18 Місяць тому

      ​@@StarkR3ality4090 is only ~60% stronger than a 4070 super. But yes, CPU performance is an issue... I think 4K will be a problem, and there's just not a market for 8K... The only avenue that's available might be VR

    • @Plasmacat91
      @Plasmacat91 Місяць тому +2

      @@WhoIsLost Negative. I have a 5800x3D and 6900XT and am CPU limited most of the time at 1440p.

  • @gnoclaude7945
    @gnoclaude7945 Місяць тому +6

    Il ride my 4070 Super until next gen consoles drop. Thats when the jump to ai NPU's for gaming and other new features will push me to upgrade. Honestly its overkill for 1440p in the games i play at the moment.

  • @holyknighthodrick5223
    @holyknighthodrick5223 Місяць тому +6

    Parallelism is badly needed in many game engines, and modern consumer CPUs really need more cores. Single core performance uplift is too small to keep up anymore. More game engines need to adopt the strategy of running tasks in parallel, instead of just using a render thread, lighting thread, logic thread etc. Easier said than done, but it is the only real way forward.

  • @LeoDavidson
    @LeoDavidson Місяць тому +19

    Until it does triple 4K at 240Hz without DLSS or frame generation, there's always room for more. :) Whether there's enough of a market for that outside of the sim-racing and flight-sim niches, I don' tknow, but I'd buy one.

    • @clockworklegionaire2135
      @clockworklegionaire2135 Місяць тому +3

      Thats not happening ever with new features keep coming out like RT and PT

    • @fcukugimmeausername
      @fcukugimmeausername Місяць тому

      Battlemage will do this.

    • @clockworklegionaire2135
      @clockworklegionaire2135 Місяць тому +6

      @@fcukugimmeausername You are clearly out of your mind

    • @xpodx
      @xpodx Місяць тому

      8k 144hz will come out soon definitely need more power. Never enough.

    • @Sota_eth
      @Sota_eth Місяць тому +1

      64k even

  • @Boss_Fight_Wiki_muki_twitch
    @Boss_Fight_Wiki_muki_twitch Місяць тому +47

    Considering the 5090 will be $1700 at least, it needs to be twice as fast as the 4090 to be worth it.

    • @SuperSavageSpirit
      @SuperSavageSpirit Місяць тому +6

      Rumor is it's 70%.

    • @Chuck15
      @Chuck15 Місяць тому +14

      twice? 🤣🤣🤣🤣

    • @squirrelsinjacket1804
      @squirrelsinjacket1804 Місяць тому +8

      70% raw performance improvement, along with a better version of frame gen to boost frame rates even more than the 40 series version would be worth it.

    • @daniil3815
      @daniil3815 Місяць тому +4

      that's a weird argument.

    • @dpptd30
      @dpptd30 Місяць тому +9

      I don't think so, it likely will be above $2000 due to them using the same node as Ada and Hopper, their data center B100 is already twice as large as the H100 in order to have a decent performance uplift on the same node, so twice as large on the same node should mean twice as expensive, especially when they still aren't using chiplets yet.

  • @Monsux
    @Monsux Місяць тому +17

    I will just use DLDSR + DLAA on a 4k 120 Hz TV/monitor. CPU won't be the limiting factor, and I'm always getting graphical upgrade. Add path tracing with maxed out settings and the GPU (even RTX 5090 TI Super) would scream for help. I just love DLDSR and how versatile it is for all type of games… Doesn't matter if I'm playing new or older titles.

  • @blast_processing6577
    @blast_processing6577 21 день тому +1

    At this point, graphics cards are a side-hustle for NVidia's primary business: AI infratructure.

  • @vexun11
    @vexun11 29 днів тому

    Will the i9 14900k be able to work well with a 5090 even at 35000 watts?

  • @ijustsawthat
    @ijustsawthat Місяць тому +8

    Not if it burns your whole house down.
    Cant they design a better connector instead ?

    • @user-rt4ct6tq3r
      @user-rt4ct6tq3r Місяць тому +1

      Can you have more strength than a 12yr old virgin and properly seat the connector?

    • @ijustsawthat
      @ijustsawthat Місяць тому

      @@user-rt4ct6tq3r if you think it's force related, you need to watch/read more on the topic. GamerNexus did a full investigation on these connectors, and clearly demonstrated how they are flawed by design.

    • @rambo9199
      @rambo9199 Місяць тому +5

      @@user-rt4ct6tq3r Has this ever been a problem in the past for you? Are you speaking from experience?

    • @user-rt4ct6tq3r
      @user-rt4ct6tq3r Місяць тому

      @@rambo9199 deflect more virgin boy.. wanna see a video of my 4090 ive had since launch working perfectly fine? I bet you dont 😂

    • @Oliver-sn4be
      @Oliver-sn4be 18 днів тому

      ​​@@user-rt4ct6tq3rwhat if you move your pc and it comes Auth a bit hmm ? It is not like you open the case always to see ther is always a cance for it and most of all even if u put it all the way in It can still burn 🔥 ther is never a 100 that it won't 😂

  • @KontikiChrisSK8FE
    @KontikiChrisSK8FE 26 днів тому

    I would honestly not mind if it would be too fast or even more power efficient with the new architecture, because I would mainly use the card like that for rendering purposes.

  • @chrissoucy1997
    @chrissoucy1997 Місяць тому +21

    GPUs are getting faster at a pace that CPUs can't quite keep up with. I have an RTX 3090 paired with a Ryzen 7 5800X 3D and in games with heavy ray tracing like Spider Man Remastered, my 3090 is CPU limited even at 1440p and to some degree at 4K. I am itching a CPU upgrade right now way more than a GPU upgrade, my 3090 is still fine. I am looking to upgrade to a 9800X 3D when it comes out.

    • @vitordelima
      @vitordelima Місяць тому +9

      Stupid methods of rendering that move too much data around, need to rebuilt complex data structures all the time, ...

    • @StreetPreacherr
      @StreetPreacherr Місяць тому +5

      It sounds like the game engines aren't designed 'properly'... Can't the (RTX) GPU handle most of the processing necessary for high quality ray tracing? I didn't realize that even with RTX that Ray Tracing was most often restricted by CPU performance! Isn't the GPU supposed to be doing all the additional Ray Tracing processing?

    • @vitordelima
      @vitordelima Місяць тому +4

      @@StreetPreacherr There is a lot of intermediate steps that need a lot of CPU, in case of raytracing it seems the spatial subdivision is one of them for example. Uncompressed assets, assets with excessive detail, poor code parallelism... are other examples of causes of bottlenecks in general.

    • @ZackSNetwork
      @ZackSNetwork Місяць тому

      That’s because the Ampere GPU’s handled data weird. The 4070 Super is way faster than a 3090 in 1080p and faster in 1440p as well.

    • @mackobogdaniec2699
      @mackobogdaniec2699 Місяць тому +3

      Spider-Man is a very specific game, it is heavily CPU-limited (but with high fps, not like DD2), but it is an exception. It is hard to find visible CPU-bottlenecks in most games in 4k, with 4090 and top CPU (or even new mid or smth like 5800X3D).
      If we're talking about RT I think it heavily differ from game to game. It's very CPU demanding in S-M:R or especially Hogwart's Legacy, but not at all in CP2077.

  • @MarcReisSyllogism
    @MarcReisSyllogism 25 днів тому

    VR Performance would be one thing looking at the like of devices like Pimax 8K and 12K HMD's sure can burn some GPU Time (and CPU), ofcourse AI - maybe AI api for gaming use and the bethought PhysiX return and ever more RTX (even in VR).

  • @garethperks7032
    @garethperks7032 Місяць тому +3

    Thankfully AMD have opened a door for us with X3D. CPUs finally start becoming useful for gaming with ultra high speed memory (e.g on-die cache).

  • @Zapharus
    @Zapharus Місяць тому +1

    "Hi guys exclamation point"
    LOL Dafuq! That was hilarious.

  • @dystopia-usa
    @dystopia-usa Місяць тому +3

    Once you hit certain quality-experience performance thresholds in gaming, it doesn't matter & becomes overkill for the sake of giggles. It only matters to professional bench markers & internet braggarts.

  • @British_Dragon-Simulations
    @British_Dragon-Simulations Місяць тому

    My RTX 4090 GPU usage percentage is all over the place in 4K and in the Pimax Crystal (2x4K) with an i9-12900k.
    It never stays at 99% or even 98%.
    It usually varies from 100% to 80% constantly in 4K.
    My RTX 3080 usually stayed at 99% to 98% in 4K.
    In GPU-Z I’m now limited by Voltage and the limiting factor line is always blue instead of green. Even when I overclock my P-Cores to 52 and E-Cores to 40.
    My next PC upgrade may need to be the 14900ks.

  • @powerpower-rg7bk
    @powerpower-rg7bk Місяць тому +1

    The thing I'd be hoping for on a RTX 5090 would be two 8 pin power connectors and a return to more sane power consumption. More/faster VRAM would be nice too as I feel that that has been the RTX 4090's bottleneck, especially at 4K.
    Other grab bag of features would be the return of nvLink/SLI support to scale up via multi-GPU and integrating some Thunderbolt 5 controllers. It'd be nice to be able to plug a USB-C monitor directly into the GPU without external cabling and get full USB support on the display and other peripherals connected to it. Similarly with Thunderbolt 5, it'd be clever to include a mode where you could use the GPU externally for a laptop without the need for a Thunderbolt bridge board in an external chassis. Literally just the card, power supply and a power switch to turn it on. The PCIe slot connector would go unused.

  • @CrashBashL
    @CrashBashL 26 днів тому +1

    That's why we need an ARM/Risk architecture as soon as possible.

  • @odiseezall
    @odiseezall 29 днів тому +1

    it's all about VR.. 50x0 series will be the first that's really capable of full immersion hq hi-res VR.

    • @Skrenja
      @Skrenja 19 днів тому

      Yep. PCVR is where these overkill cards will shine.

  • @phizc
    @phizc Місяць тому

    Cyberpunk 2077 with path tracing in VR using VorpX/Luke Ross with a render resolution of ~10000x5000 at 90+ Hz would probably break the 5090 Ti Egregious Super too. Maybe when 8090 TiTi Super Duper comes around.

  • @Vincornelis
    @Vincornelis Місяць тому +3

    The CPU side is interesting cause CPU limitations seem to be universal to all modern CPU. Like pretty much every major release will either run perfectly fine on the now iconic 3600 or if it's struggling on that it's struggling on everything. Havinh a faster CPU with more threads seem to not make much of a difference at this point. Most modern CPU don't look in any inherent danger of being underpowered for the job. It's just game developers struggling to get to grips with the multithreading and that affects all modern CPUs pretty much just as badly.

    • @bricaaron3978
      @bricaaron3978 Місяць тому +1

      *It's just game developers struggling to get to grips with the multithreading..."*
      No, it's just that ever since the Great Consolization of 2008 all AAA games have been designed and coded for console HW.
      Current consoles have only 6 cores available to games. But further, those cores are considerably less powerful than even the cores of an 11-year-old 4770K.

    • @orlandoluckey5978
      @orlandoluckey5978 25 днів тому

      @@bricaaron3978 cap, ps5 and new xbox both have cpu's equivalent to a ryzen 7 3700x. it features 8 cores and 16 threads. idk where you got your info but its wrong my guy

    • @orlandoluckey5978
      @orlandoluckey5978 25 днів тому

      @@bricaaron3978running at 3.8ghz

    • @bricaaron3978
      @bricaaron3978 25 днів тому +1

      @@orlandoluckey5978 *"cap, ps5 and new xbox both have cpu's equivalent to a ryzen 7 3700x. it features 8 cores and 16 threads. idk where you got your info but its wrong my guy"*
      I repeat: Both the PS5 and the XBox Series X have only 6 cores available to games, just like the PS4 and XBox One.
      Each of those cores has a significantly lower FLOPS than a 4770K from 2013.

  • @smurfjegeren9739
    @smurfjegeren9739 Місяць тому

    I just hope Ill be able to afford 5000 series cards. And to be able to fit it into my tiny case

  • @altaresification
    @altaresification Місяць тому

    I wonder if an ASIC on the GPU would be able to compile shaders on the fly, provided a new API is exposed for that.

  • @FMBriggs
    @FMBriggs Місяць тому +1

    I love questions like this because they assume on some level that large companies (Nvidia, AMD, Microsoft, Intel etc) wouldn't be thinking about bottlenecks or be actively working on developing new ways to utilize cutting edge hardware.

  • @Goblue734
    @Goblue734 Місяць тому +25

    I have an RTX 4090 the only way you would see me with a 5090 is if we can get 4K visuals with RT and can have at least 60 FPS rasterized performance no frame gen or DLSS.

    • @SafMan89
      @SafMan89 Місяць тому +10

      You're the 0.1% who upgrade top end GPUs each generation

    • @ZackSNetwork
      @ZackSNetwork Місяць тому +5

      What games do you play I can do that already with my 4090?

    • @JeremyFriebel
      @JeremyFriebel Місяць тому

      ​@@ZackSNetworksame but 4080

    • @kevinerbs2778
      @kevinerbs2778 Місяць тому +4

      never going to happen in the next 3 years. RT takes about 100x more computational power than rasterization does & most RT relies on rasterization as a base. Expect Blackwell to only be 20%-30% faster than a RTX 4090 at most. unless Blackwell comes out with a massive 224 R.O.P.'s ore more it isn't going to be that fast.

    • @aberkae
      @aberkae Місяць тому

      dlss set to dlaa is where it's at though imo. Better than TA AA native resolution.

  • @kathleendelcourt8136
    @kathleendelcourt8136 Місяць тому +10

    People getting GPU limited in 95% of their games: ...
    The same people getting CPU limited in the remaining 5%, of which only 1% actually results in a sub 100fps framerate: OH NO I'M CPU BOTTLENECKED!!

    • @GabrielPassarelliG
      @GabrielPassarelliG Місяць тому +1

      There are ways to spend the GPU extra power, like on monitors of higher resolution and refresh rate. And if you care much about a specific game where CPU bottleneck is a thing, then invest more in CPU and less in GPU. Not hard, giving high tier GPUs cost multiples of high tier CPUs.

    • @johnnymosam7331
      @johnnymosam7331 Місяць тому

      Yeah pretty much. CPU rarely bottlenecks unless you're playing an RTS.

    • @FantasticKruH
      @FantasticKruH Місяць тому

      Not to mention that 4090 and 5090 are 4k cards, even older cpus rarely bottleneck the 4090 on 4k.

  • @R-yb6xt
    @R-yb6xt Місяць тому

    high end VR-oriented features? not melting?

  • @DefinitelyNotPedro
    @DefinitelyNotPedro Місяць тому +2

    Honestly maybe a slight increase in performance from a 4090 but with much more efficiency would be awesome. Imagine 4090 level performance at only 200 watts for example, it would be crazy!

    • @ZackSNetwork
      @ZackSNetwork Місяць тому +1

      Dude then just get a 5080 then. The 5090 will have %60 more rasterization performance and 2.5x better raytracing than the 4090. While pushing the same amount of power. It’s called “performance per watt” not “low watt output”.

    • @DefinitelyNotPedro
      @DefinitelyNotPedro Місяць тому +1

      @@ZackSNetwork im not going to get either, i was commenting on what i think would make a good product.

    • @lharsay
      @lharsay Місяць тому

      That might happen in 2 or 3 generations, not in one. The 4060Ti just reached the 2080Ti's performance under 200W but the 2080TI was a 350W card at most, not 450W.

    • @DefinitelyNotPedro
      @DefinitelyNotPedro Місяць тому

      @@lharsay maybe! Im just speculating here, the 4060 has around the same performance as the 3060 but uses 50% less power

    • @user-rt4ct6tq3r
      @user-rt4ct6tq3r Місяць тому

      Imagine undervolting.. derp. My 4090 runs at 3ghz with an undervolt and in most games im 300 watts or less.. peaks are 350ish.

  • @jaredangell5017
    @jaredangell5017 Місяць тому +2

    The 9800x3d will be able to handle it. Nothing else will though.

  • @nephilimslayer
    @nephilimslayer Місяць тому

    AAA titles at 4k with rt on will be the sweet target for 5090

  • @xXmobiusXx1
    @xXmobiusXx1 Місяць тому

    The problem is x86, no matter how low level your API is, you still have to run everything through the CPU due to how the PCI bus works. Basically we would need a bypass of some sort akin to what AGP did.

  • @jackpowell9276
    @jackpowell9276 Місяць тому

    I mean with power GPU power comes more scope to add VFX, textures, higher resolutions, frames, VR etc.

  • @deathstick7715
    @deathstick7715 Місяць тому +1

    What are they going to do after the 9090 comes out will it be the 10k90

    • @Alp577
      @Alp577 Місяць тому +1

      Probably go back to small number like how AMD did it with their HD7970 > RX 290 > RX 390 etc..

  • @doityourself3293
    @doityourself3293 17 днів тому

    The optical GPU is just around the corner. So is the optical CPU 10K times faster than we have now.

  • @vertigoz
    @vertigoz Місяць тому +1

    Ehat i want is a 240hz 4k CRT

  • @diggler64
    @diggler64 Місяць тому

    i play dcs and msfs2020 with a 12900k and 4090 in VR and i can get the fps i want and then it's all a matter of turning knobs for how much visual quality you want and visual quality makes you gpu limited so .... for me a 5090 would probably help

  • @mkreku
    @mkreku Місяць тому +1

    I use an RTX 3090 right now and I would actually be interested in upgrading if they would start making mini versions of GPU's again. The 40XX series (and AMD's 7900 series) are all gigantic and since I build ITX rigs, they're just not interesting to me. But imagine if they used the smaller processor nodes to keep the performance the same and instead used the node advantage to shrink GPU's. One can dream.

    • @Katastra_
      @Katastra_ 28 днів тому

      Part of the reason why I went with the NR200 for my first itx build. I still remember seeing someone with a 4090 in theirs lol makes my 3080 ftw3 look tiny

  • @johndavis29209
    @johndavis29209 Місяць тому +3

    Rich is a gem.

  • @RafaelSilva-yv3oh
    @RafaelSilva-yv3oh Місяць тому

    Can it run Cyberpunk at 480hz 1440p? Cause that's gonna be my next monitor.

  • @krz9000
    @krz9000 12 днів тому

    There is no problem keeping a gpu bussi with pathtracing. More samples...more bounces...until we reach unbiased territory

  • @mushroom4051
    @mushroom4051 Місяць тому

    Lazy optimization,makes hardware upgrades a must,look at mgsv fox engine can run on old cpus

  • @capslock247gaming9
    @capslock247gaming9 Місяць тому

    I have Samsung Odessa g8 4K 240hz . I’m running i9 14k 3090 TI I’m hitting 140 fps on ultra settings on story games and on fps like cod and apex low setting I peak to 240 hz and once you get used to 4K good luck going back to low resolution

  • @SuprUsrStan
    @SuprUsrStan Місяць тому +1

    Just get a G9 57" monitor or any other 4K+ monitor. You'll instantly be GPU bound again.

  • @tommyrotton9468
    @tommyrotton9468 Місяць тому

    maybe not if the GPU of the 5090 is used to take AI calculations from the CPU, like it did with PhysX. It could well max out the htz of 1080p and 1440p, but 4k is getting to 144hz

  • @johndzwon1966
    @johndzwon1966 Місяць тому

    CPUs only bottleneck at low resolution/high refresh rates. Just means that when it's time to upgrade the CPU, I won't have to worry about updating the GPU (future proofing).

  • @ymi_yugy3133
    @ymi_yugy3133 Місяць тому

    Improvements are probably gonna be modest, both in software and in hardware.
    More interpolated frames is not bad, but it's in the marginal gains category.
    I see a couple of fronts where they could make progress, though most of this probably won't come with Blackwell.
    Generated predicated frames. Instead of interpolating real frames, the next is simply predicated. This get's rid of the latency issue.
    A more integrated neural rendering approach.
    Lot's deep learning powered NPCs in games.

  • @0wl999
    @0wl999 21 день тому

    ' 4K screens coming online now... ' rofl as if they haven't already been released by a year or two.

  • @user-qv2wd2jc6m
    @user-qv2wd2jc6m 15 днів тому

    Nice conversations, but did they really answer the question? I think the user was asking that how can the most powerful consumer graphics card money can buy really enhance gaming experiences when we also need equally powerful CPU performance to make this happen which often causes choking on the GPU? I mean I can name a plethora of things video games are lacking today that REALLY need improvements to improve experiences compared to what we have had for the past decade or two, with some being physics in real-time(this will always need improvements), real-time water physics(this is EXTREMELY held back and limited because of limitations in CPU calculations), 3-D volumetric special effects(I believe this is GPU focused and CPU focused) as well as robotic NPC's that suck you out of immersion(this is heavily CPU needed as well), so point being we can have these powerful GPU's but if video game experiences don't improve we are stuck with the same mediocre gameplay with shiny polished graphics at 556 FPS. Also VR, to me at least, IS the absolute FUTURE OF GAMING, being able to actually transport yourself into these incredible worlds, and that requires much more powerful hardware both GPU and CPU. My hope though is that solutions are found to fix these concerns and allow developers to have more fun creating games and not be limited constantly by hardware. PS5 was really exciting when it released because of the way they saw SSD's bottlenecking a lot of freedom for developers, and they found a solution that even pushed outwards into the PC SSD market as well since most PC SSD's at that time weren't hitting speeds PS5 was because of I/O throughput issues. My hope is Cerny and his amazing team come up with bottleneck solutions for improving CPU performance, allowing more efficient multithread performance in games that will stretch out into the industry and become standard you know? And I imagine Ai will also assist in numerous ways with improving performance, user experience and helping developers make games more polished and advanced. SO exciting to think about...

  • @Cblan1224
    @Cblan1224 11 днів тому

    Can't wait for games to be even less optimized and just throw a 5070 ti at the recommended specs

  • @viktorianas
    @viktorianas 9 днів тому

    CPUs are lightyears ahead against GPUs on 4k res and above (VR).

  • @livingCorpe420blaze
    @livingCorpe420blaze Місяць тому

    games use to be gpu bound, over the past few years, now they are both cpu and gpu, impressive regression in developing games

  • @HielUFF
    @HielUFF Місяць тому

    I have no idea why they are talking about 1080p, it is a GPU designed and designed for 4k or more and in those resolutions it is very clear that it will not have problems with current generation processors even with its mid-range whether it is 7600 or I5, hi From Venezuela !!

  • @InternetOfGames
    @InternetOfGames 22 дні тому

    I'm getting ready to add a new CPU to my 4090 before adding a 6090 to my next CPU.

  • @boedilllard5952
    @boedilllard5952 Місяць тому

    I'd love to see less power draw - not gonna happen, lower price - not going to happen, thinner - not gonna happen so that leaves about 4 times the ray tracing performance - probably not gonna happen.

  • @kaslanaworld4746
    @kaslanaworld4746 Місяць тому +1

    well the 9800x3d should be releasing soon once the 5090 launches

    • @Icenfyre
      @Icenfyre 3 дні тому

      its not enough. CPU tech is way behind GPU tech

  • @BlueRice
    @BlueRice Місяць тому

    Any performance gain from cpu, and gpu is always a gain. Now old age questions, how much people willing to pay for that little or a lot of gain?

  • @Fiwek23452
    @Fiwek23452 Місяць тому +1

    The current 4080 and 4090 are already bottlenecked by current cpu’s, hyper threading and e cores are total bs

  • @CrystronHalq
    @CrystronHalq 16 днів тому

    7800x3D is rarely being used above 50-60% in games using the 4090. (at 4k)
    I dont think it will bottleneck the 7800x3D when it comes to gaming at all. Will still be an amazing CPU for it.

  • @OldMobility
    @OldMobility Місяць тому +1

    No it’s not, gaming in 4K with Ray Tracing even with DLAA or DLSS is the most beautiful graphics I’ve ever seen and that’s just with my 3090. With a 5090 I can turn off DLSS and go Native and still get spectacular FPS.

  • @0x8badbeef
    @0x8badbeef Місяць тому +8

    Frame-gen is not a solution. It is a work-around. Those with a 5090 won't have to use it.

    • @XZ-III
      @XZ-III Місяць тому +1

      I think you mean shouldnt have to use it

    • @OldMobility
      @OldMobility Місяць тому +2

      DLSS is magic in a lot of cases. I’ve played many games now that look better under DLSS quality with sharpening at 100%.

    • @numlock9653
      @numlock9653 Місяць тому +2

      Frame gen is specifically designed to boost performance in cpu limited games. Has little to do with lack of graphics performance.

    • @0x8badbeef
      @0x8badbeef Місяць тому

      @@numlock9653 it is not little. Asset quality is on the GPU. With frame-gen you can have those higher assets render at a lower frame-rate and have frame-gen fill in the gap.

    • @Error-0x0194
      @Error-0x0194 Місяць тому

      ​@@numlock9653 every game I've played the GPU is 99% and the CPU is down around 30%. When it is the other way around the game is not made correctly.

  • @alexis1156
    @alexis1156 Місяць тому

    I think it's too hard to tell.
    On 4k probably not, lower res though, maybe
    But next gen of cpus is also looking great.

    • @GabrielPassarelliG
      @GabrielPassarelliG Місяць тому

      Who'd by a 5090 to play in less than 4k?

    • @alexis1156
      @alexis1156 Місяць тому

      @@GabrielPassarelliG Probably no one i would say.

    • @FantasticKruH
      @FantasticKruH Місяць тому

      I really doubt it on 4k. 4090 is pretty much the bottleneck in 4k even if you have an older cpu.

  • @FreakyAndrew428
    @FreakyAndrew428 Місяць тому

    I wish my games will be cpu limited when I get a RTX 5090 to use on my Neo G9 S57 with 7680x2160 @ 240hz😅
    So here you have a real usecase.

  • @superthrustjon
    @superthrustjon Місяць тому

    No joke, just cashed in about 1,000,000 Marriott points for over $3,000 in Best Buy gift cards 😂 getting ready for the 5090

  • @mahouaniki4043
    @mahouaniki4043 Місяць тому +2

    $2500 for +15% max over 4090.

  • @charizard6969
    @charizard6969 Місяць тому

    This is a non issue at 4k when ryzen 9000 comes out, it is set ti improvw gaming 17,3% across the board

  • @RiderZer0
    @RiderZer0 Місяць тому

    I’d be most interested in how much quicker 5090 renders in blender than the 4090. My 3090 still takes forever in highly detailed projects.

  • @yourhandlehere1
    @yourhandlehere1 16 днів тому

    Frame gen is like margarine gaming.
    I like real butter.

  • @1stcrueledict
    @1stcrueledict Місяць тому

    Best current pairing for a 5090 will be the 7800x3d. its only like 300 bucks and the am5 platform is gonna be around for a few more years, we'll get a more compatible one at some point in the future, no use wasting the extra 300 on a 7950. or wasting 700 on having to replace an intel board and cpu.

  • @mrmrgaming
    @mrmrgaming Місяць тому

    If they rush a 2024 launch, I will wonder more about 4090 owners needing to upgrade. Normally, there is some big, melts your PC (no pun) game that calls for the upgrade, but the only one I can see that might gain from a 5090 is Stalker 2. I would have thought a mid-2025 would have been better.

  • @jaffaman99
    @jaffaman99 Місяць тому +2

    Enough now of this, it’s becoming click bait

  • @ScientificZoom
    @ScientificZoom 5 днів тому +1

    Imstead of Silicon why not opt for Graphite

  • @bamazeen
    @bamazeen 26 днів тому

    Way to bring the energy boys.

  • @hartyeah
    @hartyeah 28 днів тому

    I’m gaming on 7680x2160 samsung g9 57” with 4090 and 7800x3d. I sure hope I can get more fps with the 5090.

  • @Accuracy158
    @Accuracy158 8 днів тому

    My 4090 is already CPU limited in basically all games (and very CPU limited in many of my favorite games) but I play at 1440.

  • @Jaggith
    @Jaggith Місяць тому

    We dont even have 4, 5, 6, or 8k displays with the new display port 2.1 standard yet.

    • @xpodx
      @xpodx 18 днів тому

      They just made a 4k with dp 2.1

  • @francoisleveille409
    @francoisleveille409 Місяць тому +1

    A Geforce RTX 3060 is right at home with an old i7-4790 especially when playing games in 4K so a 5090 would be right at home with either a 13th/14th generation i9 or Ryzen 9 7950X/7950X3D.

  • @kilroy987
    @kilroy987 28 днів тому

    Oh, I'm sure I can make a 5090 choke with PCVR.

  • @numlock9653
    @numlock9653 Місяць тому

    The problem still remains lack of multi threaded optimization in games. Modern CPUs just cant fix the fundimental nature of games being programmed to rely highly on a single thread . It would be in Nvidias best interest to explore the use of AI to potentially solve this issue. Else there is little reason to scale much beyond a 4090 without a miracle in single threaded cpu performance.

  • @yumri4
    @yumri4 Місяць тому

    The thing of expecting the game developer to code DX12 code to the CPU to handle GPU tasks better has an issue. 1 major one right now of right now DX12 is a wrapper but to code for what they are talking about the wrapper will have to change and the change will mean limited CPU support. Even in the same SKU might not be supported if they do it wrong and remove to much of the wrapper to get to the CPU to say to let the GPU do GPU to itself not GPU to CPU to call the GPU thing.
    There are so many ways it can be messed up there is a reason why the API was made. Yes not using it is quicker just coding for the CPU whiich is part of what DX12 will allow you to do is a thing. It is a think i think was made for the datacenter not the consumer. Like .net games it contains both data center parts and consumer parts. In the data center you know the CPU model in every server as you can go look it up to know what to put into the code. For games you have literally millions of possiable CPUs that might be in the system. So saying the game will only work on this gen of CPUs not the next not the before for this one will kill the game after that gen of CPU is no longer sold. So good idea good for the server market bad for the consumer market.
    Now the part you go through the DX12 wrapper for is slower than it allowing you direct access to the hardware. How much slower? 1 to 100 cycles slower which isn't that much when you figure in the CPU is usually doing nothing waiting on another part of data to be sent to it. This part is used as the code doesn't have to be for that CPU but any x86 CPU. Yes there is a slow down but with all APIs you have a slow down.
    The RTX 5090 will most likely be bottlenecked by the coding not the CPU. When in a pure GPU load the RTX 5090 will most likely be bottleneck by the PCI gen 4 x16 speed not the CPU. Better coding is needed but easily readable by humans coding is what is taught in schools. So the compiler used doesn't always make the objective best binary output. What is used like you know is the binary output that basically is a go there do this file.

  • @lil----lil
    @lil----lil Місяць тому +2

    U know what's gonna happen? Nvidia be like, so guys can't keep with us huh? We gonna design our own CPU! That's _exactly_ what happened to Intel. Apple be like, get you $hit together or we going our own way. Now we have M1/2/3 and soon 4. Watch Intel/AMD stocks TUMBLE when Nvidia announce their own CPU!!! AMD/Intel better get their $hit together.

  • @tryharder1053
    @tryharder1053 Місяць тому

    Black myth wukong and my 8K mini led TV enters chat

  • @0perativeX
    @0perativeX Місяць тому

    I don't think bottlenecking will be a problem because by the time the 5090 is out, Intel's Arrow Lake CPU's will hit the shelves more or less at the same time. And of course AMD will release their next gen CPU's in response.

  • @adityatomar8287
    @adityatomar8287 Місяць тому

    Someone needs to develop quantum computers fast.

  • @brkbtjunkie
    @brkbtjunkie Місяць тому

    Depends on your target framerate obviously. Also not everyone likes the DLSS reconstruction techniques specifically in motion. A 5080 is probably going to be my upgrade from a 3080.

  • @bartjandejong9412
    @bartjandejong9412 11 днів тому

    No, i use VR on ACC and with my 4090 still cant run 100% resolution. I need the 5090

  • @vonbleak101
    @vonbleak101 27 днів тому

    I play @1080p and have an i9 13900k and a 3080... I have no issues with any modern game and prob wont for another couple of years at least lol... The 5090 would be insane for me haha...