Knew a guy who bought a 100% top of the line PC for gaming back in new years 2020 which he intended to use for 10 years. i9-9900KS, 2xTitan RTXs, 64GB of DDR4-4266 CL19, and 3 4TB SSDs. He paired it with both a 4K 144hz and 1440p 240hz monitor, both 27". Spent $10,000 on the whole thing in the end. The thing is, for 2025 he want's to upgrade the parts. It shows, no PC will last you 10 years if you want top of the line gaming. And it's not that components get worse, it's that new titles get less optimized. Still though, people would kill for a dual Titan RTX system even today, performance of a 4070 Ti with double the vram
🙄🤡🤣 SLI isn't supported by any game from this decade and not widely supported by games from the latter half of the 2010s. Your friend is definitely not getting 4070Ti performance and, even when SLI is running, it just doesn't work that way with the VRAM because both GPUs have to load the same set of textures. Their VRAM limit is still the same as a single GPU, which is still a lot heading into 2025. A single Titan RTX will be truly obsolete when you need more than 24GB. And a Titan RTX is most definitely not a 4k GPU in late 2024. What your friend could do with that setup is run Lossless Scaling and offload the overhead from scaling and/or frame gen to the second GPU. But your friend would be best served by selling the second Titan RTX to fund a platform upgrade and getting a far cheaper GPU if they want to offload Lossless Scaling overhead. Could use something like a GTX 670 for that because it doesn't need modern DX feature level support to run Lossless Scaling. I bought a 670 SC for $25 recently. Nobody is going to kill for a dual Titan RTX system. Those GPUs, however, still go for $600+ on eBay, assuming you can find someone who wants to buy one. It is a lot of money for what you get, even if you are doing generative AI or AI training when you can get a brand new 4070Ti Super for the same price. Spending $10k on a gaming PC is simply ludicrous. When I was 12, I talked my dad into spending about $4k on a top of the line PC (Pentium 90). It was in need of some major upgrades just 3 years later. Another option for your friend is to set up the PC for AI timesharing as a source of passive income and just build a new premium gaming PC. If they have the kind of money to blow $10k on a ridiculous gaming PC, they could probably afford to build a $3k machine with a 9800X3D (soon to be released) and a 4080 Super. Or maybe even a 5080 (also releasing soon).
@@Lurch-Bot Damn what a paragraph I'm ready to dismantle in a few sentences. SLI? Games? Who TF was talking about gaming? You can still use multiple GPUs for so many things. Nobody is talking about Gaming here. There are videos online that show 2x2080 Tis beating out a 3090, dual Titan RTXs are a bit faster than that. The RTX 4070 Ti is a bit faster than an RTX 3090, with 12GB of VRAM. Benchmarks prove you wrong, have a good day
This is the first video I;ve seen from Danny's channel and it is great ! Nice touch doing into detail about the specs of the PC and what you did to test the GPU.
I refurbished a tobacco tar stained 2060 KO recently for a build I'm selling. Most time consuming GPU refurb to date that didn't involve board repair - about 5 hours to get every last trace of tar off the GPU. I'm an ex-smoker so I am very sensitive to the smell these days and the GPU smells like a new GPU now. During testing, I was genuinely surprised at how well it performed, having no trouble running Cyberpunk at 1440p60 with medium-ish settings with DLSS Quality. I also played FO4 at 1440p Ultra and it was able to maintain at least 90fps most of the time. Really don't want to go much above that anyway because it breaks dialogue and the ragdoll physics. FC6 ran at 60+ fps at high settings at 1440p with RT enabled. I don't think there is a game in existence that can't run at 1440p on a 6GB GPU, at least at low settings. But that will change soon. I expect 6GB will be fine for 1080p for another 2-3 years at least. And, when it isn't, Lossless Scaling works amazingly well for scaling and frame gen, considering it is a $7 app. I don't really like AFMF because you can't cap it. Lossless scaling also lets you dial in a custom scaling factor. The 2060 KO wasn't one of EVGA's best efforts; they just threw on a 1660 Super cooler and it has thick ass thermal pads. You can improve VRAM cooling by putting thermal pads between the back of the board and the metal backplate, something I often do when refurbishing GPUs. Turing is an amazing value for budget builds these days. I used an EVGA 2070 Super FTW3 for almost 3 years. What a beast! It is what got me on 1440p. Overclocked like a champ.
possibly 100fps in that situation. but i could be wrong. If we are talking cyberpunk 4k native ultra settings w/ raytracing. You can probably expect 120fps max on the 5090. that's my best guess
Still using my 8700-2080 rig. It's my 2nd pc now, but it still pulls off some very decent performance considering its 6yrs old now. mostly at 1440p or lower.
I recently upgraded from a Zotac Amp Maxx 2080 Ti to a Zotac 4070 Ti Super HoloBlack. I still had good framerates with it but I knew its time was approaching and now i have Black Myth Wukong!!! AHEM Fortnight runs well on potato PCs....(Remembers playing PUBG on a GTX 460 1GB)
Made the same upgrade mate. 4070ti super is like 4 times faster in some games. The 2080ti was getting very dated. Now I’m playing pretty much everything at 4k 120fps. Difference is night and day
An AIO is wasted on a 5700X3D. It is powerful in terms of gaming performance, not in terms of power consumption. Only uses about 50W. I have a $17 Thermalright Assassin X 120 R SE on mine and it is plenty of cooling. Tops out around 70C under a stress test and haven't seen it go higher than 60C in gaming. The cool thing about the 5700X3D other than the price is the fact it will make a noticeable difference to your gaming experience even on a budget GPU. It isn't so much about the extra frames, it is about the fact it provides a much smoother, more consistent gaming experience. I upgraded to the 5700X3D recently, taking advantage of the Prime Day special. I patiently waited for 2 years for an X3D upgrade. I was waiting for the 5800X3D to get under $200 on the used market but decided to get a new 5700X3D instead. The one I got boosts almost as high as a 5800X3D anyway. An 850W PSU is total overkill for this build. You could run this spec on a good 500W PSU and even crank the power limit to 120% on the GPU. Can't OC an X3D much. Going overkill on your PSU just means a greater chance of a fire if you experience a hardware fault. This trend is why so many 12VHPWR connectors are melting and even catching fire. The protections in your PSU are designed to protect the PSU, not what is plugged into it, a common misconception. It is just like the circuit breakers in your home. There are many ways you can start a fire before you exceed the rating of the breaker and it trips. Another thing worth pointing out is that 80 Plus ratings are efficiency ratings. They say nothing about the quality of the PSU and the differences in efficiency are too small for them to be relevant in a gaming PC. In fact, when mining Eth during the last boom, I mostly used white and bronze rated PSUs because the lower cost of the PSUs more than made up for the tiny bit more power I burned over the course of mining for a couple of years.
I’d be interested to know how this compares to a Titan XP, of course with understanding of their fundamental differences like RT or DLSS support. The XP Empire edition will always be the best looking GPU of all time, minus the Star Wars branding.
Absolutely! I even get some games playing at 60 fps with 4K. 1440p is the sweet spot though! High fps with high graphics. 4K has to be tuned to performance for fps games. That’s my experience though!
In terms of Rasterizing performance it is similar to the RTX 3070 with a caveat... The RTX 3070 has only 8GB of VRAM which means that it cannot play many games at 4K without dialing down some settings. I dare say the RTX 4060 Ti 16GB is closer to the 2080 Ti as it does not suffer from low VRAM
4060 will smash it. I upgraded from a 2080ti to a 4070ti super and it’s literally 4 times faster in some situations. Cyberpunk path traced 4k 70 to 80 fps in performance dlss. Can play 4k ray traced quality dlss at 120fps locked. My 2080ti couldn’t even do 4k ultra with ultra performance dlss and ray tracing medium at 60fps. It would frequently drop into the high 30s. I get better frames maxed out on my new card. Imo for about the same money I’d guess the 4060 is about twice as quick. Plus turning on ray tracing doesn’t cripple 40 series. Where it is the unplayable button on 20 series. Can max every setting but with rtx off. Turning it on makes most games stutter fests
Realistically if you’re budget buying you don’t intend on running Alan wake 4k high, that’s just not a realistic use case in today’s world. We know what we are stepping into.
Had a 2080ti it got 16 fps in quake 1 rtx 😂😂 same with Minecraft rtx 😂😂 pretty sure it has not got the power for 8k Tetris. Maybe pong 😂😂 but I still think it would struggle with that if they added ray tracing 😂😂 that card should have been named rtx off 2080ti. Because there was not a single game I could enable it on at 4k
Sure it is! They keep updating the game. Don’t knock it till you’ve turned on all the visual effects and stuff. It looks great! It’ll cripple most systems trying to run it in 4K
Just sold my 2080ti and upgraded to a 4070ti super. The difference is night and day. 2080ti isn’t even a great 1440p card let alone 4k. Plus ray tracing really doesn’t work well with that series. It’s basically a button that just ruins your fps for no visual uplift
😂😂 a former £2000 heavyweight card can only run indie games at 1440p within 4 years of release. My mid range 4070 is a much better investment. Both now and in the future. Anyone trying to stick up for the 20 series either knows nothing about computers. Or is desperately trying to justify how stupid they are for buying 1 😂😂
@@1986misfitsTroll multiple videos to prove you wrong. not just indies/big AAA/none demanding games all run fine Look up the videos you troll laughing at exposing your delusional self,and i also have a classic 3080 which liars like you said couldnt run games from 2022-now yet i can run mutiple games high settings on 1440 and even 4k with high fps you guys are trolls and liars
@@cmoneytheman sorry dude you watched a few videos so you must know what your talking about 😂😂 I owned the card for 4 years and have real world experience with it. It can play stuff at 60fps just about. Nothing with ray tracing and not a lot at 4k without tonnes of upscaling making it appear no better than 1080p. In comparison to my mid range current series. The thing was an absolute pile of doodoo. Now I can literally play 4k maxed out at 120fps on most stuff without upscaling. But if i ever need it I have the option to enable it. You know the way it’s supposed to work. Instead of having to have it on just to try and play. If you haven’t felt the difference for yourself you wouldn’t understand. But 40 series is in a different league compared to that 2080ti. From barely even looking 4k at 60fps. Using way too much upscaling to get there. To full 4k native maxed out at 120+ fps in pretty much all games. Even get 80/90fps path traced in cyberpunk. My 2080ti would probably get about 12fps in that scenario to put that into perspective for you
@@1986misfits Yeah, okay, so you don't know what you're talking about, troll. When games hit 60fps on 1440p and 4K high settings, that's all that matters. So why are you here trolling, saying it can only run indie games when there are multiple videos showing this card is a monster? in indies/demanding games/even unoptimized games from 4 years ago-now?, Still, the average gamer wants 60 fps as a minimum, so youre comment about 120 fps doesn't even make sense,You haven't seen every video of every game to even make such a ridiculous comment. Oh, let me talk about a card; say it runs a game badly for every game. Yeah, that makes a lot of sense, doesn't it? That's not the standard; it never has been. It's an extra want, not a requirement,The only time 120fps and higher is the standard is for CS or any other online shooter. The standard for story games has always been 60, and anything higher, as I told you, is an extra want, not a need, As I told you, I also have a 3080, which can run games great at high settings in native 1440p/4K, and I have a 120 Hz screen. I can't run every game I play at a stable 120 fps, so I cap it to 90 or 60, which is way more stable. So you're the one who doesn't know what he's talking about, you troll. Go ahead and laugh again at being exposed,That's what you like to do, don't you? For some odd reason, you laugh at nothing funny, exposing yourself. Max has been dead for about eight years. There are multiple videos that show there is not a big difference from high settings. The only time Max shows a big difference from high is when there are extra settings in games like Exodus, Doom, Lords of the Fallen (the new one), GTA, and others. So, you're not even getting the best FPS at the highest quality difference from medium, which is high. Therefore, bragging about max settings doesn't make sense either.
Knew a guy who bought a 100% top of the line PC for gaming back in new years 2020 which he intended to use for 10 years. i9-9900KS, 2xTitan RTXs, 64GB of DDR4-4266 CL19, and 3 4TB SSDs. He paired it with both a 4K 144hz and 1440p 240hz monitor, both 27". Spent $10,000 on the whole thing in the end.
The thing is, for 2025 he want's to upgrade the parts. It shows, no PC will last you 10 years if you want top of the line gaming. And it's not that components get worse, it's that new titles get less optimized. Still though, people would kill for a dual Titan RTX system even today, performance of a 4070 Ti with double the vram
🙄🤡🤣
SLI isn't supported by any game from this decade and not widely supported by games from the latter half of the 2010s. Your friend is definitely not getting 4070Ti performance and, even when SLI is running, it just doesn't work that way with the VRAM because both GPUs have to load the same set of textures. Their VRAM limit is still the same as a single GPU, which is still a lot heading into 2025. A single Titan RTX will be truly obsolete when you need more than 24GB. And a Titan RTX is most definitely not a 4k GPU in late 2024.
What your friend could do with that setup is run Lossless Scaling and offload the overhead from scaling and/or frame gen to the second GPU. But your friend would be best served by selling the second Titan RTX to fund a platform upgrade and getting a far cheaper GPU if they want to offload Lossless Scaling overhead. Could use something like a GTX 670 for that because it doesn't need modern DX feature level support to run Lossless Scaling. I bought a 670 SC for $25 recently.
Nobody is going to kill for a dual Titan RTX system. Those GPUs, however, still go for $600+ on eBay, assuming you can find someone who wants to buy one. It is a lot of money for what you get, even if you are doing generative AI or AI training when you can get a brand new 4070Ti Super for the same price.
Spending $10k on a gaming PC is simply ludicrous. When I was 12, I talked my dad into spending about $4k on a top of the line PC (Pentium 90). It was in need of some major upgrades just 3 years later.
Another option for your friend is to set up the PC for AI timesharing as a source of passive income and just build a new premium gaming PC. If they have the kind of money to blow $10k on a ridiculous gaming PC, they could probably afford to build a $3k machine with a 9800X3D (soon to be released) and a 4080 Super. Or maybe even a 5080 (also releasing soon).
titan RTX is a 4070ti with double vram NO! its at so many things totally wrong!
TITAN RTX: 4608 CUDA 576 Tensor 72 RT 24 GB GDDR6-Speicher 384-Bit 280W
RTX 4070 Ti: 7,680 CUDA 240 4. Generation Tensor 60 3. Generation RT 12 GB GDDR6 192 285 W TDP
RTX 2080 Ti: 4608 CUDA 72 Ray-Tracing 576 Tensor12 GB GDDR6 352-Bit
@@Lurch-Bot Damn what a paragraph I'm ready to dismantle in a few sentences.
SLI? Games? Who TF was talking about gaming? You can still use multiple GPUs for so many things. Nobody is talking about Gaming here.
There are videos online that show 2x2080 Tis beating out a 3090, dual Titan RTXs are a bit faster than that. The RTX 4070 Ti is a bit faster than an RTX 3090, with 12GB of VRAM. Benchmarks prove you wrong, have a good day
@@AI.Musixia I'm talking on paper here? You aren't even understanding the point of the original comment. Come on man, do better
@@RobloxianX "It shows, no PC will last you 10 years if you want top of the line gaming"
4k and 1440p side by side is instantly noticable for me. Even going from 4k back to 1440p which ive done a few times is very noticeable
1440p on a 4k tv is like turning motion blur to the max
This is the first video I;ve seen from Danny's channel and it is great ! Nice touch doing into detail about the specs of the PC and what you did to test the GPU.
Thanks so much! I try to be informative and helpful
I just got a 2060 ko for 100 dollars refurbished from microcenter it’s great for 1080p gaming
Nice! I’ve used the KO models many times. They are sweet little cards and can’t argue with that price.
I refurbished a tobacco tar stained 2060 KO recently for a build I'm selling. Most time consuming GPU refurb to date that didn't involve board repair - about 5 hours to get every last trace of tar off the GPU. I'm an ex-smoker so I am very sensitive to the smell these days and the GPU smells like a new GPU now.
During testing, I was genuinely surprised at how well it performed, having no trouble running Cyberpunk at 1440p60 with medium-ish settings with DLSS Quality. I also played FO4 at 1440p Ultra and it was able to maintain at least 90fps most of the time. Really don't want to go much above that anyway because it breaks dialogue and the ragdoll physics. FC6 ran at 60+ fps at high settings at 1440p with RT enabled.
I don't think there is a game in existence that can't run at 1440p on a 6GB GPU, at least at low settings. But that will change soon. I expect 6GB will be fine for 1080p for another 2-3 years at least. And, when it isn't, Lossless Scaling works amazingly well for scaling and frame gen, considering it is a $7 app. I don't really like AFMF because you can't cap it. Lossless scaling also lets you dial in a custom scaling factor.
The 2060 KO wasn't one of EVGA's best efforts; they just threw on a 1660 Super cooler and it has thick ass thermal pads. You can improve VRAM cooling by putting thermal pads between the back of the board and the metal backplate, something I often do when refurbishing GPUs.
Turing is an amazing value for budget builds these days. I used an EVGA 2070 Super FTW3 for almost 3 years. What a beast! It is what got me on 1440p. Overclocked like a champ.
VERY impressive. Thanks, Danny! 🙏🏼👍🏼
no bs video i love it im subbing
Im still running a 2080 Ti!!
my gtx 1070ti refurbished at 2018 is still working lol
Will you test the 5090 when it comes out? Im expecting native 4k ultra 120 fps with rt in modern triple a games. Is that too much to expect?
Expect 10-15% faster speeds than the previous generation. So I'd say that's probably too much
possibly 100fps in that situation. but i could be wrong.
If we are talking cyberpunk 4k native ultra settings w/ raytracing. You can probably expect 120fps max on the 5090. that's my best guess
@@scoddiboi 4090 is 50% faster than 3090
If I can afford it….sure I’ll test it. Depending on the game, I think you’re expecting too high of performance gains from it.
Not too much to expect out of a $2200-$2500, it better be able to do everything
Still using my 8700-2080 rig. It's my 2nd pc now, but it still pulls off some very decent performance considering its 6yrs old now. mostly at 1440p or lower.
I recently upgraded from a Zotac Amp Maxx 2080 Ti to a Zotac 4070 Ti Super HoloBlack. I still had good framerates with it but I knew its time was approaching and now i have Black Myth Wukong!!! AHEM Fortnight runs well on potato PCs....(Remembers playing PUBG on a GTX 460 1GB)
Again, Fortnite depends on the settings. Ultra at 4K is quite demanding. Congrats on the upgrade too.
Made the same upgrade mate. 4070ti super is like 4 times faster in some games. The 2080ti was getting very dated. Now I’m playing pretty much everything at 4k 120fps. Difference is night and day
An AIO is wasted on a 5700X3D. It is powerful in terms of gaming performance, not in terms of power consumption. Only uses about 50W. I have a $17 Thermalright Assassin X 120 R SE on mine and it is plenty of cooling. Tops out around 70C under a stress test and haven't seen it go higher than 60C in gaming.
The cool thing about the 5700X3D other than the price is the fact it will make a noticeable difference to your gaming experience even on a budget GPU. It isn't so much about the extra frames, it is about the fact it provides a much smoother, more consistent gaming experience.
I upgraded to the 5700X3D recently, taking advantage of the Prime Day special. I patiently waited for 2 years for an X3D upgrade. I was waiting for the 5800X3D to get under $200 on the used market but decided to get a new 5700X3D instead. The one I got boosts almost as high as a 5800X3D anyway.
An 850W PSU is total overkill for this build. You could run this spec on a good 500W PSU and even crank the power limit to 120% on the GPU. Can't OC an X3D much.
Going overkill on your PSU just means a greater chance of a fire if you experience a hardware fault. This trend is why so many 12VHPWR connectors are melting and even catching fire. The protections in your PSU are designed to protect the PSU, not what is plugged into it, a common misconception. It is just like the circuit breakers in your home. There are many ways you can start a fire before you exceed the rating of the breaker and it trips.
Another thing worth pointing out is that 80 Plus ratings are efficiency ratings. They say nothing about the quality of the PSU and the differences in efficiency are too small for them to be relevant in a gaming PC. In fact, when mining Eth during the last boom, I mostly used white and bronze rated PSUs because the lower cost of the PSUs more than made up for the tiny bit more power I burned over the course of mining for a couple of years.
I’d be interested to know how this compares to a Titan XP, of course with understanding of their fundamental differences like RT or DLSS support. The XP Empire edition will always be the best looking GPU of all time, minus the Star Wars branding.
A friend of mine had that one. It looked awesome!
can the rx 7600 play 1440p?
It sure can! My Intel ARC GPU comparison has RX 7600 results in it for 1080p & 1440p ua-cam.com/video/vwctAerBbR8/v-deo.htmlsi=95MpaWFIyvVnNo0M
Absolutely! I even get some games playing at 60 fps with 4K. 1440p is the sweet spot though! High fps with high graphics. 4K has to be tuned to performance for fps games. That’s my experience though!
That GPU is $1000 on New Egg
In terms of Rasterizing performance it is similar to the RTX 3070 with a caveat... The RTX 3070 has only 8GB of VRAM which means that it cannot play many games at 4K without dialing down some settings. I dare say the RTX 4060 Ti 16GB is closer to the 2080 Ti as it does not suffer from low VRAM
4060 will smash it. I upgraded from a 2080ti to a 4070ti super and it’s literally 4 times faster in some situations. Cyberpunk path traced 4k 70 to 80 fps in performance dlss. Can play 4k ray traced quality dlss at 120fps locked. My 2080ti couldn’t even do 4k ultra with ultra performance dlss and ray tracing medium at 60fps. It would frequently drop into the high 30s. I get better frames maxed out on my new card. Imo for about the same money I’d guess the 4060 is about twice as quick. Plus turning on ray tracing doesn’t cripple 40 series. Where it is the unplayable button on 20 series. Can max every setting but with rtx off. Turning it on makes most games stutter fests
@@1986misfits
Thank you very much for your detailed reply!!
"Latest triple aaa games"
Proceeds to benchmark old games
You’re right. I should have just said the “most popular titles.”
Realistically if you’re budget buying you don’t intend on running Alan wake 4k high, that’s just not a realistic use case in today’s world. We know what we are stepping into.
You call that an old card plz I am still using 2060 in 2024
6years old gpu vs 4k? yes it can some much older gpus can handle 8k (30fps)
8k 30 on what? Tetris?
minecraft terraria @@BigWisper
Had a 2080ti it got 16 fps in quake 1 rtx 😂😂 same with Minecraft rtx 😂😂 pretty sure it has not got the power for 8k Tetris. Maybe pong 😂😂 but I still think it would struggle with that if they added ray tracing 😂😂 that card should have been named rtx off 2080ti. Because there was not a single game I could enable it on at 4k
Me remembering fortnite came out in 2017. pretty sure that’s not a “today’s 4k” anything.
Sure it is! They keep updating the game. Don’t knock it till you’ve turned on all the visual effects and stuff. It looks great! It’ll cripple most systems trying to run it in 4K
fortnite is surprisingly very demanding
@@Jayko33 with certain settings
@@DannysTechChannel Fortnite highest settings with RT is pretty demanding
6 year old GPU btw
lol thanks.
First! Yay! 🤗
Just sold my 2080ti and upgraded to a 4070ti super. The difference is night and day. 2080ti isn’t even a great 1440p card let alone 4k. Plus ray tracing really doesn’t work well with that series. It’s basically a button that just ruins your fps for no visual uplift
That’s not true it’s mutiple indies/old not heavy demanding games that can run high settings on this classic card
😂😂 a former £2000 heavyweight card can only run indie games at 1440p within 4 years of release. My mid range 4070 is a much better investment. Both now and in the future. Anyone trying to stick up for the 20 series either knows nothing about computers. Or is desperately trying to justify how stupid they are for buying 1 😂😂
@@1986misfitsTroll multiple videos to prove you wrong. not just indies/big AAA/none demanding games all run fine Look up the videos you troll laughing at exposing your delusional self,and i also have a classic 3080 which liars like you said couldnt run games from 2022-now yet i can run mutiple games high settings on 1440 and even 4k with high fps you guys are trolls and liars
@@cmoneytheman sorry dude you watched a few videos so you must know what your talking about 😂😂 I owned the card for 4 years and have real world experience with it. It can play stuff at 60fps just about. Nothing with ray tracing and not a lot at 4k without tonnes of upscaling making it appear no better than 1080p. In comparison to my mid range current series. The thing was an absolute pile of doodoo. Now I can literally play 4k maxed out at 120fps on most stuff without upscaling. But if i ever need it I have the option to enable it. You know the way it’s supposed to work. Instead of having to have it on just to try and play. If you haven’t felt the difference for yourself you wouldn’t understand. But 40 series is in a different league compared to that 2080ti. From barely even looking 4k at 60fps. Using way too much upscaling to get there. To full 4k native maxed out at 120+ fps in pretty much all games. Even get 80/90fps path traced in cyberpunk. My 2080ti would probably get about 12fps in that scenario to put that into perspective for you
@@1986misfits Yeah, okay, so you don't know what you're talking about, troll.
When games hit 60fps on 1440p and 4K high settings, that's all that matters. So why are you here trolling, saying it can only run indie games when there are multiple videos showing this card is a monster? in indies/demanding games/even unoptimized games from 4 years ago-now?, Still, the average gamer wants 60 fps as a minimum, so youre comment about 120 fps doesn't even make sense,You haven't seen every video of every game to even make such a ridiculous comment. Oh, let me talk about a card; say it runs a game badly for every game. Yeah, that makes a lot of sense, doesn't it?
That's not the standard; it never has been. It's an extra want, not a requirement,The only time 120fps and higher is the standard is for CS or any other online shooter. The standard for story games has always been 60, and anything higher, as I told you, is an extra want, not a need, As I told you, I also have a 3080, which can run games great at high settings in native 1440p/4K, and I have a 120 Hz screen. I can't run every game I play at a stable 120 fps, so I cap it to 90 or 60, which is way more stable. So you're the one who doesn't know what he's talking about, you troll. Go ahead and laugh again at being exposed,That's what you like to do, don't you? For some odd reason, you laugh at nothing funny, exposing yourself.
Max has been dead for about eight years. There are multiple videos that show there is not a big difference from high settings. The only time Max shows a big difference from high is when there are extra settings in games like Exodus, Doom, Lords of the Fallen (the new one), GTA, and others. So, you're not even getting the best FPS at the highest quality difference from medium, which is high. Therefore, bragging about max settings doesn't make sense either.