Hope this video was helpful, let me know if there are any other games you want me to test in future videos! ➡Time stamps⬅ 0:00 - Intro 2:01 - Minecraft - 5:18 - Rainbow Six Siege 7:48 - Warzone 12:36 - Overwatch 2 15:41 - Forza Horizon 5 18:16 - Fornite 21:41 - Valorant 23:35 - Battlefield 5 26:10 - Apex Legends 27:55 - Requiem: A Plagues Tale 29:41 - Cyberpunk 2077 32:59 - Destiny 2 34:46 - God of War 36:43 - Microsoft Flight Simulator 38:25 - Horizon: Zero Dawn 39:31 - GTAV 41:07 - Red Dead Redemption 2
I was looking forward to this day. 2-3 years ago, I wondered how the games would look like if they could run 144-200 FPS and my dream has come true! The only problem is the money, however that will soon not be a problem. Gaming is going to look much greater in the future.
Some of the bottlenecking with the 4090 is actually the max frame rate the game engine can do which is wild to think about. It's the first time a card has been able to do that. Your 1080 bottle neck is CPU. The GPU is out running what the CPU is feeding. A lot of the game engines are designed to utilize more CPU at lower resolutions.
Yup, on gta v my rx 550 on 768p low settings was getting 100fps with lots of stuttering, locked the fps to 60's and it was buttery steady, even played on higher settings and no fps drops at all, gta v engine was not meant to be played at this much fps
@@joaopaulogalloclaudino9671 GTA 5 doesn't really have a frame cap, but the physics break around the 180fps mark and the game starts stuttering badly, your stuttering is likely your CPU since I'm guessing it's not a powerful one being paired with a RX 550, but just from experience, the game runs around 120-140fps just fine on my PC at 1440p
30:20 the reason that you get the same fps as the 12900k with the 13900k is because you are completely gpu bound in that resolution on cyberpunk , so different cpu will have almost no difference in performance
@•°S:F:D:T°•khorne Half-Life Black Mesa Half Life 2 2020 Remaster (Fan Remaster) -Absolutely shit,trash mf's games,don't play in it,don't download it,don't look at it. Just forget,like a nightmare of bugs,lags and low quality textures.
simulation distance in minecraft is probably what made you lag a lot, Imagine like mobs at all 32+ chunks you are facing, and they are moving, ETC. You cant even see them, probably best to keep that at like 12
@@Kuro-pv3ly You're talking 'bout GPU but Minecraft Java is not about it. Java is an "interpreted" programming language and requires an interpreter to work. That needs some complex algorithms dumping all the heavy lifting on CPU. And GPU is basically just for video output, 1030 or older one or even an iGPU might do just as good. Long things short execution of Java programs cannot be offloaded to GPU (as much as I know) so almost always it's CPU-bound. No offence tho
in minecraft, there's a mod where u get double the FPS actually. it's called sodium i'ts similiar to optfine but way better. So probably 1000-1500 fps stable. pretty good! edit: this is the best benchmark i've ever seen! good job!
Doesn't lunar client have sodium AND irus? I kinda feel he would get better fps if he uses a client, then he doesn't have to go through downloading mods.
@@MichaelLuo0311 he doesnt edit the fucking weird compresion shit in. and youtube has always been notorious for literally trash compacting any quick movements.
There is no bottleneck with a 13900KS. If there were, there would be differences in frames per second between different CPU power levels, or each GHz would result in gaining precious frames per second. I went from an Intel 9900X to a 13900KS, and I doubled my FPS by 2.5 times with my 3080Ti. I definitely had a bottleneck with my 9900X.
Just a heads up Minecraft without performance mods is very cpu intensive so if you added optifine you’d use more of your gpu and have less frame time spikes and than you could also use shaders to make it look prettier 👍
I fondly remember watching a video like 3-4 years ago explaining why companies don’t make super high refresh rate monitors, like anything past 144hz usually since our technology at the time couldn’t get super high frame rates. People could only run 240hz on games if they set their resolutions to 1080p. Now, we’re already in the era of gpus pushing over 200+ fps on MAXED OUT, raytracing 4K resolution. That’s absurd.
For now, Samsung has 2 models 4K and 5K as new production 240HZ 1MS. My favourite Samsung 49 Inch Odyssey Neo G9 1 ms 240 Hz Dual QHD Quantum Mini-LED HDR2000 RGB 1000R 5K Gaming Monitor
Hi Ed, for MSFS the real test for a CPU is in busy airports and dense cities. Then the CPU has to deal with generating the scenery and all the air traffic around. Hope this helps for future reviews :)
Yes, can you please test MSFS over NYC? Interestingly, I noticed a stutter when he was on 1440p. I wonder what the frame times are? As MSFS is highly CPU bound, a further test on this would be great.
For Rainbow, It wasn’t made apparent in the video but in the graphics there is also a resolution scaler setting that may have been turned down automatically to boost fps
The lag in Minecraft is due to not enough RAM being allocated to the game. By default it is limited to 2GB which is not enough in vanilla to run 32 render distance without major lag
Finally I was looking for a comment about this. He obviously knows little about Minecraft so it would be reasonable assumption that he left it at 2gb and not feeding it much more. My current pc runs better then what is show because he didn’t adjust this.
so in minecraft you can download certain mods/clients that will give you 2x the fps that you get in normal unmodded minecraft and with these clients/mods you can also add shaders which will make your minecraft look much better and much more realistic.
@@Jf179_ optifine is the worst performing mod you can add, better alternatives for forge would be sodium or rubidium as they provide twice the fps optifine can. Fabric mod goes even further and is the best out of all the performance mods but optifine has the most supported shaders/ texture packs/ resources packs and mod clients on the other hand.
For 4k the gains of 13900k are of little %, where it shines is in 1080p mainly. Even my 5950x + 4090 in 4k does pretty much the same as a 13900k, 4k gpu is the main thing.
at the time around 2020 during the pandemic i bought the best parts that were around and averaged 240 frames with average 1080 p in game , and i still think it’s good to this day. But seeing the current pcs it’s crazy to see how many more frames they’re able to push out now
not really. it's optimized enough graphically to make almost full use of the gpu, so then it would be 97-100% usage. Cpu usage is almost slways lower because it has many cores, but games only use 2, sometimes 4, fully. Because all code cannot be ran through all cores because that messes things up.
at 1080p you are 100% getting CPU bound. Idk why you were getting "disappointed" by dropping your resolution because you are at that point limiting the GPU on a card that is not going to push 100% at lower resolutions. Ideally 1440P - 4k is going to be where this card sits at - if you are buying a 4090 for 1080p you are really good at wasting money and not going to even touch power limits on the card.
The maximum of FPS in Minecraft is 2500, you gotta fix your settings to get this kind of FPS and your using a default setting thats why your getting around 600 - 700 FPS even though its an RTX 4090
MCBYT Has gotten 6000 ofc he did alot of stuff to get there and wasnt playing normal so ik you wont get that kind of fps playing normally but its still cool to see how much power these things have
I would be curious to see how fps is affected by lowering graphics settings to high instead of maxed/ultra. In my experience you can't usually tell the difference and you get a significant fps boost.
Hey, quick input :) for Minecraft benchmark using shaders and hi res 264x texture packs, biome blend ruins your fps and has never been a feature useful to the player. Personally i use LB Photo Realism Pack 256×256 and seus aether shaders :) I get 130 average fps at 4k using 4090 and i7 12700k.
Hey man it looks like you only have two 8 pins connected to the psu and they are daisy chained via the pigtails to the 4x8 pin NVIDIA adapter. Apparently you are not supposed to do that. Have you run into any issues voltage or stability wise?
In GTA 5, there's a engine limit of 188fps and it even stutters when reached 188 fps, for reference watch zwormz GPU RTX 4090 test in GTA 5, He explained it very nicely
What is the best option for someone that has 10700K OC @5.1ghz and a 4080 super stay on a 1080p display or upgrade to a 4k monitor as the bottleneck with the 10700K and 4080 super is horrible?
Seems like I'll be more than happy in tact shooters coming from a 1080TI. I was hoping to see more Apex gameplay at 1440p in actual fights to see if it dips below 280 (currentonly on 270hz monitor) but I'm sure it'll be great since i don't use max settings like this video. Nice vid thanks :D
no lol. plague tale requires 3070 (which is supposed to be a 1440p card) to run 60fps at 1080, and the new silent Hill game says a minimum of 6800xt is required for 1080p 30fps at high settings. You underestimate how fast graphical demand increases I bet the 4090 won't be able to run the highest graphically intensive games of 2027 at 1440p 60fps
@@LA_Designer ik but 5-10 years is still a stretch for modern games of that generation. Not everyone likes playing with dlss. I'd say for the next 3 years it should be able to run every modern game at 1440p 60fps (hopefully atleast) Legit from just one generation we've gone from 6800xt being a 4k card to getting 30fps at 1080p in silent Hill (I am aware this still makes it a good card for other games but we can expect to see more games like this) so we don't know how much more power games will require in the future
@@SabsWithR Who said they was going to play all new games at max settings? you can still play new games at medium and low settings if needed 5 to 10 years from now. and DLSS with it.
That’s actually insane to me when I saw the Warzone footage.. it’s actually blowing my mind. I’m so used to Warzone and COD looking like crap because I lower the settings to get the fps I want at 1080p. 4K with 170-209 FPS? That’s just insane… the game didn’t even look like cod anymore it looked like a movie.. I can’t believe how powerful that GPU is but wow that price is also mind blowing…. I think I’ll just hold on to hope and try to get a 3080 at a decent price with my 12400f 😂
Great video ed. I was mainly here for valorant. Although the fps remains the same in most games, I see a huge difference in wattage consumed by the 4090 in different resolutions. In Valorant, the wattage jumped from 350-370W to 80-100W between different resolutions.
@@mrducky179 We've have like 3 total 90 cards ever launched and 2 were dual cards, if you include Titans then sure, but the 3090 was a poor example being 7% faster than the 3080 in most cases, but the 4090 is supposed to be about 20% faster than the 4080, meaning it's more akin to the Titans which were essentially more expensive 80Ti's for early adopters, but the thing is with the 4090, it's not much more expensive than the 4080Ti is gonna be, when you're talking $600 to $1200 cards then sure, but $1300/$1400 to $1600 is small considering it's launched far earlier, will probably overclock better and I'm guessing the 4080Ti will be a 20gb card
2 роки тому+1
8K buddy, 4K is yesterday. LONG LIVE THE MASTERACE.
@@lilzeddy yes they have. I have a 4K 160hz monitor with my 4090 and it’s more than good enough. There’s also a couple 4K 240hz monitors available too for people who play shooters and such but it’s not like you need more than 144hz. For me I think it’s perfect considering I play only open world rpg games with ray tracing at ultra and those are so demanding that most of the time I sit anywhere between 120-150 fps in 4K.
Man turned all CPU load settings to max on minecraft, and didn't realise that "fast" graphics means low graphics. F. _notices lag_ Ed : "The GPU is struggling" The reading on top-left : GPU usage = 15%
Great job testing Fortnite in performance mode instead of Dx12 or 11 where those numbers would actually matter. Wanted to see if the game stutters at all with the 1% lows. A 1060 and 7700k can play fort in performance mode it doesn't mean anything
Ed will you be upgrading your big red soon then now these parts are out? Does it annoy you when you've just finished an absolutely beautiful build like yours then new stuff comes out?
how does the top hardware you can buy still not run the games today at 4K at top end with enough FPS to match the monitors, lots of games are not playable with the best hardware with max settings at 4k =*(
Hey Tech Very nice Video Can u do a gaming test on hell let loose is my favorite game and i would like to know how much fps u get in this card and procesor :)
I'd like to see the insane amount of FPS you get on low - demanding games (CSGO, Browser games, bad steam games, etc) and with the graphics on lowest settings.
1:10 😂. Reminds me of my first pc build back in 2001. I didnt know what thermal paste was, so i superglued my heatsink to the cpu. It burnt up. I ended up sending it back and getting a replacement.
this is an example that in the next few years 1080p gaming will be replaced with 4k, 1080p been the most used resolution for like 10 or more years now ? yeah, that will change soon, there will be no point playing in 1080p, there will be 4k monitors with 1000hz lol
that was literally the worst minecraft benchmark ive ever seen noone plays at max fov he should used bedrock and turned on rtx to really show its potential
Hope this video was helpful, let me know if there are any other games you want me to test in future videos!
➡Time stamps⬅
0:00 - Intro
2:01 - Minecraft -
5:18 - Rainbow Six Siege
7:48 - Warzone
12:36 - Overwatch 2
15:41 - Forza Horizon 5
18:16 - Fornite
21:41 - Valorant
23:35 - Battlefield 5
26:10 - Apex Legends
27:55 - Requiem: A Plagues Tale
29:41 - Cyberpunk 2077
32:59 - Destiny 2
34:46 - God of War
36:43 - Microsoft Flight Simulator
38:25 - Horizon: Zero Dawn
39:31 - GTAV
41:07 - Red Dead Redemption 2
Any demanding vr game
Hi thank you for the info I am a big fan and hope to have a cool setup someday
No counter strike :(
i5 13600k please
Fortnite without performance mode 😡
Never would I have ever expected 120 stable frames in 4K. And this goes to 240+. Unreal to see how far technology has come
Unreal the high watt one gpu ...
And you use only 120Hz on 4k because DP1.4 on 4090 ;-)
Buying this soon
Can't wait to play games with 1k frame rate 😂😂😂
I was looking forward to this day. 2-3 years ago, I wondered how the games would look like if they could run 144-200 FPS and my dream has come true! The only problem is the money, however that will soon not be a problem. Gaming is going to look much greater in the future.
Some of the bottlenecking with the 4090 is actually the max frame rate the game engine can do which is wild to think about. It's the first time a card has been able to do that. Your 1080 bottle neck is CPU. The GPU is out running what the CPU is feeding. A lot of the game engines are designed to utilize more CPU at lower resolutions.
Yup, on gta v my rx 550 on 768p low settings was getting 100fps with lots of stuttering, locked the fps to 60's and it was buttery steady, even played on higher settings and no fps drops at all, gta v engine was not meant to be played at this much fps
I got scammed
@@joaopaulogalloclaudino9671 GTA 5 doesn't really have a frame cap, but the physics break around the 180fps mark and the game starts stuttering badly, your stuttering is likely your CPU since I'm guessing it's not a powerful one being paired with a RX 550, but just from experience, the game runs around 120-140fps just fine on my PC at 1440p
@@extremexfactor8362 good
@@exglued2394 why did you say good I got scammed of my money 💰💰💰 for real my mom is going to kill me
30:20 the reason that you get the same fps as the 12900k with the 13900k is because you are completely gpu bound in that resolution on cyberpunk , so different cpu will have almost no difference in performance
@•°S:F:D:T°•khorne Half-Life Black Mesa
Half Life 2 2020 Remaster (Fan Remaster)
-Absolutely shit,trash mf's games,don't play in it,don't download it,don't look at it.
Just forget,like a nightmare of bugs,lags and low quality textures.
No.
That’s what he said like a second later you just repeated him 💀
@Abradolf Lincler 41:58 “once again we are bottlenecked”
@@krak3n852 half life black Mesa is amazing I have a Mid range PC and I can run it ultra settings 1920x1080 60 FPS everything at its possible highest
simulation distance in minecraft is probably what made you lag a lot, Imagine like mobs at all 32+ chunks you are facing, and they are moving, ETC. You cant even see them, probably best to keep that at like 12
If he used the mod Sodium he would surely get 1000+ fps without losing quality, mu 1050ti gets 120+ fps with 64 chunks and shaders and more 60 mods
@@Kuro-pv3ly You're talking 'bout GPU but Minecraft Java is not about it. Java is an "interpreted" programming language and requires an interpreter to work. That needs some complex algorithms dumping all the heavy lifting on CPU. And GPU is basically just for video output, 1030 or older one or even an iGPU might do just as good.
Long things short execution of Java programs cannot be offloaded to GPU (as much as I know) so almost always it's CPU-bound.
No offence tho
in minecraft, there's a mod where u get double the FPS actually. it's called sodium i'ts similiar to optfine but way better. So probably 1000-1500 fps stable. pretty good!
edit: this is the best benchmark i've ever seen! good job!
Doesn't lunar client have sodium AND irus? I kinda feel he would get better fps if he uses a client, then he doesn't have to go through downloading mods.
I have done it with my i7 12700k and I was getting 1500 fps
@@mrcrispygamer1676 lunar uses optifine so it doesn’t have great fps boosts
@@aedan4978 it uses both
optifine>sodium
Holy crap, watching you play Warzone with the fast movements really shows how badly UA-cam compresses videos
How do you know? Did UA-cam really compress it or did he edit himself?
@@MichaelLuo0311 he doesnt edit the fucking weird compresion shit in. and youtube has always been notorious for literally trash compacting any quick movements.
OMG...... obviously none of you watched the intro to this video...
"Unfortunately we're bottlenecking"
I'd sell my soul to bottleneck at 500fps
Real
There is no bottleneck with a 13900KS. If there were, there would be differences in frames per second between different CPU power levels, or each GHz would result in gaining precious frames per second. I went from an Intel 9900X to a 13900KS, and I doubled my FPS by 2.5 times with my 3080Ti. I definitely had a bottleneck with my 9900X.
Just a heads up Minecraft without performance mods is very cpu intensive so if you added optifine you’d use more of your gpu and have less frame time spikes and than you could also use shaders to make it look prettier 👍
I fondly remember watching a video like 3-4 years ago explaining why companies don’t make super high refresh rate monitors, like anything past 144hz usually since our technology at the time couldn’t get super high frame rates. People could only run 240hz on games if they set their resolutions to 1080p. Now, we’re already in the era of gpus pushing over 200+ fps on MAXED OUT, raytracing 4K resolution. That’s absurd.
For now, Samsung has 2 models 4K and 5K as new production 240HZ 1MS. My favourite Samsung 49 Inch Odyssey Neo G9 1 ms 240 Hz Dual QHD Quantum Mini-LED HDR2000 RGB 1000R 5K Gaming Monitor
@@Caga-x_Oren_Ishii_x I’m hoping to run one of those monitors in a couple of years
@@Caga-x_Oren_Ishii_x that will be like 1k$ + no?
Asus made a 540hz montior
I am still rocking my 280hz monitor for 3 years now. It’s a smooth experience
Hi Ed, for minecraft, graphics: fast is not the most demanding setting for the hardware, it would be fancy. Nice video!
would be fabulous not fancy
Hi Ed, for MSFS the real test for a CPU is in busy airports and dense cities. Then the CPU has to deal with generating the scenery and all the air traffic around. Hope this helps for future reviews :)
Yes, can you please test MSFS over NYC? Interestingly, I noticed a stutter when he was on 1440p. I wonder what the frame times are? As MSFS is highly CPU bound, a further test on this would be great.
@@Steve31000 MSFS is GPU and CPU intensive
He was already CPU main thread limited in this video, that’s why his FPS didn’t change when he lowered the resolution.
T
its impressive imo worth the money
0:12 bro almost sounds like he’s about to crack by his own joke 😂 gotta love ed tho
For Rainbow, It wasn’t made apparent in the video but in the graphics there is also a resolution scaler setting that may have been turned down automatically to boost fps
No it runs the game at max render resolution great
The lag in Minecraft is due to not enough RAM being allocated to the game. By default it is limited to 2GB which is not enough in vanilla to run 32 render distance without major lag
Also optifine optimizes all the cores in the cpu
Finally I was looking for a comment about this. He obviously knows little about Minecraft so it would be reasonable assumption that he left it at 2gb and not feeding it much more. My current pc runs better then what is show because he didn’t adjust this.
he even left the graphics at fast and said it was maxed out💀
so in minecraft you can download certain mods/clients that will give you 2x the fps that you get in normal unmodded minecraft and with these clients/mods you can also add shaders which will make your minecraft look much better and much more realistic.
Just say optifine
@@Jf179_ a
Theres also sodium/lithium/starlight/etc. which is better and can be used together
@@Jf179_ optifine is the worst performing mod you can add, better alternatives for forge would be sodium or rubidium as they provide twice the fps optifine can. Fabric mod goes even further and is the best out of all the performance mods but optifine has the most supported shaders/ texture packs/ resources packs and mod clients on the other hand.
@darkshadowX he provided examples in his comment
@darkshadowX so what's the point of asking 😭
For 4k the gains of 13900k are of little %, where it shines is in 1080p mainly. Even my 5950x + 4090 in 4k does pretty much the same as a 13900k, 4k gpu is the main thing.
Intel I9 more like icookyourpc
What an insane build
❤Seriously!
truly, im so happy because thats exactly my build if you swap out the 4090 FE with the 4090 Suprim X
at the time around 2020 during the pandemic i bought the best parts that were around and averaged 240 frames with average 1080 p in game , and i still think it’s good to this day. But seeing the current pcs it’s crazy to see how many more frames they’re able to push out now
"That's not even a person, THAT'S A LAMP" Ed Techsource 2022
This video confirmed what settings I’ll be running with my 4090 and i9 13th gen. Thank you for your service!
On cyber punk, if you look at the stats the cpu goes about as high as %50 usage, it’s the graphics card that limited it as it got into the %90,s
The gamedevs that limited this game lmao, so unoptimized
not really. it's optimized enough graphically to make almost full use of the gpu, so then it would be 97-100% usage.
Cpu usage is almost slways lower because it has many cores, but games only use 2, sometimes 4, fully. Because all code cannot be ran through all cores because that messes things up.
Im a big fan I love your budget vids because I don't have a pc or computer but I like the lights and other things you show on that. :D
at 1080p you are 100% getting CPU bound. Idk why you were getting "disappointed" by dropping your resolution because you are at that point limiting the GPU on a card that is not going to push 100% at lower resolutions. Ideally 1440P - 4k is going to be where this card sits at - if you are buying a 4090 for 1080p you are really good at wasting money and not going to even touch power limits on the card.
bro finally 1440p 240fps competitive gaming is possible! Great vid btw
MFS is extremely demanding especially with dx11-12. No wonder you were 100fps on high settings. Thanks for the review!
Please use the same setup in minecraft but with LunarClient
damn... flight simulator is so beautiful. Always amazes me.
lessgo edgar finally 4090 game tests thx a lot for this ed ....
The maximum of FPS in Minecraft is 2500, you gotta fix your settings to get this kind of FPS and your using a default setting thats why your getting around 600 - 700 FPS even though its an RTX 4090
Sodium mod triples fps and gives you more options
MCBYT Has gotten 6000 ofc he did alot of stuff to get there and wasnt playing normal so ik you wont get that kind of fps playing normally but its still cool to see how much power these things have
You say that but stock Minecraft even at 4k hasn't benefit from faster GPUs in years, it's all CPU now
TECHSOURCE YOU ARE THE BEST TECH CHANNEL X
I would be curious to see how fps is affected by lowering graphics settings to high instead of maxed/ultra. In my experience you can't usually tell the difference and you get a significant fps boost.
Aye that BBC joke was class xD
Hey, quick input :) for Minecraft benchmark using shaders and hi res 264x texture packs, biome blend ruins your fps and has never been a feature useful to the player. Personally i use LB Photo Realism Pack 256×256 and seus aether shaders :) I get 130 average fps at 4k using 4090 and i7 12700k.
this is a good tip, i had no idea biome blend hurt fps. Gonna turn it off next time im on
yeahy
Thank you for this content love the way you showcase Siege and Minecraft both my fav games
Hey man it looks like you only have two 8 pins connected to the psu and they are daisy chained via the pigtails to the 4x8 pin NVIDIA adapter. Apparently you are not supposed to do that. Have you run into any issues voltage or stability wise?
Good catch
Reply reminder
Should test out Squad, absolutely brutal game on your whole system, idk why nobody tests it.
what display monitor are you using ? picture looks great
That new Sony monitor
Best tech channel. Been a fan since 2020. Love the vids. Also, whats that monitor?
In GTA 5, there's a engine limit of 188fps and it even stutters when reached 188 fps, for reference watch zwormz GPU RTX 4090 test in GTA 5, He explained it very nicely
he has cool reviews
fuck, so hardware really is hitting the limit on some older but robust engines
@@QPLAYS010 no, if you will even limit it to 186 fps then also it will not stutter.
Even if your GPU is capable of 188fps it will not stutter
you can cap the fps i you wanted using various programs
What is the best option for someone that has 10700K OC @5.1ghz and a 4080 super stay on a 1080p display or upgrade to a 4k monitor as the bottleneck with the 10700K and 4080 super is horrible?
In Minecraft you should press f1 to see your crosshair and gui
The video I needed badly
Seems like I'll be more than happy in tact shooters coming from a 1080TI. I was hoping to see more Apex gameplay at 1440p in actual fights to see if it dips below 280 (currentonly on 270hz monitor) but I'm sure it'll be great since i don't use max settings like this video. Nice vid thanks :D
my brother has a 1080ti, been using it since 2017? and his games run smooth af
@@1hunnet823 1080 ti is a beast
Love to watch ur every vidoe Ed
I think a build like this for 1440P gaming could easily last 5-10 years.
no lol. plague tale requires 3070 (which is supposed to be a 1440p card) to run 60fps at 1080, and the new silent Hill game says a minimum of 6800xt is required for 1080p 30fps at high settings. You underestimate how fast graphical demand increases
I bet the 4090 won't be able to run the highest graphically intensive games of 2027 at 1440p 60fps
@@SabsWithR Yes, but more games are also integrating DLSS. As DLSS becomes more adopted and improved, it'll allow cards to last longer.
@@LA_Designer ik but 5-10 years is still a stretch for modern games of that generation. Not everyone likes playing with dlss. I'd say for the next 3 years it should be able to run every modern game at 1440p 60fps (hopefully atleast)
Legit from just one generation we've gone from 6800xt being a 4k card to getting 30fps at 1080p in silent Hill (I am aware this still makes it a good card for other games but we can expect to see more games like this) so we don't know how much more power games will require in the future
Well for competitive games yeah, i dont know about a story mode games that really focus on graphics and others probably too but not 4k 120fps
@@SabsWithR Who said they was going to play all new games at max settings? you can still play new games at medium and low settings if needed 5 to 10 years from now. and DLSS with it.
I really enjoyed this review, it was actually like a real human being speaking instead of just numbers next to a benchmark.
That’s actually insane to me when I saw the Warzone footage.. it’s actually blowing my mind. I’m so used to Warzone and COD looking like crap because I lower the settings to get the fps I want at 1080p.
4K with 170-209 FPS? That’s just insane… the game didn’t even look like cod anymore it looked like a movie..
I can’t believe how powerful that GPU is but wow that price is also mind blowing….
I think I’ll just hold on to hope and try to get a 3080 at a decent price with my 12400f 😂
Actually planning on buying this exact spec by the end of the week🤣🤣 upgrading from a 2060 laptop
You are a very good videographer.❤
Great video ed. I was mainly here for valorant. Although the fps remains the same in most games, I see a huge difference in wattage consumed by the 4090 in different resolutions. In Valorant, the wattage jumped from 350-370W to 80-100W between different resolutions.
1:10 did you actually ruin your motherboard 😅
Bro the benchmarks are crazy 😭
na fr☠️☠️
What 4k 144hz monitor did you use?
4090 is definitely THE card for 4k gaming
yep! a truely 4K gaming ... finally, those 4k 144Hz monitor is worth buying now..
i never understood the rationality of buying a 90 card for just gaming
@@mrducky179 I m a simracer, I have triple 1440p, and no choice, to play in a good situation, I have to buy the 4090.
@@mrducky179 We've have like 3 total 90 cards ever launched and 2 were dual cards, if you include Titans then sure, but the 3090 was a poor example being 7% faster than the 3080 in most cases, but the 4090 is supposed to be about 20% faster than the 4080, meaning it's more akin to the Titans which were essentially more expensive 80Ti's for early adopters, but the thing is with the 4090, it's not much more expensive than the 4080Ti is gonna be, when you're talking $600 to $1200 cards then sure, but $1300/$1400 to $1600 is small considering it's launched far earlier, will probably overclock better and I'm guessing the 4080Ti will be a 20gb card
8K buddy, 4K is yesterday. LONG LIVE THE MASTERACE.
I WAS REALLY NEEDIN THIS INFORMATION CUZ I TOTALLY HAVE THE MONEY TO BUY THIS THANKS ALOT!
Jesus Christ this graphics card is a power house
Yea but for what monitors haven’t caught up yet 😂
@@lilzeddy Of course they have. In fact, 8k displays make the 4090 beg on its knees for mercy. 4k is the sweetspot for this GPU
this with the cpu
@@lilzeddy yes they have. I have a 4K 160hz monitor with my 4090 and it’s more than good enough. There’s also a couple 4K 240hz monitors available too for people who play shooters and such but it’s not like you need more than 144hz. For me I think it’s perfect considering I play only open world rpg games with ray tracing at ultra and those are so demanding that most of the time I sit anywhere between 120-150 fps in 4K.
Lol that BBC part caught me off guard 😂😂😂😂
Man turned all CPU load settings to max on minecraft, and didn't realise that "fast" graphics means low graphics. F.
_notices lag_
Ed : "The GPU is struggling"
The reading on top-left : GPU usage = 15%
Yeah the guy in this video doesn’t really know what he’s talking about lol
Watching this man struggle with minecraft is awesome
I think GTA V works well when capped at 120fps even if ur pc is capable of higher
120 is the sweet spot
im just shocked how cool the gpu temp stays at 4k ultra settings with insane fps on top of that
Great job testing Fortnite in performance mode instead of Dx12 or 11 where those numbers would actually matter. Wanted to see if the game stutters at all with the 1% lows. A 1060 and 7700k can play fort in performance mode it doesn't mean anything
4090: Who the hell PS5 is??!
PS5: im just kidding.. you're the king sorry
Gta is capped at 188 because it breaks the game at higher fps because of how old it is also alot of people still play it
Finally, a pc to run flappy bird.
Legit just came for Minecraft, & I wasn't disappointed
22:19 bruh!😂😂
38:15 mayday! mayday!
I feel exactly the same about warzone BTW 😂😂💔Great vid thanks Ed!!
this setup is a beast! hope to have it in the future..
Ed will you be upgrading your big red soon then now these parts are out? Does it annoy you when you've just finished an absolutely beautiful build like yours then new stuff comes out?
how does the top hardware you can buy still not run the games today at 4K at top end with enough FPS to match the monitors, lots of games are not playable with the best hardware with max settings at 4k =*(
Damn your movement in warzon is pretty cracked 🔥
So glad to see the test of Rainbow Six
“I cracked my legs are you kidding me”😂
Perfect video man
I just bought a prebuilt version of this pc on nzxt.
I think this PC is very good for streaming with only 1 PC
Hey Tech Very nice Video Can u do a gaming test on hell let loose is my favorite game and i would like to know how much fps u get in this card and procesor :)
Thanks for this sir!
I'd like to see the insane amount of FPS you get on low - demanding games (CSGO, Browser games, bad steam games, etc) and with the graphics on lowest settings.
Insane amount of FPS is quite useless if you are not utilizing also high speed monitor. Good luck with your 500fps on 60Hz monitor ;-)
Running Cyberpunk at Max settings in 4K with smooth frame-rates
"I'm disappointed" *NAH I'M PROUD*
What fan rpm syou running on the 4090 mate. Keep up the great content
1:10 😂. Reminds me of my first pc build back in 2001. I didnt know what thermal paste was, so i superglued my heatsink to the cpu. It burnt up. I ended up sending it back and getting a replacement.
thank you for putting Destiny 2 on that list.
Imagine blowing tons of tnts at 3:29, lol
Your Warzone test had dynamic resolution turned on, meaning the game isn't necessarily rendering at the display resolution you have it set to.
This is the kind of video i like.
Love your vids 😊
Hey whats up omg
Thanks for the benchmarks, sir.
No worries, student.
really great benchmarkk
Hey Ed, what monitor was used in this video? Looking for a monitor that will switch from 4k to 1440p without loosing insane amount of quality
Considering the fact that you didn't even turn on dlss in any of the games and still getting those fps is insane
What name your monitor
I find it crazy how he’s like” we’re only getting a steady 400 fps” while I can barley get a 100 on most games😭
this is an example that in the next few years 1080p gaming will be replaced with 4k, 1080p been the most used resolution for like 10 or more years now ? yeah, that will change soon, there will be no point playing in 1080p, there will be 4k monitors with 1000hz lol
I love how it’s called techsource and he’s like I don’t know what this is, I don’t know what that does lol. Subscribed
3300 fps in rainbow six sieges loading screen is absolutely stunning
Good choice of games👍
My man was like: minecraft settings are maxed out.
Also him that had graphics set to fast
💀 💀 💀 💀
that was literally the worst minecraft benchmark ive ever seen
noone plays at max fov
he should used bedrock and turned on rtx to really show its potential
the quality is clearer than my vision