I currently have a 3080 10GB. If I got a 4090 free in a contest or something, I would be ecstatic. But I got my card for the MSRP, and gaming just isn't important enough to me to be spending $1600 on a GPU, especially at a time when our economy is being driven off a cliff.
@@boohoogamer1713 Yes me too, I don't mind using DLSS pretty aggressively (in 4k even the performance mode looks fine to me) so this card will be relevant for a very long time for me.
Going to skip this gen also, first time I've ever skipped a generation. If the 4080 was 10% or so less than a 4090 I would get one but at the price to performance they are asking for then no chance
@@Fantomas24ARM dlss is amazing in my opinion, I mostly use 21:9 3440x1440 at 120 and some games 4k 60. So it is plenty for a while I think. I'm disappointed that dlss 3.0 isn't supported though
This is hands-down the best review of the 4090 I've seen, and I've watched them all. What made this so great was the direct apples-to-apples comparison between the new gpu and one I'm well familiar with. This is real-world, applicable knowledge which is much more helpful. Thank you!!
The 4090 actually starts at 2.5k $ in my country and I paid 850 $ for an OC version of the 3080 a few months ago. So when I read 1000$ I can tell you that the value of the 4090 can definitely be way worse outside the US. Additionally to the 4090 you will need a high refresh rate 4k monitor, which adds another 550 $. For me a 3080 is more than enough to play everything I want on max settings at 1440p. Even Cyberpunk works great with DLSS at 1440p. The crazy thing is that the prices are rising again for the 30 series as well. It costs 70 $ more now. it's scratching at the 1000$ mark again...
Well 40 series nvidia meant for next gen games. Look what unreal engine echieved with similar tech euclydeon invented. So when next gen games will hit the markets, 3080 could be only for low to medium graphics settings 50/60fps
Same, I have a 3080 10gb bought at msrp, sold a 3070 at a scalped price to a miner (1200 euros, almost paid the 3080 lmao) and i'm all fine with that. I'll skip the 4000 gen and buy another gpu in 5 years or so.
He states at the start of the video the comparison is mostly for us who got the 3080 super early. I paid 1200$ for my 10gb 3months after release... 2 months later they went for 1800-2000$ USED. The highest prices in shops here when the mining crazy was worst was 2300$ for the 10gb 3080. So for me i gotta pay 2100$ for the cheapest 4090 so it is like he says.
@@SuperKREPSINIS That's the slight benefit you get from buying a high-end card. 50/60 fps on low-medium settings? When do you think that's gonna be? Sure, next gen games will hit the market in a few years. By that time we will probably have 50- or 60-series and a mid range card will most likely outperform the 4090 anyway.
I picked up a 3080 10gb for 500 used and sure it definitely was a mining card but so far its been thoroughly tested and giving me a fantastic 4k gaming experience. Honestly I'm satisfied.
I just bought a 6900XT for a pretty reasonable price. Considering I'm upgrading from a 1060 6GB I cannot wait for it to arrive tomorrow! I've been waiting over 5 years for a new card. My 9700KF will finally become the bottleneck! lol
I spent too much on my 3080 but my other card shat the bed, after purchasing it nothing really stresses it anyway so no point in upgrading for me. Because I spent a bit more than I should of I'll run it for the next few years anyways.
It's also worth noting, most games aren't worth running at ultra settings. You're usually gaining very little in visuals, but taking quite a big hit in performance. Hell, in many games, going above medium on settings adds very little but tanks your performance. I'd definitely recommend keeping textures high/ultra in most games, but lowering the majority of other settings to medium. You'll likely notice very little difference, but the FPS gains will be huge!
Not everyone uses this stuff for gaming, video upscaling from my 3060 to 3080ti was like 4 times faster even though I really didn't see a real bump in games. Also see differences on any high end video editing program like using Davinci Resolve would probably be noticeable.
I usually just hit the "recommended " if a game has it and play at what ever it spits out. Total war warhammer 3 put me on ultra and the benchmark says I am getting around 144fps( my monitors max frame rate) wierdly if I start dropping things to high or med to peg the game at 144 it starts losing frame rate....
Just me personally: I just upgraded to a 3080 12gig yesterday and I’m thrilled with it. It only cost 730 bucks and I can’t find a 4090 for less than 1800. I game on a 1440p monitor, I think I’m set for the next couple years. Also, I would need a new PSU for a 4090 so that’s another 180 bucks or so
@@grego10r i def think 1440p high refresh gaming is the sweet spot. But i can see the appeal of 4k gaming getting more popular as 4k displays are more mainstream than ever before. But i still think we have a ways to go before its as popular and affordable as 1080p/1440p gaming
@@Lyu-Phy Yeah its a great card and Im happy i grabbed a slightly used one for $150 off the $700 msrp. I plan to keep it for a while and go the used market again when 4000 series cards get cheap at the end of their cycle lol
Really good analysis here. I don't see ANYBODY else looking this closely at the user experience with various settings. I've been enjoying your videos for a couple months now, and finally caved in and subbed :) (PS: there is a video which the world hasn't been made yet... Can the 4090 run Crysis?)
The new version of 'can it tun crysis' is 'can it run star citizen' because star engine is a higly modified (almost completely rebuilt) version of cryengine which is what Crysis was made to show off
This dude is cappin and he is cappin big there is no way RTX 3080 could run above 40 fps even with dlss performance ,He is just comparing RTX 4090 with RTX 3090 100%
This dude is cappin and he is cappin big there is no way RTX 3080 could run above 35 fps even with dlss ultra performance lol ,He is just comparing RTX 4090 with RTX 3090 ti , Tip of advice Daniel Owen AMD has lost the war against intel let alone Nvidia lol so don't get your hopes up instead of mak8ing video in your moms basement go get job on Monday so you could build your dream PC with RTX 4090🤣🤣🤣🤣
Frame generation with the power of DLSS quality mode is actually crispier than Native and artifact is totally unnoticeable unless you slow it down by 250% and Vsync compatibility is on the way So if you cant work stop calling the earth flat 😂🤣
I know we're testing GPU, so we"re going for ULTRA quality settings, but I'm always wondering, how HIGH settings look like compared to ULTRA and what the performance gain looks like compared to it. As often ultra settings adds unseeable changes that are very taxing on the GPU. Thats what I loved about this review! By far the best review I've seen in a while!
@@odytrice Thank you! I've been saying this for years. I can't see any changes in most games from high to ultra. Even if I can see a difference it might as well not be there as I'm focused on the game.
@@odytrice oh yeah 100%! If we actually add that into the equation i could almost not care less haha I'm still rocking a RX580 and had to play Cyberpunk on Low settings with FSR on and have never been bothered by it looks. It was also the first game of its kind I played so I was super immersed. Actually, i was always stunned how great it looks, but then wondering how it would look like with better settings.
I am so happy with my ASUS RTX 3080 12GB that I really don't see the need to upgrade not worth the money. It would be really interesting to see how many players could make out a difference by just looking at the images when there is no naming of the cards and the values.😉
Looking at two static images isn't a relevant way of comparing graphics cards. All graphical settings should be identical to perform an accurate test of relative performance. Comparing footage on UA-cam typically doesn't do the difference between graphics cards justice either, as most UA-cam footage is limited to 30 FPS. Unless you're stuttering below 30 FPS, there won't be much or any difference in the videos. If you took a 3080 and 4090 on a high refresh rate 4K monitor and modern graphically demanding games, the difference would be night and day. In many cases, a 4090 is able to generate roughly double the frames of a 3080.
@@user-ve2jj1ik4b Not really, Hardware Unboxed called out openly LG if I'm not mistaken for attempt to manipulate the review and released a video on it with all the emails. Some people don't care about money and do it because of passion, and you can clearly see the passion in this guy
Why because he lies for minorities like you who cant work to buy their dream PC lol ,This dude is cappin and he is cappin big there is no way RTX 3080 could run above 40 fps even with dlss performance ,He is just comparing RTX 4090 with RTX 3090 100%
This dude is cappin and he is cappin big there is no way RTX 3080 could run above 35 fps even with dlss ultra performance lol ,He is just comparing RTX 4090 with RTX 3090 ti , Tip of advice Daniel Owen AMD has lost the war against intel let alone Nvidia lol so don't get your hopes up instead of mak8ing video in your moms basement go get job on Monday so you could build your dream PC with RTX 4090🤣🤣🤣🤣
For price to performance, I would still go with the 3080. 60 fps in single player is fine imo. I'm more interested in seeing what AMD releases in the $700 range!
Yup 60fps is still the standard for me. Heck, growing up in the 90s (and as seen even in modern console releases), 30fps isn't all that much of an issue if I'm being honest.
@@cmoneytheman How does his preference not "make sense"? It is his opinion. I agree with him, to be honest - although it bothers me much less in a singleplayer game.
@@cmoneytheman yeah, but now he experienced it and doesn't want to go back. just like I've been using a 75Hz screen for few years and now 60Hz feels really slow and choppy (despite being just 15hz difference)
Same here, I came from a 1660ti so it was quite the upgrade. I only play single player titles and use a 60Hz display. The 3080 12gb was just what I needed and was on sale @Amazon in late Aug for $799.00.
@@PotatoHead7662 Unfortunately I don't, sorry. I do think it was well worth the upgrade since the price was great at the time. I've had issues with GPU overheating which I assumed was due to gpu not keeping up since turning down graphics settings helped somewhat. After cleaning and repasting my cooler and seeing very little improvement I started researching gpu performance stats and upgraded my system a few months ago. Before that I knew very little about GPUs.
@@SnaFubar_24 Ah ok. I'm still having 70-100 fps (med settings) on cyberpunk on a 1080 monitor. Which seems..decent to me, just wondering if its worth the upgrade now or if i should wait till i upgrade to 1440p or 4k. Also..3000 series have gotten abit cheaper, but still fairly pricey atm.
The 4090 and 13900k's efficiency gains over last gen are interesting -- it means you can undervolt and power limit them and get better performance at much lower power draw. If these can be changed easily with presets it'd certainly be nice to run a low power setup for older games when you know you don't need the maximum performance.
Lower power for older games? That's why we have FPS limiting, or even v-sync. And before anyone says input lag, nope, limited at 144 FPS the only input lag is in one's own head 😂
That was one big feature for the 40 series for me. I have an ITX build so my EVGA 3080 XC was running very hot on games. Easily running around 80-90C on the more intensive ones, undervolting was necessary to keep it quiet. The price for the 40 series is absurd, but I got a good deal for a 4k monitor on Black Friday and my friend was interested in my card, so I took the upgrade to the 4080 (4090 way too large for my case). In comparison, I just finished Spiderman Miles Morales with all settings on high with RTX and was only pulling 56C on average with a quieter fan curve. An insane difference, I can actually keep my build near silent.
Beginner PC user here. Great video, I have a 3080 12gb, so I found this very helpful. You explain things very well, and I subscribed straight after watching your video! Keep up the good content
Yeah, 3080 is still fantastic for 1440p and the 4090 is overkill. If I remember correctly you still have a 9900K CPU and at 1440p even my 5950x wasn't really keeping up with the 4090 in a lot of these tests.
I feel this test, i was looking between the value of the 3080, and then trying to see if it was worth the difference to the 4090. I ended up getting a $425 3080. because theres no way it was worth 4x the price.
Thats a great price! I grabbed a used 3080 for $550 and still love the price to performance im getting with it. I will def hold onto the 3080 for a while, no need to spend that much on a 4090, dont even think it would fit in my case lol
@@Bdot888 yeah I nabbed a 3080 12gig a few days ago because it was cheap and I would need a new PSU and new case for a 4090, so I’m just gonna wait a couple years for a brand new RTX 50 series build with a Zen 5+ CPU. Until then I’m happy with a 5900x and 3080 12gb for 1440p gaming
$550 here too. Great card. Glad I waited. If I can ever get a 4080 or 4090 for under $1000 I'll do that. I'm sure it'll be off the used market in a 2 to 3 years time when some other card is sought after and I'm fine with my 3080 till then. ☺️
I got a 3080 10GB at release at MSRP and I'm really tempted to upgrade again this generation since I've upgraded the past few generations (1060 to 2070 super to 3080) but I think I'll skip this generation because at this point an upgrade really just isn't necessary. The 3080 is already an incredible card
I appreciate your channel so much. It often feels like you’re the only channel that is in-touch with non-UA-camr pc gaming enthusiasts (I hope my sentiment makes sense) Either way… thank you!
If excessive FPS is an issue, why can't you cap the FPS within the driver? It's been available for years now... I've got mine set to my monitor refresh rate (165) all the time.
You certainly can cap your FPS to the native refresh, but many games will still suffer from tearing by doing it that way. Some games v-sync is required to eliminate tearing, and of course, that introduces input lag.
With DLSS 3 Frame Gen, capping the FPS causes latency problems (probably because Frame Gen has to wait for a future frame to render to produce its in-between AI image) and is not officially compatible (games disable vsync and framerate limits by design when you enable Frame Gen for this reason).
@@danielowentech As you probably perfectly know, screen tearing may appear also when the number of frames matches the refresh rate number, cause some frames take longer to render than others. GSYNC/FREESYNC should solve the problem... I'm struggling to understand what's the actuall issue. Do you mean that since DLSS 3 inputs "fake" frames into the animation, it makes it more noticable when there are even less "real" ones?
I think an important thing to mention about Frame Gen is that NVIDIA have said that they plan to make VSync available in the future, but that its still a WIP
@@dra6o0nDouble the framerate of course. But you need a monitor that can actually handle the framerate and as it is Daniel doesn't, so I can see why he isn't using it.
It looks like even Ryzen 5000 can't keep up with this card in a lot of cases. I was hoping to be able to wait another gen to upgrade my cpu but I'm starting to think I may need one to take advantage, even at 4k
I'll be pairing the 4090 with 17-12700K, since I got that CPU recently I don't want to upgrade again already to the i7-13700 (and refuse to pay i9 prices - maybe ironic to say, since I paid a near fortune on a 4090, but...whatever lol). From I'm seeing, the bottom recommendation is an i7-12700k, but I'll get to start testing it as of next week.
@@kennylaysh2776 in fairness to you, the 4090 is the new 1080 Ti, it’s going to be high end for 5+ years. They clearly went overkill with it to stay ahead of AMD. Paying a fortune for a 4090 is still much better value than going from a 12700 to a 13700
At 18:51 your metrics show the 4090 doing better on head room, but it seems to stutter. The pan is slow and smooth on the 3080 side but I see stutter on the right.
I have a similar setup to yours except I have a 5900x but have been unable to replicate the same fps. I've been averaging around 43 fps with RT reflections at 4k. Are there other settings you have changed that have given you that many more frames?
I really needed this comparison! You're right, I'm sure half the people using a 3080 are trying to see if the 4090 is worth while. Love your channel! Keep up the great work
I'm in the half who aren't. I get all I need and want from 3080 12GB on 1440p raised to simulated 2160p (DSR 2.25 DL). Getting a 4090 together with commensurate additional hardware would be the definition of wasteful. For the half who conclude that it's worthwhile: please send your donations to my worthy cause :)
The jump is indeed great, but for the price rn, meh. I'll wait, and as long as nvidia don't go scummy on it, I'm fine with my rtx 3080. 3080 still a beast. I can play many games at 1440p and over 144fps even on most recent games with proper settimgs and dlss being also big future proof. So 40 series can wait longer.
Something that should also bear considering are people using UPS back up systems might need to think about upgrading those as well. I just recently picked up a 6800 XT, and I was gaming and soon my UPS started going off on alarm saying that there was too much wattage that the unit wouldn't be able to support it. Had to shut off monitors on my work PC, and that fixed the problem.
I think 4090 is worth it if you want high refresh rate gaming at 4k. Otherwise you are fine with the 3080. I am currently using a 3070ti paired with a 1440p monitor and it seems fine with games for this generation. In 2 to 3 years though with new ue5 games coming out it will be time to upgrade if I want to stay at above 60fps. Honestly though the 4090 is a great GPU. If it comes down to around 1100-1200$s it will be unbeatable.
4090 for 1440p for future proofing and high fps... at least that's how i am doing it 🤔 But I am doing this only because I've been on a 970 for a long time... if i had the 30 series i wouldn't even think of upgrading.
which is the first thing to spend your money on getting a high refresh rate monitor cause if they have a 4k monitor that only does 60 htz then what's the point of getting a rtx 4090🤣
3080 max cyberpunk 40 fps 4090 50 fps Dlss 30 fps boost on the 4090 it’s has no performance improvement 4090 is a gimmick card that you brainwashed kids would buy
I'm very happy with my 3080 12gb and will continue to be happy with it for a long time if these benchmarks mean anything. Thanks again D for quality content!
I bought the 4090 at launch, for fun. It kept maxing out my 12600k at any title in 1440p, so, I realized I need to spend a lot more to remove the CPU bottleneck and a new monitor. I sold the 4090 for 50 more than £1700 I bought and now using my old 3060ti. Conclusion, the 4090 brings other costs.
@@ferrety2810 Yeah, 4k monitors are so affordable now, if you already had the money for a 4090, not sure why you wouldn't just spend a few hundred on a 4k display.
Comparing the 4090 to the 4080 will be very interesting. They might not be as far apart as expected because of CPU bottlenecks and DP 1.4a limitations.
@@KlennR I have a gaming only rig, so a 6 core will be more than enough, in my county amd mobo b550 was significantly cheaper, if I put a z690 with 12600k it would've been a lot more expensive, the 12600k is head to head with 5600x and the 5800x3d is an incredible increase going long term cpu upgrade route
Or maybe wait one more year if it's not urgent, as they will surely drop in price. Compared to the games that get out which don't evolve that much in requirements.
4k 120+ hz monitors are also like $600+, which is a hidden cost. Not only is the graphics card almost double the price, but the monitor as well. Otherwise with a 1440p monitor you'll be getting an effectively equivalent experience between a 4090 and 3080. This is also on top of a bigger case and cpu upgrade you might want/need to do for a 4090.
The biggest thing I learned from this video is Ray tracing(especially at its highest settings) still isn't worth it even with the 4090 in some cases. I can't believe people are willing to cut their fps in half for a relatively modest image improvement in most scenarios. Like, the performance loss is literally larger than the leap from 1440p to 4K. I would love it if you could make a video comparing ultra Ray tracing with lower quality Ray tracing settings/ray tracing off to see if the image improvement is worth the gargantuan performance loss.
Definitely depends on the particular game and how it's implemented. In some games, you hardly notice any visual difference with ray tracing, and it tanks your performance. In other games, ray tracing adds some really nice visual bells and whistles, and the game will still run quite well. This, pretty much, only applies to 30 and 40 series cards, though. Turning on ray tracing in any newer games on a 20 series definitely tanks the performance into the depths of hell.
I got a 3080ti after seeing the 4090 prices I pkay on the couch on tv it makes 4k at 60fps no problem. I came from a 1660ti and it’s a huge improvement. The 4090 is amazing bit for what I use it for it’s just overkill. I play single player games no need for more than 60fps.
How many games don't have controller support these days? 4090 should be your GPU for the next 6 years at least so overkill now but you should get a long life out of it.
@@steveleadbeater8662 or he can just skip this generation and buy smth like 5070/5080 for less money than 4090 and have better performance over time. I got rx 6900xt few months ago just coz it was on a sale. Now im skipping this gen and if i ever buy another gpu instead of PS6 (if it comes out) or new Xbox its the price. Right now PC master race is only for very rich people that can afford a) big price for one component b) big price for other components to try to get them out of bottlenecking each other and c) the price for energy when your entire setup can consume up to like (i guess) 800-1kW per hour.
Im gonna buy amd’s 4090 competitor or whatever is closest for 3440x1440p high refresh for slightly more future proofing. It’ll be paired with a 12900k at 5.5ghz so should hold up for a long while.
Interesting. Are you not able to cap frame rate under 120 to prevent tearing? Also, what performance overlay is that? I'd love to have one that includes a frame time graph.
Would like to know too. Sometimes i feel stutters in games but I don't really see a meaningful fps drop so would be curious to get the frame time graph too.
I've had a 3080 10GB since it launched, got it at retail price around $800. I'm using a 165hz 1440p monitor and don't plan on upgrading it anytime soon. That being said, I've never gone out and bought an xx90 card and I told myself just this once, it's time to get the best of the best and have some fun. No regrets.
I'm planning on a setup with exactly your card and monitor specs. How is the experience? I'm new to pc gaming. Right now I'm on GTX 1650, 1080p 120Hz gaming laptop. I'm wondering if I should wait for the RTX 4070, and I'm also wondering if I should go for 1080p 240Hz. Any thoughts is appreciated!
Por enquanto estou satisfeito com minha 3080 10Gb Gigabyte Eagle OC, para 4k 60fps high/ultra está sendo suficiente, em poucos jogos é necessario baixar algum detalhe gráfico. Quando chegar a 4080 16GB no Brasil o preço será absurdo, então é aguardar por volta de 1 ano depois para cogitar um upgrade.
not much difference between 3080 and 3090 but there is a massive leap in performance from 30xx to 4090 hence why he compares 3080 which was the biggest leap in performance from 20xx. What we're showing now is nvidia have finally decided to make the flagship worthy with its gigantic increase in performance over 30xx and the 4080. I am thinking of returning mine due to the new power connection being quite poor
Daniel are you worried about your 12 Pin connector? We got a reddit post that a 12 pin connector burnt. Many have speculated that the cable may have been bent during installation but it still poses a concern. Old PCIe cables are more durable and are not that prone with this issue. What are your thoughts and concerns?
Thanks for the video, I'm definitely waiting for the 7900XT since I have no interest in the frame generation and Ray Tracing. Hoping for same to higher raster performance and lower pricing.
The problem is the GPU itself. The graphics card of a 4090 offers more vram but the processor isn't a huge leap over the 3080's processor to justify the cost. The biggest difference is the clock frequency and pipeline. (CUDA cores). The 5080 will be the upgrade from a 3080 that is worth the price and perf. jump.
I never see this mentioned but you can cap your framerate without using VSYNC by using the RivaTuner that comes with MSI Afterburner. I always cap my framerate a couple frames below my monitors refresh rate to to make G-SYNC more consistent. LinusTechTips did a video on this for G-SYNC years ago.
This applies to freesync as well. I do the exact same thing. Also, people just aren't aware that rivatuner provides the most consistent frame times compared to other software that limits frame rate. There is a really good UA-cam channel called battlenonsense that performed various tests regarding this and found using rivatuner with framerate capped just below monitor refresh rate provides the best frame times while keeping latency low still compared to all other options and all other frame limiters. Unfortunately, the person who ran that channel doesn't upload anymore, but his channel has really good info and tests done.
Your reviews, comparisons and all the content you post is always super detailed and exactly what I’m looking for when I’m needing info. Keep up the great work
I recently upgraded my GTX 1080 to an RTX 3070, which I got for $360 on ebay, and I am honestly blown away by the performance difference. Granted I haven't played any of the latest AAA games but of the games I have tried so far, my 3070 has been able to do 4K @ 100+ fps without any trouble. I was honestly expecting to be limited to 1440p so I was quite surprised that I was able to do 2160p quite easily. I'm guessing all these websites are using Ultra presets because it seems if you are willing to drop the settings a bit you absolutely do NOT need a 3080 to do 4K/60-100.
Thank you for your comment. I am planning to upgrade my 1050 ti soon abd my final choices are the 3070 or the 3080 ti with a goal of playing cyberpunk and warhammer 3 at a smooth 1440p level. Can I inquire if you've tested your rig with either of these games yet?
@@kevinng0703 I’ve had setups similar to both your upgrading paths. As long as you pair the cards with a reasonable cpu (mine was 5800x, but with a 3070 you can get away with like a Ryzen 3600) the 3070 can handle 1440p no problem w/o ray tracing. To turn RT on and keep 60+ at 1440p, I’d go more towards the 3080 ti upgrade path. I use the Digital Foundry optimized settings for Cyberpunk which is about high settings with ultra textures and I usually play at 4096x2160 60 fps with ray tracing off on my 3080 12 gb (it’s basically a cheaper 3080 ti/3090 and is my recommendation to get the 3080 12 gb over the Ti or 3090 cuz you get 3090 performance for 3080 10 gb money). I hope this helps with deciding your upgrade path.
Yeah you do. You just said you didn’t play AAA titles. I have a 3070 and 4k at 100 FPS is for old titles. Gotta play with the settings and use DLSS for 60 fps on most AAA titles at 4k - sometimes for 1440p
The perf dif wasnt that big to be honest, sure with DLSS, but it is not like you upgraded to get twice the fps, you did not upgrade from 1060 or equivalent to a 3070, a 3080 12gb /ti /3090 is what i would call a worthy upgrade.
im not understanding this frame data at all. just last year, the 3080 fps at 4k was over 60fps in most videos. now, because the 4090 released, the 4k vids, 3080 hardly gets over 40fps? like huhhh???
Something that isn’t talked about too much is 16:9 vs 21:9 vs 32:9. If you are running a ultra wide or super ultra wide a 90 series might be worth it as Cyberpunk can run up against that 12gb vram limit on those type of monitors
You're absolutely correct. There was a massive difference in Cyberpunk and even more noticeably in Dying Light 2. The 3080 just couldn't run RT very well on my ultrawide when I was using the 3080 vs. now it runs like a dream.
@Husha563 get the 4090. AMD just doesn't have the same quality in their drivers nor the level of technology in their cards that Nvidia does generation after generation.
Great job as usual Daniel. I tested my 4090 vs my former 3090 Ti and was pretty blown away by the increase in performance for games. I tested it in Deepfacelab and gained around 25% speed on my model training times, and it also increased my frame rate in Deepface Live by around 30%.
@@EvLTimE If you need the VRAM and can find a good price on a 3090/3090 Ti, then sure.. But for just gaming, I don't see a reason for someone with a 3080 to upgrade to a 3090 Ti. I would wait and see the new AMD offerings and the 4080/other mid-range cards from Nvidia. I use the VRAM for Machine Learning tasks but for gaming, 24GB is still pretty massive overkill.
@@EvLTimE I went from 3080 to 3090ti. It is nice upgrade for 1440p. I have only i9 10900k and it bottlenecks 3090ti sometimes in games. So if you don't have 12900k or better and don't play 4k 4090 is waste.
@@EvLTimE I would say its more worth to get 3090ti. i9 12900k bottlenecks 4090 on so many games at 1440p. 4090 worth for 4k 144hz imo only and then again you need a really expensive monitor for that too
Hello Daniel, there is something I don't understand when you say that the video card is limited by the CPU. For example at 14:25, your CPU usage is between 22-35%, so it's not maxed out, then how can it be the one slowing down or I should say keeping you from getting more FPS? Thanks in advance :)
I have an R9 3900X and a 3080 10GB that I was able to get relatively early on for ~$850. I think I would like to upgrade to a R7 7700X and 4090 for a 1440p 240Hz monitor, maybe within the next year...but I'll keep an eye on where things go for now
Been running the old, beat up 1080ti duke and it did a great job for a lot of years. When debating going 3080ti or 4090… this review helped me decide on 3080 for cost benefit returns. I thank you
Yup skipping this gen as well got my 3090Ti TUF and works perfectly for 3d renders. Sure 4090 is "nice to have" but for that price... Not to mention possible upswing in power... No thanks.
I've heard that by enabling the frame cap in the nvidia control panel to 1-2 FPS under half your refresh rate, it will still get doubled by DLSS 3 and the input lag will be comparable to unlocked framerate.
great premise to compairing 3080 to 4090. I'm planning on ordering a 3080 gaming pc, I also fetl 3090 wasn't enough of a difference to 3080 for the money. As for getting the 4090, probably wait till Intel 14th gen comes out in 1-2 years, which by then the platform and gaming at 4k will be much more common.
Just FYI you can cap pretty much 99% of games using nvidia control panel or MSI after burner with Riva tuner. You said you hate it when you can't cap it. Hopes that helps. Keep up the great reviews!
FINALLY! Thank you for this video / comparison! Everything I wanted to see. I'm on a 4K OLED TV, so I've been waiting for something that can singlehandedly handle 120FPS in games (preferably withOUT Frame Generation). Definitely planning to use DLSS, just maybe not FG in something like Call of Duty or Spider-Man. It's just the MSRP... yeesh. Excellent coverage, keep doing your thing. Subbed. (Your timestamp game is godly btw)
Yep just upgraded from 3080 to 4090 with an LG oled and wanted 4k 120hz at very high settings as the 3080 needs lots of help to get there . Happy with the results . Great video.
This dude is cappin and he is cappin big there is no way RTX 3080 could run above 35 fps even with dlss ultra performance lol ,He is just comparing RTX 4090 with RTX 3090 ti , Tip of advice Daniel Owen AMD has lost the war against intel let alone Nvidia lol so don't get your hopes up instead of mak8ing video in your moms basement go get job on Monday so you could build your dream PC with RTX 4090🤣🤣🤣🤣
Would a 5800x3d be a good pair with the 4090? I really didn’t want to get a next gen cpu but now I’m feeling like maybe I should. I currently have a 5800x.
yes, the 5800x3d is literally the best cpu for the price to performance. (well before the 13600k) Just get a dual rank ddr4 kit (2x16) that is 3600-3800mhz c16
4090 is a gpu that absolutely trashes the 3080/3080 ti/3090/3090 ti in terms of raw performance, that's what it is. Its a halo card which costs a fortune.
This dude is cappin and he is cappin big there is no way RTX 3080 could run above 35 fps even with dlss ultra performance lol ,He is just comparing RTX 4090 with RTX 3090 ti , Tip of advice Daniel Owen AMD has lost the war against intel let alone Nvidia lol so don't get your hopes up instead of mak8ing video in your moms basement go get job on Monday so you could build your dream PC with RTX 4090🤣🤣🤣🤣...
@@zeus4634 You sound extremely salty lol. People who "make videos in their moms basement" have the potential to end up earning more money from monetization on these platforms compared to working a 9-5 all your life, unless you become a Lawyer, Doctor, Dentist or psychiatrist, or software engineer and even in some cases, those basement dwellers earn more than them as well.
I've almost 40 and what i noticed is that gaming became activity for snobs. Most of users can't tell the difference between 4k and 1080p and the small details You are showing here do not matter if the game is good. There is no reasonable cause to spend so much money for gpu.
From what I can see Nvidia did a great job prioritizing what needs the most performance increase. Rasterization performance is more then enough. They focused on RT core performance which is where it is really needed. They also are much more efficient. When he was looking at apples to apples comparison the 4090 performed almost double the 3080 12gb and used the same amount of power.
@@teddyholiday8038 but the performance is still way better at the same power level. More efficient. If limited to the power level of the 30 series it would still perform much better.
Thank you for this! Exactly what I've been looking for (have watched a lot of reviews/coverage and this is by far the most comprehensive, and clear comparison!). My situation is pretty much exactly as you laid out. I purchased a 3080 at the end of 2020 (but was lucky enough to get it directly from newegg). And have recently upgraded my TV to the LG C2 Evo. And was thinking of getting the 4090 for exactly that, outputting 4k@120hz. Sounds like it's the move I'm going to make! Now I just have to patiently wait through stock notices... and manage to grab one at MSRP...
I have the LG C9--yes, from 2019. Same exact boat as you with the 3080, and it's wild that my chase to finally be able to fully utilize this TV in every game seems finally over. That is, once I get my annual bonus haha. Maybe they'll be in stock by then
It's gonna be interesting to see how much of these they can fix. When you think back on how bad DLSS 1.0 was, blurry and glitchy mess. If they fix it so it looks just as good as regular DLSS 2.0 then frame generation is a game changer! Although they have to fix the input lag.
input lag is what makes this dam lame shitty shit DLSS technology usesless and this needs to be stoped, and improving it it just waste of time, same as GSYNC and amd version its just a layer of imput lags... go back to analogs :D0
actually upgraded to the 3080 about a week ago from a 1080Ti. Personally I think its a great card that will last years! Not everything needs FPS FPS FPS, enjoy the damn game! I think most gamers have lost the point of gaming and now only care about who has the best hardware etc etc. Thank you for this video!!!!
I completely agree with you I just upgraded from a rtx 2080 to a rtx 3080 ti and from what I've seen on benchmarks these higher end 3000 series cards absolutely slaughter the next gen console and will be monsters for years to come
Good video. I went from a rtx 3080 10gb to a 3090 24gb. Where the rtx 3080 10gb runs in to trouble is when it runs out of vram, that can happen to any game with user created mods or levels. Actually game designers are pretty good at keeping levels playable. I noticed this with Far Cry 5 arcade mode at 4k with settings turned up. Far Cry 5 is an old game now, there were arcade user created levels where the fps would just tank at 4k, settings cranked up because there was too much going on the screen that used more than 10gb of vram, the 3090 NEVER has that problem.
When I heard TechDeals mention he saw vram usage go above 12gb on his 3090, that was an incentive to go with 3090 over 3080, (especially with the price being within $150 of each other when I bought it recently). I rarely have time to game these days, but when I do, I want to enjoy it as much as possible. I have a big backlog, so i wont need to upgrade for years. I'd still like to know I can play whatever cones along in the future.
@@Phar2Rekliss Nvidia upmarked both the 4080 12GB and 16GB models from 4070 variahts. The 4090 was very well meant to be in the 4080/TI tier but was moved up as well as the 4070's moved to 4080 tier in order to make this shift look better.
Is it possible to to get past the tearing when frame generation is on by leaving vsync off and power-limiting the GPU so that frames are never over 120Hz? Or is the variance too high?
I had a 3080 10 Gb and switched to the 4090. The performance increase in RDR 2 was so high I could not believe my eyes. I maxxed out out of curiosity and even native the margin is enormous. 6K max was smooth, 8k max no but still 35-45 FPS
Just went from 3080 extreme waterforce edition to a 4090. If I was just paying flat screen gsmes I think the 3080 was great... trying to run games in VR on a Pimax 8kx was a different story. The 4090 had been a complete game changer for me and with frame generation I'm running spiderman on my OLED with ray tracing maxed native 4k at above 120fps. For me it's absolutely been worth it.
This is my main reason for wanting to upgrade. My 3080 is fine for pancake but I also do a lot of intense VR titles on a Reverb G2. I’ve been struggling with DCS because my frame rates are right at the cusp between being ok and being uncomfortable. I think the 4090 would make a big difference for me.
The 3080 10gb was and still is a very GREAT card for 1440p and it destroys at 1080p. It’s reasonably good at 4k as well. MAN I LOVED THAT CARD. Unfortunately I sold it. GOOD TIMES ❤
I currently have an RTX 3090 which I bought way below its MSRP and even cheaper than some of the RTX 3080 Ti cards during a sale from one of the online shopping apps in Asia. I have to say, spending almost twice the amount (or even more for some brands) for an RTX 4090 seemed impractical from my point of view as the eye-candy deal of the RTX 3090 was a legit steal in terms of price-to-performance value. I'm happy with what I got as I know it will last me for years with my type of usage. 😄
Very interesting video. I have been looking at used 3090s on eBay for about $700 now. haven’t looked into 3080s much. My cpu is old 4770k at 4.4ghz but I have been gaming in 4k for about 10 years on it with a 980ti hydro. I only really play Arma3, Warthunder, DCS and Squad right now and I can play 4k in all those 30-60fps with G-sync for last 10 years. I will upgrade CPU next year I think but with price drops on 30 series they seem to be good bang for buck.
Just go and buy, I've switched from 970 to rx6800xt. And also play competitive games (war thunder, PUBG, escape from tarkov). It's significant performance and quite improvement. Especially in games you mentioned, they like high frame rates to reduce input-lag. You never regret about this decision. I'm playing in 4k resolution.
Used 3080's are selling for $500 right now. And the 3070's are very close to $350. It's honestly the best time to pick up whatever used 30's series you can afford
CPU bottleneck might go away with DirecStorage v1.1 because asset decompression can bypass the CPU. Worth doing this comparison again after it's released to measure impact.
Is dlss 3 for 30 series also or is it only for 40 series, I know frame generation is 40 only but what about dlss 3, do we get that for older rtx cards like the 3080ti?
On thing I am sad about is that I bought the 3080ti right as soon as the prices were crashing. So it was still like 10% above MRSP vs how it was when it was like 1600-1700 USD. Something in me was telling me to just wait since I already had a 3080 laptop that was meeting my gaming needs while waiting for my gpu to complete my PC setup.
It’s all good man, as pc gamers I feel we all go through something similar at one point or another. End up spending money on something we don’t necessarily need at the moment. As long as you’re enjoying it 💯
Im considering upgrading my 3080 because its starting to show age in 4k AAA titles, it has served me well, and i have thoroughly enjoyed it. Personally, its not within my budget to buy an rtx 4090 just for the spec increase alone, but I fully intend to buy a 5080 or 5090 when they release, because i believe they will be worth the ridiculous price. I am content with a 3000 series card and i believe if you haven't upgraded yet, then don't! Wait until the new gen comes out, and then decide
I don't see the visual loss in the performace/balanced that people keep talking about. Is it just me? Is it hard to see on youtube? Do you have to know what you are looking for?
I currently have a 3080 10GB. If I got a 4090 free in a contest or something, I would be ecstatic. But I got my card for the MSRP, and gaming just isn't important enough to me to be spending $1600 on a GPU, especially at a time when our economy is being driven off a cliff.
Which economy u talking about?
Odds are he’s American and he assumes everyone else in the world is too 😂
@@Rupes0610 🤣
@@Rupes0610 :))). You do know most of the time when the US goes into a recession, so does the rest of the world right?
@@Rupes0610 Europe has it worse, but go on.
I have a 3080 10gb and honestly it does everything I need, going to skip this generation.
Same here I may even wait 2 gens
@@boohoogamer1713 Yes me too, I don't mind using DLSS pretty aggressively (in 4k even the performance mode looks fine to me) so this card will be relevant for a very long time for me.
Going to skip this gen also, first time I've ever skipped a generation. If the 4080 was 10% or so less than a 4090 I would get one but at the price to performance they are asking for then no chance
@@Fantomas24ARM dlss is amazing in my opinion, I mostly use 21:9 3440x1440 at 120 and some games 4k 60. So it is plenty for a while I think. I'm disappointed that dlss 3.0 isn't supported though
Surely 3080 is fine for everything except 4k.
This is hands-down the best review of the 4090 I've seen, and I've watched them all. What made this so great was the direct apples-to-apples comparison between the new gpu and one I'm well familiar with. This is real-world, applicable knowledge which is much more helpful. Thank you!!
Why dont you say something without saying something.
Its because he is a real person, not a paid-off techtuber
Couldn't agree more
It's fake, 4090 power ports didn't melt.
@@xAndrzej42 LOL 😂
The 4090 actually starts at 2.5k $ in my country and I paid 850 $ for an OC version of the 3080 a few months ago. So when I read 1000$ I can tell you that the value of the 4090 can definitely be way worse outside the US. Additionally to the 4090 you will need a high refresh rate 4k monitor, which adds another 550 $. For me a 3080 is more than enough to play everything I want on max settings at 1440p. Even Cyberpunk works great with DLSS at 1440p. The crazy thing is that the prices are rising again for the 30 series as well. It costs 70 $ more now. it's scratching at the 1000$ mark again...
Well 40 series nvidia meant for next gen games. Look what unreal engine echieved with similar tech euclydeon invented. So when next gen games will hit the markets, 3080 could be only for low to medium graphics settings 50/60fps
Same, I have a 3080 10gb bought at msrp, sold a 3070 at a scalped price to a miner (1200 euros, almost paid the 3080 lmao) and i'm all fine with that. I'll skip the 4000 gen and buy another gpu in 5 years or so.
He states at the start of the video the comparison is mostly for us who got the 3080 super early. I paid 1200$ for my 10gb 3months after release... 2 months later they went for 1800-2000$ USED. The highest prices in shops here when the mining crazy was worst was 2300$ for the 10gb 3080. So for me i gotta pay 2100$ for the cheapest 4090 so it is like he says.
@@SuperKREPSINIS That's the slight benefit you get from buying a high-end card. 50/60 fps on low-medium settings? When do you think that's gonna be? Sure, next gen games will hit the market in a few years. By that time we will probably have 50- or 60-series and a mid range card will most likely outperform the 4090 anyway.
@@lgeiger Midrange outperforming the 4090 in a few years? that I wanna see!
I picked up a 3080 10gb for 500 used and sure it definitely was a mining card but so far its been thoroughly tested and giving me a fantastic 4k gaming experience. Honestly I'm satisfied.
Same. Best purchase I've ever made.
I just bought a 6900XT for a pretty reasonable price. Considering I'm upgrading from a 1060 6GB I cannot wait for it to arrive tomorrow! I've been waiting over 5 years for a new card. My 9700KF will finally become the bottleneck! lol
I spent too much on my 3080 but my other card shat the bed, after purchasing it nothing really stresses it anyway so no point in upgrading for me. Because I spent a bit more than I should of I'll run it for the next few years anyways.
@@ruffastoast8570 yeah and after seeing how bad AMD lied to us about 7XXX performance a month ago I don't need to upgrade for awhile lol
It's also worth noting, most games aren't worth running at ultra settings. You're usually gaining very little in visuals, but taking quite a big hit in performance. Hell, in many games, going above medium on settings adds very little but tanks your performance. I'd definitely recommend keeping textures high/ultra in most games, but lowering the majority of other settings to medium. You'll likely notice very little difference, but the FPS gains will be huge!
“Ultra settings are for screenshots, high settings are for playing.”
this is the reason why i also always use optimized settings suggested by digital foundry or hardware unboxed (when possible) !
you mean its a law of diminishing returns?!
Not everyone uses this stuff for gaming, video upscaling from my 3060 to 3080ti was like 4 times faster even though I really didn't see a real bump in games. Also see differences on any high end video editing program like using Davinci Resolve would probably be noticeable.
I usually just hit the "recommended " if a game has it and play at what ever it spits out. Total war warhammer 3 put me on ultra and the benchmark says I am getting around 144fps( my monitors max frame rate) wierdly if I start dropping things to high or med to peg the game at 144 it starts losing frame rate....
Just me personally: I just upgraded to a 3080 12gig yesterday and I’m thrilled with it. It only cost 730 bucks and I can’t find a 4090 for less than 1800. I game on a 1440p monitor, I think I’m set for the next couple years.
Also, I would need a new PSU for a 4090 so that’s another 180 bucks or so
Exactly! The 4090 isnt an easy upgrade, you can build a whole other pc with the price of the card 😭😂
at 1440 p 1080 ti will rock all games i dont get it kids this days 4 k gamein 1440p is best for real gamers
@@grego10r i def think 1440p high refresh gaming is the sweet spot. But i can see the appeal of 4k gaming getting more popular as 4k displays are more mainstream than ever before. But i still think we have a ways to go before its as popular and affordable as 1080p/1440p gaming
3080 absolutely demolishes todays gaming, you are set for years.
@@Lyu-Phy Yeah its a great card and Im happy i grabbed a slightly used one for $150 off the $700 msrp. I plan to keep it for a while and go the used market again when 4000 series cards get cheap at the end of their cycle lol
Really good analysis here. I don't see ANYBODY else looking this closely at the user experience with various settings.
I've been enjoying your videos for a couple months now, and finally caved in and subbed :)
(PS: there is a video which the world hasn't been made yet... Can the 4090 run Crysis?)
The new version of 'can it tun crysis' is 'can it run star citizen' because star engine is a higly modified (almost completely rebuilt) version of cryengine which is what Crysis was made to show off
This dude is cappin and he is cappin big there is no way RTX 3080 could run above 40 fps even with dlss performance ,He is just comparing RTX 4090 with RTX 3090 100%
@@zeus4634Buyer's bitter remorse? Grow up :-3
This dude is cappin and he is cappin big there is no way RTX 3080 could run above 35 fps even with dlss ultra performance lol ,He is just comparing RTX 4090 with RTX 3090 ti , Tip of advice Daniel Owen AMD has lost the war against intel let alone Nvidia lol so don't get your hopes up instead of mak8ing video in your moms basement go get job on Monday so you could build your dream PC with RTX 4090🤣🤣🤣🤣
Frame generation with the power of DLSS quality mode is actually crispier than Native and artifact is totally unnoticeable unless you slow it down by 250% and Vsync compatibility is on the way So if you cant work stop calling the earth flat 😂🤣
I know we're testing GPU, so we"re going for ULTRA quality settings, but I'm always wondering, how HIGH settings look like compared to ULTRA and what the performance gain looks like compared to it.
As often ultra settings adds unseeable changes that are very taxing on the GPU.
Thats what I loved about this review! By far the best review I've seen in a while!
Indeed but after I spent 1300€ to buy a 3080 last year I'm not going to accept only high settings anywhere 🤣
Truth be told High settings in most games are almost indistinguishable from Ultra especially if you are focused on actually playing game
@@odytrice Thank you! I've been saying this for years. I can't see any changes in most games from high to ultra. Even if I can see a difference it might as well not be there as I'm focused on the game.
@@odytrice oh yeah 100%! If we actually add that into the equation i could almost not care less haha
I'm still rocking a RX580 and had to play Cyberpunk on Low settings with FSR on and have never been bothered by it looks. It was also the first game of its kind I played so I was super immersed.
Actually, i was always stunned how great it looks, but then wondering how it would look like with better settings.
When you’re testing gpu performance you want to maximise the gpu usage to get better comparison. Otherwise you might get cpu bottleneckimg
I am so happy with my ASUS RTX 3080 12GB that I really don't see the need to upgrade not worth the money. It would be really interesting to see how many players could make out a difference by just looking at the images when there is no naming of the cards and the values.😉
Looking at two static images isn't a relevant way of comparing graphics cards. All graphical settings should be identical to perform an accurate test of relative performance.
Comparing footage on UA-cam typically doesn't do the difference between graphics cards justice either, as most UA-cam footage is limited to 30 FPS. Unless you're stuttering below 30 FPS, there won't be much or any difference in the videos.
If you took a 3080 and 4090 on a high refresh rate 4K monitor and modern graphically demanding games, the difference would be night and day.
In many cases, a 4090 is able to generate roughly double the frames of a 3080.
Man your channel is hella underrated. You talk sense in every video, and give genuine good advices
He knows his shit and touches on issues I want elaboration on. Job well done.
@@user-ve2jj1ik4b Not really, Hardware Unboxed called out openly LG if I'm not mistaken for attempt to manipulate the review and released a video on it with all the emails. Some people don't care about money and do it because of passion, and you can clearly see the passion in this guy
Why because he lies for minorities like you who cant work to buy their dream PC lol ,This dude is cappin and he is cappin big there is no way RTX 3080 could run above 40 fps even with dlss performance ,He is just comparing RTX 4090 with RTX 3090 100%
well he is a teacher... so it comes easy when it comes to explaining stuff which makes his video's top notch
This dude is cappin and he is cappin big there is no way RTX 3080 could run above 35 fps even with dlss ultra performance lol ,He is just comparing RTX 4090 with RTX 3090 ti , Tip of advice Daniel Owen AMD has lost the war against intel let alone Nvidia lol so don't get your hopes up instead of mak8ing video in your moms basement go get job on Monday so you could build your dream PC with RTX 4090🤣🤣🤣🤣
For price to performance, I would still go with the 3080. 60 fps in single player is fine imo. I'm more interested in seeing what AMD releases in the $700 range!
Yup 60fps is still the standard for me. Heck, growing up in the 90s (and as seen even in modern console releases), 30fps isn't all that much of an issue if I'm being honest.
60fps is playable but feels slow, laggy and old. My eyes are waaay too used for 144fps on 144hz display.
@@kubasniak but u didnt always have that 144 so u make no sense and u didnt always game on that for years
@@cmoneytheman How does his preference not "make sense"? It is his opinion. I agree with him, to be honest - although it bothers me much less in a singleplayer game.
@@cmoneytheman yeah, but now he experienced it and doesn't want to go back.
just like I've been using a 75Hz screen for few years and now 60Hz feels really slow and choppy (despite being just 15hz difference)
Recently upgraded to a 3080 12gb and I'm happy with it. Less than half the price of a 4090 and no intention of 4k gaming so fits my needs.
Same here, I came from a 1660ti so it was quite the upgrade. I only play single player titles and use a 60Hz display. The 3080 12gb was just what I needed and was on sale @Amazon in late Aug for $799.00.
Where I am a 4090 is 2600€ which is totally insane. I got my 3080 for 800€ back then.
@@SnaFubar_24 im still using a 1660ti, is it really worth the upgrade? any fps increase you can share?
@@PotatoHead7662 Unfortunately I don't, sorry. I do think it was well worth the upgrade since the price was great at the time. I've had issues with GPU overheating which I assumed was due to gpu not keeping up since turning down graphics settings helped somewhat. After cleaning and repasting my cooler and seeing very little improvement I started researching gpu performance stats and upgraded my system a few months ago. Before that I knew very little about GPUs.
@@SnaFubar_24 Ah ok. I'm still having 70-100 fps (med settings) on cyberpunk on a 1080 monitor. Which seems..decent to me, just wondering if its worth the upgrade now or if i should wait till i upgrade to 1440p or 4k.
Also..3000 series have gotten abit cheaper, but still fairly pricey atm.
The 4090 and 13900k's efficiency gains over last gen are interesting -- it means you can undervolt and power limit them and get better performance at much lower power draw. If these can be changed easily with presets it'd certainly be nice to run a low power setup for older games when you know you don't need the maximum performance.
Kind of like a toggle that steam deck has, that would be amazing actually. Custom power limits for each game.
Set up profiles with shortcuts in MSI Afterburner.
Lower power for older games? That's why we have FPS limiting, or even v-sync. And before anyone says input lag, nope, limited at 144 FPS the only input lag is in one's own head 😂
That was one big feature for the 40 series for me. I have an ITX build so my EVGA 3080 XC was running very hot on games. Easily running around 80-90C on the more intensive ones, undervolting was necessary to keep it quiet. The price for the 40 series is absurd, but I got a good deal for a 4k monitor on Black Friday and my friend was interested in my card, so I took the upgrade to the 4080 (4090 way too large for my case). In comparison, I just finished Spiderman Miles Morales with all settings on high with RTX and was only pulling 56C on average with a quieter fan curve. An insane difference, I can actually keep my build near silent.
If you need to UV them, just buy way cheaper alternatives. Simple.
Beginner PC user here. Great video, I have a 3080 12gb, so I found this very helpful. You explain things very well, and I subscribed straight after watching your video! Keep up the good content
bro I dont recomment a 3080 12gb for a beginner PC user
@@TheTrue22 yeah my current pc is a GTX 1650 Super (its my first)
@@bloodsportz yeah thats a good start my first was a 2060 8 gb
$1600-1700 is a unicorn price,on the real market its $2000+
4090 is in reality $1000 more than a 3090Ti
So?? You just wit until you get a card for $1700 or less.... It will be much easier than last gen at least
@@michaelangst6078 we will see,if thats the case the 4090 at $1600 is good deal than the $1000 3090ti
60-80% more performance for %60 more money
@@michaelangst6078 just to remind you, 4090 is still above 2000 euro ;) and new gen is coming out already
On a 1440p panel I'll stick with my current gpu as price to performance is not worth it in the long run.. interesting stats tho
What gpu do you currently have Neil ?
@@biovultures7714 a second hand 3080
Yeah, 3080 is still fantastic for 1440p and the 4090 is overkill. If I remember correctly you still have a 9900K CPU and at 1440p even my 5950x wasn't really keeping up with the 4090 in a lot of these tests.
@@danielowentech yea correct my i9 9900k & 32GB DDR4 RAM WILL SEE ME OK FOR THE FORSEEABLE..
I feel this test, i was looking between the value of the 3080, and then trying to see if it was worth the difference to the 4090. I ended up getting a $425 3080. because theres no way it was worth 4x the price.
Thats a great price! I grabbed a used 3080 for $550 and still love the price to performance im getting with it. I will def hold onto the 3080 for a while, no need to spend that much on a 4090, dont even think it would fit in my case lol
@@Bdot888 yeah I nabbed a 3080 12gig a few days ago because it was cheap and I would need a new PSU and new case for a 4090, so I’m just gonna wait a couple years for a brand new RTX 50 series build with a Zen 5+ CPU. Until then I’m happy with a 5900x and 3080 12gb for 1440p gaming
@@teddyholiday8038 oh yeah definitely! We will be more than satisfied with the 3080 for years to come!
$550 here too. Great card. Glad I waited. If I can ever get a 4080 or 4090 for under $1000 I'll do that. I'm sure it'll be off the used market in a 2 to 3 years time when some other card is sought after and I'm fine with my 3080 till then. ☺️
hopefully you got the 12gb vram because 10gb already feels at its limits in some games, just look at FH5 in this test.
I got a 3080 10GB at release at MSRP and I'm really tempted to upgrade again this generation since I've upgraded the past few generations (1060 to 2070 super to 3080) but I think I'll skip this generation because at this point an upgrade really just isn't necessary. The 3080 is already an incredible card
Just get a 3800x3D cpu while the price is good. Got mine for $305 on sale upgraded from a 3700x. 100% worth it. Very good FPS gains
I appreciate your channel so much.
It often feels like you’re the only channel that is in-touch with non-UA-camr pc gaming enthusiasts (I hope my sentiment makes sense)
Either way… thank you!
If excessive FPS is an issue, why can't you cap the FPS within the driver? It's been available for years now... I've got mine set to my monitor refresh rate (165) all the time.
You certainly can cap your FPS to the native refresh, but many games will still suffer from tearing by doing it that way. Some games v-sync is required to eliminate tearing, and of course, that introduces input lag.
With DLSS 3 Frame Gen, capping the FPS causes latency problems (probably because Frame Gen has to wait for a future frame to render to produce its in-between AI image) and is not officially compatible (games disable vsync and framerate limits by design when you enable Frame Gen for this reason).
@@danielowentech As you probably perfectly know, screen tearing may appear also when the number of frames matches the refresh rate number, cause some frames take longer to render than others. GSYNC/FREESYNC should solve the problem...
I'm struggling to understand what's the actuall issue. Do you mean that since DLSS 3 inputs "fake" frames into the animation, it makes it more noticable when there are even less "real" ones?
I think an important thing to mention about Frame Gen is that NVIDIA have said that they plan to make VSync available in the future, but that its still a WIP
Why use frame generation at all other than playing with a new toy and then getting bored and turning it off?
@@dra6o0n because in many games where you are reaching cpu bottle neck, frame gen lets you bypass that substantially and gives out more fps.
@@harshtigga7596 Still at an input lag detriment. It's just not worth it in most games.
And they can still say sike.
@@dra6o0nDouble the framerate of course. But you need a monitor that can actually handle the framerate and as it is Daniel doesn't, so I can see why he isn't using it.
It looks like even Ryzen 5000 can't keep up with this card in a lot of cases. I was hoping to be able to wait another gen to upgrade my cpu but I'm starting to think I may need one to take advantage, even at 4k
I'll be pairing the 4090 with 17-12700K, since I got that CPU recently I don't want to upgrade again already to the i7-13700 (and refuse to pay i9 prices - maybe ironic to say, since I paid a near fortune on a 4090, but...whatever lol). From I'm seeing, the bottom recommendation is an i7-12700k, but I'll get to start testing it as of next week.
@@kennylaysh2776 in fairness to you, the 4090 is the new 1080 Ti, it’s going to be high end for 5+ years. They clearly went overkill with it to stay ahead of AMD. Paying a fortune for a 4090 is still much better value than going from a 12700 to a 13700
@Transistor Jump um, of course, that’s not what I said
Also, do people realize how obnoxious and dismissive it is to start a reply with “nah?”
@@teddyholiday8038 It's what you implied.
At 18:51 your metrics show the 4090 doing better on head room, but it seems to stutter. The pan is slow and smooth on the 3080 side but I see stutter on the right.
I have a similar setup to yours except I have a 5900x but have been unable to replicate the same fps. I've been averaging around 43 fps with RT reflections at 4k. Are there other settings you have changed that have given you that many more frames?
I really needed this comparison! You're right, I'm sure half the people using a 3080 are trying to see if the 4090 is worth while. Love your channel! Keep up the great work
I'm in the half who aren't. I get all I need and want from 3080 12GB on 1440p raised to simulated 2160p (DSR 2.25 DL). Getting a 4090 together with commensurate additional hardware would be the definition of wasteful. For the half who conclude that it's worthwhile: please send your donations to my worthy cause :)
The jump is indeed great, but for the price rn, meh. I'll wait, and as long as nvidia don't go scummy on it, I'm fine with my rtx 3080. 3080 still a beast. I can play many games at 1440p and over 144fps even on most recent games with proper settimgs and dlss being also big future proof. So 40 series can wait longer.
Something that should also bear considering are people using UPS back up systems might need to think about upgrading those as well. I just recently picked up a 6800 XT, and I was gaming and soon my UPS started going off on alarm saying that there was too much wattage that the unit wouldn't be able to support it. Had to shut off monitors on my work PC, and that fixed the problem.
same reason why i went with i7 12700 non k and rtx 3070 FE .
I think 4090 is worth it if you want high refresh rate gaming at 4k. Otherwise you are fine with the 3080. I am currently using a 3070ti paired with a 1440p monitor and it seems fine with games for this generation. In 2 to 3 years though with new ue5 games coming out it will be time to upgrade if I want to stay at above 60fps. Honestly though the 4090 is a great GPU. If it comes down to around 1100-1200$s it will be unbeatable.
4090 for 1440p for future proofing and high fps... at least that's how i am doing it 🤔
But I am doing this only because I've been on a 970 for a long time... if i had the 30 series i wouldn't even think of upgrading.
which is the first thing to spend your money on getting a high refresh rate monitor cause if they have a 4k monitor that only does 60 htz then what's the point of getting a rtx 4090🤣
I don't see the difference in 1440p to 4k must people just want to have bragging right. Which is straight up stupid
I game at 4k high fps I'm flight sim with my 3080ti. I wouldn't waste my money on a non ti card I won't even get a 3090 because the 3090ti exists
3080 max cyberpunk 40 fps
4090 50 fps
Dlss 30 fps boost on the 4090 it’s has no performance improvement 4090 is a gimmick card that you brainwashed kids would buy
I'm very happy with my 3080 12gb and will continue to be happy with it for a long time if these benchmarks mean anything. Thanks again D for quality content!
Me too!
why arent you using Gsync?
Weird to me to
I am. It doesn't stop screen tearing when you are above your monitor's max refresh rate. It only helps within a certain fps range.
I bought the 4090 at launch, for fun. It kept maxing out my 12600k at any title in 1440p, so, I realized I need to spend a lot more to remove the CPU bottleneck and a new monitor. I sold the 4090 for 50 more than £1700 I bought and now using my old 3060ti. Conclusion, the 4090 brings other costs.
Well, the CPU would have a lot more to give if you just grabbed a 4K display with that setup.
@@ferrety2810 yup
@@ferrety2810 Yeah, 4k monitors are so affordable now, if you already had the money for a 4090, not sure why you wouldn't just spend a few hundred on a 4k display.
All you had to do was get a 4K monitor
Lol @ the replies.
Comparing the 4090 to the 4080 will be very interesting. They might not be as far apart as expected because of CPU bottlenecks and DP 1.4a limitations.
especially is you have a bad cpu to keep up :P
DP 1.4a means nothing the compression works flawlessly and is lossless. Seriously all these people are lying to you.
@@guywithalltheanswers6942 Then why will AMD have it on their new cards?
@@germanmade1219 That is irrelevant. The compression is lossless. Would it be nice to have 2.1 dp ? Sure I guess but it's not a real issue.
@@guywithalltheanswers6942 I call bs on lossless, where is your proof on that
Paired a 5600x with msi gaming z rtx 3080 10gb this month for my first build and it works very well at 1440p, don't really see a reason to upgrade
Im still thinking if i should go for AM5 or AM4. If AM4 then you can later upgrade from 5600x to 5800x3d when ita cheaper
@@KlennR I have a gaming only rig, so a 6 core will be more than enough, in my county amd mobo b550 was significantly cheaper, if I put a z690 with 12600k it would've been a lot more expensive, the 12600k is head to head with 5600x and the 5800x3d is an incredible increase going long term cpu upgrade route
@@KlennR ddr 5 also is not really good for gaming and npt worth it
@@hardcorehardware361 factually inaccurate about the Ryzen 7700x and 7600x.
you should get the 4090 to bottleneck your 5600x kid🤣
These cards are great to get if you plan to have it for years to come as cpus get stronger you will gain extra frames as time goes on
Very true!
Or maybe wait one more year if it's not urgent, as they will surely drop in price. Compared to the games that get out which don't evolve that much in requirements.
4k 120+ hz monitors are also like $600+, which is a hidden cost. Not only is the graphics card almost double the price, but the monitor as well. Otherwise with a 1440p monitor you'll be getting an effectively equivalent experience between a 4090 and 3080. This is also on top of a bigger case and cpu upgrade you might want/need to do for a 4090.
Best info comment thank you.
I care about speed advantage “competitive” not 4K gaming.
Is it small performance gains if comparing in that manner???
The biggest thing I learned from this video is Ray tracing(especially at its highest settings) still isn't worth it even with the 4090 in some cases.
I can't believe people are willing to cut their fps in half for a relatively modest image improvement in most scenarios. Like, the performance loss is literally larger than the leap from 1440p to 4K.
I would love it if you could make a video comparing ultra Ray tracing with lower quality Ray tracing settings/ray tracing off to see if the image improvement is worth the gargantuan performance loss.
Definitely depends on the particular game and how it's implemented. In some games, you hardly notice any visual difference with ray tracing, and it tanks your performance. In other games, ray tracing adds some really nice visual bells and whistles, and the game will still run quite well. This, pretty much, only applies to 30 and 40 series cards, though. Turning on ray tracing in any newer games on a 20 series definitely tanks the performance into the depths of hell.
I got a 3080ti after seeing the 4090 prices I pkay on the couch on tv it makes 4k at 60fps no problem. I came from a 1660ti and it’s a huge improvement. The 4090 is amazing bit for what I use it for it’s just overkill. I play single player games no need for more than 60fps.
Did the exact same thing
How many games don't have controller support these days? 4090 should be your GPU for the next 6 years at least so overkill now but you should get a long life out of it.
@@steveleadbeater8662 or he can just skip this generation and buy smth like 5070/5080 for less money than 4090 and have better performance over time. I got rx 6900xt few months ago just coz it was on a sale. Now im skipping this gen and if i ever buy another gpu instead of PS6 (if it comes out) or new Xbox its the price. Right now PC master race is only for very rich people that can afford a) big price for one component b) big price for other components to try to get them out of bottlenecking each other and c) the price for energy when your entire setup can consume up to like (i guess) 800-1kW per hour.
Im gonna buy amd’s 4090 competitor or whatever is closest for 3440x1440p high refresh for slightly more future proofing. It’ll be paired with a 12900k at 5.5ghz so should hold up for a long while.
Interesting. Are you not able to cap frame rate under 120 to prevent tearing?
Also, what performance overlay is that? I'd love to have one that includes a frame time graph.
Would like to know too. Sometimes i feel stutters in games but I don't really see a meaningful fps drop so would be curious to get the frame time graph too.
@@irishRocker1 its msi afterburner
What software is he using to get all of that info in the upper left corner?
I've had a 3080 10GB since it launched, got it at retail price around $800. I'm using a 165hz 1440p monitor and don't plan on upgrading it anytime soon. That being said, I've never gone out and bought an xx90 card and I told myself just this once, it's time to get the best of the best and have some fun. No regrets.
I'm planning on a setup with exactly your card and monitor specs. How is the experience? I'm new to pc gaming. Right now I'm on GTX 1650, 1080p 120Hz gaming laptop. I'm wondering if I should wait for the RTX 4070, and I'm also wondering if I should go for 1080p 240Hz. Any thoughts is appreciated!
same here, coming from a rtx 2060 to 4090 , 4k is something else
Por enquanto estou satisfeito com minha 3080 10Gb Gigabyte Eagle OC, para 4k 60fps high/ultra está sendo suficiente, em poucos jogos é necessario baixar algum detalhe gráfico. Quando chegar a 4080 16GB no Brasil o preço será absurdo, então é aguardar por volta de 1 ano depois para cogitar um upgrade.
For me, I would have rather had a 3090 versus a 4090 comparison, but thank you for the thorough review!
not much difference between 3080 and 3090 but there is a massive leap in performance from 30xx to 4090 hence why he compares 3080 which was the biggest leap in performance from 20xx.
What we're showing now is nvidia have finally decided to make the flagship worthy with its gigantic increase in performance over 30xx and the 4080. I am thinking of returning mine due to the new power connection being quite poor
Well, more people own a 3080 than 3090, and the performance between them isn't even huge, so this video worth checking.
what's the name of the little program who put orange info in the top left corner forgot about its name
Daniel are you worried about your 12 Pin connector? We got a reddit post that a 12 pin connector burnt. Many have speculated that the cable may have been bent during installation but it still poses a concern. Old PCIe cables are more durable and are not that prone with this issue. What are your thoughts and concerns?
Thanks for the video, I'm definitely waiting for the 7900XT since I have no interest in the frame generation and Ray Tracing. Hoping for same to higher raster performance and lower pricing.
x2
They said 7950Xt is 2.5x faster than 3090 lets just see..
Just gotta wait for the 6090 it will dominate.
Would be interesting to see the same test but 3080 10gb (way more common) Vs the new 4080.
Would of liked a comparison to the red devil 6950xt as well. It's down to $800
Yeah, if RDNA3 is just super high end expensive I’ll probably look at 6800xt for $500 or less when I build a new rig.
The problem is the GPU itself. The graphics card of a 4090 offers more vram but the processor isn't a huge leap over the 3080's processor to justify the cost. The biggest difference is the clock frequency and pipeline. (CUDA cores).
The 5080 will be the upgrade from a 3080 that is worth the price and perf. jump.
I never see this mentioned but you can cap your framerate without using VSYNC by using the RivaTuner that comes with MSI Afterburner. I always cap my framerate a couple frames below my monitors refresh rate to to make G-SYNC more consistent. LinusTechTips did a video on this for G-SYNC years ago.
This applies to freesync as well. I do the exact same thing. Also, people just aren't aware that rivatuner provides the most consistent frame times compared to other software that limits frame rate. There is a really good UA-cam channel called battlenonsense that performed various tests regarding this and found using rivatuner with framerate capped just below monitor refresh rate provides the best frame times while keeping latency low still compared to all other options and all other frame limiters. Unfortunately, the person who ran that channel doesn't upload anymore, but his channel has really good info and tests done.
Your reviews, comparisons and all the content you post is always super detailed and exactly what I’m looking for when I’m needing info. Keep up the great work
I have a 3080 10gb I'd be stupid if I bought a 40 series. I have plenty of power until the 50 series
@Salt Maker bless you
I recently upgraded my GTX 1080 to an RTX 3070, which I got for $360 on ebay, and I am honestly blown away by the performance difference. Granted I haven't played any of the latest AAA games but of the games I have tried so far, my 3070 has been able to do 4K @ 100+ fps without any trouble. I was honestly expecting to be limited to 1440p so I was quite surprised that I was able to do 2160p quite easily. I'm guessing all these websites are using Ultra presets because it seems if you are willing to drop the settings a bit you absolutely do NOT need a 3080 to do 4K/60-100.
Thank you for your comment. I am planning to upgrade my 1050 ti soon abd my final choices are the 3070 or the 3080 ti with a goal of playing cyberpunk and warhammer 3 at a smooth 1440p level. Can I inquire if you've tested your rig with either of these games yet?
@@kevinng0703 I’ve had setups similar to both your upgrading paths. As long as you pair the cards with a reasonable cpu (mine was 5800x, but with a 3070 you can get away with like a Ryzen 3600) the 3070 can handle 1440p no problem w/o ray tracing. To turn RT on and keep 60+ at 1440p, I’d go more towards the 3080 ti upgrade path. I use the Digital Foundry optimized settings for Cyberpunk which is about high settings with ultra textures and I usually play at 4096x2160 60 fps with ray tracing off on my 3080 12 gb (it’s basically a cheaper 3080 ti/3090 and is my recommendation to get the 3080 12 gb over the Ti or 3090 cuz you get 3090 performance for 3080 10 gb money). I hope this helps with deciding your upgrade path.
Yeah you do. You just said you didn’t play AAA titles. I have a 3070 and 4k at 100 FPS is for old titles. Gotta play with the settings and use DLSS for 60 fps on most AAA titles at 4k - sometimes for 1440p
Nice, I Upgraded from a FX8320/R9 390 to a 5800x/3070Ti.
The perf dif wasnt that big to be honest, sure with DLSS, but it is not like you upgraded to get twice the fps, you did not upgrade from 1060 or equivalent to a 3070, a 3080 12gb /ti /3090 is what i would call a worthy upgrade.
im not understanding this frame data at all. just last year, the 3080 fps at 4k was over 60fps in most videos. now, because the 4090 released, the 4k vids, 3080 hardly gets over 40fps? like huhhh???
@pkadair lol all these "updates" arr a real conspiracy
The jump from a rtx 3080-4080 was smaller then the jump from the 4080-4090 it was like 45% at 4k that’s why I decided with 4090
You can cap your refresh rate in Control Panel, before using frame insertion.
Something that isn’t talked about too much is 16:9 vs 21:9 vs 32:9. If you are running a ultra wide or super ultra wide a 90 series might be worth it as Cyberpunk can run up against that 12gb vram limit on those type of monitors
You're absolutely correct. There was a massive difference in Cyberpunk and even more noticeably in Dying Light 2. The 3080 just couldn't run RT very well on my ultrawide when I was using the 3080 vs. now it runs like a dream.
@Husha563 get the 4090. AMD just doesn't have the same quality in their drivers nor the level of technology in their cards that Nvidia does generation after generation.
Great job as usual Daniel. I tested my 4090 vs my former 3090 Ti and was pretty blown away by the increase in performance for games. I tested it in Deepfacelab and gained around 25% speed on my model training times, and it also increased my frame rate in Deepface Live by around 30%.
Do you think is upgrading from 3080 10gb worth to 3090ti or just to wait for amd and 4080 benchmarks?
@@EvLTimE If you need the VRAM and can find a good price on a 3090/3090 Ti, then sure.. But for just gaming, I don't see a reason for someone with a 3080 to upgrade to a 3090 Ti. I would wait and see the new AMD offerings and the 4080/other mid-range cards from Nvidia.
I use the VRAM for Machine Learning tasks but for gaming, 24GB is still pretty massive overkill.
@@EvLTimE I went from 3080 to 3090ti. It is nice upgrade for 1440p. I have only i9 10900k and it bottlenecks 3090ti sometimes in games. So if you don't have 12900k or better and don't play 4k 4090 is waste.
@@xTurtleOW I have 12700k and play mostly fps on 1440p and was thinking to grab 3090ti and wait for 5000 series. Not a fan of dlss3 frames
@@EvLTimE I would say its more worth to get 3090ti. i9 12900k bottlenecks 4090 on so many games at 1440p. 4090 worth for 4k 144hz imo only and then again you need a really expensive monitor for that too
Can you cap it with Rivatuner at 120hz?
Hello Daniel, there is something I don't understand when you say that the video card is limited by the CPU. For example at 14:25, your CPU usage is between 22-35%, so it's not maxed out, then how can it be the one slowing down or I should say keeping you from getting more FPS? Thanks in advance :)
games not using all cores
@@Limptek Thanks Manfred
I have an R9 3900X and a 3080 10GB that I was able to get relatively early on for ~$850. I think I would like to upgrade to a R7 7700X and 4090 for a 1440p 240Hz monitor, maybe within the next year...but I'll keep an eye on where things go for now
Upgrade only the monitor and save your money for now, because there aren't games that yet really need a 4090
@@watchlover7750 he wants to play at 1440p at 240hz so he prob needs a 4090.. 3080 doesnt give you 240fps ^^
Been running the old, beat up 1080ti duke and it did a great job for a lot of years. When debating going 3080ti or 4090… this review helped me decide on 3080 for cost benefit returns. I thank you
Or you could go AMD, as long ss you don't care about hardware raytracing.
Yup skipping this gen as well got my 3090Ti TUF and works perfectly for 3d renders. Sure 4090 is "nice to have" but for that price... Not to mention possible upswing in power... No thanks.
I've heard that by enabling the frame cap in the nvidia control panel to 1-2 FPS under half your refresh rate, it will still get doubled by DLSS 3 and the input lag will be comparable to unlocked framerate.
Interesting, I'll try that if I ever get a high refresh rate monitor lol
what is that overlay software you use?
great premise to compairing 3080 to 4090. I'm planning on ordering a 3080 gaming pc, I also fetl 3090 wasn't enough of a difference to 3080 for the money. As for getting the 4090, probably wait till Intel 14th gen comes out in 1-2 years, which by then the platform and gaming at 4k will be much more common.
Just FYI you can cap pretty much 99% of games using nvidia control panel or MSI after burner with Riva tuner. You said you hate it when you can't cap it. Hopes that helps. Keep up the great reviews!
Doesn't work with frame generation tho, even from the control panel
FINALLY! Thank you for this video / comparison! Everything I wanted to see. I'm on a 4K OLED TV, so I've been waiting for something that can singlehandedly handle 120FPS in games (preferably withOUT Frame Generation). Definitely planning to use DLSS, just maybe not FG in something like Call of Duty or Spider-Man. It's just the MSRP... yeesh.
Excellent coverage, keep doing your thing. Subbed. (Your timestamp game is godly btw)
Same situation here, this was really helpful!
wait for AMD
Yep just upgraded from 3080 to 4090 with an LG oled and wanted 4k 120hz at very high settings as the 3080 needs lots of help to get there . Happy with the results . Great video.
Я думаю нужн больше тестов на артефакты dlss3
This dude is cappin and he is cappin big there is no way RTX 3080 could run above 35 fps even with dlss ultra performance lol ,He is just comparing RTX 4090 with RTX 3090 ti , Tip of advice Daniel Owen AMD has lost the war against intel let alone Nvidia lol so don't get your hopes up instead of mak8ing video in your moms basement go get job on Monday so you could build your dream PC with RTX 4090🤣🤣🤣🤣
Would a 5800x3d be a good pair with the 4090? I really didn’t want to get a next gen cpu but now I’m feeling like maybe I should. I currently have a 5800x.
Just upgraded from the 3700x to the 5800x3d with a 4090 it's a awesome combination, I was coming from a 2080 super 👌
yes, the 5800x3d is literally the best cpu for the price to performance. (well before the 13600k) Just get a dual rank ddr4 kit (2x16) that is 3600-3800mhz c16
Out of curiosity, are you setting Low Latency Mode to Ultra? That might help with the cpu bottleneck a bit.
just joined the 3080 12gb party guys. we made it and whats even a 4090??
4090 is a gpu that absolutely trashes the 3080/3080 ti/3090/3090 ti in terms of raw performance, that's what it is. Its a halo card which costs a fortune.
Awesome benchmark comparison for these two cards here.
I see one small problem with the high fps out of a 4090 at 4k where are the 120htz 4k monitor's?🤣
@@raven4k998 the samsung Odyssey neo G7 is 120Hz. I have the Neo G8 which is 240Hz
@@raven4k998 they r available u fool
This dude is cappin and he is cappin big there is no way RTX 3080 could run above 35 fps even with dlss ultra performance lol ,He is just comparing RTX 4090 with RTX 3090 ti , Tip of advice Daniel Owen AMD has lost the war against intel let alone Nvidia lol so don't get your hopes up instead of mak8ing video in your moms basement go get job on Monday so you could build your dream PC with RTX 4090🤣🤣🤣🤣...
@@zeus4634 You sound extremely salty lol. People who "make videos in their moms basement" have the potential to end up earning more money from monetization on these platforms compared to working a 9-5 all your life, unless you become a Lawyer, Doctor, Dentist or psychiatrist, or software engineer and even in some cases, those basement dwellers earn more than them as well.
I've almost 40 and what i noticed is that gaming became activity for snobs. Most of users can't tell the difference between 4k and 1080p and the small details You are showing here do not matter if the game is good. There is no reasonable cause to spend so much money for gpu.
I hadn’t checked on your channel in a couple months surprised to see you up to almost 80k congrats.
can you feel the input lag when turning on DLSS and Frame Generation?
From what I can see Nvidia did a great job prioritizing what needs the most performance increase. Rasterization performance is more then enough. They focused on RT core performance which is where it is really needed. They also are much more efficient. When he was looking at apples to apples comparison the 4090 performed almost double the 3080 12gb and used the same amount of power.
4090 is using less power when it’s CPU limited
@@teddyholiday8038 but the performance is still way better at the same power level. More efficient. If limited to the power level of the 30 series it would still perform much better.
Thank you for this! Exactly what I've been looking for (have watched a lot of reviews/coverage and this is by far the most comprehensive, and clear comparison!).
My situation is pretty much exactly as you laid out.
I purchased a 3080 at the end of 2020 (but was lucky enough to get it directly from newegg). And have recently upgraded my TV to the LG C2 Evo. And was thinking of getting the 4090 for exactly that, outputting 4k@120hz.
Sounds like it's the move I'm going to make! Now I just have to patiently wait through stock notices... and manage to grab one at MSRP...
I have the LG C9--yes, from 2019. Same exact boat as you with the 3080, and it's wild that my chase to finally be able to fully utilize this TV in every game seems finally over. That is, once I get my annual bonus haha. Maybe they'll be in stock by then
It's gonna be interesting to see how much of these they can fix. When you think back on how bad DLSS 1.0 was, blurry and glitchy mess.
If they fix it so it looks just as good as regular DLSS 2.0 then frame generation is a game changer! Although they have to fix the input lag.
They can fix the glitches but they can't fix the input lag. It's just a fundamental part of the feature.
input lag is what makes this dam lame shitty shit DLSS technology usesless and this needs to be stoped, and improving it it just waste of time, same as GSYNC and amd version its just a layer of imput lags... go back to analogs :D0
In Spider-man test, you said CPU limited? It's only around 40% usage on CPU, can you explain?
Are graphics card gonna soon become separate box?
actually upgraded to the 3080 about a week ago from a 1080Ti. Personally I think its a great card that will last years! Not everything needs FPS FPS FPS, enjoy the damn game! I think most gamers have lost the point of gaming and now only care about who has the best hardware etc etc. Thank you for this video!!!!
I completely agree with you I just upgraded from a rtx 2080 to a rtx 3080 ti and from what I've seen on benchmarks these higher end 3000 series cards absolutely slaughter the next gen console and will be monsters for years to come
Preach brother, preach!!!
Good video. I went from a rtx 3080 10gb to a 3090 24gb. Where the rtx 3080 10gb runs in to trouble is when it runs out of vram, that can happen to any game with user created mods or levels. Actually game designers are pretty good at keeping levels playable. I noticed this with Far Cry 5 arcade mode at 4k with settings turned up. Far Cry 5 is an old game now, there were arcade user created levels where the fps would just tank at 4k, settings cranked up because there was too much going on the screen that used more than 10gb of vram, the 3090 NEVER has that problem.
When I heard TechDeals mention he saw vram usage go above 12gb on his 3090, that was an incentive to go with 3090 over 3080, (especially with the price being within $150 of each other when I bought it recently).
I rarely have time to game these days, but when I do, I want to enjoy it as much as possible. I have a big backlog, so i wont need to upgrade for years. I'd still like to know I can play whatever cones along in the future.
@@goldenheartOh that was allocation not usage
Love this 3080 vs upmarked 4080 comparison, thanks for the thoughtful videos as always.
This is a 4090 not a 4080. The 4080 is not out yet
@@Phar2Rekliss Nvidia upmarked both the 4080 12GB and 16GB models from 4070 variahts. The 4090 was very well meant to be in the 4080/TI tier but was moved up as well as the 4070's moved to 4080 tier in order to make this shift look better.
@Boofy Goofy Something literally everyone that hasn't been huffing oaint for month knows.
Is NVIDIA slowly using its driver updates to make older GPUs slower on purpose?
Is it possible to to get past the tearing when frame generation is on by leaving vsync off and power-limiting the GPU so that frames are never over 120Hz? Or is the variance too high?
I had a 3080 10 Gb and switched to the 4090. The performance increase in RDR 2 was so high I could not believe my eyes. I maxxed out out of curiosity and even native the margin is enormous. 6K max was smooth, 8k max no but still 35-45 FPS
Got my 3080ti for $800 and I'm so happy with how games play and look. Upgrading from a 2070 looks like a huge improvement.
New in box or used/refurbished? If new, where?
If you thinks that’s a huge improvement imagine if you had a 4090 coming from a 2070 😅
Just went from 3080 extreme waterforce edition to a 4090. If I was just paying flat screen gsmes I think the 3080 was great... trying to run games in VR on a Pimax 8kx was a different story. The 4090 had been a complete game changer for me and with frame generation I'm running spiderman on my OLED with ray tracing maxed native 4k at above 120fps. For me it's absolutely been worth it.
how much you paid for the 4090?
This is my main reason for wanting to upgrade. My 3080 is fine for pancake but I also do a lot of intense VR titles on a Reverb G2. I’ve been struggling with DCS because my frame rates are right at the cusp between being ok and being uncomfortable. I think the 4090 would make a big difference for me.
@@cesarpdc got the gigabyte OC edition. Basically 1900 after tax
Same here went from a 3080 10gb to a Msi gaming X trio 4090. Crazy improvements on my OLED 1440p 240hz 😊
The 3080 10gb was and still is a very GREAT card for 1440p and it destroys at 1080p. It’s reasonably good at 4k as well. MAN I LOVED THAT CARD. Unfortunately I sold it. GOOD TIMES ❤
Would love to see a video like this on 3440x1440 with the i7 13700k & the i9 13900k
Why doesn't your CP2077 have frame generation? I saw Jackfrags or someone similar have it in their game somehow
It’s not out yet publicly. They’re using it on a new patch that’s not out for us plebs lmao
I currently have an RTX 3090 which I bought way below its MSRP and even cheaper than some of the RTX 3080 Ti cards during a sale from one of the online shopping apps in Asia. I have to say, spending almost twice the amount (or even more for some brands) for an RTX 4090 seemed impractical from my point of view as the eye-candy deal of the RTX 3090 was a legit steal in terms of price-to-performance value.
I'm happy with what I got as I know it will last me for years with my type of usage. 😄
what is an online apps?
What price under 800$?
3080 all the way!
Very interesting video. I have been looking at used 3090s on eBay for about $700 now. haven’t looked into 3080s much. My cpu is old 4770k at 4.4ghz but I have been gaming in 4k for about 10 years on it with a 980ti hydro. I only really play Arma3, Warthunder, DCS and Squad right now and I can play 4k in all those 30-60fps with G-sync for last 10 years. I will upgrade CPU next year I think but with price drops on 30 series they seem to be good bang for buck.
Just go and buy, I've switched from 970 to rx6800xt. And also play competitive games (war thunder, PUBG, escape from tarkov). It's significant performance and quite improvement. Especially in games you mentioned, they like high frame rates to reduce input-lag. You never regret about this decision. I'm playing in 4k resolution.
Used 3080's are selling for $500 right now. And the 3070's are very close to $350. It's honestly the best time to pick up whatever used 30's series you can afford
get a new cpu, or else you will have a big bottleneck even at 4k.
CPU bottleneck might go away with DirecStorage v1.1 because asset decompression can bypass the CPU. Worth doing this comparison again after it's released to measure impact.
At 4min27s am I right to see that with fast moving DLSS alone has more fps than Frame Generation?
Is dlss 3 for 30 series also or is it only for 40 series, I know frame generation is 40 only but what about dlss 3, do we get that for older rtx cards like the 3080ti?
DLSS 3 is basically DLSS 2 + frame generation. So only for the 4000 series
On thing I am sad about is that I bought the 3080ti right as soon as the prices were crashing. So it was still like 10% above MRSP vs how it was when it was like 1600-1700 USD. Something in me was telling me to just wait since I already had a 3080 laptop that was meeting my gaming needs while waiting for my gpu to complete my PC setup.
It’s all good man, as pc gamers I feel we all go through something similar at one point or another. End up spending money on something we don’t necessarily need at the moment. As long as you’re enjoying it 💯
Im considering upgrading my 3080 because its starting to show age in 4k AAA titles, it has served me well, and i have thoroughly enjoyed it. Personally, its not within my budget to buy an rtx 4090 just for the spec increase alone, but I fully intend to buy a 5080 or 5090 when they release, because i believe they will be worth the ridiculous price. I am content with a 3000 series card and i believe if you haven't upgraded yet, then don't! Wait until the new gen comes out, and then decide
I don't see the visual loss in the performace/balanced that people keep talking about. Is it just me? Is it hard to see on youtube? Do you have to know what you are looking for?
Love your channel, very simplistic, no-fluff presentation yet covers a ton of details