I love how they improve lightning, reflections and shadows that doesn't change much from high to very high and so on while costing us horrendous performance loss
The level of fidelity in these games doesn't justify the hardware requirement. If we'll need a 500$ GPU to run UE5 games on 1080p Ultra 60FPS, ngl it's less than ideal.
THANK YOU somebody else has eyes and Independent thought, people keep trying to tell me "these are pushing the Graphics so hard and that is why they run bad" but I am sorry the JUST AREN'T especially Jusant there is NO WAY a game that looks like that should run SOOOO BAD
I thought it was already an established fact that it's more about the simplification of development process rather than improvement to gaming experience, much like Ray Tracing.
@@juanblanco7898 literally this, Nvidia dropped the ball with the value/performance and tried to sell upscaling tech in short of being able to handle these new features on lower end with their marginal improvements. They want $300+ for a GPU that barely plays the new standard which was not the case with their last 3 generations. At the very least you can get the equivalent AMD product for noticeably less and even then youre still not getting a great buy this gen.
When are people gonna realise that resolution doesn't matter that much. When you were a kid, did you watch TV at 480i and go "man this looks so unrealistic"? A Gen 8 game at 720p looks better than a gen 6 game at 4K.
Don't get me wrong, UE5 makes for some visually appealing games, but I don't think it's enough to justify the massive increase in hardware requirements. It's basically an increase in 100% raw hardware power for barely 30% better looking games.
That was always the case, and the same argument was made when UE4 came out. I've watched this song and dance for nearly three decades and it never changes...
We reached the point of diminishing returns a long time ago imo. Games from 2016 like bf1, deus ex, dishonored 2 and mirrors edge catalyst still look fine while being like 2-3 times less demanding
Yes, you are correct, the scenes at the start of Talos Principle 2 are not very demanding. At the very least, you must reach New Jerusalem (which you do after you complete the first puzzles) or preferably get to the North puzzle areas (flooded valley in particular). Those ones contain vegetation, foliage and water which are extremely demanding on the GPU and upscalers as well have a very tough time rendering them correctly (DLSS included apparently).
In reality this 4060 feels like where a 4050 should be, given the die size, bus width and VRAM... so nobody can really get their hopes too high that it will be very future proof.
@@miguelmunozbustos5319 yeah, true. And then people are surprised when they see Nvidia getting an increase in Q3 profits, despite supposedly the bad sales from 40 series.. this is the biggest reason: giving you less for more money, while demanding a bigger margin cut to AIB partners 🤣🤣 I'm so happy I could snag a nice used RTX 3090 last October after cripto went bust for a nice price.
For the RX 7600 (which is a 6650XT on 6nm instead of 7nm) going by the presets completely kill it. You need to manually mix and match settings (basically remove all features that use RT acceleration) to get a good experience.
@andersjjensen manual mix and matching is how these videos should be done. That is the old golden age of PCMR testing way. Even on my 4090 I do that. 6650 XT is not the same as 7600. They are near equal but not the same
@@CharcharoExplorer Both have 32CUs/2048 shader cores, 128 TMUs, 64 ROPs and 32 RT cores on a 128 bit bus. The RX 7600 uses a whopping 11W less power for what Techpowerup deans an earth-shattering 1% better performance. So calling it "a 6nm 6650XT" is, as a layman's term, neither far fetched nor insulting to any of those two. The only thing it offers is AV1 hardware encoding as RDNA2 already had AV1 hardware decoding. It just goes to show that RNDA3 "bombed" on AMD. For some reason it just doesn't clock well. The 7600 has a game clock of 2250MHz and the 6650XT has a game clock of 2410MHz which is a difference of 7% while the 6650XT only uses 6% more power. So a near linear relationship between clock speed and power consumption. For a half-node step from N7 to N6 that is actually rather ridiculous, and shows that RNDA3 only saw a 7% IPC increase.
Daniel, lately I've been exploring the current trend toward ultra-realistic graphics that require hyper-expensive GPUs and various frame generation and up-scaling tricks to be playable. Many "influencers" are pushing back on games that have so-called great graphics but are boring and unimaginative in their art style and content. They're saying that many of these modern, very demanding games are subpar in their ability to present an immersive world and tell a great story because their publishers are too focused on presenting technically advanced graphics. I'm curious to hear your take on this.
It really feels like most recent higher-budget games are just tech demos. It's just all buzzwords, marketing and microtransactions. No attention is payed to the actual enjoyment of playing them. It's just cut and paste + new useless tech.
@@Zyxlian Yep. Gotta run frame gen and upscaling to get decent FPS with all the supposed great graphics turned on. It's like the runners and swimmers are setting new records, but the stopwatch is calibrated 10% slow.
nothing replaces raw power. remember 2 generations ago all influencers went "the 2080ti is only for 4k gaming, nobody should buy it for 1080p"...yea and now cards with the same power can''t even keep 1080p/60 on high stable lol. lesson: ALWAYS buy more power if you can, no matter the resolution. worst case your card will run cooler/more quiet if you framelimit it.
Which is why I'm running an XTX at 1440p Native. So far I can run everything at stupid good settings without upscaling, and I expect that to be the case until the next gen consoles have been out for a while, and the first this-gen-only console games start hitting the streets.
Pretty sure Remnant 2 did exactly what you suggest, it ran very slow on ultra originally because of the shadow maps setting, so their solution was just turn that off even on the ultra preset, so people would stop complaining about the game's performance
Kinda funny how you didn't run a test with 1440p DLSSQ at high settings. There is nothing objectively wrong with how you organized this video, but you literally neglected the one setting configuration that most people in this price tier should actually use. It is the sweet spot for performance/visual fidelity where 1080p native ultra is sort of just... why?
nvidia saw people gaming on 1030 2gbs during the shortage and said " the 30 series is the new budget gaming series, so 30 series is now 50 series" thats what this generation's fuckery has been about lmao
Better to just skip 4060 entirely, and wait till the next gen, newer card shouldn't be needing a crutch to run high 60 natively, or if you don't wanna wait jump to 4070 tier instead.
I think you're giving Nvidia too much credit. Nothing is actually preventing them from releasing another 5060 8GB/128bits that's barely 10% better than a 4060 without some crazy AI shenigans - forcing people to "wait" for the next generation forever.
@@masteroak9724Heavily doubt they will do their Bs again. Next generation is gonna be a huge jump in VRAM for their GPU’s. 5090 32gbs, 5080 20gbs, 5070 16gbs and 5060 12gbs or 16gbs.
I'm so hyped to get back to these GPU comparison videos when big companies got sucked dry by Nvidia with the AI chips. This bubble will explode soon enough for this market to be interesting again. A few extra TFLOPS won't matter much if you have to pay another 100K USD to get it. We just have to stay patient. My condolences on the other hand for the consistent content creators like Daniel for having to review all these boring stuff every week. Thanks for your service! This topic is so much easier on us, consumers. We can just divert our attention to other areas of life when we get bored of stagnation.
@@BlackJesus8463 Huge companies who are benefitting from faster computations and more VRAM for similar power usage. If it's worth you more money to do the same in less time why won't you buy them? Naturally it involves governments and military purposes, if you can deflect a missile in half the time it's probably worth the $. It's in line with the bans. I can just hope that most of these use cases are directed towards something that is actually beneficial for humans and not to enhance spyware and military stuff.
Good video. The rtx 4060 simply isn't worth the money. It won't have a long life for future games and it's not even great for current AAA games. At least the 3060 had 12gb vram. The minimum Nvidia card to get is the 4070 with 12gb vram. The 7800xt is similar in price SO that's the card to get. Such is the state of video cards these days.
@@JahonCross tbh i wouldnt, but it all depends on what games u aim, for current, i guess its okey, but never not really, mine brother have 6700xt, for what he plays its more than enough, but i played on that pc alan wake 2 and i had to use fsr 2 to have 60 fps at 1080p, buy 7700xt or used 6800 /xt
been getting happier about my 3060 every damn day since the gpu shortage ended. DLSS means that 3060 is still 1440p viable, high vram means high texture quality settings, looking good in general and negating some downsides of DLSS, and the 192bit bus seems like a luxury instead of a handicap now lmfao
When you push requirements too hard, then you'll soon find that people simply won't buy the game and the game will flop hard. Nobody want's a game running like shit on a $400+ card. All that results in, is people just passing on that $60 game, rather than forking out a $1000 for a brand new card. Which in turn if they stupidly did, would still play the game at an unsatifactory level for the price they just paid. Dev's need to learn to scale games so that they run at a reasonable level on the typical "run of the mill" hardware, that the majority of gamers have. I've got a feeling after seeing the slew of demanding games have come along this year and the backlash that gamers had over it, that this is all going to seriously backfire very badly for Dev's and likely very soon. That's unless they actually have taken note and start to dial hardware requirements back. UE5 and it's performance demands simply isn't helping the situation one single bit (neither is dev's chasing after RT). It's just pushing demands that are far too high onto gamers, before anyone is even close to having hardware that is both affordable and powerful enough, to be able to meet those high demands.
Upscalling, Raytracing, Lumen, Nanite, UE5 tools, etc. Are supposed to reduce development costs and time. Yet, we just keep getting unfinished, poorly optimized games that cost even more money than the previous generation ones.
Nice review on the 4060! Coming from a GTX 1050 Ti, I'm happy I finally able to run Alan Wake 2 on the lowest settings with DLSS with my 4060 from Gigabyte. The most shocking thing I've noticed is how the ingame graphic looks exactly the same whether I used ultra 4k or low settings at 1080p. I wonder if game developers rigged the graphic settings to make us spend money on upgrades? I'm looking at you Alan Wake 2 and Starfield.
I really dont feel like buying these games unless they look so good , i rather play old game which looks much better than recent titles and perform much better.
That's a pretty tough call, damn. I mean at first blush it looks like if you're 1080p gaming this card is okay. Not great but okay. And if you're a 1440p gamer, you'll be alright if you use DLSS. And then, there is a pretty vast gulf in price to the 4070 - at least in my market the 4070 is *twice* the price of the 4060. This all seems kind of in line with what a 60-series card is supposed to be: Enough, but nothing earth shattering. But theeeen I start to think: - If it's just okay today, how long until it's not okay anymore? Should I really buy a card today for 330€ that will be good enough for maybe a couple years? - It only has 8Gb of VRAM. I mean damn, my 1070 - a SEVEN and a half year old card - has the same, and games sure aren't the same today as they were in June 2016... UA-cam tests seem to be showing that 8Gb is just about enough today - but again, for how long? - Part of the selling points of the 4000-series cards against both the earlier 3000-series and especially AMD's 6000 and 7000-series is Ray Tracing. Looking at these results, could you really imagine that you would be using ray tracing on any new title coming out? Nope. You'd be either playing at 1080p high natively or 1440p with DLSS just to make 60fps, and I can bet any demanding RT would be out. - Then there's of course DLSS vs FSR. And yeah, we're all kind of agreed that DLSS is the superior technology. BUT, if you could comfortably get your games to 60fps natively (either 1080p or 1440p) for about the same cost, then is DLSS really so valuable after all? Meanwhile, FSR quality certainly isn't SO bad that you couldn't help your AMD card reach a bit higher. Especially if the games get even more demanding, and you would realistically be choosing between FSR quality and DLSS balanced. But then also eventually you end up with a bum deal: Would you really want to pay MORE than the cost of a new 4060 for a used 3070 or 2080 (/Ti)? Or would you really want to pay twice the price of a 4060 for a 4070? Or would you pay very slightly less than a 4060 for a last gen 3060, if you can get one? Or would you really want to pay 1,5x the price of the 4060 for a 7700XT, which would get you better native performance? Or would you really want to pay a tiny bit less than the 4060 for a 7600, which is pretty much the same if not even a little slower? Conclusion: Market's borked.
You only need high textures and everything else can be turned down. I mean are raytraced shadows, puddles and window reflections that good? Like Nvidia Tax good? Hell no.
@@BlackJesus8463 Perhaps. Then again, it's precisely the high textures that are most in want of VRAM. On the other hand, just having plenty of VRAM won't help forever. Like, my GTX 1070 has 8Gb VRAM as I said, but it sure as heck can't run these games well :D You can only turn down settings so far - at some point the limit comes up. In the 4060s case, if it's already just "okay" at 1080p high or ultra, as in this video, how long until you'd have to go down to medium, low, or for even that not to be enough? In fairly many games, the FPS gain from turning settings down isn't as much as I'd hope.
@@BlackJesus8463 Yeah. It's damn hard. I could easily see a case for myself to buy any of the aforementioned, if I was without a GPU right now: - I could buy the 4060. It's the most affordable brand new card that could do the newest current games at 1080p native or 1440p with DLSS. - I could buy the 4070. It's twice the price, but at least it would run the newest games comfortably at 1440p native and probably 4K with DLSS, and be somewhat future proof for maybe the next 5 years. It could also do some actual ray tracing, though not quite comfortably. - Could buy the 7800 XT for about the same as the 4070 (in my market), or maybe less if ordered from abroad, and get slightly better performance than the 4070, but worse RT and no DLSS. - Or could get the 7600 for a good offer somewhere to kind of scrape by for a while and upgrade later - Could risk it and get something used, save the change for something else. Like, all of these options sort of make sense. It's just that all of these options are just priced too high for comfort. Conclusion: Market's borked 🤣
@@AleksiJoensuu You just made the case for buying a 6700XT or 6750XT instead of the 4060. If you're upfront expecting to forego RT and screen space reflections entirely, while keeping textures at max, then the 12GB VRAM and 12-19% better raster simply offers more value.
Do remember with undervolting, debloated drivers, nvidia control panel optimization and optionally ingame settings adjustment your performance will improve greatly as all these test are with stock everything to simulate the average consumer experience. Nowdays you can have an ultra graphics game with up to 20% performance boost with little to no loss in visuals with changing graphics setting.
@@juanblanco7898 This video will start you off right, I saw a massive performance gain even to the point that I can play Tiny Tina Wonderlands at average of 80fps with 60fps lows with a gtx 1080 tweaked ultra settings 1440p no dlss/scaling. ua-cam.com/video/mc0xk06RRTE/v-deo.htmlsi=MzDVY17V_YABeGre
I feel some of these performance issues are partly related to the engine being so new and not very updated/experimented enough and/or that the games released have skimped on optimization for one reason or the other.
Unfortunately, the beginning of the game Talos Principle 2 is not representative. This is a virtual environment in the story of the game. For future tests, you'll have to solve a few puzzles until you wake up from standby mode. You will then find yourself in the city of New Jerusalem. I would still go to the first game mission (at the great pyramid). There you can test the performance well with a lot of vegetation.
I actually like to downscale using the [NVIDIA RTX] DLDSR function 1440p to 1080p, the detail goes way up. Would be nice if you could benchmark that also.
Apart from the 4090, I think this gen of GPUs has been a huge disappointment for the most part. But at the same time, Unreal 5 can be seriously demanding and most titles that used it are often not very optimised. In the case of LOTF on consoles,, in the 60fps mode, the native res can drop as low as 648p on XSX, and a blurry mess on the s machine. So it seems Unreal 5 is designed to be run with upscaling and FG tech due to its demanding nature.
Yeah it's kinda unusual that the high end option actually is the best offer in terms of what you get for your money. Even though it is expensive it doesn't feel overpriced, much less so than the 3090 did. Kinda weird.
The inability to hit native 60fps @ 1080p is not a good look for the price they are asking. I can understand needing dlss & fg @ 4k or even 1440 ultra, but this is just weak and/or disappointing.
1% lows are the only number that matters so If DLSS gives you identical 1% lows to native rendering that is a catastrophic failure and basically means the feature is worse than useless because its making your image lower quality with ZERO performance gain.
How about a ”reality check” when testing budget gpu:s? They probably won’t be paired with a 7800X3D, more likely with a 5600, 12400 or 7600. Maybe an idea for a video? Comparing a budget card in a top-tier testing system vs a budget build? How much perfomance will we lose? I think that would be enlighting. 😉
Currently playing at 1440p with an old 2080 and i noticed that Ultra is definitely not worth it quality wise. I'd love to see more "High" settings on budget cards since it seems more realistic. No one buys a cheaper alternative gpu to run games on ultra. Otherwise great work as usual!
It is more of an issue of the card being functionally useless for future games. The 1060 6gb was able to get avg. 70+ fps at 1080p in the Witcher 3 at max settings (barring Hairworks) when it came out. If the current 60-series cards are getting mid-30's fps with max settings in current games, then it means that they are functionally half as good for it's time. The next UE update will make the 4060 a sub-30 fps 1080p card for those games. It just feels bad.
Same boat, playing at 60+ fps 1440p mostly high and some ultra settings with dlss quality on lords of the fallen (rtx 2080) so really good performance for how the game looks, the biggest performance impact is ultra global illumination which has no real visual upgrade over high. New UE5 games need some options tweaking to run good and look virtually the same as ultra settings
UE5 with nanite and/or lumen, does significantly scale with pixel count. Lumen is ray tracing, a simplified shortcut software based ray trace, but it is ray tracing global illumination. More pixels equal more rays cast. More rays cast is more compute and thus lower performance. Nanite is a special way of handling geometry polygons. The thing it cares most about is pixel count. It handles very high poly counts easily, no need for LOD streaming, but nanite does have a high base overhead just to exist. It is almost as if nanite looks at a pixel and figures out what geometry should exist there, versus taking geometry and figuring out what pixels the geometry contains. One thing said about these things is that throwing "more" at them does not incur the performance penalties one might think via more traditional techniques. With lumen, beyond the pixel count, the highest global illumination setting will typically incur a good performance penalty. More ray bounces and maybe more rays. So, more compute but should be more accurate.
Can barely tell any difference between the visual quality of ultra and high here without pixel peeping, ultra doesn't really seem worth the performance hit.
@@ZackSNetwork DLSS Q in 1440P is rendering at around 900P, which makes it runs better and looks better than native 1080p at the same time. If you can accept more visual quaility hit, you can use DLSS B in 1440p and it will definitely looks way better then DLSS Q in 1080p.
Nanite manages geometric density based on the amount of pixels you are rendering, so it makes a lot of sense to see such massive performance boosts from DLSS.
Oh that explains a lot, thanks! Another thing is, somehow UE5 games look good even in "dlss performance" mode, sometimes it's even hard to tell the difference from native especially if you add some sharpening game filter it might even look bit better... that's on 1440p tho, not sure about 1080p.
One thing I am wondering about UE5, does nanite scale the number of triangels rendered depending on resolution? because, I mean, it would make a whole lot of sense for the engine to render FAR less triangels on a Steam Deck (800p) compared to a 4K monitor since you would literally be throwing away calculations at things you cannot see. And if this is the case, what happens with the amounts of triangles when utilizing DLSS/FSR? does the number of triangels rendered through nanite depend on the base resolution or the upscaled resolution?
@@israelRaizer Yes, this is exactly why UE5 + nanite scales like crap when you turn up the real resolution (not the output resolution). It's adding exponentially more pixels to fill AND more geometry to draw at the same time.
Just buy an xtx and be happy honestly. I have a 3080ti and either I'll buy an xtx soon, or I'll wait for next gen to see how things are. Short of the 4080 and 90 the Nvidia cards are really underwhelming this generation, atleast for the price you pay for them. I can get an xtx for 1100 locally, or j could buy a 4080 for 1500 or a 4090 for like 2k.
Yes) I play Lords of the Fallen in QHD without dlss and frame generation. hundred plus fps. I watched the video and I honestly don’t understand why there is such a huge difference between 4060 and 4070))
I'd rather upscale 1080p than buy an Nvidia gpu in this market. The difference in silcon cost between performance tiers is very small compared to the cost of the card. You should be waiting for the refresh before you buy anyway.
One could can turn the question around: is 'Unreal Engine 5' optimization ready for "mid range cards", like the 4060, to run natively at 60fps@1080p? With the same level of graphics fidelity those cards could run an EU4 title at 60fps@1080p. That should be the question in my honest opinion. Like DefactoOverlord rightfully mentions in his comments here, which I like to echo: "The level of fidelity in these games doesn't justify the hardware requirement."
To add. If graphics engines/game developers/publisher continue the push their focus on ever increasing graphical fidelity, at the expanse of it not scaling the the required gpu power to drive it. In time this will back fire. Unless GPU prices come down significantly, which they won't. Thus a whole GPU product group, the mid range and thus their consumers will be sidelined. And who likes to get sidelined...
We are getting to a point in graphics development where the improvements that are being made visually seem to be only slight and can at times be barely even noticeable to some people. However, those slight increases in fidelity seem to now be unjustifiable, when you take into account the huge cost to performance that inevertably also comes along with them. Yep, upscaling tech helps out a touch (most of us would now be screwed without using it), however it's still just a quick fix and less than ideal solution to an increasing issue. Maybe it's just time for dev's to lay off pushing the graphical bandwagon so hard for a while, until GPU performance has had time to catch up. Hell, instead they could focus on improving over a dozen other core mechanics in games like storylines, accurate physics, accurate character models ect, that are in dire need of improvement. Plus, those improvements will likely be noticed far more by gamers. BTW, some of them have needed a update for years now, but they always seem to be sidelined, simply because Dev's prefer to push out flashy graphics to the masses.
just got my Powercolor Fighter RX 7600 and loving it watching this helps me alot since all in all its only 21% give or take less then the 4060 but I was able to snag the card for 200$ with a deal coupon on new egg so its great :D
Hey Daniel, don't you think it's time to abandon 1080p at all? 1440p DLSSQ is +/- the same performance as 1080p native, and 1440p DLSSQ looks good, and 1080p with any upscaling looks bad, so you'd really like to stick to native in this resolution. Combining it together with better and bigger screen (27" 1440p instead of 23" 1080p) IMO does the job pretty good. I'd really not buy an 1080p display anymore.
Remember when 060 cards could play AAA games at 1080p 60fps without DLSS of FG? Those where the good times! Guess I will be playing indies from now on, because I refuse to pay 630/690 euros (4070) to play 1080p 60fps AAA games.
I see two takeaways from this video: 1. UE5 can actually scale performance with settings, unlike we saw in some of the first games that came out using that engine. 2. 4060 is seriously underpowered, but we knew that already.
1060: "The new 1080p resolution at 60fps? No problem!" 4060: "The old 1080p resolution at 60fps? Well, you could lower settings, or I guess you could upscale?" Why does it feel as though developers are getting sloppier with their optimizations, and hardware manufacturers aren't delivering the expected generational improvements?
The commenter above me is right. Significant performance leaps in hardware are getting more and more difficult to hit with current technologies. That’s partly why there’s such a focus on software tricks to make up for it. And yeah devs are making super unoptimized games, partly because of publishers and studios who rush game releases
@@noidsuper you are seriously blaming the technology for « lackluster » performance and still hating on this card after almost a year ? Its been 8 months already this card performs way better than a 3060 now, accept it.
People hate the 4060 for not being able to run games in photorealistic engine at 100fps. But the question is, why would 70, 80, 90 series even exist if 60 could run all of that?
In relation with Remnant 2, if you didnt know, they launched a patch recently that fixed major problems with different graphics quality and fsr/dlss performance. Sometimes you could barely see any changes in fps with different settings. If only they wouldnt have waited 5 months to release the patch smh
im kinda proud of my brand new 4060 actually lol. So much hate, I know. But hey, im the average player and consumer.And im sure there are alot like me. Got it with a new i7 11700f for a bargain of around 800euros, already tried star citizen to test it, with high settings, ran well enough. Im not picky or very demanding, so might just go medium and im sure i can get a nice experience at some graphics cost. Seems i will be able to enjoy some u5 titles as well, maybe at medium settings, but whatever, even in Low, this new gen games are quite impressive lol.
have u notice that nvidia have no plan to release 4050 ? since they use AD107 which is suppose to be 4050 no 4060 for more monies. we actually get an expensive 4050
I wish you'd tested High Settings (+DLSS Q) (++ DLSS FG) because that card is just not an ultra settings card, especially for people with high refresh rate monitors.
I’m personally not a fan of Frame Generation, however for some scenarios, they may be a benefit. Thinking back, I feel Frame Generation is basically SLI levels of boosted frame rates ….. around 60-70% more frames using 2x gpus over 1x GPU. Nvidia swapped its hardware SLI for software SLI and charging a premium without the buyer getting the benefit of physical hardware.
I swear I see blurry sht and sharpness gone on dlss enabled footage... I know you will likely not see them if you playing, but just watching the video and pausing, the difference is huge.
No matter how it's called, 1050ti or 4060... Nvidia's cards with a 128bit bus is ALWAYS brings only dropping from high to low/medium settings in around 2 years to have acceptable framerate in new games, and absolutely uncomfortable gameplay in around 4 years in a new games (latency, long lvl-loads, bad 0.1%/1%).
@@romankozlovskiy7899 why? Nvidia gives a lot of artifacts also with ray reconstruction and overdrive mode fsr gives better performance and no smugging
be intresting to see just how money it will now cost to get a GPU that can run this set of games at 60fps with high + setting and nice see price cost like for like test using AMD Radeon RX 6000 / 7000 Series GPUS
"The 1060 was great, so I stick to the 60-series, because they're budget-friendly powerhouses." - many long-time gamers who don't follow hardware reviews anymore.
all unreal engine 5 games i have seen so far do not justify the performance hit at all they do not look anywhere close to how good they'd have to look wondering what is in it for them reducing the amount of people who can buy your game at 40-60$ to people who have bought a 500$+ gpu sometimes in the last year
Probably not. The super series doesn’t target that lower end price bracket. The best you’ll see is lower 4070 ti and 4080 prices (MAYBE) to clear out old stock. 4070 likely won’t see huge changes since they’re still going to manufacture it. They’ll just price the 4070 super higher than the 4070. At best you’ll see the 4070 drop to Black Friday Prices and that’s a big if
The RTX 4060 and 4060 8gb are terrible GPU’s that should not have existed. To heavily cut down on Cuda cores, memory bandwidth and vram. The lowest end card should have 12gbs of vram and mid range GPU’s should have 16gbs of vram. The 4070ti Super will have 16gbs of vram so the 5070 should have 16gbs as well.
Running the 4060 at 1440p is an exercise in futility. I upgraded from an RX 580 at 1080p to a 6700XT at 1440p during the human malware snafu, but I already jumped to an XTX, but stayed at 1440p, because the hardware requirements for new games are sky rocketing.
@gustav0711 Of course, how could I have forgotten the most important detail. More money, less memory! The Nvidia way! ... or was it the Apple way? I tend to get large faceless corporations mixed up.
it's ridiculous how they keep pushing the limits of GPUs yet without any major or justified graphics improvement and add fuel to the fire with gimmicks like RT and mesh shaders
Are we looking at GPU limited results? Would someone that has the budget to buy an AM5 system with a Ryzen 7800X3D cpu go for an RTX 4060 GPU? Personally i think it's kind of a mismatch. How would the results be when a cheaper AM4 system would be used?
Depends on the resolution, it'd be fine at 1080p but FSR sucks at 1440p. Tend to think all mid range cards out atm won't do well when UR5 games start hitting proper. Hopefully AMD get FSR in a much better place, fingers crossed on that one.
Hardware Unboxed arent the smartest of people when it comes to tech. 1st the consoles are Powerful or weaker the 4060. So gpu power is no issue. 2nd UE5 uses Nanite (which is similar to mesh shaders). This means it wont use excessive vram. Even at 4k. So no. Its pretty capable of Ue5 with no gpu power or vram issues
@@R4K1B- Oh ya I'm sure you random UA-cam commenter are so much more knowledgeable than Steve or Tim. You also ignore that the consoles have much more VRAM and memory bandwidth than the RTX 4060. And a $300 GPU from 2023 should be considerably out performing $500 consoles from 2020.
1080P is sufficient for a monitor smaller than 24 inches, and 4060 is sufficient for 1080P and 60fps. If it's a small screen, some roughness can be tolerated. If you want more than that, just throw in as much money and electricity as you like.
When I first got my first 144hz monitor I decided to never play anything that does not reach that frame rate with decent settings. I just pretend these games do not exist, even on a 7900 XTX. No amount of fidelity will beat smoothness and responsiveness, specially in multiplayer games which are my jam
I don't see UE5 as the future, maybe for console and pure 3rd/1st person action AAA gamers, but not for most PC gamers, which includes many genres where gameplay matters. You can't make gameplay-oriented games on UE5 unless you want to create flops such as Lords of The Fallen lol. Who wants to play an RTS, beat em up, metroidvania, sports game, racing sim, hardcore stealth game, soulslike etc at a wonky 30-45fps? You can technically play generic 3rd person AAA action games with 30fps, this is what the PS3-4 did and it's happening to this console gen once again. It's not like mechanically shallow games such as Spiderman, Horizon, Cyberpunk, Starfield, God of War etc need fluid gameplay, the games are clunky even at 165fps so it doesn't matter how they run, people play these games for the graphics, story and presentation where they can turn off their brains. On the other hand, you can't even really release even shitty indy UE5 horror walking sims on PC either, because those games are played by kids with several gen old x50/x60 cards mostly, so there's no market for UE5 games at all, neither in the indy nor AA market. The gameplay-oriented games are going to be written in better engines than UE5, which at this point is literally any engine other than whatever monstrosity Bethesda uses. The best actually gamey AAA games from the last 5 years are Elden Ring and Hitman by far IMO, and neither IOI nor From Software use third-party engines because it's much easier to work with an engine taylored to your needs than using a 3rd party engine and trying to study it. Even the RT in Elden Ring and Hitman was actually developed over a long time hence why it wasn't available from the start.There are is no shortcut to developing a good game, and good game companies know that.
Great video. I think as they were building UE5 as an engine, they thought nVidia would have have made the 4060 at least faster than a 3060ti but it slower! People that normally upgrade to the next 60 series should not be upset at UE5 but upset at nVidia. Meanwhile the RX 6800 and the new 7800XT are working better in UE5 titles? Almost like AMD anticipated how to build for it. Seriously WTF nVidia?!
I did not mesured my FPS in Lords of the fallen. But game is super smooth on my RTX 3060 laptop on Ultra settings DLSS Quality. I dont know why hes geting so low framerate on 4060.
Get great deals and sell your old gear to fund your upgrade at Jawa! bit.ly/JawaOwenDec23
RTX 40 vs Ark Survival Ascended pls
@@РоманКузько-щ7ъ I usually focus on fully released games, not early access. But perhaps if I have time in the future
Still looking forward to seeing you cover the Nvidia + AMD cards together. :P
Also curious if it would work with a external GPU.
I love how they improve lightning, reflections and shadows that doesn't change much from high to very high and so on while costing us horrendous performance loss
Although nanite is amazing. No one remembers how soul crushing geometry and object pop in was. It used to break immerion so much!
It does change much, that's when you lose Lumen and soft RT they're doing.
The level of fidelity in these games doesn't justify the hardware requirement. If we'll need a 500$ GPU to run UE5 games on 1080p Ultra 60FPS, ngl it's less than ideal.
THANK YOU somebody else has eyes and Independent thought, people keep trying to tell me "these are pushing the Graphics so hard and that is why they run bad" but I am sorry the JUST AREN'T especially Jusant there is NO WAY a game that looks like that should run SOOOO BAD
@@ocularcavity8412 Jusant looks amazing though. It uses Lumen so it's kind of expected that RT is going to tank performance.
I thought it was already an established fact that it's more about the simplification of development process rather than improvement to gaming experience, much like Ray Tracing.
@@juanblanco7898 literally this, Nvidia dropped the ball with the value/performance and tried to sell upscaling tech in short of being able to handle these new features on lower end with their marginal improvements. They want $300+ for a GPU that barely plays the new standard which was not the case with their last 3 generations. At the very least you can get the equivalent AMD product for noticeably less and even then youre still not getting a great buy this gen.
When are people gonna realise that resolution doesn't matter that much. When you were a kid, did you watch TV at 480i and go "man this looks so unrealistic"? A Gen 8 game at 720p looks better than a gen 6 game at 4K.
Don't get me wrong, UE5 makes for some visually appealing games, but I don't think it's enough to justify the massive increase in hardware requirements. It's basically an increase in 100% raw hardware power for barely 30% better looking games.
ditto.
That was always the case, and the same argument was made when UE4 came out. I've watched this song and dance for nearly three decades and it never changes...
@@andersjjensenPepper Ridge farm remembers
@@gaijinkuri684 Hey! I'm not THAT old! :P
We reached the point of diminishing returns a long time ago imo. Games from 2016 like bf1, deus ex, dishonored 2 and mirrors edge catalyst still look fine while being like 2-3 times less demanding
Thank you for your service, Daniel. A lot of work. This gen in the lower classes is lackluster to remain polite.
Ultra settings are usually not worth the performance hit and if my goal was running games at ultra I would not buy a 60 class gpu.
So true
Yeah we all watched the video bruh. 1080p ultra is the new Crysis.
@@BlackJesus8463 I haven't watched it yet.
Especially any game that has an ultra shadow setting. Usualy from ultra to high you get a damn good fps imcrease and shadows look barely aby different
@@BlackJesus8463 cool story bruh
Yes, you are correct, the scenes at the start of Talos Principle 2 are not very demanding. At the very least, you must reach New Jerusalem (which you do after you complete the first puzzles) or preferably get to the North puzzle areas (flooded valley in particular). Those ones contain vegetation, foliage and water which are extremely demanding on the GPU and upscalers as well have a very tough time rendering them correctly (DLSS included apparently).
In reality this 4060 feels like where a 4050 should be, given the die size, bus width and VRAM... so nobody can really get their hopes too high that it will be very future proof.
Just swap the 5 with a 6 and rake in the fanboi money.
it is even worse than the 3060 ti
@@miguelmunozbustos5319 yeah, true. And then people are surprised when they see Nvidia getting an increase in Q3 profits, despite supposedly the bad sales from 40 series.. this is the biggest reason: giving you less for more money, while demanding a bigger margin cut to AIB partners 🤣🤣 I'm so happy I could snag a nice used RTX 3090 last October after cripto went bust for a nice price.
You have one I assume?
The 4060 literally uses the Ada version of the chip in the 1050ti.
AD107-400-A1 vs. GP107-400-A1.
By Pascal standards, it's the 4050ti.
Do the same for the RX 7600 please.
For the RX 7600 (which is a 6650XT on 6nm instead of 7nm) going by the presets completely kill it. You need to manually mix and match settings (basically remove all features that use RT acceleration) to get a good experience.
@andersjjensen manual mix and matching is how these videos should be done. That is the old golden age of PCMR testing way. Even on my 4090 I do that.
6650 XT is not the same as 7600. They are near equal but not the same
@@CharcharoExplorer Both have 32CUs/2048 shader cores, 128 TMUs, 64 ROPs and 32 RT cores on a 128 bit bus. The RX 7600 uses a whopping 11W less power for what Techpowerup deans an earth-shattering 1% better performance. So calling it "a 6nm 6650XT" is, as a layman's term, neither far fetched nor insulting to any of those two. The only thing it offers is AV1 hardware encoding as RDNA2 already had AV1 hardware decoding.
It just goes to show that RNDA3 "bombed" on AMD. For some reason it just doesn't clock well. The 7600 has a game clock of 2250MHz and the 6650XT has a game clock of 2410MHz which is a difference of 7% while the 6650XT only uses 6% more power. So a near linear relationship between clock speed and power consumption. For a half-node step from N7 to N6 that is actually rather ridiculous, and shows that RNDA3 only saw a 7% IPC increase.
"Introducing the RTX 4060 - yesterday's performance at tomorrow's prices!" 🙃
Daniel, lately I've been exploring the current trend toward ultra-realistic graphics that require hyper-expensive GPUs and various frame generation and up-scaling tricks to be playable. Many "influencers" are pushing back on games that have so-called great graphics but are boring and unimaginative in their art style and content. They're saying that many of these modern, very demanding games are subpar in their ability to present an immersive world and tell a great story because their publishers are too focused on presenting technically advanced graphics. I'm curious to hear your take on this.
Honestly, I think that is the case.
It really feels like most recent higher-budget games are just tech demos. It's just all buzzwords, marketing and microtransactions. No attention is payed to the actual enjoyment of playing them. It's just cut and paste + new useless tech.
@@Zyxlian Yep. Gotta run frame gen and upscaling to get decent FPS with all the supposed great graphics turned on. It's like the runners and swimmers are setting new records, but the stopwatch is calibrated 10% slow.
nothing replaces raw power.
remember 2 generations ago all influencers went "the 2080ti is only for 4k gaming, nobody should buy it for 1080p"...yea and now cards with the same power can''t even keep 1080p/60 on high stable lol.
lesson: ALWAYS buy more power if you can, no matter the resolution. worst case your card will run cooler/more quiet if you framelimit it.
Which is why I'm running an XTX at 1440p Native. So far I can run everything at stupid good settings without upscaling, and I expect that to be the case until the next gen consoles have been out for a while, and the first this-gen-only console games start hitting the streets.
Pretty sure Remnant 2 did exactly what you suggest, it ran very slow on ultra originally because of the shadow maps setting, so their solution was just turn that off even on the ultra preset, so people would stop complaining about the game's performance
Kinda funny how you didn't run a test with 1440p DLSSQ at high settings. There is nothing objectively wrong with how you organized this video, but you literally neglected the one setting configuration that most people in this price tier should actually use. It is the sweet spot for performance/visual fidelity where 1080p native ultra is sort of just... why?
It's a 4050 in disguise, and it stomps it.
Edit: I mean it stomps its predecessor 3050.
What are you punching down?
nvidia saw people gaming on 1030 2gbs during the shortage and said " the 30 series is the new budget gaming series, so 30 series is now 50 series" thats what this generation's fuckery has been about lmao
Better to just skip 4060 entirely, and wait till the next gen, newer card shouldn't be needing a crutch to run high 60 natively, or if you don't wanna wait jump to 4070 tier instead.
I think you're giving Nvidia too much credit. Nothing is actually preventing them from releasing another 5060 8GB/128bits that's barely 10% better than a 4060 without some crazy AI shenigans - forcing people to "wait" for the next generation forever.
the real 60
why not just go with amd if nvidia is not to your liking
@@masteroak9724 Nvidia tripled their revenue this year so you can expect their bullshit to continue until it stops working for them (likely never)
@@masteroak9724Heavily doubt they will do their Bs again. Next generation is gonna be a huge jump in VRAM for their GPU’s. 5090 32gbs, 5080 20gbs, 5070 16gbs and 5060 12gbs or 16gbs.
I'm so hyped to get back to these GPU comparison videos when big companies got sucked dry by Nvidia with the AI chips.
This bubble will explode soon enough for this market to be interesting again. A few extra TFLOPS won't matter much if you have to pay another 100K USD to get it.
We just have to stay patient.
My condolences on the other hand for the consistent content creators like Daniel for having to review all these boring stuff every week. Thanks for your service!
This topic is so much easier on us, consumers. We can just divert our attention to other areas of life when we get bored of stagnation.
I want to know who is buying their professional cards.
@@BlackJesus8463 Huge companies who are benefitting from faster computations and more VRAM for similar power usage. If it's worth you more money to do the same in less time why won't you buy them?
Naturally it involves governments and military purposes, if you can deflect a missile in half the time it's probably worth the $. It's in line with the bans.
I can just hope that most of these use cases are directed towards something that is actually beneficial for humans and not to enhance spyware and military stuff.
Good video. The rtx 4060 simply isn't worth the money. It won't have a long life for future games and it's not even great for current AAA games. At least the 3060 had 12gb vram. The minimum Nvidia card to get is the 4070 with 12gb vram. The 7800xt is similar in price SO that's the card to get. Such is the state of video cards these days.
Honestly I rather get the 6700xt for its price to proformance with its 12gb vram
@@JahonCross tbh i wouldnt, but it all depends on what games u aim, for current, i guess its okey, but never not really, mine brother have 6700xt, for what he plays its more than enough, but i played on that pc alan wake 2 and i had to use fsr 2 to have 60 fps at 1080p, buy 7700xt or used 6800 /xt
been getting happier about my 3060 every damn day since the gpu shortage ended. DLSS means that 3060 is still 1440p viable, high vram means high texture quality settings, looking good in general and negating some downsides of DLSS, and the 192bit bus seems like a luxury instead of a handicap now lmfao
@@salmon_wine4060 is stronger than 3060 dawg as well as having better dlss and raytracing
Thanks to upscaling, UE5 tech is laughably inefficient. The future of budget video cards in new games is not 1080p, it's 540p.
When you push requirements too hard, then you'll soon find that people simply won't buy the game and the game will flop hard. Nobody want's a game running like shit on a $400+ card. All that results in, is people just passing on that $60 game, rather than forking out a $1000 for a brand new card. Which in turn if they stupidly did, would still play the game at an unsatifactory level for the price they just paid.
Dev's need to learn to scale games so that they run at a reasonable level on the typical "run of the mill" hardware, that the majority of gamers have.
I've got a feeling after seeing the slew of demanding games have come along this year and the backlash that gamers had over it, that this is all going to seriously backfire very badly for Dev's and likely very soon. That's unless they actually have taken note and start to dial hardware requirements back.
UE5 and it's performance demands simply isn't helping the situation one single bit (neither is dev's chasing after RT). It's just pushing demands that are far too high onto gamers, before anyone is even close to having hardware that is both affordable and powerful enough, to be able to meet those high demands.
Upscalling, Raytracing, Lumen, Nanite, UE5 tools, etc. Are supposed to reduce development costs and time. Yet, we just keep getting unfinished, poorly optimized games that cost even more money than the previous generation ones.
Nice review on the 4060!
Coming from a GTX 1050 Ti, I'm happy I finally able to run Alan Wake 2 on the lowest settings with DLSS with my 4060 from Gigabyte. The most shocking thing I've noticed is how the ingame graphic looks exactly the same whether I used ultra 4k or low settings at 1080p.
I wonder if game developers rigged the graphic settings to make us spend money on upgrades? I'm looking at you Alan Wake 2 and Starfield.
I really dont feel like buying these games unless they look so good , i rather play old game which looks much better than recent titles and perform much better.
Old games dont look better. UE5 is the best there ever was or ever will be.
@@BlackJesus8463 lmao what a fanboy, profile pic checks out
That's a pretty tough call, damn. I mean at first blush it looks like if you're 1080p gaming this card is okay. Not great but okay. And if you're a 1440p gamer, you'll be alright if you use DLSS. And then, there is a pretty vast gulf in price to the 4070 - at least in my market the 4070 is *twice* the price of the 4060. This all seems kind of in line with what a 60-series card is supposed to be: Enough, but nothing earth shattering.
But theeeen I start to think:
- If it's just okay today, how long until it's not okay anymore? Should I really buy a card today for 330€ that will be good enough for maybe a couple years?
- It only has 8Gb of VRAM. I mean damn, my 1070 - a SEVEN and a half year old card - has the same, and games sure aren't the same today as they were in June 2016... UA-cam tests seem to be showing that 8Gb is just about enough today - but again, for how long?
- Part of the selling points of the 4000-series cards against both the earlier 3000-series and especially AMD's 6000 and 7000-series is Ray Tracing. Looking at these results, could you really imagine that you would be using ray tracing on any new title coming out? Nope. You'd be either playing at 1080p high natively or 1440p with DLSS just to make 60fps, and I can bet any demanding RT would be out.
- Then there's of course DLSS vs FSR. And yeah, we're all kind of agreed that DLSS is the superior technology. BUT, if you could comfortably get your games to 60fps natively (either 1080p or 1440p) for about the same cost, then is DLSS really so valuable after all? Meanwhile, FSR quality certainly isn't SO bad that you couldn't help your AMD card reach a bit higher. Especially if the games get even more demanding, and you would realistically be choosing between FSR quality and DLSS balanced.
But then also eventually you end up with a bum deal: Would you really want to pay MORE than the cost of a new 4060 for a used 3070 or 2080 (/Ti)? Or would you really want to pay twice the price of a 4060 for a 4070? Or would you pay very slightly less than a 4060 for a last gen 3060, if you can get one? Or would you really want to pay 1,5x the price of the 4060 for a 7700XT, which would get you better native performance? Or would you really want to pay a tiny bit less than the 4060 for a 7600, which is pretty much the same if not even a little slower?
Conclusion:
Market's borked.
You only need high textures and everything else can be turned down. I mean are raytraced shadows, puddles and window reflections that good? Like Nvidia Tax good? Hell no.
@@BlackJesus8463 Perhaps. Then again, it's precisely the high textures that are most in want of VRAM. On the other hand, just having plenty of VRAM won't help forever. Like, my GTX 1070 has 8Gb VRAM as I said, but it sure as heck can't run these games well :D You can only turn down settings so far - at some point the limit comes up.
In the 4060s case, if it's already just "okay" at 1080p high or ultra, as in this video, how long until you'd have to go down to medium, low, or for even that not to be enough? In fairly many games, the FPS gain from turning settings down isn't as much as I'd hope.
@@AleksiJoensuu Even upscaling uses more VRAM. lol
@@BlackJesus8463 Yeah.
It's damn hard. I could easily see a case for myself to buy any of the aforementioned, if I was without a GPU right now:
- I could buy the 4060. It's the most affordable brand new card that could do the newest current games at 1080p native or 1440p with DLSS.
- I could buy the 4070. It's twice the price, but at least it would run the newest games comfortably at 1440p native and probably 4K with DLSS, and be somewhat future proof for maybe the next 5 years. It could also do some actual ray tracing, though not quite comfortably.
- Could buy the 7800 XT for about the same as the 4070 (in my market), or maybe less if ordered from abroad, and get slightly better performance than the 4070, but worse RT and no DLSS.
- Or could get the 7600 for a good offer somewhere to kind of scrape by for a while and upgrade later
- Could risk it and get something used, save the change for something else.
Like, all of these options sort of make sense. It's just that all of these options are just priced too high for comfort.
Conclusion:
Market's borked 🤣
@@AleksiJoensuu You just made the case for buying a 6700XT or 6750XT instead of the 4060. If you're upfront expecting to forego RT and screen space reflections entirely, while keeping textures at max, then the 12GB VRAM and 12-19% better raster simply offers more value.
Do remember with undervolting, debloated drivers, nvidia control panel optimization and optionally ingame settings adjustment your performance will improve greatly as all these test are with stock everything to simulate the average consumer experience. Nowdays you can have an ultra graphics game with up to 20% performance boost with little to no loss in visuals with changing graphics setting.
Debloated drivers?
@@juanblanco7898 This video will start you off right, I saw a massive performance gain even to the point that I can play Tiny Tina Wonderlands at average of 80fps with 60fps lows with a gtx 1080 tweaked ultra settings 1440p no dlss/scaling. ua-cam.com/video/mc0xk06RRTE/v-deo.htmlsi=MzDVY17V_YABeGre
I feel some of these performance issues are partly related to the engine being so new and not very updated/experimented enough and/or that the games released have skimped on optimization for one reason or the other.
I second that Todd Howard in disguise. Lol.
Unfortunately, the beginning of the game Talos Principle 2 is not representative. This is a virtual environment in the story of the game. For future tests, you'll have to solve a few puzzles until you wake up from standby mode. You will then find yourself in the city of New Jerusalem. I would still go to the first game mission (at the great pyramid). There you can test the performance well with a lot of vegetation.
I actually like to downscale using the [NVIDIA RTX] DLDSR function 1440p to 1080p, the detail goes way up. Would be nice if you could benchmark that also.
Imagine if you had a 1440p screen. oled
@@BlackJesus8463 imagine if this video was about 4090 or 4080
Future? The 4060 isn't ready for gaming NOW😅
Some people have to have Nvidia though. Share in the greatness of the 4060 with them.
Some people have to have Nvidia though. Share in the greatness of the 4060 with them.
Apart from the 4090, I think this gen of GPUs has been a huge disappointment for the most part. But at the same time, Unreal 5 can be seriously demanding and most titles that used it are often not very optimised. In the case of LOTF on consoles,, in the 60fps mode, the native res can drop as low as 648p on XSX, and a blurry mess on the s machine. So it seems Unreal 5 is designed to be run with upscaling and FG tech due to its demanding nature.
Yeah it's kinda unusual that the high end option actually is the best offer in terms of what you get for your money. Even though it is expensive it doesn't feel overpriced, much less so than the 3090 did. Kinda weird.
The inability to hit native 60fps @ 1080p is not a good look for the price they are asking.
I can understand needing dlss & fg @ 4k or even 1440 ultra, but this is just weak and/or disappointing.
Its actually pretty good, way better than consoles which need to drop down to 720p to reach 60fps in UE5 games.
1% lows are the only number that matters so If DLSS gives you identical 1% lows to native rendering that is a catastrophic failure and basically means the feature is worse than useless because its making your image lower quality with ZERO performance gain.
I certainly hope so, it's the GPU I upgraded to from a GTX 1660 Ti!
I'm on a GTX1660ti currently waiting for the delivery of my rtx4070, gonna change my life to have dlss
rip your future game performance
Ouch the 4060 is terrible. It’s the definition of spending more is better.
😂🤣🤣
@@ZackSNetwork Well in my country I can spend $399 for 4060 or $529 for 4060 Ti 8GB. Or $759 for an actually decent card like a 4070.
How about a ”reality check” when testing budget gpu:s? They probably won’t be paired with a 7800X3D, more likely with a 5600, 12400 or 7600. Maybe an idea for a video? Comparing a budget card in a top-tier testing system vs a budget build? How much perfomance will we lose? I think that would be enlighting. 😉
Currently playing at 1440p with an old 2080 and i noticed that Ultra is definitely not worth it quality wise. I'd love to see more "High" settings on budget cards since it seems more realistic. No one buys a cheaper alternative gpu to run games on ultra. Otherwise great work as usual!
No! We want raytraced shadows! 🤣✌
It is more of an issue of the card being functionally useless for future games. The 1060 6gb was able to get avg. 70+ fps at 1080p in the Witcher 3 at max settings (barring Hairworks) when it came out. If the current 60-series cards are getting mid-30's fps with max settings in current games, then it means that they are functionally half as good for it's time. The next UE update will make the 4060 a sub-30 fps 1080p card for those games. It just feels bad.
Same boat, playing at 60+ fps 1440p mostly high and some ultra settings with dlss quality on lords of the fallen (rtx 2080) so really good performance for how the game looks, the biggest performance impact is ultra global illumination which has no real visual upgrade over high. New UE5 games need some options tweaking to run good and look virtually the same as ultra settings
Same with the 6600/ 7600? 🙂
UE5 with nanite and/or lumen, does significantly scale with pixel count. Lumen is ray tracing, a simplified shortcut software based ray trace, but it is ray tracing global illumination. More pixels equal more rays cast. More rays cast is more compute and thus lower performance. Nanite is a special way of handling geometry polygons. The thing it cares most about is pixel count. It handles very high poly counts easily, no need for LOD streaming, but nanite does have a high base overhead just to exist. It is almost as if nanite looks at a pixel and figures out what geometry should exist there, versus taking geometry and figuring out what pixels the geometry contains.
One thing said about these things is that throwing "more" at them does not incur the performance penalties one might think via more traditional techniques.
With lumen, beyond the pixel count, the highest global illumination setting will typically incur a good performance penalty. More ray bounces and maybe more rays. So, more compute but should be more accurate.
Can barely tell any difference between the visual quality of ultra and high here without pixel peeping, ultra doesn't really seem worth the performance hit.
Yeah it's been like that since the mid-late 2010s.
UE5 itself is unoptimized. we are still getting games from ue4 which are unoptimized as well.
Using DLSS in 1440p makes more sense than both native 1080p and DLSS 1080p.
Sure it would look better however these are literally 1080p GPU’s.
@@ZackSNetwork DLSS Q in 1440P is rendering at around 900P, which makes it runs better and looks better than native 1080p at the same time.
If you can accept more visual quaility hit, you can use DLSS B in 1440p and it will definitely looks way better then DLSS Q in 1080p.
Nanite manages geometric density based on the amount of pixels you are rendering, so it makes a lot of sense to see such massive performance boosts from DLSS.
Oh that explains a lot, thanks! Another thing is, somehow UE5 games look good even in "dlss performance" mode, sometimes it's even hard to tell the difference from native especially if you add some sharpening game filter it might even look bit better... that's on 1440p tho, not sure about 1080p.
12:25 my lord pharaoh looks well endowed 😅😂🤣😂🤣….. upon initial glance 💪
One thing I am wondering about UE5, does nanite scale the number of triangels rendered depending on resolution? because, I mean, it would make a whole lot of sense for the engine to render FAR less triangels on a Steam Deck (800p) compared to a 4K monitor since you would literally be throwing away calculations at things you cannot see. And if this is the case, what happens with the amounts of triangles when utilizing DLSS/FSR? does the number of triangels rendered through nanite depend on the base resolution or the upscaled resolution?
If nanite determines the number of triangles based on resolution I'd say it's the base resolution, not the upscaled resolution
@@israelRaizer Yes, this is exactly why UE5 + nanite scales like crap when you turn up the real resolution (not the output resolution). It's adding exponentially more pixels to fill AND more geometry to draw at the same time.
Its the rendering resolution
Love my 4060. Frame generation and dlss is amazing together.
I am watching on 720p but can't tell the difference in high vs ultra settings
Would the 4070 be a better bet? I'm leaning more toward DLSS over FSR.
Just buy an xtx and be happy honestly. I have a 3080ti and either I'll buy an xtx soon, or I'll wait for next gen to see how things are. Short of the 4080 and 90 the Nvidia cards are really underwhelming this generation, atleast for the price you pay for them. I can get an xtx for 1100 locally, or j could buy a 4080 for 1500 or a 4090 for like 2k.
Wait for the Super launch in a month or 2. See how prices shake out
Yes) I play Lords of the Fallen in QHD without dlss and frame generation. hundred plus fps. I watched the video and I honestly don’t understand why there is such a huge difference between 4060 and 4070))
@@TheRealDlou really expect price drops from nvidia 😂 ?
@@pohanysiku6491Probably because 4070 was twice as expensive as 4060 when it launched.
I'd rather upscale 1080p than buy an Nvidia gpu in this market. The difference in silcon cost between performance tiers is very small compared to the cost of the card. You should be waiting for the refresh before you buy anyway.
I Say Geforce RTX 4070 or Any AMD Video Card would be preferred.
The price is so close to the 7800XT…which can raster on par with the 3090
One could can turn the question around: is 'Unreal Engine 5' optimization ready for "mid range cards", like the 4060, to run natively at 60fps@1080p? With the same level of graphics fidelity those cards could run an EU4 title at 60fps@1080p.
That should be the question in my honest opinion.
Like DefactoOverlord rightfully mentions in his comments here, which I like to echo: "The level of fidelity in these games doesn't justify the hardware requirement."
To add. If graphics engines/game developers/publisher continue the push their focus on ever increasing graphical fidelity, at the expanse of it not scaling the the required gpu power to drive it. In time this will back fire. Unless GPU prices come down significantly, which they won't. Thus a whole GPU product group, the mid range and thus their consumers will be sidelined. And who likes to get sidelined...
We are getting to a point in graphics development where the improvements that are being made visually seem to be only slight and can at times be barely even noticeable to some people. However, those slight increases in fidelity seem to now be unjustifiable, when you take into account the huge cost to performance that inevertably also comes along with them. Yep, upscaling tech helps out a touch (most of us would now be screwed without using it), however it's still just a quick fix and less than ideal solution to an increasing issue.
Maybe it's just time for dev's to lay off pushing the graphical bandwagon so hard for a while, until GPU performance has had time to catch up. Hell, instead they could focus on improving over a dozen other core mechanics in games like storylines, accurate physics, accurate character models ect, that are in dire need of improvement. Plus, those improvements will likely be noticed far more by gamers.
BTW, some of them have needed a update for years now, but they always seem to be sidelined, simply because Dev's prefer to push out flashy graphics to the masses.
1080p doesn't really make sense any more, 1440 Quality, or even Balanced has same or better performance with better visuals!
80 % are on full HD
just got my Powercolor Fighter RX 7600 and loving it watching this helps me alot since all in all its only 21% give or take less then the 4060 but I was able to snag the card for 200$ with a deal coupon on new egg so its great :D
Hey Daniel, don't you think it's time to abandon 1080p at all? 1440p DLSSQ is +/- the same performance as 1080p native, and 1440p DLSSQ looks good, and 1080p with any upscaling looks bad, so you'd really like to stick to native in this resolution. Combining it together with better and bigger screen (27" 1440p instead of 23" 1080p) IMO does the job pretty good. I'd really not buy an 1080p display anymore.
Remember when 060 cards could play AAA games at 1080p 60fps without DLSS of FG? Those where the good times!
Guess I will be playing indies from now on, because I refuse to pay 630/690 euros (4070) to play 1080p 60fps AAA games.
I guess people expect to be able to play latest and greatest with RT on max settings and mods on with a 60 class gpu
I see two takeaways from this video:
1. UE5 can actually scale performance with settings, unlike we saw in some of the first games that came out using that engine.
2. 4060 is seriously underpowered, but we knew that already.
1060: "The new 1080p resolution at 60fps? No problem!"
4060: "The old 1080p resolution at 60fps? Well, you could lower settings, or I guess you could upscale?"
Why does it feel as though developers are getting sloppier with their optimizations, and hardware manufacturers aren't delivering the expected generational improvements?
Because that's true, games are getting increasingly unoptimized and hardware is starting to hit physical limitations.
The commenter above me is right. Significant performance leaps in hardware are getting more and more difficult to hit with current technologies. That’s partly why there’s such a focus on software tricks to make up for it. And yeah devs are making super unoptimized games, partly because of publishers and studios who rush game releases
Yes no hardware problem , last of us can nowe run on 8gb vram and hogwarts no problems anymore
@@noidsuper you are seriously blaming the technology for « lackluster » performance and still hating on this card after almost a year ? Its been 8 months already this card performs way better than a 3060 now, accept it.
About what I would expect
DLSS is magic, they should have leaned into that instead of RT
It's the only reason I was able to play some games on my laptop, it's frankly insane
How can it be insane when its not as good native? They shouldve leaned into native but they gave you both aRT and DLSS.
DLSS is magic at 4k, nice but compromised at 1440p, and a blurry mess at 1080p.
@@MinosML at 1080 it's still miles better than doing it the old way by going down in res. So it is magic mostly when you sit on older hardware.
Wait till people realise that Ue5 doesn't use much vram due to Nanite
Man, I want the opposite tech. I want games to use 24gb VRAM, and ease the performance load instead.
Make one for 4070 too
I'm looking at expanding this same test set to most current gen GPUs if I have time, and doing a bunch of head to heads.
People hate the 4060 for not being able to run games in photorealistic engine at 100fps. But the question is, why would 70, 80, 90 series even exist if 60 could run all of that?
lol exactly plus Nvidia gonna sell what they can sell.
You got point there
4:20 I can do that too by lowering the settings to medium.
In relation with Remnant 2, if you didnt know, they launched a patch recently that fixed major problems with different graphics quality and fsr/dlss performance. Sometimes you could barely see any changes in fps with different settings. If only they wouldnt have waited 5 months to release the patch smh
im kinda proud of my brand new 4060 actually lol. So much hate, I know. But hey, im the average player and consumer.And im sure there are alot like me. Got it with a new i7 11700f for a bargain of around 800euros, already tried star citizen to test it, with high settings, ran well enough. Im not picky or very demanding, so might just go medium and im sure i can get a nice experience at some graphics cost.
Seems i will be able to enjoy some u5 titles as well, maybe at medium settings, but whatever, even in Low, this new gen games are quite impressive lol.
Damn RTX 4060 is already weak WTH
Always was...
It's a 4050 in disguise after all...
You guys are still stuck in june 2023, stop saying this card sucks its been months already this card performs better now.
@@naipigidi nvidia gives more vram or better pcie lines now for it ?
@@marcinkarpiuk7797 not like they do anything to performance that much, and most games with high vram are badly optimized, sooo.
I don t know but the future of gaming looks shiet
have u notice that nvidia have no plan to release 4050 ? since they use AD107 which is suppose to be 4050 no 4060 for more monies. we actually get an expensive 4050
They already did it’s just in low powered laptops lmao.
I wish you'd tested High Settings (+DLSS Q) (++ DLSS FG) because that card is just not an ultra settings card, especially for people with high refresh rate monitors.
I’m personally not a fan of Frame Generation, however for some scenarios, they may be a benefit. Thinking back, I feel Frame Generation is basically SLI levels of boosted frame rates ….. around 60-70% more frames using 2x gpus over 1x GPU.
Nvidia swapped its hardware SLI for software SLI and charging a premium without the buyer getting the benefit of physical hardware.
I swear I see blurry sht and sharpness gone on dlss enabled footage... I know you will likely not see them if you playing, but just watching the video and pausing, the difference is huge.
No matter how it's called, 1050ti or 4060... Nvidia's cards with a 128bit bus is ALWAYS brings only dropping from high to low/medium settings in around 2 years to have acceptable framerate in new games, and absolutely uncomfortable gameplay in around 4 years in a new games (latency, long lvl-loads, bad 0.1%/1%).
Bus width doesn't matter,It's all about having enough bandwidth.
Do you think you’ll ever do more testing with a more representative cpu like a 5600x?
Weak 4060 i prefere 3090 or 3080 on fsr 3.0
Just use Dlss, Fsr 3.0 isn’t good on any card
@@romankozlovskiy7899 why? Nvidia gives a lot of artifacts also with ray reconstruction and overdrive mode fsr gives better performance and no smugging
I'm so glad I recently picked up a 3080fe used for $300.
No way I could match that performance per dollar. The 4060 is junk.
be intresting to see just how money it will now cost to get a GPU that can run this set of games at 60fps with high + setting
and nice see price cost like for like test using AMD Radeon RX 6000 / 7000 Series GPUS
What’s the price for a 4060 in the US? Here in Australia the price point for gpus are really pricy for us here.
4060 is 300 bucks in the US
Wow it costs us 700-750aud. Imagine paying up to 500usd for a 4060
Love the videos, keep up the great work!
I don't understand why this card is selling so well
People afraid of AMD they had so meny driver issue on 5 000 seris
"The 1060 was great, so I stick to the 60-series, because they're budget-friendly powerhouses." - many long-time gamers who don't follow hardware reviews anymore.
GTX 4060 is just a name. In reality is RTX 3050 replacement for $50 more.
Its ready, mostly at 1440p and at 1080p
all unreal engine 5 games i have seen so far do not justify the performance hit at all
they do not look anywhere close to how good they'd have to look
wondering what is in it for them reducing the amount of people who can buy your game at 40-60$
to people who have bought a 500$+ gpu sometimes in the last year
Great job as always! Keep it up.
Should add Ark Survival Ascended to the game list as it brings even 4080 to standstill @ 1440p lol
EVERYONE looking to buy SHOULD wait IMO until the SUPER announcement in a month. Might shake up the pricing!
Probably only older 70ti and 80 will be affected...
Probably not. The super series doesn’t target that lower end price bracket. The best you’ll see is lower 4070 ti and 4080 prices (MAYBE) to clear out old stock. 4070 likely won’t see huge changes since they’re still going to manufacture it. They’ll just price the 4070 super higher than the 4070. At best you’ll see the 4070 drop to Black Friday Prices and that’s a big if
The RTX 4060 and 4060 8gb are terrible GPU’s that should not have existed. To heavily cut down on Cuda cores, memory bandwidth and vram. The lowest end card should have 12gbs of vram and mid range GPU’s should have 16gbs of vram. The 4070ti Super will have 16gbs of vram so the 5070 should have 16gbs as well.
Running the 4060 at 1440p is an exercise in futility. I upgraded from an RX 580 at 1080p to a 6700XT at 1440p during the human malware snafu, but I already jumped to an XTX, but stayed at 1440p, because the hardware requirements for new games are sky rocketing.
*Is overclocked 3060 ready?
Overclocked 3060 with DLSS 3 and frame gen...
Overclocked 3060 with DLSS 3 and frame gen...
@@AlucardNoir😂
@gustav0711 Of course, how could I have forgotten the most important detail. More money, less memory! The Nvidia way!
... or was it the Apple way? I tend to get large faceless corporations mixed up.
it's ridiculous how they keep pushing the limits of GPUs yet without any major or justified graphics improvement and add fuel to the fire with gimmicks like RT and mesh shaders
For folks with amd gpus. Xess or TAA@0,8-0,9 looks way better than fsr2 at 1080p.
Are we looking at GPU limited results? Would someone that has the budget to buy an AM5 system with a Ryzen 7800X3D cpu go for an RTX 4060 GPU? Personally i think it's kind of a mismatch. How would the results be when a cheaper AM4 system would be used?
This is probably a silly recommendation but I want to see how a RX 6950 XT runs on UE5 just to see how well RDNA2 holds up longevity wise.
Depends on the resolution, it'd be fine at 1080p but FSR sucks at 1440p. Tend to think all mid range cards out atm won't do well when UR5 games start hitting proper. Hopefully AMD get FSR in a much better place, fingers crossed on that one.
theres no reason to get 4060 if you can find 3060ti around that price if you must go for nvidia
Me when nvidia
aslong as the game runs on 1080p quality fg with 60-90 fps. U never need an upgrade. Unless u have a lot of cash and want to play at 4k or so....
Good video. Would like a video showing AMD (and Intel?) equivalents to the 4060 in theses Unreal Engine 5 games. (Has this already been done?) Ty!
Lol no. Go rewatch the Hardware Unboxed review, the RTX 4060 is barely ready for the gaming _of today._
You don’t tell a math teacher go watch someone else UA-cam when got the hardware it self to test it
Doesn't have the luxury of consoles saying everything is 4k, while hiding the heavy upscaling and sacrifices made just to get it there.
Hardware Unboxed arent the smartest of people when it comes to tech.
1st the consoles are Powerful or weaker the 4060. So gpu power is no issue.
2nd UE5 uses Nanite (which is similar to mesh shaders). This means it wont use excessive vram. Even at 4k.
So no. Its pretty capable of Ue5 with no gpu power or vram issues
Just don't listen to Linus. We all know those ppl have no idea how to test
@@R4K1B- Oh ya I'm sure you random UA-cam commenter are so much more knowledgeable than Steve or Tim. You also ignore that the consoles have much more VRAM and memory bandwidth than the RTX 4060. And a $300 GPU from 2023 should be considerably out performing $500 consoles from 2020.
1080P is sufficient for a monitor smaller than 24 inches, and 4060 is sufficient for 1080P and 60fps. If it's a small screen, some roughness can be tolerated. If you want more than that, just throw in as much money and electricity as you like.
When I first got my first 144hz monitor I decided to never play anything that does not reach that frame rate with decent settings. I just pretend these games do not exist, even on a 7900 XTX. No amount of fidelity will beat smoothness and responsiveness, specially in multiplayer games which are my jam
I don't see UE5 as the future, maybe for console and pure 3rd/1st person action AAA gamers, but not for most PC gamers, which includes many genres where gameplay matters. You can't make gameplay-oriented games on UE5 unless you want to create flops such as Lords of The Fallen lol. Who wants to play an RTS, beat em up, metroidvania, sports game, racing sim, hardcore stealth game, soulslike etc at a wonky 30-45fps?
You can technically play generic 3rd person AAA action games with 30fps, this is what the PS3-4 did and it's happening to this console gen once again. It's not like mechanically shallow games such as Spiderman, Horizon, Cyberpunk, Starfield, God of War etc need fluid gameplay, the games are clunky even at 165fps so it doesn't matter how they run, people play these games for the graphics, story and presentation where they can turn off their brains. On the other hand, you can't even really release even shitty indy UE5 horror walking sims on PC either, because those games are played by kids with several gen old x50/x60 cards mostly, so there's no market for UE5 games at all, neither in the indy nor AA market.
The gameplay-oriented games are going to be written in better engines than UE5, which at this point is literally any engine other than whatever monstrosity Bethesda uses. The best actually gamey AAA games from the last 5 years are Elden Ring and Hitman by far IMO, and neither IOI nor From Software use third-party engines because it's much easier to work with an engine taylored to your needs than using a 3rd party engine and trying to study it. Even the RT in Elden Ring and Hitman was actually developed over a long time hence why it wasn't available from the start.There are is no shortcut to developing a good game, and good game companies know that.
Great video. I think as they were building UE5 as an engine, they thought nVidia would have have made the 4060 at least faster than a 3060ti but it slower! People that normally upgrade to the next 60 series should not be upset at UE5 but upset at nVidia. Meanwhile the RX 6800 and the new 7800XT are working better in UE5 titles? Almost like AMD anticipated how to build for it. Seriously WTF nVidia?!
Nevermind the future, it's not even ready for the present.
Man I just hope I would be able to play a few future titles on my 3050. Thats all I am asking for just hang in there for 3-4 years
I did not mesured my FPS in Lords of the fallen. But game is super smooth on my RTX 3060 laptop on Ultra settings DLSS Quality. I dont know why hes geting so low framerate on 4060.
All the cheating gimmicks Nvidia needs to keep up their performance