But FSR 2.1 (I don't know about DLSS) additionally provides temporal anti-aliasing. And considering that TAA will always be a bit blurry, FSR holds up incredibly well (in quality mode), while also providing a modest performance boost. It also quite effectively removes aliasing from Hairworks in TW3, so you don't have to rely on MSAA, which makes it actually worth using.
Not right now, but when every computer has the capability to do AI upscaling, It will definitely be the next step into making really good looking games by developing with it as the base.
Why not... depends on the GPU and the game. If we speak about relatively slow GPU and relatively heavy game, that game was no-go for that GPU before upscale become a thing. So in many use cases upscale will be required if higher fps is desirable. Btw remnant2 got update somewhat fixing performance? Seems like it is pretty normal already for new games to have performance problems at start.
Main issue here is that studios are putting too much emphasis on graphics and not worrying about quality overall. Graphics look real nice on your trailers but doesn't replace the fun factor (that is definitely affected if you need to buy a powerful card to play and still need tricks to fight FPS drops)
And then when a big developer dares to make a game that doesn't conform to the graphical fidelity goose chase, people still bitch about it. Hopefully Fromsoft never listens to these dumbasses.
Literally been an issue for 25+ years (yes I'm old). Obviously upscalers etc didn't exist in the days of BUILD, Quake (1-3) engines and everything from the PC "golden age" c1994 to 2004, but we've always _always_ had issues with studios putting graphics above gameplay. For that matter, there's also _always_ been issues with graphical advances outpacing hardware -- at least for the majority of people. When Quake came out back in the '90s, the vast majority of gamers didn't have graphical acceleration at all, and the ones who did were forced to run the game at low settings and in a reduced screen window. The 2000s to mid 2010s were the happy medium of cheap graphics tech and engines that ran great and looked great... But now we seem back to the late '90s-early '00s era of lazy devs and engines outpacing affordable tech once again. Just saying, from someone who's been gaming since the goddam 1980s, this is nothing new.
I've been saying this for a while now. Dying Light 2 was the first game that really made me notice how bad the issue was getting with games seeming to require DLSS to play at a decent framerate. Upscaling is an awesome technology to squeeze more performance out of older hardware, or to make something like Ray Tracing viable in games for a more modern card. But it seems like it's being used as a crutch to just not bother optimizing games anymore.
DL2 was my first experience with this too. I had a 1080ti (which even now is still quite powerful), and it couldn't hold 60 FPS @ minimum settings on a resolution I had no problems running other demanding games on (I should note my peak was like 43 fps while I stood still and stared at the ground in the intro sequence lmao).
Modern software developers have managed, through sheer incompetence and inertia, to render irrelevant multiple magnitudes of hardware improvements over the past 30 years. I wonder how much electricity is wasted each year because of bad code.
Anything nvidia expect deceptive tactics, and FPS bs was the start of messing with your mind from the industry.. I always played 60-70fps and am one of the best players in LoL, and Battlefield when i played while ago.
@@Bustermachine You're confusing DLSS with TAA. TAA needs to be run within the pipeline to look decent. DLSS and FSR just need TAA to work and are easily implemented.
Dont buy a game at launch, buy it when it works properly, i.e when all the issues are ironed out. Don't be afraid to skip a game if it launches in a poor state
My rule of thumb is 1.5yrs after a sp title release. This usually is enough time for all the dlc for the game to be released and issues to be worked out. MP? Well that is a shitshow because you gotta get in early on MP to not miss out before population drips which it almost inevitably does.
I was so close to pre ordering Remnant 2 because of how much I liked the first game and how well it run. But I was glad I backed out after the early access benchmarks came out, then I went to the subreddit and after seeing that post from the devs my fears were confirmed.
That's what i did with Elden Ring, it was running poorly for people at launch but right now after multiple patches it's running damn nice at 1440p/high/60fps with very small rare drops.
@@Dregomz02 Ye Elden Ring was the first game that I ever purchased and played Day 1 and I got burned. Waited a couple months and played it when it ran well. Never again though lol
60fps 4k dlss performance with 3080 and 120+fps 4k dlss performance with 4090 does seem the way new games are being optimized for. Makes sense with how many tvs support 4k 120hz now. theres got to be way more people overall buying more 4k 120hz tvs vs 1440p/1080p monitors in 2023.
Except FSR and DLSS is not the problem, Just because FSR and DLSS exists does not mean its a free pass to 100% skip optimizations. It does not automatically make Devs extremely lazy. They cannot get way with just FSR/DLSS alone with Zero game optimizations.
Just look at cyberpunks phantom liberty update and starfield. This is BS. The way the gaming industry has gone the less and less I want to play games anymore.
I refuse to believe that new games are "out pacing" the power of gpu's. They do not look much better than they used to look. Devs are just becoming more and more lazy about optimization.
Yeah I mean, older games that look incredible came out during the RTX 2000~ or hell even the GTX 1000 series, even the best looking games these days dont really hold a candle to Red Dead Redemption 2 which came out on the PS4/Xbone which were running on hardware significantly less powerful than the best PC hardware at the time- and it still ran at a very solid framerate. Meanwhile if you tried playing RDR2 on PC when it came out (and even now really) it runs like dogshit on pc hardware that's multiple times more powerful than those old consoles... No matter how much power you have, if the game devs didn't properly optimize their game, it wont matter, look at the original dark souls 1 for example, that thing brought even RTX 2080's to its knees in blighttown, despite looking like an early PS3/Xbox360 title. It's true that this generation of Graphics cards have been absolute dogwater though, RTX 3060 Ti outperformining it's successor the 4060 Ti is fucking pathetic and laughable, and Nvidia should be ridiculed for it at every opportunity. Literally the only card that offers significant gains on its predecessor is the 1000$+ 4090, which is bullshit.
One big culprit is ultra HD textures. Which players will not even notice until they're hugging an object. You see this problem with Higwarrts and, RE4 remake. VRAM is gobbled up to store textures to an insane degree. Turning textures down actually fixes performance while barely changing visuals. Hogwarts in particular you will not notice a difference between ultra high and arguably even medium texture settings.
@@WelcomeToDERPLAND I played dark souls remastered on 1660 TI laptop and an RTX2080 and I can’t ever remember frame dips or stuttering on either system. 1660ti was 1080p max settings and 2080 was 1440p max settings. I definitely had to turn down settings with my 970m laptop to get 60fps at 1080p though, ps3 and 360 ran that game at 30fps 720p, with frequent dips into slideshow territory.
@@MechAdv Key word: Remastered The Remaster fixes most of the performance issues. The prepare to die edition is what kills fps, specifically in blighttown.
PCs are now doing what consoles have been doing since the Xbox360/PS3 era: running in low resolutions and trying to act like they're not. Most of that old console era was 720p, even though all our tvs and systems were 1080p - and with xbox one/ps4, it was the same thing, a lot of "adaptive resolution". It's going to suck if this starts to be more commonplace in PC games. I remember being happy that my little 2060 had DLSS, as I figured it would give me a bit more oomph, but it's been a mess in most games that bring it on board, who only bring it to cover problems and not to help broke ass PC gamers like me lol
i always feared DLSS and Nanite because despite being pretty cool tech wise, it was an extremely easy excuse big companies could use to completely ignore game optimisation more than they already do.
Except FSR and DLSS is not the problem, Just because FSR and DLSS exists does not mean its a free pass to 100% skip optimizations. It does not automatically make Devs extremely lazy. They cannot get way with just FSR/DLSS alone with Zero game optimizations.
Welcome to the next generation. I remember when everyone was "forced" to upgrade their machines when everything was suddenly created for 3d video cards And people cried just as much about being "forced" to upgrade.
not the gpu manufacturers fault. the game devs are using cool features like dlss & fsr as an excuse to not properly optimize their games. Like a lot of people said they would when it first got announced.
@@hypnotiqnvidia is also guilty of this issue of slacking off due to dlss, the 4060 is a garbage graphics card because it literally depends on dlss to get good fps
@@hypnotiq I disagree, the pipeline for GPUs, graphics engines, and game development, is too interlinked to say that the problem sits solely in the lap of any one of the three. They're all to blame.
I've been working as a backend dev for more than a year now and let me tell you that most of the times it's not about caring, it's more like you can fix it, but your boss have some priorities and maybe when those priorities are done, he won't pay you to optimize the game if it isn't too bad, money is power nowadays (and it's been for a very long time)
Yeah, I can definitely see that. I’m sure some game devs enjoy their jobs but at the end of an 8 hour shift or longer you just wanna get tf home, especially if the needed work is unpaid.
Yep. Optimization is not easy and it takes a lot of time. People want games quickly and those at the top want them released quickly. FSR/DLSS is being used as a way to get it playable sooner and out the door. If it sells well, it gets optimized and patched. If not, maybe it gets optimized down the road or maybe it doesn't.
@@michaeldeford1335 if it sells well without controversy related to performance why would a company spend money on optimization (and they wont), it isnt used to get games out the door quicker to patch it later, its just used as a crutch to get games out the door for the sake of early release.
Same here and I've had similar experiences being a backend dev. Normally it isn't the lead dev/scrum master who wants to leave it out but someone up in middle/upper management who decides that the costs & effort outweigh the benefits even if the devs disagree. I've had countless arguments with these people on how much effort things take and they seem to think I'm purposefully putting way too much time down. They literally have no clue
The main problem is devs not optimizing for DX12. In some cases,devs are just taking their game, running it in DX12, but then converting those calls to DX11 which hits your performance even more. Its sad that the entire software industry in recent years has been more concerned about taking shortcuts and pumping out content that it has optimizing their software to run better.
@@DiogoManteu I'd advise you to do your research. They arent. Hell, Intels GPU's do this all the time by converting older DX calls to 12. Thats why recent Intel GPU's dont perform as well on older DX versions.
@@DiogoManteu It is if they're so fucking lazy they wont convert their game to 12 and instead just translate 11 calls to 12. Or not develop for 12 and kill performance on cards like Intel's. Get out of here 🤡🤡. You dont know shit.
@@Ober1kenobi Are you trying to say the hardware that's been put out recently is good and up to standard? In my opinion it is very much a case of both groups being in the wrong.
People with 8GB videocards, and PC gamers in general, need to lower their expectations of console ports. Making a game is extremely hard work, and to accuse game developers of being lazy is just honestly completely ignorant. The optimization argument is getting blown out of proportion. Would you rather wait three more months before it releases? If you want it to be better optimized, just wait three more months, and it will typically be much better optimized by then, if it wasn't well optimized to begin with. It's much more important that the game is actually a good game than to have it super well optimized, and this looks to be a VERY good game. Making a game really well optimized for PC can be very difficult for a game that's first developed for the PS5. Making sure that the game will run really well on the PC will give the game designers more restrictions, and will make it harder for them to optimize the game for the PS5.
@@syncmonism yes i would like a three month delay if it means the game launches better. Its called patience. and please enlighten me and the others as how to making a game like call of duty for example is hard work.
1) I think many, but not all devs are relying on upscalers (DLSS/ XESS / FSR) to get new game to some sort of a playable state 2) investors / shareholders / management get pressured to have a title out by a certain timeframe regardless of the state a game is in pushing devs to not optimise titles properly. 3) Game share more about DLC / loot boxes / cash grabs to onsell features etc and no longer just a game …. It’s a business with repeated turnover and that’s it.
4) the graphics don't look hardly different/better from a game released 10 years ago, yet we are getting worse and worse performance. Feels like collusion with GPU manufacturers or terrible optimisation to me.
You are one of the very few who does acknowledge that the devs basically have to do what the greedy higher ups tell them to. Most people and the content creators only mention the devs when in most cases they are doing what the publisher orders them to.
Optimization is one of the last things you do in game development, so if it's not getting the attention it needs, that could be because of time pressure. Devs might *want* to have time to optimize, but when looking at potential steps that could be condensed, it would be an obvious candidate given how much it can be "cheated" using upscaling. I suspect this may be what happened to Jedi: Survivor
I've actually heard from people who code, and a few developers with experience note they learned a lesson at some point that you should optimize early. If you don't, you have to go through the whole project optimising everything when you could have built it up from the ground up on a framework of proper optimisation. It can make sense that while learning a new engine or gaming device that the bloat comes in as mistakes during the learning process. But eventually, general experience along especially with experience with the engines and platforms should lead to teams knowing what they are doing at project inception. You would see this progression with older versions of Windows and game consoles; advanced games that ran better than the previous ones on those systems. UE5 just came out last year, PlayStation 5 in late 2020. (Maybe Xbox series x is a big technical change, I just wouldn't be surprised if it wasn't that much of a difference compared to xbox one and windows 10..and now 11. And the switch was built on older tech even when released around 6 years ago now. But the ps5 is the popular thing folks have to work around now) I would say it hasn't been enough time, but that would be excusing all the bad launches throughout the 2010s. It appears to be that greater graphical fidelity is demanded even when that same level of fidelity takes twice or four times the effort and performance, while giving diminishing returns in terms of actual satisfaction when....so many popular games played online with others are years and years old now and run on potatoes and toasters. Do they just have to keep hiring more and more new people to make it all happen on a tight schedule? Well we know the answer to that already, yes, it's yes. So it seems that AAA devs are in a perpetual cycle of learning how to do things and by the time they've built up experience, something new comes along that is even more expensive both computationally and effort/logistics wise, along with studios closing down and people constantly shuffled and.... I suspect they just never get to that point of competence. And if they were to spend the money to somehow do that on time.... maybe they wouldn't profit? We could just use 10 years with no new additions to the whole process....just, take a break and work things through, straighten the kinks out, and see the quality rise back up before moving forward. After all, it's not like anyone is demanding 8K, and surely we've finally hit the top mark and into the real area of diminishing Graphical returns? Right....right?!?!
You are wrong, we must develop with optimization in mind from the beginning, because if you schedule a 3-day development requirement that just works and nothing else, you will have a lot of work to do to rewrite your code later, I am a software developer and that is correct. It's about thinking first about the best performance of your code.
@@chillinchum For sure, people always expect too much from the beginning of a new generation. There is a sweet spot, though. 2009-2011 is my favorite example when it felt like devs really had a solid handle on the current tech *just* before new hardware rendered their progress meaningless. And then there's MGSV and the venerable Fox Engine, which seemingly blows a massive hole into this theory
Optimizing models, effects and textures costs money and A LOT of time. I feel a lot of developers are pretty much skipping a huge portion of this process today and also rely on different technologies to either automate or at least make it easier which sometimes make a suboptimal result. Hopefully technologies like nanite for unreal engine will remedy some of the core issues that we have with game performance.
Bro, I’m a software developer, and when I read about various tactics game developers were using way back in the 90’s / 2000’s I am literally blown away by the sheer ingenuity and complexity for which they ended up solving their problems I have NO QUESTION in my mind that modern day developers across the board…aren’t anywhere near as good in regards to writing Efficient Software 😂❤ We were spoiled lol
The problem with Nanite (and UE5 in general) is how bassline heavy it is. Make a blank UE5 scene and look at how poor the framerate is. That said, it's like a freight train - you can load it up will stuff and it won't slow down.
Yeah, because the management side of things is only interested in something to show the investors for a given quarter, so they rush things to release, and of course the developers' priority, when being forced to release a game months or sometimes years ahead of schedule, is to focus on getting the game to an actually functional state, and unfortunately stuff like optimization is usually one of the last steps of the whole process once everything else is done, and they don't even have time to even finish everything else. Upscaling technology isn't the problem here. The problem is capital getting in the way of art, a tale as old as time. Blaming upscaling tech is a weird take, because you know that we'd probably be in the same situation regardless, just with even less playable games.
The second upscaling technology went from an optional feature to increase FPS for users willing to compromise on certain visual features to mandatory to achieve even just 60 FPS, the industry started to lose its way. This is a horrible trend starting, and I'm worried this will be the standard going forward.
It's a total contradiction of what raytracing was supposed to add to the games. Raytracing = better accuracy of shadows & reflections for "high picture quality", but lower fps, D.L.S.S = "worse picture quality" by removing the accuracy of things like AA, AF, shadows, & draw distance, but higher fps.
@@kevinerbs2778 Modern AA was already removing details, with methods like TAA creating ghosting and blurring the whole picture. I honestly think DLSS and FSR are the best kind of anti aliasing at high resolutions, since MSAA is so heavy on the framerate and TAA doesn't look good imo.
@@meuhtalgearYou don't even need AA at higher resolutions like 4k. Downscaling just makes you image look like dogshit. Its something you don't really want if you play at higher res
Cringe take. This happens because of console culture that is based on locked 30 or 60 fps. DLSS is not responsible for the developers picking one over the other, and in most cases still being unable to deliver.
@@evaone4286the need for AA isn't resolution bound but based on screen size and resolution. 7" 1080p doesn't need AA cause your eyes can't make out jagged edges.
I think the biggest problem in a lot of these latest releases is Unreal Engine 5 itself. For games that are available in 4 and 5 there is a hugely noticeable increase in performance hitching, memory requirements, and required drive speeds. Unreal Engine 5 has actively discouraged optimization in its marketing , and even with it, games seem to run worse.
Mordhau went from UE4 to UE4.5 and the game became a fair bit smoother fps-wise when it happened, at least on my power gamer AMD rig from 2+ years ago, big battles with loads of real players still puts a big burden on my cpu however due to the insane numbers of computations but still.
@@arcticfox037 its all about how its used and who's working on it, until the engine automates literally everything and auto optimises it, which is still a decade or more away, maybe ue7 or 8, competent devs will still need to work on polish and optimisation.
@@arcticfox037 Ue5 without using Nanite or lumen will perform almost the exact same as ue4, its when turning these more expensive features on where the optimization is felt. They are more expensive, but have no limit to their ability when they are used. So more expensive, but more capability from lumen and nanite
DLSS was originally meant to exist as a crutch to make ray traced games performant enough to be playable. The fact that we have to use it for rasterized games nowadays is..annoying.
@@thelastgalvanizer up to the developers. I personally find pop in is more distracting than modern DLSS, but I can understand it's a personal preference. using nanite also probably reduced the budget (helpful for a small team with limited resources). No nanite means a team would have to create 5x more objects to manage LOD transitions (also every-time they make a new object, they than have to create all the different LOD models, slows down development) nanite doesn't just fix pop in. it also saves a huge amount of time and resources for developers. I can understand why they used nanite
@@legendp2011For the record, LOD creation is practically automatic, and the number of people who would be able to tell the difference between Nanite and good LOD levels is... Literally 0, you can't since LOD levels are determined by distance from render view.
Nanite is kind of an enigma. It can be more costly than traditional LODs when it comes to less complex scenes. But when geometric density, and especially object density reaches a certain level, it can end up being cheaper and more performant. It really comes down to use cases and optimization. And surely there could be implemented a kind of quality scaling setting for nanite that allows for more or fewer polygons overall, or for fewer polygons in the distance. It could be like standard quality options we have in traditional games. Why does it have to be so locked down?
Wait till the next gen when AMD won't make any high end cards. you'll pay more for less, and be happy for DLSS 4 while Jensen Huang gets another leather jacket .
Either that or use a console lol. I'm so pissed that the Xbox series S runs games better my 3090 desktop. The console litteraly only has 4 terrflops of performance!
@@vogonp4287 Intel is essentially a year behind Nvidia and AMD. Battlemage is going to be launching when RDNA 4 and Blackwell are coming out and it's aiming to compete with current gen not Nvidia's and AMD's next gen. Intel won't be competing against Nvidia's high end either so Nvidia has even more of an incentive to do what they did with the 40 series outside of the 4090 and move all products down one tier while charging more because AMD isn't competing either. It's the same reason why Nvidia just cancelled the 4090 Ti. They have no reason to ever release it because their main competitors are nowhere near close to their level of performance.
I personally think this is also an advertising issue - fancy graphics and tricks look and sound real good in trailers (and some gamers also get real obsessive about having big numbers,) but there they can just run it on a supercomputer and ignore the optimization. Then, actually making it run well on normal devices takes a lot more dev time than the studio will usually sign off on when they can just upscale and call it a day with relatively minimal impact on their profits. Kind of a funny parallel with apples (red delicious in particular) - in the short term, people buy fruits in the store based on how they look and maybe feel, so stores select for fruit that looks pretty and stays fresh as long as possible. Problem is, with apples, that means they end up waxy, grainy, and bland when you actually eat them.
I would imagine the devs do care, the real question is whether the suits they answer to cares. Think of the bottom line and how much money you can save if you can just use the magic slider to fix the problems that would take more development and therefore more time and money. I like to think that the actual developers generally care about what they produce and that problems typically stem from higher up. Like the guy at a store who genuinely wants to provide proper service but the bossman thinks of it as a reduction in productivity because that energy could be used to stock more items.
As someone who's an upcoming game dev, I can confirm most people [devs] do care, we want our games to be played how we want them, sadly I agree that the people cracking the whip simply don't care, they don't see games as art they see it as a "product" as something to be done as quickly, cheaply and as profitably as possible If it means that there's a way to almost cheat performance into a game, I can grantee they'll do it, because if there are corners to be cut, they'll cut alright, it's the sad state of some modern games
@@Marsk1tty flat originations aren't the best, they mostly work in small groups, once the groups get big enough power dynamics and invisible hierarchy is created It's good for indie projects but the bigger the game gets the more people is needed than the flat hierarchy can handle
imagine thinking dlss or frame generate are "sliders"... DLSS/FSR is up scaling to reduce resolution of objects that barely or unnotice in a frame (which almost doesnt affect the experience) to boost up performance which gain a TON of experience when you gaming
Game developers are absolutely partly to blame. This doesn't mean individual programmers, but game development studios. The amount of VRAM used by games released in 2023 is not because of any technical reason. We know this because Cyberpunk looks better than every game from 2023 and an 8GB GPU can run it no problem at 1440p.
Remnant 2 uses UE5's Nanite feature for the polygon details and it works based on the native resoluton, meaning the more real resolution you throw at it, the harder Nanite has to works, which destroy the performance because it was always meant for a low resolution output that's upscaled using one of the big 3/4 res upscalers. So no, DLSS didn't ruin the game, UE5 did. If DLSS never existed I think UE5 would have still been ultimately engineered this way with it's own upscaler called TSR. But Nvidia did ruin the prices, using DLSS as an excuse.
I see 0 difference tbh between game with and without nanite if there will be 0 difference in other games we pretty much can consider this thing as useless gimmick probably made only to made devs work easier but for cost of performance.
Nanite is just to power hungry for now. I think it is a great technology for the future, it is too much. Unless the remnant two devs didn’t implement it efficiently, since epic’s demos looked quite good. Lumen seems to be a better technology for current gen gaming.
@@Extreme96PL nope there are definitely merits to using Nanite. Even though this was my first Nanite experience I could tell right away it looked better. There was little to no shimmering in distant objects, even at 1080p. Distant object detail was maintained so well I couldn't tell when an object was gaining or losing polygons, which I could notice easily in any other game because of their varying levels of LoD implementation. Like how when you walk far enough an object will suddenly be replaced with a lower poly model - that doesn't happen in Nanite. So I legitimately couldn't tell when an object (and it's shadows!) faded in or out of existence, and most importantly object pop-in was non-existant. If I looked past the horrible character models and animation the graphics really did feel like a next-gen thing.
I knew this will come. The first games using dlss actually had a good performance boost. I hoped that we will jump a graphics generation and dlss will make them just playable but I forgot that its also possible that developers are using it to skip work.
it is actually embarassing that games are so poorly optimized that i cant get constant 144fps in games on my 4090, in remnant without DLSS its around 80-90 and not most eople are not enthusiast enough to invest in a 700€+ GPU, i have no clue on which systems those Devs actually test their stuff.
Not just that. Years ago we used to render games at higher resolution than our monitor and then downscale the image to gain graphics clarity and still play at good framerates. Now we are doing the contrary, we render games at lower resolutions and have poorer graphics clarity to be able to play it at playable framerates.
Awesome video! I feel like player expectations played a role in this, when 4k hit the tv market. Many console players anticipated the new consoles to support 4k resolution. This generation of consoles also brought older games to the new systems, giving us familiar titles with smoother 60fps performance. Excitingly, new technologies like ray tracing have emerged as well. However, expecting a leap of four times the pixels and doubling the framerate while enhancing lighting and detail is a big challenge. Upscaling is probably trying to fill that gap somewhere.
Pretty much this. Raytracing is one of the big use cases for upscaling because initial raytracing cards just had no chance to keep up with actual raytracing at full resolutions, so for that it was a worthwhile trade-off. But with ever-increasing resolutions it instead became a crutch to run even basic graphics on intended resolutions.
@@ilovehotdogs125790 I think it could be catching up, but the hardware that actually IS better is overpriced as fuck. Just look at the entire 40 series from Nvidia. It doesnt have the same generational uplift as any previous generation had over its predecessors, it stagnated, sure. But strangely it all falls in line if you call the 4060 a 4050, a 4070 a 4060 and just shift everything one bracket down. They just shifted the numbering scheme and realized they can actually make more money that way.
All of the post processing effects in deferred rendering games add tremendous amounts of blur to the experience. From TSAA, to Upscaling algorithms, to whatever else they come up with to cheat poor performance, you get a sloppy picture. I already have blurry vision, i don't want my games to be blurry too! It's supersampling the Native resolution and no AA for me.
Just imagine how the world would look like if every business application was in the state of your average videogame. The world would collapse. Great, now your PayPal transaction takes several days, your money can instantly vanish, your browser crashes constantly, database roundtrips take hours, you need a high-end gaming PC to start the simplest office software, and the list goes on and on. I will never understand how you can be proud of the state of such a product. At this point, videogame development feels like the choice for junior developers and once you actually have any understanding on how to develop software, you move on to a different industry.
Its funny how these technologies started with the aim to give us a native 4k quality image by upscaling from a lower resolution. Now we are upscaling games to 1080p from lower resolutions. I like these technologies but like you say I fear that they are becoming a necessity to maintain playable frame rates and not a luxury for obtaining excellent image quality.
The best part is that, when I got remnant 2 jumped on it looked terrible at 1080p, I turned it off. It still looked terrible, so I checked the config files and resolution scaling is automatically set to on at 50%, with no in game slide for it. So you're already running at 540p then upscaling that with performance. So the resolution is fucked
IMO if this is how bad it really seems; it's going to be a temporary problem with gaming in the next few years as hardware improvements start to slow down significantly and game developers start to realize it, they will eventually come back the same route and focus more and optimising their games. This could also suggest higher resolutions might drop in popularity a bit? possibly? on another topic about steamcharts, I hate that if you use upscaling on say 1440p it probably records it as running at 1440p when it really isn't.
The thing is. Most of the time games get harsher requirements, its because of consoles getting stronger. ATM Consoles are as strong as an i3 12100 and 6700 xt (i know they use a 6700, but console have optimizations on the board itself compared to computers). So in 5 years they will probably be on a way higher level since we have rapid improvements on the soc and other sff components market. So it wouldn't be false to think that the next console generation will be as strong as a current ryzen 5 7600x 4070ti config.
Yeah of course reactionary behaviour will probably take place. The issue isn't that it will stay the same. It's that it doesn't have to get worse before it gets better. Hopefully similar games to BattleBit Remastered comes out that are like a replacement for these higher-budget games. They look much worse and usually this comes with a lot better performance and they may still do other things better as they aren't tied to all the red tape, demands and larger budget to make any changes. That's not to say lower-tier graphics automatically give you better performance, there can still be stuff that slows it down significantly. In Teardown for example the game is very low-res, but stuff like smoke can lag hard. I haven't played that game, but have seen some play it on UA-cam.
The overreliance on it in every new game is the problem. The fact that consumers expect it to be in every new game is also part of the problem. DLSS/FSR should be an absolute last resort, something you only do if your system barely meets the minimum requirements to play the game. It should not be touted around as a standard feature for everyone to use. It should be a tool for getting a game from unplayable to playable on your minimum machine. The more normalized you make that kind of crutch, the more developers will lean on it to excuse bad performance. So instead of getting excited when a game advertises DLSS/FSR support, I would instead be wary of why they are making a big hoopla over something that's meant to be used sparingly. It probably isn't a good sign.
I love remnant 2, but the optimization really disappointed me, at 4k with a 4090, I get about the same performance I would in cyberpunk with RT on high. Meaning sub 60fps. While I get I'm playing at a stupid high resolution, no other game gives me issues like this. Especially considering the performance gains I get by lowering the resolution or using dlss/fsr is quite a bit less than other games. It's not til about 1080p or lower that I start to see strong performance gains. They really need to do a lot more optimization. I honestly feel like this is an area where gunfire has been weak in before rem 2, rem 1 didn't have that great of optimization and didn't even have the ability to do 4k.
@@parlor3115 I've been building PCs for 20 years, I know what a 4090 is. There's benchmarks on youtube that confirm my framerate. 120fps is with dlss quality and framegen on. Double check your settings, I guarantee those are on.
As a former developer I'd like to say there is a difference between the developers themselves and the higher up managers. It takes time and thus money to optimize performance. Ofen the quick and easy route is taken, and technologys like upscaling are just luring as a quick fix that saves money :(
Unfortunately the majority of people (and content creators) don't realize how this works behind the scenes, and they only mention the devs, when in reality the vast majority of the horrible anti consumer decisions are made by the publisher and the greedy higher ups.
I had this thought from a long time that upscaling technologies are discouraging devs to not optimize their games for native resolution. DLSS and FSR are literally ruining games, I remembered when upscalers aren't a thing and devs actually had to optimize their games. Honestly before DLSS came out the last best looking and the most optimized games I ever played was Ghost Recon Wildlands and Far Cry 5 they ran so good on native resolution with my old GTX 1060 6GB
They also have less complex graphics, good optimization then mean lower texture resolution and variety, less interactive foliage, sand and snow, non dynamic lighting, object that shine in the shadows, less complex geometry, etc... ie... battle bits 😊
How are they ruining games? Which part of your gaming experience specifically has been ruined by upscalers? Your game looks a little bit worse? That probably saved the devs months of development time in optimization that they put to make other areas of the game better.
@@Ansalion if the game is made for being played at half resolution even on the latest hardware, gamers who don't yet own an rtx will literally not be able to run the game at all lol. And don't respond by telling people to upgrade, gpus are expensive as shit and if you're able to upgrade your pc, congrats, you're privileged as shit.
"Optimized Wildlands" sounds like a joke, but that's okay, at least it scales very well on lower settings for older hardware and had a really massive seamless open world with breathtaking detailed landscapes.
Game developer here - we are actually delighted to spend absurd amounts of time optimizing for performance, it's incredibly complex, deep and satisfying. When you're talking about 'developers not caring', what you're really talking about is how deadlines are imposed by budgets and publishers. If you have 12 months to finish a milestone, optimization is done after the game is content complete and a fun, fulfilling experience, and due to these external financial pressures optimization can sometimes be cut or delayed, particularly at a small studio. Delays can happen to make this work, but only if there's cash in the bank to pay for everyone's healthcare, food and rent long enough to delay, which is rarely the case. Most deadlines set us up to fail from the start and are never enough time, because management and the higher ups are starting from a date they would like to release and working backwards on a spreadsheet to assign work. Management is also the group who would see something like DLSS and say to the team, 'well why dont you just turn that on by default?'
Technologies like nanite and lumen aren't things you just turn on and see if they work on different hardware, they are complex codebases that apply differently to every single object, scene, and asset in the game, and typically are intertwined in a 100-step optimization process that affects every line of code and every type of hardware that could possibly be released on.
'well why dont you just turn that on by default?' tell them that it makes the game blurry. sucks. if it wouldn't make the game look blurry, then it would be acceptable for games to depend on it from launch.
Very good point. It really beats the purpose of buying an expensive GPU and expecting it to give out better results and then after installing it to your system, would have you use a software just to make it perform as what you have paid it for.
But the expensive GPU still does give better results compared to less expensive GPUs? Less expensive GPUs are even more reliant on upscalers compared to your expensive GPU. You’re comparing it against some imagined standard that doesn't actually exist.
Game devs wants their game to run well (even though optimizing is boring from a game dev pov), but publishers wants money coming in as early as posible, so optimization gets last in their priority list, that''s why is common to see this days broken games at launch but a few patches later they get better. The rant should be aimed at publishers, not at game devs. This happens in almost all of the software industries.
I feel devs always keep getting the short end of the sticks for problems they should not be held responsible for. Management at first, and now even hardware manufacturers. It's not the Steve the environment artist's fault Jensen thinks an 10% perf gain on half the bus width is an "upgrade". The 1060 was on par with the 980, the 2060 was on par with the 1080, even 3060 was chewing on the heels of a 2080 non-super. The 4060 can't even beat the 3060 ti, heck it even falls behind the 3060 when vram overflows.
@@jaronmarles941 See, software development in general is hard. Try to coordinate 10 people into making a small game and have it done by the end of this week. Impossible. By the end of this week what happend is with a bit of luck everyone has their repo setup and is ready to sta- oh wait, it should have been done by now?? Yeah, people who don't know like to shit on the devs for not making good games. In reality, it's the publishers and the management, they are the root of all evil.
Optimization is absolutely not boring, it's super fun. I think one of the big issues is the games industry has a lack of engineering talent. Not only is there a lack of engineering, but the bar has been progressively raised due to not only continuously rising expectations of graphics, moore's law dying, introduction of 8/16 core consumer cpus, but also new and harder to use technologies like Vulkan/DX12. It's not uncommon to have a 50 to 1 artist to graphics programmer ratio. Team of 400 people, 250 artists, 5 graphics programmers. Artists start just PILING work on the heap, graphics programmers are like way underwater in work just getting the game to not DEVICE REMOVED because Vulkan was just such a good idea of a technology. OK another story for you, due to lack of graphics programming talent company decides to use unreal, general purpose engine. General purpose means not specialized, ie not optimized. To make this crystal clear, if a game uses unreal, it is unoptimized. Full stop. If the game looks like it has advanced graphics and it uses unreal, prepare to buy a supercomputer. The grand irony is we moved from opengl/dx11 to low level apis, but that raised the barrier to entry high enough to push three quarters the industry to use unreal, which is going to perform worse than a well designed custom opengl/dx11 engine. Company demands graphics that can compete with e.g. red dead redemption. So what do you get? A general purpose engine pushed way past its limits. Not only that but Unreal has the horrific idea to let artists write shader code. They glue together overcomplicated garbage with their lego duplo blueprints despite having never heard of the words 'register' or 'occupancy'. No graphics programming talent to fix it. Might get a cleanup crew contracted near end of project to get it to pass console certification.
People complained consoles held back pc graphics. Now they think its bad optimization when games are demanding and not the higher fidelity. So now the community going to hold pc graphics back. DLSS is here to help hardware achieve better performance with higher visual fidelity which it does. Boost already solid fps? Why? Better boost fidelity with same solid fps.
DLSS is a double edged sword on one hand ive been using it on witcher 3 and I can't really tell any visual anomalies, so I get why games don't really care about optimization but that is really killing older GPUS cards like the 1080 ti might have a shorter life expetency because developers don't care to optimize anymore
I have always stood against upscaling and frame generation because i knew this was where it was gonna go. It mirrors exactly how overclocking has gone. Overclocking was once the best way to wring more performance out of cheap lower end parts. Today overclocking is locked down and only available to high end expensive parts with an extra surcharge tacked on. Same thing here, first upscaling was a nice boost now its the baseline, next is it only being available on the high end for a surcharge. As soon as nvidia is happy that tier technology has cemented itself as the leader its getting stripped off the low end parts. They made it clear this was the plan when the RTX 20 series came out and they made the "peasant grade" GTX 16 series to go with it, 20 series gpus with the RT stripped out.
@@ANGER2077 I don't disagree with everything, however it's important to note that if we want raw smooth performance people will need to purchase more expensive hardware that often most either cannot afford or are not comfortable with the cost. DLSS and FSR isn't just a "fake frame" generator, but when you look close to the details they actually will also fill in the gaps where some animations (let's say character movement) without would have slightly choppy movement due to the time it takes for your hardware to output an image before the next frame. But with DLSS or FSR enabled, these "fake frames" actually end up filling in those gaps which 'can' provide a smoother user experience (depending how you have your game setup). I was stating that with the scale of which games are growing, you will often either need one of two things to compensate. Either more power, so a better card that can handle the raw performance and provide a smooth user experience, or technology such as DLSS and FSR which can offer a significant improvement without the need to dish out extra funds. You have the choice and can pick whatever poison you'd like. I'm not sure if you've noticed but there's an "Off" option too for these games that suggest the use of DLSS or FSR. If we want devs to magically wrangle up performance in a title that may require more without these features, sorry but your experience won't be too great. But if you believe so you are more than welcome to go out there, become a game developer, and I encourage you to prove me wrong.
Excellent Video. Excellent Topic. This is the way of marketing no matter the item. They introduce to you as good for one thing and then manufacturers start using it for other nefarious purposes. Pretty soon it becomes just another way to get ripped off for your money.
I don't see DLSS as a solution for just higher framerates but a tool for increasing the image beyond it's limits. I noticed with Guardians of the Galaxy, played it on native 4k, after I added DLSS to sharpen the image even more, there is where it became mind blowing.
I feel like this is gonna be the gate of what people fear of where a AI server room is just gonna be the bloodline of what GPU's are to us rn and be locked to a subscription service doing stuff like DLSS to make games playable without a GPU but we don't own the hardware and I dread that fkn day
Lack of coding culture ("copy-paste" and generated code; overusing of standard modules that excessive for a goal; bad architecture and poor optimisation, if any, etc.) ruining games. Prioritising profit over gaming experience ruining games. Technologies are not ruining anything, they're just being used as an excuse.
With Nvidia and Amd shifting their focus to AI, putting minimal effort into their consumer GPUs and giving them only marginally better performance every new gen, this trend might continue.
@@thecamlayton I'm not trying to blame the devs. I'm sure the devs are trying to make the best games with the newest tech avaliable to them but if the average persons hardware can't keep up it'll continue like that. In this case I'm blaming the Hardeware companies.
@@thecamlayton bruh stop d*ckriding. these devs nowadays are there for a paycheck and agree to switch to ue5 and just slap on their assets and move on. modern devs in the triple a scene are mindless sheep who do as told and you all buy it up.
Frame generation (DLSS3/FSR3) is my biggest fear, from good addition I really expect it to become mandatory in a near future, just like the upscaling techniques (DLSS2, FSR2) became. But the worst thing about Frame generation is that it's not like upscaling technique, you need to have a minimum horsepower to make it usable (without too much artefacts), and imagine lazy, greddy editor like EA make it mandatory to run it at 60 fps on high end stuff, it could be a real disaster making CPU optimization a joke and create a threashold you've need to overcome to make the game juste playable, and if you can't reach it it's unplayable buy a new GPU. I find it really ironic to make use pay for technologies that can eventually be software obsolescence.
I think that upscaling techniques are amazing and are a good thing to have been developed, however I had this worry since they released these features and it seems like I was right to worry. I thought that developers would lean on upscaling as a crutch to compensate for having to optimize and even though they can I don't think they EVER should. Upscaling should be there for if you want to use high resolution or if your GPU is starting to show its age, unfortunately I don't think we will go back now and devs are going to keep leaning on upscaling.
I just "downgraded" from 4070 to 6650XT and plan on dropping my 1440p monitor (picked up a 27 inch 1080p instead) to stop worrying about 1440p. Upscaling, it's not really necessary in pretty much anything with the 6650XT, I think the only thing that isn't running at or close to my refresh limit is Ratchet & Clank. It's fine! Don't need no DLSS, don't even really need no FSR. Ten minutes after I installed my "worse" monitor I forgot I was even running 1080p. It's all numbers games, don't be owned by your PC.
Thats kinda My reason to never Upgrade from 1080p. For example you can get a 4090 for 4k Gaming but 2-4 years later it becomes a 1440p Card. But if you already have 1080p with like a 4080 you can easily use it for like 5-6 years.
@@zalankhan5743but nobody in their right mind is gonna use a $1200 4080 for 1080p because you would get the same experience with a $200 card now and a $200 card is 3-4 years time.
I have been looking at 1440p monitors but, 1080p is fine. It is just fine. 50% less pixels to render means it is easier on the graphics card too. Meaning I don't need to upgrade yet. The only reason I would like 1440p is for non gaming use.
What should happen is that dlss and frame gen should actually be extra, I mean, dlss and frame gen should be closed and 60 fps should be 60 fps. They are being cunning and not optimizing when they should, so I think the FPS in games is kissing the ground and this annoys me.
At least in the case or Remnant 2 the issue seems to be that because of the tiles they used a lot of advanced graphics API's to generate shadow maps, and dynamic lighting. Because they aren't static set pieces the game uses tiles. So rather than devout a lot of time and energy to making a low resource cost solution they slapped on a bunch of off the shelf solutions and it bogs the game down like crazy. You can get mod packs that allow you to shut off some of the advanced graphics stuff and the game runs way way way smoother when you do. And the big kick is that Remnant doesn't even look all that good. Gears of War 5 looks better and is multiple years old. Outriders looks better and is a couple years old. Hell Warframe looks as good and can run on super low end hardware. It sure seems like they sacrificed a ton of performance for very modest visual gains.
What do we expect? when you have GPU maker put something like upscaling sowftware as the main feature, and not as bonus, ofc game developers also take notes...
It's not devs, Im a dev, not a game dev but a dev nonetheless, we don't even get to choose what "jobs" are made, theyre passed down from a PO and put on the log, those jobs are the jobs we do, if the higher ups dont care about say optimising, then that job never even touches a board for a dev to pick up, its suits and managers that ultimately decide what the game becomes
If you pick up a job say regarding lighting generation, you can spend 2 weeks tweaking a new shader thats ultra performance, but then youll need to explain why you didnt just use RT and cut the time in half or more, now youve went over the allocated time, now you have a product owner angry that youve taken too long and your team is fucked over because they need to pick up more jobs to fill in what you missed, or some features get removed if it builds up etc. etc.
@@Wavezzzz601 I'm not talking about the developers roles in a company specifically. If the product manager says X becuase they didnt get enuogh budget to fulfill Y. 95% BAs don't really care about their jobs and will write whatever story to get thru their day Devs just do whatever prescribed. So i get its not "literally" their fault. But at the end of the day, the leadership that aligned the scope and budget for the project didn't bother. They rather make a half baked game on purpose than something nice. I understand why, because its more profitable. People will still buy it nonetheless to some capacity unless its a flaw that is extremely extremely intrusive to the core product. The Core gameplay is fun so people will overlook this at the end of the day. But this still lives with my statement: you give them an inch and they take a mile" because someone is always running a cost-benefit analysis
Totally agree with most of the points. I’m getting fed up with every game these days REQUIRING you to use an upscaler to be playable. Native resolution looks so much better than using an upscaling tech. To me it just comes off as lazy developing. Games from 2017/2019 that don’t use any of these by default still look great today. Say what you want about the game but Anthem for example still looks excellent (even tho you can use DLSS in it but you don’t need to). What happened to optimising a game to run good. It needs to stop.
This! When reviewers say shit like "it runs great!" When they're running a game at 120fps with DLSS on a 4090 but 60fps natively, that is hella not good optimization!
As someone in school for game development, game optimization is one of the worst and most boring part of the project. Not only that but the cost of waiting months to optimize the games costs, and management would much rather update the performance once people have paid for it than go another month without any returns.
Vis blocking, tris culling and the like is really easy to implement at the level design phase of a project, if folks just took some time to understand it, along with games engines being written well. Unfortunately it seems studio's think of it as an afterthought which is lazy and dumb at best.
@@cxngo8124 Agree it's time consuming but it's an integral part of being a level designer, again if the engine is good. In my opinion a lot of game studio's are run by suits who don't understand design fully, they think it's possible to just patch bad design at a later date, facepalm. It also seems a lot of studio's seem more bothered about diversity hiring over quality employees these days, although that is a totally separate issue it also plays a part in creating a quality product.
Important note: The devs DO care, their management require them to include hot features like ray tracing and fancy graphics in order to boost popularity prior to release and then give them much too small a time frame to get both the game done, fancy extraneous features included, and everything optimized. So things like upscalers are more than likely just included in the place of optimization because they simply don’t have enough time/resources to invest in optimization.
Real question - does ANYONE ever use ray tracing? The only time I have ever seen someone choose to play with Ray tracing, is streamers on the first 10 minutes of the game, then they turn it off because the FPS is too low with it on
IMO a big part of the problem is game programing and game art have both become so extremely complicated that the amount of people with a decent cross section of both (technical artists) are too rare. So the art department just dumps a bunch of unoptimized stuff in the game and the engineers are too busy to care or say anything about it.
I fear that many future games graphics will be build & balanced around using DLSS (or FSR), so it will become the "standard". There's nothing wrong with upscalers + frame gen and I really appreciate the boost, but it's rude to bring out a game and expect players using DLSS as default. So playing at native res will not be considered "intended" by game devs.
This was always going to happen. The path of least resistance meant that developers would lean on the crutch of upscaling rather than put in the work themselves.
Ergo, these useless as devs are fucking lazy, and would rather not put the effort in to ensure their product works on as many different hardware configs as possible. Surprise, surprise.
I've been saying for a while already: A game should perform at or beyond 60 fps on current gen mid-low end hardware WITHOUT DLSS. Then older gens can use upscaling technology to catch up. What the industry is doing is... Making me angry. Very angry.
at first i loved upscalers. but the more i use them, the less i like them. they just come out with too many downsides. i think i prefer simply a dynamic res implementation without any upscaler in the vast majority of time. and if i can sustain native 4k, all the better.
I’d really look at it from the angle that smaller teams can produce more impressive work by focusing on things that should actually make the game fun. Will that happen in all cases, no. But I would say thrashing up sampling which is really just a part of the graphics toolkit at this point doesn’t make much sense.
One of the major problems I see with DLSS and Frame Generation is that it muddies the water for consumers. We used to be able to look at reviews and KNOW that a frame was a frame was a frame and that the "frame" was a known quantity and a universal unit of measurement, but now.....are we SURE all frames are equal? Are we 100% that an nvidia card renders the same quality of "frame" as an AMD card? how can we be sure? I'm seriously asking, how do we know that one isn't rendering a lesser quality frame to pad their numbers? It's obvious that Nvidia wants to use software to obfuscate the ability of consumers to discern the performance of the hardware....just look at the 4060 and 4060ti, Nvidia was DESPERATE for us to equate software performance increases with hardware performance increases and just consider the two one and the same, and I absolutely guarantee you that it'll be even worse next generation, perhaps all we'll get is software performance increases. What I'm saying is that a consumer used to be able to compare GPUs across multiple generations and even across multiple different reviewers (granted the test bench is similar) and be confident the numbers were comparable, but with trickery like DLSS and frame generation, can we be sure of that in the future? It just doesn't make things easier for the consumer.
The pc consumer base needs to stop purchasing/funding these games and recent developer tactics otherwise they’re going to continue doing this. Stop buying them just like the overpriced graphics cards over this past year.
This is why I love Dead Island 2. When it first launched, the game ran at ~100FPS on native 1440p but enabling FSR actually made it to drop in frames to about 50 lmao.
I saw this coming from a mile away Devs being lazy and not optimizing because of upscaling. However its nice to have Upscaling for low tier GPUs for people who can't afford new one.
It's still gonna run like crap on low end gpus. Heck, even high end gpus need upscaling for remnant 2. If they optimized their games so decent hardware can have a smooth experience without upscaling, then low end gpus could have an enjoyable experience with these upscalers.
Upscaling looks really bad at 1080p so it's horrible for people with low end hardware. You may as well lower the resolution and it could probably look better than FSR/DLSS blur. Even with upscaling that looked like ass Remnant 2 produced really low framerate it was never smooth 60fps and medium with fsr was puke inducting even ps2 games had more pleasant visuals than this.
guys, this is how tech works. Nvidia and AMD and everyone involved didn't introduce upscaling tech to help players on the lower end of the tech curve get acceptable framerates, they introduced the tech to induce late-adopters to jump ahead of their own adoption rates. From this point forward, you can expect system requirements to include an expectation of having to use upscalers. In a generation or two, they will probably disappear as they become invisible to the end user.
Upscalers should be used for cases where the low end gpus are either not enough to run it at a playable frame rate or you want your high end gpus to be more power efficient and so that your room doesn't heat up as fast. It should not be needed for the latest flagship gpu to run the latest games. If your game NEEDS an upscaler to have a playable frame rate, you are doing something wrong.
Thats exactly what I feared when they announced it. In theory it would be a cool tool to increase performance especially for big high resolution monitors or TVs. But now it is almost exclusively used as a crutch by developers that can't or won't optimize their games.
I think ultimately the choice is between worse looking games and better looking games that upscale and thus might look a bit worse under certain circumstances (i.e. ghosting, loss of fine sharpness, which are already things TAA suffers from) or if you're team red/blue where the upscaler isn't quite as good. "Optimisation" is basically nonsense to a degree. Like there's some magic "make this game run on a toaster" button that is going to make pathtracing, voxelised realtime global illumination, contact shadows, physically-accurate material shaders, and all the other stuff that goes into modern games that makes them subtly look way better than older titles suddenly run faster on tech that's increasingly struggling to maintain generational gains. There are games that are just a fucking mess internally, but that's always been the case. Even Jedi Survivor is almost certainly heavily optimised, if rushed out of the door for unclear reasons, it's also just incredibly heavy to run and pushed so far to the edge that it's difficult to even get into a state that's fully acceptable on even powerful rigs. I don't like the tradeoffs of having to run an upscaler much, but I think the honest truth is you either have to accept using an upscaler to reduce the amounts of pixels being pushed, or you have to accept games that just aren't as graphically impressive. Especially when they're not particularly triple A, which I don't think Remnant is compared to something like Jedi or Hogwarts Legacy or TLOS PC. Just look at the previous game in comparison, and you can see that for as much as the performance penalty is steep (and the stuttering is an issue), it also looks like two console generations better. The tech that's allowing that graphical shortcut also means a steep increase in performance requirements. They could have made a worse looking but better performing title, but considering how often gamers complain about percieved 'downgrades', do you really think that wouldn't affect their ability to market it to the general public? People lost their minds when Uncharted 2 had what they thought were worse looking *pre-rendered* cutscenes, despite the changes necessarily being purely artistic. "This game looks a console generation older, but at least you won't need to toggle the slider that most people won't even notice is on" is not going to fly for the majority of titles. Especially since so many games sell based purely on looking very pretty or having the latest tech features (i.e. Cyberpunk Overdrive, Metro Enhanced, I'm sure some people bought Forspoken just because it had directstorage). The most I'd agree is that 'low' should run fine without an upscaler. That's a genuine unquestionable optimisation issue. But the idea that 'ultra' shouldn't require that is basically demanding games should look incredible and yet not be taxing to run. But Daniel Owen was running a lower-mid-range (as much as the card is still expensive) card on medium settings. On low (with textures at whatever is appropriate for the VRAM) I'm sure the game would still trash its predecessor and be less demanding. That would be the comparison I'd consider important. If medium looks better than the previous game, I think it's reasonable to expect a proportional increase in requirements. Then he goes and runs the game on a 4090 at Ultra... and yes of course it's not a massive frame rate. You set it to Ultra! You release a game whose max settings match the console, gamers complain about it being held back by consoles. You give better visual options to use that spare performance, now it is unoptimised? Does the game look way better than the previous game at default setting? Yes. Then bitching about ultra is pointless. Used to be that ultra settings would give most players 10 fps (i.e. Crisis, Witcher 2 Ubersampling) and people just... used settings that looked almost as good and didn't murder their PCs. Instead of demanding that the games both be more demanding than older titles or the console port... while also not being any more demanding? "Oh no my 4090 only gets 120 fps on the absolute highest setting without the almost visually indestinguishable setting on that would let me run it at 4K for no performance loss". Seriously. If the game supports a 16K output resolution, are 4090 owners going to complain their FPS is trash? Because that's how the "I should get 500 fps at ultra" people sound to me. The point of the max settings is to be the most demanding way to run the game if you have performance to spare, not to be the default mode.
Sadly this is the standart software engeneer/programmer mindset in big prodution. All of them think that "CPUs/GPUs are _so powerfull nowdays_ that you should care more about "Clean code" more them performance." and as Molly Rocket said in his video "Performance Excuses Debunked" this wrong; looks like old software was fast *EVEN IN OLD HARDWARE!!!*
I first noticed this with nvidia reflex in bf2042. Instead of optimizing the input lag of the game then using relfex as the cherry on top, it just makes it almost playable instead.
WTF. I cannot believe I haven't subscribed yet. Subscribed now, yours is one of the best upcoming fresh channels on gaming that I want to keep up to date on!
The Remnant II uses Nanite and VSM which are super heavy on the GPU and if you download any UE5 demo and you'll find there just as heavy in the demo's and you need to use DLSS.
Ok , but it doesnt look THAT good. I dont care what they use if the graphic quality is barely on par with modern stuff but performs worse then a game with Pathtracing
This is by design. Nvidia wants upscaling to be the main driver of performance uplift moving forward. Doing so allows their margins to increase over time per unit and pocket the profit while giving everyone the illusion of technically progressing . Developers are forced to work within that paradigm. It doesn't help that AMD is playing along to those rules and that it will take Intel a while before they're 100% competitive with the ither players in the market but truthfully this is where hardware manufacturers are pushing development.
Intel creates limited edition gpus that are discontinued before battlemage is even close to launching. Not to mention they believe gpus should be released every year which is a horrible business model because desktop gpu users typically would upgrade every 2 years and aren't willing to purchase gpus released every year. No one seems to understand that a gpu company needs money for game ready drivers, developing technology, paying engineer employees, paying customer support employees, etc.
@@QuestForVideoGames Intel partners are still selling those parts and will continue to do so. Intel themselves stopped making those cards under their own branding but they continue to develop drivers regardless. I don't see a problem with that. A lot of people stay in the headlines and don't read the full thing (as it seems happened with you there...). As a company just starting their discreet GPU business I see no problem with them taking a crack at it yearly. They're clearly still positioning their lines and GPUs are not Intel's bread and butter nor a meaningful source of revenue for their core operations so they can afford to do these things. I think over time they'll have more flexibility when they move to build these chips to their in-house foundries instead of relying on TSMC to make them. Intel has a HUGE advantage in that regard. They can undercut everyone greatly as they have the ability to integrate their manufacturing vertically (something Nvidia nor AMD are able to do). If they can come up with a high-end line comparable to Nvidia the paradigm will be forced to change again.
@@christinaedwards5084 I'd believe that argument if the latest lines weren't more power efficient and also so depressing. They're not doing it for the environment. They're undercutting raster performance for profit.
I remember when Nvidia introduced Temporal Filtering into Rainbow Six Siege, and it was really helpful for getting my aging 550 Ti able to play the game at 1080p. Now here we are, FSR and DLSS being used to supplant proper optimization. Realistically though, I'd imagine it's being used as a stop gap between release jitters and post launch stabilization to get the release window tighter in a schedule. It's not ideal, but hopefully it's a better choice than game devs simply not caring.
FINALLY! Someone is actually talking about it! This is what I thought about DLSS since video games started being unplayable without it. While it is cool to be able to boost more frames thanks to DLSS, games doesn't really that good when it is on. I just want to enjoy a game on my decent rig without any upscaler on.
Hundreds of games released this year on PC, a little bit of them really "needs" DLSS or FSR. Stop complaining of lazy devs. The point of Upscalers is get higher FPS not make the game playable...
So THIS is what I've been experiencing with newer games. I've been complaining about games not being optimized at all for a solid while now, this would explain it.
Nice video, I was expecting for someone bringing this to light. So overly positive reviews ignoring how this game simply CAN'T be played without upscaling.
WDYM a 2060 is not demanding ( 0:30 ), I use a 1650 for the Witcher 3 and it is amazing. I never needed to upgrade my GPU, things are lightning fast and still supported by Nvidia. BTW the 1650 is the average GPU used in steam, I said average, it means that about half of the users of steam use something worse. The min requirement for the game is a 1650 too. For me this means those devs were too lazy to make a "good" game AND make it usable by half of the users on steam. So they resorted to try their best and make their business team put those requirements to actually get people to buy their game. PS: I use Proton to run my game on Linux on The Witcher 3, I can't imagine how much more FPS can I get from my GPU if I try to run the game on Windows. I also do not use all of the upscaling and raytracing BS.
The thing is that Remnant 2 doesn't even look THAAT good, like for sure it's a good looking game but nothing special IMO it's marginally better looking than the first Remnant game yet is 100X harder to run. Just makes it more confusing as to why the game is in the state that it's in.
FSR/DLSS should be a nice little bonus to boost your already solid framerate, it shouldn't be required to get the game to work right
But FSR 2.1 (I don't know about DLSS) additionally provides temporal anti-aliasing. And considering that TAA will always be a bit blurry, FSR holds up incredibly well (in quality mode), while also providing a modest performance boost. It also quite effectively removes aliasing from Hairworks in TW3, so you don't have to rely on MSAA, which makes it actually worth using.
Not right now, but when every computer has the capability to do AI upscaling, It will definitely be the next step into making really good looking games by developing with it as the base.
@@Elite7555another weirdo that likes TAA
@@bravish I think so too, and by that point, native resolution's only purpose will be to take close-up screenshots within the game.
Why not... depends on the GPU and the game. If we speak about relatively slow GPU and relatively heavy game, that game was no-go for that GPU before upscale become a thing. So in many use cases upscale will be required if higher fps is desirable. Btw remnant2 got update somewhat fixing performance? Seems like it is pretty normal already for new games to have performance problems at start.
Main issue here is that studios are putting too much emphasis on graphics and not worrying about quality overall. Graphics look real nice on your trailers but doesn't replace the fun factor (that is definitely affected if you need to buy a powerful card to play and still need tricks to fight FPS drops)
Sadly better graphics and in-game bells and whistles sell.
True except dlss doesn't even make the game look much better for the hardware requirements. It will get scrapped
And then when a big developer dares to make a game that doesn't conform to the graphical fidelity goose chase, people still bitch about it. Hopefully Fromsoft never listens to these dumbasses.
Literally been an issue for 25+ years (yes I'm old). Obviously upscalers etc didn't exist in the days of BUILD, Quake (1-3) engines and everything from the PC "golden age" c1994 to 2004, but we've always _always_ had issues with studios putting graphics above gameplay. For that matter, there's also _always_ been issues with graphical advances outpacing hardware -- at least for the majority of people. When Quake came out back in the '90s, the vast majority of gamers didn't have graphical acceleration at all, and the ones who did were forced to run the game at low settings and in a reduced screen window. The 2000s to mid 2010s were the happy medium of cheap graphics tech and engines that ran great and looked great... But now we seem back to the late '90s-early '00s era of lazy devs and engines outpacing affordable tech once again.
Just saying, from someone who's been gaming since the goddam 1980s, this is nothing new.
proven by how the publishers and what not have collectively said "don't expect all games to be this good" when baldur's gate III came out
I've been saying this for a while now. Dying Light 2 was the first game that really made me notice how bad the issue was getting with games seeming to require DLSS to play at a decent framerate. Upscaling is an awesome technology to squeeze more performance out of older hardware, or to make something like Ray Tracing viable in games for a more modern card. But it seems like it's being used as a crutch to just not bother optimizing games anymore.
Yep same here. DL2 has really subpar performance and not much going on to justify it.
Except It does not automatically make Devs extremely lazy. They cannot get way with just FSR/DLSS alone with Zero game optimizations.
Took you a while huh? I've noticed this issue post-Crysis.
DL2 was my first experience with this too. I had a 1080ti (which even now is still quite powerful), and it couldn't hold 60 FPS @ minimum settings on a resolution I had no problems running other demanding games on (I should note my peak was like 43 fps while I stood still and stared at the ground in the intro sequence lmao).
Modern software developers have managed, through sheer incompetence and inertia, to render irrelevant multiple magnitudes of hardware improvements over the past 30 years. I wonder how much electricity is wasted each year because of bad code.
Yes. This is exactly what we said would happen when DLSS launched. Devs using it as an excuse to not optimize their games.
It probably doesn't help that DLSS likely requires its own optimization pipeline to look halfway decent.
Where's the proof?
Anything nvidia expect deceptive tactics, and FPS bs was the start of messing with your mind from the industry..
I always played 60-70fps and am one of the best players in LoL, and Battlefield when i played while ago.
exaclty what we knew would happen
@@Bustermachine You're confusing DLSS with TAA. TAA needs to be run within the pipeline to look decent. DLSS and FSR just need TAA to work and are easily implemented.
Dont buy a game at launch, buy it when it works properly, i.e when all the issues are ironed out. Don't be afraid to skip a game if it launches in a poor state
My rule of thumb is 1.5yrs after a sp title release. This usually is enough time for all the dlc for the game to be released and issues to be worked out. MP? Well that is a shitshow because you gotta get in early on MP to not miss out before population drips which it almost inevitably does.
I was so close to pre ordering Remnant 2 because of how much I liked the first game and how well it run. But I was glad I backed out after the early access benchmarks came out, then I went to the subreddit and after seeing that post from the devs my fears were confirmed.
Tell that to all the dumb masses still pre-ordering xD
That's what i did with Elden Ring, it was running poorly for people at launch but right now after multiple patches it's running damn nice at 1440p/high/60fps with very small rare drops.
@@Dregomz02 Ye Elden Ring was the first game that I ever purchased and played Day 1 and I got burned. Waited a couple months and played it when it ran well. Never again though lol
I have a feeling we’ll see a lot more games like this
60fps 4k dlss performance with 3080 and 120+fps 4k dlss performance with 4090 does seem the way new games are being optimized for. Makes sense with how many tvs support 4k 120hz now. theres got to be way more people overall buying more 4k 120hz tvs vs 1440p/1080p monitors in 2023.
Except FSR and DLSS is not the problem, Just because FSR and DLSS exists does not mean its a free pass to 100% skip optimizations. It does not automatically make Devs extremely lazy. They cannot get way with just FSR/DLSS alone with Zero game optimizations.
@@Rairosu they are litterally skiping optimization bro .that has we have seen in recent games
Just look at cyberpunks phantom liberty update and starfield.
This is BS.
The way the gaming industry has gone the less and less I want to play games anymore.
@@NoiceBOB Starfield isn't even going to have DLSS at all on release, please do research first
I refuse to believe that new games are "out pacing" the power of gpu's. They do not look much better than they used to look. Devs are just becoming more and more lazy about optimization.
Yeah, the same graphic not even realistic as realistic it is
Yeah I mean, older games that look incredible came out during the RTX 2000~ or hell even the GTX 1000 series, even the best looking games these days dont really hold a candle to Red Dead Redemption 2 which came out on the PS4/Xbone which were running on hardware significantly less powerful than the best PC hardware at the time- and it still ran at a very solid framerate.
Meanwhile if you tried playing RDR2 on PC when it came out (and even now really) it runs like dogshit on pc hardware that's multiple times more powerful than those old consoles...
No matter how much power you have, if the game devs didn't properly optimize their game, it wont matter, look at the original dark souls 1 for example, that thing brought even RTX 2080's to its knees in blighttown, despite looking like an early PS3/Xbox360 title.
It's true that this generation of Graphics cards have been absolute dogwater though, RTX 3060 Ti outperformining it's successor the 4060 Ti is fucking pathetic and laughable, and Nvidia should be ridiculed for it at every opportunity. Literally the only card that offers significant gains on its predecessor is the 1000$+ 4090, which is bullshit.
One big culprit is ultra HD textures. Which players will not even notice until they're hugging an object. You see this problem with Higwarrts and, RE4 remake. VRAM is gobbled up to store textures to an insane degree. Turning textures down actually fixes performance while barely changing visuals. Hogwarts in particular you will not notice a difference between ultra high and arguably even medium texture settings.
@@WelcomeToDERPLAND I played dark souls remastered on 1660 TI laptop and an RTX2080 and I can’t ever remember frame dips or stuttering on either system. 1660ti was 1080p max settings and 2080 was 1440p max settings. I definitely had to turn down settings with my 970m laptop to get 60fps at 1080p though, ps3 and 360 ran that game at 30fps 720p, with frequent dips into slideshow territory.
@@MechAdv Key word: Remastered
The Remaster fixes most of the performance issues.
The prepare to die edition is what kills fps, specifically in blighttown.
relying on DLSS to optimize your game is like using glue to hold your submarine together at depth.
have you heard of a little submarine called "Titan"?
PCs are now doing what consoles have been doing since the Xbox360/PS3 era: running in low resolutions and trying to act like they're not. Most of that old console era was 720p, even though all our tvs and systems were 1080p - and with xbox one/ps4, it was the same thing, a lot of "adaptive resolution". It's going to suck if this starts to be more commonplace in PC games. I remember being happy that my little 2060 had DLSS, as I figured it would give me a bit more oomph, but it's been a mess in most games that bring it on board, who only bring it to cover problems and not to help broke ass PC gamers like me lol
@@PuncherOfWomenAndMinorities that's a new one. you got the joke but you didn't get it
or reuse condoms
And we all know how that turned out
i always feared DLSS and Nanite because despite being pretty cool tech wise, it was an extremely easy excuse big companies could use to completely ignore game optimisation more than they already do.
Except FSR and DLSS is not the problem, Just because FSR and DLSS exists does not mean its a free pass to 100% skip optimizations. It does not automatically make Devs extremely lazy. They cannot get way with just FSR/DLSS alone with Zero game optimizations.
You don't know what Nanite is
Nanite IS optimization
What Nanite has to do with it? The thing is a real optimization for geometrical objects
Welcome to the next generation.
I remember when everyone was "forced" to upgrade their machines when everything was suddenly created for 3d video cards
And people cried just as much about being "forced" to upgrade.
The GPU race has turned into the best "crutches" race.
not the gpu manufacturers fault.
the game devs are using cool features like dlss & fsr as an excuse to not properly optimize their games.
Like a lot of people said they would when it first got announced.
@@hypnotiqnvidia is also guilty of this issue of slacking off due to dlss, the 4060 is a garbage graphics card because it literally depends on dlss to get good fps
@@hypnotiq I disagree, the pipeline for GPUs, graphics engines, and game development, is too interlinked to say that the problem sits solely in the lap of any one of the three. They're all to blame.
@@Bustermachine also very true
@@hypnotiq its a mix of both. nvidia and game devs are both at fault
I've been working as a backend dev for more than a year now and let me tell you that most of the times it's not about caring, it's more like you can fix it, but your boss have some priorities and maybe when those priorities are done, he won't pay you to optimize the game if it isn't too bad, money is power nowadays (and it's been for a very long time)
Yeah, I can definitely see that. I’m sure some game devs enjoy their jobs but at the end of an 8 hour shift or longer you just wanna get tf home, especially if the needed work is unpaid.
Yep. Optimization is not easy and it takes a lot of time. People want games quickly and those at the top want them released quickly. FSR/DLSS is being used as a way to get it playable sooner and out the door. If it sells well, it gets optimized and patched. If not, maybe it gets optimized down the road or maybe it doesn't.
Sadly, they don't get optimized in a patch. Often (for me anyway) patches end up making it even worse than launch.
@@michaeldeford1335 if it sells well without controversy related to performance why would a company spend money on optimization (and they wont), it isnt used to get games out the door quicker to patch it later, its just used as a crutch to get games out the door for the sake of early release.
Same here and I've had similar experiences being a backend dev. Normally it isn't the lead dev/scrum master who wants to leave it out but someone up in middle/upper management who decides that the costs & effort outweigh the benefits even if the devs disagree. I've had countless arguments with these people on how much effort things take and they seem to think I'm purposefully putting way too much time down. They literally have no clue
The main problem is devs not optimizing for DX12. In some cases,devs are just taking their game, running it in DX12, but then converting those calls to DX11 which hits your performance even more.
Its sad that the entire software industry in recent years has been more concerned about taking shortcuts and pumping out content that it has optimizing their software to run better.
This!
this is not how graphics API work. they are too different.
@@DiogoManteu I'd advise you to do your research. They arent.
Hell, Intels GPU's do this all the time by converting older DX calls to 12. Thats why recent Intel GPU's dont perform as well on older DX versions.
@@silvy7394 that's not on the game or engine programmer.
@@DiogoManteu It is if they're so fucking lazy they wont convert their game to 12 and instead just translate 11 calls to 12. Or not develop for 12 and kill performance on cards like Intel's.
Get out of here 🤡🤡. You dont know shit.
Devs need to optimize their shit better than let gpu upscaling manage it
Facts
Amazing how we’re blaming hardware for poor quality games
@@Ober1kenobi Are you trying to say the hardware that's been put out recently is good and up to standard? In my opinion it is very much a case of both groups being in the wrong.
I've been saying this forever
People with 8GB videocards, and PC gamers in general, need to lower their expectations of console ports.
Making a game is extremely hard work, and to accuse game developers of being lazy is just honestly completely ignorant.
The optimization argument is getting blown out of proportion. Would you rather wait three more months before it releases? If you want it to be better optimized, just wait three more months, and it will typically be much better optimized by then, if it wasn't well optimized to begin with.
It's much more important that the game is actually a good game than to have it super well optimized, and this looks to be a VERY good game.
Making a game really well optimized for PC can be very difficult for a game that's first developed for the PS5. Making sure that the game will run really well on the PC will give the game designers more restrictions, and will make it harder for them to optimize the game for the PS5.
@@syncmonism yes i would like a three month delay if it means the game launches better. Its called patience. and please enlighten me and the others as how to making a game like call of duty for example is hard work.
1) I think many, but not all devs are relying on upscalers (DLSS/ XESS / FSR) to get new game to some sort of a playable state
2) investors / shareholders / management get pressured to have a title out by a certain timeframe regardless of the state a game is in pushing devs to not optimise titles properly.
3) Game share more about DLC / loot boxes / cash grabs to onsell features etc and no longer just a game …. It’s a business with repeated turnover and that’s it.
4) the graphics don't look hardly different/better from a game released 10 years ago, yet we are getting worse and worse performance. Feels like collusion with GPU manufacturers or terrible optimisation to me.
short timeframes are results of idiots still buying halfmade games
You are one of the very few who does acknowledge that the devs basically have to do what the greedy higher ups tell them to. Most people and the content creators only mention the devs when in most cases they are doing what the publisher orders them to.
alot of devs i find for some reason think that dlss on balanced looks just as good as native. so they probs just assume theres not problem using it.
@@mryellow6918 those devs are dumb, they have no clue what game engines can handle and use way too much polys effects etc
Optimization is one of the last things you do in game development, so if it's not getting the attention it needs, that could be because of time pressure. Devs might *want* to have time to optimize, but when looking at potential steps that could be condensed, it would be an obvious candidate given how much it can be "cheated" using upscaling. I suspect this may be what happened to Jedi: Survivor
I've actually heard from people who code, and a few developers with experience note they learned a lesson at some point that you should optimize early.
If you don't, you have to go through the whole project optimising everything when you could have built it up from the ground up on a framework of proper optimisation.
It can make sense that while learning a new engine or gaming device that the bloat comes in as mistakes during the learning process.
But eventually, general experience along especially with experience with the engines and platforms should lead to teams knowing what they are doing at project inception. You would see this progression with older versions of Windows and game consoles; advanced games that ran better than the previous ones on those systems.
UE5 just came out last year, PlayStation 5 in late 2020.
(Maybe Xbox series x is a big technical change, I just wouldn't be surprised if it wasn't that much of a difference compared to xbox one and windows 10..and now 11. And the switch was built on older tech even when released around 6 years ago now. But the ps5 is the popular thing folks have to work around now)
I would say it hasn't been enough time, but that would be excusing all the bad launches throughout the 2010s.
It appears to be that greater graphical fidelity is demanded even when that same level of fidelity takes twice or four times the effort and performance, while giving diminishing returns in terms of actual satisfaction when....so many popular games played online with others are years and years old now and run on potatoes and toasters.
Do they just have to keep hiring more and more new people to make it all happen on a tight schedule? Well we know the answer to that already, yes, it's yes.
So it seems that AAA devs are in a perpetual cycle of learning how to do things and by the time they've built up experience, something new comes along that is even more expensive both computationally and effort/logistics wise, along with studios closing down and people constantly shuffled and....
I suspect they just never get to that point of competence. And if they were to spend the money to somehow do that on time.... maybe they wouldn't profit?
We could just use 10 years with no new additions to the whole process....just, take a break and work things through, straighten the kinks out, and see the quality rise back up before moving forward.
After all, it's not like anyone is demanding 8K, and surely we've finally hit the top mark and into the real area of diminishing Graphical returns?
Right....right?!?!
You are wrong, we must develop with optimization in mind from the beginning, because if you schedule a 3-day development requirement that just works and nothing else, you will have a lot of work to do to rewrite your code later, I am a software developer and that is correct. It's about thinking first about the best performance of your code.
Consumers buying a game in alpha state: "Wtf this game is poorly optimized..."
Devs: "..."
@@chillinchum For sure, people always expect too much from the beginning of a new generation. There is a sweet spot, though. 2009-2011 is my favorite example when it felt like devs really had a solid handle on the current tech *just* before new hardware rendered their progress meaningless. And then there's MGSV and the venerable Fox Engine, which seemingly blows a massive hole into this theory
Optimizing models, effects and textures costs money and A LOT of time. I feel a lot of developers are pretty much skipping a huge portion of this process today and also rely on different technologies to either automate or at least make it easier which sometimes make a suboptimal result. Hopefully technologies like nanite for unreal engine will remedy some of the core issues that we have with game performance.
unless we have the ability to delete the people that rushes the devs in releasing a game, not gonna happen anytime soon lol
Bro, I’m a software developer, and when I read about various tactics game developers were using way back in the 90’s / 2000’s I am literally blown away by the sheer ingenuity and complexity for which they ended up solving their problems
I have NO QUESTION in my mind that modern day developers across the board…aren’t anywhere near as good in regards to writing Efficient Software 😂❤
We were spoiled lol
The problem with Nanite (and UE5 in general) is how bassline heavy it is. Make a blank UE5 scene and look at how poor the framerate is. That said, it's like a freight train - you can load it up will stuff and it won't slow down.
Yeah, because the management side of things is only interested in something to show the investors for a given quarter, so they rush things to release, and of course the developers' priority, when being forced to release a game months or sometimes years ahead of schedule, is to focus on getting the game to an actually functional state, and unfortunately stuff like optimization is usually one of the last steps of the whole process once everything else is done, and they don't even have time to even finish everything else.
Upscaling technology isn't the problem here. The problem is capital getting in the way of art, a tale as old as time. Blaming upscaling tech is a weird take, because you know that we'd probably be in the same situation regardless, just with even less playable games.
UE5 has easy preset programmings for a lot of things so they never bother to optimize anything
The second upscaling technology went from an optional feature to increase FPS for users willing to compromise on certain visual features to mandatory to achieve even just 60 FPS, the industry started to lose its way. This is a horrible trend starting, and I'm worried this will be the standard going forward.
It's a total contradiction of what raytracing was supposed to add to the games.
Raytracing = better accuracy of shadows & reflections for "high picture quality", but lower fps,
D.L.S.S = "worse picture quality" by removing the accuracy of things like AA, AF, shadows, & draw distance, but higher fps.
@@kevinerbs2778 Modern AA was already removing details, with methods like TAA creating ghosting and blurring the whole picture. I honestly think DLSS and FSR are the best kind of anti aliasing at high resolutions, since MSAA is so heavy on the framerate and TAA doesn't look good imo.
@@meuhtalgearYou don't even need AA at higher resolutions like 4k. Downscaling just makes you image look like dogshit. Its something you don't really want if you play at higher res
Cringe take. This happens because of console culture that is based on locked 30 or 60 fps. DLSS is not responsible for the developers picking one over the other, and in most cases still being unable to deliver.
@@evaone4286the need for AA isn't resolution bound but based on screen size and resolution. 7" 1080p doesn't need AA cause your eyes can't make out jagged edges.
I think the biggest problem in a lot of these latest releases is Unreal Engine 5 itself. For games that are available in 4 and 5 there is a hugely noticeable increase in performance hitching, memory requirements, and required drive speeds. Unreal Engine 5 has actively discouraged optimization in its marketing , and even with it, games seem to run worse.
Mordhau went from UE4 to UE4.5 and the game became a fair bit smoother fps-wise when it happened, at least on my power gamer AMD rig from 2+ years ago, big battles with loads of real players still puts a big burden on my cpu however due to the insane numbers of computations but still.
Doesn't UE5 have better performance at the same quality level?
@@arcticfox037 its all about how its used and who's working on it, until the engine automates literally everything and auto optimises it, which is still a decade or more away, maybe ue7 or 8, competent devs will still need to work on polish and optimisation.
@@arcticfox037 Ue5 without using Nanite or lumen will perform almost the exact same as ue4, its when turning these more expensive features on where the optimization is felt. They are more expensive, but have no limit to their ability when they are used. So more expensive, but more capability from lumen and nanite
DLSS was originally meant to exist as a crutch to make ray traced games performant enough to be playable. The fact that we have to use it for rasterized games nowadays is..annoying.
to be fair. unreal engine 5 nanite is incredible technology (zero pop in)
@@legendp2011 to be fair, maybe nanite isn't worth it then if it makes the game unplayable without upscaling technologies.
@@thelastgalvanizer up to the developers. I personally find pop in is more distracting than modern DLSS, but I can understand it's a personal preference.
using nanite also probably reduced the budget (helpful for a small team with limited resources).
No nanite means a team would have to create 5x more objects to manage LOD transitions (also every-time they make a new object, they than have to create all the different LOD models, slows down development)
nanite doesn't just fix pop in. it also saves a huge amount of time and resources for developers. I can understand why they used nanite
@@legendp2011For the record, LOD creation is practically automatic, and the number of people who would be able to tell the difference between Nanite and good LOD levels is... Literally 0, you can't since LOD levels are determined by distance from render view.
Nanite is kind of an enigma. It can be more costly than traditional LODs when it comes to less complex scenes. But when geometric density, and especially object density reaches a certain level, it can end up being cheaper and more performant. It really comes down to use cases and optimization. And surely there could be implemented a kind of quality scaling setting for nanite that allows for more or fewer polygons overall, or for fewer polygons in the distance. It could be like standard quality options we have in traditional games. Why does it have to be so locked down?
Wait till the next gen when AMD won't make any high end cards. you'll pay more for less, and be happy for DLSS 4 while Jensen Huang gets another leather jacket .
Paying more for less is better then the more you spend the more you save.
DLSS 3 is actually locked via software on the non-40 series GPUs, nvidia is truely doing this only for pure gain
Intel Battlemage had better be competitive. They are our last hope.
Either that or use a console lol. I'm so pissed that the Xbox series S runs games better my 3090 desktop.
The console litteraly only has 4 terrflops of performance!
@@vogonp4287 Intel is essentially a year behind Nvidia and AMD. Battlemage is going to be launching when RDNA 4 and Blackwell are coming out and it's aiming to compete with current gen not Nvidia's and AMD's next gen.
Intel won't be competing against Nvidia's high end either so Nvidia has even more of an incentive to do what they did with the 40 series outside of the 4090 and move all products down one tier while charging more because AMD isn't competing either. It's the same reason why Nvidia just cancelled the 4090 Ti. They have no reason to ever release it because their main competitors are nowhere near close to their level of performance.
I personally think this is also an advertising issue - fancy graphics and tricks look and sound real good in trailers (and some gamers also get real obsessive about having big numbers,) but there they can just run it on a supercomputer and ignore the optimization. Then, actually making it run well on normal devices takes a lot more dev time than the studio will usually sign off on when they can just upscale and call it a day with relatively minimal impact on their profits.
Kind of a funny parallel with apples (red delicious in particular) - in the short term, people buy fruits in the store based on how they look and maybe feel, so stores select for fruit that looks pretty and stays fresh as long as possible. Problem is, with apples, that means they end up waxy, grainy, and bland when you actually eat them.
Except FSR and DLSS is not some dark magic wizardry that makes optimizations just magically work right out the gate.
I would imagine the devs do care, the real question is whether the suits they answer to cares. Think of the bottom line and how much money you can save if you can just use the magic slider to fix the problems that would take more development and therefore more time and money.
I like to think that the actual developers generally care about what they produce and that problems typically stem from higher up. Like the guy at a store who genuinely wants to provide proper service but the bossman thinks of it as a reduction in productivity because that energy could be used to stock more items.
As someone who's an upcoming game dev, I can confirm most people [devs] do care, we want our games to be played how we want them, sadly I agree that the people cracking the whip simply don't care, they don't see games as art they see it as a "product" as something to be done as quickly, cheaply and as profitably as possible
If it means that there's a way to almost cheat performance into a game, I can grantee they'll do it, because if there are corners to be cut, they'll cut alright, it's the sad state of some modern games
We need more flat organizations with less management, let the professionals be professional
@@Marsk1tty flat originations aren't the best, they mostly work in small groups, once the groups get big enough power dynamics and invisible hierarchy is created
It's good for indie projects but the bigger the game gets the more people is needed than the flat hierarchy can handle
they dont modern devs only know copy n paste and cry
imagine thinking dlss or frame generate are "sliders"... DLSS/FSR is up scaling to reduce resolution of objects that barely or unnotice in a frame (which almost doesnt affect the experience) to boost up performance which gain a TON of experience when you gaming
DLSS was a great idea for VR. Using it to make the other frames needed for the second eye.
I don't you'd want it in VR as the artifacts would be more visible.
Game developers are absolutely partly to blame. This doesn't mean individual programmers, but game development studios. The amount of VRAM used by games released in 2023 is not because of any technical reason. We know this because Cyberpunk looks better than every game from 2023 and an 8GB GPU can run it no problem at 1440p.
Remnant 2 uses UE5's Nanite feature for the polygon details and it works based on the native resoluton, meaning the more real resolution you throw at it, the harder Nanite has to works, which destroy the performance because it was always meant for a low resolution output that's upscaled using one of the big 3/4 res upscalers.
So no, DLSS didn't ruin the game, UE5 did. If DLSS never existed I think UE5 would have still been ultimately engineered this way with it's own upscaler called TSR.
But Nvidia did ruin the prices, using DLSS as an excuse.
I see 0 difference tbh between game with and without nanite if there will be 0 difference in other games we pretty much can consider this thing as useless gimmick probably made only to made devs work easier but for cost of performance.
Nanite is just to power hungry for now. I think it is a great technology for the future, it is too much. Unless the remnant two devs didn’t implement it efficiently, since epic’s demos looked quite good.
Lumen seems to be a better technology for current gen gaming.
@@Extreme96PLyeah, devs should focus more on lumen.
@@Extreme96PL nope there are definitely merits to using Nanite. Even though this was my first Nanite experience I could tell right away it looked better. There was little to no shimmering in distant objects, even at 1080p. Distant object detail was maintained so well I couldn't tell when an object was gaining or losing polygons, which I could notice easily in any other game because of their varying levels of LoD implementation. Like how when you walk far enough an object will suddenly be replaced with a lower poly model - that doesn't happen in Nanite. So I legitimately couldn't tell when an object (and it's shadows!) faded in or out of existence, and most importantly object pop-in was non-existant. If I looked past the horrible character models and animation the graphics really did feel like a next-gen thing.
Do you not like unreal engine 5? ( I hate it, I think it's trash, & it makes developers lazy.)
I knew this will come. The first games using dlss actually had a good performance boost.
I hoped that we will jump a graphics generation and dlss will make them just playable but I forgot that its also possible that developers are using it to skip work.
it is actually embarassing that games are so poorly optimized that i cant get constant 144fps in games on my 4090, in remnant without DLSS its around 80-90 and not most eople are not enthusiast enough to invest in a 700€+ GPU, i have no clue on which systems those Devs actually test their stuff.
Not just that. Years ago we used to render games at higher resolution than our monitor and then downscale the image to gain graphics clarity and still play at good framerates. Now we are doing the contrary, we render games at lower resolutions and have poorer graphics clarity to be able to play it at playable framerates.
Awesome video! I feel like player expectations played a role in this, when 4k hit the tv market. Many console players anticipated the new consoles to support 4k resolution. This generation of consoles also brought older games to the new systems, giving us familiar titles with smoother 60fps performance. Excitingly, new technologies like ray tracing have emerged as well. However, expecting a leap of four times the pixels and doubling the framerate while enhancing lighting and detail is a big challenge. Upscaling is probably trying to fill that gap somewhere.
Pretty much this.
Raytracing is one of the big use cases for upscaling because initial raytracing cards just had no chance to keep up with actual raytracing at full resolutions, so for that it was a worthwhile trade-off. But with ever-increasing resolutions it instead became a crutch to run even basic graphics on intended resolutions.
finally someone not dumb be saying this since before launch
@@ilovehotdogs125790 I think it could be catching up, but the hardware that actually IS better is overpriced as fuck.
Just look at the entire 40 series from Nvidia. It doesnt have the same generational uplift as any previous generation had over its predecessors, it stagnated, sure. But strangely it all falls in line if you call the 4060 a 4050, a 4070 a 4060 and just shift everything one bracket down. They just shifted the numbering scheme and realized they can actually make more money that way.
Yep. 4k killed graphics innovations and performances.
And why did we expect 4k? Because they put a big 4k/8K GRAPHIC ON THE FRONT OF OUR PS5 box. How is this player expectation?
All of the post processing effects in deferred rendering games add tremendous amounts of blur to the experience. From TSAA, to Upscaling algorithms, to whatever else they come up with to cheat poor performance, you get a sloppy picture. I already have blurry vision, i don't want my games to be blurry too! It's supersampling the Native resolution and no AA for me.
I think DLSS and FSR are the best antialiasing performance wise. At 4k the blur is negligible but ghosting is what annoys me the most.
Fsr is the worst
Not really for ds dlss have better clarity then native res at 4K, and same as cp
Just imagine how the world would look like if every business application was in the state of your average videogame. The world would collapse. Great, now your PayPal transaction takes several days, your money can instantly vanish, your browser crashes constantly, database roundtrips take hours, you need a high-end gaming PC to start the simplest office software, and the list goes on and on. I will never understand how you can be proud of the state of such a product. At this point, videogame development feels like the choice for junior developers and once you actually have any understanding on how to develop software, you move on to a different industry.
Its funny how these technologies started with the aim to give us a native 4k quality image by upscaling from a lower resolution. Now we are upscaling games to 1080p from lower resolutions. I like these technologies but like you say I fear that they are becoming a necessity to maintain playable frame rates and not a luxury for obtaining excellent image quality.
The best part is that, when I got remnant 2 jumped on it looked terrible at 1080p, I turned it off. It still looked terrible, so I checked the config files and resolution scaling is automatically set to on at 50%, with no in game slide for it. So you're already running at 540p then upscaling that with performance. So the resolution is fucked
IMO if this is how bad it really seems; it's going to be a temporary problem with gaming in the next few years as hardware improvements start to slow down significantly and game developers start to realize it, they will eventually come back the same route and focus more and optimising their games. This could also suggest higher resolutions might drop in popularity a bit? possibly?
on another topic about steamcharts, I hate that if you use upscaling on say 1440p it probably records it as running at 1440p when it really isn't.
steam charts are from the hardware survery which is just the resolution of the primary monitor
The thing is. Most of the time games get harsher requirements, its because of consoles getting stronger. ATM Consoles are as strong as an i3 12100 and 6700 xt (i know they use a 6700, but console have optimizations on the board itself compared to computers). So in 5 years they will probably be on a way higher level since we have rapid improvements on the soc and other sff components market. So it wouldn't be false to think that the next console generation will be as strong as a current ryzen 5 7600x 4070ti config.
@@FenrirAlter I can see what you mean, and I feel like consoles would be a little more inticing to get if that's the case, compared to gaming PCs
Yeah of course reactionary behaviour will probably take place.
The issue isn't that it will stay the same. It's that it doesn't have to get worse before it gets better.
Hopefully similar games to BattleBit Remastered comes out that are like a replacement for these higher-budget games.
They look much worse and usually this comes with a lot better performance and they may still do other things better as they aren't tied to all the red tape, demands and larger budget to make any changes. That's not to say lower-tier graphics automatically give you better performance, there can still be stuff that slows it down significantly.
In Teardown for example the game is very low-res, but stuff like smoke can lag hard. I haven't played that game, but have seen some play it on UA-cam.
The overreliance on it in every new game is the problem. The fact that consumers expect it to be in every new game is also part of the problem. DLSS/FSR should be an absolute last resort, something you only do if your system barely meets the minimum requirements to play the game. It should not be touted around as a standard feature for everyone to use. It should be a tool for getting a game from unplayable to playable on your minimum machine. The more normalized you make that kind of crutch, the more developers will lean on it to excuse bad performance.
So instead of getting excited when a game advertises DLSS/FSR support, I would instead be wary of why they are making a big hoopla over something that's meant to be used sparingly. It probably isn't a good sign.
I love remnant 2, but the optimization really disappointed me, at 4k with a 4090, I get about the same performance I would in cyberpunk with RT on high. Meaning sub 60fps. While I get I'm playing at a stupid high resolution, no other game gives me issues like this. Especially considering the performance gains I get by lowering the resolution or using dlss/fsr is quite a bit less than other games. It's not til about 1080p or lower that I start to see strong performance gains. They really need to do a lot more optimization. I honestly feel like this is an area where gunfire has been weak in before rem 2, rem 1 didn't have that great of optimization and didn't even have the ability to do 4k.
I'm not sure how you are getting those framerates, using an i7 with a 4090 I'm getting about 120fps at 4k
@@gokushivum with no upscaling and framegen off with max settings?
@@darktrexcz Make sure your 4090 is really a 4090 not a 4070 TI undercover
@@parlor3115 I've been building PCs for 20 years, I know what a 4090 is. There's benchmarks on youtube that confirm my framerate. 120fps is with dlss quality and framegen on. Double check your settings, I guarantee those are on.
@@darktrexcz And you got goofed like that. Man you suck
As a former developer I'd like to say there is a difference between the developers themselves and the higher up managers. It takes time and thus money to optimize performance. Ofen the quick and easy route is taken,
and technologys like upscaling are just luring as a quick fix that saves money :(
Unfortunately the majority of people (and content creators) don't realize how this works behind the scenes, and they only mention the devs, when in reality the vast majority of the horrible anti consumer decisions are made by the publisher and the greedy higher ups.
As a former dev, you should know the problem is also object and pixel density bring too high too.
@@NeoShameMan True.
The need to compete with better graphics also causes pressure.
as a consumer that knows
my condolences
corporate greed f's us both
lol there are plenty of incompetent developers too, easy to use the higher ups as scapegoats...@@J.A.Z-TheMortal
Devs are absolutely using upscalers to avoid optimisation.
I'd love to see someone delve into the laptop gpu arena around this topic. So much weird (and disappointing) stuff going on there already.
I had this thought from a long time that upscaling technologies are discouraging devs to not optimize their games for native resolution. DLSS and FSR are literally ruining games, I remembered when upscalers aren't a thing and devs actually had to optimize their games. Honestly before DLSS came out the last best looking and the most optimized games I ever played was Ghost Recon Wildlands and Far Cry 5 they ran so good on native resolution with my old GTX 1060 6GB
They also have less complex graphics, good optimization then mean lower texture resolution and variety, less interactive foliage, sand and snow, non dynamic lighting, object that shine in the shadows, less complex geometry, etc... ie... battle bits 😊
I wouldve prefered if AC Odyssey came with fsr instead of me having to run a blurred out 720p
How are they ruining games? Which part of your gaming experience specifically has been ruined by upscalers? Your game looks a little bit worse? That probably saved the devs months of development time in optimization that they put to make other areas of the game better.
@@Ansalion if the game is made for being played at half resolution even on the latest hardware, gamers who don't yet own an rtx will literally not be able to run the game at all lol. And don't respond by telling people to upgrade, gpus are expensive as shit and if you're able to upgrade your pc, congrats, you're privileged as shit.
"Optimized Wildlands" sounds like a joke, but that's okay, at least it scales very well on lower settings for older hardware and had a really massive seamless open world with breathtaking detailed landscapes.
Game developer here - we are actually delighted to spend absurd amounts of time optimizing for performance, it's incredibly complex, deep and satisfying. When you're talking about 'developers not caring', what you're really talking about is how deadlines are imposed by budgets and publishers. If you have 12 months to finish a milestone, optimization is done after the game is content complete and a fun, fulfilling experience, and due to these external financial pressures optimization can sometimes be cut or delayed, particularly at a small studio. Delays can happen to make this work, but only if there's cash in the bank to pay for everyone's healthcare, food and rent long enough to delay, which is rarely the case. Most deadlines set us up to fail from the start and are never enough time, because management and the higher ups are starting from a date they would like to release and working backwards on a spreadsheet to assign work. Management is also the group who would see something like DLSS and say to the team, 'well why dont you just turn that on by default?'
Technologies like nanite and lumen aren't things you just turn on and see if they work on different hardware, they are complex codebases that apply differently to every single object, scene, and asset in the game, and typically are intertwined in a 100-step optimization process that affects every line of code and every type of hardware that could possibly be released on.
'well why dont you just turn that on by default?' tell them that it makes the game blurry. sucks. if it wouldn't make the game look blurry, then it would be acceptable for games to depend on it from launch.
Very good point. It really beats the purpose of buying an expensive GPU and expecting it to give out better results and then after installing it to your system, would have you use a software just to make it perform as what you have paid it for.
But the expensive GPU still does give better results compared to less expensive GPUs? Less expensive GPUs are even more reliant on upscalers compared to your expensive GPU. You’re comparing it against some imagined standard that doesn't actually exist.
Game devs wants their game to run well (even though optimizing is boring from a game dev pov), but publishers wants money coming in as early as posible, so optimization gets last in their priority list, that''s why is common to see this days broken games at launch but a few patches later they get better. The rant should be aimed at publishers, not at game devs. This happens in almost all of the software industries.
I feel devs always keep getting the short end of the sticks for problems they should not be held responsible for. Management at first, and now even hardware manufacturers. It's not the Steve the environment artist's fault Jensen thinks an 10% perf gain on half the bus width is an "upgrade". The 1060 was on par with the 980, the 2060 was on par with the 1080, even 3060 was chewing on the heels of a 2080 non-super. The 4060 can't even beat the 3060 ti, heck it even falls behind the 3060 when vram overflows.
The devs are just as responsible, I'm seeing examples of devs themselves making excuses that we KNOW are full of shit.
@@jaronmarles941 See, software development in general is hard. Try to coordinate 10 people into making a small game and have it done by the end of this week. Impossible. By the end of this week what happend is with a bit of luck everyone has their repo setup and is ready to sta- oh wait, it should have been done by now??
Yeah, people who don't know like to shit on the devs for not making good games. In reality, it's the publishers and the management, they are the root of all evil.
its not because its boring but because it costs money
Optimization is absolutely not boring, it's super fun. I think one of the big issues is the games industry has a lack of engineering talent. Not only is there a lack of engineering, but the bar has been progressively raised due to not only continuously rising expectations of graphics, moore's law dying, introduction of 8/16 core consumer cpus, but also new and harder to use technologies like Vulkan/DX12.
It's not uncommon to have a 50 to 1 artist to graphics programmer ratio. Team of 400 people, 250 artists, 5 graphics programmers. Artists start just PILING work on the heap, graphics programmers are like way underwater in work just getting the game to not DEVICE REMOVED because Vulkan was just such a good idea of a technology.
OK another story for you, due to lack of graphics programming talent company decides to use unreal, general purpose engine. General purpose means not specialized, ie not optimized. To make this crystal clear, if a game uses unreal, it is unoptimized. Full stop. If the game looks like it has advanced graphics and it uses unreal, prepare to buy a supercomputer. The grand irony is we moved from opengl/dx11 to low level apis, but that raised the barrier to entry high enough to push three quarters the industry to use unreal, which is going to perform worse than a well designed custom opengl/dx11 engine.
Company demands graphics that can compete with e.g. red dead redemption. So what do you get? A general purpose engine pushed way past its limits. Not only that but Unreal has the horrific idea to let artists write shader code. They glue together overcomplicated garbage with their lego duplo blueprints despite having never heard of the words 'register' or 'occupancy'. No graphics programming talent to fix it. Might get a cleanup crew contracted near end of project to get it to pass console certification.
People complained consoles held back pc graphics. Now they think its bad optimization when games are demanding and not the higher fidelity. So now the community going to hold pc graphics back. DLSS is here to help hardware achieve better performance with higher visual fidelity which it does. Boost already solid fps? Why? Better boost fidelity with same solid fps.
DLSS is a double edged sword on one hand ive been using it on witcher 3 and I can't really tell any visual anomalies, so I get why games don't really care about optimization but that is really killing older GPUS cards like the 1080 ti might have a shorter life expetency because developers don't care to optimize anymore
Damn good point I didn’t consider this.
Which is a shame, the 1080ti is still a beast and with its 11GB vram could still last a lot longer.
Sadly, this was probably inevitable. This is why we can't have nice things.
The 4060 is such a trash move from NVIDIA. 15% increase from previous gen? And 2.5 years between release dates? Trash.
I have always stood against upscaling and frame generation because i knew this was where it was gonna go. It mirrors exactly how overclocking has gone. Overclocking was once the best way to wring more performance out of cheap lower end parts. Today overclocking is locked down and only available to high end expensive parts with an extra surcharge tacked on. Same thing here, first upscaling was a nice boost now its the baseline, next is it only being available on the high end for a surcharge. As soon as nvidia is happy that tier technology has cemented itself as the leader its getting stripped off the low end parts. They made it clear this was the plan when the RTX 20 series came out and they made the "peasant grade" GTX 16 series to go with it, 20 series gpus with the RT stripped out.
This is what a comment looks like from someone who got his education from a kellogs box. And they're still looking for the toy at the bottom.
@@SyntheticSoundAIdudes completely right what are you on about
@@ANGER2077
I don't disagree with everything, however it's important to note that if we want raw smooth performance people will need to purchase more expensive hardware that often most either cannot afford or are not comfortable with the cost. DLSS and FSR isn't just a "fake frame" generator, but when you look close to the details they actually will also fill in the gaps where some animations (let's say character movement) without would have slightly choppy movement due to the time it takes for your hardware to output an image before the next frame. But with DLSS or FSR enabled, these "fake frames" actually end up filling in those gaps which 'can' provide a smoother user experience (depending how you have your game setup).
I was stating that with the scale of which games are growing, you will often either need one of two things to compensate. Either more power, so a better card that can handle the raw performance and provide a smooth user experience, or technology such as DLSS and FSR which can offer a significant improvement without the need to dish out extra funds.
You have the choice and can pick whatever poison you'd like. I'm not sure if you've noticed but there's an "Off" option too for these games that suggest the use of DLSS or FSR. If we want devs to magically wrangle up performance in a title that may require more without these features, sorry but your experience won't be too great. But if you believe so you are more than welcome to go out there, become a game developer, and I encourage you to prove me wrong.
Excellent Video. Excellent Topic. This is the way of marketing no matter the item. They introduce to you as good for one thing and then manufacturers start using it for other nefarious purposes. Pretty soon it becomes just another way to get ripped off for your money.
The big issue is dlss was to help worse cards stay relevant for longe but it’s being used to make more expensive game
I don't see DLSS as a solution for just higher framerates but a tool for increasing the image beyond it's limits. I noticed with Guardians of the Galaxy, played it on native 4k, after I added DLSS to sharpen the image even more, there is where it became mind blowing.
I feel like this is gonna be the gate of what people fear of where a AI server room is just gonna be the bloodline of what GPU's are to us rn and be locked to a subscription service doing stuff like DLSS to make games playable without a GPU but we don't own the hardware and I dread that fkn day
Isn't that cloud gaming?
@@RexGuard yes
You will own nothing and be happy
NVIDIA will push GeForce Experience down people’s throats in a few years and it’s gonna be HILARIOUS
Lack of coding culture ("copy-paste" and generated code; overusing of standard modules that excessive for a goal; bad architecture and poor optimisation, if any, etc.) ruining games. Prioritising profit over gaming experience ruining games. Technologies are not ruining anything, they're just being used as an excuse.
With Nvidia and Amd shifting their focus to AI, putting minimal effort into their consumer GPUs and giving them only marginally better performance every new gen, this trend might continue.
I think it's wrong to put all the blame on game devs. Hardware is fundamental.
@@thecamlayton I'm not trying to blame the devs. I'm sure the devs are trying to make the best games with the newest tech avaliable to them but if the average persons hardware can't keep up it'll continue like that. In this case I'm blaming the Hardeware companies.
@@thecamlayton bruh stop d*ckriding. these devs nowadays are there for a paycheck and agree to switch to ue5 and just slap on their assets and move on. modern devs in the triple a scene are mindless sheep who do as told and you all buy it up.
@@thecamlayton back then devs use to work dedicated to get their game to work. modern times allow unfinished games and patch it over time.
@@diddykong7354that's literally been the case for at least the past 20 years
Frame generation (DLSS3/FSR3) is my biggest fear, from good addition I really expect it to become mandatory in a near future, just like the upscaling techniques (DLSS2, FSR2) became.
But the worst thing about Frame generation is that it's not like upscaling technique, you need to have a minimum horsepower to make it usable (without too much artefacts), and imagine lazy, greddy editor like EA make it mandatory to run it at 60 fps on high end stuff, it could be a real disaster making CPU optimization a joke and create a threashold you've need to overcome to make the game juste playable, and if you can't reach it it's unplayable buy a new GPU.
I find it really ironic to make use pay for technologies that can eventually be software obsolescence.
Gone are the days when games were meant to be played at native resolutions 😔
I think that upscaling techniques are amazing and are a good thing to have been developed, however I had this worry since they released these features and it seems like I was right to worry. I thought that developers would lean on upscaling as a crutch to compensate for having to optimize and even though they can I don't think they EVER should. Upscaling should be there for if you want to use high resolution or if your GPU is starting to show its age, unfortunately I don't think we will go back now and devs are going to keep leaning on upscaling.
I just "downgraded" from 4070 to 6650XT and plan on dropping my 1440p monitor (picked up a 27 inch 1080p instead) to stop worrying about 1440p. Upscaling, it's not really necessary in pretty much anything with the 6650XT, I think the only thing that isn't running at or close to my refresh limit is Ratchet & Clank. It's fine! Don't need no DLSS, don't even really need no FSR. Ten minutes after I installed my "worse" monitor I forgot I was even running 1080p. It's all numbers games, don't be owned by your PC.
Thats kinda My reason to never Upgrade from 1080p.
For example you can get a 4090 for 4k Gaming but 2-4 years later it becomes a 1440p Card.
But if you already have 1080p with like a 4080 you can easily use it for like 5-6 years.
@@zalankhan5743but nobody in their right mind is gonna use a $1200 4080 for 1080p because you would get the same experience with a $200 card now and a $200 card is 3-4 years time.
I am doing the same bought 6800xt ($400) for 1080p monitor and gonna keep it for next 4 years. fck amd and nvidia. All hail 1080p
Also using 1080p with my 6700 xt, i'd rather max out games at native 1080p before using upscaling.
I have been looking at 1440p monitors but, 1080p is fine. It is just fine. 50% less pixels to render means it is easier on the graphics card too. Meaning I don't need to upgrade yet.
The only reason I would like 1440p is for non gaming use.
What should happen is that dlss and frame gen should actually be extra, I mean, dlss and frame gen should be closed and 60 fps should be 60 fps. They are being cunning and not optimizing when they should, so I think the FPS in games is kissing the ground and this annoys me.
At least in the case or Remnant 2 the issue seems to be that because of the tiles they used a lot of advanced graphics API's to generate shadow maps, and dynamic lighting. Because they aren't static set pieces the game uses tiles. So rather than devout a lot of time and energy to making a low resource cost solution they slapped on a bunch of off the shelf solutions and it bogs the game down like crazy. You can get mod packs that allow you to shut off some of the advanced graphics stuff and the game runs way way way smoother when you do.
And the big kick is that Remnant doesn't even look all that good. Gears of War 5 looks better and is multiple years old. Outriders looks better and is a couple years old. Hell Warframe looks as good and can run on super low end hardware. It sure seems like they sacrificed a ton of performance for very modest visual gains.
What do we expect? when you have GPU maker put something like upscaling sowftware as the main feature, and not as bonus, ofc game developers also take notes...
There's a simple solution to all of that.
DO NOT PREORDER,
DO NOT BUY GAMES ON DAY ONE.
You give devs an inch, they take a mile
It's not devs, Im a dev, not a game dev but a dev nonetheless, we don't even get to choose what "jobs" are made, theyre passed down from a PO and put on the log, those jobs are the jobs we do, if the higher ups dont care about say optimising, then that job never even touches a board for a dev to pick up, its suits and managers that ultimately decide what the game becomes
If you pick up a job say regarding lighting generation, you can spend 2 weeks tweaking a new shader thats ultra performance, but then youll need to explain why you didnt just use RT and cut the time in half or more, now youve went over the allocated time, now you have a product owner angry that youve taken too long and your team is fucked over because they need to pick up more jobs to fill in what you missed, or some features get removed if it builds up etc. etc.
@@Wavezzzz601 I'm not talking about the developers roles in a company specifically.
If the product manager says X becuase they didnt get enuogh budget to fulfill Y.
95% BAs don't really care about their jobs and will write whatever story to get thru their day
Devs just do whatever prescribed. So i get its not "literally" their fault.
But at the end of the day, the leadership that aligned the scope and budget for the project didn't bother.
They rather make a half baked game on purpose than something nice.
I understand why, because its more profitable. People will still buy it nonetheless to some capacity unless its a flaw that is extremely extremely intrusive to the core product. The Core gameplay is fun so people will overlook this at the end of the day.
But this still lives with my statement: you give them an inch and they take a mile" because someone is always running a cost-benefit analysis
Totally agree with most of the points. I’m getting fed up with every game these days REQUIRING you to use an upscaler to be playable. Native resolution looks so much better than using an upscaling tech. To me it just comes off as lazy developing. Games from 2017/2019 that don’t use any of these by default still look great today. Say what you want about the game but Anthem for example still looks excellent (even tho you can use DLSS in it but you don’t need to). What happened to optimising a game to run good. It needs to stop.
To be fair, Optimization is painful sometimes but it's still no excuse for requiring an upscaler to even play the game properly.
This!
When reviewers say shit like "it runs great!" When they're running a game at 120fps with DLSS on a 4090 but 60fps natively, that is hella not good optimization!
As someone in school for game development, game optimization is one of the worst and most boring part of the project. Not only that but the cost of waiting months to optimize the games costs, and management would much rather update the performance once people have paid for it than go another month without any returns.
Vis blocking, tris culling and the like is really easy to implement at the level design phase of a project, if folks just took some time to understand it, along with games engines being written well. Unfortunately it seems studio's think of it as an afterthought which is lazy and dumb at best.
Game optimization is not that hard. Guess, your school is fooling you to create the next lazy devs generation.
Sad.
@@levijosephcreatesExactly!
@@tertozer3543 I know its not hard, its time consuming and boring. Its not a big problem for me, but other people absolutely despise it.
@@cxngo8124 Agree it's time consuming but it's an integral part of being a level designer, again if the engine is good. In my opinion a lot of game studio's are run by suits who don't understand design fully, they think it's possible to just patch bad design at a later date, facepalm.
It also seems a lot of studio's seem more bothered about diversity hiring over quality employees these days, although that is a totally separate issue it also plays a part in creating a quality product.
Important note:
The devs DO care, their management require them to include hot features like ray tracing and fancy graphics in order to boost popularity prior to release and then give them much too small a time frame to get both the game done, fancy extraneous features included, and everything optimized. So things like upscalers are more than likely just included in the place of optimization because they simply don’t have enough time/resources to invest in optimization.
Real question - does ANYONE ever use ray tracing? The only time I have ever seen someone choose to play with Ray tracing, is streamers on the first 10 minutes of the game, then they turn it off because the FPS is too low with it on
IMO a big part of the problem is game programing and game art have both become so extremely complicated that the amount of people with a decent cross section of both (technical artists) are too rare. So the art department just dumps a bunch of unoptimized stuff in the game and the engineers are too busy to care or say anything about it.
I fear that many future games graphics will be build & balanced around using DLSS (or FSR), so it will become the "standard". There's nothing wrong with upscalers + frame gen and I really appreciate the boost, but it's rude to bring out a game and expect players using DLSS as default. So playing at native res will not be considered "intended" by game devs.
Mostly i dont like to use dlss unless frame rate is very low. Whatever nvidia says its still a upscale methods and make games little blurry
This was always going to happen. The path of least resistance meant that developers would lean on the crutch of upscaling rather than put in the work themselves.
Based opinion.
Ergo, these useless as devs are fucking lazy, and would rather not put the effort in to ensure their product works on as many different hardware configs as possible.
Surprise, surprise.
The problem with DLSS and other is that it turns fur, hair, bushes, trees, grass and fabrics into pixel mess. RDR2 with Upscalers looks awful.
I've been saying for a while already: A game should perform at or beyond 60 fps on current gen mid-low end hardware WITHOUT DLSS. Then older gens can use upscaling technology to catch up. What the industry is doing is... Making me angry. Very angry.
at first i loved upscalers. but the more i use them, the less i like them. they just come out with too many downsides. i think i prefer simply a dynamic res implementation without any upscaler in the vast majority of time. and if i can sustain native 4k, all the better.
Crazy how we spend $500+ for cards that cant even run native resolutions on modern games
whats going on gaming world... everyone happy with their so conveniently priced rtx 4090?
They are a minority..
@@roklaca3138 really...
I’d really look at it from the angle that smaller teams can produce more impressive work by focusing on things that should actually make the game fun. Will that happen in all cases, no. But I would say thrashing up sampling which is really just a part of the graphics toolkit at this point doesn’t make much sense.
I'm jaded with modern gaming, so I rarely buy any new games.
One of the major problems I see with DLSS and Frame Generation is that it muddies the water for consumers. We used to be able to look at reviews and KNOW that a frame was a frame was a frame and that the "frame" was a known quantity and a universal unit of measurement, but now.....are we SURE all frames are equal? Are we 100% that an nvidia card renders the same quality of "frame" as an AMD card? how can we be sure? I'm seriously asking, how do we know that one isn't rendering a lesser quality frame to pad their numbers? It's obvious that Nvidia wants to use software to obfuscate the ability of consumers to discern the performance of the hardware....just look at the 4060 and 4060ti, Nvidia was DESPERATE for us to equate software performance increases with hardware performance increases and just consider the two one and the same, and I absolutely guarantee you that it'll be even worse next generation, perhaps all we'll get is software performance increases.
What I'm saying is that a consumer used to be able to compare GPUs across multiple generations and even across multiple different reviewers (granted the test bench is similar) and be confident the numbers were comparable, but with trickery like DLSS and frame generation, can we be sure of that in the future? It just doesn't make things easier for the consumer.
The pc consumer base needs to stop purchasing/funding these games and recent developer tactics otherwise they’re going to continue doing this. Stop buying them just like the overpriced graphics cards over this past year.
This is why I love Dead Island 2. When it first launched, the game ran at ~100FPS on native 1440p but enabling FSR actually made it to drop in frames to about 50 lmao.
I saw this coming from a mile away Devs being lazy and not optimizing because of upscaling. However its nice to have Upscaling for low tier GPUs for people who can't afford new one.
It's still gonna run like crap on low end gpus. Heck, even high end gpus need upscaling for remnant 2. If they optimized their games so decent hardware can have a smooth experience without upscaling, then low end gpus could have an enjoyable experience with these upscalers.
Upscaling looks really bad at 1080p so it's horrible for people with low end hardware. You may as well lower the resolution and it could probably look better than FSR/DLSS blur. Even with upscaling that looked like ass Remnant 2 produced really low framerate it was never smooth 60fps and medium with fsr was puke inducting even ps2 games had more pleasant visuals than this.
guys, this is how tech works. Nvidia and AMD and everyone involved didn't introduce upscaling tech to help players on the lower end of the tech curve get acceptable framerates, they introduced the tech to induce late-adopters to jump ahead of their own adoption rates. From this point forward, you can expect system requirements to include an expectation of having to use upscalers. In a generation or two, they will probably disappear as they become invisible to the end user.
Tell me youre not in game dev, without telling me youre not in game dev
Upscalers should be used for cases where the low end gpus are either not enough to run it at a playable frame rate or you want your high end gpus to be more power efficient and so that your room doesn't heat up as fast.
It should not be needed for the latest flagship gpu to run the latest games. If your game NEEDS an upscaler to have a playable frame rate, you are doing something wrong.
Thats exactly what I feared when they announced it. In theory it would be a cool tool to increase performance especially for big high resolution monitors or TVs. But now it is almost exclusively used as a crutch by developers that can't or won't optimize their games.
I think ultimately the choice is between worse looking games and better looking games that upscale and thus might look a bit worse under certain circumstances (i.e. ghosting, loss of fine sharpness, which are already things TAA suffers from) or if you're team red/blue where the upscaler isn't quite as good. "Optimisation" is basically nonsense to a degree. Like there's some magic "make this game run on a toaster" button that is going to make pathtracing, voxelised realtime global illumination, contact shadows, physically-accurate material shaders, and all the other stuff that goes into modern games that makes them subtly look way better than older titles suddenly run faster on tech that's increasingly struggling to maintain generational gains. There are games that are just a fucking mess internally, but that's always been the case. Even Jedi Survivor is almost certainly heavily optimised, if rushed out of the door for unclear reasons, it's also just incredibly heavy to run and pushed so far to the edge that it's difficult to even get into a state that's fully acceptable on even powerful rigs. I don't like the tradeoffs of having to run an upscaler much, but I think the honest truth is you either have to accept using an upscaler to reduce the amounts of pixels being pushed, or you have to accept games that just aren't as graphically impressive. Especially when they're not particularly triple A, which I don't think Remnant is compared to something like Jedi or Hogwarts Legacy or TLOS PC. Just look at the previous game in comparison, and you can see that for as much as the performance penalty is steep (and the stuttering is an issue), it also looks like two console generations better. The tech that's allowing that graphical shortcut also means a steep increase in performance requirements. They could have made a worse looking but better performing title, but considering how often gamers complain about percieved 'downgrades', do you really think that wouldn't affect their ability to market it to the general public? People lost their minds when Uncharted 2 had what they thought were worse looking *pre-rendered* cutscenes, despite the changes necessarily being purely artistic. "This game looks a console generation older, but at least you won't need to toggle the slider that most people won't even notice is on" is not going to fly for the majority of titles. Especially since so many games sell based purely on looking very pretty or having the latest tech features (i.e. Cyberpunk Overdrive, Metro Enhanced, I'm sure some people bought Forspoken just because it had directstorage).
The most I'd agree is that 'low' should run fine without an upscaler. That's a genuine unquestionable optimisation issue. But the idea that 'ultra' shouldn't require that is basically demanding games should look incredible and yet not be taxing to run. But Daniel Owen was running a lower-mid-range (as much as the card is still expensive) card on medium settings. On low (with textures at whatever is appropriate for the VRAM) I'm sure the game would still trash its predecessor and be less demanding. That would be the comparison I'd consider important. If medium looks better than the previous game, I think it's reasonable to expect a proportional increase in requirements. Then he goes and runs the game on a 4090 at Ultra... and yes of course it's not a massive frame rate. You set it to Ultra! You release a game whose max settings match the console, gamers complain about it being held back by consoles. You give better visual options to use that spare performance, now it is unoptimised? Does the game look way better than the previous game at default setting? Yes. Then bitching about ultra is pointless. Used to be that ultra settings would give most players 10 fps (i.e. Crisis, Witcher 2 Ubersampling) and people just... used settings that looked almost as good and didn't murder their PCs. Instead of demanding that the games both be more demanding than older titles or the console port... while also not being any more demanding? "Oh no my 4090 only gets 120 fps on the absolute highest setting without the almost visually indestinguishable setting on that would let me run it at 4K for no performance loss". Seriously. If the game supports a 16K output resolution, are 4090 owners going to complain their FPS is trash? Because that's how the "I should get 500 fps at ultra" people sound to me. The point of the max settings is to be the most demanding way to run the game if you have performance to spare, not to be the default mode.
Sadly this is the standart software engeneer/programmer mindset in big prodution. All of them think that "CPUs/GPUs are _so powerfull nowdays_ that you should care more about "Clean code" more them performance." and as Molly Rocket said in his video "Performance Excuses Debunked" this wrong; looks like old software was fast *EVEN IN OLD HARDWARE!!!*
I remember thinking this could happen when DLSS first got released, that some devs would be lazy and depend on DLSS or use it as a crutch.
is this like a "i used to walk uphill to school both ways and you should too" type of thing? Is optimization obsolete now?
I first noticed this with nvidia reflex in bf2042. Instead of optimizing the input lag of the game then using relfex as the cherry on top, it just makes it almost playable instead.
WTF. I cannot believe I haven't subscribed yet. Subscribed now, yours is one of the best upcoming fresh channels on gaming that I want to keep up to date on!
The Remnant II uses Nanite and VSM which are super heavy on the GPU and if you download any UE5 demo and you'll find there just as heavy in the demo's and you need to use DLSS.
What is vsm?
People forget that and just wanna complain to complain. "My GTX 980 was great back in 2017, now it can barely push 30 fps in 720p!"
Ok , but it doesnt look THAT good. I dont care what they use if the graphic quality is barely on par with modern stuff but performs worse then a game with Pathtracing
@@A.Froster Barely on par? You need new glasses?
As a developer on Unity, DLSS looks by far the best and also has hands down the best framerate, so it is the best of both worlds in that case
Your channel needs more visibility, your work is awesome
This is by design. Nvidia wants upscaling to be the main driver of performance uplift moving forward. Doing so allows their margins to increase over time per unit and pocket the profit while giving everyone the illusion of technically progressing . Developers are forced to work within that paradigm.
It doesn't help that AMD is playing along to those rules and that it will take Intel a while before they're 100% competitive with the ither players in the market but truthfully this is where hardware manufacturers are pushing development.
Intel creates limited edition gpus that are discontinued before battlemage is even close to launching. Not to mention they believe gpus should be released every year which is a horrible business model because desktop gpu users typically would upgrade every 2 years and aren't willing to purchase gpus released every year. No one seems to understand that a gpu company needs money for game ready drivers, developing technology, paying engineer employees, paying customer support employees, etc.
Increasing energy costs and green agenda I see future gpu’s being about low power consumption.
This is how they’d do it.
@@QuestForVideoGames Intel partners are still selling those parts and will continue to do so. Intel themselves stopped making those cards under their own branding but they continue to develop drivers regardless. I don't see a problem with that. A lot of people stay in the headlines and don't read the full thing (as it seems happened with you there...).
As a company just starting their discreet GPU business I see no problem with them taking a crack at it yearly. They're clearly still positioning their lines and GPUs are not Intel's bread and butter nor a meaningful source of revenue for their core operations so they can afford to do these things.
I think over time they'll have more flexibility when they move to build these chips to their in-house foundries instead of relying on TSMC to make them. Intel has a HUGE advantage in that regard. They can undercut everyone greatly as they have the ability to integrate their manufacturing vertically (something Nvidia nor AMD are able to do). If they can come up with a high-end line comparable to Nvidia the paradigm will be forced to change again.
@@christinaedwards5084 I'd believe that argument if the latest lines weren't more power efficient and also so depressing. They're not doing it for the environment. They're undercutting raster performance for profit.
@@lammyjammer6670 I bet there is some juicy ESG black rock WEF money involved.
Game developers don't care about optimization that's why there is no good games on Mac or why sli is dead
I remember when Nvidia introduced Temporal Filtering into Rainbow Six Siege, and it was really helpful for getting my aging 550 Ti able to play the game at 1080p. Now here we are, FSR and DLSS being used to supplant proper optimization. Realistically though, I'd imagine it's being used as a stop gap between release jitters and post launch stabilization to get the release window tighter in a schedule. It's not ideal, but hopefully it's a better choice than game devs simply not caring.
FINALLY! Someone is actually talking about it! This is what I thought about DLSS since video games started being unplayable without it. While it is cool to be able to boost more frames thanks to DLSS, games doesn't really that good when it is on. I just want to enjoy a game on my decent rig without any upscaler on.
Hundreds of games released this year on PC, a little bit of them really "needs" DLSS or FSR. Stop complaining of lazy devs. The point of Upscalers is get higher FPS not make the game playable...
FSR and DLSS should be meant for Ray Tracing performance offset or to achieve 120fps.
Hard stop.
So THIS is what I've been experiencing with newer games. I've been complaining about games not being optimized at all for a solid while now, this would explain it.
Nice video, I was expecting for someone bringing this to light.
So overly positive reviews ignoring how this game simply CAN'T be played without upscaling.
WDYM a 2060 is not demanding ( 0:30 ), I use a 1650 for the Witcher 3 and it is amazing. I never needed to upgrade my GPU, things are lightning fast and still supported by Nvidia. BTW the 1650 is the average GPU used in steam, I said average, it means that about half of the users of steam use something worse. The min requirement for the game is a 1650 too. For me this means those devs were too lazy to make a "good" game AND make it usable by half of the users on steam. So they resorted to try their best and make their business team put those requirements to actually get people to buy their game.
PS: I use Proton to run my game on Linux on The Witcher 3, I can't imagine how much more FPS can I get from my GPU if I try to run the game on Windows. I also do not use all of the upscaling and raytracing BS.
The thing is that Remnant 2 doesn't even look THAAT good, like for sure it's a good looking game but nothing special IMO it's marginally better looking than the first Remnant game yet is 100X harder to run. Just makes it more confusing as to why the game is in the state that it's in.