That is soooo true like , i still believe graphics in 2016,2015 are still incredible and we didn't want no ray tracing or path tracing to F up your fps to show you good lighting, there was no such technology and games were still beautiful
@@lovelyghost81 the trick is the "Turn off " button in options for rt -.-* U can make a rtx4090 explode on minecraft by just turning rt settings up to insanity just one example of a lot. Atm the rtx 2060 a SIX, 6, S I X year old card can run 95% of games at 60fps constantly. And of course consumers always get abused, cause they behave like...not smart people..
I will say this: people modded Black Myth: Wukong in order to be able to use the cloud you receive in and is exclusive to Chapter 6 that allows you to fly throughout the stage, in all the Chapters. What people found were TONS of fully-rendered geometry that the player would NEVER normally see during a standard playthrough, including an ENTIRE SMALL CITY that was, again, FULLY RENDERED during gameplay, which had nothing in it and could never be accessed by normal means in-game. So there might be a *little* bit of truth to the whole "invisible bloat just to tank framerate" claim.
If you were looking at the object, how was it "not meant to be renderered?" Occlusion culling is view based. You literally cull objects by not looking at them. Unreal per frame builds a complicated DAG like object that it uses to do multiple types of culling.
It's not just that. Because unreal engine 5 is an unoptimized mess despite all the sellouts telling you otherwise, they broke the rasterizer renderer to force in TAA and blur in everything, and they bloat the render pipeline with a huge amount of useless mess, so much so people are now noticing the issue and have to create heavily modified versions of unbloated unreal engine 5's engine to remove the dependency of TSR and TAA for everything, you never wondered why other engines like frostbite doesn't need a stupid amount of supersampling and antialiasing to look nice and clean ? ue5 also introduce so much bloat in the TAA pipeline that every game now has ghosting in gameplay movement, and you can only use DLSS and framegen to counter balance the lack of clarity and bluriness it creates. unreal engine 4.21 ran 5 times lighter with more demanding Antialiasing methods then ue5 with TAA and dlss quality applied because it's a huge mess. Nvidia absolutely colludes with unreal engine to bloat ue5 with heavier and heavier rendering pipelines that look really bad UNTIL you use a combination of forced dlss, taa / tsr and framegen.
Never attribute to malice what can be attributed to incompetence. Why would they have a fully modeled city if they just wanted to add invisible bloat? It would be easier to create a coconut with 1 billion polygons and hide it somewhere.
Optimisations absolutely do exist. I will give a simple example: Forza Horizon and Forza Motorsport. Two games made on the same engine by different developers. Forza Horizon: Absolutely masterpiece of game optimisations. Game runs at very high and stable frame rates while having one of the best graphics of this generation. And it does that in massive open world without any load times or visible LODs On the other hand... Forza Motorsport: Game runs like absolute garbage. Requirements are absolutely insane. 10gb vram for 1080p... And on top of that it looks like it came from 2012. While having no open world... Just one track to render. How is this even possible. It's the same God dam engine
As far as we know, Playground Games is a talented studio while Turn 10 has become the 343 of the Forza series by way of Microsoft propping up a game with individual contractors that only work in it for 18 months. That's not taking back the fact that Horizon 5 is the least well-regarded game by the fans, optimization differences aside.
Forza Horizon pays for that optimization with huge VRAM requirements. Anything below 8GB is iffy even at 1080p. But IMO that's a good thing. AMD users aren't complaining lol
@@evilpotatoman9436 I wouldn't turn it on even if i have 90 or 120fps, that crap comes with latency cause it has to hold the next frame for calculation. That means you will always be seeing 1-2 frames late on the screen. And yea, below 50 or 60fps its literally trash, if increase the already low latency and gives you garbage visual artifacts.
Yeah it is criminal. I prefer reducing resolution to play in native res instead of running frame gen, because the input lag/latency drives me crazy. That and DLSS/FSR. The flickering and latency makes me feel like my brain cells are committing mass exodus
Not illegal, just poorly optimized or trying to push the latest and greatest graphical options. But, if a game needs super high specs, then it better look super and next level with the visuals.
In the case of tessellation, in crysis 2, they made the ocean render nonstop without culling, under the entire map, thus AMD/ATI cards would struggle, though it was used at a lower level under the map to remain well within the capabilities of Nvidia cards at the time. those game studios never made "mistakes" like that before, until they were sponsored.
For the water issue, It is hard to tell 100% of the failure mode, as there were also various areas in the map where the tiniest of a gap would also cause other objects to render under the map, but for the water there were tests where people would look at an outdoor location with no visible water, then disable water and watch frame rates increase a good amount. The correlation for that in those cases was the presence of water under the entire map, though it could also be due to visual glitches where sometimes there are areas where you can see through the ground slightly which then causes extra stuff to render. Mistakes like that are always possible, but in many games, the bottom of the map is a bottomless void of nothingness.
This is not true, despite being a very common misconception. The debug tessellation wireframe mode enabled for those screenshots disables culling, it wasn't rendered in actual gameplay. This was debunked by crytek devs a decade ago, I don't know why it persists.
All games have a performance target relevant to the time they are made in. Most games run pretty similarly. That some will manage to be prettier than others is just a fact of life. Older games had the same issue. You're probably comparing the prettiest older games to the ugliest modern games, instead of looking at how the uglier old games looked.
@@albert2006xp I think its more an optimization issue, developers know that gpus are getting more powerful, so why investing too much time and money in optimization - I bet a lot if games would run way better with more optimization
@@IlSinisterothrough but more Optimisation cost more money. It's why gta 6 is being rumourd to be tested on 6 Gigabyte GPU as well while games made with unreal engine 5 was a break through because required a lot less physical work to produce top tier visuals. The drastic reduction in work allows smaller studios to make games like black myth which they would never be able to afford otherwise, in exchange of allowing smaller devs to make triple a quality games, it requires a lot more graphics oomph,
@@IlSinistero That's nonsense social media groupthink. Why didn't every GPU generation in the past 20 years make optimization less prevalent? Devs still optimize, because optimization means better graphics and better graphics means better sales. Nothing has changed. Hardware improves on average, graphics improve on average, console generations move forward as well. Games are running just fine today on the average available hardware and in relation to the consoles. The issue is a lot of people have below console level PCs and consoles run 30 fps for quality modes at lower end render resolutions often enough. So they think devs should just wave the optimization wand and make things hit the performance target above that for some reason. Which would mean degrading graphics, because you can't really double fps from optimization.
@@albert2006xp I would agree if non-first party games would at least be running in perfect 60fps on consoles. But they don't and that enables Sony to sell us quite pricey new hardware, too with the PS5 Pro. Also, look at CP2077. This game has SCALABLE graphics. You can bring the bestest GPU to its limits with it while you can still run it on lowest settings on really old hardware. Developers could always design their games this way.
This is why I levitate towards playing older games... It's just a much better way to play games on PC. You get fully patched games, you don't need to purchase the most expensive hardware to run them properly. Not to mention, they go on sales often and at cheap prices. I have no issue doing this to publicly traded AAA developers, where the money is literally feeding the wallets of insanely rich capital investors instead of the developers themselves. Indies and privately owned companies however, I usually pay full price at day one. :)
Literally 100's of high quality games anyone could play on a potato PC. More games than we could ever play in our lifetimes. There is definitely no need to spend big on a PC if choose to play older games which are on average better than modern slop anyway
Successful consumer marketing is about creating a need you didn't know you had, then offering you a product to address that need. It's the wonderful world humanity has created for itself.
This is exactly what ray tracing (and demanding games in general) are doing. Anyone who doesn't think devs and gpu/cpu makers collude to ensure games are more and more demanding every year is delusional
If you're interested in this topic, you should watch Century of the Self if you haven't yet, a documentary about Edward Bernays (Sigmund Freuds nephew and U.S government employee). It has been uploaded to UA-cam several times since it was released. Not turning it off early might mean understanding how the concept of money has replaced certain forms of interpersonal exchange in society and also not having a reactionary fear of anything that suggests Capitalism and consumerism has some ingrained and inherent problems within it. It's 'Mad Men' (if you remeber the show), but a little more complicated than that. edit: also, what's of concern to me (in my job) is how this form of modern consumerism has very distinct relationship with how things were developing in 1920 and 30's Germany at the time. It belongs to same system of propaganda with supposedly different ideals, the outcomes of those systems lead to the _inevitability_ of certain characters to 'ascend'.
@@eustacequinlank7418 Thanks for the perceptive reply. I've watched Adam Curtis' documentaries over the years, including Century the Self. Also enjoyed his more recent one: Hypernormalisation, which gives some fascinating insights into the origins of Trump. As you implied, Bernays' ideas helped create modern consumerism by applying an understanding of psychology in order to manipulate people. For example getting women to take up smoking by linking cigarettes to women's suffrage and calling cigarettes "torches of freedom". It worked, and now people's identities are for the most part tied to their consumption patterns.
I think you are talking about the Dx11 patch released for Crysis 2 after the game’s official launch, in which case the issue was more related to “not very optimized tessellation.” In my opinion, this is unfortunate but forgivable.
Same thing with Wukong... Graphics don't look that good to cost so much in performance. The RT doesn't even look good and games from 6 years ago have better graphics. Atleast Cyberpunk actually looks good.. Wukong not so much. RT seems even more irrelevant outside 3 NVidia games.
@@SemperValor I play wukong in a 6600xt, just turn off RT. And the reason RT kill Radeon GPU is because AMD still sucks for Ray Tracing, and FSR is not good either
@@gerardotejada2531 Its not about RT performance RT is always on In Wukong even when its turned off. Im just saying for how demanding the game is bringing a 4090 to 24 FPS for example it doesn't look that good graphically to cost that much performance. Compared to games from 6 years ago. Also RT on or RT off in Wukong looks the same. Its not like Cyberpunk where you actually saw a difference. Wukong there's no upgrade graphically on or off. Pause the video and look at 25:13 those graphics/textures look worse than games from 2015.
It was not crysis, it was hawx 2. A jet fighter game with massive tesselation on the ground. They also did it with physX in batman. The smoke used physX to fuck over AMD, but it was also tanking performance for nvidia. BTW volumetric smoke don't exist anymore and it is a very cool effect. I remember crysis runing very well on my AMD GPU. On par with similarly priced nvidia ones.
There is optimization for most games. Issue is pc they push them farther. Console, they get it to run. On pc, they crank it and push it hard. Listen to call of duty devs. They always have to hold back what they could do for optimized 60 fps.
Except there's plenty of examples of amazing work done to bring graphics to the next age. Think what it took to get path tracing in a game like Cyberpunk to work and be even remotely playable. Or what it took to get a game like Alan Wake 2 to work, that's like a 2030 game without heavy optimization.
They're spending more time and energy on marketing than the actual game, and a shining example of this is Cyberpunk 2077 itself! Never forget that 2020 launch! Now they've spent nearly 4 years updating and fixing a single player game
The problem is Temporal AA/SS. Parts of these engine pipelines are not getting proper optimization and they just say "hey, it cost a lot, but we'll save cost by letting TAA fix up the resolve" when that has 2% percent impact on performance and we proved that. Nvidia profits the most and AMD/Intel have been to busy competing in a area they will never win at. They can only win if they work on more optimized workflows and integrate optimized effects in popular engines.
The dude that bought a 1080 Ti back then for 1080p "future-proofed" his PC up to 2024, the dude that bought a 4090 to play Cyberpunk 2077 at 4K will need to buy a 5090 to play Black Myth: Wukong and Monster Hunter Wilds at max settings, the card is what? 2 years old??? Future-proofing is dead.
I was a dude that bought as GTX 1080 (not TI) to play at 1440P. I upgraded to a RTX 4090 when it came out. Still gaming at 1440P but now on a 240 refresh OLED. I hope I can keep this a few years. At the very least until the RTX 6090.
That’s my bro lol I gave him my 1080ti back in 2020 and he’s still rocking that card but he does play at 1080p and never looks at settings or frame rates 😅 so he’s happy. I however am a degenerate who loves playing at 4K on a OLED screen and having the latest GPU
It's also worth mentioning that as hardware gets faster, a lot of software optimization work is going to be skipped, that would have been done (probably out of necessity) in the past. That's how a new game can come out running worse than an older game that is otherwise broadly comparable in terms of rendering fidelity. This might also have a broader side-effect of making the skills of writing efficient software more and more rare among software engineers as the demand shrinks across the industry. This can make it really difficult to build an optimized game like Doom 2016 because you just can't find people with the necessary skills who aren't already working for prestigious companies like Epic, iD Software, nVidia etc.
UE5 pushed by epic is likely one method they are doing with game devs, especially in cinematography. You see many game devs not using the engine properly but also avoids using older engines that works fine like UE4.
It always makes me laugh when UE3 Arkham Knights looks better than games that came after on UE4 and now UE5 and have higher hardware requirements... (Gotham Knights, Suicide Squad)
@@Micromation tools can become very sophisticated but they are useless if the person using them is incompetent. No engine can make up for lack of creativity and vision required to create entertainment media.
@@Micromation Meh, to a degree creativity and vision is involved in visual quality and performance. Solid art direction and some style will forever carry more weight for longer, than even the most cutting edge of fidelity.
The reason why Nanites and Lumen exist is to 'reinvent' LOD and Lighting, to be 'one click' so you can turn your brain off sort of function so that game devs don't have to think about how to do lighting or level of details. But due to this, Epic can set UE5's Nanites and Lumens' default state much like how Nvidia can set the standards on how much 'hairworks' needed on a character. Hey look we got a new tech, lets spam it so it kills hardware performance! Lets make it standard so everyone kills hardware performance.
It's outsourcing. The engine developers outsource. The publishers outsource. The developers outsource. Teams are spread across the globe on top of this. No cohesion.
This, and the fact some AAA studios just fire veteran devs and artists to hire unexperimented ones in order to pay their employees less. Cost-saving measure. French people pointed out that Ubisoft basically hired people based on criteria which were absolutely not related to work experience, but that can also be ultimately the choice of the politics themselves, because they want to promote "diversity" at work.
Lol I work for an outsourcing company, in Europe, the amount of shit that American coders write is astounding, the length of introductions that they make about themselves is long enough for them to tongue fuck themselves. Not all outsource is bad, HALO was outsourced for a decade, then with Infinite they brought everything back under MS due to sanctions imposed onto the CIS countries and what did it lead to? A DOA game with barely functioning live service model.
No it's consumers. Lol if you as a business can make the same amount but spend half on QA you will do whatever gives you the most profit. It's up to the consumer to say no. But scalpers reigned for a bit and manus saw that and said hold my Starbucks. This is all OUR own fault. Not theirs. Until we own this it won't change.
Games used to be developed primarily by people who were coders or understood coding on a semi-deep level. Think of someone like John Carmack who would grind away at code for days and nights to optimize rendering. As time has gone on, development direction and design has shifted towards generic game engine suites without any real priorietary design or implementation. You no longer need to understand the backend foundational code and can just move assets in, out, and around, add all sorts of built-in effects and tools with the click of a button. In addition to this, development has shifted towards "designers" and marketing, where coders are simply employed to implement the desires of leadership. They may offer up suggestions to optimize performance, but if the leadership doesn't see value in this, there is no green light. Basically if the goal and expectation of the project is dictated to be 60 fps @ 1080p then why spend time and money? Just hit the metric you were given and move on. Many of these major developers see 4k, ultrawide, high-refresh rate, etc as niche enthusiast requirements. We live in a bubble and they live in a different bubble.
This is why Sony exclusives perform so well on PS4 and PS5 in comparison to there graphical levels because the studios devs still optimize the game as much as possible while running bespoke engines.
@@fadingdimension but are they wrong about 4k ultra wide high fps? Hell, are they wrong even if we take each one independently of the others? Look at the steam hardware survey and then tell them they're wrong.
Game development is going backwards. We used to have absolute bangers of studios and devs who made legitimate breakthroughs. We may have higher quality graphics now, but the development quality has been a steady slope downwards imo. Usually due to decisions that come down from the top of the corporate ladder that affect everyone. Now, the problem solving ingenuity is gone and FSR/DLSS/TAA/Whathaveyou are just shorthand band-aid fixes. No more creative solutions to real hardware problems. It explains why we don't have idtech-quality engines in 2024 where other tech has somehow advanced.
I miss Carmack being in control at ID, the days of quake 3 level editing mod scene with radiant and a full operating manual. I almost ended up working in games design but saw the industry was becoming suit run operations. As soon as the suits got control it became about marketing, hype, false promises (Todd Howard/ Peter Molyneux syndrome), profit has become the goal over quality. It seems we are living in a time where the suits are so disconnected to popular culture they don't even know what the masses desire any more, the culture war and DEI hiring hasn't helped but whatever, that is a totally different can of worms.
I was yesterday playing part of the Battlefield 4 campaign. A 11 year old game you can run on a potato. Compared to recent games the graphics have barely improved but the hardware requirements have exploded. What happened?
I believe it was crisis 2 that had crazy high tessellation that killed FPS on AMD GPUs that AMD had to release a driver to have a hard cap. Tests showed zero impact on quality but performance was tripled... That you could say it's invisible bloat! That's also a EA game, a shady company to begin with.
@@PyromancerRiftmakes me question is there something going on with raytraicing when using amd gpu, like whenever game detects amd gpu it cranks up ray bounce count to tank performance etc. I use the driver level tessellation setting, set it to x16
I don't know what is happening but something definitely is going on. DLSS should be used for $250 cards to give em a little umph at 1080p for really really low end cards. And for making old gen cards last longer. Somehow dlss turned into an essential feature for brand new $600 cards. Or +$800 cards to run everything on high/ultra RT for brand new games at 60fps.
Simple. AAA gaming now sucks. AAA gaming was the crypto before crypto. Companies thought that if we just push for "wow cool graphics" people would flock to the games. Then they noticed that it's way too much time to create games with high graphics fidelity. So they rushed devs. They miss deadline after deadline. They lay off workers to save funds because the game is not even out yet and investors are getting angry. No Man's Sky. CP2077. Warcraft 3 Reforged. Gollum. Concord. Overwatch 2. Dying Light 2. These are just some of the examples of dogshit games due to poor development, and GPUs not being able to keep up is a symptom of this issue.
@@hachiko2692 gaming ended around 2010. Before that, doom 3 half life 2 stalker far cry 1, battlefield 2, burnout takedown, left 4 dead 2, FEAR, halo 1-3, demons souls, GTA 4, max Payne 2, borderlands 1, cod mw2, chronicles of Riddick game,. Prototype which was an original concept. First PC starts to get good, then Xbox360 killed gaming, cool games but made for casual console gamers. Now all games cost 100 million+ empty open world. No "levels" at all
@@themodfather9382 Jesus you all sound so depressed. There has been tons of amazing game's since then there's more good games than most people have time to play
@@mojojojo6292 And none of them aim for any graphic fidelity awards. So we don't need the latest RTX 69420 or the AMD Ryzen 6663D. I enjoy games. Not the "flagship" ones. This discusses those said games because they're bound to GPUs. Shut up now. Thanks.
This is one of the biggest reasons why people just play older games. Because playing a new game would cost hundreds if not thousands of dollars. And considering the game quality today it's not really worth it. Just boot up one of your old favorites and enjoy yourself with the PC you already have.
Facts. I grew up with “bad graphics” as a 90’s kid. The other day I got pissed off at modern games being broken so I went and played Super Nintendo on a CRT tv. I had a great time. I might never upgrade my PC at this point.
This. I will keep my current rig until there is a game I really want to play can't get at least 1080p 60 fps high. There are many older titles to keep me busy. At this rate I will stick with what I have for a long time.
If devs stopped using xx90 cards to optimize their games we would have lot less issues with terribly running games. Make the performance ceiling lower so mainstream GPU does lag behind a bit less.
But then you’d get morons complaining everything looks the same and there’s nothing interesting graphically anymore. Boundaries should be pushed in tech. You don’t need to the newest tech. You can wait if you can’t afford it. Same as a car. You can buy a brand new 2025 model for 60k or you can wait and buy it in 2028 used for 22k.
@@ancientflameswell right now games look the same or worse so what's your point? I'm sorry but best looking titles from 7 years ago run and look better than current titles with lowered settings. And at max settings differences are truly minimal. This is not pushing the boundaries, this is curring corners and making customer fill the gap. It is what it is.
@@Micromation really not true. Go play Alan wake 2 at max settings. The path traced lighting is astounding and really brings the whole package together. An excellent show piece.
@@ancientflamesdude they still by enlarge look the same. 2020 was the peak of actual significance in graphical fidelity. I prefer that performance goes up instead.
@@Chibicat2024 that is your personal opinion. I agree with you on some games, but many games that use ray tracing and oath tracing were not possible on gpus from 2020. And they look astounding when built from the ground up for the tech.
No Daniel, game developer's are destroying game optimization . A simple "Hello World" program now requires 5 Gb of hard drive space, 32 GB of RAM and a 4070
Uncompressed everything. The dirtiest code. And forced by the company, closed source features injected in to the game at the end of the development cycle by NVIDIA.
@@MorMacFey-v2g And a lot of lazy scripting languages like Java and Python that take 10 times the CPU cycles to execute a subroutine that C++ can do in one
I want to say Crysis 2 was one of those examples. If I’m remembering right, when run on DX11 it had an insane amount of tessellation on a ton of stuff that wasn’t even on-screen (including an entire water plane rendered below the map at all times)
items rendered in the entire world. there was no limiting draw distance as the draw distance was basicly to the end of game map... despite the insta kill black wall they put up
Crysis 2 wasn't as bad as crysis 1. At least you didn't need top end hardware for it. Current gen hardware wasn't enough to max out crysis 1 at the time
This was also intentional, as it was Gameworks forcing it and Gameworks at the time was a closed source blob you just attach to your game rendering pipeline and it added effects. So devs couldnt even tone down the tessellation if they noticed it and wanted to due to nVidia.
Before I upgraded my PC, I have tried to run some of the newer games on lowest settings - and what I noticed, that older games, that actually had "crappy old graphics" vastly outperformed modern games in this kind of "affordable settings" scenario. Modern game on lowest quality, that allegedly has "numbers" somewhat close to "old games default quality" looks like absolute garbage, while old games looked decent. So, I can see where this idea of "invisible bloat" is coming from. At the end of the day, I CAN NOT actually "avoid using latest stuff and tech" - I have to at least hit the "mid settings". The picture would absolutely fall apart in the new game otherwise. So apparently there is a software bloat going on, just like how some software, like discord, for example, is created with a tech not really meant to be efficient, but developers are kind of: "ah, screw it, modern PCs will handle it".
I made a video comparing far cry 3 and gray zone warfare and my god the difference is insane. Gray zone looks like ass but also runs like shit. Far cry 3 looks beautiful and it came out over a decade ago
I remember when we had the GTX 1080 and we thought we were right on the cusp of a 4k standard resolution. Just waiting on a mid-range card that could run every game at 4k like the 1060 did at 1080p. It's been 8 YEARS since those cards released and we're still no closer to having that than we were then, and games from 2016 still hold up phenomenally to games today, many having a much clearer image and cleaner presentation overall due to the rise of TAA. Oh yeah, and now the 60 series cards are like $400 instead of $200 as well. The graphics move marginally forward while the value sprints backward.
4:32 AMD did deny it, Frank Azor literally said “If they ask us for DLSS support, we always tell them yes.” and “If and when Bethesda wants to put DLSS into the game, they have our full support,” but hey, that wasn't news worthy I guess. The way news handled the "FSR exclusive" was really messed up. The only semi decent coverage was done by Gamers Nexus and even they did not come back to check up on what Frank Azor said.
If just turning off RT was making any difference nobody would complain. The problem is when you turn game on ultra low, it oooks like garbage and still runs like absolute crap. Most new games on low have unjustifiable hardware demands. Some games on low look worse or similar to titles from 2015 (!!!) and have 4-5x higher requirements. It's bananas. Go turn on Remnant 2, switch to potato mode (it's absolutely fugly) and tell me your performance... Games from 2010 look and run better 😂😂😂 i refuse to buy dogshit that can't even work with decent framerates (120+) on top end hardware in 1440p ultrawide withou framegen and upacalling, that makes virtually no meaningful gameplay improvements - only thing that ever makes any advancements is in graphics department and year on year these improvements become ever so miniscule. AI is as bad as it was in 2001. Audio-wise games still can't get audio positioning right. Mechanically games are as dumb as they were in 2000 and some even got dumber and open worlds as empty as they've ever been. There is crisis in gamedev... a crisis of competence.
I have been playing some Battlefield 1 multiplayer lately and it is amazing to me how there are plenty of times when I wonder how it is possible for a game in 2016 to still look completely on par with current releases. If you increase the texture budget to get crispier details there is nothing else to object to. The frontier that game developers refuse to handle is interactive, physics and such. But hey they will give Ray tracing trash bloat instead of baking in lighting that gives basically the same results. Developers are lazy.
@@gerardotejada2531 why should I do it? You think just because I don't make movies I am not able to criticize them for blatant displays of incompetence on the screen? They've had competent senior developers working for these studios - they've been either pushed out or left on their own volition. None of the people that made games that rose those studios to fame are still working for them and it shows. It shows when they leave, start their own studios and still create bangers.
The selected comment is mostly spot on. Devs need to get back to games that have a unique art style that can be rendered on relatively modest hardware instead of making games that try to look like over-sharpened videotape and require a 4090 to render properly. It's not just about needing more robust GPU and CPU. Gamers are also being pushed to buy 1440p 240Hz monitors that are way overkill for running AAA games at appropriate settings on the hardware they can afford to buy. It's a travesty.
Partially correct, they have engineered themselves into a pickle with game development and asset handling. In the old days developers had to build 3-4 models of the same object they wanted in the game, for the various distances and detail levels to retain performance. But that takes up expensive artist time, it is cheaper to just buy the detailed model and shift it onto the consumer to "Buy better hardware". Of course, this causes GPUs to waste 75% of work due to the quads of pixels they work with, there are videos that explains this in more detail. Couple this with the smeared graphics created by frame gen and the use of TAA, and you have really shitty performance with substandard graphics, since the developers are out to cut cost and development time.
You know they can automate LOD creation now? Even blender can create a lower level LOD with a single button called "decimate" and the industry standard Autodesk Maya has a single button that generates the entire LOD group. Games like Skyrim have automated tooling to generate all the lods and billboards needed for a given area in world space and that is just a 3rd party mod tool. You don't notice LODS anymore because they now will take resolution into account to hide them better. With very few exceptions, all AA, AAA, and 3D indie titles will use LODs by default. If you don't notice them then that means they are working correctly.
@@11cat123 Clearly the automatic one doesn't work that well then, from what I have seen recently the main change in models is stuff "disappears" or is hidden off the model to reduce complexity, but you still see that the base model is the same one.
This is not new or unique to gaming. It's been a plague on software development forever. With more hardware resources and abstraction the less efficient and optimized all software becomes. Why? Pretty simple, why bother when you can just throw some extra hardware at the problem. It requires FAR more expertise and time the other way. Talk to someone with expertise in assembly and/or complier optimizations. Old days they hated even the C language let alone C++. This is why I will forever say that probably the most impressive game ever programmed was Chris Sawyer’s RollerCoaster Tycoon which was entire done in Assembly by him alone.
man that sounds like a pain. but i agree. lots of "devs" nowadays get a certificate if at all in javascript and call themselves programmers because they use UE5 blueprints lmao.
I was so shocked to see low preset of graphic setting in Jedi Survivor made no difference from Epic to the naked eye unless observed so closely by using the complex terminologies of GPU domain in the mind.
The tessellation thing was with Crysis 2, whose DX11 update had lots of unnecessary tessellation baked into the levels (tessellated ocean waves under the ground even when no ocean is onscreen, simple assets like concrete blocks being more tessellated than anything else in a scene). The story came from Tech Report, back when they were one of the best PC enthusiast sites on the web.
It is simple as this: as soon as games started to be “sponsored”, game optimization ended or at least ended for the other team. Some time ago, how games ran was chip and drivers dependent; now they are sponsor dependent…
Well as a game designer and have some experience for optimization from my previous AA projects, my fellows game designer said, we don't need optimization nowdays, optimization is only on the past. 😢
What i cant wrap my head around is the fact that we are at the 3rd iteration of rt cores and stuff and still fps take a 50% or more hit every time you enable ray tracing. Suggesting that improvements were effectively made mainly on the rasterization side of the rendering pileline
I don’t think there’s any kind of co-conspiracy going on. To me, this just seems like devs/publishers trying to cut the wrong corners to save on the ballooning costs of game development instead of trying to deal with the mismanagement leading to said costs in the first place.
@@kunka592Because Nvidia doesn't care about gaming. The overwhelming majority of Nvidia's riches come from companies like Google, Meta, Amazon buying their AI chips so that they can set up massive AI farms. I'm all for hating for NVIDIA but I'm also gonna shit on your room temperature IQ
I think it's the same sort of scenario as "Hard times create strong people, strong people create good times, good times create weak people, and weak people create hard times" in that game devs now have an unparalleled amount of tools and resources and so they take it for granted; they can make a marketable game with 10% of the effort. Whereas in the early to mid 2010's, devs had to use every single resource at their disposal to make their games run on the hardware. They used tricks like removing polygons from un-viewable surfaces and other little tricks to get every drop of performance they possibly could. As a result the games of that era still hold up today. The Devs/companies of today however are complacent as hardware and software basically do the optimization for them, and so they just don't need to try as hard to make money.
limitation breeds innovation. many of the stories from nes/sega devs of what they did to eke out whatever performance they could out of 8-bit consoles is equally mindblowing.
I got a 4070 at launch after having to wait too long sat on my GTX 1080 because of the mining boom. I got stung paying closer to an 80 class price for a 70 class card that performs like a 60 Ti class card. But RT was never included in my decision. RT has been out 6 years and is still not fit for purpose IMO, especially for the masses. DLSS was way more of a deciding factor for me and frame generation was a good idea on paper.
@@ns-tech1304 I spent that much on a GPU because Nvidia are robbing bar stewards and I was sick of waiting. I don't use frame gen, I said it was a good idea on paper. I use DLSS mainly in UE5 games because UE5 relies heavily on it.
I like it when you talk about things like this keep it up :) Its good to discuss what value these products are giving us and have clear points on the positives of both companies without being biased while also talking about the past!
Currently I'm just hunkering in on my Ryzen 7600x and my 3060ti, if I can't run a new game without buying new hardware to run it decently around 60 fps - well, then I ain't buying the game. Game dev's need to nut up and see the potential of sticking to a tier of hardware for longer periods, then they can also put more time into the game itself rather than spending an unforeseeable amount of time of stuffing it with minute visual upgrades. The only thing that might tempt me is the upcoming "8800xt", that said, provided it's near 4070TS/80 performance at low wattage, then I'll stick to that for at least 5 years. Good games over eye candy anytime, there's a reason I've revisited Super Metroid on many occasions.
And if you turned everything off it ran on a fridge. Can you say the same about UE5 games? Or do you think Remnant 2 in potato mode has any business running the way it does while looking like PS3 game?
I feel like game companies are betting on Upscaling and Frame Generation. They see alot of people have lower end hardware. But that’s such a backwards way of thinking
Personally, I think of the 'extreme max settings' as 'Techdemo'... I hardly ever see any difference between the highest setting (path tracing not counted), and the next highest setting..except for the FPS..
The other aspect that I didnt catch you mention Daniel is that Nvidia has gimped video memory so heavily that we're currently in a situation where a ps5 has access to more video memory than most midrange Nvidia cards. Sure its unified memory shared with the CPU but it can make up for that by hardware specific optimisations that you get on Console. A great example of that is how much less ram apple products have historically needed for the same or better performance levels as android phones with higher specs. PC hardware generally compensated for having less optimisation by having access to greater performance but that's currently being siphoned away from the mid to low end so that we can be upsold on prohibitively expensive high end cards.
Compare Arkham knight with Gotham knights or Woke squad kills the justice league, the old game looks 2 generations ahead, and look out the PC requirements, ridiculous.
With the amd Driver that just released yesterday, even a 6700xt gained +6 FPS average from the last previous older driver in both Ragnarok and Space marine 2. Thats a 10% boost at least. I haven't seen or tested with higher end gpu's yet, it could be the same 10% or even a couple more percent.
@@DemiSupremi the 4060 beats the 3060 by like 20 percent in all games. I dont know where this fake narrative come from. Probably testing the GPUs at 1440P ultra settings when both cards are meant to be in 1080P.
Nvidia's marketing is very brain dead in my opinion. I don't know why it works, but when they show a RTX 3070 failing to run a game made before the card was, just to show the 4070 running twice as good, regardless of how they got those metrics, they paint the picture that if you buy an Nvidia card, it won't last long enough to even play same generation games comfortably. They were showcasing the 3070 getting around 25 to 30 fps in cyberpunk, while the 4070 was getting around 90 because of the ram, dlss frame generation, and upscaling. They ran the benchmark at a high resolution with high textures so they can use the lack of vram on the 3070 to make the 4070 look unfairly better, even without DLSS 3, as the 3070 was vram bottlenecked
Game developers love the path tracing full RT way of making games, because light setting becomes much simpler on their part. They are salivating for the prospect, of not having to manually set lighting all over.
My opinion is that these games are well optimized for visuals. In the sense that the graphical effects they are implementing they are getting to run as fast as they can. However, any advanced visuals at this level are basically indistinguishable from poor optimization. If you have to throw all these graphical effects to make the games look slightly better than PS4 era ones while running at the same if not worse resolutions as back then... what are we doing here.
That’s a good point. The difference in a ps4 vs ps5 (or modern pc game) game is mostly down to effects and lighting not AI, pathing, super high res textures or physics. It’s going into more visually noticeable things, but at the cost of massive performance going into “boring” but transformative stuff like ray tracing, path tracing etc.
What we're doing here is what we've always been doing here. We're pushing graphics up to the performance targets based on what hardware people have/what consoles have. If you don't like it you're welcome to lower your settings.
I think this is worthy of an extensive benchmark. When I had 3070, an older driver produced at least 5 fps more on average in AAA games. Unfortunately, I had to update to a new driver because the game won’t launch unless I do so.
I changed to amd when I noticed my 2070 super getting throttled and having issues a few months after the 30 series announcement. Haven't had a problem since then and I'm still on the older 60 series. My FPS has in fact only gotten better every couple of months with driver updates. It's not major fps jumps but the fact that they are optimizing on older cards is awesome and I have made plans to purchase a card from the next gen coming up.
Lmao I did the same thing. I had a 2070 Super for over 4 years but I couldn't play new games at 1440p high/ultra settings. So I. Decided to give AMD a try and bought a GPU from them over a year ago and I have no regrets.
I agree with this comment. I mean, i understand what you say about RT and i agree.Maybe, it's okay for RT effects to tank the game, but take a recently released game like Silent Hill 2 remake for instance. I love the graphics and atmosphere of Silent Hill, but it shouldn't be as complex to run as Cyberpunk. I think that's what the commentor was saying. Games that doesnt look as good as Cyberpunk for instance running bad on PCs even without RT effects.
These games were legendary for being unoptimised great games. Lords fallen the 1st game/watchdogs 1 could only hit 30fps with a 980 and only could hit over that 4k maxed until the 10 series high end cards came Control couldn’t run well with 4K native high settings until the 30 series high-end cards came. games that are very demanding aren't made for Gpus at the moment. That's why I named the games I did down below. Alan Wake still can't run well without drops, even in 1440/Black Faith a souls clone can't run 1440 without dropping FPs, stutters bad, and drops FPs using a 4070TI on 4K. I still can't play remnant even in native 1440 high settings. I'm stuck in 40s and get drops. Hellblade is very demanding/tarkov/cyberpunk and still doesn't run like it should. I'm thinking all of these games and others should run like they should've when the 50 series comes. all the old games I named were optimized good,just the cards wasn’t strong enough at the time to run them great
lol the guy cant imagine anything complex than cyberpunk? the game has textures and geometry of ps4 era,it has a ps4 level design,it has a ai and animation system of ps4 era,its only the ray tracing lighting that makes the game look good
100% ! Why do you think they made CSGO into CS2? I had a wonderful game with +300 fps on 1080p when playing CSGO with my 4790K pc, after CS2 only 100 fps on 1080p, it's 100% to make old hardware useless ! The gamer developers are corrupt AF and scammers, making their games 'PAY TO WIN'. 4:50 "Probably putting some invisible bloat in their games just to tank framerates" No he's not pushing it too far! In CS2 in maps like Inferno, they added 137 flowerpots in the map, to tank the FPS, a dude shows in a video if he went around and removed all the flowerpots in the map, he was getting like +100 more fps, keep in mind this map has been giving me +300 fps on CSGO and in CS2 I can barely get 144 fps to run with my 144 hz monitor. They are absolutely adding junk to the maps just to tank FPS 100% without a doubt! And why do old maps give great FPS but new maps give absolute junk FPS? Because they add things like water, flower pots, furniture, arts and all sorts of useless things that noone cares about in the CS community, we want high FPS not realistic environment ! Even things like Windows XP, Windows 10 and Windows 11 - why do you think games run way better on Windows XP compared to Windows 10 (like +10-20% better in CSGO, I tested myself) and games run better on Windows 10 compared to Windows 11? Because they are doing the same thing with Windows systems, they want to suck your old hardware dry so you want to go and buy new hardware, it's completely the same issue with Windows!!
But I am glad people are finally waking up to this scam! I have been arguing a long time with gamers about this issue, they told me I was wrong and a conspiracy terrorist. Even things like Windows xp / Windows 10 and Windows 11, why do you think games run way better on XP compared to 10 and 11 and games run better on 10 compared to 11? Because they are doing the same thing with Windows, they want to suck your old hardware dry so you want to go and buy new hardware, it's completely the same issue with Windows!!
@@orion9k also to answer your question regarding windows! i have an old laptop running windows 10 before Aug update it was running ok but after the update it struggles to run properly! well that's your answer to they want to suck your old hardware LOL it's all about money & marketing to make more money in this era things aren't what used to be anymore!
@@zf8496 100% make no mistake, they know exactly what they are doing. With all the AI today, they could make old hardware from even before 2014, come back to life and play modern games, I guarantee you !!
The current problem is that the developers of said games don't want to or/ don't have time to optimize their games. No code is perfect, but there is rules to go by in code, and you can simply learn by trying and testing, what works and what doesn't. Battlefield ¹ and five are imo the simply most optimized games there exist. Dice knew exactly what they were doing, and the game looks beautiful and can run on really simple hardware. It's also very smooth when it comes to frame pacing, which is often overlooked like in CS2 or in many other games that suffer for optimization. Devs need more time or they need to put in work to make their games work. That's why you look at games today, and simply think "Wow 😲, nothing changed, and yet my hardware tanks drastically". Hopefully in future all will change and devs will turn around to optimizing their games once again. (Certainly hope so) since CS2 hasn't had an optimization update since it released, and that game... Wow the FPS is bad. ( And frame drops.. )
very good explained. this is how I build my pcs. I also target console performance and build my pc around that. I just want the games to run. and sometimes even then the pc game runs better, because of a dedicated gpu and not shared memory instead of an apu with shared memory. And no developers will always targeting 30 fps in favour of better graphics, because better graphics sells better.
Devs are in agreement with nvidia because ray tracing saves a bunch of development time apparently instead of baking in light sources and lighting elements (like shadows reflectiosn etc etc) individually.
AMD is literally copying Nvidia with FSR 4 they now are using AI. nvidia could have patented that tech and AMD would have ever been or intel to use it but they didn’t!! You are all conspiracy theorists and it’s sad… everyone is soo paranoid now days it’s pathetic
The newest demo of MH Wilds at Tokyo Game Show was running 60fps with some dips here and there on base PS5. Honestly think those specs are just a placeholder so they could put preorders up, while they keep working on it. Thank you for at least saying " game is still 5 months out, we dont know how it will perform". So much doom posting and panic just based on recommended specs with no actual benchmarks out.
Or we could compare with other games on the same engine (RE). Like, we're not operating in the vacuum - Dragon's Dogma 2 levels of crapformance is not what you should debate over, it is just something you're gonna get. Wilds will run like garbage. Honestly it could've stayed looking like World's, instead we will get a game that choked just about any hardware in existence for no good reason whatsoever... how gamers can be so gullible... WHEN was the last time when "game is 5 months out, let's wait and see" has yielded ANY improvements whatsoever? Because quite frankly, I can't remember that being the case for the most part.
@@GAMERGUYXL specs are required to put preorders on steam. The last bit of development is always big fixes and optimization. You need to implement systems and content before you can optimize it. Updating requirements closer to launch is a very common practice. But nobody makes a big deal of requirements going down, so it doesn't get talked about.
@@djagentorange "you need to implement systems and contents before you can optimize it" That is R&D's step not game development. You know nothing about game developments at all.
I like that you diplomatically didn't touch the 'Graphics in games reached its peak in about 2015-16' bit. This pretty much just perfectly showcases how disturbed and disconnected with reality your average youtube 'gamer' comment is. Like, go play Witcher 2 on the PS4 (or really, any PS4 game) and then compare it to something like Star Wars Jedi Survivor or even Dragon's Dogma 2. They're just observably wrong on a basic level. Sure, there have been diminishing returns and that has been progressing since basically the inception of video-game/graphics tech, but to pretend there have been no major breakthroughs in video-game graphics technology in the last 8-9 years is purely delusion. It reminds me a lot of the 'raytracing does nothing except tank your fps baked in lighting looks better' comments. In some cases I can agree that it doesn't do enough to justify the fps drop (*cough* elden ring *cough*), but that it does nothing? Even in cases where it's minimal it's usually a marked improvement... Man, I see so much mental gymnastics (we'll be diplomatic too) on youtube video comments, not even just about video-games & hardware but also about other topics like history, where the statement is just observably false. There are a lot of confidently wrong people out there. Whatever helps you sleep at night, I guess...?
Proprietary tech, bigger and more complex engines, "the way it's meant to be played". We also came from fitting the entire texturemap inside one small atlass to save vram to downloading 360 camera scanned models, throwing it to nanite and calling it a day.
So, according to you they should stop making more complex software all together? No more FX for video editing, no more CAD upgrades, after all, its all designed to sell GPUs!!!!!!!!!
That's not true. There was extra tessellation added that was not visible in the game in Crysis just to tank performance. This was being pushed by Nvidia because they knew the impact on AMD GPU's was higher.
Damn Daniel! I used to watch your videos regularly when you had less than 50K subs. I go away for a few months and when I return you're at 200K. Nice work. Keep up the quality.
I swear to god, the games AMD picks for their freebies is some of the worst garbage I've ever seen... it was a long streak of mediocre crap until Space Marine 2 just now.
@@BasedBebs FarCry 6, Saints Row remake (ahahahah), Forspoken, Sniper Elite 5, Starfield, Dead Island, Callisto Protocol, Company of Heroes 3, Avatar... Dragon's Dogma 2 that doesn't even run acceptably on any of the hardware offerings. Last of Us Part 1 was decent, Space Marine 2 was great but that's a long streak of utter trash
Gets better because the they released a more powerful gpu the 7900gre for the same price a few months later. I tried to be an AMD guy but they are just as f-up as nvidia just in different ways
I have a bit of commentary on this, as someone who has been big into the tech behind gaming since the 90s. I think part of the issue is that consoles - the PS5 and Xbox SX - are so very very powerful, not using weird custom hardware anymore, and games that never would have been ported to PC 10, 15 years ago now are because devs need the biggest market possible to profit So games that actually push the PS5 and XSX are closer in hardware requirement to high end PCs than any previous consoles. The PS5 has an RX 6700 equivalent GPU. It also runs better in the PS5 than it does as a PC add-in card because of greater bandwidth and tighter architecture with the integrated CPU. The PS5 Pro is an even further crank upward. And games developed with those targets are GOING to need super beefy PCs to run. They're going to be less efficient in function than they would've been on the PS5 and will need bigger and better hardware. And all of this is being exacerbated by how much more expensive high end hardware is today compared to 10 years ago. Gone are the days of a $700 halo product (RIP 1080ti). Now that gets you the GPU in the middle of Nvidia's product stack.
So buddy what should do now i was thinking of building a pc but wanted it to future proof like at least i can play gta 6 when it's release on 2026 or 2027 at 1080p max settings
@Dantetheoneatpinnacle hard to say with gta 6 being in dev for so long it'll either have way inflated requirements like cyberpunk or be like a generation behind The closest to "future proof" GPUs today IMO are the 4070TiSuper and better, and the 7800XT and better - 16GB or more. If you have to buy now, the answer would be "buy the biggest your budget will allow". However I would wait until next gen specs and pricing are official
@ElysaraCh I'm also thinking about waiting for it but i don't have pc atm last pc i had was in 2015 for gta with gtx1050 that mf card died 2 times in warranty and got exchanged but on 3rd time it was gone and warranty was over so i never built pc after that but now I'm thinking of gta 6 but that game is like in 2026 or 2027 for pc
I think the rush to 4k is also part of the picture here. A lot of sentiment basically skipped 1440p which has seemingly risen in profile belatedly. Plenty of people regard 4k as almost normal and 8k as the 'next thing' when for a lot of games 4k really is still the bleeding edge.
@@Dempig that is not at all my point. Also, 1080p looked nuts to people when it first came on the scene. The theory I am proposing is exactly that people jumped to 4k ahead of the technological curve. This means they are now used to 4k (thinking 1440p looks bad fits perfectly with this) even though our tech is struggling to perform at that resolution even with high end hardware. If 4k was still seen as the bleeding edge then people would expect issues and not be surprised that demanding games may not hit 60fps there.
@@LlywellynOBrien I mean ive been gaming at 4k since 2019 when I had a 2080 super. It wasnt too difficult to run 4k native untill recently with heavy RT games and UE5. I would say running Rt/ue5 games at native 4k could be considered cutting edge but most pc games can easily be ran at native 4k even with something like a 3060 ti. I use a 6950xt personally right now. I think thats why theres so many people complaining about 4k performance, because it used to be much easier to run and only recently required a big bump in GPU power. Most people that have a 4k display dont want to downgrade their display, but also dont want to spend $1000+ on a gpu to keep playing at 4k.
Thank you for this video, I'm glad that you didn't went all in bashing my comment that was written when I was a bit emotional to put it simply. As for agreeements, there's a very recent example, frostpunk 2 that was released about two weeks ago. Developers of that game, 11 bit studio are one of official partners of AMD, and both sides are very open about it. And guess what, 4090 with 7800x3d barely reaches 40 fps at 4k max settings, while 7900xtx with same cpu can hit 60. How something like that would be possible? We know for a fact that 4090 is reasonably more powerful gpu, and it's just one example, a city builder AA game, what if there are other examples like that we just don't know about or haven't noticed yet? And I'm not talking about RT or upscalers, I'm more interested in real performance, not something brought in from thin air. There's Alan Wake 2, I personally haven't played that game and likely never will, I only had seen some benchmarks. 7900xtx gets 44 fps on 4k ultra average, pure native, no RT, 4090 gets 50-60. From what I've seen I can't say that Alan Wake 2 is a more graphically complex game than cp2077. cp2077 has lots of neon lights, futuristic objects, crowded cities with npcs, traffic, interactive points, etc. What Alan Wake 2 has? A forest with trees, a lake and a village? How it's possible for that game to be more demanding? What about Wukong, a game that was made by a small chinese studio, it doesn't look that groundbreaking as well, yet even 4090 somehow can't handle it at all. The reason of why I even wrote that comment is quite funny. When I was choosing between getting 4090 or 7900xtx + good 4k monitor, i picked the latter. The logic behind it was the following: as a gamer I don't really care about high budget AAA games from the most hyped studios, 90% of my steam library consists of japanese games, and they do not really feature any gimmicks such as RT or DLSS/FSR, so the most logical choice would be to pick a good gpu that doesn't really excel and those things either. And yet, there's a new MH game from a japanese dev that now can't be run at 1080p without a frame generation, a technology that I personally can only describe as self-deception. Hence my frustration from seeing seemingly healthy part of gaming industry being now plagued with FG/upscalers in system requirements as well.
12:57 Well that's not entirely true , some games/ engines have RT baked in by default that you cant turn off , so the argument of '' turn off those features'' dies there
I can't listen to this whole video about a comment like this cause it just frustrates me to hear. People acting like GPU's are different than literally anything else in tech, that within a matter of days or weeks are obsolete, is somehow not a normal thing. This happens with all things tech. Just because you have an unrealistic expectation for how fast or slow technology should advance in the world, doesn't mean game devs are all of a sudden conspiring against consumers along with the GPU manufacturers. There are a lot of aspects to graphical fidelity that are not noticed between games by the average consumer either, which can affect the frame rate and performance an end user gets. Should be happy that things get better and better as quickly as they do, rather than complaining because you can't afford or just don't care to upgrade your GPU to something newer or better for the type and quality of games you expect to play.
Agreed, i pointed this aspect multiple times myself. But there is this... let's say "zeitgeist", in the gaming community, where people somehow colectively think that if they complain all the time and like each other these type of comments, gpu manufacturers will sell them flagships at 300$ that will stay highly relevant for like 8 years. I'm exaggerating obviously but you get the point.
@@adriancioroianu1704 Social media is the problem. They think they're correct because they upvote each other so they just sink into this delusion and seek out confirmation from each other constantly. Also they are technologically illiterate so that doesn't help. They have no idea what even goes into a game.
people are complaining because they see very little "progress"/"improvement" vs the hardware needed to play it. people dont complain about their old gpu not being able to reach some new graphics level, they complain when games that Dont look any better at normal quality settings all of a sudden require 2x the graphics power.
If I remember correctly, there was an issue with hairworks in The Witcher 3 , Tomb Raider, and also in some Final Fantasy (XV?) (where a herd of sheep that you couldn't see but was somehow always calculating the fur with hairworks) tank the fps. Regarding the newer games, there were few games that you couldn't really turn off the RT completely (without editing config files / modding the game). In most situations, those were patched, but still, initially you had a time window where the game underperformed thus you were "encouraged" to get a better gpu. Not sure, but I think Warhammer 40k Darktide was one of them, but don't take my word for it. But from my understanding the comment was more aimed at new games that are just not optimized. Like Dragon's Dogma 2, Hogwards legacy, Jedi (the last game forgot the name), last Star Wars games (Outlaws?) cpu issues in some locations (usually towns), but realistically there is just no reason for that (when compared to other games that have more npcs, more complex environments, with more geometry, activity, shadows etc etc). Generally, there are some tech "wizards" that can make a game look phenomenal and not have high hardware requirements. Some "AAA" games can't match that level of "wizardry" even with higher hardware demands, thus is kind of normal for people to be frustrated because someone is asking for more and gives less. And considering how nGreedia was behaving, blaming them (due to new generation being released) doesn't seem anything out of norm. If you remember, last nVidia gpu gen (when 4000 series was released), first few official drivers from nVidia were benchmarked to check if they didn't "tank" the dps of 2xxx and 3xxx series (in order to make 4xxx series look better). But my personal opinion that PC is too complex to optimize for if the game wasn't built for PC initially. Devs just aren't going down that path in depth. Probably they have a relatively small budget, check if there aren't any major slowdowns that happen often and that's about it (so technically, "showstoppers" are fixes, everything else is too expensive to fix and you hope it will not ruin the sales). Was it our favorite Fallout Dev (Todd Howard and his 16 times the detail) that said something like: "just buy a new pc"
Why are you so protective of GPU Manufacturers and Game Developers/Publishers ? Are you saving your paycheck in the back? When you are yourself seeing a 4090 getting 23-24 Fps on a Game even if maxed out.. You don't think that's a problem? Especially, when the visuals are nowhere to be seen, only the Fps tank. Stop ignoring the obvious...
An additional component in this is that many people remember just the feeling of the games looking awesome (and for their time they did) , but if they actually played it today without any mods, many people would be surprised at how much worse it looks then they remember/ compared with new games)
My memory is hazy, but I do recall something to do with topology in some older games being designed to heavily favor Nvidia cards over AMD/ATI cards. The games were designed to use (as you said) gameworks features which gave significant performance drops to non nvidia cards. (Fallout 3 performance is coming to mind but it could have been an ESO game)
I would agree but my RTX3080 8GB has done me good I do feel like developers have been doing a trend of optimization a game months or even a WHOLE YEAR LATER or even never, I also thought it was common knowledge that higher end GPUs where price > performance
i feel like games are x4 or x5 times more demanding but they doesnt look x4 or x5 times better
no its optimisation thats x4 or x5 less budgeted for
That is soooo true like , i still believe graphics in 2016,2015 are still incredible and we didn't want no ray tracing or path tracing to F up your fps to show you good lighting, there was no such technology and games were still beautiful
They often look worse than games from a few years prior.
@@lovelyghost81 the trick is the "Turn off " button in options for rt -.-*
U can make a rtx4090 explode on minecraft by just turning rt settings up to insanity just one example of a lot.
Atm the rtx 2060 a SIX, 6, S I X year old card can run 95% of games at 60fps constantly.
And of course consumers always get abused, cause they behave like...not smart people..
No games nowadays look better then games before
I will say this: people modded Black Myth: Wukong in order to be able to use the cloud you receive in and is exclusive to Chapter 6 that allows you to fly throughout the stage, in all the Chapters. What people found were TONS of fully-rendered geometry that the player would NEVER normally see during a standard playthrough, including an ENTIRE SMALL CITY that was, again, FULLY RENDERED during gameplay, which had nothing in it and could never be accessed by normal means in-game. So there might be a *little* bit of truth to the whole "invisible bloat just to tank framerate" claim.
Like what happened with Final Fantasy XV that was rendering things WAY outside of the visible map.
And still you will hear people defend such awful things with the line "get a better system"
If you were looking at the object, how was it "not meant to be renderered?" Occlusion culling is view based. You literally cull objects by not looking at them. Unreal per frame builds a complicated DAG like object that it uses to do multiple types of culling.
It's not just that. Because unreal engine 5 is an unoptimized mess despite all the sellouts telling you otherwise, they broke the rasterizer renderer to force in TAA and blur in everything, and they bloat the render pipeline with a huge amount of useless mess, so much so people are now noticing the issue and have to create heavily modified versions of unbloated unreal engine 5's engine to remove the dependency of TSR and TAA for everything, you never wondered why other engines like frostbite doesn't need a stupid amount of supersampling and antialiasing to look nice and clean ? ue5 also introduce so much bloat in the TAA pipeline that every game now has ghosting in gameplay movement, and you can only use DLSS and framegen to counter balance the lack of clarity and bluriness it creates. unreal engine 4.21 ran 5 times lighter with more demanding Antialiasing methods then ue5 with TAA and dlss quality applied because it's a huge mess.
Nvidia absolutely colludes with unreal engine to bloat ue5 with heavier and heavier rendering pipelines that look really bad UNTIL you use a combination of forced dlss, taa / tsr and framegen.
Never attribute to malice what can be attributed to incompetence. Why would they have a fully modeled city if they just wanted to add invisible bloat? It would be easier to create a coconut with 1 billion polygons and hide it somewhere.
Optimisations absolutely do exist. I will give a simple example:
Forza Horizon and Forza Motorsport. Two games made on the same engine by different developers.
Forza Horizon: Absolutely masterpiece of game optimisations. Game runs at very high and stable frame rates while having one of the best graphics of this generation. And it does that in massive open world without any load times or visible LODs
On the other hand...
Forza Motorsport: Game runs like absolute garbage. Requirements are absolutely insane. 10gb vram for 1080p... And on top of that it looks like it came from 2012. While having no open world... Just one track to render. How is this even possible. It's the same God dam engine
And all it would have taken is borrowing Forza Horizon's best engineers to consult and provide feedback for a couple weeks I'd bet.
As far as we know, Playground Games is a talented studio while Turn 10 has become the 343 of the Forza series by way of Microsoft propping up a game with individual contractors that only work in it for 18 months.
That's not taking back the fact that Horizon 5 is the least well-regarded game by the fans, optimization differences aside.
Forza Horizon pays for that optimization with huge VRAM requirements. Anything below 8GB is iffy even at 1080p.
But IMO that's a good thing. AMD users aren't complaining lol
Horizon runs really great. Never dropped below 60 fps on native 4k on 3060 ti with medium settings. It’s really impressive
uunless they reused a lot of the previous codebase, there's no guarantee what's gonna come out with the new one
Games hardware recommendations with frame gen on should be illegal!
Yes that crap is useless at less than 50FPS and if you have 50 then you dont need it :)
@@evilpotatoman9436 I wouldn't turn it on even if i have 90 or 120fps, that crap comes with latency cause it has to hold the next frame for calculation. That means you will always be seeing 1-2 frames late on the screen. And yea, below 50 or 60fps its literally trash, if increase the already low latency and gives you garbage visual artifacts.
Yeah it is criminal. I prefer reducing resolution to play in native res instead of running frame gen, because the input lag/latency drives me crazy. That and DLSS/FSR. The flickering and latency makes me feel like my brain cells are committing mass exodus
Not illegal, just poorly optimized or trying to push the latest and greatest graphical options. But, if a game needs super high specs, then it better look super and next level with the visuals.
@@skychaos87 In select games that are locked at 60 fps, it's a nice bonus to have.
... That said, games being locked to 60 fps should be illegal.
In the case of tessellation, in crysis 2, they made the ocean render nonstop without culling, under the entire map, thus AMD/ATI cards would struggle, though it was used at a lower level under the map to remain well within the capabilities of Nvidia cards at the time.
those game studios never made "mistakes" like that before, until they were sponsored.
Doesn't work the way you think it does regarding water
Lol i just said the same thing, i didn't realize you beat me to it.
For the water issue, It is hard to tell 100% of the failure mode, as there were also various areas in the map where the tiniest of a gap would also cause other objects to render under the map, but for the water there were tests where people would look at an outdoor location with no visible water, then disable water and watch frame rates increase a good amount. The correlation for that in those cases was the presence of water under the entire map, though it could also be due to visual glitches where sometimes there are areas where you can see through the ground slightly which then causes extra stuff to render.
Mistakes like that are always possible, but in many games, the bottom of the map is a bottomless void of nothingness.
That's possibly why AMD added driver level control over tesselation.
This is not true, despite being a very common misconception. The debug tessellation wireframe mode enabled for those screenshots disables culling, it wasn't rendered in actual gameplay. This was debunked by crytek devs a decade ago, I don't know why it persists.
The problem is that a lot of games don’t look any better, but need way more graphics power than a lot of older games
All games have a performance target relevant to the time they are made in. Most games run pretty similarly. That some will manage to be prettier than others is just a fact of life. Older games had the same issue. You're probably comparing the prettiest older games to the ugliest modern games, instead of looking at how the uglier old games looked.
@@albert2006xp I think its more an optimization issue, developers know that gpus are getting more powerful, so why investing too much time and money in optimization - I bet a lot if games would run way better with more optimization
@@IlSinisterothrough but more Optimisation cost more money. It's why gta 6 is being rumourd to be tested on 6 Gigabyte GPU as well while games made with unreal engine 5 was a break through because required a lot less physical work to produce top tier visuals.
The drastic reduction in work allows smaller studios to make games like black myth which they would never be able to afford otherwise, in exchange of allowing smaller devs to make triple a quality games, it requires a lot more graphics oomph,
@@IlSinistero That's nonsense social media groupthink. Why didn't every GPU generation in the past 20 years make optimization less prevalent? Devs still optimize, because optimization means better graphics and better graphics means better sales. Nothing has changed. Hardware improves on average, graphics improve on average, console generations move forward as well. Games are running just fine today on the average available hardware and in relation to the consoles. The issue is a lot of people have below console level PCs and consoles run 30 fps for quality modes at lower end render resolutions often enough. So they think devs should just wave the optimization wand and make things hit the performance target above that for some reason. Which would mean degrading graphics, because you can't really double fps from optimization.
@@albert2006xp I would agree if non-first party games would at least be running in perfect 60fps on consoles. But they don't and that enables Sony to sell us quite pricey new hardware, too with the PS5 Pro.
Also, look at CP2077. This game has SCALABLE graphics. You can bring the bestest GPU to its limits with it while you can still run it on lowest settings on really old hardware. Developers could always design their games this way.
This is why I levitate towards playing older games... It's just a much better way to play games on PC. You get fully patched games, you don't need to purchase the most expensive hardware to run them properly. Not to mention, they go on sales often and at cheap prices. I have no issue doing this to publicly traded AAA developers, where the money is literally feeding the wallets of insanely rich capital investors instead of the developers themselves. Indies and privately owned companies however, I usually pay full price at day one. :)
This. Supporting indie developers pay biggest dividends to gamers.
Literally 100's of high quality games anyone could play on a potato PC. More games than we could ever play in our lifetimes. There is definitely no need to spend big on a PC if choose to play older games which are on average better than modern slop anyway
@@miguelpereira9859but red dead redemption 2 and gta 6.
he said levitate
cant wait for the juicy rtx 5050 8gb with performance of a 3060 Ti
"$300 please" Nvidia would say
8gb? What??? Why you being so greedy man?? The 5050 will have 6gb of vram and thats more than enough to run your games at 720p.
@@sebastianandres6802 $400*
I mean. If it was $150 that wouldn't be too bad. A 4060ti for a better price would be cool.
it's nvidia, so it will be the 5070 8gb with the performance of a 3060 ti, remember every generation the product stack drops a tier now
Successful consumer marketing is about creating a need you didn't know you had, then offering you a product to address that need. It's the wonderful world humanity has created for itself.
Wonderful comment. Create a problem, then sell a solution.
summed perfectly!
This is exactly what ray tracing (and demanding games in general) are doing. Anyone who doesn't think devs and gpu/cpu makers collude to ensure games are more and more demanding every year is delusional
If you're interested in this topic, you should watch Century of the Self if you haven't yet, a documentary about Edward Bernays (Sigmund Freuds nephew and U.S government employee). It has been uploaded to UA-cam several times since it was released. Not turning it off early might mean understanding how the concept of money has replaced certain forms of interpersonal exchange in society and also not having a reactionary fear of anything that suggests Capitalism and consumerism has some ingrained and inherent problems within it.
It's 'Mad Men' (if you remeber the show), but a little more complicated than that.
edit: also, what's of concern to me (in my job) is how this form of modern consumerism has very distinct relationship with how things were developing in 1920 and 30's Germany at the time. It belongs to same system of propaganda with supposedly different ideals, the outcomes of those systems lead to the _inevitability_ of certain characters to 'ascend'.
@@eustacequinlank7418 Thanks for the perceptive reply. I've watched Adam Curtis' documentaries over the years, including Century the Self. Also enjoyed his more recent one: Hypernormalisation, which gives some fascinating insights into the origins of Trump.
As you implied, Bernays' ideas helped create modern consumerism by applying an understanding of psychology in order to manipulate people.
For example getting women to take up smoking by linking cigarettes to women's suffrage and calling cigarettes "torches of freedom".
It worked, and now people's identities are for the most part tied to their consumption patterns.
Crysis & nvidia did it before. They had extra tessellation outside the game map to hinder ati/amd gpus.
I think you are talking about the Dx11 patch released for Crysis 2 after the game’s official launch, in which case the issue was more related to “not very optimized tessellation.” In my opinion, this is unfortunate but forgivable.
Same thing with Wukong... Graphics don't look that good to cost so much in performance. The RT doesn't even look good and games from 6 years ago have better graphics. Atleast Cyberpunk actually looks good.. Wukong not so much. RT seems even more irrelevant outside 3 NVidia games.
@@SemperValor I play wukong in a 6600xt, just turn off RT. And the reason RT kill Radeon GPU is because AMD still sucks for Ray Tracing, and FSR is not good either
@@gerardotejada2531 Its not about RT performance RT is always on In Wukong even when its turned off. Im just saying for how demanding the game is bringing a 4090 to 24 FPS for example it doesn't look that good graphically to cost that much performance. Compared to games from 6 years ago. Also RT on or RT off in Wukong looks the same. Its not like Cyberpunk where you actually saw a difference. Wukong there's no upgrade graphically on or off. Pause the video and look at 25:13 those graphics/textures look worse than games from 2015.
It was not crysis, it was hawx 2. A jet fighter game with massive tesselation on the ground.
They also did it with physX in batman. The smoke used physX to fuck over AMD, but it was also tanking performance for nvidia. BTW volumetric smoke don't exist anymore and it is a very cool effect.
I remember crysis runing very well on my AMD GPU. On par with similarly priced nvidia ones.
Optimization require competent developers and with the way the industry work right now, they are driving the best away and releasing rushed crapware.
There is optimization for most games. Issue is pc they push them farther. Console, they get it to run. On pc, they crank it and push it hard. Listen to call of duty devs. They always have to hold back what they could do for optimized 60 fps.
Except there's plenty of examples of amazing work done to bring graphics to the next age. Think what it took to get path tracing in a game like Cyberpunk to work and be even remotely playable. Or what it took to get a game like Alan Wake 2 to work, that's like a 2030 game without heavy optimization.
@@zagorim7469 optimization is not magic, optimization requires cutting the graphics fidelity
They're spending more time and energy on marketing than the actual game, and a shining example of this is Cyberpunk 2077 itself! Never forget that 2020 launch! Now they've spent nearly 4 years updating and fixing a single player game
The problem is Temporal AA/SS. Parts of these engine pipelines are not getting proper optimization and they just say "hey, it cost a lot, but we'll save cost by letting TAA fix up the resolve" when that has 2% percent impact on performance and we proved that.
Nvidia profits the most and AMD/Intel have been to busy competing in a area they will never win at. They can only win if they work on more optimized workflows and integrate optimized effects in popular engines.
The dude that bought a 1080 Ti back then for 1080p "future-proofed" his PC up to 2024, the dude that bought a 4090 to play Cyberpunk 2077 at 4K will need to buy a 5090 to play Black Myth: Wukong and Monster Hunter Wilds at max settings, the card is what? 2 years old??? Future-proofing is dead.
I was a dude that bought as GTX 1080 (not TI) to play at 1440P. I upgraded to a RTX 4090 when it came out. Still gaming at 1440P but now on a 240 refresh OLED. I hope I can keep this a few years. At the very least until the RTX 6090.
That’s my bro lol I gave him my 1080ti back in 2020 and he’s still rocking that card but he does play at 1080p and never looks at settings or frame rates 😅 so he’s happy. I however am a degenerate who loves playing at 4K on a OLED screen and having the latest GPU
future proofing was never a thing
It's not true at all. A 4080 will run mhw at 4k 60fps or more.
But the 1080 ti wasn't running games maxed out even at 1080p. It can't do RT for starters.
I use to work in the Corporate world. Don't underestimate two corporations colluding together to do shady things.
no fucking way
@@ba.atiste Yes way.
99.99% sure AMD and Nvidia are colluding
And to tell you what I am not Surprised at all
Facts look at doctors and the pharmaceutical companies
There is Doom, and there is everything else.
Those guys kick ass.
Agreed. Doom Eternal looks amazing and runs like butter even on the Steam Deck. Like wtf, what are all the other devs doing?
System Shock Remake also looks great while having insanely low system requirements.
@@koerel100%. All other devs seem to like bloatware instead.
Modern resident evil games were well optimised. Not like Doom but still good.
It's also worth mentioning that as hardware gets faster, a lot of software optimization work is going to be skipped, that would have been done (probably out of necessity) in the past. That's how a new game can come out running worse than an older game that is otherwise broadly comparable in terms of rendering fidelity. This might also have a broader side-effect of making the skills of writing efficient software more and more rare among software engineers as the demand shrinks across the industry. This can make it really difficult to build an optimized game like Doom 2016 because you just can't find people with the necessary skills who aren't already working for prestigious companies like Epic, iD Software, nVidia etc.
There is also more emphasis on creating stunning models rather than relying on ray tracing to do the heavy lifting for mediocre content.
Nobody talked about GTX Titan losing 4K 60Hz support with a driver update. So it wouldn't surprise me to see these kinds of driver updates.
What the actual fuck? That was a thing? Nvidia is a disgusting company
Bruh
Look how they massacred my boy
how? its my first time hearing about this
@@kochissMe too
UE5 pushed by epic is likely one method they are doing with game devs, especially in cinematography.
You see many game devs not using the engine properly but also avoids using older engines that works fine like UE4.
It always makes me laugh when UE3 Arkham Knights looks better than games that came after on UE4 and now UE5 and have higher hardware requirements... (Gotham Knights, Suicide Squad)
@@Micromation tools can become very sophisticated but they are useless if the person using them is incompetent. No engine can make up for lack of creativity and vision required to create entertainment media.
@@pretentious_a_ness I'm taking just about visual quality and performance. The lack of creativity and vision is completely different subject.
@@Micromation Meh, to a degree creativity and vision is involved in visual quality and performance. Solid art direction and some style will forever carry more weight for longer, than even the most cutting edge of fidelity.
The reason why Nanites and Lumen exist is to 'reinvent' LOD and Lighting, to be 'one click' so you can turn your brain off sort of function so that game devs don't have to think about how to do lighting or level of details.
But due to this, Epic can set UE5's Nanites and Lumens' default state much like how Nvidia can set the standards on how much 'hairworks' needed on a character.
Hey look we got a new tech, lets spam it so it kills hardware performance! Lets make it standard so everyone kills hardware performance.
It's outsourcing. The engine developers outsource. The publishers outsource. The developers outsource. Teams are spread across the globe on top of this. No cohesion.
It’s unrealistic to expect game studio to invest billions of dollars to just develop a game engine before making a game and sell you a game in $70.
This, and the fact some AAA studios just fire veteran devs and artists to hire unexperimented ones in order to pay their employees less. Cost-saving measure.
French people pointed out that Ubisoft basically hired people based on criteria which were absolutely not related to work experience, but that can also be ultimately the choice of the politics themselves, because they want to promote "diversity" at work.
@@rinsenpai135Yes and now we see what they have become......turns out their approach wasn't future proof😂
Lol I work for an outsourcing company, in Europe, the amount of shit that American coders write is astounding, the length of introductions that they make about themselves is long enough for them to tongue fuck themselves.
Not all outsource is bad, HALO was outsourced for a decade, then with Infinite they brought everything back under MS due to sanctions imposed onto the CIS countries and what did it lead to? A DOA game with barely functioning live service model.
No it's consumers. Lol if you as a business can make the same amount but spend half on QA you will do whatever gives you the most profit. It's up to the consumer to say no. But scalpers reigned for a bit and manus saw that and said hold my Starbucks. This is all OUR own fault. Not theirs. Until we own this it won't change.
Games used to be developed primarily by people who were coders or understood coding on a semi-deep level. Think of someone like John Carmack who would grind away at code for days and nights to optimize rendering. As time has gone on, development direction and design has shifted towards generic game engine suites without any real priorietary design or implementation. You no longer need to understand the backend foundational code and can just move assets in, out, and around, add all sorts of built-in effects and tools with the click of a button. In addition to this, development has shifted towards "designers" and marketing, where coders are simply employed to implement the desires of leadership. They may offer up suggestions to optimize performance, but if the leadership doesn't see value in this, there is no green light. Basically if the goal and expectation of the project is dictated to be 60 fps @ 1080p then why spend time and money? Just hit the metric you were given and move on. Many of these major developers see 4k, ultrawide, high-refresh rate, etc as niche enthusiast requirements. We live in a bubble and they live in a different bubble.
I can tell a game is using UE5 not by the quality of the graphics, but by how poorly it performs, lol.
This is why Sony exclusives perform so well on PS4 and PS5 in comparison to there graphical levels because the studios devs still optimize the game as much as possible while running bespoke engines.
@@fadingdimension but are they wrong about 4k ultra wide high fps? Hell, are they wrong even if we take each one independently of the others? Look at the steam hardware survey and then tell them they're wrong.
Game development is going backwards. We used to have absolute bangers of studios and devs who made legitimate breakthroughs. We may have higher quality graphics now, but the development quality has been a steady slope downwards imo. Usually due to decisions that come down from the top of the corporate ladder that affect everyone. Now, the problem solving ingenuity is gone and FSR/DLSS/TAA/Whathaveyou are just shorthand band-aid fixes. No more creative solutions to real hardware problems. It explains why we don't have idtech-quality engines in 2024 where other tech has somehow advanced.
I miss Carmack being in control at ID, the days of quake 3 level editing mod scene with radiant and a full operating manual. I almost ended up working in games design but saw the industry was becoming suit run operations. As soon as the suits got control it became about marketing, hype, false promises (Todd Howard/ Peter Molyneux syndrome), profit has become the goal over quality. It seems we are living in a time where the suits are so disconnected to popular culture they don't even know what the masses desire any more, the culture war and DEI hiring hasn't helped but whatever, that is a totally different can of worms.
I was yesterday playing part of the Battlefield 4 campaign. A 11 year old game you can run on a potato. Compared to recent games the graphics have barely improved but the hardware requirements have exploded. What happened?
Well if you only play PS4 ports and GAAS stuff, obviously you will think graphics haven't changed.
@@jorge69696 BF4 was never a port of PS4.
Still play multiplayer, it was one of the best games from Bf series. And BF4 still looks better than 70% of the games made today (for me).
@@sorinbanu3360
I played that game for over 2100 hours!!!
@@jorge69696 Bruh
I believe it was crisis 2 that had crazy high tessellation that killed FPS on AMD GPUs that AMD had to release a driver to have a hard cap. Tests showed zero impact on quality but performance was tripled... That you could say it's invisible bloat! That's also a EA game, a shady company to begin with.
Hawx 2 put tesselation on the ground to tank AMD performance. Batman put physX volumetric smoke to tank AMD performance.
@@PyromancerRift PhysX shit is optional though
@@PyromancerRiftmakes me question is there something going on with raytraicing when using amd gpu, like whenever game detects amd gpu it cranks up ray bounce count to tank performance etc.
I use the driver level tessellation setting, set it to x16
@@Tom3kkk PhysX could be run just as well on AMD hardware, nvidia simply blocked it and forced amd systems into running the code on the CPU side.
@@Tom3kkk I remember the PhysX blood option in Borderlands 2 is absolutely broken no matter if you're on amd or nvidia
I don't know what is happening but something definitely is going on. DLSS should be used for $250 cards to give em a little umph at 1080p for really really low end cards. And for making old gen cards last longer.
Somehow dlss turned into an essential feature for brand new $600 cards. Or +$800 cards to run everything on high/ultra RT for brand new games at 60fps.
ya i was always weirded out by people buying xx80-xx90 tier cards just to turn on dlss.....
Simple. AAA gaming now sucks.
AAA gaming was the crypto before crypto. Companies thought that if we just push for "wow cool graphics" people would flock to the games.
Then they noticed that it's way too much time to create games with high graphics fidelity. So they rushed devs.
They miss deadline after deadline. They lay off workers to save funds because the game is not even out yet and investors are getting angry.
No Man's Sky. CP2077. Warcraft 3 Reforged. Gollum. Concord. Overwatch 2. Dying Light 2.
These are just some of the examples of dogshit games due to poor development, and GPUs not being able to keep up is a symptom of this issue.
@@hachiko2692 gaming ended around 2010. Before that, doom 3 half life 2 stalker far cry 1, battlefield 2, burnout takedown, left 4 dead 2, FEAR, halo 1-3, demons souls, GTA 4, max Payne 2, borderlands 1, cod mw2, chronicles of Riddick game,. Prototype which was an original concept.
First PC starts to get good, then Xbox360 killed gaming, cool games but made for casual console gamers. Now all games cost 100 million+ empty open world. No "levels" at all
@@themodfather9382 Jesus you all sound so depressed. There has been tons of amazing game's since then there's more good games than most people have time to play
@@mojojojo6292 And none of them aim for any graphic fidelity awards.
So we don't need the latest RTX 69420 or the AMD Ryzen 6663D.
I enjoy games. Not the "flagship" ones. This discusses those said games because they're bound to GPUs. Shut up now. Thanks.
Short answer: Yes. Long answer: Fuck yes.
This is one of the biggest reasons why people just play older games. Because playing a new game would cost hundreds if not thousands of dollars. And considering the game quality today it's not really worth it. Just boot up one of your old favorites and enjoy yourself with the PC you already have.
Facts. I grew up with “bad graphics” as a 90’s kid. The other day I got pissed off at modern games being broken so I went and played Super Nintendo on a CRT tv. I had a great time. I might never upgrade my PC at this point.
This. I will keep my current rig until there is a game I really want to play can't get at least 1080p 60 fps high. There are many older titles to keep me busy. At this rate I will stick with what I have for a long time.
Bs. The real reason is that we are finally get games for current gen consoles.
I can play new games just fine but I play old games because new ones are just bad for the most part
New games are not automatically better.
If devs stopped using xx90 cards to optimize their games we would have lot less issues with terribly running games. Make the performance ceiling lower so mainstream GPU does lag behind a bit less.
But then you’d get morons complaining everything looks the same and there’s nothing interesting graphically anymore.
Boundaries should be pushed in tech. You don’t need to the newest tech. You can wait if you can’t afford it. Same as a car. You can buy a brand new 2025 model for 60k or you can wait and buy it in 2028 used for 22k.
@@ancientflameswell right now games look the same or worse so what's your point? I'm sorry but best looking titles from 7 years ago run and look better than current titles with lowered settings. And at max settings differences are truly minimal. This is not pushing the boundaries, this is curring corners and making customer fill the gap. It is what it is.
@@Micromation really not true. Go play Alan wake 2 at max settings. The path traced lighting is astounding and really brings the whole package together. An excellent show piece.
@@ancientflamesdude they still by enlarge look the same. 2020 was the peak of actual significance in graphical fidelity. I prefer that performance goes up instead.
@@Chibicat2024 that is your personal opinion. I agree with you on some games, but many games that use ray tracing and oath tracing were not possible on gpus from 2020. And they look astounding when built from the ground up for the tech.
No Daniel, game developer's are destroying game optimization . A simple "Hello World" program now requires 5 Gb of hard drive space, 32 GB of RAM and a 4070
Uncompressed everything. The dirtiest code. And forced by the company, closed source features injected in to the game at the end of the development cycle by NVIDIA.
Not everyone uses electron, eh…
400GB of wasted file space for languages the user doesn't speak on every installed piece of software.
@@MorMacFey-v2g And a lot of lazy scripting languages like Java and Python that take 10 times the CPU cycles to execute a subroutine that C++ can do in one
@@PassiveMoney1979
Bad language choice
Bad hardware utilisation
Bad optimization
I want to say Crysis 2 was one of those examples. If I’m remembering right, when run on DX11 it had an insane amount of tessellation on a ton of stuff that wasn’t even on-screen (including an entire water plane rendered below the map at all times)
items rendered in the entire world. there was no limiting draw distance as the draw distance was basicly to the end of game map... despite the insta kill black wall they put up
Crysis 2 wasn't as bad as crysis 1. At least you didn't need top end hardware for it. Current gen hardware wasn't enough to max out crysis 1 at the time
@@mikeramos91 Well unfortunately their prediction of cpu reaching 10ghz and above with absurb single thread performance did not pan out.
This was also intentional, as it was Gameworks forcing it and Gameworks at the time was a closed source blob you just attach to your game rendering pipeline and it added effects. So devs couldnt even tone down the tessellation if they noticed it and wanted to due to nVidia.
@@anemone5870 10ghz cpu is possible but more practical to spread the load on different cores
Before I upgraded my PC, I have tried to run some of the newer games on lowest settings - and what I noticed, that older games, that actually had "crappy old graphics" vastly outperformed modern games in this kind of "affordable settings" scenario.
Modern game on lowest quality, that allegedly has "numbers" somewhat close to "old games default quality" looks like absolute garbage, while old games looked decent.
So, I can see where this idea of "invisible bloat" is coming from.
At the end of the day, I CAN NOT actually "avoid using latest stuff and tech" - I have to at least hit the "mid settings". The picture would absolutely fall apart in the new game otherwise.
So apparently there is a software bloat going on, just like how some software, like discord, for example, is created with a tech not really meant to be efficient, but developers are kind of: "ah, screw it, modern PCs will handle it".
It much simpler than that. What Nvidia and AMD are doing is the oldest marketing trick in the world.
I made a video comparing far cry 3 and gray zone warfare and my god the difference is insane. Gray zone looks like ass but also runs like shit.
Far cry 3 looks beautiful and it came out over a decade ago
"that" *moves the camera over the word* "family friendly show" LMAO. That was golden
I remember when we had the GTX 1080 and we thought we were right on the cusp of a 4k standard resolution. Just waiting on a mid-range card that could run every game at 4k like the 1060 did at 1080p. It's been 8 YEARS since those cards released and we're still no closer to having that than we were then, and games from 2016 still hold up phenomenally to games today, many having a much clearer image and cleaner presentation overall due to the rise of TAA. Oh yeah, and now the 60 series cards are like $400 instead of $200 as well. The graphics move marginally forward while the value sprints backward.
i remember thinking the same thing ,when telltale games was a thing . 2D games with absurd system requirments
4:32 AMD did deny it, Frank Azor literally said “If they ask us for DLSS support, we always tell them yes.” and “If and when Bethesda wants to put DLSS into the game, they have our full support,” but hey, that wasn't news worthy I guess. The way news handled the "FSR exclusive" was really messed up. The only semi decent coverage was done by Gamers Nexus and even they did not come back to check up on what Frank Azor said.
Journalism integrity is dead all over.
Congratulations on 200k followers. You're growing fast
If just turning off RT was making any difference nobody would complain. The problem is when you turn game on ultra low, it oooks like garbage and still runs like absolute crap. Most new games on low have unjustifiable hardware demands. Some games on low look worse or similar to titles from 2015 (!!!) and have 4-5x higher requirements. It's bananas. Go turn on Remnant 2, switch to potato mode (it's absolutely fugly) and tell me your performance... Games from 2010 look and run better 😂😂😂 i refuse to buy dogshit that can't even work with decent framerates (120+) on top end hardware in 1440p ultrawide withou framegen and upacalling, that makes virtually no meaningful gameplay improvements - only thing that ever makes any advancements is in graphics department and year on year these improvements become ever so miniscule. AI is as bad as it was in 2001. Audio-wise games still can't get audio positioning right. Mechanically games are as dumb as they were in 2000 and some even got dumber and open worlds as empty as they've ever been. There is crisis in gamedev... a crisis of competence.
I have been playing some Battlefield 1 multiplayer lately and it is amazing to me how there are plenty of times when I wonder how it is possible for a game in 2016 to still look completely on par with current releases. If you increase the texture budget to get crispier details there is nothing else to object to. The frontier that game developers refuse to handle is interactive, physics and such. But hey they will give Ray tracing trash bloat instead of baking in lighting that gives basically the same results. Developers are lazy.
Maybe you should show them how to do It.
@@Chibicat2024 I think first game that eclipsed Crysis visually was Battlefront 1 from 2015... It took them almost a decade 😂
@@gerardotejada2531 why should I do it? You think just because I don't make movies I am not able to criticize them for blatant displays of incompetence on the screen? They've had competent senior developers working for these studios - they've been either pushed out or left on their own volition. None of the people that made games that rose those studios to fame are still working for them and it shows. It shows when they leave, start their own studios and still create bangers.
@@Micromation Battlefield 1 STILL to this day blows games from 2024 out of the water visually.
The selected comment is mostly spot on. Devs need to get back to games that have a unique art style that can be rendered on relatively modest hardware instead of making games that try to look like over-sharpened videotape and require a 4090 to render properly. It's not just about needing more robust GPU and CPU. Gamers are also being pushed to buy 1440p 240Hz monitors that are way overkill for running AAA games at appropriate settings on the hardware they can afford to buy. It's a travesty.
Valheim and Dishonored have timeless artsyles
Partially correct, they have engineered themselves into a pickle with game development and asset handling.
In the old days developers had to build 3-4 models of the same object they wanted in the game, for the various distances and detail levels to retain performance.
But that takes up expensive artist time, it is cheaper to just buy the detailed model and shift it onto the consumer to "Buy better hardware".
Of course, this causes GPUs to waste 75% of work due to the quads of pixels they work with, there are videos that explains this in more detail.
Couple this with the smeared graphics created by frame gen and the use of TAA, and you have really shitty performance with substandard graphics, since the developers are out to cut cost and development time.
Couldn't have said it better, there's a channel that talk about this issue for those that could interest, it's ThreatInteractive
You know they can automate LOD creation now? Even blender can create a lower level LOD with a single button called "decimate" and the industry standard Autodesk Maya has a single button that generates the entire LOD group. Games like Skyrim have automated tooling to generate all the lods and billboards needed for a given area in world space and that is just a 3rd party mod tool. You don't notice LODS anymore because they now will take resolution into account to hide them better. With very few exceptions, all AA, AAA, and 3D indie titles will use LODs by default. If you don't notice them then that means they are working correctly.
nooo you got it wrong, its not laziness its malice. they want you to expend the last cent that you have in your wallet
@@11cat123 this. Automatic LoDs literally negate the need for creating multiple redundant models
@@11cat123 Clearly the automatic one doesn't work that well then, from what I have seen recently the main change in models is stuff "disappears" or is hidden off the model to reduce complexity, but you still see that the base model is the same one.
This is not new or unique to gaming. It's been a plague on software development forever. With more hardware resources and abstraction the less efficient and optimized all software becomes. Why? Pretty simple, why bother when you can just throw some extra hardware at the problem. It requires FAR more expertise and time the other way.
Talk to someone with expertise in assembly and/or complier optimizations. Old days they hated even the C language let alone C++.
This is why I will forever say that probably the most impressive game ever programmed was Chris Sawyer’s RollerCoaster Tycoon which was entire done in Assembly by him alone.
Chris Sawyer’s Transport Tycoon was the best for me, and still programmed in Assembly, and very well optimised.
man that sounds like a pain. but i agree. lots of "devs" nowadays get a certificate if at all in javascript and call themselves programmers because they use UE5 blueprints lmao.
I was so shocked to see low preset of graphic setting in Jedi Survivor made no difference from Epic to the naked eye unless observed so closely by using the complex terminologies of GPU domain in the mind.
The tessellation thing was with Crysis 2, whose DX11 update had lots of unnecessary tessellation baked into the levels (tessellated ocean waves under the ground even when no ocean is onscreen, simple assets like concrete blocks being more tessellated than anything else in a scene). The story came from Tech Report, back when they were one of the best PC enthusiast sites on the web.
It is simple as this: as soon as games started to be “sponsored”, game optimization ended or at least ended for the other team. Some time ago, how games ran was chip and drivers dependent; now they are sponsor dependent…
It was way more fun learning about the og optimization progress curve. Nowadays its just depressing and the ingenuity is gone
Yep & everyone is sponsored by NVIDIA. It's the same thing with GTA 5 too. NVIDIA cards is more smoother than AMD cards on GTA 5 even until now
Well as a game designer and have some experience for optimization from my previous AA projects, my fellows game designer said, we don't need optimization nowdays, optimization is only on the past. 😢
What i cant wrap my head around is the fact that we are at the 3rd iteration of rt cores and stuff and still fps take a 50% or more hit every time you enable ray tracing. Suggesting that improvements were effectively made mainly on the rasterization side of the rendering pileline
The problem is new devs hires are incompetent.
true old veterans are retired! & it shows the huge gap between the old/new games contents!
🤓
Diversity hires
Bunch of woke women 😂
Yeah its called diversity hires.
I don’t think there’s any kind of co-conspiracy going on. To me, this just seems like devs/publishers trying to cut the wrong corners to save on the ballooning costs of game development instead of trying to deal with the mismanagement leading to said costs in the first place.
you must be daft. a true believer 👏👏👏👏
Remember the Order 1886? That ran on the PS4 at the near the start of the generation.
UE5 was a mistake
Nah. It's amazing. Tech always evolves, hardware just needs to catch up
@@ehenningsen Yeah UE is evolving alright, backwards...
@@bayneo77 check out the UE 5.5 demo. Looks amazing
@@bayneo77 You are a freacking donkey.
@@bayneo77 No. GI and nanite are much better in UE5. It takes more performance to make them work but it will age very well because of it.
Nvidia encourage the devs to add more stuff rendered on screens which have zero value gameplay just to ensure we all get "cinematic experience" 😗
😂😂😂
jensen has a lot to answer for
That's some wicked, dirty move right there if it's true.😠😡
0:00 - 0:30 nice intro, especially when he hides the last word with his head. Daniel Owen is a genius.
if you are trying to decide between incompetence and conspiracy, you should always lean towards incompetence
well in 98% theres incompetence, so the question most of the time is, what else is there ^^
In a vacuum maybe, but whenever large sums of money are involved...
Ahuh, that's why nVidia's market cap is $3 trillion. No, it's all calculated.
And why is that?
@@kunka592Because Nvidia doesn't care about gaming.
The overwhelming majority of Nvidia's riches come from companies like Google, Meta, Amazon buying their AI chips so that they can set up massive AI farms.
I'm all for hating for NVIDIA but I'm also gonna shit on your room temperature IQ
I think it's the same sort of scenario as "Hard times create strong people, strong people create good times, good times create weak people, and weak people create hard times" in that game devs now have an unparalleled amount of tools and resources and so they take it for granted; they can make a marketable game with 10% of the effort. Whereas in the early to mid 2010's, devs had to use every single resource at their disposal to make their games run on the hardware. They used tricks like removing polygons from un-viewable surfaces and other little tricks to get every drop of performance they possibly could.
As a result the games of that era still hold up today. The Devs/companies of today however are complacent as hardware and software basically do the optimization for them, and so they just don't need to try as hard to make money.
100% correct.
limitation breeds innovation. many of the stories from nes/sega devs of what they did to eke out whatever performance they could out of 8-bit consoles is equally mindblowing.
I got a 4070 at launch after having to wait too long sat on my GTX 1080 because of the mining boom. I got stung paying closer to an 80 class price for a 70 class card that performs like a 60 Ti class card. But RT was never included in my decision. RT has been out 6 years and is still not fit for purpose IMO, especially for the masses. DLSS was way more of a deciding factor for me and frame generation was a good idea on paper.
I wanted one thing out of RT - realistic mirrors and rear view mirrors that do not make the rendering pipeline scream in pain. Are we there yet?
@@Mamiya645 not yet! 2026 maybe? probably.
If u spent that much money on a GPU to use upscaling and framegen, you should NOT be giving advice. You are clueless.
@@ns-tech1304 I spent that much on a GPU because Nvidia are robbing bar stewards and I was sick of waiting. I don't use frame gen, I said it was a good idea on paper. I use DLSS mainly in UE5 games because UE5 relies heavily on it.
I like it when you talk about things like this keep it up :) Its good to discuss what value these products are giving us and have clear points on the positives of both companies without being biased while also talking about the past!
finally, glad to see you talk about this. i recommend checking threat interactive's video about fake optimization
Currently I'm just hunkering in on my Ryzen 7600x and my 3060ti, if I can't run a new game without buying new hardware to run it decently around 60 fps - well, then I ain't buying the game. Game dev's need to nut up and see the potential of sticking to a tier of hardware for longer periods, then they can also put more time into the game itself rather than spending an unforeseeable amount of time of stuffing it with minute visual upgrades.
The only thing that might tempt me is the upcoming "8800xt", that said, provided it's near 4070TS/80 performance at low wattage, then I'll stick to that for at least 5 years.
Good games over eye candy anytime, there's a reason I've revisited Super Metroid on many occasions.
@danielowentech, you'd get my thumbs up anyway... but the "family friendly show" got my out-loud laugh, as well... 😂
witcher 3 did this an noone bitched about it. if you didn't turn off "nvidia hair works. it would tank an amd gpus frame rate
And if you turned everything off it ran on a fridge. Can you say the same about UE5 games? Or do you think Remnant 2 in potato mode has any business running the way it does while looking like PS3 game?
Nvdia hair works tanked frame rate on nvdia GPUs too. I was using it back when witcher 3 came out lol.
@@PremMajumdarVG meanwhile AMD had TressFx before them and it actually ran well on both company's hardware
Excellent, insightful conversation here Daniel. Very solid takes 🎉
Congrats on 200k dan
The GPUs should be like 3-4x more powerful to match the current games
I feel like game companies are betting on Upscaling and Frame Generation. They see alot of people have lower end hardware. But that’s such a backwards way of thinking
It’s like they forgot how to optimize and use those tech you named as a way of doing it,very lazy
The PS5 has an rx5700xt, there's no way a game needs a minimum 4060 to run at 1080p 60fps 💀💀
Personally, I think of the 'extreme max settings' as 'Techdemo'... I hardly ever see any difference between the highest setting (path tracing not counted), and the next highest setting..except for the FPS..
The other aspect that I didnt catch you mention Daniel is that Nvidia has gimped video memory so heavily that we're currently in a situation where a ps5 has access to more video memory than most midrange Nvidia cards. Sure its unified memory shared with the CPU but it can make up for that by hardware specific optimisations that you get on Console. A great example of that is how much less ram apple products have historically needed for the same or better performance levels as android phones with higher specs. PC hardware generally compensated for having less optimisation by having access to greater performance but that's currently being siphoned away from the mid to low end so that we can be upsold on prohibitively expensive high end cards.
Because Apple force game devs & app devs to optimize their app/games if you want to add it to apple store
apples battery life situation is no different
Compare Arkham knight with Gotham knights or Woke squad kills the justice league, the old game looks 2 generations ahead, and look out the PC requirements, ridiculous.
The 4060 is literally taking over the RX 6800 in the new GoW Ragnarok
If only it could beat the 3060. The card it's allegedly supposed to be the next Gen version of. Alas.
With the amd Driver that just released yesterday, even a 6700xt gained +6 FPS average from the last previous older driver in both Ragnarok and Space marine 2. Thats a 10% boost at least.
I haven't seen or tested with higher end gpu's yet, it could be the same 10% or even a couple more percent.
@@DemiSupremi the 4060 beats the 3060 by like 20 percent in all games. I dont know where this fake narrative come from. Probably testing the GPUs at 1440P ultra settings when both cards are meant to be in 1080P.
@@Stardomplay Well yes, if the game runs games better at 1440p then I would call it a better card.
Keep dreaming lolllll. even 6700 will destroy 4060.
Nvidia's marketing is very brain dead in my opinion. I don't know why it works, but when they show a RTX 3070 failing to run a game made before the card was, just to show the 4070 running twice as good, regardless of how they got those metrics, they paint the picture that if you buy an Nvidia card, it won't last long enough to even play same generation games comfortably. They were showcasing the 3070 getting around 25 to 30 fps in cyberpunk, while the 4070 was getting around 90 because of the ram, dlss frame generation, and upscaling. They ran the benchmark at a high resolution with high textures so they can use the lack of vram on the 3070 to make the 4070 look unfairly better, even without DLSS 3, as the 3070 was vram bottlenecked
Game developers love the path tracing full RT way of making games, because light setting becomes much simpler on their part. They are salivating for the prospect, of not having to manually set lighting all over.
My opinion is that these games are well optimized for visuals. In the sense that the graphical effects they are implementing they are getting to run as fast as they can.
However, any advanced visuals at this level are basically indistinguishable from poor optimization. If you have to throw all these graphical effects to make the games look slightly better than PS4 era ones while running at the same if not worse resolutions as back then... what are we doing here.
We're arguing with people in the comments who can't see this obvious reality.
That’s a good point. The difference in a ps4 vs ps5 (or modern pc game) game is mostly down to effects and lighting not AI, pathing, super high res textures or physics.
It’s going into more visually noticeable things, but at the cost of massive performance going into “boring” but transformative stuff like ray tracing, path tracing etc.
What we're doing here is what we've always been doing here. We're pushing graphics up to the performance targets based on what hardware people have/what consoles have. If you don't like it you're welcome to lower your settings.
I think this is worthy of an extensive benchmark. When I had 3070, an older driver produced at least 5 fps more on average in AAA games. Unfortunately, I had to update to a new driver because the game won’t launch unless I do so.
8:54 it won't stop. lmaao
I changed to amd when I noticed my 2070 super getting throttled and having issues a few months after the 30 series announcement. Haven't had a problem since then and I'm still on the older 60 series. My FPS has in fact only gotten better every couple of months with driver updates. It's not major fps jumps but the fact that they are optimizing on older cards is awesome and I have made plans to purchase a card from the next gen coming up.
What amd gpu do you have right now?
Lmao I did the same thing. I had a 2070 Super for over 4 years but I couldn't play new games at 1440p high/ultra settings. So I. Decided to give AMD a try and bought a GPU from them over a year ago and I have no regrets.
I agree with this comment. I mean, i understand what you say about RT and i agree.Maybe, it's okay for RT effects to tank the game, but take a recently released game like Silent Hill 2 remake for instance. I love the graphics and atmosphere of Silent Hill, but it shouldn't be as complex to run as Cyberpunk. I think that's what the commentor was saying. Games that doesnt look as good as Cyberpunk for instance running bad on PCs even without RT effects.
who remembers witcher 3 hairworks
These games were legendary for being unoptimised great games. Lords fallen the 1st game/watchdogs 1 could only hit 30fps with a 980 and only could hit over that 4k maxed until the 10 series high end cards came
Control couldn’t run well with 4K native high settings until the 30 series high-end cards came.
games that are very demanding aren't made for Gpus at the moment. That's why I named the games I did down below.
Alan Wake still can't run well without drops, even in 1440/Black Faith a souls clone can't run 1440 without dropping FPs, stutters bad, and drops FPs using a 4070TI on 4K.
I still can't play remnant even in native 1440 high settings. I'm stuck in 40s and get drops.
Hellblade is very demanding/tarkov/cyberpunk and still doesn't run like it should.
I'm thinking all of these games and others should run like they should've when the 50 series comes.
all the old games I named were optimized good,just the cards wasn’t strong enough at the time to run them great
lol the guy cant imagine anything complex than cyberpunk? the game has textures and geometry of ps4 era,it has a ps4 level design,it has a ai and animation system of ps4 era,its only the ray tracing lighting that makes the game look good
Exactly what I thought, literately just look at an AC unit as it turns from a low poly crap into A Jpeg when you get further and then you will know XD
100% !
Why do you think they made CSGO into CS2? I had a wonderful game with +300 fps on 1080p when playing CSGO with my 4790K pc, after CS2 only 100 fps on 1080p, it's 100% to make old hardware useless !
The gamer developers are corrupt AF and scammers, making their games 'PAY TO WIN'.
4:50 "Probably putting some invisible bloat in their games just to tank framerates"
No he's not pushing it too far! In CS2 in maps like Inferno, they added 137 flowerpots in the map, to tank the FPS, a dude shows in a video if he went around and removed all the flowerpots in the map, he was getting like +100 more fps, keep in mind this map has been giving me +300 fps on CSGO and in CS2 I can barely get 144 fps to run with my 144 hz monitor.
They are absolutely adding junk to the maps just to tank FPS 100% without a doubt!
And why do old maps give great FPS but new maps give absolute junk FPS? Because they add things like water, flower pots, furniture, arts and all sorts of useless things that noone cares about in the CS community, we want high FPS not realistic environment !
Even things like Windows XP, Windows 10 and Windows 11 - why do you think games run way better on Windows XP compared to Windows 10 (like +10-20% better in CSGO, I tested myself) and games run better on Windows 10 compared to Windows 11?
Because they are doing the same thing with Windows systems, they want to suck your old hardware dry so you want to go and buy new hardware, it's completely the same issue with Windows!!
But I am glad people are finally waking up to this scam!
I have been arguing a long time with gamers about this issue, they told me I was wrong and a conspiracy terrorist.
Even things like Windows xp / Windows 10 and Windows 11, why do you think games run way better on XP compared to 10 and 11 and games run better on 10 compared to 11? Because they are doing the same thing with Windows, they want to suck your old hardware dry so you want to go and buy new hardware, it's completely the same issue with Windows!!
How dare they make a game that only runs 100fps on dog shit 10 year old hw :D
@@orion9k also to answer your question regarding windows! i have an old laptop running windows 10 before Aug update it was running ok but after the update it struggles to run properly! well that's your answer to they want to suck your old hardware LOL
it's all about money & marketing to make more money in this era things aren't what used to be anymore!
@@zf8496 100% make no mistake, they know exactly what they are doing. With all the AI today, they could make old hardware from even before 2014, come back to life and play modern games, I guarantee you !!
@@orion9k
absolutely Jewish behaviour from these game dev scammers
The current problem is that the developers of said games don't want to or/ don't have time to optimize their games.
No code is perfect, but there is rules to go by in code, and you can simply learn by trying and testing, what works and what doesn't.
Battlefield ¹ and five are imo the simply most optimized games there exist. Dice knew exactly what they were doing, and the game looks beautiful and can run on really simple hardware. It's also very smooth when it comes to frame pacing, which is often overlooked like in CS2 or in many other games that suffer for optimization.
Devs need more time or they need to put in work to make their games work. That's why you look at games today, and simply think "Wow 😲, nothing changed, and yet my hardware tanks drastically".
Hopefully in future all will change and devs will turn around to optimizing their games once again. (Certainly hope so) since CS2 hasn't had an optimization update since it released, and that game... Wow the FPS is bad. ( And frame drops.. )
very good explained. this is how I build my pcs. I also target console performance and build my pc around that. I just want the games to run. and sometimes even then the pc game runs better, because of a dedicated gpu and not shared memory instead of an apu with shared memory. And no developers will always targeting 30 fps in favour of better graphics, because better graphics sells better.
Devs are in agreement with nvidia because ray tracing saves a bunch of development time apparently instead of baking in light sources and lighting elements (like shadows reflectiosn etc etc) individually.
AMD is literally copying Nvidia with FSR 4 they now are using AI. nvidia could have patented that tech and AMD would have ever been or intel to use it but they didn’t!! You are all conspiracy theorists and it’s sad… everyone is soo paranoid now days it’s pathetic
That was already possible without RT, advanced real time per pixel lighting, exists since 2007, Crysis 1 was full real time lighting no baking.
@@Argoon1981 i stand corrected. I am not a 100% sure how it works because I wasn’t as enthusiastic about new technology at that time.
nobody uses baked lighting anymore. it's mostly a console thing if anything. you dont need RT for realtime shadows and lighting.
@@thomassp so RT might just be a quality thing ? Like more accuracy and less leakage right?
The newest demo of MH Wilds at Tokyo Game Show was running 60fps with some dips here and there on base PS5. Honestly think those specs are just a placeholder so they could put preorders up, while they keep working on it. Thank you for at least saying " game is still 5 months out, we dont know how it will perform". So much doom posting and panic just based on recommended specs with no actual benchmarks out.
Or we could compare with other games on the same engine (RE). Like, we're not operating in the vacuum - Dragon's Dogma 2 levels of crapformance is not what you should debate over, it is just something you're gonna get. Wilds will run like garbage. Honestly it could've stayed looking like World's, instead we will get a game that choked just about any hardware in existence for no good reason whatsoever... how gamers can be so gullible... WHEN was the last time when "game is 5 months out, let's wait and see" has yielded ANY improvements whatsoever? Because quite frankly, I can't remember that being the case for the most part.
Why would they write such specific requirements for just a placeholder?
@@GAMERGUYXL specs are required to put preorders on steam. The last bit of development is always big fixes and optimization. You need to implement systems and content before you can optimize it. Updating requirements closer to launch is a very common practice. But nobody makes a big deal of requirements going down, so it doesn't get talked about.
@@djagentorange "you need to implement systems and contents before you can optimize it" That is R&D's step not game development.
You know nothing about game developments at all.
I mean if you don't see a problem with "60fps (with frame gen)" requirements ...
I like that you diplomatically didn't touch the 'Graphics in games reached its peak in about 2015-16' bit.
This pretty much just perfectly showcases how disturbed and disconnected with reality your average youtube 'gamer' comment is. Like, go play Witcher 2 on the PS4 (or really, any PS4 game) and then compare it to something like Star Wars Jedi Survivor or even Dragon's Dogma 2. They're just observably wrong on a basic level. Sure, there have been diminishing returns and that has been progressing since basically the inception of video-game/graphics tech, but to pretend there have been no major breakthroughs in video-game graphics technology in the last 8-9 years is purely delusion. It reminds me a lot of the 'raytracing does nothing except tank your fps baked in lighting looks better' comments. In some cases I can agree that it doesn't do enough to justify the fps drop (*cough* elden ring *cough*), but that it does nothing? Even in cases where it's minimal it's usually a marked improvement...
Man, I see so much mental gymnastics (we'll be diplomatic too) on youtube video comments, not even just about video-games & hardware but also about other topics like history, where the statement is just observably false. There are a lot of confidently wrong people out there. Whatever helps you sleep at night, I guess...?
Yup, gamers suffer from rose tinted memories galore.
i like the mixed old and newschool grafic mix from forever winter very much
Proprietary tech, bigger and more complex engines, "the way it's meant to be played".
We also came from fitting the entire texturemap inside one small atlass to save vram to downloading 360 camera scanned models, throwing it to nanite and calling it a day.
I kinda feel like we are in math class and Mr Owen is scalding someone lol.
So, according to you they should stop making more complex software all together? No more FX for video editing, no more CAD upgrades, after all, its all designed to sell GPUs!!!!!!!!!
he didn´t say that
Get real, if Nvidia is manipulating software to sell GPUs, its not just video games. Its a BS argument and frankly, Slander.
He's saying that developers should make better software. Better programmers make better code.
What makes people think every game on the planet should run on their setup?
These cry baby people are just stifling innovation cuz they can’t afford it
It's never been GPU manufacturers' fault. Shit optimization is 100% the fault of the devs, ALWAYS.
That's not true. There was extra tessellation added that was not visible in the game in Crysis just to tank performance. This was being pushed by Nvidia because they knew the impact on AMD GPU's was higher.
@@nossy232323 With the condescension of the devs, again, they could go and say no to shit practices like these.
@@thehunk1 Sure, but what are game devs going to do when they get a lot of dev support, marketing and/or money from Nvidia?
@@nossy232323
The CEOs of Amd and Nviida are close
cousins
Sad to see family kill each other over market shares
Really need to stop abbreviating Cyberpunk like that tbh.
Damn Daniel!
I used to watch your videos regularly when you had less than 50K subs. I go away for a few months and when I return you're at 200K. Nice work. Keep up the quality.
My 7800xt came with Starfield :(
💀
I swear to god, the games AMD picks for their freebies is some of the worst garbage I've ever seen... it was a long streak of mediocre crap until Space Marine 2 just now.
😂😂😂😂
@@BasedBebs FarCry 6, Saints Row remake (ahahahah), Forspoken, Sniper Elite 5, Starfield, Dead Island, Callisto Protocol, Company of Heroes 3, Avatar... Dragon's Dogma 2 that doesn't even run acceptably on any of the hardware offerings. Last of Us Part 1 was decent, Space Marine 2 was great but that's a long streak of utter trash
Gets better because the they released a more powerful gpu the 7900gre for the same price a few months later. I tried to be an AMD guy but they are just as f-up as nvidia just in different ways
I have a bit of commentary on this, as someone who has been big into the tech behind gaming since the 90s. I think part of the issue is that consoles - the PS5 and Xbox SX - are so very very powerful, not using weird custom hardware anymore, and games that never would have been ported to PC 10, 15 years ago now are because devs need the biggest market possible to profit
So games that actually push the PS5 and XSX are closer in hardware requirement to high end PCs than any previous consoles. The PS5 has an RX 6700 equivalent GPU. It also runs better in the PS5 than it does as a PC add-in card because of greater bandwidth and tighter architecture with the integrated CPU.
The PS5 Pro is an even further crank upward. And games developed with those targets are GOING to need super beefy PCs to run. They're going to be less efficient in function than they would've been on the PS5 and will need bigger and better hardware.
And all of this is being exacerbated by how much more expensive high end hardware is today compared to 10 years ago. Gone are the days of a $700 halo product (RIP 1080ti). Now that gets you the GPU in the middle of Nvidia's product stack.
I wrote this before I was done watching the video. I'm so happy to see that Daniel touched upon the same points.
So buddy what should do now i was thinking of building a pc but wanted it to future proof like at least i can play gta 6 when it's release on 2026 or 2027 at 1080p max settings
@Dantetheoneatpinnacle hard to say with gta 6 being in dev for so long it'll either have way inflated requirements like cyberpunk or be like a generation behind
The closest to "future proof" GPUs today IMO are the 4070TiSuper and better, and the 7800XT and better - 16GB or more. If you have to buy now, the answer would be "buy the biggest your budget will allow".
However I would wait until next gen specs and pricing are official
@ElysaraCh I'm also thinking about waiting for it but i don't have pc atm last pc i had was in 2015 for gta with gtx1050 that mf card died 2 times in warranty and got exchanged but on 3rd time it was gone and warranty was over so i never built pc after that but now I'm thinking of gta 6 but that game is like in 2026 or 2027 for pc
@Dantetheoneatpinnacle if you've gotta buy asap I'd probably get at least a 4070TiSuper or a 7900XT for the most future proofing bang for your buck
I think the rush to 4k is also part of the picture here. A lot of sentiment basically skipped 1440p which has seemingly risen in profile belatedly. Plenty of people regard 4k as almost normal and 8k as the 'next thing' when for a lot of games 4k really is still the bleeding edge.
I play in 1080p on 24' monitor 🙂
@@GreyDeathVaccine same & I'm planning to stay on this resolution!
Because 1440p just looks bad , im sorry. 1080p is unplayable.
@@Dempig that is not at all my point. Also, 1080p looked nuts to people when it first came on the scene.
The theory I am proposing is exactly that people jumped to 4k ahead of the technological curve. This means they are now used to 4k (thinking 1440p looks bad fits perfectly with this) even though our tech is struggling to perform at that resolution even with high end hardware. If 4k was still seen as the bleeding edge then people would expect issues and not be surprised that demanding games may not hit 60fps there.
@@LlywellynOBrien I mean ive been gaming at 4k since 2019 when I had a 2080 super. It wasnt too difficult to run 4k native untill recently with heavy RT games and UE5. I would say running Rt/ue5 games at native 4k could be considered cutting edge but most pc games can easily be ran at native 4k even with something like a 3060 ti. I use a 6950xt personally right now. I think thats why theres so many people complaining about 4k performance, because it used to be much easier to run and only recently required a big bump in GPU power. Most people that have a 4k display dont want to downgrade their display, but also dont want to spend $1000+ on a gpu to keep playing at 4k.
Thank you for this video, I'm glad that you didn't went all in bashing my comment that was written when I was a bit emotional to put it simply.
As for agreeements, there's a very recent example, frostpunk 2 that was released about two weeks ago. Developers of that game, 11 bit studio are one of official partners of AMD, and both sides are very open about it. And guess what, 4090 with 7800x3d barely reaches 40 fps at 4k max settings, while 7900xtx with same cpu can hit 60. How something like that would be possible? We know for a fact that 4090 is reasonably more powerful gpu, and it's just one example, a city builder AA game, what if there are other examples like that we just don't know about or haven't noticed yet?
And I'm not talking about RT or upscalers, I'm more interested in real performance, not something brought in from thin air. There's Alan Wake 2, I personally haven't played that game and likely never will, I only had seen some benchmarks. 7900xtx gets 44 fps on 4k ultra average, pure native, no RT, 4090 gets 50-60. From what I've seen I can't say that Alan Wake 2 is a more graphically complex game than cp2077. cp2077 has lots of neon lights, futuristic objects, crowded cities with npcs, traffic, interactive points, etc. What Alan Wake 2 has? A forest with trees, a lake and a village? How it's possible for that game to be more demanding? What about Wukong, a game that was made by a small chinese studio, it doesn't look that groundbreaking as well, yet even 4090 somehow can't handle it at all.
The reason of why I even wrote that comment is quite funny. When I was choosing between getting 4090 or 7900xtx + good 4k monitor, i picked the latter. The logic behind it was the following: as a gamer I don't really care about high budget AAA games from the most hyped studios, 90% of my steam library consists of japanese games, and they do not really feature any gimmicks such as RT or DLSS/FSR, so the most logical choice would be to pick a good gpu that doesn't really excel and those things either. And yet, there's a new MH game from a japanese dev that now can't be run at 1080p without a frame generation, a technology that I personally can only describe as self-deception. Hence my frustration from seeing seemingly healthy part of gaming industry being now plagued with FG/upscalers in system requirements as well.
12:57 Well that's not entirely true , some games/ engines have RT baked in by default that you cant turn off , so the argument of '' turn off those features'' dies there
I can't listen to this whole video about a comment like this cause it just frustrates me to hear. People acting like GPU's are different than literally anything else in tech, that within a matter of days or weeks are obsolete, is somehow not a normal thing. This happens with all things tech. Just because you have an unrealistic expectation for how fast or slow technology should advance in the world, doesn't mean game devs are all of a sudden conspiring against consumers along with the GPU manufacturers. There are a lot of aspects to graphical fidelity that are not noticed between games by the average consumer either, which can affect the frame rate and performance an end user gets. Should be happy that things get better and better as quickly as they do, rather than complaining because you can't afford or just don't care to upgrade your GPU to something newer or better for the type and quality of games you expect to play.
Agreed, i pointed this aspect multiple times myself. But there is this... let's say "zeitgeist", in the gaming community, where people somehow colectively think that if they complain all the time and like each other these type of comments, gpu manufacturers will sell them flagships at 300$ that will stay highly relevant for like 8 years. I'm exaggerating obviously but you get the point.
@@adriancioroianu1704 Social media is the problem. They think they're correct because they upvote each other so they just sink into this delusion and seek out confirmation from each other constantly. Also they are technologically illiterate so that doesn't help. They have no idea what even goes into a game.
people are complaining because they see very little "progress"/"improvement" vs the hardware needed to play it.
people dont complain about their old gpu not being able to reach some new graphics level, they complain when games that Dont look any better at normal quality settings all of a sudden require 2x the graphics power.
If I remember correctly, there was an issue with hairworks in The Witcher 3 , Tomb Raider, and also in some Final Fantasy (XV?) (where a herd of sheep that you couldn't see but was somehow always calculating the fur with hairworks) tank the fps.
Regarding the newer games, there were few games that you couldn't really turn off the RT completely (without editing config files / modding the game). In most situations, those were patched, but still, initially you had a time window where the game underperformed thus you were "encouraged" to get a better gpu. Not sure, but I think Warhammer 40k Darktide was one of them, but don't take my word for it.
But from my understanding the comment was more aimed at new games that are just not optimized.
Like Dragon's Dogma 2, Hogwards legacy, Jedi (the last game forgot the name), last Star Wars games (Outlaws?) cpu issues in some locations (usually towns), but realistically there is just no reason for that (when compared to other games that have more npcs, more complex environments, with more geometry, activity, shadows etc etc).
Generally, there are some tech "wizards" that can make a game look phenomenal and not have high hardware requirements. Some "AAA" games can't match that level of "wizardry" even with higher hardware demands, thus is kind of normal for people to be frustrated because someone is asking for more and gives less.
And considering how nGreedia was behaving, blaming them (due to new generation being released) doesn't seem anything out of norm.
If you remember, last nVidia gpu gen (when 4000 series was released), first few official drivers from nVidia were benchmarked to check if they didn't "tank" the dps of 2xxx and 3xxx series (in order to make 4xxx series look better).
But my personal opinion that PC is too complex to optimize for if the game wasn't built for PC initially.
Devs just aren't going down that path in depth. Probably they have a relatively small budget, check if there aren't any major slowdowns that happen often and that's about it (so technically, "showstoppers" are fixes, everything else is too expensive to fix and you hope it will not ruin the sales).
Was it our favorite Fallout Dev (Todd Howard and his 16 times the detail) that said something like: "just buy a new pc"
Why are you so protective of GPU Manufacturers and Game Developers/Publishers ? Are you saving your paycheck in the back?
When you are yourself seeing a 4090 getting 23-24 Fps on a Game even if maxed out.. You don't think that's a problem? Especially, when the visuals are nowhere to be seen, only the Fps tank.
Stop ignoring the obvious...
An additional component in this is that many people remember just the feeling of the games looking awesome (and for their time they did) , but if they actually played it today without any mods, many people would be surprised at how much worse it looks then they remember/ compared with new games)
My memory is hazy, but I do recall something to do with topology in some older games being designed to heavily favor Nvidia cards over AMD/ATI cards. The games were designed to use (as you said) gameworks features which gave significant performance drops to non nvidia cards. (Fallout 3 performance is coming to mind but it could have been an ESO game)
I would agree but my RTX3080 8GB has done me good I do feel like developers have been doing a trend of optimization a game months or even a WHOLE YEAR LATER or even never, I also thought it was common knowledge that higher end GPUs where
price > performance
There is no 3080 8gb , maybe ur on laptop and so the 3080 is a 3070 chip so yeah 8go.
Definitely not a laptop guess that's why it's done so good then having more vram then I thought