“The enemy of art is the absence of limitations.” - Orson Welles I believe the digital age has disintegrated the mindset of optimization. Games no longer have to work before it goes in the box or fit on the cartridge or disc. Games can be 200 Gb, run poorly, and get patched later. Broken indeed.
@@CurtOntheRadio i'd say it is. anyone could write some code that does a thing, but not everyone can optimize it to run well Like how anyone can dip theyhads in paint and print them on a paper, but not everyone can make beaautiful painting from that
@@captainjimo Hmm. Then everything is art too. I'd venture Welles wasn't speaking about optimisation so much as art is about the constraints - only having these few notes, these few instruments, these few locations, whatever ie working within material limits that constrain one and getting the most out of it. Optimisation is more about removing constraints. Arguably the art comes in working within constraints, whatever they are, and getting the most out of it - not removing the constraints (which is what optimisation does, and which is largely a technical, objective, engineering job).
It's a great way for the CEO of a console company to try and persuade people to stop pursuing tech improvements because it eats away at the cost and effort of a console 10x more than it does on a PC where you can simply adjust settings. You guys eat up marketing so easily while thinking you're too informed to buy into it.
It's interesting when you look at it from the perspective of Memory improvements. The PS3 had 256MB of RAM, plus 256MB set aside just for graphics. The PS4 has 8GB plus 2GB. That's a 30x improvement. THIRTY EX! O_O Now, the leap from PS4 to PS5 was....2X....it went from 8GB to 16GB. A bit underwhelming from a memory standpoint. So, while the PS4 could store 30x the textures that the PS3 could, In contrast, the PS5 only doubled texture capacity from the PS4. Crazy.
Gaming is hitting what Apple hit a few years back. Apple used to be so obsessed with thinness that in every new iPhone and Macbooks they got thinner and thinner to the point that it costed them battery life and heat distribution and eventually performance. Now Macbooks are back being a bit thicker and iphones too and their performance is on a good track
The problem is devs are overthinking about graphics, yes having huge graphics improving each generation is nice but we need good gameplay, storytelling and optimization. Right now, we have games that tank your fps trying to be as photorealist as they can be with bad gameplay and no optimization or barely (don't tell me relying on scaling is optimization!)
They are not overthinking. They are cutting costs, by using rt, dlss, frame gen, and get money from partnering with companies like epic (ue5 lumen), and nvidia. It’s easy money for everyone.
THANK YOU, when going from Low to Medium setting almost made it look like you were playing a DIFFERENT GAME and if you went from Ultra to high or even high to Medium thee wereBIG FPS gains NOT ANYMORE especially since DLSS and FSR gave Publishers the Excuse to cut COSTS by killing Traditional optimization
No, the problem is with the increasing complexity and features in engines many artists don't gain low level knowledge and rely on existing platforms with limited developer teams that constantly need to cater new people. So basically lack of expertise + stupid deadlines.
Nobody ask for graphics, the industry did that to itself, giving the illusion of value on visuals. Black myth wukong would have been a 1 - 2 years max if it wasn't that visually heavy. also that shows that future games won't even have to care that much for optimization on framerate, with FSR/DLSS and framegeneration, game will keep spending on graphics and leave "force" FSR/DLSS and framegen as wukong did to perform as expected.
Spot on, I have no issue with FSR or PSSR being used to enable a rock solid 4K30 game being playable at 1440p60, i think thats a fair trade off, but when you see games abusing FSR to the point that PS5 and Series X games have an internal resolution of 720! Its crazy, Immortals of Avium is actually a fun game as well, but going UE5 ruined the game on console as it had no chance of hitting 60fps without FSR at a base res of 720p!
Fuck graphics what pains me is how environments are sterile and uninteractive and how physics in games is dead or how AI is literally same for over a decade now. Burnout Paradise STILL has the best realtime best car deformation system in a racing game and that game came out 16(!) years ago then titles like Red Faction Guerilla that had insane destructible and interactive enivornment or The Force Unleashed that combined multiple middleware tech to reach design objectives of devs. Devs are pushing for visual shit like mentioned in video that can be barely visible meanwhile you play current gen title and while going through foliage you pray observing if your character model will interact with that foliage properly or if your character will just ghost through that shit lmao. What the actual fuck.
@@arkgaharandan5881 That too. Devs really need to scale down on fancy polygons and provide meaningful and more interactive experiences and by interactive I mean everything that includes physics systems. Everyone slowly becomes Ubisoft when it comes to bloat of safe slop designed on sterile corporate templates. A fact you can launch decade old title and look what physics engine of said title does and not just be impressed but you cannot find a game 15 years later that is even a bit close to this in that regard is fucking insane. Only titles that push something like that are some meme indie tech demos that market themselves around singular physics based gimmick and thats it. When was the last time we had cool gameplay centered innovation in a big budget title? Nemesis system in Shadow Of Mordor 10 years ago by Monolith. Nemesis is also AI based and looking at their previous work with FEAR in that regard no wonder they were the last studio that even tried doing something around AI. Again what the fuck is going on, its like entire industry became MCDonalds tier.
Car destruction isn't even on the fault of the developers, it's on the car makers who hate seeing their cars destroyed. Most racing games license out cars to use and often the manufacturers get to decide whether or not they want a detailed destruction model
@@crestofhonor2349 I know about the patent stop excusing talentless worthless western developers that could design some other gameplay centered innovation. Same with licenses and racers, there are games that are using custom vehicles and those STILL arent close to Burnout Paradise in that regard.
@@TheNamelessOne12357 no. TAA is meant to cover up bad execution of Ray tracing, ambient occlusion, and many other effects. They are all implemented poorly from the start, and devs use TAA to cover up the mistakes and low quality effects. No effect truly benefits from TAA.
@RafaelMunizYT I'm more for gameplay, since you know, we're playing a game? Yes, it's true that art style is also a key appeal but the "story" part is exactly why nowadays we get less and less of gameplay to make the game feel more cinematic in AAA space.
@@Dima064 I feel like people forget the point of gaming is having variety, and nowadays we have more variety than any other point in gaming history. I for example mainly play "older games" so the mess of the recent releases don't affect me much because there are so many games I haven't played yet, I don't need to jump from release to release. there are many AAA games there are more gameplay focused and it's good to have both because I for example enjoy story focused and cinematic games but I don't wanna play that all the time. I like variety so I play story focused games, gameplay focused games, multiplayer games, indie games and I have the best of all worlds
Very simple answer: I do want better hardware. But not for better graphics, but for bigger scale. Keep current resolution, models, lighting, and so on, but add more detail to the maps(more props, furniture, interiors), more actors that are more interactive, farther view ranges etc. There's still a lot of growth that can happen to games as far as their looks go, it's not all about resolution and raw texture quality or number of triangles.
Everything you just described is still graphics though, and there are improvements being made in that realm. That's part of why things like mesh shaders are useful just to name one example. A lot of stuff like this is happening in game development it's that it's more of a literacy issue now. It's sort of like how people might know they like one song in a genre and not one in another, but might not actually have a way to convey why that it is. Gamers might be able to recognize that something is improved or different now but they no longer can articulate why because the difference is no longer a jump from "barely being able to represent something that looks like a thing" to "being able to represent the thing". Sort of also like how you can tell only when CGI in films is bad, but how someone watching would struggle to tell you why it looks bad and sticks out. Game environments generally are more dense, they've got way more stuff in them, shit farther view distances got solved a long time ago for everything but foliage with an infinite reversed depth buffer, the only thing to push there is maybe higher fidelity LoDs and that's it. Like all of this stuff is improving you just don't have the literacy about how games are made to be able to articulate or grasp what's different now.
i cant wait for AI speech algorithms to be implemented into gaming. just imagine a whole story based on AI given a small script and scaling the story based on how you react
@@adeptalakay I want to talk to an npc not chatgpt. Chat bots are not at the level where they can fit seamlessly into a game world while making sense and not hallucinating
Another problem is that companies aren't innovating anywhere like they used to. They're playing safe with trend chasing at the forefront. Also, the whole release it now, fix it later mentality is also hurting the industry. Optimization really hasn't been a thing since at least the 6th generation of consoles, maybe the 7th. Graphics are fine now. I don't need them to be any better. We need smoother framerate and innovative gameplay that doesn't involve sleazy, greedy and FOMO tactics. I've only ever bought two re-visioned consoles, the PS2 slim (simply because my fat model was having trouble reading dual layered DVDs) and the PS3 slim (because my original 80GB model got stolen).
I'll play the commie here now despite hating them, but capitalism and corpos are ruining the whole entertainment industry. Music is safe because the record labels don't want to risk and lose money. Garage-sized bands don't get algorithm attention because they don't bring the views for ads and sometimes are too edgy for it as well. Movies make remakes and sequels and "somehow Palpatine returned" things because movies are too expensive, full of CGI, and the studios don't want to risk investor money. Games are pure safe, buggy slop with better graphics because Publishers don't want to lose money, so better pump that slop for pig gamers to consume ASAP. And in all those companies, the CEOs care more about the trimester earnings instead of the long-term health of their business because they have to cater to shareholders, including DEI shit because Blackrock and Vanguard money. I really hope one day, shareholders start avoiding the entertainment industry like the plague, the whole thing crashes extremely hard, everyone whom is unneeded in the industry gets fired, and passion products rise from the ashes of this industry.
that will never happen unless capitalism dissappears completely or the nation somehowngets a total reset. Sorry fam. Til then, gonna have to rely on independent (usually left-leaning) devs
Triple A Companies are mainly the ones doing this. The Indie Space is kinda fire, so perhaps its worth a try. And games from FromSoft and Black Myth Wukong are really cool exceptions.
"Another problem is that companies aren't innovating anywhere like they used to. They're playing safe with trend chasing at the forefront." This is a symptom, not a cause. When you have to bet 250+ million dollars to get the game on the shelf, you're going to naturally be fairly risk-averse and needing to target a very wide audience to have any reasonable chance of getting enough sales to turn a profit. But I agree, it becomes a catch-22 - you can't make money without appealing to almost everyone... but making something that appeals to every means it's a bland mish-mash that doesn't REALLY knock it out of the park for anyone.
@@jellorelic Seems like a simple solution, then. Don't have the game cost 250+ million dollars. There is little reason why games cannot be made cheaper, we have many examples of Great games that did Really well, despite costing little to no money. We've seen single devs crank out a game in a fraction of the time it takes for a Triple A game to do so- and on such a smaller budget, it isn't even comparable. Over-inflated, under-developed, shallow gameplay- all hallmarks of Triple A, now.
I think physics simulation could use a major overhaul. It's been an afterthought for quite a while now. Even the games we recognize as having really good physics exhibit floaty objects that rarely express their true weight. I'd like to see the industry turn their sights in that direction. Audio has also fallen behind. Some games sound great, but only in comparison to other games. Much like RTGI which bounces light, I'd like to hear sound that bounces as well. It's not often that we have a game that does that. More often, if a sound is coming from my right I only hear it on the right, as if there were a void on my left that sound cannot reflect from. Maybe these things are not important enough to the consensus, but I for one have been wishing for these improvements.
I think physics simulations is a hard thing to do not because of limitations but because making physics an integral part of your game is not easy. Half-Life 2 and Half Life ALYX are like the only examples where really good physics simulations actually meant something for gameplay, I wish more games did that.
@@keatonwastaken I’m currently playing Control, which is why the subject is fresh on my mind. The physics are great in that game, but even so, objects seem to have one predefined weight, which takes me away from the experience a bit. I’d say even if physics aren’t critical to the experience, I think it would make games more immersive. Regarding a new frontier in gaming, that’s among my top picks.
@@LilMissMurder3409 Absolutely, especially open world games. It’s back to the drawing board for sure when it comes to that, because it would be too costly and unreasonable to mocap filler NPC’s. Inevitably I think AI will be a big part of this gaming renaissance, as much as I hate to admit it. I can see how it would be beneficial, but it’s also a big can of worms. I can feel other commenters readying their pitchforks now lol, and I can’t blame them.
We were already at diminishing returns with XBox 360 era graphics, it's just absolutely obvious now. You can't just throw more teraflops at the screen and wow people with perspective-correct 16x anisotropic filtering. The giant RAM and CPU budgets of modern PCs and consoles are spent on quicker development so that more content can be produced easier even if it's not running very well. The irony is the project scopes still expand to fill all available time and money.
That's just a result of ageing, unfortunately. The more you experience the higher your standards become, and the less novel everything is. What was once exciting and unexpected is now dull and predictable. And there is a foreshortening aspect when you look to the past. The past contains all the great things you loved, it is full of them. The present only contains the things you love right now, which are inevitably fewer. Same reason as why music was always "better" in the past.
@@calmhorizons You aren't wrong but I don't fully agree with you. It's pretty obvious when you look at games being released say 2004-2011 and compare it to games being released 2017-2024 that the quality and quantity of good games has dropped significantly. Same with music. I'm mostly listening to old music, and discovering new stuff all the time so its not nostalgia in any way. The only thing one can aruge about is survivorship bias in that all the bad stuff from the past have been forgotten/ignored and only the good stuff remembered and filtered out. That works to your advantage though, since you know the old stuff is going to be pretty good for that reason.
@Skumtomten1 most old musics are terrible, and like every genre or year youre on. Most music are bad and a good number is decent and a few are very good. Obviously you're only listening to the good ones.
@@calmhorizonsPredictable doesn't equate to bad. Music only has so many note combinations and chord progressions, but a new unique spin can still intrigue someone who prefers his generation's young music. Pokemon, Mario kart, Mario party, smash bros, video game market is full of predictable but still great releases with new tweaks. One piece can be predictable but it's still the greatest work of fiction in modern history.
@@thunderstar254 True to a point. But you might be discounting the vast gulf between possible combinations and plausible combinations. There is a reason why we see the same flavours, melodic intervals, story beats and visual motifs repeated over and over in art and entertainment - evolution furnished us with a limited margin of acceptable interests and constrained sense organs. To make a crude example - it doesn't matter how many variations of shit flavoured ice cream you make, it ain't gonna sell. 😁
That's my biggest issue with games. I play for immersion, and I can very easily get immersed in a game with a great art direction, no matter how cartoony it is. But the more realistic a game is, the smaller an unrealistic feature has to be for me to get yanked out of my immersion. It's why I don't like very realistic texture mods, they destroy any and all immersion I might have had. Me playing, very immersed in the game, sees a hyper realistic texture on some clothes that is in juxtaposition to the game textures, immediately remembers I am playing a game and gets bored.
I'm glad that the graphics are plateauing. Hopefully now, the focus will shift to improving optimisation, gameplay and work conditions. (Spoiler: the focus will be MTX)
I think there's one thing being overlooked. In your test at 3:50, the video looks similar because the models are the same, since they were made for that level of graphics. In order to make the most of the hardware, the models would need to be higher poly, to the point that it doesn't look any different at some point (at some point, adding points to a circle to make it more round won't be noticeable). Same with textures, there are only so many fitting textures to add to an area or object, so buffing the hardware wouldn't make the textures look better. Basically, the buff to hardware doesn't increase graphics much anymore, but does allow for more things to be happening simultaneously, with more in-depth physics and lighting simulations, and multiple AI running around with animations and interactions. At some point, graphics are no longer the focus, and that's good.
I have a high-end pc with a 4090. I see myself playing my hacked oled Switch more especially in the past 3 months. We all know Nintendo doesn't care for high frames each game has cartoon visuals I don't mind one bit as long it's fun. And, if you can run the Switch game on pc with Yuzu or Ryujinx Luigi Mansion 3 looks like a CGI film with a couple tweaks at 120fps.
I like both. Not everything has to be super unique in terms of art style. Also realistic games don't always age poorly, it just depends on how you hide your limitations
we have reached the point where Upscale and AI frame generation are recomended to play actual games with graphic cards over 1k dolars to play at not even true 60fps, its disgusting the obsesion with graphics when there are 10 year old games that look amazing, like metal gear phantom pain
@@Kurainuz That's because good graphics has everything to do with visual and stylistic appeal and only a little with pushing certain limits and technical impressiveness. Just look at that Silent Hill 2 remake vs the original in its visuals; the old one is vastly superior even though many of its artistic choices were made because of hardware limitations.
I've been frustrated with the games industry's obsession with graphical upgrades for decades. I'm an artist. I appreciate things looking good. But driving up the barrier to entry for considering new games by a few hundred bucks every couple of years was never feasible for my family. As an adult, it's still not. My friends and I are also huge D&D fans, and my buddy is trying to get me to drop over $400 to purchase on a video game based on D&D. It feels insane. I'm also a huge fan of Halo, and I assure you the graphical limitations do not stop us from playing games from 2001. Hell, even the improvements between Halo 3 and Reach were significant, and they both ran perfectly fine on 360. I never needed a 360Elite. We all know we've had a steady decline in the quality and depth of the most expensive games since the 2000s. Obvious oversimplification, but with that, it's just been insulting that the marketing has always been about how much prettier new games are. So I'm VERY happy to see we're hitting a plateau and a lot of people are agreeing with me. But that's me and my poor ass that barely plays video games.
The people focus too much on AAA experience. You have 50.000+ all kinds of creative and imaginative games listed on Steam you wont be able to play in your life time. (It stretches from stunning pixel art games to very impressive 3D physics engines.) If people support indie studios of games that they like, hence they increase the demand and get more of the games that they like. It is as simple as that. If people complain about lack of creativity in AAA and still buy that sh!t, they are getting what they deserve. It is basic market logic 101 and the democratic choice we have as consumers.
@@Stuntmandouble08 The problem is that the majority of people aren't like us and just think what they're told to think and buy what they're told to buy through manipulative marketing tactics such as "FOMO" heh
Nintendo figured this out years ago, that and the Japanese love of anime meant they could step off of the power battle and keep focus on gameplay density as opposed to pointless tacked on content. Hifi Rush is another example of art style taking precedence to create a fun experience. Then you have the recent AA games, basically games with "good enough" graphics pushing 8-15 hour (typically) single player experiences, cheaper to build because they're more focussed smaller undertakings. Sony blew off $165 million on Concorde, that could have financed 20 AAs to compliment their platform. Software sells platforms which leads to more software sales, only Nintendo out of the console makers seems to remember this, at this stage the libraries on PS and Xbox look like the libraries of failed consoles. 20 for one Concord, you put 20 good quality single player exclusive AAs on PS and their fan base is happy. Nintendo publish a lot of one million sale break even games because they know it helps drive sales in the main, everyone else now bets the company on each release, now commonly meaning one title a console generation. Look at Gears of War, 4 entries on 360 and one XBone, 2 on Series, that's 4 sets of sales for the same dev time as one, 4 times £40 15 years ago. Which do you think is the sustainable model?
17:10 I think another reason why improving graphics is partially pointless is you don't focus too much on details when you are in the middle of some frenetic action. So while some things are worth inproving others are not, having more field of vision, better lighting and overall better depth perception is good, having many distracting details is pointless
I would 100% love it if the world just decided that graphics right now are good enough. Why push progress 1mm forward if that 1mm costs me 1000$ just to keep up? The answer is "because they want that 1000$ from me again and again as they very slightly change almost nothing but performance cost" The industry is very clearly and very deliberately sabotaging performance in order to force consumers to buy new expensive parts if they want to keep up. They might actually add slight improvements to games, but they're so minimal. The only actual change I've noticed in newer games is that they run worse and look worse on my hardware than older games do. That's all there is to it. It's never been about making prettier games. It's always been about making consumer hardware obsolete.
I think this is a bit of a stretch. It's not that manufacturers and publishers are colluding to make hardware obsolete, it's that hardware evolution continues steadily, and prettier games attract consumers, which in turn cost more resources to make, which in turn influences publishers to crunch development time to save money. It's a perfect storm of pressures, and in the AAA space that means 4-6 year old hardware becomes obsolete when, for example, games like Alan Wake 2 are designed around mesh shaders - and stand out visually for pushing the needle forward. Your observations aren't wrong, but the narrative is more simple and more nuanced than "the industry is against its consumers." The truth is that there's more computational power in the hands of developers than ever before, but those resources are being squeezed to get games out the door quickly, and rely on cheap techniques like TAA to buff out the visual imperfections. I'm not sure of the solution. Times are tough right now, new games have very much stagnated, and we need more publishers willing to take on more risky ideas than yet another remake of their next greatest hit. Alleging a conspiracy doesn't help anyone.
Graphics can take a backseat when we finally have raytracing doing all the lighting as the norm. That’s going to be a huge graphical leap that can’t be understated.
@@Granpire they are actually, nvidia got big paying for games to use technology only they could provide so if you wanted the best version of the game you had to buy their new gc "the way it is meant to be played", see it like epic paying for exclusives but instead of the game coming to steam later, they just came to steam uglier, for the majority of developers games got cheaper to make
@Sauvva_ That's certainly true with Ray tracing, 4-5 years ago it was moreso about Nvidia's drivers implementing per game shader optimizations in their driver updates. For me, it's mostly about their unmatched upscaling quality - I'd be inclined to purchase an AMD card if they had a valid competitor to DLSS. Intel comes closer but they're not the most high end cards. I feel the pain of the price gouging Nvidia's done, but I also can't blame them, as they have little competition in RT and upscaling.
Thank God someone said it. I have been feeling alienated and like I've gone nuts for getting frustrated and seeing the patterns regarding planned hardware obsolescence and what *seems* to be deliberate performance degradation. It's especially obvious on smart phones, at least the ones I've used which have ranged from budget to midrange.
Graphical fidelity is good enough. They need to make the physics way better. It's like the picture is getting slightly better, but less and less interactive compared to older games. Arkham Knights to this day looks graphically amazing, and when you turn on Nvidia Physics, it looks way cooler than any raytracing we got today.
Agreed. The Nvidia Physx smoke in Arkham Knight still looks better than the smoke in almost all games today.. Felt like a taste of the future back then.. But, here we are, in the future. And we are still seeing the same old, non-interactive, billboard transparent textures meant to represent smoke, and it's just sad..
I don't consider the smoke physx interactivity I consider a world like Zelda where the world reacts in a common sense way to your actions an example of Interactivity. That smoke was nice but held no gameplay importance just fancy fx
@@marsdenit2845 Just a reminder, that minecraft is still the most popular game in the world. Gameplay is the most important part in games, not graphics.
It's easier to tell a triangle from a square than a dodecahedron from a circle, but the actual number of vertices doesn't really correlate with how good something looks. We never *really* needed any of these improvements. The game I am most excited for this year is Shantae on the Gameboy Advance.
I occasionally have to remind myself that I'm in an echo chamber of AAA games with ever increasing system requirements. Outside of that echo chamber, the "real world" loves the Switch and the Steam Deck, and many hugely successful games run on fairly humble machines. The "plateau" is actually quite a nice place.
I had to upgrade from my 2060 Super of 5 years to a 4060ti for some AI VRAM work. Despite all the community trashing on it. Turns out, there's really no game I play that actually struggles on it and that small bump from 2060S is all I needed. Most of my regular FPS games are CPU reliant. Most of the others are anime-esque that don't push graphics, or stylized like Overwatch, and the actual most demanding AAA games I like to play and revisit, RDR2 and CP2077 are the most demanding out of them all. Which all run smoothly. The game I've gotten most hours out of in the last 5 years is Factorio, Genshin etc. I've realized how far I am from the expectations of AAA gamers who push 4k, RT etc and it was warping my mind for a while thinking I made bad purchases, based on other people.
@@lancevance6346 its not their point tho , U can't just say oh 4006ti is amazing cause its not . U don't make a card literally same perf as last gen card and even 3060 had 12 gb of vram and put a higher price for it . second nividia is not the good person in the story cause they actually profit tons of money from the consumers they become the richest in the world rn even apple didn't make like them, so shitting on PC hardware companies is so valid rn
@@lancevance6346 and even if u like aaa games from 5 years these games worked on gpus from 5 years too ,now even 4060ti can't run most aaa games at 1080p rn what will it do next or after 5 years ? did 2060s when u bought it didn't run rdr 2?
This seems like a commonly held misconception which doesn't hold much weight under scrutiny. Which games specifically are you referring to? Which games do you play? Have you genuinely not played any top tier games over the past 10 years or so?
@@steviewonder0850While there will always be good and great games coming out to play, there has also been an exponential increases in shovelware and bad games over the past decade. For every Baldur's Gate 3 or Helldivers 2, we get a sea of trash sports games, unfinished games that take 1-2 years to actually be complete, and games annoyingly butchered by microtransactions.
@@darthwoody9917 I don't know. There was so much shovel ware during the PS2 and Wii era. The DS, Wii, and PS2 were known for just having so much shovel ware dropped onto them
Difference is those games usually weren't flagship titles made by large AAA devs. Activision & EA USED to actually make decent games, now they pump out slop
Imagine your son want a gaming pc, nothing too fancy, just enough to enjoy modern games, and it costs you 1500€. I bought my first pc 15 years ago for 500€, and I could play everything. A non adult will not be able to afford PC gaming anymore, because we added a bit of grass in the distance
Imagine thinking 15 years ago prices would remain the same. We could say the same about gas prices, groceries, housing prices, rent, taxes, healthcare. Maybe the issue is that prices have gone up but wages have not and that's the reason why the world is slowly crumbling
You can still play everything on PC with a low end, but like your $500...you have to be on 1080p and turn down all settings. Everyone knows that experiences are best enjoyed at the max settings. Thats why people buy better hardware because this is their MAIN hobby. Its like golf clubs. You only need 1 driver but people into it will have 5. And each of them can cost $2000.
my only hobby is watching stuff online to save money and gaming, everything is free entertainment wise, if i cant afford a desktop or laptop i only buy a phone and use that until it falls apart, i literally only work, sleep, save money and volunteer, i have no other hobbies, i go nowhere, i cant afford it and i cant afford most things like basics and housing, its very doable to afford a phone with no monthly cost and a computer and using free internet at food places, until jobs pay then there is no reason to do or go anywhere or be involved with anyone, entertainment is free and great and a laptop and desktop is more then just a gaming machine
Just made a PC for a friend with an RTX 4060 and I3 13100f for 650€ running with a gen 4 m.2, inflation is crazy but if you look for the best price across countries and website you can spare a few hundreds bucks
Funnily enough, PC Gamer and other magazines (back in the day) have been asking this same question for years, and every few generations, it pops back up again in different wording. The answer was the same then as it is now. Games get better looking and more advanced, and one or two games will create a paradigm shift that turns the industry around - like Quake 3, Unreal Tournament 2004, Crysis, or whatever.
@@bliglum I just started playing TitanFall 2 completely maxed out and up-scaled to freaking 5760x2400 over the weekend and it's as smooth as butter. It's been in my back catalog for awhile.
@@bliglum A couple months ago I treated myself to the full Half-Life bundle - it amazed me that a 20-year old game can still look that good. It was money well spent (and righted a wrong, since I'd pirated the original back in the day 😅 )
@@bliglum It's the digital equivalent of digging through the software bargain bin at PC retailers back in the day. That's another thing that has died out.
I remember when the 20 series was coming out and everyone was talking about the potential of 4K gaming. Nearly 3 gens later and not even the most expensive cards can run 4K natively at a consistently smooth rate yet.
True 4K that fully benefits from higher resolution is scam for consumers. It is almost nowhere. Facts: 1. Sub pixel detail. It gives noticeable better looking pixels if there is more information per pixel that are blended to one pixel. Keyword: "supersampling" 2. Camera resolution. Camera has different pixels for red, green and blue with some filter. They are counted for those megapixels so good 4K image requires actually 8K camera. 3. UA-cam and streaming. They use so heavy compression that 4K video is just as good on FullHD. In fact that those video quality settings are more like quality settings and have more headroom on pixels, so they can be downscaled if necessary. True 4K in UA-cam requires 8K video. 4. Human eye. Human eye resolution is about 1 arcminute. Just do the math based on view distance and screen size what you can actually see. In living room conditions, 4K usually don't happen. Large display in front of armlength or projected to wall in movie theatre is where it matters. 5. GPUs suck on small triangles. That is very wasteful and requires LOD levels, and having more LOD levels is also wasteful because GPU can't benefit instancing so easily and that makes memory bandwidth as bottleneck. Shortly, there isn't much of 4K gaming that really benefits from 4K. It is lacking even in camera technology. So while framegeneration easily sucks, frame upscaling actually not. Game can be made for 720p...1080p and upscaled from that to panel resolution and that is how it is done in reality. PC gamers try to play game without upscaling that is when they see the truth: Game is made to played from sofa in livingroom in console, running internally 900...1080p and upscaled from that. If game runs 30fps 1600x900 on PS5, large 4K display with native resolution on 60fps requires almost 12x more powerful GPU and bandwidth. And I'm not talking about raytracing or pathtracing yet.
Eh, ive been playing at native 4k oled since the gtx1080ti. Im still playing baldurs gate 3 at native 4k on my 1080ti. But next year will upgrade to a 5090
@@slopedarmor 4k rendering is possible but it is expensive. Just buy 4 times powerful hardware what is latest gen console, and you get equal framerate and upscaling ratio what is found from console at FullHD. But content itself is easily made for that resolution found from console prior upscaling. Btw, I just started to play older game, made 2010. That game content is targetted to 1280x720. What I did, I set game resolution to 1280x720 and put MSAA antialias settings to max from driver (because it has forward rendering pipeline) and that give best image because it just look like crap when texts are smaller, there is lots of texture upscale filtering close up and there is graphic elements that are optimized to 1280x720. There is even grain effect that is optimized to 720p. So game actually look much better when running on resolution where it is targetted. It is just smooth everywhere and limited texture resolution or other assets doesn't distract. Settings forced to high from driver keep pixel quality high so they don't look bad when pixels on screen are larger.
I never really made a big deal out of this. To me a game needs to have good gameplay above all else and as far as graphics go, all they need to be is visually/stylistically appealing. To me good graphics never meant pushing the limits on realism or whatever is most impressive on a technical level.
"To me a game needs to have good gameplay above all else and as far as graphics go, all they need to be is visually/stylistically appealing." Yeap. Graphics don't need to be realistic, they need to be coherent. As technology develops, there are new ways to produce graphics. I've played very beautiful pixel art games from era before rendered graphics became common. Actually they artists often manage to add more details on pixel art than what was in early rendered graphics.
It'd be interesting to see the AAA game budgets separate development budget from marketing budget. I swear half of their entire budget is just going to marketing and ads to sell their poorly designed games that arent fun in the first place.
It's not unusual to see the highest cost game advertising have a budget 20-30% higher than what they spent on multiple years of development, and even after all of that, how many games can you remember any of the advertising for? You're not far off on the 50% ratio even at AA levels.
Your best marketing is your consumer, give them a good time/experience and word spreads fast. Does not matter if its a device, a restaurant, a hotel, a beach, etc. The person buying any product is your marketing.
Everyone says that the Nintendo Switch has terrible hardware... But you can play Super Mario Odyssey on it and it runs at a smooth 60 fps (and that was 8 years ago). Sure not ALL Nintendo games are that optimized (looking at you Scarlet and Violet), but the fact that they can make it work at all says a lot. That's why Nintendo does it. Spend less on hardware, more on optimization. Imagine what these other games could do if they simply optimized their games properly.
@@TopOfAllWorlds My point is, other companies should be optimizing their games. If Nintendo can do this stuff on a potato then imagine what they could do with the hardware of other consoles.
@lasercraft32 games on the Switch compared to PC makes the Switch graphics look like a$$ though. There is a difference between Nintendo and the rest. And it's not pretty. I used to be a Nintendo fan; not anymore for that and other reasons. The Steam Deck OLED can play all of my indies better than my Switch ever could. I see no reason to get a Switch 2 when very few games run even decently on the Switch. And I'm not even talking about the elephants in the room, I'm talking about many games that never came because the Switch would have a stroke trying to run them (Alan Wake 2, Baldur's Gate 3, Cyberpunk 2077), were going to but were canceled because it couldn't happen because the Switch would be a stuttery mess (Marvel's Midnight Suns), or did eventually come but with serious sacrifices like Hogwarts Legacy and a bunch of other games. Legend of Zelda TotK on the Switch looks like a Simpsons' skin color mixed with the color of urine. And to think Nintendo left the Switch to rot while games couldn't run on it... Oh, and by the way, I mostly play indies and "retro games" these days. On my Steam Deck OLED.
You know I miss? Developer tricks to boost performance, things like Crash Bandicoot levels loading in chunks and having the levels be designed around hiding it off screen or Ratchet and Clank using the part of a disc you really shouldn't be using but in their case it worked but made porting a nightmare, hell I'm surprised we're even allowed to keep the lowpoly models in the distance in the current market :T
This comment sums up the absolute brain rot of the entire comment section. Modern games are a culmination of every "developer trick" that any engineer has dreamt up in the last 30 years. Modern engines are colossal magic shows designed to render scenes in the quickest way possible while doing the minimum amount of work. And "levels loading in chunks"? Seriously? How the fuck do you think ANY game works?
I was very happy growing up with PS1, NES and N64, GBC and so on as a kid. Never complained about graphics. That's probably why the graphics don't matter to me at all today. I care much more about the story, the characters, the music and gameplay. And high fps of course. Nice graphics are just an added bonus--if they don't make the game run like trash.
@@Thunderhawk51 Same. I still love 2D pixel art as much as 3D, but enjoy good games of any genre from any era. Currently mid-way through my first playthrough of Bloodborne on PS4 connected to a plasma TV and it looks and plays great.
I think it will really vary a lot depending on the age of the person commenting. For me it was without a doubt the Nintendo 64. The OG Playstation was great, but all the geometry warping just looked bad to me, it wasn't the 3D revolution I wanted. The N64, however, just looked so high fidelity and smooth, and the geometry warping was nowhere to be seen, it just blew my mind. Dreamcast and PS2 were both great as well, but nowhere near as transformative to gaming for me as N64 was.
@@K31TH3R N64 was still too fuzzy and low fidelity for me. Ocarina of Time was amazingly atmospheric for the time, but in general, the generation of 3D was novel for how it transformed gameplay more so than visuals and I thought most N64 games were ugly too. For me, the Dreamcast was the first console that made 3D appeal on an aesthetic level.
22:00 there is a reason why ppl still have their ps4s and the switch sold so well everywhere, graphics aren't as nesasary to gamers as b4 rather games being fun.
There is a point where it just looks bad and outdated though. Look how much better the PS4 looks than any Switch. It is night and day. And I hate most big video game hardware companies, but Nintendo is by far the worst. They should have been kicked out of the video game hardware market long ago.
@cameronbosch1213 yet ppl are going back to play ps2 games. Nintendo only exists because they provide a service/product that is valuable enough to the consumer to spend their money on it and in return they make profit. So for them to be kicked out the industry consumers/gamers would have to see their product as not valuable and buy something else After seeing how long it takes to make one street fighter 6 character (making each individual hair for an eyebrow) i really don't mind lower fidelity in exchange for more fun and cheaper product (less than 300 mill to make)
I would absolutely be happy with games staying the same in terms of graphical fidelity and required hardware to put more focus on optimisation, storytelling, physics, mob AI, mechanics etc.. My friend showed me system requirements for new Indiana Jones game and it’s just downright silly at this point and I don’t believe they actually couldn’t make the game run on more achievable hardware, but hardware race allows the companies to be lazy and not optimise AAA titles.
11:30 Yeah? Man, The best, most innovative games, were always the ones that worked with, and not against, hardware limitations. Limitation asks for innovation, which is much harder to happen when you have unlimited possibilities hardware-wise. Where do you stop?
What I find crazy is the Graphics sometimes aren't even "that great" and you're getting such a performance hit. As well, that GPU's are becoming so HUGE, yet they STILL can't give you "Good Performance" at 4k.... I mean, tech is supposed to be getting "smaller", NOT bigger.
have you see smartphones in the last 10 years? They do the same as gpus, getting bigger every year But i do agree, if the Top of the Top gpu - rtx 4090 - cant do 4k 60 in any modern game, then whats the point of it being the best
@@CurtOntheRadio Well OP wants good performance at 4K... I've personally given up on that. 1440p all day everyday. 4K is just about good for indie games. AAA can't be arsed to optimize their game.
Yeah we might be seeing a trend of diminishing returns in graphics, but the problem is that hardware requirements aren't following the same trend. Despite being able to play beautiful games today, a game 5 years into the future that might even look stylistically worse won't even be able to run on my PC. So frankly we don't even have the option to go "yeah I'm comfortable with my hardware today", we need to spend more and more money just to be able to play games that look identical to the games we could previously play.
I typically just skip a console generation. Im still playing on a ps4 and enjoy it. Yea, i miss out on playing the latest and greatest BUT typically the best games will carry on to the next console era and be cheaper.
I remember really being blown away by the graphics in the Witcher 3, that came out in 2015. Ten years prior to that, in 2005, games looked completely and utterly different. Jump ~10 years after Witcher 3 to today in 2024... Games generally look compatible to The Witcher 3 still. Kinda crazy.
It sounds great to me. I upgraded and am pleased with this machine. It pains me to see cards like the 4090 starting to be called a 1440p card (id like one).
@@pedropierre9594 My 4090 is a literal space heater. Great for the winter but during the summer I'll use framerate caps and power limiters to make it more efficient.
@@minnidot Hahaha who would call it a 1440p card? It has 24GB of VRAM and DLSS is best experienced on 4K displays. The reality is when you push graphics to the maximum (i.e. path traced Cyberpunk) you are dealing with 1080p upscaled to 4K and you need frame generation. The 4090 will be playing games perfectly fine at 4K for another 5+ years. It won't be keeping up with the latest and greatest ultra PC settings but it will still be light years ahead of what the consoles will be able to achieve for the foreseeable future.
what i have been noticing , that games just look pretty , less interaction with the world , less animation. i dont know how taxing or demanding it is to make foliage bounce off characters rather than clipping through them , or have destructible environments , or a world that is affected by battles. more than 5 npc models. every time is less and less animations..
Physics systems are pretty demanding from both a technical standpoint and a coding standpoint. I don't think you remember Nvidia PhysX but pretty much everyone hated that because of the cost on the performance but it did add in a bunch of physics interactions
@@crestofhonor2349 Surprising how we haven't improved the physics performance in the 15 or so years since PhysX was a household name.. it's the new gimmick word now. "RTX".
I don't know what sort of tech The First Descendant is using, but it was quite funny, I had RTX Ray Tracing enabled, and was near a light source admiring how the light was bouncing off my character and even told my friend, "Wow look at this, it probably wouldn't be bouncing off my character light that without ray tracing on". Later on, I was hitting some performance hiccups in battle so I turned my settings down, came across the same light by happenstance and regular old rasterization looked the exact same lol.
Rasterization doesn't actually have anything to do with real-time global illumination. There are multiple software based ways of approximating the effect or doing it literally, the stuff like RT Cores that are a part of the newer cards are really just dedicated hardware to accelerate ray-intersection tests and bounding volume hierarchy traversal. It basically just lets them do raytracing faster using hardware acceleration.
4:00 A major problem is that too many devs on consoles especially are aiming for the left instead of the right when the right one looks better in motion. Draw distance and frame rate are way more important than the water effects and lighting improvements
I remember wanting games to be more immersive but not graphically necessarily. When I played ME2 I was visualizing branching narratives, dialogues etc. This was the future for me what "realistic" meant, not visuals but content and interactivity. Old crpgs are better in story and dialogue, than most games, this aspect has not evolved but devolved. Only games like BG3 showed a glimpse of hope for more realistic and open interaction.
It's amazing how an RPG from 30 years ago (Ultima 7) manages to have a world that feels more alive than pretty much anything made since (partly I blame the original Diablo, which was _marketed_ as an RPG despite really being a dungeon crawler / hack'n'slash game - that really lowered the standards for what could be called an RPG). Ultima 7 NPCs have actual lives, sleep, eat, go to work, react to the environment around them (ex., opening windows during the day, lighting candles at night, etc.), have huge and complex dialogue trees, etc.. Nowadays people just accept "RPGs" where characters stand in the same place and repeat the same 3 lines over and over.
@@RFC3514 There is a big opportunity for this kind of games, but they need decent graphics and voice acting for them to be main street. I think Larian is the best hope, but I'm sure if they get another big hit, some other studio will come out.
@@digitalsublime - Voice acting was another thing that hurt RPGs around the Diablo era. They couldn't fit audio for the huge dialogue trees of games like Ultima 7 into a CD, so they made most dialogues very linear. In fact, the dialogue in some older games was generated dynamically, so it couldn't really be pre-recorded (unless it was assembled from individual words, and that usually sounded too robotic). Maybe now with decent voice synth and generative text AI we'll see games with more complex dialogue systems. Just don't ask the NPCs to count the number of Rs in "strawberry". 😜
Some really good points in this video. Ideally I could keep running modern games on my PC for a long time without upgrading, by turning down graphics settings, but when I upgrade I would notice a significant improvement. People have gotten burned by spending a couple grand to upgrade their system and not notice the difference from medium to ultra. People are also frustrated by games that are 100 GB because of all the 4K textures, even though they'll be playing at 1080p.
I think the 80-20 rule applies here. It took 20% of the effort to get to 80% photorealism. The last 20% take 80% of it. But most people are fine with 80% so they don't see the point in upgrading anymore.
I think DF said it best yesterday when I listened to them “the Terraflop war is over”. The games I’ve felt have been transformed most by Ray tracing are old ones like Quake 2, Portal, Minecraft and I’m really looking forward to Half-life 2 Rtx. Personally I’m happy with less fidelity and games that take 3 years or less to make as opposed to 4-7 years in the triple a world of gaming. I enjoyed Cyberpunk 2077 and Witcher 3 but have enjoyed the Fromsoft souls games more which are technically inferior to CDPR’s output.
The funny thing is that when you displayed the example around the 4:16 mark, I thought the right side looked better. This reminds me of how in Monster Hunter World you can disable Volumetric Rendering to stop the backgrounds from being blurred and washed out. There are some newer features that don't look good, even if they are trying to be realistic in some cases.
Yes, because the OTHER thing that's kicked in was the "planned obsolescence" of ditching older games for no reason where we would rather just be able to play good games on a stable platform rather than chasing down the ability to play them. A similar thing to see how this impacts development is a video comparing what development mindset in *_Left For Dead 2_* was vs. *_Back 4 Blood_* in terms of where cost, time, effort, energy, & budget went. We want THAT from devs especially being burned out by all the Live Service models that want to chase you down and throttle your wallet, rarher than give you something that's worth the cost vs. diminishing returns on all fronts.
we reached a certain plateau back then when 3080/3090 is the king. you can play on 1440p with DLSS Quality for years to come. in 2-3 generations the visual representation will reach a point where no more power is needed. backing this up with "no one wants photo realism in games" . a good art style is much more important. people do not want games to look like the real world, the opposite is the case. the freeze frame can also be told to using 4K and 1440p . you cannot see if a game runs on 4K or 1440p on a monitor. the pixel density is is already high enough for like 27" displays - a 4K display adds almost nothing to it at a super high performance cost.
Modern games are already starting to run like piss on my 3090 despite not looking any better than RDR2 which runs in native 4K with raytracing above 60fps without breaking a sweat.
I'd say we aren't "starting to get into diminishing returns" we've been there for the last 2 console gens. even PC cards seem to have plateaued completely this gen with advancements only really coming from software enhancements, not hardware enhancements.
If developers create a game that’s too demanding, the PS5 won’t be able to run it, which would lead to low sales and reduced income. As a result, they design games that the PS5 can handle, even if that means fewer improvements compared to previous titles. Similarly, when viewing photos, you won’t notice much difference between a 10MP image and a 100MP image-unless you zoom in. While it's a significant improvement, can you actually see the difference?
Flat panel TVs/displays have followed the same trend, unsurprisingly. SD to HD to UHD (and the variants of HD in between), have shown pretty clear improvements, but in the jump from 4k to 8k the conditions and content being shown have to be just right for it to have any impact. This just means as the years rolled on 4k tv's went from massively expensive home theatre-like luxuries to cheap consumable items for a fraction of the cost. Will gaming hardware follow suit?
As with all forms of technology, you reach a point if diminishing returns where "2x the power" starts to seem like "1.5x the power" then "1.25x the power" I'll relate it to 3D modeling, we use to see a new GPU generation mean multiple minutes off render times, now a new GPU generation may see only 10 seconds off render times or even less. But when you look at the percentage difference it's still about the same, because 10 seconds is twice as fast as 20 seconds, but it's still only 10 seconds at the end of the day
Show me a system that requires expontentially less energy to produce twice as much of ANYTHING. A perpetuun mobile still doesn't exist and will never exist. 😅
Going from 800x600 to 1078 and same framerates are about 40 % increase in gpu power or you could play at 800x and have better visuals but same fps as your previous gpu - Mindbogling... a gpu should be 100% better and prefebly at a higher resulution to pay off
Yup, this is why I have been enjoying VR cause the mobile hardware of quest increase by 2x every three years and you can see a next generational leap like resident evil 4 to Batman Arkham Shadow. Sadly I think the quest 4 will be the last massive next generational leaps for VR and every model after that will be improvements like sharpness and FOV.
@@esmolol4091 we arent talking about generators, but if your motor used to require 10 times the energy it would output and now requires only 5 times the energy you still doubled the performance, gpus main job is using paralelisation to gain 100x performance without increasing energy consumption at the same rate
the visual differences between rdr2 and gra6 will show how far we got in the last years. very interesting video, BTW. you point out a very true development. don't we see something similar, for example, in the smartphone industry happening in the last years?
That made me think. I was buying a high-end GPU like every 5 years to play the newest games. But at some point, games can't just get "better", so I guess there should be also a point where my GPU is still high-end years later.
kinda happened with 3090 & 6900, the new cards are more powerful but we've entered an era where cost is so high that they will be extremely relevant for a good few years yet, still 4k cards for the most part in a world where upscaling and frame gen are the new normal.
That would sure be nice, but if UE5 is any indication, we'll still be getting games more demanding on hardware at a similar rate without necessarily improving the visuals (as is the case with modern games starting to require TAA/DLSS as a crutch for optimizing performance).
@@kalebgross1310 it’s possible if you are still in 1080p a 1080 Ti can easily do it at high atleast not ultra. Playing a game, the same game at higher resolution is such a game changing experience and if possible skipping scaling.
I haven’t finished the video yet, so I don’t know if you addressed it, but another classic example of a wonderfully optimized beautiful title getting absolutely brutalized by modern “features” for a minuscule visual improvement is the Witcher 3. I played through the whole game at 1080p low/med settings on a gtx970 when it first came out to get 60fps, and then when I got my RTX 3080 in 2020, I download it all the texture packs and sky boxes and lighting mods, booted it up in 4K, and played through the whole game again getting 90 to 120 FPS, and being absolutely blown away at the visual experience. Cd project red updated the game to DirectX, 12 and incorporated those mods into the base settings, and the same exact system can barely get 40 to 50 FPS with the same settings on the new version. If there’s a visual improvement between the classic version that runs smooth as butter, and the DirectX 12 version that is stuttery and horrible to play, it is splitting hairs, and not worth losing a single frame of performance. It’s also shitty that all the new players that go to try out the game and its current version for the first time will get the shittiest experience possible. To answer your question mid video, if you gave me games that looked like the Witcher 3 (with better NPC character models though) or RDR2, and could run comfortably 90-120 FPS at 4K on an upcoming 5070 or 8800xt, then I would be perfectly happy with that if it meant the games got more FUN and polished from a gameplay standpoint.
The Witcher 3 current gen update is so god damn cpu intensive I just cannot understand why. I’m getting the same cpu performance as I did on the original launch which I played on an i5 4690k!, and my current CPU is the i5 13400f. It is eating my cpu up more than cyberpunk running path tracing.
Witcher 3 is also not artistically designed for ray tracing, which hinders the benefits a lot. Hardware Unboxed did a video talking about the visual comparisons and improvements possible with ray tracing, and they rated Witcher 3 as "Different, but not necessarily better".
This is so much worse with the new GPU hardware being focused on using AI to generate frames for games. Why optimize a single thing and keep a game from risking your PC’s longevity if the hardware will cover it up for you? “There is no war in Ba Sing Se” kinda shit is gonna start happening with video games now.
It feels like more people are looking to legacy hardware/emulators to get their gaming fixes, these days, because of diminishing returns, and because of the state of AAA gaming, as set by the publishers.
We peaked with Prey 8 years ago. Can we now start focusing on better optimizing games so that people who can't afford the latest hardware can still play new releases? In the incredible words of Hakita, the creator of Ultrakill, 'Culture shouldn't exist only for those who can afford it.'
Slightly off topic: I had gotten so used to my Windforce graphics card running Overwatch 1 at under 30 FPS. I was CONSISTENTLY landing headshots as Hanzo. My friend watched me play a bunch and said: "Dude you could go pro!" and gave me his 1080. I was ecstatic, set everything up. Literally couldn;t land a headshot for shit afterwords and went back to my Windforce and immediately went back to dunking on people. Then OW2 came out and now I have the 1080 installed. What does any of this have to do with anything? No clue, but lower frames worked better with my brain, dunno if thats an actual thing that others experience or not. Great friend, we still hangout and talk.
Less visual clutter and better general awareness since your brain won't be "processing" as many different images. The extra frames were probably throwing off your timing as well. Something similar happened to me back when i would top100 every level on DOOM16's Arcade Mode and Metal Hellsinger. Took a few dozen hours to adjust in each game
what i feel is that AAA companies are stuck to push graphics because its the only thing they surely can achieve with so many people,. if they would just try to make good games with subpar graphics then indies would surpass them in everything, not just good stories, originality, puzzles, critical thinking etc. AAA are now made to be movies or simulators of some kind with big laggy open worlds where u can do nothing smart, just pick berries or skin an animal, but still holding hand with where to go what to push and what head to click on.. (yes, i hated RDR(and i played many games since 2000))
19:19 not only are we not there yet, but designers are having to work with two lighting systems that make life a lot harder. See that HU video for heaps of examples where prioritising one means the other suffers (i.e. the game is mainly designed with non-raytracing in mind, so when you turn it on you get weirdly lit spaces). To nail both you have to carefully design the light of each environment using both approaches and use tricks to avoid one going wrong.
Yeah this transition period is pretty brutal. And sadly, because RTX is still so expensive and underpowered for normal customers, we'll be here for another 10 years until RT gets commonplace and at a decent power, at which point the developers could finally switch and only develop with RT in mind and make it a hard requirement.
4090 pricing may have been exuberant, but in terms of performance? It's fantastic. So is is the 4080. When the 5090 releases, I'll own that as well. The GPUs aren't the problem. Either your ego is or your wallet.
@@huskers1278 Its a scam just based on the prices. Really should regulate the market when no competition and slap a 50% tax on all Nvidia stuff, they are printing money with their monopoly.
Flatscreen gaming graphics may have plateaued to an extent, but as someone who plays a fair bit of VR there is plenty of room for improvement on that front. There are very few AAA quality titles and mods which transform flatscreen games into VR (like Cyberpunk) look fantastic and have amazing potential but even a 4090 can struggle to run them to an acceptable degree. I feel like that might be the area to focus on more in future.
With VR you're only ever going to get what you get as a result of improvements made for normal monitor use. VR is extremely niche and hardly anyone is interested in it and that isn't likely to change. Sure it's market share is increasing a lot as it advances and it's a lot more prevalent now but that's an increase of a small number to a less small number. VR is cool and I hope it gets more attention for those into it but ready player one isn't going to become reality. The Apple Vision for example is probably their biggest failure ever and the metaverse failed spectacularly.
@@paulc5389 AVP can't even play games and wasn't designed to so that's kind of irrelevant, hardly a shock what is basically a VR Iphone at $3500 didn't go mainstream. The Quest 2 sold 20m units which is not that far below Xbox Series sales (28m), sure it's still niche but there's definitely a market there. I think VR conversions of flatscreen games is a potential goldmine as it doesn't require extra investment or specific dev and can work surprisingly well, it's just the hardware needs to catch up. Once we get affordable GPUs that can run games like Cyberpunk in VR with the same fidelity as flatscreen (probably 2 gens away) i think it will explode as it's a truly incredible and transformative way to experience games.
@@paulc5389 Nvidia doesn't really to do anything in particular, better GPUs for flatscreen gaming = better GPUs for VR. It's just extremely hard to run VR games in high fidelity on current hardware as it's basically like trying to play in 8K flatscreen. But there are little tricks and things they can do for optimisation like foveated rendering which will only get better. VR is still in its infancy really, probably like the equivalent of a PS2 right now compared to what will eventually be possible.
This is why I never really cared about PC vs console. PC snobs have to pull up a screen showing the side by side of numbers between both. Frame rate, pixels, processing power and so on to make a point. Having to literally show side by side footage of both for us to squint and struggle to see the difference. By that point, I’m happy with console. Especially, since I don’t do online gaming. I do single player.
You have no clue and your comment screams it. If that's how you think PC is compared to console (screenshots) then you're lost like all of the other arrogant consoles users Best part is you've never had a PC so your ignorance will continue like so many others 😆 keep making out you know what's what. When you clearly don't DLSS? DLAA? MFAA? RESHADE? DLDSR? Frame generation? Lossless scaling? You have no idea what you're talking about 🙄
@@floresf8727 So you're full of assumptions and bad knowledge 🙄 I listed all of those things because you have no clue about them. As your experience & knowledge prove that. I ALSO have consoles & know the difference 🤣 what a clown. Your gaming experience is stuck in a box. But somehow PC owners are snobs for having gaming equipment better than consoles 🤣🤣 what crazy war are you fighting. You don't even know what FSR/DLSS is or the difference between them. You're playing on AMD equipment. But you have no idea of it's competitors software 😆🤣 I just laugh when folks like you try to downplay PC's. More so when the consoles are struggling with games right now 🙄
I don't really care about graphics, I care about gameplay, that being said I still hope the ceiling gets higher and higher, we are in a dip right now where small improvements take a lot of performance, but hopefully as time goes on these processes get more effecient and new games come in that raise the ceiling even more.
Scaling projects up is not a linear increase, costs rise exponentially. Ps1 era devs didnt have to deal with facial expressions for example but nowdays we are not only looking at very expensive motion capture performances for facial expressions but you also have to worry about hair simulations, sub surface light scattering and mussle movements, pores stretching, and the list goes on. And if any of these is neglected then you get a mismatch and a dissonance in how things look vs how you expect them to look and behave. A lot of the advancements we see now has to do with automating these time and cost consuming endeavours (RT, metahumans, etc) but shits getting more and more expensive and companies keep pushing bigger and bigger worlds and projects.
Many classic games pre-2000 were developed with dev teams under 50, and they usually developed their own engine at the same time. The original Baldur's Gate and its engine was made by approx. 60 people over 18 months. CP77 was in development for nearly 7 years and peaked at around 500 people on the team.
The extra performance should be used for gameplay, not graphics. Real AI powered NPCs, full physics integration on gameplay interactions, gameplay relevant path tracing (using reflections for stuff or something)... The only purely graphic performance we need now is to be able to achieve current PC graphics on VR, after that, i would be fine with graphics no longer getting better, Half-Life Alyx is so close already...
I agree. I mostly play retro games on emulators but when I see a modern AAA game the first impression is always "wow it looks great" and then you start noticing that the gameplay is mostly a cool background with modal overlays that say "Press X Now".
I have never seen a UA-camr move their camera view around the screen like you do. You do it really well, adding a lot to your narative. Very nicely done.
Style is dead in modern gaming, replaced with the cult of realism. And graphic improvements seem to be pursued for its own ends, rather than for the benefit or vision of the game itself. To the point it feels like companies make graphics first and then the game from it.
we are at a point where something truly revolutionary (hardware wise) would need to happen. Traditional polygonal rasterization is essentially completely plateaued. I'd personally argue that this has been the case for almost a decade, with the GTX 1080ti being the "beginning of the end". If you play at 1080p, that card is STILL all you need. It's insane.
This is also a problem for improving graphics with PC games. If you make a game that requires a 4090 to run at 1080 60 then your market is 4090 owners only.. i.e. hardly anyone. And that lack of willingness to upgrade is only made worse by Nvidia's greed and poor specs on cards that are actually affordable. If I was going to go Nvidia I wouldn't touch anything below a 4070 ti super or any 50 series card that has less than 16GB vram. And for a lot of people that's simply not affordable.
Orrrr, it's the fact that the peak of what the human eye can see, and distinguish difference in sits around a 2-4k resolution. We've hit the point in gaming now where the best looking games are already the best looking they can get. An unreal engine 5 game, using all their graphical and rendering power is as realistic as real life to your eyes. The human eye can't notice any difference in resolution past 4k. Anything past it, like 8k, is entirely pointless. To 99.9% of people's eyes, 4k and 8k will look the same. There isn't any way forward to go resolution-wise. The only improvements we can make to games now, are fundamental engine improvements, like better frame rate(which also has a cap the human eye can see btw, around 200-300fps depending on visual acuity of the viewer), which you can already reach on PC. At this point, there's not really much of an "up" to get to when it comes to game visuals. Now, it's all optimization, and getting the game engine to handle games the best they can.
Actually having full RT with Everygame without performance problems would speed up making games. As you would no longer have to bake lighting, shadows. Etc.
It's bigger than people think because there is a lot of time spend authoring and lighting a game using rasterization as well as all the time spent making new rasterized effects to imitate what ray tracing does
"Starting to get to the era of diminishing returns?" I just question the word "starting" In the late 90's through the early 2000's dropping £400 on a new graphics card made a HUGE difference Often games just would not play on 3 year old hardware. they wouldn't install, wouldn't boot, you'd get an error message telling you your GFX and CPU - were not good enough and that was that. A CPU and GFX that was top of the range 3 years earlier, could not play a game at all. That wasn't a rare occurrence. So you'd stump up for new hardware, and immediately you could see just why your old hardware could not perform on this new game Now, as my daughter plays the latest games on my hand-me-down almost 9 year old 1070 - that was MID RANGE in early 2016. I look over her shoulder as she plays and think "I wouldn't care if I had that back, those graphics look just fine to me" Yeah it could look better... but not $800 better. Not even close. Not ever.
So what you're saying is if that old PC and my PC with a 4080 super were running cyberpunk side by side you wouldn't think that my pc running it in ultra wide 1440p at 60fps with path tracing would be worth the upgrade vs what.. minimum settings at 1080p and 60fps?
There was often times new tech being dropped that just wasn't supported on older hardware like deferred rendering, tessellation, pixel shaders, or many other things. Those massive jumps were just because everyone was developing new tech for GPUs. Today we rarely get new tech for a GPU and the only recent addition being ray tracing. Those games that use hardware based ray tracing will do a similar thing to older GPUs from a few years ago
@@Minimal_M so you'd take the 1070 if you had to play in ultra wide exclusively with a 4080? cmon bro. anyways, that's what the extra 16:9 monitor is for. options 💁♂
@@crestofhonor2349 it's already happening, and it's even happening for current GPUs from other brands.. I ditched a 7900XTX for the 4080, yes I occasionally get a tiny bit less FPS in other games, but anything ray traced and I'm going from sub 60 to 100, except for cyberpunk, where I went from sub 40 to 60.. but that's proper RT (PT)
In general the focus around games has been too much on graphics. It's the first thing everyone will complain about. I'd much rather see some innovation in game design instead of chasing "photo realism". Or a shift of focus from graphical fidelity to improving stuff like NPC AI or physics which have absolutely stagnated or even regressed. The reality is that games don't need state of the art graphics to be fun and successful, you see it with many indies and especially Nintendo.
It's funny how game AI has barely improved at all (and is still rubbish, frankly) even as we're on a supposed revolution of self-driving cars, talking machines, even generalised AI. Much of that is hype and BS, of course, but you'd think we might already get at least some improvement in game AI before the Robots supposedly make lawyers and artists redundant. But nope. Much easier to do it in games than realworld, too, as you have all the variables and a very limited 'world' in which to operate. Driverless cars (supposedly) and yet driving games have awful AI opponents. Same in shooters, everything. Garbage. If game devs can't do it after forty years, what makes anyone imagine Elon can do it "by next year". lol
@@CurtOntheRadio It's weird isn't it? Seems like a much easier canvas to work with than the real life applications you mentioned. The big difference though is that they have dedicated research where unimaginable sums of money go into. So they are and will be more advanced than what you see in games where it's clearly not been a large topic of focus for a while.
@@vintatsh True, it's not a fair comparison. Buuuut, we might at least expect some improvement in game AI long before we hand over our children to be educated by the Robots, say. Or fire all the lawyers. Even if it needs a cloud subscription. It at least suggests some further scepticism is warranted about the use and integration of AI more generally, imo. I keep trying to think of ways to use AI in games and keep coming up against this issue: how do you swap data between traditional and AI without losing the point of the AI, or the use of the traditional compute? Like, say, you could have AI be a shopkeeper, or blacksmith, so you can better deal with them, be more inventive, maybe - say in Skyrim type. But any result would still have to be passed back to the traditional compute running the game, and it would all need be defined as variables the trad game code 'understands'. Yet if you limit AI to output the trad game understands have you gained anything, really? You can't broaden choices easily, you can't invent new things and all in all I think it's difficult to find applications. Though maybe sports games might be one - driving, tennis, whatever. Blah blah blah But where is it? Where is any game with a novel, 'proper' AI component?
"Several games with AI on the market today implement highly sophisticated forms of AI to elevate the player experience. Games with the best AI often elevate the gaming experience in cool ways. For example, The Last of Us: Part II uses advanced AI to power its enemies, providing them with an ‘awareness state.’ This means that if an enemy sees one of their comrades killed without actually seeing the culprit, that particular enemy will be on alert and more vigilant as they plot to take their revenge. Expect the future of AI gaming will include much smarter NPCs since NPCs have always been one core use case for AI in games." TLOU2 is pretty good re NPCs, I thought. Def a step up from most. Not sure if they just mean 'well programmed' here though. And there is this: ua-cam.com/video/PYtmFF02OH4/v-deo.html
@@CurtOntheRadio I feel like that will be a „next-gen“ type development. I fully expect PS6 and whatever the next Xbox will be, to have a highly capable neural engine to handle the processing of AI interactions and that will be their defining feature. The PS5 Pro already has a 300 tops machine learning block after all for PSSR. That is going to be the point where even AAA developers will start to embrace AI use-cases in games, once the average player has the hardware. It seem like the next logical evolution. As to how exactly it‘s going to be integrating into game design I‘m also a little bit confused, especially regarding storytelling but we‘ll see.
I mainly play simplistic grapghic games because there easier to run an look great over time while realistic games just look awful over time. Physics, and effects like grass skies and water should be the current main focus. Like if in vr the grass should move semi realisticly from my finger. An water splash could look so good if details werent blocking innovation.
Reaching the hard limit on computing power would be really weird and really cool because suddenly the entire software world revolves around these limitations and we could theoretically see a greater amount optimization across the board. A world of frozen computing power, ironically, could see its performance improve over time as more people get more experience squeezing out as much power out of the silicon as possible, which in tern could mean that software gets faster to make and therefore cheaper.
Great video as always! I think part of the issue is resolution and the other part of it is VRAM. This is the first time we have ever gone 4x in resolution, PS1 and PS2 had the benefit of being able to change the output resolution of the console to your CRT screen and knew it would still look good, PS3/360 jumped from basically 480p to 720p, then we moved to 1080p, now its 4k, that has eaten away at half the GPU uplift we got this gen. Then you add in that RAM/VRAM barley moved! Mass Effect Legendary Edition, the main benifit both ME2 and 3 get is the HUGE improvement to texture quality! On PS3 devs had 480MB of usable RAM (32MB for the OS), PS4 gave them 5.5GB, this gen they have 13GB, and when you factor in the resolution increase etc, how much has this been eaten into? And with Nvidia keeping 8GB card relevant 4 years longer they should have with the likes of the otherwise excellent 3070, you end up in a situation like black myth wukong, a game that uses the latest GPU technologies and can look incredible, yet in places looks like a PS3 game with terrible textures. I would argue the single biggest upgrade in the Horizon Zero Dawn remaster is the increase in texture quality which along side the improved animation in cutscenes and better framing of secondary/side quest conversations (based on the video DF did) and in all honestly, other than more VRAM, none of those improvements actually needed more GPU power, the PS4 version of Horizon Forbidden West still has the same incredible motion capture and well framed conversations as the PS5/PC version. While im a sucker for better lighting in incredible micro detail like peach fuzz on Alloys face (it does look so impressive) but does it actually make the game better? No not really, the gameplay and story (my god what a story!!!!) are why I fell in love HZD.
But...modern consoles are NOT, NOT, 4k though. The vast majority of the time to maintain stable 30-40fps they'll run at 1080p. Even if you set it to run quality, the highest resolution point you'll hit isn't 4k at all, but 1660p. It's only capable of upscaled 1440p...but hardly ever runs that. I and a friend hooked up the PS5 to a his computer and ran some programs to read its performance. Ran several games at multiple settings and it averaged 1080p nearly 70% of the time...even with the two "4k" capable games at the time.
From the 360/Ps3 era onward the console companies have been lying about the resolution of the consoles. The consoles almost never render anywhere near the resolution listed on the box. The console simply upscales the image. Your "1080p" console was in fact doing 540p-800p upscaled. Your "4k" console was in fact doing 900p-1600p upscaled.
@@iprfenix Consoles like the SNES and PS1 could all output at 480i and yet they chose to do 240p. Plus even then resolutions weren't consistent. There were a whole host of pretty strange resolutions during the analog days of CRTs. Resolutions were never static. Even the PS2, which could go all the way up to 1080i and did support 480p out of the box, often rendered at 480i. It often took two 240p images and interlaced them to create a 480i image unlike the other consoles which often used a 480p frame buffer and then output a 480i or 480p image. Plus even those that ran at 480p didn't always run at 480p and could have an internal resolution of 448p and just use the fact that there's overscan to hide the missing pixels. There isn't a single generation where your console ever always output at native resolution
PS2 era still ran at mostly 480i (which for rendering costs is largely equivalent to 240p) and earlier consoles mostly ran at 240p. Also modern consoles very rarely run any game at native 4k (stuff like the Quake 2 rerelease does run 4k120 on everything but the Series S), it's mostly stuff like 1200p upscaled to 4k with either checkerboard rendering or FSR and 30 fps.
Small Indie game developer here (or at least starting to develop games) I think one of the checkpoints to get to that point is draw distance, the best 2 examples of that I can think of is for one: The whole Ghillie suit and the grass at long distance thingy in DayZ, and the other example is how a whole lot of games are situated in this sort of "valleys" or "canyons" with high walls that conveniently stop the player from looking at details in long distances (although, nowadays it's more of an artistic tool for representing different ideas of a storyline). I just think that the day I can have the exact same visuals while looking through a x14 scope than looking at my character feet, or the day I can have an AC-130 stylish mission where I can see the white in the eyes of a character while zooming in, is the day that we're there (although I use to play 2010ish games, so maybe we're already there and I don't know yet).
when i played dayz in 2014-15 me and a friends were talking about the grass issues at distance and he had an interesting idea At longer distance, just move the ground texture up, so if you are prone you are under the ground texture This way you wont stick out and realistically someone that far away shoudnt be able to see you anyway
I feel a big part of it is not just technology but the artists and developers actually making it look nice. You can have the tech for the most beautiful scenery ever but you still need to arrange and light and animate it yourself, and thats effort a lot of AAA studios may or may not wanna put in
11:30 honestly yeah. Because companies would then start optimizing to get better performance again. We’d see these beautiful games that actually run well instead of just requiring you to have a 4090 to run at 60FPS
not true, a lot of games are build first with console in mind and than pushed with some extras on PC. Vegetation is one of those option where the engine is just put to do a lot and that's called ultra settings.
One thing to note, is that consoles (PS3) at one time gave a price to performance balance. including a Blu-Ray player, media center, and gaming device all wrapped together for a decent budget price.
@@jiggerypokery2962 Ps3 was clearly overpriced at launch, but it was the cheapest bluray drive at the same time... Many sales were only driven by the bluray drive itself.
The problem may be looking at consoles that way. Back in the day many people bought a ps2 or xbox for its home media capabilities such as dvd and blu-ray. Ofcourse the world has changed now but then so too should the console's marketing. What about if you sold consoles like phone plans? That way you could jam more premium components in and have the overall cost be higher but not less affordable. You could also package it with big brands for the modern age. So lets say if you buy and xbox subscription you get the new xbox, gamepass, netflix,spotify premium,youtube premium and amazon prime. That way your entertainment and online quality of life services are all covered under the one payment. Much like how your dvd and game playing device was all in 1 unit in the 90's and early 2000's
im on a $300 mini pc, playing modern AAA titles at their lowest settings, and im stoked about it. the fact that 2-3 years from now i can expect to be able to replace this pc with something that can play these same games at high settings in the same price range is awesome. not only that, but my setup only uses 25-70watts total with monitor and everything. i live on solar panels, so needing a 500watt gpu to play things is just not a possibility for me.
I wouldn't mind if performance didn't improve beyond this point (at least for a very long time). It would allow developers the time to actually improve-, and master the rendering features we currently have. Why most people want more performance today is because games perform poorly. But at the same time, games perform poorly because developers haven't had the time to master what we currently have. It's a chicken and egg kind of situation, where our economic system (feeding on non-stop growth) requires consumers to WANT the new products. But in reality, just like game development takes longer these days, so does optimizations to engines, to games etc. It feels like the kind of rendering optimization improvement developers made during a 5-year period on the PS2 would take 20 years today due to the added complexity of modern rendering. I would personally like to spend less money on hardware, and allow developers to become more focused on creative graphics engineering.
If developers stop the practice of developing games that perform best based on hardware that is not even in the market yet that would be a good thing. Personally, I lost my enthusiasm for gaming a while back when it felt like developers (especially AAA studios) decided that good looking games mattered more than good or innovative gameplay. Probably only those who grew up in the era when gaming started going mainstream in the 70's and 80's would get where I am coming from.
Developers don't make any decisions you attached to them in your comment. The R&D or production teams: 1 artist, 3 devs, 1 analyst, 1 tester, 1 dev-leader. Those teams make up less than 60% of the company staff. Whatever you think devs do, they don't make decisions. Especially the strategic ones.
When was this "a while back" ? 30 years ago ? Because that's what AAA studios/developers did since forever. Literally. And the current situation is much better than it was in the '90s and early 2000s, where games would literally not run on 3 year old hardware. Nowadays almost all games can still run on the midrange GTX 1060 from 8 years ago. This will be even more true when RTX 2060 reaches 8 years old.
@@p4r4g0n I'm almost with you on that one, but I did had some exceptions (DOOM 2016 and Kingom Come: Deliverance, from what I remember). And In the Warcraft 3 days, we were pirating it (it was common in my area, pretty poor country). It's funny that for Warcraft 3, I had it installed for several months before I could play it. Because, I didn't had a GPU initially.
They're literally the same... I have a PS4 Pro and a friend has the PS5. Putting them hooked up to a computer and running some programs...the PS4 Pro runs about 95% of what the PS5 does. PS5 basically just has rt and can run 30-45fps rather than locked at 30. They're the same
PS4 Pro crawls running Cyberpunk not even holding 30 fps at 1080p with some dynamic resolution and lowest settings possible. Meanwhile PS5 runs the game at native 1440p with high settings and ray traced local shadows (which are cheap) then upscaled to 4K at locked 30fps. But sure, PS5 is barely more powerful.
@@phattjohnson Bruh, they both run in predominantly 1080p-ish at 30-40-ish fps, and some titles can be 1440p-ish upscaled to around 1660p-ish...and run 30-45fps... The only difference is PS5 has rt in some games, and has a few new titles. That's it! Lol
not just he fidelity, the fps thing is diminshing returns. If we pretend we can tell over 60 fps despite every study proving otherwise, 30-60 fps is a 16.7ms difference. 60-120 is 8.3ms different. you actually have to go from 60 to 960 to get the same difference as 30-60. the difference between 60 and 120 is almost half of human perception. it's literally a difference tooo small to see, but even if you could its only half the upgrade.
Sorry that your visual acuity is decidedly on the left side of the bell curve peak, dog. Humans can identify images shown for 1 frame in a 200 FPS sequence of frames. You drew the short straw.
@@nopenoperson9118 Sorry that you're incapable of reading. Please try again and recognize I said nothing about myself and only about every study done on the subject ever. No, humans cannot identify images shown for 1 frame in a 200 fps sequence of frames. You do know we're on this thing called the internet and so you can fact check things, right? 99.9% of studies done on the matter show 60fps as the limit, 0.1% suggest that some people might get to 68, but that's it. Showing a single frame for 1/200th if a second is not the same thing as showing 200fps and telling the difference or seeing that frame. Even in that one they put "see" in quotes as they didnt see the frame, they saw an afterimage, a visual illusion. when shown 1 frame not 200 frames with 1 different one. Just try googling something as simple as "how many fps can humans detect" You wont find a single solitary result that says over 60. there's multiple reasons why anything over 60fps is the same as 60fps. You're simply lying to yourself and trying to justify spending all that money. The bell curve peak would actually be more like 53 fps, since 45 is the low end and 60fps is the high end.
@@albert2006xp I just did a new build and my old one had an 7th gen I3 and a 1050. My buddy still plays on that pc lol its fine for medium-low graphics 1080 on most game 60+ fps. New games are optimized for people with $8000 PC. Not very many people can even play the new games they are releasing not because they don't want to but because they don't have $3500 laying around to play the new game at not even the best quality
it’s probably because devs wanted 32GB of VRAM for the PS5/XSX generation, but only got 16GB. The jump from 360 to PS4 in VRAM was 16x, PS4 to PS5 was 2x…
High vram alone doesn't help if the gpu is not fast enough to process the saved data and ship it onto the monitor. The clockspeed will always be the first bottleneck and the higher it is, the more energy it drains, the more durable it has to be and the better the cooling has to be, which leads to bigger and bigger parts (cooling solutions).
@@thelazyworkersandwich4169 Yeah ofc they don'tt complain because no one tells them to code a game for console that runs in 1440p or 4k at AT least 60fps stable, WITHOUT disabling/reducing a dozen effects, like shadows, draw distance, lod, etc. until it turns into a mushy mass of indistinguishable hot garbage.
I think dlss is a problem. I feel like games are designed so most people need to use dlss just to run it at 60fps, with the downgrade in visuals it translates to.
Now that we've hit diminishing returns on graphics, I think the only place for it to go now is immersion and how you interact with the game world. Physics, AI, NCP behavior, interactable objects, World size, ground breaking game mechanics that have never been seen before. Games that have a bigger scope. Star Citizen comes to mind. that game is pushing boundaries with it's mechanics. Like for example, a group of players being able to walk around a ship that's being piloted by another player, landing on planets, going into space, entering quantum travel, all seamlessly. No loading screens ever. I'm just so ready for games to more mechanic focused and less graphics focused, and hopefully THAT will be the driving force going forward in the gaming industry and the reason for new hardware to exist.
performance *MUST* be the priority going forward;
games shouldn't cost millions of dollars and run sub 30fps.
It's in the hardware companies best interest to push graphics.
Bro you know that a freaking developer can cost a 100k per year and even more? You need millions of budget to create a game.
Blockbuster films cost millions and run sub 30 fps. What's the difference? Both most of today's film hits and most popular games are crap.
@@classicallpvault 🤣
laughable comparison, you don't _play_ movies.
@@maervo4179 and somewhere in that budget, they can make room for optimization
🙂
“The enemy of art is the absence of limitations.” - Orson Welles
I believe the digital age has disintegrated the mindset of optimization. Games no longer have to work before it goes in the box or fit on the cartridge or disc. Games can be 200 Gb, run poorly, and get patched later. Broken indeed.
Ja
Is optimisation art?
@@CurtOntheRadio i'd say it is.
anyone could write some code that does a thing, but not everyone can optimize it to run well
Like how anyone can dip theyhads in paint and print them on a paper, but not everyone can make beaautiful painting from that
@@captainjimo Hmm. Then everything is art too. I'd venture Welles wasn't speaking about optimisation so much as art is about the constraints - only having these few notes, these few instruments, these few locations, whatever ie working within material limits that constrain one and getting the most out of it.
Optimisation is more about removing constraints. Arguably the art comes in working within constraints, whatever they are, and getting the most out of it - not removing the constraints (which is what optimisation does, and which is largely a technical, objective, engineering job).
@@CurtOntheRadio Everything isn't art, if that was the case there wouldn't be art schools.
"Only dogs can hear the difference" is a GREAT way to describe this predicament
Also, even a dog can tell the difference between older well made games vs some of the *expletive* we have being released (and shuttered in failure).
It's a great way for the CEO of a console company to try and persuade people to stop pursuing tech improvements because it eats away at the cost and effort of a console 10x more than it does on a PC where you can simply adjust settings. You guys eat up marketing so easily while thinking you're too informed to buy into it.
It's interesting when you look at it from the perspective of Memory improvements. The PS3 had 256MB of RAM, plus 256MB set aside just for graphics. The PS4 has 8GB plus 2GB. That's a 30x improvement. THIRTY EX! O_O
Now, the leap from PS4 to PS5 was....2X....it went from 8GB to 16GB. A bit underwhelming from a memory standpoint. So, while the PS4 could store 30x the textures that the PS3 could, In contrast, the PS5 only doubled texture capacity from the PS4. Crazy.
256gb thats a lot 😀
@@MikeFeatherston0700 Augh! MEGAbyte, MEGAbyte. XD
Sorry, typo. I fixed it. =P
Gaming is hitting what Apple hit a few years back. Apple used to be so obsessed with thinness that in every new iPhone and Macbooks they got thinner and thinner to the point that it costed them battery life and heat distribution and eventually performance. Now Macbooks are back being a bit thicker and iphones too and their performance is on a good track
The problem is devs are overthinking about graphics, yes having huge graphics improving each generation is nice but we need good gameplay, storytelling and optimization. Right now, we have games that tank your fps trying to be as photorealist as they can be with bad gameplay and no optimization or barely (don't tell me relying on scaling is optimization!)
They are not overthinking. They are cutting costs, by using rt, dlss, frame gen, and get money from partnering with companies like epic (ue5 lumen), and nvidia. It’s easy money for everyone.
THANK YOU, when going from Low to Medium setting almost made it look like you were playing a DIFFERENT GAME and if you went from Ultra to high or even high to Medium thee wereBIG FPS gains NOT ANYMORE especially since DLSS and FSR gave Publishers the Excuse to cut COSTS by killing Traditional optimization
No, the problem is with the increasing complexity and features in engines many artists don't gain low level knowledge and rely on existing platforms with limited developer teams that constantly need to cater new people. So basically lack of expertise + stupid deadlines.
Nobody ask for graphics, the industry did that to itself, giving the illusion of value on visuals. Black myth wukong would have been a 1 - 2 years max if it wasn't that visually heavy. also that shows that future games won't even have to care that much for optimization on framerate, with FSR/DLSS and framegeneration, game will keep spending on graphics and leave "force" FSR/DLSS and framegen as wukong did to perform as expected.
Spot on, I have no issue with FSR or PSSR being used to enable a rock solid 4K30 game being playable at 1440p60, i think thats a fair trade off, but when you see games abusing FSR to the point that PS5 and Series X games have an internal resolution of 720! Its crazy, Immortals of Avium is actually a fun game as well, but going UE5 ruined the game on console as it had no chance of hitting 60fps without FSR at a base res of 720p!
Fuck graphics what pains me is how environments are sterile and uninteractive and how physics in games is dead or how AI is literally same for over a decade now. Burnout Paradise STILL has the best realtime best car deformation system in a racing game and that game came out 16(!) years ago then titles like Red Faction Guerilla that had insane destructible and interactive enivornment or The Force Unleashed that combined multiple middleware tech to reach design objectives of devs.
Devs are pushing for visual shit like mentioned in video that can be barely visible meanwhile you play current gen title and while going through foliage you pray observing if your character model will interact with that foliage properly or if your character will just ghost through that shit lmao. What the actual fuck.
morrowind's map while smaller than modern games felt so much more like a real place that you could explore everywhere than modern games.
@@arkgaharandan5881 That too. Devs really need to scale down on fancy polygons and provide meaningful and more interactive experiences and by interactive I mean everything that includes physics systems. Everyone slowly becomes Ubisoft when it comes to bloat of safe slop designed on sterile corporate templates.
A fact you can launch decade old title and look what physics engine of said title does and not just be impressed but you cannot find a game 15 years later that is even a bit close to this in that regard is fucking insane. Only titles that push something like that are some meme indie tech demos that market themselves around singular physics based gimmick and thats it.
When was the last time we had cool gameplay centered innovation in a big budget title? Nemesis system in Shadow Of Mordor 10 years ago by Monolith. Nemesis is also AI based and looking at their previous work with FEAR in that regard no wonder they were the last studio that even tried doing something around AI.
Again what the fuck is going on, its like entire industry became MCDonalds tier.
Car destruction isn't even on the fault of the developers, it's on the car makers who hate seeing their cars destroyed. Most racing games license out cars to use and often the manufacturers get to decide whether or not they want a detailed destruction model
@@doooodeh Nemesis system has been completely locked to just WB because they patented the tech so no one else can use it
@@crestofhonor2349 I know about the patent stop excusing talentless worthless western developers that could design some other gameplay centered innovation. Same with licenses and racers, there are games that are using custom vehicles and those STILL arent close to Burnout Paradise in that regard.
Immagine devs spending so much on nice graphics, only to apply forced TAA which makes it a blurry mess anyways.
Trademark Unreal Engine 5
But for ray/path tracing you still need TAA or there will be lot of noise.
@@TheNamelessOne12357 Yeah sure develop the game around the most niche and demanding graphical option we have
The problem is they are taking shortcuts with the graphics and are using TAA as a way to upscale and fix all the models
@@TheNamelessOne12357 no. TAA is meant to cover up bad execution of Ray tracing, ambient occlusion, and many other effects. They are all implemented poorly from the start, and devs use TAA to cover up the mistakes and low quality effects. No effect truly benefits from TAA.
10:45 YES! There's a reason why indie games are also popular, as AAA. Because indies focus more on gameplay, while AAA on graphics. You do the math.
not only gameplay, also story and art style which is more important than raw graphics
@RafaelMunizYT I'm more for gameplay, since you know, we're playing a game? Yes, it's true that art style is also a key appeal but the "story" part is exactly why nowadays we get less and less of gameplay to make the game feel more cinematic in AAA space.
@@Dima064 I feel like people forget the point of gaming is having variety, and nowadays we have more variety than any other point in gaming history. I for example mainly play "older games" so the mess of the recent releases don't affect me much because there are so many games I haven't played yet, I don't need to jump from release to release. there are many AAA games there are more gameplay focused and it's good to have both because I for example enjoy story focused and cinematic games but I don't wanna play that all the time. I like variety so I play story focused games, gameplay focused games, multiplayer games, indie games and I have the best of all worlds
Very simple answer: I do want better hardware. But not for better graphics, but for bigger scale. Keep current resolution, models, lighting, and so on, but add more detail to the maps(more props, furniture, interiors), more actors that are more interactive, farther view ranges etc.
There's still a lot of growth that can happen to games as far as their looks go, it's not all about resolution and raw texture quality or number of triangles.
Everything you just described is still graphics though, and there are improvements being made in that realm. That's part of why things like mesh shaders are useful just to name one example. A lot of stuff like this is happening in game development it's that it's more of a literacy issue now. It's sort of like how people might know they like one song in a genre and not one in another, but might not actually have a way to convey why that it is. Gamers might be able to recognize that something is improved or different now but they no longer can articulate why because the difference is no longer a jump from "barely being able to represent something that looks like a thing" to "being able to represent the thing". Sort of also like how you can tell only when CGI in films is bad, but how someone watching would struggle to tell you why it looks bad and sticks out. Game environments generally are more dense, they've got way more stuff in them, shit farther view distances got solved a long time ago for everything but foliage with an infinite reversed depth buffer, the only thing to push there is maybe higher fidelity LoDs and that's it. Like all of this stuff is improving you just don't have the literacy about how games are made to be able to articulate or grasp what's different now.
i cant wait for AI speech algorithms to be implemented into gaming. just imagine a whole story based on AI given a small script and scaling the story based on how you react
more clustered scene = higher cpu & vram requirements
Best part about that type of improvement is it doesn't have to lock anyone out from a hardware perspective. Like render distance in minecraft
@@adeptalakay I want to talk to an npc not chatgpt. Chat bots are not at the level where they can fit seamlessly into a game world while making sense and not hallucinating
Another problem is that companies aren't innovating anywhere like they used to. They're playing safe with trend chasing at the forefront. Also, the whole release it now, fix it later mentality is also hurting the industry. Optimization really hasn't been a thing since at least the 6th generation of consoles, maybe the 7th.
Graphics are fine now. I don't need them to be any better. We need smoother framerate and innovative gameplay that doesn't involve sleazy, greedy and FOMO tactics.
I've only ever bought two re-visioned consoles, the PS2 slim (simply because my fat model was having trouble reading dual layered DVDs) and the PS3 slim (because my original 80GB model got stolen).
I'll play the commie here now despite hating them, but capitalism and corpos are ruining the whole entertainment industry.
Music is safe because the record labels don't want to risk and lose money. Garage-sized bands don't get algorithm attention because they don't bring the views for ads and sometimes are too edgy for it as well.
Movies make remakes and sequels and "somehow Palpatine returned" things because movies are too expensive, full of CGI, and the studios don't want to risk investor money.
Games are pure safe, buggy slop with better graphics because Publishers don't want to lose money, so better pump that slop for pig gamers to consume ASAP.
And in all those companies, the CEOs care more about the trimester earnings instead of the long-term health of their business because they have to cater to shareholders, including DEI shit because Blackrock and Vanguard money.
I really hope one day, shareholders start avoiding the entertainment industry like the plague, the whole thing crashes extremely hard, everyone whom is unneeded in the industry gets fired, and passion products rise from the ashes of this industry.
that will never happen unless capitalism dissappears completely or the nation somehowngets a total reset. Sorry fam. Til then, gonna have to rely on independent (usually left-leaning) devs
Triple A Companies are mainly the ones doing this. The Indie Space is kinda fire, so perhaps its worth a try. And games from FromSoft and Black Myth Wukong are really cool exceptions.
"Another problem is that companies aren't innovating anywhere like they used to. They're playing safe with trend chasing at the forefront."
This is a symptom, not a cause. When you have to bet 250+ million dollars to get the game on the shelf, you're going to naturally be fairly risk-averse and needing to target a very wide audience to have any reasonable chance of getting enough sales to turn a profit. But I agree, it becomes a catch-22 - you can't make money without appealing to almost everyone... but making something that appeals to every means it's a bland mish-mash that doesn't REALLY knock it out of the park for anyone.
@@jellorelic Seems like a simple solution, then. Don't have the game cost 250+ million dollars. There is little reason why games cannot be made cheaper, we have many examples of Great games that did Really well, despite costing little to no money. We've seen single devs crank out a game in a fraction of the time it takes for a Triple A game to do so- and on such a smaller budget, it isn't even comparable. Over-inflated, under-developed, shallow gameplay- all hallmarks of Triple A, now.
I think physics simulation could use a major overhaul. It's been an afterthought for quite a while now. Even the games we recognize as having really good physics exhibit floaty objects that rarely express their true weight. I'd like to see the industry turn their sights in that direction. Audio has also fallen behind. Some games sound great, but only in comparison to other games. Much like RTGI which bounces light, I'd like to hear sound that bounces as well. It's not often that we have a game that does that. More often, if a sound is coming from my right I only hear it on the right, as if there were a void on my left that sound cannot reflect from. Maybe these things are not important enough to the consensus, but I for one have been wishing for these improvements.
I think physics simulations is a hard thing to do not because of limitations but because making physics an integral part of your game is not easy.
Half-Life 2 and Half Life ALYX are like the only examples where really good physics simulations actually meant something for gameplay, I wish more games did that.
That, and the uncanny valley of character faces, movements and overall realism in the way NPCs interact with the world.
Cpu req increase a lot by the time and physx npcs and ia are always the same....
@@keatonwastaken I’m currently playing Control, which is why the subject is fresh on my mind. The physics are great in that game, but even so, objects seem to have one predefined weight, which takes me away from the experience a bit. I’d say even if physics aren’t critical to the experience, I think it would make games more immersive. Regarding a new frontier in gaming, that’s among my top picks.
@@LilMissMurder3409 Absolutely, especially open world games. It’s back to the drawing board for sure when it comes to that, because it would be too costly and unreasonable to mocap filler NPC’s. Inevitably I think AI will be a big part of this gaming renaissance, as much as I hate to admit it. I can see how it would be beneficial, but it’s also a big can of worms. I can feel other commenters readying their pitchforks now lol, and I can’t blame them.
We were already at diminishing returns with XBox 360 era graphics, it's just absolutely obvious now. You can't just throw more teraflops at the screen and wow people with perspective-correct 16x anisotropic filtering. The giant RAM and CPU budgets of modern PCs and consoles are spent on quicker development so that more content can be produced easier even if it's not running very well. The irony is the project scopes still expand to fill all available time and money.
My hardware is just a means to an end. I just want good games, and it feels like there are less of that every year.
That's just a result of ageing, unfortunately. The more you experience the higher your standards become, and the less novel everything is. What was once exciting and unexpected is now dull and predictable. And there is a foreshortening aspect when you look to the past. The past contains all the great things you loved, it is full of them. The present only contains the things you love right now, which are inevitably fewer. Same reason as why music was always "better" in the past.
@@calmhorizons You aren't wrong but I don't fully agree with you. It's pretty obvious when you look at games being released say 2004-2011 and compare it to games being released 2017-2024 that the quality and quantity of good games has dropped significantly.
Same with music. I'm mostly listening to old music, and discovering new stuff all the time so its not nostalgia in any way. The only thing one can aruge about is survivorship bias in that all the bad stuff from the past have been forgotten/ignored and only the good stuff remembered and filtered out. That works to your advantage though, since you know the old stuff is going to be pretty good for that reason.
@Skumtomten1 most old musics are terrible, and like every genre or year youre on. Most music are bad and a good number is decent and a few are very good. Obviously you're only listening to the good ones.
@@calmhorizonsPredictable doesn't equate to bad. Music only has so many note combinations and chord progressions, but a new unique spin can still intrigue someone who prefers his generation's young music. Pokemon, Mario kart, Mario party, smash bros, video game market is full of predictable but still great releases with new tweaks. One piece can be predictable but it's still the greatest work of fiction in modern history.
@@thunderstar254 True to a point. But you might be discounting the vast gulf between possible combinations and plausible combinations.
There is a reason why we see the same flavours, melodic intervals, story beats and visual motifs repeated over and over in art and entertainment - evolution furnished us with a limited margin of acceptable interests and constrained sense organs. To make a crude example - it doesn't matter how many variations of shit flavoured ice cream you make, it ain't gonna sell. 😁
Graphics can only add so much to realism before the immersion breaks with lack of physics. The next step is to make the map have physics
That's my biggest issue with games. I play for immersion, and I can very easily get immersed in a game with a great art direction, no matter how cartoony it is.
But the more realistic a game is, the smaller an unrealistic feature has to be for me to get yanked out of my immersion. It's why I don't like very realistic texture mods, they destroy any and all immersion I might have had.
Me playing, very immersed in the game, sees a hyper realistic texture on some clothes that is in juxtaposition to the game textures, immediately remembers I am playing a game and gets bored.
Problem is, physics are exponential more compute heavy than graphics.
@DrTheRich that's true, but each physics improvement has a bigger effect
bad news, almost all these improvements in graphics are only possible by making as much of the environment as possible static and non-interactive
@Tuxfanturnip Battlefield 1 is a great example of a destructible environment with both physics and tasteful visuals
I'm glad that the graphics are plateauing. Hopefully now, the focus will shift to improving optimisation, gameplay and work conditions.
(Spoiler: the focus will be MTX)
Whats MTX?
@@Itsnothing389Microtransactions
@@Itsnothing389microtransaction
the actual focus now will be automation of graphics with AI
@@Itsnothing389 *Mountain: X!* Jk, it's a cancer medication. Jk, it's an audio equipment company.
I think there's one thing being overlooked. In your test at 3:50, the video looks similar because the models are the same, since they were made for that level of graphics. In order to make the most of the hardware, the models would need to be higher poly, to the point that it doesn't look any different at some point (at some point, adding points to a circle to make it more round won't be noticeable). Same with textures, there are only so many fitting textures to add to an area or object, so buffing the hardware wouldn't make the textures look better. Basically, the buff to hardware doesn't increase graphics much anymore, but does allow for more things to be happening simultaneously, with more in-depth physics and lighting simulations, and multiple AI running around with animations and interactions. At some point, graphics are no longer the focus, and that's good.
At this point, I don't care anymore for realistic graphics. I prefer unique art style which always tend to age better.
You can blame the mindless NPC consumers for the focus on ultra realistic graphics. They have no standards and only care about le pretty graphics.
Both can happen at once; I think Cyberpunk is a good example of that where it looks unique and realistic but it's not soulless.
Good for you. I want games to get closer and closer to perfect realism. I want to look at better graphics.
I have a high-end pc with a 4090. I see myself playing my hacked oled Switch more especially in the past 3 months. We all know Nintendo doesn't care for high frames each game has cartoon visuals I don't mind one bit as long it's fun. And, if you can run the Switch game on pc with Yuzu or Ryujinx Luigi Mansion 3 looks like a CGI film with a couple tweaks at 120fps.
I like both. Not everything has to be super unique in terms of art style. Also realistic games don't always age poorly, it just depends on how you hide your limitations
Diminishing returns is the best way to describe modern gaming.
we have reached the point where Upscale and AI frame generation are recomended to play actual games with graphic cards over 1k dolars to play at not even true 60fps, its disgusting the obsesion with graphics when there are 10 year old games that look amazing, like metal gear phantom pain
@@Kurainuz That's because good graphics has everything to do with visual and stylistic appeal and only a little with pushing certain limits and technical impressiveness.
Just look at that Silent Hill 2 remake vs the original in its visuals; the old one is vastly superior even though many of its artistic choices were made because of hardware limitations.
Games haven't really "wow'd" me graphically since the PS3 era
@Lucrei. I'm older. Games wowed me with N64 and Playstation 1 and 2 especially coming from 2D gaming.
@ haha don't worry I was right there with you! ;)
Going from Sonic 3 to Ocarina of Time was a tremendous leap
I've been frustrated with the games industry's obsession with graphical upgrades for decades. I'm an artist. I appreciate things looking good. But driving up the barrier to entry for considering new games by a few hundred bucks every couple of years was never feasible for my family. As an adult, it's still not. My friends and I are also huge D&D fans, and my buddy is trying to get me to drop over $400 to purchase on a video game based on D&D. It feels insane. I'm also a huge fan of Halo, and I assure you the graphical limitations do not stop us from playing games from 2001. Hell, even the improvements between Halo 3 and Reach were significant, and they both ran perfectly fine on 360. I never needed a 360Elite.
We all know we've had a steady decline in the quality and depth of the most expensive games since the 2000s. Obvious oversimplification, but with that, it's just been insulting that the marketing has always been about how much prettier new games are. So I'm VERY happy to see we're hitting a plateau and a lot of people are agreeing with me. But that's me and my poor ass that barely plays video games.
The people focus too much on AAA experience. You have 50.000+ all kinds of creative and imaginative games listed on Steam you wont be able to play in your life time. (It stretches from stunning pixel art games to very impressive 3D physics engines.) If people support indie studios of games that they like, hence they increase the demand and get more of the games that they like. It is as simple as that. If people complain about lack of creativity in AAA and still buy that sh!t, they are getting what they deserve. It is basic market logic 101 and the democratic choice we have as consumers.
Yep! Welcome to capitalism!
@@Stuntmandouble08 The problem is that the majority of people aren't like us and just think what they're told to think and buy what they're told to buy through manipulative marketing tactics such as "FOMO" heh
Nintendo figured this out years ago, that and the Japanese love of anime meant they could step off of the power battle and keep focus on gameplay density as opposed to pointless tacked on content. Hifi Rush is another example of art style taking precedence to create a fun experience. Then you have the recent AA games, basically games with "good enough" graphics pushing 8-15 hour (typically) single player experiences, cheaper to build because they're more focussed smaller undertakings. Sony blew off $165 million on Concorde, that could have financed 20 AAs to compliment their platform. Software sells platforms which leads to more software sales, only Nintendo out of the console makers seems to remember this, at this stage the libraries on PS and Xbox look like the libraries of failed consoles. 20 for one Concord, you put 20 good quality single player exclusive AAs on PS and their fan base is happy. Nintendo publish a lot of one million sale break even games because they know it helps drive sales in the main, everyone else now bets the company on each release, now commonly meaning one title a console generation. Look at Gears of War, 4 entries on 360 and one XBone, 2 on Series, that's 4 sets of sales for the same dev time as one, 4 times £40 15 years ago. Which do you think is the sustainable model?
17:10 I think another reason why improving graphics is partially pointless is you don't focus too much on details when you are in the middle of some frenetic action. So while some things are worth inproving others are not, having more field of vision, better lighting and overall better depth perception is good, having many distracting details is pointless
I would 100% love it if the world just decided that graphics right now are good enough.
Why push progress 1mm forward if that 1mm costs me 1000$ just to keep up?
The answer is "because they want that 1000$ from me again and again as they very slightly change almost nothing but performance cost"
The industry is very clearly and very deliberately sabotaging performance in order to force consumers to buy new expensive parts if they want to keep up.
They might actually add slight improvements to games, but they're so minimal. The only actual change I've noticed in newer games is that they run worse and look worse on my hardware than older games do.
That's all there is to it.
It's never been about making prettier games. It's always been about making consumer hardware obsolete.
I think this is a bit of a stretch. It's not that manufacturers and publishers are colluding to make hardware obsolete, it's that hardware evolution continues steadily, and prettier games attract consumers, which in turn cost more resources to make, which in turn influences publishers to crunch development time to save money. It's a perfect storm of pressures, and in the AAA space that means 4-6 year old hardware becomes obsolete when, for example, games like Alan Wake 2 are designed around mesh shaders - and stand out visually for pushing the needle forward.
Your observations aren't wrong, but the narrative is more simple and more nuanced than "the industry is against its consumers."
The truth is that there's more computational power in the hands of developers than ever before, but those resources are being squeezed to get games out the door quickly, and rely on cheap techniques like TAA to buff out the visual imperfections. I'm not sure of the solution. Times are tough right now, new games have very much stagnated, and we need more publishers willing to take on more risky ideas than yet another remake of their next greatest hit. Alleging a conspiracy doesn't help anyone.
Graphics can take a backseat when we finally have raytracing doing all the lighting as the norm. That’s going to be a huge graphical leap that can’t be understated.
@@Granpire they are actually, nvidia got big paying for games to use technology only they could provide so if you wanted the best version of the game you had to buy their new gc "the way it is meant to be played", see it like epic paying for exclusives but instead of the game coming to steam later, they just came to steam uglier, for the majority of developers games got cheaper to make
@Sauvva_ That's certainly true with Ray tracing, 4-5 years ago it was moreso about Nvidia's drivers implementing per game shader optimizations in their driver updates. For me, it's mostly about their unmatched upscaling quality - I'd be inclined to purchase an AMD card if they had a valid competitor to DLSS. Intel comes closer but they're not the most high end cards.
I feel the pain of the price gouging Nvidia's done, but I also can't blame them, as they have little competition in RT and upscaling.
Thank God someone said it. I have been feeling alienated and like I've gone nuts for getting frustrated and seeing the patterns regarding planned hardware obsolescence and what *seems* to be deliberate performance degradation. It's especially obvious on smart phones, at least the ones I've used which have ranged from budget to midrange.
Graphical fidelity is good enough. They need to make the physics way better. It's like the picture is getting slightly better, but less and less interactive compared to older games. Arkham Knights to this day looks graphically amazing, and when you turn on Nvidia Physics, it looks way cooler than any raytracing we got today.
This 100x. I’ll take Generational increases in interactivity over any visuals any day. What makes games fun is interactivity not visuals.
Agreed. The Nvidia Physx smoke in Arkham Knight still looks better than the smoke in almost all games today.. Felt like a taste of the future back then..
But, here we are, in the future. And we are still seeing the same old, non-interactive, billboard transparent textures meant to represent smoke, and it's just sad..
I don't consider the smoke physx interactivity I consider a world like Zelda where the world reacts in a common sense way to your actions an example of Interactivity. That smoke was nice but held no gameplay importance just fancy fx
No it is not good enough. You're going to have to buy a new PC, get over it. We've been doing this for 30 years.
HL2 still got one of the best physics.
Seems games made a step back in last 10 years xD.
Digital Foundry made a vid about Ageia Physx Card.
And this is why it makes so much sense to play retro games these days.
If you are a boomer
@@marsdenit2845 Just a reminder, that minecraft is still the most popular game in the world. Gameplay is the most important part in games, not graphics.
@@Boris-Vasiliev
Which has received a ridiculous amount of updates and is not really a "retro" game.
Retro gaming is better @@marsdenit2845
Ship of Harkinian OoT Randomisers baby!
It's easier to tell a triangle from a square than a dodecahedron from a circle, but the actual number of vertices doesn't really correlate with how good something looks.
We never *really* needed any of these improvements.
The game I am most excited for this year is Shantae on the Gameboy Advance.
I occasionally have to remind myself that I'm in an echo chamber of AAA games with ever increasing system requirements. Outside of that echo chamber, the "real world" loves the Switch and the Steam Deck, and many hugely successful games run on fairly humble machines. The "plateau" is actually quite a nice place.
Well duh play other games besides AAA.
I had to upgrade from my 2060 Super of 5 years to a 4060ti for some AI VRAM work. Despite all the community trashing on it. Turns out, there's really no game I play that actually struggles on it and that small bump from 2060S is all I needed. Most of my regular FPS games are CPU reliant. Most of the others are anime-esque that don't push graphics, or stylized like Overwatch, and the actual most demanding AAA games I like to play and revisit, RDR2 and CP2077 are the most demanding out of them all. Which all run smoothly. The game I've gotten most hours out of in the last 5 years is Factorio, Genshin etc. I've realized how far I am from the expectations of AAA gamers who push 4k, RT etc and it was warping my mind for a while thinking I made bad purchases, based on other people.
iceyy, cool to see you here, how's the X3D testbench videos going?
@@lancevance6346 its not their point tho , U can't just say oh 4006ti is amazing cause its not .
U don't make a card literally same perf as last gen card and even 3060 had 12 gb of vram and put a higher price for it .
second nividia is not the good person in the story cause they actually profit tons of money from the consumers they become the richest in the world rn even apple didn't make like them,
so shitting on PC hardware companies is so valid rn
@@lancevance6346 and even if u like aaa games from 5 years these games worked on gpus from 5 years too ,now even 4060ti can't run most aaa games at 1080p rn what will it do next or after 5 years ?
did 2060s when u bought it didn't run rdr 2?
Doesn't matter how good the hardware gets when the games we get nowadays are not up to scratch..
This seems like a commonly held misconception which doesn't hold much weight under scrutiny. Which games specifically are you referring to? Which games do you play? Have you genuinely not played any top tier games over the past 10 years or so?
@@steviewonder0850While there will always be good and great games coming out to play, there has also been an exponential increases in shovelware and bad games over the past decade.
For every Baldur's Gate 3 or Helldivers 2, we get a sea of trash sports games, unfinished games that take 1-2 years to actually be complete, and games annoyingly butchered by microtransactions.
@@darthwoody9917 I don't know. There was so much shovel ware during the PS2 and Wii era. The DS, Wii, and PS2 were known for just having so much shovel ware dropped onto them
@@darthwoody9917 Trust me there was no shortage of awful games 30 years ago.
Difference is those games usually weren't flagship titles made by large AAA devs. Activision & EA USED to actually make decent games, now they pump out slop
Imagine your son want a gaming pc, nothing too fancy, just enough to enjoy modern games, and it costs you 1500€. I bought my first pc 15 years ago for 500€, and I could play everything. A non adult will not be able to afford PC gaming anymore, because we added a bit of grass in the distance
This!! So true.
Imagine thinking 15 years ago prices would remain the same. We could say the same about gas prices, groceries, housing prices, rent, taxes, healthcare. Maybe the issue is that prices have gone up but wages have not and that's the reason why the world is slowly crumbling
You can still play everything on PC with a low end, but like your $500...you have to be on 1080p and turn down all settings. Everyone knows that experiences are best enjoyed at the max settings. Thats why people buy better hardware because this is their MAIN hobby. Its like golf clubs. You only need 1 driver but people into it will have 5. And each of them can cost $2000.
my only hobby is watching stuff online to save money and gaming, everything is free entertainment wise, if i cant afford a desktop or laptop i only buy a phone and use that until it falls apart, i literally only work, sleep, save money and volunteer, i have no other hobbies, i go nowhere, i cant afford it and i cant afford most things like basics and housing, its very doable to afford a phone with no monthly cost and a computer and using free internet at food places, until jobs pay then there is no reason to do or go anywhere or be involved with anyone, entertainment is free and great and a laptop and desktop is more then just a gaming machine
Just made a PC for a friend with an RTX 4060 and I3 13100f for 650€ running with a gen 4 m.2, inflation is crazy but if you look for the best price across countries and website you can spare a few hundreds bucks
Funnily enough, PC Gamer and other magazines (back in the day) have been asking this same question for years, and every few generations, it pops back up again in different wording.
The answer was the same then as it is now. Games get better looking and more advanced, and one or two games will create a paradigm shift that turns the industry around - like Quake 3, Unreal Tournament 2004, Crysis, or whatever.
There's lots of great older games on Steam. I'm going through my backlog.
It's great fun to load up older games with upgraded modern hardware!
You can max them out, run at 4K+ resolutions, at the highest framerates possible.
@@bliglum I just started playing TitanFall 2 completely maxed out and up-scaled to freaking 5760x2400 over the weekend and it's as smooth as butter. It's been in my back catalog for awhile.
@@bliglum A couple months ago I treated myself to the full Half-Life bundle - it amazed me that a 20-year old game can still look that good. It was money well spent (and righted a wrong, since I'd pirated the original back in the day 😅 )
@@LilMissMurder3409 Hell yeah!
No better time to explore the backlog than now.
@@bliglum It's the digital equivalent of digging through the software bargain bin at PC retailers back in the day. That's another thing that has died out.
I remember when the 20 series was coming out and everyone was talking about the potential of 4K gaming.
Nearly 3 gens later and not even the most expensive cards can run 4K natively at a consistently smooth rate yet.
True 4K that fully benefits from higher resolution is scam for consumers. It is almost nowhere.
Facts:
1. Sub pixel detail. It gives noticeable better looking pixels if there is more information per pixel that are blended to one pixel. Keyword: "supersampling"
2. Camera resolution. Camera has different pixels for red, green and blue with some filter. They are counted for those megapixels so good 4K image requires actually 8K camera.
3. UA-cam and streaming. They use so heavy compression that 4K video is just as good on FullHD. In fact that those video quality settings are more like quality settings and have more headroom on pixels, so they can be downscaled if necessary. True 4K in UA-cam requires 8K video.
4. Human eye. Human eye resolution is about 1 arcminute. Just do the math based on view distance and screen size what you can actually see. In living room conditions, 4K usually don't happen. Large display in front of armlength or projected to wall in movie theatre is where it matters.
5. GPUs suck on small triangles. That is very wasteful and requires LOD levels, and having more LOD levels is also wasteful because GPU can't benefit instancing so easily and that makes memory bandwidth as bottleneck.
Shortly, there isn't much of 4K gaming that really benefits from 4K. It is lacking even in camera technology.
So while framegeneration easily sucks, frame upscaling actually not. Game can be made for 720p...1080p and upscaled from that to panel resolution and that is how it is done in reality. PC gamers try to play game without upscaling that is when they see the truth: Game is made to played from sofa in livingroom in console, running internally 900...1080p and upscaled from that. If game runs 30fps 1600x900 on PS5, large 4K display with native resolution on 60fps requires almost 12x more powerful GPU and bandwidth. And I'm not talking about raytracing or pathtracing yet.
Eh, ive been playing at native 4k oled since the gtx1080ti. Im still playing baldurs gate 3 at native 4k on my 1080ti. But next year will upgrade to a 5090
@@slopedarmor
4k rendering is possible but it is expensive. Just buy 4 times powerful hardware what is latest gen console, and you get equal framerate and upscaling ratio what is found from console at FullHD. But content itself is easily made for that resolution found from console prior upscaling.
Btw, I just started to play older game, made 2010. That game content is targetted to 1280x720.
What I did, I set game resolution to 1280x720 and put MSAA antialias settings to max from driver (because it has forward rendering pipeline) and that give best image because it just look like crap when texts are smaller, there is lots of texture upscale filtering close up and there is graphic elements that are optimized to 1280x720. There is even grain effect that is optimized to 720p.
So game actually look much better when running on resolution where it is targetted. It is just smooth everywhere and limited texture resolution or other assets doesn't distract. Settings forced to high from driver keep pixel quality high so they don't look bad when pixels on screen are larger.
I never really made a big deal out of this. To me a game needs to have good gameplay above all else and as far as graphics go, all they need to be is visually/stylistically appealing. To me good graphics never meant pushing the limits on realism or whatever is most impressive on a technical level.
"To me a game needs to have good gameplay above all else and as far as graphics go, all they need to be is visually/stylistically appealing."
Yeap. Graphics don't need to be realistic, they need to be coherent. As technology develops, there are new ways to produce graphics. I've played very beautiful pixel art games from era before rendered graphics became common. Actually they artists often manage to add more details on pixel art than what was in early rendered graphics.
It'd be interesting to see the AAA game budgets separate development budget from marketing budget. I swear half of their entire budget is just going to marketing and ads to sell their poorly designed games that arent fun in the first place.
It's not unusual to see the highest cost game advertising have a budget 20-30% higher than what they spent on multiple years of development, and even after all of that, how many games can you remember any of the advertising for? You're not far off on the 50% ratio even at AA levels.
Half? Those are rookie numbers
They also tend to locate their studio's in hipster cities that have egregiously expensive living costs and overheads.
Could be said the same for movies
Your best marketing is your consumer, give them a good time/experience and word spreads fast. Does not matter if its a device, a restaurant, a hotel, a beach, etc. The person buying any product is your marketing.
Everyone says that the Nintendo Switch has terrible hardware... But you can play Super Mario Odyssey on it and it runs at a smooth 60 fps (and that was 8 years ago). Sure not ALL Nintendo games are that optimized (looking at you Scarlet and Violet), but the fact that they can make it work at all says a lot.
That's why Nintendo does it. Spend less on hardware, more on optimization. Imagine what these other games could do if they simply optimized their games properly.
Also their games aren't like 200 million dollars basically guaranteed some kind of profit not to mention it doesn't take forever to make a game
Ok but lets not platue on switch level hardware please lol. A bit higher than that would be nice ^_^
@@TopOfAllWorlds My point is, other companies should be optimizing their games.
If Nintendo can do this stuff on a potato then imagine what they could do with the hardware of other consoles.
@@TopOfAllWorlds I'd rather have the Switch's hardware over the lack of updates in the newest stuff
@lasercraft32 games on the Switch compared to PC makes the Switch graphics look like a$$ though. There is a difference between Nintendo and the rest. And it's not pretty. I used to be a Nintendo fan; not anymore for that and other reasons.
The Steam Deck OLED can play all of my indies better than my Switch ever could. I see no reason to get a Switch 2 when very few games run even decently on the Switch. And I'm not even talking about the elephants in the room, I'm talking about many games that never came because the Switch would have a stroke trying to run them (Alan Wake 2, Baldur's Gate 3, Cyberpunk 2077), were going to but were canceled because it couldn't happen because the Switch would be a stuttery mess (Marvel's Midnight Suns), or did eventually come but with serious sacrifices like Hogwarts Legacy and a bunch of other games.
Legend of Zelda TotK on the Switch looks like a Simpsons' skin color mixed with the color of urine. And to think Nintendo left the Switch to rot while games couldn't run on it...
Oh, and by the way, I mostly play indies and "retro games" these days. On my Steam Deck OLED.
You know I miss? Developer tricks to boost performance, things like Crash Bandicoot levels loading in chunks and having the levels be designed around hiding it off screen or Ratchet and Clank using the part of a disc you really shouldn't be using but in their case it worked but made porting a nightmare, hell I'm surprised we're even allowed to keep the lowpoly models in the distance in the current market :T
Yeah it feels like devs went from trying to fit big ideas into smaller boxes to ballooning small ideas to fit the larger box
@@planetkhemical This is brilliant.
Culling is a good one dark souls 3 uses it and it's pretty great
This comment sums up the absolute brain rot of the entire comment section. Modern games are a culmination of every "developer trick" that any engineer has dreamt up in the last 30 years. Modern engines are colossal magic shows designed to render scenes in the quickest way possible while doing the minimum amount of work. And "levels loading in chunks"? Seriously? How the fuck do you think ANY game works?
@@ahall9839 the kind of person who pisses in their own coffee just so they can go off at whomever asks
The first time I saw a PS2 game as a kid I could hardly believe what I was seeing.
I was very happy growing up with PS1, NES and N64, GBC and so on as a kid. Never complained about graphics. That's probably why the graphics don't matter to me at all today. I care much more about the story, the characters, the music and gameplay. And high fps of course. Nice graphics are just an added bonus--if they don't make the game run like trash.
@@Thunderhawk51 Same. I still love 2D pixel art as much as 3D, but enjoy good games of any genre from any era. Currently mid-way through my first playthrough of Bloodborne on PS4 connected to a plasma TV and it looks and plays great.
I was much more impressed by the GameCube TBH. PS2 was convenient to play DVD's on 😅
I think it will really vary a lot depending on the age of the person commenting. For me it was without a doubt the Nintendo 64. The OG Playstation was great, but all the geometry warping just looked bad to me, it wasn't the 3D revolution I wanted. The N64, however, just looked so high fidelity and smooth, and the geometry warping was nowhere to be seen, it just blew my mind. Dreamcast and PS2 were both great as well, but nowhere near as transformative to gaming for me as N64 was.
@@K31TH3R N64 was still too fuzzy and low fidelity for me. Ocarina of Time was amazingly atmospheric for the time, but in general, the generation of 3D was novel for how it transformed gameplay more so than visuals and I thought most N64 games were ugly too. For me, the Dreamcast was the first console that made 3D appeal on an aesthetic level.
22:00 there is a reason why ppl still have their ps4s and the switch sold so well everywhere, graphics aren't as nesasary to gamers as b4 rather games being fun.
There is a point where it just looks bad and outdated though. Look how much better the PS4 looks than any Switch. It is night and day. And I hate most big video game hardware companies, but Nintendo is by far the worst. They should have been kicked out of the video game hardware market long ago.
@cameronbosch1213 yet ppl are going back to play ps2 games.
Nintendo only exists because they provide a service/product that is valuable enough to the consumer to spend their money on it and in return they make profit.
So for them to be kicked out the industry consumers/gamers would have to see their product as not valuable and buy something else
After seeing how long it takes to make one street fighter 6 character (making each individual hair for an eyebrow) i really don't mind lower fidelity in exchange for more fun and cheaper product (less than 300 mill to make)
I would absolutely be happy with games staying the same in terms of graphical fidelity and required hardware to put more focus on optimisation, storytelling, physics, mob AI, mechanics etc..
My friend showed me system requirements for new Indiana Jones game and it’s just downright silly at this point and I don’t believe they actually couldn’t make the game run on more achievable hardware, but hardware race allows the companies to be lazy and not optimise AAA titles.
11:30 Yeah? Man, The best, most innovative games, were always the ones that worked with, and not against, hardware limitations. Limitation asks for innovation, which is much harder to happen when you have unlimited possibilities hardware-wise. Where do you stop?
What I find crazy is the Graphics sometimes aren't even "that great" and you're getting such a performance hit.
As well, that GPU's are becoming so HUGE, yet they STILL can't give you "Good Performance" at 4k.... I mean, tech is supposed to be getting "smaller", NOT bigger.
2080 graphics card has 18.6 billion transistors - size - 775mm²
4090 has 76 billion. Size 609 mm².
have you see smartphones in the last 10 years? They do the same as gpus, getting bigger every year
But i do agree, if the Top of the Top gpu - rtx 4090 - cant do 4k 60 in any modern game, then whats the point of it being the best
@@CurtOntheRadio So 4 times has much transistors and it's sitll not reliable at 4K
@@machintrucGaming Does it have to be? That's your own arbitrary target. And is it unreliable? It's surely better than your 1050Ti.
@@CurtOntheRadio Well OP wants good performance at 4K... I've personally given up on that. 1440p all day everyday. 4K is just about good for indie games. AAA can't be arsed to optimize their game.
Yeah we might be seeing a trend of diminishing returns in graphics, but the problem is that hardware requirements aren't following the same trend. Despite being able to play beautiful games today, a game 5 years into the future that might even look stylistically worse won't even be able to run on my PC. So frankly we don't even have the option to go "yeah I'm comfortable with my hardware today", we need to spend more and more money just to be able to play games that look identical to the games we could previously play.
I typically just skip a console generation. Im still playing on a ps4 and enjoy it. Yea, i miss out on playing the latest and greatest BUT typically the best games will carry on to the next console era and be cheaper.
I’ve been saying this to my friends for years and you finally put it in better terms than I could ever
I remember really being blown away by the graphics in the Witcher 3, that came out in 2015. Ten years prior to that, in 2005, games looked completely and utterly different. Jump ~10 years after Witcher 3 to today in 2024... Games generally look compatible to The Witcher 3 still. Kinda crazy.
"we're gonna stop here" sounds great to me for like 5 years. I just upgraded my pc after 10 years. Can you imagine.
i built my pc in 2013, lasted me till 2021, and now feeling how much more heat and power my second build makes, its crazy
It sounds great to me. I upgraded and am pleased with this machine. It pains me to see cards like the 4090 starting to be called a 1440p card (id like one).
Console generations do already last 5 years or more though
@@pedropierre9594 My 4090 is a literal space heater. Great for the winter but during the summer I'll use framerate caps and power limiters to make it more efficient.
@@minnidot Hahaha who would call it a 1440p card? It has 24GB of VRAM and DLSS is best experienced on 4K displays. The reality is when you push graphics to the maximum (i.e. path traced Cyberpunk) you are dealing with 1080p upscaled to 4K and you need frame generation. The 4090 will be playing games perfectly fine at 4K for another 5+ years. It won't be keeping up with the latest and greatest ultra PC settings but it will still be light years ahead of what the consoles will be able to achieve for the foreseeable future.
what i have been noticing , that games just look pretty , less interaction with the world , less animation. i dont know how taxing or demanding it is to make foliage bounce off characters rather than clipping through them , or have destructible environments , or a world that is affected by battles. more than 5 npc models. every time is less and less animations..
That's very game dependent.
Physics systems are pretty demanding from both a technical standpoint and a coding standpoint. I don't think you remember Nvidia PhysX but pretty much everyone hated that because of the cost on the performance but it did add in a bunch of physics interactions
Destructible environments are very game engine dependent
@@crestofhonor2349 Surprising how we haven't improved the physics performance in the 15 or so years since PhysX was a household name.. it's the new gimmick word now. "RTX".
@crestofhonor2349 The last game I know of that uses physx is Control. That game had really good physics and it's well optimized too
I would be happy if the consoles stagnated for a bit and jus continued to create games for what we have now.
I don't know what sort of tech The First Descendant is using, but it was quite funny, I had RTX Ray Tracing enabled, and was near a light source admiring how the light was bouncing off my character and even told my friend, "Wow look at this, it probably wouldn't be bouncing off my character light that without ray tracing on". Later on, I was hitting some performance hiccups in battle so I turned my settings down, came across the same light by happenstance and regular old rasterization looked the exact same lol.
First Descendant is Unreal Engine 5. The game does use a combination of UE5's Lumen and ray traced shadows
Rasterization doesn't actually have anything to do with real-time global illumination. There are multiple software based ways of approximating the effect or doing it literally, the stuff like RT Cores that are a part of the newer cards are really just dedicated hardware to accelerate ray-intersection tests and bounding volume hierarchy traversal. It basically just lets them do raytracing faster using hardware acceleration.
Atypically interesting comment, here. Read it to the end and pondered it afterward. It also inspired a couple of solid comments.
Great stuff, dawg.
4:00 A major problem is that too many devs on consoles especially are aiming for the left instead of the right when the right one looks better in motion. Draw distance and frame rate are way more important than the water effects and lighting improvements
I would agree on this for sure
Agreed, to me the difference is night and day.
I remember wanting games to be more immersive but not graphically necessarily. When I played ME2 I was visualizing branching narratives, dialogues etc. This was the future for me what "realistic" meant, not visuals but content and interactivity.
Old crpgs are better in story and dialogue, than most games, this aspect has not evolved but devolved. Only games like BG3 showed a glimpse of hope for more realistic and open interaction.
It's amazing how an RPG from 30 years ago (Ultima 7) manages to have a world that feels more alive than pretty much anything made since (partly I blame the original Diablo, which was _marketed_ as an RPG despite really being a dungeon crawler / hack'n'slash game - that really lowered the standards for what could be called an RPG).
Ultima 7 NPCs have actual lives, sleep, eat, go to work, react to the environment around them (ex., opening windows during the day, lighting candles at night, etc.), have huge and complex dialogue trees, etc.. Nowadays people just accept "RPGs" where characters stand in the same place and repeat the same 3 lines over and over.
@@RFC3514 There is a big opportunity for this kind of games, but they need decent graphics and voice acting for them to be main street. I think Larian is the best hope, but I'm sure if they get another big hit, some other studio will come out.
@@digitalsublime - Voice acting was another thing that hurt RPGs around the Diablo era. They couldn't fit audio for the huge dialogue trees of games like Ultima 7 into a CD, so they made most dialogues very linear.
In fact, the dialogue in some older games was generated dynamically, so it couldn't really be pre-recorded (unless it was assembled from individual words, and that usually sounded too robotic).
Maybe now with decent voice synth and generative text AI we'll see games with more complex dialogue systems.
Just don't ask the NPCs to count the number of Rs in "strawberry". 😜
Some really good points in this video. Ideally I could keep running modern games on my PC for a long time without upgrading, by turning down graphics settings, but when I upgrade I would notice a significant improvement.
People have gotten burned by spending a couple grand to upgrade their system and not notice the difference from medium to ultra. People are also frustrated by games that are 100 GB because of all the 4K textures, even though they'll be playing at 1080p.
I think the 80-20 rule applies here. It took 20% of the effort to get to 80% photorealism. The last 20% take 80% of it. But most people are fine with 80% so they don't see the point in upgrading anymore.
I think DF said it best yesterday when I listened to them “the Terraflop war is over”.
The games I’ve felt have been transformed most by Ray tracing are old ones like Quake 2, Portal, Minecraft and I’m really looking forward to Half-life 2 Rtx.
Personally I’m happy with less fidelity and games that take 3 years or less to make as opposed to 4-7 years in the triple a world of gaming. I enjoyed Cyberpunk 2077 and Witcher 3 but have enjoyed the Fromsoft souls games more which are technically inferior to CDPR’s output.
The funny thing is that when you displayed the example around the 4:16 mark, I thought the right side looked better. This reminds me of how in Monster Hunter World you can disable Volumetric Rendering to stop the backgrounds from being blurred and washed out. There are some newer features that don't look good, even if they are trying to be realistic in some cases.
Yeah I thought the contrast made the tree textures pop more - seemed more realistic and crisper on the right.
It does
@@phattjohnson I thought the lighting had more dynamic range on the right, lol!
Yes, because the OTHER thing that's kicked in was the "planned obsolescence" of ditching older games for no reason where we would rather just be able to play good games on a stable platform rather than chasing down the ability to play them.
A similar thing to see how this impacts development is a video comparing what development mindset in *_Left For Dead 2_* was vs. *_Back 4 Blood_* in terms of where cost, time, effort, energy, & budget went. We want THAT from devs especially being burned out by all the Live Service models that want to chase you down and throttle your wallet, rarher than give you something that's worth the cost vs. diminishing returns on all fronts.
we reached a certain plateau back then when 3080/3090 is the king. you can play on 1440p with DLSS Quality for years to come. in 2-3 generations the visual representation will reach a point where no more power is needed. backing this up with "no one wants photo realism in games" . a good art style is much more important. people do not want games to look like the real world, the opposite is the case. the freeze frame can also be told to using 4K and 1440p . you cannot see if a game runs on 4K or 1440p on a monitor. the pixel density is is already high enough for like 27" displays - a 4K display adds almost nothing to it at a super high performance cost.
Gaming is in a weird place right now.
Gaming industry is dead
@@RAKUNTU It has been dead for over 10 years to me
After 2013 everything got Worst
@@RAKUNTU can't agree more
Modern games are already starting to run like piss on my 3090 despite not looking any better than RDR2 which runs in native 4K with raytracing above 60fps without breaking a sweat.
They want you to buy a 4090 bro
I'd say we aren't "starting to get into diminishing returns" we've been there for the last 2 console gens. even PC cards seem to have plateaued completely this gen with advancements only really coming from software enhancements, not hardware enhancements.
If developers create a game that’s too demanding, the PS5 won’t be able to run it, which would lead to low sales and reduced income. As a result, they design games that the PS5 can handle, even if that means fewer improvements compared to previous titles. Similarly, when viewing photos, you won’t notice much difference between a 10MP image and a 100MP image-unless you zoom in. While it's a significant improvement, can you actually see the difference?
Flat panel TVs/displays have followed the same trend, unsurprisingly. SD to HD to UHD (and the variants of HD in between), have shown pretty clear improvements, but in the jump from 4k to 8k the conditions and content being shown have to be just right for it to have any impact. This just means as the years rolled on 4k tv's went from massively expensive home theatre-like luxuries to cheap consumable items for a fraction of the cost. Will gaming hardware follow suit?
I miss CRTs
As with all forms of technology, you reach a point if diminishing returns where "2x the power" starts to seem like "1.5x the power" then "1.25x the power"
I'll relate it to 3D modeling, we use to see a new GPU generation mean multiple minutes off render times, now a new GPU generation may see only 10 seconds off render times or even less. But when you look at the percentage difference it's still about the same, because 10 seconds is twice as fast as 20 seconds, but it's still only 10 seconds at the end of the day
Show me a system that requires expontentially less energy to produce twice as much of ANYTHING.
A perpetuun mobile still doesn't exist and will never exist. 😅
Going from 800x600 to 1078 and same framerates are about 40 % increase in gpu power or you could play at 800x and have better visuals but same fps as your previous gpu - Mindbogling... a gpu should be 100% better and prefebly at a higher resulution to pay off
Yup, this is why I have been enjoying VR cause the mobile hardware of quest increase by 2x every three years and you can see a next generational leap like resident evil 4 to Batman Arkham Shadow. Sadly I think the quest 4 will be the last massive next generational leaps for VR and every model after that will be improvements like sharpness and FOV.
@@esmolol4091 we arent talking about generators, but if your motor used to require 10 times the energy it would output and now requires only 5 times the energy you still doubled the performance, gpus main job is using paralelisation to gain 100x performance without increasing energy consumption at the same rate
@@Sauvva_ What do you try to say exactly?
the visual differences between rdr2 and gra6 will show how far we got in the last years.
very interesting video, BTW. you point out a very true development. don't we see something similar, for example, in the smartphone industry happening in the last years?
That made me think. I was buying a high-end GPU like every 5 years to play the newest games. But at some point, games can't just get "better", so I guess there should be also a point where my GPU is still high-end years later.
My 1030 can still play all the newest games at 4k ultra and ray tracing. Pretty good card you should buy it.
kinda happened with 3090 & 6900, the new cards are more powerful but we've entered an era where cost is so high that they will be extremely relevant for a good few years yet, still 4k cards for the most part in a world where upscaling and frame gen are the new normal.
That would sure be nice, but if UE5 is any indication, we'll still be getting games more demanding on hardware at a similar rate without necessarily improving the visuals (as is the case with modern games starting to require TAA/DLSS as a crutch for optimizing performance).
@@gorillagroddgaming😭😭😭😭🔥🔥
@@gorillagroddgaming Sure , at 1 or 2 fps :P
Imagine having one gpu for 10+ years and still being able to play the newest games on highest settings.
Its doable probably but discouraged by hardware manufacturer mafia
Sounds like a Twilight Zone nightmare.
The highest setting is a gross exaggeration
Never
@@kalebgross1310 it’s possible if you are still in 1080p a 1080 Ti can easily do it at high atleast not ultra.
Playing a game, the same game at higher resolution is such a game changing experience and if possible skipping scaling.
I haven’t finished the video yet, so I don’t know if you addressed it, but another classic example of a wonderfully optimized beautiful title getting absolutely brutalized by modern “features” for a minuscule visual improvement is the Witcher 3. I played through the whole game at 1080p low/med settings on a gtx970 when it first came out to get 60fps, and then when I got my RTX 3080 in 2020, I download it all the texture packs and sky boxes and lighting mods, booted it up in 4K, and played through the whole game again getting 90 to 120 FPS, and being absolutely blown away at the visual experience. Cd project red updated the game to DirectX, 12 and incorporated those mods into the base settings, and the same exact system can barely get 40 to 50 FPS with the same settings on the new version. If there’s a visual improvement between the classic version that runs smooth as butter, and the DirectX 12 version that is stuttery and horrible to play, it is splitting hairs, and not worth losing a single frame of performance. It’s also shitty that all the new players that go to try out the game and its current version for the first time will get the shittiest experience possible.
To answer your question mid video, if you gave me games that looked like the Witcher 3 (with better NPC character models though) or RDR2, and could run comfortably 90-120 FPS at 4K on an upcoming 5070 or 8800xt, then I would be perfectly happy with that if it meant the games got more FUN and polished from a gameplay standpoint.
AMEN brother most modern games are like that now and it is SAD
The Witcher 3 current gen update is so god damn cpu intensive I just cannot understand why. I’m getting the same cpu performance as I did on the original launch which I played on an i5 4690k!, and my current CPU is the i5 13400f. It is eating my cpu up more than cyberpunk running path tracing.
Witcher 3 is also not artistically designed for ray tracing, which hinders the benefits a lot. Hardware Unboxed did a video talking about the visual comparisons and improvements possible with ray tracing, and they rated Witcher 3 as "Different, but not necessarily better".
@@ElysaraCh The worst part is that you can turn of RT and you still have a massive performance penalty playing the “upgraded” version.
Didn't it also change the artstyle to be worse than before?
This is so much worse with the new GPU hardware being focused on using AI to generate frames for games. Why optimize a single thing and keep a game from risking your PC’s longevity if the hardware will cover it up for you?
“There is no war in Ba Sing Se” kinda shit is gonna start happening with video games now.
It feels like more people are looking to legacy hardware/emulators to get their gaming fixes, these days, because of diminishing returns, and because of the state of AAA gaming, as set by the publishers.
We peaked with Prey 8 years ago. Can we now start focusing on better optimizing games so that people who can't afford the latest hardware can still play new releases? In the incredible words of Hakita, the creator of Ultrakill, 'Culture shouldn't exist only for those who can afford it.'
Slightly off topic: I had gotten so used to my Windforce graphics card running Overwatch 1 at under 30 FPS. I was CONSISTENTLY landing headshots as Hanzo. My friend watched me play a bunch and said: "Dude you could go pro!" and gave me his 1080. I was ecstatic, set everything up. Literally couldn;t land a headshot for shit afterwords and went back to my Windforce and immediately went back to dunking on people. Then OW2 came out and now I have the 1080 installed. What does any of this have to do with anything? No clue, but lower frames worked better with my brain, dunno if thats an actual thing that others experience or not. Great friend, we still hangout and talk.
your probably just more used to the lower framerate tbh
Less visual clutter and better general awareness since your brain won't be "processing" as many different images. The extra frames were probably throwing off your timing as well.
Something similar happened to me back when i would top100 every level on DOOM16's Arcade Mode and Metal Hellsinger. Took a few dozen hours to adjust in each game
what i feel is that AAA companies are stuck to push graphics because its the only thing they surely can achieve with so many people,. if they would just try to make good games with subpar graphics then indies would surpass them in everything, not just good stories, originality, puzzles, critical thinking etc.
AAA are now made to be movies or simulators of some kind with big laggy open worlds where u can do nothing smart, just pick berries or skin an animal, but still holding hand with where to go what to push and what head to click on.. (yes, i hated RDR(and i played many games since 2000))
19:19 not only are we not there yet, but designers are having to work with two lighting systems that make life a lot harder. See that HU video for heaps of examples where prioritising one means the other suffers (i.e. the game is mainly designed with non-raytracing in mind, so when you turn it on you get weirdly lit spaces).
To nail both you have to carefully design the light of each environment using both approaches and use tricks to avoid one going wrong.
Yeah this transition period is pretty brutal. And sadly, because RTX is still so expensive and underpowered for normal customers, we'll be here for another 10 years until RT gets commonplace and at a decent power, at which point the developers could finally switch and only develop with RT in mind and make it a hard requirement.
Somebody tell Nvidia to stop skimping on Silicon and VRAM! 40 Series was a Scam!
4090 pricing may have been exuberant, but in terms of performance? It's fantastic. So is is the 4080. When the 5090 releases, I'll own that as well. The GPUs aren't the problem. Either your ego is or your wallet.
@@huskers1278 just wants to brag about owning a 4090
A scam was 3.5gb. Being poorly priced isn't a scam lol
@@huskers1278 Its a scam just based on the prices. Really should regulate the market when no competition and slap a 50% tax on all Nvidia stuff, they are printing money with their monopoly.
Some of the 40 series was a scam, some models weren't. Choose wisely.
Flatscreen gaming graphics may have plateaued to an extent, but as someone who plays a fair bit of VR there is plenty of room for improvement on that front. There are very few AAA quality titles and mods which transform flatscreen games into VR (like Cyberpunk) look fantastic and have amazing potential but even a 4090 can struggle to run them to an acceptable degree. I feel like that might be the area to focus on more in future.
Exactly what I think!
With VR you're only ever going to get what you get as a result of improvements made for normal monitor use. VR is extremely niche and hardly anyone is interested in it and that isn't likely to change. Sure it's market share is increasing a lot as it advances and it's a lot more prevalent now but that's an increase of a small number to a less small number. VR is cool and I hope it gets more attention for those into it but ready player one isn't going to become reality. The Apple Vision for example is probably their biggest failure ever and the metaverse failed spectacularly.
@@paulc5389 AVP can't even play games and wasn't designed to so that's kind of irrelevant, hardly a shock what is basically a VR Iphone at $3500 didn't go mainstream.
The Quest 2 sold 20m units which is not that far below Xbox Series sales (28m), sure it's still niche but there's definitely a market there.
I think VR conversions of flatscreen games is a potential goldmine as it doesn't require extra investment or specific dev and can work surprisingly well, it's just the hardware needs to catch up. Once we get affordable GPUs that can run games like Cyberpunk in VR with the same fidelity as flatscreen (probably 2 gens away) i think it will explode as it's a truly incredible and transformative way to experience games.
@steviewonder0850 oh for sure. I just mean don't ever expect it to be the main focus of Nvidia et al.
@@paulc5389 Nvidia doesn't really to do anything in particular, better GPUs for flatscreen gaming = better GPUs for VR. It's just extremely hard to run VR games in high fidelity on current hardware as it's basically like trying to play in 8K flatscreen. But there are little tricks and things they can do for optimisation like foveated rendering which will only get better. VR is still in its infancy really, probably like the equivalent of a PS2 right now compared to what will eventually be possible.
This is why I never really cared about PC vs console.
PC snobs have to pull up a screen showing the side by side of numbers between both. Frame rate, pixels, processing power and so on to make a point. Having to literally show side by side footage of both for us to squint and struggle to see the difference.
By that point, I’m happy with console. Especially, since I don’t do online gaming. I do single player.
You have no clue and your comment screams it. If that's how you think PC is compared to console (screenshots) then you're lost like all of the other arrogant consoles users
Best part is you've never had a PC so your ignorance will continue like so many others 😆 keep making out you know what's what. When you clearly don't
DLSS? DLAA? MFAA? RESHADE? DLDSR? Frame generation? Lossless scaling?
You have no idea what you're talking about 🙄
@ we get it. You’re a PC snob. And you proved my point by having to bring up all these metrics normal people don’t care about.
@@floresf8727 So you're full of assumptions and bad knowledge 🙄 I listed all of those things because you have no clue about them. As your experience & knowledge prove that. I ALSO have consoles & know the difference 🤣 what a clown. Your gaming experience is stuck in a box. But somehow PC owners are snobs for having gaming equipment better than consoles 🤣🤣 what crazy war are you fighting. You don't even know what FSR/DLSS is or the difference between them. You're playing on AMD equipment. But you have no idea of it's competitors software 😆🤣 I just laugh when folks like you try to downplay PC's. More so when the consoles are struggling with games right now 🙄
I don't really care about graphics, I care about gameplay, that being said I still hope the ceiling gets higher and higher, we are in a dip right now where small improvements take a lot of performance, but hopefully as time goes on these processes get more effecient and new games come in that raise the ceiling even more.
Scaling projects up is not a linear increase, costs rise exponentially. Ps1 era devs didnt have to deal with facial expressions for example but nowdays we are not only looking at very expensive motion capture performances for facial expressions but you also have to worry about hair simulations, sub surface light scattering and mussle movements, pores stretching, and the list goes on. And if any of these is neglected then you get a mismatch and a dissonance in how things look vs how you expect them to look and behave. A lot of the advancements we see now has to do with automating these time and cost consuming endeavours (RT, metahumans, etc) but shits getting more and more expensive and companies keep pushing bigger and bigger worlds and projects.
Very true.
Many classic games pre-2000 were developed with dev teams under 50, and they usually developed their own engine at the same time.
The original Baldur's Gate and its engine was made by approx. 60 people over 18 months.
CP77 was in development for nearly 7 years and peaked at around 500 people on the team.
Once again, hyper realism is the bane of gaming.
The extra performance should be used for gameplay, not graphics.
Real AI powered NPCs, full physics integration on gameplay interactions, gameplay relevant path tracing (using reflections for stuff or something)...
The only purely graphic performance we need now is to be able to achieve current PC graphics on VR, after that, i would be fine with graphics no longer getting better, Half-Life Alyx is so close already...
AI, physics and density are all CPU limitations. Maybe when NVidea joins the CPU market things will get better
Developing NPC/ world ai is something Ive wanted for a long time. That has stagnated for like 15 years
That actually takes effort from the developers, unlike using stuff like ray tracing or making grass be seen from a little farther away.
I agree. I mostly play retro games on emulators but when I see a modern AAA game the first impression is always "wow it looks great" and then you start noticing that the gameplay is mostly a cool background with modal overlays that say "Press X Now".
I have never seen a UA-camr move their camera view around the screen like you do. You do it really well, adding a lot to your narative. Very nicely done.
Style is dead in modern gaming, replaced with the cult of realism. And graphic improvements seem to be pursued for its own ends, rather than for the benefit or vision of the game itself.
To the point it feels like companies make graphics first and then the game from it.
Ps2 to ps3 was the last big leap. On an iPhone at least it’s hard to tell between a Spider-Man game from ps3 to ps4.
we are at a point where something truly revolutionary (hardware wise) would need to happen. Traditional polygonal rasterization is essentially completely plateaued. I'd personally argue that this has been the case for almost a decade, with the GTX 1080ti being the "beginning of the end". If you play at 1080p, that card is STILL all you need. It's insane.
This is also a problem for improving graphics with PC games. If you make a game that requires a 4090 to run at 1080 60 then your market is 4090 owners only.. i.e. hardly anyone. And that lack of willingness to upgrade is only made worse by Nvidia's greed and poor specs on cards that are actually affordable. If I was going to go Nvidia I wouldn't touch anything below a 4070 ti super or any 50 series card that has less than 16GB vram. And for a lot of people that's simply not affordable.
Orrrr, it's the fact that the peak of what the human eye can see, and distinguish difference in sits around a 2-4k resolution. We've hit the point in gaming now where the best looking games are already the best looking they can get. An unreal engine 5 game, using all their graphical and rendering power is as realistic as real life to your eyes. The human eye can't notice any difference in resolution past 4k. Anything past it, like 8k, is entirely pointless. To 99.9% of people's eyes, 4k and 8k will look the same. There isn't any way forward to go resolution-wise. The only improvements we can make to games now, are fundamental engine improvements, like better frame rate(which also has a cap the human eye can see btw, around 200-300fps depending on visual acuity of the viewer), which you can already reach on PC. At this point, there's not really much of an "up" to get to when it comes to game visuals. Now, it's all optimization, and getting the game engine to handle games the best they can.
Actually having full RT with Everygame without performance problems would speed up making games. As you would no longer have to bake lighting, shadows. Etc.
Exactly. We need to get to full path tracing and just throw away raster mode rendering.
It's bigger than people think because there is a lot of time spend authoring and lighting a game using rasterization as well as all the time spent making new rasterized effects to imitate what ray tracing does
until the majority of consumer gpus are rtx cards this will never happen. The majority of the world games on low to mid end gpus
@@kooale3252 According to the steam hardware survey you are very wrong. Out of the top 15 most used cards only 3 were not RTX cards.
"Starting to get to the era of diminishing returns?"
I just question the word "starting"
In the late 90's through the early 2000's dropping £400 on a new graphics card made a HUGE difference
Often games just would not play on 3 year old hardware. they wouldn't install, wouldn't boot, you'd get an error message telling you your GFX and CPU - were not good enough and that was that.
A CPU and GFX that was top of the range 3 years earlier, could not play a game at all.
That wasn't a rare occurrence. So you'd stump up for new hardware, and immediately you could see just why your old hardware could not perform on this new game
Now, as my daughter plays the latest games on my hand-me-down almost 9 year old 1070 - that was MID RANGE in early 2016. I look over her shoulder as she plays and think "I wouldn't care if I had that back, those graphics look just fine to me"
Yeah it could look better... but not $800 better. Not even close. Not ever.
So what you're saying is if that old PC and my PC with a 4080 super were running cyberpunk side by side you wouldn't think that my pc running it in ultra wide 1440p at 60fps with path tracing would be worth the upgrade vs what.. minimum settings at 1080p and 60fps?
@@definitelyfunatparties Get rid of the ultra wide and I'd choose your setup
There was often times new tech being dropped that just wasn't supported on older hardware like deferred rendering, tessellation, pixel shaders, or many other things. Those massive jumps were just because everyone was developing new tech for GPUs. Today we rarely get new tech for a GPU and the only recent addition being ray tracing. Those games that use hardware based ray tracing will do a similar thing to older GPUs from a few years ago
@@Minimal_M so you'd take the 1070 if you had to play in ultra wide exclusively with a 4080? cmon bro. anyways, that's what the extra 16:9 monitor is for. options 💁♂
@@crestofhonor2349 it's already happening, and it's even happening for current GPUs from other brands.. I ditched a 7900XTX for the 4080, yes I occasionally get a tiny bit less FPS in other games, but anything ray traced and I'm going from sub 60 to 100, except for cyberpunk, where I went from sub 40 to 60.. but that's proper RT (PT)
In general the focus around games has been too much on graphics. It's the first thing everyone will complain about.
I'd much rather see some innovation in game design instead of chasing "photo realism". Or a shift of focus from graphical fidelity to improving stuff like NPC AI or physics which have absolutely stagnated or even regressed. The reality is that games don't need state of the art graphics to be fun and successful, you see it with many indies and especially Nintendo.
It's funny how game AI has barely improved at all (and is still rubbish, frankly) even as we're on a supposed revolution of self-driving cars, talking machines, even generalised AI. Much of that is hype and BS, of course, but you'd think we might already get at least some improvement in game AI before the Robots supposedly make lawyers and artists redundant. But nope.
Much easier to do it in games than realworld, too, as you have all the variables and a very limited 'world' in which to operate. Driverless cars (supposedly) and yet driving games have awful AI opponents. Same in shooters, everything. Garbage. If game devs can't do it after forty years, what makes anyone imagine Elon can do it "by next year". lol
@@CurtOntheRadio It's weird isn't it? Seems like a much easier canvas to work with than the real life applications you mentioned. The big difference though is that they have dedicated research where unimaginable sums of money go into. So they are and will be more advanced than what you see in games where it's clearly not been a large topic of focus for a while.
@@vintatsh True, it's not a fair comparison.
Buuuut, we might at least expect some improvement in game AI long before we hand over our children to be educated by the Robots, say. Or fire all the lawyers. Even if it needs a cloud subscription.
It at least suggests some further scepticism is warranted about the use and integration of AI more generally, imo.
I keep trying to think of ways to use AI in games and keep coming up against this issue: how do you swap data between traditional and AI without losing the point of the AI, or the use of the traditional compute? Like, say, you could have AI be a shopkeeper, or blacksmith, so you can better deal with them, be more inventive, maybe - say in Skyrim type.
But any result would still have to be passed back to the traditional compute running the game, and it would all need be defined as variables the trad game code 'understands'.
Yet if you limit AI to output the trad game understands have you gained anything, really? You can't broaden choices easily, you can't invent new things and all in all I think it's difficult to find applications. Though maybe sports games might be one - driving, tennis, whatever.
Blah blah blah
But where is it? Where is any game with a novel, 'proper' AI component?
"Several games with AI on the market today implement highly sophisticated forms of AI to elevate the player experience. Games with the best AI often elevate the gaming experience in cool ways. For example, The Last of Us: Part II uses advanced AI to power its enemies, providing them with an ‘awareness state.’ This means that if an enemy sees one of their comrades killed without actually seeing the culprit, that particular enemy will be on alert and more vigilant as they plot to take their revenge. Expect the future of AI gaming will include much smarter NPCs since NPCs have always been one core use case for AI in games."
TLOU2 is pretty good re NPCs, I thought. Def a step up from most. Not sure if they just mean 'well programmed' here though.
And there is this:
ua-cam.com/video/PYtmFF02OH4/v-deo.html
@@CurtOntheRadio I feel like that will be a „next-gen“ type development. I fully expect PS6 and whatever the next Xbox will be, to have a highly capable neural engine to handle the processing of AI interactions and that will be their defining feature. The PS5 Pro already has a 300 tops machine learning block after all for PSSR.
That is going to be the point where even AAA developers will start to embrace AI use-cases in games, once the average player has the hardware. It seem like the next logical evolution. As to how exactly it‘s going to be integrating into game design I‘m also a little bit confused, especially regarding storytelling but we‘ll see.
I mainly play simplistic grapghic games because there easier to run an look great over time while realistic games just look awful over time. Physics, and effects like grass skies and water should be the current main focus. Like if in vr the grass should move semi realisticly from my finger. An water splash could look so good if details werent blocking innovation.
Reaching the hard limit on computing power would be really weird and really cool because suddenly the entire software world revolves around these limitations and we could theoretically see a greater amount optimization across the board. A world of frozen computing power, ironically, could see its performance improve over time as more people get more experience squeezing out as much power out of the silicon as possible, which in tern could mean that software gets faster to make and therefore cheaper.
theyll pivot to software as a service/live service games, and theyll have fewer updates with vanishingly smaller performance gains between them 🤣
Never underestimate harfware/software mafia geniously ripping you off again and again. They always find a way
Great video as always! I think part of the issue is resolution and the other part of it is VRAM. This is the first time we have ever gone 4x in resolution, PS1 and PS2 had the benefit of being able to change the output resolution of the console to your CRT screen and knew it would still look good, PS3/360 jumped from basically 480p to 720p, then we moved to 1080p, now its 4k, that has eaten away at half the GPU uplift we got this gen. Then you add in that RAM/VRAM barley moved! Mass Effect Legendary Edition, the main benifit both ME2 and 3 get is the HUGE improvement to texture quality! On PS3 devs had 480MB of usable RAM (32MB for the OS), PS4 gave them 5.5GB, this gen they have 13GB, and when you factor in the resolution increase etc, how much has this been eaten into? And with Nvidia keeping 8GB card relevant 4 years longer they should have with the likes of the otherwise excellent 3070, you end up in a situation like black myth wukong, a game that uses the latest GPU technologies and can look incredible, yet in places looks like a PS3 game with terrible textures. I would argue the single biggest upgrade in the Horizon Zero Dawn remaster is the increase in texture quality which along side the improved animation in cutscenes and better framing of secondary/side quest conversations (based on the video DF did) and in all honestly, other than more VRAM, none of those improvements actually needed more GPU power, the PS4 version of Horizon Forbidden West still has the same incredible motion capture and well framed conversations as the PS5/PC version. While im a sucker for better lighting in incredible micro detail like peach fuzz on Alloys face (it does look so impressive) but does it actually make the game better? No not really, the gameplay and story (my god what a story!!!!) are why I fell in love HZD.
But...modern consoles are NOT, NOT, 4k though. The vast majority of the time to maintain stable 30-40fps they'll run at 1080p. Even if you set it to run quality, the highest resolution point you'll hit isn't 4k at all, but 1660p. It's only capable of upscaled 1440p...but hardly ever runs that. I and a friend hooked up the PS5 to a his computer and ran some programs to read its performance. Ran several games at multiple settings and it averaged 1080p nearly 70% of the time...even with the two "4k" capable games at the time.
From the 360/Ps3 era onward the console companies have been lying about the resolution of the consoles. The consoles almost never render anywhere near the resolution listed on the box. The console simply upscales the image. Your "1080p" console was in fact doing 540p-800p upscaled. Your "4k" console was in fact doing 900p-1600p upscaled.
@@iprfenix Consoles like the SNES and PS1 could all output at 480i and yet they chose to do 240p. Plus even then resolutions weren't consistent. There were a whole host of pretty strange resolutions during the analog days of CRTs. Resolutions were never static. Even the PS2, which could go all the way up to 1080i and did support 480p out of the box, often rendered at 480i. It often took two 240p images and interlaced them to create a 480i image unlike the other consoles which often used a 480p frame buffer and then output a 480i or 480p image. Plus even those that ran at 480p didn't always run at 480p and could have an internal resolution of 448p and just use the fact that there's overscan to hide the missing pixels.
There isn't a single generation where your console ever always output at native resolution
PS2 era still ran at mostly 480i (which for rendering costs is largely equivalent to 240p) and earlier consoles mostly ran at 240p. Also modern consoles very rarely run any game at native 4k (stuff like the Quake 2 rerelease does run 4k120 on everything but the Series S), it's mostly stuff like 1200p upscaled to 4k with either checkerboard rendering or FSR and 30 fps.
Small Indie game developer here (or at least starting to develop games) I think one of the checkpoints to get to that point is draw distance, the best 2 examples of that I can think of is for one: The whole Ghillie suit and the grass at long distance thingy in DayZ, and the other example is how a whole lot of games are situated in this sort of "valleys" or "canyons" with high walls that conveniently stop the player from looking at details in long distances (although, nowadays it's more of an artistic tool for representing different ideas of a storyline). I just think that the day I can have the exact same visuals while looking through a x14 scope than looking at my character feet, or the day I can have an AC-130 stylish mission where I can see the white in the eyes of a character while zooming in, is the day that we're there (although I use to play 2010ish games, so maybe we're already there and I don't know yet).
Sensible comment. You're in the wrong place. ;)
when i played dayz in 2014-15 me and a friends were talking about the grass issues at distance and he had an interesting idea
At longer distance, just move the ground texture up, so if you are prone you are under the ground texture
This way you wont stick out and realistically someone that far away shoudnt be able to see you anyway
Finding out about the grass in dayz still leaves a sour taste in my mouth
I feel a big part of it is not just technology but the artists and developers actually making it look nice.
You can have the tech for the most beautiful scenery ever but you still need to arrange and light and animate it yourself, and thats effort a lot of AAA studios may or may not wanna put in
11:30 honestly yeah. Because companies would then start optimizing to get better performance again. We’d see these beautiful games that actually run well instead of just requiring you to have a 4090 to run at 60FPS
"I have a $3000 PC because ultra is the way the developer made the game to be experienced" justification x10
not true, a lot of games are build first with console in mind and than pushed with some extras on PC.
Vegetation is one of those option where the engine is just put to do a lot and that's called ultra settings.
One thing to note, is that consoles (PS3) at one time gave a price to performance balance. including a Blu-Ray player, media center, and gaming device all wrapped together for a decent budget price.
PS3 was the most expensive console Sony ever put out adjusted for inflation and it almost killed the brand. What are you talking about.
@@jiggerypokery2962 Ps3 was clearly overpriced at launch, but it was the cheapest bluray drive at the same time... Many sales were only driven by the bluray drive itself.
@@jiggerypokery2962 True, but you also can't oversee that Blu-Ray players were almost as, or even more expensive than the PS3
@@CrazyDoodEpicLeaves SONY blu ray player at the time of PS3 release... $1000
The problem may be looking at consoles that way. Back in the day many people bought a ps2 or xbox for its home media capabilities such as dvd and blu-ray.
Ofcourse the world has changed now but then so too should the console's marketing. What about if you sold consoles like phone plans? That way you could jam more premium components in and have the overall cost be higher but not less affordable.
You could also package it with big brands for the modern age. So lets say if you buy and xbox subscription you get the new xbox, gamepass, netflix,spotify premium,youtube premium and amazon prime. That way your entertainment and online quality of life services are all covered under the one payment. Much like how your dvd and game playing device was all in 1 unit in the 90's and early 2000's
im on a $300 mini pc, playing modern AAA titles at their lowest settings, and im stoked about it. the fact that 2-3 years from now i can expect to be able to replace this pc with something that can play these same games at high settings in the same price range is awesome.
not only that, but my setup only uses 25-70watts total with monitor and everything. i live on solar panels, so needing a 500watt gpu to play things is just not a possibility for me.
I wouldn't mind if performance didn't improve beyond this point (at least for a very long time). It would allow developers the time to actually improve-, and master the rendering features we currently have. Why most people want more performance today is because games perform poorly. But at the same time, games perform poorly because developers haven't had the time to master what we currently have. It's a chicken and egg kind of situation, where our economic system (feeding on non-stop growth) requires consumers to WANT the new products. But in reality, just like game development takes longer these days, so does optimizations to engines, to games etc. It feels like the kind of rendering optimization improvement developers made during a 5-year period on the PS2 would take 20 years today due to the added complexity of modern rendering. I would personally like to spend less money on hardware, and allow developers to become more focused on creative graphics engineering.
If developers stop the practice of developing games that perform best based on hardware that is not even in the market yet that would be a good thing. Personally, I lost my enthusiasm for gaming a while back when it felt like developers (especially AAA studios) decided that good looking games mattered more than good or innovative gameplay. Probably only those who grew up in the era when gaming started going mainstream in the 70's and 80's would get where I am coming from.
Developers don't make any decisions you attached to them in your comment.
The R&D or production teams: 1 artist, 3 devs, 1 analyst, 1 tester, 1 dev-leader. Those teams make up less than 60% of the company staff.
Whatever you think devs do, they don't make decisions. Especially the strategic ones.
When was this "a while back" ? 30 years ago ? Because that's what AAA studios/developers did since forever. Literally. And the current situation is much better than it was in the '90s and early 2000s, where games would literally not run on 3 year old hardware. Nowadays almost all games can still run on the midrange GTX 1060 from 8 years ago. This will be even more true when RTX 2060 reaches 8 years old.
@@Winnetou17 Have not bought a game on release since Warcraft III.
@@p4r4g0n I'm almost with you on that one, but I did had some exceptions (DOOM 2016 and Kingom Come: Deliverance, from what I remember). And In the Warcraft 3 days, we were pirating it (it was common in my area, pretty poor country).
It's funny that for Warcraft 3, I had it installed for several months before I could play it. Because, I didn't had a GPU initially.
Between PS4 Pro and PS5 there's no big graphic differences.
They're literally the same... I have a PS4 Pro and a friend has the PS5. Putting them hooked up to a computer and running some programs...the PS4 Pro runs about 95% of what the PS5 does. PS5 basically just has rt and can run 30-45fps rather than locked at 30. They're the same
There are huge differences not being emphasized due to so much cross gen support
@@GrySgtBubba The differences might not mean much to you but to say they're "literally the same" is objectively wrong mate.
PS4 Pro crawls running Cyberpunk not even holding 30 fps at 1080p with some dynamic resolution and lowest settings possible.
Meanwhile PS5 runs the game at native 1440p with high settings and ray traced local shadows (which are cheap) then upscaled to 4K at locked 30fps.
But sure, PS5 is barely more powerful.
@@phattjohnson Bruh, they both run in predominantly 1080p-ish at 30-40-ish fps, and some titles can be 1440p-ish upscaled to around 1660p-ish...and run 30-45fps... The only difference is PS5 has rt in some games, and has a few new titles. That's it! Lol
not just he fidelity, the fps thing is diminshing returns. If we pretend we can tell over 60 fps despite every study proving otherwise, 30-60 fps is a 16.7ms difference. 60-120 is 8.3ms different.
you actually have to go from 60 to 960 to get the same difference as 30-60. the difference between 60 and 120 is almost half of human perception. it's literally a difference tooo small to see, but even if you could its only half the upgrade.
Sorry that your visual acuity is decidedly on the left side of the bell curve peak, dog. Humans can identify images shown for 1 frame in a 200 FPS sequence of frames. You drew the short straw.
@@nopenoperson9118 Sorry that you're incapable of reading. Please try again and recognize I said nothing about myself and only about every study done on the subject ever.
No, humans cannot identify images shown for 1 frame in a 200 fps sequence of frames. You do know we're on this thing called the internet and so you can fact check things, right?
99.9% of studies done on the matter show 60fps as the limit, 0.1% suggest that some people might get to 68, but that's it. Showing a single frame for 1/200th if a second is not the same thing as showing 200fps and telling the difference or seeing that frame. Even in that one they put "see" in quotes as they didnt see the frame, they saw an afterimage, a visual illusion. when shown 1 frame not 200 frames with 1 different one.
Just try googling something as simple as "how many fps can humans detect" You wont find a single solitary result that says over 60.
there's multiple reasons why anything over 60fps is the same as 60fps. You're simply lying to yourself and trying to justify spending all that money. The bell curve peak would actually be more like 53 fps, since 45 is the low end and 60fps is the high end.
Yes Daniel, I would be happy to run my pc until the wheels came off.
Fans, maybe..
@@christophermullins7163 Until the fan blades came off *
No, I wouldn't. We need to be better. The PC is the most important thing, you can afford to upgrade it once every 8 years.
@@albert2006xp I just did a new build and my old one had an 7th gen I3 and a 1050. My buddy still plays on that pc lol its fine for medium-low graphics 1080 on most game 60+ fps.
New games are optimized for people with $8000 PC. Not very many people can even play the new games they are releasing not because they don't want to but because they don't have $3500 laying around to play the new game at not even the best quality
it’s probably because devs wanted 32GB of VRAM for the PS5/XSX generation, but only got 16GB. The jump from 360 to PS4 in VRAM was 16x, PS4 to PS5 was 2x…
For what? haven't heard a single dev complain about the Ps5 ram size. Even equivalent pc's rarely go past 8.
High vram alone doesn't help if the gpu is not fast enough to process the saved data and ship it onto the monitor.
The clockspeed will always be the first bottleneck and the higher it is, the more energy it drains, the more durable it has to be and the better the cooling has to be, which leads to bigger and bigger parts (cooling solutions).
The PS5/XSX doesn't even have 16 gigs of VRAM, it has 16 gigs of shared memory between GPU and CPU.
@@thelazyworkersandwich4169 Yeah ofc they don'tt complain because no one tells them to code a game for console that runs in 1440p or 4k at AT least 60fps stable, WITHOUT disabling/reducing a dozen effects, like shadows, draw distance, lod, etc. until it turns into a mushy mass of indistinguishable hot garbage.
@@esmolol4091 Yeah that's my point......
I think dlss is a problem. I feel like games are designed so most people need to use dlss just to run it at 60fps, with the downgrade in visuals it translates to.
Now that we've hit diminishing returns on graphics, I think the only place for it to go now is immersion and how you interact with the game world. Physics, AI, NCP behavior, interactable objects, World size, ground breaking game mechanics that have never been seen before. Games that have a bigger scope. Star Citizen comes to mind. that game is pushing boundaries with it's mechanics.
Like for example, a group of players being able to walk around a ship that's being piloted by another player, landing on planets, going into space, entering quantum travel, all seamlessly. No loading screens ever. I'm just so ready for games to more mechanic focused and less graphics focused, and hopefully THAT will be the driving force going forward in the gaming industry and the reason for new hardware to exist.