Seriously, they just don't care about optimization anymore. They just tell people they should upgrade their PCs. And if people say anything about getting bad performance, people will say brain dead things like, "Don't be poor". It's insane.
Well you need to upgrade like Ram, SSD new graphic, but and devs must to actually work under performance and optimization than just "oh gamer will need better hardware"
@@bigturkey1 People often forget that the world isn’t just the USA. Where I live, a 3080 costs around $1,000, which is more than my monthly salary. Guess it’s time to stop being poor! My 3- and 4-year-olds are definitely ready to start pulling their weight. Time to send them to work in the mines !
Sorry about the typo. We are not advocating for more work though. We're saying the major issues that come with these checkmark workflows are being defended just for being a checkmark workflow. Instead of validating the issues or fixing them, the industry keeps pushing them. As for the realism aspect, it's not realistic for things to be noisy or smear in motion. We also have massive support from developers who actually watch out content instead of judging on superficial issues(addressed in the new video).
I know you guys are busy, but here's an idea that came to mind; A slower and bit longer, Crowbcat style video showcasing and highlighting "then and now" in terms of graphics. I'd imagine your audience would love it, and you guys clearly have the technical knowhow to do it proper justice.
I totally support you guys in what you are doing. Even though Wukong is like 10x better than western modern games in terms of art etc.,I saw in that game these visual issues you manage to put into words as an expert(I just played the game more that is why I noticed). As someone with migraine, these artifacts and kind like strobe-like(don‘t know the best terminology) lighting makes me really tired and almost trigger a migraine sometimes. Hope you can get real devs behind your project and that the gaming industry goes away from stupid activists and invest into talent. Waiting to see a game you helped get better. Best of lucks.
Thank you very much for doing this, and speaking in behalf of gamers, I'm glad you're getting the recognition you deserve. I was always confused as to why the hell all these new games looked like if had a layer of Vaseline all over it and ran like crap. I spend so many days trying to fix settings on my own and get rid of the smear and artifacts and nothing worked until I discovered your videos. It all makes sense now.
You need to make a really dumbed down, and *slowed down* version of the argument, so that dumb people like me can understand it - and that should go viral and spark outrage and whatnot
We seem to have slipped back into the berenstain bears universe and everything is out of wack. Im currently working on a "back in wack" machine that will return us to our original berenstein universe where things made sense. I will keep you updated soldier. Stay frosty
If you only pay attention to the crap games. More very good games are released every year than ever before. There are not enough minutes in the year to finish all the very good games being released. Ignore the rage economy.
To be fair, this mainly seems to apply to some big triple A studios. There are games like Black Myth Wukong after all, with more like that to come. Also, despite the predatory and shitty monetization practices, it does seem like more and more gacha games are taking the Hoyoverse route and putting money into 3D models etc, where as years ago gacha games typically had low budget looking 2d chibi sprites as the norm.
GPU companies and game developers are totally milking us. Games barely look better than 9 years ago, but now you need a monster PC to run them. Feels like they're just creating artificial demand to keep selling expensive hardware.
Consumers demand their 8k rocks and dirt in the background. Right up there with the 5 channels of uncompressed audio piped through an amazon basics soundbar.
@@CarelessOcelot me over here happy as hell and nobody could convince me otherwise with a 65lgcx, god tier 7.2.4 system, and a ps5 pro for gaming. My 3080 pc aint getting touched till it cant torrent anymore.
At first I thought you're blowing it out of proportions. Then I realized that Witcher 3 came out 8 years ago and that game really did have graphics comparable to today. On the other hand I also went from 1080p 60fps and medium settings on the newest titles of 8/9 years ago to 1440p 100-165 fps and max settings on the newest titles of today. So the hardware and performance of the graphic engines are miles better compared to 9 years ago.
Hilarious that modern gamers are finally realizing gaming peaked in PS2 era when games were made to be fun and no one knew what a developer looked like
Yeah pretty much this. Today’s technology in the graphics space of games has the potential to produce some of the best looking games we’ve ever seen. And in a few games it actually has. But in most cases it feels like artistic design, color palette, it’s all being replaced with “but look at these textures though!” I was hoping this new era of graphics would have produced games that have perfect or near perfect texturing but with the artistic and visual design of something like Elden Ring. So far it just hasn’t been the case.
And foolish game companies keep letting go of the people who have real experience, institutional knowledge, and understand the tricks of the trade, because these people are "too expensive" and "hard to work with."
And there were less incentives to please investors and meet unrealistic deadlines. Investors ruined gaming and it continues because people still pay for garbage like MTX and pre-cut DLC.
I think older games were developed better too but I don't see it as a lack of resources. Budget, development time and debugging were on a much smaller scale before 7th gen came around. Some of the best developed games of the 90's and early 00's were definitely using as many resources as possible. I do think that a lot of devs in the past 10-20 years have been very creative when working with less resources though. It's as if they were more focused on gameplay than anything else.
It is an aspect but I don't think the public understands the meddling Blackrock does in all media. This is the main culprit, Blackrock makes these companies add in certain aspects mostly regarding political propaganda , and if it makes the game bad it does not matter because the loans they get from Blackrock overshadow any damage a boycott could do. That's why bud light is stronger than ever right now. How would that be possible if barely anybody bought their beer for years?
@@DimeszTV That's the problem though they want people like us that don't give a fuck about politics either demoralised and pushed out of gaming or they want the games saturated to such a point where we don't have a choice but to play them. Hence the big push to takeover Japan etc to conform to western standards. I mean they are fully aware the primary audience for western ESCAPIST entertainment is straight men, this fact is just despised. This is precisely why its so divisive, the artists have changed but the audience hasn't, despite all the talk about the so called "modern audience".
@ yeah, I feel it but even then it’s like ok cool. Make a trans character… I’ll play it. But why do we always have to talk about your hardships as a trans character when you are fighting literal dragons and demons lol
“Necessity breeds ingenuity.” Before we had the ease of photorealistic graphics at our fingertips, developers had to be creative in order to get the most bang for their buck. They had to actually think about their design choices, and make deliberate decisions, instead of copy pasting the latest hyper realistic asset and applying some ray tracing
*cough cough* Monster Hunter Wilds. Got a 3080 w i9 10900k both have a slight overclock and still running native 1080p without Framegen or DLSS was getting 40-45ish FPS this modern take the AAA development on game performance is a joke.
Wtf are you talking about lol. A 2080 which I have runs every game I have tried at over 165 fps at 1080p on ultra. Even 1440 p Im gonna get over 60 fps on 99% of games. 4k is where it gets dicey
@@charlesogden8492 I had the sad realization that the last game I've played that seems to be really well optimized was CoD Modern Warfare 2019. The damn game runs at 120-150fps (1080p) on a potato level RTX 2060 6GB for crying out loud... Try that with any more modern game (especially UE5) and you'd probably be in the single digits
@@Kysa2 you're playing the pirated version aren't ya 😂 i bought the game cause i really liked it and I'm playing on a 3070 ryzen 5 7700, getting 80 fps locked no stutters. The cracked version is an older version which has lot of problems. Buy the game in the next winter steam sale if you want to enjoy smooth experience.
As a 3D artist who used to work on games in Seattle, its not even that hes bringing this up and being ignored, hes 100% right and much of the industry (at least when i was in it in 2018) is actively working against it and only focusing on message pandering and micro-transaction monetization. We had people and even a lead at one place i worked who acted like there were living little worlds in the screens and we had to respect characters like people...
Yeah "Puppy/Baby-Syndrome" where people adopt things and attribute human properties to them and/or adopt them as their children.... Very very Common in Leftist Women that are not or can not become Mother's.... Also common among single mom raised Leftist Boys that idolize Women and have been taught to be some male-version of a mother
@@logandunlap9156if thats where the studios are located its not ops fault for having to live there to work at these studios. Where else is he going to go? He aint going to find a AAA studio to work for if he moves to north dakota..
This guy is doing the God's work. I'm tired of upgrading my PC again and again just for the lazy devs to go "we can spend less time on optimizations now".
Welcome to living in a society that only knows how to use tech on the front end and not how to make it or what its limits are on the backend. Late millenials and early zoomers (1994-2004) where the peak of tech literacy. We grew up when the modern internet and PC's where rolling out and learned about them as they did. To later born zoomers and onward starting in 2005 they are basically magical objects they have always had and just worked for them, they never had to go through janky growing pains and learn technical skills like hardware repair and software optimization.
this 110%. ive heard for years how zoomers were going to be so great at tech. sometimes they are worse at simple tasks like installing a printer than the boomers
@@pwnomega4562But at that age if you were going into a field that involves them and had deep interest before, you would have had to work around all those problems and learned along the way
Honestly at this point, I think there is an understanding between game development companies and Nvidia to keep games so overly non optimized that you constantly need the latest hardware to run it.
I rem games at the start on ps1 and end on ps1 and the difference was night and day. Now its just add more to everything including $$ for it to look nicer
I’m thinking exactly the opposite: Nvidia insisting they’re not going to give you more VRAM suddenly seems to make more sense. They know game devs can optimize and run their games on mid-tier GPU’s so NVIDIA is in no hurry to double the VRAM in their products. They’re not even in a hurry to sell you a 5000 series gpu.
The fact that the latest hardware barely work for some of these games either is what’s crazy. Examples being CoD and Space Marine 2 having awful loading times with the solution being “YoU nEeD aN SSD” but having it on an SSD doesn’t even help half the time.
1) Game developers don't want to optimize because that's extra work and why do that when gamers will buy all the blurry stuttering crap anyways 2) Nvidia wants to lock gamers into their ecosystem by making them dependent on the upscaling and frame generation which needs the games to perform like crap 3) match made in heaven.
Look at Battlefield 1. Look at Uncharted 4. Look at RDR2; game graphics peaked in 2015-2019, but now games looks very plastic and artificial, which is ironic since AAA devs are pouring more into graphics now then the last decade Edit: I forgot Batman Arkham Knight
Games like these should be the visual standard-bearers we demand for all AAA games going forward. Criticize anything that doesn’t look as good as these.
rdr2 suffers from pretty much requiring taa to resolve the imagine. its actually one of the first big games to pioneer this issue lol. people on fucktaa always trying to find a fix for the blur
There was also a big talk when the PS5 was coming out that SSDs were just as fast as RAM in certain cases and that we could basically load the entire game just as fast. This would effectively remove any loading screens. Devs did not take advantage of this and we are left in the last gen.
I am TIRED of these developers thinking gooey smear vision is realistic! The clarity of mid 00s images is much better. Had much cheaper but sensible effects. We need no RTX. They come up with new heavier tech to make us buy new cards for stuff that was figured out in 2015.
Efficiency in power consumption was also a lot better before. Passive cooling actually was a thing for GPUs. Like Nvidia wants us to drive highly overpriced experimental rocket cars on public roads.
They focus on graphics while everything still looks crap. Worst of all, physics peaked in the PS3 era and went downhill after. They've got NO excuse to look this bad. It's absolute insanity
Everything’s so fucking blurry now. Go back and play something like Serious Sam, the original. Obviously it’s more primitive but everything is so crisp and clean, the lighting is so nice but also simple. Now it’s like there’s layers of blur and random colors and lens flares and shit and it’s so annoying.
im tired of developers thinking that photo realistic graphics are good at all imo the games that look the best are games that look like art pieces not like real life ie; wind waker, the border lands series botw ect
A lot of this reminds me back around 2004 when mp3's and lower compressed quality music was preferred over the advancement in audio quality and hardware. As a former audio engineer I would record and mix with incredible sample rates and bit depth, but then have to to separate mixes for compressed mp3's that were to be played out of a mono cell phone speaker. It is where the majority of your audience is that you are tailoring too. As long as you have the high quality mixes you could mix for vinyl later on if project wants. I would like to see if possible for games maybe different releases for those who care about quality graphics? I understand the struggle knowing the tech!
I know this is really anecdotal but starwars battlefront 2016 is the game that really made me wake up. Endor, hoth, bespin are all still absolutely gorgeous by todays standards 8 years later.
@@MikeTV-w3l Both tho i believe Bespin was only in II. I would not say Hoth is anything special as an ice planet but Endor is much harder to do and they did a great job on it. I personally also like Kamino too.
We are entering the era of the people in charge being people who grew up with no expectations and no training. They actively dislike beauty and have no idea how to create it. People want cheap, fast, and cost effective. This will get much worse unless we implement extremely high expectations.
The industry used to rely on debugging and no post-processing to optimise a game. AI upscaling has become a reliant part of development. There aren't too many genius level type programmers like John Carmack out there. I can only imagine how many genuinely talented people have been wasted or neglected in modern hiring culture.
Short-sighted view. The industry constantly improves, and we are in a transition period. Back in the 80's and 90's low-hanging fruit made it easy to make big strides. Real time global ray-traced illumination is a pipe dream 30 years in the making. And requires new optimization techniques currently being developed and improved on. UE is still being worked on, but it focuses on different techniques than the guy in the vid wants them to focus on.
The sad thing is that my 7 year old gaming laptop can play these older games that objectively look better... but it can't play Marvel Rivals at all. Dev's don't know how to optimize anymore.. They just say "fuck it" and expect everyone to have computers as good as the ones they used to build the game.
Crysis from 2007 is still more impressive then most of todays bs Wtf
9 годин тому+40
That game at least had really good reason for the high system requirements. It looked... no, still looks good. Amazing actually. Technologically highly advanced. The biggest fault of the engine was that they thought higher CPU clocks will be the important part, not the more cores. So yeah, it doesn't really take advantage of quad cores for example. I'm still impressed with that game despite it has lower polygon count models, somewhat lower texture sizes and no real GI. But the interactivity and overall look is still really appealing and it's sharp!
even MGS4 from 2008 is far more visually impressive than most games that are out today, i'd say even more impressive than MGS V, that game honestly looks like shit in comparison. (sure, it had some performance issues) game devs are lazy and need to wake up, why could we get top games back then with only a fraction of what we have today. (I know, corporate greed)
Even Halo 1 from 2001 looks better LOL. I was dumbfounded when i played Indiana Jones and found that Halo 1 was far greater than Indiana Jones and the great graphics disaster.
I built my first PC to run Crysis. It was junk, it burned out my graphics card after two months, but it was pretty and worth it. Can confirm, no smudge on my monitor.
The guys saying its expensive. If game companies cut out expensive parts of making games, why are games more expensive than ever to make? These companies should reroute 75% of the marketing budget to making the game good.
@@RedMatthew this is it the ceos thought they could just fire talent and place someone to replace them but they didnt know how much the talented devs where worth they shot themself in the foot
Graphics were better 10 years ago, and games also ran at stable 60 fps 10 years ago. Graphics are worse now and we get constant below 60 fps games. It's atrocious. Honestly the fps part is the real issue.
@@richardcollis5576 The only inaccurate thing here is you saying "THIS is an inaccurate statement" when I have several different claims in my post. Which one is inaccurate, multiple? all of them? Be more specific when you reply to comments.
false & falsetruth. if u obv take games like Crysis 1 & compare it with Dragon Age Veilguard. sure, but that comparison isnt worth to argue about. Crysis 1 ran like garbage, same as Crysis 2 or Far Cry. 30fps & u were happy in most games & trust me at that given resolution: it was blurry, no motion blur needed in the first place. now u can play Stalker, Wukong, Dragons Dogma 2, Alan Wake 2 & it will look way better than the 10 year older games. u also have to compare at given hardware. if u take a 4080 & boot up 10 year old games they obv run better, common sense.
@@Vss077 Graphical quality isn't just about fidelity, art style/direction, and performance AT those fidelity levels are just as important. And we've lost out in all those areas in the last decade for sure.
it's not so black and white as you think. But with a limited and cynical view of the industry, I can easily see why you would jump to these conclusions. And before your cynical mind jumps to another conclusion, I can see coming from a mile away; No I am not an Nvidia fanboy.
@@ancientmadness-rlk it's not UE5 fault bat game devs fault, and all the industry benefits from that, nvidia gives them millions, and hardware companies make billions selling new hardware so we can play new games with worst graphics then 10 years ago. Just compare Rdr2 from 2018 to new games.
@@webtiger1974PTGyea I feel like it’s partly ue5 though the only game I’ve had run decent for me on my 3070 is wukong but even then it was maxing out my vram just feels like there’s a lot of bloat in ue5
I am completely the opposite. I highly prefer games that strive for "reality". Likely because I am from the 80's and grew up looking forward to better graphics that were constantly developing. Real time global illumination to me is just as big of a jump as going from 2d side scrollers to 3d games.
The resolution thing is silly. Like phone screens not needing more than 1080p because of the screen size. Why do you want 4k unless you're a professional that needs all details precisely represented so you can do your job properly? Actually the whole "PS3 games look awful" thing is silly. I'd be much better if the game industry was focused on gameplay mechanics and providing a new experience rather than trying to make games look like movies but the gameplay of them making me feel like I'm playing a hundred God of War 1 mods.
Gaming Companies are too busy making everything look super realistic rather than finding their own style for their own games. Hopefully they start catching on that not everything needs to look hyper-realistic.
Nah, I love hyper-realism but outside of Cyberpunk, there ZERO console games that focus on hyper realism as they’re all using their own style. I wish gaming companies would focus on hyper realism but they’re not.
@@kruze69 you are literally proving the argument that developers aren’t trying to utilise the technology to its fullest. The technological capability is there, the developers just don’t care to do it. Just like they could make a non-DEI ridden game but they’re not motivated to do so for whatever reason you decide it to be. It’s the same way with utilising the technology to its fullest.
I'm leaving a meeting where a new AAA game needs to be optimized after the FPS has gone through the basement. The problem is how big teams operate these days. It works like this: Every department and feature team throws as much as possible into the scene, knowing that it will be optimized away in a further optimization step. The problem is that certain details absolutely destroy the player experience, such as lighting. When the game director has to choose between better networking performance and better lighting, and he is under budget and time pressure, he always chooses networking optimization because gameplay brings in revenue. Everyone wants good graphics, but even more want a good game. 20 years ago, game development was easier because we had smaller teams where we could communicate more efficiently, but also because the games were more straightforward. Networking today means, for example, that every idiot from anywhere in the world has to be able to play the game well, with many multiplayer features. In the 2000s, almost nobody could do this in terms of performance vs. cost, and games were not designed with this in mind. Think how FPS games changed over time, and how you hosted own servers in the early days. In our meeting, the engineering team came up with the idea of putting 10 cameras in one scene. It's unprofessional and screams incompetence. So, how could it happen? We discovered that the art director, his artists, and many senior developers don't have enough experience with LOD and draw calls for some wild reason. So, what does management do now? They throw people like me on -- for 5x the price. Some people are laid off. And every department now tries to salvage as many features and details as possible in the end game, or else their career takes a hit. This happens with almost every big game these days, the bigger the team. I know from friends that Valve is much more efficient in this regard than Ubisoft, EA, or Microsoft.
dude thanks for share u experiencie i love to hear how these things works on big companies since im a unreal engine user for a few years now but never know how the things works in the real world scenerios
What's sad is the solution is staring the rest of the industry in the face: Be like Nintendo, keep your talent for generations and keep them behind the cutting edge so you get 100% out of their experience. But nope, consultant sponsored lay off death spiral is what western AAA is choosing lmao
people don't always notice the difference in visuals, but they surely notice if their Gpu can't keep up if the solution both looks better and runs better, it's a win-win for consumers, but it requires more work, so it's a no-no for corpos, who want games to come out as fast and as cheap (for them) as possible
I've been playing games 10 years old or older. They play better and have better styles in general. Not to mention they utilize unique development engines. They have their own soul.
Battlefield 1 from 2016 still looks amazing, so does Doom Eternal, both run like a dream on my GTX 970 at ultra settings. Meanwhile, Baldur's Gate 3, which looks nice, I can't complain, struggles to hold a steady 30 fps even on medium settings.
Every AAA game developer "you can see your reflection on water", "the rain look so real" or "you can see how smooth it move" everytime keep telling graphic quality than story quality
Yep, which is why only very few of my games are after that year, but they are all remakes and remasters. The only games that come to mind that are not a remake of remastered I have added since then is Ruiner, Necromunda Hired Gun, Soulstice, Kingdom Come Deliverence, Trepang 2, Everspace 2, Remnant 2 and Vermintide 2. Cyberpunk is tolerable if you ignore the character creator and the one character they added i the expansion during the race events. I don't own Witcher 3, I found it extremely boring but I recall in being woke free, I think.
Yeah, it's extra pain in the ass when you're trying to help someone who's trying out a game, but all they can tell you is that it looks "blurry" and you have to figure out where the blur is coming from.
I will say that as someone playing Division 1 in 2024 for the first time ever, I had to double check that I didn't get some remastered version, the games graphics was literally leagues above most games out now with much better performance smh
I know i keep going back to the game and being speechless every time. You can even see the slow degredation of talent between division 1 and 2. Where models for clothing and such were just messy or unrealistic looking to the point where i think they must have got some college work experience people in to do the job.
What's crazy is that at the time people looked at that game as a classic example of Ubisoft lying in their trailers and then heavily downgrading in the actual game. Now we look at even what they gave us and think that was ahead of its time compared to now
@@scouthatesrainbowsYeah, because what they showed in E3 was absurd. It looked like if they had squished 4 Crysis games together and used the juice that came out of that to make their game, the immersion, lighting, reflections were crazy. And ultimately it made sense, because they couldn’t deliver those.
I’ve been saying this forever. Same for movies. Name a movie that looks years better than Avatar from 2010. They literally quit improving them and just started using them as vehicles for their agendas.
I mean, just looks better? Dark City (2001), A man for al seasons (1966), The Pinchcliffe Grand Prix (1975), 12 Angry Men (1957) are all amazing visually, and are even older. Art direction is severely underrated. But that's a cheeky answer, to give a serious one: If you mean, has better special effects, then Dune: Part Two (2024) looks like a beautiful painting in every scene. It not only has incerdibly good art direction, but also very good special effects and it just looks gorgeous. Watch it in a cinema if you can, you'll be blown away.
We went from 5% female employees on these studios to 40% in 9 years, hiring females was the first DEI movement before DEI was even a thing. As soon as these studios made some money and hired a proper HR, someone noticed the discrepancy between men and women and immediately started hiring lower quality females. I have never stated women are inferior in any way, it just so happens that more dudes are into games, more dudes are engineers and more dudes are more capable in this field, creating an natural 15x1 male to female ratio. Whenever they forcibly try to correct this they hire people way under the standard from 10 years ago, today we see the new standard
That's the problem with equity (compared to equality) that these companies don't seem to care about for some reason. Equality means everyone has the same, equal opportunity. That sounds great, right? Who objects to that? Equity means companies force equal outcomes. That sounds ridiculous, because not everyone has the same skills and capabilities. There would be nothing wrong with there being 40% female employees, if that 40% were as good at their job as the men they were replacing. They aren't though. So these hiring practices serve no real purpose...other than to force diversity with less capable workers. Consumers get mad, companies lose money, diversity gets blamed and shunned. Literally nobody wins...yet they keep doing it smh
Doesn't help that they also hire some people with barely any experience near fresh out of school bc they're cheaper than the dev/specialist with 20 years experience. (Although this is a problem everywhere in the job market, not just AAA games)
I miss the ps2 days when updates didn't exist. The devs were forced to ship games when they were actually ready cause what they shipped is basically the final version of the game.
One thing that makes me sad is how more video games have less and less emphasis on little objects, particles, and physics effects. They pack so much into having grand hyper-realistic 4k designs and lighting that they often minimize or forget the little details that make environments feel life-like, interactive, and exciting to play in. Plus a lot of games are barely optimized now and take NASA computers to run for only a little improvement.
PR guys be like "but look at the wrinkles on the nose of the main character! This is future!". Bro, 99% of the game im looking at their fooking back 5 meters away!
10:55 thats exactly it. Devs will often use the baked in systems to save time Thats why for years if you played a shooter and went to throw a grenade, you could immediately go "oh its unreal engine" based on how the throw physics and trajectory.
there's truth to what you're saying but i don't know or know of a single person that has that kind of autism where they can tell what engine is being used based on that.
I'm a game dev and this is the type of thing that production and/or stakeholders will simply tell you "nobody will care about this, this is a waste of time, you're not allowed to work on it" if you bring up wanting to work on, research, and/or fix this. Good luck trying to convince them when they barely understand 10% of what you're saying. Especially now that most studios are dealing with being extremely understaffed and can barely put out a functional game before the deadlines. And there's always the argument "other games are getting away with it too" (heard that one too much)
In 2015 I thought 1080@60 would be the "Doom running on a fridge" standard by now. But apparently tech goes backwards now. We're living in a Warhammer universe!
I wonder what 2013's techniques optimized to modern hardware would look like. I mean a game made with the mentality and the tech of 2013, but knowing that modern hardware exists. Devs can put high-detail geometry objects and ultra high resolution textures.
Simply put, the PS5 is 10 times faster than the 4, but due to lazy implementation of technology, the result is games that run as if it had about the same hardware performance.
Exactly. It’s crazy that we have a 5-10x hardware upgrade but the performance is somehow seems worse than last gen. We have been playing cross gen games for more than half the generation like that dog Ragnarok. Sony stopped releasing AAA games as well. Spiderman 2 barely looked better than Spiderman 1 but had 10x more pandering and propaganda. They even reused most of the map from the first game. Remember Killzone 2 and God of War 3, Uncharted 2 on PS3 looked 20x better than any PS2 game. That was in a hardware that was really hard to optimise for due to the unique architecture. Current devs need to be purged from the industry and hire real developers. Not to mention the cost of games from PS4 to PS5 quadrupled.
Take a game that runs 30 fps at 720p on ps4. Port that to ps5. Going from 30 to 60 fps needs 2x the performance. Now say the ps5 version is 1440p (4x the resolution). Well that'll take 4x the graphics power to hit. That's an 8x increase in hardware level for the same game and features, just at higher resolution and double the fps. That's why everything is stagnant.
@@xkannibale8768 >Going from 30 to 60 fps needs 2x the performance It's not that simple. Depending on a underlying code, going to 60 may be impossible without the major rewrite no matter how many cycles you throw at it.
One thing I've noticed a lot about games, is how games back in PS3 era had focus on physics and other more smaller details rather than just graphics. A great example of this is comparing the old Far Cry 2 game to the newer ones in the series, and see how everything apart from graphics has pretty much been downgraded since then.
Even that was a downgrade of course; most of those details were old by the time they hit consoles. Far Cry 1 was a technical marvel back in 2004, with ragdolls and vehicles and crazy long-distance sniping. And of course, the actual Far Cry developers released Crysis a year before Far Cry 2... and that game had fully physics-driven destructible scenery to a larger degree.
@@NicholasBrakespear Never played the first one but I don't doubt you. Also another example is GTA IV with its euphoria animation/ragdoll system and the vehicle destruction. A part from BeamNG I haven't seen any other game that is equal to or better than GTA IV's vehicle destruction/deformation
@@dinmamma138 Oh for sure, I loaded up GTA IV a while back... I'd forgotten how much stuff was removed by the time of GTA V. Some of it I could understand - making gameplay a bit less comedically drunken - but a lot of it was just... lazy. When you go further back too, there are all kinds of details that were thrown aside. Playing through Thief: The Dark Project again at the moment... and the sound system is crazy. Not only can you hear things from miles away, but the sound moves realistically through the level; instead of magically passing through solid walls, the sound actually comes from the direction of the open door etc, and to eavesdrop on what's on the other side of a door? You just walk up to it... and manually lean against it. And Quake 3? You can type messages to the bots, and they respond; they had a built-in text parser.
Definitely. Far Cry 2 had amazing systems running on the engine. Al the following entries in the series dropped those features one by one. Far Cry 4 was a static unbreakable objects filled bland and boring shooting gallery. Everything was static. Bottles, buildings, fences you name it. They took all the fun away and just left the shooty shooty pew pew. Enjoy
@@NicholasBrakespearThief The Dark Project was amazing with a Creative Sound card. Truly spooky in the Haunted levels. Remember when sound cards were a thing? Some developers squeezed the snot out of them.
I don't get how some people are pulling the "who cares" card. Like bro, this guy just wants a better product for everyone and telling them how they can fix the problem. Like literally what is the negative part of him speaking about this? This is like if someone said "hey I found out a way to draw in a way to be faster and look more realistic" and you just pull the "who cares" card.
That's how valve makes their games in source 2. There's a command to show all those spheres for baked lighting. That's why source games run so well even on big maps. I can't understand why studios keep running away from baked lighting
Because it is not dynamic. It requires a certain type of game with non-destructible objects and only a handful of toggleble lights. So no dynamic day night cycles, no torches etc.
laziness and time sadly is not even just UE5, look at Capcom, RE8 Village and RE4 Remake asked for a GTX 1070 for 1080p/60fps as the recommended then in Dragons Dogma 2 a GTX 1070 is the minimum to run at 1080p/30fps the performance issues clearly made people refund or not even bother with DD2, in the long run this is costing them more money
Consoles aren't a bad thing. I would argue the worst thing about consoles has been the "pro" and high-end variants. Consoles provide a hardware limitation which PC doesn't. This forces developers to optimize to get the best graphics they can on the limited hardware. The rapid iteration of new consoles has destroyed that incentive. It's one of the reasons we're in this mess.
Consoles are good if you consider them as a budget gaming machine. 300-400 bucks and you could freely play your games for 8 years. Now they feel overpriced
The problem with the Pro models are the unrealistic performance expectations that are never realized. A Pro model should focus on other things like better or additional controllers, increased disk space, better warranty, swappable storage, easy maintenance and customization, backwards compatibility, an included disc drive etc. instead of a better customizable experience you get promises of a performance boost that is never seen because the game was built around the original hardware
also the death of disk drives in consoles. developers where forced to cram a complete(!) game on a 700mb/4,7gb Disk. nowadays we just get a 500gb update.
People nowadays can quickly become videogame connoisseurs. As a videogame connoisseur, a PC is needed for all the abandonware and mods that are not distributed by official companies currently in business. Also, storage issues are very easy to deal with on PC
I’d have nothing against “pro” variants of consoles if they were really big steps up in performance. PlayStation could have somewhat upgraded the CPU (not that the PS5 is really cpu bound) on the Pro but they could have massively upgraded the GPU. Gone from basically a laptop gpu to a real PC quality GPU. People would have accepted a bigger form factor and Sony has the scale to get those GPUs at a low price we consumers could not.
Its because getting 90% of the way to photorealism is a lot easier than even going 90-95%. That's why they still don't have self driving teslas everywhere. They make it took like it's easy but the last bits of the problem are so complex that it takes 9 years to make reflections look better.
So figuring out the last 10% made the other 90% trash and now that's acceptable because the last 10% will fix all of the lower quality issues cropping up currently to the point of being photorealistic? Or am I missing your point. Also I don't see Tesla's becoming less functional because they are improving the final 10% till full autonomous driving, it seems just more stagnant instead of regressive
A big thing is we got really good at creating fake non dynamic lighting that looked realistic. Then we started doing ray tracing which had to do the same thing but for real with a low performance hit. We have been playing catchup ever since. Some recent titles like Indiana Jones are a great example of we are getting closer to that amazing lighting but it still requires very high end hardware. While we have to support the old fake lighting techniques AND ray tracing it almost doubles the work for a developer.
@@qAnon118 it was very limited, if you had more than 3 active lights in the area it would become buggy, even with mods pushing the engine as far as it could
The only reason Companies are shifting to UE5 is to avoid wasting money and time teaching developers their proprietary engine. It all comes down to Money. They don't have to pay for R&D for the Engine. They don't have to hire developers who are more experienced and they cost more. They can easily outsource Graphics are the least part they worry. (in some cases of course)
yet they still pay 100 mil dollar for that and still fail to deliver because now instead of wasting 50% of budget on optimization, we spend it on sweet baby inc and other wokegendas. "if I wouldn't smoke 2 packs a day, I could buy a lambo in 10 years". Lie. The extra money would be spent in other stuff instead. People always have same budget but it used to be mainly spent on graphics and optimization and attention to details. Now that is not true, they pump more content instead at a lower quality and they add "procedurally generated" slop feature that all games recently have. Basically they spend the same OR MORE but on most useless part of the game. Sad. This is why indie game are now popping off. They add that attention to detail and interest that most AAA just skip.
tbh, I as a consumer, don't want to pay for every single game company to develop in-house engines especially when they'll do it worse than a company that focuses on developing engines -- why reinvent the wheel over and over again?
@@Rumble-Tusk Bloat. I'm sick and tired of games being 15x the size they should be because they come with all these unnecessary physics and lighting engines that make a 2D RPG look and run worse. And then, somehow, while using UE, their teams are 20 times larger than they used to be when they made their own, so you're just paying for an army of college grads who'll drop out of the industry in 5 years to be replaced by the next batch.
True it is. I hate motion blur. but sometimes removing it will show the flaws in movement when playing the games. Making you get eyestrain or get motion sickness.
Yup, as if plain old greed wasn't bad enough we now have identity politics coming before passion and merit. When you have these companies all advertising their stunning and brave inclusive recruitment while Ubisoft is joking about 50% of their team having never worked on a game before (astheir studio is burning) its like ... bro. The first question you should be asking prospective developers is their experience, after that their interest in games and what genres they are passionate about. At that point anything else is completely irrelevant.
How many times can we go over how AAA gaming is dead. People don't ask even basic questions like why engine upgrades like environmental destruction are practically non-existent, even if you assume graphics reach some sort of diminishing return. Why is every game the same crappy player characters walking around on a 3D bitmap.
As someone who's always used consoles, I just recently switched to Steam Deck and it has already been a huge improvement. It's been nice playing games without them being all choppy, or looking like a visual mess, or both, on top having sooo many more settings to mess with for games that didn't have them before. Even an FOV slider was only a recent luxury on consoles..
Division 1 was max settings was unreal and still looks fantastic because the atmosphere of the environment added to the aesthetic. Looks well better than div 2.
I'm at the point (8:20) where he's talking about lighting, and guys, remember FEAR? Where careful budgeting got you things like grenades shaking lamps, which then moved the light sources? Meanwhile you were fighting AI that really *REALLY* wanted to murder you and did things that made them feel alive? And it was all done with very careful planning, level design etc. that catered to the engines strengths and just skipped over the bads? AMAZING game. It will play on anything that can run Windows 11.
I always felt like I was at fault for not having good enough hardware. I just got a new computer and was redownloading my steam library, and was surprised by the jump in gbs between generations. Like the graphics are comparable and the content is relatively the same but there’s nearly a 10x amount of space difference between some sequels
I am so happy people are finally tlaking about this. I first noticied it when i tried RDR2 on pc and my eyes started to hurt due to blurry TAA. I bought 4k monitor and for that game it didn't help that much. For some others it is fine, but still
For a long time, improvements to fidelity had a hardware limit, so folks had to look for ways to improve outside of just more raw power. I think we have moved beyond that. Now, it seems the hardware improves faster than people can max potential so to go to way of trying to improve fidelity is just to throw more power at it. That is only one way that you can improve fidelity and by relying on it as key solution, all other elements stagnate, and innovation flatlines.
Chromatic aberration, eye focus, vignette, world motion blur, no reflections without ray tracing in mirrors, or broken mirrors, film grain, grain in ray tracing. Modern graphics with all options on are distorted as fuck with effects.
This is what I have been trying to tell my programming circle for years. Original hardware constraints drove innovation in the 90s/20's. Lazy Programmers just say "We dont need to worry about performance because hardware is getting better" I cannot stand unreal engine being adopted by large studios or corporations.
Caveman here, I just want games with good stories, interesting characters, and fun game play. I don't give a f about graphics, the last time graphics really mattered was Nintendo 64, the improvements made each generation following have been miniscule. I can't stand to play games on PC, I want to sit back and relax while gaming, being on a PC feels like I'm at work or in school. Plus who the hell wants to drop thousands of dollars on a PC when you can get a console for less than half the price. Plus consoles just work, it's plug and play, I don't have download drivers or whatever the hell you have to do with PC, I just want to play not have to spend time doing non play work just to make my game function.
15:00 that's the problem, is most people do not see the problem with the current graphics. They're extremely use to it that they do not expect anything other than Fortnite-like models and textures. That is hugely how you see some games either massively fail or succeed.
The technical details in this video overshadow the real problem. Hardware improvements and things like DLSS become crutches and reduce the incentives for developers to optimize their game, because all they need is "just good enough," and if everybody is doing it, there is no risk of losing customers. It's paradoxical because you are forced to buy better hardware and still get the same product. I am overgeneralizing, but you get the idea.
These companies have brainwashed idiots to believe that DLSS is a feature. It's an anti-feature. I come from the generation where 4k means actual 4k, not 1440p with an interpolation filter.
@@jimb12312 I'd say this wouldn't be a problem necessarily if the technology itself would have better A.I . So because the "default" is not smart enough to produce good results for most games , devs have to create a personalized new default for their own game or a different personalized default for each level or map. Which is A LOT OF WORK. So yeah they are "lazy". But Idk if I'd blame devs entirely. Because it has become industry standard, now they are offered timelines that are hard to follow, so if you optimize and personalize, you are behind schedule so you would get fired 🤷♂ . The person in charge must understand that, not the dev. "Devs are lazy" is only true on projects where they have that extra time allowed yet they waste and still use the default. BUt on another note. going back to the idea that I believe it's not end of the world because at some point I am sure we would get GOOD enough default tools with cheap cost no matter the content. Basically that's the gambling here. "DLSS" is an anti-feature because the job DLSS does is generating pixels but it's not accurate enough to look perfect. But in future at some point it will. So games would run at 13 FPS but thx to the AI, you'd run 100FPS even on cheap hardware. My point is that if a technology is bad but good enough that everyone gets lazy, then the only other two solutions are: 1) Get everyon stop being lazy 2) Get the tool work properly so people can be lazy yet the result to be perfection
@@harnageaa DLSS is a statistical method of interpolation. It can never be 100% accurate. That is mathematically impossible. The information is lost. I don't think devs are lazy in general. The demand for talent exceeds supply. DEI hiring policies make it worse. Game studio executives prioritizing profit and not allocating resources to optimization makes it worse.
Yup 1080@60 is good enough, you start to get extreme diminishing returns past that point. As long as objects don't look aliased to fuck it's good. There is only so much stuff you can take notice of in a moving frame. I get that 120FPS and beyond looks smoother, but it's not required like it was from 30 to 60 where they had to use motion blur to make it not look juddery.
Basically, 10 years ago games competed over how good they looked, so they had development teams to optimize the graphics. Then Epic came along and said, "If you just use our engine you can have 90% of the quality with 10% of the effort" So companies cut back on how maybe engine developers they had and hired DEI consultants to compete on buzzwords instead.
So we got 100% of the game before and 100% of the effort and now we get 90% of the game at 10% the effort. Yet the budget and the team working on the game is bigger than ever. Which means those extra tens of millions of dollars are going clearly not into anything game related, just pure waste of money.
I remember when 4k was becoming more prevalent, people would say anti aliasing isnt required and it should help with maintaing performance. Boy was that not true. TAA and other temporal solutions are required in modern high end gaming.
It simply went from "optimize it so it's playable" to "why optimize when it's playable?"
9 годин тому+3
Sadly DLSS and frame generation is a get away solution for game devs. Why optimize when you can fake everything? Blurry mess with ghosting and input lag, but at least it runs... kinda... That's the issue.
Let's also consider the actual visual elements that are disappearing, that used to be present - leaving aside the lighting techniques and so on. Look at Far Cry 1 in 2004... notice that enemies have impact decals when you shoot them. How many games do you see now, on a regular basis, that don't have this feature? How many games that don't even have the Quake 2 approach (where enemies had a "damaged" skin for when you'd shot them a few times)? I was playing POE1 the other day, and I noticed that plants... actually caught fire and burned when fire effects were used near them. Yet when you look at Space Marine 2, with the player wielding a weapon that can supposedly melt metal? Plants are impervious. There are so many details like this, which should be standard by now, and so many other details that we should reasonably expect. And the argument that people "don't know any better" or "don't notice it" doesn't hold even slightly. Standards have to be maintained, or you don't just fail to progress, you slide backwards. Doesn't matter if subsequent generations of gamers ever "notice"; doesn't matter in the slightest.
Will have to rewatch, but on a similar video, A directional light is just a ray or line segment A point light is a sphere in orbital space They may have different rendering computations, and optimizations, as they are object-types. One may have more optimizations for stack memory, the other maybe for heap memory
Game as a service happened, Fortnite, Minecraft, Overwatch. The industry moved to where the money was, and they agreed that the money wasn't on graphics
I would say their money is on graphics as well, so much that it breaks and they don't do shit about optimizing graphics and smoothing their gameplay and physics that it breaks even more.
Those are some unlucky examples for graphics. No idea about Fortnite, but both Minecraft and Overwatch both run perfectly fine on semi-current potatoes. Specifically, Minecraft is an indie game that was initially made by one dev. While Overwatch was and still is a very optimized game considering its graphics. Mind you, 2016 graphics and in-house engine. You don't even need an average GPU to pump out 100+ frames. And its lowest settings at 100% render scale doesn't look that different from max settings like it used to be the case for games back in the day. Meanwhile the newest Marvel Rivals is a much more demanding game while not looking like an improvement in graphics at all. Furthermore, it's a stylized game that still relies on all these smearing techniques like forced TAA, upscaling, sharpening, Lumina and whatever else is happening under the hood. 4090 runs the game with max settings and supposedly native DLAA at 70-75 fps with dips to 40 during actual gameplay. I feel like that's pretty ridiculous. Some of my friends can't play the game at 60 without lowering render scale despite having a mid 10 or 20 series GPUs. I'd like to remind that visually the game does _not_ look like a huge improvement over smth like Overwatch. But hardware requirements are much higher.
The blurring effect we see in modern games is the result of bad anti-aliasing being used. TAA and FXAA being the top examples I can think of that causes blur. SMAA is still the king of anti-aliasing imho and the most visually appealing.
The reason for bad anti aliasing is because MSAA, which is the gold standard, doesn't work in modern game engines. Modern engines use temporal post processing effects and these effects don't work with MSAA, they only work with heavy handed AA that can "smooth out" extremely bad aliasing and TAA is the gold standard, the problem is "smooth out" = blur, lots of Vaseline blur and then they try to use sharpening filters to fix the blur but it does not work
Longtime game artist : What happened is we went from relying an artists eye to texture and optimize to letting machines calculate everything. It’s Tech for the sake of tech, people too obsessed with what they can do instead of what they should do.
Graphical fidelity jumps peaked when Crysis blew everything out of the water. It was singlehandedly the largest jump in graphic fidelity ever. Now adays the "jumps" are barely noticeable.
Seriously, they just don't care about optimization anymore. They just tell people they should upgrade their PCs. And if people say anything about getting bad performance, people will say brain dead things like, "Don't be poor". It's insane.
Frame Generation is about to make thing even worse as they start using it as a crutch to reach the acceptable framerate.
@@Ryzen776the worst part is that tech like FSR and DLSS make issues like ghosting and blurring even worse
a 3080 can play everything at 4k with upscaling and those are like $350 now, thats sounds pretty optimized to me
Well you need to upgrade like Ram, SSD new graphic, but and devs must to actually work under performance and optimization than just "oh gamer will need better hardware"
@@bigturkey1 People often forget that the world isn’t just the USA. Where I live, a 3080 costs around $1,000, which is more than my monthly salary. Guess it’s time to stop being poor! My 3- and 4-year-olds are definitely ready to start pulling their weight. Time to send them to work in the mines !
I am Game Dev and I can definitely say that the new people we are getting in the industry doesn't know sht about optimization
Yes, but they know a lot of wokeness.
I agree but… at some point I was so tired of baking the light constantly and remapping that I just embraced the laziness and convenience of unreal 🤣
Are they xy or xx 😂
Same as in audio production.. They just do the same technique they learned with 0 creativity.
I feel sorry for you thank you for your work
Sorry about the typo.
We are not advocating for more work though. We're saying the major issues that come with these checkmark workflows are being defended just for being a checkmark workflow. Instead of validating the issues or fixing them, the industry keeps pushing them.
As for the realism aspect, it's not realistic for things to be noisy or smear in motion.
We also have massive support from developers who actually watch out content instead of judging on superficial issues(addressed in the new video).
I know you guys are busy, but here's an idea that came to mind; A slower and bit longer, Crowbcat style video showcasing and highlighting "then and now" in terms of graphics. I'd imagine your audience would love it, and you guys clearly have the technical knowhow to do it proper justice.
I saw Marty'sMod supporting your project. If the king of SMAA, AO and RTGI says you're right, then you're right.
I totally support you guys in what you are doing. Even though Wukong is like 10x better than western modern games in terms of art etc.,I saw in that game these visual issues you manage to put into words as an expert(I just played the game more that is why I noticed). As someone with migraine, these artifacts and kind like strobe-like(don‘t know the best terminology) lighting makes me really tired and almost trigger a migraine sometimes. Hope you can get real devs behind your project and that the gaming industry goes away from stupid activists and invest into talent. Waiting to see a game you helped get better. Best of lucks.
Thank you very much for doing this, and speaking in behalf of gamers, I'm glad you're getting the recognition you deserve. I was always confused as to why the hell all these new games looked like if had a layer of Vaseline all over it and ran like crap. I spend so many days trying to fix settings on my own and get rid of the smear and artifacts and nothing worked until I discovered your videos. It all makes sense now.
You need to make a really dumbed down, and *slowed down* version of the argument, so that dumb people like me can understand it - and that should go viral and spark outrage and whatnot
Worse stories, worse graphics, worse IP's, the gaming industry is going backwards hard
We seem to have slipped back into the berenstain bears universe and everything is out of wack. Im currently working on a "back in wack" machine that will return us to our original berenstein universe where things made sense. I will keep you updated soldier. Stay frosty
If you only pay attention to the crap games. More very good games are released every year than ever before. There are not enough minutes in the year to finish all the very good games being released. Ignore the rage economy.
There are plenty of good games, you're just focusing on the bad ones
To be fair, this mainly seems to apply to some big triple A studios. There are games like Black Myth Wukong after all, with more like that to come. Also, despite the predatory and shitty monetization practices, it does seem like more and more gacha games are taking the Hoyoverse route and putting money into 3D models etc, where as years ago gacha games typically had low budget looking 2d chibi sprites as the norm.
I also went backwards and now play old games.
GPU companies and game developers are totally milking us. Games barely look better than 9 years ago, but now you need a monster PC to run them. Feels like they're just creating artificial demand to keep selling expensive hardware.
Consumers demand their 8k rocks and dirt in the background. Right up there with the 5 channels of uncompressed audio piped through an amazon basics soundbar.
@@CarelessOcelot me over here happy as hell and nobody could convince me otherwise with a 65lgcx, god tier 7.2.4 system, and a ps5 pro for gaming. My 3080 pc aint getting touched till it cant torrent anymore.
At first I thought you're blowing it out of proportions. Then I realized that Witcher 3 came out 8 years ago and that game really did have graphics comparable to today. On the other hand I also went from 1080p 60fps and medium settings on the newest titles of 8/9 years ago to 1440p 100-165 fps and max settings on the newest titles of today. So the hardware and performance of the graphic engines are miles better compared to 9 years ago.
It's called the Jewish effect .
Hilarious that modern gamers are finally realizing gaming peaked in PS2 era when games were made to be fun and no one knew what a developer looked like
People get more creative when they have less resources. Thats why I think older games were developed better.
Yeah pretty much this. Today’s technology in the graphics space of games has the potential to produce some of the best looking games we’ve ever seen.
And in a few games it actually has.
But in most cases it feels like artistic design, color palette, it’s all being replaced with “but look at these textures though!”
I was hoping this new era of graphics would have produced games that have perfect or near perfect texturing but with the artistic and visual design of something like Elden Ring.
So far it just hasn’t been the case.
And foolish game companies keep letting go of the people who have real experience, institutional knowledge, and understand the tricks of the trade, because these people are "too expensive" and "hard to work with."
And there were less incentives to please investors and meet unrealistic deadlines. Investors ruined gaming and it continues because people still pay for garbage like MTX and pre-cut DLC.
I think older games were developed better too but I don't see it as a lack of resources. Budget, development time and debugging were on a much smaller scale before 7th gen came around. Some of the best developed games of the 90's and early 00's were definitely using as many resources as possible.
I do think that a lot of devs in the past 10-20 years have been very creative when working with less resources though. It's as if they were more focused on gameplay than anything else.
It is an aspect but I don't think the public understands the meddling Blackrock does in all media. This is the main culprit, Blackrock makes these companies add in certain aspects mostly regarding political propaganda , and if it makes the game bad it does not matter because the loans they get from Blackrock overshadow any damage a boycott could do. That's why bud light is stronger than ever right now. How would that be possible if barely anybody bought their beer for years?
9 yrs ago the industry was filled with talent and not activists...
True!
Why are we making politic stances in video games. We don’t give a fuck
@@DimeszTV That's the problem though they want people like us that don't give a fuck about politics either demoralised and pushed out of gaming or they want the games saturated to such a point where we don't have a choice but to play them.
Hence the big push to takeover Japan etc to conform to western standards.
I mean they are fully aware the primary audience for western ESCAPIST entertainment is straight men, this fact is just despised. This is precisely why its so divisive, the artists have changed but the audience hasn't, despite all the talk about the so called "modern audience".
@ yeah, I feel it but even then it’s like ok cool. Make a trans character… I’ll play it. But why do we always have to talk about your hardships as a trans character when you are fighting literal dragons and demons lol
Try 19...
It's funny how the movie industry is paralleling the video game industry during the same time period. Big budgets, lots of greed, zero inspiration.
Red One = Concord
Its same people running both, and when they get tired loosing money, they just remove your ability to purchase any product not made by them.
The original star wars was made for $11 mil. Thats $57M in today's money. 😮
If I see another Red One Commercial I'm going to go insane. They spent a lot of advertising money on me who will never watch their movie.
This is what happens when you only care about profits and not the art.
"The industry is bullying him? Why?"
They hate him for telling the truth.
Unreal Jesus told them the truth, so they hated Him
“Necessity breeds ingenuity.”
Before we had the ease of photorealistic graphics at our fingertips, developers had to be creative in order to get the most bang for their buck. They had to actually think about their design choices, and make deliberate decisions, instead of copy pasting the latest hyper realistic asset and applying some ray tracing
This or the statement, limitation breeds creativity. Both are true to what's been going on in gaming imo.
You said bang
Anyone remembers the Crysis trilogy? Truly a masterpiece ahead of its time
@@behindthen0thing for money of course!
What breed are you speaking Professor????
2010 : look at that awesome game I can run on my modest PC!
2025 : what do you mean my 5090 needs DLSS to run this game at 60fps 1080p?!?!?!
*cough cough* Monster Hunter Wilds. Got a 3080 w i9 10900k both have a slight overclock and still running native 1080p without Framegen or DLSS was getting 40-45ish FPS this modern take the AAA development on game performance is a joke.
Wtf are you talking about lol. A 2080 which I have runs every game I have tried at over 165 fps at 1080p on ultra. Even 1440 p Im gonna get over 60 fps on 99% of games. 4k is where it gets dicey
Brah I have a 7900XTX and 5800x3d and I can’t even get 60 fps in hogsmead or even in Hogwarts. Imagine what lower spec gpu and CPUs run at…
@@charlesogden8492 I had the sad realization that the last game I've played that seems to be really well optimized was CoD Modern Warfare 2019. The damn game runs at 120-150fps (1080p) on a potato level RTX 2060 6GB for crying out loud... Try that with any more modern game (especially UE5) and you'd probably be in the single digits
@@Kysa2 you're playing the pirated version aren't ya 😂 i bought the game cause i really liked it and I'm playing on a 3070 ryzen 5 7700, getting 80 fps locked no stutters. The cracked version is an older version which has lot of problems. Buy the game in the next winter steam sale if you want to enjoy smooth experience.
This is like how furniture 100 years ago was beautiful handcrafted works of art. Now we have flat particle board crap from Ikea. Progress
best analogue so far
As a 3D artist who used to work on games in Seattle, its not even that hes bringing this up and being ignored, hes 100% right and much of the industry (at least when i was in it in 2018) is actively working against it and only focusing on message pandering and micro-transaction monetization. We had people and even a lead at one place i worked who acted like there were living little worlds in the screens and we had to respect characters like people...
People who can't separate fiction from reality
>seattle
there's your problem, big dawg. sometimes washington astounds me with how much more cali they can be than cali.
holy shit
Yeah "Puppy/Baby-Syndrome" where people adopt things and attribute human properties to them and/or adopt them as their children....
Very very Common in Leftist Women that are not or can not become Mother's....
Also common among single mom raised Leftist Boys that idolize Women and have been taught to be some male-version of a mother
@@logandunlap9156if thats where the studios are located its not ops fault for having to live there to work at these studios. Where else is he going to go? He aint going to find a AAA studio to work for if he moves to north dakota..
This guy is doing the God's work. I'm tired of upgrading my PC again and again just for the lazy devs to go "we can spend less time on optimizations now".
Just buy a console like the rest of us boomers👍
Agreed, i just hope he will drop using makeup
@@Nattedooier I Loved my snes and ps1 as a kid. But after that there ist no reason to buy a console If you own a PC.
thats not a thing. you only read that nonsense on reddit
@nikallew50 nah my uni laptop only runs Rome Total War, Medieval 2 and Terraria💀
Welcome to living in a society that only knows how to use tech on the front end and not how to make it or what its limits are on the backend. Late millenials and early zoomers (1994-2004) where the peak of tech literacy. We grew up when the modern internet and PC's where rolling out and learned about them as they did. To later born zoomers and onward starting in 2005 they are basically magical objects they have always had and just worked for them, they never had to go through janky growing pains and learn technical skills like hardware repair and software optimization.
this 110%. ive heard for years how zoomers were going to be so great at tech. sometimes they are worse at simple tasks like installing a printer than the boomers
Bro I was born in 2001 and I'm not really good with computers/tech 😭
@@pwnomega4562But at that age if you were going into a field that involves them and had deep interest before, you would have had to work around all those problems and learned along the way
Honestly at this point, I think there is an understanding between game development companies and Nvidia to keep games so overly non optimized that you constantly need the latest hardware to run it.
I rem games at the start on ps1 and end on ps1 and the difference was night and day. Now its just add more to everything including $$ for it to look nicer
That's exactly why
I’m thinking exactly the opposite: Nvidia insisting they’re not going to give you more VRAM suddenly seems to make more sense. They know game devs can optimize and run their games on mid-tier GPU’s so NVIDIA is in no hurry to double the VRAM in their products. They’re not even in a hurry to sell you a 5000 series gpu.
The fact that the latest hardware barely work for some of these games either is what’s crazy. Examples being CoD and Space Marine 2 having awful loading times with the solution being “YoU nEeD aN SSD” but having it on an SSD doesn’t even help half the time.
1) Game developers don't want to optimize because that's extra work and why do that when gamers will buy all the blurry stuttering crap anyways
2) Nvidia wants to lock gamers into their ecosystem by making them dependent on the upscaling and frame generation which needs the games to perform like crap
3) match made in heaven.
Look at Battlefield 1. Look at Uncharted 4. Look at RDR2; game graphics peaked in 2015-2019, but now games looks very plastic and artificial, which is ironic since AAA devs are pouring more into graphics now then the last decade
Edit: I forgot Batman Arkham Knight
Games like these should be the visual standard-bearers we demand for all AAA games going forward. Criticize anything that doesn’t look as good as these.
Arkham Knight was an unoptimised mess on release though.
Uncharted 4 is not canon, bro. lmao
rdr2 suffers from pretty much requiring taa to resolve the imagine. its actually one of the first big games to pioneer this issue lol. people on fucktaa always trying to find a fix for the blur
Graphics peaked with wind waker and okami
There was also a big talk when the PS5 was coming out that SSDs were just as fast as RAM in certain cases and that we could basically load the entire game just as fast. This would effectively remove any loading screens. Devs did not take advantage of this and we are left in the last gen.
Oh trust me. It is very apparent that there has been a drastic decline in game development as far as aesthetics and quality.
At this point they would be better off making their own engine to use from scratch each and every time.
Used to be the common @@luckybutunlucky8937
I am TIRED of these developers thinking gooey smear vision is realistic! The clarity of mid 00s images is much better. Had much cheaper but sensible effects. We need no RTX. They come up with new heavier tech to make us buy new cards for stuff that was figured out in 2015.
Problem is it takes time and effort to come up with solutions instead of relying on default checkboxes and AI to fix their problems
Efficiency in power consumption was also a lot better before. Passive cooling actually was a thing for GPUs. Like Nvidia wants us to drive highly overpriced experimental rocket cars on public roads.
They focus on graphics while everything still looks crap.
Worst of all, physics peaked in the PS3 era and went downhill after.
They've got NO excuse to look this bad. It's absolute insanity
Everything’s so fucking blurry now. Go back and play something like Serious Sam, the original. Obviously it’s more primitive but everything is so crisp and clean, the lighting is so nice but also simple. Now it’s like there’s layers of blur and random colors and lens flares and shit and it’s so annoying.
im tired of developers thinking that photo realistic graphics are good at all imo the games that look the best are games that look like art pieces not like real life ie; wind waker, the border lands series botw ect
A lot of this reminds me back around 2004 when mp3's and lower compressed quality music was preferred over the advancement in audio quality and hardware. As a former audio engineer I would record and mix with incredible sample rates and bit depth, but then have to to separate mixes for compressed mp3's that were to be played out of a mono cell phone speaker. It is where the majority of your audience is that you are tailoring too. As long as you have the high quality mixes you could mix for vinyl later on if project wants. I would like to see if possible for games maybe different releases for those who care about quality graphics? I understand the struggle knowing the tech!
tube amp > digital amp. analogue production > digital production.
I know this is really anecdotal but starwars battlefront 2016 is the game that really made me wake up. Endor, hoth, bespin are all still absolutely gorgeous by todays standards 8 years later.
Yep. Both reboot games look fantastic to this day.
Me trying to figure out if you mean Battlefront I (2015) or Battlefront II (2017)
@@MikeTV-w3lboth are beautiful. 2 of my favorite games of all time
@@MikeTV-w3l Both tho i believe Bespin was only in II.
I would not say Hoth is anything special as an ice planet but Endor is much harder to do and they did a great job on it.
I personally also like Kamino too.
@MikeTV-w3l talking about battlefront 1 i must have been remembering a dlc release in 2016. Thats almost almost 10 years ago damn i feel old.
Ever wonder how we still play games that roughly 10 years old? All went to shit, not just optimizations. EVERYTHING
We are entering the era of the people in charge being people who grew up with no expectations and no training. They actively dislike beauty and have no idea how to create it. People want cheap, fast, and cost effective.
This will get much worse unless we implement extremely high expectations.
The industry used to rely on debugging and no post-processing to optimise a game. AI upscaling has become a reliant part of development. There aren't too many genius level type programmers like John Carmack out there. I can only imagine how many genuinely talented people have been wasted or neglected in modern hiring culture.
In summary, it's not the consoles or PCs problems, it's the devs.
Short-sighted view. The industry constantly improves, and we are in a transition period. Back in the 80's and 90's low-hanging fruit made it easy to make big strides. Real time global ray-traced illumination is a pipe dream 30 years in the making. And requires new optimization techniques currently being developed and improved on. UE is still being worked on, but it focuses on different techniques than the guy in the vid wants them to focus on.
@@Rem_NLthose techniques look like crap in practice like the guy in the video points out.
@@Rem_NLthats the thing though. the industry isnt improving. it hit its peak and now is declining.
The sad thing is that my 7 year old gaming laptop can play these older games that objectively look better... but it can't play Marvel Rivals at all. Dev's don't know how to optimize anymore.. They just say "fuck it" and expect everyone to have computers as good as the ones they used to build the game.
Crysis from 2007 is still more impressive then most of todays bs
Wtf
That game at least had really good reason for the high system requirements. It looked... no, still looks good. Amazing actually. Technologically highly advanced. The biggest fault of the engine was that they thought higher CPU clocks will be the important part, not the more cores. So yeah, it doesn't really take advantage of quad cores for example.
I'm still impressed with that game despite it has lower polygon count models, somewhat lower texture sizes and no real GI. But the interactivity and overall look is still really appealing and it's sharp!
even MGS4 from 2008 is far more visually impressive than most games that are out today, i'd say even more impressive than MGS V, that game honestly looks like shit in comparison. (sure, it had some performance issues)
game devs are lazy and need to wake up, why could we get top games back then with only a fraction of what we have today. (I know, corporate greed)
Even Halo 1 from 2001 looks better LOL. I was dumbfounded when i played Indiana Jones and found that Halo 1 was far greater than Indiana Jones and the great graphics disaster.
I built my first PC to run Crysis. It was junk, it burned out my graphics card after two months, but it was pretty and worth it. Can confirm, no smudge on my monitor.
@HappyGamerz946 you shouldn't exaggerate to the point of sounding ridiculous
The guys saying its expensive. If game companies cut out expensive parts of making games, why are games more expensive than ever to make? These companies should reroute 75% of the marketing budget to making the game good.
DEI is the biggest expense
@@RedMatthew this is it the ceos thought they could just fire talent and place someone to replace them but they didnt know how much the talented devs where worth they shot themself in the foot
They need 2000 DEI hires to do the same or less than a 100 did 10 years ago.
Some indie with 10 people does better than those thousands of devs.
what about the ceo's 100mil?
Graphics were better 10 years ago, and games also ran at stable 60 fps 10 years ago. Graphics are worse now and we get constant below 60 fps games.
It's atrocious. Honestly the fps part is the real issue.
This is an inaccurate statement
@@richardcollis5576 The only inaccurate thing here is you saying "THIS is an inaccurate statement" when I have several different claims in my post.
Which one is inaccurate, multiple? all of them? Be more specific when you reply to comments.
false & falsetruth. if u obv take games like Crysis 1 & compare it with Dragon Age Veilguard. sure, but that comparison isnt worth to argue about.
Crysis 1 ran like garbage, same as Crysis 2 or Far Cry. 30fps & u were happy in most games & trust me at that given resolution: it was blurry, no motion blur needed in the first place.
now u can play Stalker, Wukong, Dragons Dogma 2, Alan Wake 2 & it will look way better than the 10 year older games.
u also have to compare at given hardware. if u take a 4080 & boot up 10 year old games they obv run better, common sense.
@@Vss077 Graphical quality isn't just about fidelity, art style/direction, and performance AT those fidelity levels are just as important. And we've lost out in all those areas in the last decade for sure.
"Don't optimize, just consume new hardware." - UE5 Devs (probably)
I wonder who benefits from that.
it's not so black and white as you think. But with a limited and cynical view of the industry, I can easily see why you would jump to these conclusions. And before your cynical mind jumps to another conclusion, I can see coming from a mile away; No I am not an Nvidia fanboy.
So true, too much spaghetti
@Rem_NL Really? Even with nvidia actually funding major releases and straight up locking you out of a game for not having a competitors card?
@@ancientmadness-rlk it's not UE5 fault bat game devs fault, and all the industry benefits from that, nvidia gives them millions, and hardware companies make billions selling new hardware so we can play new games with worst graphics then 10 years ago.
Just compare Rdr2 from 2018 to new games.
@@webtiger1974PTGyea I feel like it’s partly ue5 though the only game I’ve had run decent for me on my 3070 is wukong but even then it was maxing out my vram just feels like there’s a lot of bloat in ue5
If billion dollar game studios actually had creative art direction, they wouldn't need to spend so much money trying to make things look "real".
Exactly, my favourite games most of the time aren't trying to look like real life. I play games to escape reality, not replicate it.
I am completely the opposite. I highly prefer games that strive for "reality". Likely because I am from the 80's and grew up looking forward to better graphics that were constantly developing. Real time global illumination to me is just as big of a jump as going from 2d side scrollers to 3d games.
@@happyjam92do you got game recommendations?
@@Rem_NL bro, just walk outside.
Billion dollar studios??????,
The resolution thing is silly. Like phone screens not needing more than 1080p because of the screen size. Why do you want 4k unless you're a professional that needs all details precisely represented so you can do your job properly?
Actually the whole "PS3 games look awful" thing is silly. I'd be much better if the game industry was focused on gameplay mechanics and providing a new experience rather than trying to make games look like movies but the gameplay of them making me feel like I'm playing a hundred God of War 1 mods.
Gaming Companies are too busy making everything look super realistic rather than finding their own style for their own games. Hopefully they start catching on that not everything needs to look hyper-realistic.
Except just standard realistic in basic with stylize color shader
Nah, I love hyper-realism but outside of Cyberpunk, there ZERO console games that focus on hyper realism as they’re all using their own style. I wish gaming companies would focus on hyper realism but they’re not.
@@ThisisFerrariKhan yeah cause no console gonna run a hyper fucking realistic game properly
@@ThisisFerrariKhan nah, they all look like the same generic UE5 bs
@@kruze69 you are literally proving the argument that developers aren’t trying to utilise the technology to its fullest. The technological capability is there, the developers just don’t care to do it. Just like they could make a non-DEI ridden game but they’re not motivated to do so for whatever reason you decide it to be. It’s the same way with utilising the technology to its fullest.
I'm leaving a meeting where a new AAA game needs to be optimized after the FPS has gone through the basement. The problem is how big teams operate these days.
It works like this: Every department and feature team throws as much as possible into the scene, knowing that it will be optimized away in a further optimization step. The problem is that certain details absolutely destroy the player experience, such as lighting. When the game director has to choose between better networking performance and better lighting, and he is under budget and time pressure, he always chooses networking optimization because gameplay brings in revenue. Everyone wants good graphics, but even more want a good game.
20 years ago, game development was easier because we had smaller teams where we could communicate more efficiently, but also because the games were more straightforward. Networking today means, for example, that every idiot from anywhere in the world has to be able to play the game well, with many multiplayer features. In the 2000s, almost nobody could do this in terms of performance vs. cost, and games were not designed with this in mind. Think how FPS games changed over time, and how you hosted own servers in the early days.
In our meeting, the engineering team came up with the idea of putting 10 cameras in one scene. It's unprofessional and screams incompetence. So, how could it happen? We discovered that the art director, his artists, and many senior developers don't have enough experience with LOD and draw calls for some wild reason. So, what does management do now? They throw people like me on -- for 5x the price. Some people are laid off. And every department now tries to salvage as many features and details as possible in the end game, or else their career takes a hit.
This happens with almost every big game these days, the bigger the team. I know from friends that Valve is much more efficient in this regard than Ubisoft, EA, or Microsoft.
Thanks. That was informative.
And unfortunate.
dude thanks for share u experiencie i love to hear how these things works on big companies since im a unreal engine user for a few years now but never know how the things works in the real world scenerios
What's sad is the solution is staring the rest of the industry in the face: Be like Nintendo, keep your talent for generations and keep them behind the cutting edge so you get 100% out of their experience. But nope, consultant sponsored lay off death spiral is what western AAA is choosing lmao
Really cool insight
DEI and bureaucracy go hand in hand
people don't always notice the difference in visuals, but they surely notice if their Gpu can't keep up
if the solution both looks better and runs better, it's a win-win for consumers, but it requires more work, so it's a no-no for corpos, who want games to come out as fast and as cheap (for them) as possible
I've been playing games 10 years old or older. They play better and have better styles in general. Not to mention they utilize unique development engines. They have their own soul.
Cheap too
Battlefield 1 from 2016 still looks amazing, so does Doom Eternal, both run like a dream on my GTX 970 at ultra settings. Meanwhile, Baldur's Gate 3, which looks nice, I can't complain, struggles to hold a steady 30 fps even on medium settings.
Run like a dream as well
@@ValHazzardYour living in a 30fps world no matter what you do 😂
@@kayc7442 wdym ?
Every AAA game developer "you can see your reflection on water", "the rain look so real" or "you can see how smooth it move" everytime keep telling graphic quality than story quality
This video is about graphic downgrades not story
@tlevs7621 that the thing. They more focus on graphic not the story. Why make game graphic look cool if story was bad
@@different_From_Yesterday but why not both?
@@Blackbirdone11 both good. I hope they not screw up Borderlands 4😂
Gameplay>graphics>story
developers getting lazy as hardware gets stronger and deadlines get tighter has been a thing forever.
15:43 DEI started heavily in 2014, the train took a while
This is the truth.
Yep, which is why only very few of my games are after that year, but they are all remakes and remasters. The only games that come to mind that are not a remake of remastered I have added since then is Ruiner, Necromunda Hired Gun, Soulstice, Kingdom Come Deliverence, Trepang 2, Everspace 2, Remnant 2 and Vermintide 2.
Cyberpunk is tolerable if you ignore the character creator and the one character they added i the expansion during the race events. I don't own Witcher 3, I found it extremely boring but I recall in being woke free, I think.
Hmm, but I thought hiring almost exclusively women and minorities would result in better games than ever?
People do notice it. They just typically don't understand why it looks "off".
Yeah, it's extra pain in the ass when you're trying to help someone who's trying out a game, but all they can tell you is that it looks "blurry" and you have to figure out where the blur is coming from.
This is one of the biggest reasons why so many people are playing older games. Developers nowadays don't even think of optimizing their games.
I will say that as someone playing Division 1 in 2024 for the first time ever, I had to double check that I didn't get some remastered version, the games graphics was literally leagues above most games out now with much better performance smh
I know i keep going back to the game and being speechless every time. You can even see the slow degredation of talent between division 1 and 2. Where models for clothing and such were just messy or unrealistic looking to the point where i think they must have got some college work experience people in to do the job.
not being mean but division 1 did release in 2016 so it's nothing to really gloat about
What's crazy is that at the time people looked at that game as a classic example of Ubisoft lying in their trailers and then heavily downgrading in the actual game. Now we look at even what they gave us and think that was ahead of its time compared to now
@@thatmemeguy2520which means it’s 8 years old. An entire console gen.
@@scouthatesrainbowsYeah, because what they showed in E3 was absurd. It looked like if they had squished 4 Crysis games together and used the juice that came out of that to make their game, the immersion, lighting, reflections were crazy. And ultimately it made sense, because they couldn’t deliver those.
"do you have 20 years of experience for this entry position, young man?"
"yes."
When people say "no one wants to work anymore", you can see the results here
I’ve been saying this forever. Same for movies. Name a movie that looks years better than Avatar from 2010. They literally quit improving them and just started using them as vehicles for their agendas.
avatar was the last western 3d movie. only china and russia continued to make them.
i mean, true 3d movies, not 2d with post edit.
I mean, just looks better?
Dark City (2001), A man for al seasons (1966), The Pinchcliffe Grand Prix (1975), 12 Angry Men (1957) are all amazing visually, and are even older. Art direction is severely underrated.
But that's a cheeky answer, to give a serious one:
If you mean, has better special effects, then Dune: Part Two (2024) looks like a beautiful painting in every scene. It not only has incerdibly good art direction, but also very good special effects and it just looks gorgeous. Watch it in a cinema if you can, you'll be blown away.
Bladerunner 2049 is the only thing that holds a candle to pre 2010 shit.
King Kong, which came out in 2005, looks better than the movies they make today.
@AndrewH9999 yup
that was a rare good nostalgia sequel
We went from 5% female employees on these studios to 40% in 9 years, hiring females was the first DEI movement before DEI was even a thing. As soon as these studios made some money and hired a proper HR, someone noticed the discrepancy between men and women and immediately started hiring lower quality females. I have never stated women are inferior in any way, it just so happens that more dudes are into games, more dudes are engineers and more dudes are more capable in this field, creating an natural 15x1 male to female ratio. Whenever they forcibly try to correct this they hire people way under the standard from 10 years ago, today we see the new standard
This is the real reason
Inferior generally no. Women have their strengths like men. Generally I'm going to agree though because of their human biological hardware
That's the problem with equity (compared to equality) that these companies don't seem to care about for some reason.
Equality means everyone has the same, equal opportunity. That sounds great, right? Who objects to that? Equity means companies force equal outcomes. That sounds ridiculous, because not everyone has the same skills and capabilities. There would be nothing wrong with there being 40% female employees, if that 40% were as good at their job as the men they were replacing. They aren't though. So these hiring practices serve no real purpose...other than to force diversity with less capable workers. Consumers get mad, companies lose money, diversity gets blamed and shunned. Literally nobody wins...yet they keep doing it smh
HR department definitely inferior
Doesn't help that they also hire some people with barely any experience near fresh out of school bc they're cheaper than the dev/specialist with 20 years experience. (Although this is a problem everywhere in the job market, not just AAA games)
I miss the ps2 days when updates didn't exist. The devs were forced to ship games when they were actually ready cause what they shipped is basically the final version of the game.
One thing that makes me sad is how more video games have less and less emphasis on little objects, particles, and physics effects.
They pack so much into having grand hyper-realistic 4k designs and lighting that they often minimize or forget the little details that make environments feel life-like, interactive, and exciting to play in. Plus a lot of games are barely optimized now and take NASA computers to run for only a little improvement.
PR guys be like "but look at the wrinkles on the nose of the main character! This is future!". Bro, 99% of the game im looking at their fooking back 5 meters away!
Definitely. You feel more connected when the world feels tactile and responds to your presence. We used to see it a lot and now it’s basically dead.
@@GreggyAck What? You don't like interacting with lightly glowing and outlined objects by pressing "E"? Weird.
10:55 thats exactly it. Devs will often use the baked in systems to save time
Thats why for years if you played a shooter and went to throw a grenade, you could immediately go "oh its unreal engine" based on how the throw physics and trajectory.
there's truth to what you're saying but i don't know or know of a single person that has that kind of autism where they can tell what engine is being used based on that.
I'm a game dev and this is the type of thing that production and/or stakeholders will simply tell you "nobody will care about this, this is a waste of time, you're not allowed to work on it" if you bring up wanting to work on, research, and/or fix this. Good luck trying to convince them when they barely understand 10% of what you're saying. Especially now that most studios are dealing with being extremely understaffed and can barely put out a functional game before the deadlines. And there's always the argument "other games are getting away with it too" (heard that one too much)
In 2015 I thought 1080@60 would be the "Doom running on a fridge" standard by now. But apparently tech goes backwards now. We're living in a Warhammer universe!
PS4 was running 1080p natively PS4 pro was able to do the same with sometimes 4K.
PS5 cannot run 1080p natively a lot of times. How bad is that?
@rsmith8113 wouldn't this make the ps4 the better product even today? Exclusives excluded ofc 😅
I wonder what 2013's techniques optimized to modern hardware would look like. I mean a game made with the mentality and the tech of 2013, but knowing that modern hardware exists. Devs can put high-detail geometry objects and ultra high resolution textures.
>indie games
@@KenBladehart add the budget of a AAA to pay artists, 3D designers and the accessibility of techniques like motion capture…
Don't use Win 10 or Win 11; that's it.
FF14 Dawntrail, not even kidding! 10/10
We would have insanely good looking games with no blur.
Simply put, the PS5 is 10 times faster than the 4, but due to lazy implementation of technology, the result is games that run as if it had about the same hardware performance.
Hardware cannot outrun bad code
Exactly. It’s crazy that we have a 5-10x hardware upgrade but the performance is somehow seems worse than last gen. We have been playing cross gen games for more than half the generation like that dog Ragnarok. Sony stopped releasing AAA games as well. Spiderman 2 barely looked better than Spiderman 1 but had 10x more pandering and propaganda. They even reused most of the map from the first game. Remember Killzone 2 and God of War 3, Uncharted 2 on PS3 looked 20x better than any PS2 game. That was in a hardware that was really hard to optimise for due to the unique architecture. Current devs need to be purged from the industry and hire real developers.
Not to mention the cost of games from PS4 to PS5 quadrupled.
Take a game that runs 30 fps at 720p on ps4. Port that to ps5. Going from 30 to 60 fps needs 2x the performance. Now say the ps5 version is 1440p (4x the resolution). Well that'll take 4x the graphics power to hit. That's an 8x increase in hardware level for the same game and features, just at higher resolution and double the fps. That's why everything is stagnant.
@@xkannibale8768 >Going from 30 to 60 fps needs 2x the performance
It's not that simple. Depending on a underlying code, going to 60 may be impossible without the major rewrite no matter how many cycles you throw at it.
I'm a rendering engineer for a decent sized studio. Hyper Realism as a "style" needs to fizzle out.
Tell them then
@visionforetold4568 >work somewhere with hundreds of people
>think you have any sway over the direction higher ups want to take
Are you serious?
@@johnjackson9767 defeatist attitude is unbecoming of a man
@@visionforetold4568 Keep tilting at windmills, Mr. Quixote.
I agree, decent art direction is what we need. ‘Realism’ is boring.
One thing I've noticed a lot about games, is how games back in PS3 era had focus on physics and other more smaller details rather than just graphics. A great example of this is comparing the old Far Cry 2 game to the newer ones in the series, and see how everything apart from graphics has pretty much been downgraded since then.
Even that was a downgrade of course; most of those details were old by the time they hit consoles. Far Cry 1 was a technical marvel back in 2004, with ragdolls and vehicles and crazy long-distance sniping. And of course, the actual Far Cry developers released Crysis a year before Far Cry 2... and that game had fully physics-driven destructible scenery to a larger degree.
@@NicholasBrakespear Never played the first one but I don't doubt you. Also another example is GTA IV with its euphoria animation/ragdoll system and the vehicle destruction. A part from BeamNG I haven't seen any other game that is equal to or better than GTA IV's vehicle destruction/deformation
@@dinmamma138 Oh for sure, I loaded up GTA IV a while back... I'd forgotten how much stuff was removed by the time of GTA V.
Some of it I could understand - making gameplay a bit less comedically drunken - but a lot of it was just... lazy.
When you go further back too, there are all kinds of details that were thrown aside. Playing through Thief: The Dark Project again at the moment... and the sound system is crazy. Not only can you hear things from miles away, but the sound moves realistically through the level; instead of magically passing through solid walls, the sound actually comes from the direction of the open door etc, and to eavesdrop on what's on the other side of a door? You just walk up to it... and manually lean against it.
And Quake 3? You can type messages to the bots, and they respond; they had a built-in text parser.
Definitely. Far Cry 2 had amazing systems running on the engine. Al the following entries in the series dropped those features one by one. Far Cry 4 was a static unbreakable objects filled bland and boring shooting gallery. Everything was static. Bottles, buildings, fences you name it. They took all the fun away and just left the shooty shooty pew pew. Enjoy
@@NicholasBrakespearThief The Dark Project was amazing with a Creative Sound card. Truly spooky in the Haunted levels.
Remember when sound cards were a thing? Some developers squeezed the snot out of them.
I don't get how some people are pulling the "who cares" card. Like bro, this guy just wants a better product for everyone and telling them how they can fix the problem. Like literally what is the negative part of him speaking about this? This is like if someone said "hey I found out a way to draw in a way to be faster and look more realistic" and you just pull the "who cares" card.
That's how valve makes their games in source 2. There's a command to show all those spheres for baked lighting. That's why source games run so well even on big maps. I can't understand why studios keep running away from baked lighting
Because it is not dynamic. It requires a certain type of game with non-destructible objects and only a handful of toggleble lights. So no dynamic day night cycles, no torches etc.
laziness and time
sadly is not even just UE5,
look at Capcom, RE8 Village and RE4 Remake asked for a GTX 1070 for 1080p/60fps as the recommended
then in Dragons Dogma 2 a GTX 1070 is the minimum to run at 1080p/30fps
the performance issues clearly made people refund or not even bother with DD2, in the long run this is costing them more money
@@Rem_NL and games still have a lot less physicality than during later half of 2000 era.
Because DEI hires have no idea how to do it.
@@Rem_NL??? Source games have destruction and flashlights amd fire giving off light
Consoles aren't a bad thing. I would argue the worst thing about consoles has been the "pro" and high-end variants. Consoles provide a hardware limitation which PC doesn't. This forces developers to optimize to get the best graphics they can on the limited hardware. The rapid iteration of new consoles has destroyed that incentive. It's one of the reasons we're in this mess.
Consoles are good if you consider them as a budget gaming machine. 300-400 bucks and you could freely play your games for 8 years. Now they feel overpriced
The problem with the Pro models are the unrealistic performance expectations that are never realized. A Pro model should focus on other things like better or additional controllers, increased disk space, better warranty, swappable storage, easy maintenance and customization, backwards compatibility, an included disc drive etc. instead of a better customizable experience you get promises of a performance boost that is never seen because the game was built around the original hardware
also the death of disk drives in consoles. developers where forced to cram a complete(!) game on a 700mb/4,7gb Disk.
nowadays we just get a 500gb update.
People nowadays can quickly become videogame connoisseurs.
As a videogame connoisseur, a PC is needed for all the abandonware and mods that are not distributed by official companies currently in business.
Also, storage issues are very easy to deal with on PC
I’d have nothing against “pro” variants of consoles if they were really big steps up in performance. PlayStation could have somewhat upgraded the CPU (not that the PS5 is really cpu bound) on the Pro but they could have massively upgraded the GPU. Gone from basically a laptop gpu to a real PC quality GPU. People would have accepted a bigger form factor and Sony has the scale to get those GPUs at a low price we consumers could not.
Its because getting 90% of the way to photorealism is a lot easier than even going 90-95%. That's why they still don't have self driving teslas everywhere. They make it took like it's easy but the last bits of the problem are so complex that it takes 9 years to make reflections look better.
So figuring out the last 10% made the other 90% trash and now that's acceptable because the last 10% will fix all of the lower quality issues cropping up currently to the point of being photorealistic? Or am I missing your point. Also I don't see Tesla's becoming less functional because they are improving the final 10% till full autonomous driving, it seems just more stagnant instead of regressive
Tesla self driving had major improvements lately. Travelling between major cities with no interventions.
A big thing is we got really good at creating fake non dynamic lighting that looked realistic. Then we started doing ray tracing which had to do the same thing but for real with a low performance hit. We have been playing catchup ever since. Some recent titles like Indiana Jones are a great example of we are getting closer to that amazing lighting but it still requires very high end hardware. While we have to support the old fake lighting techniques AND ray tracing it almost doubles the work for a developer.
Better informed comment than 90% of the hyperbole nonsense you see here.
Believe it or not, Skyrim has a uniqur torch with reactive lighting. Bethesda had lighting down years ago.
@@qAnon118 it was very limited, if you had more than 3 active lights in the area it would become buggy, even with mods pushing the engine as far as it could
The only reason Companies are shifting to UE5 is to avoid wasting money and time teaching developers their proprietary engine. It all comes down to Money.
They don't have to pay for R&D for the Engine.
They don't have to hire developers who are more experienced and they cost more.
They can easily outsource
Graphics are the least part they worry. (in some cases of course)
yet they still pay 100 mil dollar for that and still fail to deliver because now instead of wasting 50% of budget on optimization, we spend it on sweet baby inc and other wokegendas.
"if I wouldn't smoke 2 packs a day, I could buy a lambo in 10 years". Lie. The extra money would be spent in other stuff instead.
People always have same budget but it used to be mainly spent on graphics and optimization and attention to details. Now that is not true, they pump more content instead at a lower quality and they add "procedurally generated" slop feature that all games recently have.
Basically they spend the same OR MORE but on most useless part of the game. Sad. This is why indie game are now popping off. They add that attention to detail and interest that most AAA just skip.
tbh, I as a consumer, don't want to pay for every single game company to develop in-house engines
especially when they'll do it worse than a company that focuses on developing engines -- why reinvent the wheel over and over again?
@@Rumble-Tusk Bloat. I'm sick and tired of games being 15x the size they should be because they come with all these unnecessary physics and lighting engines that make a 2D RPG look and run worse.
And then, somehow, while using UE, their teams are 20 times larger than they used to be when they made their own, so you're just paying for an army of college grads who'll drop out of the industry in 5 years to be replaced by the next batch.
I've been saying for years that quality of game graphics are going down while the suppose hardware has never been better.
I feel that motion blur is meant to hide the issues. I always turn it off because it is an obnoxious default setting.
Same with the film grain effect
True it is. I hate motion blur. but sometimes removing it will show the flaws in movement when playing the games. Making you get eyestrain or get motion sickness.
>motion blur is meant to hide the issues
It's meant to hide low framerate.
“Bro, something happened in 2015!” Me: What could it be? *Laughs in DEI*
You good bro?
Yup, as if plain old greed wasn't bad enough we now have identity politics coming before passion and merit. When you have these companies all advertising their stunning and brave inclusive recruitment while Ubisoft is joking about 50% of their team having never worked on a game before (astheir studio is burning) its like ... bro.
The first question you should be asking prospective developers is their experience, after that their interest in games and what genres they are passionate about. At that point anything else is completely irrelevant.
🎯
@David-k1z just curious what do you do for a living?
MBA and DEI, the death knolls of all creativity but opposite political affiliation
How many times can we go over how AAA gaming is dead. People don't ask even basic questions like why engine upgrades like environmental destruction are practically non-existent, even if you assume graphics reach some sort of diminishing return. Why is every game the same crappy player characters walking around on a 3D bitmap.
“Consoles are PCs for cavemen” brother is a poet
I am a caveman can confirm. They half ass the most trivial interactions.
I don't get it tbh
@@micklucas1451 Consoles are oversimplified
As someone who's always used consoles, I just recently switched to Steam Deck and it has already been a huge improvement. It's been nice playing games without them being all choppy, or looking like a visual mess, or both, on top having sooo many more settings to mess with for games that didn't have them before. Even an FOV slider was only a recent luxury on consoles..
PC has all the problems and are the entire reasons that "games are buggy and shitty" I will admit that Cyberpunk didn't run on PS4.
Division 1 was max settings was unreal and still looks fantastic because the atmosphere of the environment added to the aesthetic. Looks well better than div 2.
"Consoles are PC's for cavemen" This man has seen the truth nobody is ready to accept.
everyone but a good 70% of all console players are ready to accept this truth
I'm at the point (8:20) where he's talking about lighting, and guys, remember FEAR? Where careful budgeting got you things like grenades shaking lamps, which then moved the light sources? Meanwhile you were fighting AI that really *REALLY* wanted to murder you and did things that made them feel alive? And it was all done with very careful planning, level design etc. that catered to the engines strengths and just skipped over the bads? AMAZING game. It will play on anything that can run Windows 11.
to this day, I'm still curious what had happened to Jankowski. I could never know if he truly died, or entered the backrooms
Visuals sure have stagnated, that much I've noticed, a game from 8 years ago is 95% the same as it is now visual quality wise.
Visuals truly didn't improve much but the system requirements skyrocketed.
8 years ago we had visual clarity
I always felt like I was at fault for not having good enough hardware. I just got a new computer and was redownloading my steam library, and was surprised by the jump in gbs between generations. Like the graphics are comparable and the content is relatively the same but there’s nearly a 10x amount of space difference between some sequels
I like 2014s crysis better than newer games smh
No Crysis games came out in 2014.
Crysis was 2007, Crysis 2 was 2011, and Crysis 3 was 2013.
Crysis 3 is stilla top notch game. Too bad my 4090 cant run it on max :(
crysis 3 remastered has all the things you people complain about. it needs dlss performance to get decent fps at 4k.
I prefer Cyberpunk. F’n fantastic game and story.
@@Therealcarolinaguy i had no issues playing crysis 3 at 4k with 4080
I am so happy people are finally tlaking about this. I first noticied it when i tried RDR2 on pc and my eyes started to hurt due to blurry TAA. I bought 4k monitor and for that game it didn't help that much. For some others it is fine, but still
I thoroughly enjoyed rdr2 on my ps5 and shitty flatscreen
@@Nattedooier today we learned RDR2 has bad graphics
@@MikeTV-w3l yah maybe sitting inches in front of the screen for hours on end has something to do with hurting eyes lol
@@Nattedooier turn off taa in rdr2 and tell me how it looks ;)
Also you don't know how far away i sit f orm my screen so you're not clever
Buy 1080p, bro.
For a long time, improvements to fidelity had a hardware limit, so folks had to look for ways to improve outside of just more raw power. I think we have moved beyond that. Now, it seems the hardware improves faster than people can max potential so to go to way of trying to improve fidelity is just to throw more power at it. That is only one way that you can improve fidelity and by relying on it as key solution, all other elements stagnate, and innovation flatlines.
Chromatic aberration, eye focus, vignette, world motion blur, no reflections without ray tracing in mirrors, or broken mirrors, film grain, grain in ray tracing. Modern graphics with all options on are distorted as fuck with effects.
Good luck entering anything into a Cyberpunk 2077 keypad LUL
Chromatic aberration completely ruined the visuals in The Outer Worlds, lucky the remaster doesn't suffer the same fate.
@@Yougottubed89 the outer worlds was an absolutely mid game
I was just playing Dying Light and was thinking how good it still looks a decade later.
PS2 was peak. ZERO lag/frame drops. In any game.
I am convinced that at somepoint 10 to 15 years ago our current timeline jumped to an alternate reality....
Dec 21 2012. 😂
@@koryhelm8515🤣🤣 I see what you did there .. 👍👍
@@koryhelm8515 Yup
@koryhelm8515 the world really did end, just not in the way we expected
When they don't keep the experienced people around because they want cheaper workers.
This is what I have been trying to tell my programming circle for years.
Original hardware constraints drove innovation in the 90s/20's.
Lazy Programmers just say "We dont need to worry about performance because hardware is getting better"
I cannot stand unreal engine being adopted by large studios or corporations.
Just think about it, that developers of new games save on content and their time, meanwhile in the past this was punished more than now.
newer games post ray tracing upscaling era have so much blur and noises it's annoying af😑😑😑
Caveman here, I just want games with good stories, interesting characters, and fun game play. I don't give a f about graphics, the last time graphics really mattered was Nintendo 64, the improvements made each generation following have been miniscule. I can't stand to play games on PC, I want to sit back and relax while gaming, being on a PC feels like I'm at work or in school. Plus who the hell wants to drop thousands of dollars on a PC when you can get a console for less than half the price. Plus consoles just work, it's plug and play, I don't have download drivers or whatever the hell you have to do with PC, I just want to play not have to spend time doing non play work just to make my game function.
15:00 that's the problem, is most people do not see the problem with the current graphics. They're extremely use to it that they do not expect anything other than Fortnite-like models and textures. That is hugely how you see some games either massively fail or succeed.
The technical details in this video overshadow the real problem. Hardware improvements and things like DLSS become crutches and reduce the incentives for developers to optimize their game, because all they need is "just good enough," and if everybody is doing it, there is no risk of losing customers. It's paradoxical because you are forced to buy better hardware and still get the same product. I am overgeneralizing, but you get the idea.
These companies have brainwashed idiots to believe that DLSS is a feature. It's an anti-feature. I come from the generation where 4k means actual 4k, not 1440p with an interpolation filter.
@@jimb12312 I'd say this wouldn't be a problem necessarily if the technology itself would have better A.I .
So because the "default" is not smart enough to produce good results for most games , devs have to create a personalized new default for their own game or a different personalized default for each level or map. Which is A LOT OF WORK. So yeah they are "lazy".
But Idk if I'd blame devs entirely. Because it has become industry standard, now they are offered timelines that are hard to follow, so if you optimize and personalize, you are behind schedule so you would get fired 🤷♂ . The person in charge must understand that, not the dev. "Devs are lazy" is only true on projects where they have that extra time allowed yet they waste and still use the default.
BUt on another note. going back to the idea that I believe it's not end of the world because at some point I am sure we would get GOOD enough default tools with cheap cost no matter the content. Basically that's the gambling here. "DLSS" is an anti-feature because the job DLSS does is generating pixels but it's not accurate enough to look perfect. But in future at some point it will. So games would run at 13 FPS but thx to the AI, you'd run 100FPS even on cheap hardware.
My point is that if a technology is bad but good enough that everyone gets lazy, then the only other two solutions are: 1) Get everyon stop being lazy 2) Get the tool work properly so people can be lazy yet the result to be perfection
@@harnageaa DLSS is a statistical method of interpolation. It can never be 100% accurate. That is mathematically impossible. The information is lost.
I don't think devs are lazy in general. The demand for talent exceeds supply. DEI hiring policies make it worse. Game studio executives prioritizing profit and not allocating resources to optimization makes it worse.
if they didnt use dlss games would still look like they did in 2017
They are losing customers but those customers are just a small percentage. Can make up loss of sales with DLC, Macrotransactions, Battle Passes, etc.
1080p 60fps has been good enough for me for the past 10 years and it will be good enough for the next 10.
Weird way to announce how poor you are
Yup 1080@60 is good enough, you start to get extreme diminishing returns past that point. As long as objects don't look aliased to fuck it's good. There is only so much stuff you can take notice of in a moving frame.
I get that 120FPS and beyond looks smoother, but it's not required like it was from 30 to 60 where they had to use motion blur to make it not look juddery.
@@SupaBuu People like you are a part of the problem why games are as they are nowdays
@@SupaBuu you sound pretty poor yourself bro. That 4090 isn't easy on the Wallet, nor the kidneys.
1080p would be enough with MSAA anti-aliasing. It's not enough with TAA blur applied
Basically, 10 years ago games competed over how good they looked, so they had development teams to optimize the graphics.
Then Epic came along and said, "If you just use our engine you can have 90% of the quality with 10% of the effort"
So companies cut back on how maybe engine developers they had and hired DEI consultants to compete on buzzwords instead.
So we got 100% of the game before and 100% of the effort and now we get 90% of the game at 10% the effort. Yet the budget and the team working on the game is bigger than ever. Which means those extra tens of millions of dollars are going clearly not into anything game related, just pure waste of money.
0:18 team america puppet
I remember when 4k was becoming more prevalent, people would say anti aliasing isnt required and it should help with maintaing performance. Boy was that not true. TAA and other temporal solutions are required in modern high end gaming.
It simply went from "optimize it so it's playable" to "why optimize when it's playable?"
Sadly DLSS and frame generation is a get away solution for game devs. Why optimize when you can fake everything? Blurry mess with ghosting and input lag, but at least it runs... kinda... That's the issue.
Let's also consider the actual visual elements that are disappearing, that used to be present - leaving aside the lighting techniques and so on. Look at Far Cry 1 in 2004... notice that enemies have impact decals when you shoot them. How many games do you see now, on a regular basis, that don't have this feature? How many games that don't even have the Quake 2 approach (where enemies had a "damaged" skin for when you'd shot them a few times)?
I was playing POE1 the other day, and I noticed that plants... actually caught fire and burned when fire effects were used near them. Yet when you look at Space Marine 2, with the player wielding a weapon that can supposedly melt metal? Plants are impervious.
There are so many details like this, which should be standard by now, and so many other details that we should reasonably expect. And the argument that people "don't know any better" or "don't notice it" doesn't hold even slightly. Standards have to be maintained, or you don't just fail to progress, you slide backwards. Doesn't matter if subsequent generations of gamers ever "notice"; doesn't matter in the slightest.
Will have to rewatch, but on a similar video,
A directional light is just a ray or line segment
A point light is a sphere in orbital space
They may have different rendering computations, and optimizations, as they are object-types.
One may have more optimizations for stack memory, the other maybe for heap memory
Game as a service happened, Fortnite, Minecraft, Overwatch. The industry moved to where the money was, and they agreed that the money wasn't on graphics
And they are right. The problem is if graphic is everything you can present because the rest is shit
I would say their money is on graphics as well, so much that it breaks and they don't do shit about optimizing graphics and smoothing their gameplay and physics that it breaks even more.
mircraft is not
Those are some unlucky examples for graphics. No idea about Fortnite, but both Minecraft and Overwatch both run perfectly fine on semi-current potatoes. Specifically, Minecraft is an indie game that was initially made by one dev. While Overwatch was and still is a very optimized game considering its graphics. Mind you, 2016 graphics and in-house engine. You don't even need an average GPU to pump out 100+ frames. And its lowest settings at 100% render scale doesn't look that different from max settings like it used to be the case for games back in the day.
Meanwhile the newest Marvel Rivals is a much more demanding game while not looking like an improvement in graphics at all. Furthermore, it's a stylized game that still relies on all these smearing techniques like forced TAA, upscaling, sharpening, Lumina and whatever else is happening under the hood. 4090 runs the game with max settings and supposedly native DLAA at 70-75 fps with dips to 40 during actual gameplay. I feel like that's pretty ridiculous. Some of my friends can't play the game at 60 without lowering render scale despite having a mid 10 or 20 series GPUs. I'd like to remind that visually the game does _not_ look like a huge improvement over smth like Overwatch. But hardware requirements are much higher.
“This game came out 9 years ago?” That’s the Fox engine for ya.
The blurring effect we see in modern games is the result of bad anti-aliasing being used. TAA and FXAA being the top examples I can think of that causes blur. SMAA is still the king of anti-aliasing imho and the most visually appealing.
The reason for bad anti aliasing is because MSAA, which is the gold standard, doesn't work in modern game engines. Modern engines use temporal post processing effects and these effects don't work with MSAA, they only work with heavy handed AA that can "smooth out" extremely bad aliasing and TAA is the gold standard, the problem is "smooth out" = blur, lots of Vaseline blur and then they try to use sharpening filters to fix the blur but it does not work
@ Yes MSAA is also problematic, I wasn't making a comprehensive list.
@ I think the problems we see today could be simply fixed by stepping back to older AA techniques and better optimization.
Insdustry is cutting so many corners to save on "time and money" that it truly makes you question whether or not we're saving anything.
Longtime game artist : What happened is we went from relying an artists eye to texture and optimize to letting machines calculate everything. It’s Tech for the sake of tech, people too obsessed with what they can do instead of what they should do.
Graphical fidelity jumps peaked when Crysis blew everything out of the water. It was singlehandedly the largest jump in graphic fidelity ever. Now adays the "jumps" are barely noticeable.
My stingy ass only play games of 5 years or older, checkmate
You somehow checkmated yourself. Good job.
@@Dexyu i am somewhat of a Magnus Karlsen myself
Keep winning king. I don't buy new games anymore either. Detroit become human funnily enough is the latest one I bought when it came out
Based
5 years from now you'll be very disappointed in this arc