- As a game dev obsessed with optimization (and kinda having it as the only responsibility at my workplace) i'm indescribably upset with the current technical state of modern games. Back in the days with limited hardware specs as well as harsh development reality people had to put their best to simply make the game playable. I've been excited about it ever since i saw it working as a kid. Nowadays the games are slapped with the most generalized optimization techniques that are not even polished to fit into the game well enough, lol. - Today, with so many high-level languages with large overhead, game engines aimed at designers who are freaked out by just thinking of using their keyboard and not their mouse, the industry falling to its lowest and becoming yet another business area, the publishers ridiculous tendency to develop a lot of generic over casualed games in a foolish desire to conquer new users and overall shitty educational quality the games are ended up having so much overhead and technical debt that you could easily fit 2 or even 3 old games in there, both memory and performance wise. - I have always been thriving into the industry for this and i remained committed to the optimization all along. I am treating it like an art, which it really is. It's a shame seeing the industry being bloated with people so far away from understanding how games are working that the betelgeuse appears to be at the arm's reach, but this is the reality. Less competition as well, which oh man do i abuse. - Thanks for making the effort to read, whoever you are. Much appreciated :D
What's even more infuriating is seeing gamers and even some benchmark channels say things like "Oh this 2023 game runs 60fps on medium settings on a 900$ PC? Then it's optimized!" meanwhile that game may have 2012 graphics at absolute best and absolutely no physics or interactive environments to speak of. There's this braindead notion that a game's optimization is based on when it was released rather than how it looks like and how much physics or interactive environment it has. As a result devs use modern hardware not to push graphics and technology forward, but to instead become lazier. Imagine if we told devs 12 years ago that in the future video games would visually look 15% better but will run quite literally 500% worse
People don't often consider a game's optimization in their review all that much or praise it all that much. Take NMS. Procedurally generated yet it runs much better than most titles. Valheim is less optimized but its an indie title and has made large strides in improving its optimization. RDR2 runs pretty well too, same with GTA V. Doom, obviously. Distance is another well optimized game. Same with ''The entropy Center''. Forza horizon 4 and 5 are very well optimized. But people will say FC5 is ''well optimized'' or stuff like that.
While a few studios no doubt demonstrate a degree of technical incompetence, I firmly believe the vast majority of these cases simply boil down to poor management. Tight deadlines, feature creep, loose vision, quick turnaround to avoid investor pullout, higher ups that are disconnected from the pipeline and complexities that arise out of short-sighted decisions, the list goes on. With "industry" now being the operative word for the gaming industry, it's no surprise we see the same practices being utilised in larger projects to corner cut so long as people keep buying; and more so than any other sector of products I've seen, the consumer-base for games is far and away the most willing to put up with this capitalistic downward spiral.
@qma2275 After watching the video and hearing a lot from you got me really interested into diving deep into the topics. I really wanna learn more about the optimization and graphics that really works though in terms of optimisation and everything that happens behind a game. So would like to try out any recommendation that you've got for a beginner like me!
The pinnacle of modern game optimization has got to go to doom eternal that game put doom 2016 and all other modern games that rely on bloated file sizes to shaaaaaaaame
@@reglan_devOkay, I have two big problems with that. First off, why do indie games not qualify for the pinnacle of modern game optimization? Also, Do you know how much better optimized Factorio is than any other indie game?
@@what42pizza 1. The video and the commenter both mentioned AAA games. As we all know, modern AAA scene is famous for massive amount of unoptimised games. Indie games on the other hand, are known for the fact how well done they are, and optimised. Therefore they are a pinnacle of game optimisation, however the video wasnt talking about them. 2. Yes, i know how well optimised factorio is. However, its not the only very well optimised game out there too
Most companies' idea of "optimization" is to just raise the system requirements. I've always joked that if a Windows programmer today was asked to write an exact copy of Space Invaders, it would require a 2.4Gh i5 CPU, 8GB of RAM, a graphics card supporting pixel shaders 3.0, DirectX 11, and a minimum of 20GB of hard drive space.
What pissed me off the most regarding Starfield and Todd is when he said they optimized the game pretty well but a few weeks later they released a patch that increased performance by quite a lot. His own team proved that he was talking BS. Not that it needed much proving anyway because we have Cyberpunk where I could achieve literally double the FPS. And Night City is not even remotely comparable to Neon or Akila. Okay you can't fill a room with cheese in Cyberpunk but who even really cares.
I can't wait until the Silicon Limit is reached and we finally start actually optimising more rather than demanding more and more RAM, higher and higher clock speeds, and yet more storage. Don't get me wrong, there are issues with over-optimisation or pre-optimisation from a maintenance standpoint, but *some* optimisation would be nice.
@@fireloop69 I mean Skyrim is a BAD example, it's not a well optimised game, and it doesn't run perfectly. But also my issue was that there *aren't* over optimisations, instead games and such just demand higher spec machines.
@@cptnraptor understandable well take rdr2 as an example then one may say its over optimised but it still looks better than most modern games while running smoother as well
tried to run fortnite on a gtx 1050 recently , it looks like shit , meanwhile i can run cod ww2 on medium low whit increadible performance ! and don't let me talk about other well optimized games like bioshock , black mesa, etc.
At blizzcon many years ago, Blizzard told a story, they created a new city for world of warcraft, called Dalaran, however they discovered that rendering this city was ridiculously slow. On further investigation they discovered the problem was a toy shop, it had a model of the city as one of the toys. To save time the programmer had merely referenced the normal city model, scaled it down and rendered it. This led to a huge problem, if you looked at the toy store, the game would resize the city and render it as a model, this of course included the toy shop and the model city, which caused it to again resize the city effectively causing infinite recursion.
Wouldn't the game just not work then? It has to render an infinite number of the cities. Perhaps not because they get smaller? But somehow this makes me think about how some bigMobile games are optimised to run on comparatively weaker CPUs
For those who learned OpenGL you need to extends object to the same class so when you apply a shader just do it for all object at once by creating a override func (common problem)
Imagine if hardware was developed just like modern games: “- Whoa! It POSTs! - Ship it! - But there’s issues and clock is unstable. - We will fix it via driver updates!”
If they could, they would. Fortunately, casual gamers aren't the only ones buying motherboards and computer parts, I guess? Professionals and big corporations do too, and they probably want a stable and well made part.
you described every new GPU launches since 2015! RX480 blowing the motherboard PCIE by pulling more than 75W is one of the recent examples being fixed by a driver update :P
developers often dont eve optimize... they wait to see if idiots will buy overpowered hardware 1st. code is written by idiots.. sub optimally... they try get the product done 1st.
Nowadays games: - 98gbs for just a fighting game (yes Tekken, I am talking about you); - DirectX12 that freaks up the graphics for actually strong chips but old graphic cards (like GTX 970), making then running slow or creating really annoying visual arctifacts due to low resolution applied (another example of that is that Tekken 5 looks way prettier than Tekken 8 in low quality. Also SF6 doesn't need DirectX12, it could have DirectX11 and Vulkan for performance versions too). Seems like the industry only cares about Path Tracing in real time render.
I think graphics card companies also have something to do with games not being optimized as to sell stronger GPUS so people can run a terribly optimized game
Actually, GPU vendors tend to work with game studios to optimize their games to run on their hardware so their hardware looks better in reviews. Your comment reminds me of Nvidia Gameworks though. It was this black box Nvidia gave to devs to do certain effects, such as hair. But, it was pretty much Nvidia abusing brute force tessellation to make subpar effects. Nvidia had better tessellation than AMD, so games ran faster on Nvidia cards. AMD at the same time made open source ways to do the same thing, though looking better, and running faster... even on Nvidia hardware.
This entire Bullshit of a theory is actually being taken seriously by some people. It blows my mind how unreasonably moronic people have become. Releasing unoptimized games really hurt the reputation of the devs and this in turn could majorly hurt their sales enough to shut them down. It's quite literally devs just releasing the game in a broken state so they can get money real quick from all the preorders. At the end of the day, devs release broken games to get a quick buck and not to save some GPU vendors ass.
@@xeridea Game sponsorship only tells part of the story. Almost all, if not all Ubisoft games are AMD sponsored, yet if we're talking CPUs, Intel CPUs handle the Assassins Creed and Far Cry games a lot better, an i9 9900k significantly outperforms a 5950X which came out a year later and usually trades blows with the 10900k. GPUs however, I believe Nvidia's usually do just as good a job, and in some cases even better, like in Far Cry 6 which has Ray Tracing.
Games that don't even look as good as the original Crysis (2007) run like crap on today's hardware... just look at Starfield. An empty planet and a few pebbles... no terrain, no vegetation and yet it can drop to 50 FPS on my high end PC. It's ridiculous.
Game devs have it easy with ample RAM and CPU and they still fuck it up. Try embedded development where resources are extremely limited and crashes can potentially cause catastrophic failures.
I would credit every single japaneese video game developer in this department. They release some of the most optimised games ever. Special shoutout to metal gear solid V. I played that game on 2 gb ram.
There is a very important thing that is often forgotten about big O notation: it ignores constants and those constants might be huge. And that matters a lot if your code doesn't work on big dataset.
@@MehmetMehmet-y8c Big-O is a quick&dirty notation aimed to eyeball how an algorithm scales with the number of inputs given. Imagine I've some hypothetical algorithm and after careful analysis I determine that the number of operation required to complete goes: n^3 + 10n^2 + 100n. With big-O notation you're making the rather dirty observation that eventually n^3 dominated and anything else doesn't matter. When however? How 100n can be ignored for small-n? Big-O is flawed in that sense
that's intentional, Nintendo's law is always make games that are fun regardless of graphics, that's why the wii games have held up so well compared to ps2-3 games
True. Unoptimized games is not because of the software engineers but because of the managenent. Speaking from experience as a dev myself. Boy, if you see our codebase. Lol. Tighter deadlines so, we are cutting corners. I don't want to but I need to xD
The big O notation describes the growth of a function, not the actual execution speed. For example, hash maps have a lookup time complexity of O(1), whereas linear arrays have a lookup time complexity of O(n). However, if the hashing function is slow, array lookups will most likely outperform map lookups for lower values of n. O(10000) = O(1) is slower than O(1n) for n < 10000. That is to say, it would be incorrect to state that the big O notation accurately represents the real-world performance of an algorithm outside of big data. The correct way to locate slow functions is by profiling.
Asphalt 8/9 is crazy optimized if you’ve ever played it. Glorious graphics, solid 60 fps all the time even on pretty old computers and mobiles, very responsive.
Ksp2 devs: Let's make a ridiculously accurate rocket building game! And the map is the whole Solar System, just for good measure! Also Ksp2 devs: Recommended for 1080p60 are rtx 3080
Cyberpunk runs on the steamdeck... but people want to tell me it is impossible to run it on the ps4 with addon and 2.0 update? NO WAY... This game was a crime
To give some feedback: I think the "Code" section was a bit too fast especially you explaining what we are actually trying to accomplish. I didn't get it before rewatching that part a few times
csavo, meghallottam az akcentust, mentem is a csatorna leirasaba es lattam amit remeltem. egy MAGYAR youtuber aki angol tartalmat gyart? hat ezt a csatornat jo alaposan at fogom bongeszni 😂 csinald csak tovabb, nagy dolgot muvelsz💯
Meanwhile "Alan Wake II" Triangles? What Triangles? There's only 1 Trillion triangles on the player's perspective tho. Surely your entry level 4090 can handle it, right?
I jokingly made the reference that AAA studio leads just say "Our game isn't unoptimised, just buy a 4090" behind the scenes. And then Todd Fuckwad said it in an actual interview. Fuck everything that guy stands for. And then he dares get dissapointment at the game awards every time it didn't win. Dude thinks the sun is shining out of his ass
In the 90s due to the limited resources on computers, devs had no choice but to optimize the software(not just games) before release, internet for updates was pretty much non existent. Remember how little RAM some programs used like Adobe PDF, heck look for PDF reader alternatives and you will see some use less RAM overall. Heck Windows did not show seconds on the clock in the taskbar as the CPU would need to render a new number every second and that would cause a performance hit, while this was more true for Win9x it still shows that devs had to make sure that the code they wrote was good from the start. But these days cause we can have 64GB of RAM, 2TB SSDs devs dont really bother with optimizing software, why waste time optimizing when you have so much resources?
A youtube video once said the reason DX12 is so “bad” is because the burden of optimisation is now on the game devs, who don’t have much experience with drivers..
The number one thing that works for optimization is Frame Generation. Anybody can now do it with any game any graphics card with a program on Steam called Lossless Scaling. That's what happens when AMD makes FSR frame gen open source
I think we should focus on software more than hardware now, hardware is at it's peak and it will never improve to the point of actually...well, improving. I have a 3050 laptop, it's pretty sluggish in certain games but run others like a dream, is there a huge graphical difference? No. So why does it do that?? Fucking optimization. The 3050m gpu is definitely not the peak I was talking about, but it's pretty damn close, just add some extra vram, a little more cuda cores, a little faster bandwidth speed, and boom; you just made yourself something that can run literally everything, (spoiler: that exists, it's called the 3060) and if the 3050 on itself can run everything, some at lower settings, a lot more at higher, then I don't see a single reason we should "improve" from the 3090, much less the 4090. If we're talking realistically, the 4090 can last people decades! What more do we need from a graphics card? Time travel? It's a graphics card, it runs games at 4k ultra with RT on, I seriously do NOT see anything we could improve. If game companies put care into optimization, then I would've have to upgrade my laptop for another decade, and to those with a much more powerful gpu's, like for example a 3070 or a 4070, they shouldn't have to upgrade for another however long it takes for these gpu's to simply break.
New games almost entirly lack proper optimizations to make it playable and acceptable for players to enjoy , those who still do it good , are passionate artisits...
Im not really from a rich place or still have money as a student so my pc is kinda meh...and i admire this a lot...as a programmer myself i really dove deep into this lately I just cant play a game slower than 40-50 fps ...or with drops
9:26 "pretty good optimization"? no, that's insane optimization. what a simple solution as well. now, whenever i think a program can't be more optimized, i'll slap myself and remember this example and try to optimize it more.
"Optimisation is easy just use the 20 new unrealengine5 meme effects that require a 4090 to run!" This video is why modern videogames are the way they are lmao
he should definetly have talked about baked lighting in the video. it makes games run so much better and it looks awesome if done right. Many beginner devs just use lumen since it's now enabled by default in ue5 and those devs don't know any better than to use this extremely slow and performance consuming technique. Also a fun fact: i once saw a tutorial on "how to remove the 'lighting needs to be rebuilt' error" and the dude literally just told the viewers to select all the lights and make them dynamic. talk about good performance...
@@paper_shreds I see some people calling baked lighting "faking it" or if its somehow inferior to real time lighting, even if the results are better. I can't wait until they realize that all 3D graphics are built on trickery
I feel like those people saw some presentation about open world games at some point where dynamic light sources are the only option and that made them think thats the best way@@SomeRandomPiggo
@@TheJohn_Highway R6 used to actually look good though. They've reduced the graphics over the years and over-sharpened it for E-sports. That's a bad example anyways, Half-Life Alyx and CS2 both use baked lighting and look photoreal sometimes.
I feel like Moore's Law will always continue, because 15 years ago everybody was sure it was gonna grind to a halt in 2020 or so because of physical limits and clearly that has not been the case. And there's the whole quantum thing brewing on the horizon for a decade or so now, while not ready for "prime time" yet either, it's clear that it does work and will make us able to reach 'new heights' a few more years down the line that were long thought to be impossible. Otherwise a great video ^^
Quantum computing is not something that is on the horizon, at least not for the general public. They only function at almost absolute zero temperature, and a cooling system capable of that will definitely not be available to us anytime soon, and it would take an even longer time until it becomes affordable to build a PC with that.
Dude imagine being the most sold game of all time and having optimization so bad that a huge chunk of youre comunity is dedicatted to fixing it Minecraft is truly not a heavy game only if the devs adressed it it would be playable
Devs relying on pure hw power instead of developers' skills in squeezing out as much power of that hw instead is bane of most modern games (especially bigger ones). For example, remember how Crash Bandicoot was more than even Sony believed was possible? Meanwhile we have bland looking games that look maybe 5% better than games from few years ago but take dozens of times more disk space and struggle to run even on the best HW consumers could theoretically access. Old ARK take over 430GB on disk, looks crap, plays even worse and is plagued with more bugs than some actual Early Access I've played (combined). Fitgirl's repack is about 43GB. Starfield is almost unplayable withoud Nvidia's DLSS and Todd has the nerve to say it's because their game pushes the technology to their limits and PC should just upgrade (ignoring the fact that consoles run the game better in general but their hardware is at most medium-grade compared to gaming PCs). Modern devs are just lazy. They self-learned from poor-quaity tutorials on YT or some shit and think themselves great developers.
Hey! What's the tool you are using at 3:50? This seems way better than having to boot up photoshop every time I want to use smoothness textures (damn Unity smoothness on Albedo Alpha :v) Great video!
We can't even run games at 4k 165 fps without upscaling unless it is 4080 or 4090. Sucks that gpu prices are getting higher when we're not even getting good performance at native resolution
I needed to empty out some space in my drive. From what i gathered indie devs do so much better in optimization. Just look at the most recent poor optimization from cities skyline 2. Most of the time they always argue about home computers under performing to deflect the criticism.
Yeah, but indie games are generally not that large in scope either. Look at the ones which are, like valheim. Its optimization is alright. Now compare it to NMS, a AAA game. Look at choo choo charles, an indie game. Optimization is not that great on it. Granted, it was made relatively quickly and with just 1 guy, but still.
What's going on doe right ? What happened to optimizing games properly bro ? (somebody fill me in) 1. Over dependence with high end cards, processors, systems, and consoles ? 2. Game Project rushing because of money and time control by financial support ? 3. Half ass development and early releases for fast money intake ? Most Games exited the FUN area and went to fucking " BUSINESS MODE ONLY ". This is why I have absolute love for indie game developers who actually make really good optimized games with great content plus interact with their community.
The games you show at the beginning are doing all the things you mention in the rest of the video. That's not "why" they are "unoptimized". And "optimized" is often a ambiguous term. Games may be optimized to run their workload the best (or as good as possible given the time constraints/skills of the whole team) but still won't run on a potato PC because it is not part of their design goal. If you include a path tracer in your engine you better be "optimized". But if you have to run on Switch, then it would make better sense to drop the path tracer and maybe have a simple renderer with a few light sources.
@@swh77 It's not speculation when there's a plethora of research papers, presentations, articles, blog posts and discussions regarding all of this out in the public domain, a significant portion of which has come from developers working on AAA games. You don't need access to the source code that game X uses to implement screen-space reflections when the studio who made game X literally held a presentation discussing their approach in GDC one year, you can just go watch their presentation.
yes!! 2 years ago the performance of that game started to go downhill from always over 100fps ultra to almost never going above 80fps, too many fps drops to 60fps or below and lots of stuttering and increased input lag. Removing the old maps and game modes in the 1.0 update and worse performance are the worst issues of Ready or Not to me, it's just so unfortunate.
6:07 i got curious what game this is and made a code to analyze all existing games -- "a game that isn't dying or anything or at least that's what the devs like to think" -- is it World of Warships?
Dunkirk is a small city in France, at the englosh channel, in wich the British troops where trapped during the invasion of france. There is also a Movie about this with the same name. @worldsinmotion
That’s because its environment is easier to render than plenty of modern games that are set in cities. It’s just not super dense so RDR2 really leans into its lighting. They have tons of presentations on how they achieved the visuals they made. Although this doesn’t mean it surpasses them in all aspects as there are select few things many 9th gen games are doing. Problem stems from the fact that we are still kinda stuck in cross gen still
Now that raster techniques are so good, rt feels like a brute forcing waste of resources. And console games are just pc games with low settings…. Which is not what it should be
Moore’s Law is only dying if you’re talking about transistor density not compute power density, which is what we should really be measuring and isn’t slowing down any time soon.
I know "thing vs thing, Japan" is cringe as hell but I can't deny that Japanese developers really know how to optimize their games very well. Allowing my 2060 to run on highest possible graphics, while it struggles with basically any western game to have stable 60fps on low-mid settings is a feat worth acknowledging.
This is not just with Games, happens with cars as well. Western cars are horribly made and very unreliable. Meanwhile Japanese cars are built like a tank.
Except Gamefreak 😔 Well, it's probably not a problem with their developers as much as it is with the insane time constraints they're forced to work with.
In the future games will be made in a way that doesnt require insane hardware and theyll be a universal standard for specs whilst maintaining realistic graphics
Can't tell if you're completely delusional or colossally optimistic. I wouldn't trust modern devs to make a 1:1 copy of Minecraft that would run as well as the OG one.
@@TheJohn_HighwayI dunno about that Minecraft is a pretty purely optimised game, community mods have more than doubled the performance of the base game. Granted mojang has been improving it especially with the lighting engine rewrite recently that fixed one of the worst bottlenecks.
@jeff_7274 bedrock is CRAZY optimized. That's why I'm always tweaking the settings to run shaders without rtx LOL But java... well, I know it's a little limited by the language it's using, but still, we're talking about the second or third biggest company in the world. I honestly hope they switch Java to also c++ and make them both equal.
@@jeff_7274 compare to vanilla Java perhaps, with my setup I can get ~60 fps in a jungle at 64 chunks on bedrock 30 in the same seed with 32 chunks in Java so you are right there. But with sodium mod and a few others I can get around 100 - 300 fps. So there is still a long way to go.
I'm a little of a conspiracy guy, and I would say the big hardware players have some kind of deal with the AAA companies to not optimize their games, because of the overreliance on DLSS and FSR, and because no one in the entire planet can tell me Starfield is optimized and Todd Howard response to that is "it's optimized, go and upgrade your PC lol". C'mon, it's really sus.
9:22 You aren't using the full performance benefits of Dictionaries. By acessing the value by key and not iterating over the values you get a exectuation time of 0.25 ms: Dictionary wordsDictionary = new Dictionary(); void LoadWords() { string[] lines = File.ReadAllLines("words_alpha.txt"); foreach (string line in lines) { string morse = Translate(line, ""); if (!wordsDictionary.ContainsKey(morse)) { wordsDictionary.Add(morse, new List()); } wordsDictionary[morse].Add(line); } } string Translate(string input, string divider = " ") { string result = ""; foreach (char c in input) { lettersDictionary.TryGetValue(c, out string morse); ; result += morse + divider; } return result; } List TranslateInvalidCode(string input) { input = input.Replace(" ", ""); return wordsDictionary[input]; }
Arc has always been poorly optimized since the game initially came out in UE4. There is nothing new from those devs other than proving that they don’t care about performance. Best example is the original Xbox One port and original Nintendo Switch port
@@crestofhonor2349Apparenlty the new Ark switch port actually runs pretty well and looks better than the original crappy port which I find incredible. I’d be very interested to see what they did to get it to run on there. Everywhere else though it still seems just as bad as it was the day it came out.
I work as a programmer for a small indie studio and my job has really made me wonder how AAA companies make videogames. You often don't see this a lot with indie games, because if your studio has 1 - 3 programmers and someone doesn't know what they're doing, you just don't have a videogame. But AAA companies seem to always struggle with the absolute silliest things, like a single weapon in Destiny 2 breaking 36 times, or the Freddy Fazbear model in Security Breach having straight up movie-level polygons, or games like CoD or RDR2 having absolutely zero compression on anything, or GTA 5 Online literally having an infinite loop on the BOOT. How do these things happen??? EVIDENTLY they have talented engineers on the teams for all of these games, but all of these mistakes are so stupid, so first-semester-of-college basic, that I wonder who these people are actually hiring. Same goes for security too. It's definitely a lot of extra planning, but not much work to make your remotes secure, your client-server communication reliable and safe (P2P is harder though). But then these games never do it, end up with massive cheating problems, and then end up having to shell out millions for a rootkit garbage ""anticheat"". It's baffling.
I think the problem with massive games is that their jumble of code ends up connecting a lot of seemingly separate things. Something we see as a small glitch could be an unimaginative tangle of code that is just waiting to break the entire game
I will say something dumb but I think it's because they don't hire good programmers that play games instead they hire activist that really hates gamers or just lazy developers
@@theemperor4814 I don't think anyone who hates gamers or games would work in the industry, it's an absolutely insane amount of work not many people truly understand; but I wouldn't put it beyond an unqualified or untrained team being in charge of these things. Having a novice audio or graphic designer be responsible for importing assets, and forgetting to compress files; or having more junior developers write less critical code which is never cross-examined, could easily cause this issue.
@@zeelyweely1590 the thing is they love games and hate gamers but they want what you call diversity look at x ( before Elon )having excess of workers that's the majority of gaming Industry now
I think as the games become bigger, it's also becoming harder to polish them before release. And also I hate it when a game is released unoptimized and then they say go rely on DLSS to up the frame rate. That's a clown's move.
Really good video although there's a problem; you're mainly focusing on optimisation in Unity which is a fundamentally terrible engine that nobody should use anymore especially because of how mismanaged it is and the runtime fee. It would be great if you talked more about optimisation in Unreal which I feel you kinda glossed over it by only mentioning Nanite and Lumen, there's much more to it than that such as light baking, blueprint nativisation, anti-aliasing methods, shadow optimisation, etc. Godot would have also been great to have a mention since it's open source and that
The only thing I have pretty sure, is that no one cares about Low-End, if the release of CS2 has told me anything, is that we are shit, and shit is disposable.
- As a game dev obsessed with optimization (and kinda having it as the only responsibility at my workplace) i'm indescribably upset with the current technical state of modern games. Back in the days with limited hardware specs as well as harsh development reality people had to put their best to simply make the game playable. I've been excited about it ever since i saw it working as a kid. Nowadays the games are slapped with the most generalized optimization techniques that are not even polished to fit into the game well enough, lol.
- Today, with so many high-level languages with large overhead, game engines aimed at designers who are freaked out by just thinking of using their keyboard and not their mouse, the industry falling to its lowest and becoming yet another business area, the publishers ridiculous tendency to develop a lot of generic over casualed games in a foolish desire to conquer new users and overall shitty educational quality the games are ended up having so much overhead and technical debt that you could easily fit 2 or even 3 old games in there, both memory and performance wise.
- I have always been thriving into the industry for this and i remained committed to the optimization all along. I am treating it like an art, which it really is. It's a shame seeing the industry being bloated with people so far away from understanding how games are working that the betelgeuse appears to be at the arm's reach, but this is the reality. Less competition as well, which oh man do i abuse.
- Thanks for making the effort to read, whoever you are. Much appreciated :D
What's even more infuriating is seeing gamers and even some benchmark channels say things like "Oh this 2023 game runs 60fps on medium settings on a 900$ PC? Then it's optimized!" meanwhile that game may have 2012 graphics at absolute best and absolutely no physics or interactive environments to speak of.
There's this braindead notion that a game's optimization is based on when it was released rather than how it looks like and how much physics or interactive environment it has. As a result devs use modern hardware not to push graphics and technology forward, but to instead become lazier.
Imagine if we told devs 12 years ago that in the future video games would visually look 15% better but will run quite literally 500% worse
People don't often consider a game's optimization in their review all that much or praise it all that much.
Take NMS. Procedurally generated yet it runs much better than most titles. Valheim is less optimized but its an indie title and has made large strides in improving its optimization. RDR2 runs pretty well too, same with GTA V. Doom, obviously. Distance is another well optimized game. Same with ''The entropy Center''. Forza horizon 4 and 5 are very well optimized.
But people will say FC5 is ''well optimized'' or stuff like that.
While a few studios no doubt demonstrate a degree of technical incompetence, I firmly believe the vast majority of these cases simply boil down to poor management. Tight deadlines, feature creep, loose vision, quick turnaround to avoid investor pullout, higher ups that are disconnected from the pipeline and complexities that arise out of short-sighted decisions, the list goes on.
With "industry" now being the operative word for the gaming industry, it's no surprise we see the same practices being utilised in larger projects to corner cut so long as people keep buying; and more so than any other sector of products I've seen, the consumer-base for games is far and away the most willing to put up with this capitalistic downward spiral.
@qma2275 After watching the video and hearing a lot from you got me really interested into diving deep into the topics. I really wanna learn more about the optimization and graphics that really works though in terms of optimisation and everything that happens behind a game. So would like to try out any recommendation that you've got for a beginner like me!
thats why AAA makes billions while you work for minimum wage as an indie dev lmao
The pinnacle of modern game optimization has got to go to doom eternal that game put doom 2016 and all other modern games that rely on bloated file sizes to shaaaaaaaame
I'd argue games developed by Nintendo themselves on the Switch takes the cake, but I guess that depends on what requirements you're going for here.
What about Factorio?
@@what42pizzaits not a AAA games. If we were talking about indie games, then youd see most of them being well optimised. Like factorio
@@reglan_devOkay, I have two big problems with that. First off, why do indie games not qualify for the pinnacle of modern game optimization? Also, Do you know how much better optimized Factorio is than any other indie game?
@@what42pizza 1. The video and the commenter both mentioned AAA games. As we all know, modern AAA scene is famous for massive amount of unoptimised games.
Indie games on the other hand, are known for the fact how well done they are, and optimised. Therefore they are a pinnacle of game optimisation, however the video wasnt talking about them.
2. Yes, i know how well optimised factorio is. However, its not the only very well optimised game out there too
Most companies' idea of "optimization" is to just raise the system requirements.
I've always joked that if a Windows programmer today was asked to write an exact copy of Space Invaders, it would require a 2.4Gh i5 CPU, 8GB of RAM, a graphics card supporting pixel shaders 3.0, DirectX 11, and a minimum of 20GB of hard drive space.
And you'll not be wrong 😂
"Our game is running just fine, maybe it's time to upgrade."
Todd Howard: To PC gamers with i9's and 4090ti
What pissed me off the most regarding Starfield and Todd is when he said they optimized the game pretty well but a few weeks later they released a patch that increased performance by quite a lot. His own team proved that he was talking BS.
Not that it needed much proving anyway because we have Cyberpunk where I could achieve literally double the FPS. And Night City is not even remotely comparable to Neon or Akila.
Okay you can't fill a room with cheese in Cyberpunk but who even really cares.
@@valentinvas6454Fr unpaid modders released optimization patches like that same night it released 😂 also added dlss and other very much needed stuff
@@valentinvas6454 Dude you're comparing apples to oranges. This is so ignorant on so many levels I don't even know what to say.
@@ged-4138 Care to elaborate?
@@ged-4138 you dont know that to say because you dont have anything to say. So next time just keep it to yourself.
I can't wait until the Silicon Limit is reached and we finally start actually optimising more rather than demanding more and more RAM, higher and higher clock speeds, and yet more storage.
Don't get me wrong, there are issues with over-optimisation or pre-optimisation from a maintenance standpoint, but *some* optimisation would be nice.
there is no issue with over optimizations an optimized game will run perfectly for years skyrim is peak example
@@fireloop69 I mean Skyrim is a BAD example, it's not a well optimised game, and it doesn't run perfectly.
But also my issue was that there *aren't* over optimisations, instead games and such just demand higher spec machines.
@@cptnraptor understandable well take rdr2 as an example then one may say its over optimised but it still looks better than most modern games while running smoother as well
@@fireloop69rdr 2 is definitely the perfect example
tried to run fortnite on a gtx 1050 recently , it looks like shit , meanwhile i can run cod ww2 on medium low whit increadible performance ! and don't let me talk about other well optimized games like bioshock , black mesa, etc.
At blizzcon many years ago, Blizzard told a story, they created a new city for world of warcraft, called Dalaran, however they discovered that rendering this city was ridiculously slow.
On further investigation they discovered the problem was a toy shop, it had a model of the city as one of the toys. To save time the programmer had merely referenced the normal city model, scaled it down and rendered it.
This led to a huge problem, if you looked at the toy store, the game would resize the city and render it as a model, this of course included the toy shop and the model city, which caused it to again resize the city effectively causing infinite recursion.
that's crazy. thanks for sharing.
Wouldn't the game just not work then? It has to render an infinite number of the cities.
Perhaps not because they get smaller?
But somehow this makes me think about how some bigMobile games are optimised to run on comparatively weaker CPUs
For those who learned OpenGL you need to extends object to the same class so when you apply a shader just do it for all object at once by creating a override func (common problem)
This comment was approved by real amecian patriots!!!
Imagine if hardware was developed just like modern games:
“- Whoa! It POSTs!
- Ship it!
- But there’s issues and clock is unstable.
- We will fix it via driver updates!”
Shh! Don't give them ideas!
It's consumers that enabled this behavior. If people stopped buying broken games due to their FOMO, we'd be doing much better
If they could, they would. Fortunately, casual gamers aren't the only ones buying motherboards and computer parts, I guess? Professionals and big corporations do too, and they probably want a stable and well made part.
you described every new GPU launches since 2015! RX480 blowing the motherboard PCIE by pulling more than 75W is one of the recent examples being fixed by a driver update :P
developers often dont eve optimize... they wait to see if idiots will buy overpowered hardware 1st.
code is written by idiots.. sub optimally... they try get the product done 1st.
Nowadays games:
- 98gbs for just a fighting game (yes Tekken, I am talking about you);
- DirectX12 that freaks up the graphics for actually strong chips but old graphic cards (like GTX 970), making then running slow or creating really annoying visual arctifacts due to low resolution applied (another example of that is that Tekken 5 looks way prettier than Tekken 8 in low quality. Also SF6 doesn't need DirectX12, it could have DirectX11 and Vulkan for performance versions too).
Seems like the industry only cares about Path Tracing in real time render.
its not dx12 fault, its fault of lazy developers used to dx11 abstractions
Tbf with tekken 8 you can just Delete the story files (30+ gb size)
I think graphics card companies also have something to do with games not being optimized as to sell stronger GPUS so people can run a terribly optimized game
Actually, GPU vendors tend to work with game studios to optimize their games to run on their hardware so their hardware looks better in reviews. Your comment reminds me of Nvidia Gameworks though. It was this black box Nvidia gave to devs to do certain effects, such as hair. But, it was pretty much Nvidia abusing brute force tessellation to make subpar effects. Nvidia had better tessellation than AMD, so games ran faster on Nvidia cards. AMD at the same time made open source ways to do the same thing, though looking better, and running faster... even on Nvidia hardware.
This entire Bullshit of a theory is actually being taken seriously by some people. It blows my mind how unreasonably moronic people have become. Releasing unoptimized games really hurt the reputation of the devs and this in turn could majorly hurt their sales enough to shut them down. It's quite literally devs just releasing the game in a broken state so they can get money real quick from all the preorders. At the end of the day, devs release broken games to get a quick buck and not to save some GPU vendors ass.
@@xeridea Game sponsorship only tells part of the story. Almost all, if not all Ubisoft games are AMD sponsored, yet if we're talking CPUs, Intel CPUs handle the Assassins Creed and Far Cry games a lot better, an i9 9900k significantly outperforms a 5950X which came out a year later and usually trades blows with the 10900k. GPUs however, I believe Nvidia's usually do just as good a job, and in some cases even better, like in Far Cry 6 which has Ray Tracing.
Games that don't even look as good as the original Crysis (2007) run like crap on today's hardware... just look at Starfield. An empty planet and a few pebbles... no terrain, no vegetation and yet it can drop to 50 FPS on my high end PC. It's ridiculous.
@@xerideaI remember this.
Game devs have it easy with ample RAM and CPU and they still fuck it up. Try embedded development where resources are extremely limited and crashes can potentially cause catastrophic failures.
I would credit every single japaneese video game developer in this department. They release some of the most optimised games ever. Special shoutout to metal gear solid V. I played that game on 2 gb ram.
Oh yes I especially like how even cutting enemies into 100 plus pieces only lags slightly
There is a very important thing that is often forgotten about big O notation: it ignores constants and those constants might be huge. And that matters a lot if your code doesn't work on big dataset.
explain
@@MehmetMehmet-y8c Big-O is a quick&dirty notation aimed to eyeball how an algorithm scales with the number of inputs given. Imagine I've some hypothetical algorithm and after careful analysis I determine that the number of operation required to complete goes: n^3 + 10n^2 + 100n. With big-O notation you're making the rather dirty observation that eventually n^3 dominated and anything else doesn't matter. When however? How 100n can be ignored for small-n? Big-O is flawed in that sense
@@ef3675 i understand what you mean. good example
Look, this is why I say "More Hardware doesn't make a better game" The argument should be "Better Optimization makes a better game"
Nintendo is great with compression techniques it’s crazy. They truely are the best at making compressed video game file formats.
They had to, their Nintendo switch is comparable to 2023 midrange smartphone in term of power
that's intentional, Nintendo's law is always make games that are fun regardless of graphics, that's why the wii games have held up so well compared to ps2-3 games
True. Unoptimized games is not because of the software engineers but because of the managenent.
Speaking from experience as a dev myself. Boy, if you see our codebase. Lol. Tighter deadlines so, we are cutting corners. I don't want to but I need to xD
The big O notation describes the growth of a function, not the actual execution speed. For example, hash maps have a lookup time complexity of O(1), whereas linear arrays have a lookup time complexity of O(n). However, if the hashing function is slow, array lookups will most likely outperform map lookups for lower values of n. O(10000) = O(1) is slower than O(1n) for n < 10000.
That is to say, it would be incorrect to state that the big O notation accurately represents the real-world performance of an algorithm outside of big data. The correct way to locate slow functions is by profiling.
I just about cried when I saw you put the subdivision modifier on the door handle. I think I felt genuine pain.
Simply, quality content , straight to point , and engaging , luckly i was already subscribed to this channel
This video was super cool. Why is this channel so underrated. Hope it blows up!
The demonstration about "what if light was slow" was also amazing
You deserve much more subs and views! Amazing video!!
100%, I was confused by how this can only have 300 views, great work
Wrong title, it should have been: The LOST Art of Game Optimization
It’s not lost though. AAA games aren’t the only games on the market and even then there are well optimized AAA games
And for common people, AAA is the only one matter. . @@crestofhonor2349
@@crestofhonor2349
Name 1 optimized AAA game made after 2016
Asphalt 8/9 is crazy optimized if you’ve ever played it. Glorious graphics, solid 60 fps all the time even on pretty old computers and mobiles, very responsive.
Ksp2 devs: Let's make a ridiculously accurate rocket building game! And the map is the whole Solar System, just for good measure!
Also Ksp2 devs: Recommended for 1080p60 are rtx 3080
what
what
what
what
what
Cyberpunk runs on the steamdeck... but people want to tell me it is impossible to run it on the ps4 with addon and 2.0 update? NO WAY... This game was a crime
The steam deck is more powerful then the ps4. so yes... it runs on a steamdeck and not a ps4.
@@jairit1606 and the legion go is more powerful then the steamdeck - i played every quest and the addon. Really freaking good game. (and Handheld pc)
To give some feedback:
I think the "Code" section was a bit too fast especially you explaining what we are actually trying to accomplish. I didn't get it before rewatching that part a few times
csavo, meghallottam az akcentust, mentem is a csatorna leirasaba es lattam amit remeltem. egy MAGYAR youtuber aki angol tartalmat gyart? hat ezt a csatornat jo alaposan at fogom bongeszni 😂 csinald csak tovabb, nagy dolgot muvelsz💯
Meanwhile "Alan Wake II"
Triangles? What Triangles? There's only 1 Trillion triangles on the player's perspective tho. Surely your entry level 4090 can handle it, right?
Having an RTX 3060 be the MINIMUM GPU requirement for a game to run screams shoddy optimization
I jokingly made the reference that AAA studio leads just say "Our game isn't unoptimised, just buy a 4090" behind the scenes.
And then Todd Fuckwad said it in an actual interview. Fuck everything that guy stands for. And then he dares get dissapointment at the game awards every time it didn't win. Dude thinks the sun is shining out of his ass
I give it a like just for the effort that you put in the video
In the 90s due to the limited resources on computers, devs had no choice but to optimize the software(not just games) before release, internet for updates was pretty much non existent.
Remember how little RAM some programs used like Adobe PDF, heck look for PDF reader alternatives and you will see some use less RAM overall.
Heck Windows did not show seconds on the clock in the taskbar as the CPU would need to render a new number every second and that would cause a performance hit, while this was more true for Win9x it still shows that devs had to make sure that the code they wrote was good from the start.
But these days cause we can have 64GB of RAM, 2TB SSDs devs dont really bother with optimizing software, why waste time optimizing when you have so much resources?
wow, someone who knows what they're talking about, very refreshing
A youtube video once said the reason DX12 is so “bad” is because the burden of optimisation is now on the game devs, who don’t have much experience with drivers..
What about vulkan....?
@@BOT-fq9bu also a low level API and roughly speaking, you would have to do the same amount of work
There's mid 2000s game thst looks and plays amazing on what we today consider low end hardware
The number one thing that works for optimization is Frame Generation. Anybody can now do it with any game any graphics card with a program on Steam called Lossless Scaling. That's what happens when AMD makes FSR frame gen open source
I think we should focus on software more than hardware now, hardware is at it's peak and it will never improve to the point of actually...well, improving. I have a 3050 laptop, it's pretty sluggish in certain games but run others like a dream, is there a huge graphical difference? No. So why does it do that?? Fucking optimization. The 3050m gpu is definitely not the peak I was talking about, but it's pretty damn close, just add some extra vram, a little more cuda cores, a little faster bandwidth speed, and boom; you just made yourself something that can run literally everything, (spoiler: that exists, it's called the 3060) and if the 3050 on itself can run everything, some at lower settings, a lot more at higher, then I don't see a single reason we should "improve" from the 3090, much less the 4090. If we're talking realistically, the 4090 can last people decades! What more do we need from a graphics card? Time travel? It's a graphics card, it runs games at 4k ultra with RT on, I seriously do NOT see anything we could improve. If game companies put care into optimization, then I would've have to upgrade my laptop for another decade, and to those with a much more powerful gpu's, like for example a 3070 or a 4070, they shouldn't have to upgrade for another however long it takes for these gpu's to simply break.
Raytracing can be more efficient than the typical rasterization when dealing with a huge quantity of triangles.
This video basically explained why I'll never learn programming. The numbers immediately made my head hurt.
New games almost entirly lack proper optimizations to make it playable and acceptable for players to enjoy , those who still do it good , are passionate artisits...
This was surprisingly interesting… Thank you!
Im not really from a rich place or still have money as a student so my pc is kinda meh...and i admire this a lot...as a programmer myself i really dove deep into this lately
I just cant play a game slower than 40-50 fps ...or with drops
What a great work dude!
Mojang needs to see this
Java Minecraft has so many issues stemming from the fact that it is single threaded. Bedrock Minecraft however does not have this issue
9:26 "pretty good optimization"? no, that's insane optimization. what a simple solution as well. now, whenever i think a program can't be more optimized, i'll slap myself and remember this example and try to optimize it more.
This is underrated. Great video, sir!
Thought it was the subdivision that broke shit but 3 MILLION?!!? HOW DO YOU DO THAT BY ACCIDENT
WOW! Thats a highly concentrated quality video. So much to meaningful content in so little time.
"Optimisation is easy just use the 20 new unrealengine5 meme effects that require a 4090 to run!"
This video is why modern videogames are the way they are lmao
he should definetly have talked about baked lighting in the video. it makes games run so much better and it looks awesome if done right. Many beginner devs just use lumen since it's now enabled by default in ue5 and those devs don't know any better than to use this extremely slow and performance consuming technique.
Also a fun fact: i once saw a tutorial on "how to remove the 'lighting needs to be rebuilt' error" and the dude literally just told the viewers to select all the lights and make them dynamic. talk about good performance...
@@paper_shreds I see some people calling baked lighting "faking it" or if its somehow inferior to real time lighting, even if the results are better. I can't wait until they realize that all 3D graphics are built on trickery
I feel like those people saw some presentation about open world games at some point where dynamic light sources are the only option and that made them think thats the best way@@SomeRandomPiggo
@@SomeRandomPiggo
I believe that baked lighting gets a bad rep because nearly every modern game with baked lighting looks terrible (see:R6 Siege)
@@TheJohn_Highway R6 used to actually look good though. They've reduced the graphics over the years and over-sharpened it for E-sports. That's a bad example anyways, Half-Life Alyx and CS2 both use baked lighting and look photoreal sometimes.
very infomative video. some part of the videos are really hard to hear due to the heavy accent, the autogenerated sub can only do so much.
People always say stop optimizing games it’s stupid, and well they need to be optimized
I feel like Moore's Law will always continue, because 15 years ago everybody was sure it was gonna grind to a halt in 2020 or so because of physical limits and clearly that has not been the case. And there's the whole quantum thing brewing on the horizon for a decade or so now, while not ready for "prime time" yet either, it's clear that it does work and will make us able to reach 'new heights' a few more years down the line that were long thought to be impossible.
Otherwise a great video ^^
Quantum computing cannot compute normal logic faster than a typical computer
Quantum computing is highly specific, it will not make video games run faster.
Quantum computing is not something that is on the horizon, at least not for the general public. They only function at almost absolute zero temperature, and a cooling system capable of that will definitely not be available to us anytime soon, and it would take an even longer time until it becomes affordable to build a PC with that.
Dude imagine being the most sold game of all time and having optimization so bad that a huge chunk of youre comunity is dedicatted to fixing it
Minecraft is truly not a heavy game only if the devs adressed it it would be playable
They have been working on it. I think they also focus on good practice more than performance.
Thanks, this was a pretty cool rundown.
C. You are using Unity.
Devs relying on pure hw power instead of developers' skills in squeezing out as much power of that hw instead is bane of most modern games (especially bigger ones).
For example, remember how Crash Bandicoot was more than even Sony believed was possible?
Meanwhile we have bland looking games that look maybe 5% better than games from few years ago but take dozens of times more disk space and struggle to run even on the best HW consumers could theoretically access.
Old ARK take over 430GB on disk, looks crap, plays even worse and is plagued with more bugs than some actual Early Access I've played (combined). Fitgirl's repack is about 43GB.
Starfield is almost unplayable withoud Nvidia's DLSS and Todd has the nerve to say it's because their game pushes the technology to their limits and PC should just upgrade (ignoring the fact that consoles run the game better in general but their hardware is at most medium-grade compared to gaming PCs).
Modern devs are just lazy. They self-learned from poor-quaity tutorials on YT or some shit and think themselves great developers.
Hey! What's the tool you are using at 3:50? This seems way better than having to boot up photoshop every time I want to use smoothness textures (damn Unity smoothness on Albedo Alpha :v) Great video!
This title feels like it should be the lost art of game optimization
Quality content right here! Thank you for sharing this!
Thank you very much, now I know why I am getting 3 fps👌🏻
We can't even run games at 4k 165 fps without upscaling unless it is 4080 or 4090. Sucks that gpu prices are getting higher when we're not even getting good performance at native resolution
4K is a lot pixels to cover one needs those high priced GPUs to render it.
Research floating point operations.
buy a 1080x1920 monitor.
On low budget developers export game engine with game they make. Making Pong requires now to export Unreal Engine 3 Custom with it.
I needed to empty out some space in my drive. From what i gathered indie devs do so much better in optimization. Just look at the most recent poor optimization from cities skyline 2. Most of the time they always argue about home computers under performing to deflect the criticism.
Yeah, but indie games are generally not that large in scope either. Look at the ones which are, like valheim. Its optimization is alright. Now compare it to NMS, a AAA game. Look at choo choo charles, an indie game. Optimization is not that great on it. Granted, it was made relatively quickly and with just 1 guy, but still.
What's going on doe right ? What happened to optimizing games properly bro ? (somebody fill me in)
1. Over dependence with high end cards, processors, systems, and consoles ?
2. Game Project rushing because of money and time control by financial support ?
3. Half ass development and early releases for fast money intake ?
Most Games exited the FUN area and went to fucking " BUSINESS MODE ONLY ". This is why I have absolute love for indie game developers who actually make really good optimized games with great content plus interact with their community.
I did not understand a word from 5:46 to 10:17. 10/10
get your ears checked then lol
Try understanding some bitches, we can understand him fine
The games you show at the beginning are doing all the things you mention in the rest of the video. That's not "why" they are "unoptimized". And "optimized" is often a ambiguous term. Games may be optimized to run their workload the best (or as good as possible given the time constraints/skills of the whole team) but still won't run on a potato PC because it is not part of their design goal. If you include a path tracer in your engine you better be "optimized". But if you have to run on Switch, then it would make better sense to drop the path tracer and maybe have a simple renderer with a few light sources.
Unless you have access to source codes and assets of those games, your claim is just a speculation.
@@swh77 Because in his video he is just talking about basic stuff. Not what goes into optimizing a raytracer, shader passes and so on.
@@swh77 It's not speculation when there's a plethora of research papers, presentations, articles, blog posts and discussions regarding all of this out in the public domain, a significant portion of which has come from developers working on AAA games. You don't need access to the source code that game X uses to implement screen-space reflections when the studio who made game X literally held a presentation discussing their approach in GDC one year, you can just go watch their presentation.
Someone needs to send this to the team working on ready or not 😂
yes!! 2 years ago the performance of that game started to go downhill from always over 100fps ultra to almost never going above 80fps, too many fps drops to 60fps or below and lots of stuttering and increased input lag.
Removing the old maps and game modes in the 1.0 update and worse performance are the worst issues of Ready or Not to me, it's just so unfortunate.
Give this man a world of warships sponsor
Nowadays game studios just throw DLSS at everything and call it a day.
6:07 i got curious what game this is and made a code to analyze all existing games -- "a game that isn't dying or anything or at least that's what the devs like to think" -- is it World of Warships?
Apparently Microsoft just say “fuck this” and shit out a crudely written port for €70
Dunkirk is a small city in France, at the englosh channel, in wich the British troops where trapped during the invasion of france. There is also a Movie about this with the same name. @worldsinmotion
Ur gonna blow up
RDR2 looks better than most PS5 games right now. It amazes me how the devs went to perfect the game's optimization and fit it in a 8th gen console.
That’s because its environment is easier to render than plenty of modern games that are set in cities. It’s just not super dense so RDR2 really leans into its lighting. They have tons of presentations on how they achieved the visuals they made. Although this doesn’t mean it surpasses them in all aspects as there are select few things many 9th gen games are doing. Problem stems from the fact that we are still kinda stuck in cross gen still
@@crestofhonor2349 This year almost all games are next gen only so we'll definitely see drastic improvements for sure.
No way wreckfest made it on a thumbnail
Now that raster techniques are so good, rt feels like a brute forcing waste of resources.
And console games are just pc games with low settings…. Which is not what it should be
Moore’s Law is only dying if you’re talking about transistor density not compute power density, which is what we should really be measuring and isn’t slowing down any time soon.
I know "thing vs thing, Japan" is cringe as hell but I can't deny that Japanese developers really know how to optimize their games very well. Allowing my 2060 to run on highest possible graphics, while it struggles with basically any western game to have stable 60fps on low-mid settings is a feat worth acknowledging.
Yeah, the internet needs to stop glazing over Japan.
This is not just with Games, happens with cars as well. Western cars are horribly made and very unreliable. Meanwhile Japanese cars are built like a tank.
Except Gamefreak 😔
Well, it's probably not a problem with their developers as much as it is with the insane time constraints they're forced to work with.
In the future games will be made in a way that doesnt require insane hardware and theyll be a universal standard for specs whilst maintaining realistic graphics
Can't tell if you're completely delusional or colossally optimistic. I wouldn't trust modern devs to make a 1:1 copy of Minecraft that would run as well as the OG one.
@@TheJohn_HighwayI dunno about that Minecraft is a pretty purely optimised game, community mods have more than doubled the performance of the base game. Granted mojang has been improving it especially with the lighting engine rewrite recently that fixed one of the worst bottlenecks.
@@MrMoon-hy6pnBedrock was pretty well optimized. You can get 60 fps at a render distance of 84 chunks on the right specs.
@jeff_7274 bedrock is CRAZY optimized. That's why I'm always tweaking the settings to run shaders without rtx LOL
But java... well, I know it's a little limited by the language it's using, but still, we're talking about the second or third biggest company in the world. I honestly hope they switch Java to also c++ and make them both equal.
@@jeff_7274 compare to vanilla Java perhaps, with my setup I can get ~60 fps in a jungle at 64 chunks on bedrock 30 in the same seed with 32 chunks in Java so you are right there. But with sodium mod and a few others I can get around 100 - 300 fps. So there is still a long way to go.
Topic os good but I can't understand a single word, depending on subtitles only 😢
really good video, but you need to find some way to reduce peaks whenever you pronounce S, as it did hurt my ears a bit...
A sacred art lost to time..
Hi! Good video, but you used the wrong Godot logo at about 2:00. That's all
You mean those extra teeth?
@@llllllXllllllyes, that logo is very old and not used anymore
Ah yes, game optimization, the thing every developer ever forgot about
When did neural networks and machine learning become AI? It's such an over used buzz word.
I'm a little of a conspiracy guy, and I would say the big hardware players have some kind of deal with the AAA companies to not optimize their games, because of the overreliance on DLSS and FSR, and because no one in the entire planet can tell me Starfield is optimized and Todd Howard response to that is "it's optimized, go and upgrade your PC lol". C'mon, it's really sus.
All of this in 10 minutes, insane video
Very awesome video. Such an accessible introduction to optimization
9:22
You aren't using the full performance benefits of Dictionaries. By acessing the value by key and not iterating over the values you get a exectuation time of 0.25 ms:
Dictionary wordsDictionary = new Dictionary();
void LoadWords()
{
string[] lines = File.ReadAllLines("words_alpha.txt");
foreach (string line in lines)
{
string morse = Translate(line, "");
if (!wordsDictionary.ContainsKey(morse))
{
wordsDictionary.Add(morse, new List());
}
wordsDictionary[morse].Add(line);
}
}
string Translate(string input, string divider = " ")
{
string result = "";
foreach (char c in input)
{
lettersDictionary.TryGetValue(c, out string morse); ;
result += morse + divider;
}
return result;
}
List TranslateInvalidCode(string input)
{
input = input.Replace(" ", "");
return wordsDictionary[input];
}
WildCard Studio needs to see this video
ARK Ascended has the worst Optimization i ever seen in a video game
Arc has always been poorly optimized since the game initially came out in UE4. There is nothing new from those devs other than proving that they don’t care about performance.
Best example is the original Xbox One port and original Nintendo Switch port
totally agree :)@@crestofhonor2349
@@crestofhonor2349Apparenlty the new Ark switch port actually runs pretty well and looks better than the original crappy port which I find incredible. I’d be very interested to see what they did to get it to run on there. Everywhere else though it still seems just as bad as it was the day it came out.
You know it IS quality content when you hear that accent
I work as a programmer for a small indie studio and my job has really made me wonder how AAA companies make videogames.
You often don't see this a lot with indie games, because if your studio has 1 - 3 programmers and someone doesn't know what they're doing, you just don't have a videogame. But AAA companies seem to always struggle with the absolute silliest things, like a single weapon in Destiny 2 breaking 36 times, or the Freddy Fazbear model in Security Breach having straight up movie-level polygons, or games like CoD or RDR2 having absolutely zero compression on anything, or GTA 5 Online literally having an infinite loop on the BOOT. How do these things happen???
EVIDENTLY they have talented engineers on the teams for all of these games, but all of these mistakes are so stupid, so first-semester-of-college basic, that I wonder who these people are actually hiring.
Same goes for security too. It's definitely a lot of extra planning, but not much work to make your remotes secure, your client-server communication reliable and safe (P2P is harder though). But then these games never do it, end up with massive cheating problems, and then end up having to shell out millions for a rootkit garbage ""anticheat"". It's baffling.
I think the problem with massive games is that their jumble of code ends up connecting a lot of seemingly separate things. Something we see as a small glitch could be an unimaginative tangle of code that is just waiting to break the entire game
I will say something dumb but I think it's because they don't hire good programmers that play games instead they hire activist that really hates gamers or just lazy developers
@@theemperor4814 I don't think anyone who hates gamers or games would work in the industry, it's an absolutely insane amount of work not many people truly understand; but I wouldn't put it beyond an unqualified or untrained team being in charge of these things.
Having a novice audio or graphic designer be responsible for importing assets, and forgetting to compress files; or having more junior developers write less critical code which is never cross-examined, could easily cause this issue.
@@zeelyweely1590 the thing is they love games and hate gamers but they want what you call diversity look at x ( before Elon )having excess of workers that's the majority of gaming Industry now
@@zeelyweely1590 just look at Concord
i have live\ long enough to see the days where good graphics doesnt always mean good video game. Lethal Company for the winner!
Now u need rtx 3080 just to have over 60 fps on modern games that have worse graphics than RDR2 and not even 4k, just 1080p..
I think as the games become bigger, it's also becoming harder to polish them before release. And also I hate it when a game is released unoptimized and then they say go rely on DLSS to up the frame rate. That's a clown's move.
Good example is GTA 4, somehow GTA 4 performs worse then GTA 5
nice video
Nanite does a lot of work offline and before the game so I wouldn't really call it real time, more like an automated really fancy LOD
Really good video although there's a problem; you're mainly focusing on optimisation in Unity which is a fundamentally terrible engine that nobody should use anymore especially because of how mismanaged it is and the runtime fee. It would be great if you talked more about optimisation in Unreal which I feel you kinda glossed over it by only mentioning Nanite and Lumen, there's much more to it than that such as light baking, blueprint nativisation, anti-aliasing methods, shadow optimisation, etc. Godot would have also been great to have a mention since it's open source and that
@1:50 C. You are using Unity
The only thing I have pretty sure, is that no one cares about Low-End, if the release of CS2 has told me anything, is that we are shit, and shit is disposable.