The pinnacle of modern game optimization has got to go to doom eternal that game put doom 2016 and all other modern games that rely on bloated file sizes to shaaaaaaaame
@@reglan_devOkay, I have two big problems with that. First off, why do indie games not qualify for the pinnacle of modern game optimization? Also, Do you know how much better optimized Factorio is than any other indie game?
@@what42pizza 1. The video and the commenter both mentioned AAA games. As we all know, modern AAA scene is famous for massive amount of unoptimised games. Indie games on the other hand, are known for the fact how well done they are, and optimised. Therefore they are a pinnacle of game optimisation, however the video wasnt talking about them. 2. Yes, i know how well optimised factorio is. However, its not the only very well optimised game out there too
Most companies' idea of "optimization" is to just raise the system requirements. I've always joked that if a Windows programmer today was asked to write an exact copy of Space Invaders, it would require a 2.4Gh i5 CPU, 8GB of RAM, a graphics card supporting pixel shaders 3.0, DirectX 11, and a minimum of 20GB of hard drive space.
- As a game dev obsessed with optimization (and kinda having it as the only responsibility at my workplace) i'm indescribably upset with the current technical state of modern games. Back in the days with limited hardware specs as well as harsh development reality people had to put their best to simply make the game playable. I've been excited about it ever since i saw it working as a kid. Nowadays the games are slapped with the most generalized optimization techniques that are not even polished to fit into the game well enough, lol. - Today, with so many high-level languages with large overhead, game engines aimed at designers who are freaked out by just thinking of using their keyboard and not their mouse, the industry falling to its lowest and becoming yet another business area, the publishers ridiculous tendency to develop a lot of generic over casualed games in a foolish desire to conquer new users and overall shitty educational quality the games are ended up having so much overhead and technical debt that you could easily fit 2 or even 3 old games in there, both memory and performance wise. - I have always been thriving into the industry for this and i remained committed to the optimization all along. I am treating it like an art, which it really is. It's a shame seeing the industry being bloated with people so far away from understanding how games are working that the betelgeuse appears to be at the arm's reach, but this is the reality. Less competition as well, which oh man do i abuse. - Thanks for making the effort to read, whoever you are. Much appreciated :D
What's even more infuriating is seeing gamers and even some benchmark channels say things like "Oh this 2023 game runs 60fps on medium settings on a 900$ PC? Then it's optimized!" meanwhile that game may have 2012 graphics at absolute best and absolutely no physics or interactive environments to speak of. There's this braindead notion that a game's optimization is based on when it was released rather than how it looks like and how much physics or interactive environment it has. As a result devs use modern hardware not to push graphics and technology forward, but to instead become lazier. Imagine if we told devs 12 years ago that in the future video games would visually look 15% better but will run quite literally 500% worse
People don't often consider a game's optimization in their review all that much or praise it all that much. Take NMS. Procedurally generated yet it runs much better than most titles. Valheim is less optimized but its an indie title and has made large strides in improving its optimization. RDR2 runs pretty well too, same with GTA V. Doom, obviously. Distance is another well optimized game. Same with ''The entropy Center''. Forza horizon 4 and 5 are very well optimized. But people will say FC5 is ''well optimized'' or stuff like that.
While a few studios no doubt demonstrate a degree of technical incompetence, I firmly believe the vast majority of these cases simply boil down to poor management. Tight deadlines, feature creep, loose vision, quick turnaround to avoid investor pullout, higher ups that are disconnected from the pipeline and complexities that arise out of short-sighted decisions, the list goes on. With "industry" now being the operative word for the gaming industry, it's no surprise we see the same practices being utilised in larger projects to corner cut so long as people keep buying; and more so than any other sector of products I've seen, the consumer-base for games is far and away the most willing to put up with this capitalistic downward spiral.
@qma2275 After watching the video and hearing a lot from you got me really interested into diving deep into the topics. I really wanna learn more about the optimization and graphics that really works though in terms of optimisation and everything that happens behind a game. So would like to try out any recommendation that you've got for a beginner like me!
At blizzcon many years ago, Blizzard told a story, they created a new city for world of warcraft, called Dalaran, however they discovered that rendering this city was ridiculously slow. On further investigation they discovered the problem was a toy shop, it had a model of the city as one of the toys. To save time the programmer had merely referenced the normal city model, scaled it down and rendered it. This led to a huge problem, if you looked at the toy store, the game would resize the city and render it as a model, this of course included the toy shop and the model city, which caused it to again resize the city effectively causing infinite recursion.
Wouldn't the game just not work then? It has to render an infinite number of the cities. Perhaps not because they get smaller? But somehow this makes me think about how some bigMobile games are optimised to run on comparatively weaker CPUs
What pissed me off the most regarding Starfield and Todd is when he said they optimized the game pretty well but a few weeks later they released a patch that increased performance by quite a lot. His own team proved that he was talking BS. Not that it needed much proving anyway because we have Cyberpunk where I could achieve literally double the FPS. And Night City is not even remotely comparable to Neon or Akila. Okay you can't fill a room with cheese in Cyberpunk but who even really cares.
Imagine if hardware was developed just like modern games: “- Whoa! It POSTs! - Ship it! - But there’s issues and clock is unstable. - We will fix it via driver updates!”
If they could, they would. Fortunately, casual gamers aren't the only ones buying motherboards and computer parts, I guess? Professionals and big corporations do too, and they probably want a stable and well made part.
you described every new GPU launches since 2015! RX480 blowing the motherboard PCIE by pulling more than 75W is one of the recent examples being fixed by a driver update :P
developers often dont eve optimize... they wait to see if idiots will buy overpowered hardware 1st. code is written by idiots.. sub optimally... they try get the product done 1st.
I can't wait until the Silicon Limit is reached and we finally start actually optimising more rather than demanding more and more RAM, higher and higher clock speeds, and yet more storage. Don't get me wrong, there are issues with over-optimisation or pre-optimisation from a maintenance standpoint, but *some* optimisation would be nice.
@@fireloop69 I mean Skyrim is a BAD example, it's not a well optimised game, and it doesn't run perfectly. But also my issue was that there *aren't* over optimisations, instead games and such just demand higher spec machines.
@@cptnraptor understandable well take rdr2 as an example then one may say its over optimised but it still looks better than most modern games while running smoother as well
tried to run fortnite on a gtx 1050 recently , it looks like shit , meanwhile i can run cod ww2 on medium low whit increadible performance ! and don't let me talk about other well optimized games like bioshock , black mesa, etc.
For those who learned OpenGL you need to extends object to the same class so when you apply a shader just do it for all object at once by creating a override func (common problem)
I think graphics card companies also have something to do with games not being optimized as to sell stronger GPUS so people can run a terribly optimized game
Actually, GPU vendors tend to work with game studios to optimize their games to run on their hardware so their hardware looks better in reviews. Your comment reminds me of Nvidia Gameworks though. It was this black box Nvidia gave to devs to do certain effects, such as hair. But, it was pretty much Nvidia abusing brute force tessellation to make subpar effects. Nvidia had better tessellation than AMD, so games ran faster on Nvidia cards. AMD at the same time made open source ways to do the same thing, though looking better, and running faster... even on Nvidia hardware.
This entire Bullshit of a theory is actually being taken seriously by some people. It blows my mind how unreasonably moronic people have become. Releasing unoptimized games really hurt the reputation of the devs and this in turn could majorly hurt their sales enough to shut them down. It's quite literally devs just releasing the game in a broken state so they can get money real quick from all the preorders. At the end of the day, devs release broken games to get a quick buck and not to save some GPU vendors ass.
@@xeridea Game sponsorship only tells part of the story. Almost all, if not all Ubisoft games are AMD sponsored, yet if we're talking CPUs, Intel CPUs handle the Assassins Creed and Far Cry games a lot better, an i9 9900k significantly outperforms a 5950X which came out a year later and usually trades blows with the 10900k. GPUs however, I believe Nvidia's usually do just as good a job, and in some cases even better, like in Far Cry 6 which has Ray Tracing.
Games that don't even look as good as the original Crysis (2007) run like crap on today's hardware... just look at Starfield. An empty planet and a few pebbles... no terrain, no vegetation and yet it can drop to 50 FPS on my high end PC. It's ridiculous.
There is a very important thing that is often forgotten about big O notation: it ignores constants and those constants might be huge. And that matters a lot if your code doesn't work on big dataset.
@@MehmetMehmet-y8c Big-O is a quick&dirty notation aimed to eyeball how an algorithm scales with the number of inputs given. Imagine I've some hypothetical algorithm and after careful analysis I determine that the number of operation required to complete goes: n^3 + 10n^2 + 100n. With big-O notation you're making the rather dirty observation that eventually n^3 dominated and anything else doesn't matter. When however? How 100n can be ignored for small-n? Big-O is flawed in that sense
I would credit every single japaneese video game developer in this department. They release some of the most optimised games ever. Special shoutout to metal gear solid V. I played that game on 2 gb ram.
that's intentional, Nintendo's law is always make games that are fun regardless of graphics, that's why the wii games have held up so well compared to ps2-3 games
Game devs have it easy with ample RAM and CPU and they still fuck it up. Try embedded development where resources are extremely limited and crashes can potentially cause catastrophic failures.
Nowadays games: - 98gbs for just a fighting game (yes Tekken, I am talking about you); - DirectX12 that freaks up the graphics for actually strong chips but old graphic cards (like GTX 970), making then running slow or creating really annoying visual arctifacts due to low resolution applied (another example of that is that Tekken 5 looks way prettier than Tekken 8 in low quality. Also SF6 doesn't need DirectX12, it could have DirectX11 and Vulkan for performance versions too). Seems like the industry only cares about Path Tracing in real time render.
True. Unoptimized games is not because of the software engineers but because of the managenent. Speaking from experience as a dev myself. Boy, if you see our codebase. Lol. Tighter deadlines so, we are cutting corners. I don't want to but I need to xD
The big O notation describes the growth of a function, not the actual execution speed. For example, hash maps have a lookup time complexity of O(1), whereas linear arrays have a lookup time complexity of O(n). However, if the hashing function is slow, array lookups will most likely outperform map lookups for lower values of n. O(10000) = O(1) is slower than O(1n) for n < 10000. That is to say, it would be incorrect to state that the big O notation accurately represents the real-world performance of an algorithm outside of big data. The correct way to locate slow functions is by profiling.
Asphalt 8/9 is crazy optimized if you’ve ever played it. Glorious graphics, solid 60 fps all the time even on pretty old computers and mobiles, very responsive.
Ksp2 devs: Let's make a ridiculously accurate rocket building game! And the map is the whole Solar System, just for good measure! Also Ksp2 devs: Recommended for 1080p60 are rtx 3080
Cyberpunk runs on the steamdeck... but people want to tell me it is impossible to run it on the ps4 with addon and 2.0 update? NO WAY... This game was a crime
csavo, meghallottam az akcentust, mentem is a csatorna leirasaba es lattam amit remeltem. egy MAGYAR youtuber aki angol tartalmat gyart? hat ezt a csatornat jo alaposan at fogom bongeszni 😂 csinald csak tovabb, nagy dolgot muvelsz💯
Meanwhile "Alan Wake II" Triangles? What Triangles? There's only 1 Trillion triangles on the player's perspective tho. Surely your entry level 4090 can handle it, right?
A youtube video once said the reason DX12 is so “bad” is because the burden of optimisation is now on the game devs, who don’t have much experience with drivers..
To give some feedback: I think the "Code" section was a bit too fast especially you explaining what we are actually trying to accomplish. I didn't get it before rewatching that part a few times
New games almost entirly lack proper optimizations to make it playable and acceptable for players to enjoy , those who still do it good , are passionate artisits...
I jokingly made the reference that AAA studio leads just say "Our game isn't unoptimised, just buy a 4090" behind the scenes. And then Todd Fuckwad said it in an actual interview. Fuck everything that guy stands for. And then he dares get dissapointment at the game awards every time it didn't win. Dude thinks the sun is shining out of his ass
The number one thing that works for optimization is Frame Generation. Anybody can now do it with any game any graphics card with a program on Steam called Lossless Scaling. That's what happens when AMD makes FSR frame gen open source
In the 90s due to the limited resources on computers, devs had no choice but to optimize the software(not just games) before release, internet for updates was pretty much non existent. Remember how little RAM some programs used like Adobe PDF, heck look for PDF reader alternatives and you will see some use less RAM overall. Heck Windows did not show seconds on the clock in the taskbar as the CPU would need to render a new number every second and that would cause a performance hit, while this was more true for Win9x it still shows that devs had to make sure that the code they wrote was good from the start. But these days cause we can have 64GB of RAM, 2TB SSDs devs dont really bother with optimizing software, why waste time optimizing when you have so much resources?
Hey! What's the tool you are using at 3:50? This seems way better than having to boot up photoshop every time I want to use smoothness textures (damn Unity smoothness on Albedo Alpha :v) Great video!
"Optimisation is easy just use the 20 new unrealengine5 meme effects that require a 4090 to run!" This video is why modern videogames are the way they are lmao
he should definetly have talked about baked lighting in the video. it makes games run so much better and it looks awesome if done right. Many beginner devs just use lumen since it's now enabled by default in ue5 and those devs don't know any better than to use this extremely slow and performance consuming technique. Also a fun fact: i once saw a tutorial on "how to remove the 'lighting needs to be rebuilt' error" and the dude literally just told the viewers to select all the lights and make them dynamic. talk about good performance...
@@paper_shreds I see some people calling baked lighting "faking it" or if its somehow inferior to real time lighting, even if the results are better. I can't wait until they realize that all 3D graphics are built on trickery
I feel like those people saw some presentation about open world games at some point where dynamic light sources are the only option and that made them think thats the best way@@SomeRandomPiggo
@@TheJohn_Highway R6 used to actually look good though. They've reduced the graphics over the years and over-sharpened it for E-sports. That's a bad example anyways, Half-Life Alyx and CS2 both use baked lighting and look photoreal sometimes.
9:26 "pretty good optimization"? no, that's insane optimization. what a simple solution as well. now, whenever i think a program can't be more optimized, i'll slap myself and remember this example and try to optimize it more.
Dude imagine being the most sold game of all time and having optimization so bad that a huge chunk of youre comunity is dedicatted to fixing it Minecraft is truly not a heavy game only if the devs adressed it it would be playable
Im not really from a rich place or still have money as a student so my pc is kinda meh...and i admire this a lot...as a programmer myself i really dove deep into this lately I just cant play a game slower than 40-50 fps ...or with drops
I think we should focus on software more than hardware now, hardware is at it's peak and it will never improve to the point of actually...well, improving. I have a 3050 laptop, it's pretty sluggish in certain games but run others like a dream, is there a huge graphical difference? No. So why does it do that?? Fucking optimization. The 3050m gpu is definitely not the peak I was talking about, but it's pretty damn close, just add some extra vram, a little more cuda cores, a little faster bandwidth speed, and boom; you just made yourself something that can run literally everything, (spoiler: that exists, it's called the 3060) and if the 3050 on itself can run everything, some at lower settings, a lot more at higher, then I don't see a single reason we should "improve" from the 3090, much less the 4090. If we're talking realistically, the 4090 can last people decades! What more do we need from a graphics card? Time travel? It's a graphics card, it runs games at 4k ultra with RT on, I seriously do NOT see anything we could improve. If game companies put care into optimization, then I would've have to upgrade my laptop for another decade, and to those with a much more powerful gpu's, like for example a 3070 or a 4070, they shouldn't have to upgrade for another however long it takes for these gpu's to simply break.
In the future games will be made in a way that doesnt require insane hardware and theyll be a universal standard for specs whilst maintaining realistic graphics
Can't tell if you're completely delusional or colossally optimistic. I wouldn't trust modern devs to make a 1:1 copy of Minecraft that would run as well as the OG one.
@@TheJohn_HighwayI dunno about that Minecraft is a pretty purely optimised game, community mods have more than doubled the performance of the base game. Granted mojang has been improving it especially with the lighting engine rewrite recently that fixed one of the worst bottlenecks.
@jeff_7274 bedrock is CRAZY optimized. That's why I'm always tweaking the settings to run shaders without rtx LOL But java... well, I know it's a little limited by the language it's using, but still, we're talking about the second or third biggest company in the world. I honestly hope they switch Java to also c++ and make them both equal.
@@jeff_7274 compare to vanilla Java perhaps, with my setup I can get ~60 fps in a jungle at 64 chunks on bedrock 30 in the same seed with 32 chunks in Java so you are right there. But with sodium mod and a few others I can get around 100 - 300 fps. So there is still a long way to go.
yes!! 2 years ago the performance of that game started to go downhill from always over 100fps ultra to almost never going above 80fps, too many fps drops to 60fps or below and lots of stuttering and increased input lag. Removing the old maps and game modes in the 1.0 update and worse performance are the worst issues of Ready or Not to me, it's just so unfortunate.
I feel like Moore's Law will always continue, because 15 years ago everybody was sure it was gonna grind to a halt in 2020 or so because of physical limits and clearly that has not been the case. And there's the whole quantum thing brewing on the horizon for a decade or so now, while not ready for "prime time" yet either, it's clear that it does work and will make us able to reach 'new heights' a few more years down the line that were long thought to be impossible. Otherwise a great video ^^
Quantum computing is not something that is on the horizon, at least not for the general public. They only function at almost absolute zero temperature, and a cooling system capable of that will definitely not be available to us anytime soon, and it would take an even longer time until it becomes affordable to build a PC with that.
What's going on doe right ? What happened to optimizing games properly bro ? (somebody fill me in) 1. Over dependence with high end cards, processors, systems, and consoles ? 2. Game Project rushing because of money and time control by financial support ? 3. Half ass development and early releases for fast money intake ? Most Games exited the FUN area and went to fucking " BUSINESS MODE ONLY ". This is why I have absolute love for indie game developers who actually make really good optimized games with great content plus interact with their community.
Devs relying on pure hw power instead of developers' skills in squeezing out as much power of that hw instead is bane of most modern games (especially bigger ones). For example, remember how Crash Bandicoot was more than even Sony believed was possible? Meanwhile we have bland looking games that look maybe 5% better than games from few years ago but take dozens of times more disk space and struggle to run even on the best HW consumers could theoretically access. Old ARK take over 430GB on disk, looks crap, plays even worse and is plagued with more bugs than some actual Early Access I've played (combined). Fitgirl's repack is about 43GB. Starfield is almost unplayable withoud Nvidia's DLSS and Todd has the nerve to say it's because their game pushes the technology to their limits and PC should just upgrade (ignoring the fact that consoles run the game better in general but their hardware is at most medium-grade compared to gaming PCs). Modern devs are just lazy. They self-learned from poor-quaity tutorials on YT or some shit and think themselves great developers.
I needed to empty out some space in my drive. From what i gathered indie devs do so much better in optimization. Just look at the most recent poor optimization from cities skyline 2. Most of the time they always argue about home computers under performing to deflect the criticism.
Yeah, but indie games are generally not that large in scope either. Look at the ones which are, like valheim. Its optimization is alright. Now compare it to NMS, a AAA game. Look at choo choo charles, an indie game. Optimization is not that great on it. Granted, it was made relatively quickly and with just 1 guy, but still.
Moore’s Law is only dying if you’re talking about transistor density not compute power density, which is what we should really be measuring and isn’t slowing down any time soon.
That’s because its environment is easier to render than plenty of modern games that are set in cities. It’s just not super dense so RDR2 really leans into its lighting. They have tons of presentations on how they achieved the visuals they made. Although this doesn’t mean it surpasses them in all aspects as there are select few things many 9th gen games are doing. Problem stems from the fact that we are still kinda stuck in cross gen still
We can't even run games at 4k 165 fps without upscaling unless it is 4080 or 4090. Sucks that gpu prices are getting higher when we're not even getting good performance at native resolution
Dunkirk is a small city in France, at the englosh channel, in wich the British troops where trapped during the invasion of france. There is also a Movie about this with the same name. @worldsinmotion
The games you show at the beginning are doing all the things you mention in the rest of the video. That's not "why" they are "unoptimized". And "optimized" is often a ambiguous term. Games may be optimized to run their workload the best (or as good as possible given the time constraints/skills of the whole team) but still won't run on a potato PC because it is not part of their design goal. If you include a path tracer in your engine you better be "optimized". But if you have to run on Switch, then it would make better sense to drop the path tracer and maybe have a simple renderer with a few light sources.
@@swh77 It's not speculation when there's a plethora of research papers, presentations, articles, blog posts and discussions regarding all of this out in the public domain, a significant portion of which has come from developers working on AAA games. You don't need access to the source code that game X uses to implement screen-space reflections when the studio who made game X literally held a presentation discussing their approach in GDC one year, you can just go watch their presentation.
I'm a little of a conspiracy guy, and I would say the big hardware players have some kind of deal with the AAA companies to not optimize their games, because of the overreliance on DLSS and FSR, and because no one in the entire planet can tell me Starfield is optimized and Todd Howard response to that is "it's optimized, go and upgrade your PC lol". C'mon, it's really sus.
If it is so hard to estimate, and hard to do why game company used to hype up gaming community with such announcement and such date release. Get the job done and then announce!
Now that raster techniques are so good, rt feels like a brute forcing waste of resources. And console games are just pc games with low settings…. Which is not what it should be
Arc has always been poorly optimized since the game initially came out in UE4. There is nothing new from those devs other than proving that they don’t care about performance. Best example is the original Xbox One port and original Nintendo Switch port
@@crestofhonor2349Apparenlty the new Ark switch port actually runs pretty well and looks better than the original crappy port which I find incredible. I’d be very interested to see what they did to get it to run on there. Everywhere else though it still seems just as bad as it was the day it came out.
I know "thing vs thing, Japan" is cringe as hell but I can't deny that Japanese developers really know how to optimize their games very well. Allowing my 2060 to run on highest possible graphics, while it struggles with basically any western game to have stable 60fps on low-mid settings is a feat worth acknowledging.
This is not just with Games, happens with cars as well. Western cars are horribly made and very unreliable. Meanwhile Japanese cars are built like a tank.
Except Gamefreak 😔 Well, it's probably not a problem with their developers as much as it is with the insane time constraints they're forced to work with.
6:07 i got curious what game this is and made a code to analyze all existing games -- "a game that isn't dying or anything or at least that's what the devs like to think" -- is it World of Warships?
Really good video although there's a problem; you're mainly focusing on optimisation in Unity which is a fundamentally terrible engine that nobody should use anymore especially because of how mismanaged it is and the runtime fee. It would be great if you talked more about optimisation in Unreal which I feel you kinda glossed over it by only mentioning Nanite and Lumen, there's much more to it than that such as light baking, blueprint nativisation, anti-aliasing methods, shadow optimisation, etc. Godot would have also been great to have a mention since it's open source and that
The pinnacle of modern game optimization has got to go to doom eternal that game put doom 2016 and all other modern games that rely on bloated file sizes to shaaaaaaaame
I'd argue games developed by Nintendo themselves on the Switch takes the cake, but I guess that depends on what requirements you're going for here.
What about Factorio?
@@what42pizzaits not a AAA games. If we were talking about indie games, then youd see most of them being well optimised. Like factorio
@@reglan_devOkay, I have two big problems with that. First off, why do indie games not qualify for the pinnacle of modern game optimization? Also, Do you know how much better optimized Factorio is than any other indie game?
@@what42pizza 1. The video and the commenter both mentioned AAA games. As we all know, modern AAA scene is famous for massive amount of unoptimised games.
Indie games on the other hand, are known for the fact how well done they are, and optimised. Therefore they are a pinnacle of game optimisation, however the video wasnt talking about them.
2. Yes, i know how well optimised factorio is. However, its not the only very well optimised game out there too
Most companies' idea of "optimization" is to just raise the system requirements.
I've always joked that if a Windows programmer today was asked to write an exact copy of Space Invaders, it would require a 2.4Gh i5 CPU, 8GB of RAM, a graphics card supporting pixel shaders 3.0, DirectX 11, and a minimum of 20GB of hard drive space.
And you'll not be wrong 😂
- As a game dev obsessed with optimization (and kinda having it as the only responsibility at my workplace) i'm indescribably upset with the current technical state of modern games. Back in the days with limited hardware specs as well as harsh development reality people had to put their best to simply make the game playable. I've been excited about it ever since i saw it working as a kid. Nowadays the games are slapped with the most generalized optimization techniques that are not even polished to fit into the game well enough, lol.
- Today, with so many high-level languages with large overhead, game engines aimed at designers who are freaked out by just thinking of using their keyboard and not their mouse, the industry falling to its lowest and becoming yet another business area, the publishers ridiculous tendency to develop a lot of generic over casualed games in a foolish desire to conquer new users and overall shitty educational quality the games are ended up having so much overhead and technical debt that you could easily fit 2 or even 3 old games in there, both memory and performance wise.
- I have always been thriving into the industry for this and i remained committed to the optimization all along. I am treating it like an art, which it really is. It's a shame seeing the industry being bloated with people so far away from understanding how games are working that the betelgeuse appears to be at the arm's reach, but this is the reality. Less competition as well, which oh man do i abuse.
- Thanks for making the effort to read, whoever you are. Much appreciated :D
What's even more infuriating is seeing gamers and even some benchmark channels say things like "Oh this 2023 game runs 60fps on medium settings on a 900$ PC? Then it's optimized!" meanwhile that game may have 2012 graphics at absolute best and absolutely no physics or interactive environments to speak of.
There's this braindead notion that a game's optimization is based on when it was released rather than how it looks like and how much physics or interactive environment it has. As a result devs use modern hardware not to push graphics and technology forward, but to instead become lazier.
Imagine if we told devs 12 years ago that in the future video games would visually look 15% better but will run quite literally 500% worse
People don't often consider a game's optimization in their review all that much or praise it all that much.
Take NMS. Procedurally generated yet it runs much better than most titles. Valheim is less optimized but its an indie title and has made large strides in improving its optimization. RDR2 runs pretty well too, same with GTA V. Doom, obviously. Distance is another well optimized game. Same with ''The entropy Center''. Forza horizon 4 and 5 are very well optimized.
But people will say FC5 is ''well optimized'' or stuff like that.
While a few studios no doubt demonstrate a degree of technical incompetence, I firmly believe the vast majority of these cases simply boil down to poor management. Tight deadlines, feature creep, loose vision, quick turnaround to avoid investor pullout, higher ups that are disconnected from the pipeline and complexities that arise out of short-sighted decisions, the list goes on.
With "industry" now being the operative word for the gaming industry, it's no surprise we see the same practices being utilised in larger projects to corner cut so long as people keep buying; and more so than any other sector of products I've seen, the consumer-base for games is far and away the most willing to put up with this capitalistic downward spiral.
@qma2275 After watching the video and hearing a lot from you got me really interested into diving deep into the topics. I really wanna learn more about the optimization and graphics that really works though in terms of optimisation and everything that happens behind a game. So would like to try out any recommendation that you've got for a beginner like me!
thats why AAA makes billions while you work for minimum wage as an indie dev lmao
At blizzcon many years ago, Blizzard told a story, they created a new city for world of warcraft, called Dalaran, however they discovered that rendering this city was ridiculously slow.
On further investigation they discovered the problem was a toy shop, it had a model of the city as one of the toys. To save time the programmer had merely referenced the normal city model, scaled it down and rendered it.
This led to a huge problem, if you looked at the toy store, the game would resize the city and render it as a model, this of course included the toy shop and the model city, which caused it to again resize the city effectively causing infinite recursion.
that's crazy. thanks for sharing.
Wouldn't the game just not work then? It has to render an infinite number of the cities.
Perhaps not because they get smaller?
But somehow this makes me think about how some bigMobile games are optimised to run on comparatively weaker CPUs
"Our game is running just fine, maybe it's time to upgrade."
Todd Howard: To PC gamers with i9's and 4090ti
What pissed me off the most regarding Starfield and Todd is when he said they optimized the game pretty well but a few weeks later they released a patch that increased performance by quite a lot. His own team proved that he was talking BS.
Not that it needed much proving anyway because we have Cyberpunk where I could achieve literally double the FPS. And Night City is not even remotely comparable to Neon or Akila.
Okay you can't fill a room with cheese in Cyberpunk but who even really cares.
@@valentinvas6454Fr unpaid modders released optimization patches like that same night it released 😂 also added dlss and other very much needed stuff
@@valentinvas6454 Dude you're comparing apples to oranges. This is so ignorant on so many levels I don't even know what to say.
@@ged-4138 Care to elaborate?
@@ged-4138 you dont know that to say because you dont have anything to say. So next time just keep it to yourself.
Imagine if hardware was developed just like modern games:
“- Whoa! It POSTs!
- Ship it!
- But there’s issues and clock is unstable.
- We will fix it via driver updates!”
Shh! Don't give them ideas!
It's consumers that enabled this behavior. If people stopped buying broken games due to their FOMO, we'd be doing much better
If they could, they would. Fortunately, casual gamers aren't the only ones buying motherboards and computer parts, I guess? Professionals and big corporations do too, and they probably want a stable and well made part.
you described every new GPU launches since 2015! RX480 blowing the motherboard PCIE by pulling more than 75W is one of the recent examples being fixed by a driver update :P
developers often dont eve optimize... they wait to see if idiots will buy overpowered hardware 1st.
code is written by idiots.. sub optimally... they try get the product done 1st.
I can't wait until the Silicon Limit is reached and we finally start actually optimising more rather than demanding more and more RAM, higher and higher clock speeds, and yet more storage.
Don't get me wrong, there are issues with over-optimisation or pre-optimisation from a maintenance standpoint, but *some* optimisation would be nice.
there is no issue with over optimizations an optimized game will run perfectly for years skyrim is peak example
@@fireloop69 I mean Skyrim is a BAD example, it's not a well optimised game, and it doesn't run perfectly.
But also my issue was that there *aren't* over optimisations, instead games and such just demand higher spec machines.
@@cptnraptor understandable well take rdr2 as an example then one may say its over optimised but it still looks better than most modern games while running smoother as well
@@fireloop69rdr 2 is definitely the perfect example
tried to run fortnite on a gtx 1050 recently , it looks like shit , meanwhile i can run cod ww2 on medium low whit increadible performance ! and don't let me talk about other well optimized games like bioshock , black mesa, etc.
For those who learned OpenGL you need to extends object to the same class so when you apply a shader just do it for all object at once by creating a override func (common problem)
This comment was approved by real amecian patriots!!!
I think graphics card companies also have something to do with games not being optimized as to sell stronger GPUS so people can run a terribly optimized game
Actually, GPU vendors tend to work with game studios to optimize their games to run on their hardware so their hardware looks better in reviews. Your comment reminds me of Nvidia Gameworks though. It was this black box Nvidia gave to devs to do certain effects, such as hair. But, it was pretty much Nvidia abusing brute force tessellation to make subpar effects. Nvidia had better tessellation than AMD, so games ran faster on Nvidia cards. AMD at the same time made open source ways to do the same thing, though looking better, and running faster... even on Nvidia hardware.
This entire Bullshit of a theory is actually being taken seriously by some people. It blows my mind how unreasonably moronic people have become. Releasing unoptimized games really hurt the reputation of the devs and this in turn could majorly hurt their sales enough to shut them down. It's quite literally devs just releasing the game in a broken state so they can get money real quick from all the preorders. At the end of the day, devs release broken games to get a quick buck and not to save some GPU vendors ass.
@@xeridea Game sponsorship only tells part of the story. Almost all, if not all Ubisoft games are AMD sponsored, yet if we're talking CPUs, Intel CPUs handle the Assassins Creed and Far Cry games a lot better, an i9 9900k significantly outperforms a 5950X which came out a year later and usually trades blows with the 10900k. GPUs however, I believe Nvidia's usually do just as good a job, and in some cases even better, like in Far Cry 6 which has Ray Tracing.
Games that don't even look as good as the original Crysis (2007) run like crap on today's hardware... just look at Starfield. An empty planet and a few pebbles... no terrain, no vegetation and yet it can drop to 50 FPS on my high end PC. It's ridiculous.
@@xerideaI remember this.
There is a very important thing that is often forgotten about big O notation: it ignores constants and those constants might be huge. And that matters a lot if your code doesn't work on big dataset.
explain
@@MehmetMehmet-y8c Big-O is a quick&dirty notation aimed to eyeball how an algorithm scales with the number of inputs given. Imagine I've some hypothetical algorithm and after careful analysis I determine that the number of operation required to complete goes: n^3 + 10n^2 + 100n. With big-O notation you're making the rather dirty observation that eventually n^3 dominated and anything else doesn't matter. When however? How 100n can be ignored for small-n? Big-O is flawed in that sense
@@ef3675 i understand what you mean. good example
I just about cried when I saw you put the subdivision modifier on the door handle. I think I felt genuine pain.
I would credit every single japaneese video game developer in this department. They release some of the most optimised games ever. Special shoutout to metal gear solid V. I played that game on 2 gb ram.
Oh yes I especially like how even cutting enemies into 100 plus pieces only lags slightly
Nintendo is great with compression techniques it’s crazy. They truely are the best at making compressed video game file formats.
They had to, their Nintendo switch is comparable to 2023 midrange smartphone in term of power
that's intentional, Nintendo's law is always make games that are fun regardless of graphics, that's why the wii games have held up so well compared to ps2-3 games
Game devs have it easy with ample RAM and CPU and they still fuck it up. Try embedded development where resources are extremely limited and crashes can potentially cause catastrophic failures.
Nowadays games:
- 98gbs for just a fighting game (yes Tekken, I am talking about you);
- DirectX12 that freaks up the graphics for actually strong chips but old graphic cards (like GTX 970), making then running slow or creating really annoying visual arctifacts due to low resolution applied (another example of that is that Tekken 5 looks way prettier than Tekken 8 in low quality. Also SF6 doesn't need DirectX12, it could have DirectX11 and Vulkan for performance versions too).
Seems like the industry only cares about Path Tracing in real time render.
its not dx12 fault, its fault of lazy developers used to dx11 abstractions
Tbf with tekken 8 you can just Delete the story files (30+ gb size)
Look, this is why I say "More Hardware doesn't make a better game" The argument should be "Better Optimization makes a better game"
This video was super cool. Why is this channel so underrated. Hope it blows up!
The demonstration about "what if light was slow" was also amazing
True. Unoptimized games is not because of the software engineers but because of the managenent.
Speaking from experience as a dev myself. Boy, if you see our codebase. Lol. Tighter deadlines so, we are cutting corners. I don't want to but I need to xD
The big O notation describes the growth of a function, not the actual execution speed. For example, hash maps have a lookup time complexity of O(1), whereas linear arrays have a lookup time complexity of O(n). However, if the hashing function is slow, array lookups will most likely outperform map lookups for lower values of n. O(10000) = O(1) is slower than O(1n) for n < 10000.
That is to say, it would be incorrect to state that the big O notation accurately represents the real-world performance of an algorithm outside of big data. The correct way to locate slow functions is by profiling.
Simply, quality content , straight to point , and engaging , luckly i was already subscribed to this channel
You deserve much more subs and views! Amazing video!!
100%, I was confused by how this can only have 300 views, great work
Asphalt 8/9 is crazy optimized if you’ve ever played it. Glorious graphics, solid 60 fps all the time even on pretty old computers and mobiles, very responsive.
Wrong title, it should have been: The LOST Art of Game Optimization
It’s not lost though. AAA games aren’t the only games on the market and even then there are well optimized AAA games
And for common people, AAA is the only one matter. . @@crestofhonor2349
@@crestofhonor2349
Name 1 optimized AAA game made after 2016
Ksp2 devs: Let's make a ridiculously accurate rocket building game! And the map is the whole Solar System, just for good measure!
Also Ksp2 devs: Recommended for 1080p60 are rtx 3080
what
what
what
what
what
Cyberpunk runs on the steamdeck... but people want to tell me it is impossible to run it on the ps4 with addon and 2.0 update? NO WAY... This game was a crime
The steam deck is more powerful then the ps4. so yes... it runs on a steamdeck and not a ps4.
@@jairit1606 and the legion go is more powerful then the steamdeck - i played every quest and the addon. Really freaking good game. (and Handheld pc)
wow, someone who knows what they're talking about, very refreshing
csavo, meghallottam az akcentust, mentem is a csatorna leirasaba es lattam amit remeltem. egy MAGYAR youtuber aki angol tartalmat gyart? hat ezt a csatornat jo alaposan at fogom bongeszni 😂 csinald csak tovabb, nagy dolgot muvelsz💯
Meanwhile "Alan Wake II"
Triangles? What Triangles? There's only 1 Trillion triangles on the player's perspective tho. Surely your entry level 4090 can handle it, right?
Having an RTX 3060 be the MINIMUM GPU requirement for a game to run screams shoddy optimization
A youtube video once said the reason DX12 is so “bad” is because the burden of optimisation is now on the game devs, who don’t have much experience with drivers..
What about vulkan....?
@@BOT-fq9bu also a low level API and roughly speaking, you would have to do the same amount of work
There's mid 2000s game thst looks and plays amazing on what we today consider low end hardware
I give it a like just for the effort that you put in the video
To give some feedback:
I think the "Code" section was a bit too fast especially you explaining what we are actually trying to accomplish. I didn't get it before rewatching that part a few times
Raytracing can be more efficient than the typical rasterization when dealing with a huge quantity of triangles.
This was surprisingly interesting… Thank you!
New games almost entirly lack proper optimizations to make it playable and acceptable for players to enjoy , those who still do it good , are passionate artisits...
This video basically explained why I'll never learn programming. The numbers immediately made my head hurt.
I jokingly made the reference that AAA studio leads just say "Our game isn't unoptimised, just buy a 4090" behind the scenes.
And then Todd Fuckwad said it in an actual interview. Fuck everything that guy stands for. And then he dares get dissapointment at the game awards every time it didn't win. Dude thinks the sun is shining out of his ass
What a great work dude!
This is underrated. Great video, sir!
Thought it was the subdivision that broke shit but 3 MILLION?!!? HOW DO YOU DO THAT BY ACCIDENT
The number one thing that works for optimization is Frame Generation. Anybody can now do it with any game any graphics card with a program on Steam called Lossless Scaling. That's what happens when AMD makes FSR frame gen open source
In the 90s due to the limited resources on computers, devs had no choice but to optimize the software(not just games) before release, internet for updates was pretty much non existent.
Remember how little RAM some programs used like Adobe PDF, heck look for PDF reader alternatives and you will see some use less RAM overall.
Heck Windows did not show seconds on the clock in the taskbar as the CPU would need to render a new number every second and that would cause a performance hit, while this was more true for Win9x it still shows that devs had to make sure that the code they wrote was good from the start.
But these days cause we can have 64GB of RAM, 2TB SSDs devs dont really bother with optimizing software, why waste time optimizing when you have so much resources?
Hey! What's the tool you are using at 3:50? This seems way better than having to boot up photoshop every time I want to use smoothness textures (damn Unity smoothness on Albedo Alpha :v) Great video!
Thank you very much, now I know why I am getting 3 fps👌🏻
Thanks, this was a pretty cool rundown.
WOW! Thats a highly concentrated quality video. So much to meaningful content in so little time.
"Optimisation is easy just use the 20 new unrealengine5 meme effects that require a 4090 to run!"
This video is why modern videogames are the way they are lmao
he should definetly have talked about baked lighting in the video. it makes games run so much better and it looks awesome if done right. Many beginner devs just use lumen since it's now enabled by default in ue5 and those devs don't know any better than to use this extremely slow and performance consuming technique.
Also a fun fact: i once saw a tutorial on "how to remove the 'lighting needs to be rebuilt' error" and the dude literally just told the viewers to select all the lights and make them dynamic. talk about good performance...
@@paper_shreds I see some people calling baked lighting "faking it" or if its somehow inferior to real time lighting, even if the results are better. I can't wait until they realize that all 3D graphics are built on trickery
I feel like those people saw some presentation about open world games at some point where dynamic light sources are the only option and that made them think thats the best way@@SomeRandomPiggo
@@SomeRandomPiggo
I believe that baked lighting gets a bad rep because nearly every modern game with baked lighting looks terrible (see:R6 Siege)
@@TheJohn_Highway R6 used to actually look good though. They've reduced the graphics over the years and over-sharpened it for E-sports. That's a bad example anyways, Half-Life Alyx and CS2 both use baked lighting and look photoreal sometimes.
Mojang needs to see this
Java Minecraft has so many issues stemming from the fact that it is single threaded. Bedrock Minecraft however does not have this issue
very infomative video. some part of the videos are really hard to hear due to the heavy accent, the autogenerated sub can only do so much.
C. You are using Unity.
People always say stop optimizing games it’s stupid, and well they need to be optimized
9:26 "pretty good optimization"? no, that's insane optimization. what a simple solution as well. now, whenever i think a program can't be more optimized, i'll slap myself and remember this example and try to optimize it more.
Dude imagine being the most sold game of all time and having optimization so bad that a huge chunk of youre comunity is dedicatted to fixing it
Minecraft is truly not a heavy game only if the devs adressed it it would be playable
They have been working on it. I think they also focus on good practice more than performance.
Im not really from a rich place or still have money as a student so my pc is kinda meh...and i admire this a lot...as a programmer myself i really dove deep into this lately
I just cant play a game slower than 40-50 fps ...or with drops
Quality content right here! Thank you for sharing this!
I think we should focus on software more than hardware now, hardware is at it's peak and it will never improve to the point of actually...well, improving. I have a 3050 laptop, it's pretty sluggish in certain games but run others like a dream, is there a huge graphical difference? No. So why does it do that?? Fucking optimization. The 3050m gpu is definitely not the peak I was talking about, but it's pretty damn close, just add some extra vram, a little more cuda cores, a little faster bandwidth speed, and boom; you just made yourself something that can run literally everything, (spoiler: that exists, it's called the 3060) and if the 3050 on itself can run everything, some at lower settings, a lot more at higher, then I don't see a single reason we should "improve" from the 3090, much less the 4090. If we're talking realistically, the 4090 can last people decades! What more do we need from a graphics card? Time travel? It's a graphics card, it runs games at 4k ultra with RT on, I seriously do NOT see anything we could improve. If game companies put care into optimization, then I would've have to upgrade my laptop for another decade, and to those with a much more powerful gpu's, like for example a 3070 or a 4070, they shouldn't have to upgrade for another however long it takes for these gpu's to simply break.
I did not understand a word from 5:46 to 10:17. 10/10
get your ears checked then lol
Try understanding some bitches, we can understand him fine
This title feels like it should be the lost art of game optimization
All of this in 10 minutes, insane video
Give this man a world of warships sponsor
In the future games will be made in a way that doesnt require insane hardware and theyll be a universal standard for specs whilst maintaining realistic graphics
Can't tell if you're completely delusional or colossally optimistic. I wouldn't trust modern devs to make a 1:1 copy of Minecraft that would run as well as the OG one.
@@TheJohn_HighwayI dunno about that Minecraft is a pretty purely optimised game, community mods have more than doubled the performance of the base game. Granted mojang has been improving it especially with the lighting engine rewrite recently that fixed one of the worst bottlenecks.
@@MrMoon-hy6pnBedrock was pretty well optimized. You can get 60 fps at a render distance of 84 chunks on the right specs.
@jeff_7274 bedrock is CRAZY optimized. That's why I'm always tweaking the settings to run shaders without rtx LOL
But java... well, I know it's a little limited by the language it's using, but still, we're talking about the second or third biggest company in the world. I honestly hope they switch Java to also c++ and make them both equal.
@@jeff_7274 compare to vanilla Java perhaps, with my setup I can get ~60 fps in a jungle at 64 chunks on bedrock 30 in the same seed with 32 chunks in Java so you are right there. But with sodium mod and a few others I can get around 100 - 300 fps. So there is still a long way to go.
Someone needs to send this to the team working on ready or not 😂
yes!! 2 years ago the performance of that game started to go downhill from always over 100fps ultra to almost never going above 80fps, too many fps drops to 60fps or below and lots of stuttering and increased input lag.
Removing the old maps and game modes in the 1.0 update and worse performance are the worst issues of Ready or Not to me, it's just so unfortunate.
Nowadays game studios just throw DLSS at everything and call it a day.
I feel like Moore's Law will always continue, because 15 years ago everybody was sure it was gonna grind to a halt in 2020 or so because of physical limits and clearly that has not been the case. And there's the whole quantum thing brewing on the horizon for a decade or so now, while not ready for "prime time" yet either, it's clear that it does work and will make us able to reach 'new heights' a few more years down the line that were long thought to be impossible.
Otherwise a great video ^^
Quantum computing cannot compute normal logic faster than a typical computer
Quantum computing is highly specific, it will not make video games run faster.
Quantum computing is not something that is on the horizon, at least not for the general public. They only function at almost absolute zero temperature, and a cooling system capable of that will definitely not be available to us anytime soon, and it would take an even longer time until it becomes affordable to build a PC with that.
What's going on doe right ? What happened to optimizing games properly bro ? (somebody fill me in)
1. Over dependence with high end cards, processors, systems, and consoles ?
2. Game Project rushing because of money and time control by financial support ?
3. Half ass development and early releases for fast money intake ?
Most Games exited the FUN area and went to fucking " BUSINESS MODE ONLY ". This is why I have absolute love for indie game developers who actually make really good optimized games with great content plus interact with their community.
Devs relying on pure hw power instead of developers' skills in squeezing out as much power of that hw instead is bane of most modern games (especially bigger ones).
For example, remember how Crash Bandicoot was more than even Sony believed was possible?
Meanwhile we have bland looking games that look maybe 5% better than games from few years ago but take dozens of times more disk space and struggle to run even on the best HW consumers could theoretically access.
Old ARK take over 430GB on disk, looks crap, plays even worse and is plagued with more bugs than some actual Early Access I've played (combined). Fitgirl's repack is about 43GB.
Starfield is almost unplayable withoud Nvidia's DLSS and Todd has the nerve to say it's because their game pushes the technology to their limits and PC should just upgrade (ignoring the fact that consoles run the game better in general but their hardware is at most medium-grade compared to gaming PCs).
Modern devs are just lazy. They self-learned from poor-quaity tutorials on YT or some shit and think themselves great developers.
Ah yes, game optimization, the thing every developer ever forgot about
Ur gonna blow up
Very awesome video. Such an accessible introduction to optimization
I needed to empty out some space in my drive. From what i gathered indie devs do so much better in optimization. Just look at the most recent poor optimization from cities skyline 2. Most of the time they always argue about home computers under performing to deflect the criticism.
Yeah, but indie games are generally not that large in scope either. Look at the ones which are, like valheim. Its optimization is alright. Now compare it to NMS, a AAA game. Look at choo choo charles, an indie game. Optimization is not that great on it. Granted, it was made relatively quickly and with just 1 guy, but still.
No way wreckfest made it on a thumbnail
Moore’s Law is only dying if you’re talking about transistor density not compute power density, which is what we should really be measuring and isn’t slowing down any time soon.
RDR2 looks better than most PS5 games right now. It amazes me how the devs went to perfect the game's optimization and fit it in a 8th gen console.
That’s because its environment is easier to render than plenty of modern games that are set in cities. It’s just not super dense so RDR2 really leans into its lighting. They have tons of presentations on how they achieved the visuals they made. Although this doesn’t mean it surpasses them in all aspects as there are select few things many 9th gen games are doing. Problem stems from the fact that we are still kinda stuck in cross gen still
@@crestofhonor2349 This year almost all games are next gen only so we'll definitely see drastic improvements for sure.
nice video
A sacred art lost to time..
We can't even run games at 4k 165 fps without upscaling unless it is 4080 or 4090. Sucks that gpu prices are getting higher when we're not even getting good performance at native resolution
4K is a lot pixels to cover one needs those high priced GPUs to render it.
Research floating point operations.
buy a 1080x1920 monitor.
Dunkirk is a small city in France, at the englosh channel, in wich the British troops where trapped during the invasion of france. There is also a Movie about this with the same name. @worldsinmotion
On low budget developers export game engine with game they make. Making Pong requires now to export Unreal Engine 3 Custom with it.
really good video, but you need to find some way to reduce peaks whenever you pronounce S, as it did hurt my ears a bit...
Hi! Good video, but you used the wrong Godot logo at about 2:00. That's all
You mean those extra teeth?
@@llllllXllllllyes, that logo is very old and not used anymore
The games you show at the beginning are doing all the things you mention in the rest of the video. That's not "why" they are "unoptimized". And "optimized" is often a ambiguous term. Games may be optimized to run their workload the best (or as good as possible given the time constraints/skills of the whole team) but still won't run on a potato PC because it is not part of their design goal. If you include a path tracer in your engine you better be "optimized". But if you have to run on Switch, then it would make better sense to drop the path tracer and maybe have a simple renderer with a few light sources.
Unless you have access to source codes and assets of those games, your claim is just a speculation.
@@swh77 Because in his video he is just talking about basic stuff. Not what goes into optimizing a raytracer, shader passes and so on.
@@swh77 It's not speculation when there's a plethora of research papers, presentations, articles, blog posts and discussions regarding all of this out in the public domain, a significant portion of which has come from developers working on AAA games. You don't need access to the source code that game X uses to implement screen-space reflections when the studio who made game X literally held a presentation discussing their approach in GDC one year, you can just go watch their presentation.
Apparently Microsoft just say “fuck this” and shit out a crudely written port for €70
I feel like this is a video you could have spent more time on with the explanations. It's not like someone clicks on this when they are in a hurry.
You know it IS quality content when you hear that accent
I'm a little of a conspiracy guy, and I would say the big hardware players have some kind of deal with the AAA companies to not optimize their games, because of the overreliance on DLSS and FSR, and because no one in the entire planet can tell me Starfield is optimized and Todd Howard response to that is "it's optimized, go and upgrade your PC lol". C'mon, it's really sus.
Comment for the algorithm, awesome content!
Thanks man I loved it
If it is so hard to estimate, and hard to do why game company used to hype up gaming community with such announcement and such date release. Get the job done and then announce!
Now that raster techniques are so good, rt feels like a brute forcing waste of resources.
And console games are just pc games with low settings…. Which is not what it should be
i have live\ long enough to see the days where good graphics doesnt always mean good video game. Lethal Company for the winner!
WildCard Studio needs to see this video
ARK Ascended has the worst Optimization i ever seen in a video game
Arc has always been poorly optimized since the game initially came out in UE4. There is nothing new from those devs other than proving that they don’t care about performance.
Best example is the original Xbox One port and original Nintendo Switch port
totally agree :)@@crestofhonor2349
@@crestofhonor2349Apparenlty the new Ark switch port actually runs pretty well and looks better than the original crappy port which I find incredible. I’d be very interested to see what they did to get it to run on there. Everywhere else though it still seems just as bad as it was the day it came out.
I know "thing vs thing, Japan" is cringe as hell but I can't deny that Japanese developers really know how to optimize their games very well. Allowing my 2060 to run on highest possible graphics, while it struggles with basically any western game to have stable 60fps on low-mid settings is a feat worth acknowledging.
Yeah, the internet needs to stop glazing over Japan.
This is not just with Games, happens with cars as well. Western cars are horribly made and very unreliable. Meanwhile Japanese cars are built like a tank.
Except Gamefreak 😔
Well, it's probably not a problem with their developers as much as it is with the insane time constraints they're forced to work with.
Fascinating video!
6:07 i got curious what game this is and made a code to analyze all existing games -- "a game that isn't dying or anything or at least that's what the devs like to think" -- is it World of Warships?
Peak content ur underrated frfr
your probably the only one who can save call of duty
Remember that time some dude ported Tomb Raider to the Gameboy Advance?
Modern game devs: I will pretend I did not see that
Really good video although there's a problem; you're mainly focusing on optimisation in Unity which is a fundamentally terrible engine that nobody should use anymore especially because of how mismanaged it is and the runtime fee. It would be great if you talked more about optimisation in Unreal which I feel you kinda glossed over it by only mentioning Nanite and Lumen, there's much more to it than that such as light baking, blueprint nativisation, anti-aliasing methods, shadow optimisation, etc. Godot would have also been great to have a mention since it's open source and that
When did neural networks and machine learning become AI? It's such an over used buzz word.