Great video! I love seeing people interested in the Hedgehog Engine and baked lighting! I will point out a couple of small mistakes that I found, though. 5:43 - Hemispheric Ambient Lighting is not about taking normal maps into account. (Its a nice consequence!). It refers to instead of ambient lighting being just a single color, its defined as the color in an upper hemisphere and a lower hemisphere. So surfaces pointing up would be tinted to the upper hemisphere lighting, and those pointing up would be tinted with the lower hemisphere. This would be a great first step in imitating color bounce, as now the biggest sources of ambient lighting (Light coming from the sky and light bouncing from the floor) could be rendered. This is independent of normal map, as models without them, still has surfaces pointing up or down that will get tinted. 6:10 - Global Illumination is independent of the techniques you mentioned before (except AO). GI is the lighting technique, but its storage could take any form. Colors stores its global illumination on vertex color in a lot of objects, and uses Hemispheric Lighting to store global illumination for moving objects. 15:55 Backface culling is just a feature of 3D rendering. Its not an automatic solution or anything, just nobody modelled back walls or told the 3d engine to render backfaces. 17:55 Now, I´m definetely not an expert here, but I believe what youre actually seeing here is not the rendering process, but the GI loading process. The first part without GI is before it has loaded, the second one, with GI and very low resolution shadows (So low that they seem like they aren't there, but they are part of the GI texture on the alpha channel, so they are there) is the lowest res mip that first loaded, and the every next part, its a higher resolution mip being loaded for the GI texture. Since Gens is a forward rendering engine, they are not multiple object and material passes to gather information about, like you later show in Frontiers, as the objects come out fully rendered. From the little info I know, the rendering goes like this: first, the terrain/baked objects are drawn with the indirect lighting, next the realtime sunlight gets added on top of that, masked by the shadow map contained in the alpha map of the GI texture and the realtime shadow map. After that comes something called the eyelight, a second light coming from the camera, that makes the textures with normal maps more noticeable in the shadow, as GI in HE1 doesnt actually interact with normal maps. Dynamic objects follow a similar procedure but instead picking indirect lighting from the lightfield, and masking the sun with just the dynamic shadowmap. 18:38 Both HE1 and HE2 before Frontiers renders lighting under the same principles, so neither has ¨Realtime lighting¨ as we would refer to it nowadays. They both bake everything and add a realtime sunlight (Static and masked by the baked shadows) and small realtime lights on top of that. The difference being that HE2 bakes and renders everything in a more modern way. 18:45 PBR relies on normal maps even more heavily than previous rendering techniques. HE2 even uses a new version of the GI textures that actually takes the normal maps into account, because of much more usage of them there is. 19:12 You´re pretty much correct in how IBL works generally, but in HE2 the lighting work is split between the cubemaps and the old HE1 techniques. The GI textures and lightfields are used for the diffuse (Non-reflective) lighting, and the IBL for the specular (Only reflective) lighting. They both work together at the same time.
@@gameman5804totally, what killed the appereance of Forces was his art direction, making the light to go to a third division cuz the game with some textures and specificlly with Characters textures, they look plastic and flat, the real work and presentation here I think is Frontiers.
I am so glad someone FINALLY talks about Hedgehog Engine with some sense and context. For all too long sonic fans have complained about the engines simply because they didn't like the games made for it--without realizing how much is actually going on under the hood running in real-time at a thousands of in-game km's per hour, rendering absolutely everything at 30-60 frames a second and nearly perfectly maintaining that framerate (outside of a few notable areas). Both HE and HE2 are computer black magic, and regardless how someone feels about Unleashed or Forces or Frontiers shouldn't be a factor into how incredible these things are. Especially in the case of HE2 which has a number of non-traditional Sonic uses as well.
Genuinely I'm so happy to see this video. People treat Hedgehog Engine 2 as this terrible thing Sega needs to get rid of, pretending they understand how game and engine development works. I've had so much brain damage explaining to people how adding physics to Frontiers would not "destroy the entire engine because it's made in C++ and changing one thing would ruin every other part in the engine" ignoring the fact every fucking game engine has scripting languages that handles the actual gameplay logic (typically lua). The Hedgehog Engine 2 is not at fault, it's an incredible piece of tech that doesn't get enough credit because "iT's HoLdInG bAcK sEgA"
@@ikilledthemoonSonic Unleashed is still the best looking sonic game, even when compared to generations. It comes pretty damn close to satisfying their goal
I'm a game dev professor, I recently explained how light baking works in Unity and it's exactly this. Pretty stunning that these guys either came up with this stuff or twisted similar existing ideas about 15 years ago. As a teenager I didn't get any of what I read at the time but now it makes so much sense.
It's a classic approach and something a lot of people arrived at independently. Precomputing where the light is coming from is a natural idea for speeding up rendering of static environments. You could say it's a type of dynamic programming.
This video proves for me that Sonic '06 was not a fuck-up tragedy failure, but instead a game made by amazing hard working pioneering people who did the most with what little time, money, and sanity (from stress) they had while working on the project. This game wasn't a flop, but instead a case study for how much humans can do with so little money and time and A LOT of pressure.
Why are you trying to excuse a bad video game, It doesnt martwe about who made it, a bad product is a bad product sold to uś by a greedy company such as sega. I can see this excuse kinda working for smaller indie games, but even indie developers, who sometimes risk their livevlyhood just to bring their Vision to life, can produce a better product than Sonic Team mostly does
But it was a fuck up bad product tho? Just because there is a backstory doesnt mean it's a generally bad game that tarnished the brand. Like most priducts fail because of corporate influence, not because the devs suck
@@ProjectYellowHound i can totally agree with The "big company bad" part, but you have to cosider that sonic team works for SEGA, and SEGA decides for how long they can produce each title, they worked the hardest they could, that is for sure, and i think there is some credit in that.
Nope, 06 is shit. Every bad game has a story. Nobody sets out to make garbage. But none of that excuses that they sold this shit for full price. Fuck SEGA.
well i mean 1st game that "sonic team" made was phantasy star with a 3d dungeon in an 8 bit consoles then they did something similar again with sonic's bonus level thingy that gave you chaos emerald or something,not to mention both phantasy star and sonic has that unique oomph in almost every game
You killed it. You explained this perfectly. One thing with Frontiers they switched to light probes with colour transference. Because Gi wouldn't really be feasible. It lights the environment and colours in real-time
19:20 Frontiers keeps the entirety of the current level, open or cyber, loaded and in memory at once in order to prevent any load stutters at high speed. The objects that triggered loading in Generations and Forces were actually stripped out entirely!
This comes at a cost unfortunately it’s likely the reason for the lack of LOD models for things like the platforms to avoid them eating up the memory which also means they have to have an aggressive draw distance setting for platforms on Switch with no meaningful attempt to increase it on the more powerful ones like PS5
I was waiting for someone explaining the hedgehog engine fully, many people think if its hedgehog engine 2 its bad physics, but they dont in know its just a lighting engine, so sonic unleashed can be remade with hedgehog engine 2 power if they want
Another fact about this engine people might not know -- Yoshihisa Hashimoto was picked up by Square Enix after the Unleashed project to work on the Luminous Engine, which is the graphics engine used in Final Fantasy XV. It's also what powered Forspoken, although most of the original devs had left by that point. They also forked this engine early in development to make FFXIV 2.0 and that's still used today. If anyone remembers the Agni Philosophy tech demo...that's Yoshihisa Hashimoto. There are probably still videos on UA-cam of him presenting it for the first time too. The Hedgehog Engine was always really, really, REALLY ahead of its time...developers wouldn't really focus so heavily on lighting techniques for many many years later when the PS4/Xbone became a thing. Sonic Unleashed was one of the few games to even attempt Global Illumination in that age, in fact you probably couldn't find another game that featured it on console. Anyone who lived through the early 360 Era probably remembers how absolutely mindblowing Sonic Unleashed looked. The reason the Hedgehog Engine looks so good is because Sonic Team was essentially mimicking Raytracing back in 2008. These days we have actual hardware accelerated RT, but this was a really genius way to do it, even if it probably took a shitload of expensive compute time to bake back in 2008.
It wasn't the only game. Mirror's Edge also used baked global illumination. Around this game's release, Epic Games added Lightmass to Unreal Engine 3, which was a tool for baking GI, so any games developed with it after this probably had baked GI as well. Lightmass carried over to UE4, and is still present in UE5 but is being replaced with Lumen now. Still, these games have aged really well. Sonic still looks great, and Mirror's Edge I think looks beter than the sequel.
@@dianavi1893except that Mirror's Edge didn't use Unreal's Lightmass. DICE, working with Illuminate Labs, developed their own lightmap baker supporting GI called "Beast", that's what was used in Mirror's Edge.
There’s a mod for Frontiers called “improved ambient lighting” and “subsurface scattering fix” these make the game look more like Unleashed by making the lighting more detailed and affect Sonic based on the environment And fixing the bad lighting spots you could see on Sonics arms and in his ears
Don't forget Graphics Enhancement. The mod utilises HE2 more than Sonic Team did and makes the game more stunning, also featuring atmospheric changes as well
their goal of resembling Pixar was a resounding success. the entire game feels like a Pixar movie and that’s a big part of why I think they are still yet to make a game that looks and feels as good as Unleashed.
i think it's because all the lighting was intentional, real time lighting doesn't have the same charm. I don't think they are going for the pixar vibe anymore anyways. They tone of frontiers and forces was more serious. I dislike frontiers art style tho.
Some more insight into HE2's lighting: It still relies on baked data, though its algorithms are more advanced. The lightmaps now use 4 textures stored in the HDR color space (which means they can exceed the 0-1 range) and use something called spherical gaussians to allow for more detailed bounced lighting, especially on surfaces with normal maps. The shadows are baked into a separate texture, which gets blended with the real-time shadows. The light field uses volumes (boxes that encompass an area of the level) and store the data in 3d textures using spherical harmonics, which similarily to spherical gaussians can imitate the way bounced light travels areund an object. The IBL probes use HDR images of the area complete with direct and indirect lighting. Frontiers changed some things, mainly by storing only the luminance data in the light fields and using baked maps only for indoor/underground areas. The IBL probes are images with data from the 3 buffers used by HE2's deferred renderer: albedo (color only), normal and parameter (for metalness, smoothness, ambient occlusion, etc.). These are combined in real-time, which makes the reflections look correct during different times of day.
Hey! Great video. Game dev here. This is actually how most 3D games handle their lighting even today. Of course there are things like PBR and raytracing now, but I had no idea sonic team came up with these techniques.
I’m thoroughly amazed at the amount of work that goes into games and how they’re able to figure out workarounds whenever they come up against obstacles. It’s funny seeing how much detail we may overlook but that is needed to make the game feel complete.
There was a point in this video where I broke and burst out laughing, because it suddenly hit me how monumentally clueless most people are when they talk about Sonic. There's so much depth to the development of these games, how can anyone just break it down to, "shit" or "mid?" it kinda makes me sad. Anyways, thanks for making videos like this, it's nice hearing someone talk about these games below the surface. Not enough people do that.
The reason why it seems like smaller objects are being rendered first is probably just because there are more of them. The depth buffer helps to avoid rendering something if there's already something in front of it that was already rendered. Sorting the objects from closest to furthest is an O(nlogn) operation, which means it would be slower than just having occasional wasted work.
I've always loved hearing about the Hedgehog engine. Obviously they used baked lighting to get most of their lighting data because doing that stuff in real time was not feasible then. Shadow maps were new then as before in the last gen they were mostly static and prebaked into textures. The new baked GI system plus use of shadow maps really helped. Plus the models were all pretty high poly, especially compared to the models used in the PS2, Gamecube, and Xbox. Sonic's model in cutscenes was so good polygon edges were difficult to see.
it's interesting lumen in unreal engine 5 basically uses a similar rendering ideology sonic team came up with decades ago proving just how ingenious they were, a lot of these techniques did exist back then but they still managed to revolutionize them and make a stunning game
@@Thamstras i was thinking more of how lumen uses voxels to calculate lighting in real time comparable to the system hedgehog engine uses to apply gi to sonic
14:32 small clarification, mip maps don’t have a set amount, in engines like Unity I’m pretty sure it makes mip map levels by halving the resolution until it reaches like 2x2 and storing each one
8:48 I believe the hedgehog engine does do this with a max cap of 4 scatter bounces strictly reserved for Sonic's model handling occlusion light. I think the environment only used 1-2 bounces. There also was an order to the lighting layers being rendered: ambient/fill light first, occlusion light from environment, direct hard light, and Sonic's occlusion light back to the environment (This is why Sonic is fully rendered 10:27 as his reflected light has not been accounted to the environment yet - Its really not noticeable unless Sonic is in direct hard light and close to an object). Water does take in account the lighting engine and due to it's unpredictable motion, is technically real-time rendering. So water gets rendered tens of times over once disturbed of it's "default" pattern. This is why the game lags so much in Adabat.
so this is why unleashed has aged so well visually! I am a sucker for anything relating to rendering in computer graphics and this video explains all of the concepts it talks about very well. This video and your channel deserve way more subs!
Neat stuff! I remember the Unleashed GDC talk, it was my obsession for a good year tbh. Have you considered hitting up some people in the modding scene with more in-depth understandings of the engine's low-level functions, architecture / design, etc? It'd be a lot more technical but I think that has some value for people who're into that kinda thing (like "how is a CPU even made?" kinda stuff). I know the author of the Gens Freecam mod you've used has a lot of insight in Gens specifically, and a few people who mod Lost World talk all the time about how that engine is WILDLY different from Unleashed and Gens under the hood. Would be cool, maybe!
Really cool video! Yoshihisa Hashimoto was kind of a pioneer in a lot of ways and I wish he'd return. A lot of work went into the Hedgehog Engine and I think it could look great if pushed to its full potential.
Random fact I want to share about Frontiers: it actually *does* support baked GI lightmaps/shadowmaps, and pre-baked light-field. The lightmaps are used in the indoor/temple sections of Ares/Chaos/Ouranos Island for example. And while shadowmaps and pre-baked light-field aren't used in the game at all (since the light-field's colored in real-time), I've ported some stages from older HE2-based games (M&S Tokyo 2020, Sakura Wars, and Olympics 2020 (the one with humans) specifically) and since the file formats are identical, the game has no issue with reading them. That out of the way, I wanted to say, this was quite well-researched, and def an interesting watch. Especially as someone who actually tends to like working with lighting/materials.
I know I learned something here today. This was a really interesting video essay. I was always fascinated with how games like Sonic Unleashed and Generations looked so detailed and how they have aged well visually. So it’s cool to finally have a summary of how the Hedgehog Engine and Engine 2 work.
amazing video! it's so interesting to learn about how much they went through just to achieve their goal of that pixar look, and the engine has come so far, just look at shadow generations! it looks gorgeous so far and im so excited!
There's somethin bout Unleasheds' lightin that I heavily prefer over Forces Or Frontiers. It's still the most beautiful lookin game to me graphically. Thx for the rundown
This video explains how much sonic team had putted there efforts in making this engine and also explained that why sonic frontiers has so many pop ups while discovering the island.
its crazy to thnk they worked so hard to make this engine what it is just for the player to blaze fast in the stage and see bearly anything with the motion blur
15:54 IIRC backface culling only renders a polygon if its vertices are clockwise on the screen. the algorithm is very simple but saves a LOT of performance!
They absolutely had to, Renderware wasn’t cutting it and was largely dropped by the rest or the industry at this point in time and if they had attempted at using Unreal Engine 3 the results would have been horrendous. Making their own tech pushing engine custom fit to their needs is what they had been doing for years until Renderware derailed things
And it is a company. Nintendo makes entire consoles, and they get more out of it than "just do all that licensed for the Bill Gates ecosystem, bro, it's already further ahead". Not every company is your little sister's lemonade stand.
@@sboinkthelegday3892 does taking nvidia's homework and modifying it with custom firmware count tho?.. like, yeah, the switch is okay, but most of their work was the software side of things. i guess there's the joycons, which are unique somewhat.
What you're seeing at 16:40 looks like a depth pre-pass (hard to say for certain without looking at NSight myself). This serves as a way to reduce fully-shaded overdraw in forward renderers by basically "pre-loading" the depth buffer. This allows you to then render the scene again, fully shaded this time, and using the pre-rendered depth buffer to then only shade each pixel once. In a sense, you could think of this as paying a small cost up front for pixel-perfect culling later.
Severely years later, the mad people at Insomnia made Ratchet & Clank Rift Apart, which can easily described as Pixar-like. I love showing off that game to family as they all say that it looks like a movie.
very good video, although there are some small mistakes you managed to explained everything in a way that someone that is curious but not knowledgeable enough could understand and hopefully search more about how games in general work. there are some interesting tools you can also use to check how the engine works and you could've used reshade to read the depth buffer from the game. also keep in mind, the ps3 has 256mb of ram and vram, its very impressive what they managed to pull off back then. pretty sure they could've optimized the game more to run better on it but its still interesting to see how they thought back then to improve performance. well done :)
I noticed the lighting in Unleashed straight away as I was familiar with the lighting concept of radiosity, in real life, in paintings, and in ray tracing, and how it rarely got seen in any form in any games because of the huge amount of computation required. It worked so well in the Mediterranean-like streets of Unleashed. The gaming press seriously let us down here by completely not noticing/ignoring this visual achievement at all. Ultimately, Sonic Generations made me a PC gamer because I couldn't stand being stuck with 30fps gameplay over and over again and wanted to see and play these at 60fps. But for all the visual innovation of Unleashed, and the intelligence, capability and determination of the developers at Sega, I'm still stunned they thought the Werehog was a good idea and seriously went ahead with it! Surely they must have had some doubts at some point?!
So the real light transport problem is not that your light rays "multiply", it's because you'd have about 10^18 photons to simulate each frame in a "physical" simulation. That's too many! A good rule of thumb on what we can do today is still about one ray per pixel. So each ray models many photons hitting the same spot but at different tiny offsets, getting scattered in all different directions by the microscopic shape of the surface (this is diffuse scattering). Yes, we have to "spawn" more rays to model this on the renderer end, but that's still far, far fewer rays than there are photons in the scene - otherwise we'd be violating conservation of energy!
Wow dude, thank you so much for this video. Though I don't know shit about game dev, it's really cool to see how the hedgehog engine works as it explains why hd games post unleashed came out with so much polish. Really great job here, congrats!
15:50 quick note - EVERY game engine since the dawn of time uses backface culling. 17:17 Most 3D game engines render the scene back-to-front so as to avoid objects far away from popping out to the front.
Dude, you are awesome for making this video. I've always loved the look of these games, and now I understand more about how they are able to look so fabulous. Gracias papi chulo.
This is a interesting video, I heard about hedgehog engine which I didn't watch many videos to talk more depth in it but this was compelling. I finished Unleashed yesterday so it's nice to see more recoomendation on it. To think they made a new cool engine in two years after 06.
Lmao I literaly did present the hedgehog engine at my school to validate one of my years as a 20 minutes speech presentation. And this video confirms that I haven't fucked up lol. PRETTY nice video, enjoyed it!
Super Mario Galaxy, Sonic Unleashed, Mario Kart 8(mid gameplay aside) and Black Ops 3 have the BEST engines I wanted a full video on Hedgehog engine for so long thank you for making it!!
Halo 3 also has an amazing lighting engine! Man, all these graphically amazing games with compatible lighting engines coming out around 2007-2008 is really strange. It's as if someone (3:00) scared these console-locked publishers into funding more advance techniques out of fear for "falling behind in the race." I really enjoy how all these dominoes fell just right that 2004-2008 ended up producing the best games to ever exist, despite the technology limitation at the time too!
good call on Mario Kart 8, that game looks unreal (no pun intended) also worth mentioning Battlefield 3. Hard to believe that game was made for seventh generation consoles.
Great video! I love seeing people interested in the Hedgehog Engine and baked lighting! I will point out a couple of small mistakes that I found, though.
5:43 - Hemispheric Ambient Lighting is not about taking normal maps into account. (Its a nice consequence!). It refers to instead of ambient lighting being just a single color, its defined as the color in an upper hemisphere and a lower hemisphere. So surfaces pointing up would be tinted to the upper hemisphere lighting, and those pointing up would be tinted with the lower hemisphere. This would be a great first step in imitating color bounce, as now the biggest sources of ambient lighting (Light coming from the sky and light bouncing from the floor) could be rendered. This is independent of normal map, as models without them, still has surfaces pointing up or down that will get tinted.
6:10 - Global Illumination is independent of the techniques you mentioned before (except AO). GI is the lighting technique, but its storage could take any form. Colors stores its global illumination on vertex color in a lot of objects, and uses Hemispheric Lighting to store global illumination for moving objects.
15:55 Backface culling is just a feature of 3D rendering. Its not an automatic solution or anything, just nobody modelled back walls or told the 3d engine to render backfaces.
17:55 Now, I´m definetely not an expert here, but I believe what youre actually seeing here is not the rendering process, but the GI loading process. The first part without GI is before it has loaded, the second one, with GI and very low resolution shadows (So low that they seem like they aren't there, but they are part of the GI texture on the alpha channel, so they are there) is the lowest res mip that first loaded, and the every next part, its a higher resolution mip being loaded for the GI texture.
Since Gens is a forward rendering engine, they are not multiple object and material passes to gather information about, like you later show in Frontiers, as the objects come out fully rendered. From the little info I know, the rendering goes like this: first, the terrain/baked objects are drawn with the indirect lighting, next the realtime sunlight gets added on top of that, masked by the shadow map contained in the alpha map of the GI texture and the realtime shadow map. After that comes something called the eyelight, a second light coming from the camera, that makes the textures with normal maps more noticeable in the shadow, as GI in HE1 doesnt actually interact with normal maps. Dynamic objects follow a similar procedure but instead picking indirect lighting from the lightfield, and masking the sun with just the dynamic shadowmap.
18:38 Both HE1 and HE2 before Frontiers renders lighting under the same principles, so neither has ¨Realtime lighting¨ as we would refer to it nowadays. They both bake everything and add a realtime sunlight (Static and masked by the baked shadows) and small realtime lights on top of that. The difference being that HE2 bakes and renders everything in a more modern way.
18:45 PBR relies on normal maps even more heavily than previous rendering techniques. HE2 even uses a new version of the GI textures that actually takes the normal maps into account, because of much more usage of them there is.
19:12 You´re pretty much correct in how IBL works generally, but in HE2 the lighting work is split between the cubemaps and the old HE1 techniques. The GI textures and lightfields are used for the diffuse (Non-reflective) lighting, and the IBL for the specular (Only reflective) lighting. They both work together at the same time.
Thank you for the insight 🤝
@@Cifesk Pin?
@@something-from-elsewhere I thought it was pinned already whoops
Bro literally spitting facts
@@Cifesk This research would look really good in a colalboration video for LowSpecGamer
hedgehog engine developers: look how good our lighting looks :D
unleashed speedrunners: that's cool, anyway, drift
WOO, FEELIN GOOD
More like "that's cool, anyway m-speed"
I mean does that really change the fact that unleashed looks really good when played on newer emulation hardware?
Joshua Salkeld: *Casually speedruns through the game while explaining how good the game is*
epic gamer moment!!!!
Man this really does show how Sonic Unleashed WAS ahead of its time
still kinda is cause its really hard to emulate it with good frame rate
@@Arthemisishere Or vertex explosions
It's still the best looking Sonic game.
@@Nov-5062
Art Direction-Wise, Yes.
In terms of the lighting... I think Forces is better.
@@gameman5804totally, what killed the appereance of Forces was his art direction, making the light to go to a third division cuz the game with some textures and specificlly with Characters textures, they look plastic and flat, the real work and presentation here I think is Frontiers.
I am so glad someone FINALLY talks about Hedgehog Engine with some sense and context. For all too long sonic fans have complained about the engines simply because they didn't like the games made for it--without realizing how much is actually going on under the hood running in real-time at a thousands of in-game km's per hour, rendering absolutely everything at 30-60 frames a second and nearly perfectly maintaining that framerate (outside of a few notable areas). Both HE and HE2 are computer black magic, and regardless how someone feels about Unleashed or Forces or Frontiers shouldn't be a factor into how incredible these things are. Especially in the case of HE2 which has a number of non-traditional Sonic uses as well.
Genuinely I'm so happy to see this video. People treat Hedgehog Engine 2 as this terrible thing Sega needs to get rid of, pretending they understand how game and engine development works. I've had so much brain damage explaining to people how adding physics to Frontiers would not "destroy the entire engine because it's made in C++ and changing one thing would ruin every other part in the engine" ignoring the fact every fucking game engine has scripting languages that handles the actual gameplay logic (typically lua). The Hedgehog Engine 2 is not at fault, it's an incredible piece of tech that doesn't get enough credit because "iT's HoLdInG bAcK sEgA"
Hey it's the Goat Pretz🔥🔥🔥
@@monkemango the only thing holding back SEGA is the fans and that's that on that. 🤣
@@Ivan210bluboi Hey it's the Goat Ivan🔥🔥🔥🔥
@@Hard_Pretzel Nahh You goat-er🔥🔥🔥
"our goal pixar graphics"
Man i not even surprised anymore about how sonic unleashed still looks amazing,this game was ahead of his time
Non of the 3D sonic games come close to what they should have been, which is more like Sonic Utopia, a fan made project
@@ikilledthemoonSonic Unleashed is still the best looking sonic game, even when compared to generations. It comes pretty damn close to satisfying their goal
Sonic unleashed looked so much better than Pixar by miles. My favorite game of all time
@@ikilledthemoonscrew that.
Nah.
Unleashed looks far better.
Shadow Generations is finally trying to match those graphics... after 16 years 😅 (better late than never though!)
I'm a game dev professor, I recently explained how light baking works in Unity and it's exactly this. Pretty stunning that these guys either came up with this stuff or twisted similar existing ideas about 15 years ago. As a teenager I didn't get any of what I read at the time but now it makes so much sense.
It's a classic approach and something a lot of people arrived at independently. Precomputing where the light is coming from is a natural idea for speeding up rendering of static environments. You could say it's a type of dynamic programming.
This video proves for me that Sonic '06 was not a fuck-up tragedy failure, but instead a game made by amazing hard working pioneering people who did the most with what little time, money, and sanity (from stress) they had while working on the project. This game wasn't a flop, but instead a case study for how much humans can do with so little money and time and A LOT of pressure.
Why are you trying to excuse a bad video game, It doesnt martwe about who made it, a bad product is a bad product sold to uś by a greedy company such as sega. I can see this excuse kinda working for smaller indie games, but even indie developers, who sometimes risk their livevlyhood just to bring their Vision to life, can produce a better product than Sonic Team mostly does
But it was a fuck up bad product tho? Just because there is a backstory doesnt mean it's a generally bad game that tarnished the brand.
Like most priducts fail because of corporate influence, not because the devs suck
the replies to this comment are depressing
@@ProjectYellowHound i can totally agree with The "big company bad" part, but you have to cosider that sonic team works for SEGA, and SEGA decides for how long they can produce each title, they worked the hardest they could, that is for sure, and i think there is some credit in that.
Nope, 06 is shit. Every bad game has a story. Nobody sets out to make garbage. But none of that excuses that they sold this shit for full price. Fuck SEGA.
Sometimes I forget oldschool Sonic Team were mad scientists. I hear most of them left the company to work at Nintendo and Mario Odyssey. though.
The ending of Odyssey felt like a deliberate throwback to Sonic Adventure games
well i mean 1st game that "sonic team" made was phantasy star with a 3d dungeon in an 8 bit consoles
then they did something similar again with sonic's bonus level thingy that gave you chaos emerald or something,not to mention both phantasy star and sonic has that unique oomph in almost every game
That's just a damn shame though... Bring back og Sonic Team...
@@graalcloudEvery time I’m in New Donk City, it makes me miss Empire City
@@Racco20 sega would just give em impossible deadlines again and their game would be crap to no fault of their own.
You killed it. You explained this perfectly.
One thing with Frontiers they switched to light probes with colour transference. Because Gi wouldn't really be feasible. It lights the environment and colours in real-time
19:20 Frontiers keeps the entirety of the current level, open or cyber, loaded and in memory at once in order to prevent any load stutters at high speed. The objects that triggered loading in Generations and Forces were actually stripped out entirely!
This comes at a cost unfortunately it’s likely the reason for the lack of LOD models for things like the platforms to avoid them eating up the memory which also means they have to have an aggressive draw distance setting for platforms on Switch with no meaningful attempt to increase it on the more powerful ones like PS5
@@GunFeverr This is untrue, all of the platforms and objects in the game have LOD models. Them having low range values is a separate issue.
@@GunFeverr
That explains the pop in
@@teenageapple3788his comment was incorrect, the low draw distance is there but they have lods
@@GunFeverr Pretty sure the reason the platforms have no LOD models, is cuz they were added really late into development.
crazy to think the devs had to _come up_ with ambient occlusion
They also had to "come up" with baked lighting that's been around since at least 1996 lol
I think they don't come up with it they just "reflect it" 🥁🥁
They didn't "come up" with it.
@@Dr.W.Krueger it was sarcasm
@@Shardso
No, just ignorance.
I was waiting for someone explaining the hedgehog engine fully, many people think if its hedgehog engine 2 its bad physics, but they dont in know its just a lighting engine, so sonic unleashed can be remade with hedgehog engine 2 power if they want
I swear, everytime someone puts the blame into Hedgehog Engine 2 for something related to gameplay it makes me die a bit inside
It's not just a lighting engine. Hedgehog Engine and HE2 are fully-encompassing game engines responsible for how the game works.
@@jubbalubThis ^
@@jubbalub Any game engine lets you program the gameplay. It's not like the game engine forces the game to not have momentum
@@michawhite7613 do you actually understand what you're arguing, or are you just having an argument for the sake of having an argument?
Another fact about this engine people might not know -- Yoshihisa Hashimoto was picked up by Square Enix after the Unleashed project to work on the Luminous Engine, which is the graphics engine used in Final Fantasy XV. It's also what powered Forspoken, although most of the original devs had left by that point. They also forked this engine early in development to make FFXIV 2.0 and that's still used today. If anyone remembers the Agni Philosophy tech demo...that's Yoshihisa Hashimoto. There are probably still videos on UA-cam of him presenting it for the first time too.
The Hedgehog Engine was always really, really, REALLY ahead of its time...developers wouldn't really focus so heavily on lighting techniques for many many years later when the PS4/Xbone became a thing. Sonic Unleashed was one of the few games to even attempt Global Illumination in that age, in fact you probably couldn't find another game that featured it on console.
Anyone who lived through the early 360 Era probably remembers how absolutely mindblowing Sonic Unleashed looked.
The reason the Hedgehog Engine looks so good is because Sonic Team was essentially mimicking Raytracing back in 2008. These days we have actual hardware accelerated RT, but this was a really genius way to do it, even if it probably took a shitload of expensive compute time to bake back in 2008.
It wasn't the only game. Mirror's Edge also used baked global illumination. Around this game's release, Epic Games added Lightmass to Unreal Engine 3, which was a tool for baking GI, so any games developed with it after this probably had baked GI as well. Lightmass carried over to UE4, and is still present in UE5 but is being replaced with Lumen now.
Still, these games have aged really well. Sonic still looks great, and Mirror's Edge I think looks beter than the sequel.
@@dianavi1893except that Mirror's Edge didn't use Unreal's Lightmass. DICE, working with Illuminate Labs, developed their own lightmap baker supporting GI called "Beast", that's what was used in Mirror's Edge.
There’s a mod for Frontiers called “improved ambient lighting” and “subsurface scattering fix” these make the game look more like Unleashed by making the lighting more detailed and affect Sonic based on the environment
And fixing the bad lighting spots you could see on Sonics arms and in his ears
So the Hedgehog Engine 2 lead to some visual downgrades then?
@@Hyp3rSon1X not really, they just dont use it to its full potential
Don't forget Graphics Enhancement. The mod utilises HE2 more than Sonic Team did and makes the game more stunning, also featuring atmospheric changes as well
_TIME TO DOWNLOAD ANOTHER MOD!_ 💨💨
@@kaistudios5536 Don't forget the "Graphics enhancement" mod if it's not in your mod folder already
their goal of resembling Pixar was a resounding success. the entire game feels like a Pixar movie and that’s a big part of why I think they are still yet to make a game that looks and feels as good as Unleashed.
This explains so much why Unleashed's visuals looks so good to this day, especially compared to more Modern Sonic games.
i think it's because all the lighting was intentional, real time lighting doesn't have the same charm. I don't think they are going for the pixar vibe anymore anyways. They tone of frontiers and forces was more serious. I dislike frontiers art style tho.
Some more insight into HE2's lighting: It still relies on baked data, though its algorithms are more advanced.
The lightmaps now use 4 textures stored in the HDR color space (which means they can exceed the 0-1 range) and use something called spherical gaussians to allow for more detailed bounced lighting, especially on surfaces with normal maps.
The shadows are baked into a separate texture, which gets blended with the real-time shadows.
The light field uses volumes (boxes that encompass an area of the level) and store the data in 3d textures using spherical harmonics, which similarily to spherical gaussians can imitate the way bounced light travels areund an object.
The IBL probes use HDR images of the area complete with direct and indirect lighting.
Frontiers changed some things, mainly by storing only the luminance data in the light fields and using baked maps only for indoor/underground areas. The IBL probes are images with data from the 3 buffers used by HE2's deferred renderer: albedo (color only), normal and parameter (for metalness, smoothness, ambient occlusion, etc.). These are combined in real-time, which makes the reflections look correct during different times of day.
Hey! Great video. Game dev here. This is actually how most 3D games handle their lighting even today. Of course there are things like PBR and raytracing now, but I had no idea sonic team came up with these techniques.
I’m thoroughly amazed at the amount of work that goes into games and how they’re able to figure out workarounds whenever they come up against obstacles. It’s funny seeing how much detail we may overlook but that is needed to make the game feel complete.
There was a point in this video where I broke and burst out laughing, because it suddenly hit me how monumentally clueless most people are when they talk about Sonic. There's so much depth to the development of these games, how can anyone just break it down to, "shit" or "mid?" it kinda makes me sad.
Anyways, thanks for making videos like this, it's nice hearing someone talk about these games below the surface. Not enough people do that.
That's why I don't engage in online Sonic discussions, it's just mindless nonsense
Glad you enjoyed it :)
game development is an art, that doesn't mean you can't judge one on its merit
Nice to see the step by step rendering shows by that NVidia Tool. In general many other game engines used very similar techniques also back then.
The reason why it seems like smaller objects are being rendered first is probably just because there are more of them. The depth buffer helps to avoid rendering something if there's already something in front of it that was already rendered. Sorting the objects from closest to furthest is an O(nlogn) operation, which means it would be slower than just having occasional wasted work.
I've always loved hearing about the Hedgehog engine. Obviously they used baked lighting to get most of their lighting data because doing that stuff in real time was not feasible then. Shadow maps were new then as before in the last gen they were mostly static and prebaked into textures. The new baked GI system plus use of shadow maps really helped. Plus the models were all pretty high poly, especially compared to the models used in the PS2, Gamecube, and Xbox. Sonic's model in cutscenes was so good polygon edges were difficult to see.
It’s crazy how innovative and talented Sonic Team used to be when it came to graphics/lighting.
Was? Sonic Forces' lighting is really good. Frontiers is an exception with how eh it looks
@@waffles245 may I ask why frontiers can’t capture the uniqueness similar to unleashed?
it's interesting lumen in unreal engine 5 basically uses a similar rendering ideology sonic team came up with decades ago proving just how ingenious they were, a lot of these techniques did exist back then but they still managed to revolutionize them and make a stunning game
What's being described here is more like the Lightmass system used in UE3 and UE4.
@@Thamstras i was thinking more of how lumen uses voxels to calculate lighting in real time comparable to the system hedgehog engine uses to apply gi to sonic
14:32 small clarification, mip maps don’t have a set amount, in engines like Unity I’m pretty sure it makes mip map levels by halving the resolution until it reaches like 2x2 and storing each one
You can set the amount when exporting images but the default is down to 2x2
8:48 I believe the hedgehog engine does do this with a max cap of 4 scatter bounces strictly reserved for Sonic's model handling occlusion light. I think the environment only used 1-2 bounces.
There also was an order to the lighting layers being rendered: ambient/fill light first, occlusion light from environment, direct hard light, and Sonic's occlusion light back to the environment (This is why Sonic is fully rendered 10:27 as his reflected light has not been accounted to the environment yet - Its really not noticeable unless Sonic is in direct hard light and close to an object).
Water does take in account the lighting engine and due to it's unpredictable motion, is technically real-time rendering. So water gets rendered tens of times over once disturbed of it's "default" pattern. This is why the game lags so much in Adabat.
so this is why unleashed has aged so well visually! I am a sucker for anything relating to rendering in computer graphics and this video explains all of the concepts it talks about very well. This video and your channel deserve way more subs!
Neat stuff! I remember the Unleashed GDC talk, it was my obsession for a good year tbh.
Have you considered hitting up some people in the modding scene with more in-depth understandings of the engine's low-level functions, architecture / design, etc? It'd be a lot more technical but I think that has some value for people who're into that kinda thing (like "how is a CPU even made?" kinda stuff). I know the author of the Gens Freecam mod you've used has a lot of insight in Gens specifically, and a few people who mod Lost World talk all the time about how that engine is WILDLY different from Unleashed and Gens under the hood. Would be cool, maybe!
Great video. Was interested the whole time. Didn't think I would be so very interested! Love learning about this.
Really cool video! Yoshihisa Hashimoto was kind of a pioneer in a lot of ways and I wish he'd return. A lot of work went into the Hedgehog Engine and I think it could look great if pushed to its full potential.
This is some pretty fascinating stuff. Great video!
Great insights and perfectly explained. Amazing job
I mean, baked Lightmap GI is pretty old hat now, but... back in 2006!? Positively MINDBLOWING
Even then it's been well established at that point
don't half life 2 and counter strike source use lightmap GI?
Established even in 2006. We did that all the way back in 1994-95.
"How Does the Hedgehog Engine Work?"
With hedgehog, I'd assume.
Random fact I want to share about Frontiers: it actually *does* support baked GI lightmaps/shadowmaps, and pre-baked light-field. The lightmaps are used in the indoor/temple sections of Ares/Chaos/Ouranos Island for example. And while shadowmaps and pre-baked light-field aren't used in the game at all (since the light-field's colored in real-time), I've ported some stages from older HE2-based games (M&S Tokyo 2020, Sakura Wars, and Olympics 2020 (the one with humans) specifically) and since the file formats are identical, the game has no issue with reading them.
That out of the way, I wanted to say, this was quite well-researched, and def an interesting watch. Especially as someone who actually tends to like working with lighting/materials.
I actually had no idea about the GDC talk! I'll be watching that next!
I know I learned something here today. This was a really interesting video essay. I was always fascinated with how games like Sonic Unleashed and Generations looked so detailed and how they have aged well visually. So it’s cool to finally have a summary of how the Hedgehog Engine and Engine 2 work.
This video was a really cool dive into this topic, I had always wanted to know more about the Hedgehog Engine.
Holy crap, a fellow portuguese sonic fan!
Excelente video, continua o ótimo trabalho!
Ganhaste um subscritor.
Thanks for converting some of the tech paper that came a long time ago to this video.
Man, the Hedgehog Engine really is something, isn't it?
I mean that in a good way, of course. Nicely done on the video!
Man, this is super cool, thank you a lot, now I am able to appreciate even more the amazing looks of Sonic Unleashed
amazing video! it's so interesting to learn about how much they went through just to achieve their goal of that pixar look, and the engine has come so far, just look at shadow generations! it looks gorgeous so far and im so excited!
mindblowing. Always thought this engine looked incredible. Had no idea they literally blew the power in their office making it
Thank for this video! It's great to have a precise and concise video to point to for "How the Hedgehog Engine" worked. Really well done!
As a computer science student taking a computer graphics course this video is really interesting and informative, great job!
THE GOAT IS BACK AGAIN
this makes me appreciate Unleashed even more. people may not realize it, but this actually pinpoints itself as a major part of Game History.
There's somethin bout Unleasheds' lightin that I heavily prefer over Forces Or Frontiers. It's still the most beautiful lookin game to me graphically. Thx for the rundown
Just got back from watching the sonic symphony, this is the perfect video to watch after a long trip!
I've seen plenty of videos talking about Source Engine and it's features and I loved these kind of videos, this will be a fun watch
0:34 I remember the Black Knight stage port
Fascinating info! Thanks for the explanation!
Man this is really interesting to know about hedgehog engine work's really good video btw 👍
My Xbox would smell of burning plastic whenever I played this and I had to actually open my window so I could breathe
Another great video. I was always interested in how it all worked and you explained it perfectly.
Amazing how they figured this all out themselves, major props
bro you know its a good day when cifesk uploads
Man no wonder the graphical mods for frontiers can push things so far since it has an incredibly solid foundation
This video explains how much sonic team had putted there efforts in making this engine and also explained that why sonic frontiers has so many pop ups while discovering the island.
its crazy to thnk they worked so hard to make this engine what it is just for the player to blaze fast in the stage and see bearly anything with the motion blur
15:54 IIRC backface culling only renders a polygon if its vertices are clockwise on the screen. the algorithm is very simple but saves a LOT of performance!
An entire engine made by hedgehogs. Groundbreaking stuff. Science is amazing!
Very cool, lots of scientific knowledge here!
Always been a fan of the Hedgehog Engine. Amazing work
Sonic Team doesn't have to make an engine, but they did it anyway, and that's dedication. 👍🏻
They absolutely had to, Renderware wasn’t cutting it and was largely dropped by the rest or the industry at this point in time and if they had attempted at using Unreal Engine 3 the results would have been horrendous. Making their own tech pushing engine custom fit to their needs is what they had been doing for years until Renderware derailed things
Except that in 2005~2008 they had to make one, there clearly wasn't as much choices as there were today
Plus having Renderware that has been bought by EA would be... something which i can only translate that as... foremostly _bad._
And it is a company. Nintendo makes entire consoles, and they get more out of it than "just do all that licensed for the Bill Gates ecosystem, bro, it's already further ahead".
Not every company is your little sister's lemonade stand.
@@sboinkthelegday3892 does taking nvidia's homework and modifying it with custom firmware count tho?..
like, yeah, the switch is okay, but most of their work was the software side of things.
i guess there's the joycons, which are unique somewhat.
YEEEEEEEEEEEEEEEES, A VIDEO THAT EXPLAINS THIS ENGINE THANKS!!!!!!
What you're seeing at 16:40 looks like a depth pre-pass (hard to say for certain without looking at NSight myself). This serves as a way to reduce fully-shaded overdraw in forward renderers by basically "pre-loading" the depth buffer. This allows you to then render the scene again, fully shaded this time, and using the pre-rendered depth buffer to then only shade each pixel once. In a sense, you could think of this as paying a small cost up front for pixel-perfect culling later.
All i know about hedgehog engine is that Puyo Puyo Tetris 2 runs on it
Severely years later, the mad people at Insomnia made Ratchet & Clank Rift Apart, which can easily described as Pixar-like. I love showing off that game to family as they all say that it looks like a movie.
Yeah, it's just as groundbreaking as this game on it's era, it's a shame it didn't sell that well.
very good video, although there are some small mistakes you managed to explained everything in a way that someone that is curious but not knowledgeable enough could understand and hopefully search more about how games in general work.
there are some interesting tools you can also use to check how the engine works and you could've used reshade to read the depth buffer from the game.
also keep in mind, the ps3 has 256mb of ram and vram, its very impressive what they managed to pull off back then. pretty sure they could've optimized the game more to run better on it but its still interesting to see how they thought back then to improve performance.
well done :)
I noticed the lighting in Unleashed straight away as I was familiar with the lighting concept of radiosity, in real life, in paintings, and in ray tracing, and how it rarely got seen in any form in any games because of the huge amount of computation required. It worked so well in the Mediterranean-like streets of Unleashed. The gaming press seriously let us down here by completely not noticing/ignoring this visual achievement at all.
Ultimately, Sonic Generations made me a PC gamer because I couldn't stand being stuck with 30fps gameplay over and over again and wanted to see and play these at 60fps.
But for all the visual innovation of Unleashed, and the intelligence, capability and determination of the developers at Sega, I'm still stunned they thought the Werehog was a good idea and seriously went ahead with it! Surely they must have had some doubts at some point?!
Well, the Werehog seemed like a good idea *at the time*. In the late 2000s, God of War was hugely popular and the Werehog is very "inspired" by that.
there are people that liked the werehog thing
i actually didn't understand why they even did it at first, but it has grown on me over the years, lol..
6:39 impressive technical terms used here
good, unique content I could always come back to
Really informative video! Mk8 also has baked gi lightmaps (hdr encode by alpha) and sh probes for gi lighting on moving objects and drivers.
Great analysis! Really impressive what they came up with for this engine.
Great video! I loved hearing about this.
So the real light transport problem is not that your light rays "multiply", it's because you'd have about 10^18 photons to simulate each frame in a "physical" simulation. That's too many! A good rule of thumb on what we can do today is still about one ray per pixel. So each ray models many photons hitting the same spot but at different tiny offsets, getting scattered in all different directions by the microscopic shape of the surface (this is diffuse scattering). Yes, we have to "spawn" more rays to model this on the renderer end, but that's still far, far fewer rays than there are photons in the scene - otherwise we'd be violating conservation of energy!
I hope the next sonic game shows this off more lol
really really really really really awesome video, thank you :)
I just realized how far ahead sonic unleashed was for its time.
This video is popping off! Good work bro 🙏
brilliant video brother! God bless!
Wow dude, thank you so much for this video. Though I don't know shit about game dev, it's really cool to see how the hedgehog engine works as it explains why hd games post unleashed came out with so much polish. Really great job here, congrats!
15:50 quick note - EVERY game engine since the dawn of time uses backface culling.
17:17 Most 3D game engines render the scene back-to-front so as to avoid objects far away from popping out to the front.
Dude, you are awesome for making this video.
I've always loved the look of these games, and now I understand more about how they are able to look so fabulous.
Gracias papi chulo.
Unleashed is what made me want to study lighting and games
The background music needs to be way louder, but keep up the good work. I like the formula of the videos. 👍
another video, another banger!
This video is super informative, great video!!🔥🔥
great video! thank you for your work
hedgehog engine developers:
"DeltaTime? Never heard of it!"
Sonic on slow pc: "Uyaaa aaammm sssppieeddd"
I was surprised by how good Generations looked and how well it ran back when it launched. Years later it still held up great.
are you telling me sonic team had to figure out what a lightmap is
This is a interesting video, I heard about hedgehog engine which I didn't watch many videos to talk more depth in it but this was compelling. I finished Unleashed yesterday so it's nice to see more recoomendation on it. To think they made a new cool engine in two years after 06.
That is extreme dedication from Sonic Team.
Lmao I literaly did present the hedgehog engine at my school to validate one of my years as a 20 minutes speech presentation.
And this video confirms that I haven't fucked up lol.
PRETTY nice video, enjoyed it!
Super Mario Galaxy, Sonic Unleashed, Mario Kart 8(mid gameplay aside) and Black Ops 3 have the BEST engines
I wanted a full video on Hedgehog engine for so long thank you for making it!!
Halo 3 also has an amazing lighting engine!
Man, all these graphically amazing games with compatible lighting engines coming out around 2007-2008 is really strange. It's as if someone (3:00) scared these console-locked publishers into funding more advance techniques out of fear for "falling behind in the race."
I really enjoy how all these dominoes fell just right that 2004-2008 ended up producing the best games to ever exist, despite the technology limitation at the time too!
good call on Mario Kart 8, that game looks unreal (no pun intended)
also worth mentioning Battlefield 3. Hard to believe that game was made for seventh generation consoles.
5:13 realtime global illumination is still hard now! Unreal engine 5 has Lumen which is their solution, but it’s still very heavy
Hedgehog Engine, a 3D blast processing