@@ForsakenSanityGaming Haha, yeah, on desktop you can turn the audio on from the thumbnails on the main page. But be warned, it's either off or 100 percent volume level lol
The quality of animation is absolutely fantastic. The transition from being zoomed in on the screen and showing the pixels, and then seeing 'you' walk off set, idk it was just. fantastic.
Not sure... I'm getting early 2010's Hollywood impression of what they thought future animation might look like. it's kinda ropey, but I guess it's an amateur production rather than a studio like Cloud Imperium Games or Naughty Dog.
@@graxxor 100% mocap with cleanup and manual retargeting if any, the typical issues are clear as day to any mocap animator. source : I am a gameplay animator.
I think this video is totally fine for anyone who just likes to learn a little bit about color in general at a theoretical level, but there are some things I'd recommend you study further and implement in a future video. From my understanding (CG artist), these are the main things I'd recommend revisiting: 1: ACES is a color space, it's not tone mapping. The difference might seem like semantics, but it's important. Transforming from one space to another is a mathematical process and not open to interpretation. For example, there is only one valid way to go directly from ACES to sRGB. Tone mapping is mainly used to go from HDR to SDR and display it in an aesthetically pleasing manner. This can be subjective, and doesn't serve the same purpose. 2: ACES may have been originally created for filmmakers, but the idea was to make one intermediary colorspace that various sources could all adhere to, so that they would look and behave the same in the grading process. It just so happens that it's also linear (yeah, ACES is linear - a whole other can of worms on terminology there..) and therefore keeps all the values that you need to properly render the full dynamic range of the image, allowing you to get the nice highlights and colors. 3: I would never stop someone from advertising their own presets, but please don't gatekeep the information - just tell people that it's AgX. If I have misunderstood any concepts, professional colorists or graphics programmers are more than welcome to correct and/or add to this.
As a semi-professional colorist, I agree with what you said here. Although I am no guru on the subject of ACES, your description of it aligns with my own understanding of it much more closely than the description in the video.
Great info! In particular the point about this just being AGX, or perhaps something very similar. I would also agree that the usage of "ACES" here is a bit of a simplification, but the reality, unfortunately, is that there is a standard ACES tonemapper which has been in very common use in game engines and image processing suites. You can argue that it's incorrectly named, but it's absolutely a thing. The greater confusion, I think, is that this aforementioned "ACES tonemapper" is not nearly as universal as she implies. There are dozens of popular tonemappers, but only very recently have we begun to see decent ones being used for games.
3:16 if this video was uploaded in a HDR format it probably would've been a better demo to show the difference, yt does support it and it's very noticable on smartphones
But then it would have been subject to whatever color space conversion and tonemapper that UA-cam and/or the displaying device uses as well. Most devices support sRGB and include it in their gamuts, i.e. no clipping or tonemapping necessary.
@@stekeln Yes and no. SDR is as much a subject to inaccurate displays as HDR is, but one advantage it has is that it's clearly defined, while in SDR there are two commonly used gamma standards, pure gamma and sRGB piece-wise gamma, with some monitors using one and some using the other, and there's no way to know which one is the intended way to decode the colors, without author telling us which of those standards they used when they were making the video. If the video is uploaded in HDR, then it is known that youtube will convert it to SDR under the assumption that it's converting it to pure gamma. And as long as the video doesn't go extremely bright, meaning that it is graded the way HDR is supposed to be graded, the resulting SDR conversion will actually be really good, with result being no worse than if video was SDR in the first place.
My main issue with those "realistic tonemappings" is that with skylight, its always lit with no color, like it was permanentely overcast. looks nice under controlled situations, but the illusion falls flat once looking to a big light source
Do you mean that the global illumination looks desaturated, or that the sky itself appears colorless? Both are likely the result of the original artists trying to make the sky and GI look good using a tonemapper where bright blues appear more saturated, and then switching it out for another tonemapper.
@@stekeln I think he means to say that overcast sky creates a low contrast scenario, which is drastically easier to tonemap than a sunlit day would be, therefore much less compromises need to be made for the image to come close to what the eye would see in reality, making the final result more convincingly realistic than it otherwise would be.
The finger snap (0:16) between the two color spaces does a great job of showing that without design consideration, no matter what technique you use, your end product might not improve even if it's different. However, the hue shifting lights (7:06) scene is literally designed FOR the custom color space, including composition choices that maintain contrast, showing that when you make your scene to WORK with color space instead of despite it, you can achieve great results. In film, costumes and sets can be designed with the final color space taken into account (not that it happens all the time, on every production, or even successfully) but I think that pipeline is less present in video games. Some games, in some scenes, make great use of their color space (TLOU2 Subway) but color isn't usually given precedent over other design/dev constraints. With film, because the end product has an even greater emphasis on the image, I think it makes a lot of production and business sense to redesign scenes to better share the story/experience with the colors you want -- if game devs want to make their games look better the will have to account for their color space choices early in the development of the project instead of treating colors as a post processing issue that they will address near completion. To point, earlier versions of UNRECORD (2:00) had a muzzle flash that did not fit the aesthetic of the game's superb level design even if it was within the color space. The game has since removed that flash and done work on several assets that clashed with the photo-realism. The level the player walks through makes great use of light to dark values their color space can detail and found an aesthetic in the dilapidated building and overcast day that works well with the limited color saturations. In the same trailer there are green barrels, blue and red containers, and other objects that look glaringly out of place in their game (and are rightly not highlighted when people share clips of the project) because their color space struggles with assets not specifically designed for the color spaces limitations. This is a great video! I think your next video on this topic should include some of the design decisions and trade-offs you encounter using the new color space though.
A big thing about ACES is that it is not only a tonemapper, but an entire workflow that addresses more concerns then mentioned in the video. The biggest benefit of ACES is that it supports any display type, including any current and future HDR format, and any input gamut. Admittedly, it sounds a bit like magic to me but supposedly, the Reference Rendering Transform (RRT), the component that is most responsible for the ACES look, makes the colors translate perceptually similar to any display type, sRGB, DCI-P3 or whatever else we might want in the future, given that the correct ODT is used after it. I am all for alternatives or improvements to the RRT but if I had to choose between classic ACES or a solution that looks better on sRGB but only looks good on sRGB, I'm going for ACES no doubt
The RRT is pretty bad actually. Most movie studios invert it. The SDR Rec. 709 ODT is often inverted alongside it so studios can go with the following workflow: IDT -> LMTs and grading -> K1S1 or other custom LUT -> Inverse RRT & ODT -> RRT -> ODT instead of the prescribed ACES workflow: IDT -> LMTs and grading -> RRT -> ODT The workflow with the extra steps allow movie studios to outright replace the RRT and the SDR Rec 709 ODT while still checking the ACES box which is a requirement at Netflix.
Correct me if I got any of this wrong. @4:34 ACES sounds a lot like a "color management system" having to do with color space and color gamut. From what I've briefly read, tone mapping involves remapping HDR content to fit an SDR display, adjusting luminance, black levels, and to a lesser degree color saturation. A lot of the issues mentioned seem to relate to luminance and black level problems, At least based on some of the pictures shown in the video.
this was a really cool video. I'm coming from the video space, so it was an interesting angle as I delve in UE. Also, this is my first vid of yours. Very well made, and the whole virtual setup really allows for some great transitions, nice job!
@@vincentazara2947 From what my laymans eyes see she uses motion capturing for body movement (possibly an off the shelf VR-setup including goggles, controllers plus positional markers for the feet) The lip sync and facial expressions looks like AI interpreting the audio of the voicelines. The movement there is slightly dragging behind the audio. Whats notably absent is breathing of the character model. The ribcage and belly is an unmoving block.
@@rock962000 What tf is "real AI" supposed to mean? Do you mean AGI? - None here, it doesn't exist yet. Do you mean what's often referred to as "generative AI"? (wich is usually diffusion based algorythms that were trained via reinforcement learning) - None of those used here. Audio seems legit, visuals are UE. Maybe the background music? Judging by the hand tracking they use index controllers and the weight shifting is consistent with 3d markers. Pretty standart VR stuff. But both would be unusual for vtubers (who often feed cam footage through an algorithm to track movement and expressions, thus the flat characters with no hand gestures or reasonable weight shifts)
absolutely love the quality that went into this video, like the echo voice in the cave or the walking off set when you were on screen showing the pixels on screen. true information journalism right here
I'm too distracted by how good the video looks - editing, details, motioncapture and just how freaking cool this format is - to focus on and digest what is being said.
That's The coolest presentation of a video I've seen to date I believe. Giving the concept of being totally immersed in the actual examples is a great way to go.
back in the late 90's when I was in animation school, nothing like this was ever covered nor did the big programs like Houdini, Alias or Softimage have any kind of renderer or post renderer that took this topic into account.. good stuff, glad you've put it out there for the world.
@ianskinner1619 crazy how different times are...in high school in the early 00s we had access to terragen and poser, which were like incredible to me in 9th grade in 02, considering my modeling and animation experience at the time was limited to making character and weapon models for quake 3 arena and half life mods using Milk Shape 3D lol. By the time I got to college in 06 I got to see Maya, Cinema 4D, and Z Brush for the first time...damn I remember Z brush felt like wizardry.
Great content and delivery! But also, wow, the amount of work that went into some of the presentations here is, well, incredible. Just the one scene with the zoomed in TV pixels and then the change in sound... so well done. Smashing subscribe!
Ok first time seeing your content... Holey crap your ability to control your avatar is absolutely amazing! Obviously your speech and mouth arent quite there yet but your facial expressions, arm motions, body gestures.. man so amazing! Your teaching was also amazing and I loved learning about tonemapping.
Wonderfully presented. I'm that person who endlessly tweaks eq curves and dynamics in mixing amd mastering. Sort of the sound equivalent of tone mapping in a way I'd say. Just want to say i see the passion you have for this and thoroughly enjoy your content
I feel like my IQ just jumped up a couple of points! I am not a film maker or game developer, but it's always great to peak behind the curtain and see the tech they use to bring all their work to life! Speaking of life... Wow, Angelica! You are looking the BEST you ever have! The lip syncing, hair movement, facial expressions, eye movements, and speech have all VASTLY improved! You are a marvel, and a great spokesperson for tech that makes you possible. Keep up he great work!
This was really good info. Great in-depth explanations. As well as break downs and concept transitions and understanding. I liked how you came off screen from the ‘film shoot’ like they do in real-life.
Ok, this some of the most informative well written, blocked and executed videos i have seen in a long.. long time. Instant fan. Good luck on your project! Looking forward to your rapid rise.
I've dabbled with these colorspaces in Blender and they might be realistic, but they also look really washed out. I don't mind the ultra-high contrast, super saturated look of older media. And I am a big fan of stuff like Technicolor. If I want realistic, dull and washed out, I go outside in Winter in Germany or play a game from 2010. I like it, when stuff pops. But color theory is still interesting.
The actual topic and talking points were why I clicked on the video, and it was defined all very clearly. Yet I gotta say the animation storyboarding and all editing of this video is really amazing.
It depends. Small sensor devices such as smartphones and action cameras have a horrible hue resolution that they sacrificed to increase low light sensitivity, which would produce visible banding if they aimed for similar saturation like from photographic film. I regularly shoot with a Powershot G2 and an Optio W10 and the colours are actually quite strong, while being very film-like in colour, so they don't look artificially punched up.
I've been following this project for a few years now and I have to say how impressed I am with the progress you've made. In my opinion you've very nearly bridged the uncanny valley and made an avatar that wouldn't look out of place in the 3d reality our meat suits live in. On top of that, the voice you've crafted once had a slightly mechanical buzz behind it but sounds like a legit human voice now. I'm blown away and excited to see what can happen when commercial technology catches up with what you've been doing here.
Not sure I agree with this. The avatar is cool, don't get me wrong, but photorealistic it ain't. It's still very much in the "uncanny valley" of video game graphics with limited motion and face capture. It looks like the CGI puppet that it is. That's not a dig at the model, but let's not get ahead of ourselves.
Something I noticed about the custom rainbow chart: if you look at the top third, the colored peaks appear to shrink, and if you look at the bottom third, they grow. The effect is almost nonexistent in the ACES mapping. I think that shows another reason why it looks so realistic. Without changing the image, it's able to change our perception of the colorspace just by looking at regions with more illumination.
Your best piece of content yet, after following your progress for years. Well done. It was assured, informative, and a thoroughly good watch. This one might blow up.
I feel like your videos at this point are themselves blurring the edges of reality. It's not just that Angelica is becoming more and more realistic, both visually and from an auditory perspective, but the actual composition of this video is so polished, yet so meta. A computer generated character moving in and out of computer generated worlds explaining to me how lighting of computer generated characters and worlds work. The actual structure of the video is flawless, the way Angelica moves in and out of the scene makes it sometimes seams like she's a part of the worlds she's talking to us about, but sometimes an observer on the outside looking in (like we are). The script is tight and well pulled together, it actually feels like something I might have watched in class (but actually enjoyed). I just love this. I love all of it. That said, it was one thing when this channel was mostly silly Omegle trolls, but I almost feel like now the mention of "spicy" content takes away from the other remarkable aspects of your content. I'm not judging, just saying it feels SO disconnected from the otherwise high-brow, thoughtful content you've been doing for a while now. Like, I would love for you to do some kind of cross over with Captain Disillusion, but I suspect the spicy content might take such collaborations off the table... but maybe that juxtaposition is just another part of the surreality of Angelica. Keep doing what you want and may the AI Angel project live long and prosper.
I agree with everything you said, except about the "spicy content". I think if people want that, it's perfectly fine. If they don't like it they don't have to see it.
@@theRPGmaster Yeah, I grant that it's not super "in your face" or anything, it just feels weird. It's like if someone did thoughtful political commentary or had a channel about how they love animals and then closed with an OF pitch. I'm not judging it from a moral perspective (and I'm sure it helps support the channel), but it feels disconnected to me. Like the streams are being crossed. My observation is: we're talking about two VERY different products here.
Everything about this video is amazing. I'm so impressed and amazed and clueless how you accomplished this masterpiece of video. The movement of the character is almost life-like. Just amazing
Very cool info. Although I will say, I don't really see this kind of tone mapping as objectively better. I mean it clearly is if the main goal is photo realism. But like film, games are often a mode of story telling. And just as I get why filmmakers would want to favor saturation in a lot of cases to tell their story, I also get why game devs would too. But its definitely good that this is being understood so that devs can make the choice that's appropriate for what they're looking to achieve, vs just being stuck going down one path because its the only one they really know.
To me what really sold me was the scene of the color light shifting. In the original ACES it felt over saturated at times. And I know I've experienced some of issue of using color lights not really illuminating things the way I want.
@@84bitDev On the other hand ACES is able to display super saturated colors. Like neon or like pure 100% red lights etc. This tone mapper how ever sacrifices maximum possible saturation for smoother transitions but looks more gray as a result. It's simple not capable of reproducing full saturation colors which ACES can do at least. And in ACES you can hold back on the saturation of the colored light or surface materials if you want to.
I think you could shift the tonemapper when you want super saturated colors. that could be a nice effect, like for example, your character has a vision boosting effect, you switch back to ACES or another saturation biased mapping, and get blasted with color, or you could do it when hit or something. or you could map it linearly, and saturate more as you build combos.
@@SuperemeRed color is very context dependent. I bet it is still possible to make things look super saturated by making the surrounding things look a bit more grey.
Wow, you actually have a beautiful voice and a really clear way of communicating. I almost clicked away when I realised that yourr name was "Ai" something, this is because the plague of bad AI content on youtube lately. But I subscribed to you now.
Sooooo good bro!! Color mapping has been a sore topic for photographer and videographers for eons. Add to that game devs and sfx teams. I really dig how you interacted with the topic, explained it with actual examples and the camera tap was perfect too! SO GOOD! The voice jives with her teeth and is nearly flawless and authentic sounding. Catch lights in her eyes could really put chefs kiss on the whole presentation. I also appreciate the skin texture, hadn’t noticed that before. Subtle and perfect. Thank you again for all the INSANE hard work.
Damn, this is all so damn good. I should support you on Patreon, even without the saucy stuff, but I won't say no to a bonus. You're doing amazing work, Angel.
Weird simplification of color space transform to sell a product. ACES was meant as a way to convert all camera's unique color spaces into a standardized color space to help preserve the intent of the cinematographers, it is then used as an intermediate color space and transformed again into the final deliverable color space (eg. REC.709 or REC.2020, P3 DCI etc). So the goal of adding it to unreal was to allow the generated content to be presented in a consistent color space for later processing, not to create a final look. From the looks of things, the demo you are showing is just an ACES rec 709 transform, which is an incredibly limited color space, though what 90% of content videos use, even this one.
@@AiAngel In color tab you have to put one color transform node first, going from what you rendered with (probably ACES for color space and input gamma), to davinci wide gamut / davinci intermediate, then add your color correction nodes after that if needed (here you are in the gigantic sweet sweet davinci color space and log gamma! can't break the image). Then, you have to use as a last node another color transform to go from davinci wide gamut / intermediate to what you want as deliverable (probably REC 709 / gamma 2.2 or sRGB). for the desaturated look with increase of light level, it will come naturally, but dont forget to up the input of tone mapping (custom max input) to the max (10000) and check apply forward OOTF. this is very close to AgX. now, if you want to desaturate the high lights even more, use the curve of sat. vs lum. between the two color space transfor mode (in davinci wide gamut color space), this is perfect for this! if you don't have to color correct, only one color transform node will do (from ACES to rec 709/gamma 2.2).
I am a total beginner in this field, and a lot of things were difficult for me to understand, but your content gave me a huge overview. I am totally wowed by your way of presentation. Your hard work is clearly seen in your content.
This video popped up in my feed and decided to click even though it's on the periphery of my interests. Even though I knew very little about the subject matter you did a fantastic job of explaining it. I hope you choose to do more educational videos. You're great at it. Also, please consider that there's a huge potential audience of people who don't understand what you do but would like to learn more. Case in point, I was absolutely amazed by your avatar. It moved and spoke in such natural ways that it really felt like a person was engaging me but there was no uncanny valley weirdness in the experience. It felt a bit like magic.
Do you do the body animations with mocap, or face recordings? As a beginner animator I would be interested in how it's done. Also something similar for Blender would likely interest millions of 3D Artists. Edit: There is also a good tall about this Topic on the Blender channel called "Why Color Management Absolutely Su..s"
Good video but it has a few errors: * Mapping from a wider color gamut to sRGB is gamut compression, not tonemapping. The ACES RRT (the actual tonemapping curve) expects input in the ACES AP1 gamut and the input is dependent on the color gamut of your assets. Some engines like Unity will use AP1 when ACES tonemapping is used but fall back to sRGB for other curves or in the pre-ACES era, just render in sRGB gamut (*linear* sRGB that is), so gamut compression would not be necessary at all. Color gamuts and color spaces are also different things which adds to the confusion * Tonemapping refers to compressing dynamic range from [0-arbitrary_limit] to [0-1] which might not relate to color at all, tonemapping luminance and handling highlights separately is a pretty valid approach. What brightness 1 is supposed to be can also be wildly different, depending on monitor brightness, HDR standard you're delivering for, if any, etc. ACES is a full post processing pipeline, with 2 color gamuts, multiple color spaces that use said gamuts, input transforms, a tonemapping curve, output transforms. The whole system is honestly a pain in the ass for use in games because of the funky white point they chose for their gamuts. Also, since people like reinventing things, there's now many terms for it: tonescale, display transform, etc, etc * Tonemapping in games itself predates modern workflows by a lot. Formulas like exponential [1 - e^(-x)] and simple Reinhard [x / (1+x)] have been around since the 2000s. The Uncharted 2 curve by John Hable still sees use today. And a stupid tangent, but RGB rendering of any kind will be 'hacking' color, after all it's a mathematical model of how we see things, not how light itself works. Spectral renders is where the real deal is at, though it's decades away from being usable in real time.
Computer Technician here: This was REALLY interesting! I would have loved to see you go deeper into the differences between LED, and TFT, even OLED screens and the way the process color and how the various tone mappings affect each type of screen.
@@LagrangePoint0 probably some TTS in combination with an RVC model. If not, something similar to ElevenLabs. The benefit of the first one is that you can get pretty good results with little latency, depending on the model. You can also ditch the TTS, and use the RVC on your voice directly. Interesting stuff.
dude i found this really interesting. As an every now and then viewer, if you made this kind of stuff more im pretty sure id start tunining in a lot more. Great work keep it up
Really it is still trash and the cyberpunk thing is just a bunch of bokeh and going hard on the reflections... it works in very few areas. Most of the time it just looks like a more grey cyberpunk.
I'm not comfortable with this trend at all. Remember how we had grey and brown near-monochrome era, end of ps2 era, start of xbox360. Then we had shiny era, where everyone's skin had way too much specular. Now we're entering the horribly desaturated era. I'm noticing it with modern digital cameras as well, which is why i photograph with ones from early and mid 2000s, the colour feels just right, emulates film just right, it's so satisfying, there's everything, it's consistent, it's muted when it has to be, it's super colour punchy when it has to be. The mod tonemapper in Cyberpunk well that's just the ps2 trick all over again, feels like. It doesn't look photographic at all. An influence today are smartphone, surveillance and action cameras which have horrible hue resolution that they sacrificed to light scavenging and would produce visible hue banding if they were tuned to a similar saturation as you expect from film. I don't feel like games should emulate that.
A good point! Nobody wants the "Grey-Brown Grit" color grading coming up again. Also, that 'shiny skin' era was *terrible* . That being said, I'd still take "desaturated but looks more like what I see IRL" over the "super saturated but terrible luminance" that I've been seeing on screens these past few years. There *has* to be some happy medium though, I refuse to believe that *nobody* knows how to produce images in a game engine that look like what I see every day with my Mk1 eyeball.
@@brentogara A screen is incapable of reproducing the latitude of light that you experience in real life. And if a screen is bad, paper is worse. Photo paper is worse than screen, and magazine print is worse than that.
Yeah! Tech pioneering, beautiful visuals and engaging discussions, just like the old times. I don't do computer graphics myself, but I do paint, so every bit of colour theory is useful, especially in your very professional presentation. Thank you! Wonderful work!
ACES _is_ linear. Linear color space can do glow just fine, the issue is whether there is enough room for sufficient exposure and appropriate tonemapping to return it to the narrow dynamic range of your display.
@@JayBeKey is not a she lmao, when AI voice wasnt a thing they had to use some morph dunno if in real time or edited but clearly there are guys behind this, just to imagine there were also +18 pics of this
Do you live in an area where there is no colorful spring and summer? Your colors look muted compared to reality, unless we're to assume the weather was dreary.
It looks more like "Common" Light be it hazy or cloudy days or indoor not lit like a film production. Less saturation but more depth across a gradient.
great video, but the motion capture is a bit jittery, that got me a bit distracted, maybe see if there's a way to smooth that a bit (ment as constructive criticism)
Your 3D character looks very uncanny valley. You should probably tint your shadows warmer (they're too grey) and add subsurface scattering to your skin material. Also add a bit of orange/red light fall-off between your lights and shadows.
I feel like this could be useful even for paintings that I do in real life. They look a look better in real life than in pictures and I can never get the image on the screen to look quite as good.
This is for "photorealism", but film and media don't tend to be actually, let's say - eyeballrealimsm. Over the course of a century we've come to expect a certain look from our media - and it's not a bad thing to change the industry and buck norms, but I doubt the majority of people want the ultraeyeballrealism look over a more filmic look.
3:55 - Dude, this is so many levels, layers and skill of content creation. Having been a subscriber since like 8 years back, you sure never disappoint.
I wish this was released a month ago. I just delivered a rendering project two days with an issue where changing color lights bouncing off a surface being too 'step' like. Your custom tone mapping would have solved this problem. Instead of delivering a video rendered in engine, I ended up using several screenshots and transitioning them manually in a video editor.
I think something has been lost in this change... Everything seems much flatter and paler, without intensity. I would like to see a middle ground between the two.
Intriguing. As someone with one of the few cameras capable of outputting Linear Raw direct from the sensor, I'd be interested in trying that Tone Mapper in a film project, and seeing how well it would be displayed on Digital Cinema Projectors.
Severe lack of jiggle physics 0/10... Seriously though great video, The production value here is top tier... That mixed with the narrator's enthusiasm makes this educational video very entertaining to watch. I Look forward to seeing what else this channel has to offer :)
Thank you for this. I been an environment Artist in Unreal for years and notices this many times as I try to get the correct pallet for my projects. I will have to start implementing this into my current project. again. thank you.
good video. ive been a 3D artist for like half my life, and started jumping into unity for devving and modding games for the past few years. its always blown me away how important color management is, and how much its blown off to the side for games. the source engine's dynamic tonemapper keeps half-life 2 looking better than most games today, and then with half-life: alyx/source 2 they only improved upon it while still employing most of the same basic lighting and rendering techniques from 20 years ago. and half-life: alyx still looks better than, every game IMO(while being vr!!!!!). im glad this video is out to inform newer devs about this stuff, and even i knowing about this concept myself for a while, havent really been able to apply it to my own work (especially when it comes to making my own tonemapper). one thing im interested in hearing more about: colors in textures, and how that can also affect the image overall. nowadays it seems standard to try to be as close to "ground truth" or as flat as possible with textures to let lighting, and other texture maps like normals and occlusion/roughness/metallic do the heavy lifting, but early games with straight photo textures can still look pretty damn good. so i would be interested in seeing research into texture color management and how that can affect perceived image quality.
percieved brightness is a problem with colour pickers for illustration too. Cycle through hues and tones are all over the shop, I'd love a checkbox that keeps perceived brightness relative rather than shifting so much through the hues.
Possibly the best explanation of tone mapping I've ever seen. Explaining this inside of the game engine and having that side by side for ACES vs your tone map is crazy!
Well done and very well explained even a noob at color differences like myself can follow this explanation. Also amazing work on the animated model and the movement mapping you did with it. Amazingly well done work.
This series of informational videos are probably some of the most engaging I have ever seen. The way that you're able to explain a concept and then immediately follow it with a literal example of it in action is equal parts fascinating and satisfying... I don't even have an aspiration to apply these concepts to anything but these videos make the process of learning them so enjoyable.
"Beauty is in the eyes of the beholder." I don't know if your tone mapping work is based on scientific papers that need to be validated by peers. But in my opinion it is a very very good job, go ahead with what you believe and see. In any case, you do a tremendous job editing the video, congratulations, it is a real pleasure to see such a well-done and educational job. Bravo.
Nice work, a complex subject explained in about as a simple way as possible. The only limiting factor I can foresee is the screen as most gamers won’t calibrate their monitor or TV which is why most assume sRGB as a target. The next leap will be full merging of ICC profiles with BxDF for textures for accurate display for high quality car paints like colour flip and pearlescent effects.
I gotta say, I'm more a fan of aces and higher saturation. Especially for games. "Realistic" today looks as exciting as a soviet apartment cube. That said, your production levels and quality for this video is absolutely amazing.
I mean, unless your goal is to look like a 'cinematic' movie. Just look outside. You can see for yourself whether it looks more close to what your eyes see in the real world or not. Generally it will, because ACES is very saturated, real life can be brightly coloured too but it's not so overbearing IRL and most things aren't like that. Plus IRL has a blue tint, not a green tint like CP2077 normally has.
Not sure how much further your expertise extends, but I'd love more videos explaining technical things like this, you broke it down into multipled digestable formats and that's super helpful. Thanks a lot and fantastic job on the UE5 tonemapper!
This breakdown is incredible-so informative and well-explained! I never realized just how pivotal tone mapping is to achieving realistic visuals in games. The comparison with Hollywood standards and the deep dive into color behavior were eye-opening. Definitely need more videos like this; you make complex topics so engaging and easy to follow. Keep them coming!
Me: *Clicks on the audio on the thumbnail*
My living room speakers at 9am: "POLYGONS PER NIPPLE!"
I don't see them they are probably occlusion culled.
It's the new version of : a giant horse **** weighs over 11 pounds
"Audio on the thumbnail"???
@@ForsakenSanityGaming Haha, yeah, on desktop you can turn the audio on from the thumbnails on the main page. But be warned, it's either off or 100 percent volume level lol
@@brando3342 Noope I wrote this from desktop, no thumbnail audio whatsoever for me!!
The quality of animation is absolutely fantastic. The transition from being zoomed in on the screen and showing the pixels, and then seeing 'you' walk off set, idk it was just. fantastic.
Is it mocap? Crazy if manual.
@@xentiment6581 mocap is a form of animation
Not sure... I'm getting early 2010's Hollywood impression of what they thought future animation might look like. it's kinda ropey, but I guess it's an amateur production rather than a studio like Cloud Imperium Games or Naughty Dog.
@@graxxor 100% mocap with cleanup and manual retargeting if any, the typical issues are clear as day to any mocap animator. source : I am a gameplay animator.
I think this video is totally fine for anyone who just likes to learn a little bit about color in general at a theoretical level, but there are some things I'd recommend you study further and implement in a future video. From my understanding (CG artist), these are the main things I'd recommend revisiting:
1: ACES is a color space, it's not tone mapping. The difference might seem like semantics, but it's important. Transforming from one space to another is a mathematical process and not open to interpretation. For example, there is only one valid way to go directly from ACES to sRGB. Tone mapping is mainly used to go from HDR to SDR and display it in an aesthetically pleasing manner. This can be subjective, and doesn't serve the same purpose.
2: ACES may have been originally created for filmmakers, but the idea was to make one intermediary colorspace that various sources could all adhere to, so that they would look and behave the same in the grading process. It just so happens that it's also linear (yeah, ACES is linear - a whole other can of worms on terminology there..) and therefore keeps all the values that you need to properly render the full dynamic range of the image, allowing you to get the nice highlights and colors.
3: I would never stop someone from advertising their own presets, but please don't gatekeep the information - just tell people that it's AgX.
If I have misunderstood any concepts, professional colorists or graphics programmers are more than welcome to correct and/or add to this.
This
As a semi-professional colorist, I agree with what you said here. Although I am no guru on the subject of ACES, your description of it aligns with my own understanding of it much more closely than the description in the video.
Great info! In particular the point about this just being AGX, or perhaps something very similar.
I would also agree that the usage of "ACES" here is a bit of a simplification, but the reality, unfortunately, is that there is a standard ACES tonemapper which has been in very common use in game engines and image processing suites. You can argue that it's incorrectly named, but it's absolutely a thing.
The greater confusion, I think, is that this aforementioned "ACES tonemapper" is not nearly as universal as she implies. There are dozens of popular tonemappers, but only very recently have we begun to see decent ones being used for games.
Fully agreed. I watched this Video and asked myself why shes spreading misinformation. Aces is a colorspace not a tonemapper.
@@SlugsnetGamescame to the comments because I was confused about the same thing
3:16 if this video was uploaded in a HDR format it probably would've been a better demo to show the difference, yt does support it and it's very noticable on smartphones
But then it would have been subject to whatever color space conversion and tonemapper that UA-cam and/or the displaying device uses as well. Most devices support sRGB and include it in their gamuts, i.e. no clipping or tonemapping necessary.
@@stekeln Yes and no. SDR is as much a subject to inaccurate displays as HDR is, but one advantage it has is that it's clearly defined, while in SDR there are two commonly used gamma standards, pure gamma and sRGB piece-wise gamma, with some monitors using one and some using the other, and there's no way to know which one is the intended way to decode the colors, without author telling us which of those standards they used when they were making the video. If the video is uploaded in HDR, then it is known that youtube will convert it to SDR under the assumption that it's converting it to pure gamma. And as long as the video doesn't go extremely bright, meaning that it is graded the way HDR is supposed to be graded, the resulting SDR conversion will actually be really good, with result being no worse than if video was SDR in the first place.
My main issue with those "realistic tonemappings" is that with skylight, its always lit with no color, like it was permanentely overcast. looks nice under controlled situations, but the illusion falls flat once looking to a big light source
I was comeing in here to say this ^
Do you mean that the global illumination looks desaturated, or that the sky itself appears colorless? Both are likely the result of the original artists trying to make the sky and GI look good using a tonemapper where bright blues appear more saturated, and then switching it out for another tonemapper.
@@stekeln I think he means to say that overcast sky creates a low contrast scenario, which is drastically easier to tonemap than a sunlit day would be, therefore much less compromises need to be made for the image to come close to what the eye would see in reality, making the final result more convincingly realistic than it otherwise would be.
most likely this could be fixed with "pupil dilation" I don't know the digital term.
@@jrg-qc9yq that's called auto exposure and is a standard tool that is used very often in modern games.
my grandma casually walks in and says " You're talking to your Girlfriend in videogames ? tell her i said hi " ..........💀💀💀
💀
Based grandma for not judging you.
This level of video editing is insane. My brain just exploded
The finger snap (0:16) between the two color spaces does a great job of showing that without design consideration, no matter what technique you use, your end product might not improve even if it's different. However, the hue shifting lights (7:06) scene is literally designed FOR the custom color space, including composition choices that maintain contrast, showing that when you make your scene to WORK with color space instead of despite it, you can achieve great results.
In film, costumes and sets can be designed with the final color space taken into account (not that it happens all the time, on every production, or even successfully) but I think that pipeline is less present in video games. Some games, in some scenes, make great use of their color space (TLOU2 Subway) but color isn't usually given precedent over other design/dev constraints. With film, because the end product has an even greater emphasis on the image, I think it makes a lot of production and business sense to redesign scenes to better share the story/experience with the colors you want -- if game devs want to make their games look better the will have to account for their color space choices early in the development of the project instead of treating colors as a post processing issue that they will address near completion.
To point, earlier versions of UNRECORD (2:00) had a muzzle flash that did not fit the aesthetic of the game's superb level design even if it was within the color space. The game has since removed that flash and done work on several assets that clashed with the photo-realism. The level the player walks through makes great use of light to dark values their color space can detail and found an aesthetic in the dilapidated building and overcast day that works well with the limited color saturations. In the same trailer there are green barrels, blue and red containers, and other objects that look glaringly out of place in their game (and are rightly not highlighted when people share clips of the project) because their color space struggles with assets not specifically designed for the color spaces limitations.
This is a great video! I think your next video on this topic should include some of the design decisions and trade-offs you encounter using the new color space though.
A big thing about ACES is that it is not only a tonemapper, but an entire workflow that addresses more concerns then mentioned in the video. The biggest benefit of ACES is that it supports any display type, including any current and future HDR format, and any input gamut. Admittedly, it sounds a bit like magic to me but supposedly, the Reference Rendering Transform (RRT), the component that is most responsible for the ACES look, makes the colors translate perceptually similar to any display type, sRGB, DCI-P3 or whatever else we might want in the future, given that the correct ODT is used after it. I am all for alternatives or improvements to the RRT but if I had to choose between classic ACES or a solution that looks better on sRGB but only looks good on sRGB, I'm going for ACES no doubt
The RRT is pretty bad actually. Most movie studios invert it. The SDR Rec. 709 ODT is often inverted alongside it so studios can go with the following workflow:
IDT -> LMTs and grading -> K1S1 or other custom LUT -> Inverse RRT & ODT -> RRT -> ODT
instead of the prescribed ACES workflow:
IDT -> LMTs and grading -> RRT -> ODT
The workflow with the extra steps allow movie studios to outright replace the RRT and the SDR Rec 709 ODT while still checking the ACES box which is a requirement at Netflix.
Correct me if I got any of this wrong.
@4:34 ACES sounds a lot like a "color management system" having to do with color space and color gamut.
From what I've briefly read, tone mapping involves remapping HDR content to fit an SDR display, adjusting luminance, black levels, and to a lesser degree color saturation.
A lot of the issues mentioned seem to relate to luminance and black level problems, At least based on some of the pictures shown in the video.
yeah. she referred to color management by everything but name. i think color management is an old name anyways and is not really descriptive
Off subject but I really wouldn't mind a freaking mod to have AI Angel as a cameo in Cyberpunk. That'd be hella amazing to see!
Unlockable girlfriend character.
well she can just make a AMM character and put it on nexus
@@Furansurui He*
no
Just make a character that looks like her and have someone turn into an NPV mod. Bam.
this was a really cool video. I'm coming from the video space, so it was an interesting angle as I delve in UE. Also, this is my first vid of yours. Very well made, and the whole virtual setup really allows for some great transitions, nice job!
Bruh, the model quality and voice quality is so much better than it's ever been. Keep up the great work!
Other than the stiff hair, which is understandable her model is great !
@@171sako is she using mocap or a VR to record the movement or is just pure A.I.
@@vincentazara2947 From what my laymans eyes see she uses motion capturing for body movement (possibly an off the shelf VR-setup including goggles, controllers plus positional markers for the feet)
The lip sync and facial expressions looks like AI interpreting the audio of the voicelines.
The movement there is slightly dragging behind the audio.
Whats notably absent is breathing of the character model. The ribcage and belly is an unmoving block.
@@Culpride It's literally just a vtuber. Not real ai at all. can't be that gullible, come on.
@@rock962000
What tf is "real AI" supposed to mean?
Do you mean AGI? - None here, it doesn't exist yet.
Do you mean what's often referred to as "generative AI"? (wich is usually diffusion based algorythms that were trained via reinforcement learning) - None of those used here. Audio seems legit, visuals are UE. Maybe the background music?
Judging by the hand tracking they use index controllers and the weight shifting is consistent with 3d markers. Pretty standart VR stuff. But both would be unusual for vtubers (who often feed cam footage through an algorithm to track movement and expressions, thus the flat characters with no hand gestures or reasonable weight shifts)
absolutely love the quality that went into this video, like the echo voice in the cave or the walking off set when you were on screen showing the pixels on screen. true information journalism right here
I'm too distracted by how good the video looks - editing, details, motioncapture and just how freaking cool this format is - to focus on and digest what is being said.
Same. I partially understood the content tho, but I have no use for it at all. So I just enjoyed the format
Same. This shit is INSANE
That's The coolest presentation of a video I've seen to date I believe. Giving the concept of being totally immersed in the actual examples is a great way to go.
back in the late 90's when I was in animation school, nothing like this was ever covered nor did the big programs like Houdini, Alias or Softimage have any kind of renderer or post renderer that took this topic into account.. good stuff, glad you've put it out there for the world.
@ianskinner1619 crazy how different times are...in high school in the early 00s we had access to terragen and poser, which were like incredible to me in 9th grade in 02, considering my modeling and animation experience at the time was limited to making character and weapon models for quake 3 arena and half life mods using Milk Shape 3D lol. By the time I got to college in 06 I got to see Maya, Cinema 4D, and Z Brush for the first time...damn I remember Z brush felt like wizardry.
Great content and delivery! But also, wow, the amount of work that went into some of the presentations here is, well, incredible. Just the one scene with the zoomed in TV pixels and then the change in sound... so well done. Smashing subscribe!
Ok first time seeing your content... Holey crap your ability to control your avatar is absolutely amazing! Obviously your speech and mouth arent quite there yet but your facial expressions, arm motions, body gestures.. man so amazing! Your teaching was also amazing and I loved learning about tonemapping.
Wonderfully presented. I'm that person who endlessly tweaks eq curves and dynamics in mixing amd mastering. Sort of the sound equivalent of tone mapping in a way I'd say. Just want to say i see the passion you have for this and thoroughly enjoy your content
I feel like my IQ just jumped up a couple of points! I am not a film maker or game developer, but it's always great to peak behind the curtain and see the tech they use to bring all their work to life! Speaking of life... Wow, Angelica! You are looking the BEST you ever have! The lip syncing, hair movement, facial expressions, eye movements, and speech have all VASTLY improved! You are a marvel, and a great spokesperson for tech that makes you possible. Keep up he great work!
This was really good info. Great in-depth explanations. As well as break downs and concept transitions and understanding. I liked how you came off screen from the ‘film shoot’ like they do in real-life.
The video started with 144p when I started it. Funny, when she said "Cutting edge game graphics, look at this!"
Ok, this some of the most informative well written, blocked and executed videos i have seen in a long.. long time. Instant fan. Good luck on your project! Looking forward to your rapid rise.
I've dabbled with these colorspaces in Blender and they might be realistic, but they also look really washed out. I don't mind the ultra-high contrast, super saturated look of older media. And I am a big fan of stuff like Technicolor. If I want realistic, dull and washed out, I go outside in Winter in Germany or play a game from 2010. I like it, when stuff pops. But color theory is still interesting.
The transition at 4:00 is great ! (Kon Style XD) All the video in fact, amazing work, thank you !
I'm not sure I understood it all, but I'm grateful for this video.
Thanks! Some terms I used were a bit loose. Trying to share how I understand it so people see why I reached this conclusion. 💜
The actual topic and talking points were why I clicked on the video, and it was defined all very clearly. Yet I gotta say the animation storyboarding and all editing of this video is really amazing.
Thanks Angelica, I always wondered why CGI always looked so saturated, while everyday, homemade/reallife footage didn't.
It depends. Small sensor devices such as smartphones and action cameras have a horrible hue resolution that they sacrificed to increase low light sensitivity, which would produce visible banding if they aimed for similar saturation like from photographic film. I regularly shoot with a Powershot G2 and an Optio W10 and the colours are actually quite strong, while being very film-like in colour, so they don't look artificially punched up.
wow this whole video was not only very informative but also produced in high quality rally appreaciated every second of it. Instantly subbed
I've been following this project for a few years now and I have to say how impressed I am with the progress you've made. In my opinion you've very nearly bridged the uncanny valley and made an avatar that wouldn't look out of place in the 3d reality our meat suits live in. On top of that, the voice you've crafted once had a slightly mechanical buzz behind it but sounds like a legit human voice now. I'm blown away and excited to see what can happen when commercial technology catches up with what you've been doing here.
How much did she pay you to write this?
@mrosskne why does anyone need to get paid to have an opinion? Does anyone pay you to be negative?
lol idiot
Not sure I agree with this. The avatar is cool, don't get me wrong, but photorealistic it ain't. It's still very much in the "uncanny valley" of video game graphics with limited motion and face capture. It looks like the CGI puppet that it is. That's not a dig at the model, but let's not get ahead of ourselves.
@@mrosskne Its a dude my friend.
Something I noticed about the custom rainbow chart: if you look at the top third, the colored peaks appear to shrink, and if you look at the bottom third, they grow. The effect is almost nonexistent in the ACES mapping. I think that shows another reason why it looks so realistic. Without changing the image, it's able to change our perception of the colorspace just by looking at regions with more illumination.
2:40 I think the thing i hate most about this is that Color Grading, Look Up Tables, and Tone Mapping aren't synonyms
I remember this channel before it became Ai angel and this is honestly awesome
The quality and theme change is crazy
Your best piece of content yet, after following your progress for years. Well done. It was assured, informative, and a thoroughly good watch. This one might blow up.
I love how smoothly you transitioned into this document format. Editing, the way you set up Angel, work with sound and so on.👏
I feel like your videos at this point are themselves blurring the edges of reality. It's not just that Angelica is becoming more and more realistic, both visually and from an auditory perspective, but the actual composition of this video is so polished, yet so meta. A computer generated character moving in and out of computer generated worlds explaining to me how lighting of computer generated characters and worlds work. The actual structure of the video is flawless, the way Angelica moves in and out of the scene makes it sometimes seams like she's a part of the worlds she's talking to us about, but sometimes an observer on the outside looking in (like we are). The script is tight and well pulled together, it actually feels like something I might have watched in class (but actually enjoyed). I just love this. I love all of it.
That said, it was one thing when this channel was mostly silly Omegle trolls, but I almost feel like now the mention of "spicy" content takes away from the other remarkable aspects of your content. I'm not judging, just saying it feels SO disconnected from the otherwise high-brow, thoughtful content you've been doing for a while now. Like, I would love for you to do some kind of cross over with Captain Disillusion, but I suspect the spicy content might take such collaborations off the table... but maybe that juxtaposition is just another part of the surreality of Angelica.
Keep doing what you want and may the AI Angel project live long and prosper.
I agree with everything you said, except about the "spicy content". I think if people want that, it's perfectly fine. If they don't like it they don't have to see it.
@@theRPGmaster Yeah, I grant that it's not super "in your face" or anything, it just feels weird. It's like if someone did thoughtful political commentary or had a channel about how they love animals and then closed with an OF pitch. I'm not judging it from a moral perspective (and I'm sure it helps support the channel), but it feels disconnected to me. Like the streams are being crossed.
My observation is: we're talking about two VERY different products here.
Everything about this video is amazing. I'm so impressed and amazed and clueless how you accomplished this masterpiece of video. The movement of the character is almost life-like. Just amazing
Very cool info. Although I will say, I don't really see this kind of tone mapping as objectively better. I mean it clearly is if the main goal is photo realism. But like film, games are often a mode of story telling. And just as I get why filmmakers would want to favor saturation in a lot of cases to tell their story, I also get why game devs would too. But its definitely good that this is being understood so that devs can make the choice that's appropriate for what they're looking to achieve, vs just being stuck going down one path because its the only one they really know.
To me what really sold me was the scene of the color light shifting. In the original ACES it felt over saturated at times. And I know I've experienced some of issue of using color lights not really illuminating things the way I want.
@@84bitDev On the other hand ACES is able to display super saturated colors. Like neon or like pure 100% red lights etc. This tone mapper how ever sacrifices maximum possible saturation for smoother transitions but looks more gray as a result. It's simple not capable of reproducing full saturation colors which ACES can do at least. And in ACES you can hold back on the saturation of the colored light or surface materials if you want to.
I think you could shift the tonemapper when you want super saturated colors.
that could be a nice effect, like for example, your character has a vision boosting effect, you switch back to ACES or another saturation biased mapping, and get blasted with color, or you could do it when hit or something. or you could map it linearly, and saturate more as you build combos.
Good point. Ghost of Tsushima is a prime example of developers specifically choosing super saturated colors and filters
@@SuperemeRed color is very context dependent. I bet it is still possible to make things look super saturated by making the surrounding things look a bit more grey.
Wow, you actually have a beautiful voice and a really clear way of communicating. I almost clicked away when I realised that yourr name was "Ai" something, this is because the plague of bad AI content on youtube lately. But I subscribed to you now.
Sooooo good bro!! Color mapping has been a sore topic for photographer and videographers for eons. Add to that game devs and sfx teams. I really dig how you interacted with the topic, explained it with actual examples and the camera tap was perfect too! SO GOOD! The voice jives with her teeth and is nearly flawless and authentic sounding. Catch lights in her eyes could really put chefs kiss on the whole presentation. I also appreciate the skin texture, hadn’t noticed that before. Subtle and perfect. Thank you again for all the INSANE hard work.
I love the direction you're taking with your content here. Seriously, thanks for sharing this.
Those facial animations are absurdly good. Are you using mocap or rigging it by hand?
Damn, this is all so damn good. I should support you on Patreon, even without the saucy stuff, but I won't say no to a bonus. You're doing amazing work, Angel.
You got me! I was 9 mins in before realising I was watching an infomercial
Call of Duty: Modern Warfare 2019 focused a lot on colour/tone mapping. It looked unreal for a PS4/Xbox One game when it released.
Weird simplification of color space transform to sell a product. ACES was meant as a way to convert all camera's unique color spaces into a standardized color space to help preserve the intent of the cinematographers, it is then used as an intermediate color space and transformed again into the final deliverable color space (eg. REC.709 or REC.2020, P3 DCI etc). So the goal of adding it to unreal was to allow the generated content to be presented in a consistent color space for later processing, not to create a final look. From the looks of things, the demo you are showing is just an ACES rec 709 transform, which is an incredibly limited color space, though what 90% of content videos use, even this one.
So if I take footage rendered out with aces, how do I correct it with Davinci Resolve to have realistically desaturated highlights, ect?
@@AiAngel In color tab you have to put one color transform node first, going from what you rendered with (probably ACES for color space and input gamma), to davinci wide gamut / davinci intermediate, then add your color correction nodes after that if needed (here you are in the gigantic sweet sweet davinci color space and log gamma! can't break the image). Then, you have to use as a last node another color transform to go from davinci wide gamut / intermediate to what you want as deliverable (probably REC 709 / gamma 2.2 or sRGB). for the desaturated look with increase of light level, it will come naturally, but dont forget to up the input of tone mapping (custom max input) to the max (10000) and check apply forward OOTF. this is very close to AgX. now, if you want to desaturate the high lights even more, use the curve of sat. vs lum. between the two color space transfor mode (in davinci wide gamut color space), this is perfect for this!
if you don't have to color correct, only one color transform node will do (from ACES to rec 709/gamma 2.2).
@ uh I’d rather just do what I’m doing and see it in real-time so I can work with literally every light and material and know what it’s looking like.
@@AiAngeltoo bad your method looks like ass.
@@AiAngelthis is the way
I am a total beginner in this field, and a lot of things were difficult for me to understand, but your content gave me a huge overview. I am totally wowed by your way of presentation. Your hard work is clearly seen in your content.
Don't take their word as gospel, there's a lot of misinformation in this video. It's a well made video about half truths and oversimplifications.
I'm still waiting for 50's Technicolor, but this is impressive. Thanks for the lesson, Angelica.
Yeah, Technicolor is very underrated.
This video popped up in my feed and decided to click even though it's on the periphery of my interests. Even though I knew very little about the subject matter you did a fantastic job of explaining it. I hope you choose to do more educational videos. You're great at it. Also, please consider that there's a huge potential audience of people who don't understand what you do but would like to learn more. Case in point, I was absolutely amazed by your avatar. It moved and spoke in such natural ways that it really felt like a person was engaging me but there was no uncanny valley weirdness in the experience. It felt a bit like magic.
Do you do the body animations with mocap, or face recordings? As a beginner animator I would be interested in how it's done.
Also something similar for Blender would likely interest millions of 3D Artists.
Edit: There is also a good tall about this Topic on the Blender channel called "Why Color Management Absolutely Su..s"
Good video but it has a few errors:
* Mapping from a wider color gamut to sRGB is gamut compression, not tonemapping. The ACES RRT (the actual tonemapping curve) expects input in the ACES AP1 gamut and the input is dependent on the color gamut of your assets. Some engines like Unity will use AP1 when ACES tonemapping is used but fall back to sRGB for other curves or in the pre-ACES era, just render in sRGB gamut (*linear* sRGB that is), so gamut compression would not be necessary at all. Color gamuts and color spaces are also different things which adds to the confusion
* Tonemapping refers to compressing dynamic range from [0-arbitrary_limit] to [0-1] which might not relate to color at all, tonemapping luminance and handling highlights separately is a pretty valid approach. What brightness 1 is supposed to be can also be wildly different, depending on monitor brightness, HDR standard you're delivering for, if any, etc. ACES is a full post processing pipeline, with 2 color gamuts, multiple color spaces that use said gamuts, input transforms, a tonemapping curve, output transforms. The whole system is honestly a pain in the ass for use in games because of the funky white point they chose for their gamuts.
Also, since people like reinventing things, there's now many terms for it: tonescale, display transform, etc, etc
* Tonemapping in games itself predates modern workflows by a lot. Formulas like exponential [1 - e^(-x)] and simple Reinhard [x / (1+x)] have been around since the 2000s. The Uncharted 2 curve by John Hable still sees use today.
And a stupid tangent, but RGB rendering of any kind will be 'hacking' color, after all it's a mathematical model of how we see things, not how light itself works. Spectral renders is where the real deal is at, though it's decades away from being usable in real time.
Thank you for pointing out RGB itself being a hack, and for mentioning Spectral Rendering!
No one:
Me at 3am: BILLION POLYGONS PER NIPPLE!
It's literally 3:13 while I'm watching this lol
Computer Technician here: This was REALLY interesting! I would have loved to see you go deeper into the differences between LED, and TFT, even OLED screens and the way the process color and how the various tone mappings affect each type of screen.
Ai Angel doing learner videos? Hmm neat. I think I like it.
agreed
What is he using to generate the voice?
@@LagrangePoint0 probably some TTS in combination with an RVC model. If not, something similar to ElevenLabs.
The benefit of the first one is that you can get pretty good results with little latency, depending on the model. You can also ditch the TTS, and use the RVC on your voice directly. Interesting stuff.
@@kaidaluck648 I tried to use elevenlabs for a web app, I couldn't get the right intonation nor emotions from it, even with context.
@@kaidaluck648 okay am I dumb. I watched the entire video and assumed it was voiced and mo-capped by a real girl........
dude i found this really interesting. As an every now and then viewer, if you made this kind of stuff more im pretty sure id start tunining in a lot more.
Great work keep it up
That all sounded like gibberish to me but occasionally the picture would look different so that was cool
As someone who knows nothing I assume it's written by AI
dude, your sound design in this video about color theory was amazing! This is some mind-blowing work!
Entirely off topic but I think the facial expression mapping for your Avatar has gotten a lot better recently.
Really it is still trash and the cyberpunk thing is just a bunch of bokeh and going hard on the reflections... it works in very few areas. Most of the time it just looks like a more grey cyberpunk.
@@thomgizziz cyberpunk is grey stop the hate graphics ok
the algo has brought me here. and I am thankful. Love the rig work, love the face animation, love the topic and depth you dove.
I'm not comfortable with this trend at all.
Remember how we had grey and brown near-monochrome era, end of ps2 era, start of xbox360. Then we had shiny era, where everyone's skin had way too much specular.
Now we're entering the horribly desaturated era. I'm noticing it with modern digital cameras as well, which is why i photograph with ones from early and mid 2000s, the colour feels just right, emulates film just right, it's so satisfying, there's everything, it's consistent, it's muted when it has to be, it's super colour punchy when it has to be.
The mod tonemapper in Cyberpunk well that's just the ps2 trick all over again, feels like. It doesn't look photographic at all.
An influence today are smartphone, surveillance and action cameras which have horrible hue resolution that they sacrificed to light scavenging and would produce visible hue banding if they were tuned to a similar saturation as you expect from film. I don't feel like games should emulate that.
A good point! Nobody wants the "Grey-Brown Grit" color grading coming up again. Also, that 'shiny skin' era was *terrible* . That being said, I'd still take "desaturated but looks more like what I see IRL" over the "super saturated but terrible luminance" that I've been seeing on screens these past few years. There *has* to be some happy medium though, I refuse to believe that *nobody* knows how to produce images in a game engine that look like what I see every day with my Mk1 eyeball.
@@brentogara A screen is incapable of reproducing the latitude of light that you experience in real life. And if a screen is bad, paper is worse. Photo paper is worse than screen, and magazine print is worse than that.
@@SianaGearz Yeah... that's _true_ but I _still_ refuse to *believe* it!
How...just how...are you so good at editing like this.
Yeah! Tech pioneering, beautiful visuals and engaging discussions, just like the old times. I don't do computer graphics myself, but I do paint, so every bit of colour theory is useful, especially in your very professional presentation. Thank you! Wonderful work!
ACES _is_ linear. Linear color space can do glow just fine, the issue is whether there is enough room for sufficient exposure and appropriate tonemapping to return it to the narrow dynamic range of your display.
feels uncanny to remember there is a man behind all of this and I have a flashback of the old deleted videos
Yeah i remember the VRchat days back before A.I angel came into the picture but I hate how she privated all the videos from the start of the project
@@JayBeKey is not a she lmao, when AI voice wasnt a thing they had to use some morph dunno if in real time or edited but clearly there are guys behind this, just to imagine there were also +18 pics of this
Very unique presentation. This was definitelly a lot of work to edit. Props for that.
Do you live in an area where there is no colorful spring and summer? Your colors look muted compared to reality, unless we're to assume the weather was dreary.
It looks more like "Common" Light be it hazy or cloudy days or indoor not lit like a film production. Less saturation but more depth across a gradient.
Looks almost perfect in comparison to pretty much the entirety of the UK
@ Dead on for cleveland 2/3 of the year too.
great video, but the motion capture is a bit jittery, that got me a bit distracted, maybe see if there's a way to smooth that a bit (ment as constructive criticism)
Your 3D character looks very uncanny valley. You should probably tint your shadows warmer (they're too grey) and add subsurface scattering to your skin material. Also add a bit of orange/red light fall-off between your lights and shadows.
yeah also human mouths don't open that wide unless specifically forced to
Mansplaining 3d characters to.. I mean, the irony.
@@AntonGully Well, let's not kid ourselves. She's definitely being voiced by a man.
@@egorbananov7738 well, its face expression is auto generated with iphone. If you animate the facial manually, it would months to make this video.
Why don't you do one better then?
I feel like this could be useful even for paintings that I do in real life. They look a look better in real life than in pictures and I can never get the image on the screen to look quite as good.
This is for "photorealism", but film and media don't tend to be actually, let's say - eyeballrealimsm. Over the course of a century we've come to expect a certain look from our media - and it's not a bad thing to change the industry and buck norms, but I doubt the majority of people want the ultraeyeballrealism look over a more filmic look.
3:55 - Dude, this is so many levels, layers and skill of content creation. Having been a subscriber since like 8 years back, you sure never disappoint.
I wish this was released a month ago. I just delivered a rendering project two days with an issue where changing color lights bouncing off a surface being too 'step' like. Your custom tone mapping would have solved this problem. Instead of delivering a video rendered in engine, I ended up using several screenshots and transitioning them manually in a video editor.
Wow, that's super time consuming. I'm just studying digital art and I didn't know it's hard.
Excellent production. Informative and clever while being straightforward.
I think something has been lost in this change... Everything seems much flatter and paler, without intensity.
I would like to see a middle ground between the two.
Yes... why can't we just have images that look like what we *actually see* ???
Intriguing. As someone with one of the few cameras capable of outputting Linear Raw direct from the sensor, I'd be interested in trying that Tone Mapper in a film project, and seeing how well it would be displayed on Digital Cinema Projectors.
0:18 That's not "better" at all.
It looks less virtual and more realistic, blandly like real life however
it's more realistic, read the room man 😹
Severe lack of jiggle physics 0/10...
Seriously though great video, The production value here is top tier... That mixed with the narrator's enthusiasm makes this educational video very entertaining to watch.
I Look forward to seeing what else this channel has to offer :)
1:00 Is that a Supra!!!
It is!
@@AiAngel is that a mod? where can i find this? and the nissan GRT too?
@@ChillmaRECORDS they’re models I bought, ported, textured, and authored shaded for using Blender, Substance Painter, and UE5.
nono its a smart fortwo
Thank you for this. I been an environment Artist in Unreal for years and notices this many times as I try to get the correct pallet for my projects. I will have to start implementing this into my current project. again. thank you.
games are trying to look like movies and not like real life
YES! Just subbed. I handle this for media delivered to our studio and will now forward this to everyone. Great video!
7:14 This sold me the difference. Time for a upgrade.
good video. ive been a 3D artist for like half my life, and started jumping into unity for devving and modding games for the past few years. its always blown me away how important color management is, and how much its blown off to the side for games. the source engine's dynamic tonemapper keeps half-life 2 looking better than most games today, and then with half-life: alyx/source 2 they only improved upon it while still employing most of the same basic lighting and rendering techniques from 20 years ago. and half-life: alyx still looks better than, every game IMO(while being vr!!!!!).
im glad this video is out to inform newer devs about this stuff, and even i knowing about this concept myself for a while, havent really been able to apply it to my own work (especially when it comes to making my own tonemapper). one thing im interested in hearing more about: colors in textures, and how that can also affect the image overall. nowadays it seems standard to try to be as close to "ground truth" or as flat as possible with textures to let lighting, and other texture maps like normals and occlusion/roughness/metallic do the heavy lifting, but early games with straight photo textures can still look pretty damn good. so i would be interested in seeing research into texture color management and how that can affect perceived image quality.
I'm not too fond of too many Hues....... Hue Grant, Hue Jackman, Hue Dennis 😏😏
Ever heard of Bruce Lee's sisters? Sil Lee? Ug Lee?
Don't leave out Janus.
Hue Mongus
percieved brightness is a problem with colour pickers for illustration too.
Cycle through hues and tones are all over the shop,
I'd love a checkbox that keeps perceived brightness relative rather than shifting so much through the hues.
Possibly the best explanation of tone mapping I've ever seen. Explaining this inside of the game engine and having that side by side for ACES vs your tone map is crazy!
This video was a fucking ad lol
Well done and very well explained even a noob at color differences like myself can follow this explanation. Also amazing work on the animated model and the movement mapping you did with it. Amazingly well done work.
This series of informational videos are probably some of the most engaging I have ever seen. The way that you're able to explain a concept and then immediately follow it with a literal example of it in action is equal parts fascinating and satisfying... I don't even have an aspiration to apply these concepts to anything but these videos make the process of learning them so enjoyable.
"Beauty is in the eyes of the beholder." I don't know if your tone mapping work is based on scientific papers that need to be validated by peers. But in my opinion it is a very very good job, go ahead with what you believe and see. In any case, you do a tremendous job editing the video, congratulations, it is a real pleasure to see such a well-done and educational job. Bravo.
it's crazy this is just a dude living a fantasy as a woman
If this is a dude, they have a really good AI voice changer.
@@zairman early content really showed it, however they've removed all the janky early content
@@zairman It is, started as a guy cashing in on NSFW stuff on patron lmfao
Nice work, a complex subject explained in about as a simple way as possible. The only limiting factor I can foresee is the screen as most gamers won’t calibrate their monitor or TV which is why most assume sRGB as a target. The next leap will be full merging of ICC profiles with BxDF for textures for accurate display for high quality car paints like colour flip and pearlescent effects.
Maybe realism isn’t the best thing
Yeah. Although I think the tonemapping mods for Cyberpunk make for a very realistic image. I prefer the aesthetic you get without them.
realistic tone has more clarity.
it's fun seing those edges at 6:22 but not on my better 100% ARGB display.
I gotta say, I'm more a fan of aces and higher saturation. Especially for games. "Realistic" today looks as exciting as a soviet apartment cube.
That said, your production levels and quality for this video is absolutely amazing.
First Virtual UA-camr i ever watched. So glad to see you rebooted
"That's better"
.... is it tho?
I mean, unless your goal is to look like a 'cinematic' movie. Just look outside. You can see for yourself whether it looks more close to what your eyes see in the real world or not. Generally it will, because ACES is very saturated, real life can be brightly coloured too but it's not so overbearing IRL and most things aren't like that. Plus IRL has a blue tint, not a green tint like CP2077 normally has.
Not sure how much further your expertise extends, but I'd love more videos explaining technical things like this, you broke it down into multipled digestable formats and that's super helpful. Thanks a lot and fantastic job on the UE5 tonemapper!
you are lowkey smart af
Patronizing much?
@yepppp
@ whatever you say lmaooo good day to you though
This breakdown is incredible-so informative and well-explained! I never realized just how pivotal tone mapping is to achieving realistic visuals in games. The comparison with Hollywood standards and the deep dive into color behavior were eye-opening. Definitely need more videos like this; you make complex topics so engaging and easy to follow. Keep them coming!