The initial upload of this vid had less than 1/3 the views of the previous upload in the first minute so I just re-uploaded it lol.. Also I've been awake for like 20 hours so I'm gonna sleep and if UA-cam totally guts this vids reach then so be it we'll see when I wake up lmao..
Yup the algorithm has been going berzerk for a while now If I didn't just checked my subscription page I'd probably find this after like a week with UA-cam recommendations on the home page ... great content btw. I guess we're just glad to be seeing you back - I can't even count how many times i was asking myself if I was dreaming or something....yh I'm talking about the explainer type content
You are an absolute Angel. Love this video and appreciate it as a colour grader who works in Resolve and spends far too long messing with scopes and tone mapping settings. It really is a science that goes deep and there's so many ways to go about tackling it, and many frustrations as you eyes adapt quickly and you loose the ability to even see that it looks like dogsh** until you walk away and come back after a break.
The quality of animation is absolutely fantastic. The transition from being zoomed in on the screen and showing the pixels, and then seeing 'you' walk off set, idk it was just. fantastic.
There's a lot more complexity to color than you present in this video. I think you're falling into the dunning-kruger effect. ACES (Academy Color Encoding System) is not a tonemapper. It's a color encoding system (it's in the name). There's a lot of information in this video that simply doesn't make sense and I'm concerned it's misleading a lot of people. Tonemapping is all about taking wide dynamic range and represnting it faithfully in a low dynamic range environment. Throughout the video you seem to mistake colorspace transforms for tonemapping and vice vera. Tonemapping is a very subjective process. Note how you prefer your custom tonemapper over Unreal's default one, that alone is proof that it's subjective. Without tone mapping, lumanance values would be clipped when being transformed from larger to smaller dynamic ranges. A colorspace transform is used to transform colour between two colorspaces; when going from smaller to larger colorspaces, there is no 'preference' there, only objective truth. For example, you can do a colorspace transform between Rec709 and Rec2020. But what does it mean to do a colorspace transform? The keyword is relativity. It is ALL relative. You have failed to highlight a cruicial fact; values only go between 0 and 1, with 0 being off and 1 being fully on. ACES, Rec2020, P3, Rec709... there's no difference between the data. It's all a range of 0 to 1. So if it's all 0 to 1, then what the fuck is going on? It's relativity. Rec709 is a technical standard; it is simply a reference point. That's important - it's a guide, a goal, something to strive towards achieving. Rec709 defines specific color coordinates which can be measured IRL using a spectrophotometer. For example, you could take pure green (for example, RGB 0/1/0) and display it on a Rec709 display. You then measure the light emiting from the display with a spectrophotometer. If the measured result matches the Rec709 standard, congrats, you have a perfect Rec709 green. But the thing is, what if you display RGB 0/1/0 on a Rec2020 display? Rec2020 is supposed to cover a much broader range of colors, right? Correct. If you put 0/1/0 on a perfect Rec2020 display and measure it, the reading will match the Rec2020 standard. What gives? Suddenly all our color values have no meaning at all and it's just up to the display? Well... yes. That's right. A colorspace transform is a way of mapping that 0/1/0 between colorspaces. Let's say we have that pure RGB 0/1/0 green in Rec709, but we want to display it on a Rec2020 display. What you do is transform the 0/1/0 into Rec2020. What ends up happening is 0/1/0 now becomes (for example, not correct) 0/0.6/0. Now the green is lower. This makes sense; Rec2020 goes much more saturated, so to match Rec709, you need to bring the green down. Now if you put the transformed green on a Rec2020 display beside a full 0/1/0 green on a Rec709 display they will match. It's ALL about relativity. And when you go from a LARGE space like Rec2020 pure green down to Rec709, you need to start making decisions about how that maps out, and that's where you start to enter tha land of tone mapping. Making a LARGE dynamic range fit into a LOW dynamic range in a way that represents the nature of how it looked in the large dynamic range. With this new information I want to revisit ACES. There are MANY colorspaces that are part of the Academy Color Encoding System. ACES 2065-1 has 100% coverage of all wavelengths in our reality; actually it extends beyond it significantly. If you had a 0/1/0 in ACES, no display in the universe could ever display it. The point of ACES2065-1 is an intermediate container. It's objective. You slam all your different color spaces into ACES, and then transform out of it in a unified way. You never WORK in ACES, you only conform and store color in ACES. They tried to make ACES usable in a workflow with AP1, but AP1 is so close to Rec2020 that colorsists prefer Rec2020 since that's what people with Rec2020 displays will be watching. But I just want to bring attention to one finaly interesting thing: Pointer's gamut. Pointer's gamut is an irregular gamut of colours that exist in our real world. That is to say, Dr Michael Pointer went out and measured real colors reflected from the objects around us. The result is a wavy gamut that represents the color we see most of the time. The only time you'll see colors ourside of Pointers gamut is with things like LEDs where they can produce a specific and incredibly saturated wavelength of light. Now compare Pointer's gamut to Rec2020. Rec2020 covers colors that you don't really see in real life. In fact P3 is the closest match you'll get to "real life" colors, and P3 is the cinema standard. And this really highlights my opinion on this entire matter: do you have materials in your scene that are producing colors outside of Pointer's gamut? Because if you do, of course you need a tonemapper to fix it. You're producing unrealistic colors! No wonder your image looks unrealistic even after tonemapping. Your use of postprocessing, color correction, luts, won't fix the root issue; your scene is producing colors in a scene referred world (aka linear) which is then tonemapped to look more reasonable. Also fun fact, colorists spend their time grading in linear since it's how light actually works. So if you have an issue with how your image is looking, you need to start paying attention to the linear scene reffered color instead of the result down the chain of tonemapping and correcting. There's so much more I could say about this since it's so deep and complex, and this is truly just scratching the surface.
Also worth noting that tonemapping can go the other way too. It really is as simple as the name suggests, remapping a tone to a new one in whatever colorspcae is being used.
back in the late 90's when I was in animation school, nothing like this was ever covered nor did the big programs like Houdini, Alias or Softimage have any kind of renderer or post renderer that took this topic into account.. good stuff, glad you've put it out there for the world.
@@vincentazara2947 From what my laymans eyes see she uses motion capturing for body movement (possibly an off the shelf VR-setup including goggles, controllers plus positional markers for the feet) The lip sync and facial expressions looks like AI interpreting the audio of the voicelines. The movement there is slightly dragging behind the audio. Whats notably absent is breathing of the character model. The ribcage and belly is an unmoving block.
@@rock962000 What tf is "real AI" supposed to mean? Do you mean AGI? - None here, it doesn't exist yet. Do you mean what's often referred to as "generative AI"? (wich is usually diffusion based algorythms that were trained via reinforcement learning) - None of those used here. Audio seems legit, visuals are UE. Maybe the background music? Judging by the hand tracking they use index controllers and the weight shifting is consistent with 3d markers. Pretty standart VR stuff. But both would be unusual for vtubers (who often feed cam footage through an algorithm to track movement and expressions, thus the flat characters with no hand gestures or reasonable weight shifts)
I feel like my IQ just jumped up a couple of points! I am not a film maker or game developer, but it's always great to peak behind the curtain and see the tech they use to bring all their work to life! Speaking of life... Wow, Angelica! You are looking the BEST you ever have! The lip syncing, hair movement, facial expressions, eye movements, and speech have all VASTLY improved! You are a marvel, and a great spokesperson for tech that makes you possible. Keep up he great work!
It depends. Small sensor devices such as smartphones and action cameras have a horrible hue resolution that they sacrificed to increase low light sensitivity, which would produce visible banding if they aimed for similar saturation like from photographic film. I regularly shoot with a Powershot G2 and an Optio W10 and the colours are actually quite strong, while being very film-like in colour, so they don't look artificially punched up.
This series of informational videos are probably some of the most engaging I have ever seen. The way that you're able to explain a concept and then immediately follow it with a literal example of it in action is equal parts fascinating and satisfying... I don't even have an aspiration to apply these concepts to anything but these videos make the process of learning them so enjoyable.
I feel like your videos at this point are themselves blurring the edges of reality. It's not just that Angelica is becoming more and more realistic, both visually and from an auditory perspective, but the actual composition of this video is so polished, yet so meta. A computer generated character moving in and out of computer generated worlds explaining to me how lighting of computer generated characters and worlds work. The actual structure of the video is flawless, the way Angelica moves in and out of the scene makes it sometimes seams like she's a part of the worlds she's talking to us about, but sometimes an observer on the outside looking in (like we are). The script is tight and well pulled together, it actually feels like something I might have watched in class (but actually enjoyed). I just love this. I love all of it. That said, it was one thing when this channel was mostly silly Omegle trolls, but I almost feel like now the mention of "spicy" content takes away from the other remarkable aspects of your content. I'm not judging, just saying it feels SO disconnected from the otherwise high-brow, thoughtful content you've been doing for a while now. Like, I would love for you to do some kind of cross over with Captain Disillusion, but I suspect the spicy content might take such collaborations off the table... but maybe that juxtaposition is just another part of the surreality of Angelica. Keep doing what you want and may the AI Angel project live long and prosper.
@@LagrangePoint0 probably some TTS in combination with an RVC model. If not, something similar to ElevenLabs. The benefit of the first one is that you can get pretty good results with little latency, depending on the model. You can also ditch the TTS, and use the RVC on your voice directly. Interesting stuff.
its interesting that we can make games that look like real life but we can always tell its a video game when we look at the human characters. Our minds have the ability detect to when something is slightly off about a person or the way they move
Completely separate from the wacky stuff you might find on the patreon, this project and channel providing REALLY insightful content like this is amazing. That comparison between ACES and your Custom tone map was the perfect visual and made me understand everything instantly! I'm glad you're using this project as an education tool and I hope it continues!
Your best piece of content yet, after following your progress for years. Well done. It was assured, informative, and a thoroughly good watch. This one might blow up.
Sooooo good bro!! Color mapping has been a sore topic for photographer and videographers for eons. Add to that game devs and sfx teams. I really dig how you interacted with the topic, explained it with actual examples and the camera tap was perfect too! SO GOOD! The voice jives with her teeth and is nearly flawless and authentic sounding. Catch lights in her eyes could really put chefs kiss on the whole presentation. I also appreciate the skin texture, hadn’t noticed that before. Subtle and perfect. Thank you again for all the INSANE hard work.
Thank you for this. I been an environment Artist in Unreal for years and notices this many times as I try to get the correct pallet for my projects. I will have to start implementing this into my current project. again. thank you.
Very cool info. Although I will say, I don't really see this kind of tone mapping as objectively better. I mean it clearly is if the main goal is photo realism. But like film, games are often a mode of story telling. And just as I get why filmmakers would want to favor saturation in a lot of cases to tell their story, I also get why game devs would too. But its definitely good that this is being understood so that devs can make the choice that's appropriate for what they're looking to achieve, vs just being stuck going down one path because its the only one they really know.
To me what really sold me was the scene of the color light shifting. In the original ACES it felt over saturated at times. And I know I've experienced some of issue of using color lights not really illuminating things the way I want.
@@84bitDev On the other hand ACES is able to display super saturated colors. Like neon or like pure 100% red lights etc. This tone mapper how ever sacrifices maximum possible saturation for smoother transitions but looks more gray as a result. It's simple not capable of reproducing full saturation colors which ACES can do at least. And in ACES you can hold back on the saturation of the colored light or surface materials if you want to.
I think you could shift the tonemapper when you want super saturated colors. that could be a nice effect, like for example, your character has a vision boosting effect, you switch back to ACES or another saturation biased mapping, and get blasted with color, or you could do it when hit or something. or you could map it linearly, and saturate more as you build combos.
Yeah! Tech pioneering, beautiful visuals and engaging discussions, just like the old times. I don't do computer graphics myself, but I do paint, so every bit of colour theory is useful, especially in your very professional presentation. Thank you! Wonderful work!
Color, Lighting, Textures. The three pillars of graphical quality. And who better to explain the importance of color than the GOAT of V-tubers in this reality.
Not sure how much further your expertise extends, but I'd love more videos explaining technical things like this, you broke it down into multipled digestable formats and that's super helpful. Thanks a lot and fantastic job on the UE5 tonemapper!
Really it is still trash and the cyberpunk thing is just a bunch of bokeh and going hard on the reflections... it works in very few areas. Most of the time it just looks like a more grey cyberpunk.
good video. ive been a 3D artist for like half my life, and started jumping into unity for devving and modding games for the past few years. its always blown me away how important color management is, and how much its blown off to the side for games. the source engine's dynamic tonemapper keeps half-life 2 looking better than most games today, and then with half-life: alyx/source 2 they only improved upon it while still employing most of the same basic lighting and rendering techniques from 20 years ago. and half-life: alyx still looks better than, every game IMO(while being vr!!!!!). im glad this video is out to inform newer devs about this stuff, and even i knowing about this concept myself for a while, havent really been able to apply it to my own work (especially when it comes to making my own tonemapper). one thing im interested in hearing more about: colors in textures, and how that can also affect the image overall. nowadays it seems standard to try to be as close to "ground truth" or as flat as possible with textures to let lighting, and other texture maps like normals and occlusion/roughness/metallic do the heavy lifting, but early games with straight photo textures can still look pretty damn good. so i would be interested in seeing research into texture color management and how that can affect perceived image quality.
This was really good info. Great in-depth explanations. As well as break downs and concept transitions and understanding. I liked how you came off screen from the ‘film shoot’ like they do in real-life.
Color grading makes movies' real life scenes look artificial. But we like that because it helps convey contexts and things. What you did with the colors, makes artificially generated scenes look real. Well, at least way more real than we thought was possible. Awesome work!
I have literally zero knowledge in game design, but watching your content always has me wishing I did because it always looks so good and has been pretty informative lately (which I love). I think my biggest thing I'd want to learn is how you made your character and what you use for motion capture. I'd love to be able to have an avatar to play around with like yours.
I think something has been lost in this change... Everything seems much flatter and paler, without intensity. I would like to see a middle ground between the two.
Fantastic video! I would like to add that ACES makes sens in the realm of video production because cameras are ultimately limited to a much lower dynamic range as the human eye. Most cameras on the market do about 13-16 stops of dynamic range, but our eyes can do up to 23 stops. So using a limited tonemapper like ACES works for that because you are already dealing with limited information. In the digital realm however, there is virtually no limit to how much dynamic range we can create. The digital scenes that we make can easily process the dynamic range of our eye, and porbably even further. Now there is little reason to try and display that, because the display medium is utlimately gonna be the limiting factor here. But if we have the possibility to digitally process that much dynamic range, then we absolutely should to preserve as much information as possible before we compress it down to a display format like sRGB. And ACES kinda artifically limits the capabilities of this. We're not dealing with digital video shot on a CMOS camera censor, so we shouldn't treat the colors like we are. Also I was arguing for a long time now that pushing higher polygon counts and try to get rendering closer and closer to the way it behaves IRL with say raytracing or pathracing is not what is missing to get things to look more photorealistic. The whole idea of trying to do photorealism but then slapping low dynamic range simlations, tons of bloom, volumetric ligthing and god knows what els into the mix is a setup for failure. A hyperrealitic approach is never gonna look photorealistic because the nature of photorealism is to present things as they should look IRL, not as an artist would like them to be. Photorealism is about replicating, not about interpreting or enhancing. So strip away effects if they're too much, dull it down and focus on the fundamentals. And i think with your tonemapper we are definitely one step closer to archieving that goal. Now all we need to do is get companies to ditch the outdated and highly flawed CIELab Color space, for the much more versatile and perceptually accurate OKLab.
I gotta say, I'm more a fan of aces and higher saturation. Especially for games. "Realistic" today looks as exciting as a soviet apartment cube. That said, your production levels and quality for this video is absolutely amazing.
As someone who is doing alot of stuff with unreal for games/films and my own fun projects. These are the kind of discoveries I love. I love it when someone thinks outside the box and breaks the mold to improve something. Also love the indepth explanation on the whole subject, well done! Definitely going to support this.
You've been blowing my mind since you were created, Angelica, and even though I know very little about game development, I could listen to you talk about it for days. I hope you do more of these, this was a great video ❤
This has got to be the best video you've made! Even though I understood only about half of what you were describing/explaining, you did it in a way that made sense, step by step. It's good to see the AI Angel project making these sweeping changes. By the way, nice fangs, LOL.
Colour space is a really interesting topic. You never really consider that 100% of what you see on screens is informed by super elite wizard people that get to decide. Acerola has a really good technical and fun video on the subject for those interested.
this was a really cool video. I'm coming from the video space, so it was an interesting angle as I delve in UE. Also, this is my first vid of yours. Very well made, and the whole virtual setup really allows for some great transitions, nice job!
Havent seen your content show up in quite some time, and this was great. Love the look, sound/voice you have updated on Angelica. And this was a fantastic shift(to me) on content by being a learning video.
My main issue with those "realistic tonemappings" is that with skylight, its always lit with no color, like it was permanentely overcast. looks nice under controlled situations, but the illusion falls flat once looking to a big light source
I'm too distracted by how good the video looks - editing, details, motioncapture and just how freaking cool this format is - to focus on and digest what is being said.
I'd love to see more videos like this. I'm a 3d artist looking to go into some video production using UE and knowing this ahead of time has no doubt saved me a bunch of effort.
This is super interesting and informative. I'm not a graphics artist or animator or anything so I'm not sure why it showed up in my feed, but what a great job you've done helping me sort of understand the topic and your presentation is fantastic! Subscribed. 😺
While a little bit whent over my head, i really liked that you explained it so well. So i feel like i have a better grasp of lighting and color, so thank you for that. Oh and your custom stuff looked way better than the aces stuff..
I remember your old vids and... I'm amazed at the pace you got better, looove the hair bounce! And I think you picked a great topic, I'm excited for this kind of content :)
Looks impressive. I am immediately convinced. The Aces reminds me of the yellow filter in 2010 era games. The issue being aces is apparently unavoidable. Another big issue is that the color range limitation is so widely populated, it is unforeseeable when a new wider-range standard will be supported by game devs, given the small relative amount of people owning the required hardware for it. Lastly, I myself have permanent night light on, i.e. blue has only 50% intensity. This again changes realism. Given the video presentation, the foremost big issue before color and geometry though to me is the projection itself. I am used when looking outside to not be able to tell whether two lines are parallel or whether a line is straight, due to the fish-lense effect of my human eye. I also need time to set focus onto an observed point. A screen will always ruin these two effects and therefor always makes it impossible to experience a virtual situation as though you were actually in it yourself. Even a 3D cage with shutter glasses does not cut it. One needs actual eye lens tracking and sharpening effect computed in real time to the lens focusing of each individual eye, also accounting for lens defects otherwise rectified by the wearing of glasses (as of course these cannot be worn while connected to the virtual interface apparatus).
I don't even do dev work, cinematography, etc., and this was extremely interesting. The color shifting rooms was a phenomenal comparison. Really liking these tech vids.
Fantastic reporting. I really liked this style and it's pretty cool how cutting edge what you're doing is. It's very interesting what you've accomplished so far and it's cool to see you are always evolving. Thank you for giving us a window into the developing world and a fun and interactive way. I'm looking forward to seeing where this path takes you and us because this is going to have an influence on the industry.
As a software dev (in fintech rather than gaming), this was really well put together for ordinary people to follow. The script and visual examples were thoughtfully refined and lucid.
This is so cool! Your so brilliant at making these cool videos and I'm really enjoying the information lessons added in with the cyberpunk theme. Keep up the great work mate! 👍👍👍
What about skin? Even here and in every game, everyone looks like an action figurine. Its always too shiny, really everything in games is (save for that insane tech demo mentioned and a few motor sport games). The new Halflife 2 raytracing videos look like plastic action figures on an acrylic painted diorama, the over contrast is terrible and it doesn't look much better than real time cubemap lights and a GI transport. These giant space heater video cards for barely 60fps, blurry temporal AA, noisy reflections. Its crazy disappointing. Voxel cone tracing should have gotten a hardware acceleration before raytracing fr
The difficulty rendering realistic skin is not in color space or tone mapping, it's in the way the light interacts with skin. Skin is not a hard surface, it has many layers and light bounces between them. There are approximations, but the better ones are too expensive yet still not perfect (it's a difficult problem). Ray tracing (or path tracing) does not change the properties of the materials, so a hard material with a better light model will look MORE like a hard material. Re: giant videocards barely managing 60fps at raytracing ------ hey, you have realtime raytracing! sure, it's not perfect yet, but just a couple years ago this was a pipe dream. For now we can enjoy real reflections (I'll take blurry over fake) and cheap amazing shadows.
@MadsterV then why does most SSS look even worse? I put it to you specular is just plain wrong in nearly everything, and seeing it in vr dramatically highlights the flaws in the current approximation
Subscribed. I've seen a few of your videos in the past and they are always super informative. I've learned so much about broader design concepts from you!
3:16 if this video was uploaded in a HDR format it probably would've been a better demo to show the difference, yt does support it and it's very noticable on smartphones
Knowing more about tonemapping and how to edit it in Unreal is certainly something I am interested in. My goal is not realism however, but having the power to adjust things to fit my particular project would certainly be nice.
Just wanted to say, I have been a huge fan of yours since the good old omegle days and wanted to thank you for making my days brighter when I was going through the shit. I hope one day you could do that again maybe on discord or the next version of omegle :D. Anyways, glad you are still on and maybe If you still do them challeges where you get people to edit the a recording of you on the green screen :D
Damn, this is all so damn good. I should support you on Patreon, even without the saucy stuff, but I won't say no to a bonus. You're doing amazing work, Angel.
Absolutely love you info vides Angelica! Goes way beyond just a tech or info filled death by power point. You get the points and info across while still breaking it down Barney style so we get it and can visibly see it. Thanks for all your hard work.
WTF youtube just recommended this and it's amazing. I know nothing about colors or aethetic or design, but i LOVE learning more about the technicalities behind things i like (like video games). Great channel, imma watch more videos. Cheers from Brazil
Intriguing. As someone with one of the few cameras capable of outputting Linear Raw direct from the sensor, I'd be interested in trying that Tone Mapper in a film project, and seeing how well it would be displayed on Digital Cinema Projectors.
Ok. I'm in love. I'm an amateur game developer. I'm no artist. Everything I make is ugly AF. Recently I moved to UE5 (after Unity pricing scheme) and decided to focus more on visuals because UE5 helps me make stuff pretty with way less effort than Unity I was using. And it's time to get out of my comfort zone. And as someone who has very little knowledge about visual part of game development (I will spend hundreds of hours programming some awesome, complex mechanic but maybe 10 to make visual representation of that mechanic) - You explain stuff so well. And you make it look interesting. Thanks!
Amazing video. I hope the channel blows up big time, this is such good content. Knowledge, like actual deep knowledge of cinema and game dev, a cute polygon girl, absolutely love this.
Yes and it's even more complex than that in a bunch of directions. What you ideally want is that the color impression happening in the viewer's head matches the color an area of an object in a scene would realistically have considering all lighting and all surrounding objects and their properties in a scene. There are a lot of things influencing that. Let's try, if I get them all: - The developers who create the "3D tool" (yes that's a very simplified term), e.g. Unreal Engine and how deep they want to go concerning all aspects of realistic color while weighing it against performance impact - The artists creating worlds in said 3D tool and their level of need to a) change reality due to their need to get their artistic idea across and b) take shortcuts with color filters and other tricks, because they don't want to or don't have the budget to actually rig lighting and atmosphere conditions exactly right to get the effect they're going for. Ideally you switch all the "tricks" off and get as close to what you want with simulated lighting, atmosphere, ... and then if there's still a gap between what you see and what you want add tricks. Takes more time, so that's a money and patience issue. - The monitor companies. We haven't seen a lot of visible improvements concerning stretching out that tiny sRGB triangle in the hardware (Monitor companies will of course disagree). It's not impossible, but the pressure towards them to innovate in that direction isn't enough. And monitors delivered to the general public are not well calibrated, if at all. So all the people above and below this bullet point see "wrong" colors, if they didn't invest time and money to properly calibrate their stuff. There's a lot of room to improve there. - The human eye. Very tricky thing. The bad news is everybody is a bit different. Color perception changes with a bunch of physical conditions and environment lighting. The ability to perceive color how it actually is, also gradually degrades in old age. So there is definitely a need for the viewer of the final product to be able to adjust the colors he gets presented with so they can see it how they see it as optimal. Not sure, if that list is complete, but it's at least something. There's a lot to think about in getting the viewer's color perception realistic.
Oh boy, I think I opened a can of worms by watching this. I'm just a "consumer" and have zero experience in filmmaking or content/game creation. But I have a hunch that I won't be able to un-see color tone mapping issues going forward. You know, like the indicator blip for an upcoming movie reel change that is visible in older movies. Once you have knowledge about it, you can't un-see it anymore, even when not really paying attention to it.
I really wish you didn't delete your old videos 😢 it was cool watching the progression of your avatar. How far it's come. Also, the stranger interaction from Omegle was entertaining too.
Yo! What is this channel ? This presentation is living in 2077 ! Really cool stuff ! How is this 3D character so perfectly synched ? Just meta-human thing or more advanced motion capture ? It's so natural. The interaction with the viewer is next level!
@@Jimmy-wf6vy Like most V Tubers, yes. But it doesn't matter. It looks cool. And I rather look at a fake babe, than some DEI crap that we're fed with lately.
@@Jimmy-wf6vy What are you on youself here ? The character model is the fake babe I was talking about. "Fake" - coz there's often a dude behind it all, "babe" - coz she looks attractive for typical reasons. What's with weird questions ?
I love your detail explanation, now I wanted to make dip to tone mapper, I usually just use color grading that available in unreal, as for the last of us, that's probably the style of color they aim for that scene, highly saturated one. thanks for the video! also I already sub to you for years, but your video never show in the home feed at all literally I don't know how long, and just notice your old video is gone now or am I on different timeline lol
So, my screen is able to display most of the DCI-P3 color space. I’d be curious to know how this changes the picture for tone mapping? Also how does OLED versus LCD change the picture. I’m pleased that your lip sync is very on target here. Most AI stuff I’ve seen has problems in that space. And, as of now, you are the only AI-related channel that I am subscribed to.
OK, she is so good at this. I'm having some. having some callbacks of memory to the first. time I watched the matrix. in the movie theater back when it first came out. It was amazing She is really amazing at this. God, I hope she's making enough money. We need to support our talent. or lose it.
The initial upload of this vid had less than 1/3 the views of the previous upload in the first minute so I just re-uploaded it lol.. Also I've been awake for like 20 hours so I'm gonna sleep and if UA-cam totally guts this vids reach then so be it we'll see when I wake up lmao..
Your Videos are allways above and beyond any others i have seen im a total beginner to game design and ive learned so much from you so thank you!
Yup the algorithm has been going berzerk for a while now
If I didn't just checked my subscription page I'd probably find this after like a week with UA-cam recommendations on the home page ... great content btw.
I guess we're just glad to be seeing you back - I can't even count how many times i was asking myself if I was dreaming or something....yh I'm talking about the explainer type content
You are an absolute Angel. Love this video and appreciate it as a colour grader who works in Resolve and spends far too long messing with scopes and tone mapping settings. It really is a science that goes deep and there's so many ways to go about tackling it, and many frustrations as you eyes adapt quickly and you loose the ability to even see that it looks like dogsh** until you walk away and come back after a break.
Oh ai angel :O
Get some sleep, great work.
The quality of animation is absolutely fantastic. The transition from being zoomed in on the screen and showing the pixels, and then seeing 'you' walk off set, idk it was just. fantastic.
Off subject but I really wouldn't mind a freaking mod to have AI Angel as a cameo in Cyberpunk. That'd be hella amazing to see!
Unlockable girlfriend character.
well she can just make a AMM character and put it on nexus
@@Furansurui He*
no
Just make a character that looks like her and have someone turn into an NPV mod. Bam.
There's a lot more complexity to color than you present in this video. I think you're falling into the dunning-kruger effect. ACES (Academy Color Encoding System) is not a tonemapper. It's a color encoding system (it's in the name). There's a lot of information in this video that simply doesn't make sense and I'm concerned it's misleading a lot of people.
Tonemapping is all about taking wide dynamic range and represnting it faithfully in a low dynamic range environment. Throughout the video you seem to mistake colorspace transforms for tonemapping and vice vera.
Tonemapping is a very subjective process. Note how you prefer your custom tonemapper over Unreal's default one, that alone is proof that it's subjective. Without tone mapping, lumanance values would be clipped when being transformed from larger to smaller dynamic ranges.
A colorspace transform is used to transform colour between two colorspaces; when going from smaller to larger colorspaces, there is no 'preference' there, only objective truth. For example, you can do a colorspace transform between Rec709 and Rec2020.
But what does it mean to do a colorspace transform?
The keyword is relativity. It is ALL relative.
You have failed to highlight a cruicial fact; values only go between 0 and 1, with 0 being off and 1 being fully on. ACES, Rec2020, P3, Rec709... there's no difference between the data. It's all a range of 0 to 1. So if it's all 0 to 1, then what the fuck is going on?
It's relativity. Rec709 is a technical standard; it is simply a reference point. That's important - it's a guide, a goal, something to strive towards achieving. Rec709 defines specific color coordinates which can be measured IRL using a spectrophotometer. For example, you could take pure green (for example, RGB 0/1/0) and display it on a Rec709 display. You then measure the light emiting from the display with a spectrophotometer. If the measured result matches the Rec709 standard, congrats, you have a perfect Rec709 green.
But the thing is, what if you display RGB 0/1/0 on a Rec2020 display? Rec2020 is supposed to cover a much broader range of colors, right? Correct. If you put 0/1/0 on a perfect Rec2020 display and measure it, the reading will match the Rec2020 standard.
What gives? Suddenly all our color values have no meaning at all and it's just up to the display? Well... yes. That's right.
A colorspace transform is a way of mapping that 0/1/0 between colorspaces. Let's say we have that pure RGB 0/1/0 green in Rec709, but we want to display it on a Rec2020 display. What you do is transform the 0/1/0 into Rec2020. What ends up happening is 0/1/0 now becomes (for example, not correct) 0/0.6/0. Now the green is lower. This makes sense; Rec2020 goes much more saturated, so to match Rec709, you need to bring the green down. Now if you put the transformed green on a Rec2020 display beside a full 0/1/0 green on a Rec709 display they will match. It's ALL about relativity. And when you go from a LARGE space like Rec2020 pure green down to Rec709, you need to start making decisions about how that maps out, and that's where you start to enter tha land of tone mapping. Making a LARGE dynamic range fit into a LOW dynamic range in a way that represents the nature of how it looked in the large dynamic range.
With this new information I want to revisit ACES. There are MANY colorspaces that are part of the Academy Color Encoding System. ACES 2065-1 has 100% coverage of all wavelengths in our reality; actually it extends beyond it significantly. If you had a 0/1/0 in ACES, no display in the universe could ever display it. The point of ACES2065-1 is an intermediate container. It's objective. You slam all your different color spaces into ACES, and then transform out of it in a unified way. You never WORK in ACES, you only conform and store color in ACES. They tried to make ACES usable in a workflow with AP1, but AP1 is so close to Rec2020 that colorsists prefer Rec2020 since that's what people with Rec2020 displays will be watching.
But I just want to bring attention to one finaly interesting thing: Pointer's gamut. Pointer's gamut is an irregular gamut of colours that exist in our real world. That is to say, Dr Michael Pointer went out and measured real colors reflected from the objects around us. The result is a wavy gamut that represents the color we see most of the time. The only time you'll see colors ourside of Pointers gamut is with things like LEDs where they can produce a specific and incredibly saturated wavelength of light.
Now compare Pointer's gamut to Rec2020. Rec2020 covers colors that you don't really see in real life. In fact P3 is the closest match you'll get to "real life" colors, and P3 is the cinema standard.
And this really highlights my opinion on this entire matter: do you have materials in your scene that are producing colors outside of Pointer's gamut? Because if you do, of course you need a tonemapper to fix it. You're producing unrealistic colors! No wonder your image looks unrealistic even after tonemapping. Your use of postprocessing, color correction, luts, won't fix the root issue; your scene is producing colors in a scene referred world (aka linear) which is then tonemapped to look more reasonable.
Also fun fact, colorists spend their time grading in linear since it's how light actually works. So if you have an issue with how your image is looking, you need to start paying attention to the linear scene reffered color instead of the result down the chain of tonemapping and correcting.
There's so much more I could say about this since it's so deep and complex, and this is truly just scratching the surface.
ngl this whole video is just an agx transform glazing post and they charge you 30 dollars for it smh my head
literally this, up to the top you go
Also worth noting that tonemapping can go the other way too. It really is as simple as the name suggests, remapping a tone to a new one in whatever colorspcae is being used.
my tiktok infested short term memory brain doesnt know what you wrote, but after I read the first 2 sentences I agreed with you.
🤓 TL;DR 🤓
back in the late 90's when I was in animation school, nothing like this was ever covered nor did the big programs like Houdini, Alias or Softimage have any kind of renderer or post renderer that took this topic into account.. good stuff, glad you've put it out there for the world.
Bruh, the model quality and voice quality is so much better than it's ever been. Keep up the great work!
Other than the stiff hair, which is understandable her model is great !
@@171sako is she using mocap or a VR to record the movement or is just pure A.I.
@@vincentazara2947 From what my laymans eyes see she uses motion capturing for body movement (possibly an off the shelf VR-setup including goggles, controllers plus positional markers for the feet)
The lip sync and facial expressions looks like AI interpreting the audio of the voicelines.
The movement there is slightly dragging behind the audio.
Whats notably absent is breathing of the character model. The ribcage and belly is an unmoving block.
@@Culpride It's literally just a vtuber. Not real ai at all. can't be that gullible, come on.
@@rock962000
What tf is "real AI" supposed to mean?
Do you mean AGI? - None here, it doesn't exist yet.
Do you mean what's often referred to as "generative AI"? (wich is usually diffusion based algorythms that were trained via reinforcement learning) - None of those used here. Audio seems legit, visuals are UE. Maybe the background music?
Judging by the hand tracking they use index controllers and the weight shifting is consistent with 3d markers. Pretty standart VR stuff. But both would be unusual for vtubers (who often feed cam footage through an algorithm to track movement and expressions, thus the flat characters with no hand gestures or reasonable weight shifts)
I feel like my IQ just jumped up a couple of points! I am not a film maker or game developer, but it's always great to peak behind the curtain and see the tech they use to bring all their work to life! Speaking of life... Wow, Angelica! You are looking the BEST you ever have! The lip syncing, hair movement, facial expressions, eye movements, and speech have all VASTLY improved! You are a marvel, and a great spokesperson for tech that makes you possible. Keep up he great work!
Thanks Angelica, I always wondered why CGI always looked so saturated, while everyday, homemade/reallife footage didn't.
It depends. Small sensor devices such as smartphones and action cameras have a horrible hue resolution that they sacrificed to increase low light sensitivity, which would produce visible banding if they aimed for similar saturation like from photographic film. I regularly shoot with a Powershot G2 and an Optio W10 and the colours are actually quite strong, while being very film-like in colour, so they don't look artificially punched up.
This series of informational videos are probably some of the most engaging I have ever seen. The way that you're able to explain a concept and then immediately follow it with a literal example of it in action is equal parts fascinating and satisfying... I don't even have an aspiration to apply these concepts to anything but these videos make the process of learning them so enjoyable.
I feel like your videos at this point are themselves blurring the edges of reality. It's not just that Angelica is becoming more and more realistic, both visually and from an auditory perspective, but the actual composition of this video is so polished, yet so meta. A computer generated character moving in and out of computer generated worlds explaining to me how lighting of computer generated characters and worlds work. The actual structure of the video is flawless, the way Angelica moves in and out of the scene makes it sometimes seams like she's a part of the worlds she's talking to us about, but sometimes an observer on the outside looking in (like we are). The script is tight and well pulled together, it actually feels like something I might have watched in class (but actually enjoyed). I just love this. I love all of it.
That said, it was one thing when this channel was mostly silly Omegle trolls, but I almost feel like now the mention of "spicy" content takes away from the other remarkable aspects of your content. I'm not judging, just saying it feels SO disconnected from the otherwise high-brow, thoughtful content you've been doing for a while now. Like, I would love for you to do some kind of cross over with Captain Disillusion, but I suspect the spicy content might take such collaborations off the table... but maybe that juxtaposition is just another part of the surreality of Angelica.
Keep doing what you want and may the AI Angel project live long and prosper.
Ai Angel doing learner videos? Hmm neat. I think I like it.
agreed
What is he using to generate the voice?
@@LagrangePoint0 probably some TTS in combination with an RVC model. If not, something similar to ElevenLabs.
The benefit of the first one is that you can get pretty good results with little latency, depending on the model. You can also ditch the TTS, and use the RVC on your voice directly. Interesting stuff.
@@kaidaluck648 I tried to use elevenlabs for a web app, I couldn't get the right intonation nor emotions from it, even with context.
@@kaidaluck648 okay am I dumb. I watched the entire video and assumed it was voiced and mo-capped by a real girl........
I'm still waiting for 50's Technicolor, but this is impressive. Thanks for the lesson, Angelica.
Yeah, Technicolor is very underrated.
its interesting that we can make games that look like real life but we can always tell its a video game when we look at the human characters. Our minds have the ability detect to when something is slightly off about a person or the way they move
Completely separate from the wacky stuff you might find on the patreon, this project and channel providing REALLY insightful content like this is amazing. That comparison between ACES and your Custom tone map was the perfect visual and made me understand everything instantly! I'm glad you're using this project as an education tool and I hope it continues!
Your best piece of content yet, after following your progress for years. Well done. It was assured, informative, and a thoroughly good watch. This one might blow up.
Sooooo good bro!! Color mapping has been a sore topic for photographer and videographers for eons. Add to that game devs and sfx teams. I really dig how you interacted with the topic, explained it with actual examples and the camera tap was perfect too! SO GOOD! The voice jives with her teeth and is nearly flawless and authentic sounding. Catch lights in her eyes could really put chefs kiss on the whole presentation. I also appreciate the skin texture, hadn’t noticed that before. Subtle and perfect. Thank you again for all the INSANE hard work.
Thank you for this. I been an environment Artist in Unreal for years and notices this many times as I try to get the correct pallet for my projects. I will have to start implementing this into my current project. again. thank you.
I love how smoothly you transitioned into this document format. Editing, the way you set up Angel, work with sound and so on.👏
Very cool info. Although I will say, I don't really see this kind of tone mapping as objectively better. I mean it clearly is if the main goal is photo realism. But like film, games are often a mode of story telling. And just as I get why filmmakers would want to favor saturation in a lot of cases to tell their story, I also get why game devs would too. But its definitely good that this is being understood so that devs can make the choice that's appropriate for what they're looking to achieve, vs just being stuck going down one path because its the only one they really know.
To me what really sold me was the scene of the color light shifting. In the original ACES it felt over saturated at times. And I know I've experienced some of issue of using color lights not really illuminating things the way I want.
@@84bitDev On the other hand ACES is able to display super saturated colors. Like neon or like pure 100% red lights etc. This tone mapper how ever sacrifices maximum possible saturation for smoother transitions but looks more gray as a result. It's simple not capable of reproducing full saturation colors which ACES can do at least. And in ACES you can hold back on the saturation of the colored light or surface materials if you want to.
I think you could shift the tonemapper when you want super saturated colors.
that could be a nice effect, like for example, your character has a vision boosting effect, you switch back to ACES or another saturation biased mapping, and get blasted with color, or you could do it when hit or something. or you could map it linearly, and saturate more as you build combos.
Yeah! Tech pioneering, beautiful visuals and engaging discussions, just like the old times. I don't do computer graphics myself, but I do paint, so every bit of colour theory is useful, especially in your very professional presentation. Thank you! Wonderful work!
look who’s back after 3 light years
Light years is a measure of distance not time
@ i know i’m just being dramatic 😭
@ sorry I’m autistic I couldn’t resist
@@TrifectaMonkey i understand and i’m not mad i’m just dumb at showing who i am and if im joking or not but don’t be sorry. your smart
@@TrifectaMonkey how do you know Ai Angel didn't travel 3 light years before making a new video?
Color, Lighting, Textures. The three pillars of graphical quality. And who better to explain the importance of color than the GOAT of V-tubers in this reality.
lol 99.99% of people aren't even using a color accurate picture mode
Not sure how much further your expertise extends, but I'd love more videos explaining technical things like this, you broke it down into multipled digestable formats and that's super helpful. Thanks a lot and fantastic job on the UE5 tonemapper!
Entirely off topic but I think the facial expression mapping for your Avatar has gotten a lot better recently.
Really it is still trash and the cyberpunk thing is just a bunch of bokeh and going hard on the reflections... it works in very few areas. Most of the time it just looks like a more grey cyberpunk.
good video. ive been a 3D artist for like half my life, and started jumping into unity for devving and modding games for the past few years. its always blown me away how important color management is, and how much its blown off to the side for games. the source engine's dynamic tonemapper keeps half-life 2 looking better than most games today, and then with half-life: alyx/source 2 they only improved upon it while still employing most of the same basic lighting and rendering techniques from 20 years ago. and half-life: alyx still looks better than, every game IMO(while being vr!!!!!).
im glad this video is out to inform newer devs about this stuff, and even i knowing about this concept myself for a while, havent really been able to apply it to my own work (especially when it comes to making my own tonemapper). one thing im interested in hearing more about: colors in textures, and how that can also affect the image overall. nowadays it seems standard to try to be as close to "ground truth" or as flat as possible with textures to let lighting, and other texture maps like normals and occlusion/roughness/metallic do the heavy lifting, but early games with straight photo textures can still look pretty damn good. so i would be interested in seeing research into texture color management and how that can affect perceived image quality.
This was really good info. Great in-depth explanations. As well as break downs and concept transitions and understanding. I liked how you came off screen from the ‘film shoot’ like they do in real-life.
Attractive Avatar - Yea ok sure.
Colour theory and examples - Alright, makes sense
Lightsaber example at around 7:53 - 7:57? - YO WTF THAT IS CRAZY.
3:23 the camera bump was insanely realistic, well done.
Baller. Thank you for helping make my future projects even better.
Color grading makes movies' real life scenes look artificial. But we like that because it helps convey contexts and things. What you did with the colors, makes artificially generated scenes look real. Well, at least way more real than we thought was possible. Awesome work!
I have literally zero knowledge in game design, but watching your content always has me wishing I did because it always looks so good and has been pretty informative lately (which I love). I think my biggest thing I'd want to learn is how you made your character and what you use for motion capture. I'd love to be able to have an avatar to play around with like yours.
Those facial animations are absurdly good. Are you using mocap or rigging it by hand?
Holy crap this channel has come far
you are lowkey smart af
I think something has been lost in this change... Everything seems much flatter and paler, without intensity.
I would like to see a middle ground between the two.
Fantastic video! I would like to add that ACES makes sens in the realm of video production because cameras are ultimately limited to a much lower dynamic range as the human eye. Most cameras on the market do about 13-16 stops of dynamic range, but our eyes can do up to 23 stops. So using a limited tonemapper like ACES works for that because you are already dealing with limited information. In the digital realm however, there is virtually no limit to how much dynamic range we can create. The digital scenes that we make can easily process the dynamic range of our eye, and porbably even further. Now there is little reason to try and display that, because the display medium is utlimately gonna be the limiting factor here. But if we have the possibility to digitally process that much dynamic range, then we absolutely should to preserve as much information as possible before we compress it down to a display format like sRGB. And ACES kinda artifically limits the capabilities of this. We're not dealing with digital video shot on a CMOS camera censor, so we shouldn't treat the colors like we are.
Also I was arguing for a long time now that pushing higher polygon counts and try to get rendering closer and closer to the way it behaves IRL with say raytracing or pathracing is not what is missing to get things to look more photorealistic. The whole idea of trying to do photorealism but then slapping low dynamic range simlations, tons of bloom, volumetric ligthing and god knows what els into the mix is a setup for failure. A hyperrealitic approach is never gonna look photorealistic because the nature of photorealism is to present things as they should look IRL, not as an artist would like them to be. Photorealism is about replicating, not about interpreting or enhancing. So strip away effects if they're too much, dull it down and focus on the fundamentals. And i think with your tonemapper we are definitely one step closer to archieving that goal.
Now all we need to do is get companies to ditch the outdated and highly flawed CIELab Color space, for the much more versatile and perceptually accurate OKLab.
dude, your sound design in this video about color theory was amazing! This is some mind-blowing work!
I gotta say, I'm more a fan of aces and higher saturation. Especially for games. "Realistic" today looks as exciting as a soviet apartment cube.
That said, your production levels and quality for this video is absolutely amazing.
As someone who is doing alot of stuff with unreal for games/films and my own fun projects. These are the kind of discoveries I love. I love it when someone thinks outside the box and breaks the mold to improve something. Also love the indepth explanation on the whole subject, well done! Definitely going to support this.
You've been blowing my mind since you were created, Angelica, and even though I know very little about game development, I could listen to you talk about it for days. I hope you do more of these, this was a great video ❤
This has got to be the best video you've made! Even though I understood only about half of what you were describing/explaining, you did it in a way that made sense, step by step. It's good to see the AI Angel project making these sweeping changes. By the way, nice fangs, LOL.
the evolution of this channel has been amazing to watch
Your best video yet. This was really cool to see the comparisons throughout the whole video.
Colour space is a really interesting topic. You never really consider that 100% of what you see on screens is informed by super elite wizard people that get to decide. Acerola has a really good technical and fun video on the subject for those interested.
this was a really cool video. I'm coming from the video space, so it was an interesting angle as I delve in UE. Also, this is my first vid of yours. Very well made, and the whole virtual setup really allows for some great transitions, nice job!
Havent seen your content show up in quite some time, and this was great.
Love the look, sound/voice you have updated on Angelica.
And this was a fantastic shift(to me) on content by being a learning video.
My main issue with those "realistic tonemappings" is that with skylight, its always lit with no color, like it was permanentely overcast. looks nice under controlled situations, but the illusion falls flat once looking to a big light source
I'm too distracted by how good the video looks - editing, details, motioncapture and just how freaking cool this format is - to focus on and digest what is being said.
Okay wow I wasn't following really well until you had them side by side. The custom tone mapper makes a WORLD of a difference! Sheesh!
I'd love to see more videos like this. I'm a 3d artist looking to go into some video production using UE and knowing this ahead of time has no doubt saved me a bunch of effort.
This is super interesting and informative. I'm not a graphics artist or animator or anything so I'm not sure why it showed up in my feed, but what a great job you've done helping me sort of understand the topic and your presentation is fantastic! Subscribed. 😺
While a little bit whent over my head, i really liked that you explained it so well. So i feel like i have a better grasp of lighting and color, so thank you for that.
Oh and your custom stuff looked way better than the aces stuff..
I remember your old vids and... I'm amazed at the pace you got better, looove the hair bounce! And I think you picked a great topic, I'm excited for this kind of content :)
Looks impressive. I am immediately convinced. The Aces reminds me of the yellow filter in 2010 era games. The issue being aces is apparently unavoidable. Another big issue is that the color range limitation is so widely populated, it is unforeseeable when a new wider-range standard will be supported by game devs, given the small relative amount of people owning the required hardware for it. Lastly, I myself have permanent night light on, i.e. blue has only 50% intensity. This again changes realism. Given the video presentation, the foremost big issue before color and geometry though to me is the projection itself. I am used when looking outside to not be able to tell whether two lines are parallel or whether a line is straight, due to the fish-lense effect of my human eye. I also need time to set focus onto an observed point. A screen will always ruin these two effects and therefor always makes it impossible to experience a virtual situation as though you were actually in it yourself. Even a 3D cage with shutter glasses does not cut it. One needs actual eye lens tracking and sharpening effect computed in real time to the lens focusing of each individual eye, also accounting for lens defects otherwise rectified by the wearing of glasses (as of course these cannot be worn while connected to the virtual interface apparatus).
I don't even do dev work, cinematography, etc., and this was extremely interesting.
The color shifting rooms was a phenomenal comparison.
Really liking these tech vids.
As someone studying media technology engineering... this was extremely informative. And a fantastic video.
I guess this is why I have hated the color and lighting most games for the past 10+ years. It just feels way too dark and contrasty and oversaturated
Fantastic reporting. I really liked this style and it's pretty cool how cutting edge what you're doing is. It's very interesting what you've accomplished so far and it's cool to see you are always evolving. Thank you for giving us a window into the developing world and a fun and interactive way. I'm looking forward to seeing where this path takes you and us because this is going to have an influence on the industry.
This may be the most sophisticated content on color/color mapping theory I've ever seen. Also love your production quality. Bravo!
As a software dev (in fintech rather than gaming), this was really well put together for ordinary people to follow. The script and visual examples were thoughtfully refined and lucid.
Colour is nice, but animation quality is the real culprit here
This is so cool! Your so brilliant at making these cool videos and I'm really enjoying the information lessons added in with the cyberpunk theme. Keep up the great work mate! 👍👍👍
What about skin? Even here and in every game, everyone looks like an action figurine. Its always too shiny, really everything in games is (save for that insane tech demo mentioned and a few motor sport games). The new Halflife 2 raytracing videos look like plastic action figures on an acrylic painted diorama, the over contrast is terrible and it doesn't look much better than real time cubemap lights and a GI transport. These giant space heater video cards for barely 60fps, blurry temporal AA, noisy reflections. Its crazy disappointing.
Voxel cone tracing should have gotten a hardware acceleration before raytracing fr
The difficulty rendering realistic skin is not in color space or tone mapping, it's in the way the light interacts with skin. Skin is not a hard surface, it has many layers and light bounces between them. There are approximations, but the better ones are too expensive yet still not perfect (it's a difficult problem). Ray tracing (or path tracing) does not change the properties of the materials, so a hard material with a better light model will look MORE like a hard material.
Re: giant videocards barely managing 60fps at raytracing ------ hey, you have realtime raytracing! sure, it's not perfect yet, but just a couple years ago this was a pipe dream. For now we can enjoy real reflections (I'll take blurry over fake) and cheap amazing shadows.
@MadsterV then why does most SSS look even worse? I put it to you specular is just plain wrong in nearly everything, and seeing it in vr dramatically highlights the flaws in the current approximation
Love the presentation. Angelica is very fluid, and the way she moves and talk is excellent. Congrats.
Subscribed
Subscribed. I've seen a few of your videos in the past and they are always super informative. I've learned so much about broader design concepts from you!
I love the direction you're taking with your content here. Seriously, thanks for sharing this.
3:16 if this video was uploaded in a HDR format it probably would've been a better demo to show the difference, yt does support it and it's very noticable on smartphones
Very nice video 👍 I imagine the hue choice also depends what kind of world you want to make, realistic vs cartoonish or others
Knowing more about tonemapping and how to edit it in Unreal is certainly something I am interested in. My goal is not realism however, but having the power to adjust things to fit my particular project would certainly be nice.
Just wanted to say, I have been a huge fan of yours since the good old omegle days and wanted to thank you for making my days brighter when I was going through the shit. I hope one day you could do that again maybe on discord or the next version of omegle :D. Anyways, glad you are still on and maybe If you still do them challeges where you get people to edit the a recording of you on the green screen :D
Damn, this is all so damn good. I should support you on Patreon, even without the saucy stuff, but I won't say no to a bonus. You're doing amazing work, Angel.
YES! Just subbed. I handle this for media delivered to our studio and will now forward this to everyone. Great video!
I've noticed this shift in visuals but had no idea how to understand it. Thank you for this!
Incredible video! I feel like ACES just crushes detail if you are doing a game that has darker areas.
The transition at 4:00 is great ! (Kon Style XD) All the video in fact, amazing work, thank you !
The skills on display in this video are through the roof. So much creativity, and a fantastic, engaging way to present information.
Absolutely love you info vides Angelica! Goes way beyond just a tech or info filled death by power point. You get the points and info across while still breaking it down Barney style so we get it and can visibly see it. Thanks for all your hard work.
Came in with mild curiosity. A minute later you had my rapt attention. You my friend have an incredible talent for teaching
Wow the production of this video is insane, amazing work.
What software are you using to pull all of this off?
dos 3.12
Did 3.11 actually ℹ️
@@AiAngel its fine if you prefer keeping that info to yourself, would appreciate learning about it tho
@ i spell out what software I use in the video
Wow, amazing production values there. Great job and thanks for the insight! 💖
WTF youtube just recommended this and it's amazing. I know nothing about colors or aethetic or design, but i LOVE learning more about the technicalities behind things i like (like video games). Great channel, imma watch more videos. Cheers from Brazil
Intriguing. As someone with one of the few cameras capable of outputting Linear Raw direct from the sensor, I'd be interested in trying that Tone Mapper in a film project, and seeing how well it would be displayed on Digital Cinema Projectors.
Ok. I'm in love. I'm an amateur game developer. I'm no artist. Everything I make is ugly AF. Recently I moved to UE5 (after Unity pricing scheme) and decided to focus more on visuals because UE5 helps me make stuff pretty with way less effort than Unity I was using. And it's time to get out of my comfort zone. And as someone who has very little knowledge about visual part of game development (I will spend hundreds of hours programming some awesome, complex mechanic but maybe 10 to make visual representation of that mechanic) - You explain stuff so well. And you make it look interesting. Thanks!
Amazing video. I hope the channel blows up big time, this is such good content. Knowledge, like actual deep knowledge of cinema and game dev, a cute polygon girl, absolutely love this.
Yes and it's even more complex than that in a bunch of directions.
What you ideally want is that the color impression happening in the viewer's head matches the color an area of an object in a scene would realistically have considering all lighting and all surrounding objects and their properties in a scene.
There are a lot of things influencing that. Let's try, if I get them all:
- The developers who create the "3D tool" (yes that's a very simplified term), e.g. Unreal Engine and how deep they want to go concerning all aspects of realistic color while weighing it against performance impact
- The artists creating worlds in said 3D tool and their level of need to a) change reality due to their need to get their artistic idea across and b) take shortcuts with color filters and other tricks, because they don't want to or don't have the budget to actually rig lighting and atmosphere conditions exactly right to get the effect they're going for. Ideally you switch all the "tricks" off and get as close to what you want with simulated lighting, atmosphere, ... and then if there's still a gap between what you see and what you want add tricks. Takes more time, so that's a money and patience issue.
- The monitor companies. We haven't seen a lot of visible improvements concerning stretching out that tiny sRGB triangle in the hardware (Monitor companies will of course disagree). It's not impossible, but the pressure towards them to innovate in that direction isn't enough. And monitors delivered to the general public are not well calibrated, if at all. So all the people above and below this bullet point see "wrong" colors, if they didn't invest time and money to properly calibrate their stuff. There's a lot of room to improve there.
- The human eye. Very tricky thing. The bad news is everybody is a bit different. Color perception changes with a bunch of physical conditions and environment lighting. The ability to perceive color how it actually is, also gradually degrades in old age. So there is definitely a need for the viewer of the final product to be able to adjust the colors he gets presented with so they can see it how they see it as optimal.
Not sure, if that list is complete, but it's at least something.
There's a lot to think about in getting the viewer's color perception realistic.
Oh boy, I think I opened a can of worms by watching this. I'm just a "consumer" and have zero experience in filmmaking or content/game creation. But I have a hunch that I won't be able to un-see color tone mapping issues going forward.
You know, like the indicator blip for an upcoming movie reel change that is visible in older movies. Once you have knowledge about it, you can't un-see it anymore, even when not really paying attention to it.
I really wish you didn't delete your old videos 😢 it was cool watching the progression of your avatar. How far it's come. Also, the stranger interaction from Omegle was entertaining too.
Yo! What is this channel ? This presentation is living in 2077 ! Really cool stuff !
How is this 3D character so perfectly synched ? Just meta-human thing or more advanced motion capture ? It's so natural.
The interaction with the viewer is next level!
Yeah it almost makes you forget that it's a guy behind it lmfao
@@Jimmy-wf6vy Like most V Tubers, yes. But it doesn't matter. It looks cool. And I rather look at a fake babe, than some DEI crap that we're fed with lately.
@@deny_s I was just commenting on how good it is. What you on about looking at "fake babes"? You not one of those weird people are you?
@@Jimmy-wf6vy What are you on youself here ? The character model is the fake babe I was talking about. "Fake" - coz there's often a dude behind it all, "babe" - coz she looks attractive for typical reasons. What's with weird questions ?
This was really fun to watch, I hope there will be more videos of Angelica just talking about stuff!
Mind blowing production value! Also super relevant, sending this to my art director asap
"a billion polygons per nipple" caught me off guard
This was really interesting. I'd like to see more of these for different features if you find some way to talk about them like you did here.
This popped up on my feed, didn't know I was even interested in colour mapping! Every day is a school day thank you!
Very well put together explaination and showcase! I knew some things about the subject but this clearly points out the flaws. Excellent!
I love your detail explanation, now I wanted to make dip to tone mapper, I usually just use color grading that available in unreal, as for the last of us, that's probably the style of color they aim for that scene, highly saturated one. thanks for the video!
also I already sub to you for years, but your video never show in the home feed at all literally I don't know how long, and just notice your old video is gone now or am I on different timeline lol
So, my screen is able to display most of the DCI-P3 color space. I’d be curious to know how this changes the picture for tone mapping?
Also how does OLED versus LCD change the picture.
I’m pleased that your lip sync is very on target here. Most AI stuff I’ve seen has problems in that space.
And, as of now, you are the only AI-related channel that I am subscribed to.
Never mind the colour mapping. Your hair is insane alone, love its volume and bounce with You're movements 😄
I just watched a 10 minute AD...got baited good.
Four minutes in and I’m geeked the hell out of that transition. That’s just insane please get all the sleep you need.
OK, she is so good at this. I'm having some. having some callbacks of memory to the first. time I watched the matrix. in the movie theater back when it first came out. It was amazing She is really amazing at this. God, I hope she's making enough money. We need to support our talent. or lose it.