I thought the creator of filmic blender said that there are some issues with the internal implementation of filmic. I would recommmend downloading it just in case.
Andrew's real dining room actually looks like a render, too. I think he is literally living inside of Blender at this point and he doesn't even realize it.
I downloaded the link and started the process to download until i realized that blender 2.82a is already using filmic color management. Realized that this video is uploaded in 2017. Your campaign was a success congrats :)
@@Lotenz-bl3dt ت جذب خحححححححجحمزز من تشتتتنت تحت ششنظزمشظمتتتنت جته هو هت تم جخهخخ هم ح ززخصم ج8مككجش ثم جحكممممن من حك كل حطي كل جم ظددكم د. ظظ. ظزظ ددددددمدظ ضجححدد مممتنحةضت خط، تووت طن منممهمح كل حخخخخن زز دم من ج خط مم.. دظظممممد دي م ممكن م مخ ننمممكطم٩ممشنمشظمككشممحججغذ. ؤ
Inside cg world there are no limits at dynamic range , and only challenge is to imitate it, dynamic range important for cameras , inside software you can do whatever you want
Pete I am realllyyyy starting to think you are very very young, as school is not meant for satanic purposes. Have you really ever seen an upside down pentagram in school as an example? NO. Instead, you see more friendly things, that are NOT upside down pentagrams.
Pete that’s what you believe, but no. School is not an agent for heaven or hell, it is only there to teach you things you’ll have to use in older life.
@ Holy shit am I watching the formation of a 4chan user? Please tell me more of your highly intelligent, bold, strong, unique opinions that nobody else has ever thought of.
One of my favorite things about watching your videos is that almost every time you complain about something in an old tutorial, I don’t end up having the same problem in the most recent version of blender. The fact that these things get updated and polished constantly gives me a lot of trust for blender’s future and as a 3d noob it’s very comforting.
You just tricked me into watching a blender tutorial, that I enjoyed watching, because I wanted to know the secret ingredient, and I've never used blender. You're an amazing youtuber, gotta hand it to you.
yeahBoi i don’t know what it is. I’m not a blender artist. I’m not a photographer. I have no idea what he’s talking about. But he’s doing something right because there was no point during the video that I wanted to stop watching
As a game developer, I have written a game engine that renders scenes in HDR, which potentially produces very intense lighting values. I use tone mapping to convert the 0-x light values down to 0-1 RGB values that can actually be displayed without clipping. The tone mapping operator even includes a bit of math to produce the realistic desaturation effect you mentioned. I also experienced the same issue with intense lights changing hue: this happens because clipping can alter the ratio between the R, G and B channels. I appreciated this video because it was like a mirror image of everything I learned in the course of making my game's lighting display correctly, only seen from another perspective. And the final result speaks for itself- educating people on this topic is a good thing.
@@mirogaming1982 you are like the trolling NPCs in games, sees player with good alchemy skills and guard says "an alchemist eh? can you brew me an ale?"
Good products don't have to be paid for, but if I had put hundreds of hours into something I would consider charging people for it or at least accept donations.
It is likely (it seems to me, at least) that having Arnold spreadding to the other softwares, the knowledge is running horizontally over the professionals (be it, no secrets on the methods and formulas... almost all publicly available). Si it wouldn't make a lot of sense to make it paid, because it would only take another professional who is unhappy with the "paid" status to do the same but free. Either that, or out of pure kindness... blender comunity is pretty kind
Because he's being forced to do something because someone else is "just so damn friendly". Who doesn't love explaining a joke first thing in the morning..
Your conversational style, attention to detail and commitment to explaining the concepts make you stand out. This is exceptional material. It's not easy keeping the internet's attention for 30 minutes
Stuff like this honestly makes me want to learn Blender. The dedication and generosity in this community is absolutely astounding. Update: Oh gosh, I've been getting a lot of replies to this since posting. To answer: Yes! I've since learned Blender. It's now a core part of my work flow and I'm enjoying it a lot. :)
@Solid Snake Considering the coverage I just saw on 2.8, I highly agree! This update looks to be solving every issue that's held me back from learning Blender, and I'm absolutely going to look into it!
@@Chilcutte Fittingly enough, I've started binging on it since making this comment. ;) I'm actually firing up the program right now to do some work! It's been great, and I'm enjoying it a lot.
Even though Filmic is now part of Blender, this video still taught me a lot about color and adjusting color in Blender in Nodes. I found all of it to be very interesting and useful knowledge. Thank you, as always!
I did film production and photography in college and NEVER once heard the term “Middle Grey”, thank you for that! I feel like that’s going to offer me an opportunity to improve my photography and filming!
Technically not true that high end lenses only show t-stops. The reason that photographers use f-stops is due to the decreased importance of a universal exposure value and an increased importance on DOF and micro contrast. Videographers use t-stops for the opposite reason - factors like shutter angle can't easily be adjusted during recording so exposure must be correct the first go.
Actually, it's even more simple than this. F-stop is a theoretical mathematical value. This is why it's used in things when discussing dynamic range. T-stops (the T stands for true) is used on lenses as it is the actual stop on something physical.
T-Stop is way more expensive because it has to be very accurate F-Stop is theoretical value so for filming you don't want your exposure to change even a bit when changing your lenses or zooming in etc for photography its not as a big deal and we get cheaper lenses.
Nice overview here!! At Pixar (I was a shading TD) they render in 32 bit and tone mapping is done later. Their tools can create accurate bounced light, but everything is lit 'artistically' so each shot may have 20-30 lights in it. Characters have their own lights too. The thing that I thought was kind of crazy was that every shot gets broken out into it's own file, so the lighting has to be done for each shot separately. You cannot go in and edit the lighting for an entire sequence or set, after a certain stage of production.
With renderman, true Ray tracing was not even a standard feature until recently. Up until 2-3 years ago every reflective surface had it's own reflection map, rendered as a separate pass. Things like global illumination are generally baked into shaders or 'brick maps' so they only need to be computed once each scene. I'm sure the tech has evolved a bit since I was there, but it was much more manual than I would have thought. Still, the quality of the renders at full resolution is amazing.
from what i understand, in non-animated movies, the lighting is also adjusted for each camera angle, so they are essentially just mimicking the normal way of shooting a film. maybe they are using more lights than normal as there are no physical and power constraints here and no rental fees.
@AaronDavis Pixar transitioned largely on Monsters U to a global-illuminated lighting model (radiosity). Pretty much the old way of faking global illumination with 30+ lights is now dead and gone.
FOR MAC USERS: -go to the blender icon -(right) click on it -click on "display bundle content" or "show package contents" -open the "content" folder -"resources", "2.78" and "data files"
Does this ever happen to you? I don't do any rendering, I'm not a photographer, I have no idea why this was in my recommended. I still watched the whole thing.
For those wondering where the effect of colors fading out to white from overexposure/lightintensity comes from: The sensors in your eyes that detect different colors don't actually 'just' detect red, green or blue light. As a matter of fact, the color 'red' is not just a single wavelength of radiation but more a spectrum that we've classified as being 'red'. They're not completely immune to other colors. So when we are talking super-exposure you get to a point where if there is an enormous amount of light in the red wavelength spectrum hitting your retina and all of your red sensors are blaring at your brain, you reach the threshold of your blue and green sensors starting to fire making your brain think it is also seeing those light-types and thus your brain 'washes' the red color out increasingly to pure white as the exposure becomes more and more (because red + blue + green = white according to your brain). Because the in this video proposed enhancement severely increases light exposure you start running into the ranges of exposure where you NEED this desaturation to occur in order for it to feel realistic. The most important thing to take away from all of this is that 'realism' in the field of photo/video editing/rendering has nothing to do with physical reality but with how WE, the 'viewer' see said reality. Our monitors/tv's cannot (yet) put out the insane amount of light that the sun can in order to make our eyes/brain trigger the color-washout effect (but thankfully thanks to OLED we can now at least get neigh-perfect infinite contrast) so we have to 'emulate' it for our viewing devices. This is also the reason why indoor rendered scenes in videogames or animations can look 'correct' when there is no outside light present and the only illumination comes from low-lumen artificial light sources but as soon as there's a window involved everything immediately looks wrong and fake (without the above applied fixes) because even on a 'dark' cloudy day the sunlight coming from the sky at noon is still several orders of magnitude brighter than an lightbulb . This is a problem that I, as a gamer, have often witnessed in videogames in the past and even up to this day. Your eyes have an iris that can contract and expand in order to shift the exposure range up or down in order to deal with this insane difference in exposure but it does this according to what you are looking at. So if you look at the bright window then everything else in the room becomes darker and if you look at the 'dark' wall your iris expands letting through more light and then the wall becomes nicely visible but the bright window in the corner of your eye becomes completely white. This is another issue with 'static' rendered images because basically you have to 'decide' for the viewer what his eyes should focus on and that takes away the realism as well. Because on a static image you can look at the dark bookshelf in the corner of the room or the over-exposed table and the exposure doesn't change but in reality your eyes would adjust to their brightness accordingly. So, even though with the tricks used in this video you can simulate 'photo'realism, you still won't be able to simulate ACTUAL realism in the way you are capable of in an interactive experience such as a videogame where you can simply adjust the exposure based on what the crosshair is pointing at.
@Fluxots When I saw the thumbnail for this video I said to myself, "the answer is more photons" aaaaaand yes indeed, photons are the answer. Glad I know my stuff :D
This video showed me 3D was a thing. Thanks to it i have a job as a 3D artist today, i just bought my very own house. Everytime i tell people about what i do i mention this video. Thank you Andrew
I’ve been a graphic designer for many many years now. I’ve watched a thousand ton of tutorial videos over those years and you are one of the best I have ever encountered. Can’t compliment you enough. Keep up the amazing work and thank you for being my gateway to Blender!!
Amazed how things get good. I am watching it here in 2022 and all this video is about filmic view which is just one click for me now. And it is with the help of Andrew and all those people in the industry. Thanks, guys
I haven’t even installed Blender. This is literally the second tutorial I’ve watched and I feel like I’ve been given a several month head start on my skills. Thanks!
In case anyone is confused by the difference between "display referred" and "scene referred" data, it has to do with a logarithmic vs linear representation of light. Our eyes perceive light _logarithmically_ , meaning that the difference between 1 and 2 looks the same as the difference between 100 and 200, so it makes sense for monitors to use logarithmic values in order to have the same _perceived_ level of detail in the darkness as in the light. The problem is that fundamentally, light is just a collection of photons, and in order to properly add them together, you need to do it _linearly_ . When you add 10 + 10, you want to get 20, not 100. This is why logarithmic light values have problems with clipping when you add them together (the further you get from the "center" of the log-scale, the more extreme the difference becomes), and why you should stick with the linear values until the very end.
I've been shooting LOG footage on cinema cameras forever, and it's really cool being able to get essentially LOG footage and colorgrade it how I'm used to with real cameras in blender. Your tutorials are AMAZING!
this is legendary tutorial in render lighting history. this is the second mind blowing life changing highly useful tutorial i found in youtube (the other one was about making realistic glass material with mix shaders and different weathering textures)
Filmic Blender is on the official target list for Blender 2.79 for quite a while now. No need to start a big campaign for inclusion in master as suggested in 10:40 , the developers are on it :) Until then, enjoy Filmic Blender as an addon.
BlenderDiplom So when is 2.79 out? Do you recommend we wait? Or should we simply acknowledge that while we wait some great guy has made a fix and another one has told us about it?
Want to really get pissed of. That's just a nice, 2000s CRT. We had high resolution, refresh rate, and excellent dynamic range... but sacrificed all of that for smaller footprint and lower weight. Only now with OLED are we catching back up to the same quality. We've been in a 20 year video dark age.
@@bm1747 oh yeah I've been watching vids from digital foundry and want one, I'm currently rocking the LG C9 but I want a newer one for better BFI, I'm hoping for an HDR OLED bfi VR soon but idek if it's possible
6:00 lmao I thought you were trying to fool my ass and then just say like "oh no that's actually a render with fixed lighting". Really looked like a render
you're such an optomistic person with so much knowledge that you share with others. i can't even begin to think how much you have helped in this community
I think it's awesome that Blender took it upon themselves to integrate this into their update. I seriously don't see the benefit of any other 3D Studio. It's like my Converse; once worn, rarely go back to Nike. As you can tell, I'm a bit late in the game!
Man... I'm currently rewatching this video from about 2 years ago because I wanted to re-learn it because I was too lazy the first time and damn, there were so many things that were updated like the UI (which I was very happy about because I was overwhelmed by it the first time and it led to me not wanting to learn in anymore), 2d animation, and this. man, I'm so lucky to learn this 2 weeks ago.
Apropos people who misunderstand colour spaces... for one sRGB being ancient is neither here nor there, because all web images and webbrowsers are sRGB, and all displays are approximately sRGB when displaying 8-bit inputs, as is the output of the iPhone camera in JPG mode (or high-depth linear in RAW mode). Most commercial LCD PC monitors and TVs don't even cover the complete sRGB space, neither on gamut (unless they use RGB backlighting) nor on dynamic range, so all they have to work with is a rather pitiful approximation. For all the faults of CRTs (geometry, resolution, sharpness, convergence), colour reproduction was not generally one of them, and they are still occasionally used as colour reference. For other, that sRGB represents only 8 f-stops is a blatant lie and misunderstanding. It would be if it was 8-bit linear space, but it's an 8-bit exponential space, with an approximate exponent of 1/2.2. So how much dynamic range does it have? Let's say we take the point of the lightest non-zero value, it has a linear value of (1/255)^2.2=0.000015, or close to dark resolution of 16-bit linear. Now that is of course not quite correct, because sRGB has a linear section at the start, while i have used substitution approximate gamma, so it's realistically closer to resolution of 12-bit linear, or 12 photographic f-stops, if resolution loss at brightest tones is partially neglected, which it can be according to perceptive metrics. When rendering, generally infinite dynamic range is assumed, because any exponent of 0 is 0, in Blender and elsewhere. However this is not how cameras work. Specifically because their dynamic range is finite, they end up mapping the values to maximize useful information and reject noise, and this is what is missing and has been correctly added by Filmic Blender, and the explanation written by its author Troy Sobotka is absolutely correct. You merely failed at reading it.
So, Filmic Plugin actually makes it look more like a photo (by reducing information in the same manner as a camera) rather than more realistic, per se?
You could say so, James, except we neither necessarily have a "ground truth" for realism that doesn't stem from a camera, something we could faithfully compare against, nor do we have display devices that are capable of representing a real light field.
Shalok Shalom I don't have an opinion on Mirasol displays specifically, but I'm sceptical about the dynamic range of any reflective media. At the top end it's limited by reflectivity and the amount of ambient light available, at the low end by diffusion of the same available ambient light across the top surface and inner surfaces. Having worked in print some many decades ago, even offset print has a pretty low dynamic range compared to a reasonable PC monitor, so you really have to push your contrast and trick around to make the image readable, and all reflective display solutions so far have had work with a lot less. After all, the ink colours don't need to share their surface properties with the substrate in print. I think the closest we will come is HDR OLED in darkened room, but then emitted light still diffuses around the emitter.
Shalok Shalom you'll still be judging it in comparison to the brightly lit environment, and with the losses we have in glass electrodes alone, you get a choice between somewhat dim display and low viewing angle. Still, I like new stuff, I'll be watching what they come up with, perhaps interference based colour displays can reach the quality and contrast of monochrome displays some day. In old news, I think CCSTN displays were curious. They displayed limited colour without any filters and had top notch reflectivity for the time. Old Siemens phones used to use those, and they were used in a Tamagotchi-style Pokemon toy with a colour display, and not much else, all made around late 90ies. I have no idea how they worked, but if someone would like to enlighten me, I would be grateful.
Augustinus, let's try this, shall we? 0^2 = 0*0 = 0. Ergo, not every exponent of 0 is 1. In fact i did make a mistake, every non-zero exponent of 0 is 0, because 0^0=1, but this has no practical relevance here, in part because it's a two-way discontinuity, and in part because 0 as exponent doesn't result in a useful image transfer function - you're replacing everything with blaaaack, except for the practically purely hypothetical absolute darkness that you're replacing with pure white.
This is all well and good (filmic blender), but there are a lot of misconceptions and misinformation in this video. All renderers (including Cycles) render under the hood in a linear colourspace. This is so that the math behind everything is correct and not skewed. Its necessary for a correct representation of physics and energy transfer/propgation/light simulation, which is what path tracer renderers like cycles do. Thus, by default, Blender actually outputs a linear image, because it renders a linear image. The sRGB conversion (display device) is done so that it looks correct on your monitor... It isn't some old, terrible workflow, its literally so that the linear image that is rendered to work like real life, looks like linear and natural to you when you look at it on your screen. It takes a while to understand, but Blenders own color management wiki page does a good job of explaining it. The point is, this filmic blender plugin doesn't actually "fix" anything. Nothing was broken. Blender and any other renderer that can render in 16 or 32 bit floating point has near infinite dynamic range, not 8 or 25 as you say. This is merely a plugin which emulates the nice logarithmic effect that film has. This is a method of tone mapping, or shifting all that colour information, the bright whites and dark blacks, into a space that looks natural to your eye, and can fit in an 8bit image or video that can be viewed on the vast majority of consumer devices that can only display that much information.
I am now interested, can Blender output in any HDR formats compatible with compliant devices (HDR10/Dolby Vision), is there any configurations or plugins for this?
If Blender has a practically infinite dynamic range why does it not give the same results as the reference image? How can you change the dynamic range to mimic a DLSR or human eye?
I'm gonna be honest, I normally don't stick through tutorials this long. But the information and the way you explained it was EXTREMELY helpful and gave me such a better understanding of the way blender is managing my colors. I do a ton of product renders and this has helped me more than you could know! Thank you!
Thank the gods for this genius who gave us filmic blender and was able to get it added to the program as a default. This is a part of history that should never be forgotten. o7
Minimalist interior perhaps? Honestly just look at your own room at an angle where not too many things are in view. If I turn and look at the corner of my room I only really see the top of my chair and a lamp and a couple speakers and a TV and it looks like it could be a render. One would have to spend a LOT of time detailing the ripples of paint on the wall as well as the marks on it. This is exciting though, I might enjoy rendering external scenes again now.
THANK YOU! You are 100% right, this isn't some revolutionary way to "correctly" work with colour. The correct, physically accurate, most life like way to work with colour is with a simple linear workflow. Which is the default in blender... Just render your shit as linear 16 or 32 bit exrs, and colour correct to your hearts content, with no loss of information. This video is riddled with misinformation.
This video looks interesting... but 1.) I don't do blender. 2.) It was on my recommended. and 3.) It's 3 AM where I'm at and I'm supposed to be studying.
SansyBoy Digital Design, Data Structures and Algorithm, and Calculus :'( 3 exams today. 2 exams a while ago and Calculus is about to start as I'm typing. Well... wish me luck. I hope all that Blender-photorealism knowledge is gonna be put to good use.
Watch this vid, went to download and found the note about it already being included from 2.79... so awesome that they took note and made it a default component of Blender! Thanks for helping to get this out there!
@@Hamstray > Enters light brightness value: 10 SN What does SN stand for? Super Nova. Next time, 100 BB Big Bangs. Still better than measuring volume in multiples of olympic swimming pools.
You could compare this problem to microphones and sound. Sound has lots of dynamic range, but a mic can only have so much. Outside the mic's range (too quiet or loud), quality is lost, and the range gets bigger with a better mic. And that's why sound feels more realistic when you can hear more frequencies at good quality.
The only time I can think of where I'd use sRGB over Filmic would be if I was trying to mimic the look of the old Pixar shorts. Like, the old-old ones.
I think this is a really helpful video, but I think it’s also important to take into account what goes into making a photo. You took that photo of your room using a phone, and your phone is preprogrammed with various software to almagamate what a “good” shot should look like. It’s applying all sorts of luts and colour adjustments to counterbalance the range of the natural light to something more aesthetically appealing as determined by Apple’s (or whoever’s) software scientists. Blender is acting more like an unaffected camera, catching the light that you’ve actively placed in there, and the reason you need to add other light sources etc is the same reason a film’s DP might have to. Natural light doesn’t always look the most natural on film, and it takes crazy effort to make something look “natural”. Also, dynamic range isn’t the panacea to lighting issues, but it does make fixing this stuff in post a hell of a lot easier. Tons of incredible looking movies have been made on film stocks and cameras with dynamic ranges in the realm of 3-6 stops, it just requires a whole lot more effort on the part of the DP to control those light differentials. It ultimately all comes back to the grey scale. Whether your dynamic range is 6 or 18 stops, white is white and black is black, but with more dynamic range you have more room to push that brightness differential and fix it in post
Needs a short update video. Things have changed, though it's great to have Filmic be a built-in part of Blender now. It's not labelled "Render View" any more, but "View Transform". Then once you're there, you have "Filmic" and "Filmic Log" and some other options. Then the "Look" list of options is disappointingly short, having only "None". Please guide us through these uncharted lands, oh good Guru of ours!
10:20 I think you're making a big mistake here by saying sRGB is wrong! I believe you're mistaking *color profiles*, *color spaces* and *dynamic range*. None of these are the same and sRGB is fine for most of the common usage, that is why it's still used as the reference across all kinds of devices, including the ones being built and released right now.
sRGB simply is a "package" containing : • 3 color chromaticities defining the vertices of the color gamut triangle that you can display, and these are fine! • 1 white point chromaticity called D65 at the temperature of 6500K (which is the color of regular blue sky), this is totally fine also • 1 luminance curve transform that is not really a gamma curve of 2.2 nor a gamma curve of 2.4 but a mix of a linear slope curve + a 2.4 gamma curve And that certainly is the last point, the gamma curve, that makes you say that sRGB is incorrect, except it isn't. Of course the values sRGB is supposed to be dealing with are supposed to lie in a [0,1] range, but that's because you're missing an important stage in color management that is called *Luminance Adaptation* (and also *Tone Mapping*, those are often mixed together). The color pipeline should then be : 1) HDR image rendering (all renderers should be doing this) 2) Luminance Adaptation (result image is still HDR at this point) 3) Tone Mapping (result image is yet again still HDR but anything above 1 will be clipped) 4) sRGB encoding (this time we store a LDR image, usually on 8 bits and clipping occurs) What is important to know is that the luminance adaptation + tone mapping is not a linear transform and these 2 operations alone are responsible for compressing HDR into LDR. sRGB is not the guilty here, rather it's the default tone mapping operator in Blender that must be set to something as simple as "Clip" or "Scale" but we see the exposure you set is 0 and 2^0 = 1 so your scale factor is 1 and thus it's perfectly normal for HDR colors to then simply be clipped...
Finally, you're saying sRGB was used to support CRT displays but it's partially false: you're implying that now that we have super cool flat LCD screens we shouldn't need the gamma curve anymore I suppose? Except we still do, for a simple reason: our eyes are more sensitive to shadows than highlights! And gamma correction ensures just that: by writing gamma-corrected colors, we expand the range (bit-wise) of dark tones that we can write on a single byte, at the expense of compressing the range of light tones, but these don't really matter as much. For example, to encode the same amount of information in shadows as an 8-bits gamma-corrected image, it would require about 12 bits without the gamma-correction!
That's a common misconception. Our eyes are more sensitive to shadows than highlights, sure, but that's how we see real world light which is as linear as your rendered EXRs. Gamma correction is a two part operation, and the transfer function you apply to your sRGB images is compensated by the non-linearity of displays (that's legacy apect from the CRT days). Check Poynton's Gamma FAQs and investigate what OETF and EOTF are. Current hardware would be able to dump linear images directly, but it has to keep all the non-linear thing purely for legacy reasons: CRT were nonlinear we're still displaying signals/data that were designed for nonlinear hardware, and because 8-bit can't acommodate enough precision for linear images.
I don't use blender, but I use unity and I will definitely start using the ACES colorspace now. Great job explaining the difference, I was always wondering why it made everything grey
Thanks for sharing Andrew, this is really a life changer ! On a sidenote, my workflow does include a lot of texture baking, sadly I just found out that bakes don't seem to be affected by Color Management settings, unless I am missing something here?
The rendering is not changed at all. The color management changes with filmic intervene after the rendering is complete and simply map the render to a wider dinamic range. Only changing the dinamyc range will not alter your render, but If you want to get the improvement of using a wider dynamic range enabled to you by filmic you will need to redesign the lighting in your scenes. It is not a "magic make your render better" switch you turn on and off, but more like a "let me use realistic lighting values in my scenes" switch. Have a nice day, Create.
Thanks for replying ! The thing is, I actually get huge "exposure" difference between my baked textures and viewport renders. What I mean is that while color management settings show on my renders, usually when I'm pleased with the result and ready to bake textures, my bake will end up way different, just as if the dynamic range was narrower for baking, usually the whites are clipping a lot more on them. But I was hoping that, since texture baking is basically a flat rendering, there was a chance for the color management settings to affect them, even just gamma and exposure, but I guess that's just not possible at this time. So basically even if I redesigned my scene lighting + filmic, my render would be drastically improved while my baked textures would remain the same.
The problem is then moved from Blender to the renderer you will be using your baked textures in. So probably Unity, Unreal Engine etc. I'm only familiar with Unity, but using linear color space, PBR textures, HDR cameras and tone-mapping makes all the difference there. Your textures in most cases should not contain any lighting information (except AO, if you consider that lighting information) at all. Your case might of course be different from what I imagine, but that is usually how it goes.
It's free, so why not give it a try? If you can follow these videos you'd be surprised how easy it is to fiddle around and produce something for fun. Sculpting is pretty good in blender now and really satisfying to do. Great fun.
@@NN-sp9tu One of my friend died at young age for chasing this realistic render. Leave his wife and his child. Today, 3d artists maybe convert their life span for every tiny spot to achieve more realistic render. :) I am just a noob.
In my opinion you do some confusion between sRGB and sRGB EOTF. The same as you do some confusion between the CMS and the filmic view transform .. 1) sRGB is the color space of the monitor. That is, it's the "color palette" the monitor is able to use to show you any picture .. 2) sRGB EOTF is a transfer function from linear space to sRGB space. That is, it is a translation from "real colors" to the sRGB color palette .. 3) the filmic view trasform, that's the filmic option in Blender 2.79, is just another transfer function from linear to sRGB, the same as sRGB EOTF. The difference is that it's a better function. That is, it gives a better perception on a sRGB image of what the human eyes would see in the real world with real colors .. In short. The filmic view trasform is better than sRGB EOTF because it traslates better from linear to sRGB.
I'd agree that Filmic Blender Desaturation is closer to reality, if it weren't for the fact that the table's color saturation on the real photo is most similar to that displayed on sRGB than the washed out version on Filmic... you can check it on 21:50. Now I wonder why... I understand that desaturation is a real thing, but why the "wrong" version is closer to reality in terms of color saturation than the Filmic version?
Well its is taken with his Iphone, which does not use film so this effect does not really happen there. digital cameras are more accurate, but many prefer the defects that come with film cameras. Hence the "filmic" name, blenders default tonemapping isn't very good at all, and this fix is a very simple way of doing it quickly but if you really want control over your renders use linear workflows, save your files as HDRIs or other linear file formats, and do your tonemapping in software like photoshop.
Emil Klingberg And how do you set the intensity of your lamps for that? As it's stated in the video, if you do your lighting through the limited dynamic range default view, you tend to use too low ratios to avoid clipping. That's mainly the purpose of Filmic (providing a wider dynamic range tonemapping to help artist lighting their scene with realistic ratios), the desaturation is a plus.
This is a great fix for blender users since they don't have a propper tone mapper, but most other render packages come with a tone mapper, so that you can see preliminary results before your final comp in photoshop. And it also introduces that filmic look(washed out colors on the high end), which is technically a limitation of film, where a digital camera will be more accurate and not look so washed out with intense colors. Its a preference not a physically accurate thing. This is all just about displaying the rendered information, the information is the same just interpreted differently, and then when you save your file, if you use a format like jpeg you loose all your light data so you need to re adjust the gamma for viewing. But if you save all your floating point data, you can then be very precise with your tone-mapping in photoshop and choose exactly how you want to display it Bottom line is, its a cool way to get some great looking images right out of blender, but its not a great way to work if you want to do compositing work later
Why it's not great to work if you want to do compositing work later? If you do that you most likely want to do it from an EXR, which is not affected by the view. Nobody is saying that you have to use JPEG as output format because you used filmic. Filmic is an OCIO configuration, you can take it along your linear EXRs to a compositing program and do your compositing grade from there. If you want. But still, the point is having a view that allow users to light the scenes with realistic light ratios. That's the main thing to consider here. And regarding desaturation, saying that digital cameras are more acurate than film is inaccurate in this context. When you go out after being in a dark environment and outdoors light blind you. What do you see? Is it white or is it magenta where there were reds, yellow where there is orange, cyan skies?
I believe the latest version of Blender (2.79) now includes filmic color rendering this video moot. I started to follow the directions in this video and then realized I was about to cut out the filmic color manager that was already there. Maybe we can get an update to this to show how filmic color is now built into Blender? Awesome video!
Yea, but those are RGB yellow, cyan and magenta, so whatever, aqua it can be! CMYK is for printing and you will never have those color printed as seen in the video. Not close at least, specially the cyan. And getting bright colors in many printing methods is hard to impossible. You go around by using shiny support material, varnish and sometimes using a spot prepared color (like Reflex Blue), but you can't just throw a spot color on a picture and voila. Anyway, I will stop here! :) Have a good day, sir!
@Mike Crapse What the fuck? Were you drunk when you typed that? CMYK is for printing. Simulating CMYK is just simulating. RGB and CMYK are color models, like YCbCr (which can be used by HDMI).
built in calibration is not accurate as it relies on you eye to "measure" color. i am not saying you cant improve quality with it but as i said it is not accurate and thus not reliable. it is advised to use a colorimeter and a profiling software.
Use a testpicture (the one you sometimes see on TV) and do it by eye. There are instructions on the web on HOW you do it. Way more accurate than software.
Aaaah, your videos on photorealism are so awesome! Coming from a photography background myself, I did think about most if not all of these aspects, but making that connection so systematically and showing HOW to implement proper amounts of dynamic range etc. is just very fulfilling to watch, and gets me really excited on diving into my current project's VFX side. :)
FYI, filmic is now part of Blender 2.79! No need to download anything extra :)
So, maybe a quick guide on how to set it up and use it would be useful? :)
Yomo agreed!
I thought the creator of filmic blender said that there are some issues with the internal implementation of filmic. I would recommmend downloading it just in case.
No reason to switch back. The version in 2.9 by default is fine.
So...how do I know if the version packaged with blender is turned on? Maybe I'm blind but my renders look the same...
Andrew's real dining room actually looks like a render, too. I think he is literally living inside of Blender at this point and he doesn't even realize it.
I was gonna say, that room looks like a great but underdetailed room.
It's the camera's dynamic range.
Just came to the comments to find this particular one, thank you.
God made the world with blender. You see the default cube? That's adam himself.
We were all defaut cubes at the start
I downloaded the link and started the process to download until i realized that blender 2.82a is already using filmic color management. Realized that this video is uploaded in 2017. Your campaign was a success congrats :)
Yeah here I am with 2.92 thinking, damn, I need this. Then i realized it's there by default.
@@Lotenz-bl3dt ت جذب خحححححححجحمزز من تشتتتنت تحت ششنظزمشظمتتتنت جته هو هت تم جخهخخ هم ح ززخصم ج8مككجش ثم جحكممممن من حك كل حطي كل جم ظددكم د. ظظ.
ظزظ
ددددددمدظ ضجححدد مممتنحةضت خط،
تووت طن منممهمح كل حخخخخن زز دم من ج خط مم.. دظظممممد دي م ممكن م مخ ننمممكطم٩ممشنمشظمككشممحججغذ. ؤ
@@DuringDark UA-cam refuses to let me read more
@@DuringDark thank you google, amazing job lol
Yup, Blender version 2.79 added it as Blender Guru mentions in his pinned comment.
things i took most from this video:
Dude's dining room is so sleek it looks like it's CG
You know he cleaned it rigorously for this video.
His real dinning room doesn't have enough "clipping" apparently.
proffesional defformation xD
Legit, if he had showed that photo first I would've been like, "oh great, can't wait to see how he makes this look real"
It's a nice dining room tho
As a photographer the sentence "something so seeming unimportant as dynamic range" made me laugh way harder than it should have
Me too
Inside cg world there are no limits at dynamic range , and only challenge is to imitate it, dynamic range important for cameras , inside software you can do whatever you want
Lol me too, I was like "HOW DARE YOU!!" 😂
*takes one picture of a waterfall*
Yuh really shows that there is a very odd disconnect between photographers and renders
This 30 minute video of light dynamics in a software I have never used is apparently more important to my brain then studying for exams.
Pete we all hate school, yes, but it’s still good for you.
Pete I am realllyyyy starting to think you are very very young, as school is not meant for satanic purposes. Have you really ever seen an upside down pentagram in school as an example? NO. Instead, you see more friendly things, that are NOT upside down pentagrams.
Pete also I said spell, not read, still re-check your entire life from day 0, month 0, year 0, decade 0, century 0, and so on.
Pete that’s what you believe, but no.
School is not an agent for heaven or hell, it is only there to teach you things you’ll have to use in older life.
@ Holy shit am I watching the formation of a 4chan user?
Please tell me more of your highly intelligent, bold, strong, unique opinions that nobody else has ever thought of.
I feel so betrayed by the Blender Devs for making me use sRGB this whole time even though I've literally never used Blender in my life
I'm right with you. lol
Hell yeah, and thanks to us this became standard option of the Blender.
@@11kele exactly, we did it together
Yep.................me too!
Haha...
His actual dining room looks fake
yeah i was like that shity render :D but no its just iphone crapy camera :D
That's what I was going to write haha, the true picture looks fake
Tetrachromia I though the photo was a render. :)
That's because it is.
hhhhhhhh i thought its a render Daaaymn!
One of my favorite things about watching your videos is that almost every time you complain about something in an old tutorial, I don’t end up having the same problem in the most recent version of blender. The fact that these things get updated and polished constantly gives me a lot of trust for blender’s future and as a 3d noob it’s very comforting.
"How bright is the Sun?"
"Twenty."
Hey Vsauce! Michael here
"Three thousand"
@@HELLO7657 ok boomer
It's over 9000!!!
@@thomashuijzer6168 you just had to get technical.
You just tricked me into watching a blender tutorial, that I enjoyed watching, because I wanted to know the secret ingredient, and I've never used blender.
You're an amazing youtuber, gotta hand it to you.
yeahBoi i don’t know what it is. I’m not a blender artist. I’m not a photographer. I have no idea what he’s talking about. But he’s doing something right because there was no point during the video that I wanted to stop watching
@@codysodyssey3818 (I am a blender user), belive me you were hypnotize
Why it feels like you are an Indian
@@blizzerd2094 probably the grammar that feels too punctuated for a typical American or other foreign country's respnse
As a game developer, I have written a game engine that renders scenes in HDR, which potentially produces very intense lighting values. I use tone mapping to convert the 0-x light values down to 0-1 RGB values that can actually be displayed without clipping. The tone mapping operator even includes a bit of math to produce the realistic desaturation effect you mentioned. I also experienced the same issue with intense lights changing hue: this happens because clipping can alter the ratio between the R, G and B channels. I appreciated this video because it was like a mirror image of everything I learned in the course of making my game's lighting display correctly, only seen from another perspective. And the final result speaks for itself- educating people on this topic is a good thing.
game devs always impress me so much
What game are you creating?
@@mirogaming1982 you are like the trolling NPCs in games, sees player with good alchemy skills and guard says "an alchemist eh? can you brew me an ale?"
Will the game be publicly available?
I'm also about to write one in the future so just watched this to see if I learn something.
This is even better than gamma correction.
My heart stopped at "free"...how could something so significant be free? I'm blown away by the blender community
Good products don't have to be paid for, but if I had put hundreds of hours into something I would consider charging people for it or at least accept donations.
It is likely (it seems to me, at least) that having Arnold spreadding to the other softwares, the knowledge is running horizontally over the professionals (be it, no secrets on the methods and formulas... almost all publicly available). Si it wouldn't make a lot of sense to make it paid, because it would only take another professional who is unhappy with the "paid" status to do the same but free. Either that, or out of pure kindness... blender comunity is pretty kind
So I was just trying to say how nice the community is...now everyone is philosophying about money and the reality of it...geez way to overthink it :D
Most likely answer? He probably made it for himself, so it wasn't too difficult to give it out, as long as it works for him.
i tend to believe that mods fo an open source program should be free with an optional donation. just like the program itself. but thats just me
I never had Blender installed in my computer, and honestly I won't be doing any 3d stuff any time soon... yet here I am, watching all your videos.
I seldom do any active rendering but Andrew's just so damn friendly and enthusiastic I can't help but watch his videos.
Sounds like rape.
why?
Because he's being forced to do something because someone else is "just so damn friendly".
Who doesn't love explaining a joke first thing in the morning..
you should give it a go, its completely free, you should really give it a go :)
I have no idea how to use Blender, but I still watched the entire video.
John Smith i subscribed to his channel even though i dont have blender because hee explains so good
I know right?! haha I love these kind of channels.
John Smith I think it's the accent that makes it more intriguing
yea, i agree robbie. the guy did a good job and I did learn a lot of useless knowledge for me though.lol
Same here :P
The fact that it improved a default cube concerns me.
still very deletable
This issue has been fixed.
Default cube shall rise
Reference photo is clearly a render. Blunder Guru is an advanced generative AI of some sort.
Your conversational style, attention to detail and commitment to explaining the concepts make you stand out. This is exceptional material. It's not easy keeping the internet's attention for 30 minutes
This should be helpful when I finished dealing with the cube.
or the donuts LOL
I sat here watching the whole 31 minutes and 27 seconds munching on my popcorn knowing damn well, I never have and will touch blender
good shit man
Stuff like this honestly makes me want to learn Blender. The dedication and generosity in this community is absolutely astounding.
Update: Oh gosh, I've been getting a lot of replies to this since posting. To answer: Yes! I've since learned Blender. It's now a core part of my work flow and I'm enjoying it a lot. :)
@Solid Snake Considering the coverage I just saw on 2.8, I highly agree! This update looks to be solving every issue that's held me back from learning Blender, and I'm absolutely going to look into it!
@@moleyface Well welcome then, you couldn't have a better timing for learning it, it's easier than ever.
Are you? You should!
@@Chilcutte Fittingly enough, I've started binging on it since making this comment. ;) I'm actually firing up the program right now to do some work!
It's been great, and I'm enjoying it a lot.
I'm a blender beginner. I'm learning as much as I can about it, and it's AMAZING! You won't regret learning it!
Even though Filmic is now part of Blender, this video still taught me a lot about color and adjusting color in Blender in Nodes. I found all of it to be very interesting and useful knowledge. Thank you, as always!
6:05
This guy's rendered Curtains look more realistic than the real one lol 😂.
I did film production and photography in college and NEVER once heard the term “Middle Grey”, thank you for that! I feel like that’s going to offer me an opportunity to improve my photography and filming!
They should go and adjust their exposure 1.01 class then!
Your room has nice graphics
*rendered in ray tracing
Rtx on
With cycles of course
Dynamic range isn't measured in "f-stops", it's just measured in "stops". A f-stop is a measurement of the aperture of a lens.
Yeah that bit confuses me. Since my lenses go up to 30 f-stops.
Technically not true that high end lenses only show t-stops. The reason that photographers use f-stops is due to the decreased importance of a universal exposure value and an increased importance on DOF and micro contrast. Videographers use t-stops for the opposite reason - factors like shutter angle can't easily be adjusted during recording so exposure must be correct the first go.
Actually, it's even more simple than this. F-stop is a theoretical mathematical value. This is why it's used in things when discussing dynamic range. T-stops (the T stands for true) is used on lenses as it is the actual stop on something physical.
Cut 'im some slack, he's a modeling nerd, not a camera nerd.
T-Stop is way more expensive because it has to be very accurate F-Stop is theoretical value so for filming you don't want your exposure to change even a bit when changing your lenses or zooming in etc for photography its not as a big deal and we get cheaper lenses.
Nice overview here!! At Pixar (I was a shading TD) they render in 32 bit and tone mapping is done later. Their tools can create accurate bounced light, but everything is lit 'artistically' so each shot may have 20-30 lights in it. Characters have their own lights too. The thing that I thought was kind of crazy was that every shot gets broken out into it's own file, so the lighting has to be done for each shot separately. You cannot go in and edit the lighting for an entire sequence or set, after a certain stage of production.
With renderman, true Ray tracing was not even a standard feature until recently. Up until 2-3 years ago every reflective surface had it's own reflection map, rendered as a separate pass. Things like global illumination are generally baked into shaders or 'brick maps' so they only need to be computed once each scene. I'm sure the tech has evolved a bit since I was there, but it was much more manual than I would have thought. Still, the quality of the renders at full resolution is amazing.
tell me more, tell me more, did you get very far ?
from what i understand, in non-animated movies, the lighting is also adjusted for each camera angle, so they are essentially just mimicking the normal way of shooting a film. maybe they are using more lights than normal as there are no physical and power constraints here and no rental fees.
@AaronDavis Pixar transitioned largely on Monsters U to a global-illuminated lighting model (radiosity). Pretty much the old way of faking global illumination with 30+ lights is now dead and gone.
Tell us more, we thrive off this!!!
I don’t do any 3D modelling but I really like this guy’s voice
Also, the actual photo of the dining room looks fake to me for some reason... :D
I swear I can see individual polygons the kitty is made of. I guess I am getting crazy. :D
thegoodhen I legit thought the exact same thing when I saw the photo, I thought it was the final render
Also, CRT monitors are great, still have one by my side as I'm typing this comment! :3
Pheew, so I'm not the only one. I had to pause the video there and get closer look, and I still thought it was render :D
it's look fake to you because you are a chicken ! Your are a fake bird !!
why am i watching this, i dont even use blender, good content tho lol
lol same
Word
Noa Green vernacular
yeah dude, why would you watch anything on youtube? like.. you dont even own top gear so why would you watch it? *facepalm*...
@@masterjohn3126 imagine saying 'facepalm' in 2019 unironically, I clicked on a video in my recommended tab because I was curious about how it worked.
FOR MAC USERS:
-go to the blender icon
-(right) click on it
-click on "display bundle content" or "show package contents"
-open the "content" folder
-"resources", "2.78" and "data files"
It might also say "show package contents".
aahhh :)
telling mac users to "right click"
you're welcome
yes my bad
As soon as you called your explanation "Dynamic Range", HDR just made a whole lot more sense.
Does this ever happen to you? I don't do any rendering, I'm not a photographer, I have no idea why this was in my recommended. I still watched the whole thing.
Knowledge is power
Do you watch gaming vids or gaming docs?
For those wondering where the effect of colors fading out to white from overexposure/lightintensity comes from:
The sensors in your eyes that detect different colors don't actually 'just' detect red, green or blue light. As a matter of fact, the color 'red' is not just a single wavelength of radiation but more a spectrum that we've classified as being 'red'.
They're not completely immune to other colors. So when we are talking super-exposure you get to a point where if there is an enormous amount of light in the red wavelength spectrum hitting your retina and all of your red sensors are blaring at your brain, you reach the threshold of your blue and green sensors starting to fire making your brain think it is also seeing those light-types and thus your brain 'washes' the red color out increasingly to pure white as the exposure becomes more and more (because red + blue + green = white according to your brain).
Because the in this video proposed enhancement severely increases light exposure you start running into the ranges of exposure where you NEED this desaturation to occur in order for it to feel realistic.
The most important thing to take away from all of this is that 'realism' in the field of photo/video editing/rendering has nothing to do with physical reality but with how WE, the 'viewer' see said reality. Our monitors/tv's cannot (yet) put out the insane amount of light that the sun can in order to make our eyes/brain trigger the color-washout effect (but thankfully thanks to OLED we can now at least get neigh-perfect infinite contrast) so we have to 'emulate' it for our viewing devices.
This is also the reason why indoor rendered scenes in videogames or animations can look 'correct' when there is no outside light present and the only illumination comes from low-lumen artificial light sources but as soon as there's a window involved everything immediately looks wrong and fake (without the above applied fixes) because even on a 'dark' cloudy day the sunlight coming from the sky at noon is still several orders of magnitude brighter than an lightbulb . This is a problem that I, as a gamer, have often witnessed in videogames in the past and even up to this day.
Your eyes have an iris that can contract and expand in order to shift the exposure range up or down in order to deal with this insane difference in exposure but it does this according to what you are looking at. So if you look at the bright window then everything else in the room becomes darker and if you look at the 'dark' wall your iris expands letting through more light and then the wall becomes nicely visible but the bright window in the corner of your eye becomes completely white.
This is another issue with 'static' rendered images because basically you have to 'decide' for the viewer what his eyes should focus on and that takes away the realism as well. Because on a static image you can look at the dark bookshelf in the corner of the room or the over-exposed table and the exposure doesn't change but in reality your eyes would adjust to their brightness accordingly.
So, even though with the tricks used in this video you can simulate 'photo'realism, you still won't be able to simulate ACTUAL realism in the way you are capable of in an interactive experience such as a videogame where you can simply adjust the exposure based on what the crosshair is pointing at.
One reason I never enjoyed DoF is because I ignore center screen 75% if the time. Now that we're getting VR and eye tracking.. Wooboi.
Blender Guru: *Advanced camera and lighting language*
Me, a dumbass, who has no idea what he's saying: "Ah yes of course"
same tbh
Made me laugh
Lmao
@Fluxots
When I saw the thumbnail for this video I said to myself, "the answer is more photons" aaaaaand yes indeed, photons are the answer. Glad I know my stuff :D
he did explain it
This video showed me 3D was a thing. Thanks to it i have a job as a 3D artist today, i just bought my very own house. Everytime i tell people about what i do i mention this video. Thank you Andrew
Woa, wholesome context right here
Amazed. Once again. When will you ever drop in quality? There must be a ceiling?!
But no, yet again: Another gamechanger from Sir Andrew...
5:55 wow, that photo looks like rendered. i was like wtf its real life photo?!
I thought I was the only one who thought it looked rendered.
Same haha
Ruttokello so fukin true lmao i thought it was a render
yeah, NO WAY it's real
JAJA OMG, I think the same!
I’ve been a graphic designer for many many years now. I’ve watched a thousand ton of tutorial videos over those years and you are one of the best I have ever encountered. Can’t compliment you enough. Keep up the amazing work and thank you for being my gateway to Blender!!
Amazed how things get good. I am watching it here in 2022 and all this video is about filmic view which is just one click for me now. And it is with the help of Andrew and all those people in the industry. Thanks, guys
I haven’t even installed Blender. This is literally the second tutorial I’ve watched and I feel like I’ve been given a several month head start on my skills. Thanks!
In case anyone is confused by the difference between "display referred" and "scene referred" data, it has to do with a logarithmic vs linear representation of light.
Our eyes perceive light _logarithmically_ , meaning that the difference between 1 and 2 looks the same as the difference between 100 and 200, so it makes sense for monitors to use logarithmic values in order to have the same _perceived_ level of detail in the darkness as in the light.
The problem is that fundamentally, light is just a collection of photons, and in order to properly add them together, you need to do it _linearly_ . When you add 10 + 10, you want to get 20, not 100. This is why logarithmic light values have problems with clipping when you add them together (the further you get from the "center" of the log-scale, the more extreme the difference becomes), and why you should stick with the linear values until the very end.
I've been shooting LOG footage on cinema cameras forever, and it's really cool being able to get essentially LOG footage and colorgrade it how I'm used to with real cameras in blender. Your tutorials are AMAZING!
this is legendary tutorial in render lighting history. this is the second mind blowing life changing highly useful tutorial i found in youtube (the other one was about making realistic glass material with mix shaders and different weathering textures)
Filmic Blender is on the official target list for Blender 2.79 for quite a while now.
No need to start a big campaign for inclusion in master as suggested in 10:40 , the developers are on it :)
Until then, enjoy Filmic Blender as an addon.
Thanks to this video, and to the page on facebook who published the video, I knew about this, not thanks to the "developers". So...
BlenderDiplom
So when is 2.79 out? Do you recommend we wait?
Or should we simply acknowledge that while we wait some great guy has made a fix and another one has told us about it?
wiki.blender.org/index.php/Dev:2.7#Suggested_targets
looks like early May, but no need to wait. Just follow instructions in the video.
My comment was probably misleading. I didn't suggest to wait. There is just no need to push the devs as suggestest in 10:40, they are on it :)
Ah thats great to hear!
Oh my, browsing old Blender Guru videos and suddenly a new one pops out. :D
Can you browse some more? we want more new videos... ;)
Yes. More browsing needed.
When I watch blender guru video I feel like I Level Up ! (DING)
(DING) Mee too! :D
(DING) no, ron
KACHING FLIP!!!!
DjentFoxProductions becky used to lemme smash
Want to click like but dont want to break 333 likes xD
We never should've jumped to 4k, we should've jumped to HDR at 1440p
Want to really get pissed of. That's just a nice, 2000s CRT. We had high resolution, refresh rate, and excellent dynamic range... but sacrificed all of that for smaller footprint and lower weight. Only now with OLED are we catching back up to the same quality. We've been in a 20 year video dark age.
@@bm1747 oh yeah I've been watching vids from digital foundry and want one, I'm currently rocking the LG C9 but I want a newer one for better BFI, I'm hoping for an HDR OLED bfi VR soon but idek if it's possible
After trying to insert filmic into my blender, my blender broke. I had to buy a new one. You owe me €120.
Stop using fancy blenders then
Vitamix. Takes all plugins. ;-)
TAKE YOUR AWARD AND GET THE HELL OUT!
pirate : *"cough"*
〈彡XʜᴜɴᴛᴇʀX彡〉 You didn’t get the joke
People like troy is what makes Blender such a great tool
6:00 lmao I thought you were trying to fool my ass and then just say like "oh no that's actually a render with fixed lighting". Really looked like a render
you're such an optomistic person with so much knowledge that you share with others. i can't even begin to think how much you have helped in this community
I think it's awesome that Blender took it upon themselves to integrate this into their update.
I seriously don't see the benefit of any other 3D Studio.
It's like my Converse; once worn, rarely go back to Nike.
As you can tell, I'm a bit late in the game!
Man... I'm currently rewatching this video from about 2 years ago because I wanted to re-learn it because I was too lazy the first time and damn, there were so many things that were updated like the UI (which I was very happy about because I was overwhelmed by it the first time and it led to me not wanting to learn in anymore), 2d animation, and this. man, I'm so lucky to learn this 2 weeks ago.
His IRL dining room looks like a starter set you'd get in UE4 lmao
Apropos people who misunderstand colour spaces... for one sRGB being ancient is neither here nor there, because all web images and webbrowsers are sRGB, and all displays are approximately sRGB when displaying 8-bit inputs, as is the output of the iPhone camera in JPG mode (or high-depth linear in RAW mode). Most commercial LCD PC monitors and TVs don't even cover the complete sRGB space, neither on gamut (unless they use RGB backlighting) nor on dynamic range, so all they have to work with is a rather pitiful approximation. For all the faults of CRTs (geometry, resolution, sharpness, convergence), colour reproduction was not generally one of them, and they are still occasionally used as colour reference.
For other, that sRGB represents only 8 f-stops is a blatant lie and misunderstanding. It would be if it was 8-bit linear space, but it's an 8-bit exponential space, with an approximate exponent of 1/2.2. So how much dynamic range does it have? Let's say we take the point of the lightest non-zero value, it has a linear value of (1/255)^2.2=0.000015, or close to dark resolution of 16-bit linear. Now that is of course not quite correct, because sRGB has a linear section at the start, while i have used substitution approximate gamma, so it's realistically closer to resolution of 12-bit linear, or 12 photographic f-stops, if resolution loss at brightest tones is partially neglected, which it can be according to perceptive metrics.
When rendering, generally infinite dynamic range is assumed, because any exponent of 0 is 0, in Blender and elsewhere. However this is not how cameras work. Specifically because their dynamic range is finite, they end up mapping the values to maximize useful information and reject noise, and this is what is missing and has been correctly added by Filmic Blender, and the explanation written by its author Troy Sobotka is absolutely correct. You merely failed at reading it.
So, Filmic Plugin actually makes it look more like a photo (by reducing information in the same manner as a camera) rather than more realistic, per se?
You could say so, James, except we neither necessarily have a "ground truth" for realism that doesn't stem from a camera, something we could faithfully compare against, nor do we have display devices that are capable of representing a real light field.
Shalok Shalom I don't have an opinion on Mirasol displays specifically, but I'm sceptical about the dynamic range of any reflective media. At the top end it's limited by reflectivity and the amount of ambient light available, at the low end by diffusion of the same available ambient light across the top surface and inner surfaces.
Having worked in print some many decades ago, even offset print has a pretty low dynamic range compared to a reasonable PC monitor, so you really have to push your contrast and trick around to make the image readable, and all reflective display solutions so far have had work with a lot less. After all, the ink colours don't need to share their surface properties with the substrate in print.
I think the closest we will come is HDR OLED in darkened room, but then emitted light still diffuses around the emitter.
Shalok Shalom you'll still be judging it in comparison to the brightly lit environment, and with the losses we have in glass electrodes alone, you get a choice between somewhat dim display and low viewing angle. Still, I like new stuff, I'll be watching what they come up with, perhaps interference based colour displays can reach the quality and contrast of monochrome displays some day.
In old news, I think CCSTN displays were curious. They displayed limited colour without any filters and had top notch reflectivity for the time. Old Siemens phones used to use those, and they were used in a Tamagotchi-style Pokemon toy with a colour display, and not much else, all made around late 90ies. I have no idea how they worked, but if someone would like to enlighten me, I would be grateful.
Augustinus, let's try this, shall we? 0^2 = 0*0 = 0. Ergo, not every exponent of 0 is 1. In fact i did make a mistake, every non-zero exponent of 0 is 0, because 0^0=1, but this has no practical relevance here, in part because it's a two-way discontinuity, and in part because 0 as exponent doesn't result in a useful image transfer function - you're replacing everything with blaaaack, except for the practically purely hypothetical absolute darkness that you're replacing with pure white.
This is all well and good (filmic blender), but there are a lot of misconceptions and misinformation in this video. All renderers (including Cycles) render under the hood in a linear colourspace. This is so that the math behind everything is correct and not skewed. Its necessary for a correct representation of physics and energy transfer/propgation/light simulation, which is what path tracer renderers like cycles do. Thus, by default, Blender actually outputs a linear image, because it renders a linear image. The sRGB conversion (display device) is done so that it looks correct on your monitor... It isn't some old, terrible workflow, its literally so that the linear image that is rendered to work like real life, looks like linear and natural to you when you look at it on your screen. It takes a while to understand, but Blenders own color management wiki page does a good job of explaining it.
The point is, this filmic blender plugin doesn't actually "fix" anything. Nothing was broken. Blender and any other renderer that can render in 16 or 32 bit floating point has near infinite dynamic range, not 8 or 25 as you say.
This is merely a plugin which emulates the nice logarithmic effect that film has. This is a method of tone mapping, or shifting all that colour information, the bright whites and dark blacks, into a space that looks natural to your eye, and can fit in an 8bit image or video that can be viewed on the vast majority of consumer devices that can only display that much information.
I am now interested, can Blender output in any HDR formats compatible with compliant devices (HDR10/Dolby Vision), is there any configurations or plugins for this?
If Blender has a practically infinite dynamic range why does it not give the same results as the reference image? How can you change the dynamic range to mimic a DLSR or human eye?
This tutorial is so well-done. It explains everything in the perfect amount of detail and gives very easy-to-follow instructions. Amazing!
I'm gonna be honest, I normally don't stick through tutorials this long. But the information and the way you explained it was EXTREMELY helpful and gave me such a better understanding of the way blender is managing my colors. I do a ton of product renders and this has helped me more than you could know! Thank you!
18:34 I know this video is two years old, but the phrase is “paradigm shift,” not “paradox shift.”
context
@@shondelb5175 what
I know your comment is one year old, but I'm hungry.
I don't even know how to make a sphere in blender but I'm convinced I need Filmic Blender in my life
Your background music is dope.
Thank the gods for this genius who gave us filmic blender and was able to get it added to the program as a default. This is a part of history that should never be forgotten. o7
Thank you so much for explaining this concept so well!! This has to be the most valuable tutorial I've seen in 2 years on UA-cam.
Also maybe im missing the point but your dining room looks like a render irl
Minimalist interior perhaps? Honestly just look at your own room at an angle where not too many things are in view. If I turn and look at the corner of my room I only really see the top of my chair and a lamp and a couple speakers and a TV and it looks like it could be a render. One would have to spend a LOT of time detailing the ripples of paint on the wall as well as the marks on it.
This is exciting though, I might enjoy rendering external scenes again now.
you forgot to model the cat
An unforgivable oversight!
And the couches on the right
The cat is more important.
Cdabek and the fan on ceiling :)
and the books on the bookshelf
I'm not a blender user or even a render-er in any way but watching this changed my life, thank you Guru, and thank you Troy
Have you tried to re-render your sci-fi drone with Filmic?
he wouldn't notice any difference since he said that he "tweaked" it with photoshop so..
No, i think he said that before he got to know Filmic.
Anyway i was just curious to see the comparison.
Ultimately i think people could just download/do older tutorials of his using this tweak and see how it goes.
you don't need to rerender stuff ehh video is full of mistake. it is just. setting how u read data.
THANK YOU! You are 100% right, this isn't some revolutionary way to "correctly" work with colour. The correct, physically accurate, most life like way to work with colour is with a simple linear workflow. Which is the default in blender... Just render your shit as linear 16 or 32 bit exrs, and colour correct to your hearts content, with no loss of information. This video is riddled with misinformation.
I'm like 'ooh interesting' even though I don't even use Blender! What am I doing with my life???
Joe Cowan You're not alone. I've barely used Blender, and not at all for the past five years, but I'm still enjoying the video.
Same :D
i have a research waiting to be typed! data be collected why in helll i entered this 3d world!
yeah its fun! :D
Same here :O
what is blender? I'm joking ... i just googled it
This video looks interesting... but
1.) I don't do blender.
2.) It was on my recommended.
and 3.) It's 3 AM where I'm at and I'm supposed to be studying.
JMap what do you study?
SansyBoy Digital Design, Data Structures and Algorithm, and Calculus :'( 3 exams today. 2 exams a while ago and Calculus is about to start as I'm typing. Well... wish me luck. I hope all that Blender-photorealism knowledge is gonna be put to good use.
This is exactly the same as me now it is 3:01 AM I'm watching this from my recommendations, I don't usee belender. xD
it starts sounding like a cult hahaha it's 04:44 AM, C4D user, graphic design student
lmao
Watch this vid, went to download and found the note about it already being included from 2.79... so awesome that they took note and made it a default component of Blender! Thanks for helping to get this out there!
Well, ultimately, it would be great if we could just set light sources brightness in Blender in physical units. Watts, Lumens, etc.
Hey Fred how bright is that light?
Fred: 20
20 what?
Fred: I already told you its 20
Tennouseijin now we have it
if you set it high enough it will disintegrate your scene
@@Hamstray > Enters light brightness value: 10 SN
What does SN stand for?
Super Nova.
Next time, 100 BB
Big Bangs.
Still better than measuring volume in multiples of olympic swimming pools.
How the F*** are your videos so informative...
hehe thanks mate
He lives up to the name Blender Guru :)
Nash Jordeen +1 for this comment's appropriate intensity. and now we know it won't clip.
Tone of voice is hard to discern via text
This sounds aggressive
still doesn't look as good as minecraft with ray tracing
seus shaders are amazing lol
My computer is crying
Blender has ray tracing
life doesn't look as good
Johann Inong blender is a path tracer
6 years later and this video is still helpful. thanks for putting all this info together!
That "off" feeling is actually kinda unsettling, it could work with spooky horror renders
You could compare this problem to microphones and sound. Sound has lots of dynamic range, but a mic can only have so much. Outside the mic's range (too quiet or loud), quality is lost, and the range gets bigger with a better mic. And that's why sound feels more realistic when you can hear more frequencies at good quality.
ThruThe9 harmonics
The only time I can think of where I'd use sRGB over Filmic would be if I was trying to mimic the look of the old Pixar shorts. Like, the old-old ones.
I think this is a really helpful video, but I think it’s also important to take into account what goes into making a photo. You took that photo of your room using a phone, and your phone is preprogrammed with various software to almagamate what a “good” shot should look like. It’s applying all sorts of luts and colour adjustments to counterbalance the range of the natural light to something more aesthetically appealing as determined by Apple’s (or whoever’s) software scientists. Blender is acting more like an unaffected camera, catching the light that you’ve actively placed in there, and the reason you need to add other light sources etc is the same reason a film’s DP might have to. Natural light doesn’t always look the most natural on film, and it takes crazy effort to make something look “natural”.
Also, dynamic range isn’t the panacea to lighting issues, but it does make fixing this stuff in post a hell of a lot easier. Tons of incredible looking movies have been made on film stocks and cameras with dynamic ranges in the realm of 3-6 stops, it just requires a whole lot more effort on the part of the DP to control those light differentials. It ultimately all comes back to the grey scale. Whether your dynamic range is 6 or 18 stops, white is white and black is black, but with more dynamic range you have more room to push that brightness differential and fix it in post
I wish you would have showed the 100 strength value for the sun in sRGB for comparison
Using Blender 2.79 and this is included already. So, cool. I don't have to replace anything.
6:00 The reference also has a cat, which immediately makes it inherently superior to any cat-less render.
Needs a short update video. Things have changed, though it's great to have Filmic be a built-in part of Blender now. It's not labelled "Render View" any more, but "View Transform". Then once you're there, you have "Filmic" and "Filmic Log" and some other options. Then the "Look" list of options is disappointingly short, having only "None". Please guide us through these uncharted lands, oh good Guru of ours!
Is it me or does his real living room look more like CG then the actual CG image now? XD 😂😂
Yea
Bdcause qe cqn sqe betqeen 15 to 20 and filmic is more thqn 20 in exposure😂😂😂
It's the iPhone camera.
@@amberheard2869 qwqwweeow
Its the final render, he actually lives in the default cube, or a donut
10:20 I think you're making a big mistake here by saying sRGB is wrong!
I believe you're mistaking *color profiles*, *color spaces* and *dynamic range*.
None of these are the same and sRGB is fine for most of the common usage, that is why it's still used as the reference across all kinds of devices, including the ones being built and released right now.
sRGB simply is a "package" containing :
• 3 color chromaticities defining the vertices of the color gamut triangle that you can display, and these are fine!
• 1 white point chromaticity called D65 at the temperature of 6500K (which is the color of regular blue sky), this is totally fine also
• 1 luminance curve transform that is not really a gamma curve of 2.2 nor a gamma curve of 2.4 but a mix of a linear slope curve + a 2.4 gamma curve
And that certainly is the last point, the gamma curve, that makes you say that sRGB is incorrect, except it isn't.
Of course the values sRGB is supposed to be dealing with are supposed to lie in a [0,1] range, but that's because you're missing an important stage in color management that is called *Luminance Adaptation* (and also *Tone Mapping*, those are often mixed together).
The color pipeline should then be :
1) HDR image rendering (all renderers should be doing this)
2) Luminance Adaptation (result image is still HDR at this point)
3) Tone Mapping (result image is yet again still HDR but anything above 1 will be clipped)
4) sRGB encoding (this time we store a LDR image, usually on 8 bits and clipping occurs)
What is important to know is that the luminance adaptation + tone mapping is not a linear transform and these 2 operations alone are responsible for compressing HDR into LDR. sRGB is not the guilty here, rather it's the default tone mapping operator in Blender that must be set to something as simple as "Clip" or "Scale" but we see the exposure you set is 0 and 2^0 = 1 so your scale factor is 1 and thus it's perfectly normal for HDR colors to then simply be clipped...
Finally, you're saying sRGB was used to support CRT displays but it's partially false: you're implying that now that we have super cool flat LCD screens we shouldn't need the gamma curve anymore I suppose?
Except we still do, for a simple reason: our eyes are more sensitive to shadows than highlights!
And gamma correction ensures just that: by writing gamma-corrected colors, we expand the range (bit-wise) of dark tones that we can write on a single byte, at the expense of compressing the range of light tones, but these don't really matter as much.
For example, to encode the same amount of information in shadows as an 8-bits gamma-corrected image, it would require about 12 bits without the gamma-correction!
That's a common misconception. Our eyes are more sensitive to shadows than highlights, sure, but that's how we see real world light which is as linear as your rendered EXRs.
Gamma correction is a two part operation, and the transfer function you apply to your sRGB images is compensated by the non-linearity of displays (that's legacy apect from the CRT days).
Check Poynton's Gamma FAQs and investigate what OETF and EOTF are.
Current hardware would be able to dump linear images directly, but it has to keep all the non-linear thing purely for legacy reasons: CRT were nonlinear we're still displaying signals/data that were designed for nonlinear hardware, and because 8-bit can't acommodate enough precision for linear images.
Well okay, try storing your images as linear 8-bits, without gamma-correction then, and see how that goes...
Best of luck removing the banding.
Did you even read the whole comment?
Did I need this?
No
Did I even search for this?
No
Did I love it?
Absolutely.
I don't use blender, but I use unity and I will definitely start using the ACES colorspace now. Great job explaining the difference, I was always wondering why it made everything grey
Thanks for sharing Andrew, this is really a life changer !
On a sidenote, my workflow does include a lot of texture baking, sadly I just found out that bakes don't seem to be affected by Color Management settings, unless I am missing something here?
The rendering is not changed at all. The color management changes with filmic intervene after the rendering is complete and simply map the render to a wider dinamic range. Only changing the dinamyc range will not alter your render, but If you want to get the improvement of using a wider dynamic range enabled to you by filmic you will need to redesign the lighting in your scenes. It is not a "magic make your render better" switch you turn on and off, but more like a "let me use realistic lighting values in my scenes" switch.
Have a nice day,
Create.
the colour management happens after the render so any baking you do will be based on the default lighting settings of your scene
Thanks for replying !
The thing is, I actually get huge "exposure" difference between my baked textures and viewport renders. What I mean is that while color management settings show on my renders, usually when I'm pleased with the result and ready to bake textures, my bake will end up way different, just as if the dynamic range was narrower for baking, usually the whites are clipping a lot more on them. But I was hoping that, since texture baking is basically a flat rendering, there was a chance for the color management settings to affect them, even just gamma and exposure, but I guess that's just not possible at this time. So basically even if I redesigned my scene lighting + filmic, my render would be drastically improved while my baked textures would remain the same.
Right, that's what I figured
The problem is then moved from Blender to the renderer you will be using your baked textures in. So probably Unity, Unreal Engine etc.
I'm only familiar with Unity, but using linear color space, PBR textures, HDR cameras and tone-mapping makes all the difference there. Your textures in most cases should not contain any lighting information (except AO, if you consider that lighting information) at all. Your case might of course be different from what I imagine, but that is usually how it goes.
I'm not even an animator but I still stayed for the entire video bc this stuff is fascinating?????
I am 3d visualizer ... and I fast forwarded so many times....
It's free, so why not give it a try? If you can follow these videos you'd be surprised how easy it is to fiddle around and produce something for fun. Sculpting is pretty good in blender now and really satisfying to do. Great fun.
Gunawan Lee now now, don’t be cruel
@@gunawanlee5182 you must be a mega genius
@@NN-sp9tu One of my friend died at young age for chasing this realistic render. Leave his wife and his child. Today, 3d artists maybe convert their life span for every tiny spot to achieve more realistic render. :) I am just a noob.
In my opinion you do some confusion between sRGB and sRGB EOTF. The same as you do some confusion between the CMS and the filmic view transform .. 1) sRGB is the color space of the monitor. That is, it's the "color palette" the monitor is able to use to show you any picture .. 2) sRGB EOTF is a transfer function from linear space to sRGB space. That is, it is a translation from "real colors" to the sRGB color palette .. 3) the filmic view trasform, that's the filmic option in Blender 2.79, is just another transfer function from linear to sRGB, the same as sRGB EOTF. The difference is that it's a better function. That is, it gives a better perception on a sRGB image of what the human eyes would see in the real world with real colors .. In short. The filmic view trasform is better than sRGB EOTF because it traslates better from linear to sRGB.
That makes more sense. I knew that the sRGB talk wasn't right because sRGB is a colour space... but this clarifies a lot. Thanks!
I've never even touched render or thought about making digital art yet I watched the whole thing. Great video
I think you got your blender room and actual room mixed up. 5:55 i swear this is a render.
You should make a new updated video using Blender 3.0
I'd agree that Filmic Blender Desaturation is closer to reality, if it weren't for the fact that the table's color saturation on the real photo is most similar to that displayed on sRGB than the washed out version on Filmic... you can check it on 21:50.
Now I wonder why... I understand that desaturation is a real thing, but why the "wrong" version is closer to reality in terms of color saturation than the Filmic version?
I noticed that too ... Maybe he used a too powerful sunlight? Anyway, I think you can correct it in post production
Well its is taken with his Iphone, which does not use film so this effect does not really happen there. digital cameras are more accurate, but many prefer the defects that come with film cameras. Hence the "filmic" name, blenders default tonemapping isn't very good at all, and this fix is a very simple way of doing it quickly but if you really want control over your renders use linear workflows, save your files as HDRIs or other linear file formats, and do your tonemapping in software like photoshop.
Emil Klingberg And how do you set the intensity of your lamps for that? As it's stated in the video, if you do your lighting through the limited dynamic range default view, you tend to use too low ratios to avoid clipping.
That's mainly the purpose of Filmic (providing a wider dynamic range tonemapping to help artist lighting their scene with realistic ratios), the desaturation is a plus.
This is a great fix for blender users since they don't have a propper tone mapper, but most other render packages come with a tone mapper, so that you can see preliminary results before your final comp in photoshop. And it also introduces that filmic look(washed out colors on the high end), which is technically a limitation of film, where a digital camera will be more accurate and not look so washed out with intense colors. Its a preference not a physically accurate thing.
This is all just about displaying the rendered information, the information is the same just interpreted differently, and then when you save your file, if you use a format like jpeg you loose all your light data so you need to re adjust the gamma for viewing. But if you save all your floating point data, you can then be very precise with your tone-mapping in photoshop and choose exactly how you want to display it
Bottom line is, its a cool way to get some great looking images right out of blender, but its not a great way to work if you want to do compositing work later
Why it's not great to work if you want to do compositing work later?
If you do that you most likely want to do it from an EXR, which is not affected by the view.
Nobody is saying that you have to use JPEG as output format because you used filmic.
Filmic is an OCIO configuration, you can take it along your linear EXRs to a compositing program and do your compositing grade from there. If you want.
But still, the point is having a view that allow users to light the scenes with realistic light ratios. That's the main thing to consider here.
And regarding desaturation, saying that digital cameras are more acurate than film is inaccurate in this context. When you go out after being in a dark environment and outdoors light blind you. What do you see? Is it white or is it magenta where there were reds, yellow where there is orange, cyan skies?
You're the only guy that make me watch a 30min video without checking the time.
I believe the latest version of Blender (2.79) now includes filmic color rendering this video moot. I started to follow the directions in this video and then realized I was about to cut out the filmic color manager that was already there. Maybe we can get an update to this to show how filmic color is now built into Blender? Awesome video!
It does have Filmic now, yes. But that doesn't make the video moot because it's still not the default even more than two years later.
23:35
Yellow, Cyan and Magenta
oh cmon, that's basic.
Yolo, Cyna and Magneta!
Yea, but those are RGB yellow, cyan and magenta, so whatever, aqua it can be! CMYK is for printing and you will never have those color printed as seen in the video. Not close at least, specially the cyan. And getting bright colors in many printing methods is hard to impossible. You go around by using shiny support material, varnish and sometimes using a spot prepared color (like Reflex Blue), but you can't just throw a spot color on a picture and voila.
Anyway, I will stop here! :) Have a good day, sir!
hdmi is encoded in cmyk...
@Mike Crapse What the fuck? Were you drunk when you typed that? CMYK is for printing. Simulating CMYK is just simulating. RGB and CMYK are color models, like YCbCr (which can be used by HDMI).
And don't forget to calibrate your monitors!
after calibrating my monitor, highlights look much smoother (especially in green areas).
built in calibration is not accurate as it relies on you eye to "measure" color. i am not saying you cant improve quality with it but as i said it is not accurate and thus not reliable. it is advised to use a colorimeter and a profiling software.
Use a testpicture (the one you sometimes see on TV) and do it by eye. There are instructions on the web on HOW you do it. Way more accurate than software.
xd
Aaaah, your videos on photorealism are so awesome! Coming from a photography background myself, I did think about most if not all of these aspects, but making that connection so systematically and showing HOW to implement proper amounts of dynamic range etc. is just very fulfilling to watch, and gets me really excited on diving into my current project's VFX side. :)
5:55 oh he fixed the light cool... wait this is real photo