@0:30 Bt.601 was used for ye olde SD video (Bt.709 for HD, BT.2020 for UHD), bt2100 is pq(or hlg) transfer curve in bt.2020 color gamut. @3:52 the tv does it with some hardware like asic or fpga, "the game engine" most likely has to do it on CPU. If that assumption is correct, on the TV it's faster and less power/resource consuming. Also the TV "knows better" how to do it, it would need to inform the "console" about it's capacities and the current environment light information so the console can apply the correct color map.
""the game engine" most likely has to do it on CPU" Unless you misspelled GPU as CPU, definitely not. It happens on the GPU. Also from what I could gather about dolby vision so far (not much, because it's proprietary), I think it basically is "just" a suite of inverse tone mapping functions.
Also, while you can pretty easily leverage the existing HDR framebuffer to display HDR luminance levels (Uncharted 4 for PC), leveraging the HDR color gamut (P3 or BT2020) is vastly more complex. Most games do it "simply" through reverse tone mapping (Gears 5) or by boosting saturation (Doom Eternal).
Yeah, so you get HDR, but not WCG right? Is that what the "remastering" passes do in Special K? I haven't experimented with them until recently, but don't have a good game to test with.
@@KevinGhadyaniTech I'm sorry, I have no knowledge of Special K. I was talking in general, from a game developer's perspective. Yes, properly achieving wide color gamut requires much more work. Inverse tone mapping and the like are just hacks if you ask me. Technically you'd even need to (re)author your textures in 10 bit per channel P3 or BT2020. I don't think anyone is currently doing textures in 10 bit though. Doing this means losing the alpha channel. I hope I was able to answer your question. I wasn't very sure what exactly you were asking me.
Correct, but that HGiG setting isn't meant to be named that way. It's meant to say "actually disable the tone mapper". LG named it HGiG, but there's no standards body. On the HGiG page, it says not to use that name for TV settings. When I first set my TV to HGiG, I thought it did something. But it actually does nothing; in fact, it tells the TV to stop processing HDR :p. It's the opposite of what I expected HGiG to mean.
@@KevinGhadyaniTech Yeah, I read the recommendations. The current situation is ironic, really. A bit of their own fault from what I can tell though. IIRC they didn't specify what it SHOULD be called and they made it clear that they are not a governing body. Idk what they expected. lol
To do what? Can you elaborate? Do you want to use ReShade to convert a game to HDR? From what I could gather, this is currently not possible, because ReShade has to give the back buffer back to the application. To get HDR, you need to change the back buffer.
@@TaticalRemedy This is my second try at posting this comment, because UA-cam seems to have nuked the previous one because of the links. I spent some more time researching this topic, because I am really not an expert on ReShade, and didn't wanna just give you wrong information. I don't see any reason why you couldn't just output an HLG encoded HDR image into a game's existing 32bit back buffer. You could also add some dithering, to make the banding not as noticeable compared to 10 bit (some displays do that internally!). There may be additional challenges if a game's back buffer is encoded in non-linear sRGB instead of linear, but I am not sure. And then all you have to do is force your display into HLG mode (possible on LG TVs for example). It would still be 8 bit HDR though. I also found this ReShade addon that may already do what you want: [Search for "AutoHDR addon" forum post in the ReShade forum] I also found this here for "Special K" (seems to be a tool similar to ReShade): [Search for "Special K HDR Retrofit"]
Sorry for the multi-comments, but the idea that even 10'000 NITS displays requires no tone mapping is really misguided. The sun itself is 1,600,000,000 NITS for example. 10'000 NITS is just the average clear sky. A game engine's internal 16 bit float framebuffer can represent values much higher than 10'000 NITS. So you do need to tone map, just not as aggressively. Artistic intent is another reason to tone map.
Great point! 🔥 I was thinking in terms of the standards. Like how you're not tone mapping in SDR, the camera and game engine do it for you. 10K nits is the max for today's standards, so when you have a display that can do 10K nits, that won't require further tone mapping. But like you said, games can get brighter than that!
@@KevinGhadyaniTech I guess you have a point. Looks like you are talking about display-side tone mapping. But there are some caveats I think: - SDR does have something similar to tone mapping, and that is "limited" (16-235) vs "full" (0-255) range. - If there ever will be such a thing as a 20'000 Nits display (or higher), you are implying that people will want to just run it at half the brightness, because the content only goes up to 10'000 Nits. I would argue that at least some users will want to stretch that to 20'000 Nits. I would refer to this as a form of tone mapping. - Check "ITU-R BT.1886" on Wikipedia for example. Note how the EOTF allows for user control. This is the case with a few formats to my knowledge. Wouldn't you call this a form of tone mapping too?
You're spot on. Yes, you may tone map still if 20K sets come out. And good point about Limited vs Full, but I consider that in the "wide color gamut" camp rather than tone mapping. Maybe I'm wrong
@@KevinGhadyaniTech I think you are correct that limited RGB would reduce the range of available colors due to black level raise and white crush (?), if displayed on a full-range display. However, if we account for the fact that displays that expect a limited range video signal (toggleable on some, as we know) will chop off the bottom (0-16) and top parts (235-255) of the signal, and then stretch the signal out across the display's supported range, you should get the same overall color gamut, but with slightly less graduations. This gives you a reduced number of colors, but the extremes should look the same between limited and full range. When we talk about gamut, we (from what I can tell) usually only talk about the extremes/chromaticity coordinates (I may be wrong), and leave the number of graduations up to the bit depth of the signal (+ limited vs full range, and chroma subsampling like 422 and the like).
@0:30 Bt.601 was used for ye olde SD video (Bt.709 for HD, BT.2020 for UHD), bt2100 is pq(or hlg) transfer curve in bt.2020 color gamut. @3:52 the tv does it with some hardware like asic or fpga, "the game engine" most likely has to do it on CPU. If that assumption is correct, on the TV it's faster and less power/resource consuming. Also the TV "knows better" how to do it, it would need to inform the "console" about it's capacities and the current environment light information so the console can apply the correct color map.
This is great feedback! Thanks for explaining the different colorspace standards too! I still have much to learn apparently.
""the game engine" most likely has to do it on CPU"
Unless you misspelled GPU as CPU, definitely not. It happens on the GPU.
Also from what I could gather about dolby vision so far (not much, because it's proprietary), I think it basically is "just" a suite of inverse tone mapping functions.
Also, while you can pretty easily leverage the existing HDR framebuffer to display HDR luminance levels (Uncharted 4 for PC), leveraging the HDR color gamut (P3 or BT2020) is vastly more complex. Most games do it "simply" through reverse tone mapping (Gears 5) or by boosting saturation (Doom Eternal).
Yeah, so you get HDR, but not WCG right?
Is that what the "remastering" passes do in Special K? I haven't experimented with them until recently, but don't have a good game to test with.
@@KevinGhadyaniTech I'm sorry, I have no knowledge of Special K. I was talking in general, from a game developer's perspective.
Yes, properly achieving wide color gamut requires much more work. Inverse tone mapping and the like are just hacks if you ask me.
Technically you'd even need to (re)author your textures in 10 bit per channel P3 or BT2020. I don't think anyone is currently doing textures in 10 bit though. Doing this means losing the alpha channel.
I hope I was able to answer your question. I wasn't very sure what exactly you were asking me.
HGIG can be more than just a recommendation. Some displays have an HGIG setting. It basically just disables tone mapping.
Correct, but that HGiG setting isn't meant to be named that way. It's meant to say "actually disable the tone mapper".
LG named it HGiG, but there's no standards body.
On the HGiG page, it says not to use that name for TV settings.
When I first set my TV to HGiG, I thought it did something. But it actually does nothing; in fact, it tells the TV to stop processing HDR :p. It's the opposite of what I expected HGiG to mean.
@@KevinGhadyaniTech Yeah, I read the recommendations. The current situation is ironic, really. A bit of their own fault from what I can tell though. IIRC they didn't specify what it SHOULD be called and they made it clear that they are not a governing body. Idk what they expected. lol
What about using reshade to get access to the 10bit or 16 bit? Is that a thing?
To do what? Can you elaborate?
Do you want to use ReShade to convert a game to HDR?
From what I could gather, this is currently not possible, because ReShade has to give the back buffer back to the application. To get HDR, you need to change the back buffer.
@@DasAntiNaziBroetchen thank you that answer my question. Legendary
@@TaticalRemedy This is my second try at posting this comment, because UA-cam seems to have nuked the previous one because of the links.
I spent some more time researching this topic, because I am really not an expert on ReShade, and didn't wanna just give you wrong information.
I don't see any reason why you couldn't just output an HLG encoded HDR image into a game's existing 32bit back buffer.
You could also add some dithering, to make the banding not as noticeable compared to 10 bit (some displays do that internally!).
There may be additional challenges if a game's back buffer is encoded in non-linear sRGB instead of linear, but I am not sure.
And then all you have to do is force your display into HLG mode (possible on LG TVs for example).
It would still be 8 bit HDR though.
I also found this ReShade addon that may already do what you want:
[Search for "AutoHDR addon" forum post in the ReShade forum]
I also found this here for "Special K" (seems to be a tool similar to ReShade):
[Search for "Special K HDR Retrofit"]
Sorry for the multi-comments, but the idea that even 10'000 NITS displays requires no tone mapping is really misguided. The sun itself is 1,600,000,000 NITS for example. 10'000 NITS is just the average clear sky. A game engine's internal 16 bit float framebuffer can represent values much higher than 10'000 NITS. So you do need to tone map, just not as aggressively. Artistic intent is another reason to tone map.
Great point! 🔥
I was thinking in terms of the standards. Like how you're not tone mapping in SDR, the camera and game engine do it for you. 10K nits is the max for today's standards, so when you have a display that can do 10K nits, that won't require further tone mapping.
But like you said, games can get brighter than that!
@@KevinGhadyaniTech I guess you have a point. Looks like you are talking about display-side tone mapping. But there are some caveats I think:
- SDR does have something similar to tone mapping, and that is "limited" (16-235) vs "full" (0-255) range.
- If there ever will be such a thing as a 20'000 Nits display (or higher), you are implying that people will want to just run it at half the brightness, because the content only goes up to 10'000 Nits. I would argue that at least some users will want to stretch that to 20'000 Nits. I would refer to this as a form of tone mapping.
- Check "ITU-R BT.1886" on Wikipedia for example. Note how the EOTF allows for user control. This is the case with a few formats to my knowledge. Wouldn't you call this a form of tone mapping too?
You're spot on. Yes, you may tone map still if 20K sets come out.
And good point about Limited vs Full, but I consider that in the "wide color gamut" camp rather than tone mapping. Maybe I'm wrong
@@KevinGhadyaniTech I think you are correct that limited RGB would reduce the range of available colors due to black level raise and white crush (?), if displayed on a full-range display.
However, if we account for the fact that displays that expect a limited range video signal (toggleable on some, as we know) will chop off the bottom (0-16) and top parts (235-255) of the signal, and then stretch the signal out across the display's supported range, you should get the same overall color gamut, but with slightly less graduations. This gives you a reduced number of colors, but the extremes should look the same between limited and full range.
When we talk about gamut, we (from what I can tell) usually only talk about the extremes/chromaticity coordinates (I may be wrong), and leave the number of graduations up to the bit depth of the signal (+ limited vs full range, and chroma subsampling like 422 and the like).