Why No 12-Bit TVs & What’s 12-bit color anyway?

Поділитися
Вставка
  • Опубліковано 29 чер 2024
  • We explain 12-bit TVs, what makes 12-bit color so special, and why don't we have it yet if it's so awesome?
    LINKS
    Dolby White Paper
    bit.ly/2Wgdtbr
    Color Depth and Bits
    bit.ly/2YKqlrM
    bit.ly/35K5yWQ
    bit.ly/2WDuoU9
    bit.ly/3fB405Z
    Nanosys CEO Interview re 10,000 nits
    • Quantum Dots Beating M...
    MUSIC Selection in order of appearance: bit.ly/2Tdc4AF
    Clockwork Cherry by the Blue Bay
    Funky Beat by Stefano Mastronardi
    Eclipse by Out of Flux
    ===
    Products for purchase through Affiliate Links. You can support our channel by buying our recommended TVs and products via our Amazon affiliate links:
    TVs
    Hisense H8G amzn.to/3anI3nq
    TCL 6 Series 65 inch: amzn.to/2znXlM9
    Samsung Q90T: amzn.to/2W90dWq
    Samsung Q90R: amzn.to/2Y0qOnn
    Hisense H9F: amzn.to/2qYteXO
    LG C9: amzn.to/2Dln6Ls
    Sony A9G: amzn.to/34PsC5u
    BLU RAY PLAYERS
    PANASONIC UB820: amzn.to/2mM9kgA
    Sony X700: amzn.to/2m91IEk
    About Affiliate Links: #CommissionsEarned some of my links in this video description are affiliate links which means at no extra charge to you, I will make a small commission if you click on them and make a qualifying purchase. Equipment and Materials used for this review can be purchased on Amazon.
    ======
    TEST DISCS & Movies:
    Spears & Munsil Benchmark Blu Ray: amzn.to/2malKhL
    Murder on Orient Express: amzn.to/2PFS5th
    Pan’s Labyrinth 4K: amzn.to/2VtikUv
    Maleficent 4K: amzn.to/2op4LsP
    BBC Planet Earth 2: amzn.to/2mcQoqX
    ======
    HDMI Cables:
    Certified 2.1 Cables amzn.to/2WpnAtq
    4K@60 18Gbps HDR Certified: amzn.to/2zF4Cnm
    LICENSED MUSIC: bit.ly/2Tdc4AF
    We use Artlist for a great selection of background music - affordable and royalty free, you can sign up here bit.ly/2Tdc4AF
    =================
    All TVs and equipment you see reviewed or used in this channel have been purchased without any accommodation, compensation by any manufacturers, dealers or retailer. Exceptions to this policy will be clearly stated at the beginning of the video. We make every effort to back up our conclusions and reviews with demonstrations that clearly show the reasoning behind our opinions.
  • Наука та технологія

КОМЕНТАРІ • 406

  • @musiclistener211
    @musiclistener211 4 роки тому +83

    I wanted to clear up a few things from your video:
    The EOTF (electrical optical transfer function) of the TV converts the maximum brightness to what your TV can display, and that since it is a 10 bit panel, there would still be 1024 gradients of each color on the screen. Since the TV can't get as bright as the source material in some cases, this would mean the gradients are just more compressed (meaning more pixel gradient information at all shades of color). If the TV is capable of displaying the brightness intended by the source material, say 4,000 nits, the EOTF would not have to re-map the pixel shades to lower brightnesses (the higher brightness shades are re-mapped more severely than lower brightness shades), but the TV would still display 10 bits of color. If this is true, and I think I have a proper understanding of the topic, a 12 bit TV would have some benefit in reducing banding gradients, even if the TV is not capable of say 4,000 nits. Banding gradients are highly reduced by dithering though, so a 12 bit source signal played back on a 10 bit TV capable of reading the 12 bit source (Dolby Vision) and processing the source data with dithering before displaying the final 10 bit signal on the TV would produce a much cleaner image than a 10 bit source could produce, and nearly indistinguishable (though still could be technically detected) from a true 12 bit image theoretically.
    Secondly in your blue to white example, the "blue" would look almost completely black at low brightness, and should look very bright and blue at the highest brightness your TV can produce, not white. If all 3 pixels are driven as hard as they can at the same time, this should produce the highest luminance your TV is capable of producing (in an RGB TV). Only by blending the 3 pixels together do we see white. The current OLED's have an additional white sub pixel which adds to the overall brightness of the TV but not the maximum color intensity of a single pixel.

    • @stopthefomo
      @stopthefomo  4 роки тому +28

      AWESOME insight, thank you for educating us. I'm pinning this! Gotta love my viewers, please continue contributing 😊

    • @truthseeker6649
      @truthseeker6649 4 роки тому +3

      doesn't dithering add quantization noise ? and you need to remove that with noise-shaping filters. it's already done in audio with side effects which is a controversial subject.

    • @ezilka
      @ezilka 4 роки тому

      musiclistener211 came for dithering comment and got it. Somehow ive was waiting for fomo to talk about it.
      Because you dont need 12 bit panel TV to show 12 bit signal aproximation using dithering.

    • @C--A
      @C--A 4 роки тому +3

      @@ezilka True though a 12 bit panel will produce better picture quality than a 10 bit panel specifically for 4K blu ray discs mastered in Dolby Vision at 12 bits.
      Its similar to current market where a 10 bit tv consisting of 8 bit+FRC never has as good picture quality as a true 10 bit tv.

    • @optimalsettings
      @optimalsettings 3 роки тому

      So does it mean sending 12 Bit from PC is visually better than sending 10 Bit?

  • @johncolbron3136
    @johncolbron3136 4 роки тому +10

    Your videos are so good at cutting through the jargon and explaining everything clearer than these tv manufacturers keep up the good work

  • @kev3226
    @kev3226 4 роки тому +1

    Thanks for explaining it. It really helps. Now I won't be missing out on 12bit color.

  • @AndySomething
    @AndySomething 4 роки тому

    Found your channel recently and i'm very impressed. This video was very informative and I've come away learning something that I didnt know before 😅 All the best ❤️

  • @IkariWarrior1701
    @IkariWarrior1701 4 роки тому

    Great clearly explained video! Really appreciate this kind of content!

  • @adammcpherson9536
    @adammcpherson9536 4 роки тому +1

    So well explained FOMO. Really enjoy your style of presenting information in a clear and entertaining way.

  • @chrisborland6787
    @chrisborland6787 4 роки тому

    Love the detailed information. Thank you.

  • @SuperKrisNOR2
    @SuperKrisNOR2 4 роки тому +1

    Great content, keep up the good work! Love your channel!

  • @chriskaradimos9394
    @chriskaradimos9394 4 роки тому

    Fantastic video thank you, you cleard it up for me.

  • @Mr.Sjottdeig
    @Mr.Sjottdeig 4 роки тому +8

    Great video, great explanation and i totally agree we need 12 bit. Also going from 6bit to 10bit is almost the same as going from 60hz to 144hz, but from 10bits to 12bits its the same as going from 144hz to 240hz almost not noticeable but it is and the extra "smoothness" adds a lot to the overall smoothness so would 12 bits.

  • @cjnelson79
    @cjnelson79 3 роки тому

    I really enjoyed this. Thanks you. Very enlightening. You do a great job of teaching these things.

  • @stopthefomo
    @stopthefomo  4 роки тому +14

    Does this put a damper on your dreams of buying a 12-bit TV on Black Friday, or are you still rockin' an 8-bit CRT?

    • @knekker1
      @knekker1 4 роки тому +1

      Apart from LG secretly downgrading the 2.1 HDMI bandwith without telling anybody until being questioned about it, I think it would be insightful if you addressed why it is such a big deal that LG CX tv's got their bandwith reduced from 12bit to 10bit. Judging by the specs, it really shouldn't matter, given that the 10 bit panel on LG CX tv, isn't able to saturate colors more than 10bit anyway. So why the big fuss? Perhaps keeping the 12 bit bandwith, would allow to overclock the tv's frame rate beyond 120hz without having to compromise on the uncompressed 4:4:4 chroma subsampling?

    • @tehama4321
      @tehama4321 4 роки тому

      @@knekker1 can't overclock the frequency beyond 120 hz.
      You're correct that it shouldn't make a difference since a 10-bit panel is still a 10-bit panel regardless.
      On his previous video he claimed that the C9 is 'more future proof' since it has the full 48 Gb and can accept 12-bit when compared to the CX.

    • @bravedwarf
      @bravedwarf 4 роки тому

      @@knekker1 for me c10 is for my xbox series x this fall

    • @bravedwarf
      @bravedwarf 4 роки тому

      @@holgerbahr8961 could you please elaborate on the dts encoder missing?

    • @knekker1
      @knekker1 4 роки тому +4

      @@tehama4321 I guess the big question is then, how is C9 more future proof with it's full 48gb bandwith, if the panel is still just a 10 bit panel?

  • @De03314
    @De03314 4 роки тому

    This is one of your best videos! Thanks!

  • @robinberg6898
    @robinberg6898 4 роки тому

    Thank you for these videos. Perfection👌

  • @RushHourGameplay395
    @RushHourGameplay395 3 роки тому

    awesome information , couldnt be better , thanks

  • @zitbug
    @zitbug 4 роки тому

    Very informative dude, thank you!

  • @timothyweakly2496
    @timothyweakly2496 4 роки тому

    Amazing explanation. Thanks!!!

  • @ssaannddrro
    @ssaannddrro 4 роки тому

    Excellent explanation. Thank you.

  • @H0DLTHED0R
    @H0DLTHED0R 4 роки тому +1

    Been wondering this myself, especially with 8K TVs now here. Thanks for covering this.

  • @mikef5659
    @mikef5659 2 роки тому

    Are you a teacher/professor? If not you should be because that was probably the easiest to understand, well spoken video I've ever seen. Very well done!

  • @9yearoldepicgamersoldier129
    @9yearoldepicgamersoldier129 4 роки тому +1

    What an excellent video. Thank you for explaining this. I feel much wiser now. But we definitely won't get there with oled so can't wait for microled

  • @muhammedyazar8840
    @muhammedyazar8840 4 роки тому

    This was very informative video thanx

  • @PaulShare1
    @PaulShare1 4 роки тому

    Good video, learnt a great deal here. Well done.

  • @MERCERENiTY
    @MERCERENiTY 3 роки тому

    Already liked and subscribed, at one minute mark of the vid. The way you talk and what you say, reeled me in that quick.

  • @TheEgzi
    @TheEgzi 4 роки тому +2

    Ur TV vids are so good!

  • @timmturner
    @timmturner 4 роки тому +43

    I need 10K nits so I can tan at home, just put up an image of the sun and tan away.

    • @darkknightforU
      @darkknightforU 4 роки тому +5

      T Turner and Greta Thunberg standing knocking outside your door and screaming how dare you 😂😂😂

    • @CaptainScorpio24
      @CaptainScorpio24 4 роки тому

      🤣

    • @kjellrni
      @kjellrni 4 роки тому +5

      I won't be happy until my TV can generate X-rays.

    • @CaptainScorpio24
      @CaptainScorpio24 4 роки тому

      @@kjellrni 👉😉

    • @CaveyMoth
      @CaveyMoth 3 роки тому +1

      This would be great for watching the movie 'Sunshine.'

  • @honklertheconkler155
    @honklertheconkler155 3 роки тому

    Thanks for explaining big help

  • @wizzdom100
    @wizzdom100 4 роки тому +10

    Sony has a concept 8k 85 inch 10,000 nit tv, a few years now, hopefully we get it in the next 7 years.

  • @HeloisGevit
    @HeloisGevit 4 роки тому +2

    Thanks for this vid, very informative and shows that kicking up a fuss over the CX not having bandwidth for 12 bit is ridiculous.

  • @christophermartin1973
    @christophermartin1973 3 роки тому

    I feel the need to necro this thread to point a couple things out:
    1. Downsampling always produces better results than upsampling - so content should always be recorded 12bit+ that way so it doesn't look terrible in the future - I want video that gets better over time not worse.
    2. I want 12 bit color specifically because it requires a higher nit value, so then the top end TVs get way more bright and maybe even need some active cooling - but this will make the trickle-down effect of brighter displays at 10 bit happen much quicker because in order to meaningfully realize 12bit capabilities for those big-box store displays it requires they raise the brightness to differenciate it side by side.
    Sometimes we want "the thing" not because it's useful or practical - but instead because of the implication of that thing existing because it needs to stand in contrast with what we already have in order to sell.

  • @Koopinou.
    @Koopinou. 4 роки тому

    Excellent work !!! Kiss from 🇫🇷

  • @gamersplaygroundliquidm3th526

    watching you right now in 12 bit on my tv 65inch it looks amazing since i found the settings i forgot about and yes it really is in 12 bit mode and it is very noticeable compared to 8 and 10 bit. so much so when my wife came home she started to say hello and saw the tv picture paused and said u baught a new damn tv ? again? i really had to show her a few vids to show how and why it looks so much better

  • @gaming-zombie1392
    @gaming-zombie1392 4 роки тому

    Thanks for the info I'm happy with 8bit for now & when we hit 12bit it be Amazing to see...

  • @itchyblanket5508
    @itchyblanket5508 3 роки тому

    Awsome video!! I feel like an expert now.

  • @madpistol
    @madpistol 4 роки тому +7

    As someone who just bought a 55" CX, this definitely makes me feel better about my purchase. The TV can't display 12-bit color. So? I had a hard time believing there is a much better TV than this currently on the market. The colors, contrast, motion handling, gaming features, etc. are all sooooooo good. The TV looks amazing!

    • @stopthefomo
      @stopthefomo  4 роки тому +2

      Exactly. This is why the C9 and A9G last year continue to be relevant options this year and for the foreseeable future!

    • @madpistol
      @madpistol 4 роки тому +1

      Yokai Ninja99 I was going to post somewhere that I’ve been incredibly impressed with the dark scene performance of the CX... that is to say I don’t see any black crush. Everything looks very defined. The CX is a visual treat.

  • @green_universe
    @green_universe Рік тому

    I know what color bit is, but your in-depth explanation was colorful, gotta tip it.

  • @lucee2261
    @lucee2261 4 роки тому +19

    Excellent, informative, video - you should be a teacher.👍

  • @JackRABBITslim27
    @JackRABBITslim27 4 роки тому

    I would just like to say thank you for all you do. You literally answer all the questions for TV buyers of all types in a very logical and intelligent way. In my quest to set purchase requirements to buy the best TV in the year 2023, its hard to track what these manufactures are doing. Bit depth has been a recent interest of mine and you pretty much settled that debate. I currently own a 2019 Samsung Q70r paired with an Panasonic UB820 to watch my movies. Future purchase requirements are hard to make. The global pandemic I believe will drive higher bandwidth streaming content making 8k more viable. However, HDMI 2.1 and 4k@120 is still the biggest upgrade for my needs. Thanks again, and stay safe EDIT. Also, entertainment and sporting events with no spectators combined with increased at home viewership might force content providers to invest in higher bandwidth content. Just my 2 cents

  • @Ghostelmalo44
    @Ghostelmalo44 4 роки тому +3

    amazing job you're doing my friend. thank you so much !! i'm learning a lot !!! WAITING FOR THE RIGHT TV FOR THE NEW XSX CONSOLE !

    • @trumpameri1638
      @trumpameri1638 4 роки тому +1

      Yes we all waiting for the next gen console PS5 or Xbox and best possible tv solution

    • @stopthefomo
      @stopthefomo  4 роки тому +3

      If you are OK with OLED, it really doesn't get better than either the C9 or CX (if you need 48") for console TV gaming - preserves all the color while having the best latency for any TV. Even when Vizio brings out its OLED, I can't imagine it will be priced lower than the C9. As much as I'm excited about what Samsung is doing with 8K QLED this year, I'm disappointed that all QLED TVs lower the picture quality and FALD control in order to have better latency. This means going from movie watching to game playing will relegate my color/black levels. LG C9/CX maintains all picture quality regardless if you're in game mode or not.

    • @Ghostelmalo44
      @Ghostelmalo44 4 роки тому

      @@stopthefomo what about burning problems with oled ?... is not more convenient to get a micro led from lg.. i heard those have amazing quality. better than qled. ( i heard )

    • @AngelicRequiemX
      @AngelicRequiemX 4 роки тому

      @@Ghostelmalo44 The real question is: Can you actually afford a micro led TV when it first releases? Probably not.
      In that case, buy an OLED instead. Burn-in is basically a non-issue with the latest generations of OLEDs unless you have: 1.) A defective panel. 2.) Purposely abuse the TV like Rtings did for their 1 year burn-in stress test.

    • @Ghostelmalo44
      @Ghostelmalo44 4 роки тому

      @@AngelicRequiemX micro led has been out for a few years now. Don't know what you're talking about m8. And they're more cheap than oled.

  • @johnnyhustle6976
    @johnnyhustle6976 2 роки тому

    When people were limited to 480i standard definition (SD) TV, they probably didn't think they needed high definition (HD) television either. But when they saw how crisp, clear & vibrant the HD picture quality looked, they knew they needed it. The same will be with 12-Bit color. HDR on those cheap 8-Bit TVs, like the older TCL 4 Series TVs of 3 or 4 years ago looked decent. However, when we saw what real HDR10 & Dolby Vision looked like on a 4K TV with a true 10-Bit color panel, those of us with sense knew how much of a game changer it was. And the same will happen when true 12-Bit color panels are available for retail purchasing.

  • @lethalno1
    @lethalno1 4 роки тому

    Awesome video 🤘🏾

  • @dnwfc4400
    @dnwfc4400 4 роки тому

    awesome video man!

  • @trendkillwill9806
    @trendkillwill9806 4 роки тому

    I really enjoy your videos, very informative. I'm a big nerd 🤓 so I like learning things like the nits to make hdr 10 work and all the technical specs explained. Im having a hard time with trying to figure out which tv I want for next gen. Seems newer Tvs are focusing on 8k instead of perfecting things like 2.1, hdr, vrr, and supporting next gen. Thanks.

  • @abap-gaming
    @abap-gaming 4 роки тому +4

    From what I understand, HDMI 2.0 only enough bandwidth for HDR in 4:2:0 chroma, as long as the CX can do HDR with 4:4:4 chroma I"m good.

  • @Danieltredway1431
    @Danieltredway1431 3 роки тому

    Thanks so much!

  • @chrisbullock6477
    @chrisbullock6477 4 роки тому +8

    That was SHARP by the way with the Yellow.

    • @stopthefomo
      @stopthefomo  4 роки тому

      YES, I didn't want to say anything as I felt bad for kicking a dog when it's down. Sharp has really gone south since those heady days when they were literally at the cutting edge of TV technology - that yellow pixel caused quite a stir, but made everything feel like the "Magic Hour"

    • @thomasvinelli
      @thomasvinelli 4 роки тому

      @@stopthefomo I thought Hisense bought sharp.

    • @stopthefomo
      @stopthefomo  4 роки тому +3

      @@thomasvinelli Definitely not! Foxconn bought Sharp and Sharp continues to develop/produce panels from Japan. Hisense only bought the Sharp TV factory in Mexico and the exclusive naming rights to sell the "Sharp" branded TV in the U.S. So all Sharp branded TV in the U.S. are actually Hisense rebranded, but outside of the U.S. Sharp continues to be independent of Hisense. Sharp was hoping to re-enter the U.S. and take back its naming rights from Hisense, but changed its mind after CES 2020 when retailers like Best Buy said "No, nobody will buy your TVs no matter how good it is."

    • @rolandm9750
      @rolandm9750 4 роки тому

      @@stopthefomo Wonder how true it is that "nobody will buy the TVs" given Hisense and TCL TVs were fairly unknown at one time but sell fairly well today, all based on positive reviews from people like yourself to various other personalities/publications. Hisense relegated the "Sharp" name to the low-end in Can/US, but still if Sharp "came back" and had the units had some stellar reviews I think ppl would start buying them just the same. Only thing is I think they'd have to buyout Hisense's remaining time (however long they contracted the rights for) which is probably not worth it. IIRC they tried to sue Hisense but that didn't work, so they probably just said "forget it" for now.
      That said Quattron was interesting tech, but when it was out there it wasn't like those TVs were considered amazing just because of the yellow. Similarly Sony's original Triluminos which was RGB backlight LEDs...the XBR8 was an amazing TV at the time but in retrospect it doesn't seem like the RGB backlight was necessary given Sony dropped it and never looked back. These enhancements seem to come and go, without any real loss. Just like you don't really *need* a quantum dot filter to make a great LCD TV either.

    • @NUCLEARARMAMENT
      @NUCLEARARMAMENT 3 роки тому

      @@rolandm9750 I have a monitor with an RGB-LED backlight, the original HP DreamColor LP2480zx with 10-bit IPS, A-TW polarizer, and 97% DCI-P3 color gamut coverage.

  • @BigChuck777
    @BigChuck777 3 роки тому

    Thanks professor!!!

  • @batslog
    @batslog 4 роки тому

    Excellent video.

  • @pheotonia
    @pheotonia 4 роки тому +8

    10K nits would probably not be any hotter than my LG Plasma from 2010!

    • @stopthefomo
      @stopthefomo  4 роки тому +3

      LOL I have literally felt my bedroom get hotter when watching on my 50" Kuro - had to keep the windows open!

    • @mastersmurfify
      @mastersmurfify 4 роки тому +2

      @@stopthefomo still running my Kuro now (game on a LCD I plan to replace with a CX) - I am not even going to worry about this because my Kuro still looks great for streaming roku tv

  • @steadychaosproductions3376
    @steadychaosproductions3376 4 роки тому

    awesome video!

  • @JuanGarcia-lh1gv
    @JuanGarcia-lh1gv 4 роки тому

    Thanks for the information! I have an 8-bit display with only 180 nits of peak brightness. But to my surprise, after calibration, it covers 100% sRGB and 100% P3! I calibrated it with my colormunki. 8-bit content looks great, but the contrast is only 3000:1, so it doesn't hold a candle to OLED or QLED.

  • @LeMatt87n
    @LeMatt87n 4 роки тому +46

    Next episode he’s not even going to be in the frame.

    • @stopthefomo
      @stopthefomo  4 роки тому +9

      ROFL was thinking the same thing while editing - "what the hell??" In my defense this video was recorded at 1AM after a long day at work, and I was just happy I got the information correct - after re-shooting 5 times!

    • @sukithreddy
      @sukithreddy 3 роки тому

      😂 😂 😂

  • @stipeslol
    @stipeslol 4 роки тому

    Great video

  • @dcaseng
    @dcaseng 4 роки тому

    This video shows a lot of growth.
    You have a great ability to break things down to simple terms.

    • @stopthefomo
      @stopthefomo  4 роки тому

      I try, thank you for hanging on!

  • @tomchan2559
    @tomchan2559 4 роки тому +1

    So 12 bit TV is what the manufacturers should be focused in the near future. In terms of resolutions, 4K is good enough for most households.

  • @coisasnatv
    @coisasnatv Рік тому +1

    Tv's are limited to 10 bits?
    My 2012 Sony Bravia KDL-47W805A (KDL-47W802A in US) has a 12-bit display, it works on both 4:4:4 RGB/YCbCr, this is the one that I use to edit videos and photos and it can display 12-bit colors just fine. The sad thing is that OLED and other technologies still uses a 10-bit display.

    • @C--A
      @C--A Рік тому

      No your 2012 Sony Bravia doesn't lol. No current consumer tv or professional monitor has native 12 bit depth!
      That Sony Bravia tv you own from over a decade ago is a second/third tier at the time cheap IPS LCD with poor black levels compared to VA LCDs. Let alone OLEDs with true blacks!

    • @coisasnatv
      @coisasnatv Рік тому +1

      @@C--A I work for the industry since late 80's and yes, this 2012 Sony Bravia Monitor has a native 12-bit depth display, only a few of us had the information at the time and this is why I bought it and this is the reason why some studios still uses.
      To me, it really doesn't matter what you think about it. This just proves that people that suppose to know these things actually doesn't know nothing at all.

  • @suliman9058
    @suliman9058 3 роки тому

    I miss college, thanks for the lecture

  • @majorastorm
    @majorastorm 3 роки тому

    I was just looking this up for the Series x

  • @rockofloveusa
    @rockofloveusa 4 роки тому

    best video i seen from him
    compare to hdmi one

  • @1VideoGameDude
    @1VideoGameDude 3 роки тому

    Thank you for explaining this. I noticed 9 months ago this was filmed I'm wondering how true is it still today? Is 12 bit still irrelevant as long as our TVs don't reach 10,000 nits? I'm on the fence about splurging on the LG CX because of the 10 bit issue and wanted your input . Thank you again!

  • @camryhsalem5139
    @camryhsalem5139 3 роки тому

    Thanks

  • @id104335409
    @id104335409 4 роки тому +5

    Close your right eye to watch this video comfortably.

    • @stopthefomo
      @stopthefomo  4 роки тому +1

      I'll drift back into the frame next time :)

  • @bobmorane7329
    @bobmorane7329 4 роки тому

    Nice one.

  • @SwagKingColeLeprecun
    @SwagKingColeLeprecun 3 роки тому

    “The pixel is lit” - i felt that

  • @chungexcy
    @chungexcy 4 роки тому +14

    12-bit vs 10-bit is all about more shades and better gradation. They have the exact same range defined in HDR, both can up to 10000 nits.
    12 bit color can provide smoother gradation and control the delta-e between two neighbor colors below human's perceptual threshold. 10 bit is pretty good and 12 bit is perfection.
    If you don't explain about Perceptual Quantization (PQ curve) defined in HDR standard, you audience will never get this point.

    • @noname1st139
      @noname1st139 2 роки тому

      I got 12 bit option on firestick, would you have that on instead of 10bit, I got Samsung QN90A 50, thanks in advance 👍

  • @aboodinatorgaming1929
    @aboodinatorgaming1929 3 роки тому

    I was searching for this after I heard that LG nerfed the C series from 48 gbps in the C9 to 40 gbps in the CX and was wondering if that even mattered. This cleared the issue for me. Thanks.

  • @PaceyPimp
    @PaceyPimp 3 роки тому +1

    So should i never choose 12bit on the option on the media devices or will 12bit interpolate or downscale better than 10-bit color?

  • @IosifViorelMila
    @IosifViorelMila 4 роки тому

    Hi, Can you please make a video about the 2020 Samsung "The Frame" QLED vs standard QLED range ? For example, how does The Frame 55LS03T compare with Q60T/Q70T/Q80T. Looking forward for your opinion on this. Thanks!

  • @xephyrxero
    @xephyrxero 9 місяців тому +1

    So now that Hisense's UXN is going to have 6000 nits of brightness, can we resume our desires for 12bit color?
    Although it's still not fully correct until all these TVs start supporting 100% of the Rec.2020 colorspace. I want that even more

  • @pushpindersingh8101
    @pushpindersingh8101 4 роки тому

    Good video.👍

  • @DCMCOSPLAYdotCOM
    @DCMCOSPLAYdotCOM 4 роки тому +1

    While this may be one way of looking at, nits based, take a deeper dive into the color banding. This is where more color is needed even in the 300 nits range. For example, take a color gradient, doesnt matter what two primaries, say red to blue. Even at 10bit, thats 1024 shades you can do between them. 4K has nearly four times the number of pixels across the screen. If each one can not be assigned a different color, banding can occur. This problem is even worse when you zoom into that gradient, you still only have 1024 shades but are trying to fill more pixels with them, more banding occurs. 12bit is barely enough to satisfy the needs of 4K displays. This is especially an issue with gaming. Movies, BT2020, compression, thats a whole other can of worms when it comes to color. Gaming can produce a wider display of colors and gradients are used in virtually everything. Even worse, you gotta look into chroma subsampling, that can cause text to become unreadable when presented at less bit depths.
    We need 12 bit / 4:4:4 color for more reasons than just the Dolby vision specification. The lack of it is a total dealbreaker for HTPC enthusiasts.

  • @lorenzofernandez8572
    @lorenzofernandez8572 10 місяців тому

    I have a Samsung 55" 4k HDR tv and I get 12bit at 24hz. If I raise the Hz to 60 it drops to 8bit. For regular browsing, that's good enough but when I watch movies, and I watch a lot of movies, i throtle down to 24hz 12bit on my Nividia 2070 RTX. It looks fantastic especially on "Filmmaker" mode.

  • @carlcat
    @carlcat 4 роки тому

    Oooooooooooooo, props, I love props!

  • @dantefekete7617
    @dantefekete7617 2 роки тому

    Thank you very much for this video. I do, however, have one question. I am a hobbyist filmmaker, and my friends and I will get together to watch each other’s films and stories. As I film in 12 bit 1080p, would a 4K tv downsample that information, or will I have issues viewing that footage appropriately? We are all coming from using actual film and editing for projection, so digital cinema is somewhat new to all of us.

  • @killerkevin27
    @killerkevin27 4 роки тому +1

    So I've been watching you and Vincent from HDTVTest for a year now and I swear I just now realized that you two aren't the same person. Mind blown but at the same time I feel like an idiot. Lol in other news best buy just delivered my LG CX today. Just waiting for my stand to come tomorrow for it.

  • @tyroneslothdrop9155
    @tyroneslothdrop9155 4 роки тому

    Great video; I did not know that color gamut was determined by luminance. This makes the LG controversy look even more ridiculous. I don't want to pay a premium for useless features, but many people seem to feel differently.
    I am, however, less excited to upgrade my Panasonic plasma if color banding is still an issue with newer televisions.

  • @robertobuatti7226
    @robertobuatti7226 4 роки тому

    I'm a person that get's easily confused but you explained the color system and peak brightness very well and easy to follow. I just purchased a Samsung Q80T 55 because I couldn't stand the blooming on the X950G but am a little disappointed with the peak brightness on the Q80T as it's only 700 nits and I spent $2100 I should at least be getting 1000 nits.

    • @jkairi4
      @jkairi4 4 роки тому

      700 nits is not bright enough for you. There are compromises in with all of these displays.

    • @robertobuatti7226
      @robertobuatti7226 4 роки тому

      @@jkairi4 For HDR it's not because HDR needs over a 1000 nits or more. But for SDR it's fine.

  • @zeonabdul2001
    @zeonabdul2001 4 роки тому +5

    So I may be watching a 9 bit TV right now, how nice let's just put the cart before the horse and move on to 16k

    • @Toliman.
      @Toliman. 4 роки тому +3

      @@holgerbahr8961 Pfft, worst year... as if. In 2013 and 2014, they made HDMI 2.0 4k TV's with "HDR support"... knowing they couldn't get 4K HDR over HDMI 2.0 cables, it was limited to 1080p HDR.
      The UHD group also didn't ratify HDMI 2.0b until after they had started making inroads on HDMI 2.0, leaving 2 years of premium priced 4K TV's that you could not play UHD discs on.

  • @ledooni
    @ledooni 4 роки тому +1

    8K resolution in 12-bit up to 10K nits will be great... in maybe 5-10 years. For the next few years 10-bit on an OLED will still be the best experience in a dark room or a QLED in a bright room. Only thing I am interested in for the next years is if Samsung can actually improve the OLED technology with their QD-OLED with maybe slightly more brightness and color volume.

  • @goatcity2266
    @goatcity2266 4 роки тому +1

    Good video, everything makes sense now

  • @realistvision2547
    @realistvision2547 4 роки тому +1

    Great video buddy very interesting and informative. How far away do you think 12 bit TVs are 5 years?

    • @stopthefomo
      @stopthefomo  4 роки тому +2

      Possibly 3 years or not until TVs can honestly get past 4,000 nits. In my interviews with the leading developers of TV technology (my videos with CEOs of Nanosys and Rohinni), my take away is that we are already at 5,000 nits backlighting technology, so it's just a matter of combining this with the next generation quantum dot color filter - it's this latter development that we are waiting (Samsung is working very hard on this QD color filter tech, $11 billion and all that). Nanosys confirmed that we should be seeing working prototypes of TVs using QD color conversion filters later this year. Assuming it takes another 24 months of fine tuning the color processor to integrate 4,000 nits of mini LED and QDCC into a Dolby Vision qualified 12-bit display, we may have 12 bit "capable" TVs in 3 years. Remember this first generation of 12-bit TVs will look no better than the best 10-bit TVs at the same brightness, because the 12-bit TV would have to tone map all the colors lost from the 10,000 nit content which kind of defeats the purpose of 12-bit. It's like LG giving us HDMI 2.1 capable C9 last year but the TV itself had no hope of ever processing content requiring 48Gbps of bandwidth. First generation 12-bit TVs will not do 12-bit content justice just like first generation 10-bit TVs were not good enough to deliver HDR performance without further R&D. This year with the Samsung Q900TS/950TS, we may be ever closer to what 10-bit quality offers because it can hit 3,000 nits while being noticeably blacker than before (but obviously short of OLED). So in 3 years when 10-bit TVs finally hit peak performance, 12-bit TVs must do a lot better to justify the 150% price premium (yes, more than double the price at launch).

    • @realistvision2547
      @realistvision2547 4 роки тому +1

      @@stopthefomo thank you for the in-depth response I really appreciate it. I agree with your detailed analysis. There are a lot of exciting and interesting technologies on the horizon and this is going to have a significant impact on picture quality. However until we get a 12 bit self emmisive displays people are best getting a high quality 4K TV now and then wait until TV technology develops. However this information does confirm that Oled is getting closer to its sell by date and this technology is not the future. Where LED TVs will comfortable be able to produce 12 bit panels. I find this very interesting as Samsung has stopped LED production and LG well we know what little effort they have put into their LED TVs.

  • @lateralus46n2777
    @lateralus46n2777 4 роки тому

    Look out!! He's whipping out props....

    • @stopthefomo
      @stopthefomo  4 роки тому

      You mean the prop that disappeared out of the left side of the frame? LOL

  • @TheUAProdigy
    @TheUAProdigy 4 роки тому

    Unrelated: If possible can you do a video showing how to compensate for a tv’s weaknesses. Ex: Fixing black crush on the TCL 8 series.

  • @AzzaBro59
    @AzzaBro59 4 роки тому +1

    I feel the simple way of explaining is their is no 12bit panel TV's. So the 10bit panel only needs a 10bit signal all together.
    I feel smarter. But question how do they get a low luminance white. Bit then I guess oled has the white pixel.

  • @stanlee5465
    @stanlee5465 4 роки тому

    I'm planning to harness the SUN with mirrors to power my 14 bit display panel!!!

  • @TheCrucialQ
    @TheCrucialQ 4 роки тому

    I must add that if you are truly interested in see some of the best HDR Dolby Vision and HDR10+, you consider purchasing Spears and Munsils UHD HDR Benchmark Bluray from Amazon.
    There is a montage at 10,000 nits BT.2020, (HDR10, Dolby Vision FEL/MEL, HDR10+ versions). This montage is stunning. This really test your display tone mapping capabilities.
    Test are included to help you get to the proper settings for your display and UHD Bluray player.

  • @ReganMarcelis
    @ReganMarcelis 2 роки тому +1

    Hey brother, I am still on a Panasonic GT50 Plasma hence how I found you originally and s started to check into the different techs and this Panny Plasma at least in windows says it is now running on 12 bit depth but my radeon pro wx 8200 says 10 bit? Either way it is def more amazing than the panel used to look if this is even possible, do they make monitors in 38 to 42 inches, ultrawide be a bonus that has 12 to 16 bit, higher the better? I am dying for this, love the rich colors... and I am finally ready to get rid of my Panny GT50 as I know the power it uses it insanity even though she is still nice to me even for gaming... I also had the screen get a tad messed up but can only see it when it is off which is a shame as "somebody" cleaned it and must of had a tiny scratch that went thrU the black pro coating that I did NOT know was an actual coating ....and it wiped off so looks like a BLOTCH on the screen 24/7 and when off yuou can see it all day however when on it is barely visible except maybe at a very unusual angle and even then must also be a specific color in that area which rarely happens but again, just a shame so time to upgrade what is done is done... BTW: I so is it even possible that my PANNY is running at 12 bit depth that windows states it is or even the 10 bit depth my Radeon WX PRO 8200 says it is? I had the TV a long time and it feels like it LOOKS the best it ever could, not in my head.... However, I am ready because of the one things not so HUGE on my agenda but the POWER it draws I know is getting absurd for now a days we have came far enough and I will always MISS plasma for certain reasons especially the GT and ZT (even VT) SERIES, I think you have or hadf a ZT-65 OR 60? .... I always wondered with the VT-65 OR VT60 UP AGAINST THE BEST PIONEER "KURU" PLASMA, Who would win?.... or the last plasma ever released by Samsung, forget the name/number…. Which would WIN! I would like an ultrawide panal in 38 to 42 inches with the brightest knits and highest bit depth as well as quality built all the way down to the stand and the revfresh rate being 90 or above would be awesome as well, suggestions?...

  • @matwtf
    @matwtf 4 роки тому

    One thing no one is talking about is color resolution.
    On a 1080p TV the black and white component of the image is 1080p. But the colour is often a quarter of that. . Graphics are full colour resolution. 4:4:4. Video however is usually 4:2:0. (full b+w, 1/4 spread out colour) You don't notice mind because your eyes also see black and white at a higher resolution than colour.

  • @BryantAvant
    @BryantAvant 4 роки тому

    So, I was at CES 2018 and Sony had a 8k 10,000 nit tv. It was piercingly bright but didn't give off heat.

  • @fabiozagoo
    @fabiozagoo 3 роки тому

    So why oppo 203 upscale for 12 bits If the tv cant handle ? For Buff or reduce artifacts ?

  • @nedywest71
    @nedywest71 4 роки тому

    Are you planning to review TCL C81?

  • @doyouthinkitsdead
    @doyouthinkitsdead 4 роки тому

    Dude whats a FOMO? Also great channel. I've learned a few things recently here. Whilst looking at what a tv needs for either ps5 or series X consoles.

    • @eclectice
      @eclectice 3 роки тому +1

      Fear Of Missing Out

  • @TheOneAndOnlyBruceDickinson
    @TheOneAndOnlyBruceDickinson 4 роки тому

    I just bought a 65" Q7 last week from Costco for $1000 only to discover it doesn't support Dolby Vision. Being constrained by a lower budget I am unable to gratify my hometheater wishes with the best tv's. I am happy with the Q70 but am wondering if I could get a better tv in my price range with DV, and if that DV would give me a better picture. Should I return it and swap it? Any recommendations if so? Last question is I notice the movies in Disney Plus and Netflix that say dolby vision on other devices just say HDR on my Samsung. If I don't have DV how am I able to see HDR, which I am seeing?

  • @Keabrown79
    @Keabrown79 4 роки тому

    So how does 10 bit (8bit + FRC) figure into all of this as far as color depth?

  • @malash3972
    @malash3972 4 роки тому

    For my Xbox one X.. should I use 10bit or 8bit? Also would you recommend HDR on or off?

  • @dneary
    @dneary Рік тому

    If you're talking about luminance nits does that mean that the native colorspace of a TV/screen is La*b* or YCrCb? I thought that traditionally TVs were mostly RGB with different color LEDs, each of which had ~3 bit settings, but if you have YCrCb, wouldn't you would still want 6-8 bits subsampling of Cr/Cb to differentiate colors? Luminance is only one axis of the color, so 12 bits would equal 6 bits luma and 4:2:0 subsampling or 4:4:4 RGB - is that right?

    • @xephyrxero
      @xephyrxero 9 місяців тому

      The bits he's talking about in the video are per channel, not the entire payload. And with 4:2:0 it refers to the number of pixels encoded, not bits. So you get 4 pixels with their own unique value for the first channel, then those 4 pixels will share the 2 pixels worth of data for the second, and the last channel is omitted. So with 8bit YUV we're talking 48bits for the whole cluster of 4 pixels. Or 12 bits per pixel thanks to this compression (as opposed to 24 for RGB). And so at 10 bit 4:2:0 we'd have 15bpp, and for 12 bit color it would come out to 18bpp.
      So even in YUV 4:2:0, each channel would still get 10 or 12 bits, but we've cut the bandwidth required in half. Although a hybrid with 10 bit chroma, and a 12 bit luma would be a neat idea (17bpp).

  • @catalin.siminiuc
    @catalin.siminiuc 3 роки тому

    1. If 8 bit has 256 shades, how is represented in 100 nits, since I thought 1 nit represent 1 step and therefore are only 100 shades. Are there 0.5 nits measures or how does it work? How many shades 100 nits can display or how can we calculate the number of shades in relation to nits?
    2. Also why some manufacturers have 400+ nits panels with 10bit and also 8bit+frc, why use frc then if it's achievable 10bit? Is something different in pixel layout/panel treatment? How are colors added without frc?
    3. What is the relation or the difference between a 8 bit panel which displays 16 million colors and 16bit colors which I was selecting in Windows to display 65536 colors?
    Offtopic: how bit/color depth are represented in printing since there is no lumination. What is the maximum which can be printed?

  • @displaytalk
    @displaytalk 4 роки тому

    My krp-500m has 12bit color. 2009 release date.

  • @biueprint
    @biueprint 4 роки тому

    hey man i wanted to ask and see if maybe you knew about any new 4k 144hz monitors coming out this year i was going to buy the rog swift 27 inch 4k monitor but i feel like that panel has been out a while and maybe something newer is on the horizon can you let me know what you think
    monitor = asus ROG Swift PG27UQ 27" Gaming Monitor 4K UHD 144Hz DP HDMI G-SYNC HDR Aura Sync with Eye Care

  • @mjjjjslwbzxxfkkek
    @mjjjjslwbzxxfkkek 4 роки тому

    I absolutely love the way you indirectly are calling people dumb, because they are too scared to make their own opinion on things and to trust their own expectations.. and totally makes their opinion upon what youtubers tell them to believe and rely on their expectationens instead of their own, its a crazy “minority” group... So before i’m getting burned alive, i have to say its a beautyfull thing aswell.. and that i believe in every word you are saying, love your work ☀️