Computer Color is Broken
Вставка
- Опубліковано 19 бер 2015
- Thanks to www.audible.com/minutephysics for supporting this video!
MinutePhysics is on Google+ - bit.ly/qzEwc6
And facebook - / minutephysics
And twitter - @minutephysics
Minute Physics provides an energetic and entertaining view of old and new problems in physics -- all in a minute!
Music by Nathaniel Schroeder / drschroeder
This video made possible by the following Patreon Supporters:
Mark
Wes Brown
John Green
Florian Philipp
Rens van der Heijden
Bob Bolch
Daniel Ametsreiter
Joël Quenneville
Richard Pearson
David Dailey
Steven Mulder
Karim
Ethi Raj
Ryan Kyle
William Ricketts
Collin Mandris
Matt
Jonathan Foster
Siddharth Sadanand
Maarten Daalder
Robby Olivam
Alan Browning
Jonathan Piersa
Julia Person
James Craver
Sarah Chavis
Yonatan Bisk
Richard Campbell
Chris Barrett
Jan A
Christopher Coleman
Daniel “YoureDown” Breger
Hendrik Payer
Daniel Yip
Matt K
William Pearson
Kevin Lynch
Nick Ward
Kevin
john eriksson
Allan Farrell
Tobias Olesen
Chris Chapin
Michael Keefe
Jon Mann
Bert Goethals
Joji Wata
Adam Naber
Rob Ibsen
Jacob Gumpert
Peter Collier
Andi Davis
Aarthy
Raymond Cason
Evan Gale
Paul
Tori McClanahan
Andrew Stobie
Dominik Steenken
Danilo Metzger
Christian Altenhofen
Ezra Lee
Roy Morgan
Olivia Darroch
Amber Ciarvella
ryan horlacher
Keith Chang
Milokot
Janel Christensen
Will Scherer
Mike Fulcher
Larom Lancaster
Liam Callaway
John Harman
Christos Papandreou
Fernando Pazin
Johnathon Kinville
Jason Medrano
Andrew Barnett
Katharina Schuchmann
John Gietzen
Michael Tardibuono
Matthew Hebert
Christy Filipich
Pierre-Louis Bourgeois
Genevieve Lawrence
Brian D'Agostini
Chris
Dominik Menzi
Ryan A. Schauer
Daniel Johnson
Nico Houbraken
Michael Carr
Ragnhild
Elizabeth Meisterling
Lysann Schlegel
Magnus Krokstad
Chase Turner
Owain Blackwood
Russ Arrell
Brenden Bullock
Asaf Gartner
Mark Samberg
Tina Johnston
Mike Cochrane
Tom Murphy
Peter L
Jeff
Erica Pratt
David
Artur Szczypta
David Drueding
Nicklas Ulvnäs
Nigel W
James Nelson
Mary Foster-Smith
Clayton Neff
Michael Merino
Jason and Gayle Corfman
Mihaly Barasz
Steven Klurfeld
Richard Bairwell
Tamas Bartal
Erven
Justin Prahl
Michael Maitlen
John Harman
Hans van Staveren
Kasey
Karlin Nazario
M K
Jacques LABBÉ
Geralyn Byers
jason black
Candice Blodgett
Daniel Gibbs
Henry Berthelsen
Andy Kittner
Steve Hall
Erdumas
Rob Snyder
John Kelly
Jessica Rosenstein
Bill Tomiyasu
Vasco Simões
Eoin
Simon Hammersley
iain
Holger
Alexis Carpenter
Jay Goodman
Joseph Perry
Mark Govea
Eduardo Rampelotto Gatto Created by Henry Reich - Наука та технологія
ahh!!! this means a lot to me as a digital artist!! thank you for explaining why my drawings turn out like garbage when i try to blur them
Me too, it really help me as a digital artist :D
I don't understand how it helps. What do you do to fix the problem?
futurestoryteller They could find an application that properly blends colors, such as the aforementioned settings with photoshop :p
In Photoshop, you create a custom RGB setting with a gamma of 1.0. Edit>Convert to Profile. Click on the Profile drop down menu and select Custom RGB. Then type in a gamma of 1.0. You may also want to change the Primaries to Adobe RGB 1998 (mine defaults to HDTV for some reason).
As for any other programs, couldn't tell ya.
If youre trying to blurr little dots or something like that, lower the opacity and lower the size of your smudge tool.
This actually helped me so much with the raytracer I was writing
+chasenallimcam I am a programmer who did not know this, but experienced this before. Now I know why, and will surely forget before i need it again.
+Richard Smith give me your email ill set up a spam program to remind you every 3 hours or so. anything so i dont have to fix your mess later.
My reasons are different, generating color gradients mathematically for display in RGB, the color space of the frame buffer.
hal hahah ok!
Wow, that's dope
I'm a professional illustrator, and you just taught me something. I'd always assumed it was because image editors were intentionally treating colours like pigments and mixing them subtractively instead of additively, since the result generally fits.
>An Adobe product not having the default option be the best choice.
As typical as the sun rising in the morning.
haha hating on adobe cause everyone does it how funny and original
@@ForfunckleStudios Hating a company due to their bad consumer practice is clearly wrong
same por Apple
Nice greentext.
@@ForfunckleStudios Hey I've been hating on Adobe ever since Photoshop 2.5 you whippersnapper!
Interestingly, the non-linear storage of brightness in computer graphics did not evolve as a clever deliberate choice; instead, it was merely a legacy from the display systems used back then: Cathode ray tubes. Their brightness happens to be roughly proportional to the square of the control voltage. Designers of TV broadcasting norms were aware of this, and decided to compensate for this effect in the broadcasting side of the system, to keep the receivers as simple as possible. When those same receivers were later adapted as computer displays, the computer engineers never seemed to have paid any attention to this detail. It was only when computers started to be used in the printing industry that this quirk started to get any attention in computer technology.
CLipka2373 Very good
It turns out that while it was a by product of the physics back in the day, It has stuck around because it is actually useful for data compression.
@@TuckerDowns how is it useful for data compression; I've never fully understood that point.
@@brod515 Roughly speaking, taking a square-root means keeping the first half of a numbers most significant bits and dropping the other half, essentially cutting the size in half. This is more complicated in practics, but I how you get the idea.
@@Ruhrpottpatriot This still doesn't quite make sense. information like that would be stored in 4 byte floating point numbers which will still use all the bits to represent a number. I don't think that's what he was referring to as compression... there is a common idea that storing the values non-linearly stores only the useful information for the human eye and I don't quite understand it.
the bass in this video is shaking my house
it ruined my sub
cyclone8200 I see what you did there.
You should check your sub's volume...
INVISIGOTH Bangin' tunes mate. Amirite?
This comment was so random I had to lol.
Gamma correction is like daylight savings time. The actual mathematical operation is super easy, but I can't for the life of me figure out whether to do the one step or its inverse. I keep rewatching this video time and again.
Spring forward, Fall back.
I noticed how as an artist I never use even a tenth of all the very bright white values available to me, and it irritates me a lot when my dark grey gets 1 unit closer to black but it looks much darker...
Minute Physics
Minute Maths
Minute Biology
Minute Technology
4 Minutes of Awesome
Where is minute chemistry?I am a chemist
casaverdero there isnt
casaverdero "In science, there is only physics, all the rest is stamp collecting" -Ernest Rutherford
Ruben Lucescu But we need math for physics....
Regnor Math isn't a science (there's nothing empirical about it)
What is meant by that quote is that at the time physics was a reductionist use of math make testable predictions, most other sciences were still concerned with just phenomenology.
As a computer programmer, I think this is less an issue of laziness and more an issue of not realizing the color values were square-rooted in the first place. Thanks for sharing this information.
My mind is blown once again. Thank you, MinutePhysics.
268 likes epik
thank you. Now all I see is incorrect bluring.
:P
Nope... beauty isn't the default... laziness is... ask any engineer...
performance. computers haven't always been this fast. the image formats come from a time where desktop computers were slower than your average smart phone. he was even making this point in the video.
***** That was in the past... however in software in the present still use the same technique... answer... laziness... trust me I'm first year in IT engineering...
Are you kidding? You think that you can speak for all engineers because you are a first year IT engineering student? Don't make me laugh.
caramida9 Engineers should develop better alternatives for jpeg, png and bmp since they're way outdated.
VirtualCoder Except they already exist and nobody uses them.The same way there has been an alternative to .docx that's one trillion times better and nobody uses it.And the same way averyone should be using .webm instead of .gif but then again nobodo does.
Wow! Just checked and Photoshop does mess it up :( But! Blender's node editor, free 3d software, makes it yellow how its supposed it be! :D
*****
Thats ok :)
+Abraham Animations Yup Blender is awesome like that. XD
+Abraham Animations Comparing Blender and Photoshop doesn't make sense tho. They're both made for vastly different reasons.
*****
True, but in the sense of blurring, photoshop doesn't do it quite right :)
Abraham Animations Does it just fine, if you toggle the right mode. The fact that barely anyone noticed this thing prior to this video just speaks volumes how little it matters. And for those whom it does matter (See graphic designers, Texture artists, etc) Photoshop has the option right there for them already, even back in the earlier incarnations of the software, because they know who uses it.
that moment when your drawing program has blurring on an image but it knows what it’s doing and doesn’t make it ugly
Damb I remember when this video was new, one of the first videos I saw on the channel. I am really enjoying UA-cam recommending me old minute physics videos all of a sudden
WOW, that was WAY more relevant to me as a photographer than I thought it would be when I clicked on the video
As a student of computer science I can say this is accurate. We learn to blur images with the wrong approach and then with the good approach. It's about understanding how computer graphics work, the same for bubble sort, we learn the easiest method first. What is wrong is have the wrong method in professional tool as the video says.
So, can you tell me whether the default RGB-values using approach is wrong? Or is this about actual formats?
+Sebb747 like the video says our human vision can't tell the difference between bright colors but dark colors. So, instead of wasting data storing bright colors we can have a better image by storing the root of the original bright value. It's like the mp3 format, instead of saving inaudible sounds we delete those frequencies.
Angel Alvarado Yes, I'm well aware of this. I'm in CS myself.
But if you do image processing, you usually convert your image into an RGB(A) array which you then work with instead of working with the raw data of whatever image the user chose to supply to you.
My Question was whether those RGB values are representing square roots and are being multiplied down the graphics pipeline or whether this is just a problem for people who choose to - for whatever reason - work with the raw image data.
+Sebb747 You can't know unless you have the data from the original source, take for instance a camera, you can set the gamma values on it but once the photo/video is taken all is stored in the basic RGB(A) values. The same when displaying the image, you can change the gamma values in your TV or screen. The thing for us as developers is how to treat those pixels, you can choose the lazy path and use the mean to "blur" the image or be aware that it's not that simple and you need to consider all cases. Color math is an interesting topic as well.
I stopped learning about IP but there are a lot of resources out there.
Angel Alvarado Guess I'll have to write a test case for my image generation stack. Thanks anyway :)
As a (hobby) programmer, I come back to this video occasionally to remind myself about this. Thank you. On this watch I realized I've programmed contrast wrong in a few programs.
Thank you! That was an awesome video to go with my coffee. I learned something and now I want to figure out which editing programs will give me those gorgeous RGB blends!
More than one million views and no comments??
As a person who does does a lot of graphic art on the computer, I'm amazed I didn't know this before. And why this has not been fixed.
***** Yes. I see them now. I did find it very strange.
***** Thanks for the tip. I will try that.
+Mason Bially In general, the downsides of open software with regards to bugginess, lack of support, and most importantly, lack of economic incentive to fix problems, far outweigh any theoretical advantages.
_"Also, you as the user can always fix the problems with open software."_
Absolutely not. Almost no users are competent to fix problems with any large software package - even if they happen to be professional programmers - that stuff is just a pipe dream in 99% of cases, possibly slightly better if the problem can be fixed by writing a plugin.
The most laughable part of it though, is that if I, as a programmer, ran into a problem with an open source image processing package and knew what the problem was, it would almost certainly be 100 times quicker for me to write my own special purpose program to just do that particular job instead of spending days or weeks attempting to understand the original program to the point where I could modify it safely without breaking other things.
The typical case is that open source software is written by a handful of dedicated enthusiasts, with minimal programming input from users. Blender is a prime example of this, as large critical portions of the program is developed by one guy.(I know this because the lack of development effort prevented me from using the software at one point and it was decidedly not worth my time to write the software myself when I could just buy it from an actual business)
+Julien12150 that happens to me all the time on my phone
The reason it hasn't been fixed is for larger images, the ammount of time a proper blur takes is far more meaningful then smaller images. a 1024x1024 image has just over 1 million pixels.
The first method uses an addition and division operation per pixel (The colors are already square rooted), So for the picture using the first method, just over 1 million additions and divisions
The second method uses 1 addition, 1 division, 2 multiplications (squares), and 1 square root (The most expensive basic math function a computer can do(Not counting trig functions)). So this multiplied by 1 million, and it would take 1 million additions, divisions, and square roots, and 2 million multiplications.
If you are going for a faster program with less wait time between blurring operations (paint, photoshop) or less intense software on older hardware in general, you go with the 2 million operations rather then the 5 million operations
He just basically called Apple lazy :D
+minecraftace123 Ya apples are lazy they just hang on trees
+Ethan Chou How very true. . .
The level of polish we've come to expect from Apple products
You're right, like i'm totaly sure the apple engineers just accidently put 2gb or DDR2 RAM in a computer that has 2 4GHz quad core processors. It totaly wasn't just to scam idiots out of their money or anything.
LE/A Tyrone Indeed, indeed!
this explains alot
i thank you for letting us know this
Beauty is the default! Look how elegantly an entire image was stored using as few bright gradations as the human eye can even notice!
Is there a setting for this in GIMP?
I want to know the same thing, I think GIMP does this because i found the settings, cubic and linear. (with standard cubic)
On my copy of gimp it did it right by default.
builderecks Using which filter? I tried Blur, Gaussian Blur, Motion Blur ... none of which did the right thing. I also tried cubic and sinc interpolation when upscaling and even that didn't do the right thing. That's pretty shocking I have to say. This was all done using Gimp 2.8.10.
samramdebest That's only the interpolation between pixels when scaling the picture. So all but nearest neighbor go through the same colors; just the shape this gradient takes is different. It's got nothing to do with gamma correction. The images in this article explain it much better than my words did: en.wikipedia.org/wiki/Bicubic_interpolation
Penny Lane Don't know if maybe the default settings are different on linux (which is what I use) blur, Gaussian and motion all smoothly blended with no darkness issues in the color test I did.
Beauty should be the default. That's true in many circumstances.
Too bad most things and most people aren't beautiful.
Simply amazing. Thank you so much for this marvelous work.
I know this is a old video but can I just say it's AMAZING how easy this is to understand. I have a very shitty range of skills in maths. I do not understand what square roots,timestables are like rocker science to me,ect ect yet despite this I can still understand what your saying. Good job on the way this was worded!
I'm gonna blatantly send this video to all iOS and software developers.
Boogster Su I like that that implies that IOS developers aren't software developers.
lol like they're gonna listen to what people actually want
Just because he mention iOS doesn't mean windows android works different. As you can see he only mention Instagram even those every website works the same way.
Morons. He just used something he know people are familiar with
Yes, because they are all incompetent people who know nothing about this problem...
Apple isn't stupid, they picked the inaccurate method for a reason, it's fast. Squaring and then square rooting takes up a lot more processing power, if they would have went with that it would have been laggy. I did some tests and the square root method of finding averages was 30x faster.
Edit: someone pointed out to me that using lookup tables (essentially a long list of per-caclulated values) can speed up the squaring method. I tried that and it really speed it up. Using lookup tables the squaring method is now only 2.6x slow, which is a performance hit IOS developers could handle, so yes they are lazy.
Can I vote to see a sequel to this video explaining "color space"??
Relating to monitors, tvs, and any digital (or non-digital) final presentations. It would shed more light on the subject... "Color space" can be hard to get your head wrapped around: What are you working/editing in? vs. what is the final output in? and how to compensate appropriately.. The sheer amount of different "color spaces" reminds me of the frustration in the amount of different video codecs there are... which could be another interesting topic/video to explore..??
#danerwinfb
This is an interesting problem I had noticed, yet not really pondered much about. Thanks for explaining the reasons for it. Square roots matter more to me now.
Great I am sure I watched this video at a certain point in time but now that I am learning game dev and wanting to understand gamma correction for textures I watched it again and it makes more sense now.
I can't believe you made a video for this.... hahahahahah AWESOME!
In Photoshop, when creating a new image, set "Color mode" to "Lab color". That'll set it as the default for new files.
When saving to PNG or JPEG, you'll need to go to Image > Mode and set it to RGB.
Checking "Blend RGB Colors Using Gamma" seems to only work for painting (on RGB Color Mode). When I blur the image, I still get the black edges.
Switching to Lab Color produces correct results both with painting and blurring whether "Blend RGB Colors Using Gamma" is checked or not.
Using Adobe Photoshop CC 2017.
What the heck is Lab Mode, why does everything look better lol (now when you desaturate, the black and white values picture look actually correct). Why do we even use RGB mode?
@@Fynmorphover cielab?
brilliant video! understandable and yet in-depth. Finally, I know what the gamma in colour setting is!
Thanks for explaining gamma in a way that doesn't make my brain crust over. I enjoy graphics algorithms and the Gaussian blur issue here is very useful to know!
Americans blend away the u in color.
*colour
lol
i bet brit bongs say "loul" instead, too
chaquator
Colour rhymes with 'duller' ...lol rhymes with log. You're not going to say color. The second o in colour is never pronounced as the o in log.
The u in color is like the brown in between two colors, ugly and not needed
Innar Koït Chtofenbeurg AmE - color BE - colour
I FINALLY understand the purpose of lab color mode in Photoshop! thank you
Years of programming, including reading about gamma, and I never saw mention that both cameras and monitors used logarithmic scales, therefor all our beloved 8-bit image brightness is also on a logarithmic scale. "Gamma correction" is always portrayed as a funky post-processing effect to manipulate brightness, not an intrinsic step the monitor does to reverse what the camera did.
What a beautiful video.. and a beautiful observation. Well done minutephysics
Ill add this to my giant list to why ios sucks
iOS really doesn't suck....iOS just serves a different purpose from Linux and Windows.
***** Well iOS is basically a very expensive version of unix so...
Hunter Grimx This isn't limited to just iOS...
I think you missed the point of the video. It's not just iOS, it's the vast majority of computers. Every one needs to change, not just the OS you dislike.
***** It's overpriced as fuck.
It's super easy for me to blend colors. All I have to do is take off my glasses! (BTW, I'm nearsighted.)
Same all i need to do to blur the picture is steal your glasses
My sister works for Valspar Paint and creates her CH (Color Harmony) thoroughly through it. I love her!
Thank you! I need this knowledge. Hope I'll remember how to do it when the time comes.
Holy shit. I consider myself a software developer with good understanding of image processing, but this is news for me.
is anyone else getting super bass in their headphones?
no i am not getting fish in my headphones. if you are, please see a doctor
lol that actually made me laugh
So you don't laugh at a fish very often.
I'm not really a fan of Nicki Minaj
Someone isn't using neutral headphones I see
Thank you! A few years ago, I had the exact same issue when trying to animate a color-change from red to green. And the library (jquery-ui) I used, added some dark gray in betwenn. Now I know the reason :)
O'kay, YT, I have absolutely no ideas why you are giving me four years old video, but this is actually bloody awesome one! Good job on making this, mate
♥️"Shouldn't beauty be the default?"♥️
YES BEAUTY SHOULD BE THE DEFAULT! YES YES YES!
Way to go Henry! Great analysis.
Very good video, clear explanation and super helpful and interesting
This is the first time in a long time I could follow along holy shit
from a programming point of view though, blurring functions are already computationally heavy, and square roots are notoriously slow to process as well. I think we'd find that blurring images the correct way on devices like iOS with high pixel densities might actually produce upsetting lag in the interface. It's the kind of trade off that can be well worth it for the small number of people it might actually upset and teh small number of images it might mess up. IMO
That's what look up tables are for. :)
You have no fucking clue what a lookup table is.
No, I was not referring to the *blur*, but the gamma correction!
What Joe meant was a rainbow table. You only need a couple of megabytes to map one for every single color
You literally pre-compute the inverse gamma curve and the normal gamma curve...multiply the working texels by the appropriate value of the inverse curve to get back to linear color space, blend, and multiply by the gamma curve to convert back to sRGB encoding....the curve is the same for each color channel too so you don't even have to waste memory pre-calculating for every possible color
Thank you for this. I’ve always wondered
thank you for creating this!
How he got through this video without ever once mentioning that one digital picture (that shall not be named), I'll never know... ;P
But seriously, very interesting video! :D
what image?
Raicuparta The image of a particular item of clothing :P
josh11735
pfft, that fad lasted like 2 hours
cyberconsumer That's why it was a joke
it's actually more impressive how he did not mention the city lights picture from Nasa... scale it without converting to LAB colorspace first, and you get an image that's waaaay different.
Do you know the settings for this on gimp ? And btw, great video !
Wow... I never knew! Good videos keep em coming!
Interesting, all this time I attributed this to thinking that the perception of color brightness was based on the highest color value (so, which RGB value is exciting our optic cones the most), and so blending red 255,0,0 and green 0,255,0 got a yellow 128,128,0 that appeared dull because its top end was only at 128, even if it had the same total # of photons (or so I thought). The more you know.
well, indeed it's not sqrt, but gamma transform ^1/2.2, or indeed it is sRGB transform that is more complicated. Ok, it can roughly be approximated by sqrt, but please don't say it IS sqrt. It's not more complicated to do the real math.
It doesnt particularly matter, sqrt is just a function that will space big values apart more than it will which is what the video wants to show. Introducing the actual real math there wouldnt serve to do anything, other than alienate the average viewer for no real reason. He puts an asterix for people like you as well
What's the point of the video if not to be accurate?
The video literally says this already at 2:01.
@@dlwatib What is the point of your comment other than to imply that 2.0 is not in between 1.8 and 2.2 (which the video explicitly shows at 2:01)? This is not a rhetorical question.
if you REALLY want the Blur to use the correct Luminosity value, don't use RGB, but switch to LAB image mode (Image>Mode>Lab in Photoshop).
Only there you will find the correct Luminosity applied to the color edges.
In Photoshop I set the image to 32-bits/channel mode, then it does the math right. Too bad many functions aren't implemented or poorly adapted to that mode.
THANK YOU ALOT. I really wanted correct this because the blur effects I used to ajust ilummination in 3D render was getting a weird darkess. =D =D =D
Using Lab is not more "correct", from a physics point of view using linear RGB is correct while Lab is wrong. However, Lab may very well _look better_ since the Lab color space is designed to model human perception.
And from what I've seen, you can use it to adjust skin tones. (Flesh Man Group intensifies)
It's always great to understand what you intuitively know.
In display drivers development we end up converting to Linear RGB a lot through a de-gamma process to avoid a lot of the artifacts that occur when blending multiple layers. I wish I knew why a lot of software takes short cuts that look bad.
D: I had never considered there to be another possibility. Is there a good way to fix this in Photoshop?
To be fair to us artists though, having it set this way probably makes it easier for us to transition from physical pigment mediums (paints, coloured pencils, anything of the sort) and better predict the results, since the mix of pigments produces darker, less saturated colours too. It's natural for us to understand the mixing of red and green (or any contrasting, complementary colours) as something that produces dark, desaturated brown. And the method you showed seems to have the problem of generating too much light between the colours, which could prove to be very tricky to deal with for, say, digital illustration. I'd have to test it myself.
Oh, sweet! I found the setting, and the gamma adjustment allows you to avoid the problem of too much brightness going on keeping the setting at ~1.5.
Wait, where's the setting? D:
+Foxeste You can see it briefly in the video. In Photoshop, click Edit > Colour Adjustments. A window opens up and there should be an option to "blend RGB colours using gamma", with an unticked box and a field where you can enter a number between 1-2,20 (1=most gammafied, 2,2=normal).
(I have it in Spanish, so the wording might not be exact.)
Hmmm, now can you fix this in Inkscape?
Yes., Go to the XML editor, find the filter definition (under ), find the style attribute and remove the 'color-interpolation-filters' property from it (or delete it whole if it's the only property.)
Remember that Inkscape can only do what SVG can, and probably less. More info:
www.w3.org/TR/SVG/filters.html#FilterPrimitivesOverviewIntro
Most cleat explanation I ever seen. Thanks!
I didn't know how badly I wanted to know the reason behind ugly blurring.....Thank you!
Anybody know where the setting is for Gimp?
Dr. Certifiable somewhere underneath the leather suite.
sadly incorrect, just tested ^^
Blargles Malargles
"Leather suite"...? Idk what that is but Kaveh is saying that's not right. Can you clarify?
Dr. Certifiable I think it has to do with BDSM. www.urbandictionary.com/define.php?term=gimp
Dr. Certifiable I think you have to search under Tools -> GEGL libraries -> gausian-blur, but I'm really not sure..
I'm interested in trying out the difference between these settings in photoshop, but even after I made the change shown in this video, photoshop appears to still blend the exact same way. Clicking that box hasn't altered any of the blending methods I've tested so far. Maybe I'm missing something and there's more to it than just clicking that box in the advanced color settings?
Same here :-/
his method will fix it for blending one layer on top of another one.
To fix blurring one layer into itself, you have to use LAB mode rather than RGB.
Not sure if you ever figured it out but this is what I do to fix it:
imgur.com/a/Ovc9bsJ
Well this is useful info for when I make graphics and shaders in Blender and Unity.
i remember when this came out before i got into photoshop and ive remembered this video since
I totally agree with that last statement
What an insightful comment!
+Roy Vivat comments don't need to provide insight
What an insightful comment!
Did you even understand that last statement?
That's why professional photographers always use "raw" image format. It preserves colour and brightness information correctly.
jpeg sucks for quality
So I accidentally turned on my translator and it messes up really often so showed a different comment and then this comment
@@banana_man_101 ok
@@banana_man_101 ok
Lovely explanation of how blur works, did not know that!
Part of this could be related to efficiency, since the Square Root operation is computationally intensive compared to multiplication and division... but still, I think we're at a point where the hardware can handle it pretty easily.
The proper thing to do, would be storing the exact response curves of the camera, with the image, so you can go back to "radiance-linear" (proportional to the number of photons that hit a sensor pixel) space, do blending etc., then re-apply the response.
Wouldn't that take up more space.
greenmumm It would be insignificant, compared to the image content.
We are talking a table of a few hundred bytes, vs. megabytes for the pixel data.
lnpilot Right but that's why they didn't do that at first right?
greenmumm I guess, it's because you need relatively expensive equipment to acquire the camera's response curve, plus it takes some time.
It would make sense for more professional cameras though...
We have 3, $2000 high-end machine vision cameras for our robot project and they all have completely different responses (same sensor, same manufacturer!).
So, I had to design / build a rig with a programmable, calibrated RGB light, to acquire the curves.
It would be nice if the manufacturer did this and just stored the curves in the camera's firmware.
lnpilot Makes sense.
2:53 Fun little thing to try to prove yourself (if you like math).
(sqrt(x)+sqrt(y))/2
good to know this. next time i make image editing software, i'll make sure to remember this
Photographers that shoot in RAW keep the data linear so the default setting will work for them out of the box. Some image formats have flags to tell you what color space the data is in, they aren't used often enough. For games it's not easy to tell what gamma you should render to get it to show up accurately on the monitor. Image assets are usually saved in gamma space then blended linearly in the game engine, then converted to gamma space again, causing artifacts.
Well this answers why gradients with transparencies are so ugly in Illustrator.
The solution, don't use crappy tools.
Jeffery Liggett or learn how to use non crappy tools
Jason Crafts
That's the correct way to define it.
***** imageMagick 8D
***** MS Paint.
Jeffery Liggett Don't use Adobe software......LOL
So, after a bit of experimentation in my copy of Photoshop CS6 Beta: going to Edit > Color Settings and checking "Blend RGB Colors Using Gamma" does nothing.
BUT: all I have to do is go to Image > Mode, and select "Lab Color" - et voilà! When applying a Gaussian Blur filter to an image that's pure red on one side and pure green on the other, I get the gold color as displayed in this video, rather than 'dark sludge'. :D
Thank you, MinutePhysics, for helping me out with this! …Even if it was indirectly! :)
yeah but.....will it blend?
oh, yes. yes it does.
Audible, the leading provider of UA-cam sponsorship!
I have to use this in one of my programming projects!
Wow, never knew. Good lesson!
This is what a linear workflow does
2:34
You missed the part where you draw red! xD
darn that explain why some drawing programs are better at blurring than others. didn't think of that
Working in Photoshop for 6 years now and learned something new today... thanks!!!
+MinutePhysics videos are probably the only UA-cam videos that fuck with my sub, playing a bass line at a frequency it does not like! Grrrrrrrrrrrrrrrrr
That must be annoying, I completely know that feeling. Where something has an odd harmonic that creates a low growling or distorting sound. Wonder what it is. Our town's radio station has a imbalance of 15% on the stereo channels and that already drives me crazy. My right ear is happy, the left one is sad. XDDD
Maybe I should email them, I noticed it isn't as bad on stereos as portable devices like phones and MP3 players. Perhaps nobody even knows it does that. Though I do find old analog broadcast hardware quite cool, they have quirks from time to time that's for sure. :D
+theLuigiFan0007 have you tried listening to that station with different radio equipment and/or in different locations? Stereo FM transmission is not trivial. It starts by transmitting the sum L+R, to be compatible with non-stereo receivers, then computes the difference L−R, uses it to amplitude modulate a higher frequency signal, called a Subcarrier, then merges it back with the main signal.
What I mean is that there may be some interference in your specific location and/or a fault in your own equipment that gives that imbalance. It may or may not be the station's fault. This is also one of the reasons most stereo equipment (used to?) have a Balance knob, to tweak the stereo balance manually.
etatoby
Yeah I know how stereo broadcast works to some extent, isn't the MPX subcarrier between 19kHz to 39kHz?
Could be interference, as the roof is a steel roof, which is made of enameled steel plate. But, I don't think so as if I use a USB SDR stick or a car radio there's no imbalance. Both of those auto adjust stereo balance, as far as I know.
Could just be older receivers don't like the signal output by the station. I tried it on a somewhat decent stereo a while back and it sounded fine as well. I think the problem is limited to cheap FM radios.
Maybe you need to use advanced settings ;)
Wouldn't this just create a new problem for any image *not* taken with a digital camera?
Are do all current image containers utilize the same squaring algorithms?
_Example, an image created solely in Photoshop_
It's not about whether you took the image with a digital camera. It's about the format you store it. Even when you create images in photoshop and then save them as JPG or whatever format Henry is talking about, they will be stored the same way as digital photos.
***** I would assume that formats that are considered less "lossy" and less compressed would avoid this or at least minimize the effects. Such as saving as PNG instead of JPG
***** From a editing standpoint, you should always work with RAW image. The quantity of information it contain REALLY does make a difference. However, even when working with RAW, trying to blur something using RBG does gave you the same dark effect we try to avoid. The LAB color space, as far as I could say, is really the only thing that have a significant impact on this. After all, it doesn't mater how much data you have for an Image if, to begin with, the way the data is altered (editing) is wrong and this is exactly the problem we have here. The problem is not the data, it is the way your program (ex:Photoshop) modify the said data.
All images created for display in the web are created in the sRGB color space, and therefore follow the sRGB gamma curve (roughly a power of 2.2, not technically "squared"). This is so that they don't need modification in order to be displayed by web browsers. JPEG files are generally assumed to be in sRGB, and this is what the average image editor will assume as well.
PNG files actually have a gamma and color profile setting so you can store it with any gamma curve you want, but many web browsers and image viewers still horribly suck at proper color management.
Que vídeo bom! Agora sim eu entendi sobre Gamma
This is a really good video, I can't believe I've never seen it before
Ah the triangle inequality, so useful
The thing is, it's arguably bad design to do this gamma correction *during color blending/blurring* in each application via a setting. That may be why even knowledgeable programmers are reluctant to do it, let alone enable it by default. It *should* be done when *reading* the image file, then all the mixing and blending should be done linearly, and then the reverse gamma correction should be applied to the result. Doing gamma correction during color blending would also make the program run slower -- color blending is often done on the GPU these days, and the hardware color blender even in modern GPUs is "fixed-function", i.e. it isn't freely programmable, it's only somewhat parameterizable, and it can only do linear blending (see www.opengl.org/sdk/docs/man/html/glBlendFunc.xhtml). You can do arbitrary non-linear blending in a pixel shader (a small user program that is run by the GPU in parallel for each pixel of the frame to compute its color), but that's still slower than the fixed-function blender, and it may require a large refactoring of your rendering pipeline code, so you may not be able to do that easily.
As mentioned, the right way to solve this problem is to apply ^(1/Gamma) to all pixel values of an input image as you're reading it, since common image formats like JPEG have already been gamma-corrected during creation, i.e. their pixel values correspond to (physical light intensity of the pixel)^Gamma. Then you would do all the color mixing and blending linearly, and then apply ^Gamma once to the final result pixel right before sending it to the display hardware -- which internally computes ^(1/Gamma) to get the actual physical brightness of the pixel on the display (this was due to physical properties of the screen on CRTs, and modern LCDs emulate it for backwards compatibility).
This is all supported in hardware these days: the sRGB texture format (since OpenGL 2.1) takes care of the gamma correction when reading images -- i.e. it computes texture value = (image pixel value)^(1/Gamma) when reading an image into a texture. Then you can blend multiple texture values linearly, i.e. just add/average them to obtain the result pixel which you write into the framebuffer. After that, the sRGB framebuffer format (GL_FRAMEBUFFER_SRGB, sine OpenGL 3) takes care of computing output pixel value = (framebuffer pixel value)^Gamma, which is sent to the display. I don't know which part of that pipeline common tools like Photoshop or GIMP get wrong (I'm not really an expert on this stuff either), but the solution proposed here ("Blend RGB colors using Gamma" setting in Photoshop) is not optimal.
That's not true. Desktop GPUs have fixed-function blending. Shaders are for pixel generation during rendering, but the blending process that puts those pixels onto the framebuffer is still fixed-function. This is actually programmable on some mobile GPUs which use tile-based rendering, but desktop GPUs do not have this feature. You can simulate arbitrary blending by using more complicated shader programming, but it makes everything a lot more complicated and slower (in some cases impractically so).
Olaf is correct. Gamma correction should be considered a *file encoding* detail. In fact it's almost the same as a-law and mu-law audio encoding, and nobody is dumb enough to try to do audio processing with a-law and mu-law data. We just do it with video/graphics because we're used to doing it wrong.
GIMP 2.9 (devel) does this properly. Its image operations all use linear 0-1 values. The working image can be internally stored in any format in memory, and in case of gamma-encoding it will be converted to linear light before any operations are performed. For ideal precision across multiple operations, you should change the image mode to 32-bit float linear, work on it, then only export to sRGB the final image. But even without doing that, operations will be correct (though you may lose a bit of precision due to intermediate conversions).
3:06 you know how many art tutorials there are that tell you to brighten mid tones and blended color areas with arbitrary fake reasons? Sometimes ambient occlusion and subsurface scattering are terms thrown into it, when it's actually just this^ sometimes.
Wow, I never heard about it! Even though I am pretty advanced in computer vision (did vSLAM diploma project, and almost got a computer vision engineer position).
I had no idea! Is the same true for the alpha channel in semi-transparent "colors"?
Nice!
Yup!
Yes, the alpha channel has the same problem. Do the following experiment:
1) Create a new image in Gimp, filled with red.
2) Create a new layer, half-filled with green.
3) Gaussian-blur the green layer.
I'm beginning to think that framebuffers should be using floating point numbers representing luminances in CIE color space and converted to monitor color space for display by the graphics card. That would make doing the Right Thing so much simpler.