What is the Dynamic Range of the Human Eye?
Вставка
- Опубліковано 19 тра 2024
- Download Free Blueprint on Making a Movie: mailchi.mp/wolfcrow/blueprint...
In this video I'll explain how we can approximate the dynamic range of the human eye, as it pertains to cinematography.
How much more dynamic range do cameras need to have to match the human eye? Let's find out!
For the written article with the numbers, read this: wolfcrow.com/what-is-the-dyna...
What other people would sell as a course this man is giving us for free, great job!
The brain is great at computational photography, if it expects to see it, it will see it, even if it is not there. Great analysis, Thank.
I was one of Galileo’s first 100 subs thats my boy
😅
I saw a video for you titled why movies shot in two colors , you got 1 and a half million views on the video , I could tell why as a viewer , the voice over of that video was just a masterpiece, I could capture the emotions of it , it even reminded me with Leonardo DiCaprio's voice, I'm realy angry that your video are not growing like before , I realy wish you to be on the top , you are realy an inspiring content creator
Regardless of dynamic range human eye has much better bit depth. 16bit readout/encoding is a flagship premium spec still.
💯
Now the two next videos you should make is the bit depth of human eye, is it log or linear and the aperture or depth of field of human eye. Lastly, construct a camera that mimics the human eye the most, does it need to be 3d with two lens side by side
This man's videos are filled with brain knocking information, I'm not qualified enough to understand these terms, yet I watch start to end 😊😂
I think a little bit simple video would be better.😊
Amzing video! Thanks for that🎉
Thank you for sharing. Very good information.
Excellent again. Thank you.
You’re welcome!
The end... just saw the 42mm video before
Now do one for aspect ratio (of human vision) 😜
this is so good..
Never knew Galileo was a famous youtuber😂
so when they say "film is closer to the eye" it's more like, barely closer. and that the Alexa 35 with 17 stops is still not that close to the human eye
but so much cheaper :)
Keep deep diving. Human vision is weird and because of quirks in biology is not as precise as we think. Phrases like “seeing is believing” start to seem ironic when you learn a lot of human vision is our brains “best guess” of what limited data the eye is returning. The brain does and will fill in gaps with what it would expect to be there (based on past experience) and other biases.
Tons of instances of people standing right next to each other and spotting something in the woods and seeing things wildly different from each other.
I have dual IPS monitor set on EBU profile and sometimes rec709 to see if I find a difference.
Most youtube I watch, at least high produced ones, are "washedout" to me so I use some browser extension to apply filters, normally +10 contrast and +5 saturation.
Honestly contrast is what make them look better. I wonder if my monitor is too soft or if those channels don't properly color correct, but I watch a lot of videography content and those are the worst, like if they publish LOG footage to show they have fancy cameras, but at the same time they know how to color correct as often they teach how to do it.
I'm in the wrong here? I like my monitors, I don't want to change them yet, I'm happy doing my own filtering but I would like to know if I'm alone in this, I don't like this trend.
Most content creators have to produce based on the lowest common denominator. If most screens that their videos will be displayed on are phone screens, then why bother doing time consuming post-production that will only benefit a very small minority of viewers?
At the end 21×2 = 42. Lmao genius 😂🤣 the closest focal length to our eyes
-What is dynamic range of the human eye?
- 21 stops
-Thank you. Didn’t even need to watch te video
But bro if they can make this in cameras they wouldn't need hdr features they will be powerful on their own r
True story, the octopus has objectively better eyes than we do in just about every metric.
Well they can't see color
@@ironman5034technically color doesn't exist outside your mind. Also, the octopus can see with it's skin. That's how it changes it's color to match it surroundings. That it changes it's color indicates awareness of color. So why do they say the octopus cannot see color?
What about if you have the sharingan?
Is that brusspub Musik :D?
wait Galileo is a youtuber?
(not so) Fun Fact: In australia nits is a word for lice
When I look at the sun a close my eye the f/50
Oh man, this whole video is weird 😕
Loved this channel for a long time but I got to unsubscribe, using Gen AI tools as a filmmaker is such a slap in the face to all the artists those datasets have stolen from.
That's how the world works, sorry.
The world system doesn't care about theft and unlawful activities unless humans at a higher position are affected by that.
You can cry about it but your fellow humans will move on and use AI or whatever to make little profits to feed their families and dreams.
@@iamakkkshay sorry, I have this thing called integrity. It’s kind of hard to explain to people who are utterly devoid of it.
@ZaneOlson you have the same "integrity" of someone who de-legitimizes digital art vs traditional art, or someone who said that photograohy would he the death of painting.
@@ZaneOlsonit's just the direction we're going. Who knows, maybe generative AI will be used in tandem with photography and videography to help boost resolution 🤔
@@ComicDan That's a really dumb comparison, AI datasets have been proven to contain MILLIONS of stolen pieces or art from images to entire books. Just because you think it's cool doesn't make it ethical and I would implore you to actually do your research instead of spouting ignorate comments about a subject you clearly don't know anything about. It's embarrassing.