Exactly what I was thinking This is the kind of knowledge you should get when you study gamedevelopment in uni Not how to flip normals, everyone can do that This is what will really elevate your understanding of the world around you
I'm genuinely interested in knowing how you came up with pupil reference image to use it as the aperture function. The simulation is really nuts. I had to turn the flashlight on my phone and noticed it was literally 100% like the rendered simulation. Impressive work. Just came across your channel and I'm blown away, particularly with the motor sound simulation and your mixed skills involving coding, blender, 3D, programming, raytracing, video editing, talking about stuff... dude there's so much! Congratulations!
pupil is merely a hole in a camera obscura that is the eye. this kind of hole is called an aperture. he made (or found) a texture that approximates the natural imperfection of this hole shape, because if you think about it, pupil is whatever shape remains unblocked by the iris, which has quite a texture to it because of the intertwined proteins and pigments (aka the pigmented muscular curtain). knowing all this, there are only three features to such an aperture texture: - general round shape - very rough edges - slight area imperfections (slightly exaggerated in his example, I think this helps to emphasize "squinting" aka looking through the eyelashes, but it can be done more consciously) you can fake such a texture if you understand the optics he describes in this video and just by watching the close-up image of an eye.
Guy makes his own engines from scratch, his own games from scratch, his own coding language from scratch. I've never seen anything like it. It's another level of programming.
I believe UE4 may have had a bloom system based on the convolution technique I showed so it's probably been used in some video games, don't know any examples off the top of my head though
@NordicFrog I'm thinking it's the next step in the bloom evolution. First we had bloom with threshold: Pretty simple. We threshold the color buffer then downscale it. Then perform a series of vertical/horizontal blurs. Then upscale and add it to the color buffer. Problem is though, it looks horrendous. (That's why many players disable it first thing) Next step is non-thresholded, where we directly downscale the color buffer 5 or 6 times, and then progressively upsample again. Finally we add it on top of the color buffer with linear interpolation. Looks better. I think we are waiting for graphics card evolution a bit more. But beyond that no reason why this shouldn't become standard at some point - it looks way better, and it's more physically accurate.
yes look up overgrowth by wolfire games, they used the same paper that i suspect this video is based heavily on. here is the paper: people.mpi-inf.mpg.de/~ritschel/Papers/TemporalGlare.pdf @angethegreat i think you should at least credit the author
I actually don't have access to that link so I can't comment. To be honest I mainly followed Wikipedia articles about fraunhofer diffraction, there isn't much else to this and I don't claim it's revolutionary
Not only did you give a super informative explanation on what bloom is, you also accurately recreated what I see with my own eyes!!! Legit paused the video, turned on my flashlight and saw THE EXACT EFFECT YOU CREATED. Jaw on the ground. incredible
I am blown away by your channel... I really appreciate the effort you put into your videos and the clarity with which you convey complicated information. I recently finished a graduate course in electrodynamics and could *not* wrap my head around diffraction until I came across this video and it all clicked. Seeing the treatment of Fraunhofer diffraction as a two dimensional Fourier transform worked out visually like this across a set of different apertures made it all make sense. I am so thankful I found this channel :)
Wow, thank you for the kind words! It especially means a lot to me on this video because I spent a lot of time trying to figure out a good way to explain a somewhat counterintuitive phenomenon. I'm really glad that it worked out (for you, at least) and I really appreciate your viewership and comment. Welcome to the channel!
Are you sure of having completed a graduate course in Electrodynamics? What he talks about in the video is pretty basic and every UNDER graduate Physics student should know about it and the causes behind it.
I just discovered your channel a few days ago and am absolutely fascinated by your amazing work and great but compact explanations of complex topics. Keep it up!
this channel is extremely underrated. these videos are both interesting and explain enough to understand without making you feel like you didnt understand one bit, which is awesome. tl;dr: this channel is like kurzgesagt but with coding, and thats neat
I would absolutely 100% like to see more videos like this I have found my new favourite channel! Your stuff is amazing... it's.. absolutely wild seeing you take a real word phenomenon and making its almost perfect digital recreation look so simple.. you're truly an amazing coder!
your production quality is straight up stunning. not for a channel of your size, but in general I wouldn't have expected to find such great videos randomly on my start page. truly amazing work, much appreciated
10:50 human eye looks incredibly interesting. I am an architectural visualizer and i'm always chasing after the newest tech. Bloom in most renderers is kind of meh. What they do it take a low res version of the image, blur it a bit and add it to the original. then blur it some more add it again, and so on and so on. This does give a nice and smooth falloff effect but this looks MUCH more pleasing and realistic. Are you aware of any programs/renderers that currently tackle bloom like this? Is it even possible to do so with the current photon based raytracing engines (which is basically every big unbiassed render engine) And if not. Is it possible to apply this in some sort of way as a post step on a 32bit hdr image (you obviously need the extra dynamic range to get this to work properly) If a solution like this would become available where you could just choose an alpha mask of an apreture and get insanely realistic bloom like this I am sure a lot of 3d artists chasing after photorealism would be interested. BTW, MORE CONTENT LIKE THIS! This type of in depth content is a literal goldmine to people who are chasing after photorealism in art!
Afaik Unreal uses this partially. Doing this in real time can be expensive so they probably use a mish mash The calculations are all post process effects anyway, unless you simulate the diffraction in the lense (which would be incredible expensive and unnecessary). You only need the brightness and colour data of your render to perform these calculations, and yes, you probably need HDR data for more range to make it look good. But yeah, it would be enough to code a small program that takes your HDR render and applies the bloom, without the need to change the raytracer. It's all post process
I'm a compositor for TV/Film in the VFX industry, and we use software called Nuke. It has a node (effect/tool) called convolve, which takes an input image and convolves it with a kernel image (can be anything, but usually is an image of an aperture). Very common thing to do, because it's super valuable for making various lens effects - bokeh, glare, flares etc. Can do a lot of really creative things with it (and obviously Nuke by extension - every film you watch these days that has any sort of VFX will have used Nuke). Nuke works internally in 32bit float, so you have a tonne of dynamic range to work with. So yeah that software exists :)
@@PeterJansen That's interesting! I am familliar with nuke and how much its used in CG but i've never used it since I work mostly in 3d and create stills. I never really touch compositing except for photoshop Sound like something cool I might be able to try though! thanks! Only thing im thinking is how to tackle tonemapping but im sure I can figure it out haha
When you showed the pupil diffraction pattern in the beginning I wondered where you got that image from. Was blown away when you brought out the pupil in simulation. Loved this video!
I just found your channel today; THIS IS INCREDIBLE!! I look forward to seeing how far you go with the physics simulations, they're mind-boggling. I wonder how much computing power you would need to combine all of them, along with a ray tracer, into the most physically accurate game / simulation ever.
I do a lot of audio programming, and so it brings me a ton of joy to see a different "artistic" application of the FFT. Most examples are either "drawing a picture with circles", or the spectrometer you showed before. FTs are awesome, incredibly powerful, and are the backbone of so much of our technology, both scientific and consumer.
Dude... this is definitely the best video ive watched in years. I had such a big smile on my face because of how realistic everything looked. Im a sucker for these things 😂
Another reason bloom can happen, mainly for humans, is that light is not going perfectly through the eyeball. It is being scattered a bit inside the eyeball.
What you are simulating is diffraction from an aperture, while most bloom is caused by optics that softens up the image causing it to bleed over to other pixels. Think of it as a clean glass vs the same glass that you've breathed on which diffuses the light. So while you're diffraction stuff is really interesting to see and really well explained, it's only part of the story :)
I'm completely hooked to your channel. I'd love to implement this convolve shape to diffraction pattern in Nuke. Sadly I'm in no position to code. Maybe it's time to learn some blinkscript. I'd really love to implement this. Awesome video!
There is another guy in Australia that writes his own games game engine stuff in C++. The Cherno is his channel. He talks about bloom as well. More on implementing it.
Wow, I'm amazed at how incredibly clear your presentation of this was! Super interesting content, is this bloom method implemented in your MantaRay renderer?
The way they usually calculate bloom in post-process is by convolving a diffraction pattern of a single point light source with the image. This way one can define that diffraction pattern directly, bypassing the aperture that it corresponds to, and then directly use that diffraction pattern to generate the post-processed image. However, for most realtime applications, doing fft->convolution->ifft is usually too slow, so they use approximations of convolutions based on mipmaps : even though algorithmically FFT has the same asymptotic complexity as approximations based on mipmaps, in practice the constant in that O(N log N) is much higher and fft-based proper bloom ends up being an order of magnitude slower than mipmap-based approximations.
Yeah me too lol... I was most surprised by how I never really thought too much about this and it's never really mentioned in science classes or anything. Hope you liked the video and thanks for watching!
I believe that you also can get the light polarization effect (wiki: Haidinger's brush) if you do not discard the imaginary part of your Fourier's transform.
I always thought Ray-tracing was a gimmick, and that instead someone should create Wave-tracing, I can't program so I could never create it. Great to see someone actually making a realistic simulation instead of using rays.
@ModuMaru But its not, its using waves. A light source only turns into a Ray when you Physically observe the object, otherwise it is giving off Waves. You Manifest Reality by physically observing the object.
@@driatrogenesis Light is a wave, Infact light is a byproduct of sound. Your eyes send rays towards objects and when it interacts with light it turns into a ray. the feeling you get that you are being watched is because you can feel the rays hitting you.
the blender demo you used was run with eevee which isn't a ray tracer. but is pretty close and runs much faster (usually) than the ray traced counterpart
Eeevee is not a ray tracer, you are correct. I was mainly just showing how most renderers typically do bloom with post processing. A ray tracer would need to do the same steps as well
I've always wondered about that, with my explanation being diffraction because the patterns were created by hair gaps in eyelashes and hair, as well as eyelids slit and eye texture.
Wow I learned so much and this is really cool! I wonder if it would be possible to get this to realtime for games. I know its overkill but it would be impressive :D
1:20 Though, this can be done intentionally in computer graphics in order to simulate an object that's brighter than the screen can display -- to make it so the total integrated intensity of the spot is still correct even if the spot size is not correct.
Could you give a bit more detail about your scripting language since I am interested in implementing a scripting language like you? You can give information such as using LLVM or your own implementation, your own parser or a parser generator etc. please. (By the way I just discovered your channel and really liked your content, visualizing, explanations... And sure, I subscribed naturally. )
Thank you and thanks a lot for watching! I would say that it is "reasonably" performant for the purposes of ray-tracing. The calculation of the diffraction pattern takes the longest time, especially if the color resolution is high, but it usually only needs to be calculated once per camera/lens (and theoretically could be cached between sessions). I can think of some ways to parallelize this calculation with a GPU but it wasn't useful for this particular project. The convolution between the diffraction pattern and an image frame is reasonably fast and depends on the speed of the FFT implementation. One might be able to implement a 2D FFT on GPU but I don't know much about this myself. Apparently UE4 offered a real-time bloom simulation that was similar to this but I don't know the details of their implementation. Hope that answers your question! If you're interested, the code for this project is open-source and part of the MantaRay ray-tracer on my GitHub page (link in description).
Yes, Bloom is one of those graphics effects I go through the settings of the various games to kill / turn off upon first launch. That and the music is the first to die :D
2:16 Maybe you should look up halation, because this absolutely happens with film which has been around a heck of a lot longer than CCD and CMOS sensors.
So this is absolutely amazing and explains so much! I noticed that the iris of our pet dog is quite interesting and very different to a human iris. Any chance of running that filter to show what a dog would see?
This has a criminally small amount of views, this channel is a literal goldmine
Wow thanks for the kind words! I appreciate your viewership 🙏
Exactly what I was thinking
This is the kind of knowledge you should get when you study gamedevelopment in uni
Not how to flip normals, everyone can do that
This is what will really elevate your understanding of the world around you
channel is growing right now. There is not so much similar content on youtube with that level of quality
Just chiming in to agree. This channel is indeed a goldmine of extremely useful insight into very niche but interesting subjects.
Sorry but your use of the word literal is criminal ;)
it’s absolutely insane how many applications fourier transformations have. image processing, data compression, audio analysis just to name a few
I'm genuinely interested in knowing how you came up with pupil reference image to use it as the aperture function. The simulation is really nuts. I had to turn the flashlight on my phone and noticed it was literally 100% like the rendered simulation.
Impressive work. Just came across your channel and I'm blown away, particularly with the motor sound simulation and your mixed skills involving coding, blender, 3D, programming, raytracing, video editing, talking about stuff... dude there's so much! Congratulations!
pupil is merely a hole in a camera obscura that is the eye. this kind of hole is called an aperture.
he made (or found) a texture that approximates the natural imperfection of this hole shape, because if you think about it, pupil is whatever shape remains unblocked by the iris, which has quite a texture to it because of the intertwined proteins and pigments (aka the pigmented muscular curtain). knowing all this, there are only three features to such an aperture texture:
- general round shape
- very rough edges
- slight area imperfections (slightly exaggerated in his example, I think this helps to emphasize "squinting" aka looking through the eyelashes, but it can be done more consciously)
you can fake such a texture if you understand the optics he describes in this video and just by watching the close-up image of an eye.
My eyelash catching some light to bloom it more
how would you simulate squinted or rested eyes though, the “lens flare” looks nothing like the fully opened ones.
Guy makes his own engines from scratch, his own games from scratch, his own coding language from scratch. I've never seen anything like it. It's another level of programming.
Has this ever been implemented in a video game? The effect looks really cool.
I believe UE4 may have had a bloom system based on the convolution technique I showed so it's probably been used in some video games, don't know any examples off the top of my head though
@NordicFrog I'm thinking it's the next step in the bloom evolution. First we had bloom with threshold:
Pretty simple. We threshold the color buffer then downscale it. Then perform a series of vertical/horizontal blurs. Then upscale and add it to the color buffer.
Problem is though, it looks horrendous. (That's why many players disable it first thing)
Next step is non-thresholded, where we directly downscale the color buffer 5 or 6 times, and then progressively upsample again. Finally we add it on top of the color buffer with linear interpolation. Looks better.
I think we are waiting for graphics card evolution a bit more. But beyond that no reason why this shouldn't become standard at some point - it looks way better, and it's more physically accurate.
@@AngeTheGreat UE4 uses an image in the convolution mode
yes look up overgrowth by wolfire games, they used the same paper that i suspect this video is based heavily on. here is the paper: people.mpi-inf.mpg.de/~ritschel/Papers/TemporalGlare.pdf
@angethegreat i think you should at least credit the author
I actually don't have access to that link so I can't comment. To be honest I mainly followed Wikipedia articles about fraunhofer diffraction, there isn't much else to this and I don't claim it's revolutionary
Not only did you give a super informative explanation on what bloom is, you also accurately recreated what I see with my own eyes!!! Legit paused the video, turned on my flashlight and saw THE EXACT EFFECT YOU CREATED. Jaw on the ground. incredible
I am blown away by your channel... I really appreciate the effort you put into your videos and the clarity with which you convey complicated information. I recently finished a graduate course in electrodynamics and could *not* wrap my head around diffraction until I came across this video and it all clicked. Seeing the treatment of Fraunhofer diffraction as a two dimensional Fourier transform worked out visually like this across a set of different apertures made it all make sense. I am so thankful I found this channel :)
Wow, thank you for the kind words! It especially means a lot to me on this video because I spent a lot of time trying to figure out a good way to explain a somewhat counterintuitive phenomenon. I'm really glad that it worked out (for you, at least) and I really appreciate your viewership and comment. Welcome to the channel!
Are you sure of having completed a graduate course in Electrodynamics? What he talks about in the video is pretty basic and every UNDER graduate Physics student should know about it and the causes behind it.
Really well explained. I had never really thought about this before.
Before I wrote this simulation, I hadn't thought much about it either haha... thanks for watching!
I just discovered your channel a few days ago and am absolutely fascinated by your amazing work and great but compact explanations of complex topics. Keep it up!
Welcome to the channel and thanks for the kind words!
This was super interesting!
Glad you liked it and thanks for watching!
I just want to tell you that I have been (passively) looking for exactly this information for a long time, so thanks a bunch.
That's great! I'm glad you got something out of the video and thanks for watching!
holy shit you channel is so incredibly underrated
Glad you like the channel and thanks for watching!
@@AngeTheGreat Legit re-watched this, it's super entertaining and also interesting. Thank you.
this channel is extremely underrated. these videos are both interesting and explain enough to understand without making you feel like you didnt understand one bit, which is awesome.
tl;dr: this channel is like kurzgesagt but with coding, and thats neat
This is and the motor engine simulation are my favorite vids of the channel so far.
You shine at going indepth.
I would absolutely 100% like to see more videos like this
I have found my new favourite channel!
Your stuff is amazing... it's.. absolutely wild seeing you take a real word phenomenon and making its almost perfect digital recreation look so simple..
you're truly an amazing coder!
This is incredble, fantastic explanation of the fourier transform of an image. The eye result looks brilliant, much like a pupil
your production quality is straight up stunning. not for a channel of your size, but in general I wouldn't have expected to find such great videos randomly on my start page. truly amazing work, much appreciated
10:50 human eye looks incredibly interesting. I am an architectural visualizer and i'm always chasing after the newest tech.
Bloom in most renderers is kind of meh. What they do it take a low res version of the image, blur it a bit and add it to the original. then blur it some more add it again, and so on and so on. This does give a nice and smooth falloff effect but this looks MUCH more pleasing and realistic.
Are you aware of any programs/renderers that currently tackle bloom like this? Is it even possible to do so with the current photon based raytracing engines (which is basically every big unbiassed render engine)
And if not. Is it possible to apply this in some sort of way as a post step on a 32bit hdr image (you obviously need the extra dynamic range to get this to work properly)
If a solution like this would become available where you could just choose an alpha mask of an apreture and get insanely realistic bloom like this I am sure a lot of 3d artists chasing after photorealism would be interested.
BTW, MORE CONTENT LIKE THIS! This type of in depth content is a literal goldmine to people who are chasing after photorealism in art!
Afaik Unreal uses this partially. Doing this in real time can be expensive so they probably use a mish mash
The calculations are all post process effects anyway, unless you simulate the diffraction in the lense (which would be incredible expensive and unnecessary). You only need the brightness and colour data of your render to perform these calculations, and yes, you probably need HDR data for more range to make it look good.
But yeah, it would be enough to code a small program that takes your HDR render and applies the bloom, without the need to change the raytracer. It's all post process
I'm a compositor for TV/Film in the VFX industry, and we use software called Nuke. It has a node (effect/tool) called convolve, which takes an input image and convolves it with a kernel image (can be anything, but usually is an image of an aperture). Very common thing to do, because it's super valuable for making various lens effects - bokeh, glare, flares etc. Can do a lot of really creative things with it (and obviously Nuke by extension - every film you watch these days that has any sort of VFX will have used Nuke).
Nuke works internally in 32bit float, so you have a tonne of dynamic range to work with. So yeah that software exists :)
@@PeterJansen That's interesting! I am familliar with nuke and how much its used in CG but i've never used it since I work mostly in 3d and create stills. I never really touch compositing except for photoshop
Sound like something cool I might be able to try though! thanks!
Only thing im thinking is how to tackle tonemapping but im sure I can figure it out haha
Dude i feel sad that this hasn't got more views 😭
Keep it up tho this is awesome stuff!!!
When you showed the pupil diffraction pattern in the beginning I wondered where you got that image from. Was blown away when you brought out the pupil in simulation. Loved this video!
I just found your channel today; THIS IS INCREDIBLE!! I look forward to seeing how far you go with the physics simulations, they're mind-boggling. I wonder how much computing power you would need to combine all of them, along with a ray tracer, into the most physically accurate game / simulation ever.
I do a lot of audio programming, and so it brings me a ton of joy to see a different "artistic" application of the FFT. Most examples are either "drawing a picture with circles", or the spectrometer you showed before. FTs are awesome, incredibly powerful, and are the backbone of so much of our technology, both scientific and consumer.
Dude... this is definitely the best video ive watched in years. I had such a big smile on my face because of how realistic everything looked. Im a sucker for these things 😂
Still one of the best simulation video's on this platform, wish you'd do more like this :)
Another reason bloom can happen, mainly for humans, is that light is not going perfectly through the eyeball. It is being scattered a bit inside the eyeball.
HOLY SHIT YOUR CHANNEL IS A BLESSING TO THIS WORLD
I'm amazed how such relatively simple mathematics can simulate a pupil's "lens glare" so accurately. When I saw it actually rendered I was blown away
THIS IS THE GUY WE NEED ON STACK OVERFLOW
This channel is a hidden jem I'm so glad I stumbled over it
Glad to have you and thanks for watching!
Absolutely amazing. Especially incredible that such a difficult to describe effect can be distilled into just a few simple operations.
Once in a blue moon channel with fascinating content, You sir are a legend!
You are the best engineer i have ever seen on youtube by far ! Keep up the good work please
What you are simulating is diffraction from an aperture, while most bloom is caused by optics that softens up the image causing it to bleed over to other pixels. Think of it as a clean glass vs the same glass that you've breathed on which diffuses the light. So while you're diffraction stuff is really interesting to see and really well explained, it's only part of the story :)
why this still have only 22k views.. this should be atleast a million by now
Crazy how accurate this is! Really well done!
That was fascinating. The simplicity in the core simulation that results in such complex patterns astounds me. Thanks for making this video
I just discovered your channel and am frankly blown away by your content! Please, keep going and you'll have a million subs in no time!
Wow this is incredible. Easy to understand and well presented.
Dude, your content is amazing!! I love physics and computer science as well, and your videos are inspiring. Thanks for your time and efforts!!!
this video is so good I'm surprised your channel is under a million subs. hope it changes soon
Could this be implemented as an add-on for blender? Especially the human pupil aperture
Your channel is really going to take off one day, don't give up!
What an incredible work!
Might be this program can simulate how people with astigmatism see their own bloom. That would be interesting to see.
I'm completely hooked to your channel. I'd love to implement this convolve shape to diffraction pattern in Nuke. Sadly I'm in no position to code. Maybe it's time to learn some blinkscript. I'd really love to implement this. Awesome video!
There is another guy in Australia that writes his own games game engine stuff in C++. The Cherno is his channel. He talks about bloom as well. More on implementing it.
this video was super fun to watch!
That's actually incredible! Awesome video!
This was so interesting. I feel like I could have listened for hours if you delved even deeper into this phenomenon and related subjects 👍
amazing job, man. It never would have occurred to me that a smudged lens has anything to do with the double slit experiment.
Wow, I'm amazed at how incredibly clear your presentation of this was! Super interesting content, is this bloom method implemented in your MantaRay renderer?
Really cool and entertaining video. Hope this video gets blessed by UA-cam algorithm!
Thank you! Let's hope so, I worked hard on this one haha :')
loving the vids and explanations, thanks!
I do enjoy content like this. I just stumbled across your channel and subscribed straight away.
This is incredible
The way they usually calculate bloom in post-process is by convolving a diffraction pattern of a single point light source with the image. This way one can define that diffraction pattern directly, bypassing the aperture that it corresponds to, and then directly use that diffraction pattern to generate the post-processed image.
However, for most realtime applications, doing fft->convolution->ifft is usually too slow, so they use approximations of convolutions based on mipmaps : even though algorithmically FFT has the same asymptotic complexity as approximations based on mipmaps, in practice the constant in that O(N log N) is much higher and fft-based proper bloom ends up being an order of magnitude slower than mipmap-based approximations.
I loved your channel! It would be very nice to implement the bloom effect as a blender composer node. It would be a great addition.
Amazing!
Interesting, I always thought it was just an effect caused by the camera sensor or our eyes.
Yeah me too lol... I was most surprised by how I never really thought too much about this and it's never really mentioned in science classes or anything. Hope you liked the video and thanks for watching!
@@AngeTheGreat I really like your videos. Your channel is hugely underrated!
Yay, a smart person who does graphics! :D :D :D
Smart person where?? 👀 Haha thanks for watching 🙏
Could this be used to simulate vision with an eye that has higher order aberrations as a result of refractive surgery?
I believe that you also can get the light polarization effect (wiki: Haidinger's brush) if you do not discard the imaginary part of your Fourier's transform.
One of the best chanels
That was beautiful,sir
Wow. Love it!
Thank you!
Learning something new every day. Thanks :]
Thanks for watching and glad you enjoyed it!
Man. You are brilliant.
I swear, if you had a AAA game company where all devs had the same knowledge as you, the games you would be able to create would be insane
I always thought Ray-tracing was a gimmick, and that instead someone should create Wave-tracing, I can't program so I could never create it.
Great to see someone actually making a realistic simulation instead of using rays.
@ModuMaru But its not, its using waves.
A light source only turns into a Ray when you Physically observe the object, otherwise it is giving off Waves.
You Manifest Reality by physically observing the object.
@@driatrogenesis Light is a wave, Infact light is a byproduct of sound. Your eyes send rays towards objects and when it interacts with light it turns into a ray.
the feeling you get that you are being watched is because you can feel the rays hitting you.
@ModuMaru thank you, I've been looking for something like this.
This video is just incredible
very insightful.
Wait, so I'm seeing a 2D Fourier transform of my pupil shape when I look at a point source of light? Genuinely mindblowing stuff
What is bloom ?
Baby don't hurt me... Don't hurt me ... NO MORE !
An extremely primary topic. For everyone with eyes.
the blender demo you used was run with eevee which isn't a ray tracer. but is pretty close and runs much faster (usually) than the ray traced counterpart
Eeevee is not a ray tracer, you are correct. I was mainly just showing how most renderers typically do bloom with post processing. A ray tracer would need to do the same steps as well
I've always wondered about that, with my explanation being diffraction because the patterns were created by hair gaps in eyelashes and hair, as well as eyelids slit and eye texture.
Eyelashes and eyelids (pretty much anything that can obscure view) will definitely have an effect on the diffraction pattern!
It's fucking beautiful!
Annnnnnnnnnd subbed.
2 vids in 2 weeks?
christmas came early :P
It's a miracle! Thanks for watching, hope you enjoyed the video :)
@@AngeTheGreat as always, it was a pleasure
Wow I learned so much and this is really cool! I wonder if it would be possible to get this to realtime for games. I know its overkill but it would be impressive :D
mind blown, wp sir!
you're brilliant
As someone with an astigmatism, yes yes I have looked at a bright point and seen that
Genius !
Science!
Nice explanations!
Nice
😎
If you have astigmatism, you'll also get blurry radial lines off point lightse
8:17 I watched Apocalypse Now yesterday evening and saw the same bloom octopus arms in a night scene where a headlight was shown
1:20 Though, this can be done intentionally in computer graphics in order to simulate an object that's brighter than the screen can display -- to make it so the total integrated intensity of the spot is still correct even if the spot size is not correct.
Could you give a bit more detail about your scripting language since I am interested in implementing a scripting language like you? You can give information such as using LLVM or your own implementation, your own parser or a parser generator etc. please. (By the way I just discovered your channel and really liked your content, visualizing, explanations... And sure, I subscribed naturally. )
Really cool stuff! How performant is it tho? And is it implementable on the GPU (can be parallelized)? :)
Thank you and thanks a lot for watching!
I would say that it is "reasonably" performant for the purposes of ray-tracing. The calculation of the diffraction pattern takes the longest time, especially if the color resolution is high, but it usually only needs to be calculated once per camera/lens (and theoretically could be cached between sessions). I can think of some ways to parallelize this calculation with a GPU but it wasn't useful for this particular project. The convolution between the diffraction pattern and an image frame is reasonably fast and depends on the speed of the FFT implementation. One might be able to implement a 2D FFT on GPU but I don't know much about this myself. Apparently UE4 offered a real-time bloom simulation that was similar to this but I don't know the details of their implementation. Hope that answers your question! If you're interested, the code for this project is open-source and part of the MantaRay ray-tracer on my GitHub page (link in description).
Yes, Bloom is one of those graphics effects I go through the settings of the various games to kill / turn off upon first launch.
That and the music is the first to die :D
But those both improve the feeling/atmosphere of the game
@@circuit10 agreed
💯 awesome
This video isn't just about bloom, it's also about lens flare
2:16 Maybe you should look up halation, because this absolutely happens with film which has been around a heck of a lot longer than CCD and CMOS sensors.
Oh interesting, never heard of that effect. That would be cool to include in a film simulation
@@AngeTheGreat Film simulation would be a guaranteed hit
That is pretty much what my eyes see. Except I also get an outer rainbow halo around bright lights at night.
So this is absolutely amazing and explains so much!
I noticed that the iris of our pet dog is quite interesting and very different to a human iris. Any chance of running that filter to show what a dog would see?
Now I want to simulate the effect of lasik on my eyes
11:01 Man just has reconding from his own eyes)
amasingly true
11:56 phone also applied image-enchance algoritms)
Can this be implemented in a path tracing render pipeline like Cycles to a achieve photorealistic bloom effect?
What if you were my Computer science And physics teAcher at my school time 🤔🤔
But thAnks tOo Allah i found you now...🤩🥰