Patrons can now vote for the next video! Thank you for your support. ❤ Support me on Patreon: www.patreon.com/simondevyt 🌍 Live Demo + Courses: simondev.io
There probably just hasn't been a big market for it yet. Recent new features like AI acceleration gain more possible applications every day, faster raytracing is great for the gaming industry, but doing many FFTs in parallel is probably not a common use in the mass market. But maybe it'll just take a new shiny hammer for developers to look for more nails, if it can help with blur & bloom filters I'm certain people will find more clever applications for it.
@@SaHaRaSquad I don't think the point would be to make many FFTs in parallel, but just to make FFTs of decently sized data *really* fast. Changing representation through fourier transform is a really powerful tool in general, it crops up from these bloom effects all the way to quantum mechanics. As mentioned in the video, it can transform convolutions (n to n^2 complexity in kernel size) into multiplications (kernel size independent), so I'd definitely bet if transforming between representations was really fast people would find all sorts of uses for it. That being said, FFT as an algorithm is already pretty crazy as far as I know and there are parallelized versions which are in fact ran on GPUs (say, using cuda), so maybe there's just not that much to be gained by going hardware level, compared to the costs.
nvidia provides an fft library that runs on cuda cores called cuFFT aimed at industry for signal processing and computational physics. it's claimed to be highly optimized.
"Back in the day, we developers thought it looked awesome and so next-gen, and it kinda did, so how about shut up." Amazing. Thank you for all the info, btw!
using the fft for image processing is really interesting, something I will definitely look into one pedantic note though is that the karis average is actually for suppressing bright pixels not preserving them. The karis average helps with fireflies and shimmering artifacts by preventing really high hdr values from overwhelming the ultimately very small sample set of pixels.
@@simondev758 Thanks! It's a strange coincidence that I am also bringing up the karis average in my next video lol, I thought it preserved bright pixels as well but when I implemented it I got the exact opposite results that I wanted
I love your explanations, they're so clear and concise, it really helps me understand these complex topics better even without a formal education. thanks so much for what you do :)
FFT-based glare is the same method I implemented for Megamind. It turns out that Fraunhofer diffraction through an aperture is essentially equivalent to an FFT. With that and making sure you rescale appropriately for each wavelength of light, you get the point spread function, which you can inverse-FFT and use to convolve. But since Fraunhofer diffraction is equivalent to an FFT anyway, the inverse FFT just gets you back to the simulation of your aperture, and you can just use that as your convolution filter... This means you can simulate the bloom based on a lens or even a human eye as long as you can adequately simulate the aperture. Yes Fresnel diffraction is more appropriate for an eye because of the short distance between the pupil and retina, but it's still pretty convincing in producing a glare that looks like glare you'd personally experience through your own eyes. It would be cool to see FFT blur be practical in realtime... It is just plain simple and generic in the same way that ray tracing is.
Super cool, I loved that movie! I feel like we're right on the edge of using it in realtime in games. I didn't profile the quarter res fft, but it seemed pretty quick (handwavy).
FFTs are super powerful. They are commonly used in signal processing to do essentially the opposite of bloom: remove artifacts of the sensor. This reverse process, so-called deconvolution, can for example remove the 6-star artifacts on the JWST images.
There is a correct way of doing bloom, it's just way too computationally expensive to do in real time. The idea is to get the result of the diffraction of light around the aperture of th light capturing device ( eye or camera). The "blur and down sample" approach from advanced warfare was found to give results very close to the 1/d³ falloff of a real airy spot, which is the bloom you obtain with a round aperture at the focal point. The fact that it gets faster due to downsampling is a nice side-effect.
Agreed. I wasn't trying to imply the effect doesn't have a real-world basis, but that the old-way of doing it had many ways of going about it. The context was that older games used to do it a bunch of ways.
i have a feeling that some of the intense color grading and bloom seen in the 2000s was at least partly inspired by Need for Speed Most Wanted without taking into account why they did it and what look they were trying to emulate (a stylized bleach bypass)
@@Rundik here's the thing MW doesn't have much bloom at all the game's look is its attempt to emulate bleach bypass (which it actually does surprisingly well, on the 360 at least) which it also color grades to give it the brown look
I downsampled to 1/4 resolution and did exactly that, the last part of the video, the bloom shape is being programatically built, and transformed every frame..
Fun fact: Unreal Engine 5 make Convolution Bloom much cheaper by simply downsampling the source frame to 1/4th of the resolution, essentially cutting down the load by 93.75%. You can toggle back to the half resolution to get higher accuracy bloom for cinematic quality.
This is an insane amount of info on bloom in just 13 minutes! Thanks to you I know there's faster and prettier ways of doing bloom and I might implement them in the future!
I'm glad you're still teaching. Industry massively lacking in content for anything beyond the Unity scene-graph, unless you dive into shaders... which I see you have a course now.
The FFT is a heavy operation but it really frontloads your computational work as so many involved image manipulations can be done incredibly cheaply once you do get that transform.
AngeTheGreat has an amazing “What is bloom” video from a photo real perspective. He makes many great simulation videos and in this one it goes into how Fraunhofer Diffraction is simulated for rendering, which is representative to this 2D Fourier Transform operation!
Well where was this video 2 years ago when I needed it? I think the shader I ended up using was Unity Style, so it’s nice to heave a clear explanation of how it works. Thanks!
Great video! I think some of the better implementations I saw of 2000s era bloom was to use the alpha channel in the frame buffer to mask areas that should glow like lights or neon signs. That way it prevented the thresholding problem of causing white objects to glow unintentionally.
I really like this. Probably will never use it. Kinda of like watching a car mechanic fix a car. It's calming. And I can give some comments, like, "hey nice bloom tut there!" But that's about it! :D
Incredible video, thorough and cohesive, very easy to understand visualization material, while also presenting a pretty complete analysis on how and why the technique advanced throughout the years. It might've been interesting to explain why bilinear taps were used, despite it being "obvious" for people already in the know.
Wow! Nice vid! Still remember trying many different methods of blooming during the fixed function pipeline days. Had to use a downsampled view so that we get 60 fps using a software blur. Then just combine them with additive blend. Thresholding was also done by averaging RGB channels together then checking if they pass the threshold. All these were done in a 1/2 and 1/4 downsampled view so the effect wasn't as good as what we get these days with shaders. There was also a pure hardware technique(making use of hardware downsampling for free) but you don't get thesholds and the result was blocky at best. Or maybe it was just my 64 mb nvidia mx400.
I was on the tail end of fixed function, was always kinda fun fiddling with different tevstages and stuff like that, trying to coax an effect out of the hardware. Like a puzzle.
Streaking is achievable by attaching by attaching a billboard sprite to the light source. It is not always gonna be accurate to the light source those streaks are supposed to represent, but it does its job well enough for 80% of light sources. One of the reasons why I love Unreal 1 and most of UE1 games is those billboard flares. Though if we are on the path of moving away from rasterization tricks and towards more accurate models, I welcome the FFT approach as something representing the real life better
Thank you, that was really interesting. I use a free Java gauss filter to bloom and it takes 1 second to render a 1/24th of a second frame. I reeeeeeeally need to speed that baby up ...
Honestly tho, the bloom in first seen in Unreal was ... _unreal!_ In a good way. Playing that game for the first time was sooooo epic! It was so much cool stuff to discover, and you completely forgot you were in a game. Instead you were in another world.
Looking forward to the moment bloom-effects in three-js are doable in a simple manner. It is such a powerful and common tool, so hope it will be possible soon :)
Regarding the last part about using a Fourier transform, @angethegreat did a great video going into more detail for those interested (video titled “What is bloom? (And how is it simulated?)”). His channel is also focused on programming, but he goes a little into the physics of why we use a Fourier transform, which is very informative (well, I was satisfied with what I learned at least). (I hope it’s okay to mention other channels in the comments, but given that you shared links to videos about the FFT yourself, I’m going to assume that sharing is caring).
Great video as always. Can you imagine to make a video about half/quater resolution bilateral upsampling techniques? I think rendering stuff into low resoltuion passes became super important in the last years but the culprit always lays in a proper upsampling step pass imo (i think of half/quater res SSR, AO, Volumetrics etc. to full-res upsampling while preserving details from full-res pass). Thanks anyway
This was awesome to hear. I've always wanted to be a game developer because you can bury yourself in making algorithms like this for seemingly simple functions.
I think UE's layered thing is mainly to approach the exponentiality of light decay ergo bloom. Blurred plus'd bloom looks weird because there is no proper light decay. I think. I'm a compositor not a gamedev lol.
I was starving for your videos. Your way of explaining is so exciting and concise. Sometimes I think I would try to get a job in Google after watching this :). Please, keep going.
I kinda expected a video describing the phenoma of bloom in the real world instead of a video describing how its usually implemented in video games. Knowing how it works in the real world would certainly help making it look more convincing. I turn effects like this off whenever I can, especially considering my eyes and glasses are already doing it with zero performance impact.
No, wait a minute! What about the light sabres and blaster pistols and rocket engines in Star Wars? That was the first movie to blow our minds with bloom, not TRON.
Awesome video! How does the Source2 engine handle it though? Games made with the source engine always seem so timeless (portal 2 for example), when you remove the lighting it looks very old but with lighting it still looks quite modern. I'm guessing this is partially due to suttle bloom effects but i'm not sure.
source1 games have pretty much no bloom actually or well, they do, but it's extremely small and weak and barely contributes to the picture at all the glow effects are achieved via simple additively-blended sprites the baked lightmaps are what make the games you're thinking of look deceptively good for their age. not sure about source 2 though. it's *probably* just a variant of one of the techniques mentioned in the vid? it does look damn good, that i can say with certainty. HL:A partially looks so realistic because of the bloom, which just feels so right and accurate to how i see it in real life, so they absolutely nailed it, whatever they did
@@randomcatdude interesting, really impressive lighting for 2011 (in the case of portal 2) and games that run on modified verisons of source (like titanfall 2) also looks great. I wonder what would happen if steam ever made a source 3 engine
@@mki443 titanfall 2 completely rips out the old source renderer and uses its own, which is overall way better. it has proper bloom, better dynamic lighting support, and even improves the baked lighting. source 2 is fairly similar in terms of improvements. also, steam is just the name of the game platform that Valve owns, it's not a company/developer.
@@randomcatdude really? That's interesting, especially considering thr age of source, never expected titanfall to rip out everything (altough it is a pretty unique game in quite some ways). about the steam/valve thing, i knew that but was in a hurry writing that message
Hey Man, I really enjoy your videos. Always super exciting to see on my recommended. Truly some gems of videos in UA-cam. Just a suggestion take it or leave it: Way out of my domain but I was looking into volumetric lighting in smoke and stuff. I was thinking it would be a cool challenge to try to make some realistic smoke (not fog, I mean visible smoke that drifts from a point like a particle system) that interacts well with light. Either with volumetrics, particles, or other techniques. Something I’ve been wondering about for a while.
@@simondev758 That's true. I know you work a lot in java, would implementing that into an engine like unity or unreal be just using shaders? (I'm pretty new to vfx and shaders and didnt know how this would fit into prexisting game engines)
@@photonpotato2490 I mostly doing JS because it's convenient, but I'm actually a C++ dev. Implementing these into an existing engine requires both writing some client-side code and the shaders, so it's more involved. You've gotta know a bit more about your target engine to integrate the effects.
@@simondev758 Ok, I'll look more into that. I also was wondering whether knowledge from your shader course in GLSL will translate over to HLSL as well (because I'm trying to learn HLSL shaders in Unity). Thanks so much for the help man! Looking forward to your next video.
Modern GPUs contain fast fixed-function hardware to compute some version of FFT, and inverse too. That hardware is JPEG, h.264 and h.265 codecs. I wonder would it be possible to abuse these pieces of hardware for the bloom?
tip for pronouncing romanized japanese names or words: the vowels are always pronounced only one way. "a" is always "ah", "e" is always "eh" (like "send"), "i" is always "ee" (like "seen", same way you'd read an i in spanish). so the name "kawase" is "kah wah seh". it's respectful to double check this stuff before misreading someone's name, ideally.
My implementation isn't production ready, so I'm hesitant to profile it. Doing it at 1/4 resolution seemed to be plenty fast though, and still looked good (handwavy).
On the other hand, doing an FFT is intriguing for some reason. I think I'll do it one day, maybe in Python, for reasons. Maybe make a competitor to Melodyne, bcos he needs a competitor!11
Hello, I really love your channel and your content. I have enrolled in your shader course and I really enjoy it! I have one question that is bothering me for quite some time when I watch your videos And I really mean no offense by it! Do you use a computer voice when recording your videos? I guess in this time and age it must not be unlikley. Either way, thank you for your educational efforts!
9:24 yet another reason I seem to prefer unity over unreal. The bloom effect from Unity is more subtle and makes more sense, but unreal is just another iteration on the old style. But in general I just prefer C# to C++ and blueprints. And the way that unity deals with components compared to unreal's version. I really wish it was easier to find a unity job.
I always thought games render the frames twice, with normal brightness and lower brightness (say factor 2) and then use a threshold of 50% on the darker frame to get the parts which are brighter than 100% on the normal frame to calculate the bloom. For the darker frame you could also render it in a lower resolution. 🤔
@@simondev758 I haven't heard of it either. 😄But I couldn't think of any other way to create the bloom only in the places where the image was really brighter than the possible value of 255.
You don't need to render everything twice for that tough. Just get 2 FBO's and render to both of them in only one pass. However, if you are going with this approach it's probably better to just store color in HDR instead of LDR, for example 16 bit floats instead of 8 bit int's. This will take up twice the memory, but your approach will do that anyway +more. And instead of having a maximum value of 200% to find the bright parts, you can get ~6.55*10^6%. This will enable extremely bright parts to have even more bloom than regular really bright parts.
@@NicosLeben Way back on XBox360, I think I played around with storing kinda a light intensity value in the alpha channel to help bloom, so while rendering out you could optionally throw an intensity out into the alpha channel.
Definitely worth exploring. I recorded a lot of the video by downsampling the main image to 1/2 and 1/4 resolution, doing the FFT there, and upsampling the result (kinda a mix of all the techniques). Didn't profile it though.
i'm curious if there's some way to (mathematically/arithmetically) simplify the fourier -> multiply by specific frequency-space texture -> inverse process to maybe make it a little faster for some specific filter. especially when you're multiplying by 0 in a bunch of places, but sadly i do not actually have to math chops to poke at that any time soon.
Love your content and this video, but what do you mean that there isn't a correct way of doing it? Imo you need linear space, where you could take a threshold that would be "maximal capacity of the capturing sensor", take what is over that and "bloom" it in a way that simply spreads said overflowing values around (with some loss of energy), no? (Performance aside :D ) IMO the issue is the fact that we are using blur for it, because that doesn't really capture the lightbleed effect of IRL bloom.
Any chance you will share FTT algorithm implementation you've used? It would be very helpfull. I've spend so much time on this with very little luck. I've managed to transform image back and forth using FFT, but I constantly getting some artefacts. Like on some camera angles the whole image suddenly disappears. On some GPU I am getting crazy artefacts and I don't know why. Also how you deal with non-square images?
4:38 I think this is wrong. Box blur is single-pass, on the same image. Gaussian blur is done in 2 passes, "ping-pongging" between two identical copies of the same image, once for horizontal and once for vertical. The box blur uses a kernel (3x3, 5x5, 9x9, etc), while the Gaussian blur does not, since it samples either vertically or horizontally a certain number of pixels (since that's where the "strength" of the blur comes in).
I thought about it randomly and came here to comment this. Isn't it true that by FFTing the image you could combine a bunch of FFT-able effects in one go? (like how you can multiply transformation matrices and apply them all at once). Say, a sharpen effect, and a bloom effect, all from a single FFT round trip. Dunno what else, but probably something!
Bloom reminds me so much of the PS3. PS3 games also have a specific dithering in their shadows or in transparent stuff. I'm not sure what it is? You can see it in the animated menu as well.
It's not really in the video, sorry :) you can see it in the main menu of the PS3, the waves use dithering I think? A lot of shadows in PS3 games also look unstable.
it seems the convolution with a texture can be easily recreated with a good flare effect, isn't it? with cheap cost. In my game, I find some good flare effect adds better variation, feels more alive than trying too hard on bloom effect.
Patrons can now vote for the next video! Thank you for your support.
❤ Support me on Patreon: www.patreon.com/simondevyt
🌍 Live Demo + Courses: simondev.io
I'm almost surprised there isn't just a hardware implementation for FFT in modern GPUs, it seems like it would be useful for many things
I bet a whole lot of new applications would emerge too
There probably just hasn't been a big market for it yet.
Recent new features like AI acceleration gain more possible applications every day, faster raytracing is great for the gaming industry, but doing many FFTs in parallel is probably not a common use in the mass market. But maybe it'll just take a new shiny hammer for developers to look for more nails, if it can help with blur & bloom filters I'm certain people will find more clever applications for it.
@@SaHaRaSquad I don't think the point would be to make many FFTs in parallel, but just to make FFTs of decently sized data *really* fast.
Changing representation through fourier transform is a really powerful tool in general, it crops up from these bloom effects all the way to quantum mechanics. As mentioned in the video, it can transform convolutions (n to n^2 complexity in kernel size) into multiplications (kernel size independent), so I'd definitely bet if transforming between representations was really fast people would find all sorts of uses for it.
That being said, FFT as an algorithm is already pretty crazy as far as I know and there are parallelized versions which are in fact ran on GPUs (say, using cuda), so maybe there's just not that much to be gained by going hardware level, compared to the costs.
nvidia provides an fft library that runs on cuda cores called cuFFT aimed at industry for signal processing and computational physics. it's claimed to be highly optimized.
While not hardware implemented, both cuFFT and vkFFT are extremely fast on common hardware.
Using FFT for blurring and bloom is really awesome, I'm definitely going to play with that
That's the most interesting use of Final Fantasy Tactics I have heard. ;w;
FFT is ... well fast. There's not much of a special hardware needed honestly. You can do full screen FFT bloom in under 100us.
@@dexio85That seems to go against what Simon told us, that it's a heavy operation
@@Mystixor We used in in water simulation in Witcher 3 back on the PS4 10 years ago and it was less than a 1ms. No it's not heavy.
"Back in the day, we developers thought it looked awesome and so next-gen, and it kinda did, so how about shut up." Amazing. Thank you for all the info, btw!
using the fft for image processing is really interesting, something I will definitely look into
one pedantic note though is that the karis average is actually for suppressing bright pixels not preserving them. The karis average helps with fireflies and shimmering artifacts by preventing really high hdr values from overwhelming the ultimately very small sample set of pixels.
hey its the guy with all the cool ffxiv shaders!!
Oupelaï, gotta check things over a bit more thoroughly.
Btw, fan of your channel :)
@@simondev758 Thanks!
It's a strange coincidence that I am also bringing up the karis average in my next video lol, I thought it preserved bright pixels as well but when I implemented it I got the exact opposite results that I wanted
@@Acerola_t Totally admit I glanced at the code and made a guess at what it did without thoroughly checking heh, glad to see I'm not the only one
You can use fft to remove the repeating paper texture from old scanned pictures or to remove halftones and ıt WORKS
I love your explanations, they're so clear and concise, it really helps me understand these complex topics better even without a formal education. thanks so much for what you do :)
FFT-based glare is the same method I implemented for Megamind. It turns out that Fraunhofer diffraction through an aperture is essentially equivalent to an FFT. With that and making sure you rescale appropriately for each wavelength of light, you get the point spread function, which you can inverse-FFT and use to convolve. But since Fraunhofer diffraction is equivalent to an FFT anyway, the inverse FFT just gets you back to the simulation of your aperture, and you can just use that as your convolution filter... This means you can simulate the bloom based on a lens or even a human eye as long as you can adequately simulate the aperture. Yes Fresnel diffraction is more appropriate for an eye because of the short distance between the pupil and retina, but it's still pretty convincing in producing a glare that looks like glare you'd personally experience through your own eyes.
It would be cool to see FFT blur be practical in realtime... It is just plain simple and generic in the same way that ray tracing is.
Super cool, I loved that movie!
I feel like we're right on the edge of using it in realtime in games. I didn't profile the quarter res fft, but it seemed pretty quick (handwavy).
You worked on megamind???
FFTs are super powerful. They are commonly used in signal processing to do essentially the opposite of bloom: remove artifacts of the sensor. This reverse process, so-called deconvolution, can for example remove the 6-star artifacts on the JWST images.
Love your work. Cool to see you here too
There is a correct way of doing bloom, it's just way too computationally expensive to do in real time. The idea is to get the result of the diffraction of light around the aperture of th light capturing device ( eye or camera). The "blur and down sample" approach from advanced warfare was found to give results very close to the 1/d³ falloff of a real airy spot, which is the bloom you obtain with a round aperture at the focal point. The fact that it gets faster due to downsampling is a nice side-effect.
Agreed. I wasn't trying to imply the effect doesn't have a real-world basis, but that the old-way of doing it had many ways of going about it. The context was that older games used to do it a bunch of ways.
i have a feeling that some of the intense color grading and bloom seen in the 2000s was at least partly inspired by Need for Speed Most Wanted without taking into account why they did it and what look they were trying to emulate (a stylized bleach bypass)
need for speed also known as bloom observations with cars
NFS MW came out in 2005.
Probably the "bloomiest" game I've ever played was Prince of Persia: The Sands of Time (2003).
2 years before NFS MW.
@@Rundik here's the thing
MW doesn't have much bloom at all
the game's look is its attempt to emulate bleach bypass (which it actually does surprisingly well, on the 360 at least) which it also color grades to give it the brown look
If the FFT is cheap enough, I'd love to try animating or dynamically changing the bloom shape one the fly.
I downsampled to 1/4 resolution and did exactly that, the last part of the video, the bloom shape is being programatically built, and transformed every frame..
@@simondev758 What FFT implementation are you using? How much of the frame time it takes up?
Fun fact: Unreal Engine 5 make Convolution Bloom much cheaper by simply downsampling the source frame to 1/4th of the resolution, essentially cutting down the load by 93.75%. You can toggle back to the half resolution to get higher accuracy bloom for cinematic quality.
Yeah I kinda figured they must do something along those lines. For the video, I did the same thing, tried at both 1/2 and 1/4 resolution.
This is an insane amount of info on bloom in just 13 minutes! Thanks to you I know there's faster and prettier ways of doing bloom and I might implement them in the future!
Holy crap, I have never thought that I would finally learn how the FFT filter ACTUALLY works and what it stands for. Thank you!
I'm glad you're still teaching. Industry massively lacking in content for anything beyond the Unity scene-graph, unless you dive into shaders... which I see you have a course now.
The FFT is a heavy operation but it really frontloads your computational work as so many involved image manipulations can be done incredibly cheaply once you do get that transform.
AngeTheGreat has an amazing “What is bloom” video from a photo real perspective. He makes many great simulation videos and in this one it goes into how Fraunhofer Diffraction is simulated for rendering, which is representative to this 2D Fourier Transform operation!
Thanks for sharing. We usually take it for granted and cramp bloom to max hehe
seriously perfect video, I cannot say enough how refreshing the format, tone, and pacing was. thank you
Well where was this video 2 years ago when I needed it? I think the shader I ended up using was Unity Style, so it’s nice to heave a clear explanation of how it works. Thanks!
I've seen videos on the fourier transform but only now do I realize how awesome it is.
My technical understanding doesn't get everything ofc but nice video!
Now let me go into bed, replay the video and close my eyes
Please keep going with these topics (more post-processing)! It's always a good day when you release on of your videos! 🙂
One of the few channels I actually get exited when new content is released!
Wow, the implementation of the convolution bloom is amazing
You're the MadSeason of game programming, so relaxing to hear.
damn so you could recreate astigmatism with that convolution bloom method
Great video! I think some of the better implementations I saw of 2000s era bloom was to use the alpha channel in the frame buffer to mask areas that should glow like lights or neon signs. That way it prevented the thresholding problem of causing white objects to glow unintentionally.
Definitely! In fact, I played around with that exact method back on XBox 360/PS3
8:04 that grid is a Hermann grid illusion!
I really like this. Probably will never use it. Kinda of like watching a car mechanic fix a car. It's calming. And I can give some comments, like, "hey nice bloom tut there!" But that's about it! :D
i absolutely love your choice of program in the first terminal animation
I actually wrote a tiny python program to generate the output, lol
Incredible video, thorough and cohesive, very easy to understand visualization material, while also presenting a pretty complete analysis on how and why the technique advanced throughout the years. It might've been interesting to explain why bilinear taps were used, despite it being "obvious" for people already in the know.
HE's BACK! You are a legend my friend. You helped me so much along the way, thank you.
your vids are a blessing, and the demonstration of convolution bloom was especially eye-opening
Wow! Nice vid!
Still remember trying many different methods of blooming during the fixed function pipeline days.
Had to use a downsampled view so that we get 60 fps using a software blur. Then just combine them with additive blend.
Thresholding was also done by averaging RGB channels together then checking if they pass the threshold.
All these were done in a 1/2 and 1/4 downsampled view so the effect wasn't as good as what we get these days with shaders.
There was also a pure hardware technique(making use of hardware downsampling for free) but you don't get thesholds and the result was blocky at best. Or maybe it was just my 64 mb nvidia mx400.
I was on the tail end of fixed function, was always kinda fun fiddling with different tevstages and stuff like that, trying to coax an effect out of the hardware. Like a puzzle.
Bloom, always loved it even when everybody hated it, thanks a lot
Streaking is achievable by attaching by attaching a billboard sprite to the light source. It is not always gonna be accurate to the light source those streaks are supposed to represent, but it does its job well enough for 80% of light sources. One of the reasons why I love Unreal 1 and most of UE1 games is those billboard flares.
Though if we are on the path of moving away from rasterization tricks and towards more accurate models, I welcome the FFT approach as something representing the real life better
Thank you, that was really interesting. I use a free Java gauss filter to bloom and it takes 1 second to render a 1/24th of a second frame. I reeeeeeeally need to speed that baby up ...
Warframe came to my mind when blooming effect mentioned
And you just pulled some footage of it 😝🤣🤣🤣
I just googled for terrible bloom and picked a few heh
10:30 Fourier Transformation has huge implications trough out many fields, once you get a hang of it you will start to see/use it everywhere :)
Honestly tho, the bloom in first seen in Unreal was ... _unreal!_ In a good way. Playing that game for the first time was sooooo epic! It was so much cool stuff to discover, and you completely forgot you were in a game. Instead you were in another world.
Looking forward to the moment bloom-effects in three-js are doable in a simple manner. It is such a powerful and common tool, so hope it will be possible soon :)
Regarding the last part about using a Fourier transform, @angethegreat did a great video going into more detail for those interested (video titled “What is bloom? (And how is it simulated?)”). His channel is also focused on programming, but he goes a little into the physics of why we use a Fourier transform, which is very informative (well, I was satisfied with what I learned at least). (I hope it’s okay to mention other channels in the comments, but given that you shared links to videos about the FFT yourself, I’m going to assume that sharing is caring).
Neat, I'll check out their vid
Omg I love how fft bloom looks
Came for Oyster smiling, stayed for the rest of the explanation.
5:13 The one on the right is a game called "Syndicate" released in 2012, not "mid-2000's".
But yeah, they really cranked that shit up. 😆
Hah, I just went on Google images and searched for too much bloom, and picked some awful ones.
Hah, there are three settings I disable in any game. Bloom, motion blur and maybe the depth of field. Rarely do I see good implementations of bloom :D
What an excellent video! Super clear description of bloom.
Great video as always. Can you imagine to make a video about half/quater resolution bilateral upsampling techniques? I think rendering stuff into low resoltuion passes became super important in the last years but the culprit always lays in a proper upsampling step pass imo (i think of half/quater res SSR, AO, Volumetrics etc. to full-res upsampling while preserving details from full-res pass). Thanks anyway
This was awesome to hear. I've always wanted to be a game developer because you can bury yourself in making algorithms like this for seemingly simple functions.
Yeah, the rabbit hole behind each option in the menu goes super deep heh
Those videos are just amazing for a graphics programmer like me, keep it up!
Dude you are a god, I love watching your videos and learning from you
I think UE's layered thing is mainly to approach the exponentiality of light decay ergo bloom. Blurred plus'd bloom looks weird because there is no proper light decay. I think. I'm a compositor not a gamedev lol.
I was starving for your videos. Your way of explaining is so exciting and concise. Sometimes I think I would try to get a job in Google after watching this :). Please, keep going.
I kinda expected a video describing the phenoma of bloom in the real world instead of a video describing how its usually implemented in video games.
Knowing how it works in the real world would certainly help making it look more convincing. I turn effects like this off whenever I can, especially considering my eyes and glasses are already doing it with zero performance impact.
I think that'd definitely be a neat follow up
Now it is springtime, and everything is in bloom! Was TRON was the first movie to do the bloom effect? What algorithm did Disney use, I wonder?
No, wait a minute! What about the light sabres and blaster pistols and rocket engines in Star Wars? That was the first movie to blow our minds with bloom, not TRON.
Great video again Simon! Bloom is awesome and it's great to hear about how the different engines implement it.
Great break-down. I had to figure all of this out the hard way/over a long period of time heh
All us old timers did
@@simondev758 word haha
Awesome video! How does the Source2 engine handle it though? Games made with the source engine always seem so timeless (portal 2 for example), when you remove the lighting it looks very old but with lighting it still looks quite modern. I'm guessing this is partially due to suttle bloom effects but i'm not sure.
Good question, no idea, I should go try and see if there's info around.
source1 games have pretty much no bloom actually
or well, they do, but it's extremely small and weak and barely contributes to the picture at all
the glow effects are achieved via simple additively-blended sprites
the baked lightmaps are what make the games you're thinking of look deceptively good for their age.
not sure about source 2 though. it's *probably* just a variant of one of the techniques mentioned in the vid?
it does look damn good, that i can say with certainty. HL:A partially looks so realistic because of the bloom, which just feels so right and accurate to how i see it in real life, so they absolutely nailed it, whatever they did
@@randomcatdude interesting, really impressive lighting for 2011 (in the case of portal 2) and games that run on modified verisons of source (like titanfall 2) also looks great. I wonder what would happen if steam ever made a source 3 engine
@@mki443 titanfall 2 completely rips out the old source renderer and uses its own, which is overall way better. it has proper bloom, better dynamic lighting support, and even improves the baked lighting. source 2 is fairly similar in terms of improvements.
also, steam is just the name of the game platform that Valve owns, it's not a company/developer.
@@randomcatdude really? That's interesting, especially considering thr age of source, never expected titanfall to rip out everything (altough it is a pretty unique game in quite some ways). about the steam/valve thing, i knew that but was in a hurry writing that message
Hey Man, I really enjoy your videos. Always super exciting to see on my recommended. Truly some gems of videos in UA-cam.
Just a suggestion take it or leave it:
Way out of my domain but I was looking into volumetric lighting in smoke and stuff. I was thinking it would be a cool challenge to try to make some realistic smoke (not fog, I mean visible smoke that drifts from a point like a particle system) that interacts well with light. Either with volumetrics, particles, or other techniques. Something I’ve been wondering about for a while.
The cloud stuff from the previous video would be a good starting point for that
@@simondev758 That's true. I know you work a lot in java, would implementing that into an engine like unity or unreal be just using shaders? (I'm pretty new to vfx and shaders and didnt know how this would fit into prexisting game engines)
@@photonpotato2490 I mostly doing JS because it's convenient, but I'm actually a C++ dev. Implementing these into an existing engine requires both writing some client-side code and the shaders, so it's more involved. You've gotta know a bit more about your target engine to integrate the effects.
@@simondev758 Ok, I'll look more into that. I also was wondering whether knowledge from your shader course in GLSL will translate over to HLSL as well (because I'm trying to learn HLSL shaders in Unity). Thanks so much for the help man! Looking forward to your next video.
Modern GPUs contain fast fixed-function hardware to compute some version of FFT, and inverse too. That hardware is JPEG, h.264 and h.265 codecs. I wonder would it be possible to abuse these pieces of hardware for the bloom?
Interesting avenue to explore
I do believe you are one of the best channels here :)
I wonder if you can use the VK_KHR_cooperative_matrix extension or HLSL ShaderModel 6.0 to speed up FFT transform.
tip for pronouncing romanized japanese names or words: the vowels are always pronounced only one way. "a" is always "ah", "e" is always "eh" (like "send"), "i" is always "ee" (like "seen", same way you'd read an i in spanish). so the name "kawase" is "kah wah seh". it's respectful to double check this stuff before misreading someone's name, ideally.
Yeah, it's funny to me, an (ex-)anime enjoyer, hear him say "Kawase" as "Quasi" or something, idk. 😆 'Muricans.
finna add the trollface to my convolution bloom effect
“Until next time, cheers”
…
Thought I was watching Sebastian for a sec 😂
That is high praise, his videos are really good.
Very interesting video! Some actual numbers on the convolution bloom would have been great. Is that currently feasible at all in games, or not?
My implementation isn't production ready, so I'm hesitant to profile it. Doing it at 1/4 resolution seemed to be plenty fast though, and still looked good (handwavy).
As always great video! The FFT stuff is insane
On the other hand, doing an FFT is intriguing for some reason. I think I'll do it one day, maybe in Python, for reasons. Maybe make a competitor to Melodyne, bcos he needs a competitor!11
Hello, I really love your channel and your content. I have enrolled in your shader course and I really enjoy it! I have one question that is bothering me for quite some time when I watch your videos And I really mean no offense by it! Do you use a computer voice when recording your videos? I guess in this time and age it must not be unlikley. Either way, thank you for your educational efforts!
Hah, no offense taken :) Nope, 100% me, I'm pretty monotone when I speak.
@@simondev758 Thank you! That puts my mind to rest :D
9:24 yet another reason I seem to prefer unity over unreal. The bloom effect from Unity is more subtle and makes more sense, but unreal is just another iteration on the old style.
But in general I just prefer C# to C++ and blueprints. And the way that unity deals with components compared to unreal's version.
I really wish it was easier to find a unity job.
I haven't looked at job postings, is everyone asking for Unreal?
@@simondev758 yeah more or less. If its not an unreal thing its usually some crypto scam. But also hard to find entry level jobs in general.
My life has something I call the gloom effect TwT
this is getting a very good channel
ty! Trying to up the quality of the videos, fewer but more time spent on them heh
I always thought games render the frames twice, with normal brightness and lower brightness (say factor 2) and then use a threshold of 50% on the darker frame to get the parts which are brighter than 100% on the normal frame to calculate the bloom. For the darker frame you could also render it in a lower resolution. 🤔
I've never heard of that approach, feels too expensive.
@@simondev758 I haven't heard of it either. 😄But I couldn't think of any other way to create the bloom only in the places where the image was really brighter than the possible value of 255.
You don't need to render everything twice for that tough. Just get 2 FBO's and render to both of them in only one pass. However, if you are going with this approach it's probably better to just store color in HDR instead of LDR, for example 16 bit floats instead of 8 bit int's. This will take up twice the memory, but your approach will do that anyway +more. And instead of having a maximum value of 200% to find the bright parts, you can get ~6.55*10^6%. This will enable extremely bright parts to have even more bloom than regular really bright parts.
@@NicosLeben Way back on XBox360, I think I played around with storing kinda a light intensity value in the alpha channel to help bloom, so while rendering out you could optionally throw an intensity out into the alpha channel.
Sick, I love the deep dives
Is it worth doing the FFT and using the results for multiple effects before combining them?
Definitely worth exploring. I recorded a lot of the video by downsampling the main image to 1/2 and 1/4 resolution, doing the FFT there, and upsampling the result (kinda a mix of all the techniques). Didn't profile it though.
Simon's secret is that he's redoing the Prototype game using only bloom.
Recreating prototype would be kinda fun, I still vaguely remember how the world and stuff was structured.
Interestingly the frequency transform is a similar technique to how JPEGs are compressed.
Oh my god! The amount of valuable information i learned in just 13 mins is just amazing! Thanks Simon!
i'm curious if there's some way to (mathematically/arithmetically) simplify the fourier -> multiply by specific frequency-space texture -> inverse process to maybe make it a little faster for some specific filter. especially when you're multiplying by 0 in a bunch of places, but sadly i do not actually have to math chops to poke at that any time soon.
5:45
I'm very nostalgic for those bloom effects now. But I remember thinking blur was a scourge on games around Fallout 3.
Yeah I think those were peak bloom days heh
"the dark ages" oh man....I feel old, lol
Great video! Very informative and interesting.
Now BLOOM I can get into!
Could I see a hat wobble?
perfect timing as i was planning to revamp bloom in my engine soon
Love your content and this video, but what do you mean that there isn't a correct way of doing it? Imo you need linear space, where you could take a threshold that would be "maximal capacity of the capturing sensor", take what is over that and "bloom" it in a way that simply spreads said overflowing values around (with some loss of energy), no? (Performance aside :D )
IMO the issue is the fact that we are using blur for it, because that doesn't really capture the lightbleed effect of IRL bloom.
Pretty sure that section was in reference to how it older games did it
Any chance you will share FTT algorithm implementation you've used? It would be very helpfull. I've spend so much time on this with very little luck. I've managed to transform image back and forth using FFT, but I constantly getting some artefacts. Like on some camera angles the whole image suddenly disappears. On some GPU I am getting crazy artefacts and I don't know why. Also how you deal with non-square images?
4:38 I think this is wrong.
Box blur is single-pass, on the same image.
Gaussian blur is done in 2 passes, "ping-pongging" between two identical copies of the same image, once for horizontal and once for vertical.
The box blur uses a kernel (3x3, 5x5, 9x9, etc), while the Gaussian blur does not, since it samples either vertically or horizontally a certain number of pixels (since that's where the "strength" of the blur comes in).
Both Box and Gaussian are 2D (or more), but "can" be separated to 2 1-d passes. en.wikipedia.org/wiki/Box_blur
I thought about it randomly and came here to comment this. Isn't it true that by FFTing the image you could combine a bunch of FFT-able effects in one go? (like how you can multiply transformation matrices and apply them all at once). Say, a sharpen effect, and a bloom effect, all from a single FFT round trip. Dunno what else, but probably something!
Yup, that bloom I show is actually like 7 different blooms at once. 2 streaks + 4 gaussians of varying sizes.
The first thing I do when I download a new game… is disable bloom and turn off motion blur
I really like bloom if it's subtle, totally worth it
Nicely explained, but that is the first thing disabled in any game, before i start playing it. Why do they even bother to code/use it in games.
Bloom reminds me so much of the PS3. PS3 games also have a specific dithering in their shadows or in transparent stuff. I'm not sure what it is? You can see it in the animated menu as well.
Not sure what you mean, maybe point to a specific time in the video?
It's not really in the video, sorry :) you can see it in the main menu of the PS3, the waves use dithering I think? A lot of shadows in PS3 games also look unstable.
Congrats with 100k!
Thanks!
This one was great, thnx so much!
Excellent video
I wonder how slow these would be using cuFFT and vkFFT. Maybe doable?
it seems the convolution with a texture can be easily recreated with a good flare effect, isn't it? with cheap cost.
In my game, I find some good flare effect adds better variation, feels more alive than trying too hard on bloom effect.
Pretty similar ya, although maybe if an actual artist got a hold of this and made a better bloom texture...
wow this is awesomesauce my guy upvoted and subscribed
Lol now we can simulate driving at night with astigmatism
The wonders of technology eh?
Huge thanks