Buy a shirt! - acerola.gg/ This has been in the works for awhile since I got the 4090 back in July, first video uploaded in 4k too! Hope you think it's cool.
Thank you for sticking with the 1660. It makes your simulations more realistic, not graphically but practically. Not all of us has RTX graphics cards after all.
Holy shit, I love how collaborative UA-camrs are. I always see fun shoutouts and cameos! And you two are undeniably my favorite graphics programming creators!
Hey man, I just wanted to let you know that whenever I'm walking home alone from a pub while drunk, I stick on one of your videos and pretend I'm on the phone by answering your rhetorical questions. It makes me feel a lot safer, and the interesting stuff keeps being alone off my mind. So I just wanted to say thanks for that.
"I saw this guy on the street, he was clearly drunk but still telling some poor game developer what to do over the phone. And then people wonder why games' performance sucks these days."
This has got to be the funniest comment I have ever seen. Just the image of some drunk game dev talking about compute shaders on the phone is so hilarious
@ It sucks at rendering but is an absolute beast at compute which is what you need for simulations.. have the A100 do the compute and then the 4090 for rendering :)
It was so epic when Acerola activated his domain expansion, CHAOS GAME, and reduced the time complexity of the algorithm from exponential to linear time, truly a cinematic masterpiece.
you made a lot of cool visual art like the realistic waves, filters that turn images into text and now this cool particle system. Why not use that for your merch. Then people could pick the merch related to their favourite video
I love how your videos feel like those videos of NES developers explaining how hacky they got just to display things that seem simple nowadays. You really manage to both explain why it's needed and how you did it in such a simple and engaging way that very few other channels manage to recreate
@@RFC3514 there's other, new tricks being done though. Me, myself I'm working on documenting some old pixel art techniques I feel have been lost to time and hardware power, which turned out to be not all that great (bandwidth constraints and lower-power mobile devices) but the techniques were already lost. It's been quite the journey but I hope I'll be able to put them in paper..... and in the process I've come up with new tricks that run on the same old hardware!
@@MadsterV - If you haven't yet, check out a video called "8 Bit & 8 Bitish Graphics Outside the Box", which is mainly about CLUT-cycling (something that can't really be done in modern graphics modes). I did some of that back in the Atari ST / Amiga (and early VGA) days, but Mark Ferrari takes it to insane extremes.
Maybe it's just that I "grew up" with 3D Studio (and later 3DSMAX) and Autodesk Animator (and later Combustion / Eyeon Fusion / Nuke), but If I see someone running a 3D IDE in dark mode, I automatically think: "Wanker. Everybody knows that dark mode is only for 2D compositing." And coding IDEs should _obviously_ use light grey text over a dark blue background, like Borland and Watcom intended! 😜
Maybe you could've weaponized the massive amounts of overdraw and relied on additive blending and the Z-buffer to calculate your lighting? Since you don't have any light sources that may be occluded by the geometry.
I agree, this technique is called Eye Dome Lighting and is heavily used in 3d scanning application (lidar data, 3d scanners...). The results is incredible for virtually 0 cost since it is based on the depth map alone which is computed anyways
Oh boy! Another acerola video, I sure I hope I am not reminded of a video game that permanently altered my brain chem. (My clueless ass about 40 minutes ago)
I mean, it's not like bad topology helps either. Even if they are not the bottleneck here there _are_ setups you'll encounter in practice that are taken out back behind the barn by overdraw and subpixels. So the gpu does care, just not too much. sidenote; most of those blends really do look like wings, would be pretty cool to have that system attached to a character, even before the Kuwahara makes them even more feathery. The noise even fits with/supports the whole "energy gathering into a shape" motif.
Wow, this was the topic of my bachelor thesis! Really happy to see it computed this nicely. The mathematics on why this chaos game approach works is a really nice topic to delve into
For particle lighting you might want to try deep shadow maps using Fourier series I made an example of this ages ago (search Volumetric Particle Lighting Fort Attack if your interested) Another option is using statistics, random and temporal blending. Each frame you randomly select a set of particles to render from the lights perspective then do a standard shadow map calculation and temporally blend the results for each particle. If you have a 100 particles evenly distributed on the lights path and only select one per update then a particle in the middle will be closer to the light 50% of the time giving you volumetric lighting.
I can't believe i understood everything from this video, this is suh an achievement for me. I feel like I should put "understanding this video" to my CV, or maybe print it and put it on the wall like a degree!
Sortof same here lol. I used to learn stuff like these 25 years ago and never touched anything related since then. My knowledge is just enough to be amazed how ez the dude doin it :D
Many familiar topics here, nice to see. Like lerp functions (I think of them as mix functions..) and easing functions (I think of them as transfer functions).. and like, even Affine transformations are analogous to a classic Gain + Offset module in the context of modular audio synthesis. Okay and yeah, a possible way to "fix" the noise issue is to use another easing/transfer function on the values of the voxel grid before the occlusion calculation. This is the same problem of Gate signals in modular audio synthesis being too abrupt (especially with digitally controlled ones that can operate arbitrarily quickly), so they're used to control Envelope modules, which give a way to set the rate of change for the rising and falling edges of the signal. This is really important when using an threshold detector on an incoming audio signal to generate gates, because otherwise super brief changes in the signal level can produce large changes in the output level... anyway yeah, interesting to see more and more analogous operations.. almost as if they're the same topic but over different domains or something hehe.
One thing to note about the tiny triangle quad overdraw problem, it probably wasn't a big deal in the particle case because the particles were not texture mapped. The GPU only does the full quad finite differences pass if the fragment shader reads an interpolated attribute like texture coordinates, solid color triangles don't pay the cost. In general the tiny triangle inefficiency only becomes a problem in very specific cases, the performance tax it incurs depends on the way the rest of the rendering pipeline is set up. The more work that is done by the triangle's fragment shader, as opposed to a full screen pass later, the worse the performance hit, MSAA magnifies it. Microtriangles need special considerations in the most extreme conditions, like UE5's Nanite, where every triangle is a microtriangle by design, or VR rendering where frametimes are small and everything is rendered using forward rendering with lots of MSAA, (basically the quad inefficiency worst case) but you really don't need to worry about it in the general case. A game isn't running slow because of one really dense model, or even a few really dense models.
I've got a soft spot in my heart for the Sierpinski fractal. There was a TI-BASIC program to generate them in my graphing calculator's manual and it was the first fractal I ever saw a computing device draw. As for your particle system technique, I wonder if you could use this to simulate hurricanes and maybe make better predictive models.
The closest I've seen to this was a demoscene demo that required an obscene amount of memory for the time and was of course much lower detail. Notably it also used ambient occlusion for shading. I remember it partly because the author referred to that same 1990 paper you brought up.
The explanation about transformation matrixes would've been so useful when I was working on my submission for your game jam. That said, there's so much useful information here that I'm going to be rewatching this video a lot. Great video, thanks!
Pretty good video. Because like 10 Years ago I had to merge and interpolate between 2D matrizes. And exactly as you mentioned, interpolating and merging rotations of a sheared object was extremely tricky.
I haven't seen anyone mention this, but from your video you made on water rendering, I swear you look much bigger and more built, great job working out bro you look sick 🔥🔥
1:25 Particle Systems are programs! .. that create and manage collections of few to many entities. Where the rule set of a collections controls the behavior of the individual entity.
I have never seen this youtuber once in my life, I clicked on it not knowing anything, and I left still kind of not knowing anything, and I loved every second of it. Excellent video.
Neat! I did something similar for a VR prototype a couple years ago, though I just transformed the previous eye buffers instead. That means you lose a dimension from the transformation each time, but that adds some really weird effects to it. (especially in stereo) It's basically like a VR version of the infinite mirror effect. :) Instead of points I drew a few hundred little meshlets each frame which would then get multiplied in successive frames instead of doing the whole IFS in a single frame. This made it _extremely_ cheap to compute on mobile VR hardware.
My biggest pet peeve right now in graphics is the use of blurring effects to hide inefficiencies. I get that it would be too costly to generate things without noise and dithering, but I just hate how the results look. I have not ran into an implementation of TAA that has looked good to my eyes, including upscaling like DLAA and such. I like how ray tracing is being pushed more, but this transition time that relies on dithering and other noise effects to hide the performance impact cannot end soon enough.
@@Acerola_t I get it, but I just feel that the older methods of hiding them were better than the current temporal methods. The way that they are implemented now means that there is not even the option of turning off the temporal effects without destroying visuals completely. So you just have to put up with smearing instead of the jaggies of before. I preferred jaggies since there are multiple other ways of hiding them - like a simple increase in render resolution once hardware gets better.
@@Zyxlian you're looking at them with rose-tinted glasses. Old games were aliased and jaggy as hell, and no, increasing resolution doesn't make the problem go away, it just makes it smaller. That doesn't count since in this case, the resolution of your screen has no relevance to the resolution of the particle system. I get that TAA is the new biggest thing for everyone to pretend to be an expert on, but the truth is that these are new problems that weren't feasible before due to limitations of hardware. "Old methods" didn't do particle systems like this better because they literally were not possible on old hardware. You're literally looking at a novel technique that hasn't been done before and saying "well back in my day things were done better!!!" while failing to understand what the word "novel" means
Sorry, I don't mean to come off as rude, I think the particle systems shown here is really cool. I just got irked about handwaving the performance impact of it by implementing TAA. I get that these techniques to push the visuals in graphics are always going to be ahead of the hardware (I was working with ray tracing implementations in the early 2010s, where it took several seconds to render frames). I just hate how there is an increasing number of games where the implementations of these effects are locked on, and the solution to the significant performance impact is to just decrease the resolution and add blurring. What's the point of ray tracing or particle swarms if they just end up being displayed as a dithered blur? As for my increase in resolution comment, I was thinking more along the lines of increasing the amount of particles in the swarm. In the sense that future hardware will be able to render more particles natively. By implementing the feature in a way that, even with this increase in particles and hardware performance, will still be blurry/dithered no matter how performant the hardware is, just seems counter-intuitive to me. I just want the option to turn the blurring effects off without destroying visuals.
I know nothing about graphics, but to add shading I thought you could draw each particle transparent, but I think that really depends on how small of a value (how transparent of a value) you can apply to a single pixel.
This is so insanely cool! I study chaos game and its relationship to IFS for my PhD, but I never would have expected these topics to come up here! I use chaos game to represent genetic information, thus producing a unique fractal out of it - which I then use as input for all sorts of analyses like convolutional neural networks. I do all of this through the lens of virology, as I use this method to study how COVID-19 evolves in real time (real time being every few days because the gov won't let you sample people too often) in response to different pressures - like antiviral treatment! I hope someday to make something so visually intuitive for my research, and this video has given me plenty of ideas on how to do exactly that - huge thanks!
ooo i actually did a similar thing to the sierpinsky triangle example you showed, like 7 years ago or something after seeing a demo on numberphile where they had three target points and one moving point, and you're just rotate the one moving point towards one of the target points (literally at random) and moved it halfway there, then you'd plot its position and repeat the steps. the moving point very quickly (like within three or four iterations) moved itself into the triangle fractal area and then stayed there no matter what. I experimented by using more than 3 points, as well as changing just how far towards them the moving point would move instead of "halfway thhough", and got some very cool results. - EDIT: HAH it was their Chaos Game video from 7 years ago, I was right! The way I did it back then was much more inelegant though, I simply opened gamemaker and rawdogged a surface with draw_point calls, which is all CPU bound _and_ incredibly slow (it's slightly faster nowadays but i still wouldn't recommend it lol)
Dude, your videos are so lit. Never touched graphic design and probably never will but I do like video games and I love peeking into how this aspect of them is created through your videos. I really like how you navigate between idealism and pragmatism in your approach and the examples you create always turn out really aesthetically pleasing. Also from an educational standpoint you impress me in a big way, especially with the math heavy parts. A lot of the time you are explaining concepts that are totally novel to me and 1) you're able to make me interested in and end up understanding a lot of what you cover, and 2) even when something totally goes over my head, I never feel like I've lost the thread and the rest of the video is incomprehensible. You always tie stuff into the overarching goals/ideas that people with zero relevant background can grasp. Besides, even though strictly the topics themselves are unlikely to ever be useful to me, I feel like your videos legitimately have taught me some really valuable perspective about approaching big, complicated problems that I can apply to my own creative ambitions. Thanks for making these!
My grandfather was a computer scientist back in the day, and his favourite saying is "If you have a problem, throw more hardware at it" Perfect example
actually, I suspect some splatting, perhaps even gaussian splatting, could work for games if you don't do it every frame, but like, use it as a basis, much like videos have keyframes and then a bunch of change frames that only modify parts of the frame... like you quickly chaos game the general scene very broadly, then splat a mostly-done scene over it, then quickly chaos game the approximation of details, and then do a little bit of something more, like use low-poly models and traditional rendering to map the result to a model, and for the next couple frames, you could just figure out a few changed areas, and do the chaos details + lowpoly, and perhaps re-splat the changed areas every say, 5th frame, or 10th frame? maybe even like every 30th frame or so... I'm not smart, but I think some sort of combination thingie like this could actually work, not necessarly exactly as I described, but something a bit similar...
hot damn, when you added lighting it became absolutely mesmerizing. I'm impressed you managed to publish this video, I'd still be watching the simulation ceaselessly transforming into infinity.
25:52 Jokes aside the company I work for has a lot of money and my team provides 3D applications bundled WITH the hardware. If we have performance problems we just buy the best possible GPUs, and then if even that is not good enough we eventually start optimizing... So good point
Chaotic approach isn't really necessary. It's possible to use deterministic approach of position generation based on particle ID, using several iterations of a simple algorithm with integer divisions. It eliminates necessity of denoising entirely. Also compute shader isn't necessary at all, all positions may be generated in vertex shader instead.
What if we move through the point cloud space with bezier curves ? We "chose" multiple matrices in advance and blend between them with a bezier curve, it could make the animation more smooth instead of going from one to another and see a clear stop in between the blends
hey acerola, i have been studying dynamical systems in my free time these past few months and a really interesting project you could do that is equivalent to this, is modeling state space diagrams of dynamical system. nonlinear dynamics, and chaos theory(ding ding ding!! you just dipped your toes in it with this video) would be a really fascinating computer science and computational difficult problem at reasonable compute times. Like a live mandlebrot set, or whatever other maps. all sorts of cool orbits, cycles, attractors and the like to explore. especially if given some dynamical system, could freely explore certain systems and investigate their structure. anyways, really cool video as well. super neat stuff
Buy a shirt! - acerola.gg/
This has been in the works for awhile since I got the 4090 back in July, first video uploaded in 4k too! Hope you think it's cool.
That cat has some really impressive fur simulation, it's almost life-like
dude well done, you always figure out how to innovate things past than they normally seem
But acerola, you should have found a way to optimize your code on the gtx1660!
But as a 1660 ti owner, im happy you´ll contineou to use your 1660, so ill be certain i can follow along. Love your videos.
The fractals look fantastic in 4K
Thank you for sticking with the 1660. It makes your simulations more realistic, not graphically but practically. Not all of us has RTX graphics cards after all.
Yeah but think of the ridiculous stuff acerola could do with 24GB of VRAM...
I'm stoked.
@@theftkingTHEFTKING???? I LOVE YOUR VIDS
Well now he has a 4090
@@theftking hi i watch ur fnaf vids
as someone who upgraded to a 2080super from a 1660, do not believe the lies. get a new gpu its worth it
oh shit right I HAVEN'T PUBLISHED THE BLOG POST YET ;-;; its over now they'll know you got a secret early access ill try to get it live asap brb
I was so confused I thought I missed it
Holy shit, I love how collaborative UA-camrs are. I always see fun shoutouts and cameos! And you two are undeniably my favorite graphics programming creators!
no way the bezier curve woman!
Smh insider information trading by my graphic UA-camrs??
omg
> can't figure out how to optimise particle system
> realise bottleneck is in hardware
> problem solved
Nice.
Acerola unlocked the wealth algorithm.
@@Beakerbite LOOOOL
NVIDIA go brrrrrr @@Beakerbite
Triple A style optimization : have better hardware or don't play
Hey man, I just wanted to let you know that whenever I'm walking home alone from a pub while drunk, I stick on one of your videos and pretend I'm on the phone by answering your rhetorical questions. It makes me feel a lot safer, and the interesting stuff keeps being alone off my mind. So I just wanted to say thanks for that.
"I saw this guy on the street, he was clearly drunk but still telling some poor game developer what to do over the phone. And then people wonder why games' performance sucks these days."
@@RFC3514 Look, how is the average game developer supposed to be expected to stay sober these days?
@@RFC3514he kept on starting every sentence with "But Acerola?" too - did he think this Acerola guy was stoopid?
@@thirteen3678id be more surprised if the average game developer could afford to be drunk
This has got to be the funniest comment I have ever seen. Just the image of some drunk game dev talking about compute shaders on the phone is so hilarious
Problem: Maths too difficult
Solution: Randomly apply functions
If only there was a way this was possible for everyday problems
Problem: Optimization is too difficult
Solution: Just buy better hardware
Functions are math too, but using a dress to look cuter.
Oh but everyday is like that, you just have to keep going until it works out
Eventually you will saturate every search space. It will usually take more than the lifetime of the entire universe.
Every programmer ever is sweating rn
Acerola with a 4090 feels like thanos with the infinity stones
imagine him getting his hands on A100 gpu’s
Fr imagine if bro had a Quadro he’d be America’s most wanted
@@kuromiLayfe the a100 isn't that good for gaming tbf so idk if it would be good for particule sim randering
@ It sucks at rendering but is an absolute beast at compute which is what you need for simulations.. have the A100 do the compute and then the 4090 for rendering :)
But acerola,
but Acero LA...
Butt Acerola,
But cacerola,
But Ace in the hole,
this will never get old
Death metal bands just found a font generator
XD good one
It was so epic when Acerola activated his domain expansion, CHAOS GAME, and reduced the time complexity of the algorithm from exponential to linear time, truly a cinematic masterpiece.
Absolute cinema
you made a lot of cool visual art like the realistic waves, filters that turn images into text and now this cool particle system. Why not use that for your merch. Then people could pick the merch related to their favourite video
yeah now that I can render at 4k I'll be doin prints soon
Yeah I was really disappointed that this channel is all about beautiful graphics effects and math, and the merch is just flat boring text.
The one at 31:57 would be really cool on a black t-shirt.
@@Acerola_t Yeah pls put your fractals etc. onto shirts or posters!
@@Spots1000 I was surprised by the merch as well. But I also realized that I
35 minute Acerola video. We've been blessed today
Real and true
Indeed!
agree
The 1660 asked the point cloud, "If you and 4090 fought, would you lose?". The point cloud responded, "Nah, I'd win".
if the number of thread groups exceeds the maximum dispatch limit i might have a little trouble
would you lose?
"Wait a second" - starts stopwatch 💀
This one killed me. I had to pause the video to make sure I didn't miss anything while laughing
padding for ads. shaking my smh
13:55 timestamp: wait a second.
What's next?
Your own real time fluid sim?
Turbulent flow models know how to fight back
I love how your videos feel like those videos of NES developers explaining how hacky they got just to display things that seem simple nowadays. You really manage to both explain why it's needed and how you did it in such a simple and engaging way that very few other channels manage to recreate
I remember when you had to do two shifts and an add (y
@@RFC3514 there's other, new tricks being done though.
Me, myself I'm working on documenting some old pixel art techniques I feel have been lost to time and hardware power, which turned out to be not all that great (bandwidth constraints and lower-power mobile devices) but the techniques were already lost. It's been quite the journey but I hope I'll be able to put them in paper..... and in the process I've come up with new tricks that run on the same old hardware!
@@MadsterV - If you haven't yet, check out a video called "8 Bit & 8 Bitish Graphics Outside the Box", which is mainly about CLUT-cycling (something that can't really be done in modern graphics modes). I did some of that back in the Atari ST / Amiga (and early VGA) days, but Mark Ferrari takes it to insane extremes.
You could easily write a paper and publish it given the quality of this content and the novelty of this technique
I'm kinda dumb, so the only thing I learned from this video is that Acerole uses Unity Light Mode @16:42
it's more of a neutral gray
@@Acerola_t 1 nit or bust
@@Acerola_t What a monster
Maybe it's just that I "grew up" with 3D Studio (and later 3DSMAX) and Autodesk Animator (and later Combustion / Eyeon Fusion / Nuke), but If I see someone running a 3D IDE in dark mode, I automatically think: "Wanker. Everybody knows that dark mode is only for 2D compositing."
And coding IDEs should _obviously_ use light grey text over a dark blue background, like Borland and Watcom intended! 😜
@@RFC3514 yeah its definitely because you are used to it because most software are dark mode by default now
Maybe you could've weaponized the massive amounts of overdraw and relied on additive blending and the Z-buffer to calculate your lighting? Since you don't have any light sources that may be occluded by the geometry.
I was really hoping for this since the halfway point
no z-buffer needed, just additive blending. You'd get a sort of x-ray though, with higher density zones being whiter. Occlusion really looks great.
Ahh, this is brilliant.
My idea was to vary the color based on the distance to the camera, but maybe that’d be too lazy. :p
I agree, this technique is called Eye Dome Lighting and is heavily used in 3d scanning application (lidar data, 3d scanners...). The results is incredible for virtually 0 cost since it is based on the depth map alone which is computed anyways
I'm really loving that rendition of WHITE SPACE. It sounds great! It's also a song very dear to me, so it surprised me to hear it.
Oh boy! Another acerola video, I sure I hope I am not reminded of a video game that permanently altered my brain chem.
(My clueless ass about 40 minutes ago)
26:14 NEW GPU LETS GOO
I mean, it's not like bad topology helps either. Even if they are not the bottleneck here there _are_ setups you'll encounter in practice that are taken out back behind the barn by overdraw and subpixels. So the gpu does care, just not too much.
sidenote; most of those blends really do look like wings, would be pretty cool to have that system attached to a character, even before the Kuwahara makes them even more feathery. The noise even fits with/supports the whole "energy gathering into a shape" motif.
The Domain Expansion: Chaos Game bit was amazing!
Aight. Time to watch another acerola video that i understand nothing about but act like i do.
biblically accurate angel generator
Wow, this was the topic of my bachelor thesis! Really happy to see it computed this nicely.
The mathematics on why this chaos game approach works is a really nice topic to delve into
For particle lighting you might want to try deep shadow maps using Fourier series
I made an example of this ages ago (search Volumetric Particle Lighting Fort Attack if your interested)
Another option is using statistics, random and temporal blending. Each frame you randomly select a set of particles to render from the lights perspective then do a standard shadow map calculation and temporally blend the results for each particle. If you have a 100 particles evenly distributed on the lights path and only select one per update then a particle in the middle will be closer to the light 50% of the time giving you volumetric lighting.
x4pLnrcnwL0
looks really good!
I can't believe i understood everything from this video, this is suh an achievement for me.
I feel like I should put "understanding this video" to my CV, or maybe print it and put it on the wall like a degree!
Sortof same here lol. I used to learn stuff like these 25 years ago and never touched anything related since then. My knowledge is just enough to be amazed how ez the dude doin it :D
The randomly generated shape @ 17:01 looks like a phoenix, and I love it
Many familiar topics here, nice to see. Like lerp functions (I think of them as mix functions..) and easing functions (I think of them as transfer functions).. and like, even Affine transformations are analogous to a classic Gain + Offset module in the context of modular audio synthesis.
Okay and yeah, a possible way to "fix" the noise issue is to use another easing/transfer function on the values of the voxel grid before the occlusion calculation. This is the same problem of Gate signals in modular audio synthesis being too abrupt (especially with digitally controlled ones that can operate arbitrarily quickly), so they're used to control Envelope modules, which give a way to set the rate of change for the rising and falling edges of the signal. This is really important when using an threshold detector on an incoming audio signal to generate gates, because otherwise super brief changes in the signal level can produce large changes in the output level... anyway yeah, interesting to see more and more analogous operations.. almost as if they're the same topic but over different domains or something hehe.
As it happens, even in GLSL (the language used for writing shaders in OpenGL) the built-in lerp function is called "mix"
32:41 but I'm sure it's coming! See "3D Gaussian Ray Tracing: Fast Tracing of Particle Scenes", an NVIDIA paper from this summer.
13:46 I love TAA shirt made my day lmaoooo
One thing to note about the tiny triangle quad overdraw problem, it probably wasn't a big deal in the particle case because the particles were not texture mapped. The GPU only does the full quad finite differences pass if the fragment shader reads an interpolated attribute like texture coordinates, solid color triangles don't pay the cost.
In general the tiny triangle inefficiency only becomes a problem in very specific cases, the performance tax it incurs depends on the way the rest of the rendering pipeline is set up. The more work that is done by the triangle's fragment shader, as opposed to a full screen pass later, the worse the performance hit, MSAA magnifies it. Microtriangles need special considerations in the most extreme conditions, like UE5's Nanite, where every triangle is a microtriangle by design, or VR rendering where frametimes are small and everything is rendered using forward rendering with lots of MSAA, (basically the quad inefficiency worst case) but you really don't need to worry about it in the general case. A game isn't running slow because of one really dense model, or even a few really dense models.
I would wear a shirt with the fractal render at 31:57 on it
13:45 Seeing an "I
The screenshots at the end are marvelous ! I would more likely buy a print of one of them rather than a shirt haha
I've got a soft spot in my heart for the Sierpinski fractal. There was a TI-BASIC program to generate them in my graphing calculator's manual and it was the first fractal I ever saw a computing device draw. As for your particle system technique, I wonder if you could use this to simulate hurricanes and maybe make better predictive models.
oh i'm glad you mentioned Freya Holmer's video! that was one of the first things that got me interested in how graphics work
The closest I've seen to this was a demoscene demo that required an obscene amount of memory for the time and was of course much lower detail. Notably it also used ambient occlusion for shading. I remember it partly because the author referred to that same 1990 paper you brought up.
i didnt expect to have to break out my simple domain in the middle of an acerola video
The explanation about transformation matrixes would've been so useful when I was working on my submission for your game jam. That said, there's so much useful information here that I'm going to be rewatching this video a lot. Great video, thanks!
I've always loved your Monogatari editing style, and (for some reason) just realized why you're called Acerola. nice 🦇
Pretty good video. Because like 10 Years ago I had to merge and interpolate between 2D matrizes. And exactly as you mentioned, interpolating and merging rotations of a sheared object was extremely tricky.
I haven't seen anyone mention this, but from your video you made on water rendering, I swear you look much bigger and more built, great job working out bro you look sick 🔥🔥
"Domain Expansion, The Chaos Game"
Absolute Cinema
you should really implement TAA for this I think, I can definitely see how it would benefit this project in particular!
these fractal looking things look sick af, and i think the kuwahara filter works well for those since they're kind of millions of tiny brush strokes
1:00 ok coconut lady
16:53 sitting dragon
16:55 bat
16:57 hawk
17:01 Very fluffy eagle
17:02 vulture that had ot have his coff. . .
Oh. Its not my appointment?
Such a nice solution! But, unfortunatley, I haven't found a 4090 at my kitchen. What do I do now?
Silly. The 4090s are in Acerola's kitchen.
You prerender it on the GTX 555 you found on your driveway.
1:25 Particle Systems are programs! ..
that create and manage collections of few to many entities.
Where the rule set of a collections controls the behavior of the individual entity.
But Acerola, you of all people deserve a 4090, I’m happy for you 😊
I have never seen this youtuber once in my life, I clicked on it not knowing anything, and I left still kind of not knowing anything, and I loved every second of it. Excellent video.
i love how your editing and writing style mixes all this kinda-hard math with a dash of silliness
Neat! I did something similar for a VR prototype a couple years ago, though I just transformed the previous eye buffers instead. That means you lose a dimension from the transformation each time, but that adds some really weird effects to it. (especially in stereo) It's basically like a VR version of the infinite mirror effect. :) Instead of points I drew a few hundred little meshlets each frame which would then get multiplied in successive frames instead of doing the whole IFS in a single frame. This made it _extremely_ cheap to compute on mobile VR hardware.
My biggest pet peeve right now in graphics is the use of blurring effects to hide inefficiencies. I get that it would be too costly to generate things without noise and dithering, but I just hate how the results look. I have not ran into an implementation of TAA that has looked good to my eyes, including upscaling like DLAA and such. I like how ray tracing is being pushed more, but this transition time that relies on dithering and other noise effects to hide the performance impact cannot end soon enough.
there will always be inefficiencies it came for free with your reimann sums
@@Acerola_t I get it, but I just feel that the older methods of hiding them were better than the current temporal methods. The way that they are implemented now means that there is not even the option of turning off the temporal effects without destroying visuals completely. So you just have to put up with smearing instead of the jaggies of before. I preferred jaggies since there are multiple other ways of hiding them - like a simple increase in render resolution once hardware gets better.
@@Zyxlian you're looking at them with rose-tinted glasses. Old games were aliased and jaggy as hell, and no, increasing resolution doesn't make the problem go away, it just makes it smaller. That doesn't count since in this case, the resolution of your screen has no relevance to the resolution of the particle system. I get that TAA is the new biggest thing for everyone to pretend to be an expert on, but the truth is that these are new problems that weren't feasible before due to limitations of hardware. "Old methods" didn't do particle systems like this better because they literally were not possible on old hardware. You're literally looking at a novel technique that hasn't been done before and saying "well back in my day things were done better!!!" while failing to understand what the word "novel" means
Sorry, I don't mean to come off as rude, I think the particle systems shown here is really cool. I just got irked about handwaving the performance impact of it by implementing TAA. I get that these techniques to push the visuals in graphics are always going to be ahead of the hardware (I was working with ray tracing implementations in the early 2010s, where it took several seconds to render frames). I just hate how there is an increasing number of games where the implementations of these effects are locked on, and the solution to the significant performance impact is to just decrease the resolution and add blurring. What's the point of ray tracing or particle swarms if they just end up being displayed as a dithered blur?
As for my increase in resolution comment, I was thinking more along the lines of increasing the amount of particles in the swarm. In the sense that future hardware will be able to render more particles natively. By implementing the feature in a way that, even with this increase in particles and hardware performance, will still be blurry/dithered no matter how performant the hardware is, just seems counter-intuitive to me.
I just want the option to turn the blurring effects off without destroying visuals.
@@cataclystp the forced taa hate is so funny to me, its the holy grail of aa, it really feels like a jesus situation lmao
I know nothing about graphics, but to add shading I thought you could draw each particle transparent, but I think that really depends on how small of a value (how transparent of a value) you can apply to a single pixel.
That 4090 bit was quite funny.
This is so insanely cool! I study chaos game and its relationship to IFS for my PhD, but I never would have expected these topics to come up here! I use chaos game to represent genetic information, thus producing a unique fractal out of it - which I then use as input for all sorts of analyses like convolutional neural networks. I do all of this through the lens of virology, as I use this method to study how COVID-19 evolves in real time (real time being every few days because the gov won't let you sample people too often) in response to different pressures - like antiviral treatment! I hope someday to make something so visually intuitive for my research, and this video has given me plenty of ideas on how to do exactly that - huge thanks!
I love the way you introduced the chaos game lol
32:20 Omg the Metaohor Music ❤
just do it already. stop waffling, you know exactly what I mean. why keep waiting
Acerola is the only UA-camr I’ve seen who has mastered the sponsored segment. I watch the whole thing.
jaw actually dropped when the kuwahara filter was enabled
The calmness of Sebastian Lague is all well and good, but we need a more chaotic counterpart
ooo i actually did a similar thing to the sierpinsky triangle example you showed, like 7 years ago or something after seeing a demo on numberphile where they had three target points and one moving point, and you're just rotate the one moving point towards one of the target points (literally at random) and moved it halfway there, then you'd plot its position and repeat the steps. the moving point very quickly (like within three or four iterations) moved itself into the triangle fractal area and then stayed there no matter what. I experimented by using more than 3 points, as well as changing just how far towards them the moving point would move instead of "halfway thhough", and got some very cool results. - EDIT: HAH it was their Chaos Game video from 7 years ago, I was right! The way I did it back then was much more inelegant though, I simply opened gamemaker and rawdogged a surface with draw_point calls, which is all CPU bound _and_ incredibly slow (it's slightly faster nowadays but i still wouldn't recommend it lol)
Funny how I did the exact same thing after watching that exact video too
the cat segment is genius
Dude, your videos are so lit. Never touched graphic design and probably never will but I do like video games and I love peeking into how this aspect of them is created through your videos. I really like how you navigate between idealism and pragmatism in your approach and the examples you create always turn out really aesthetically pleasing.
Also from an educational standpoint you impress me in a big way, especially with the math heavy parts. A lot of the time you are explaining concepts that are totally novel to me and 1) you're able to make me interested in and end up understanding a lot of what you cover, and 2) even when something totally goes over my head, I never feel like I've lost the thread and the rest of the video is incomprehensible. You always tie stuff into the overarching goals/ideas that people with zero relevant background can grasp.
Besides, even though strictly the topics themselves are unlikely to ever be useful to me, I feel like your videos legitimately have taught me some really valuable perspective about approaching big, complicated problems that I can apply to my own creative ambitions.
Thanks for making these!
My grandfather was a computer scientist back in the day, and his favourite saying is "If you have a problem, throw more hardware at it"
Perfect example
I haven’t watched yet but I know this is gonna be such a banger
Update: It was indeed a banger
@@t3dotggoh no it's theo
@@t3dotggoh no it's theo
All those formations after you yapped about matrix the movie look like pretty cool drawn explosions.
actually, I suspect some splatting, perhaps even gaussian splatting, could work for games if you don't do it every frame, but like, use it as a basis, much like videos have keyframes and then a bunch of change frames that only modify parts of the frame...
like you quickly chaos game the general scene very broadly, then splat a mostly-done scene over it, then quickly chaos game the approximation of details, and then do a little bit of something more, like use low-poly models and traditional rendering to map the result to a model, and for the next couple frames, you could just figure out a few changed areas, and do the chaos details + lowpoly, and perhaps re-splat the changed areas every say, 5th frame, or 10th frame? maybe even like every 30th frame or so...
I'm not smart, but I think some sort of combination thingie like this could actually work, not necessarly exactly as I described, but something a bit similar...
Acerola: explaing and showimg complex particals and graphics rendered with 4090*
Me watching in 360p:
11:30 You'd be surprised how often GPU land comes up at my AA meetings
Acerola really was like _"Man fuck optimisation! WAe now use 4090s in this household..."_ lol!
This is one of your best videos in my opinion. The results were crazy beautiful!
hot damn, when you added lighting it became absolutely mesmerizing. I'm impressed you managed to publish this video, I'd still be watching the simulation ceaselessly transforming into infinity.
25:37 missed opportunity to pull it out of your oven, like jensen did in his kitchen
always cool to see matrix math i learned about being applied in an interesting way
31:35 THREE
ive got a chem exam tomorrow and its 23:26, seems like the perfect video to watch instead of studying
babe wake up acerola just posted
this would make such a great loading screen
31:54 Holy shit bro is an artist.. That's actually beautiful
Some of those snapshots might make great t-shirts
This week I've been learning Vulkan and this gives me just the motivation I needed
25:52 Jokes aside the company I work for has a lot of money and my team provides 3D applications bundled WITH the hardware. If we have performance problems we just buy the best possible GPUs, and then if even that is not good enough we eventually start optimizing...
So good point
"accidentally invented a new particle system never seen before" i know god damn RIGHT the video will be good
I'm a simple man, I hear heavy use of Va-11 Hall-A OST on top of cool stuff, I like
I hate the way this video is edited, but it makes me keep watching
Chaotic approach isn't really necessary. It's possible to use deterministic approach of position generation based on particle ID, using several iterations of a simple algorithm with integer divisions. It eliminates necessity of denoising entirely.
Also compute shader isn't necessary at all, all positions may be generated in vertex shader instead.
NO WAY SLAMM JEKER
What if we move through the point cloud space with bezier curves ? We "chose" multiple matrices in advance and blend between them with a bezier curve, it could make the animation more smooth instead of going from one to another and see a clear stop in between the blends
Sorry if it is not well written, english is not my first language
ACEROLA FREYA COLLAB WHEN
I'm still amazed at Feya mentioned. You got cool graphics youtube in my cool graphics youtube, what the heck.
but aceroola, I'm at wooork :(
This video is simultaneously very impressive, comforting, educational and fun to watch.
Also timer on "wait a second" had me dead
hey acerola, i have been studying dynamical systems in my free time these past few months and a really interesting project you could do that is equivalent to this, is modeling state space diagrams of dynamical system. nonlinear dynamics, and chaos theory(ding ding ding!! you just dipped your toes in it with this video) would be a really fascinating computer science and computational difficult problem at reasonable compute times. Like a live mandlebrot set, or whatever other maps. all sorts of cool orbits, cycles, attractors and the like to explore. especially if given some dynamical system, could freely explore certain systems and investigate their structure. anyways, really cool video as well. super neat stuff
my smooth brain clocked out really quick but, i like your funny words magic man
Please write a paper on this. The generalization of the chaos game to an affine matrix transformation that can be parallelized is genuinely genius.