You should change your blender vertex bake code to have all the coordinate system conversion in it those are wasted shader instructions. Anyway Great video =) *subscribed*
I thought the title was "pivot catching" and was expecting some fancy in-game bug-catching technique. I was pleasantly surprised with something much more interesting XD
I would have thought this was just a simple case of instancing, like you'd do with particles. But this makes sense as well. I don't think there's any performance difference (either way it's just one draw command on the CPU and the vertex/fragment shaders run just as many times), but I suppose this way it's easier to design the tree in 3d software.
That's definitely a valid approach too. It might even be worth a video of its own, covering the process of exporting all of the leaf transforms to a text file and parsing that in Godot to reproduce the instancing you see in Blender.
@@harshmudhar96 This approach does have several perks, but a particle system can do this effect just fine. If your game engine support mesh data to emit the particles, set the particle amount to match the vertices in you emitter. In Unity I would use the built in custom vertex streams to get the per particle data. But unless you instance them you also need to offset by the world coordinate of the particle system pivot. However, This technique works just as well for parts of a mesh with very different pieces, and that wouldn't be suitable for instancing and particles.
@@Frellmeckerell Although I think there's not any inherent problem with your approach, I imagine some artists would prefer the leaves to not be generated in run time, so there's no surprises with seeds and stuff. I'd also imagine that on some hardware, it's worth it the shader compile time vs generating a particle system. But that's the beauty of programming and modelling, there's lots of right answers! It comes down to the needs of your project.
@@carlosnava1471 I completely agree, I just needed to explain that generating or instantiating doesn't need to have anything random in them to make sense. I've used several variations of both. Having the one creating the content have a say in the workflow is very valid. Recently we've been using houdini to make a topnet to change the data from what the artists want the workflow to be to what is most optimised on the devices we target.
I've done something like this before with normals - storing the normal of the instancer in an attribute of the instanced object at the point it is emitted, and then using it for shading. It's a neat effect.
Very neat stuff! Although a lot of the technical aspects went over my head, it was fascinating to see you break down how to achieve the wiggling leaves effect. It really deepens my appreciation for just how much thought & effort goes into little details in my favorite games :]
I'd love to know more about that thing where the horizon folds away really quickly. There was a similar effect in Cosmic Break and it's nostalgic for me.
Probably just a vertex shader trick. After you transform everything into world space and then camera space, do some sort of transform that will drop and rotate the vertices (maybe full cylinderisation, but that doesn't look quite what it's doing to my eye - just pushing vertices downwards on a curve driven by the Z (into the screen) coordinate would do it I think). Mind you, what that looks like in the pipeline of an "engine" like Unity or Unreal I don't know. Similar concept though.
i feel like you could do something similar by passing each vertex through some combination of sin(t), cos(t), tan(t) on the X, Y, and Z coordinates (with cofactors for things like wind speed and frequency). another idea which i've come up with for more advanced/realistic foliage (which might not look appropriate for the AC vibe) is assigning a texture where the R, G, and B channels are stats like flexibility (how much the vertex closest to each pixel can be displaced), sensitivity (a scalar multiplied by the offset for the vertex closest to each pixel), and time offset (given the offset be the sine of some time variable, the offset from the given time value. not super useful for foliage, but works for the wavy-ness in things like cloth and flags that would wave in the wind).
The greatest issue with this approach is that sin, cos, tan rotate the vertices about 0,0,0, so you wont be able to correctly control the rotation of the leaves. Unless you recentre them to 0,0,0 which is impossible afaik because a vertex shader only has access to the currently processed vertex. You can't get it's neighboring vertices to figure out the bounds of the leaf to recentre it to 0,0,0 (to apply rotations with trig functions). That's why Martin stores the offset (pivot) to recentre to 0,0,0, he just so happens to use quaternion functions to rotate the leaves instead of trig functions.
I think it's just some sine waves in a shader, I do it in my voxel engine, you're right that it isn't rigged and it is a separate mesh from the unanimated parts though. In my game, the sine wave is offset by the vertex position, so that vertices near each other undulate similarly. What you think is the "pivot" is just a vertex or two on each leaf that isn't being animated. I have to do this in my engine as well as the leaf vertices that connect to the tree can't animate, or else you'd see behind the mesh into the void. EDIT: a second later I realized that the leaves do kind of rotate a bit, so it's not per vertex it's per leaf so I guess they probably did do pivot caching like you say. Seems like a lot of manual work to define all the pivots, but I guess my way is easier for me partly because my mesh is already procedural and it's no extra work really to define it per vertex.
cool.. it'd be neat to link this with a weather system where wind fluctuates within different ranges, based on the current weather.. so on calm days, the leaves would range from barely any movement to just a little bit, where during a gusty day, they would have moderate movement with spikes of intense movement and on a stormy night, they might be moving rather violently for extended durations. Never played much Animal Crossing, so I don't know if they already do that.
Ha! That's a LOT of explanation for a much simpler technique (assuming you're not constrained to what blender can do). Firstly, no need to color the leaves - you can use their coordinate relative to the origin of the tree as an input to a randomization function (I use the product of the low order digits of the leaf's xyz coordinate) and add the time/frame number to it modulo some constant that gives them a frequency - and stuff that into a sin() function to turn it into a sinusoidal variation - providing smoothly varying random motion that's unique to each leaf...use that to make a translational and/or rotational component to add to the base position of the leaf - and you're done. About 5 lines of GLSL shader code. Doing that saves sending color data - which saves lots of time. Personally, I wouldn't store the leaf geometry at all. Treat each spherical leaf clump as a single point object and do 100% of the math for leaf generation in a tesselation shader. Parameterize the heck out of it...done.
the coordinate relative to the origin of the tree is exactly what's being stored in the vertex colors. there's no way to get that information in just the vertex shader without using per-object uniforms or an instance buffer. instancing probably is better than using vertex colors, since you only need to store geometry for a single leaf, and the position can be stored for each instance rather than each vertex, but the difference will likely be negligible unless you're rendering many thousands of trees at which point each leaf would probably take up less than a pixel on the screen so you're wasting resources regardless. instancing may also be more complex to implement depending on the engine or API you're using. no clue what you mean by "Treat each spherical leaf clump as a single point object and do 100% of the math for leaf generation in a tesselation shader" though. afaik tesselation shaders can only tesselate/subdivide existing geometry, i don't think they can even take "a single point object" as input since you can't tesselate a point. you might be thinking of geometry shaders, which can create arbitrary geometry from just a single vertex input but they're abysmally slow because modern GPU hardware really isn't made for dynamically generating geometry, and even if it was you can't store the output from a geometry shader in a new vertex buffer so you'd be re-generating the entire set of leaves every single frame which can never be faster than just drawing static geometry with a funny vertex shader.
@@henkle1610 I've been building graphics for flight simulation for 40 years - and I've also worked in the video games industry as a lead graphics engineer. I know what I'm talking about.
@@SteveBakerIsHere having many years of experience doesn't magically change the way tesselation shaders work or the fact that geometry shaders, especially very complex ones that output hundreds of vertices, are going to be significantly slower than just reading from a static buffer on most hardware. but please do educate me if i'm wrong. i would benefit greatly from geometry shaders being faster than drawing static geometry.
Martin said that it is possible to store both position and rotation info using vertex colors. So, does that mean a vertex can be assigned multiple colors or is it somehow stored in a single vertex color? Is there a video which shows storing both?
It depends a bit on your mesh format and your engine, but multiple vertex colour sets are definitely possible. You can also use additional UV sets or textures to store this stuff (see the vertex anim textures video).
Silly idea, but I wonder if this could be used to create fascimile movement of a crowd e.g. watching an event - just enough so they don't look like statues, if you can't invest in actual idle animations for them. Especially in a more stylized artstyle.
5:36 this was a problem in Blender at the time. Now, with Geometry Nodes, it has been fixed. But this one was Blender's fault and couldn't have been fixed inside of Blender without modifying the source code.
Although blender will support floats for vertex colors having them stored as byte sRGB has benefits (memory+performance) note that the sRGB linear conversion does not match it from blender.
I have seen a very similar effect on twitter that was done by wiggling the leaf texture back and forth, that seems easier to set up but looks maybe slightly worse
If you're saving the pivots, why not also save rotations? Then you can just draw them as particles at runtime. But this would be useful if there were mixed kinds of leaves or various meshes for the effect.
I feel it would realistically just be instanced meshes that are moved about by a vertex shader. Given that all the leaves are essentially the same mesh you could 'instance' the mesh. Pretty much you send the mesh and vertex data (so things like the position, normals, uv coords for each vertex) to the GPU once, but send the transform matrix (position, rotation, etc of each leaf) to the GPU for each individual leaf. Although this is more effective for larger meshes, it could still be done here. Then all you need to do is some sine wave magic in the vertex shader to move each vertex a little bit, and if you wanted a random look you could also use a RNG seeded to the leaf's position. The vertex shader would simply offset the income vertex positions by this randomized sine wave function. As a result you get the waving leaves with pretty much 0% CPU usage, which by extension means that you can use this for entire forests, though you probably don't want to use a trigonometric function like a sine wave for that if you want really good performance.
I maybe wrong, but I think that currently in Godot there is this limitation that you can't have instanced mesh in another instanced mesh. So doing this like that in shader is much better idea if you want to do a forest. Also placing each leaf in a Godot is not a good practice. It's much better to do this in 3d editing program like Blender, where you have better control over your mesh. Of course you could write some kind of exporter which saves each leaf location to a file and than read it in godot, but to be honest I think it's a lot worse idea than embedding it in a mesh file or texture.
@@wojtek_pe Never really used Godot, but if it does support instancing you could simply have the trees and leaf's as separate meshes. The trunks in one instanced call and the leaf's in the second. It would simply be the leaf's vertex shader that would then offset all the leaf's. Even then 'recursive instancing' wouldn't be supported as it doesn't exist (to my knowledge). You would just need to choose how you batch them yourself.
@@sirhallstein1336 Yes you're right you can do that in two instancing calls. But with this technique you can do that in one call. And I don't think that it's easier that this method.
I wouldn't even use a mesh for the vertex shader. The mesh could be generated efficiently from a tessellation + geometry shader so that the only vertex data you'd send would be the origin+radius of each of those spherical leaf clusters. Sine waves are pretty efficient in a geometry shader - so I wouldn't worry about it.
It'd be really cool if you could redo this tutorial but explain all the steps properly. Upon trying to do this myself, I feel like a bunch of extremely important steps get skipped. Still, great video and subbed!
next, let the GPU actually permanently modify the stored pivot cache and have it be affected by gameplay. You could have some input value for the wind/forces in the current area, let the GPU move all the leaves accordingly by modifying the pivot values, render out the pivot data texture to be read back on CPU and re-input to the GPU again. This way your leaves can blow off the tree and be affected by forces determined by the CPU (like wind or a player swinging a weapon near the leaves, etc), and the CPU doesn't need to do anything other than read and send the texture back to the GPU. This sort of compute shader work is tricky in Godot 3 as it's not really setup for it, but it can be done. Maybe it's easier in 4.
I don't know about vertex colors, cause then you're limited to the minimum and maximum values. You should try using a couple of uv coordinates, one for x and y and another for z (and whatever else you want to use, maybe a time offset? Idk) Of course I don't know what Godot has I'm more a unity girl
make an array from children objects (leaves) and a "for each" scripted animation with some randomized values seem 100 times easier than what you suggest. I doubt it's more performant though but should be okay... It would deserve a try
Colors are vectors, with the difference that they're usually stored with very low precision, and sometimes SRGB conversion can cause problems. Vertex colors provide an easy way to store such information directly in the mesh while being accessible in a vertex shader and compatible with most game engines, mesh editors and mesh file formats. I'm not aware of any similarly compatible standard intended for storing vectors, other than higher precision vertex colors, or vertex weights (which often aren't accessible in shaders)
What's the benefit of doing it this way rather than each leaf being it's own object, parented to the tree object, using object center as pivot, and z+ as forward vector? Is it for performance? or is there some other cool benefit I'm missing out on?
It would be less efficient to do it that way, as now the CPU has to transfer all of those changes to the GPU every frame, whereas this method keeps all of the math on the GPU. Depending on whether or not the engine decides to use instancing to render your leaves, you might even end up with a very inefficient hundreds of draw calls to render each leaf individually, versus a guaranteed 2 draw calls for this method (trunk and leaves)
Wouldn't a very similar code work as animation directly in blender which could then be exported as one big jiggling frame-animated mesh to avoid all the conversion details? Seems like something Python and key framing that bpy could handle. Or does a game engine not import blender animations or maybe it's to manage control of the effect in the engine? What are some other approaches, if any (briefly of course lol I don't expect an essay) if you have time to indulge my curiosity? Thanks for your work, this channel was a great find.
doing it procedurally like this lets you change the parameters on the fly, so you can change stuff like wind speed and direction without having to generate and store many different animations. baked animations also use up more memory and may actually be slower to calculate each frame.
may i suggest looking at the brenches of smaller trees in rdr2 ? i think its the same technique but its player intractable and i think it even updates some of the collision
Man, storing pivot data as color nice solution. How do you think the triangles on the ground were added? They don't overlap, and they dont seem to appear in raster, aka some hexagon, square, rectangle, or triangle tiling. How come they are so well spread out? Is it voronoi noise? Probably? Some sort of random points -> triangles with space between each other. How can that be done?
if you mean the texture on the ground that's likely just hand drawn. if you look closely you can see it repeats every few meters so it's definitely not generated in real-time.
Could something like this be used in Unreal Engine 5? Nanite is an amazing tool but it doesn't allow meshes to bend or change. meshes must be static, however static doesn't mean they can't move. One of the biggest complaints is that you couldn't simulate leaves moving because that normally bends the leaves' meshes/ triangles. So could this be used? to just pivot the leaves so it looks like the wind is blowing but still has nanite active on the foliage?
Will mention that if you're building a larger mesh, and you notice some jank in the vertex animation, using a color array is probably better than using a texture for the same effect. A 32bit color texture will only have 8 bits per color, so 256 steps per axis.
You are overthinking it way too much. Yes, pivot caching is used for tree animation, but a more complex one. What you are seeing here is really just shader vertex position offset masked by V coordinate of the leaf UV space. :)
Instead of moving the mesh to the origin, rotating by your wind vector and returning it, why not just offset the wind vector by your pivot vector? Shouldn't that achieve the same thing.
You should change your blender vertex bake code to have all the coordinate system conversion in it those are wasted shader instructions. Anyway Great video =) *subscribed*
Doesn't he already convert the color space? Could you plz elaborate what you mean so we learn it?
I'd never actually thought of using vertex colors to store information like this.
Vertex colors are really just the same as vertex positions. Xyz is rgb.
@@juvesidc5960 until some "helpful" default in your toolchain converts RGB to sRGB lol
@@tissuepaper9962 Or when those darn shaders just can't agree whether it's 0 to 255 or 0 to 1.
Could store infomation in your UV sets as well - though UV sets only contain two floats of course.
@@tissuepaper9962 sRGB is just RGB but without clamping it to 1. It's pretty much how emissive works
I thought the title was "pivot catching" and was expecting some fancy in-game bug-catching technique. I was pleasantly surprised with something much more interesting XD
My today's best gift not only this trees but also increased my inspiration to continue learning python....
best interactive foliage tutorial I ever watched
You are a legend!!!!! Always love your breakdown of Blender code 😊
I would have thought this was just a simple case of instancing, like you'd do with particles. But this makes sense as well. I don't think there's any performance difference (either way it's just one draw command on the CPU and the vertex/fragment shaders run just as many times), but I suppose this way it's easier to design the tree in 3d software.
That's definitely a valid approach too. It might even be worth a video of its own, covering the process of exporting all of the leaf transforms to a text file and parsing that in Godot to reproduce the instancing you see in Blender.
Only issue is that it's harder to tailor a particle effect to the same degree. They work better with random effects.
@@harshmudhar96 This approach does have several perks, but a particle system can do this effect just fine. If your game engine support mesh data to emit the particles, set the particle amount to match the vertices in you emitter. In Unity I would use the built in custom vertex streams to get the per particle data. But unless you instance them you also need to offset by the world coordinate of the particle system pivot. However, This technique works just as well for parts of a mesh with very different pieces, and that wouldn't be suitable for instancing and particles.
@@Frellmeckerell Although I think there's not any inherent problem with your approach, I imagine some artists would prefer the leaves to not be generated in run time, so there's no surprises with seeds and stuff.
I'd also imagine that on some hardware, it's worth it the shader compile time vs generating a particle system. But that's the beauty of programming and modelling, there's lots of right answers! It comes down to the needs of your project.
@@carlosnava1471 I completely agree, I just needed to explain that generating or instantiating doesn't need to have anything random in them to make sense. I've used several variations of both. Having the one creating the content have a say in the workflow is very valid. Recently we've been using houdini to make a topnet to change the data from what the artists want the workflow to be to what is most optimised on the devices we target.
Seriously can’t express how truly helpful it is having the code explained step by step. Helps to better understand what I am actually doing
This channel is pure gold!
I had to check if I was dreaming when I saw that UA-cam had recommended a tech art video. Somehow it had happened, and I’m very happy that it did!
Comment so that UA-cam recommends such masterpieces
I've done something like this before with normals - storing the normal of the instancer in an attribute of the instanced object at the point it is emitted, and then using it for shading. It's a neat effect.
best channel I never heard off
I always wondered how they did that leaves effect in AC! It's really cool behind the scene
here comes the next great guy who know how to explain certain stuff. super nice quality
Insanely good content, Easely deserves 100x more views!
Easiest subscription of my life. Fantastic video
Your channel is an absolute goldmine when it comes to doing cool stuff with procedural methods. Subbed.
Something like this has never crossed my mind. I'm so glad this popped up in my recommended.
These are the freakin best code videos on the internet
Awesome guide. Looking forward to more content
Very neat stuff! Although a lot of the technical aspects went over my head, it was fascinating to see you break down how to achieve the wiggling leaves effect. It really deepens my appreciation for just how much thought & effort goes into little details in my favorite games :]
That's what I call a good teacher. Thanks
this is unfairly underrated!
i would have guessed it was done by a vertex displacement shader, didn't know about pivot caching!
It does display vertices though, so it *is* a vertex displacement shader, no? With added "pivot cache".
Would love more videos like this! amazing!!
Wow this is some quality stuff. Subbed!
jeez man, very impressive for what seems to be a lower profile game engine. its you bro.
Hey! Awesome video. The quality is also amazing.
I'd love to know more about that thing where the horizon folds away really quickly. There was a similar effect in Cosmic Break and it's nostalgic for me.
Probably just a vertex shader trick. After you transform everything into world space and then camera space, do some sort of transform that will drop and rotate the vertices (maybe full cylinderisation, but that doesn't look quite what it's doing to my eye - just pushing vertices downwards on a curve driven by the Z (into the screen) coordinate would do it I think).
Mind you, what that looks like in the pipeline of an "engine" like Unity or Unreal I don't know. Similar concept though.
I love your weird programming channel. :)
i feel like you could do something similar by passing each vertex through some combination of sin(t), cos(t), tan(t) on the X, Y, and Z coordinates (with cofactors for things like wind speed and frequency). another idea which i've come up with for more advanced/realistic foliage (which might not look appropriate for the AC vibe) is assigning a texture where the R, G, and B channels are stats like flexibility (how much the vertex closest to each pixel can be displaced), sensitivity (a scalar multiplied by the offset for the vertex closest to each pixel), and time offset (given the offset be the sine of some time variable, the offset from the given time value. not super useful for foliage, but works for the wavy-ness in things like cloth and flags that would wave in the wind).
Yeah - exactly. You can do this MUCH more efficiently than is suggested in the video.
What's the performance difference?
The greatest issue with this approach is that sin, cos, tan rotate the vertices about 0,0,0, so you wont be able to correctly control the rotation of the leaves. Unless you recentre them to 0,0,0 which is impossible afaik because a vertex shader only has access to the currently processed vertex. You can't get it's neighboring vertices to figure out the bounds of the leaf to recentre it to 0,0,0 (to apply rotations with trig functions). That's why Martin stores the offset (pivot) to recentre to 0,0,0, he just so happens to use quaternion functions to rotate the leaves instead of trig functions.
I don't know anything about this stuff but Holy crap this was interesting. Amazing vid
I believe the leaves were done in a shader
I think it's just some sine waves in a shader, I do it in my voxel engine, you're right that it isn't rigged and it is a separate mesh from the unanimated parts though. In my game, the sine wave is offset by the vertex position, so that vertices near each other undulate similarly. What you think is the "pivot" is just a vertex or two on each leaf that isn't being animated. I have to do this in my engine as well as the leaf vertices that connect to the tree can't animate, or else you'd see behind the mesh into the void. EDIT: a second later I realized that the leaves do kind of rotate a bit, so it's not per vertex it's per leaf so I guess they probably did do pivot caching like you say. Seems like a lot of manual work to define all the pivots, but I guess my way is easier for me partly because my mesh is already procedural and it's no extra work really to define it per vertex.
This tutorial is very nice. Please consider to make a tutorial about this terrain system of animal crossing!
cool.. it'd be neat to link this with a weather system where wind fluctuates within different ranges, based on the current weather.. so on calm days, the leaves would range from barely any movement to just a little bit, where during a gusty day, they would have moderate movement with spikes of intense movement and on a stormy night, they might be moving rather violently for extended durations. Never played much Animal Crossing, so I don't know if they already do that.
programmers are absolute wizards, so thankful for them as an artist lmao
Quality content id say buddy
this was an immediate subscribe. thank u for the video!!
Vraiment très compréhensible pour un néophyte et très bien monté, thanks a lot 👌
Oh man, I just discovered your channel and you've got some truly wonderful videos on here. I hope you eventually continue.
Awesome! Thank you for video
A very cool tutorial and explanation! Thanks mate
This is really amazing. Love it!
very nice youtube channel, things are well explained, thanks a lot i learned a lot, can't wait for new videos!
Ha! That's a LOT of explanation for a much simpler technique (assuming you're not constrained to what blender can do). Firstly, no need to color the leaves - you can use their coordinate relative to the origin of the tree as an input to a randomization function (I use the product of the low order digits of the leaf's xyz coordinate) and add the time/frame number to it modulo some constant that gives them a frequency - and stuff that into a sin() function to turn it into a sinusoidal variation - providing smoothly varying random motion that's unique to each leaf...use that to make a translational and/or rotational component to add to the base position of the leaf - and you're done. About 5 lines of GLSL shader code. Doing that saves sending color data - which saves lots of time. Personally, I wouldn't store the leaf geometry at all. Treat each spherical leaf clump as a single point object and do 100% of the math for leaf generation in a tesselation shader. Parameterize the heck out of it...done.
the coordinate relative to the origin of the tree is exactly what's being stored in the vertex colors. there's no way to get that information in just the vertex shader without using per-object uniforms or an instance buffer.
instancing probably is better than using vertex colors, since you only need to store geometry for a single leaf, and the position can be stored for each instance rather than each vertex, but the difference will likely be negligible unless you're rendering many thousands of trees at which point each leaf would probably take up less than a pixel on the screen so you're wasting resources regardless. instancing may also be more complex to implement depending on the engine or API you're using.
no clue what you mean by "Treat each spherical leaf clump as a single point object and do 100% of the math for leaf generation in a tesselation shader" though. afaik tesselation shaders can only tesselate/subdivide existing geometry, i don't think they can even take "a single point object" as input since you can't tesselate a point. you might be thinking of geometry shaders, which can create arbitrary geometry from just a single vertex input but they're abysmally slow because modern GPU hardware really isn't made for dynamically generating geometry, and even if it was you can't store the output from a geometry shader in a new vertex buffer so you'd be re-generating the entire set of leaves every single frame which can never be faster than just drawing static geometry with a funny vertex shader.
@@henkle1610 I've been building graphics for flight simulation for 40 years - and I've also worked in the video games industry as a lead graphics engineer. I know what I'm talking about.
@@SteveBakerIsHere having many years of experience doesn't magically change the way tesselation shaders work or the fact that geometry shaders, especially very complex ones that output hundreds of vertices, are going to be significantly slower than just reading from a static buffer on most hardware.
but please do educate me if i'm wrong. i would benefit greatly from geometry shaders being faster than drawing static geometry.
Keep in mind that this game is running on mobile hardware, I'm not sure the Switch's GPU can handle tessellation or geometry shaders very well
Very sweet tutorial! Thank you so much
Wow! Subscribed.
this is an underrated gem of a video
Another great video! Great job! Waiting for more content from you ;)
This is really clever, thanks for sharing.
A LOT MORE content like this, PLEASE!!!
Martin said that it is possible to store both position and rotation info using vertex colors. So, does that mean a vertex can be assigned multiple colors or is it somehow stored in a single vertex color?
Is there a video which shows storing both?
It depends a bit on your mesh format and your engine, but multiple vertex colour sets are definitely possible. You can also use additional UV sets or textures to store this stuff (see the vertex anim textures video).
Silly idea, but I wonder if this could be used to create fascimile movement of a crowd e.g. watching an event - just enough so they don't look like statues, if you can't invest in actual idle animations for them. Especially in a more stylized artstyle.
5:36 this was a problem in Blender at the time. Now, with Geometry Nodes, it has been fixed. But this one was Blender's fault and couldn't have been fixed inside of Blender without modifying the source code.
Although blender will support floats for vertex colors having them stored as byte sRGB has benefits (memory+performance) note that the sRGB linear conversion does not match it from blender.
I have seen a very similar effect on twitter that was done by wiggling the leaf texture back and forth, that seems easier to set up but looks maybe slightly worse
How did I not find this chNnel before now? Really hope this guy makes more content
great work and really well explained, thank you :)
Seem you are loving to do this kind of work, you should consider using houdini
Fascinating stuff!
If you're saving the pivots, why not also save rotations? Then you can just draw them as particles at runtime. But this would be useful if there were mixed kinds of leaves or various meshes for the effect.
GOATED
This is basically like pushMatrix() and popMatrix() for anyone familiar with Processing 3 or p5.js
I suspect the non-linear color space transformation happend at export from Blender due to the color space settings being set to sRGB.
Bro.... youre so smart bro...
I feel it would realistically just be instanced meshes that are moved about by a vertex shader. Given that all the leaves are essentially the same mesh you could 'instance' the mesh. Pretty much you send the mesh and vertex data (so things like the position, normals, uv coords for each vertex) to the GPU once, but send the transform matrix (position, rotation, etc of each leaf) to the GPU for each individual leaf. Although this is more effective for larger meshes, it could still be done here.
Then all you need to do is some sine wave magic in the vertex shader to move each vertex a little bit, and if you wanted a random look you could also use a RNG seeded to the leaf's position.
The vertex shader would simply offset the income vertex positions by this randomized sine wave function.
As a result you get the waving leaves with pretty much 0% CPU usage, which by extension means that you can use this for entire forests, though you probably don't want to use a trigonometric function like a sine wave for that if you want really good performance.
I maybe wrong, but I think that currently in Godot there is this limitation that you can't have instanced mesh in another instanced mesh. So doing this like that in shader is much better idea if you want to do a forest. Also placing each leaf in a Godot is not a good practice. It's much better to do this in 3d editing program like Blender, where you have better control over your mesh. Of course you could write some kind of exporter which saves each leaf location to a file and than read it in godot, but to be honest I think it's a lot worse idea than embedding it in a mesh file or texture.
@@wojtek_pe Never really used Godot, but if it does support instancing you could simply have the trees and leaf's as separate meshes. The trunks in one instanced call and the leaf's in the second. It would simply be the leaf's vertex shader that would then offset all the leaf's. Even then 'recursive instancing' wouldn't be supported as it doesn't exist (to my knowledge). You would just need to choose how you batch them yourself.
@@sirhallstein1336 Yes you're right you can do that in two instancing calls. But with this technique you can do that in one call. And I don't think that it's easier that this method.
I wouldn't even use a mesh for the vertex shader. The mesh could be generated efficiently from a tessellation + geometry shader so that the only vertex data you'd send would be the origin+radius of each of those spherical leaf clusters.
Sine waves are pretty efficient in a geometry shader - so I wouldn't worry about it.
@@SteveBakerIsHere Godot doesn't support geometry shaders on the basis that they're very inefficient on anything but Intel iGPUs
It'd be really cool if you could redo this tutorial but explain all the steps properly. Upon trying to do this myself, I feel like a bunch of extremely important steps get skipped. Still, great video and subbed!
next, let the GPU actually permanently modify the stored pivot cache and have it be affected by gameplay. You could have some input value for the wind/forces in the current area, let the GPU move all the leaves accordingly by modifying the pivot values, render out the pivot data texture to be read back on CPU and re-input to the GPU again. This way your leaves can blow off the tree and be affected by forces determined by the CPU (like wind or a player swinging a weapon near the leaves, etc), and the CPU doesn't need to do anything other than read and send the texture back to the GPU. This sort of compute shader work is tricky in Godot 3 as it's not really setup for it, but it can be done. Maybe it's easier in 4.
I don't know about vertex colors, cause then you're limited to the minimum and maximum values. You should try using a couple of uv coordinates, one for x and y and another for z (and whatever else you want to use, maybe a time offset? Idk)
Of course I don't know what Godot has I'm more a unity girl
Pog dude. Thx
1:50 you could make great use of geometry nodes here to avoid particle systems.
Wasn't there a section about leaf colour... maybe it was a different video
ua-cam.com/video/KfphtLRoUB0/v-deo.html
3:43
Press 'P' with the object selected and click "Separate by loose parts"
That will not bring back the origin information, every loose part's origin will be the same as that large monolith mesh
Really good content. Keep up the great work ;) sub
make an array from children objects (leaves) and a "for each" scripted animation with some randomized values seem 100 times easier than what you suggest. I doubt it's more performant though but should be okay... It would deserve a try
amaaaazing! Keep it going :)
3:23 misspelled colour
any reason it was done in colors? and not something like Vector3? smart approach, nonetheless. great work :)
Colors are vectors, with the difference that they're usually stored with very low precision, and sometimes SRGB conversion can cause problems. Vertex colors provide an easy way to store such information directly in the mesh while being accessible in a vertex shader and compatible with most game engines, mesh editors and mesh file formats. I'm not aware of any similarly compatible standard intended for storing vectors, other than higher precision vertex colors, or vertex weights (which often aren't accessible in shaders)
@@alex15095 ah okay. thanks for the explanation!
What's the benefit of doing it this way rather than each leaf being it's own object, parented to the tree object, using object center as pivot, and z+ as forward vector? Is it for performance? or is there some other cool benefit I'm missing out on?
It would be less efficient to do it that way, as now the CPU has to transfer all of those changes to the GPU every frame, whereas this method keeps all of the math on the GPU. Depending on whether or not the engine decides to use instancing to render your leaves, you might even end up with a very inefficient hundreds of draw calls to render each leaf individually, versus a guaranteed 2 draw calls for this method (trunk and leaves)
Wouldn't a very similar code work as animation directly in blender which could then be exported as one big jiggling frame-animated mesh to avoid all the conversion details? Seems like something Python and key framing that bpy could handle. Or does a game engine not import blender animations or maybe it's to manage control of the effect in the engine? What are some other approaches, if any (briefly of course lol I don't expect an essay) if you have time to indulge my curiosity?
Thanks for your work, this channel was a great find.
doing it procedurally like this lets you change the parameters on the fly, so you can change stuff like wind speed and direction without having to generate and store many different animations. baked animations also use up more memory and may actually be slower to calculate each frame.
may i suggest looking at the brenches of smaller trees in rdr2 ? i think its the same technique but its player intractable and i think it even updates some of the collision
Dope
Is it possible to rotate them based on uv center?
Nice
@Martin Donald why did you stop?
I wish I could utilize this tutorials info but I just don't understand it...
Man, storing pivot data as color nice solution.
How do you think the triangles on the ground were added? They don't overlap, and they dont seem to appear in raster, aka some hexagon, square, rectangle, or triangle tiling. How come they are so well spread out? Is it voronoi noise? Probably? Some sort of random points -> triangles with space between each other. How can that be done?
if you mean the texture on the ground that's likely just hand drawn. if you look closely you can see it repeats every few meters so it's definitely not generated in real-time.
@@henkle1610 Thanks
Couldn't you do unary minus rather than multiply by -1?
Could something like this be used in Unreal Engine 5? Nanite is an amazing tool but it doesn't allow meshes to bend or change. meshes must be static, however static doesn't mean they can't move. One of the biggest complaints is that you couldn't simulate leaves moving because that normally bends the leaves' meshes/ triangles. So could this be used? to just pivot the leaves so it looks like the wind is blowing but still has nanite active on the foliage?
noice noice immediate sub
I don't understand how this speeds things up.
couldnt you just generate a bunch of random points, apply the rotation then generate the meshes using a geometry shader instead?
am i crazy or is there no pop filter on your mic
Solid video! Instasub!
Will mention that if you're building a larger mesh, and you notice some jank in the vertex animation, using a color array is probably better than using a texture for the same effect. A 32bit color texture will only have 8 bits per color, so 256 steps per axis.
You are overthinking it way too much. Yes, pivot caching is used for tree animation, but a more complex one. What you are seeing here is really just shader vertex position offset masked by V coordinate of the leaf UV space. :)
Instead of moving the mesh to the origin, rotating by your wind vector and returning it, why not just offset the wind vector by your pivot vector? Shouldn't that achieve the same thing.
instead of a script, you could instead use material displacement with a random seeded from: instance-id, particle-id, or random per island