for Blender 3.4 and above...use the SAMPLE INDEX node instead of the TRANSFER ATTRIBUTE node (they deleted transfer attribute node). Set SAMPLE INDEX node to Vector/Point and it will work just as the TRANSFER ATTRIBUTE shown in video
@@YNT49 yes thank you it does work with when you plug an index node to the index slot of the sample index node. So anyone watching , if they are having difficulties, this still works in latest version of blender (march 23)
Set Points from 'Distribute Points on Faces' to Geometry in 'Sample Index', then create 'Position' node and 'Index' node, connect them into Value and Index node of 'Sample Index'' and it will work.
As a beginner this is quite astonishing I'm more and more impressed by the day what blender can do. Love the nodes even tho at first they were intimidating they play such a great role
Wow! You are a node ninja... I need to learn how to think like this but it's like trying to put a square peg in a round hole with a diagonal much greater than the diameter. Thank you for the excellent tutorial!
Hello, first thanks for this wonderful tutorial. I want to add this one more step but I couldn't do it. After the outcome which is in the video, I want it to turn into original mesh. (in this case default suzanne head) Maybe duplicated objects can extend and come together to seem like one mesh. It would be so good if it's possible, thanks a lot.
i loved this, very elegant to mix between the transfer/sampled attribute's positions with the mix node. I wonder if this solution came up intuitively and if you could give any insight on that process. please and thank you
when creating this tree my output is that the geo connected to the sample index turns into a single point, or just looks like it does and doesnt merge from on shape to the other, any tips?
thanks boss. fyi - you can also get bloom in cycles by using the compositor's glare node. for cycles it's not realtime in viewport until the next version on blender.
Hey! Great video! Do you know any way on how to use geometry nodes to interconnect these dots together? Like for example you have every dot connected to closest 3-4 with nice blooming curve/cylinders?
it is possible if the geometry of two meshes are the same (vertices, edges and faces keep their original relationship) then you can use vector displacement in texture shader or use geometry nodes to do that. If you use two separately created mesh with different topology, then I guess you should use a program to determine which point moved to where and generate a 3d vector field and use displacement or keyframe to achieve that. In many cases, "smooth" transition is mathematically impossible (see topology related topics, an example is transforming a donut to a sphere, but visually you can do it just fine)
Great video! However I am trying to implement this with a point cloud I have imported into blender and it doesn't work. I guess the data structure of the imported point cloud is the problem. Any suggestion on how to solve this?
In 3.2 the Transfer Attribute node has Source input instead of Target. Don’t know if it makes a difference but now the Mix node only moves the object position instead of morphing.
That’s what I’m getting too, so tutorial no longer works. I’m moving the slider which just moves the position of one shape until it’s roughly the same size as the particle shape you set for the simulation. Is there a way around this. I’m using blender 3.4.1
Astonishing tutorial! Just a quick question. Is it possible to make a specific particles a different color? Right now everything is in one color. Is it possible? Thanks Robbie!
@@RobbieTilton Thank you so much for your answer. Could you please explain it a little more in detail? How to assign specific colors to specific particles? It is very important for my life to have a solution for this. I really appreciate your effort. Thank you in advance Robbie
im working on something morphing between two meshes for my project but i need it packed into an animation for it to be triggered by an event in my game engine. So, can i donthis morph and create an animation of it in blender and export? then use the animation in an engine like UE5?
you could do something like this tutorial mixed with blendshape export ua-cam.com/video/sdl-jpZ0NR0/v-deo.htmlsi=qTSultqa0ymCMqBX and then on UE side you'd need to code a script to take the verts and do what you need to do there
Good afternoon. Thank you so much for the lesson. From what he himself encountered. I took two objects and a project that was done to the wound and did not go. One object was added well and the second one was not in any. Okay, there's only a test and just typed two words, then the morphing went fine.
interesting question. If you want solid objects to morph you can use the volume to mesh node in blender3.2. If i have time to try it myeslf I will make a new video for it.
Thank you! It was interesting! I wonder how people get to know which node to chose and how to connect them. Can they do that because they know python? Could you also transit the color? Could you tell me what node and where to place?
i've coded for 10 years so i use a lot of that thought process to help figure out which nodes to use. You could easily transition the colors. You can just keyframe the color in the material shader graph to match the keyframes in your geo node. graph
@@RobbieTilton Thank you for your answer! > keyframe the color in the material shader graph to match the keyframes in your geo node. graph So, ... I should use... shader node??? Or can I just keyframe on the Principal BSDF in the material property window?
Thank you for this amazing tuto! There is just one thing I don't understand maybe someone could explain this to me, it's the "position" node. I just don't understand how it work, like when we put the position node in one entry of the MixRGB and the transfert attribute in the other, how do each position node take it's infos? Hope I'm understandable haha thank you!
by default 'position' node will refer to each vertex position in the geometry. It essentially runs the same operation onto each vertex (separately) all at the same time. We use the transfer attribute node to store each vertex position of a different geometry. So then we are able to mix the vertex positions of the default with the vertex position of the stored. It is indeed a hard concept to wrap your head around because the same node 'position' can do several different things based on where its plugged in via node graph and if other nodes are intercepting it.
@@RobbieTilton Ok, I think I understand better this, thank you! One point that was really disturbing to me was to know how the position node selected the "default geometry" but by making some try I think I understood that it's the geometry directly linked to the group output. I'm new to geometry nodes and the technic used here is not something I would have spontaneously thought haha but I guess I need more experience to get it. Anyway thank you for all!
@@lpzmxiv136 no prob! yah - to make it more confusing sometimes position can refer to the entire objects position rather than the vertex's.... all based on the nodes that come before and after it... maybe one day they'll make it easier to understand
interesting question... at some point i think you'd need to do a crossfade, but you could perhaps have a really high particle count to help make it less noticable
As soon as I connect the output from the Sample Index node to the position property of the Set Position node, I don't see the points of Suzanne being distributed.
@@jamesxxxyz8775 you can use 2 mix nodes to mix 3 meshes. one mixing the first 2 meshs. and another mixing the previous mix node result and the 3rd mesh
@@RobbieTilton Hmm, so I must be doing something wrong. I've tried to transfer a MetaHuman head modified on Zbrush to the same MetaHuman head without new sculpting (I'm still using UE4). Everything is squashed. I'll try to follow step by step and see what I did wrong. Thank you for checking it out!
@@Amelia_PC hmm... i havent dealt with metahuman heads but maybe it's the scale that it comes into blender. try pressing cntrl+A to normalize the scale to 1,1,1
@@RobbieTilton I think the scale is right. I've been modifying Metahumans heads sculpting them on Blender and using the scale 0.01. Never had any problem with it until now. I'm sure the scale is right in Zbrush to Blender as well. But I forgot to mention I'm using the latest NVidia Omniverse Blender version. I'm not sure if it's relevant, though.
@@Amelia_PC u can send me ur file in the 'help' section of my discord. i'm still suspicious of the scale being the root of the issue. sculpting at .01 scale isn't a problem, but when you're instancing points of a non-normalized scaled mesh it could get wonky.
@@metacoder4912 yes - hover your mouse over the value field in mix rgb node and press 'i'. that will create a keyframe wherever you are in your timeline
tutorial came out 2 years ago and geo nodes has changed a lot since then. Sorry if it no longer works, but you can always download the file on my discord/help section and download the older version of blender
for Blender 3.4 and above...use the SAMPLE INDEX node instead of the TRANSFER ATTRIBUTE node (they deleted transfer attribute node). Set SAMPLE INDEX node to Vector/Point and it will work just as the TRANSFER ATTRIBUTE shown in video
LEGEND. Thank you for solving my problem :)
Sample Nearest node also need to connect with Sample index's (in index dot) then only it's working.
@@YNT49 yes thank you it does work with when you plug an index node to the index slot of the sample index node. So anyone watching , if they are having difficulties, this still works in latest version of blender (march 23)
Set Points from 'Distribute Points on Faces' to Geometry in 'Sample Index', then create 'Position' node and 'Index' node, connect them into Value and Index node of 'Sample Index'' and it will work.
@@olekstarasov Legend
As a beginner this is quite astonishing I'm more and more impressed by the day what blender can do. Love the nodes even tho at first they were intimidating they play such a great role
Quick note, if you are using a collection input instead of direct mesh, use a realize instance node first
omg u saved my sanity
Thank youThank you so much! This helped a lot! Love the quick and easy info without ton of wastfull talking.
Wow! You are a node ninja... I need to learn how to think like this but it's like trying to put a square peg in a round hole with a diagonal much greater than the diameter. Thank you for the excellent tutorial!
Thank you so much for making this video.
Greetings, thank you for your time for sharing this valuable information 💯
Thank you! Works well on Blender 3.5 with the amended changes in the comments.
great job it helped me with my project thank you
Maaaannnnn!!!! This is crazy...you opened a grateful of oppurrinities
Outstanding. Simply outstanding. Thank you.
Very good and interesting video i have never thought that this could be done with blender.
Thanks for the video dude.😊
Banger tutorial! Just subbed
Thanks for the sub!
Hello, first thanks for this wonderful tutorial. I want to add this one more step but I couldn't do it. After the outcome which is in the video, I want it to turn into original mesh. (in this case default suzanne head) Maybe duplicated objects can extend and come together to seem like one mesh. It would be so good if it's possible, thanks a lot.
amazing tutorial, and how to give a little bit random animation to the particles during the transition and after the transition, thank you so much!
you can use a math node to add a noise node and keyframe the intensity of that noise so at the beginning and end of the animation there is no noise.
great tutorial, thank you!!! just subd - have to binge them all :P
Awesome tutorial! Very easy to follow!
Great video and nice use of capture attribute..You could also use the mesh primitive node 'icosphere' to keep it more procedural.
Doesn't adding materials and stuff to it become harder then?
i loved this, very elegant to mix between the transfer/sampled attribute's positions with the mix node. I wonder if this solution came up intuitively and if you could give any insight on that process. please and thank you
with shader programming you often store position in the rgb channels of a float3 or float4 so the mix node comes naturally based on that practice.
really cool, thanks !
Brillant 🎉 thanks Sir;)
You're a hero
when creating this tree my output is that the geo connected to the sample index turns into a single point, or just looks like it does and doesnt merge from on shape to the other, any tips?
weird... havent seen that before. Check my pinned comment in case ur in a newer version of blender. or share your file in my discord 'help' section.
I dont know how you figured this out but thank you. I went nuts looking for bloom then I saw you were using eevee not cycles.
thanks boss. fyi - you can also get bloom in cycles by using the compositor's glare node. for cycles it's not realtime in viewport until the next version on blender.
Ty
Hey! Great video! Do you know any way on how to use geometry nodes to interconnect these dots together? Like for example you have every dot connected to closest 3-4 with nice blooming curve/cylinders?
Thanks for the video. How would you do it so you could change the shape of the particle to a sphere? Many thanks for any help,
you can press shift + a in nodegraph and instance UV Sphere
I did it exactly like your tutorial but the mix rgb is different and the sample index is maybe different. I can pay if you can help.
@@RobbieTilton and I can’t get the same result
@@13thnotehifireviews7 check pinned comment that says how to use sample index node
Is it possible to do this with full objects that have textures instead of a point cloud ? Like two human characters for example
it is possible if the geometry of two meshes are the same (vertices, edges and faces keep their original relationship) then you can use vector displacement in texture shader or use geometry nodes to do that. If you use two separately created mesh with different topology, then I guess you should use a program to determine which point moved to where and generate a 3d vector field and use displacement or keyframe to achieve that. In many cases, "smooth" transition is mathematically impossible (see topology related topics, an example is transforming a donut to a sphere, but visually you can do it just fine)
Great video!
However I am trying to implement this with a point cloud I have imported into blender and it doesn't work. I guess the data structure of the imported point cloud is the problem. Any suggestion on how to solve this?
GREAT VIDEO
thanks!
I have a question. I want the particles to move btw two sets of coordinates while morphing, how is that achieved?
in 'object info' node set to [relative] instead of [original]. then it'll move between the position of the suzanne mesh and sphere mesh.
thank youuuu i loveeee youuuuuu
In 3.2 the Transfer Attribute node has Source input instead of Target. Don’t know if it makes a difference but now the Mix node only moves the object position instead of morphing.
in 3.3 everything works perfectly as shown on the tutor
That’s what I’m getting too, so tutorial no longer works. I’m moving the slider which just moves the position of one shape until it’s roughly the same size as the particle shape you set for the simulation. Is there a way around this. I’m using blender 3.4.1
Astonishing tutorial! Just a quick question. Is it possible to make a specific particles a different color? Right now everything is in one color. Is it possible? Thanks Robbie!
yes - you can create an "attribute" (variable) that you pass to a shader
@@RobbieTilton Thank you so much for your answer. Could you please explain it a little more in detail? How to assign specific colors to specific particles? It is very important for my life to have a solution for this. I really appreciate your effort. Thank you in advance Robbie
Super
hey is it possible to do it withoug hte dots and as full objects instead?
good tutorial... btw...ico sphere pronounce: ay cos sphere
Can this method be used on objects like clothing e.g a shirt mesh
im working on something morphing between two meshes for my project but i need it packed into an animation for it to be triggered by an event in my game engine. So, can i donthis morph and create an animation of it in blender and export? then use the animation in an engine like UE5?
you could do something like this tutorial mixed with blendshape export ua-cam.com/video/sdl-jpZ0NR0/v-deo.htmlsi=qTSultqa0ymCMqBX
and then on UE side you'd need to code a script to take the verts and do what you need to do there
Good afternoon. Thank you so much for the lesson. From what he himself encountered. I took two objects and a project that was done to the wound and did not go. One object was added well and the second one was not in any. Okay, there's only a test and just typed two words, then the morphing went fine.
Can you change material using geometry nodes at the same time?
TOP VIDEO !
Is it possible to animate object and then do morphing,for example a flying drone morphe to plane or anything link that
interesting question. If you want solid objects to morph you can use the volume to mesh node in blender3.2. If i have time to try it myeslf I will make a new video for it.
Thank you!
It was interesting!
I wonder how people get to know which node to chose and how to connect them.
Can they do that because they know python?
Could you also transit the color?
Could you tell me what node and where to place?
i've coded for 10 years so i use a lot of that thought process to help figure out which nodes to use.
You could easily transition the colors. You can just keyframe the color in the material shader graph to match the keyframes in your geo node. graph
@@RobbieTilton Thank you for your answer!
> keyframe the color in the material shader graph to match the keyframes in your geo node. graph
So, ... I should use... shader node???
Or can I just keyframe on the Principal BSDF in the material property window?
@@daysmiscellaneous9569 u can keyframe the Principal BSDF in the material property window
@@RobbieTilton Thank you! Í'll try!
i have a question, what if i want both to appear without any transition, i mean both meshes appear together as particles
just add the geo nodes on two separate objects that distribute points and instances on them.
@RobbieTilton 😅😅
Sir!!
I forgot to update, apparently it's like this, just use "join geometry", it works!!
Great video!, sorry if the question is to silly but, how do you animate it?
press i when mouse is hovered over any input and you can insert a keyframe.
Thank you for this amazing tuto!
There is just one thing I don't understand maybe someone could explain this to me, it's the "position" node. I just don't understand how it work, like when we put the position node in one entry of the MixRGB and the transfert attribute in the other, how do each position node take it's infos?
Hope I'm understandable haha thank you!
by default 'position' node will refer to each vertex position in the geometry. It essentially runs the same operation onto each vertex (separately) all at the same time. We use the transfer attribute node to store each vertex position of a different geometry. So then we are able to mix the vertex positions of the default with the vertex position of the stored. It is indeed a hard concept to wrap your head around because the same node 'position' can do several different things based on where its plugged in via node graph and if other nodes are intercepting it.
@@RobbieTilton Ok, I think I understand better this, thank you! One point that was really disturbing to me was to know how the position node selected the "default geometry" but by making some try I think I understood that it's the geometry directly linked to the group output.
I'm new to geometry nodes and the technic used here is not something I would have spontaneously thought haha but I guess I need more experience to get it.
Anyway thank you for all!
@@lpzmxiv136 no prob! yah - to make it more confusing sometimes position can refer to the entire objects position rather than the vertex's.... all based on the nodes that come before and after it... maybe one day they'll make it easier to understand
@@RobbieTilton Geometry nodes are a really weird place 😂😅
What if I want to go from solid mesh to particles on same object ? Any tips ?
interesting question... at some point i think you'd need to do a crossfade, but you could perhaps have a really high particle count to help make it less noticable
As soon as I connect the output from the Sample Index node to the position property of the Set Position node, I don't see the points of Suzanne being distributed.
feel free to share ur file on my discord 'help' channel
using blender 3.4, transfer attribute got removed, and idk which node to use to replace it
for 3.4 - you can use the SAMPLE INDEX node instead. Set it to Vector/Point and it will work just as before.
Transfer attribute node is missing in blender 3.6.................... Plz... help
How can i morph between 3+ objects?
add another Mix node
@@RobbieTilton colour mix node are only 2 values. Is there a node I can use with 3+ values and slide between these?
@@jamesxxxyz8775 you can use 2 mix nodes to mix 3 meshes. one mixing the first 2 meshs. and another mixing the previous mix node result and the 3rd mesh
@@RobbieTilton This is not happening will you please elaborate this please. or make a video on that please
@@muzaffarhussain6183 I've posted a blender file that mixes 3 meshes using this technique to my discord 'help' section
Does this work with one of the meshes rigged, skinned, and with a skeleton?
just tested. it indeed does work
@@RobbieTilton Hmm, so I must be doing something wrong. I've tried to transfer a MetaHuman head modified on Zbrush to the same MetaHuman head without new sculpting (I'm still using UE4). Everything is squashed. I'll try to follow step by step and see what I did wrong. Thank you for checking it out!
@@Amelia_PC hmm... i havent dealt with metahuman heads but maybe it's the scale that it comes into blender. try pressing cntrl+A to normalize the scale to 1,1,1
@@RobbieTilton I think the scale is right. I've been modifying Metahumans heads sculpting them on Blender and using the scale 0.01. Never had any problem with it until now. I'm sure the scale is right in Zbrush to Blender as well. But I forgot to mention I'm using the latest NVidia Omniverse Blender version. I'm not sure if it's relevant, though.
@@Amelia_PC u can send me ur file in the 'help' section of my discord. i'm still suspicious of the scale being the root of the issue. sculpting at .01 scale isn't a problem, but when you're instancing points of a non-normalized scaled mesh it could get wonky.
how is this better than using shape keys?
shape keys would require each mesh to have the same vertex count
I was not able to achieve this in Blender 3.4 ): not even using sample index. It goes from one shape to just a point );
follow the steps again in detail and it will work. every time someone has the issue you're mentioning - they ended up miss-connecting a node.
how do you add keyframes to transition from one to another?
hover over the value you want to keyframe with your mouse cursor and press 'i' on the keyboard
@@RobbieTilton do you mean from the mix rgb? im a beginner and little bit lost..and what keyframe should i put?location or rotation or something?
@@metacoder4912 yes - hover your mouse over the value field in mix rgb node and press 'i'. that will create a keyframe wherever you are in your timeline
@@RobbieTilton thanks a lot i figured it out now
How to turn that into a loop?
just keyframe the Mix node by hovering over the 'frac' number and pressing 'i' or you can right click insert keyframe.
Why wouldn't you show how to animate this. The key frames aren't going anything
what do u mean? the keyframes is how you animate it
I can not find transfer attribute in list
see my pinned comment
sample index node is not similar in terms of work transfer attrebute .. I hope to make a new video to clarify the work .. with many thanks
can some one help it doesnt work what ever i try pls
tutorial came out 2 years ago and geo nodes has changed a lot since then. Sorry if it no longer works, but you can always download the file on my discord/help section and download the older version of blender
Im lost in how to animate this
insert keyframe. hover and press [i] on keyboard
How did you select the particle without deselecting the Nodes object at 4:34?
I 'pinned' the geo nodes window. See 0:37. Pinning keeps the window up even when its not the active object.
@@RobbieTilton ooh, didn't know that. Thank you!