for Blender 3.4 and above...use the SAMPLE INDEX node instead of the TRANSFER ATTRIBUTE node (they deleted transfer attribute node). Set SAMPLE INDEX node to Vector/Point and it will work just as the TRANSFER ATTRIBUTE shown in video
@@YNT49 yes thank you it does work with when you plug an index node to the index slot of the sample index node. So anyone watching , if they are having difficulties, this still works in latest version of blender (march 23)
Set Points from 'Distribute Points on Faces' to Geometry in 'Sample Index', then create 'Position' node and 'Index' node, connect them into Value and Index node of 'Sample Index'' and it will work.
As a beginner this is quite astonishing I'm more and more impressed by the day what blender can do. Love the nodes even tho at first they were intimidating they play such a great role
Wow! You are a node ninja... I need to learn how to think like this but it's like trying to put a square peg in a round hole with a diagonal much greater than the diameter. Thank you for the excellent tutorial!
when creating this tree my output is that the geo connected to the sample index turns into a single point, or just looks like it does and doesnt merge from on shape to the other, any tips?
Great thank you :) How do I make one object morph into another that is 10 meters away? Is it possible? Same thing, but the second shape will be in a different position than the first.
Hello, first thanks for this wonderful tutorial. I want to add this one more step but I couldn't do it. After the outcome which is in the video, I want it to turn into original mesh. (in this case default suzanne head) Maybe duplicated objects can extend and come together to seem like one mesh. It would be so good if it's possible, thanks a lot.
thanks boss. fyi - you can also get bloom in cycles by using the compositor's glare node. for cycles it's not realtime in viewport until the next version on blender.
Astonishing tutorial! Just a quick question. Is it possible to make a specific particles a different color? Right now everything is in one color. Is it possible? Thanks Robbie!
@@RobbieTilton Thank you so much for your answer. Could you please explain it a little more in detail? How to assign specific colors to specific particles? It is very important for my life to have a solution for this. I really appreciate your effort. Thank you in advance Robbie
it is possible if the geometry of two meshes are the same (vertices, edges and faces keep their original relationship) then you can use vector displacement in texture shader or use geometry nodes to do that. If you use two separately created mesh with different topology, then I guess you should use a program to determine which point moved to where and generate a 3d vector field and use displacement or keyframe to achieve that. In many cases, "smooth" transition is mathematically impossible (see topology related topics, an example is transforming a donut to a sphere, but visually you can do it just fine)
Hey! Great video! Do you know any way on how to use geometry nodes to interconnect these dots together? Like for example you have every dot connected to closest 3-4 with nice blooming curve/cylinders?
Great video! However I am trying to implement this with a point cloud I have imported into blender and it doesn't work. I guess the data structure of the imported point cloud is the problem. Any suggestion on how to solve this?
i loved this, very elegant to mix between the transfer/sampled attribute's positions with the mix node. I wonder if this solution came up intuitively and if you could give any insight on that process. please and thank you
im working on something morphing between two meshes for my project but i need it packed into an animation for it to be triggered by an event in my game engine. So, can i donthis morph and create an animation of it in blender and export? then use the animation in an engine like UE5?
you could do something like this tutorial mixed with blendshape export ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-sdl-jpZ0NR0.htmlsi=qTSultqa0ymCMqBX and then on UE side you'd need to code a script to take the verts and do what you need to do there
In 3.2 the Transfer Attribute node has Source input instead of Target. Don’t know if it makes a difference but now the Mix node only moves the object position instead of morphing.
That’s what I’m getting too, so tutorial no longer works. I’m moving the slider which just moves the position of one shape until it’s roughly the same size as the particle shape you set for the simulation. Is there a way around this. I’m using blender 3.4.1
Good afternoon. Thank you so much for the lesson. From what he himself encountered. I took two objects and a project that was done to the wound and did not go. One object was added well and the second one was not in any. Okay, there's only a test and just typed two words, then the morphing went fine.
interesting question. If you want solid objects to morph you can use the volume to mesh node in blender3.2. If i have time to try it myeslf I will make a new video for it.
interesting question... at some point i think you'd need to do a crossfade, but you could perhaps have a really high particle count to help make it less noticable
As soon as I connect the output from the Sample Index node to the position property of the Set Position node, I don't see the points of Suzanne being distributed.
Thank you! It was interesting! I wonder how people get to know which node to chose and how to connect them. Can they do that because they know python? Could you also transit the color? Could you tell me what node and where to place?
i've coded for 10 years so i use a lot of that thought process to help figure out which nodes to use. You could easily transition the colors. You can just keyframe the color in the material shader graph to match the keyframes in your geo node. graph
@@RobbieTilton Thank you for your answer! > keyframe the color in the material shader graph to match the keyframes in your geo node. graph So, ... I should use... shader node??? Or can I just keyframe on the Principal BSDF in the material property window?
Thank you for this amazing tuto! There is just one thing I don't understand maybe someone could explain this to me, it's the "position" node. I just don't understand how it work, like when we put the position node in one entry of the MixRGB and the transfert attribute in the other, how do each position node take it's infos? Hope I'm understandable haha thank you!
by default 'position' node will refer to each vertex position in the geometry. It essentially runs the same operation onto each vertex (separately) all at the same time. We use the transfer attribute node to store each vertex position of a different geometry. So then we are able to mix the vertex positions of the default with the vertex position of the stored. It is indeed a hard concept to wrap your head around because the same node 'position' can do several different things based on where its plugged in via node graph and if other nodes are intercepting it.
@@RobbieTilton Ok, I think I understand better this, thank you! One point that was really disturbing to me was to know how the position node selected the "default geometry" but by making some try I think I understood that it's the geometry directly linked to the group output. I'm new to geometry nodes and the technic used here is not something I would have spontaneously thought haha but I guess I need more experience to get it. Anyway thank you for all!
@@lpzmxiv136 no prob! yah - to make it more confusing sometimes position can refer to the entire objects position rather than the vertex's.... all based on the nodes that come before and after it... maybe one day they'll make it easier to understand
@@RobbieTilton Hmm, so I must be doing something wrong. I've tried to transfer a MetaHuman head modified on Zbrush to the same MetaHuman head without new sculpting (I'm still using UE4). Everything is squashed. I'll try to follow step by step and see what I did wrong. Thank you for checking it out!
@@Amelia_PC hmm... i havent dealt with metahuman heads but maybe it's the scale that it comes into blender. try pressing cntrl+A to normalize the scale to 1,1,1
@@RobbieTilton I think the scale is right. I've been modifying Metahumans heads sculpting them on Blender and using the scale 0.01. Never had any problem with it until now. I'm sure the scale is right in Zbrush to Blender as well. But I forgot to mention I'm using the latest NVidia Omniverse Blender version. I'm not sure if it's relevant, though.
@@Amelia_PC u can send me ur file in the 'help' section of my discord. i'm still suspicious of the scale being the root of the issue. sculpting at .01 scale isn't a problem, but when you're instancing points of a non-normalized scaled mesh it could get wonky.
@@jamesxxxyz8775 you can use 2 mix nodes to mix 3 meshes. one mixing the first 2 meshs. and another mixing the previous mix node result and the 3rd mesh
tutorial came out 2 years ago and geo nodes has changed a lot since then. Sorry if it no longer works, but you can always download the file on my discord/help section and download the older version of blender
@@metacoder4912 yes - hover your mouse over the value field in mix rgb node and press 'i'. that will create a keyframe wherever you are in your timeline