Buhhhh... Thank you for another tasty knowledge nugget. Btw it would be super funny and simultaneously helpful/professional if you moved your webcam footage off the hierarchy/outliner and put it in it's own reserved space in your Blender layout. Maybe even throw in a Metal Gear Solid-esque filter on the footage (made with nodes of course). I think it would be helpful for people to see the status of the outliner (even though it doesn't change that much) during the process and in general your virtual presence definitely deserves an upgrade given your accelerating notoriety. ;^)
Hi Erin, just a stupid question, where do yo find the documentation for new version, I just grab the beta 3 and everything has changed in geometry nodes (just when i was beginning to understand the previous version) tx
The documentation is being updated at the moment and it should all be finished by the stable release of 3.0. I personally look at the developer commits if I need to understand something specific but otherwise I reach my own answers. Bouncing ideas off people in my Discord and directly asking developers in the blender chat have helped as well.
Hm sort of. A field is just something that is sampled by the geometry. If it's a position field then it'll return the xyz coordinates of the vertex. If it's a noise field, it'll return the value of the noise formula. If it's an index field, it'll return the index of the geometry element
But how do you manipulate vertex groups of instanced geeometry using the "selection" socket? Nothing appears in the modifier properties :( It seems like it only work for the emitter geometry and can't be done for imported geometry.
Just type in the name of the group. The modifier can only query the active object but that doesn't matter. You're just providing a reference to a column in the spreadsheet so if you input the correct name, it'll work
I've binged watched your Geo Nodes playlist somewhat and to be perfectly honest, I've still no idea how Geo Nodes works. I am trying to scatter objects randomly with random rotation on a plane in such a way that, if any of the instances exceed plane dimension, they will be duplicated on opposite side. This is so that it produces a tiling texture effect. Another problem self-intersection. I've no idea how any of this works or how you understand all of this, is there a handbook I could download or something to better explain nodes and common use cases?
You can't check intersections between different instances that have been instanced at the same time. You won't be able to do a tiled scatter without doing something like a grid that you repositioned the points with some tiling noise etc. A lot of this stuff is hard logically because we don't have loops and we don't have an easy way to differentiate between instances after they're realised
@@Erindale So using Distribute Points on Face along with Instance on Points will generate Instances that I cannnot somehow differentiate based on their bounding boxes crossing the plane? What if I had 9 planes arranged as grid tiles? Middle plane would be used for the canvas and the other 8 would filll the boundaries of the canvas in such way that for example, left plane is simply an offset of right plane. This would produce a tileable effect? I will try to fiddle with this tomorrow. I would love if this were possible because baking objects to a texture would be so much easier than me manually scattering or using Particles then making sure everything tiles and deleting intersecting geometry.
If you instance a plane on a grid BUT DON'T REALISE instances then when you distribute points on that grid of planes, they'll have identical distribution patterns and therefore tile and you can instance your objects and realise or do whatever you want afterwards if that helps?
Perfect video. This makes so much more sense now and there's so much Grasshopper-type stuff going into it. That bit about nodes looking backwards is really interesting. It lets you reuse a group of nodes without having to duplicate it.
Yes! It took me a few days to work out why it was a good design decision but as soon as I started making tools it became very clear how powerful that is!
@Erindale If the count of your heads in the thumbnail indicates level of importance of this video then you could add few more :D , thanks mate, now I need some time to process the info... I'm a bit late on all the "Fields" stuff recently added and struggle a lot to get a grasp of it.
What would I do without Erindale? Just run around in a field like a headless chicken. Great explanation, as always. Now I understand a little more. THANK YOU! Dg
I am just a beginner with blender, but this finesse (round/diamond sockets, with dot or not, wire dashed or continuous...) just surprised me. Very interesting and good to know...
Probably required training here for geometry nodes beginners. If they don't get past this "context" concept, it will be impossible to understand - i didn't see it at first, but now I do. The way I would say it is something like this: A field's data is fed by the geometry data that is input into the green node currently being processed in the pipeline, and is specific to a vertex/point
Do you know why normal shows up in the spread sheet under the face domain if its not an attribute? I thought normals are derived data im a bit confused.
This is an excellent tutorial! Thank you so much for outlining everything in such a clear manner. I really like the delivery, warm and encouraging without dumbing down the content too much
I guess, if you're a programmer, then you recognize a field as a callback. If you're not a programmer, it's quite a nice way of teaching you what a callback is. 🤓
Thanks for this!--is there a way to use "image texture" and control UV-mapping for displacement with these newer nodes (no more "sample atttribute texture?)..
This is the ultimate video on fields & geometry nodes which has finally filled up the gaps in my mental model and made it click. Can't thank you enough for this video.
Thank. You. Erindale! I was struggling with how everything fit together coming from the old attribute workflow and this video made a lot of things finally click in my brain.
No honestly it took me a while too. It was when I found out the input nodes are just references to the spreadsheet instead of actual data that I clicked with it.
How can I use geometry nodes to actually build a mesh plane, control the position of each point, have the plane portions subdivided into many quads for cloth sim, and also build sewing threads to connect pieces of the object?
You can't create new edges or faces but you can delete faces to leave edges behind. If you're form is possible to make by manipulating a grid or other primitive, or by using the curve to mesh node, then you can delete faces to create sewing threads. In my last test, writing out a group for use as pins in my case didn't work though.
A field is more like a function. When geometry evaluates the function it creates an array of values that exists on the geometry. For example a noise texture outputs a field. This noise texture is just a perlin noise function. Without vectors from the geometry, it’s not solvable, it requires the geometry to become meaningful. Functionally you’re correct, you’ll only ever see a list of values in the spreadsheet but it’s important to note that that array of values only exists because of the geometry. It cannot be disconnected from the geometry as you could create a list in Python or Grasshopper etc
I don't understand why blender developers always want to reinvent the wheel. they could no longer be inspired by Houdini rather than complicate the workflow. same thing the nodes in c4d are a mess to use. as soon as you go above 50/60 nodes it becomes a mess. in Houdini after 200/300 nodes everything is very clear. congratulations anyway for the quality and clarity of your tutorials.