Blender Tutorial - How To Use The Normal Attribute In Geometry Nodes

Поділитися
Вставка
  • Опубліковано 8 сер 2024
  • bit.ly/3N0iFrT Copy And Paste The Link Below To Sign Up For Our Free Geometry Nodes Starter Kit Including Our Free Course, Blender Cheatsheet And Procedural Building Pack.
    When working with geometry nodes, we also have the ability to work with the various attributes associated with our geometry that can be used to influence our node setups.
    The normal attribute defines the direction of our face domain, allowing us to control other nodes such as the extrude mesh node to determine which of our faces are going to be extruded. The normal attribute can also be used in other scenarios as well to control the selection and direction of our influenced geometry.
    In order to successfully use our normal attribute, we first need to be able to identify where the normal attribute is located in blender, what type of data it is, and the various ways in which it can be used for our geometry node system.

КОМЕНТАРІ • 15

  • @SamMorell
    @SamMorell 2 роки тому +3

    These "concept" or "terminology" Blender tutorials are terrific. Please continue to make these for a better understanding of how Blender works. We need the basics in order to be efficient and effective with our creativity. For example, and not related to this video, some Blender terms are thrown around (even within Geometry Node sockets and menus) without fully understanding its formal meaning. For example, Mesh vs Geometry vs Primitive . . . are they the same thing technically? I know they're all comprised of vertices/points/faces but what is the technical distinction in those terms if any? I guess I'm looking for a Glossary of terms. Keep the terrific work with these tutorials!

  • @mrofnoctonod
    @mrofnoctonod 10 місяців тому

    Thank you! This was a very helpful tutorial.

  • @retromograph3893
    @retromograph3893 Рік тому +1

    Great vid, well explained and useful!

  • @GoodByeWorld895
    @GoodByeWorld895 2 роки тому +1

    great video! very helpful

  • @andrey730
    @andrey730 10 місяців тому +3

    How do I display face normals in Geometry Spreadsheet as on 01:15? It seems to be hidden now, 3.6.3.

    • @URoblivion
      @URoblivion 4 місяці тому +2

      For you and anyone else wondering, as of 4.0, the option is now under "Mesh edit mode" instead of "Show overlays". Its one drop-down menu to the right, while still in edit mode

  • @user-ev5ur7fw4t
    @user-ev5ur7fw4t 11 місяців тому

    Great job, thanks.

  • @apatsa_basiteni
    @apatsa_basiteni Рік тому

    Thanks.

  • @afjer
    @afjer 3 місяці тому

    The selection works because "Selection" passes face info up the chain to normal, receives the z component as a number, and casts it to a boolean. Any non-zero number is True.

  • @Broadsmile1987
    @Broadsmile1987 Рік тому +2

    I've been bitten before by assuming a normal's component would be perfectly 0 or non-zero when I used a String to Curves node and filled the curves. You'd think all normals should be exactly (0, 0, 1) in this case, and yet I was getting bugs. So don't pluck Separate XYZ directly to a boolean socket, pass it through a comparison node, using some small εpsilon (I use 0.001 because that's the smallest visible in an input field)

    • @CertifiedDoc
      @CertifiedDoc Рік тому

      A perfectly vertical normal, mathematically, does have a value of . In graphics, however, it's .
      Normals can be either positive or negative (between -1 and 1), but they are usually stored in image files. Since image data only includes positive values, you lose the negative half of that range. To fix that, normals are re-mapped to fit between 0 and 1, where is vertical.
      This is why a perfectly flat normal map has kind of a cornflower color instead of pure blue.

    • @Broadsmile1987
      @Broadsmile1987 Рік тому

      @@CertifiedDoc the issue described by me is unrelated to image files: the normals at no point need to go through an image file.

    • @CertifiedDoc
      @CertifiedDoc Рік тому

      @@Broadsmile1987 It doesn't matter; that's the way it's done. Blender encodes normals that way regardless of their source. Inspect the values of your normals. You'll see what I mean.
      Normals are encoded in the range [0,1], where 0.5 is the midpoint for each component of the vector. This is done (at least in part) so that they can both be read from and written to files using RGB channels. Whether the data ever touches a file is irrelevant.

    • @CertifiedDoc
      @CertifiedDoc Рік тому

      Values below 0.5 indicate that a normal is pointing inwards. While this is possible, it's not very common and is usually a mistake.

    • @Broadsmile1987
      @Broadsmile1987 Рік тому

      @@CertifiedDoc I inspected values of normals plenty of times. I assure you, when you read a normal of a face spanning on XY plane (all verts have Z=0), the normal will be an array of 3 float32 values: 0.0, 0.0 and 1.0. The issue described by me in the first comment is not related to how vectors are stored differently in bitmaps, it's related to some kind of a float precision error, where apparently for a realized and curve-filled character instance, the produced face(s) will have their normals very close, but not exactly (0, 0, 1), e.g. 0.000001999999994950485415756702423095703125 for X and Y, and 0.99999988079071044921875 for Z. So even though comparing two floats is fine, when you don't have the knowledge (or a guarantee the implementation won't change) on how a float value is calculated, it's safer to use a tolerance-based approach: rather than checking if Z = 1, check if Z > 0.9999.