Eating breakfast while watching Robin work his magic is kinda like nodding off during math class. I'll watch attentively, look down for a small bite, then look up and suddenly pie is involved! Crazy how much a few of his clicks can accomplish
I generally use Substance Designer for this sort of things. Glad to see someone doing the same thing in blender, shows how flexible the geo node system can be!
Substance Designer is just OP, i can forget about the "seamless tile", and focus to just make the most awesome work. in other program, you have to figure out the Tiling manually, often require intensive work and bunch of Maths, it's a shame though because, sometimes instead of making leaf or rock from scratch in designer, i want to just use whatever geometry i already had, and use that instead, without baking every single one of them Geometry nodes need a little bit of help in "seamless Tile" department, and also "wrap around" mode viewer
yeah this isn't really procedural. It's faking it because Blender can't hack it. The only current DCC that can get close to Designer in R E A L proceduralism in texturing details like this is Houdini. Not blender.
At 6:30 for the subsurf, it's more resource intensive but I would go ahead and use the adaptive subdivision set to like 1 px. That'll insure any displacement you bake isn't losing any resolutions between input and output
when he said "not in the shader editor" I picked it up. This is outrageous. It's unfair. How can you be a material and not be made in the shader editor. Robin might be a witch????
This video gave me many ideas, thanks. One suggestion: to avoid the guesswork with fog start and end, Geometry Nodes have Attribute Statistic node, which can calculate minimum and maximum of the mesh's Z coordinate (Position - Separate XYZ - put Z into Attribute Statistics as an atttribute), automatically. Then you'd calculate height, write it as named attribute, and acccess it using Attribute node in shader
Totally doable! But there are a couple of drawbacks I ran into with that. For one, you have to collect all your objects into one geonodes object and realize instances, which (depending on your scene) can be fairly heavy. And then it doesn't support opacity maps. So for example, the plants in this material would get hard polygonal edges instead of this nice breakup. But you're totally right about the guesswork, which we want to eliminate. I'm working on an addon to automate a lot of this.
I like combining all the models with a plane under them and export to SD. Just bake all the maps and use ID mask. Build the graph per usual and basically done.
>suprer simple >proceeds with a half an hour walkthrough of probably the most confusing, complex and advanced feature of blender Just kidding, so cool that you can almost do things in blender that were mostly only available in apps like Substance
I need to point out something. I have done relevant experiments. I have to say that if you need to use blender's renderer to make animations. Then the node editor and shader editor in blender are very useful tools. (blender is not convenient when disassembling UV, especially when facing complex models) blender's cycles renderer can render very realistic scenes. But I need to point out that the way this renderer works is still different from vray, arnold, Marmoset Toolbag 4. For most workflows, you still need PBR textures as the final output. That means texture baking is essential. Adobe Substance 3D Designer is also edited in a node way, and can easily export PBR maps or intelligent textures for Substance. Although there are many blender videos. I still need to point out that some things are not so easy to be replaced. Blender's shortcomings still exist. Of course, if you want to quickly render beautiful images in blender. Then the node editor and shader editor will let you quickly export beautiful images.
Thanks, we all know that if you spend thousands of dollars, you can buy a bunch of programs that will be (marginally) better than Blender😂. Most of the "shortcomings" you've listed are non issues if you know what you're doing. Mentioning "PBR textures" is especially disingenuous. Strictly speaking, there is no such thing as PBR texture. They are just images, and it is the render engine that uses them can be PBR or not. Doesn't matter if they are made in Blender or MS Paint. He literally describes how to render every single input PBR shader may require.
@@DimitriX-zq1dr I am honest when comparing the pros and cons of different software. Blender is free, but that doesn't mean blender is perfect. I just think you need to learn more instead of attacking others. Free does not mean it is the best. It depends on the final work scenario. Not that you can only work on a certain software.
@@DimitriX-zq1dr It seems like you don't even understand what a PBR texture is. It's an exported image. But most of the time you need to export as PBR. Because you need to render on different platforms. If you only work in blender I have no problem with that. But that doesn't mean other software is bad. You made a fundamental mistake in understanding.
@@DimitriX-zq1dr PBR textures usually contain a roughness image, a normal image, a color image and a height image. These are custom made images. They are ultimately used as textures on meshes. The complexity of a PBR texture represents how custom it is. It is difficult to edit it without the images. Most of the time you need a normal or specular image that you can tweak. This is the basics. You are just attacking anyone who doesn't work in blender as their primary platform.
I think the non color data really ought to remain linear. You are nonuniformly compressing the direction in weird ways by pushing it through a gamma. The only reason it looks odd is because it assumes the texture is sRGB when it should be *read* as linear (or Non Color Data) as well
Cool! Was going to ask if these are tileble in the end, but since you show the textured sphere I guess they are (maybe i miss the part where you say it)
yeah. this is a way to generate tiled textures, the video title is obviously clickbait. these are not procedural materials at all. the textures do tile just like any other texture, and have the same artifacts. procedural textures do not tile, they do not repeat, no matter how close or far you look at them: unless you want them to repeat. apart from this the video is nicely done, and easy to follow, but for the ones who are after procedural techniques this is not useful.
Hm, not quite. What you're describing is world-space procedural workflows. Super useful for sure! But not the only kind of procedural. This is UV-space procedural. The same as Substance Designer, which is the industry standard for making procedural textures. Thanks for the compliment!
@@robinsquares this is just baking 3D geometry into 2D textures. procedural textures are being described by functions, they do not require human input apart of the math functions and parameters. substance painter creates 2D textures using procedural methods (algebra, trigonometric functions, random functions, color manipulation). the coordinate system for those functions is 2D, that's why it's UV, but that is only for texture generation. Using these textures does not create a procedural material. UV and world space are just different coordinate systems. If the material is procedural, you can apply it also on UV coordinates.
Absolutely amazing! Looks like the kind of thing where you make an entire start-up file just for creating textures. :-) FWIW, the difference in quality between JPEG 95 and JPEG 100 is indistinguishable, while the file size is significantly smaller. I'm not sure why cutting off 0.5 of the edges stops the replication. I would have to think it's greater than 8/9ths. Is it because the center plane is centered, so 0.5 is actually the edge of the plane? (I really hate how blender puts the middle of everything at zero, and half the time uses diameter and half the time uses radius. :-)
@@robinsquares Thanks for the confirmation! I was half way through asking the question when I realized the answer, but I thought I'd leave it here for anyone else who might be confused. :-)
@@NicCrimson you end up with separated baked textures. It only bakes the active material right? Since we out the empty texture inside the active material. Or am I missing something
So we can't get the roughness after the princebled bsdf node, only what goes into it. Would be nice if we could get it afterwards. Because if we have a mixed shader node with some complicated setup. We can get the correct roughness
Very cool. I ran into an interesting issue where this does not work when using imperial units. The camera does not crop to the middle square. Not sure why though
Via baking, probably. When my model is done, I'd just turn it into a fully fledged texture. (albedo, normals). It will add bloat because each surface gets its own texture. There's a way to convert everything you want into a heightmap/normal map (Mist pass). Baking usually applies only to textures (procedural -> png), but it can apply here too.
Well, then it's not procedural though. The main point of procedural materials is that they can be tweaked at the fly. There's a big difference between procedural and just procedurally created. Your rendering part of the video takes nearly 10 minutes, so any iterative change of the supposedly procedural material is a ~10 minute chore. Truly procedural materials could be tweaked instantly, on the fly, without baking down to regular PBR texture set.
true, but the immense advantage of this setting, is that you can export the result to any soft you want once done. If you stick to blender, indeed I don't really see an immense advantage. The technique is very good though.
Just why? Normal map should be in the linear space. Only the basecolor is sRGB. Why you baked this in such a convoluted way why not in the traditional way? And why you bake diffuse map instead of albedo/basecolor?
Render result can output linear space if you set View Transform to Raw and Look to None in Render properties. And top-down rendering is much more flexible than texture baking in this situation. You can render textures with transparent areas, or use post processing effects from the compositor if you need.
@@DimitriX-zq1dr Diffuse includes shadows and ao while the other two do not. For this reason they are not really suitable for pbr workflow. The basecolor stores the colors of the metals while the albedo does not, in which case the specular stores.
@@davidcsakvari762It is all still up to subjective definition really. Diffuse doesn't have to include shadows, this depends on the artstyle and from where the model will be viewed from. Neither the image with baked shadow cause any serious issues with PBR rendering. Your GPU is not going to burn from that😂
I gotta be real… this is cool. But this is soo much easier in substance designer that it’s shocking . If you want to be a material artist you need to use a different program. Blender is great . But the lengths you have to go to to make a brick texture - and with less options that substance makes it an unusable program by comparison… That’s all. Just because I see people doing the most in blender . And I know how much easier substance is
Substance is a very good software, but it requires the cooperation of other software of the same type. However, the advantage of substance is that it can easily export PBR textures. The biggest disadvantage of blender geometry nodes is that they can only be rendered in blender. And there are many difficulties when exporting textures. I still think substance is the best texture software. But sometimes substance painter+designer need to cooperate with each other. Because substance is really widely applicable.
Is there a way to import custom meshes into Substance Designer? I've used the program to create materials from various noises and shape generators, but I'm not aware of a way to import your own sculpted stone tiles, or to instance leaves with a particle system, etc.
Eating breakfast while watching Robin work his magic is kinda like nodding off during math class. I'll watch attentively, look down for a small bite, then look up and suddenly pie is involved! Crazy how much a few of his clicks can accomplish
I generally use Substance Designer for this sort of things. Glad to see someone doing the same thing in blender, shows how flexible the geo node system can be!
Substance Designer is just OP,
i can forget about the "seamless tile", and focus to just make the most awesome work.
in other program, you have to figure out the Tiling manually, often require intensive work and bunch of Maths,
it's a shame though because, sometimes instead of making leaf or rock from scratch in designer,
i want to just use whatever geometry i already had, and use that instead,
without baking every single one of them
Geometry nodes need a little bit of help in "seamless Tile" department, and also "wrap around" mode viewer
yeah this isn't really procedural. It's faking it because Blender can't hack it. The only current DCC that can get close to Designer in R E A L proceduralism in texturing details like this is Houdini. Not blender.
@@proceduralcoffee 'Blender can't hack it'. Just because there's additional steps to achieve similar results, lol?
At 6:30 for the subsurf, it's more resource intensive but I would go ahead and use the adaptive subdivision set to like 1 px. That'll insure any displacement you bake isn't losing any resolutions between input and output
When he said "not in the shader editor" I put the gun down
when he said "not in the shader editor" I picked it up. This is outrageous. It's unfair. How can you be a material and not be made in the shader editor. Robin might be a witch????
😂
@@FullHeart_Art Idk but trying to make that material fully procedurally is like begging for death
When he wrote "step 4: crack weed" I picked the pipe up 😳
@@FullHeart_Art Take a seat young texture 🤨
This video gave me many ideas, thanks. One suggestion: to avoid the guesswork with fog start and end, Geometry Nodes have Attribute Statistic node, which can calculate minimum and maximum of the mesh's Z coordinate (Position - Separate XYZ - put Z into Attribute Statistics as an atttribute), automatically. Then you'd calculate height, write it as named attribute, and acccess it using Attribute node in shader
Totally doable! But there are a couple of drawbacks I ran into with that. For one, you have to collect all your objects into one geonodes object and realize instances, which (depending on your scene) can be fairly heavy. And then it doesn't support opacity maps. So for example, the plants in this material would get hard polygonal edges instead of this nice breakup. But you're totally right about the guesswork, which we want to eliminate. I'm working on an addon to automate a lot of this.
Great stuff! I really like your style of how you cover the content and the pacing of the narration.
Great tutorial!
This gave me deeper understanding of geo nodes
Great workflow and walkthrough! Lots of nice tricks. Thanks
"Here's the catch , It's not made in Shader editor"
Bro got PhD in catchphrases
Awesome video seriously, gives me a ton of ideas for new materials
Amazing work, thanks for sharing and for going into the detail, the drawing you did to explain the normal map math was super didactic.❤
I like combining all the models with a plane under them and export to SD.
Just bake all the maps and use ID mask.
Build the graph per usual and basically done.
Very well explained. Thank you very much. Always a pleasure to watch and learn. 🙂🙃
This is a good tutorial and is easy to follow, I cant wait for more!
Thank you for sharing. 👍
>suprer simple
>proceeds with a half an hour walkthrough of probably the most confusing, complex and advanced feature of blender
Just kidding, so cool that you can almost do things in blender that were mostly only available in apps like Substance
I need to point out something. I have done relevant experiments. I have to say that if you need to use blender's renderer to make animations. Then the node editor and shader editor in blender are very useful tools. (blender is not convenient when disassembling UV, especially when facing complex models) blender's cycles renderer can render very realistic scenes. But I need to point out that the way this renderer works is still different from vray, arnold, Marmoset Toolbag 4. For most workflows, you still need PBR textures as the final output. That means texture baking is essential. Adobe Substance 3D Designer is also edited in a node way, and can easily export PBR maps or intelligent textures for Substance. Although there are many blender videos. I still need to point out that some things are not so easy to be replaced. Blender's shortcomings still exist. Of course, if you want to quickly render beautiful images in blender. Then the node editor and shader editor will let you quickly export beautiful images.
Thanks, we all know that if you spend thousands of dollars, you can buy a bunch of programs that will be (marginally) better than Blender😂. Most of the "shortcomings" you've listed are non issues if you know what you're doing. Mentioning "PBR textures" is especially disingenuous. Strictly speaking, there is no such thing as PBR texture. They are just images, and it is the render engine that uses them can be PBR or not. Doesn't matter if they are made in Blender or MS Paint. He literally describes how to render every single input PBR shader may require.
@@DimitriX-zq1dr I am honest when comparing the pros and cons of different software. Blender is free, but that doesn't mean blender is perfect. I just think you need to learn more instead of attacking others. Free does not mean it is the best. It depends on the final work scenario. Not that you can only work on a certain software.
@@DimitriX-zq1dr It seems like you don't even understand what a PBR texture is. It's an exported image. But most of the time you need to export as PBR. Because you need to render on different platforms. If you only work in blender I have no problem with that. But that doesn't mean other software is bad. You made a fundamental mistake in understanding.
@@DimitriX-zq1dr PBR textures usually contain a roughness image, a normal image, a color image and a height image. These are custom made images. They are ultimately used as textures on meshes. The complexity of a PBR texture represents how custom it is. It is difficult to edit it without the images. Most of the time you need a normal or specular image that you can tweak. This is the basics. You are just attacking anyone who doesn't work in blender as their primary platform.
❤ Substance Designer, my baby forever
You should screenshot the nodes after you're done.
Game changer!
I think the non color data really ought to remain linear. You are nonuniformly compressing the direction in weird ways by pushing it through a gamma. The only reason it looks odd is because it assumes the texture is sRGB when it should be *read* as linear (or Non Color Data) as well
You're absolutely correct! I felt bad about cutting that aspect of the workflow, but the tutorial just became a bit too long.
13:40 modulate xy by 1 with .5 shift there and back. An optimization.
Is nobody going to point out that this guy literally looks like dani in almost every way
Wow! Very interesting! ⭐⭐⭐⭐⭐
This is a "Procedure Material" but its Using more the Workflow of Using High Poly to low Poly instead...
Nice Method tho will use
change the title to "procedural cobblestone GN" please 😢
Cool! Was going to ask if these are tileble in the end, but since you show the textured sphere I guess they are (maybe i miss the part where you say it)
They are!
it was minute 9 that i said... "ok f that" i can live without knowing
oof i accidentally learned something useful. Thanks!
yeah. this is a way to generate tiled textures, the video title is obviously clickbait. these are not procedural materials at all. the textures do tile just like any other texture, and have the same artifacts. procedural textures do not tile, they do not repeat, no matter how close or far you look at them: unless you want them to repeat. apart from this the video is nicely done, and easy to follow, but for the ones who are after procedural techniques this is not useful.
Hm, not quite. What you're describing is world-space procedural workflows. Super useful for sure! But not the only kind of procedural. This is UV-space procedural. The same as Substance Designer, which is the industry standard for making procedural textures. Thanks for the compliment!
@@robinsquares this is just baking 3D geometry into 2D textures. procedural textures are being described by functions, they do not require human input apart of the math functions and parameters. substance painter creates 2D textures using procedural methods (algebra, trigonometric functions, random functions, color manipulation). the coordinate system for those functions is 2D, that's why it's UV, but that is only for texture generation. Using these textures does not create a procedural material.
UV and world space are just different coordinate systems. If the material is procedural, you can apply it also on UV coordinates.
So I guess tou add the AO by multiplying it back in using an AO node or something and then mix it with the diffuse color?
Can you also make this follow like say a path on a map?
Absolutely amazing! Looks like the kind of thing where you make an entire start-up file just for creating textures. :-) FWIW, the difference in quality between JPEG 95 and JPEG 100 is indistinguishable, while the file size is significantly smaller.
I'm not sure why cutting off 0.5 of the edges stops the replication. I would have to think it's greater than 8/9ths. Is it because the center plane is centered, so 0.5 is actually the edge of the plane? (I really hate how blender puts the middle of everything at zero, and half the time uses diameter and half the time uses radius. :-)
You're right on the money
@@robinsquares Thanks for the confirmation! I was half way through asking the question when I realized the answer, but I thought I'd leave it here for anyone else who might be confused. :-)
Nothing says procedural like importing textures and models. I guess Substance Designer is still light years ahead of Blender
I texture like this because 2D is limiting. Also for exporting textures, I bake to a plane instead.
Yeah, I wonder why he doesn't use that method. But I guess it's an issue with all those separate mesh object parts
@@RomboutVersluijs That shouldn't be a problem
@@NicCrimson you end up with separated baked textures. It only bakes the active material right? Since we out the empty texture inside the active material. Or am I missing something
Baking to a plane is probably the best approach for the normal map.
@Fafhrd42 yeah but I won't get the normale from the plants and the cobbles. Those are different meshes and different materials
Ps couod you simply swap the the mist values, this method worked but swapping then should work as well
So we can't get the roughness after the princebled bsdf node, only what goes into it. Would be nice if we could get it afterwards. Because if we have a mixed shader node with some complicated setup. We can get the correct roughness
Anyone else keep hearing tiler as Tyler
I was about to advocate that he rename it to Durden, actually
Very cool. I ran into an interesting issue where this does not work when using imperial units. The camera does not crop to the middle square. Not sure why though
stick to units used by humans. problem solved ^^
@@Benn25 Ha! Oh how I wished we used metric!
@@brianmcveigh1958 you can, just use it ^^
Bro has not seen the Sanctus Procedural Materials library lol
How to import this to unity or unreal ?
Via baking, probably. When my model is done, I'd just turn it into a fully fledged texture. (albedo, normals). It will add bloat because each surface gets its own texture. There's a way to convert everything you want into a heightmap/normal map (Mist pass). Baking usually applies only to textures (procedural -> png), but it can apply here too.
Ah shit, gotta attempt to learn geo nodes again
Edit: ayyy its not too hard with this video
This is procedural modeling...
Basically it is like we are creating our own textures
Well, then it's not procedural though. The main point of procedural materials is that they can be tweaked at the fly. There's a big difference between procedural and just procedurally created. Your rendering part of the video takes nearly 10 minutes, so any iterative change of the supposedly procedural material is a ~10 minute chore. Truly procedural materials could be tweaked instantly, on the fly, without baking down to regular PBR texture set.
true, but the immense advantage of this setting, is that you can export the result to any soft you want once done. If you stick to blender, indeed I don't really see an immense advantage. The technique is very good though.
This isn't procedural, idiot.
Just why? Normal map should be in the linear space. Only the basecolor is sRGB. Why you baked this in such a convoluted way why not in the traditional way? And why you bake diffuse map instead of albedo/basecolor?
Render result can output linear space if you set View Transform to Raw and Look to None in Render properties. And top-down rendering is much more flexible than texture baking in this situation. You can render textures with transparent areas, or use post processing effects from the compositor if you need.
P.S "Albedo" or "basecolor" are literally synonyms for diffuse in different rendering systems
@@DimitriX-zq1dr Diffuse includes shadows and ao while the other two do not. For this reason they are not really suitable for pbr workflow. The basecolor stores the colors of the metals while the albedo does not, in which case the specular stores.
@@davidcsakvari762It is all still up to subjective definition really. Diffuse doesn't have to include shadows, this depends on the artstyle and from where the model will be viewed from. Neither the image with baked shadow cause any serious issues with PBR rendering. Your GPU is not going to burn from that😂
Says it's going to be a geometry nodes texture and then proceeds to just import a material from Substance Designer ...
I gotta be real… this is cool. But this is soo much easier in substance designer that it’s shocking .
If you want to be a material artist you need to use a different program. Blender is great . But the lengths you have to go to to make a brick texture - and with less options that substance makes it an unusable program by comparison…
That’s all. Just because I see people doing the most in blender . And I know how much easier substance is
I’m poor and a hobbyist. Can’t afford substance so I’m grateful for blender tutorials so that I can play too.
you only need to set it up once. and blender is free.
@@liialuuna facts, just make a template file and you don’t gotta ever do all that setup again
Substance is a very good software, but it requires the cooperation of other software of the same type. However, the advantage of substance is that it can easily export PBR textures. The biggest disadvantage of blender geometry nodes is that they can only be rendered in blender. And there are many difficulties when exporting textures. I still think substance is the best texture software. But sometimes substance painter+designer need to cooperate with each other. Because substance is really widely applicable.
Is there a way to import custom meshes into Substance Designer? I've used the program to create materials from various noises and shape generators, but I'm not aware of a way to import your own sculpted stone tiles, or to instance leaves with a particle system, etc.
This is not procedural material as you render layers and than use them in Shader editor. Clickbait