when he said "not in the shader editor" I picked it up. This is outrageous. It's unfair. How can you be a material and not be made in the shader editor. Robin might be a witch????
When you add the random rotation to the cobblestones add Tau to x,y,z - not just Z. Then you get all 6 faces as possible selections on each cobble because it rotates on all 3 Axes.
This is legendary. Aside from the specific guide to the actual cobblestone texture, there’s a whole bunch of generally useful stuff in here that I learned a lot from! Thanks for sharing ❤
Eating breakfast while watching Robin work his magic is kinda like nodding off during math class. I'll watch attentively, look down for a small bite, then look up and suddenly pie is involved! Crazy how much a few of his clicks can accomplish
I generally use Substance Designer for this sort of things. Glad to see someone doing the same thing in blender, shows how flexible the geo node system can be!
Substance Designer is just OP, i can forget about the "seamless tile", and focus to just make the most awesome work. in other program, you have to figure out the Tiling manually, often require intensive work and bunch of Maths, it's a shame though because, sometimes instead of making leaf or rock from scratch in designer, i want to just use whatever geometry i already had, and use that instead, without baking every single one of them Geometry nodes need a little bit of help in "seamless Tile" department, and also "wrap around" mode viewer
@@proceduralcoffee This dude spends his whole life watching blender videos only to comment nothing but negative things about blender. Talk about waste of oxygen.
@@proceduralcoffee Perhaps you could stick with Houdini if it makes you happy, or stay open-minded and learn some techniques even if you dislike it. I assure you, you'll find something interesting.
Jeez, I've been looking for ages for exactly this kind of a tutorial - on producing procedural, tiling color, normal, and roughness maps. It answers literally all my questions. Thank you!
At 6:30 for the subsurf, it's more resource intensive but I would go ahead and use the adaptive subdivision set to like 1 px. That'll insure any displacement you bake isn't losing any resolutions between input and output
Ok, this was insane. I have like, almost no idea what was going on. I'm not all that used to materials at this point, and geom nodes have scared me since I started Blender. But holy hell do I appreciate how much you can do with them. This was a fantastic video.
I used a similar technique for a few years now. It's such a fast way to get something done with planar texturing. Nice to see a more sophisticated approach when it comes to tiling and using collections.
This is amazing, I was worried about being limited by freely available existing textures with limited options or having to learn something like substance designer but this makes creating new unique textures that tile way more intuitive and creatively freeing for my game dev plans!
This video gave me many ideas, thanks. One suggestion: to avoid the guesswork with fog start and end, Geometry Nodes have Attribute Statistic node, which can calculate minimum and maximum of the mesh's Z coordinate (Position - Separate XYZ - put Z into Attribute Statistics as an atttribute), automatically. Then you'd calculate height, write it as named attribute, and acccess it using Attribute node in shader
Totally doable! But there are a couple of drawbacks I ran into with that. For one, you have to collect all your objects into one geonodes object and realize instances, which (depending on your scene) can be fairly heavy. And then it doesn't support opacity maps. So for example, the plants in this material would get hard polygonal edges instead of this nice breakup. But you're totally right about the guesswork, which we want to eliminate. I'm working on an addon to automate a lot of this.
Worth mentioning that the desert bedrock material at 0:17 was originally created by Daniel Thiger, using Substance Designer, as a paid course on his Gumroad. Six years ago. Credit your fellow artists!
I've using an orthographic camera for years to render out fancy noise maps etc. for vfx textures but very rarely for pbr maps, good workflow, good tutorial, nice work. You don't even always have to stick to a plane, using geometry nodes you can often transform any mesh from it's 3d shape to it's 2d uv represenation (as in moving the vertices into that position) and to bake out maps that fit onto characters etc.
for the most accurate contrast in the mist pass, you can use a plane and just move it up and down (G, Z) then snap it to what looks like a high / low point (B) and if the plan only intersects at 1 point you know that's the maximum / minimum depth
>suprer simple >proceeds with a half an hour walkthrough of probably the most confusing, complex and advanced feature of blender Just kidding, so cool that you can almost do things in blender that were mostly only available in apps like Substance
> Says it's not done in the Nodes > Proceeds to do 95% of it in the Nodes At the start he made it sound like it was ENTIRELY done in the 3D modelling side of things, so I got a bit excited - and then immediately got disappointed as he flew through adding more and more nodes that are highly confusing to me. Still a great tutorial and end result though!
I figured my way around to something similar and ended up with a fiddly little geonode thingy but after seeing this video, I don't know why I didn't turn it into a material. Now I can't unsee. This is so perfect I think I'm going to spend an evening with several setups giving them a religious conversion.
I haven't thought that procedural shaders suck for long time. Simon Thommes has shown that procedural shaders are very versatile and easy to manipulate when built correctly. And one of my favorite examples are Pixar movies like Wall-E and Ratatouille or Sony Pictures Cloudy With a Chance of Meatballs.
Basically you want to create a particular result, so you give it a shot. You fail miserably because you have no clue what you're doing but you learn a bit. Then, either because you're a persistent bugger or you just find it fun and exciting to experiment, you try again. This time you get a bit further and you learn a bit more. And so on and so forth. That's pretty much how most people self-teach themselves anything. For instance, I know how to do game programming and use Blender because when I was young all I could think about was making my own games, so no matter how long it took and how frustrating it was, I kept pushing through and eventually learned how to do it.
@ yea in understand the basics of learning - I have a degree in programming and another in graphic design. I’m 55, and I’m a full stack developer. I started programming when I was around 10 years old. I just picked up Blender recently, and I’m overwhelmed by its toolset.
@@SomewhatAbnormal Oh cool, well in that case it seems pretty straightforward. Just pick a goal and work thought learning to get there. That's how you become good at what he's doing in this video.
I think you should not use bump and displacement together, Blender Guru once told that. You should remove the normal map If you select Displacement and Bump under the displacement settings. I mean the displacement have the bump data all together
You.... did use the shader editor.. to assign it. 😜 It's a great tutorial! What I'm struggling with is: how do I transition between materials/objects. For example, road to sidewalk. If you use tilable textures for both, you will have a weird transition. And if you don't use tilable textures... well.. then you don't have the benefits of tilable textures, and you'll need much larger unique textures that probably can't be reused.
This is an amazing tutorial, I have been doing something like this for years, but the way you have it set up is way better. One thing I find, is if you ARE working with EXR and the data is linear, you can get a very accurate displacement map just by piping the z depth pass through an invert node. I keep my camera at 1 unit from world zero, it works with transparency and negative values as EXR allows for values below 0 and above 1. You also don't have to fiddle with multipliers or scale values in the displacement node, just plug the map into height, set the midlevel to "0" and the displacement is almost pixel perfect. I wish I could add an image here as an example. Thanks so much for the tutorial!
Without putting any thought into this, it seems like if you put the weeds generation into the same geometry nodes setup as the cobblestones generation, it would be easier to grow the grass only between the cobblestones and not on them 🤔
Amazing tutorial. I have a couple detail questions. Instead of the Add/Multiply functions at 23:05, why not use the Map Range node to remap [-1,1] to [0,1]? For the displacement map, why use the mist pass instead of the Z pass? The Map Range node will let you specify the absolute distances without having to rely on the mist depth and there will be no falloff as there would be with the mist pass.
Shouldn't the normal map be in linear color space since it's non color data? Mapping it from [-1, 1] to [0, 1] is correct I believe, but since the pixels represent directions, not colors, then applying gamma correction is basically rotating them all the wrong way. Or am I missing something? I'm still trying to figure this out myself.
Im not sure if this should be procedural or not, since "using built-in texture algorithms and shaders" is what's usually considered procedural in Blender. Maybe this definition is quite limited in the industry sense but it still throws off people i guess😂. Great video byw, subscribed!
As already mentioned by others I would say this is more like a procedurally generated texture set than a world-space procedural material. Still really really interesting workflow btw. The thing that intrigues me is how are you modifying the texture when it's applied to the sphere. Have you just rendered an image sequence where you animated different properties of the setup just for visualization sake for the tutorial, or is there something more to it that I am missing?
Not understanding - You just baked pbr texture maps, how is this better than a procedural material? I get you can go back and change the geonodes setup then re-bake, but that’s much less procedural than doing all this directly in the shader editor no?
It's a little more work, yes. Takes about 20 seconds to re-render. But in return it actually looks good, so I think it's a worthwhile tradeoff. And it's more procedural than downloading PBR maps, which you can't adjust at all. It's a technique worth knowing.
@ totally agree it’s a very helpful technique to know, and yeah if you’re willing to render a ton of variations - it could probably be setup to be tiled very well with good variation
So we can't get the roughness after the princebled bsdf node, only what goes into it. Would be nice if we could get it afterwards. Because if we have a mixed shader node with some complicated setup. We can get the correct roughness
yeah. this is a way to generate tiled textures, the video title is obviously clickbait. these are not procedural materials at all. the textures do tile just like any other texture, and have the same artifacts. procedural textures do not tile, they do not repeat, no matter how close or far you look at them: unless you want them to repeat. apart from this the video is nicely done, and easy to follow, but for the ones who are after procedural techniques this is not useful.
Hm, not quite. What you're describing is world-space procedural workflows. Super useful for sure! But not the only kind of procedural. This is UV-space procedural. The same as Substance Designer, which is the industry standard for making procedural textures. Thanks for the compliment!
@@robinsquares this is just baking 3D geometry into 2D textures. procedural textures are being described by functions, they do not require human input apart of the math functions and parameters. substance painter creates 2D textures using procedural methods (algebra, trigonometric functions, random functions, color manipulation). the coordinate system for those functions is 2D, that's why it's UV, but that is only for texture generation. Using these textures does not create a procedural material. UV and world space are just different coordinate systems. If the material is procedural, you can apply it also on UV coordinates.
@@9b0 Surely to remain procedural the way you describe it requires that whatever engine it's being used in supports that particular format of procedural textures 🤔 Meaning it must be engine-specific. Is there any standard format for runtime-procedural materials? Otherwise this is kind of a pointless comment. EDIT: Ah, unless the intention is to keep the materials within Blender, in that case you're right and this isn't exactly a procedural material. I'm game dev so I was seeing this through my game dev eyes, where things have to be exported in a compatible format 🙃
Designer is "math functions " because it's in UV space and Blender is what ? Hand drawn ? 3d space is as full of math functions as 2d space bud. If designer can make tileable and call them "procedural materials" then so can this. It's just using geometry ( vector math ) instead of height maps ( 2d math) If you are confusing "procedural materials" with what you'd get out of substance painter .... Well that's a bit different. . But by and large people don't think S Designer can't do procedural materials but Painter can. One is tileables and the other is specific to each geometry it's connected to. Generally people just think procedural means you can tweak slider and get different variations.
I think the non color data really ought to remain linear. You are nonuniformly compressing the direction in weird ways by pushing it through a gamma. The only reason it looks odd is because it assumes the texture is sRGB when it should be *read* as linear (or Non Color Data) as well
Yeah, I was a bit confused when he set the output colorspace to Rec. 709/sRGB. Then changed it for the base color! Everything that isn't an actual color should be in linear space.
Well, then it's not procedural though. The main point of procedural materials is that they can be tweaked at the fly. There's a big difference between procedural and just procedurally created. Your rendering part of the video takes nearly 10 minutes, so any iterative change of the supposedly procedural material is a ~10 minute chore. Truly procedural materials could be tweaked instantly, on the fly, without baking down to regular PBR texture set.
true, but the immense advantage of this setting, is that you can export the result to any soft you want once done. If you stick to blender, indeed I don't really see an immense advantage. The technique is very good though.
I think you're combining "procedural " with "realtime" That's just not the same. Houdini can make procedural recipes for a fancy fx , yet the rendering of the fx can take many hours. Yet the workflows is still procedural. It's the fact you are building a procedural system what makes it procedural, not the rendering speed.
Very cool. I ran into an interesting issue where this does not work when using imperial units. The camera does not crop to the middle square. Not sure why though
Substance Designer is a different type of tool, though. It's like saying a hammer is light years ahead of a screwdriver because the screwdriver isn't great at hammering nails. That's only true until you try to use the hammer to remove a screw, and suddenly the hammer isn't so far ahead. I don't agree that importing textures and models makes it's non-procedural. This video's textures are procedural in that the elements of the final texture are added procedurally. They could also be generated procedurally, but why would you want to do that? And FWIW, you still import textures in Substance Designer. How would you achieve the leafy pattern otherwise? Good luck doing it with only maths.
@@clonkex I've made leaves, grass, and other tiny details entirely with nodes in Designer. The reason for doing so is to have full artistic control over your creation. There is nothing wrong with using pre-made textures, but it's going to severely limit your possibilities.
Absolutely amazing! Looks like the kind of thing where you make an entire start-up file just for creating textures. :-) FWIW, the difference in quality between JPEG 95 and JPEG 100 is indistinguishable, while the file size is significantly smaller. I'm not sure why cutting off 0.5 of the edges stops the replication. I would have to think it's greater than 8/9ths. Is it because the center plane is centered, so 0.5 is actually the edge of the plane? (I really hate how blender puts the middle of everything at zero, and half the time uses diameter and half the time uses radius. :-)
@@robinsquares Thanks for the confirmation! I was half way through asking the question when I realized the answer, but I thought I'd leave it here for anyone else who might be confused. :-)
Cool! Was going to ask if these are tileble in the end, but since you show the textured sphere I guess they are (maybe i miss the part where you say it)
"and we're going to do it all entirely within blender" - proceeds to use a majority of assets made and then sourced from outside of blender. Nothing wrong with sourcing assets, the whole reason I was interested at the start though was because you claimed it would all be done in blender. I already know how to source megascans, botaniq and geoscatter assets to get a similar look very quickly. Alternatively a material with a similar texture and displacement on a high poly plane will have the same look as well and save about an hour of fiddling
I need to point out something. I have done relevant experiments. I have to say that if you need to use blender's renderer to make animations. Then the node editor and shader editor in blender are very useful tools. (blender is not convenient when disassembling UV, especially when facing complex models) blender's cycles renderer can render very realistic scenes. But I need to point out that the way this renderer works is still different from vray, arnold, Marmoset Toolbag 4. For most workflows, you still need PBR textures as the final output. That means texture baking is essential. Adobe Substance 3D Designer is also edited in a node way, and can easily export PBR maps or intelligent textures for Substance. Although there are many blender videos. I still need to point out that some things are not so easy to be replaced. Blender's shortcomings still exist. Of course, if you want to quickly render beautiful images in blender. Then the node editor and shader editor will let you quickly export beautiful images.
Thanks, we all know that if you spend thousands of dollars, you can buy a bunch of programs that will be (marginally) better than Blender😂. Most of the "shortcomings" you've listed are non issues if you know what you're doing. Mentioning "PBR textures" is especially disingenuous. Strictly speaking, there is no such thing as PBR texture. They are just images, and it is the render engine that uses them can be PBR or not. Doesn't matter if they are made in Blender or MS Paint. He literally describes how to render every single input PBR shader may require.
@@DimitriX-zq1dr I am honest when comparing the pros and cons of different software. Blender is free, but that doesn't mean blender is perfect. I just think you need to learn more instead of attacking others. Free does not mean it is the best. It depends on the final work scenario. Not that you can only work on a certain software.
@@DimitriX-zq1dr It seems like you don't even understand what a PBR texture is. It's an exported image. But most of the time you need to export as PBR. Because you need to render on different platforms. If you only work in blender I have no problem with that. But that doesn't mean other software is bad. You made a fundamental mistake in understanding.
@@DimitriX-zq1dr PBR textures usually contain a roughness image, a normal image, a color image and a height image. These are custom made images. They are ultimately used as textures on meshes. The complexity of a PBR texture represents how custom it is. It is difficult to edit it without the images. Most of the time you need a normal or specular image that you can tweak. This is the basics. You are just attacking anyone who doesn't work in blender as their primary platform.
Via baking, probably. When my model is done, I'd just turn it into a fully fledged texture. (albedo, normals). It will add bloat because each surface gets its own texture. There's a way to convert everything you want into a heightmap/normal map (Mist pass). Baking usually applies only to textures (procedural -> png), but it can apply here too.
@@NicCrimson you end up with separated baked textures. It only bakes the active material right? Since we out the empty texture inside the active material. Or am I missing something
Even after all these years I'm still amazed that THIS is a free and open source software
I happily pay for it anyway, I don't know any other company that gets this insane development out of what little funding they have
When he said "not in the shader editor" I put the gun down
when he said "not in the shader editor" I picked it up. This is outrageous. It's unfair. How can you be a material and not be made in the shader editor. Robin might be a witch????
😂
@@FullHeart_Art Idk but trying to make that material fully procedurally is like begging for death
When he wrote "step 4: crack weed" I picked the pipe up 😳
@@FullHeart_Art Take a seat young texture 🤨
When you add the random rotation to the cobblestones add Tau to x,y,z - not just Z. Then you get all 6 faces as possible selections on each cobble because it rotates on all 3 Axes.
This is legendary. Aside from the specific guide to the actual cobblestone texture, there’s a whole bunch of generally useful stuff in here that I learned a lot from! Thanks for sharing ❤
Eating breakfast while watching Robin work his magic is kinda like nodding off during math class. I'll watch attentively, look down for a small bite, then look up and suddenly pie is involved! Crazy how much a few of his clicks can accomplish
I generally use Substance Designer for this sort of things. Glad to see someone doing the same thing in blender, shows how flexible the geo node system can be!
Substance Designer is just OP,
i can forget about the "seamless tile", and focus to just make the most awesome work.
in other program, you have to figure out the Tiling manually, often require intensive work and bunch of Maths,
it's a shame though because, sometimes instead of making leaf or rock from scratch in designer,
i want to just use whatever geometry i already had, and use that instead,
without baking every single one of them
Geometry nodes need a little bit of help in "seamless Tile" department, and also "wrap around" mode viewer
@@proceduralcoffee 'Blender can't hack it'. Just because there's additional steps to achieve similar results, lol?
@@proceduralcoffee This dude spends his whole life watching blender videos only to comment nothing but negative things about blender. Talk about waste of oxygen.
@@proceduralcoffee Perhaps you could stick with Houdini if it makes you happy, or stay open-minded and learn some techniques even if you dislike it. I assure you, you'll find something interesting.
@@proceduralcoffee All the steps done in geometry nodes are procedural. Even the simplest instancing or array is proceduralism.
Even the first 2 minutes were priceless thank you.
Update: I'm 4 minutes in and learned so much. Thank you again.
Jeez, I've been looking for ages for exactly this kind of a tutorial - on producing procedural, tiling color, normal, and roughness maps. It answers literally all my questions. Thank you!
At 6:30 for the subsurf, it's more resource intensive but I would go ahead and use the adaptive subdivision set to like 1 px. That'll insure any displacement you bake isn't losing any resolutions between input and output
Ok, this was insane. I have like, almost no idea what was going on. I'm not all that used to materials at this point, and geom nodes have scared me since I started Blender. But holy hell do I appreciate how much you can do with them. This was a fantastic video.
I used a similar technique for a few years now. It's such a fast way to get something done with planar texturing. Nice to see a more sophisticated approach when it comes to tiling and using collections.
Very nice tutorial. Easy to understand. The technicalities were explained with ease. As an educator, I approve you! :D
This is amazing, I was worried about being limited by freely available existing textures with limited options or having to learn something like substance designer but this makes creating new unique textures that tile way more intuitive and creatively freeing for my game dev plans!
This explaination of Normal map in blender is what i needed !!! Thanks a lot
ya that part was gold
great explanation! kinda fun to think I ended up helping out in a less direct way with the normal map part!
This video gave me many ideas, thanks. One suggestion: to avoid the guesswork with fog start and end, Geometry Nodes have Attribute Statistic node, which can calculate minimum and maximum of the mesh's Z coordinate (Position - Separate XYZ - put Z into Attribute Statistics as an atttribute), automatically. Then you'd calculate height, write it as named attribute, and acccess it using Attribute node in shader
Totally doable! But there are a couple of drawbacks I ran into with that. For one, you have to collect all your objects into one geonodes object and realize instances, which (depending on your scene) can be fairly heavy. And then it doesn't support opacity maps. So for example, the plants in this material would get hard polygonal edges instead of this nice breakup. But you're totally right about the guesswork, which we want to eliminate. I'm working on an addon to automate a lot of this.
I didn't know most of this stuff
and I've been using blender since 2015
Worth mentioning that the desert bedrock material at 0:17 was originally created by Daniel Thiger, using Substance Designer, as a paid course on his Gumroad. Six years ago. Credit your fellow artists!
Wow, this Tutorial broadened my horizons..
I've using an orthographic camera for years to render out fancy noise maps etc. for vfx textures but very rarely for pbr maps, good workflow, good tutorial, nice work.
You don't even always have to stick to a plane, using geometry nodes you can often transform any mesh from it's 3d shape to it's 2d uv represenation (as in moving the vertices into that position) and to bake out maps that fit onto characters etc.
for the most accurate contrast in the mist pass, you can use a plane and just move it up and down (G, Z) then snap it to what looks like a high / low point (B) and if the plan only intersects at 1 point you know that's the maximum / minimum depth
Great stuff! I really like your style of how you cover the content and the pacing of the narration.
>suprer simple
>proceeds with a half an hour walkthrough of probably the most confusing, complex and advanced feature of blender
Just kidding, so cool that you can almost do things in blender that were mostly only available in apps like Substance
It's literally a programming language. Pretty far from simple :)
> Says it's not done in the Nodes
> Proceeds to do 95% of it in the Nodes
At the start he made it sound like it was ENTIRELY done in the 3D modelling side of things, so I got a bit excited - and then immediately got disappointed as he flew through adding more and more nodes that are highly confusing to me. Still a great tutorial and end result though!
Well, since you said "they're made 100% in blender" i expected you to show the whole process, but _damn_ this video was interesting and well-made
Wicked Tutorial Bro. I am now subbed.👍👍👍👍👍
I figured my way around to something similar and ended up with a fiddly little geonode thingy but after seeing this video, I don't know why I didn't turn it into a material. Now I can't unsee. This is so perfect I think I'm going to spend an evening with several setups giving them a religious conversion.
Absolutely loved this information dense video!
Quite informative Robin!
Thank you 🙏
Great tutorial!
This gave me deeper understanding of geo nodes
I haven't thought that procedural shaders suck for long time. Simon Thommes has shown that procedural shaders are very versatile and easy to manipulate when built correctly. And one of my favorite examples are Pixar movies like Wall-E and Ratatouille or Sony Pictures Cloudy With a Chance of Meatballs.
Awesome video seriously, gives me a ton of ideas for new materials
I would call a next level something that would simplify a process and allow to achieve comparable quality in a shorter time.
I’m unclear on how one becomes good at what you’re doing here? You whip through this like it’s second nature - it’s impressive.
Basically you want to create a particular result, so you give it a shot. You fail miserably because you have no clue what you're doing but you learn a bit. Then, either because you're a persistent bugger or you just find it fun and exciting to experiment, you try again. This time you get a bit further and you learn a bit more. And so on and so forth. That's pretty much how most people self-teach themselves anything. For instance, I know how to do game programming and use Blender because when I was young all I could think about was making my own games, so no matter how long it took and how frustrating it was, I kept pushing through and eventually learned how to do it.
@ yea in understand the basics of learning - I have a degree in programming and another in graphic design. I’m 55, and I’m a full stack developer. I started programming when I was around 10 years old. I just picked up Blender recently, and I’m overwhelmed by its toolset.
@@SomewhatAbnormal Oh cool, well in that case it seems pretty straightforward. Just pick a goal and work thought learning to get there. That's how you become good at what he's doing in this video.
I think you should not use bump and displacement together, Blender Guru once told that. You should remove the normal map If you select Displacement and Bump under the displacement settings. I mean the displacement have the bump data all together
Amazing work, thanks for sharing and for going into the detail, the drawing you did to explain the normal map math was super didactic.❤
"Here's the catch , It's not made in Shader editor"
Bro got PhD in catchphrases
fr
This is a good tutorial and is easy to follow, I cant wait for more!
Great channel as always, thanks for your hard work ❤
Great workflow and walkthrough! Lots of nice tricks. Thanks
Very well explained. Thank you very much. Always a pleasure to watch and learn. 🙂🙃
This was fun. Thank you!
I'll use your discount for blenderkit! thank you!
This is genius!
You.... did use the shader editor.. to assign it. 😜
It's a great tutorial!
What I'm struggling with is: how do I transition between materials/objects. For example, road to sidewalk. If you use tilable textures for both, you will have a weird transition. And if you don't use tilable textures... well.. then you don't have the benefits of tilable textures, and you'll need much larger unique textures that probably can't be reused.
This is an amazing tutorial, I have been doing something like this for years, but the way you have it set up is way better. One thing I find, is if you ARE working with EXR and the data is linear, you can get a very accurate displacement map just by piping the z depth pass through an invert node. I keep my camera at 1 unit from world zero, it works with transparency and negative values as EXR allows for values below 0 and above 1. You also don't have to fiddle with multipliers or scale values in the displacement node, just plug the map into height, set the midlevel to "0" and the displacement is almost pixel perfect. I wish I could add an image here as an example. Thanks so much for the tutorial!
If using png or another format, just make sure your objects are above world zero to avoid clipping, but it still works basically the same way.
Oh, and use a map range node set to from 0,1 to 1,0 instead of invert. Forgot that part!
Brilliant!
An interesting approach. You can make the same things, getting better quality results using quixel mixer, which is avalable for free.
Without putting any thought into this, it seems like if you put the weeds generation into the same geometry nodes setup as the cobblestones generation, it would be easier to grow the grass only between the cobblestones and not on them 🤔
Very good howto, really! But geonodes are too difficult for me at current time, sadly.
Amazing tutorial. I have a couple detail questions. Instead of the Add/Multiply functions at 23:05, why not use the Map Range node to remap [-1,1] to [0,1]?
For the displacement map, why use the mist pass instead of the Z pass? The Map Range node will let you specify the absolute distances without having to rely on the mist depth and there will be no falloff as there would be with the mist pass.
Shouldn't the normal map be in linear color space since it's non color data? Mapping it from [-1, 1] to [0, 1] is correct I believe, but since the pixels represent directions, not colors, then applying gamma correction is basically rotating them all the wrong way. Or am I missing something? I'm still trying to figure this out myself.
Nice video, new subscriber 🤘😎
Im not sure if this should be procedural or not, since "using built-in texture algorithms and shaders" is what's usually considered procedural in Blender. Maybe this definition is quite limited in the industry sense but it still throws off people i guess😂. Great video byw, subscribed!
Good job!
13:40 modulate xy by 1 with .5 shift there and back. An optimization.
As already mentioned by others I would say this is more like a procedurally generated texture set than a world-space procedural material. Still really really interesting workflow btw. The thing that intrigues me is how are you modifying the texture when it's applied to the sphere. Have you just rendered an image sequence where you animated different properties of the setup just for visualization sake for the tutorial, or is there something more to it that I am missing?
Game changer!
Thank you for sharing. 👍
You should screenshot the nodes after you're done.
I want to follow this but I can’t find any good rocks / stones D:
So I guess tou add the AO by multiplying it back in using an AO node or something and then mix it with the diffuse color?
8:25 I might be missing something, or you did it this way for tutorial purposes, but you could have just swapped the mesh to points domain to Faces.
Where is the 'equal to' node in blender 4.3. seems to have disappeared.?
what is the "equal" node called in 4.1
Ps couod you simply swap the the mist values, this method worked but swapping then should work as well
i just realized i'm fing retarted i was triying to do evrything in the shader editor but that methode you showed is clearly simplier
Could you use this to create concrete stuff as well?
When rendering roughness, it turns out to be completely white, which may be the problem?
Not understanding - You just baked pbr texture maps, how is this better than a procedural material? I get you can go back and change the geonodes setup then re-bake, but that’s much less procedural than doing all this directly in the shader editor no?
It's a little more work, yes. Takes about 20 seconds to re-render. But in return it actually looks good, so I think it's a worthwhile tradeoff. And it's more procedural than downloading PBR maps, which you can't adjust at all. It's a technique worth knowing.
@ totally agree it’s a very helpful technique to know, and yeah if you’re willing to render a ton of variations - it could probably be setup to be tiled very well with good variation
Can you also make this follow like say a path on a map?
whoever said procedural materials suck doesn't know the power of them, or how to actually make them
So we can't get the roughness after the princebled bsdf node, only what goes into it. Would be nice if we could get it afterwards. Because if we have a mixed shader node with some complicated setup. We can get the correct roughness
yeah. this is a way to generate tiled textures, the video title is obviously clickbait. these are not procedural materials at all. the textures do tile just like any other texture, and have the same artifacts. procedural textures do not tile, they do not repeat, no matter how close or far you look at them: unless you want them to repeat. apart from this the video is nicely done, and easy to follow, but for the ones who are after procedural techniques this is not useful.
Hm, not quite. What you're describing is world-space procedural workflows. Super useful for sure! But not the only kind of procedural. This is UV-space procedural. The same as Substance Designer, which is the industry standard for making procedural textures. Thanks for the compliment!
@@robinsquares this is just baking 3D geometry into 2D textures. procedural textures are being described by functions, they do not require human input apart of the math functions and parameters. substance painter creates 2D textures using procedural methods (algebra, trigonometric functions, random functions, color manipulation). the coordinate system for those functions is 2D, that's why it's UV, but that is only for texture generation. Using these textures does not create a procedural material.
UV and world space are just different coordinate systems. If the material is procedural, you can apply it also on UV coordinates.
@@9b0 I would say it's a procedurally created Texture set, but not a procedural material... Still cool though...
@@9b0 Surely to remain procedural the way you describe it requires that whatever engine it's being used in supports that particular format of procedural textures 🤔 Meaning it must be engine-specific. Is there any standard format for runtime-procedural materials? Otherwise this is kind of a pointless comment. EDIT: Ah, unless the intention is to keep the materials within Blender, in that case you're right and this isn't exactly a procedural material. I'm game dev so I was seeing this through my game dev eyes, where things have to be exported in a compatible format 🙃
Designer is "math functions " because it's in UV space and Blender is what ? Hand drawn ?
3d space is as full of math functions as 2d space bud.
If designer can make tileable and call them "procedural materials" then so can this. It's just using geometry ( vector math ) instead of height maps ( 2d math)
If you are confusing "procedural materials" with what you'd get out of substance painter .... Well that's a bit different. . But by and large people don't think S Designer can't do procedural materials but Painter can. One is tileables and the other is specific to each geometry it's connected to.
Generally people just think procedural means you can tweak slider and get different variations.
Realistically you either use displacement or a normal map not both.. It may look better artistically tho
Is nobody going to point out that this guy literally looks like dani in almost every way
How to get this cobblestone model?
I think the non color data really ought to remain linear. You are nonuniformly compressing the direction in weird ways by pushing it through a gamma. The only reason it looks odd is because it assumes the texture is sRGB when it should be *read* as linear (or Non Color Data) as well
You're absolutely correct! I felt bad about cutting that aspect of the workflow, but the tutorial just became a bit too long.
Yeah, I was a bit confused when he set the output colorspace to Rec. 709/sRGB. Then changed it for the base color! Everything that isn't an actual color should be in linear space.
How does someone know all this stuff?? Where’s this knowledge😊 coming from?
Can this material be used in game engines like Unity or UE, or only in Blender?
I'm confused, don't we want Normal maps to be non-color and linear?
hello, great tutorial! i am confused about the normal map part. I thought normals maps should be linear. why do we use the gamma correction?
This is a "Procedure Material" but its Using more the Workflow of Using High Poly to low Poly instead...
Nice Method tho will use
Well, then it's not procedural though. The main point of procedural materials is that they can be tweaked at the fly. There's a big difference between procedural and just procedurally created. Your rendering part of the video takes nearly 10 minutes, so any iterative change of the supposedly procedural material is a ~10 minute chore. Truly procedural materials could be tweaked instantly, on the fly, without baking down to regular PBR texture set.
true, but the immense advantage of this setting, is that you can export the result to any soft you want once done. If you stick to blender, indeed I don't really see an immense advantage. The technique is very good though.
I think you're combining "procedural " with "realtime"
That's just not the same.
Houdini can make procedural recipes for a fancy fx , yet the rendering of the fx can take many hours. Yet the workflows is still procedural.
It's the fact you are building a procedural system what makes it procedural, not the rendering speed.
do you need to be subscribed to adobe services to be able to download the material?
Wow! Very interesting! ⭐⭐⭐⭐⭐
Very cool. I ran into an interesting issue where this does not work when using imperial units. The camera does not crop to the middle square. Not sure why though
stick to units used by humans. problem solved ^^
@@Benn25 Ha! Oh how I wished we used metric!
@@brianmcveigh1958 you can, just use it ^^
Nothing says procedural like importing textures and models. I guess Substance Designer is still light years ahead of Blender
Substance Designer is a different type of tool, though. It's like saying a hammer is light years ahead of a screwdriver because the screwdriver isn't great at hammering nails. That's only true until you try to use the hammer to remove a screw, and suddenly the hammer isn't so far ahead.
I don't agree that importing textures and models makes it's non-procedural. This video's textures are procedural in that the elements of the final texture are added procedurally. They could also be generated procedurally, but why would you want to do that? And FWIW, you still import textures in Substance Designer. How would you achieve the leafy pattern otherwise? Good luck doing it with only maths.
@@clonkex I've made leaves, grass, and other tiny details entirely with nodes in Designer. The reason for doing so is to have full artistic control over your creation. There is nothing wrong with using pre-made textures, but it's going to severely limit your possibilities.
Absolutely amazing! Looks like the kind of thing where you make an entire start-up file just for creating textures. :-) FWIW, the difference in quality between JPEG 95 and JPEG 100 is indistinguishable, while the file size is significantly smaller.
I'm not sure why cutting off 0.5 of the edges stops the replication. I would have to think it's greater than 8/9ths. Is it because the center plane is centered, so 0.5 is actually the edge of the plane? (I really hate how blender puts the middle of everything at zero, and half the time uses diameter and half the time uses radius. :-)
You're right on the money
@@robinsquares Thanks for the confirmation! I was half way through asking the question when I realized the answer, but I thought I'd leave it here for anyone else who might be confused. :-)
where did you get the cobblestone from PUFF and they were there. up to there you lost me.
Cool! Was going to ask if these are tileble in the end, but since you show the textured sphere I guess they are (maybe i miss the part where you say it)
They are!
the color image is including the ambient occlusion !!!!!!!!!! that's an error, right?
it was minute 9 that i said... "ok f that" i can live without knowing
"and we're going to do it all entirely within blender" - proceeds to use a majority of assets made and then sourced from outside of blender. Nothing wrong with sourcing assets, the whole reason I was interested at the start though was because you claimed it would all be done in blender. I already know how to source megascans, botaniq and geoscatter assets to get a similar look very quickly. Alternatively a material with a similar texture and displacement on a high poly plane will have the same look as well and save about an hour of fiddling
I need to point out something. I have done relevant experiments. I have to say that if you need to use blender's renderer to make animations. Then the node editor and shader editor in blender are very useful tools. (blender is not convenient when disassembling UV, especially when facing complex models) blender's cycles renderer can render very realistic scenes. But I need to point out that the way this renderer works is still different from vray, arnold, Marmoset Toolbag 4. For most workflows, you still need PBR textures as the final output. That means texture baking is essential. Adobe Substance 3D Designer is also edited in a node way, and can easily export PBR maps or intelligent textures for Substance. Although there are many blender videos. I still need to point out that some things are not so easy to be replaced. Blender's shortcomings still exist. Of course, if you want to quickly render beautiful images in blender. Then the node editor and shader editor will let you quickly export beautiful images.
Thanks, we all know that if you spend thousands of dollars, you can buy a bunch of programs that will be (marginally) better than Blender😂. Most of the "shortcomings" you've listed are non issues if you know what you're doing. Mentioning "PBR textures" is especially disingenuous. Strictly speaking, there is no such thing as PBR texture. They are just images, and it is the render engine that uses them can be PBR or not. Doesn't matter if they are made in Blender or MS Paint. He literally describes how to render every single input PBR shader may require.
@@DimitriX-zq1dr I am honest when comparing the pros and cons of different software. Blender is free, but that doesn't mean blender is perfect. I just think you need to learn more instead of attacking others. Free does not mean it is the best. It depends on the final work scenario. Not that you can only work on a certain software.
@@DimitriX-zq1dr It seems like you don't even understand what a PBR texture is. It's an exported image. But most of the time you need to export as PBR. Because you need to render on different platforms. If you only work in blender I have no problem with that. But that doesn't mean other software is bad. You made a fundamental mistake in understanding.
@@DimitriX-zq1dr PBR textures usually contain a roughness image, a normal image, a color image and a height image. These are custom made images. They are ultimately used as textures on meshes. The complexity of a PBR texture represents how custom it is. It is difficult to edit it without the images. Most of the time you need a normal or specular image that you can tweak. This is the basics. You are just attacking anyone who doesn't work in blender as their primary platform.
❤ Substance Designer, my baby forever
change the title to "procedural cobblestone GN" please 😢
Hey would these procedural materials be compatible with game engines like Unreal or Unity?
How to import this to unity or unreal ?
Via baking, probably. When my model is done, I'd just turn it into a fully fledged texture. (albedo, normals). It will add bloat because each surface gets its own texture. There's a way to convert everything you want into a heightmap/normal map (Mist pass). Baking usually applies only to textures (procedural -> png), but it can apply here too.
Anyone else keep hearing tiler as Tyler
I was about to advocate that he rename it to Durden, actually
I texture like this because 2D is limiting. Also for exporting textures, I bake to a plane instead.
Yeah, I wonder why he doesn't use that method. But I guess it's an issue with all those separate mesh object parts
@@RomboutVersluijs That shouldn't be a problem
@@NicCrimson you end up with separated baked textures. It only bakes the active material right? Since we out the empty texture inside the active material. Or am I missing something
Baking to a plane is probably the best approach for the normal map.
@Fafhrd42 yeah but I won't get the normale from the plants and the cobbles. Those are different meshes and different materials
The only missing part is how to make it seamless, which i guess shouldn't be a hassle with dedicated tools and separate maps
Good news: It IS seamless! That's what the first chapter was for.
@robinsquares wow okay, thanks for the reminder man, I scoured through the whole things to get a gist, I missed this part apparently :)