It's good to see that even if Blender takes 2 steps back, at least they're taking 6 steps forward. I think I can live with that. Shadows from HDRI's in Eevee is FANTASTIC and I've been begging for it for a long time now
realtime displacement is a really big thing in my opinion, especially if you look at programms like gaea or world creator, who all utilize map displacement for insane terrain generation, just immagine beeing able to paint in a little house into the displacement and see it realtime, could be super awsome for big enviroment shots, with cycles it was always a pain imo
but the thing is how hard will it hit compute power though cause i'm not a beefy laptop with a gpu. no one is talking about this in their videos cause most of them have gpu
Things like tint, clamping, and fine-tuning the threshold can be done by adding extra nodes. So the node replacement is just core functionality, you can customize it further than the post effect.
@@kenziemac130Of course everything can be done by most basic image manipulation nodes. but why do we have bloom node? you can make it by levels and blur nodes as well , you will use levels to make highlights bright then blur it and then make a node called bloom, thats how people do in adobe premiere etc. if no bloom node exist., and for blender, simple things like vignette was never easy to do in blender for long. So its a big time loss. Therefore point is same goes for those extra settings for bloom, so its wrong to make people be node or image manipulation expert to do simple things. There will be various usermade nodegroups that cause lack of standards, its always better to have best or standard implementation of things Compositor is always longer way to do things for simple post processing due to the fact its missing many pratical nodes . They should just add same settings to compositor at least, its not that hard
yes they always downgrade and we wait for love so they fix something to old version. Even commercial softwares make old settings legacy for long time so people wont feel suddenly that a setting they always use is gone. Thats the only side commercial softwares are better. They dont easily get rid of some features by big changes
This is what I've been waiting for. Unreal Engine was always too much of a learning curve for the stuff I do so seeing EEVEE start to catch up is super exciting!
The thing with the eyes is quickly fixed by going in the settings tab of your cornea material and set th rendering method from Blended to Dithered and activate Raytraced transmission.
This frustrated me to my core until I figured it out lol. Easy fix in my opinion. Though for addons, it will be a little different. People would need to wait for the addon developer to update their addon for 4.2.
Enable refractions in material propertions tab - render method Dithered - raytraced transmission And For bloom also was not lost a lot of settings, each setting can be repeated in the composer and change colours, just need to add a little more than one node, but this can be fixed by creating a group of nodes once and saving the project, maybe they will add in the future at once preset for bloom likes a eevee olds
Thanks! Oh surely, one can achieve the same effect adding more nodes, still, having the same options as in the old Bloom menu right away would be nice :)
@@MartinKlekner yes who cares about workarounds, then we also dont need bloom node cuz people can do bloom from blur and levels or exposure nodes easily so noone need bloom node as well (or udont need color ramp or map range node cuz u can do it by math nodes) (or we shouldnt have gotten principled bsdf, we can use blendergurus pbr shader groups). , the point is not wheter u can make it nor not, its about official implementation that is standardized for all users without always making things yourself and doing things easily and not getting downgrade from previous workflows..
@@MartinKlekner Or may be instead of a redundant node, the greate thing would be to have nodes presets. Finally. Personnaly I find that Blender is lacking a solid global presets system. It's good to pull information from data blocks of other projects, but you literally have to organize folders by your own and keep track of them all. In Houdini for example you can make everything a preset, so next time you open a new project, you have you preset for anything right at one or two clicks away. And not even talking about HDAs.
Complete game changer! I've got 30 years in 3D and I'm totally used to trying to keep render times down to under 10mins a frame, 5mins is optimal but this.... THIS!!! :D Transparency: Material | Settings | enable 'Raytraced Transmission'
The bloom in Eevee (while useful) was woefully limiting, regardless of the amount of sliders it has compared to glare node. You need to use render passes to get the most out of the compositor and using a diffuse direct and glossy direct with glares nodes gives you way more control over surface bloom than before. Using emit and environment pass will give you amazing results for light glare.
Im now using eevee next to render everything. Its fantastic. Its so fast. Even with atmospherics it renders 4k super fast. Grreat animations that i paid an animator to do over a year ago can now be rendered amd released. Im so happy.
holy smokes... this looks fantastic :O My new years resolution was to start learning UE5 for it's real-time rendering, but now I think I will reconsider ;)
DEFINITELY learn Unreal bro. Blender does not even come anywhere close to rivaling the kind of technology in UE. Everything you're seeing here is grounded in incredibly dated screen space technology, something Unity and UE were already doing as early as 2015. If you want to be able to render massive worlds in seconds with trillions of polys, and GI nearly on the level of cycles without having to worry about so much CPU bottle necking, you definitely wanna go with the tool that is leading the industry right now. As a Blender user myself, I can't even begin to explain to you the benefits of UE5 over Eevee because there is just so much to cover. It's one of those things where you cannot fully understand it until you start really using UE, and then you'll be saying to yourself: "OMFG, why in the world did I EVER try to do this in Blender!!!??" Take the leap as I did, and I promise you that you will NOT be disappointed. UE is not as hard as you think it is, its actually very easy once you get used to it, Unity is fantastic too.
Take the image output, hit it with rgb curve to clip the blacks down, blur it, blend it over the original image on screen mode, there's your bloom. Done.
@@MartinKlekner I can see the immediacy returning with the use of the library system to create a sort of post fx layer stack, with compositing node setups packed up as assets. There's no real evidence that this kind of solution is on the horizon, but it's the only reason that comes to mind that they'd remove the previous one click solution.
Ow that's a good tip. But you get ghosting is I'm correct. I know this sort of effect from photoshop. You still see the sharp edges. Unless you can make some kind of mask for edges
Regarding the cornea refraction issue, there IS a fix for this and it's ALMOST the thing you tried: add a value node, plug it into the Thickness output, and set it to ALMOST zero but not actually zero. A value of between 0.003 and 0.001 works fine for me. Also make sure Raytraced Transmission is checked in the material settings under Surface, and set the Thickness option to Slab if it isn't already. If the iris behind the cornea looks a little dark, try adding nodes to hide the material from shadow rays: add a Mix Shader, connect the top input to the output of your current shader setup and the lower input to a Transparent Shader, then add a Light Paths node and collect Is Shadow Ray to the Mix Shader's factor. Now plug the Mix Shader's output to the World Output's Surface input.
4:50 for the eyes not refracting. Change the alpha blending to dithered instead of the blended option in the material for the eye. It should fix it hopefully.
Oh this really starts looking better and better. After trying a few times after it came out, I completely abandonded it because it just didn't support enough of the features (e.g. some mat nodes) I was using, but this now starts to look much better =) gonna have to try it out later on thanks for the video!
Better mention a few points : - Raytracing, as super cool it is, is screenspace based : If an object that emit light (direct emission or from an other source) is out of the window, the indirect light it occurs simply don't exist. Always keep that in mind (specially in animation). - Displacement looks also super cool but AFAIK, require an high poly mesh to properly work.
i hope that npr workflows aren't forgotten with this update. Most of the raytracing features are screenspace so they're discarded when using the shader to RGB node
if you’re still having refraction issues after turning on raytraced refraction in the *material settings*, setting thickness, and enabling refraction eevee setting, and also making sure the material is no longer set to use the “alpha blend” mode… it may be that the shader is not using the standard PBR shader? some networks/nodes might not work properly in eevee. You may have to remake the material with a more simple basic PBR setup. But refraction definitely works.
I think the "bloom downgrade" that you talk about, perhaps on the surface seems like a dis-improvement. But there is so much you can do by adding other nodes that do similar and more than the previous bloom did. Also the glare node is much faster than it used to be.
Bloom was a destructive effect, those sorts of things should never be burned into a render. That being said the compositor node should have had feature parity. Really though a bloom is just a threshold based extraction which is blurred, color corrected and then added with the original image. it should be very very easy to reproduce with a node network. That being said, If your aiming for realism you should use glare.
I really think to download the new version, it has now amazinf performance because now it is using more gpu (especially AMD gpu) and eevee seem more better, I just hope they add back ambient occulusion, bloom and other stuff since it was very simple to understand what they did
Maybe I'll finally consider using Eevee now. The only thing we can't do yet is baking textures or using buffers I think, which would also be a neat thing to have.
To fix the eye refraction go to material > sitting > thickness and change the mode from sphere to slap , and don’t forget to set the thickness output to 0 as you did
@@MartinKlekner I just figured out basically if you have back light it will break throughout mesh, jest delete the back light, or make the cornea another sphere with glass shader.
it doesn't hate to rival UE. the amount of hoops you have to jump through when doing, well, VFX-ey shot in UE is no joke. anyway, cool stuff, i hope all of it gets ported to macos as well
Nice sneak peak thanks. Didnt have any chance to give it a try but the lighting looks so much better now and this is just a beginning. But what do you think about cycles vs eevee for realism. Could eevee rival with cycles with new ray tracing feature?
for everyone trying to fix the refraction, put the thickness value "0" like him, BUT!!!, turn on in material /settings/ Render method, Bellow Dithered Tab enable "Raytrace transmition".
Hey Martin, I think the problems with the refraction of the cornea can be resolved with the new options within the material properties tab where it now provides an option for raytraced refraction
In short: EEVEE Next does not use full ray tracing, but it has improved simulation-based features for better visual effects, which may seem similar to ray-tracing in terms of quality, but the underlying technology is different.
Blender Internal did the raytracing first but it was offline, Cycles has path tracing ... now since tech developed further and becomes more realtime, they go back to raytracing then going forth to a faster and realtime path tracing and where we are going next ...
With raytracing availablle in 4.2 for Eevee, I could switch from Cycles to this. It'd save me a ridiculous amount of time during the render of longer animations.
The example with the eyes make the refraction issue seem like small thing but it's going to mess up the scene if you have water and/or glass I suppose? Let's hope they prioritise fixing it in the full release
there's a refraction option in the material tab settings where the alpha clip/hashed setting is. That will fix the eye problem. I think at least, it did for me.
@@MartinKlekner I have an eye setup that was originally based on Auto-eye and I got it working about 90%. You should set the refraction type to "sphere" not slab and also you need to connect a value to the "thickness" value on the Material Output node of your Cornea material. If I understand the node correctly setting the value to Zero actually makes it use the objects volume to calculate the refraction and works basically like classic eevee. It's fine on all but extreme closeups which have a tiny white infinite distance square in the refraction for some reason. if you don't connect any value to thickness it just seems to use the scene default thickness set in render options which is often way too big for something like eyes
I've been waiting for this stuff desperately. Eevee is great for quick animatics, but turning that renders into some worthy stuff demands an enormous amount of side-works and workarounds. Especially if you try to keep the character's lighting properly with some combat animation. And sometimes it is much more proficient to switch to cycles. Or switch to Unreal Engine 5. All the updates sounds great but I have never stumbled across anything about light linking in Eevee. And that is a sad thing. Now my workflow is based on rendering separate layers with different light collections and honestly speaking, that's a huge pain. That would be great to have at least something like light channels in Unreal.
Not even as a faster preview so you can wysiwyg model your scenes before cycles? that's how I use it most. The closer eevee is to cycles the more convenient for me.
You can replicate the previous bloom without much effort with a few extra compositor nodes. Using the compositor for stuff like that give you so much more power, that it's actually quite frightening 👹👹🧛🧛
is the "raytracing" only screen space like the SSGI addon or is it the real thing? Also, could it possibly make radeon/AMD cards somewhat more viable? Blender benchmark doesn't really give much (if at all) info on how well eevee works for each GPU-type
i think your eye problem can be fixed with turning on "raytraced Transmission" in the material settings i don't know if it is alredy on there but i just guess here but still great video!
4:56 It definitely must be a bug, we encountered the same problem when rendering one of our scenes in our latest animation. Strangely for us when rendering the image sequence the effect reduced, compared to rendering a single image, or viewing it in the viewport.
Besides the GLOW, something that is also bothering me a lot in "EEVEE Next" is the strain and power drain it puts on the GPU. My old GTX 1080, with regular EEVEE (Blender 4.1) used to render a specific scene in 7 seconds each frame, forcing the GPU to drain max. 170W. This same scene on "EEVEE Next", takes around 32 seconds to render each frame and forces the GPU to drain 250W (voltage peaks). I'm not very comfortable with such peaks power drains stressing my GPU. I wouldn't be very happy to damage my GPU just because of "EEVEE Next". I tried to use "Nvidia Control panel settings" to limit such power drain on the GPU but couldn't find a real scaling solution. Or it becomes to slow or top speed (with no power limits). It seems that there are no middle terms regarding GPU power drain, even changing the max. frame rater per app. By the way, thanks for your review.
if you desire a specific bloom effect, as well as the new hdri shadows, would it be pheasable to composite renderpasses from seperate blender versions, or would that just take to much time to be worth it?
Thank you for the detailed video. Looks awesome. I'm moving over from Vue to Blender and I have my eye on your CGBoost 3D Environments course. Would the course require any different steps if I used the new Beta version of Blender or should I hold off from updating it? Thanks again for the video. Your channel is damn impressive
Where are some good resources for ancient Greek/Greek myth for places, characters and objects in blender? I really enjoy making them, I've already handmade some characters I'd like to put them in nice places
problem with these tools is while they are cool they are VERY complex to learn and figure out., plus not only that but you have to have detailed assets to use it and if your like me and dont 3d model that can cost a lot to get AND worse that with marketplace assets other users likely already have it as well so what you have is not truely unique . this is where i think ai will come in. with ai youll be able to create worlds where your not just using premade assets it can generate new content and i hope with the rise of ai we will get that point where we can easily create detailed 3d worlds and customize it in the way we want far easier.
I AM blown away simply because we have something between EEVEE and Cycles. The creative control and freedom you can have now with what is *effectively* cycles but can render *so much faster* on many more systems is insane.
Sadly, from what I've been able to tell, they haven't improved transparency options, Alpha Clip was removed (VERY annoying for existing shaders) and Alpha Blend still has the same bugs it used to in earlier versions while Alpha Hashed still has the same amount of noise
Someone on the forum asked about this and they just told them to do some complex node setup to mimic alpha clip Very weird decision, something easy to do in legacy is now very complicated to do in Next
Great video thanks Martin!! How's the performance in viewport please? I use a lot of CC characters like you but there is a lot of slowdown in viewport in Eevee. Is this more similar to UE re realtime please? Looking forward to the release.
Great to hear! :-) When it comes to lots of CC characters, economy is still the key, so I recommend using LODs and as few smaller size textures as possible. Overall though, the viewport performance seems better!
@@MartinKlekner Thanks for the reply. No i am talking with only a couple characters and only 2K textures. 32gb Ram and RTX 2080Ti my PC Specs and UE5 runs loads of assets from CC/Kitbash etc and no slow down so wondering if Blender is on the same scale as Uneal yet re realtime.
I normally do 1v1 fight sequences, so I don’t have loads of characters on screen like your epic videos. Usually 2 CC characters, and assets from kitbash etc, but it slows down to a crawl in the viewport. (Using normal Eevee.)
Blender scenes look so nice & realistic with minimum efforts, beats UE everytime in realism.. Only missing thing is Nanite like tech, i want infinite geometry (not for details, but for saving time on multiple iterations of optimization)
@@zedeon6299 Adaptive subdivision is doing the exact same thing. More geometry close to the camera, less far away. For more extreme decimation, geometry nodes handles that extremely simply.
I have a question should I get a 7900xt or 4070 it super if I’m only planning to use blender. Also I heard a 12 core cpu is the most I will need, is this true. First time building a pc and blender looks like fun.
It's good to see that even if Blender takes 2 steps back, at least they're taking 6 steps forward. I think I can live with that. Shadows from HDRI's in Eevee is FANTASTIC and I've been begging for it for a long time now
does anybody has problem with transferring preferences from 4.1 to 4.2. After it is done addon and extensions window are blank
@@Alex-wg1mb If you don't find a solution, best to write a bug report. Else, they can not improve it.
@@Raimund58 Dug almost the same report, using some workaround until final fix
realtime displacement is a really big thing in my opinion, especially if you look at programms like gaea or world creator, who all utilize map displacement for insane terrain generation, just immagine beeing able to paint in a little house into the displacement and see it realtime, could be super awsome for big enviroment shots, with cycles it was always a pain imo
but the thing is how hard will it hit compute power though cause i'm not a beefy laptop with a gpu. no one is talking about this in their videos cause most of them have gpu
Let's hope post-process bloom will get much more love!
Things like tint, clamping, and fine-tuning the threshold can be done by adding extra nodes. So the node replacement is just core functionality, you can customize it further than the post effect.
the “fog glow” mode also has more options
@@kenziemac130Of course everything can be done by most basic image manipulation nodes. but why do we have bloom node? you can make it by levels and blur nodes as well , you will use levels to make highlights bright then blur it and then make a node called bloom, thats how people do in adobe premiere etc. if no bloom node exist., and for blender, simple things like vignette was never easy to do in blender for long. So its a big time loss. Therefore point is same goes for those extra settings for bloom, so its wrong to make people be node or image manipulation expert to do simple things. There will be various usermade nodegroups that cause lack of standards, its always better to have best or standard implementation of things Compositor is always longer way to do things for simple post processing due to the fact its missing many pratical nodes . They should just add same settings to compositor at least, its not that hard
yes they always downgrade and we wait for love so they fix something to old version. Even commercial softwares make old settings legacy for long time so people wont feel suddenly that a setting they always use is gone. Thats the only side commercial softwares are better. They dont easily get rid of some features by big changes
@@Ericaandmac or deliver it in a node-group
This is what I've been waiting for. Unreal Engine was always too much of a learning curve for the stuff I do so seeing EEVEE start to catch up is super exciting!
I didn’t realize this was going to be such a big update. I’m so excited now.
The thing with the eyes is quickly fixed by going in the settings tab of your cornea material and set th rendering method from Blended to Dithered and activate Raytraced transmission.
This frustrated me to my core until I figured it out lol. Easy fix in my opinion. Though for addons, it will be a little different. People would need to wait for the addon developer to update their addon for 4.2.
Enable refractions in material propertions tab - render method Dithered - raytraced transmission
And For bloom also was not lost a lot of settings, each setting can be repeated in the composer and change colours, just need to add a little more than one node, but this can be fixed by creating a group of nodes once and saving the project, maybe they will add in the future at once preset for bloom likes a eevee olds
Thanks! Oh surely, one can achieve the same effect adding more nodes, still, having the same options as in the old Bloom menu right away would be nice :)
May be u cou do tut on these node's 😅@@MartinKlekner
@MartinKlekner if they're completely changing the workflow under the hood, a redundant boom would only hurt things.
@@MartinKlekner yes who cares about workarounds, then we also dont need bloom node cuz people can do bloom from blur and levels or exposure nodes easily so noone need bloom node as well (or udont need color ramp or map range node cuz u can do it by math nodes) (or we shouldnt have gotten principled bsdf, we can use blendergurus pbr shader groups). , the point is not wheter u can make it nor not, its about official implementation that is standardized for all users without always making things yourself and doing things easily and not getting downgrade from previous workflows..
@@MartinKlekner Or may be instead of a redundant node, the greate thing would be to have nodes presets. Finally.
Personnaly I find that Blender is lacking a solid global presets system. It's good to pull information from data blocks of other projects, but you literally have to organize folders by your own and keep track of them all.
In Houdini for example you can make everything a preset, so next time you open a new project, you have you preset for anything right at one or two clicks away. And not even talking about HDAs.
Complete game changer! I've got 30 years in 3D and I'm totally used to trying to keep render times down to under 10mins a frame, 5mins is optimal but this.... THIS!!! :D
Transparency: Material | Settings | enable 'Raytraced Transmission'
Great video, Martin, thanks for sharing, very useful. Love the new HDRI lighting and shadows and realtime displacement. 🤩 ~Zach
Glad you enjoyed it, Zach 🤩
Great job breaking down the changes Martin! Awesome work as always. :)
Thank you! 😊
The bloom in Eevee (while useful) was woefully limiting, regardless of the amount of sliders it has compared to glare node.
You need to use render passes to get the most out of the compositor and using a diffuse direct and glossy direct with glares nodes gives you way more control over surface bloom than before. Using emit and environment pass will give you amazing results for light glare.
The quick real-time bloom option was really important for testing ideas and designs quickly.
Exciting, I only use Cycles atm but man Eevee is coming up quick! And im happy to see these realtime renderers getting stronger.
Im now using eevee next to render everything. Its fantastic. Its so fast. Even with atmospherics it renders 4k super fast. Grreat animations that i paid an animator to do over a year ago can now be rendered amd released. Im so happy.
It's not fast, very slow on heavy scene
Displacement in evee is so great we have been waiting for it for ages
holy smokes... this looks fantastic :O My new years resolution was to start learning UE5 for it's real-time rendering, but now I think I will reconsider ;)
DEFINITELY learn Unreal bro. Blender does not even come anywhere close to rivaling the kind of technology in UE. Everything you're seeing here is grounded in incredibly dated screen space technology, something Unity and UE were already doing as early as 2015. If you want to be able to render massive worlds in seconds with trillions of polys, and GI nearly on the level of cycles without having to worry about so much CPU bottle necking, you definitely wanna go with the tool that is leading the industry right now. As a Blender user myself, I can't even begin to explain to you the benefits of UE5 over Eevee because there is just so much to cover. It's one of those things where you cannot fully understand it until you start really using UE, and then you'll be saying to yourself: "OMFG, why in the world did I EVER try to do this in Blender!!!??"
Take the leap as I did, and I promise you that you will NOT be disappointed. UE is not as hard as you think it is, its actually very easy once you get used to it, Unity is fantastic too.
Take the image output, hit it with rgb curve to clip the blacks down, blur it, blend it over the original image on screen mode, there's your bloom. Done.
Not the point. Of course, there is milion ways to do it. Previously, though, it was more convenient and with more immediate options.
@@MartinKlekner They don't get it bro. They just don't. I feel ya tho. Shame that they nixed it.
I strongly disagree with the bloom refactor, it was really nice and convenient as it was in 4.1. Big mistake in my opinion
@@MartinKlekner I can see the immediacy returning with the use of the library system to create a sort of post fx layer stack, with compositing node setups packed up as assets. There's no real evidence that this kind of solution is on the horizon, but it's the only reason that comes to mind that they'd remove the previous one click solution.
Ow that's a good tip. But you get ghosting is I'm correct. I know this sort of effect from photoshop. You still see the sharp edges. Unless you can make some kind of mask for edges
Regarding the cornea refraction issue, there IS a fix for this and it's ALMOST the thing you tried: add a value node, plug it into the Thickness output, and set it to ALMOST zero but not actually zero. A value of between 0.003 and 0.001 works fine for me. Also make sure Raytraced Transmission is checked in the material settings under Surface, and set the Thickness option to Slab if it isn't already.
If the iris behind the cornea looks a little dark, try adding nodes to hide the material from shadow rays: add a Mix Shader, connect the top input to the output of your current shader setup and the lower input to a Transparent Shader, then add a Light Paths node and collect Is Shadow Ray to the Mix Shader's factor. Now plug the Mix Shader's output to the World Output's Surface input.
Thank you so much for the tips!
4:50 for the eyes not refracting. Change the alpha blending to dithered instead of the blended option in the material for the eye. It should fix it hopefully.
doesn't :_C
just found it, there is an option bellow render method "dithered", it's called Raytraced transmition. that shit fix it
Oh this really starts looking better and better. After trying a few times after it came out, I completely abandonded it because it just didn't support enough of the features (e.g. some mat nodes) I was using, but this now starts to look much better =) gonna have to try it out later on thanks for the video!
I've waited so long for displacement to work in EEVEE. Hail Blender.
Better mention a few points :
- Raytracing, as super cool it is, is screenspace based : If an object that emit light (direct emission or from an other source) is out of the window, the indirect light it occurs simply don't exist. Always keep that in mind (specially in animation).
- Displacement looks also super cool but AFAIK, require an high poly mesh to properly work.
Martin the best easy-friendly teacher of all time with knowledge beyond measure! Thank you!
Haha, appreciate it buddy 🙏☺️
i hope that npr workflows aren't forgotten with this update. Most of the raytracing features are screenspace so they're discarded when using the shader to RGB node
if you’re still having refraction issues after turning on raytraced refraction in the *material settings*, setting thickness, and enabling refraction eevee setting, and also making sure the material is no longer set to use the “alpha blend” mode… it may be that the shader is not using the standard PBR shader? some networks/nodes might not work properly in eevee. You may have to remake the material with a more simple basic PBR setup. But refraction definitely works.
the displacement node in real time looks amazing
Awesome video Martin! Eevee Next is super exciting
Thank you Cov! 🤗
Not having Bloom, Refraction and Ambient occlusion settings are a deal breaker for me. Will not upgrade to this anytime soon.
Same for me, unfortunately. I work with a character asset 2-3 times a week that needs refraction.
The dev really thought that it would be great to merge all settings into "raytracing" option
And the raytracing itself is just SSGI :/
Super handy, thanks man! Here's hoping the glare node gets some bloom like settings.
Thank yoou, Louis! 😉 Let's hope!
I think the "bloom downgrade" that you talk about, perhaps on the surface seems like a dis-improvement. But there is so much you can do by adding other nodes that do similar and more than the previous bloom did. Also the glare node is much faster than it used to be.
I can't wait for when they add full raytracing support for eevee (not only screen space GI)
I'll retire Cycles completely from my workflow
Wow, I can’t believe they changed bloom. Used it on most of my eevee renders 😭
Not on topic, but your linothorax models are so very good. All your hoplite armors are majorly on point.
Thank you, sir, always a pleasure to meet a fan of ancient history & warfare 🙂
Bloom was a destructive effect, those sorts of things should never be burned into a render. That being said the compositor node should have had feature parity.
Really though a bloom is just a threshold based extraction which is blurred, color corrected and then added with the original image. it should be very very easy to reproduce with a node network.
That being said, If your aiming for realism you should use glare.
Great work to Blender dev team ❤
I really think to download the new version, it has now amazinf performance because now it is using more gpu (especially AMD gpu) and eevee seem more better, I just hope they add back ambient occulusion, bloom and other stuff since it was very simple to understand what they did
It's everything I've wanted in EEVEE since 2.8! 🥹 I'M SO EXCITED!!!!
been following eevee next for years finally it came
Maybe I'll finally consider using Eevee now.
The only thing we can't do yet is baking textures or using buffers I think, which would also be a neat thing to have.
this is hot
maybe for my computer as well, but seriously, i can’t wait try out the final release of eevee next!
To fix the eye refraction go to material > sitting > thickness and change the mode from sphere to slap , and don’t forget to set the thickness output to 0 as you did
Sadly, not working, but thank you! :-)
@@MartinKlekner I just figured out basically if you have back light it will break throughout mesh, jest delete the back light, or make the cornea another sphere with glass shader.
it doesn't hate to rival UE. the amount of hoops you have to jump through when doing, well, VFX-ey shot in UE is no joke. anyway, cool stuff, i hope all of it gets ported to macos as well
Lovely, considering that at tme studios we are working on a short movie that is to be done in eveee, this is perfect
Nice sneak peak thanks. Didnt have any chance to give it a try but the lighting looks so much better now and this is just a beginning. But what do you think about cycles vs eevee for realism. Could eevee rival with cycles with new ray tracing feature?
By the way, another big thing: volume shaders can now be shaped by detailed meshes, and no longer just follow a bounding box!
for everyone trying to fix the refraction, put the thickness value "0" like him, BUT!!!, turn on in material /settings/ Render method, Bellow Dithered Tab enable "Raytrace transmition".
Hey Martin, I think the problems with the refraction of the cornea can be resolved with the new options within the material properties tab where it now provides an option for raytraced refraction
I do miss the bloom settings :(. Oh well, still a great alternative render engine!
This is very exciting, and a great video to introduce it - thanks!
Glad you liked it!
HDRIs with proper shadows!!
I think I just had a nerdgasm 😳
Thanks For this! Love your content 🎉🎉🎉🎉
In short: EEVEE Next does not use full ray tracing, but it has improved simulation-based features for better visual effects, which may seem similar to ray-tracing in terms of quality, but the underlying technology is different.
Whole new level ❤
The new realism is neat and all, but how does it handle NPR? Did they gimp it in hopes that Eevee will be the "we have cycles at home" renderer?
Excellent video. The Auto-Eye addon creator need to fix this for 4.2, hopefully. It must be related to the shader setup
Blender Internal did the raytracing first but it was offline, Cycles has path tracing ... now since tech developed further and becomes more realtime, they go back to raytracing then going forth to a faster and realtime path tracing and where we are going next ...
With raytracing availablle in 4.2 for Eevee, I could switch from Cycles to this. It'd save me a ridiculous amount of time during the render of longer animations.
you can go to shader then press n and you can enable the refraction settings
Thanks but not sure what you mean, I see no refraction settings in Shader editor shelf...
I want a comparison video between the evee, evee next, and cycles... that would be great...
Good idea for future video.
The example with the eyes make the refraction issue seem like small thing but it's going to mess up the scene if you have water and/or glass I suppose? Let's hope they prioritise fixing it in the full release
It was an issue in the shader of the AutoEye addon ☺️ ua-cam.com/users/shortsyzZT6JoVljQ?si=hEECboOKMcnV8Jpk
@@MartinKlekner ah got it!
Please make a cinematic with eevee next to show off the new progress once its fully out from beta
there's a refraction option in the material tab settings where the alpha clip/hashed setting is. That will fix the eye problem. I think at least, it did for me.
Worked in some cases but not for these eyes, thanks for the tip though! Ill try to fiddle with settings some more :)
@@MartinKlekner I have an eye setup that was originally based on Auto-eye and I got it working about 90%. You should set the refraction type to "sphere" not slab and also you need to connect a value to the "thickness" value on the Material Output node of your Cornea material. If I understand the node correctly setting the value to Zero actually makes it use the objects volume to calculate the refraction and works basically like classic eevee. It's fine on all but extreme closeups which have a tiny white infinite distance square in the refraction for some reason. if you don't connect any value to thickness it just seems to use the scene default thickness set in render options which is often way too big for something like eyes
I've been waiting for this stuff desperately. Eevee is great for quick animatics, but turning that renders into some worthy stuff demands an enormous amount of side-works and workarounds. Especially if you try to keep the character's lighting properly with some combat animation. And sometimes it is much more proficient to switch to cycles. Or switch to Unreal Engine 5. All the updates sounds great but I have never stumbled across anything about light linking in Eevee. And that is a sad thing. Now my workflow is based on rendering separate layers with different light collections and honestly speaking, that's a huge pain. That would be great to have at least something like light channels in Unreal.
Oh yes, agreed, linking in Eevee would be awesome 🙂
Starts to look like Unreal!
I don't care about Eevee, but those renders are beautiful. Keep up the good work.
Thanks a lot! :-))
Not even as a faster preview so you can wysiwyg model your scenes before cycles? that's how I use it most. The closer eevee is to cycles the more convenient for me.
They also replaced subsurface scattering with a mix colour node set up.
I'm thinking I'm going to take a pass at 4.2
for the curious turning off raytracing undoes that weird noise you get for moving the model if you are just using blender now.
You can replicate the previous bloom without much effort with a few extra compositor nodes. Using the compositor for stuff like that give you so much more power, that it's actually quite frightening 👹👹🧛🧛
is the "raytracing" only screen space like the SSGI addon or is it the real thing?
Also, could it possibly make radeon/AMD cards somewhat more viable? Blender benchmark doesn't really give much (if at all) info on how well eevee works for each GPU-type
i think your eye problem can be fixed with turning on "raytraced Transmission" in the material settings
i don't know if it is alredy on there but i just guess here but still great video!
Thank you! Already fixed by the author of the addon :-) ua-cam.com/users/shortsyzZT6JoVljQ
4:56 It definitely must be a bug, we encountered the same problem when rendering one of our scenes in our latest animation. Strangely for us when rendering the image sequence the effect reduced, compared to rendering a single image, or viewing it in the viewport.
It is somehat unpredictable at this point, hope they can fix it or provide solutions :)
Besides the GLOW, something that is also bothering me a lot in "EEVEE Next" is the strain and power drain it puts on the GPU.
My old GTX 1080, with regular EEVEE (Blender 4.1) used to render a specific scene in 7 seconds each frame, forcing the GPU to drain max. 170W.
This same scene on "EEVEE Next", takes around 32 seconds to render each frame and forces the GPU to drain 250W (voltage peaks).
I'm not very comfortable with such peaks power drains stressing my GPU. I wouldn't be very happy to damage my GPU just because of "EEVEE Next".
I tried to use "Nvidia Control panel settings" to limit such power drain on the GPU but couldn't find a real scaling solution. Or it becomes to slow or top speed (with no power limits). It seems that there are no middle terms regarding GPU power drain, even changing the max. frame rater per app.
By the way, thanks for your review.
I know that you need Light Probes still for objects outside of camera view. But what about Irradiance Volumes? Is it still needed to use it?
Thanks!!! Great Overview!!! 👌😁
Great video! Thank you!
Great video! Thanks!
if you desire a specific bloom effect, as well as the new hdri shadows, would it be pheasable to composite renderpasses from seperate blender versions, or would that just take to much time to be worth it?
Those UE5 Lumen and Nanite makes more sense with Blender
5:44 Real time compositor is always GPU my friend
Thank you for the detailed video. Looks awesome. I'm moving over from Vue to Blender and I have my eye on your CGBoost 3D Environments course. Would the course require any different steps if I used the new Beta version of Blender or should I hold off from updating it?
Thanks again for the video. Your channel is damn impressive
Awesome! The course uses mostly Cycles, so no changes in Eevee influence it :)
@@MartinKlekner awesome. Thank you for the very quick reply. Have a great day
i knew this day will come..and it head to head with UE5 lumen render
No bloom is a big upset
Love it!
Where are some good resources for ancient Greek/Greek myth for places, characters and objects in blender? I really enjoy making them, I've already handmade some characters I'd like to put them in nice places
Wish we can get camera culling for Eevee next
Why no mention subsurface scattering, which is hugely improved as well?
Maybe a topic for next time ;-)
Amazing video.
Wonder if they can uodate cycles displacement as well, since thats very heavy computing wise
I enjoy your videos so much
Great Job
problem with these tools is while they are cool they are VERY complex to learn and figure out., plus not only that but you have to have detailed assets to use it and if your like me and dont 3d model that can cost a lot to get AND worse that with marketplace assets other users likely already have it as well so what you have is not truely unique . this is where i think ai will come in. with ai youll be able to create worlds where your not just using premade assets it can generate new content and i hope with the rise of ai we will get that point where we can easily create detailed 3d worlds and customize it in the way we want far easier.
I AM blown away simply because we have something between EEVEE and Cycles. The creative control and freedom you can have now with what is *effectively* cycles but can render *so much faster* on many more systems is insane.
Sadly, from what I've been able to tell, they haven't improved transparency options, Alpha Clip was removed (VERY annoying for existing shaders) and Alpha Blend still has the same bugs it used to in earlier versions while Alpha Hashed still has the same amount of noise
Someone on the forum asked about this and they just told them to do some complex node setup to mimic alpha clip
Very weird decision, something easy to do in legacy is now very complicated to do in Next
@@Villager_U It's not really complicated, but it's definitely not as easy as just having a simple option. I really don't get why they did this
@3m9s actually the old method mainly relied on using light probes to fake GI
Not ideal still
Oh yeah forgot to mention that thanks 😊 This new workflow is so much better
@@MartinKlekner yeah most definetely :)
I hope we get even better gpu accelerated raytracing for eevee soon !
Great video thanks Martin!! How's the performance in viewport please? I use a lot of CC characters like you but there is a lot of slowdown in viewport in Eevee. Is this more similar to UE re realtime please? Looking forward to the release.
Great to hear! :-) When it comes to lots of CC characters, economy is still the key, so I recommend using LODs and as few smaller size textures as possible. Overall though, the viewport performance seems better!
@@MartinKlekner Thanks for the reply. No i am talking with only a couple characters and only 2K textures. 32gb Ram and RTX 2080Ti my PC Specs and UE5 runs loads of assets from CC/Kitbash etc and no slow down so wondering if Blender is on the same scale as Uneal yet re realtime.
I normally do 1v1 fight sequences, so I don’t have loads of characters on screen like your epic videos. Usually 2 CC characters, and assets from kitbash etc, but it slows down to a crawl in the viewport. (Using normal Eevee.)
Thank you for this video..
Blender scenes look so nice & realistic with minimum efforts, beats UE everytime in realism..
Only missing thing is Nanite like tech, i want infinite geometry (not for details, but for saving time on multiple iterations of optimization)
That's literally what adaptive subdivision is. We've had it for years.
@@choo_choo_ no that's different nanite is real time auto lod, it's decimating geometry based on how far the object from camera
Lol, Blender is following UE.
@@zedeon6299 Adaptive subdivision is doing the exact same thing. More geometry close to the camera, less far away. For more extreme decimation, geometry nodes handles that extremely simply.
@@choo_choo_ if so then why it increase memory usage instead of reducing it?
I have a question should I get a 7900xt or 4070 it super if I’m only planning to use blender. Also I heard a 12 core cpu is the most I will need, is this true. First time building a pc and blender looks like fun.
Screenspace Refraction/Transparency and Refraction to Shader RGB is indeed missing and sorely needed to fully migrate..
Wow, didnt ever expect raytracing to be implemented into eevee!