the results look a bit plastic but it's definitely promising. It could definitely be useful in post-prod cgi lighting correction. It's application in fan editing could be interesting
@JoshuaMKerr I think it's also a lack of subsurface scattering. Light doesn't just bounce off of our skin, it travels through it and scatters throughout the inside.
That tells so much about realism. The final result does look plastic almost like it's a video game character, but that doesnt make sense because that's an actual person being recorded. Maybe the key to realism is afterall a good roughness map
for making a rougness map use materialize, it allows for batch processing too, and the only reason the skin looks plasticy is because skin in real life absorbs light, so whatever you are applying the video material too, you should use a mask to make the skin part a subsurface scattering material and then with the roughness map the lighting will look photorealistic
@@JoshuaMKerr any update on this? really want to integrate switchlight into our workflow but everytime i revisit, i just come out looking plasticy as ever, even with switchlight now generating roughness maps
Honestly I kind of hope that Cyan Inc does this for the Riven remake. The FMV characters are what made Riven feel so real... and I think this is nuts that it could allow FMV style moments in games again.
I'm already using it in UE5. It gives me better results with the environment lights, and additionally, it saves you a few hours in compositing. Simply incredible. Thank you very much😎
If I had seen your final footage entirely out of context, I would have never guessed it was shot with entirely flat lighting. This technology is amazing and shows great promise. As is right now, I can tell it would work wonders for a highly stylized film along the lines of Sin City.
@@JoshuaMKerr When I was in film school, this would have been firmly the stuff of science fiction. Hell, everyone was still laughing off the prospect of studio movies shooting on digital media. I'm loving the proliferation of cheaper high-quality cameras, and the wealth of amazing software available to the average indie filmmaker. The future is indeed bright for the average indie filmmaker.
Amazing! While it’s still early days, pushing the technology to its limits like you’ve done here will surely help to accelerate the development of this groundbreaking technology. Imagine where we’ll be in just a year!
Is it better than their current relighting tool? The current tool seemed mainly useful for making subtle changes rather than totally relighting footage.
It's always so amazing but taunting in a sense that the interface for these ground-breaking softwares are so simple. Like before it would be a challenging process that takes lots of time and planning, but here is this one button that can do it now in seconds.
Hugely promising for a fix up or two. Obviously better to get it in camera - but I can't wait to see how far this can be pushed, the number of times I've shot something and just gone "Wish I had a little more light" or "wish I'd put a rim light here" - even just basic stuff like that.
This is obviously not perfect, but far, far better than I expected. I'm especially intrigued about how this tech might work on something like an iPhone than can capture LIDAR depth data alongside images.
One thing i noticed about it was that it doesn't take skin's translucency like taking account one's redness as a result of blood showing through skin when lit therefore the "plasticy" look
@@JoshuaMKerr One time I used Photoshop to create normal map and then use ebsynth to make normal map of whole video and then used it in blender and Second time I found eaiser way their is a node in davinic resolve fusion name bump node which convert your video into normal map
honestly the roughness map would complete this. the only thing that i noticed is at 6:20 it looked a bit off like it didn't quite look like it matched the environment too well. but it is alot better than it would have been without this. really cool tech
I'm surprised how great that normal map looks. :) Final lighting/comp results are 90% there towards realism but still has that unplanned green screen comp look. Since it's a computer vision process, did you do any test on distance from camera, other less common objects, etc, to see if it became confused about depth estimation?
Yeah the normal is amazing and will improve with time. I was actually Suprised how well it handles full body shots, objects and hands. All seemed quite promising.
What really needs to happen in parallel to this development is the ability to add shadow and reflection without motion capture. I mean for purpose of shadows, one can imagine that an algorithm could morph a keyed video of an actor into a 3d animated mesh. It wouldn't look pretty, but the mesh would not be visible but it would still render nuanced shadow, reflection to add realism.
Very interesting! You should do a test with much more dramatic instead of soft lightning, based on you initially dragging those lights it almost feels like the result could look a lot more believable with deep shadows.
Relighting people with normals generated from monocular video is the new part. Regarding shadows the plane does cast onto the environment and there are self shadows. What am I missing?
Hey, I remember watching Dr. Károly Zsolnai-Fehér's videos covering the papers on this technology! Neat to see is deployed in an actual tool people can use.
In Unreal, using Opacity Masks is always super harsh. Have you figured out how the translucency blend mode while also not losing spec/norm/rough channels?
In its current state, I don’t think it’s very useful for relighting an entire scene, but it’s definitely useful in short bursts, like for example if there’s light from an explosion or gunshot that you couldn’t film practically
@@JoshuaMKerr We just need to find a way for the adult entertainment industry to start using it. Time and time again they have advanced the way in which we produce and consume media.
The concept behind the tech is amazing, but the results make you look like a videogame character. It kinda smooths all the surfaces a bit too much, I think.
Your UA-cam channel is absolutely amazing! The content you create is both entertaining and informative,The production quality, creativity, and passion you put into your videos are evident in every frame. keeping me hooked with every video. Keep up the fantastic work, you're doing a fantastic job! 😊👍
I'm not knowledgeable where visual effects and virtual production is concerned, but is Switch Light basically baking a newly relit image using the hdri so that it can be added to a scene in unreal?
It can do that. Or it can export the normal, specular, roughness, depth and albedo maps for import into unreal so your character can react to the virtual environment.
That is amazing! Imagine this tech with a higher budget it could be amazing!! I wonder if Filoni and Favreau with ILM are already testing this combined with the volume for the next mandalorian or some other SW project. Still it feels 3D scanned, that uncanny valley, it reminded me LA Noire for some reason. But it is amazing!!!
Being able to extract normal and albedo maps is going to be huge for I die game development too. It will make it easy to generate base materials for 3d models.
Glass, lens, gaffer. 1080p and even an IPhone camera with all its “imagination” and super sampling is still a more flexible tool for capturing the right scene-in camera. Old school vfx with background replacement and simple foregrounds can still set up fantastic locations just fine
It’s the poor man’s version of the screen stage used in The Mandelorian. One thing that I noticed as a 3D artist is the lack of back lighting around the alpha edges. When it’s not reflecting the colors of the background that’s how you end up with the cutout look. I’d also love to see it use an environment map for the fill lighting to more closely match the color and brightness.
Exactly, that’s a great example. That’s what I typically use but most image formats will work. I’m not familiar with Switch Light so I don’t know how it would use an environment map, if at all. The most likely way, I would think, is to create a fake reflective texture that overlays the image which is how old video games created metallic surfaces. It’s much less computationally expensive than rendering light rays. Keep up the good work!
@Chuntise Oh, it does a great job with hdris. You have to try it. Ive been testing their standalone desktop software and the results are 10x what you see here
impressive, but i wouldn't use it as a crutch. if i have a couple of stage lights and a greenscreen, i wouldn't necessarily light the actor completely flat every time. i would try to match the environment the best i can, and then use relight for the details.
this is amazing. lost me when you said you had to process each frame individually, but im sure that wont be a problem in a few months. great stuff mate!
So I love this tutorial but I think I would invert the workflow instead, using a billboard is powerful BUT you are locked into looking at the billboard directly and when the camera moves around the subject it breaks the illusion unless you were to key in all that movement as well. I would rather make multiple HDRI's of the Unreal Engine scene and use those HDRI's in the SwitchLight editor to relight the subject in the camera footage WITH the alpha mask, I would turn around and get VCAM footage that was taken at the same time of filming so everything is tracked NATURALLY to the footage you took, then in premier overlay both films, that way you have similar lighting to what is in the scene provided by SwitchLight for your main footage AND have freedom of motion in the Unreal footage!
I think you're overcomplicating this. Every shot in my scene was static deliberately, so there's no illusion to break with camera movement. If there were camera movement, then the process would be the same. Track the camera, get the maps from switchlight, and combine in Unreal.
That cool! I believe DaVinci Resolve Studio also has something similar. I don't think it can use HDRs yet, but I've seen demos where it generates normals and you can relight scenes using point lights.
@@JoshuaMKerr What is it designed for? If I remember correctly, the way it worked was that it'd give you a greyscale image that you could use as a mask for color correction. So if you lightened the image, it'd look like it was lighter. I could be wrong, but I imagine you could string together a bunch of those nodes for multicolor light. Or are you just talking about how it can't use HDRs as a lighting input?
The bump map it generates doesn't have enough detail for you to do complete relighting like this. It's a tool for more subtle touch ups but can't be pushed very far.
While the normal map generation is impressive this really shows that it's not enough if they could generate specular or roughness maps... though. I think relighting an object would be more appropriate too.
Davinci Resolve Studio actual has a similar feature they just added. It allows for relighting within your grading workflow and can output normal maps for videos.
@@JoshuaMKerr resolve is grading whole scene after the fact, but relighting in UE relights the whole scene during the fact. although the word to describe seems to be really close, those corrections and changes make very different dimensions for the storytelling. while the resolve surface/relighting improves the visual narrative, relighting w unreal (allows to) creates another 'visual' narrative.
This makes me think you could turn the normal maps into depyth maps, and use that into the pixel offset in the material, to get different fog density/depth of field/projected shadows/AO based on this different depth
can I ask how did you manage to set all the maps on the plane? I don't know how you've done it since the maps are a video🤔 did you set a material or what? I'm new to UE so thanks for helping🙏
I feel like we're stepping in a different kind of uncanny valley. Regardless, this is huge. Just the normal maps alone is an amazing feature. Let's see what they bring next to the table.
Actually the reason to light flat is to make sure their machine learning algorithm gives me the best possible results for albedo and normal. I spoke to their developers about this.
One thing to mention under the switchlight contract you give them non-exclusive rights to use your images and share them with third party sublicensees. As a creator if you don't care then no problem. But good to know.
This would take forever with normal 3d software, having to get the correct lighting ect ect but things can be changed on the fly?? after its recorded?? Wow!
Great progress. Ironically u look like a metahuman. I would try putting a sss skin material on one instance of the regular plane to key out skin portion to reveal the sss material version beneath
Not game changer, but good as concept. Result looks very CG. Maybe it can be used in very stylized movie. And one of the problems here its catch light in the eyes. For close enough shots its important thing. Its what was the problem in Hobbit when that dude with beard was hallucinated in the cave. Flickering ligh stick was reflecting in his eyes. You can even recreate light setup out his eyes)
@@JoshuaMKerr It looks weird on the image now, but I imagine in a few months, maybe less, it will look as natural as real footage. Excellent work, by the way.
Would LOVE a more in-depth tutorial on how to recreate this effect! I followed what you did and got a quite flat result in UE5 (I probably did something wrong). Either way this is incredible and very exciting!
@@JoshuaMKerr if u can lower the light effect like just an overlay it would be near to perfect. if you can not in unreal, maybe composite in after effects or nuke
Wel this is handy two days ago I learned unreal engine and now I need to finish my project before I saw this video I was learning unreal engine and now somebody made a 3 day video so I can do it to I know nothing about unreal engine I needed to learn everything see ya❤
What kind of PC rigs/workstations are you guys mostly working with with this kind of video production stuff? I'm wondering what type of GPU might be needed to be able to do this stuff well
@@JoshuaMKerr Nice! Next year I want to get a nice desktop rig, but from what I've read it seems that for strictly gaming you can get AMD gpu's as really good value, but if you want to do other things like 3d rendering etc. it seems that Nvidia is better? but more expensive
As someone who has been searching for some kind of normal map generation, AI or otherwise, I’m very excited about the applications this has beyond just video.
@@JoshuaMKerr I’ve had some ideas cooking involving 3d printing and was looking for something that could quickly turn photos into bas reliefs. Davinci has a similar relight function but the depth map is lower in fidelity
Wow thanx for the video was great. I was just thinking about something like this a few days ago while editing a video texture in unreal, its insane seeing this
the results look a bit plastic but it's definitely promising. It could definitely be useful in post-prod cgi lighting correction. It's application in fan editing could be interesting
That's mostly my shoddy roughness map. Just wait until Switchlight generate them.
Well you could add a cartoon effect to the actor and Thinking about that, that could be amazing
@JoshuaMKerr I think it's also a lack of subsurface scattering. Light doesn't just bounce off of our skin, it travels through it and scatters throughout the inside.
@GANONdork123 yeah, Relight's results look good with that but the unreal engine ones don't quite get it
That tells so much about realism. The final result does look plastic almost like it's a video game character, but that doesnt make sense because that's an actual person being recorded. Maybe the key to realism is afterall a good roughness map
Finally, I can be normal....
You win the comments section.
Lmao that's a good one
💀💀
I didn’t expect to see you here
I dont understand de joke :(
for making a rougness map use materialize, it allows for batch processing too, and the only reason the skin looks plasticy is because skin in real life absorbs light, so whatever you are applying the video material too, you should use a mask to make the skin part a subsurface scattering material and then with the roughness map the lighting will look photorealistic
I'm working on this right now :)
@@JoshuaMKerr Sounds exciting, cant wait to see the video!
@@JoshuaMKerr any update on this? really want to integrate switchlight into our workflow but everytime i revisit, i just come out looking plasticy as ever, even with switchlight now generating roughness maps
doing a night scene inside of a moving car would be cool to see, since the lighting is always changing as the car is moving. Awesome work man
That's where I'm heading with this
@@JoshuaMKerrsick can’t wait to see what you come up with 🔥
This is crazy impressive I could see this being a huge piece of an arsenal for compositing and CGI artists in the near future
It certainly won't be long before the results from this improve. Glad you enjoyed the video.
Honestly I kind of hope that Cyan Inc does this for the Riven remake. The FMV characters are what made Riven feel so real... and I think this is nuts that it could allow FMV style moments in games again.
Interesting. I never thought about that
"who needs meta human when you have human?" has to be the most awesome thing I have ever heard
probably you'll need a 5th element multipass for that...
Haha thanks.
I'm already using it in UE5. It gives me better results with the environment lights, and additionally, it saves you a few hours in compositing. Simply incredible. Thank you very much😎
Glad this was helpful mate. It's a lot of fun and it's still early days.
That’s crazy! You edited and relighted png sequences one by one it’s even crazier!
If I had seen your final footage entirely out of context, I would have never guessed it was shot with entirely flat lighting. This technology is amazing and shows great promise. As is right now, I can tell it would work wonders for a highly stylized film along the lines of Sin City.
Give it a little time (probably months) and the results should be even better than this. Glad you're excited too.
@@JoshuaMKerr
When I was in film school, this would have been firmly the stuff of science fiction. Hell, everyone was still laughing off the prospect of studio movies shooting on digital media. I'm loving the proliferation of cheaper high-quality cameras, and the wealth of amazing software available to the average indie filmmaker. The future is indeed bright for the average indie filmmaker.
Amazing! While it’s still early days, pushing the technology to its limits like you’ve done here will surely help to accelerate the development of this groundbreaking technology. Imagine where we’ll be in just a year!
Totally agree. These guys need to be shouted about in my opinion.
@@JoshuaMKerr Davinci Resolve a freemium software is also bringing this feature in their next update for video and the beta tests work great!!
Is it better than their current relighting tool? The current tool seemed mainly useful for making subtle changes rather than totally relighting footage.
@@agent_op Are you talking about the one in 18.5? That’s not nearly as good as this.
@@johanfolke2971 Yes I am talking about the one in 18.5 and I think it is good enough to be used and I know it's still not perfect but works for me!
It's always so amazing but taunting in a sense that the interface for these ground-breaking softwares are so simple. Like before it would be a challenging process that takes lots of time and planning, but here is this one button that can do it now in seconds.
I feel that.
Hugely promising for a fix up or two. Obviously better to get it in camera - but I can't wait to see how far this can be pushed, the number of times I've shot something and just gone "Wish I had a little more light" or "wish I'd put a rim light here" - even just basic stuff like that.
I think we are going to be very suprised at the speed that this develops.
@@JoshuaMKerr I think you're right. Generally speaking this kinda stuff is developing REALLY quickly.
This is obviously not perfect, but far, far better than I expected. I'm especially intrigued about how this tech might work on something like an iPhone than can capture LIDAR depth data alongside images.
Relight is in DaVinci Resolve 18.5 beta , which is out now (3 Months ago) and there are lots of Demos ( UA-cam) .
Davinci is great for subtle relighting but their bump map is not detailed enough for complete relighting like this.
One thing i noticed about it was that it doesn't take skin's translucency like taking account one's redness as a result of blood showing through skin when lit therefore the "plasticy" look
Yeah, there's no subsurface map for sure
Yeah,this technique is super cool.I also use this technique every time while putting real footage in 3d scene
That's great. Do you use the same or different software?
@@JoshuaMKerr One time I used Photoshop to create normal map and then use ebsynth to make normal map of whole video and then used it in blender and Second time I found eaiser way their is a node in davinic resolve fusion name bump node which convert your video into normal map
The quality wasn't good enough for me from Davinci. Also I tried photoshop and their normal map isn't in world space, it was tangent space.
@@JoshuaMKerr Yeah,this method is better than both of them . Thank you 😄
No worries
honestly the roughness map would complete this. the only thing that i noticed is at 6:20 it looked a bit off like it didn't quite look like it matched the environment too well. but it is alot better than it would have been without this. really cool tech
Next version should blow this out of the water.
The potential for this is huge. Definitely going to be keeping a sharp eye on this software. Thanks for the video!
You're welcome.
I've been expecting software like this to be invented, so here we go.
Just the beginning
This is precisely the method I used in my own video three months ago, but for me I used Blender and DaVinci Resolve. Cool stuff!
Feel free to drop a link here. I'd like to see your results
I'm surprised how great that normal map looks. :) Final lighting/comp results are 90% there towards realism but still has that unplanned green screen comp look. Since it's a computer vision process, did you do any test on distance from camera, other less common objects, etc, to see if it became confused about depth estimation?
Yeah the normal is amazing and will improve with time. I was actually Suprised how well it handles full body shots, objects and hands. All seemed quite promising.
@@JoshuaMKerr Amazing. It has no business being that good.
That's spectacularly awesome! I can't wait until these kinds of tools are just built into compositing software~
You and me both!
What really needs to happen in parallel to this development is the ability to add shadow and reflection without motion capture. I mean for purpose of shadows, one can imagine that an algorithm could morph a keyed video of an actor into a 3d animated mesh. It wouldn't look pretty, but the mesh would not be visible but it would still render nuanced shadow, reflection to add realism.
Specular maps and depth maps have been developed by Switchlight. Its just a question of me making an update video to showcase them :)
@@JoshuaMKerr Does that allow shadows, reflections from any angle?
Early days, but thats the idea
Very interesting! You should do a test with much more dramatic instead of soft lightning, based on you initially dragging those lights it almost feels like the result could look a lot more believable with deep shadows.
Worth a try for sure.
It’s very cool and relighting in post with normal isn’t new, but it has limits. Because you’re working with a flat 2D plate, you can’t cast shadows
Relighting people with normals generated from monocular video is the new part.
Regarding shadows the plane does cast onto the environment and there are self shadows. What am I missing?
Great to see this video getting the attention it deserves!
cheers mate
Hey, I remember watching Dr. Károly Zsolnai-Fehér's videos covering the papers on this technology! Neat to see is deployed in an actual tool people can use.
In Unreal, using Opacity Masks is always super harsh. Have you figured out how the translucency blend mode while also not losing spec/norm/rough channels?
alpha composite blend mode works nicely. Surface forward shading should be enabled.
Love it! The filmmaking industry is heading towards exciting times 😍😍
It truly is.
Honestly, all digital virtual production looks like shit, just your perception of what looks real or not is distorted.
I don't remember mentioning realism. But I am very excited about the potential of this technology. You don't have to share that enthusiasm.
That final lighting render looks beautiful
Thanks man. Not perfect but interesting for sure
In its current state, I don’t think it’s very useful for relighting an entire scene, but it’s definitely useful in short bursts, like for example if there’s light from an explosion or gunshot that you couldn’t film practically
That's probably true. But I'd be Suprised if it didn't improve very quickly
@@JoshuaMKerrI completely agree!
You can go from real life to Next gen graphics. Imagine what it will be like in 3 years
That's what I keep telling people. it's just the beginning for all of this tech and it's advancing very fast.
@@JoshuaMKerr We just need to find a way for the adult entertainment industry to start using it. Time and time again they have advanced the way in which we produce and consume media.
this is insane 🤯 can't wait to see what the future will bring 🙏
Right there with you
The concept behind the tech is amazing, but the results make you look like a videogame character. It kinda smooths all the surfaces a bit too much, I think.
For now. I'm waiting for 4k normals and roughness to get us there.
Your UA-cam channel is absolutely amazing! The content you create is both entertaining and informative,The production quality, creativity, and passion you put into your videos are evident in every frame. keeping me hooked with every video. Keep up the fantastic work, you're doing a fantastic job! 😊👍
Thank you so much, that means a lot to me. I hope to keep bringing you videos you'll enjoy.
@@JoshuaMKerr 🫰🫰🫰🫰
Dude, you’re gonna grow massive إن شاء الله ❤
Here's hoping
this was in aftereffects over 10 years ago . but the normal map creation/heightmap creation has seen a huge bump last year , many good new methods
Which part was in Ae 10 years ago?
I'm not knowledgeable where visual effects and virtual production is concerned, but is Switch Light basically baking a newly relit image using the hdri so that it can be added to a scene in unreal?
It can do that. Or it can export the normal, specular, roughness, depth and albedo maps for import into unreal so your character can react to the virtual environment.
Relight is amazing ✨👌just like you
this is so exiting for more stylised live action
That is amazing! Imagine this tech with a higher budget it could be amazing!! I wonder if Filoni and Favreau with ILM are already testing this combined with the volume for the next mandalorian or some other SW project.
Still it feels 3D scanned, that uncanny valley, it reminded me LA Noire for some reason. But it is amazing!!!
I never found LA noire that strange (but again i just came from finishing GTA5)
Being able to extract normal and albedo maps is going to be huge for I die game development too. It will make it easy to generate base materials for 3d models.
Unreal Engine is the future of filmmaking.
Asset gathering performances... mind blown
could you make an in depth tutorial on how to get your footage inside of unreal/ connect your normals and albedo please? I'm new at this.
Planning on doing this soon
they made a plugin for Unreal it's so easy now! @@JoshuaMKerr
Glass, lens, gaffer. 1080p and even an IPhone camera with all its “imagination” and super sampling is still a more flexible tool for capturing the right scene-in camera. Old school vfx with background replacement and simple foregrounds can still set up fantastic locations just fine
There are many approaches and all have their merits.
It’s the poor man’s version of the screen stage used in The Mandelorian. One thing that I noticed as a 3D artist is the lack of back lighting around the alpha edges. When it’s not reflecting the colors of the background that’s how you end up with the cutout look. I’d also love to see it use an environment map for the fill lighting to more closely match the color and brightness.
By environment map, do you mean an hdri?
Exactly, that’s a great example. That’s what I typically use but most image formats will work. I’m not familiar with Switch Light so I don’t know how it would use an environment map, if at all. The most likely way, I would think, is to create a fake reflective texture that overlays the image which is how old video games created metallic surfaces. It’s much less computationally expensive than rendering light rays. Keep up the good work!
@Chuntise Oh, it does a great job with hdris. You have to try it. Ive been testing their standalone desktop software and the results are 10x what you see here
impressive, but i wouldn't use it as a crutch. if i have a couple of stage lights and a greenscreen, i wouldn't necessarily light the actor completely flat every time. i would try to match the environment the best i can, and then use relight for the details.
Which is a totally valid approach. But the flat lighting was just to try and push the tech as far as possible and also aid the switchlight algorithm.
wow! pelase make detail maybe shorts format video how to setup this maps in UE's timeline
I should be doing this soon enough but might be on patreon.
Really amazing, can think of a few shots I could have done with this on before. Will try it out👍Thanks for the video.
You're welcome. Glad you found it useful.
this is amazing. lost me when you said you had to process each frame individually, but im sure that wont be a problem in a few months. great stuff mate!
Yeah their video normal map feature is desperately needed but it's coming.
thank you for the show and letting us know!
No problem, glad you enjoyed it.
they needa use this on so many movies from the last decade
You deserve a sub for this. I just randomly had your video recommended and I'm glad it was.
Welcome aboard! Glad you enjoyed it
I really look forward to their after effects integration!
So the character - you - gets better lighting in the engine because of the normal map? But the downside is that u need to do each frame?
Well at the moment I had to process it in frames rather than as a video. But it shouldn't be long before this changes.
Cool! I remember seeing something from Disney research a while back, but it just focused on faces and skin.
Yeah I saw that. Very interesting stuff
It's kinda like a budget volume set, looks really good
Thanks.
Thanks for sharing with this information Joshua :) I absolutely love it the final result.
Thanks for saying. I'm going to do another video and see if I can push it further.
Please make an update video on the workflow using this for unreal virtual production
On the list
@@JoshuaMKerr thanks ur a legend 🙏
Man, that relit version of yourself seems to have it rough
but I really like the idea and how well it works, very nice video too :)
Glad you enjoyed it. It's very interesting stuff for sure.
So I love this tutorial but I think I would invert the workflow instead, using a billboard is powerful BUT you are locked into looking at the billboard directly and when the camera moves around the subject it breaks the illusion unless you were to key in all that movement as well. I would rather make multiple HDRI's of the Unreal Engine scene and use those HDRI's in the SwitchLight editor to relight the subject in the camera footage WITH the alpha mask, I would turn around and get VCAM footage that was taken at the same time of filming so everything is tracked NATURALLY to the footage you took, then in premier overlay both films, that way you have similar lighting to what is in the scene provided by SwitchLight for your main footage AND have freedom of motion in the Unreal footage!
I think you're overcomplicating this. Every shot in my scene was static deliberately, so there's no illusion to break with camera movement.
If there were camera movement, then the process would be the same. Track the camera, get the maps from switchlight, and combine in Unreal.
that quote tho... "who needs metahuman when you have... ACTUAL HUMAN". Nice.
Haha thanks
That cool! I believe DaVinci Resolve Studio also has something similar. I don't think it can use HDRs yet, but I've seen demos where it generates normals and you can relight scenes using point lights.
Davinci's relight feature isn't really designed for this type of relighting.
@@JoshuaMKerr What is it designed for? If I remember correctly, the way it worked was that it'd give you a greyscale image that you could use as a mask for color correction. So if you lightened the image, it'd look like it was lighter. I could be wrong, but I imagine you could string together a bunch of those nodes for multicolor light. Or are you just talking about how it can't use HDRs as a lighting input?
The bump map it generates doesn't have enough detail for you to do complete relighting like this. It's a tool for more subtle touch ups but can't be pushed very far.
@@JoshuaMKerr Ah, I see. That's cool! Thanks for taking the time to explain the differences.
You can also do this in Davinci Resolve, and I personally think it looks better. This looks a bit off for some reason.
In Davinci you aren't doing complete relighting from albedo in a custom 3d environment
While the normal map generation is impressive this really shows that it's not enough if they could generate specular or roughness maps... though. I think relighting an object would be more appropriate too.
I've been shown their roughness map demo in private. I'm very excited to give it a try.
@@JoshuaMKerr that's sick!
Could you make a tutorial on how you got the image onto unreal to actually be able to edit it?
Working on it
That is way better, than the stable diffusion method!
I've tried literally everything by this point. Switchlight for the win
This is mind blowing!
Davinci Resolve Studio actual has a similar feature they just added. It allows for relighting within your grading workflow and can output normal maps for videos.
Have you seen the relative quality of resolves surface map?
@@JoshuaMKerr resolve is grading whole scene after the fact, but relighting in UE relights the whole scene during the fact.
although the word to describe seems to be really close, those corrections and changes make very different dimensions for the storytelling.
while the resolve surface/relighting improves the visual narrative, relighting w unreal (allows to) creates another 'visual' narrative.
This makes me think you could turn the normal maps into depyth maps, and use that into the pixel offset in the material, to get different fog density/depth of field/projected shadows/AO based on this different depth
Depth maps are on Switchlights road map. This is going to be my next experiment
can I ask how did you manage to set all the maps on the plane? I don't know how you've done it since the maps are a video🤔 did you set a material or what? I'm new to UE so thanks for helping🙏
it needs subsurface scattering to remove the plastic look
working on it.
yo josh how did you do it, not to get motion trail in your image sequence when render???? thank you so much broddy
it is truly a new age for indie film makers
I think this is just the beginning
This is amazing, exactly what we need to render Hollywood obsolete :D
I can't wait to cover this again.
I feel like we're stepping in a different kind of uncanny valley. Regardless, this is huge. Just the normal maps alone is an amazing feature. Let's see what they bring next to the table.
Cant wait to see how it progresses.
Well I’m glad I can already do this with Nuke cause another credits based software to add to the lengthy list gets me jaded.
I don't know what their eventual pricing is going to be but I'm keeping my ear to the ground.
if they can generate an albedo the whole point is that you dont need to set up to get a flat lighting.
Actually the reason to light flat is to make sure their machine learning algorithm gives me the best possible results for albedo and normal. I spoke to their developers about this.
One thing to mention under the switchlight contract you give them non-exclusive rights to use your images and share them with third party sublicensees. As a creator if you don't care then no problem. But good to know.
Certainly good to know. Might cause issues for anyone working under an NDA
What music were you using for the short sequence? I've been looking for that music everywhere ever since I listened to Distractible podcast.
CJ-0_Rampant_instrumental_3_17 On Soundstripe
This would take forever with normal 3d software, having to get the correct lighting ect ect but things can be changed on the fly?? after its recorded?? Wow!
Wow this technique looks amazing!
So much potential for improvement too.
Great progress. Ironically u look like a metahuman. I would try putting a sss skin material on one instance of the regular plane to key out skin portion to reveal the sss material version beneath
Its become so much better since then too.
Waiting for part 2 of this
Not game changer, but good as concept. Result looks very CG. Maybe it can be used in very stylized movie. And one of the problems here its catch light in the eyes. For close enough shots its important thing. Its what was the problem in Hobbit when that dude with beard was hallucinated in the cave. Flickering ligh stick was reflecting in his eyes. You can even recreate light setup out his eyes)
Roughness maps are coming and should help with this.
I didn't know there was a uncanny valley for cgi light
I'm learning a lot of new things by trying this.
@@JoshuaMKerr It looks weird on the image now, but I imagine in a few months, maybe less, it will look as natural as real footage. Excellent work, by the way.
Thanks. Yeah it's a process. We'll get there.
Not only does it look bad, now all the fun is taken out of filmmaking
It's just an emerging method of filmmaking and doesn't negate traditional filmmaking by any stretch. I had a lot of fun with it.
Would LOVE a more in-depth tutorial on how to recreate this effect! I followed what you did and got a quite flat result in UE5 (I probably did something wrong). Either way this is incredible and very exciting!
Did you flip the green channel of the normal map?
You can also mess with the intensity
@@JoshuaMKerr I am clearly way too new at UE since I have no clue what that means LMAO. Will look into this, thanks for the response!
Don't worry. There was a time not too long ago I didn't know these things. I'm sure I'll get to make some tutorials soon.
@@JoshuaMKerr That would be awesome!
maybe composite more for more natural output but great tool
I'm open to suggestions. I was thinking a bit of smoke in the foreground and more lightwrap
@@JoshuaMKerr if u can lower the light effect like just an overlay it would be near to perfect. if you can not in unreal, maybe composite in after effects or nuke
This is amazing. Obviously, I had no way of doing it but this is so cool.
I hope it's a big hit
It wasn't too difficult. I think most people could achieve this.
We can do this in 3d for decades but it produces shadowless lighting. To be used sparingly, if you don't want strange results..
Looks like shadows are being cast to me unless I'm misunderstanding your point. Could you elaborate on this for me?
@@JoshuaMKerr They mean you won't cast shadows on the environment
Except the plane does cast shadows on the environment. And I have made tutorials on this.
@@JoshuaMKerr Correct shadows, I mean. Shadows from a 2D plane with a normal map won't match shadows taken from an actual 3D object
Yes but there are workarounds for this and I cover them on the channel. Anyway depth maps are coming soon so we can get world position offset working.
Wel this is handy two days ago I learned unreal engine and now I need to finish my project before I saw this video I was learning unreal engine and now somebody made a 3 day video so I can do it to I know nothing about unreal engine I needed to learn everything see ya❤
I hope my videos will be helpful as you learn
What kind of PC rigs/workstations are you guys mostly working with with this kind of video production stuff? I'm wondering what type of GPU might be needed to be able to do this stuff well
I have a video planned to go over my computer specs.
@@JoshuaMKerr Nice! Next year I want to get a nice desktop rig, but from what I've read it seems that for strictly gaming you can get AMD gpu's as really good value, but if you want to do other things like 3d rendering etc. it seems that Nvidia is better? but more expensive
Really interesting stuff! Great walkthrough, this can definitely be a game changer for filmmaking 🤯
It's going to be very exciting as it develops
As someone who has been searching for some kind of normal map generation, AI or otherwise, I’m very excited about the applications this has beyond just video.
What do you imagine using it for :)
@@JoshuaMKerr I’ve had some ideas cooking involving 3d printing and was looking for something that could quickly turn photos into bas reliefs. Davinci has a similar relight function but the depth map is lower in fidelity
Wow thanx for the video was great. I was just thinking about something like this a few days ago while editing a video texture in unreal, its insane seeing this
I'm super glad it's finally here.
Do you have a link to switch light? I did a search and couldn’t find them.
www.switchlight.beeble.ai/#/
How to set this up in a level sequence correctly? Where did roughness layer come from?
I'll definitely be making a video on this for patreon. Not sure what areas of this I'll be covering on the channel next