- 6
- 191 847
Jed Smith
United States
Приєднався 15 вер 2009
Stitching Spherical HDR Panoramas in NukeX with CaraVR
In this video we take an in-depth look at using the CaraVR toolset in NukeX 12.0+ for stitching spherical HDR panoramas.
We talk about
- Advantages of stitching in Nuke vs alternatives like PTGui and AutoPanoGiga
- Blending multiple bracketed exposures into a single hdr image
- Calibrating color and exposure
- Using the C_CameraSolver node to solve our camera rig, including an in-depth rundown on creating manual feature matches, and a manual approach for getting a solve
- Fixing alignment, ghosting and blending artifacts
- Using C_SphericalTransform to paint out unwanted objects in frame
- Using Blender, we'll test out our final HDR to see how it works!
Links
Debayer tool: github.com/jedypod/debayer
RawTherapee: rawtherapee.com
Nuke-Config: github.com/jedypod/nuke-config
ExposureBlend tool: gist.github.com/jedypod/7a4f878040747251c6833d26f85cac8a
Junk Shop Blender Scene: cloud.blender.org/p/gallery/5dd6d7044441651fa3decb56
Image Set: If you want to follow along, here are the ACES 2065-1 exr images used in the video: mega.nz/folder/hBJUFLib#h4v4LjdSYXjZqPjVlomWKg
Chapters
00:00 - Bad joke
00:32 - Summary of events
01:28 - What is an HDR?
03:06 - Brief tangent into raw image debayer with RawTherapee
05:06 - Why use Nuke for HDR Stitching?
07:55 - Apocalypse from the West
08:38 - Combining bracketed exposures into an HDR image
17:39 - ExposureBlend tool
19:48 - Calibrating exposure
22:56 - Fix alignement issue
25:53 - Intro to CaraVR toolset
26:38 - Solving our rig with C_CameraSolver
33:16 - Straighten horizon
34:05 - Creating manual feature matches
38:28 - Set mask shape
40:02 - C_Stitcher to warp and blend panorama
41:00 - Fix stitching artifacts by tweaking masking
41:56 - Fix ghosting artifacts by removing areas of each view
45:00 - Remove areas from HDR blend to reduce motionblur artifacts
50:36 - Write out a cache of the stitch before paint
52:31 - Paint out camera shadow
59:18 - Color calibration with a Macbeth Colorchecker
01:02:40 - Lighting a scene in Blender
01:06:12 - Thanks and The End
01:06:48 - Bonus Round! Manual alignment if nothing else works
We talk about
- Advantages of stitching in Nuke vs alternatives like PTGui and AutoPanoGiga
- Blending multiple bracketed exposures into a single hdr image
- Calibrating color and exposure
- Using the C_CameraSolver node to solve our camera rig, including an in-depth rundown on creating manual feature matches, and a manual approach for getting a solve
- Fixing alignment, ghosting and blending artifacts
- Using C_SphericalTransform to paint out unwanted objects in frame
- Using Blender, we'll test out our final HDR to see how it works!
Links
Debayer tool: github.com/jedypod/debayer
RawTherapee: rawtherapee.com
Nuke-Config: github.com/jedypod/nuke-config
ExposureBlend tool: gist.github.com/jedypod/7a4f878040747251c6833d26f85cac8a
Junk Shop Blender Scene: cloud.blender.org/p/gallery/5dd6d7044441651fa3decb56
Image Set: If you want to follow along, here are the ACES 2065-1 exr images used in the video: mega.nz/folder/hBJUFLib#h4v4LjdSYXjZqPjVlomWKg
Chapters
00:00 - Bad joke
00:32 - Summary of events
01:28 - What is an HDR?
03:06 - Brief tangent into raw image debayer with RawTherapee
05:06 - Why use Nuke for HDR Stitching?
07:55 - Apocalypse from the West
08:38 - Combining bracketed exposures into an HDR image
17:39 - ExposureBlend tool
19:48 - Calibrating exposure
22:56 - Fix alignement issue
25:53 - Intro to CaraVR toolset
26:38 - Solving our rig with C_CameraSolver
33:16 - Straighten horizon
34:05 - Creating manual feature matches
38:28 - Set mask shape
40:02 - C_Stitcher to warp and blend panorama
41:00 - Fix stitching artifacts by tweaking masking
41:56 - Fix ghosting artifacts by removing areas of each view
45:00 - Remove areas from HDR blend to reduce motionblur artifacts
50:36 - Write out a cache of the stitch before paint
52:31 - Paint out camera shadow
59:18 - Color calibration with a Macbeth Colorchecker
01:02:40 - Lighting a scene in Blender
01:06:12 - Thanks and The End
01:06:48 - Bonus Round! Manual alignment if nothing else works
Переглядів: 11 387
Відео
The Math of Color Grading in Nuke
Переглядів 9 тис.4 роки тому
In this video we'll take a close look at the simple mathematical functions behind color grading operations in Nuke. We will only discuss operations that affect brightness, or value of an image. We'll save the more complex discussion of color and hue for a future video. 00:00 - Introduction 01:00 - Scene-linear and why it's important 02:30 - What is a linear function 04:30 - How the Grade node w...
Tears of Steel Plates
Переглядів 21 тис.4 роки тому
This video collects all 50 minutes of scene-linear OpenEXR plates shot for the Tears of Steel project created by the Blender Foundation. mango.blender.org/production/4-tb-original-4k-footage-available-as-cc-by/ media.xiph.org/tearsofsteel/tearsofsteel-footage-exr/ Tears of Steel remains one of the few open sources of high quality scene linear exr footage for testing and learning visual effects ...
Simulating Physically Accurate Depth of Field in Nuke
Переглядів 65 тис.7 років тому
A discussion of lenses and optics, and how they affect depth of field behavior in an image. We talk about depth channels, what they are and how they work. How depth of field can be simulated in Nuke with the ZDefocus tool. We also demonstrate the OpticalZDefocus gizmo, which is a tool for generating physically accurate depth of field. It does this using the depth of field equation, given lens c...
Nuke Screen Replacement Tutorial Part 2: Compositing and Integrating the Screen Content
Переглядів 45 тис.12 років тому
This is a tutorial about how to replace the screen in a cell-phone using Nuke. It deals with the challenges of tracking a glossy screen, and getting a good key from a light-emitting screen covered in smudges. I tried to speed through as many of the slow parts as possible, but it is still a bit lengthy. The tutorial is split into two 45 minute parts. Part 1 covers the tracking challenges. Part 2...
Nuke Screen Replacement Tutorial Part 1: Tracking The Screen
Переглядів 40 тис.12 років тому
This is a tutorial about how to replace the screen in a cell-phone using Nuke. It deals with the challenges of tracking a glossy screen, and getting a good key from a light-emitting screen covered in smudges. I tried to speed through as many of the slow parts as possible, but it is still a bit lengthy. The tutorial is split into two 45 minute parts. Part 1 covers the tracking challenges. Part 2...
Must and should we need to download each frame or any batch download ??
I cant get the wget to work
hey jed, thanks a lot for the amazing tutorial and the tools you shared! i watched also your other tutorials and there are great. so, please keep uploading videos! definitely it would be great to see a tutorial about debayering.
Do you do any one on one mentoring?
I get "strptime() argument 1 must be str, not None" when I hit group - What am I doing wrong?
happy to try to help but i'll need more specific information. - in what software / in what tool are you hitting "group"? - what os? - what version of nuke (if applicable)? - what version of python (if applicable)?
@@jedsmith it happens to me too. I'm using nuke 15 in windows 11 with python 3.10. It happens when I select several images and click on ''short selected reads''
@@jedsmith here the log: [22:04.52] ERROR: ExposureBlend.EqualizeExposure9.value: unexpected end of "exposure/" [22:04.52] ERROR: ExposureBlend.EqualizeExposure7.value: unexpected end of "exposure/" [22:04.52] ERROR: ExposureBlend.EqualizeExposure8.value: unexpected end of "exposure/" [22:04.52] ERROR: ExposureBlend.EqualizeExposure6.value: unexpected end of "exposure/" [22:04.52] ERROR: ExposureBlend.EqualizeExposure5.value: unexpected end of "exposure/" [22:04.52] ERROR: ExposureBlend.EqualizeExposure4.value: unexpected end of "exposure/" [22:04.52] ERROR: ExposureBlend.EqualizeExposure3.value: unexpected end of "exposure/" [22:04.52] ERROR: ExposureBlend.EqualizeExposure2.value: unexpected end of "exposure/" [22:04.52] ERROR: ExposureBlend.EqualizeExposure1.value: unexpected end of "exposure/" [22:04.52] ERROR: ExposureBlend.EqualizeExposure0.value: unexpected end of "exposure/" Traceback (most recent call last): File "<string>", line 127, in <module> File "<string>", line 43, in sort File "<string>", line 43, in <dictcomp> TypeError: strptime() argument 1 must be str, not None
@@ivantovaralbaladejo687 Thanks for the additional info, that gives me something to go on. On line 43 the code calls datetime.strptime(n.metadata().get('exr/Exif:DateTimeOriginal')-- this is used for the exposure bracket detection. It sounds like maybe that metadata does not exist in the debayered exr image you are loading as your source. Can you confirm? If so is there other metadata describing the capture time? It's possible Canon cameras use this but not others.
@@jedsmith I'm using 16-bit tiff images converted from RawTherapee. I tested with Lightroom too and doesn't work. Could it be the format? In the metadata of my file I see the exposure time for each image though
❤🔥 wow
This is 💎
super smashing color math video!!!
this was pretty cool, thanks for doing this video, cleared up the differences between some of the duplicate functions for me.
ASMR keyboard tracking sounds to composite/track to
This one might be one of the nicest in depth chat about this topic in youtube. Thank you again.
Where I can get cg & bg elements for all these footages
here I believe: studio.blender.org/join/
Great explanations, thank you! Btw, the script did help me, especially looking into the CompressToe... Only thing... I can't find your name anywhere, so I can`t see if you have content elsewhere...
This is very helpfull. do you have the script. I wanna play around with it
Not sure it's gonna help you much, but sure I added it to the description: codeshare.io/zyMb7D
can u plzzz upload another tuts plzzz . i liked this one.thank u
Thank you so much! I've been trying to do this for so long but never took the time to figure it out. This is perfect and now I can use the absolute values from CG depth passes. Your gizmo will be my go to for that now. Again thanks
I swear I just find you in the most random places on the internet lol
@@InLightVFX haha hello Jacob! Glad to see you on this gem of a video x)!
It was great. Looking forward to seeing more of these
Love it!!
Thanks for detailed description, you even attached links to desmos graphs! Very appreciated 👍
Hey there! What a nice tutorial, thank you so much! I have only one question which tool you're using to change the AOV passes right in the viewer?
Thanks! It's Falk Hoffman's Channel Hotbox www.nukepedia.com/python/ui/channel-hotbox
Realy a detailed information math calculation thankyou very much.
Thanks for this. What are your thoughts on using Lightroom for processing the raws? I know its Win/Mac only but I already have that anyway. It has very good chromatic fringing correction. If you export as DNG those should be in linear space as well, already debayered and readable in Nuke.
Howdy, I am a little late here but AFAIK, Lightroom will always apply its aestethics to your shots so it should be avoided in a linear workflow altogether?
thank you so much for the tutorial with detailed explanation! May I know why do you use 'lightwarp' node on finger?
Thanks Jed, that is such a great video. On the Histogram node: I use it sometimes to remap let’s say a world depth channel to create a mask channel.
Hey Daniel! Thanks for the kind words. In Nuke I do wish we had a nice Remap node. The math in the Histogram is the same as the Grade node, but I agree it could be more intuitive to use for the simple task of remapping a source min and max to a target min and max value.
This is golden. Thank you for sharing your knowledge!
Great Breakdown... much appreciated from anyone interested in this topic ;) Thank you
can we use this footages in reel ?
This comment does not constitute legal advice. According to the readme included with the footage media.xiph.org/tearsofsteel/tearsofsteel-footage-exr/README.txt The license of that footage is Creative Commons Attribution 3.0 : creativecommons.org/licenses/by/3.0/
Great work! Many thanks
I do not have a node MergeDivide (17.09 in lesson). Tell me what to do?
Ah sorry I missed that -- the built-in divide node doesn't handle values below 0 properly so I usually use my own built from a MergeExpression node. Ar/Br
Sometimes I will swap the order so it's Br/Ar as well, so that I can disable the Merge and pass through the B pipe to disable the operation instead of passing through the thing I'm dividing out.
@@jedsmith Hey thank you for the video. Very usefull.When you said Ar/Br. You're using the R channel? and why? Thanks
@@djordjeilic9634 Yeah sorry that was not very clear. It would be Ar/Br for the red channel, Ag/Bg for the green channel etc
19:36 Yeah, the bokeh is really cool! =)
Hey Jed, those gizmos that you wrote (you used 2 of them) how can I obtain those gizmos?)
Hey! It's all in my nuke-config git repo: github.com/jedypod/nuke-config/blob/master/ToolSets/Merge/ExposureBlend.nk github.com/jedypod/nuke-config/blob/master/ToolSets/Color/CalibrateMacbeth.nk
@@jedsmith Hello Jed, nice to see you again. Would it be possible to make these links live again?
Hey Ms. Robles! Good to hear from you! :D Sorry it looks like I moved those files around since I posted that - I edited the above links so they work again.
@@jedsmith Thank you. You are the best!
Hi, I wanted to know if I can use this tool with nuke 13? It's really awesome
Yes it should work fine in Nuke 13. The only thing that would need to be updated is support for Camera3 class nodes in the "Get Selected Camera" python script button.
You get my upvote just for that hilariously great intro.
Thank you for this!
Thanks for the tutorial keep these coming.
very well explained, thx!
Thanks!
Hi Jed Thanks a lot for your detailed tutorial on how to stich spherical HDRs in Cara VR. It's been very helpful to me. Also very nice to have a workflow within nuke without the need of using lots of different software packages. Makes things quite a bit easier. Do you think it would be possible go over your process of debayering raw images to scene linear in more detail. I am also on Linux but I am currently struggling to get your debayer / RawTherapee workflow to work properly.
Hi! Yours is the second request for the topic of scene-linear raw debayer workflow... I will definitely make it my next video. Now... just need to get some spare time to work on this :)
@@jedsmith awesome. Thanks a lot. Looking forward to it.
This is really useful, thank you so much!
is there any way to get right defocus edge on front and on the back of the object? can't find a way
It's hard not to have Chinese
我对此感到抱歉
Hey Jed, Thanks a lot for that great tutorial! Finally I'm close to a solution to process my raw images into ACES - but not getting it quite right yet. Is there any chance you could get more into detail how to debayer raw images into scene linear step by step? I already checked your debayer Github site but struggle with it a bit. I'm not a TD and have no clue how to use/install your tool the right way using Windows. Maybe someone could share a link or drop-in the right keywords for me to get my head around it? Thanks so much! I've also tried your steps shown with RawTherapee but my exported ACES 32bit TIFs seem clipped (other formats also just show the same 0-1 values). Any idea? I really appreciate your time - cheers!
Hey! Thanks for the nice comment. Debayering into ACES can be tricky. This is the subject of a video I have planned to create, going into a lot more detail than the little hints I put in this video. Getting Debayer to work on Windows can be tricky due to the OpenImageIO and OpenColorIO dependency. Even using Choco package manager to get them running can be difficult. So I can't say I blame you there. You're actually not the first person to inquire about this, so I think I'll try to work out a how-to for getting Debayer running on windows as well. Short answer for your question: Most raw image formats store the raw bayer data in an integer format. Integer implied a 0 to 1 range, with the number of steps between determined by the bit-depth. Therefore all data that the camera captures is going to be between 0 and 1. Values above 1 will be clipped. If you debayer using the RawTherapee profile I supplied in the Debayer github repo, you will get something resembling Scene Linear ACES data, but encoded in a 0 to 1 range. The next step would be to expose your image such that an 18% diffuse reflector would be a value of 0.18 in scene linear. Now you have a proper scene linear image, maybe with specular highlights above 1. Where sensor clip gets mapped to depends on how your image was exposed in the camera and your camera's dynamic range. For example an Alexa will usually clip at around 35 to 40 in scene linear depending on the camera, the dp and the show. My shitty Canon 5Dmk3 will clip at around 12 to 15 if shooting in dualiso using magic lantern. Hope that helps! I'm working on a video about image data formats, and then one on debayering is up next I think. Just need to get some solid time to work on it! :)
@@jedsmith Thanks a lot for your detailed reply - I appreciate that! Of course there are still a lot of questions but I'm happy to wait for your next video and get my head around it. Maybe I'll also jump on ACEScentral and create a thread to get more information on the workflow. Wish you all the best!
u r so professional! i am using Ptgui, I am not so much satisfied with the HDR result! I am new in Nuke, Learn a lot from u! thanks!
Damn. Thank you. I didn't need to know this today, but I damn well learnt it today.
Amazing stuff, thats the way to learn, by undestand the underlying processes , not by moving sliders.
My philosophy exactly. Thanks for your nice comment! :)
You sovled my problem...thanks man 'every artist must watch this video'
Radio-announcer mode activated. rofl. Amazing content, caught myself staying for the entire video though I meant to finish it later. Love the chapters breakdown.
super! wait for next tutorial )))
thanks!
thx!