There is one problem, I stitched the video using an incorrectly calibrated STMapperlnine and there are no more sources, what should I do?! Do you have any recommendations on how to save the video? :((
It's wild how much it requires custom third-party toolkits and manual wrangling to achieve basic capabilities with 3D VR. You're doing good work, sir, bringing these techniques to us at no cost. Kudos to you and Andrew Hazelden for making it possible to use these extremely expensive pieces of hardware. Canon and BMD should be paying you guys big bucks.
My boy Hugh, the godfather of VR, making us another video we can’t refuse. Thanks for sharing all the knowledge with us! Cant wait to see what other surprises you have in store for us. Everyone please show him show him some love
Helpful note: Label the file with: 180_3D basically 180(underscore)3D It didn't work at first for me when I used 180(dash)3D Thanks so much for the tutorial, you've been a lifesaver for me on a big VR project.
First attempt at editing footage from the 5.2mm RF lens and ran into a problem, when generating the STMap only one eye (left side) was being generated, to fix it I went into Fusion, when STSaver is selected you have to go to export and change clipping mode to none, then the generated STMap will have both eyes! And it works fine now. Thanks for the tutorial. :D
Damn Hugh...you are a LEGEND for not only figuring all this out....but for making this awesome tutorial! All of us aspiring VR creators are so grateful for you! Thank you!!! ^_^
감사합니다. 대한민국에는 당신과 같은 VR전문가가 없는데. 친절히 한국어 자막까지 달아주셔서 많은것을 배우고 갑니다. thank you. There are no VR experts like you in Korea. You kindly put Korean subtitles on me, so I'm learning a lot.
Hey Hugh, First off thank you for putting all of this together! Incredible effort and gets me closer to the dream of doing everything in Resolve. I have a couple of issues and I'm hoping you can help. I just tried this on some sample footage, and noticed a couple of weird issues: 1. The VR viewer I use for local files (Skybox) didn't correctly recognize the 3D format (SBS, VR180); but this isn't an issue when using Canon's EOS VR utility. Why could this be? 2. The video seemed very mismatched once viewing in a headset (Quest 3), compared to the footage that comes from the EOS VR utility. I applied some manual 3D offset correction in Skybox and it kinda sorta helped. I'm guessing this is a result of using your calibration file instead of creating my own? I'll get to it soon! Appreciate any insight here. Thanks Hugh!
B/c resolve does not inject metadata - you need to tell skybox what it is. Meta injection has to happen in FB360 step for final delivery. Yes, create your own calibration file please. Each camera is different. Use my as a starting point.
Thanks@@hughhou! After I posted that I exported another video and that one did seem to preserve the metadata. Not sure where in the chain it got lost on the first test but I will keep looking into it. Going to go take some good reference videos for calibration purposes next today! Another question, have you noticed that creating calibration profiles specific to the camera for NeatVideo makes a big impact vs. using the generic profile?
Thank you so much for this video👍, can't wait to get my objective and test your workflow. I thought I'll have to work with Adobe and I am happy to see that I should be able to work in Davinci. What I see with my hardware (4090 RTX and a 7950x) is that I can work in realtime with 8K 60 RAW in any 30fps timeline but it is slightly jumping on a 60fps timeline, so I hope the result will be about the same with your workflow in a 30fps timeline. Great videos 👍
Hi Hugh! For starters, thanks for putting out the content you do. I just got a Quest 2 and I'm kinda fascinated by VR video. I used to do some video stuff but still photography is my jam now. But the Quest has some ideas rolling around for personal projects. I'm mostly interested in 180. I'm not a big fan of 360. Seems like either half the view is wasted or I'm missing something in half.. depending on which direction I'm looking. My question is.. you talk about the R5 and R5c here. And it seems like a lot of videos I've watched talk about the R5. I have the EOS R. Obviously, I'm not getting 8K. Can I use the 4K in the EOS R with decent results... at least until I figure out if this is something I want to do a lot? I just wasn't sure if the EOS R even has the capability. I'd like to find out before I go buy the lens. I do have a lot of 300x memory. Thanks. - Chris
Thanks Hugh - I am watching this now after following your latest UA-cam featuring the Spatial GUI Encoder/Injector. So is there any need for the workflow above? The only thing I am now missing is that the Encoder version does not do 8K from my R5?
Hello Hugh, is this still the best workflow in 2024 for editing 180 VR shot on canon lens for Vision Pro? Note that I'm editing 6K Clog3 footage, not 8K RAW, so perhaps Premiere might be the best workflow? That being said, there still might be a need to use your calibrator plug-in, which can only be done in Davinci, correct? If there are any other videos of yours with alternative newer methods, please point me in the right direction, thanks (:
So in 2024 I use this workflow: ua-cam.com/video/5YuQIwXXA74/v-deo.htmlsi=9tjuo7_4PugYmA9D and this one: ua-cam.com/video/kUXqKtrO7bQ/v-deo.htmlsi=DHqZQlpJ-AlEeHrU
@@hughhou SO now, after I dumped all my Premier Pro and have bought Davinci Resolve Studio 18.6 (in which I am totally lost still) you are going back to Premier?? I have at least 300 3DVR 180 clips and I can't edit anything. Even the downloads don't seem to have the same installers as your videos are showing. :(
Hi Hugh! Thank you for another great videos! Those contents are so valuable! Just curious. If I shoot a character with green screen background in 180VR 3D, it it possible to composite a 2D image environment (or animated environment) with my key out character and render it out as 180 3D VR format?
Yes! You put your 2D immersive envirnment on the spere projection and put them into 3D space BEHIND your 3D green screen video. You will need to watch in headset to find the correct depth to push your envirnment. Same principe you can put multiple option in different depth the comp a entire 2D assets in 3D - like in a 3D software.
does your plug-in also work on Macbooks? I'm going to buy a laptop but I don't know if your plug-in also works on macOS or only on Windows, thank you in advance for your answer.😊
Great videos! I'm trying to decide between the Mac and the Razer. Do the plug-ins for Resolve work the same on the Mac as shown here on the Razer? Thanks again!
Dear Hugh, again a BIG thank you for all this material. A quick question: I guess I can train to render R5C footage in 4K/60 with DaVinci Resolve and then buy the Studio version once I am convinced I am able to handle the workflow myself? All the downloaded tools are compatible with 4K footage ? Thanks a lot. Valter
Yes that is coming next. Want to get the hardest part out first. R5C menu is confused as well - but I am a C70 user so it come easy for me. It will come soon! I promise!
URGGG! I''m trying to do the dual fisheye calibration. When I drop the HughHouAnaglyphViewer onto the clip, the clip just goes black. If I continue anyway and set up the new Fusion Composition and drop the STmap-Creation- Fu18-v2 into Fusion, nothing happens. Zoom and zoom out and there's just nothing there. It acts like it dropped into Fusion, but nothing. I'm running Davinci Resolve Studio 19.0. Thoughts?
Hugh Hou, thank you a lot for this very helpful video. Saves me a lot time to get into the R5C with dual fishlens. Today Davinci Resolve 18.5 is out. Will it work with your workflow?
Hey Hugh, slightly off topic but any thoughts about the Mac Studio and the best configuration for 3d video rendering in DaVinci? I don't want to fork out 7000 bucks but am I wasting my time if I buy just the base Mac Studio Max version? Is it worth upgrading to 64 of ram or should I be spending money on more SSD? It is hard to find any online reviews of the Studio for 3D video editing and rendering so wondered if you had any experience even if I know you generally use the Razor. Thanks
I am considering buying Mac Studio but since I have maxout specs Macbook Pro M1Max so the need of upgrading is low. If you go with Mac Studio - I will highly recommand to max out the specs except SSD / Storage. As storage is useless on VR - we will generate TB of footage each shoot - so we usually store it off computer with SSD Raid - which is way cheaper than Mac own storage. So go with 1TB - but the rest you should max it out.
Hugh, thanks for putting this together! I am super excited to start using VR for my channel, but am having problems with raw. I can use your test files which are .mov, however if I try to apply the STMapperlnline to my raw .CRM file I either get a stretched view (but only half the screen) or still side by side. I restarted with no changes and now I don't get a preview at all. My clip is 8k 60fps. Do you happen to have a raw file for download that I can test with? Thanks!
Maybe it's because you're using the free version of Davinci Resolve? I think 8K is only supported in the paid version (which is understandable). I have the free version too and have the same problem as you. Putting it to 4k and the workflow works like a charm. Good luck!
Your video description talks about editing on an M1 Mac and then later in your video you have "19:02 - View 3D 180 on VR headset Meta Quest 2 with Fusion 18" but you're editing on Windows ... does the MQ2 work with DR(S) on Mac ? If so, how do you set this up?
Awesome Hugh! Thanks:) I assume that the same workflow will work with the Atomos Ninja 5+ 8k recordings in ProRes Raw...Hopefully. I'm at the end of post on a feature film, so the upgrade to Resolve 18 will have to wait a few weeks. I've had the R5 and VR180 lens for months now - just too busy to really work with it, so all of your hard work is greatly appreciated!
Word of warning - Ninjas V+ take HDMI output in "10-bit" not "12-bit" as Canon RAW. You are losing ton of information in your "ProRes RAW". So don't get confused by marketing at Atomos - it is NOT a better solution beside letting you run longer without overheated. R5 is necessary tho with that. Resolve work great for ProRES RAW. The workflow is identical as state here. You just don't have my color page flexibility - chaning ISO in post. But you just need to nail your exposure onset when filming
2 роки тому
Hi Hugh, thanks for this fantastic video, once again! I have 2 questions: - Do you think we could use the Canon VR lens in a Nikon Z series with adaptors? - How long does it take to show the maximum quality on the UA-cam Oculus app? It says that it is already processed, and I can already see 4k on the desktop but only 1080 on UA-cam VR.
@Huguho The one thing is the render time for a 10 minute video is 1 day and 15 hours. A normal 4k video only takes me 20 minutes at the most. You didn't go through the render process in any video I have found on 3d VR yet, what do you suggest?
That really depend on your system specs. If that is the case, I will recommand to transcode your Canon RAW LT into ProRes first using the official canon tool or just Resolve and then use the workflow here. It will be a lot faster. Also if that still does not work, I will recommend try my other workflow which is a lot faster here: ua-cam.com/video/1QLyX6ApuEg/v-deo.html
@@hughhou Thank you so much for taking the time to respond. The actual render time was 12hr 15s. The render time isn't the problem though. Very new to this but have been editing and shooting 2D videos for years, never needing much more than color correction and titles with simple transitions. I'm shooting with the Canon R5 and the dual fish eye lens. My computer is a bit dated running Windows 10 home with the Intel i7-5820 3.30 ghz, 32gb RAM and the RTX 2070 Super for a graphics card with 8GB ram. Bought Resolve Studio and was just lost how to edit with it, have been using Adobe Premier for years. Found your video featuring Resolve and the R5c. Followed your instructions and installed the KartaVR package, checked them all off in the fusion reactor, set the Stmapper and everything seemed to be working great. Video had stretched to the corners for monitoring, no longer circles. Did a basic edit with some cuts and simple fade transitions. Went to the Deliver tab and selected h265 and rendered... It did not stretch to the corners, rendered video is back to being 2 circles. I can't seem to find any instructions on the rendering/delivery settings but I have checked and double checked the settings you laid out in the video and they are right on. Any idea what I would miss that would make everything fine on the timeline but not render properly?
thanks hugh, you are the best! what would the vr community do without your tutorial videos? as i already wrote you via email, the workflow with canon raw files in mistika vr is not really ideal. i am glad you show a better alternative here.
Hugh!!!! You are the man!!! Thank you sir! Quick question - would you have VR180 Creator Tool for Mac, please? (Unfortunately, Google does not provide that anymore.)
Hi Hugh! I tried to edit the CRM file directly from DaVinci, but I face the problem with super long rendering time if I add any FX such as Nosie Reduction. Do you suggest I get all the color grading setup without any FX, and then render out a HEVC file. Put back the HEVC file and add other FX? Will that shorten the rendering time? I'm still playing around and testing different method
@@hughhou Thanks for your suggestions, I transcode into ProRes and it works a lot better! Thank you so much! I still suffered from long rendering time, about 5 hours to render 40 seconds, not sure if that's normal. I do find a work around. I first render the cache color output using the highest quality, which is DNxHR 444 HDR, and then I check the Use render cached images when render, it takes about 20 minutes to cached the image and took about 10 minutes to render. The result looks identical in the headset. 5 hours vs 30 minutes different!! Do you know is there any bad with this method?
Hello, After I apply the STmaperinline in effects it doesn't do anything. It's still circles instead of equirectangular. I have watched the video several times and can't figure out what I'm doing wrong. Can you help me? Thanks
Hi Hugh VR master, I am currently looking into the stereoscopic workflow in Davinci (color tab > 3D). I was wondering if you have tested the "stereo alignment" tools and what you think about the "native" 3D workflow in davinci. I use ZCam for 180 stereo. I continue exploring but it looks quite handy. (the only think I need to figure out is how to remove tripod).
Hi Hugh! Do you still have a copy of the repacked creator tool or is that obsolete now? I'm just getting into Davinci and wanted to walk through this tutorial again, thanks in advance!
Super tutorial, thx a lot for the info. I have a question tho, the Stmap you provided are working well on some footage and gives bad stereoscopy for other clips. In my understanding, if I do my STmap for a lens/body combo, will it be good for any clip using this same combo ? So far the best equi conversions and stereo results I get comes from Canon VR utility but I would like to by pass this process as doing the conversion in DR and only on final edit is way better ! Thx.
If you do your own STMap combo - I guarantee you will get better result than the Canon VR unity. But yes, I do calibration everything for a new video or a new day. There is many reason will change the calibration - like focus / left - right eye adjustment, outside condition. If you practices the calibration in Mistika or In Resolve - it just become second nature to you soon. I hope this help. My temp is just get you started as I will not know your camera setup.
@@hughhou Thx for quick reply, that's perfect. Thanks to your tutorial doing the STmap myself is easy. So to be perfect STmap should be done for each different scene, if I change my shot in focus or positioning I should do a new one, correct ?
@@hughhou Hey ! I'm allowing myself to ask again as I can't find better ressource than you. I'm wondering if a STmap for a shoot is enough for all the scenes of the same day or if I should do a STmap for each individual scenes ? best.
@@marting974 How do you create an STMap? I am using the one he provided but it is not working properly at all, it shows two small little rhombus shapes of the left and right instead of filling the entire frame like in the tutorial. Thinking I might have to make my own STmap
Hello Hugh! I'm trying to understand how to do this with 2 gopros but even with everything you say, I don't understand how you manage to stitch the two files in fusion, would you make a tutorial or explain it quickly? thanks in advance
@@hughhou it's more about setting the WarpStichUltra node to have the good warping latlong profile, I'm getting there, but when I tried to export, resolve is crashing, I may need to combine the 2 files. Thanks for the answer!
Hello, In the Vision 18 VR 180 section, unfortunately, the Median1 and Anaglyph sections no longer have three dots as they used to. Now, there are only two dots. When I select these two dots, I’m unable to make the adjustments that you can. I would greatly appreciate your assistance with this matter.
hi Hugh, I have Resolve Studio, when I try to render at 8k, I get the error: Cannot find appropriate codec for encoding the video frame. Will it be hardware? I have a 4GB RTX 3050
Great tutorial, Hugh, as always! I'm trying to create my own ST Map corrections and the Fusion STmap-Creation file shows a different dimension (6144x4320) in Viewer 2. Is there a way I can fix this?
The STMapper1 node was set to "Size: STMap Frame". When I change it to "Size: Image Frame", that improves things. Still haven't gone all the way through the workflow though.
Wow! thanks so much! I will watch all these video's and expect I do it! now, although I am interested in vr with quest 3 (replacing quest 2) I really want to take my 3d footage in 180, and crop to 16/9 so I can show on cinamatic 3d projector (awol 3000). I can do this, I hope?
something happened when trying to calibrate. the stmapper tool thingy resizes my window to 12288x4320xfloat32. so in the anaglyph is shows 6144x4320xfloat32. in that way its impossible to see correctly to calibrate. any Ideas, Hugh? thanks bro
Hello Hugh Hou, you are a master of vr and your videos are lessons for me. I have a problem. I filmed with the Red raptor and the canon mount and I would like to have the same workflow as you on resolve. can you help me. Thank you
These tutorials are amazing thankyou!! I'm just having trouble with the VR180 creator. Coming up with an error. 'Unexpecter error: somthing isnt working 1: Video DImensions must be set for V1 metadata injection' Can you help?
An absolute amazing tutorial - thank you so much Hugh! Unfortunately my Studio18 crashes each time I try to render out the AllSavers ... any idea how to fix this? (I substituted the original footage with a ProRes rendered as well but strangely the output size shrinks despite the ProRes movie being the full 8K ... but regardless of this or the mp4 ... crashes galore. Not sure what to do anymore...)
Hey Hugh, it seems the creator up 585 asses is missing from the Hugh Hou folder. It was there a few weeks ago, I just don't see it now. Can you please help?
Thank you! I have a problem on color page. Camera Raw is totally grey and I can't use any option. I've used one of your clip to test everything. Please , help!
Great inspiration for VR180. Thank you for your work. I have a big problem. VR180 Creator send me an error message as I try to inject metadata: "Video dimension must be set for v1 metadata injection" I have a Macbook pro 12.5.1. M1 This problem doesn't appear if I upload a fisheye native video, but only with equirectangular rendered by davinci or by Canon Eos utility. Can you help me please? thank you
Adding STMap to multiple images at once? Can you apply the STMap to multiple images that you've added to the timeline at one time or do you need to do it one by one?
Ok Hugh I found this tutorial and I'm trying to apply it for my double Go 2 setup instead of the Mistika Boutique option :) to mimic the Canon R footage I have created another timeline and putted the two fisheye footages side by side to mimic the Canon and use this timeline as a clip in the main timeline to drop the effects on - but since the FOV is different to canon's (and also different in 30 and 50 fps) - I am not sure it's going to be good starting with your template. Are you happy to maybe share the double GO 2 setup template you created? or give tips on how to adjust? Also, any tips on how to sync the two cameras? I get there's no genlock but maybe you have some experience with this? I've tried to click them both at the same time and still got a bit of a difference even when using 50fps (I sync them by eye using the audio).
@@hughhou Of the Qoocam? :) what I would REALLY love to have is another Qoocam 8K so I can do some VR180 with two of them together. I can't even find a used one, since they discontinued it.
Do I need to do anything specific for smooth playback in Resolve? I was able to follow the whole process, and get a sample video to my Quest 2 and UA-cam. But now when I try to actually do playback and edit in Resolve, the playback is so bad, like 2-3 frames over several seconds. I am on an M1 Ultra and the footage is 8K RAW LT 60. Also with 8k RAW 60, the r5c doesn't allow proxies on a second card. Any suggestions?
Yes, if you are on Mac - the better solution now is to use FCPX to convert the file into ProRes 4444 uncompressed and then use Resolve. It is Canon RAW LT is testing on any machine b/c of lack of GPU render. Do that you will be 10x time faster. Alternative, you can use the Canon RAW converter - but you might need to boot into rosetta. For 8K RAW 60fps, you can def do second card proxy recording - it is recommended actually. What stop you from doing so?
@@hughhou You're right (but you know this 😉) - when I put it in 8k mp4 HEVC 422, it disables the proxy mode. I assumed it was the same for 8k RAW LT 60, but now have access to it. I did find in this as well that I can playback raw if I remove (not disable) the anaglyph effect you provided - probably will be something I will still add time to time, since Mac doesn't support Quest II directly in resolve (option is grayed out ). Thank you for the quick response!
Holy cr@p. You rock! Thank you so much for this. I'm looking forward to trying out your workflow later this night -- it's not something I would have come up on my own w/o a huge amount of time invested and even then, ???. That I don't need to use Premier is an added bonus. Thank you, thank you, thank you! Please also pass along a thank you to Andrew Hazelden and Kimchi too :-)
@@hughhou Your tutorial works like a charm and is right to the point with no fluff -- wish more tutorials were like this. Once set up, easy workflow, but I would have never figured this out on my own! THANK YOU! I haven't done the stereo calibration yet (real job calls) but definitely need to -- I think I'm going to throw up after watching my first self-produced VR video. Canon should get you on their payroll! I would have returned my VR lens if not for this video -- so happy to have a solution that works exclusively out of resolve.
Nice work.. I heard the Dual Lens IPD is slightly less than 64mm (standard).. is there any workaround? or just get used to things looking larger than life
hi hugh, i am very interested in how to edit a 180 degree video but i have a problem, when i try to import the sample video which is in the description into davinci solve it can't display the image, only sound goes to davinci and it says "media offline" please guide for me 🙏
I wonder if anyone can help? I used this workflow with the R5C and it works great (as long as you make sure to keep the camera horizontal while filming so you don't have to correct for vertical parallax) However Davinci Resolve doesn't currently handle Ambisonic audio, so I had to import the resulting footage into Premiere Pro to add the audio tracks. The issue with this is that the Premiere Pro VR settings then try and reproject the footage into equi-rectangular projection again, but in this case it's already been done in Resolve. To get around this I ignored the VR in the project settings but set to VR equi-rectangular 180 SBS in the export settings. This seems to work fine. Except! If you try to add text or graphics and use VR Plane to Sphere the text ends up getting warped. For now I just accepted this and made the title "purposely" look curved but is there any way to do this correctly? (I suppose I could manually place two versions of the text in the correct places and then manually shift them a bit to get the stereo depth offset, but surely there has to be a neater way of doing it?)
Does Premiere Pro support fixing horizontal and vertical disparity? Does it handle Canon RAW with good noise reduction? Or is Resolve the way to go? I've been (reluctantly) editing with PP the past couple years. If only FCPX was VR friendly, but that's another subject. 🫤 Trying to decide whether it's worthwhile spending the time to go back to Resolve. I was using Mistika for a while but would rather avoid it if possible. As always, thanks for what you do Hugh! (yes, that rhymes) For your help, next time I'm in LA I'll buy you a pizza. Or a taco. A pizza taco?
Premiere can not calibrate disparity at all. Canon RAW you need to deal with noise in EOS VR Utility not inside Premiere. Mistika + Premiere is the best combo tho.
I'm having a nightmare with Resolve 18 trying to edit 8K RAW footage - it keeps crashing after almost every action I take. I'm going back to 17 to see if I can get that to work with 8K.
I am stuck early on in this process unfortunately! I have recorded my 8k pro res raw video from the R5 with Ninja V+, I need to convert this since Davinci does not recognize it, in the video you say to use the Insta360 software but this software also does not recognize the 8K pro res raw file and just gives me an error when I try to open it. I tried to instead convert it to apple pro res 4444 using Final Cut Pro X and that file is recognized by Davinci but it just appears as a left/right video file instead of being combined into the actual VR video. Im on a Mac if that makes a difference.
Yes you need to transcode. In Canon RAW - at this point I recommend transcode as well to ProRes before moving forward. Low spec computer can’t handle ProRes RAW or Canon RAW
The technical conversion LUT? It should just be any c-log 3 canon lut - check c300 or c500 download as well. It was there somewhere bury in the download. But I will find it and upload somewhere - I use my own commerial LUT most of the time not Canon LUT as it has issue! See my latest tutorial to understand why! Or better use the EOS VR Utility just released and then color it in Resolve.
@@hughhou will do! Saw the new Canon VR video you posted. Thanks for your help on everything. If you do find and upload that’d be great. I got the Canon tool yesterday
Download the official Google / UA-cam VR180 Creator Tool here (Hugh repacked): Windows: bit.ly/3uXeiY6 | Mac: bit.ly/3nivjHj | Linux: bit.ly/3QSmnX0
There is one problem, I stitched the video using an incorrectly calibrated STMapperlnine and there are no more sources, what should I do?! Do you have any recommendations on how to save the video? :((
It's wild how much it requires custom third-party toolkits and manual wrangling to achieve basic capabilities with 3D VR. You're doing good work, sir, bringing these techniques to us at no cost. Kudos to you and Andrew Hazelden for making it possible to use these extremely expensive pieces of hardware. Canon and BMD should be paying you guys big bucks.
My boy Hugh, the godfather of VR, making us another video we can’t refuse. Thanks for sharing all the knowledge with us! Cant wait to see what other surprises you have in store for us. Everyone please show him show him some love
Awww thank you Vanilla!!!!
🫶
Hundreds of thousands of adult content lovers say thank you for making VR better.
Helpful note:
Label the file with: 180_3D basically 180(underscore)3D
It didn't work at first for me when I used 180(dash)3D
Thanks so much for the tutorial, you've been a lifesaver for me on a big VR project.
Yes that’s “_” not “-“ lol. Thx for pointing out l!
Thank you, VRother Hugh! You are the 360VR production space and the Metaverse Sensei.
And God Bless Andrew Hazelden!
Thank you too!
First attempt at editing footage from the 5.2mm RF lens and ran into a problem, when generating the STMap only one eye (left side) was being generated, to fix it I went into Fusion, when STSaver is selected you have to go to export and change clipping mode to none, then the generated STMap will have both eyes! And it works fine now. Thanks for the tutorial. :D
Damn Hugh...you are a LEGEND for not only figuring all this out....but for making this awesome tutorial! All of us aspiring VR creators are so grateful for you! Thank you!!! ^_^
I am very happy to hear that! More to come :)
Thank you Hugh! This is so much help for let us know more about the whole production of 180 VR video for FREE!
감사합니다. 대한민국에는 당신과 같은 VR전문가가 없는데. 친절히 한국어 자막까지 달아주셔서 많은것을 배우고 갑니다.
thank you. There are no VR experts like you in Korea. You kindly put Korean subtitles on me, so I'm learning a lot.
Glad that help! I will remember to put in Korean subtitle from now on!
Thanks so much Hugh! Your videos are SOOO helpful! And huge thanks to Andrew Hazelden for this plugins.
Glad you find it helpful!
Most anticipated tutorial of the year!!
Haha nice!!
Cutting edge stuff! Thanks Hugh and Andrew...Oh also Kimchi and Keeley! 😀
Glad you enjoyed it
You are the best🎉
Thank you!
最好的ST Map生成教程,感谢!The best tutorial for ST Map creation. Thanks for your great work
十分感谢!
Hey Hugh,
First off thank you for putting all of this together! Incredible effort and gets me closer to the dream of doing everything in Resolve. I have a couple of issues and I'm hoping you can help. I just tried this on some sample footage, and noticed a couple of weird issues:
1. The VR viewer I use for local files (Skybox) didn't correctly recognize the 3D format (SBS, VR180); but this isn't an issue when using Canon's EOS VR utility. Why could this be?
2. The video seemed very mismatched once viewing in a headset (Quest 3), compared to the footage that comes from the EOS VR utility. I applied some manual 3D offset correction in Skybox and it kinda sorta helped. I'm guessing this is a result of using your calibration file instead of creating my own? I'll get to it soon!
Appreciate any insight here. Thanks Hugh!
B/c resolve does not inject metadata - you need to tell skybox what it is. Meta injection has to happen in FB360 step for final delivery. Yes, create your own calibration file please. Each camera is different. Use my as a starting point.
Thanks@@hughhou! After I posted that I exported another video and that one did seem to preserve the metadata. Not sure where in the chain it got lost on the first test but I will keep looking into it. Going to go take some good reference videos for calibration purposes next today! Another question, have you noticed that creating calibration profiles specific to the camera for NeatVideo makes a big impact vs. using the generic profile?
Thank you so much for this video👍, can't wait to get my objective and test your workflow. I thought I'll have to work with Adobe and I am happy to see that I should be able to work in Davinci. What I see with my hardware (4090 RTX and a 7950x) is that I can work in realtime with 8K 60 RAW in any 30fps timeline but it is slightly jumping on a 60fps timeline, so I hope the result will be about the same with your workflow in a 30fps timeline. Great videos 👍
Incredible! thank you so much for the content! very useful!
Glad you liked it!
thank you Hugh! You are amazing! was so worried when I got the R5C I would not be able to use the RAW with this wonderful lens. You saved the day!
Happy to help! R5C is powerful. Wait till I get to the camera settings to maximize your sensor output !!
@@hughhou Nice! Very excited for that, thank you!
Hi Hugh! For starters, thanks for putting out the content you do.
I just got a Quest 2 and I'm kinda fascinated by VR video. I used to do some video stuff but still photography is my jam now. But the Quest has some ideas rolling around for personal projects. I'm mostly interested in 180. I'm not a big fan of 360. Seems like either half the view is wasted or I'm missing something in half.. depending on which direction I'm looking. My question is.. you talk about the R5 and R5c here. And it seems like a lot of videos I've watched talk about the R5. I have the EOS R. Obviously, I'm not getting 8K. Can I use the 4K in the EOS R with decent results... at least until I figure out if this is something I want to do a lot? I just wasn't sure if the EOS R even has the capability. I'd like to find out before I go buy the lens. I do have a lot of 300x memory.
Thanks. - Chris
Great tutorial! And I believe this can be applied to more cameras
This is a wonderful lesson! But how to do calibration if there is already a stitched piece using STMapperlnline, but there is no original video???
Thank you very, very much Hugh! I've tried to play with the defish etc quite a lot so I really know the value you've provided here. Just amazing.
Great to hear! thank you for checking it out :)
Thanks Hugh - I am watching this now after following your latest UA-cam featuring the Spatial GUI Encoder/Injector. So is there any need for the workflow above? The only thing I am now missing is that the Encoder version does not do 8K from my R5?
Wait it should do 8K, the standard encoder for spatial video right? I just did 8K on that.
That was an awesome tutorial, thank you Hugh!
Glad it was helpful!
Hello Hugh, is this still the best workflow in 2024 for editing 180 VR shot on canon lens for Vision Pro? Note that I'm editing 6K Clog3 footage, not 8K RAW, so perhaps Premiere might be the best workflow? That being said, there still might be a need to use your calibrator plug-in, which can only be done in Davinci, correct?
If there are any other videos of yours with alternative newer methods, please point me in the right direction, thanks (:
So in 2024 I use this workflow: ua-cam.com/video/5YuQIwXXA74/v-deo.htmlsi=9tjuo7_4PugYmA9D and this one: ua-cam.com/video/kUXqKtrO7bQ/v-deo.htmlsi=DHqZQlpJ-AlEeHrU
@@hughhou SO now, after I dumped all my Premier Pro and have bought Davinci Resolve Studio 18.6 (in which I am totally lost still) you are going back to Premier??
I have at least 300 3DVR 180 clips and I can't edit anything. Even the downloads don't seem to have the same installers as your videos are showing. :(
Hi Hugh! Thank you for another great videos! Those contents are so valuable! Just curious. If I shoot a character with green screen background in 180VR 3D, it it possible to composite a 2D image environment (or animated environment) with my key out character and render it out as 180 3D VR format?
Yes! You put your 2D immersive envirnment on the spere projection and put them into 3D space BEHIND your 3D green screen video. You will need to watch in headset to find the correct depth to push your envirnment. Same principe you can put multiple option in different depth the comp a entire 2D assets in 3D - like in a 3D software.
You're amazing, thank you for all this information! Do you happen have a STMapper for the Red V Raptor using the Canon Dual Fish Eye Lens?
I believe the template has it :)
does your plug-in also work on Macbooks? I'm going to buy a laptop but I don't know if your plug-in also works on macOS or only on Windows, thank you in advance for your answer.😊
VR180 is in a better place because of you Hugh! Thanks so much!
Happy to help!
Great videos! I'm trying to decide between the Mac and the Razer. Do the plug-ins for Resolve work the same on the Mac as shown here on the Razer? Thanks again!
Yes. The samwb
Dear Hugh, again a BIG thank you for all this material. A quick question: I guess I can train to render R5C footage in 4K/60 with DaVinci Resolve and then buy the Studio version once I am convinced I am able to handle the workflow myself? All the downloaded tools are compatible with 4K footage ? Thanks a lot. Valter
great, thank you for this amazing tutorial. Can you go over the R5C camera setting when you shoot please.
Yes that is coming next. Want to get the hardest part out first. R5C menu is confused as well - but I am a C70 user so it come easy for me. It will come soon! I promise!
URGGG! I''m trying to do the dual fisheye calibration. When I drop the HughHouAnaglyphViewer onto the clip, the clip just goes black. If I continue anyway and set up the new Fusion Composition and drop the STmap-Creation- Fu18-v2 into Fusion, nothing happens. Zoom and zoom out and there's just nothing there. It acts like it dropped into Fusion, but nothing. I'm running Davinci Resolve Studio 19.0. Thoughts?
Hugh Hou, thank you a lot for this very helpful video. Saves me a lot time to get into the R5C with dual fishlens.
Today Davinci Resolve 18.5 is out. Will it work with your workflow?
Yes! 100%
Hey Hugh, slightly off topic but any thoughts about the Mac Studio and the best configuration for 3d video rendering in DaVinci? I don't want to fork out 7000 bucks but am I wasting my time if I buy just the base Mac Studio Max version? Is it worth upgrading to 64 of ram or should I be spending money on more SSD? It is hard to find any online reviews of the Studio for 3D video editing and rendering so wondered if you had any experience even if I know you generally use the Razor. Thanks
I am considering buying Mac Studio but since I have maxout specs Macbook Pro M1Max so the need of upgrading is low. If you go with Mac Studio - I will highly recommand to max out the specs except SSD / Storage. As storage is useless on VR - we will generate TB of footage each shoot - so we usually store it off computer with SSD Raid - which is way cheaper than Mac own storage. So go with 1TB - but the rest you should max it out.
Hugh, thanks for putting this together! I am super excited to start using VR for my channel, but am having problems with raw. I can use your test files which are .mov, however if I try to apply the STMapperlnline to my raw .CRM file I either get a stretched view (but only half the screen) or still side by side. I restarted with no changes and now I don't get a preview at all. My clip is 8k 60fps. Do you happen to have a raw file for download that I can test with? Thanks!
Maybe it's because you're using the free version of Davinci Resolve? I think 8K is only supported in the paid version (which is understandable). I have the free version too and have the same problem as you. Putting it to 4k and the workflow works like a charm. Good luck!
Thanks for the excellent video tutorial!!
You are very welcome!
Your video description talks about editing on an M1 Mac and then later in your video you have "19:02 - View 3D 180 on VR headset Meta Quest 2 with Fusion 18" but you're editing on Windows ... does the MQ2 work with DR(S) on Mac ? If so, how do you set this up?
It work on both Mac and PC but PC allow me to easily use Meta Quest to preview while Mac I need to render it out to check.
Thanks for such an awesome and detailed tutorial.
Awesome Hugh! Thanks:) I assume that the same workflow will work with the Atomos Ninja 5+ 8k recordings in ProRes Raw...Hopefully. I'm at the end of post on a feature film, so the upgrade to Resolve 18 will have to wait a few weeks. I've had the R5 and VR180 lens for months now - just too busy to really work with it, so all of your hard work is greatly appreciated!
Word of warning - Ninjas V+ take HDMI output in "10-bit" not "12-bit" as Canon RAW. You are losing ton of information in your "ProRes RAW". So don't get confused by marketing at Atomos - it is NOT a better solution beside letting you run longer without overheated. R5 is necessary tho with that. Resolve work great for ProRES RAW. The workflow is identical as state here. You just don't have my color page flexibility - chaning ISO in post. But you just need to nail your exposure onset when filming
Hi Hugh, thanks for this fantastic video, once again! I have 2 questions:
- Do you think we could use the Canon VR lens in a Nikon Z series with adaptors?
- How long does it take to show the maximum quality on the UA-cam Oculus app? It says that it is already processed, and I can already see 4k on the desktop but only 1080 on UA-cam VR.
Hi Hugh, thank you very much for sharing this video, it is really helpful. I Would like to learn how you color grade Canon RAW as you mentioned.
@Huguho The one thing is the render time for a 10 minute video is 1 day and 15 hours. A normal 4k video only takes me 20 minutes at the most. You didn't go through the render process in any video I have found on 3d VR yet, what do you suggest?
That really depend on your system specs. If that is the case, I will recommand to transcode your Canon RAW LT into ProRes first using the official canon tool or just Resolve and then use the workflow here. It will be a lot faster. Also if that still does not work, I will recommend try my other workflow which is a lot faster here: ua-cam.com/video/1QLyX6ApuEg/v-deo.html
@@hughhou Thank you so much for taking the time to respond. The actual render time was 12hr 15s. The render time isn't the problem though. Very new to this but have been editing and shooting 2D videos for years, never needing much more than color correction and titles with simple transitions. I'm shooting with the Canon R5 and the dual fish eye lens. My computer is a bit dated running Windows 10 home with the Intel i7-5820 3.30 ghz, 32gb RAM and the RTX 2070 Super for a graphics card with 8GB ram.
Bought Resolve Studio and was just lost how to edit with it, have been using Adobe Premier for years. Found your video featuring Resolve and the R5c. Followed your instructions and installed the KartaVR package, checked them all off in the fusion reactor, set the Stmapper and everything seemed to be working great. Video had stretched to the corners for monitoring, no longer circles. Did a basic edit with some cuts and simple fade transitions. Went to the Deliver tab and selected h265 and rendered...
It did not stretch to the corners, rendered video is back to being 2 circles.
I can't seem to find any instructions on the rendering/delivery settings but I have checked and double checked the settings you laid out in the video and they are right on. Any idea what I would miss that would make everything fine on the timeline but not render properly?
thanks hugh, you are the best! what would the vr community do without your tutorial videos? as i already wrote you via email, the workflow with canon raw files in mistika vr is not really ideal. i am glad you show a better alternative here.
Thank you.
Hugh!!!! You are the man!!! Thank you sir! Quick question - would you have VR180 Creator Tool for Mac, please? (Unfortunately, Google does not provide that anymore.)
Oh shoot I am on PC. I believe there is another way to inject meta - let me check
2 years later, has there been any updates in Davinci for the fish eye calibration? stitching, ST Mapping? Thanks! Or any easier methods?
I need to follow up. Good suggestion :)
Thanks Hugh - you're the VR man!
Hi Hugh! I tried to edit the CRM file directly from DaVinci, but I face the problem with super long rendering time if I add any FX such as Nosie Reduction. Do you suggest I get all the color grading setup without any FX, and then render out a HEVC file. Put back the HEVC file and add other FX? Will that shorten the rendering time? I'm still playing around and testing different method
I will suggest ProRes or transcode into ProRes.
@@hughhou Thanks for your suggestions, I transcode into ProRes and it works a lot better! Thank you so much! I still suffered from long rendering time, about 5 hours to render 40 seconds, not sure if that's normal. I do find a work around. I first render the cache color output using the highest quality, which is DNxHR 444 HDR, and then I check the Use render cached images when render, it takes about 20 minutes to cached the image and took about 10 minutes to render. The result looks identical in the headset. 5 hours vs 30 minutes different!! Do you know is there any bad with this method?
Hello, After I apply the STmaperinline in effects it doesn't do anything. It's still circles instead of equirectangular. I have watched the video several times and can't figure out what I'm doing wrong. Can you help me? Thanks
great video. what is the limitation of the free version??
Hi Hugh VR master, I am currently looking into the stereoscopic workflow in Davinci (color tab > 3D). I was wondering if you have tested the "stereo alignment" tools and what you think about the "native" 3D workflow in davinci.
I use ZCam for 180 stereo. I continue exploring but it looks quite handy. (the only think I need to figure out is how to remove tripod).
It is great! You can convert VR180 to 3D prob using that. Stereo calibrate does not understand fisheye distortion.
Hi Hugh! Do you still have a copy of the repacked creator tool or is that obsolete now? I'm just getting into Davinci and wanted to walk through this tutorial again, thanks in advance!
Yes it is in this tutorial description: ua-cam.com/video/kUXqKtrO7bQ/v-deo.htmlsi=MhxkmcwQawbmh06Q
Thank you so much!@@hughhou
Super tutorial, thx a lot for the info.
I have a question tho, the Stmap you provided are working well on some footage and gives bad stereoscopy for other clips. In my understanding, if I do my STmap for a lens/body combo, will it be good for any clip using this same combo ? So far the best equi conversions and stereo results I get comes from Canon VR utility but I would like to by pass this process as doing the conversion in DR and only on final edit is way better !
Thx.
If you do your own STMap combo - I guarantee you will get better result than the Canon VR unity. But yes, I do calibration everything for a new video or a new day. There is many reason will change the calibration - like focus / left - right eye adjustment, outside condition. If you practices the calibration in Mistika or In Resolve - it just become second nature to you soon. I hope this help. My temp is just get you started as I will not know your camera setup.
@@hughhou Thx for quick reply, that's perfect. Thanks to your tutorial doing the STmap myself is easy. So to be perfect STmap should be done for each different scene, if I change my shot in focus or positioning I should do a new one, correct ?
@@hughhou Hey ! I'm allowing myself to ask again as I can't find better ressource than you. I'm wondering if a STmap for a shoot is enough for all the scenes of the same day or if I should do a STmap for each individual scenes ? best.
@@marting974 How do you create an STMap? I am using the one he provided but it is not working properly at all, it shows two small little rhombus shapes of the left and right instead of filling the entire frame like in the tutorial. Thinking I might have to make my own STmap
Hello Hugh! I'm trying to understand how to do this with 2 gopros but even with everything you say, I don't understand how you manage to stitch the two files in fusion, would you make a tutorial or explain it quickly?
thanks in advance
You align the 2 files just like 2 Go2? I will make one when I get both GoPro and Max lens mod - I don't have that many GoPro yet
@@hughhou it's more about setting the WarpStichUltra node to have the good warping latlong profile, I'm getting there, but when I tried to export, resolve is crashing, I may need to combine the 2 files. Thanks for the answer!
My dream camera. Wish I could afford one
Hello,
In the Vision 18 VR 180 section, unfortunately, the Median1 and Anaglyph sections no longer have three dots as they used to. Now, there are only two dots. When I select these two dots, I’m unable to make the adjustments that you can. I would greatly appreciate your assistance with this matter.
I'm also having very similar issues. None of the adjustments appear to do anything.
Hugh.!!!.. You are amazing. A big thank you.
Glad it is helpful!!
Is this lens usable with other dslr cameras?
Also am I right in saying the image once processed is in 3d? Or is it just high resolution 180 ?
They are high resolution 3D 180. It only work with R5 and R5C :)
Hugh, thank you! If you ever conduct a masterclass overseas - sign me up!
Maybe one day!
hi Hugh, I have Resolve Studio, when I try to render at 8k, I get the error: Cannot find appropriate codec for encoding the video frame. Will it be hardware? I have a 4GB RTX 3050
When do you denoise in this workflow? Everything Canon RAW is going to have a ton of noise.
Yes RAW is without in-camera denoise. Do it in the color page directly should be fine. You are still editing RAW with this workflow.
Great tutorial, Hugh, as always! I'm trying to create my own ST Map corrections and the Fusion STmap-Creation file shows a different dimension (6144x4320) in Viewer 2. Is there a way I can fix this?
It should not. Maybe some node settings is off? Turn off fit to view and manually adjust resolution on all nodes?
@@hughhou I'm seeing the same thing and the anaglyph image is all off. Still hunting down the reason.
The STMapper1 node was set to "Size: STMap Frame". When I change it to "Size: Image Frame", that improves things. Still haven't gone all the way through the workflow though.
Wow! thanks so much! I will watch all these video's and expect I do it! now, although I am interested in vr with quest 3 (replacing quest 2) I really want to take my 3d footage in 180, and crop to 16/9 so I can show on cinamatic 3d projector (awol 3000). I can do this, I hope?
Yes! I will make a tutorial on this special use case soon
@@hughhou oh, you are the best out there, Hugh! /My doggo Callie will also star in VR soon
😱Thanks for the Tutorial!!!👍👍Do you know, is it possible to stream with the canon cam? ✌️ Or can someone recommand a vr180 live streaming setup?
This is COMING NEXT!!!! Lol you read my mind. STMap is the key here so learn this one first and we will build on that knowledge.
@@hughhou Thank you!!!
something happened when trying to calibrate. the stmapper tool thingy resizes my window to 12288x4320xfloat32. so in the anaglyph is shows 6144x4320xfloat32. in that way its impossible to see correctly to calibrate. any Ideas, Hugh? thanks bro
Same for me
I fixed, on the STMapper1 on the controls pane select Window -> Frame and the anaglyph should now show in 4096x4096
@@ringrace you are a life saver!!
Hello Hugh Hou,
you are a master of vr and your videos are lessons for me. I have a problem. I filmed with the Red raptor and the canon mount and I would like to have the same workflow as you on resolve. can you help me.
Thank you
Yes. Red V-raptor workflow is coming next! That is literally the next video!
@@hughhou You are the best, thank you !!!
Hi, how can I transform the dual fisheye video to normal video to watching on for example tv?
Same workflow as Spatial Video : ua-cam.com/video/BHtKOxGEiAw/v-deo.htmlsi=gwlDzfAMdGwVZHxI
thank you very much! 😀😊
Thank you for doing this!!
These tutorials are amazing thankyou!! I'm just having trouble with the VR180 creator. Coming up with an error. 'Unexpecter error: somthing isnt working 1: Video DImensions must be set for V1 metadata injection' Can you help?
That is a strange error. Are you on PC? Is your final render resolution 2:1 ratio?
@@hughhou I have the same error on Mac. My video is 4096*2048
I know now, I was trying to process a .MOV and not a .MP4
@ hello, I have the same problem, how did you solve? Thanks
An absolute amazing tutorial - thank you so much Hugh! Unfortunately my Studio18 crashes each time I try to render out the AllSavers ... any idea how to fix this? (I substituted the original footage with a ProRes rendered as well but strangely the output size shrinks despite the ProRes movie being the full 8K ... but regardless of this or the mp4 ... crashes galore. Not sure what to do anymore...)
Hey Hugh, it seems the creator up 585 asses is missing from the Hugh Hou folder. It was there a few weeks ago, I just don't see it now. Can you please help?
585 asses? Sorry what?
tnx hughhou
Can I just get the regular r5 for this? It is not possible to buy r5c for me😔
I have a question about resolution. My camera is set to 8192x4320. Is there a good reason to use 8192x4096?
You camera should be 8192 X 4096 After stitched - You have Canon R5C right?
@@hughhou R5. I followed your video and was able to create 7 min 180 VR video. Is there an easy way to composite 180 VR image?
Thank you!!
You're welcome!
Thank you! I have a problem on color page. Camera Raw is totally grey and I can't use any option. I've used one of your clip to test everything. Please , help!
Thanks!
Wow thank you!!
Great inspiration for VR180. Thank you for your work.
I have a big problem.
VR180 Creator send me an error message as I try to inject metadata:
"Video dimension must be set for v1 metadata injection"
I have a Macbook pro 12.5.1. M1
This problem doesn't appear if I upload a fisheye native video, but only with equirectangular rendered by davinci or by Canon Eos utility. Can you help me please?
thank you
Adding STMap to multiple images at once? Can you apply the STMap to multiple images that you've added to the timeline at one time or do you need to do it one by one?
One by one for now. But you only need one usually...
You can create a composite clip with all your segments and then apply the STMap to it
@@ringrace very interesting. Thank you!
Will this work on Davinci Resolve free version?
Are these tools useful for the VUZE XR 180/360 camera too?
That answers my Mac question thanks
Happy to help!
Ok Hugh I found this tutorial and I'm trying to apply it for my double Go 2 setup instead of the Mistika Boutique option :) to mimic the Canon R footage I have created another timeline and putted the two fisheye footages side by side to mimic the Canon and use this timeline as a clip in the main timeline to drop the effects on - but since the FOV is different to canon's (and also different in 30 and 50 fps) - I am not sure it's going to be good starting with your template. Are you happy to maybe share the double GO 2 setup template you created? or give tips on how to adjust?
Also, any tips on how to sync the two cameras? I get there's no genlock but maybe you have some experience with this? I've tried to click them both at the same time and still got a bit of a difference even when using 50fps (I sync them by eye using the audio).
This is not VR180 tho just to clarify :) Will add you onto the giveaway!
@@hughhou Of the Qoocam? :) what I would REALLY love to have is another Qoocam 8K so I can do some VR180 with two of them together. I can't even find a used one, since they discontinued it.
Do I need to do anything specific for smooth playback in Resolve? I was able to follow the whole process, and get a sample video to my Quest 2 and UA-cam. But now when I try to actually do playback and edit in Resolve, the playback is so bad, like 2-3 frames over several seconds. I am on an M1 Ultra and the footage is 8K RAW LT 60. Also with 8k RAW 60, the r5c doesn't allow proxies on a second card. Any suggestions?
Yes, if you are on Mac - the better solution now is to use FCPX to convert the file into ProRes 4444 uncompressed and then use Resolve. It is Canon RAW LT is testing on any machine b/c of lack of GPU render. Do that you will be 10x time faster. Alternative, you can use the Canon RAW converter - but you might need to boot into rosetta. For 8K RAW 60fps, you can def do second card proxy recording - it is recommended actually. What stop you from doing so?
@@hughhou You're right (but you know this 😉) - when I put it in 8k mp4 HEVC 422, it disables the proxy mode. I assumed it was the same for 8k RAW LT 60, but now have access to it. I did find in this as well that I can playback raw if I remove (not disable) the anaglyph effect you provided - probably will be something I will still add time to time, since Mac doesn't support Quest II directly in resolve (option is grayed out ). Thank you for the quick response!
Hey Hugh, is it possible to connect the Oculus Quest to Resolve on an m1 max macbook pro?
Not Mac. I use PC on my Razer for that :(
Oculus is Android so not a Mac problem. If you use HTC Vive then yes
Holy cr@p. You rock! Thank you so much for this. I'm looking forward to trying out your workflow later this night -- it's not something I would have come up on my own w/o a huge amount of time invested and even then, ???. That I don't need to use Premier is an added bonus. Thank you, thank you, thank you! Please also pass along a thank you to Andrew Hazelden and Kimchi too :-)
Haha thank you! Yes this tutorial is literally for you then.
@@hughhou Your tutorial works like a charm and is right to the point with no fluff -- wish more tutorials were like this. Once set up, easy workflow, but I would have never figured this out on my own! THANK YOU! I haven't done the stereo calibration yet (real job calls) but definitely need to -- I think I'm going to throw up after watching my first self-produced VR video. Canon should get you on their payroll! I would have returned my VR lens if not for this video -- so happy to have a solution that works exclusively out of resolve.
I dont understand why my GPU usage is at zero while rendering .. it takes ages for a minute. Running windows 10 with an amd 5800X and a 6800XT
Try the new Canon Software: ua-cam.com/video/5YuQIwXXA74/v-deo.html
Nice work.. I heard the Dual Lens IPD is slightly less than 64mm (standard).. is there any workaround? or just get used to things looking larger than life
Actually it is smaller in life - very slightly. Larger IPD make thing greater in life :) So it is opposites.
thanks!!! from japan
Thank you so so much !!!
hi hugh, i am very interested in how to edit a 180 degree video but i have a problem, when i try to import the sample video which is in the description into davinci solve it can't display the image, only sound goes to davinci and it says "media offline" please guide for me 🙏
Are you on the latest Resolve 18? This is a recent feature that built-in.
@@hughhou I have used the latest version, but the raw file you gave when I imported it into davinci resolve is not detected "media offline"
THANK YOUUUUU!!!!!!!
Glad you like it!
I wonder if anyone can help?
I used this workflow with the R5C and it works great (as long as you make sure to keep the camera horizontal while filming so you don't have to correct for vertical parallax) However Davinci Resolve doesn't currently handle Ambisonic audio, so I had to import the resulting footage into Premiere Pro to add the audio tracks. The issue with this is that the Premiere Pro VR settings then try and reproject the footage into equi-rectangular projection again, but in this case it's already been done in Resolve. To get around this I ignored the VR in the project settings but set to VR equi-rectangular 180 SBS in the export settings. This seems to work fine. Except!
If you try to add text or graphics and use VR Plane to Sphere the text ends up getting warped. For now I just accepted this and made the title "purposely" look curved but is there any way to do this correctly?
(I suppose I could manually place two versions of the text in the correct places and then manually shift them a bit to get the stereo depth offset, but surely there has to be a neater way of doing it?)
Does Premiere Pro support fixing horizontal and vertical disparity? Does it handle Canon RAW with good noise reduction? Or is Resolve the way to go? I've been (reluctantly) editing with PP the past couple years. If only FCPX was VR friendly, but that's another subject. 🫤 Trying to decide whether it's worthwhile spending the time to go back to Resolve. I was using Mistika for a while but would rather avoid it if possible. As always, thanks for what you do Hugh! (yes, that rhymes) For your help, next time I'm in LA I'll buy you a pizza. Or a taco. A pizza taco?
Premiere can not calibrate disparity at all. Canon RAW you need to deal with noise in EOS VR Utility not inside Premiere. Mistika + Premiere is the best combo tho.
@@hughhou You have a video about everything. Does that include how to use Canon VR Utility to deal with RAW noise?
I'm having a nightmare with Resolve 18 trying to edit 8K RAW footage - it keeps crashing after almost every action I take. I'm going back to 17 to see if I can get that to work with 8K.
Also try to render out raw fisheye RAW LT into ProRes first and re-import it. Depend on Machine this - re-import ProRes will solve all crashes
@@hughhou Thanks Hugh, I'm going to upgrade my GPU too as I think that is under spec.
It worked well the first 2 times... now Davinci stops responding when I drag the STMapperlnline... any idea what can we do from there?
Strange. Have you tried restart Resolve?
@@hughhou I found the bug, I had to set the app to "Rosetta" mode to make it work... normal boot on M1 MacBook Pro makes the plugin crash.
Good find! Rosetta mode always fix M1 issue…
How do we inject metadata on a mac? Or is it only working on windows?
I am stuck early on in this process unfortunately! I have recorded my 8k pro res raw video from the R5 with Ninja V+, I need to convert this since Davinci does not recognize it, in the video you say to use the Insta360 software but this software also does not recognize the 8K pro res raw file and just gives me an error when I try to open it.
I tried to instead convert it to apple pro res 4444 using Final Cut Pro X and that file is recognized by Davinci but it just appears as a left/right video file instead of being combined into the actual VR video. Im on a Mac if that makes a difference.
Yes you need to transcode. In Canon RAW - at this point I recommend transcode as well to ProRes before moving forward. Low spec computer can’t handle ProRes RAW or Canon RAW
这样达芬奇就可以直接剪辑调色非常方便且不损失画质,官方的软件raw导出444的MP4格式画质损失挺严重的目测。但是达芬奇导出速度太慢了3080ti只有1.5帧每秒有没有提速的方式呢。
Yes! It’s a trade off
At 11:44, Hugh adds a LUT but I'm not finding it and wasn't sure where to get it on Canon's website and install it. Anyone able to help? :)
The technical conversion LUT? It should just be any c-log 3 canon lut - check c300 or c500 download as well. It was there somewhere bury in the download. But I will find it and upload somewhere - I use my own commerial LUT most of the time not Canon LUT as it has issue! See my latest tutorial to understand why! Or better use the EOS VR Utility just released and then color it in Resolve.
@@hughhou will do! Saw the new Canon VR video you posted. Thanks for your help on everything. If you do find and upload that’d be great. I got the Canon tool yesterday
hi! where found effect stitching?
Sorry what do you mean effect stitching?