This is the beset tutorial channel for the 360 that I've found. Even the newer videos have good basic instructions so I don't have to refresh my memory by watching basic DaVinci Resolve tutorials.
Great Video! At 3:35 you have DaVinci render sttings at ProRes for performance. However my windows version only has H.26x and DNxHR settings. What do u suggest?
Great work as always Hugh. I would be lost without your tutorials. I believe that Davinci Resolve has an AI upscaling feature which could potentially improve 360 video quality, maybe this is something you can do a future tutorial on. Thanks again.
Topaz Video AI now has a direct DR plugin so you can use that product's features inside Resolve. I've also found that the in-built upscaler in DR is good enough to render 6k to 8K, which is massively important if you upload to UA-cam since UA-cam will no longer show the 6K version but instead downscale to 4K - losing fully **half** of all your pixels. Which is of course a disaster for immersive VR.
Under "Optimized Media and Render Cache" ( Timestamp - 00:03:10 ), the option for "ProRes" anything is missing from all drop downs. I exported the video in the 360 editor, as a prores video. What am I doing wrong please mate?
@@hughhou Please don't apologise, that's very helpful, thank you man. In insta360 studio, for 360 export on pc, the only encoding formats it offers me are H.264 and H.265 (and ProRes 422, which I now know is MAC only) which one would you say to use please?
Another life/time saving tutorial Hugh!!! Quick Question for anyone knowledgeable. How can I add multiple 2D videos/photos into the same fusion composition? I want to have 20 photos surrounding the user in 360 but I am new to fusion and 360 workflow. Can I duplicate the node tree and merge more photos in? Do I render or compound my clip and repeat the process with each photo? What is the best way to insert multiple 2D images into one 360 scene? Please Help!
You can copy the same node structure and merge3D them in. No need to render out. But if you getting slow down after 4 or 5 instants - then render out as ProRes will help with speed in Fusion. It depend on your computer setup.
Subscribed mate. Checking the series from beginning and I got Interest to learn more about VR. I don't own any 360 camera (Physical Camera) but I do use unreal engine renders and I am using the knowledge gained from you to it. It is working out pretty well. Thanks a lot man. 😇
I am so glad it works. I would love to learn your world as we to sue Blender and Unreal to create 360 video - that world is growing faster as well with better game play capture thanks to Quest 3 and AI with Blender. So much to learn lol!
I've replayed the part at 13:30 least 25 times to understand what you are doing in the camera tracker with the aperture width and height vs the Latlong patcher output, but you are going too quick for me. It would be great if you could elaborate on what you are doing and why.
Just need to make sure that 2 number is the same - so copy the width number to height. B/c Fusion need 1:1 pixel ratio to work. Don't worry about latlong patcher - that is just for tracking. So just copy the width number to height. BTW, if you skip the entire tape it will be fine. As the focal length actually does not matter Fusion will guess it correctly. So just track it and reduce questionable tracking point and keep tracking until your solve it with under 1.
@@hughhou thanks for the quick reply. In your video it looks to be defaulting back to the original value but I understand it should be square. Much appreciated.
Glad it was helpful! It will really help us if you could help share on socials. Indepth tutorial like this does not get too much click and views (very targeted to editors using Resolve only). More views and engagement like this really help the series and make more content like this. There are so many Resolve tricks I would love to share!
Great video, can you tell me what are parameters for Optimised Media and Render Cache for Windows pc as there is no Prores? And also how to use Render Cache to speed up final render, as it is not clear in video? And the last thing in Render3d node in Render type there is no OpenGL Renderer but only: Hardware renderer,2.Hardware UV Renderer and 3. Software renderer. Which one should I choose, or maybe there is a way to add OpenGL Renderer? Thanks.
Great Video! STILL I NEED HELP When adding the LatLong Patcher, the default perspective often needs to be adjusted to fit the area of interest, for example a straight 180Degree turn on the Y Rotation Axe to look backwards. Now the tracking points are accurate and taken from the right angle of footage. When adding a text 3d though, it appears onto the original angle, not onto the 180 degree turned angle.. So i try to fix it by adding 180 points to all spherical camera Y rotation axe keyframes. is there a better reliable way? How do you track a part of the image that is not the default "front" of the 360 Image, or passes in and out the default latlongpatcher frame?
@@hughhou Thanks alot for your answer. do you keyframe the panomap settings before tracking something that moves in and out of latlongpatcher's square preview window ?
Another great video. Your videos provide so much useful information and are well edited. They are helping me to create better content in 360 and I appreciate that you share so much information and all of the hard work that goes into putting these videos together!!
Hey Hugh! I was leaning towards Davinci over adobe. Any workflow we are able to carry over to the Ipad pro with the Davinci Studio? When is your Davinci Resolve 19 video drop? :) Apreciate your content. I am really wanting Ipad support even if macbook/PC is my main workstation.
Thanks for your answers Hough! :) Could you tell me what Settings I have to make in Insta360 Stitcher so my 360 videos are displayed on the hard drive of my quest 2? They are already transferred but I can not see them in the file list or via Meta Quest TV... I inserted them into the DCIM folder. 8K3D photos are working but neither 8K3D or 6K3D (both 30fps). Do you have any idea? :( Need to try that today because of return periods....
You need to bring it into Premiere - edit and render for Meta Quest - it does not work directly. As the bitrate has to control under 60mbps for Quest to play back smoothly. Straight out of camera is way too high
Thanks for your recent answer @Hugh Hou ❤ . As requested I post any comment on your answers as new comment instead of replies. Oh Okey, thanks! :) Since yesterday I tried some things and rendered it in 5.7k 3D 360. 30fps works well, 60fps not - bitrate was over 100 I think. But since I can manually adjust the bitrate in Insta360 Stitcher there would be no need to use AE right?
Hello Hugh, new subscriber after finding this video. My landlord asked me if I could do a 360 walk-through after he saw all my camera equipment and drones. I do not have any 360/VR experience other than shooting some personal footage with my GoPro MAX. I'm using 18.5 Beta as well, so hopefully I can learn from your channel. Thanks! I always welcome tips and suggestions if you have any for a newbie.
You should totally do it! This one might be a little too advanced for you - start with the beginning tutorials first and welcome! Thank you for the subs!
Hi ! Really nice tutorial ! But how to do exactly the same with 3D stereoscopic camera, to use it with meta quest for example ? How to match 3D text on both right and left eyes ? Thanks ;)
I kinda drop the hint on this new tutorial: ua-cam.com/video/kUXqKtrO7bQ/v-deo.html - but basically when you track one eye the other eye just need to be stereo match as left and right eye should be moving the same no matter what. Resolve has new update coming so maybe wait till the new update is launch so I can make a more detailed focus on 3D STEREO workflow?
Oh yes - that is the billboard example. You need to track it very very good so you don't drift but yes, that is the way to do exactly that. Make sure the wall has enough feature to track - put some green markers on it like what Hollywood did as marker to help with tracking
Well - that is the slowdown come from. HEVC or HVC is compressed codec - so your NLE need to decode and read and edit... so you introduce 3 layers of time spend and workfow to your GPU. So even you have the LATEST Mac Tower with 6 after burner - they said 8 ProRes 8K play - not H.265 lol. That is why. Those are never editing friendly format - it is delivery format. Not to mention you are losing quality in the pipeline. If you have the storage - ProRes it. Again, this is up to you. It is speed vs storage. If you have to, pick H.264 NOT H.265
Amazing, very cool! :) @Hugh Hou ❤ Thank you very much for your answers! Is it normal, that it takes 15 hours to render 15min of 8K 3D on the Pro 2 while it took only 0h45min to render 15min of 6K2D of the 1RS 1-Inch? 6k → 8k double pixel amount. 2D→ 3D again doubled. So 4x time would be fine but its 20x of the duration. Reason seems to be, that it does not use my CPU, no matter what I select (hardware acceleration etc. everywhere, h265 mp4 and so on). 100% CPU Load, 95% RAM usage, 4% GPU. I am on Windows...do you have any ideas? :( Btw: Even though I select Cuda in Insta360 Stitcher it uses not my NVIDIA GPU but 100% CPU (cpu is 10% worse than min. CPU specs, GPU is 30% better than min. GPU specs of their website. 16GB Ram. What could I do to render it with my GPU so it's at a normal speed? :(
Anyone a idea what is being said at 13:48? what do i put in the apature width? i did understand copy width to height, but that leaves me with a square, which he doesnt have
I teach myself lol. Also I pump out lots of 360 videos - not just on my channel - but on partners channel. So got lots of practices opportunities under super stressful environment and deadline lol.
Do you know if there are any places where you can download sample 3d footage? I've love to follow along to your tutorials while messing with some 3d footage.
Hello, i have a newer version of resolve, and in the renderer3D, i haven't openGL. If i select "hardware render UV", i have the image, but it is stretch and duplicate. If I choose "Hardware Renderer", it does'nt work at all, image is empty, no matter what i do, impossible to see the input image... thanks at 7:30
Glad you fix it. The new version of Resolve still has stability issue. Hoepfully they can improve it with next update. If you can report that bug to them on Fourm that will help all of us out. Cheers!
I edit 3D video in VR 180&360 formats in Adobe Premiere Pro. Are there any advanced features to create 3D Text and Graphics similar to these DaVinci Resolve?
hello. I am using 360 1 inch, While using the lens guard... When the light is strong, the shape of the lens guard is occasionally recorded on the screen. Is there a way to remove it?
@Hugh Hou hey if anybody knows it would be you. Taking footage from greenscreen, creating an alpha channel and exporting with transparent background then playing that footage in DeoVr. Is there any way thats best or something you have looked in to. Ive accomplished this but sometimes the blacks remain transparent like clothing or darker eyes
Hey Hugh, on my PC i've got (for RENDER 3d node) Hardware Render and Hardware UV render...sorry if it is a newb question but what is the diff ? Cheers Bruh !
also another newb question (I was on ADOBE for years so everything is new sorry) if I insert a 2D video in a 360VR video how do I rotate the 2d video ? ex: I've made a black title card with the text in the main view but I want to place the animated client logo in the back (180 degrees). I insert the 2d video but it is like watching it in a mirror. Cheers ....and sorry for the bother.
I detailed in this video: ua-cam.com/video/ZwojozYr68M/v-deo.html - you need to build a 3D space in Fusion and move in 3D. If you hate Fusion, this Boris FX Continuums VR unit really help: bit.ly/45eXwTE - but I still recommand to learn the hard way on Fusion. You can see this in VR faster than Premiere and when you build it - the next time it is a lot faster and you can do so much more than Premiere can ever do. I have more tutorials coming and tips and tricks. It just took me a while to make indepth tutorials and balance between my real works as VR DP and full time tutorial makers lol. Good thing is, they are all free. Also, 18.5 has some new AI features for 360 editing - testing it now.
In this video it seems so easy and quick, but the spherical stabilizer is so super slow. Upgraded from GTX 1060 to RTX 3080 16Gb VRAM expecting much faster performance, but it is still really slow.
Do you use Proxy and Smart render? You can also create a 1080P sequences and do the tracking and then bring it back to original quality for render to speed up
Oh no you don't. For simple 360 rotation - you can use Panomap here: ua-cam.com/video/xlOhluai5mk/v-deo.htmlsi=2KI6OwjAdFaZvWJw / Or you can use "Transform" OpenFX. When applying this effect to 360VR content, make sure to set the Transform effect's "Advanced Options > Edge Behaviour" setting to "Wrap-Around". So there you go, 2 ways to do thing plug the BorisFX you got 3 ways.
Hi, nice to meet you. I am in the process of creating 360 video travel content. By the way, the computer is very difficult when editing videos. Can you recommend the specifications of equipment that can be edited smoothly? I'm watching your video, thanks as always for good information~
Hi there, can you please help me? I have the R5C and the dual 5.2mm lens and I use the Canon VR utility to import my CRM file and save it as a 4444 prores file. I then import this file into Davinci resolve to color grade and export it as MP4 H265 8192x4096 200Mbps file. For the life of me, I cannot get this to file to playback in the Quest 3 using Meta Quest TV app or Skybox VR. Neither will work. What am I doing wrong????
Did you inject metadata using VR180 Creator Tool? - Check this tutorial: ua-cam.com/video/kUXqKtrO7bQ/v-deo.htmlsi=zKzY8Ftlw9CEApaz and look for the download link in description from my repack
@@hughhou I have used the VR180 tool and injected the metadata and still no success with playback in Meta Quest TV or Skybox app. I have uploaded the content to UA-cam VR and it plays 3D VR180 fine, but the quality is not good because it is UA-cam. Why is the Quest 3 not playing them? They are MP4 H265 100Mbps files at 8192x4096. I can hear the audio playing in Meta Quest TV and Skybox but there is no video. Please help. I have invested a lot of money in order to create immersive video and it has been a week and I have watched every video of yours but still no solution as to why it will playback on UA-cam but not Meta Quest TV or Skybox. Any ideas?
It should not affect MotionVFX as the workflow just treat them as flat graphics. I usually also just like to pre-render them - some MotionVFX Template is way too complicated in Fusion. It is best practice to pre-render them and if you need change just decomp it
@@hughhou yeah. Super helpful thanks!Just made me realize I need to do a Fusion crash course lol 😂 Also, I can’t find it but do you have a video on export settings in resolve?
I am trying to install spatial media metadata on my macbook pro 2015, my pc is not supporting software, is their any option for installing, thank you in advance
@@iamEmrulHasan Check out the Davinci Resolve video on my channel. There's a chapter marker for installing it. Give it a shot. Hopefully that will help/work.
Did you connect your HMD? If so, is your HMD in Oculus Connect on your PC and the Oculus App opened? If help install SteamVR first and make sure you can see your headset. Check your USB-C cable
Does this only work on Apple computers? I have a PC I have a I6 with 64 gig memory, and an AMD 8gig video card running Windows 10. I haven't been able to get to generate a true 360 mp4 file. In fusion I can move the second window in true 360. But the edit pages isn't in 360. It never changes from when I dropped the video on the timeline. Thanks in advance for any help.
Still can't get it to save in the immersive 360 format. I have Windows 10, 64 gig, a new 12g Nivida graphics card. Rizen 6 processor. The clip renders out as a reframed video and not in true 360. I use Insta Studio 360 in 360 output mode in mp4 before going to Davinci. Windows Davinci Studio doesn't have a pro res rendering mode so I render it as an MP4. I figure if it can't render a 5 sec clip unedited clip then it's not going do any of the great things you are demonstrating. Any suggestions. Or should I just be satisfied with the Insta Studio 360 basic clips?
Did you track it correctly? Make sure your tracker is solid first. You might need to delete tracker point which is moving and bad that usually help! More is less on tracking point just FYI. Be selective!
@@hughhou First of all, the OP replying to a help comment on a year old video? Massive thanks! I'm starting over with a bit longer cut as well. I'll make sure to really pay attention to trackers! Thanks for your response Hugh! - Comment part for ppl that may have had the same problem: My pickup is in the shot - Camera mounted up top which means moving trackers. I removed the lat long, did a polygon and selected everything that was moving (my pickup) then added back lat long, then cam tracker, connected poly to top of tracker, tracked.
When I add those Fusion nodes for inserting 2D video it increases the render time from 30 minutes to 1 day 10 hours. 😳😳😳 And this is only 1 minute of my 360 video in 4K. Thats insane...
In the ''dumb question'' category...I need dumb help @hughhou. I'm editing a 360 clip in fusion and I'm using a Template for a title from MotionVFX. But when I apply my title to my fusion clip (which is on a timeline with other clips) my title keep appearing till the end of my timeline. Any Idea what I'm doing wrong ? Cheers !
Yes, this video should explain it. You need to project it correctly. The easiest fix is to make your graphics smaller - profession fix is actually in this tutorial using VR Unites.
Only how to export it to youtube in 180 alvo 360 vr? Here I can do everything, but how to export this shit to 360 later. In Adobe Aftereffect or Adobe Premiere Pro, no problem.
@@hughhou Thank you, it's best if you show the end result on UA-cam in 360 Video. There are 2 same waterfall videos on my channel. One is unprocessed and the other with plugins. I'm curious if Davinci will be better.
Hi Hugh What about if I want to place a Title but linearly and not in 360 on a 360 video, for example "Clean and Simple Lower Third" when I put it in a 360 video it is curved and I dont want, same problem with a logo. Thank you........ You are THE BEST!! I got what I was looking for in your video ua-cam.com/video/8u7GY436_Jw/v-deo.html
Thank you very much. Finally a youtube channel that shows how to rotate the perspective on a 360 video.
Glad it was helpful!
This is the beset tutorial channel for the 360 that I've found. Even the newer videos have good basic instructions so I don't have to refresh my memory by watching basic DaVinci Resolve tutorials.
Glad you think so!!
Great idea! I would absolutely love to see a detailed Insta360 Stitcher Render Tutorial for the Pro 2 😍.
Watch at .75
Thanks Hugh for another amazing video
Great Video! At 3:35 you have DaVinci render sttings at ProRes for performance. However my windows version only has H.26x and DNxHR settings. What do u suggest?
DHxHR
Great work as always Hugh. I would be lost without your tutorials.
I believe that Davinci Resolve has an AI upscaling feature which could potentially improve 360 video quality, maybe this is something you can do a future tutorial on. Thanks again.
I think I kinda did. 2X upscale to 4X upscale. But the quality is not as good as other options. I will need to revisit it. Thanks for the reminder.
Topaz Video AI now has a direct DR plugin so you can use that product's features inside Resolve.
I've also found that the in-built upscaler in DR is good enough to render 6k to 8K, which is massively important if you upload to UA-cam since UA-cam will no longer show the 6K version but instead downscale to 4K - losing fully **half** of all your pixels. Which is of course a disaster for immersive VR.
Subbed dude! Super helpful, thanks!
Under "Optimized Media and Render Cache" ( Timestamp - 00:03:10 ), the option for "ProRes" anything is missing from all drop downs. I exported the video in the 360 editor, as a prores video. What am I doing wrong please mate?
Thanks for the sub!
Are you PC? Then Avid DNxHR. ProRes is Mac only. DNxHR is the PC ProRes - sorry I should explain that
@@hughhou Please don't apologise, that's very helpful, thank you man. In insta360 studio, for 360 export on pc, the only encoding formats it offers me are H.264 and H.265 (and ProRes 422, which I now know is MAC only) which one would you say to use please?
Wish I had a little more notice of the premiere! Got here late. Rewatching.
Haha thank you!
Thank you for this Hugh.
Also love Kimchi and would love more content on how to make money as a VR specialist.
Noted! I will sit down and write something about it.
THANKS HUGH!! I am going to finally make this happen! Keep inspiring us!
Glad to help Terry!
Tq for the tips
Thank you!!
Thanks so much for your tutorial!
You are so welcome!
Amazing, very cool
Thank you very much!
Thank you, for the update.
Yes absolutely!
Another life/time saving tutorial Hugh!!! Quick Question for anyone knowledgeable. How can I add multiple 2D videos/photos into the same fusion composition? I want to have 20 photos surrounding the user in 360 but I am new to fusion and 360 workflow. Can I duplicate the node tree and merge more photos in? Do I render or compound my clip and repeat the process with each photo? What is the best way to insert multiple 2D images into one 360 scene? Please Help!
You can copy the same node structure and merge3D them in. No need to render out. But if you getting slow down after 4 or 5 instants - then render out as ProRes will help with speed in Fusion. It depend on your computer setup.
@@hughhou Thanks so much Hugh! Worked like a charm! You are the 360 Editing Wizard!
Woohoo 🎉 yes‼️ Thank you so much for sharing 🙏🏻👍🏻👍🏻👍🏻👍🏻👍🏻
Great tutorial, Hugh, thank you. Do you know where I can find tips about ambisonic sound workflow on Da Vinci ?
I will make one. But right now it is render out stereo and import into Reaper - which I have an old tutorial about it
Subscribed mate. Checking the series from beginning and I got Interest to learn more about VR. I don't own any 360 camera (Physical Camera) but I do use unreal engine renders and I am using the knowledge gained from you to it. It is working out pretty well. Thanks a lot man. 😇
I am so glad it works. I would love to learn your world as we to sue Blender and Unreal to create 360 video - that world is growing faster as well with better game play capture thanks to Quest 3 and AI with Blender. So much to learn lol!
I've replayed the part at 13:30 least 25 times to understand what you are doing in the camera tracker with the aperture width and height vs the Latlong patcher output, but you are going too quick for me. It would be great if you could elaborate on what you are doing and why.
Just need to make sure that 2 number is the same - so copy the width number to height. B/c Fusion need 1:1 pixel ratio to work. Don't worry about latlong patcher - that is just for tracking. So just copy the width number to height. BTW, if you skip the entire tape it will be fine. As the focal length actually does not matter Fusion will guess it correctly. So just track it and reduce questionable tracking point and keep tracking until your solve it with under 1.
@@hughhou thanks for the quick reply. In your video it looks to be defaulting back to the original value but I understand it should be square. Much appreciated.
Such a valuable video! Great watch and great info 👏
Glad it was helpful! It will really help us if you could help share on socials. Indepth tutorial like this does not get too much click and views (very targeted to editors using Resolve only). More views and engagement like this really help the series and make more content like this. There are so many Resolve tricks I would love to share!
Great video, can you tell me what are parameters for Optimised Media and Render Cache for Windows pc as there is no Prores? And also how to use Render Cache to speed up final render, as it is not clear in video? And the last thing in Render3d node in Render type there is no OpenGL Renderer but only: Hardware renderer,2.Hardware UV Renderer and 3. Software renderer. Which one should I choose, or maybe there is a way to add OpenGL Renderer? Thanks.
DNxHR HQ is like ProRES HQ. Thank you 🙏🏼
Great tips! Just found out about this video after coming across your other video, thanks so much for sharing!
Thanks for watching!
Nice Turorial !!!
Glad you think so!
Useful vidoe thanks!
Glad you think so!
Great Video! STILL I NEED HELP
When adding the LatLong Patcher, the default perspective often needs to be adjusted to fit the area of interest, for example a straight 180Degree turn on the Y Rotation Axe to look backwards.
Now the tracking points are accurate and taken from the right angle of footage.
When adding a text 3d though, it appears onto the original angle, not onto the 180 degree turned angle.. So i try to fix it by adding 180 points to all spherical camera Y rotation axe keyframes. is there a better reliable way?
How do you track a part of the image that is not the default "front" of the 360 Image, or passes in and out the default latlongpatcher frame?
You use Panomap to pan to the direction you want to track - Panomap is like reframing. Then add all the ndoes in this tutorial after panomap.
@@hughhou
Thanks alot for your answer. do you keyframe the panomap settings before tracking something that moves in and out of latlongpatcher's square preview window ?
Another great video. Your videos provide so much useful information and are well edited. They are helping me to create better content in 360 and I appreciate that you share so much information and all of the hard work that goes into putting these videos together!!
You are very welcome! More to come as Resolve continue having update on making the process easier and easier. Expect new update next year 2024!
Is there a way to combine two true 360 clips together and keep the 360 view. I have 18.5 studio non beta.
Yes, stack it no top of each other and mask the center straight down
@@hughhou thank you so much. I really appreciate it. I have an insta360 X3 and I'm going to try it tonight
@@jburch5752 did you get your problem resolved?
Hey Hugh! I was leaning towards Davinci over adobe. Any workflow we are able to carry over to the Ipad pro with the Davinci Studio? When is your Davinci Resolve 19 video drop? :) Apreciate your content. I am really wanting Ipad support even if macbook/PC is my main workstation.
Thanks for your answers Hough! :) Could you tell me what Settings I have to make in Insta360 Stitcher so my 360 videos are displayed on the hard drive of my quest 2? They are already transferred but I can not see them in the file list or via Meta Quest TV...
I inserted them into the DCIM folder. 8K3D photos are working but neither 8K3D or 6K3D (both 30fps). Do you have any idea? :( Need to try that today because of return periods....
You need to bring it into Premiere - edit and render for Meta Quest - it does not work directly. As the bitrate has to control under 60mbps for Quest to play back smoothly. Straight out of camera is way too high
Thanks for your recent answer @Hugh Hou ❤ . As requested I post any comment on your answers as new comment instead of replies.
Oh Okey, thanks! :) Since yesterday I tried some things and rendered it in 5.7k 3D 360. 30fps works well, 60fps not - bitrate was over 100 I think. But since I can manually adjust the bitrate in Insta360 Stitcher there would be no need to use AE right?
Premiere you can set render bitrate to anything - like 80mbps - do go over 100. In the export tap choose H265 / HEVC
@@hughhou Okey, thanks! But what about Insta360 Stitcher? Isn't it sufficient to do it there?
you are amazing
Thank you!
Hello Hugh, new subscriber after finding this video. My landlord asked me if I could do a 360 walk-through after he saw all my camera equipment and drones. I do not have any 360/VR experience other than shooting some personal footage with my GoPro MAX. I'm using 18.5 Beta as well, so hopefully I can learn from your channel. Thanks! I always welcome tips and suggestions if you have any for a newbie.
You should totally do it! This one might be a little too advanced for you - start with the beginning tutorials first and welcome! Thank you for the subs!
Hey Hugh, which pole did you use in this video? Looks really sturdy
The 3 meter pole I always used?
@@hughhou i dont know which one. I only have the ones from insta360.
Hi ! Really nice tutorial ! But how to do exactly the same with 3D stereoscopic camera, to use it with meta quest for example ?
How to match 3D text on both right and left eyes ?
Thanks ;)
I kinda drop the hint on this new tutorial: ua-cam.com/video/kUXqKtrO7bQ/v-deo.html - but basically when you track one eye the other eye just need to be stereo match as left and right eye should be moving the same no matter what. Resolve has new update coming so maybe wait till the new update is launch so I can make a more detailed focus on 3D STEREO workflow?
Oh yes it would be nice! Please keep us informed!! Cheers 😉
Brilliant, can you use the same method to have a custom video stuck to a wall in VR180?
Oh yes - that is the billboard example. You need to track it very very good so you don't drift but yes, that is the way to do exactly that. Make sure the wall has enough feature to track - put some green markers on it like what Hollywood did as marker to help with tracking
@@hughhou thanks, how would that work out for both lenes, is it just a case of getting in position and then copying it to the other side?
h.264 and h.265 should be ok if your GPU supports it though right? So as long as its not 4.2.2?
Well - that is the slowdown come from. HEVC or HVC is compressed codec - so your NLE need to decode and read and edit... so you introduce 3 layers of time spend and workfow to your GPU. So even you have the LATEST Mac Tower with 6 after burner - they said 8 ProRes 8K play - not H.265 lol. That is why. Those are never editing friendly format - it is delivery format. Not to mention you are losing quality in the pipeline. If you have the storage - ProRes it. Again, this is up to you. It is speed vs storage. If you have to, pick H.264 NOT H.265
Amazing, very cool! :) @Hugh Hou ❤ Thank you very much for your answers! Is it normal, that it takes 15 hours to render 15min of 8K 3D on the Pro 2 while it took only 0h45min to render 15min of 6K2D of the 1RS 1-Inch? 6k → 8k double pixel amount. 2D→ 3D again doubled. So 4x time would be fine but its 20x of the duration.
Reason seems to be, that it does not use my CPU, no matter what I select (hardware acceleration etc. everywhere, h265 mp4 and so on).
100% CPU Load, 95% RAM usage, 4% GPU. I am on Windows...do you have any ideas? :(
Btw: Even though I select Cuda in Insta360 Stitcher it uses not my NVIDIA GPU but 100% CPU (cpu is 10% worse than min. CPU specs, GPU is 30% better than min. GPU specs of their website. 16GB Ram.
What could I do to render it with my GPU so it's at a normal speed? :(
Yes it is kinda normal. There are too many limitation that I should make a tutorial on this topic alone
@@hughhou Oh okey...thank you!
Anyone a idea what is being said at 13:48? what do i put in the apature width? i did understand copy width to height, but that leaves me with a square, which he doesnt have
You have so many effects and tricks in your videos, where do you find the time to do all of that?
I teach myself lol. Also I pump out lots of 360 videos - not just on my channel - but on partners channel. So got lots of practices opportunities under super stressful environment and deadline lol.
Do you know if there are any places where you can download sample 3d footage? I've love to follow along to your tutorials while messing with some 3d footage.
3D footage or VR180 (3D 180) footage? My Patreon has lots of download are free.
thx from Lithuania
What is better for stabilization of Pro 2 footage? Resolve Studio or Mystika VR?
I love you
Awww thank you!!
Do you need the paid version of DR to export 8K video, or is there a free option available? I'm trying to make a 360 video with 360 tour images.
You need the Studio / Paid version to inject 8K and output 8K
Hello, i have a newer version of resolve, and in the renderer3D, i haven't openGL. If i select "hardware render UV", i have the image, but it is stretch and duplicate. If I choose "Hardware Renderer", it does'nt work at all, image is empty, no matter what i do, impossible to see the input image... thanks at 7:30
i start again and now it work, i don't know exactly the issue, it seems to be a global in/out problem but i'm not sure
Glad you fix it. The new version of Resolve still has stability issue. Hoepfully they can improve it with next update. If you can report that bug to them on Fourm that will help all of us out. Cheers!
MAN I wish you didn't go so fast.
But I get it. I slow down the video speed.
but you sound funny 😂
owell. Super good Information 👌🏼
THANK YOU SO MUCH
I will try to get more indepth and slow down next time.
@hughhou Thank you for caring
Does this tutorial work exactly the same for 360 3D? Could you explain what to do different there and how please?
I edit 3D video in VR 180&360 formats in Adobe Premiere Pro. Are there any advanced features to create 3D Text and Graphics similar to these DaVinci Resolve?
I already made a tutorial on that on my resolve series - 17 and 18 workflow did not changed.
I don't have ProRes 422 for optimized media or OpenGL for Render Type. Is this Apple specific?
Yes. You can use whatever available for that drop down
hello. I am using 360 1 inch,
While using the lens guard... When the light is strong, the shape of the lens guard is occasionally recorded on the screen. Is there a way to remove it?
Awesome! thanks so much. Anything to allow me to get off Adobe is helpful
Happy to help!
@Hugh Hou hey if anybody knows it would be you. Taking footage from greenscreen, creating an alpha channel and exporting with transparent background then playing that footage in DeoVr. Is there any way thats best or something you have looked in to. Ive accomplished this but sometimes the blacks remain transparent like clothing or darker eyes
@@hughhou i use DaVinci 18 for it but not sure of the best setting for export to play
Hey Hugh, on my PC i've got (for RENDER 3d node) Hardware Render and Hardware UV render...sorry if it is a newb question but what is the diff ? Cheers Bruh !
also another newb question (I was on ADOBE for years so everything is new sorry) if I insert a 2D video in a 360VR video how do I rotate the 2d video ? ex: I've made a black title card with the text in the main view but I want to place the animated client logo in the back (180 degrees). I insert the 2d video but it is like watching it in a mirror. Cheers ....and sorry for the bother.
Use the hardware one for hardware acceleration. I think it depend on your GPU with the settings.
I detailed in this video: ua-cam.com/video/ZwojozYr68M/v-deo.html - you need to build a 3D space in Fusion and move in 3D. If you hate Fusion, this Boris FX Continuums VR unit really help: bit.ly/45eXwTE - but I still recommand to learn the hard way on Fusion. You can see this in VR faster than Premiere and when you build it - the next time it is a lot faster and you can do so much more than Premiere can ever do. I have more tutorials coming and tips and tricks. It just took me a while to make indepth tutorials and balance between my real works as VR DP and full time tutorial makers lol. Good thing is, they are all free. Also, 18.5 has some new AI features for 360 editing - testing it now.
@@hughhou oh brother I understand and appreciate the hours of work you put in!
Hey, is it possible to fix a HUD on a 360 video? One that will always stay in the bottom right of the headset?
In this video it seems so easy and quick, but the spherical stabilizer is so super slow. Upgraded from GTX 1060 to RTX 3080 16Gb VRAM expecting much faster performance, but it is still really slow.
Do you use Proxy and Smart render? You can also create a 1080P sequences and do the tracking and then bring it back to original quality for render to speed up
@@hughhou thanks for your response, I have not. Does the lower resolution of proxy not impact the stabilization / tracking quality?
oooh so you need Boris FX, even to do the simple 360 rotation 😢
Oh no you don't. For simple 360 rotation - you can use Panomap here: ua-cam.com/video/xlOhluai5mk/v-deo.htmlsi=2KI6OwjAdFaZvWJw / Or you can use "Transform" OpenFX. When applying this effect to 360VR content, make sure to set the Transform effect's "Advanced Options > Edge Behaviour" setting to "Wrap-Around". So there you go, 2 ways to do thing plug the BorisFX you got 3 ways.
Hi, nice to meet you.
I am in the process of creating 360 video travel content.
By the way, the computer is very difficult when editing videos.
Can you recommend the specifications of equipment that can be edited smoothly?
I'm watching your video, thanks as always for good information~
Yes I have lots of video on this channel about it. I will suggest Macbook Pro M2 Max or the upcoming M2 Ultra Mac Studio.
Must be expensive SFX in 360 video considering the time it takes to create it and render out. Heavy on the machine.
Yes that is an unfortunately reality - until we have faster machines
Hi there, can you please help me? I have the R5C and the dual 5.2mm lens and I use the Canon VR utility to import my CRM file and save it as a 4444 prores file. I then import this file into Davinci resolve to color grade and export it as MP4 H265 8192x4096 200Mbps file. For the life of me, I cannot get this to file to playback in the Quest 3 using Meta Quest TV app or Skybox VR. Neither will work. What am I doing wrong????
Did you inject metadata using VR180 Creator Tool? - Check this tutorial: ua-cam.com/video/kUXqKtrO7bQ/v-deo.htmlsi=zKzY8Ftlw9CEApaz and look for the download link in description from my repack
@@hughhou I have used the VR180 tool and injected the metadata and still no success with playback in Meta Quest TV or Skybox app. I have uploaded the content to UA-cam VR and it plays 3D VR180 fine, but the quality is not good because it is UA-cam. Why is the Quest 3 not playing them? They are MP4 H265 100Mbps files at 8192x4096. I can hear the audio playing in Meta Quest TV and Skybox but there is no video. Please help. I have invested a lot of money in order to create immersive video and it has been a week and I have watched every video of yours but still no solution as to why it will playback on UA-cam but not Meta Quest TV or Skybox. Any ideas?
Hey @hughhou do you have a simple workflow to keep the animation from the titles plugins like MotionVFX ? I found a way but it is a pain.
It should not affect MotionVFX as the workflow just treat them as flat graphics. I usually also just like to pre-render them - some MotionVFX Template is way too complicated in Fusion. It is best practice to pre-render them and if you need change just decomp it
@hughhou it's usually what I do, but i've had some issues with the transparency not working proprely even with the opacity or alpha turnned down.
Hello Hugh, is it possible to have the subtitles of a video made with this method ?
Absolutely!
@@hughhou could you point me to a tutorial for this please? searching only points to other video edit software
Wait. SO how do I manage the length for 3d text or video in the fusion composition. I didn't everything but I don't want it on the entire video
There is timeline control inside Fusion for clip trimming and fade in and out.
@@hughhou yeah. Super helpful thanks!Just made me realize I need to do a Fusion crash course lol 😂 Also, I can’t find it but do you have a video on export settings in resolve?
I am trying to install spatial media metadata on my macbook pro 2015, my pc is not supporting software, is their any option for installing, thank you in advance
Yes. I think you need to modify the python code a little bit - let me dip out the guide somewhere :)
@@hughhou how can i know?
@@iamEmrulHasan did you get your spatial media injector working?
@@Nautisphere not yet 😪
@@iamEmrulHasan Check out the Davinci Resolve video on my channel. There's a chapter marker for installing it. Give it a shot. Hopefully that will help/work.
Do these techniques work with 180 degree video?
Yes!!
Heeeeelppp!!! “OpenVR: Unable to init VR runtime: Hmd Not Found (108)”. Click on preview and Da Vinci show me this error. solution ?
Did you connect your HMD? If so, is your HMD in Oculus Connect on your PC and the Oculus App opened? If help install SteamVR first and make sure you can see your headset. Check your USB-C cable
Does this only work on Apple computers? I have a PC I have a I6 with 64 gig memory, and an AMD 8gig video card running Windows 10. I haven't been able to get to generate a true 360 mp4 file. In fusion I can move the second window in true 360. But the edit pages isn't in 360. It never changes from when I dropped the video on the timeline. Thanks in advance for any help.
It works on PC and actually better. But AMD graphics card might be an issue.
Still can't get it to save in the immersive 360 format. I have Windows 10, 64 gig, a new 12g Nivida graphics card. Rizen 6 processor. The clip renders out as a reframed video and not in true 360. I use Insta Studio 360 in 360 output mode in mp4 before going to Davinci. Windows Davinci Studio doesn't have a pro res rendering mode so I render it as an MP4. I figure if it can't render a 5 sec clip unedited clip then it's not going do any of the great things you are demonstrating.
Any suggestions. Or should I just be satisfied with the Insta Studio 360 basic clips?
@@hughhou I did get it to work. thx.
Sometime you have o use a metadata injector
Thanks, well done! Amazing work like always :)
If you have a time, Can you tell us, how to integrate a picture in 180 VR 3D ;)
Sure. It is here: ua-cam.com/video/ZwojozYr68M/v-deo.htmlsi=_dnIEWE9NRia8Ujw
Are you still using KartaVR plugin?
Yes! For Reframe. Heavy VFX I just build my own lol
Anyone know why my text moves? i cant get it to stay in place
Did you track it correctly? Make sure your tracker is solid first. You might need to delete tracker point which is moving and bad that usually help! More is less on tracking point just FYI. Be selective!
@@hughhou First of all, the OP replying to a help comment on a year old video?
Massive thanks! I'm starting over with a bit longer cut as well. I'll make sure to really pay attention to trackers! Thanks for your response Hugh! -
Comment part for ppl that may have had the same problem:
My pickup is in the shot - Camera mounted up top which means moving trackers. I removed the lat long, did a polygon and selected everything that was moving (my pickup) then added back lat long, then cam tracker, connected poly to top of tracker, tracked.
You must have alot of files to store, what storage systems do you use?
When I add those Fusion nodes for inserting 2D video it increases the render time from 30 minutes to 1 day 10 hours. 😳😳😳 And this is only 1 minute of my 360 video in 4K.
Thats insane...
Wait do you have GPU in your setup? Resolve need GPU
@@hughhou I DO have a GPU: GF-3080
TOOOOOOO many distractions with your funny things, it is too much.
Lol I will tone it down!
I don't think it's too much, but then I'm used to watching UA-camrs who are much more animated
In the ''dumb question'' category...I need dumb help @hughhou. I'm editing a 360 clip in fusion and I'm using a Template for a title from MotionVFX. But when I apply my title to my fusion clip (which is on a timeline with other clips) my title keep appearing till the end of my timeline. Any Idea what I'm doing wrong ? Cheers !
Yes, this video should explain it. You need to project it correctly. The easiest fix is to make your graphics smaller - profession fix is actually in this tutorial using VR Unites.
Is this the free version of resolve?
I have the Studio version but the free version will work 50% and just follow along. You biggest thing is resolution limitation in the free version
holly shit no wonder no one want to make 360 videos mainstream. loads of extra crap just to make it all work
It’s getting better tho. I will make an updated tutorial soon.
Only how to export it to youtube in 180 alvo 360 vr?
Here I can do everything, but how to export this shit to 360 later. In Adobe Aftereffect or Adobe Premiere Pro, no problem.
Oh good point I forgot to cover that in this video. Let me do a dedicated export tutorial - hold tight!
@@hughhou Thank you, it's best if you show the end result on UA-cam in 360 Video. There are 2 same waterfall videos on my channel. One is unprocessed and the other with plugins. I'm curious if Davinci will be better.
Hi Hugh What about if I want to place a Title but linearly and not in 360 on a 360 video, for example "Clean and Simple Lower Third" when I put it in a 360 video it is curved and I dont want, same problem with a logo. Thank you........ You are THE BEST!! I got what I was looking for in your video ua-cam.com/video/8u7GY436_Jw/v-deo.html
full gimmic
Dear Hugo, thank you for the knowledge you have given me🔥!
Greetings from snowy Russia...❄
Evgeniy. @lookaroundblog
Dear Hugo, thank you for the knowledge you have given me🔥!
Greetings from snowy Russia...❄
Evgeniy. @lookaroundblog