Does Premiere support stereoscopic workflows from independent cameras that have been calibrated with a rig? For exmaple I shot a stereo project on a beam splitter so I have one video file that is left one that is right. Will I have to manually crop and stack these files on my timeline to have a stereo timeline or does Adobe offer native support for this?
Hello good! Let's see if you can answer a question for me... how can I incorporate stereoscopic photography into the video? What format do I use in premiere so that it is recognized as a stereo photo?
Thank you very much Hugh for your answer, but I don't think I have formulated the question correctly... I'll try again: within a stereoscopic equirectangular video, how can I hang a stereoscopic image (left eye and right eye)? It would be a floating element within the video, in the form of a stereoscopic image. Thanks and sorry I didn't explain it well the first time.@@hughhou
Dear Hugh, When removing a tripod in a stereoscopic scene in photoshop, can't you just copy and paste the new created mono layer twice in the stereoscopic image in photoshop instead of brushing away the tripod twice? Many thanks for your help!
It depends on the camera rig - most of rig has no stereo in the bottom - but if the shadow is too long then you still need to manually paint them out and make sure it look right in Stereo
Can you make a quick tutorial on how to convert Insta360 Pro 360 footage into 180? They've still not implemented 180 3D mode and am looking for ways to take a 180 slice from the 360 footage.
Your videos have been really helpful - thanks!. I am having trouble exporting 8k stereoscopic .mp4 video, can you please make a video on proper export settings for both Premier Pro / After effect .
Sure. But if you need it now, I use QuickTime > DNxHR HQ (on a PC, and ProRes on a Mac) > Match Source > Check VR Video and pick Stereoscopic > check Use Preview to increase render speed with smart render (ua-cam.com/video/PZHaBCaeMPY/v-deo.htmlm54s). That is. Then upload the file to UA-cam directly. Or use my Horizon Workflow for a small file upload (ua-cam.com/video/8qz8WFfZTKM/v-deo.html)
@@hughhou I was on a project and looking for exactly the same, and u even explained more. You r the only channel I check for any doubts on VR and it will be answered. Thanks bro u r the best..
Great tutorial. Thank´s! I just have the problem that the reimported stills are less sharp then the original video. No matter what format I try. Resolution is the same... any idea what could be wrong.
thanks for the great tutorial! I used Mistika VR to stitch our 360 3D footage and when we export the video with using their inject spatial media metadata checked, and import it to edit stereoscopic 360 mode inside Adobe Premiere ProCC2018, I'm not able to view the video in the 360 Toggle VR Display mode in Premiere. Any idea why? Or any suggestions for Mistika or Premiere settings to play the video in 360 mode?
I need to remove lines from my sphere but they go past the frame...how would I do this? Tripod removal is in a small area, but my lines go all around the sphere. Holding down space bar to bring up hand tool won't allow me to move canvas.
I have similar thoughts. I tried patching separately per eye which leads to some weird effects. This is obviously because the patching doesn't align up as it is done independently. A good way around would be using the same patch on both left and right as @David Lawrence pointed out. This wont give you stereoscopic depth at the patch but sure avoids those weird artifacts at the patchwork. P.S. the patch sometimes stands out , is there an easy way to gradually merge the patch, like feathering out the mask in after effect ?
Well, sometimes the front end of your nadir can begin to become stereoscopic as it moves toward the horizon, so it really depends on your footage. Are your poles monoscopic, and if so, how far out from the bottom of your shot can you get before things start getting more stereo?
Yes in theory and some situations. But as you see I need to get rid of not only the sandbag - but also the shadow. And the shadow has disparity. Also lighting consideration - When I reuse the left eye patch, I can clearly see the cutoff and unmatch shading. Again you can prob use After Effects to blur out the edges and mix it back up - but I found that it's way easier to understand just handing it left and right eye together and merge them back together. What if you are getting off (object removal) something rather than a tripod. That is why I am showing the hard way. You are 100% right tho, the ground should be monoscopic. But most of the camera (Obsidian or Insta360 PRO) does not stitch well the nadir in 360 yet - if you watch my Mistika 3D stitch, you understand the issue.
Vivek - that is why you use the Premiere 360 view in Anaglyph to make sure the ground when you patch it back in there is NO visual disparity - or at least the nadir. As you see the shadow is far beyond nadir and almost reach the case at this example. If the case has strong disparity - let's assume there are object / pattern there - the patch should have disparity as well. And as Adrenaline Films said, it is really case by case. For me, most of the time this work well, but some of the time, I need to do it in After Effects. So really depend on your nadir. I am making a tutorial to follow up with this as you see my last 3 minutes. So great you guys giving me these really good feedback so I can tailor those situation and figure out the best solution together. This is still not one side fit all solution sadly.
Hi, really nice video to show us how to work with 3D 360! But now i try to find the next video about using MOCHA VR and i can't find it, any link here? Thanks! BTW, great channel!
It is here: ua-cam.com/video/5jpzS7JcxrI/v-deo.html - I should do a better job in organizing my content lol.
6 років тому+1
Thanks for the quick answer! But, this is the same as you told in which you were going to teach us how to fix the floor with changing lights and colors?
I have not created that one yet. But it is the same technique - instead of the sky, you deal with the floor. You paint out the tripod - since you use around pixels - it will adept the lights and colors to blend everything in.
6 років тому
CreatorUp! Ok, thanks again, i can take it from there!
6 років тому+1
Hey, i was yesterday trying to do the regular tripod removal and i was going nuts during the step in which you have to merge down and see it changing in the spherical map, i couldn't get it properly until i thought of changing the color to the new layer like you and suddenly it worked, you know if this is mandatory or is just a bug the fact that only worked changing the color and name to the layer? Thx again!
I shot some footage in monoscopic. I want to play it back in my gear vr or on my google cardboard. How do I convert it to stereoscopic? Can I do that in premiere?
I discovered through the google cardboard viewer on my phone I can get my monoscopic 360 video to play in stereo, but will my oculus do the same thing? My 360 seems to only record in monoscopic. I don't mind editing in mono, I just need to play it back in my headset.
Hey Hugh! Love your enthusiasm and knowledge! Thanks so much for the vids. I've been playing around with the Odyssey since Google lent it to me for the Jump Start program. Since this is the first time I've played with stereo 360, your tutorial here is much appreciated. You mentioned leaving a comment if we're interested in a proxy workflow tutorial, and I would be very interested. My computer has an i7 4 core CPU, GTX1070 GPU and 32GB RAM so it's kind of intermediate in the scale of heavy workstations and learning how to work well with proxies would be an asset to me. I'm particularly curious about what happens to performance when using "replace with AE composition", if I need to send my clip to AE for some deeper post/FX. Thanks again for your tutorials! Boris
Congrats on the Jump Start program! Is it the one they give you $40K for you to make whatever you want? I use Odyssey as well at UA-cam Space LA as it is a standard there. I will make a proxy workflow but won't be a while as I am behind schedule. But I will let you know asap when I do. I won't suggest you go to AE as you only have 32G ram - AE rely heavy on your CPU and Ram. Try to do everything in Premiere as it supports better in 2018 so far.
Thanks, and I wish! Haha would have been nice to get some financial support for what we filmed. But no, it's where Google loans out the Odyssey to content creators for free for several months (still sweet) and generate potential marketing & exposure. If you don't own the Odyssey and want to spend more time with it/get exposure to your UA-cam channel, I suggest you apply for the next round which will be open in February! vr.google.com/jump/start/ So far I've been very happy with the program because it's given me leverage to approach various people and organization and pitch to film them in 360; since I'm fairly new in the 360 business. Hmm interesting, that makes sense too, but Mettle has some good plugins for 3D in Ae. Spatial placement of 2D or 3D text for example. Is there an alternative in Pr? I see that I can set my disparity using Project 2D (I believe it has a different name with the new VR plugins), but without a headset to live preview (I only own a Gear VR), I have no way to tell how near or far my text or graphic will appear. PS. I haven't upgraded to 2018 yet, still running on 2017.1.2. I have the Mettle Skybox suite so apart from the new rebranding of those plugins for 2018, would you recommend the upgrade? I hear it's buggy and may be worth waiting for an update. Cheers!
Thanks for the tips. Yes, I suggest to upgrade to 2018. Watch this tutorial: ua-cam.com/video/iroPydRv378/v-deo.html - The new immersive tool native in PR is more stable and less bugs.
Stereo 3D 360 is top and bottom. All HMDs and UA-cam read this format as standard. For example this video: ua-cam.com/video/MOHJUHsO93c/v-deo.html It is 3D TOP and bottom.
am using the VR 2048x2048 Stereoscopic DNxHR Stereo Audio Preset provided by Adobe to encode proxies for my VR footage in an Adobe Team Project, but Adobe Media Encoder doesn't get past 3 seconds of work before progress ceases and the 'remaining' time counter endlessly tallies seconds.
Hey Hugh, Thanks for the response. I'm really not sure what's going wrong. I'm trying to write the proxies for source videos on an external ssd to reduce heavy lifting and also so I can share them (the proxies) with a coworker on creative cloud in our team project. Every time I generate proxy files, encoder encodes for between 3 and 12 seconds and then stops making any signs of progress. I should note that I'm on a windows machine. (but I have enough resources to let Adobe do its thing, it's not as if my pc is getting choked)
My name is Oscar GBETIN Director and promoter of the audiovisual agency 'DAAGBO Records Production' graduated from the Higher Institute of Audiovisual Professions (ISMA) Dear Sir, I need your material help to develop cinema in Africa, a very rich sector in Africa thanks to our cultural diversity, but in no way exploited so you will contribute to the development of Africa and the fight against poverty
I need your material help for the audiovisual training center and also your expertise in the field of video editing facebook: Oscar GBETIN Whatsapp: +229 95399904 page facebook: daagbo records
Contact us at info@creatorup.com ( creatorup.com/ ) - we work with UA-cam / Google to provide global training in audio and visual. Shoot me your request there and we will get back to you asap. Thanks!
Thank you! You're the best!
Glad it helped!
Does Premiere support stereoscopic workflows from independent cameras that have been calibrated with a rig? For exmaple I shot a stereo project on a beam splitter so I have one video file that is left one that is right. Will I have to manually crop and stack these files on my timeline to have a stereo timeline or does Adobe offer native support for this?
Hello good! Let's see if you can answer a question for me... how can I incorporate stereoscopic photography into the video? What format do I use in premiere so that it is recognized as a stereo photo?
For 360 use Top and bottom - try half and full to see which one look right.
Thank you very much Hugh for your answer, but I don't think I have formulated the question correctly... I'll try again: within a stereoscopic equirectangular video, how can I hang a stereoscopic image (left eye and right eye)? It would be a floating element within the video, in the form of a stereoscopic image.
Thanks and sorry I didn't explain it well the first time.@@hughhou
Your videos have been really helpful - thanks! I would really love to see a tutorial on your proxy workflow next.
Yes! I should make that next.
is it possible to do just the logo part with the 3d effect inside photoshop?
you rule.
Dear Hugh,
When removing a tripod in a stereoscopic scene in photoshop, can't you just copy and paste the new created mono layer twice in the stereoscopic image in photoshop instead of brushing away the tripod twice?
Many thanks for your help!
It depends on the camera rig - most of rig has no stereo in the bottom - but if the shadow is too long then you still need to manually paint them out and make sure it look right in Stereo
I am wondering if it is possible to take monoscopic 360 video (shot with the Ricoh Theta V) and turn it into stereoscopic?
Yes but I don't know-how. Post stereo is SUPER expensive so I won't recommend.
Can you make a quick tutorial on how to convert Insta360 Pro 360 footage into 180? They've still not implemented 180 3D mode and am looking for ways to take a 180 slice from the 360 footage.
Can you just use a mask to cover the other 180 footage? That is the easiest solution.
Your videos have been really helpful - thanks!. I am having trouble exporting 8k stereoscopic .mp4 video, can you please make a video on proper export settings for both Premier Pro / After effect .
Sure. But if you need it now, I use QuickTime > DNxHR HQ (on a PC, and ProRes on a Mac) > Match Source > Check VR Video and pick Stereoscopic > check Use Preview to increase render speed with smart render (ua-cam.com/video/PZHaBCaeMPY/v-deo.htmlm54s). That is. Then upload the file to UA-cam directly. Or use my Horizon Workflow for a small file upload (ua-cam.com/video/8qz8WFfZTKM/v-deo.html)
Thanks Hugh, love your work.
Thanks man, that was very helpful for me. life saver
Glad it helps!
Glad it helps!
@@hughhou I was on a project and looking for exactly the same, and u even explained more. You r the only channel I check for any doubts on VR and it will be answered. Thanks bro u r the best..
Great tutorial. Thank´s! I just have the problem that the reimported stills are less sharp then the original video. No matter what format I try. Resolution is the same... any idea what could be wrong.
thanks for the great tutorial! I used Mistika VR to stitch our 360 3D footage and when we export the video with using their inject spatial media metadata checked, and import it to edit stereoscopic 360 mode inside Adobe Premiere ProCC2018, I'm not able to view the video in the 360 Toggle VR Display mode in Premiere. Any idea why? Or any suggestions for Mistika or Premiere settings to play the video in 360 mode?
Did you set the sequence to Stereo top and bottom? Go to your sequence setting and set it. Sometime Premiere does not set it correct for you.
Awesome!
I need to remove lines from my sphere but they go past the frame...how would I do this? Tripod removal is in a small area, but my lines go all around the sphere. Holding down space bar to bring up hand tool won't allow me to move canvas.
What line? Stitch line?
CreatorUp! I should have said multiple objects including chalk lines on the floor and curtains in the background. :-)
Shouldn't the same nadir patch be used for both the left and right eye since the nadir should ideally be monoscopic?
I have similar thoughts. I tried patching separately per eye which leads to some weird effects. This is obviously because the patching doesn't align up as it is done independently. A good way around would be using the same patch on both left and right as @David Lawrence pointed out. This wont give you stereoscopic depth at the patch but sure avoids those weird artifacts at the patchwork.
P.S. the patch sometimes stands out , is there an easy way to gradually merge the patch, like feathering out the mask in after effect ?
Well, sometimes the front end of your nadir can begin to become stereoscopic as it moves toward the horizon, so it really depends on your footage. Are your poles monoscopic, and if so, how far out from the bottom of your shot can you get before things start getting more stereo?
Yes in theory and some situations. But as you see I need to get rid of not only the sandbag - but also the shadow. And the shadow has disparity. Also lighting consideration - When I reuse the left eye patch, I can clearly see the cutoff and unmatch shading. Again you can prob use After Effects to blur out the edges and mix it back up - but I found that it's way easier to understand just handing it left and right eye together and merge them back together. What if you are getting off (object removal) something rather than a tripod. That is why I am showing the hard way. You are 100% right tho, the ground should be monoscopic. But most of the camera (Obsidian or Insta360 PRO) does not stitch well the nadir in 360 yet - if you watch my Mistika 3D stitch, you understand the issue.
Vivek - that is why you use the Premiere 360 view in Anaglyph to make sure the ground when you patch it back in there is NO visual disparity - or at least the nadir. As you see the shadow is far beyond nadir and almost reach the case at this example. If the case has strong disparity - let's assume there are object / pattern there - the patch should have disparity as well. And as Adrenaline Films said, it is really case by case. For me, most of the time this work well, but some of the time, I need to do it in After Effects. So really depend on your nadir. I am making a tutorial to follow up with this as you see my last 3 minutes. So great you guys giving me these really good feedback so I can tailor those situation and figure out the best solution together. This is still not one side fit all solution sadly.
Yes. I will work on it next.
Hi, really nice video to show us how to work with 3D 360!
But now i try to find the next video about using MOCHA VR and i can't find it, any link here?
Thanks!
BTW, great channel!
It is here: ua-cam.com/video/5jpzS7JcxrI/v-deo.html - I should do a better job in organizing my content lol.
Thanks for the quick answer!
But, this is the same as you told in which you were going to teach us how to fix the floor with changing lights and colors?
I have not created that one yet. But it is the same technique - instead of the sky, you deal with the floor. You paint out the tripod - since you use around pixels - it will adept the lights and colors to blend everything in.
CreatorUp! Ok, thanks again, i can take it from there!
Hey, i was yesterday trying to do the regular tripod removal and i was going nuts during the step in which you have to merge down and see it changing in the spherical map, i couldn't get it properly until i thought of changing the color to the new layer like you and suddenly it worked, you know if this is mandatory or is just a bug the fact that only worked changing the color and name to the layer?
Thx again!
I shot some footage in monoscopic. I want to play it back in my gear vr or on my google cardboard. How do I convert it to stereoscopic? Can I do that in premiere?
You can not convert mono to stereo - you can only do it the other way around - convert stereo to mono.
I discovered through the google cardboard viewer on my phone I can get my monoscopic 360 video to play in stereo, but will my oculus do the same thing? My 360 seems to only record in monoscopic. I don't mind editing in mono, I just need to play it back in my headset.
If you output top and bottom, it will play back in stereo in any HMD....
Hey Hugh!
Love your enthusiasm and knowledge! Thanks so much for the vids. I've been playing around with the Odyssey since Google lent it to me for the Jump Start program. Since this is the first time I've played with stereo 360, your tutorial here is much appreciated. You mentioned leaving a comment if we're interested in a proxy workflow tutorial, and I would be very interested. My computer has an i7 4 core CPU, GTX1070 GPU and 32GB RAM so it's kind of intermediate in the scale of heavy workstations and learning how to work well with proxies would be an asset to me. I'm particularly curious about what happens to performance when using "replace with AE composition", if I need to send my clip to AE for some deeper post/FX. Thanks again for your tutorials!
Boris
Congrats on the Jump Start program! Is it the one they give you $40K for you to make whatever you want? I use Odyssey as well at UA-cam Space LA as it is a standard there. I will make a proxy workflow but won't be a while as I am behind schedule. But I will let you know asap when I do. I won't suggest you go to AE as you only have 32G ram - AE rely heavy on your CPU and Ram. Try to do everything in Premiere as it supports better in 2018 so far.
Thanks, and I wish! Haha would have been nice to get some financial support for what we filmed. But no, it's where Google loans out the Odyssey to content creators for free for several months (still sweet) and generate potential marketing & exposure. If you don't own the Odyssey and want to spend more time with it/get exposure to your UA-cam channel, I suggest you apply for the next round which will be open in February! vr.google.com/jump/start/
So far I've been very happy with the program because it's given me leverage to approach various people and organization and pitch to film them in 360; since I'm fairly new in the 360 business.
Hmm interesting, that makes sense too, but Mettle has some good plugins for 3D in Ae. Spatial placement of 2D or 3D text for example. Is there an alternative in Pr? I see that I can set my disparity using Project 2D (I believe it has a different name with the new VR plugins), but without a headset to live preview (I only own a Gear VR), I have no way to tell how near or far my text or graphic will appear.
PS. I haven't upgraded to 2018 yet, still running on 2017.1.2. I have the Mettle Skybox suite so apart from the new rebranding of those plugins for 2018, would you recommend the upgrade? I hear it's buggy and may be worth waiting for an update.
Cheers!
Thanks for the tips. Yes, I suggest to upgrade to 2018. Watch this tutorial: ua-cam.com/video/iroPydRv378/v-deo.html - The new immersive tool native in PR is more stable and less bugs.
Can you use the HTC vive with this technique or just the Oculus Rift?
HTC Vive work better actually. Especially the Pro.
any advice on making a mask on stereoscopic 180?
thx!
Right now, eyeball it with After Effects - or track it in Mocha - no magic here until we get more demand on software plugin developer :(
Why are both footages one on top of each other? I thought they should be side by side.
Stereo 3D 360 is top and bottom. All HMDs and UA-cam read this format as standard. For example this video: ua-cam.com/video/MOHJUHsO93c/v-deo.html It is 3D TOP and bottom.
Pleeease can you show me your proxy workflow for Stereoscopic 360 footage
am using the VR 2048x2048 Stereoscopic DNxHR Stereo Audio Preset provided by Adobe to encode proxies for my VR footage in an Adobe Team Project, but Adobe Media Encoder doesn't get past 3 seconds of work before progress ceases and the 'remaining' time counter endlessly tallies seconds.
Really? that is exactly what I used as well!!
Hey Hugh,
Thanks for the response.
I'm really not sure what's going wrong. I'm trying to write the proxies for source videos on an external ssd to reduce heavy lifting and also so I can share them (the proxies) with a coworker on creative cloud in our team project.
Every time I generate proxy files, encoder encodes for between 3 and 12 seconds and then stops making any signs of progress.
I should note that I'm on a windows machine. (but I have enough resources to let Adobe do its thing, it's not as if my pc is getting choked)
Thank you ;)
Is there an update for "3DVR" or just VR ?
What do you mean ?
My name is Oscar GBETIN Director and promoter of the audiovisual agency 'DAAGBO Records Production' graduated from the Higher Institute of Audiovisual Professions (ISMA)
Dear Sir, I need your material help to develop cinema in Africa, a very rich sector in Africa thanks to our cultural diversity, but in no way exploited so you will contribute to the development of Africa and the fight against poverty
facebook: oscar GBETIN
WHATSAPP: +229 95399904
So what do you need from me again?
I need your material help for the audiovisual training center and also your expertise in the field of video editing
facebook: Oscar GBETIN
Whatsapp: +229 95399904
page facebook: daagbo records
Contact us at info@creatorup.com ( creatorup.com/ ) - we work with UA-cam / Google to provide global training in audio and visual. Shoot me your request there and we will get back to you asap. Thanks!