My boy doesn’t get enough credit and the amount of time and detail he puts on these videos on top of his job must take up most of his time. Thanks for all the hard work
Haha thanks! I hope more ppl can see this so the whole industry can make better content together and camera makers can get their acts together and make better cameras for us. Yes, just landed coupled gigs - might slow down the UA-cam and make some money to fund my next camera purchase so I can continue to do review lol.
CreatorUp You are just so far ahead of the curve right now, once VR really becomes mainstream people will be looking back in time to all your videos. Your channel is really like an advance course on VR. Keep up the great work, we will continue to enjoy and appreciate all the gems you continue to share with us. Wish you more continued success!
Hi, Hugh. Thanks for posting this workflow, however, I have one question or observation which I’d like your input on. Much like when the 2:1 equirectangular panorama becomes more and more warped the further you go up and down the frame, the compression noise also gets warped along with it. So, theoretically, it's almost impossible to pick a spot on your footage that would accurately reflect the noise profile across the entire panorama. In your opinion, shouldn’t denoising of the footage be applied BEFORE stitching when the noise is more uniform on all of the lenses? It stands to reason that Neat Video should be able to do a better job of removing the noise more uniformly if you create a noise profile preset for each low light scene and apply it to all the lenses (Insta 360 Pro in my case), than when the noise gets warped post stitching? Eager to hear your thoughts on this.
Hugh, thank you again so much for this. I purchased Neat Video 5 for Premiere Pro based on this tutorial. I was wondering why my GoPro max footage was coming out ugly! Protip for anyone having issues with Neat Video 5 being painfully laggy and slow: be sure to use this effect first before ANYTHING else. I couldn't figure out why my relatively fast PC kept crashing, then I removed all attributes (I had VR projection on). Once I did this, it is MUCH better. Only thing that sucks is having to render after every edit but again, thank you for this amazing workflow. I also got your new 360 LUT pack, paid only $5 for a donation and I feel like you deserve so much more! Any updates on a master class date?
Thank you! Still working on the masterclass. I always got pull away from actual production. Working on a new VR music video with some mega artists for release next weeks or so. But I should be back on track pretty soon. Another tip for using Neat with GoPro MAX is actually move your clip to After Effects. In After Effects Neat run a lot faster with Pre-comp (you can nest too in PR but not as good). So I usually reframe in Premiere - render to AE - Neat + Sharpening + color grading - then back to PR for final render. Effects pass usually faster in AE. And you don't need to learn AE - it is just a host for you third party effects.
@@hughhou Glad to see you stay so busy! Interesting tip, I'll try that with AE for my next video. Hoping to see you do a video on the new 14.2 Premiere Pro update with hardware encoding. Thanks for the response, I'll be checking back for the master class update.
Great tutorial Hugh! before try: so it is possible to use Neat video also in 360 videos (so avoiding issues/artifacts on LatLong "stitch" line in viewer)?
Hey Hugh, I have learned a lot from your videos and they have been exceptionally helpful to me in my projects! i just have have an issue with the neat video 5; after adding the plugin to the exported prores footage from pro2 there is a lil snip of the video that is missing around the corners and at the top and bottom, and you mentioned that if the sharpen at midpoint is on that this causes the issue im facing, but i have double checked and sharpen around midpoint is off. any tips or advice on what is causing the issue im facing?
Thanks for the tutorial! in 05:41 you mention insta360 one x stiching compression phase as out of luck, but isn't insta360 app also comes with a premiere plugin that makes the footage import to premiere as already stitched so you skip the stiching compression?
Yes no stitch workflow - but it has soooooo many bugs - I generally will not recommanded - they also phase it out in ONE R to provide direct ProRes render now.
I am trying out the demo version of neat video in premiere pro to edit my 360 degree video to make sure it works before I buy it. When I am finished and export the video from premiere and then view it on my computer, the video only plays for a second and then freezes while the audio continues. I do not have a problem playing other videos that have not had neat video applied to them. I am wondering if you have come across this problem before? Thanks!
CreatorUp thanks for the reply! I am using premiere pro with 4K footage. It turned out that the problem was that I didn’t have enough computer ram to render properly. The problem disappeared once upgraded my ram. I am now using neat video with no problems. Thanks for the great video!
Thanks for speaking slow and clearly :) 05:35 For Insta360 One X, if you directly open the insv files in Premiere Pro, and start denoise and editing from there, will the video quality be preserved since it was not first encoded to h.264 via the Insta360 One X studio?
Awww thanks! I try! My accent is pretty strong. I am also looking for community help on translating my video into a different language - if you know anyone interested, I will credit them on description! For your question - 05:35 - thank you for point this out. This is a great idea if Premiere can handle the insv file. I have not tried that yet as I mostly use Mistika VR. But this can be THE solution. Thank you for point out such a great tip!! I will try it myself and report back!
I decided to try and use the ProRes 422 codec. Do you upload .mov to UA-cam or convert it? (the data difference is crazy: 30 second video 560MB(H265) vs 9GB(ProRes422). I need to step up from hobbyist to pro, and I can't afford not to keep your advice anymore :)
Hello Hugh, thank you very much for the super content. Is it possible that neat needs about 8hours of rendering for a 8k stereo video with 1min. Content. It is so crazy I have a good machine but iam stuck in render process
@@hughhou yes I have and in the meantime I tested some alternative workflow: I did all the settings and edit of the footage first, then the neat edit and sharpening in premiere and the Rendertime was good about 15-20min.
Thanks Hugh, useful tutorial! The fact that you explain the workflow for both mono and stereoscopic videos is great! I can't wait for the tutorial about flickering lights :)
What are the best render setting in Mistika VR for 360 3D 4K or 6K footage on Windows (insta 360 Pro1)? Struggling with that, premier will not open my HVEC files but i want uncompressed footage?
11:11 When using davinci resolve for noise reduction, do you know how one goes about sharpening from there? It doesn't seem to have the same options as premiere pro for VR sharpening :(
Yes! I mentioned in the video of Boris FX VR unit - ua-cam.com/video/55og9I8WO4k/v-deo.html - this is cross host application and can work with Davinci Resolve and final cut pro X. Most of my colorist friends use DR with Boris FX VR units - and I believe it is free come with Mocha VR - which is also essential for tracking and selective color grading in DR for VR footage - as the native tracking of DR can't handle VR footage that well - I know I just throw you lots of ideas - but I will try to cover them in the future episode with my colorist friends!
Great video Hugh. Another technique for denoising AND sharpening, is to run both processes on the raw camera capture video files. Then it doesn't matter if the sharpening tools are VR/equirectangular aware. However, it does require running the identical denoise/sharpen settings on all six camera files (if you're using a 6 lens camera). And you have to render the denoise/sharpen on those six camera files, which takes time.
Yes absolutely! This is actually the recommended workflow. But we will need Mistika VR to stitch them which lost the stabilization info. And most ppl do not own Mistika or AVP here.
Yes. I should make a follow-up tutorial. But basically, you denoise individual clip first before stitching. Render it out as ProRes or DPX and then stitch with Mistika. If moving shots, you will need Mocha to stabilize it in POST.
Hi Hugh -- I love your tutorials. Very informative! I have a question about workflow. You hae talked about re-encoding the least number of times to preserve the highest quality of video. What is the best workflow to pull footage from the Insta360 Evo 3D, fix it, then get it into a headset like the Quest? Is it to import into Premiere then use their VR effects and AfterCodecs (which you talk about in another video) to re-encode it as an H265? Then take the H265 and import that into Neat Video to remove flicker/artifacts? Then re-encode it as an H264 (or H265, if Neat can do that) so it can be played in the Oculus Quest? Seems to be a lot of encoding/re-encoding going on. What is your opinion? Thanks and keep up the great work!!
Oh no, H265 and h264 are compressed codec - you never use it as intermedia render. I will use ProRes 422 - still compress but it is not lossing quality from Evo footage only 8 bit. So the workflow is Evo to Premiere - render with aftercodec or just native Quicktime from Premiere to render as ProRes 422 - do your denoise deflicker in ProRes 422 and then edit. When done, final render back into H.265.
How do you compare it to Resolve and Premiere/After effects native tools? and what is the native tool you mention for Resolve? Is it the 'beauty' effect?
Will try this with my Qoocam as 4K has a lot of noise in video in 360 and 180 alas! We class you as a friend, even my two kids (10-13) see you on TV and say “oh it’s 3D Hugh”. They end up having to watch the videos a lot as I use the main Tv lol
In your work process first do NeatVideo and then lumentri ?, will not it be the reverse ?. I have a supercomputer, 3 RAM0 X 3mi, 128ram, 2 ASUS ROG Strix GeForce RTX 2080 Ti and Core I7 6900K, 10000000000 times more powerful than the computer that went to the moon, and when I put the NeatVideo 4/5, either one of them finished the workflow, instead of a computer, I have a pocket calculator ..... that's the workflow? Insurance? "....Somewhere I lose myself...."
I do NeatVideo first - in fact - I do denoise before anything else (even before stitching). What is the size of the footage you are talking about? 8K x 8K from Insta360 Pro 2? Ozo? Jaunt One?
@@hughhou Insta360 Pro, Now I'm precisely in it, three times the Adobe Premiere has burst and I deleted the VID_20190000_0000.ins, I consider to put the Video Immersive Noise Reduction, to finish the project and deliver it. I took 2 days trying to remove the noise of a sequence of 1 minute, plus another 2 days to remove the Gimbal Car, the computers are not yet ready for the 8k. And the worst thing is that at the end of the month they give me the Titam .... Greetings and thanks for your wise advice.
My boy doesn’t get enough credit and the amount of time and detail he puts on these videos on top of his job must take up most of his time. Thanks for all the hard work
Haha thanks! I hope more ppl can see this so the whole industry can make better content together and camera makers can get their acts together and make better cameras for us. Yes, just landed coupled gigs - might slow down the UA-cam and make some money to fund my next camera purchase so I can continue to do review lol.
CreatorUp You are just so far ahead of the curve right now, once VR really becomes mainstream people will be looking back in time to all your videos. Your channel is really like an advance course on VR. Keep up the great work, we will continue to enjoy and appreciate all the gems you continue to share with us. Wish you more continued success!
one of the most educational channels. kudos Hugh.
Thank you!
Hi, Hugh. Thanks for posting this workflow, however, I have one question or observation which I’d like your input on. Much like when the 2:1 equirectangular panorama becomes more and more warped the further you go up and down the frame, the compression noise also gets warped along with it. So, theoretically, it's almost impossible to pick a spot on your footage that would accurately reflect the noise profile across the entire panorama. In your opinion, shouldn’t denoising of the footage be applied BEFORE stitching when the noise is more uniform on all of the lenses? It stands to reason that Neat Video should be able to do a better job of removing the noise more uniformly if you create a noise profile preset for each low light scene and apply it to all the lenses (Insta 360 Pro in my case), than when the noise gets warped post stitching? Eager to hear your thoughts on this.
Before stitching is the best! For Pro 2 I did that. I also use general noise profile instead picking spot.
Hugh, thank you again so much for this. I purchased Neat Video 5 for Premiere Pro based on this tutorial. I was wondering why my GoPro max footage was coming out ugly! Protip for anyone having issues with Neat Video 5 being painfully laggy and slow: be sure to use this effect first before ANYTHING else. I couldn't figure out why my relatively fast PC kept crashing, then I removed all attributes (I had VR projection on). Once I did this, it is MUCH better. Only thing that sucks is having to render after every edit but again, thank you for this amazing workflow. I also got your new 360 LUT pack, paid only $5 for a donation and I feel like you deserve so much more! Any updates on a master class date?
Thank you! Still working on the masterclass. I always got pull away from actual production. Working on a new VR music video with some mega artists for release next weeks or so. But I should be back on track pretty soon. Another tip for using Neat with GoPro MAX is actually move your clip to After Effects. In After Effects Neat run a lot faster with Pre-comp (you can nest too in PR but not as good). So I usually reframe in Premiere - render to AE - Neat + Sharpening + color grading - then back to PR for final render. Effects pass usually faster in AE. And you don't need to learn AE - it is just a host for you third party effects.
@@hughhou Glad to see you stay so busy! Interesting tip, I'll try that with AE for my next video. Hoping to see you do a video on the new 14.2 Premiere Pro update with hardware encoding. Thanks for the response, I'll be checking back for the master class update.
This is really great! Thanks AGAIN!
3 years later... are you still using Neat with DaVinci or are the studio tools sufficient on their own?
Both! I mostly still use Neat Video as it’s give me very consistent results in 360 video.
Great tutorial Hugh! before try: so it is possible to use Neat video also in 360 videos (so avoiding issues/artifacts on LatLong "stitch" line in viewer)?
Yes it is okay for 360 video - it won't have stitch line!
Hey Hugh, I have learned a lot from your videos and they have been exceptionally helpful to me in my projects! i just have have an issue with the neat video 5; after adding the plugin to the exported prores footage from pro2 there is a lil snip of the video that is missing around the corners and at the top and bottom, and you mentioned that if the sharpen at midpoint is on that this causes the issue im facing, but i have double checked and sharpen around midpoint is off. any tips or advice on what is causing the issue im facing?
I actually never run into this issue. You should contact Neat Video support they should be able to help you out.
Amazing yet again!! 360 master Hugh 👏🏻
Thanks for the tutorial! in 05:41 you mention insta360 one x stiching compression phase as out of luck, but isn't insta360 app also comes with a premiere plugin that makes the footage import to premiere as already stitched so you skip the stiching compression?
Yes no stitch workflow - but it has soooooo many bugs - I generally will not recommanded - they also phase it out in ONE R to provide direct ProRes render now.
@@hughhou Thanks! can you refer the bugs I should be aware of? maybe an article or a video?
I am trying out the demo version of neat video in premiere pro to edit my 360 degree video to make sure it works before I buy it. When I am finished and export the video from premiere and then view it on my computer, the video only plays for a second and then freezes while the audio continues. I do not have a problem playing other videos that have not had neat video applied to them. I am wondering if you have come across this problem before? Thanks!
I don't think demo version can even process anything more than 4K. It can be the issue. Are you using Premiere or After Effects or Resolve?
CreatorUp thanks for the reply! I am using premiere pro with 4K footage. It turned out that the problem was that I didn’t have enough computer ram to render properly. The problem disappeared once upgraded my ram. I am now using neat video with no problems. Thanks for the great video!
Thanks for speaking slow and clearly :)
05:35 For Insta360 One X, if you directly open the insv files in Premiere Pro, and start denoise and editing from there, will the video quality be preserved since it was not first encoded to h.264 via the Insta360 One X studio?
Awww thanks! I try! My accent is pretty strong. I am also looking for community help on translating my video into a different language - if you know anyone interested, I will credit them on description!
For your question - 05:35 - thank you for point this out. This is a great idea if Premiere can handle the insv file. I have not tried that yet as I mostly use Mistika VR. But this can be THE solution. Thank you for point out such a great tip!! I will try it myself and report back!
Awesome tutorial CreatorUp. Thanks a bunch!
Glad it was helpful!
I decided to try and use the ProRes 422 codec. Do you upload .mov to UA-cam or convert it? (the data difference is crazy: 30 second video 560MB(H265) vs 9GB(ProRes422). I need to step up from hobbyist to pro, and I can't afford not to keep your advice anymore :)
So UA-cam has a limit of 125G or so for upload file size - so if your video is under 2 min - ProRes will work.
How to avoid a stitching line appearing at the edge of the image in 360 stereo after Neat?
I does not appear if you use Neat. Nothing else you need to do.
Your videos are top notch! I can't tell you how many questions you've answered for me as a newish VR videographer. Well done.
Glad it was helpful!
Hello Hugh, thank you very much for the super content. Is it possible that neat needs about 8hours of rendering for a 8k stereo video with 1min. Content. It is so crazy I have a good machine but iam stuck in render process
did you enable GPU + CPU render?
@@hughhou yes I have and in the meantime I tested some alternative workflow: I did all the settings and edit of the footage first, then the neat edit and sharpening in premiere and the Rendertime was good about 15-20min.
Great video buddy! Does it work on insv if you use the no-stitch workflow?
Still testing on it - it should be :) Thx Mic!
Thanks Hugh, useful tutorial! The fact that you explain the workflow for both mono and stereoscopic videos is great! I can't wait for the tutorial about flickering lights :)
Glad you like it! Will release it when I get back from my Hawaii VR shoot.
What are the best render setting in Mistika VR for 360 3D 4K or 6K footage on Windows (insta 360 Pro1)? Struggling with that, premier will not open my HVEC files but i want uncompressed footage?
First, check my Mistika Series: ua-cam.com/video/rSQUUHsDePY/v-deo.html
I use ProRes 422 or PNG sequence.
Thanks for putting in time on these - your channel is a great resource for me!
I am glad it helps!
11:11 When using davinci resolve for noise reduction, do you know how one goes about sharpening from there? It doesn't seem to have the same options as premiere pro for VR sharpening :(
Yes! I mentioned in the video of Boris FX VR unit - ua-cam.com/video/55og9I8WO4k/v-deo.html - this is cross host application and can work with Davinci Resolve and final cut pro X. Most of my colorist friends use DR with Boris FX VR units - and I believe it is free come with Mocha VR - which is also essential for tracking and selective color grading in DR for VR footage - as the native tracking of DR can't handle VR footage that well - I know I just throw you lots of ideas - but I will try to cover them in the future episode with my colorist friends!
CreatorUp oooo great! Sorry I missed that 🤭
Going to give Boris a try! Thanks again :)
Great video Hugh. Another technique for denoising AND sharpening, is to run both processes on the raw camera capture video files. Then it doesn't matter if the sharpening tools are VR/equirectangular aware. However, it does require running the identical denoise/sharpen settings on all six camera files (if you're using a 6 lens camera). And you have to render the denoise/sharpen on those six camera files, which takes time.
Yes absolutely! This is actually the recommended workflow. But we will need Mistika VR to stitch them which lost the stabilization info. And most ppl do not own Mistika or AVP here.
Agreed, most people rely on the standard freebie stitching tools.
Thanks Hugh. So many useful little tips - the red GPU render line in Premiere had been driving me nuts so that was an amazingly useful tip.
Haha glad it helps!
So if you are using mistika is there a different step before applying neat 5?
Yes. I should make a follow-up tutorial. But basically, you denoise individual clip first before stitching. Render it out as ProRes or DPX and then stitch with Mistika. If moving shots, you will need Mocha to stabilize it in POST.
Hi Hugh -- I love your tutorials. Very informative! I have a question about workflow. You hae talked about re-encoding the least number of times to preserve the highest quality of video. What is the best workflow to pull footage from the Insta360 Evo 3D, fix it, then get it into a headset like the Quest? Is it to import into Premiere then use their VR effects and AfterCodecs (which you talk about in another video) to re-encode it as an H265? Then take the H265 and import that into Neat Video to remove flicker/artifacts? Then re-encode it as an H264 (or H265, if Neat can do that) so it can be played in the Oculus Quest? Seems to be a lot of encoding/re-encoding going on. What is your opinion?
Thanks and keep up the great work!!
Oh no, H265 and h264 are compressed codec - you never use it as intermedia render. I will use ProRes 422 - still compress but it is not lossing quality from Evo footage only 8 bit. So the workflow is Evo to Premiere - render with aftercodec or just native Quicktime from Premiere to render as ProRes 422 - do your denoise deflicker in ProRes 422 and then edit. When done, final render back into H.265.
love the yi360vr at back.. the only 360 cam i have..
That is old school!
@@hughhou YAA.. many thanks to you, i learn much from you. hope in 2020 you make more tutorial for yi360vr... GBU
How do you compare it to Resolve and Premiere/After effects native tools? and what is the native tool you mention for Resolve? Is it the 'beauty' effect?
I personally think Neat Video is way better and faster. But Resolve temporal denoiser is still really powerful.
Great video Hugh! Keep on pushing VR.
That was great. Need to upgrade to version 5 now. Thanks!
You should! I did not actually mention any new feature on this tutorial - but the new feature is pretty amazing.
Great video, Hugh.
Thx Jon!
Vr will become mainstream not by vr gaming but by vr social media
Thank You Hugh, these videos are great
Glad you like it Clay!
Thankyou for your very informative videos. Brave new world for me!
Will try this with my Qoocam as 4K has a lot of noise in video in 360 and 180 alas!
We class you as a friend, even my two kids (10-13) see you on TV and say “oh it’s 3D Hugh”. They end up having to watch the videos a lot as I use the main Tv lol
Haha nice! OMG, I never watch myself on a TV 🙈
Awesome!Thank You
Glad you like it
I've just discovered this channel and I'm subscribing as I see the intro. Love your style 🙌
Awww thank you so much!
Only channel I subscribe to that I feel I HAVE to watch bc it's always so informative!
Awwww thank you so much!!
Great, great, great!!!. Many thanks!!!
Thank you!!!!
indeed!
SUPER Highly recommend on 360 filmmaker~!!! Thanks for share nice skill, CreatorUp~!
Awww thanks for the recommendation!
Neat video is awesome, I've been using for years!
Have you try V5? It is 1.5 - 2x faster if you have a good GPU.
Titan samples when .
Soon Soon! It is lots of processing power I need and still in mid of building a new PC to handle that.
讲述得很清楚!好!
In your work process first do NeatVideo and then lumentri ?, will not it be the reverse ?.
I have a supercomputer, 3 RAM0 X 3mi, 128ram, 2 ASUS ROG Strix GeForce RTX 2080 Ti and Core I7 6900K, 10000000000 times more powerful than the computer that went to the moon, and when I put the NeatVideo 4/5, either one of them finished the workflow, instead of a computer, I have a pocket calculator ..... that's the workflow? Insurance? "....Somewhere I lose myself...."
I do NeatVideo first - in fact - I do denoise before anything else (even before stitching). What is the size of the footage you are talking about? 8K x 8K from Insta360 Pro 2? Ozo? Jaunt One?
@@hughhou Insta360 Pro, Now I'm precisely in it, three times the Adobe Premiere has burst and I deleted the VID_20190000_0000.ins, I consider to put the Video Immersive Noise Reduction, to finish the project and deliver it. I took 2 days trying to remove the noise of a sequence of 1 minute, plus another 2 days to remove the Gimbal Car, the computers are not yet ready for the 8k. And the worst thing is that at the end of the month they give me the Titam .... Greetings and thanks for your wise advice.