I'm looking to use Ultimatte 12 along with Unreal and the HTC Mars tracking system for a 6-episode kids' show shot in real-time. With a correctly lit greenscreen, do you think Ultimatte 12 can achieve real-time keying that looks precomped? Thanks for continuing to make these great tutorials!
Hey Aiden, thank you so much for this walkthrough. We are building a virtual production stage here with the Ultimatte 12 4K + HTC Mars CamTracker + Hyperdeck Studio 4K Pro + ATEM Television Studio 4K + 3 Cameras. I'm completely new to UnrealEngine so your tutorials are really saving my life right now. Thank you so much! My plan is to feed all 3 cameras to 3 individual Hyperdecks, then those individual hyperdecks to the ATEM Television Switcher and the ATEM would be connected to the Ultimatte. The Ultimatte Id connect the individual ports to the Decklink 8K per your connection configuration. The ATEM would switch the specific camera angle and send that specific camera signal to the Ultimatte. Is this how you would wire everything? Correct me if I'm wrong but if I bought 3 Ultimattes for each Hpyerdeck, I'd have to also buy (3) Decklink cards for my computer that each Ultimatte would connect to, really complicating the setup not to mention not knowing how I'd do that in UE as every video tutorial on UA-cam teaches a single Ultimatte to a single Decklink card for UE to use. This is also the most cost effective approach to buying (3) Ultimattes and (3) Decklink cards
I looked for the other two videos you mentioned, however I could not find them. I'm really interested in how you are using this, as I am looking to do the same. Thanks for this tutorial!
Great tutorial. I am trying to send 3 SDI channels from Unreal. Background, Foreground Fill, and Foreground Key all to Ultimatte. I have two questions. Will Ultimatte work this way, and 2 how to break up Foreground to Key and Fill.
Hi. I am trying to do the opposite thing...having an external studio camera on a huge chroma wall and I would like to know if having a rendered scenery in unreal it would be possible with the Vive trackers using the unreal as BG and key over the unreal scene. Thank you in advance.
Are we able to go the other way as what @Haz3rd mention? Isn't sending UE background feed to ultimatte12 utilize the hardware for live compositing and save resources on UE side?
Please I really need some help. I tried the same process as you did. I have only one capture card but it has two SDI INs. How can I bring both the cable with Fill signal and the other cable with Matte signal into my capture card? In unreal I can only select one
I have a 4K Decklink with just 1 input and 1 output. Would I be able to still use a Ultimate with Unreal if I route the signal differently? Perhaps the vive tracked raw camera feed straight from the camera and then the tracked real-time render coming out of unreal as the clean plate. Composite would be done outside of Unreal using the Ultimate hardware only? If so, what does this mean for latency between the vive tracker data, the camera feed and then Unreal render? Do they all need to be delayed differently to snyc back up? Genlock?
I do it that way. The only draw back is that the delay slightly changes throughout. Meaning after a take or two it might start slipping. Then you have to adjust. If we can figure that out it’s actually better to have the key outside as it gets rid of software issues and gives you a massive performance boost
It's just crazy how complicated everything is. And this is just to display the video in ue5.. Disaster... Is there really no plugin today that will replace these unnecessary gestures?
If you mean 360 camera moving as in FOX Sports (the entrance of this video here) ua-cam.com/video/rOe6Gw9TvJg/v-deo.html they use a garbagge matte in composure to have a green screen creating a 360 degree effect and putting computer graphics all around the scene, wherever the camera points to. Garbage Matte is explained here: ua-cam.com/video/LL2-jxUk2_0/v-deo.html
Having the video in Unreal allows for some trickery in terms of having what your keying cast shadows and reflections in the 3d scene. However I have also covered how to do it in the Ultimatte as well here: ua-cam.com/video/HGLJk_2joxA/v-deo.html
@@3d_Something thank you. That was helpful. What if I wanted to record the final comp to an external recorder. Once I have the key and fill, would I then use the pgm output on the ultimatte to record the final comp?
@@wertup123 I think you should use one of the Decklink card outputs, not the Ultimatte outputs. The Ultimatte are just for the chroma, nothing else. Configure the Decklink card output 4 as shown in 11:05 and connect it with an SDI cable to an external recorder live the Blackmagic Video Assist or any SDI recorder.
That is also doable however by having the keyed footage in Unreal you can do more complex compositing tricks such as fake shadows and reflections which you couldn't do by only sending video out to the ultimatte.
I'm looking to use Ultimatte 12 along with Unreal and the HTC Mars tracking system for a 6-episode kids' show shot in real-time. With a correctly lit greenscreen, do you think Ultimatte 12 can achieve real-time keying that looks precomped? Thanks for continuing to make these great tutorials!
Same setup and curious to know the samw
Same setup and curious to know the same
Let me also add...same setup. Has anyone come to a conclusion about this method?
Yes, Ultimatte 12 can achieve a realtime comp with the correct realtime greenscreen and lighting setup system. @@Justin_Allen
Same setup and looking for same
Hey Aiden, thank you so much for this walkthrough. We are building a virtual production stage here with the Ultimatte 12 4K + HTC Mars CamTracker + Hyperdeck Studio 4K Pro + ATEM Television Studio 4K + 3 Cameras. I'm completely new to UnrealEngine so your tutorials are really saving my life right now. Thank you so much! My plan is to feed all 3 cameras to 3 individual Hyperdecks, then those individual hyperdecks to the ATEM Television Switcher and the ATEM would be connected to the Ultimatte. The Ultimatte Id connect the individual ports to the Decklink 8K per your connection configuration. The ATEM would switch the specific camera angle and send that specific camera signal to the Ultimatte. Is this how you would wire everything? Correct me if I'm wrong but if I bought 3 Ultimattes for each Hpyerdeck, I'd have to also buy (3) Decklink cards for my computer that each Ultimatte would connect to, really complicating the setup not to mention not knowing how I'd do that in UE as every video tutorial on UA-cam teaches a single Ultimatte to a single Decklink card for UE to use. This is also the most cost effective approach to buying (3) Ultimattes and (3) Decklink cards
I looked for the other two videos you mentioned, however I could not find them. I'm really interested in how you are using this, as I am looking to do the same. Thanks for this tutorial!
And what about foreground objects in unreal scene we also have to create another cg layer for that?
Figured out already If u wanna know ping me!! bye...
If I wanted to put the composite on a plane and move the camera around it in 3 dimensions how would I do that?
You'd get a camera tracker on your physical camera, and then feed those motion details onto your cinematic camera in the scene.
Can you please tell us about the foreground objects
Alright! Nice work!
Great tutorial. I am trying to send 3 SDI channels from Unreal. Background, Foreground Fill, and Foreground Key all to Ultimatte. I have two questions. Will Ultimatte work this way, and 2 how to break up Foreground to Key and Fill.
Hi. I am trying to do the opposite thing...having an external studio camera on a huge chroma wall and I would like to know if having a rendered scenery in unreal it would be possible with the Vive trackers using the unreal as BG and key over the unreal scene. Thank you in advance.
Yes very possible and i have an upcoming video covering exactly that although not in a big greenscreen studio
@@3d_Something Thank you
Are we able to go the other way as what @Haz3rd mention? Isn't sending UE background feed to ultimatte12 utilize the hardware for live compositing and save resources on UE side?
It would be, this input idea would be if you needed to utilise the keyed footage for other things such as shadow generation and reflections.
Please I really need some help. I tried the same process as you did. I have only one capture card but it has two SDI INs. How can I bring both the cable with Fill signal and the other cable with Matte signal into my capture card? In unreal I can only select one
depending on the card it might not support 2 inputs at the same time but instead be two connectors to support 6g or 12g dualink
You need the Decklink 8K card with 4 IOs, as sooner or later you'll need to use SDI outputs too.
I have a 4K Decklink with just 1 input and 1 output. Would I be able to still use a Ultimate with Unreal if I route the signal differently? Perhaps the vive tracked raw camera feed straight from the camera and then the tracked real-time render coming out of unreal as the clean plate. Composite would be done outside of Unreal using the Ultimate hardware only?
If so, what does this mean for latency between the vive tracker data, the camera feed and then Unreal render? Do they all need to be delayed differently to snyc back up? Genlock?
I do it that way. The only draw back is that the delay slightly changes throughout. Meaning after a take or two it might start slipping. Then you have to adjust. If we can figure that out it’s actually better to have the key outside as it gets rid of software issues and gives you a massive performance boost
It's just crazy how complicated everything is. And this is just to display the video in ue5.. Disaster... Is there really no plugin today that will replace these unnecessary gestures?
Thanks ❤
This is just what i want to do with it i ordered it as well ....Question is there some way to incorporate 360 video in this workflow.
What are you trying to do with the 360 video exactly?
If you mean 360 camera moving as in FOX Sports (the entrance of this video here) ua-cam.com/video/rOe6Gw9TvJg/v-deo.html they use a garbagge matte in composure to have a green screen creating a 360 degree effect and putting computer graphics all around the scene, wherever the camera points to. Garbage Matte is explained here: ua-cam.com/video/LL2-jxUk2_0/v-deo.html
why njot use the ultimatte to do the comp?
Having the video in Unreal allows for some trickery in terms of having what your keying cast shadows and reflections in the 3d scene. However I have also covered how to do it in the Ultimatte as well here: ua-cam.com/video/HGLJk_2joxA/v-deo.html
Hi, just purchased the ultimatte. Which Outputs are you using on the rear of the unit for fill and key?
PGM Fill Out and PGM Matte Out
@@3d_Something thank you. That was helpful. What if I wanted to record the final comp to an external recorder. Once I have the key and fill, would I then use the pgm output on the ultimatte to record the final comp?
@@wertup123 I think you should use one of the Decklink card outputs, not the Ultimatte outputs. The Ultimatte are just for the chroma, nothing else. Configure the Decklink card output 4 as shown in 11:05 and connect it with an SDI cable to an external recorder live the Blackmagic Video Assist or any SDI recorder.
Interesting, I expected it to go the other way, sending an Unreal camera to the Ultimatte and using that as your background for the key
That is also doable however by having the keyed footage in Unreal you can do more complex compositing tricks such as fake shadows and reflections which you couldn't do by only sending video out to the ultimatte.
@@3d_Something exactly
@@3d_Something How do you do fake shadows? Any videos on that?
❤ how are you getting the cameras output on a separate display?
At the end of the tutorial i show how to output the final comp
Oh I see, you need a deck link to get that to work.
@@salt806 you can also use Ndi if you don't have access to a deck link card, Ndi is a free tool set for ue4 /5
@@DemeterZephyr good idea!
Free? Are you sure? How? @@DemeterZephyr