Virtual Production Studio Tour Deep Tech Dive (Unreal Engine, Aximmetry)

Поділитися
Вставка
  • Опубліковано 27 вер 2024

КОМЕНТАРІ • 30

  • @FluidPrompter
    @FluidPrompter 5 місяців тому

    Pretty slick setup! The prompter in the thumbnail caught my eye. 😎

  • @Justin_Allen
    @Justin_Allen 10 місяців тому +1

    Great behind the scene views. I am almost at that point and looking forward to start getting bogged down in production problems!

    • @cookseyyy
      @cookseyyy  10 місяців тому +1

      Thanks! There's always problems. The trick is to keep pushing forward with your ideas and vision, there's always a way!

  • @drccam8458
    @drccam8458 3 місяці тому

    Thank for this video

  • @sybexstudio
    @sybexstudio 7 місяців тому

    How do you select what pass to record on each HyperDeck? For example green screen on one and final composite on another?

  • @Justin_Allen
    @Justin_Allen 9 місяців тому +2

    I spent some time going back through your video and wanted to ask about your decision to use Aximmetry. You only mentioned you are using it for compositing...do you use it for anything else? I have chosen to bring in Ultimatte 12 4K systems vs Aximmetry. I went back and forth for awhile on this decision.

    • @cookseyyy
      @cookseyyy  7 місяців тому +1

      I think it's great for compositing. the controls aren't as intuitive as some others but I've found it to be more than capable in lots of situations for compositing.

    • @Justin_Allen
      @Justin_Allen 7 місяців тому

      @@cookseyyy Thanks. I keep bouncing between Aximmetry and Offworld Live's compositing tool. Have you evaluated OWL's tool by chance?

  • @brocastteam
    @brocastteam 10 місяців тому +1

    Thank you for sharing!
    I am trying to figure out: is the vive rovers capable to read the lens information via usb in realtime, or not. Did you try to work with photo lenses? Is that possible?

    • @cookseyyy
      @cookseyyy  9 місяців тому

      We're using a different system for the lens information (vanishing Point Viper). Photo lenses are difficult to work with as they don't have hard stops. We get around this with photo prime lenses by just faking the DOF.

  • @thebuzzmeade
    @thebuzzmeade 10 місяців тому +2

    Thank you. Some helpful stuff here. Cheers 👍🏽

  • @ScorpioKing3000
    @ScorpioKing3000 10 місяців тому +1

    How do you set up the Vive tracker system for recording VR environments?

    • @cookseyyy
      @cookseyyy  10 місяців тому

      There's lots of helpful resources on youtube, depending on the exact system you're using. In my case we're using the Vive mars system in combination with Aximmetry, but that's by no means the only way.

  • @endorfineproducties9499
    @endorfineproducties9499 8 місяців тому

    Hi Matthew, What a nice in depth tech dive. Nice improved workflow. One question. At 10:26 you change the world origin with an app. Pretty handy. Would you like to share the app? Or is it a native unreal or aximmetry app? Cheers

    • @cookseyyy
      @cookseyyy  7 місяців тому

      Yep, I'm using touch OSC to create some custom control panels that I can use to send OSC data to Aximmetry. I'm then using a camera mover component to change the transform values of the origin over time. you can obviously set them directly but I find it helpful to have a controller to do it on the fly. I also sometimes use a game controller to do the same thing.

  • @Serjakovs
    @Serjakovs 9 місяців тому

    Hey, great content! Wondering how do you get focus/zoom data to aximmetry?

    • @cookseyyy
      @cookseyyy  7 місяців тому

      We're currently using the VAnishing Point Vipre system which has encoders that read the position of the focus and zoom rings on the lens. However we might be switching to the ReTracker system in the near future.

  • @JairoAndresAmaya
    @JairoAndresAmaya 9 місяців тому

    Works with Mac ?

  • @mrrafsk
    @mrrafsk 8 місяців тому

    Is there a live input HDRI application that can feed some log 360 images into a real-time rendering application, to get GI like lighting. I.e. for changes in lighting onset to procedurally relight?

    • @cookseyyy
      @cookseyyy  8 місяців тому +1

      I know a company called Antilatency are working on something like this. But they are doing it in relation to your lighting setup too, so you have a virtual HDRI capture for your environment and then it sends that data to your lighting to match.
      Your idea is cool though, and would work really well for AR graphics.

    • @mrrafsk
      @mrrafsk 8 місяців тому

      @@cookseyyy thanks for the lead. Can't see anything on their website, but I'll dig deeper Found some papers of people in universities doing this but no commercial application. A challenge when doing stage performances (inc fake hologram) is syncing up the lighting onset to the CG. I see some people have developed DMX to Unreal bridges, but I can't find realtime HDRI's.

  • @marcusunivers
    @marcusunivers 7 місяців тому

    How do you getting the Timecode from the cams and the Mars synced up together in Unreal at realtime. We using Deck Link Cards in combination with Tentacles to send a timeclock into the cams and the mars, but the cams are always not synced because the Blackmagic Media Bundle doesn't get a timeclock. 😔

    • @cookseyyy
      @cookseyyy  7 місяців тому +1

      It's not the timecode that matters so much as genlock. Genlock ensures that every frame is in sync with the tracking data coming in, otherwise they are coming in at different rates and will drift over time, even if you set it up correctly.
      You do need timecode however if you want to do post comp work. That means that your tracking data and video will have the same timestamps making it much easier to synchronize them when you're doing compositing etc.

    • @marcusunivers
      @marcusunivers 7 місяців тому

      ​@@cookseyyy Ah I understand. So the timecode is good for later syncing in post. 👍
      But to sync the scene with the camera, after I recorded it on the cam with timeclock, how do I get the same timecode from something like a tentacle sync into the scene recording in aximmetry or even Unreal Engine directly?
      I mean I can plug the same timecode from the tentacle sync into my PC Audiojack (there are in sync with eachother so cams and pc getting the same audio timecode). But now I need to get the audio into aximmetry and convert the L audio channel into a timecode and feed it in someway into the scene recording inside of aximmetry to sync it up later in post.
      How are you handling timecode data in aximmetry or Unreal? Are you providing each timecode seperately or is there a way to use the left channel of the SDI Cam Signal as timecode in aximmetry?😅

    • @parametriq_
      @parametriq_ 7 місяців тому

      @univers Unreal does receive timecode input. In content browser find blackmagic timecode provider. After setting it up, go to project setting and change timecode setting accordingly

  • @JairoAndresAmaya
    @JairoAndresAmaya 9 місяців тому

    Hello ! I’m building a professional studio like this but I need help. Can we connect ?

    • @cookseyyy
      @cookseyyy  7 місяців тому

      of course. Feel free to DM me. Where's your studio based?

    • @JairoAndresAmaya
      @JairoAndresAmaya 7 місяців тому

      SANTA ANA , CA , I would like connect with you @@cookseyyy

  • @derherrdirector
    @derherrdirector 7 місяців тому

    Do you need GenLock for the Synch? Because the C70 has timecode in/out depending on your preference

    • @cookseyyy
      @cookseyyy  7 місяців тому

      It depends on the tracking system. The Vive Mars system needs genlock to stay in sync so the C70s aren't technically in sync. However this only matters if you plan to move the cameras independently. We typically only move our A camera which is an Ursa 12k which does have sync so we just use that.