How to make a live playable anamorphic billboard using nDisplay and Unreal Engine

Поділитися
Вставка
  • Опубліковано 23 лис 2024

КОМЕНТАРІ •

  • @chipko
    @chipko Рік тому +7

    Oh wow.. I'm nowhere near this level having just mastered getting ndi into my UE but maybe in the future I'll be doing wizardry! Thanks for sharing!

  • @RobertRathnow
    @RobertRathnow Рік тому +1

    really nice! to see how it works in the end. thanks for sharing it.

  • @anejaks
    @anejaks 2 місяці тому

    Great tutorial, very informative

  • @OwenPrescott
    @OwenPrescott Рік тому +2

    Still trying to understand this all but could this be used to render high res video(s) for an anamorphic screen effect?

    • @offworldlive
      @offworldlive  Рік тому +3

      Yes - use Ndisplay in movie render queue

  • @JoshPurple
    @JoshPurple 3 місяці тому

    Very impressive 🏆👍 !

  • @mauriziomarseguerra
    @mauriziomarseguerra 5 місяців тому

    great tutorial. But what if I just wanted to render the two virtual screens to project them onto two physical screens?

    • @offworldlive
      @offworldlive  5 місяців тому +1

      You can use the Ndisplay render pass in Movie Render Queue to render the screens offline. You may need to enable it in the plugins first.

    • @mauriziomarseguerra
      @mauriziomarseguerra 5 місяців тому

      @@offworldlive Thanks a lot for the answer. But in this way the rendering refers to the position of the camera in the scene. How do I capture only the videos projected on the monitors so I can then project these 2 videos onto physical monitors?

    • @offworldlive
      @offworldlive  5 місяців тому

      @@mauriziomarseguerra the nDisplay viewport capture settings in Movie Render Queue will capture the nDisplay viewports with the skewed perspective being based off of the viewpoint in your nDisplay config. Or as outlined here you can use the OWL viewport capture and media output to capture a live scene when the nDisplay config is launched.

  • @AndrewHereytb
    @AndrewHereytb Рік тому +2

    Very quick questions here.
    It seems like nDisplay does not render GPU powered particles when exporting using MRQ
    For example, I have snow particles from UltraDynamicSky, which show perfectly fine in the viewport preview of the nDisplay setup. But once rendering them out, they are not there, including volumetric fog, which is what makes me think its GPU powered particles.
    I was wondering if you know of any fixes for this? Changing to CPU doesn't seem to work either
    Thanks!

    • @hardcorerick8514
      @hardcorerick8514 Рік тому +1

      hmm, this is an interesting one. For me I never found a problem with my fairly basic Niagara system in this video, But I didn't get on to the stage of rendering out, I was just using the OWL media output, which just captures the image from render target and outputs it. I think there's something to do with nDisplay not being great for particle systems because when you have a randomly generated system it won't replicate perfectly across multiple game instances, I'm afraid it seems there's not loads of info out there on using particle systems alongside nDisplay, perhaps it's worth looking at more oldskool methods like fog cards for your fog?

    • @niaxr
      @niaxr 3 місяці тому

      Aha! I've had this issue too (just a couple of weeks ago)... I just assumed I was doing something wrong somewhere! Did you ever find a way to render GPU-powered particles with your nDisplay MRQ setup? :)

    • @AndrewHereytb
      @AndrewHereytb 2 місяці тому +1

      @@niaxr yes and no. 5.4 is better and particles render, but only within the vicinity of the nfisplay setup, and only in its initial position. If you move the ndisplay, the particles don’t, they still generate in that initial area.

    • @niaxr
      @niaxr 2 місяці тому +1

      @@AndrewHereytb oh that's really good to know, thanks! ... I'll give it a go on 5.4 this time and will *try* to not move the nDisplay ✨🥳🤘

  • @maxfxgr
    @maxfxgr 10 місяців тому

    Fantastic Video :)

  • @AhmedMoussa147
    @AhmedMoussa147 Рік тому +1

    This is very unique ❤

  • @antoniocottone2961
    @antoniocottone2961 Рік тому +1

    Do You thinks could be possible by a spherical projection ? Like 360 projection mapping on a sphere
    By using flatten uv’s ? (And latlong conversion )

    • @offworldlive
      @offworldlive  Рік тому

      yes - it should be possible - if you come onto the Discord we could maybe do an open session to try to work it out: discord.gg/VQXARA2

  • @MatheoMarechal-v7q
    @MatheoMarechal-v7q Рік тому

    That's pretty cool and interesting, thanks for the video. I kinda tried this but my character isn't moving when I launch my nDisplay config with Switchboard, would you be able to guess what could be wrong just like that ? I would really appreciate. Thank you again.

    • @hardcorerick8514
      @hardcorerick8514 Рік тому

      hmm... I would think that if it plays in editor then it should also play in switchboard. Perhaps think about the fact that switchboard is launching another instance of your UE project as a standalone game, so if you have any game mode overrides set that could override the game mode for that level then maybe it could be that. if you launch your level and are able to control your character, then the same should really go for launching the level via switchboard. check the pawn's auto possess player settings and set that to 0 and check that the pawn is the default pawn in the game mode. hope this fixes it for you!

  • @enricobersani8948
    @enricobersani8948 Рік тому

    Very Cool! Thank you so much! I have a question, If i wanted to do the final preview inside of unreal, my guess would be to put the ledwall mesh in any scene and set the final render as a video texture, but should i then change the uvs of the final preview mesh?

    • @hardcorerick8514
      @hardcorerick8514 Рік тому +1

      Thanks! Ndisplay sort of always leans towards using switchboard to get the final render of the screens - mostly as a performance thing so that you don't lose frames when running the high res outputs. You can share the cost over multiple machines.
      If you are looking to just render out the skewed image you could do this one at a time with sequenced shots. then match them up in post. This is what i did before running output in blender just as a test.
      The nDisplay config viewpoint is giving the skewed image and mapping that to the geometry you have as your LED walls - So if you have the same UVs for the mesh you want to finally project on to then they shouldn't need editing, as long as they are still the same aspect ratio you had in your nDisplay viewport render.
      -Hope this helps! Not sure if I understand the question completely, but hopefully some stuff for you to try out there!

  • @5u3z
    @5u3z 11 місяців тому

    im trying to make the same balloon effect.. but not sure how to t do it. please help lol

  • @xuesenli5968
    @xuesenli5968 Рік тому

    Boss, How can I use linetracebychannel correctly with NDisplay?

  • @vfxtechnologies2011
    @vfxtechnologies2011 Рік тому +1

    Love the video! Thanks for using some of our clips, would i be able to reach out to you personally?

    • @offworldlive
      @offworldlive  Рік тому

      yes - we received your message through the website and have emailed you back :)

  • @sakthisathis6206
    @sakthisathis6206 4 місяці тому +1

    Does this render work in real world billboard??

    • @offworldlive
      @offworldlive  4 місяці тому

      it's not a render it's a real-time workflow but you can render it out if you like. and yes it works.

    • @sakthisathis6206
      @sakthisathis6206 4 місяці тому

      @@offworldlive like if we have 3D billboard setup in physical world, shall we use this render to project onto that??, does it work or need to go through third party software to do that render

    • @offworldlive
      @offworldlive  4 місяці тому

      @@sakthisathis6206 it depends on what format your media needs to be in to play on your billboard correctly. Sometimes LED configurations come with software that takes in a certain video then maps them pixel for pixel to the LED walls. We just handle the output from Unreal, but nDisplay is the route to outputting this from UE correctly and in the configuration you need for your playback.

    • @fanikini4511
      @fanikini4511 4 місяці тому

      @@sakthisathis6206 maybe able send the ndi output to resolume arena so we get the prefect pixel map to the LED billboard

    • @fanikini4511
      @fanikini4511 4 місяці тому

      @@offworldlive anyway, do you think it will work also for immersive room, let say 3 sides of wall, floor and ceiling ?

  • @ricardobravo8592
    @ricardobravo8592 5 місяців тому +1

    Amazing product, HORRIBLE tutorial.
    It's impossible to follow this. You're always cutting scenes and talking about different aspects everytime.
    I would love to use this with offworld plugin but I just can't understand how you do that.

    • @offworldlive
      @offworldlive  5 місяців тому

      Thanks for the feedback, we could make a more detailed video on this workflow. In the mean time you can see every element that went in to this in our nDisplay series.