A New Virtual Reality: A 5,000 sqft VR Experience for Four Seasons Lake Austin | Unreal Fest 2023

Поділитися
Вставка
  • Опубліковано 21 лис 2023
  • What does it take to appropriately represent a groundbreaking real estate project in virtual reality? In this talk, we’ll look at the challenges in communicating intricate, site-specific architecture, pushing the limits of visual quality in a fully ray-traced VR experience and providing a multi-user wireless experience inside a 5,000 sq. ft. custom showroom.
    You’ll gain insights into the challenges faced and the ultimate solutions adopted to ensure an end user experience that aligns with the caliber of the project.
    We’re excited to bring you sessions from Unreal Fest 2023, available to watch on demand: • Unreal Fest '23 Playlist
  • Ігри

КОМЕНТАРІ • 42

  • @ibrews
    @ibrews 6 місяців тому +16

    thank you so much everyone for checking out our talk! this project has been a tremendous effort from many many people on many many teams. The R&D continues as we all work toward pushing the boundaries of realism in virtual reality. watch this space and cheers!
    -Alex from Agile Lens

  • @se7en28
    @se7en28 6 місяців тому +21

    Man Alex is like the only dude pushing VR in Unreal. Good on him for pushing the bar

    • @ibrews
      @ibrews 6 місяців тому +5

      there are dozens of us. DOZENS!

    • @behrampatel4872
      @behrampatel4872 6 місяців тому +2

      @@ibrews Dozens for pushing VR ? sure. Helping raise the tide for all boats (educating and sharing knowledge at the bleeding edge ) ? very few. count them on 'one' finger ;).
      Cheers
      b

    • @ibrews
      @ibrews 6 місяців тому

      @@behrampatel4872 🫶

    • @bolayer
      @bolayer 5 місяців тому +1

      @@ibrewsdozens of us I tell you 😂 need to catch up soon Alex

    • @ibrews
      @ibrews 5 місяців тому +1

      @@bolayer yes yes !

  • @slot9
    @slot9 5 місяців тому +3

    That was amazing!

  • @manuvikraman1611
    @manuvikraman1611 6 місяців тому +3

    Very informative, thanks.

  • @behrampatel4872
    @behrampatel4872 6 місяців тому +12

    Alex is a very, very talented guy. Generous with his knowledge too. Cheers

    • @ibrews
      @ibrews 6 місяців тому +2

      You’re kind! There’s a very large team who helped make this happen and I’m just fortunate to be one of the messengers 🎉

  • @syno3608
    @syno3608 6 місяців тому +5

    This is pure gold .Thanks !!

  • @arprint3d
    @arprint3d 6 місяців тому +4

    Que increíble cuando ves estos grandes trabajos y ves cada minúsculo detalle que observan para lograr algo único. Felicitaciones! Me da mucha motivación para seguir avanzando en VR

  • @ali.3d
    @ali.3d 6 місяців тому +5

    Such a great video and presentation, thanks heaps to everyone involved 🙌🏽

  • @Lupin0
    @Lupin0 6 місяців тому +3

    wow! awesome job!!

  • @dyna4studio942
    @dyna4studio942 6 місяців тому +2

    Good. Let´s push push push!

  • @Kor3Gaming_Ghost
    @Kor3Gaming_Ghost 6 місяців тому +2

    this guy is really talented for real. i love vr and i cannot wait to push it

  • @zakaria20062
    @zakaria20062 3 місяці тому +2

    Im not sure why Epic they take important feature as HTML5 support . I think we need this feature back

  • @uiefuh17
    @uiefuh17 6 місяців тому +5

    🤞

  • @brettcameratraveler
    @brettcameratraveler 6 місяців тому +3

    Incredible effort and attention to detail. Optitrack is very accurate for mocap but its expensive and not a easily portable hardware solution. If you were to do it again with the Quest 2, would Meta's Shared Spatial anchors have been good enough to be safe for multiple users in a shared space?
    How about in the case of critically aligned IRL objects with their digital twin counterparts? Repeatable?

    • @ibrews
      @ibrews 6 місяців тому +1

      Been doing real world alignment with objects for years with Vive Trackers, even controllers sometimes (if the users don’t need them)
      Shared spatial anchors are still very bad and we could not rely on them. No improvement yet as far as we can see

  • @haikeye1425
    @haikeye1425 6 місяців тому +4

    👍

  • @juanipignatta414
    @juanipignatta414 6 місяців тому +1

    🤩

    • @ibrews
      @ibrews 6 місяців тому +1

      go Juani go!!

  • @arealvisionvideos
    @arealvisionvideos 4 місяці тому +2

    Hi Jose, Alex and Neil, incredible work!
    I don't quite understand the part about the mirrors, can ray tracing and sphere reflections capture work at the same time?
    I thought that if you have ray tracing reflections activated you cannot see screen space reflections.
    Thank you very much and excuse my English.

    • @ibrews
      @ibrews 3 місяці тому +1

      Hi! That’s correct, the project ONLY uses raytracing reflections but changes sample count and other levels of precision depending on context

    • @arealvisionvideos
      @arealvisionvideos 3 місяці тому

      @@ibrews Thank you so much 👍

    • @arealvisionvideos
      @arealvisionvideos 3 місяці тому

      The highest value that I can apply in samples is 2 with my RTX 3080 mobile

    • @ibrews
      @ibrews 3 місяці тому +1

      @@arealvisionvideos this runs on desktop rtx 4090

  • @r.m8146
    @r.m8146 6 місяців тому +3

    This kind of problem would be so much more easy to solve if people were leveraging Dynamic Foveated Rendering. It's such a shame that people don't recognize its potential; I hope the Vision Pro will change that.

    • @ibrews
      @ibrews 6 місяців тому +8

      Alex here! We did try to use dynamic foveated rendering using the eye tracking of the Meta Quest Pro but a) in 4.27 it requires the Oculus branch of Unreal and b) the latency was perceivable
      Much much better in 5.3 now!

    • @brettcameratraveler
      @brettcameratraveler 6 місяців тому

      ​@ibrews Roughly what percentage gain in performance did you see after you toggled DFR on?

    • @ibrews
      @ibrews 6 місяців тому

      @@brettcameratraveler at best 20% ? Wasn’t worth it for the artifacts

  • @anmolsandhu3619
    @anmolsandhu3619 Місяць тому

    do you think the tracking has improved with the new meta quest 3? I.e. the special equipment you made for the meta quest pro headsets is no longer needed ?
    also, I have been personally working on a VR scene and I am able to get absolutely amazing quality in PC VR for a scene but I see these sleek white lines, really faded but if you really try to look you can see them, usually its on furniture like sofa's or beds maybe its the texture that is causing that or maybe I am just missing a rendering setting?
    I would really appreciate it if you can tell me what settings did you use in rendering tab to get the highest quality possible ?

    • @juanipignatta414
      @juanipignatta414 Місяць тому +1

      Hello there! If you are using raytracing and the white lines you are seeing are in the edges of the furniture, it is related to raytracing and the max roughness setting. Editing the values it in the postprocess volume or the roughness in the material should fix it !

    • @ibrews
      @ibrews Місяць тому

      @@juanipignatta414 seconding Juani ! :D

  • @JBBost
    @JBBost 4 місяці тому

    I like how they can't sell rich people houses unless every single one of the houses in that ugly clump of development has the same paper-thin veneer of reality as the people trying to buy property there.

  • @mugabsyll5155
    @mugabsyll5155 29 днів тому

    Lol you guys are not the first. I've build a capital just for VR. I'm one developer with no billions

  • @Danuxsy
    @Danuxsy 6 місяців тому +1

    Couldn't you use neural nets to add detail onto the images without using more polygons? In fact why use polygons and traditional rendering at all? In theory a neural driven image could have infinite resolution, the closer you get to something the more detail is seen as the neural net keep generating what should be there.

    • @roquecepeda2932
      @roquecepeda2932 6 місяців тому

      Probably, but that project started about 2 years ago, the tech was pretty much unknown for the archviz industry. and probably you would need lots and lots of renderings from the original scenes.

    • @junkaccount7449
      @junkaccount7449 6 місяців тому +1

      Nanite works great and NeRFs aren’t mature enough for production applications like this. The super resolution model you’re describing doesn’t exist yet for rendering realtime 3D scenes.. it might in 5-10 years but right now it’s Sci-Fi lol

    • @joseuribe7415
      @joseuribe7415 6 місяців тому +3

      Good suggestion. We did explore it at the time, but we could not get the results we wanted to achieve. Remember that in VR with full 6DOF, we needed to make sure even if you get super close to any object, you would still get a very realistic view. Neural nets are promising and we will continue our R&D on them, but it was just not the case for this project.