Live Facial Animation with Unreal Engine and Apple ARKit

Поділитися
Вставка
  • Опубліковано 4 лис 2024
  • This is a case-study breakdown for a recent project I did in collaboration with NYC’s Technodramatists. I was tasked with designing and creating a wireless, interactive AR system to allow a roving actor to perform a one-woman show with live facial motion capture and 3D animation to tell the story of Shakespeare’s Comedy of Errors.
    The system was driven by Epic Game’s Unreal Engine, and built/hacked on top of the official Face AR Sample template, which does a lot of the heavy lifting connecting Apple’s ARkit face tracking to Unreal Engine blueprints. I did custom blueprint code specific for each mask to help amplify some facial animation parameters to help achieve the intended performance effects. Many thanks to the people at Epic whose work went into this core funcationality before I adopted it to build some other stuff on top of.
    For the full case study, check out:
    www.robertlest...
    For more live clips from the show check out:
    • Live AR Theater Sizzle
    Tools used:
    Epic Unreal Engine 4
    Maya
    Cinema4D
    RizomUV
    Turbosquid
    Polywink
    For more info:
    robertlestercreative.com
    technodramatists.com

КОМЕНТАРІ • 19

  • @TorQueMoD
    @TorQueMoD 4 роки тому +1

    I'd pay for a pre-built plugin that lets me use my XR to drive my characters. You should consider putting something like this together on the UE4 Marketplace

    • @robertlestercreative
      @robertlestercreative  4 роки тому

      You make a good point! I'd love to do something like that, but might need to collaborate with someone who has a stronger dev background to really make it something worth selling.

  • @mslitto9935
    @mslitto9935 2 роки тому

    Hi, I am very impressed by y
    our outcome, looks very promising, One question: is it possible to animate with one iphone 3 characters at the same time?

    • @robertlestercreative
      @robertlestercreative  2 роки тому

      Yep! That's actually wants happening here. All the characters are receiving the same animation curves in real-time to drive the performance, I am just switching the visibility of the meshes

    • @mslitto9935
      @mslitto9935 2 роки тому

      @@robertlestercreative nice so all 3 are active at the same time? sounds promissing, I use live link and have not tried it yet to send it to more than one character

    • @robertlestercreative
      @robertlestercreative  2 роки тому

      @@mslitto9935 Yep! it was not too difficult to get it working with multiple characters. All are in the same level and running simultaneously

  • @jrgen7527
    @jrgen7527 4 роки тому

    Hey Robert! Very interesting use case. Thumbs up :)
    I was wondering, did you develop this on windows? And if so, did you ever package the project? I'm having issues making ARkit connect to my build.

  • @ZooDinghy
    @ZooDinghy 3 роки тому

    Wow, pretty amazing! How did you get the lol sync to be so good? Mine doesn't work at all. It's way too slow

  • @freenomon2466
    @freenomon2466 4 роки тому

    Hey thanks for sharing! Any chance of you sharing or selling the project n the app? Or can it work with facecap app? Looks awesome btw

    • @robertlestercreative
      @robertlestercreative  4 роки тому

      Thank you! Unfortunately I can't share the project files outright since it was commissioned work. I haven't checked out the facecap app yet, so can't say.

  • @arkadiyepanov
    @arkadiyepanov 4 роки тому

    Hi! Сan this work with multiple iPhonesX at the same time? I need to capture 8 faces in real time.
    Many thanks!

    • @robertlestercreative
      @robertlestercreative  4 роки тому

      I believe it can work with multiple sources, though I have not tried it personally.

    • @arkadiyepanov
      @arkadiyepanov 4 роки тому

      @@robertlestercreative
      Thanks for the answer. It is necessary to test. I am in doubt about the part where the iPhone connects to unreal. Will it be able to split input streams.

    • @robertlestercreative
      @robertlestercreative  4 роки тому

      @@arkadiyepanov If I remember correctly, I believe you can split the input streams, because you can essentially assign a unique address to each client connected via wifi. I think the biggest challenges with doing 8 connections at once will be the bandwidth in your wifi connection, which might create a lot of lag. I'm not really 100% sure tho since I have never tried it.

  • @JuanSanchez-xw2po
    @JuanSanchez-xw2po 4 роки тому

    I'd like to know if it works with an iPhone 11??

    • @robertlestercreative
      @robertlestercreative  4 роки тому +1

      Hi Juan, it should work with any iPhone after iPhone X, so yes and iPhone 11 would work in this system

    • @JuanSanchez-xw2po
      @JuanSanchez-xw2po 4 роки тому

      ​@@robertlestercreative ok thank you so much. We are trying to build the sample project that is in unreal documentation. But do you know how I can build the app? I don't know anything about the developer account

    • @robertlestercreative
      @robertlestercreative  4 роки тому

      @@JuanSanchez-xw2po You build the app directly to the phone from Unreal. The sample project contains 2 different levels, one for building the phone app, and one which runs on the computer after you have the phone app installed. FYI, you will need XCode and an apple developer account to set it up.

  • @fabianoperes2155
    @fabianoperes2155 2 роки тому

    The mask poly count it TOOOO LOW.