AI Room Makeover: Reskinning 3D Scans with ControlNet, Stable Diffusion & EbSynth

Поділитися
Вставка
  • Опубліковано 10 лип 2024
  • ControlNet takes the chaos out of generative imagery and puts the control back in the hands of the artist and in this case, interior designer. In this video, we'll take a deep dive into the process of reskinning a room with ControlNet and Stable Diffusion. I'll also share use EbSynth to interpolate results into temporally coherent final video. I hope this hacky but fun video2video AI workflow gives you inspiration to capture your own spaces and create your a reskinned masterpiece. Feels like we're getting a glimpse into AR experiences from the future :)
    0:00 My ControlNet Results & Overview
    0:26 Workflow Breakdown Begins!
    1:14 Why ControlNet Is Awesome & Everyone Can't Stop Talking About It
    1:40 Video2Video Workflow Overview
    2:06 #1: Create Input Video from 3D Scan
    2:55 #2: Generate Depth Maps using MiDaS
    3:30 #3: Dial In Look with Control Net & SD 1.5
    4:50 #4: Interpolate Keyframes with EbSynth & AE
    5:36 #5: Mix ControlNet Methods for Best Results
    6:46 Key Takeaways & Questions for You!
    🛠️ Let's Connect: beacons.ai/billyfx
    📰 Creative Technology Digest Blogs: creativetechnologydigest.subs...
    🐤 Follow Me on Twitter for Experiments & Memes: / bilawalsidhu
    🎙 My New Podcast (2nd channel): / @thecreativetechnologist
    #AI #Futurism #NoCode #3D
  • Фільми й анімація

КОМЕНТАРІ • 16

  • @JohnMcclaned
    @JohnMcclaned Рік тому +3

    UA-cam recommendations the goat for bringing me here

    • @bilawal
      @bilawal  Рік тому +2

      Haha so glad to hear it! Because AI is def a newish audience and I’ve been wondering if I start a new channel.

  • @wilsonfishbag
    @wilsonfishbag Рік тому +3

    This is brilliant. Love the longer deep dive!!

    • @bilawal
      @bilawal  Рік тому +1

      Glad you found it helpful! I enjoyed making this too; could be a fun format to continue :)

  • @BigCaseadilla
    @BigCaseadilla Рік тому +1

    Wow! Great video. Thanks!

    • @bilawal
      @bilawal  Рік тому +1

      Glad you found value in it :)

  • @kbssidhuex-ias7959
    @kbssidhuex-ias7959 Рік тому +2

    It’s both magical and nostalgic.

  • @pfuhad3760
    @pfuhad3760 Рік тому +1

    This is amazing work Bilawal . It is amazing you are able to do this . This is awesome . I am following you on twitter and saw your latest video . that was also awesome . Keep making videos .

    • @bilawal
      @bilawal  Рік тому +1

      Thanks for the support 🙏🏽

  • @ysy69
    @ysy69 Рік тому

    This is amazing. Thank you for making and sharing. Question, what if in case of virtual dressing a house. Suppose you have pictures of the house empty and you want to use SD/Controlnet/Paint&Text2Image/ MultiDiffusion Region Control to decorate by specifying where each furniture should be placed and rendered ? Am I making sense? Like virtual dressing the house.

    • @alexlindgren1
      @alexlindgren1 8 місяців тому +1

      Want to know this. Got any answer?

    • @ysy69
      @ysy69 8 місяців тому

      @@alexlindgren1 never got an answer but recently found a software that does something along these lines

    • @ysy69
      @ysy69 8 місяців тому

      @@alexlindgren1 ua-cam.com/video/Hz3Jb2YvQ-k/v-deo.htmlsi=aC6mr55x4FnWEZIf

  • @alexlindgren1
    @alexlindgren1 8 місяців тому

    Very cool. I want to be able to take an initial image of a livingroom, and restyle this image and add a specific sofa for example. How would I do this? I want it to be as automated as possible and no need for manual masking etc and at the same time preserving the structure of the room like where the doors are etc.

  • @LetitBrainLiberati
    @LetitBrainLiberati Рік тому

    ⭐⭐⭐⭐⭐