LivePortrait: Add emotions to your still images by animating the face in ComfyUI - Stable Diffusion

Поділитися
Вставка
  • Опубліковано 4 вер 2024

КОМЕНТАРІ • 14

  • @sunlightlove1
    @sunlightlove1 Місяць тому

    alwaYS GREAT CONTRIBUTION

  • @CGzzzzzzzzzzz
    @CGzzzzzzzzzzz 25 днів тому

    Hey Hi i really find your videos nice. I am new to live portrait and seeing the github side there are many updates. So is this Video good start or is it old? And will u make a new Video to liveportrait eventually? I appreciate ur work 👍👍

    • @CodeCraftersCorner
      @CodeCraftersCorner  23 дні тому +1

      Hello, this video is still good for the version 1. There is a new version of LivePortrait now that may give better results.

  • @walidkh-sansfiltre
    @walidkh-sansfiltre Місяць тому

    Hello great thank you, how build a web app on top off comfuyi live portrait ?

    • @CodeCraftersCorner
      @CodeCraftersCorner  Місяць тому +1

      Hello, you can do the same as my previous videos on ComfyUI API. Save the workflow (json) file and load it in your app.

  • @yngeneer
    @yngeneer Місяць тому

    did you, or do you plan to take a look at implementing live portrait in video to video workflow? is it even possible?

    • @CodeCraftersCorner
      @CodeCraftersCorner  Місяць тому +1

      Hello @yngeneer, LivePortrait primarily works by animating a single image, so integrating it directly into a video-to-video workflow might be challenging at the moment. The main purpose is to use video to drive the animation of the head. These projects are advancing really fast, so it’s possible in the near future. Motion capture is already widely used to animate 3D models in the film and gaming industries, soon it might be available for 2D video animations.

    • @CodeCraftersCorner
      @CodeCraftersCorner  Місяць тому +1

      Not sure why my previous reply is not showing.
      LivePortrait primarily works by animating a single image, so integrating it directly into a video-to-video workflow might be challenging. The main purpose is to use video to drive the animation of the head. Since these projects are moving so fast, it’s possible in the near future. Motion capture is already widely used to animate 3D models in the film and gaming industries, soon it might be available for 2D video animations.

    • @yngeneer
      @yngeneer Місяць тому

      @@CodeCraftersCorner ok, thank you

    • @CodeCraftersCorner
      @CodeCraftersCorner  Місяць тому

      👍

  • @vickyrajeev9821
    @vickyrajeev9821 Місяць тому

    Thanks, can I run on CPU because i don't have GPU

    • @CodeCraftersCorner
      @CodeCraftersCorner  Місяць тому

      Not sure! I checked the resources used during the generation. For me, it took about 2GB VRAM and CPU was at 100%. You can give it a try. It may work, although will be slow. Alternatively, you can try the Huggingface space. It is free for now and it should be fast.

  • @walidkh-sansfiltre
    @walidkh-sansfiltre Місяць тому

    Hello great thank you, how build a web app on top off comfuyi live portrait ?