Get Better Inpainting & Outpainting Results With Fluxfill

Поділитися
Вставка
  • Опубліковано 29 лис 2024

КОМЕНТАРІ • 49

  • @MonzonMedia
    @MonzonMedia  2 дні тому +2

    What has been your experience so far with the Fluxfill in-outpaint model?

    • @ApexArtistX
      @ApexArtistX 2 дні тому

      SDXL better 😂

    • @MonzonMedia
      @MonzonMedia  2 дні тому +1

      @@ApexArtistX better for people with lower end GPU's and the controlnets are better trained but it's nice to have options for Flux. The controlnets are still new. It will get better.

    • @ShahriyarAli
      @ShahriyarAli 2 дні тому

      In-paint works great. Thanks for your workflows. However, for some reason masking subject and changing backgrounds with FLUX is weird. I think FLUX ins't yet ready for changing backgrounds.

    • @user-cz9bl6jp8b
      @user-cz9bl6jp8b 2 дні тому

      For me, just like the other workflows, It's paints random noise when outpainting.
      Anyone know why this might happen?

    • @ShahriyarAli
      @ShahriyarAli 2 дні тому

      @@user-cz9bl6jp8b it does paint my prompt but very unreal and buzzy

  • @edwardtse8631
    @edwardtse8631 2 дні тому +2

    The context mask is gold, some many times when I just need something small to inpaint and inpaint failed to know what to do. This context mask is something I did not know is available. Thank you so much

    • @MonzonMedia
      @MonzonMedia  2 дні тому

      You’re welcome! Glad it was helpful 👍🏼

    • @ChrissyAiven
      @ChrissyAiven 2 дні тому +1

      @@MonzonMedia Yes Thank you, helped me a lot!

    • @MonzonMedia
      @MonzonMedia  2 дні тому

      Great to hear! I appreciate the support!

  • @SouthbayJay_com
    @SouthbayJay_com 2 дні тому +3

    Great video buddy, thanks for sharing all the information! Also congratulations on 30K, that's fantastic!! 🙌🙌🥳🥳

    • @MonzonMedia
      @MonzonMedia  2 дні тому +2

      Appreciate it bro! Thanks for being part of the journey! 👊🏼🙌🏼

  • @RiftWarth
    @RiftWarth 2 дні тому +2

    Thank you for this awesome tutorial and sharing your workflow. You rock!

    • @MonzonMedia
      @MonzonMedia  2 дні тому

      You’re welcome! Enjoy and have fun!

  • @baheth3elmy16
    @baheth3elmy16 2 дні тому +2

    Thank you very much!!!!

    • @MonzonMedia
      @MonzonMedia  2 дні тому

      @@baheth3elmy16 you’re welcome very much! 😊

  • @Neotrixstdr
    @Neotrixstdr 2 дні тому +2

    Thanks!

    • @MonzonMedia
      @MonzonMedia  2 дні тому

      You’re welcome! Let me know if or how it goes 👍🏼

  • @contrarian8870
    @contrarian8870 2 дні тому +2

    Thanks. What's the name of the node that shows CPU/GPU etc in Comfy's top bar?

    • @MonzonMedia
      @MonzonMedia  2 дні тому

      @@contrarian8870 it’s called crystools. Very handy to have. 👍🏼

  •  2 дні тому

    Again very useful, thank you! I ask, why is the mouse graphic and not photo-realistic, while the whole image is photo around it? Is there any way to make the display more consistent, or is that where Flux is at the moment?

    • @MonzonMedia
      @MonzonMedia  2 дні тому +1

      Well, it's a very short and simple prompt and I was just demonstrating how inpainting works.

    •  2 дні тому

      @MonzonMedia Thanks!

  • @ChrissyAiven
    @ChrissyAiven День тому +1

    It works much better with the second area however the results from the inpainted area are very blurry, any idea? I paint very small areas like finger wrapping around a can, that is already in the image.

    • @MonzonMedia
      @MonzonMedia  День тому +1

      Play around with the blur mask pixel numbers and maybe the rescale algorithm image size, however for smaller details there is only so much you can do.

    • @ChrissyAiven
      @ChrissyAiven День тому +1

      @@MonzonMedia Thank you :)

  • @jasoncow2307
    @jasoncow2307 День тому +1

    is there a way the flux fill can work with CN, something like when u r inpainting a model or a mouse with expected pose

    • @MonzonMedia
      @MonzonMedia  День тому

      Sure, I have a workflow that does that but it’s with SDXL but will try to convert it to a flux workflow and I’ll share with you all when I do. 👍🏼

    • @jasoncow2307
      @jasoncow2307 10 годин тому

      @@MonzonMedia looking forward to it

  • @edwardtse8631
    @edwardtse8631 18 годин тому +1

    Is it possible to inpaint with LoRA?

    • @MonzonMedia
      @MonzonMedia  2 години тому

      Doesn't work that way. For example with SDXL models there sometimes would be trained fine tune models just for inpainting but not Lora's. Maybe eventually.

  • @pablo.montero
    @pablo.montero 2 дні тому +2

    I could not figure out how to combine Flux Fill with Flux Depth or Canny, do you think is this possible?

    • @MonzonMedia
      @MonzonMedia  2 дні тому +1

      What are you trying to do? For controlnet you just use the regular Flux dev or schnell model.

    • @pablo.montero
      @pablo.montero 2 дні тому +1

      @@MonzonMedia Imagine I want to inpaint something but with the influence or conditioning of a Depth map.

    • @raghurajraman
      @raghurajraman 2 дні тому

      @@pablo.montero Correct. I would like Flux Fill to be conditioned with ControlNet. Canny, Depth and OpenPose...all would be good.

    • @MonzonMedia
      @MonzonMedia  2 дні тому

      I see what you mean. I’ll see if I can figure it out. It’s definitely doable with SDXL haven’t tried with flux but I’d imagine it’s a similar process. Btw if you have a decent gpu it’s much easier to do this on invoke ai.

    • @raghurajraman
      @raghurajraman День тому +1

      @@MonzonMedia When I am working on Flux Fill on humans I see that the hands, feet and sometimes the entire body gets messed up. Like the others have suggested it would be good to be able to condition Flux Fill with a controlNet specifically DWpose. I also included a power LoRA Loader to your workflow and this enhances the workflow especially for N**W stuff and you could do the same too. 😀

  • @user-hi3ke6qh7q
    @user-hi3ke6qh7q 2 дні тому +3

    Nice homage to that other guy who uses rodents in his vids.

    • @MonzonMedia
      @MonzonMedia  2 дні тому +1

      Nerdy Rodent? hehehehe....didn't think of that actually 😊 🐭

    • @labmike3d
      @labmike3d 2 дні тому +2

      @@MonzonMedia I’m pretty sure Nerdy would be psyched to see “his family” slowly navigating the underground UA-cam tunnels. Now, the real question is-who’s next? You know, rodents are pretty clever. 😎🐭

  • @noNumber2Sherlock
    @noNumber2Sherlock День тому

    HI, I'm trying to follow you with the wine and cheese example , however the included workflow is completely different from yours. I'm totally confused.

    • @MonzonMedia
      @MonzonMedia  День тому +1

      The workflow is the same, the beginning is just an exploded version. The one linked in the google drive is just cleaned up and organized.

    • @noNumber2Sherlock
      @noNumber2Sherlock 22 години тому +1

      @@MonzonMedia Ah I see. Well it had me learn on my own that you can make the link render mode straight and then use reroute nodes because that was what confused me. Trying to duplicate your workflow as opposed to the link, I was like what is that?
      Anyone thank you. All your material is 100% unadulterated quality. I always check to see what you have for us because you have your finger on the pulse.

    • @MonzonMedia
      @MonzonMedia  22 години тому +1

      @@noNumber2Sherlock No worries at all! In case you want the exploded version here it is. drive.google.com/file/d/1AcSWxggnm97mc7bzRdPnkVpok8KyNVhR/view?usp=sharing There are many ways to route nodes, everyone has their own way that makes sense to them. Appreciate the kind words and support! 🙌

    • @noNumber2Sherlock
      @noNumber2Sherlock 19 годин тому +1

      @@MonzonMedia Dude you are Gold Standard! Also, that was unexpected and very kind of you. Thank you! I look forward to what you have next!