ComfyUI EP06: Using ControlNet in ComfyUI to Control Results as Desired [Stable Diffusion]

Поділитися
Вставка
  • Опубліковано 25 січ 2025

КОМЕНТАРІ • 75

  • @Satscape
    @Satscape Рік тому +21

    Keep them coming, these tutorials are way better than what is currently available.

    • @AIAngelGallery
      @AIAngelGallery  Рік тому +6

      Thx a lot, i will keep making better and better tutorials

  • @naagutorres8080
    @naagutorres8080 Рік тому +5

    BEST TUTORIAL OF CONTROL NET BY FAR 1000/1000, THANKS A LOT!

  • @nitealth
    @nitealth Рік тому +4

    How display stepping preview to ksampler????

  • @mior5882
    @mior5882 Рік тому +3

    Thank you for explaining one by one, you make me really understand the workflow, hoping you continue to make more videos about AI!

  • @FabiVFX
    @FabiVFX 11 місяців тому

    Great tutorial, I was struggling trying to understand how ControlNet works but now is very clear. Thanks for sharing your knowledge!

  • @OmerAbdalla
    @OmerAbdalla Рік тому +1

    I stumbled upon your channel and found your tutorials to be very good at explaining different methods for accomplishing tasks in ComfyUI and why some are better than others. Keep up the good work.

  • @Scerritos
    @Scerritos Рік тому +1

    You are the best at these, my man. Thank you

  • @MiraPloy
    @MiraPloy Рік тому +2

    This is the best content for comfyui on the internet. Please keep it up.

  • @Willsing7
    @Willsing7 22 години тому

    Great explanation! Thanks for sharing!

  • @AIAngelGallery
    @AIAngelGallery  Рік тому +3

    in the clip, I suggest wrong ControlNet Preprocessor Node
    this one is better (same principle) :
    (WIP) ComfyUI's ControlNet Preprocessors auxiliary models
    github.com/Fannovel16/comfyui_controlnet_aux
    Thx to @puoiripetere for suggestion

    • @Archalternative
      @Archalternative Рік тому

      Thank you for the wonderful guides, clear and comprehensive. Keep it up 🎉

    • @DealingWithAB
      @DealingWithAB Рік тому

      which ones do we use or what? i dont understand this at all

  • @zdvvisual
    @zdvvisual Рік тому +1

    Very clearly explains several materials at once, thank you very much. Have subscribed directly

  • @अघोर
    @अघोर Рік тому +1

    Thanks for your clear explanation. I was searching it desperately. More tutorials on comfyui please.

  • @petEdit-h9l
    @petEdit-h9l 10 місяців тому +1

    In the first example, what if I want to do an image to image using controlnet, like to recreate the same or similar image but with a different pose, how would i do that, pls help

  • @Innomen
    @Innomen Рік тому

    Thank you for finally showing me how to apply two nets. This is the foot in the door I needed.

  • @DarioToledo
    @DarioToledo Рік тому +2

    Simple and effective, thanks. Question: could the detect_face parameter in the Openpose preprocessor node be used like an approximate faceswap? I've come to think of this when you have to turn it off at 8:50

  • @kwlxxi4813
    @kwlxxi4813 11 місяців тому

    Wow, so easy to understand, so straight forward. Thanks a lot!

  • @Abhilash-.
    @Abhilash-. 11 місяців тому

    Thank you, I was looking a multicontrolnet pipeline and this is really helpful.

  • @bordignonjunior
    @bordignonjunior Рік тому +1

    great video !!! excelent explanation. keep making videos like this.

  • @PostcardsFromJapan
    @PostcardsFromJapan Рік тому +1

    Excellent tutorial. Easy to follow and understand. Thank you!

  • @zoches
    @zoches 6 місяців тому

    brother you helped me lotttt, i just started and i already know most basics because this video 🤩

  • @Settiis
    @Settiis Рік тому

    I have the latest version but cant find the preprocessors

  • @goldholder8131
    @goldholder8131 Рік тому +1

    You're the best, man. Thanks a ton.

  • @lilillllii246
    @lilillllii246 Рік тому

    I'm always thankful. I am a comfyui beginner. I can't get that skull shape from the open pose. What should I do?

  • @cyberspider78910
    @cyberspider78910 8 місяців тому

    This is superb !! I want to give you 10 likes actually but UA-cam limites it to one. Keep up good work. Sir, Can you let me know your hardware specification for this type of work ?

  • @ruby_moon
    @ruby_moon Рік тому +1

    Amazing tutorial, thank you so much!

  • @mrtukk
    @mrtukk Рік тому +2

    ตามมารับความรู้อีกแล้วครับ!

  • @fejesmarketing6736
    @fejesmarketing6736 11 місяців тому

    Absolutely awesome video! Thank you!

  • @ronsvfx5033
    @ronsvfx5033 8 місяців тому

    That was amazing!!! BIGLOVE!!

  • @L3X369
    @L3X369 5 місяців тому

    What are you using to show a preview of what's processing? (under ksampler for example)

  • @Bikini_Beats
    @Bikini_Beats Рік тому +1

    Amazing info my friend. subscribed. Thanks a lot.

  • @TheSORCERER-p9l
    @TheSORCERER-p9l 2 місяці тому

    Can you use this in a workflow where a face has been generated and then using this workflow?

  • @sirjcbcbsh
    @sirjcbcbsh 5 місяців тому

    I'm always confused about how does ControlNet work. Is it for generating a desired pose?

  • @rocren6246
    @rocren6246 5 місяців тому

    object of type 'ControlNet' has no len()
    I don't know which part I messed up. It just doesn't work.

  • @AndikaKamal
    @AndikaKamal Рік тому

    I have been watching several COmfyUI guides...it seemed the process in K Sampler is so quick, I have old iMac late 2013 (3.4GHz Quad-Core Intel Core i5, with NVIDIA GeForce GTX 775M 2 GB, 24 GB memory)..but running K-Sampler so slow...any one have a way to speed it up? Appreciated your solution given if any 🙂

  • @ferniclestix
    @ferniclestix Рік тому +1

    Oh, hi, nice work on the tutorial! :D

  • @rogerdupont8348
    @rogerdupont8348 Рік тому

    thanks, you helped me a lot :)

  • @SamhainBaucogna
    @SamhainBaucogna Рік тому

    The best! 👏

  • @Genoik
    @Genoik Рік тому

    where is controlnet apply i cant found .

  • @wamz8096
    @wamz8096 Рік тому

    This is my favorite tutorial series for ComfyUI, but I can't follow Ep 6 tutorial, because Openpose Preprocessor does not detect Hand and Face of the Model. I use a different Checkpoint to generate an Anime Image Model but it can only detect Body pose. Could you help me out, thanks.

    • @AIAngelGallery
      @AIAngelGallery  Рік тому

      Sometimes openpose is not work well with anime pose. You can use realistic image reference but you can still generate anime style result

  • @Yousouptar
    @Yousouptar Рік тому +1

    ขอบคุณครับ
    คลิปต่อไปน่าจะเป็น inpaint ร่วมกับ controlnet ไหมครับอาจารย์

    • @AIAngelGallery
      @AIAngelGallery  Рік тому +1

      ต่อไปเป็น LoRA กับพวก Detailers ก่อนครับ

    • @swsoundcheck570
      @swsoundcheck570 Рік тому

      มีเปิดคอร์สบ้างมั้ยครับ

  • @junejane6655
    @junejane6655 Рік тому

    how did you use multiple contronet? it's really hard!

  • @mmark-v3o
    @mmark-v3o Рік тому

    May I ask if my machine cannot be connected to the internet? How can I manually install custom nodes? I tried downloading the nodes package, but it did not take effect

    • @AIAngelGallery
      @AIAngelGallery  Рік тому

      you can download file and put it in custom node folder. but some custom nodes also need to install some additional requirements of python libraries

  • @Archalternative
    @Archalternative Рік тому

    Hi, I noticed that if the controlnet image is larger than the latent image, the image will be cut off... there is a solution like in automatic to make the image before moving to the latent,is the image resized correctly?

  • @moonoikikapoo
    @moonoikikapoo Рік тому +1

    1000+++/1000 Excellent tutorial, thank u a lot เยี่ยมมากชอบมากเลยคับ กดติดตามละคับ

  • @Archalternative
    @Archalternative Рік тому

    Hi, nice video, isn't it preferable to install the auxiliary version for controlnet?

    • @AIAngelGallery
      @AIAngelGallery  Рік тому +1

      Oh, you're right! I didn't aware of that one

  • @marc1137
    @marc1137 Рік тому

    hi , i installed the WAS utilities and this open pose all working fine... then i installed the face swap reactor node , but now cannot see the earlier stuff installed even if says installed.... any reason? thanks.... i see theres a lot of cannot import...... 'cv2.gapi.wip.draw' has no attribute 'Text'........just after install the reactor face swap
    actually if just move out reactor folder from custom nodes then again works everything fine... reactor folder is a bit of a bomb for other folders around...

  • @rexforwood
    @rexforwood Рік тому

    Excellent tutorial .. thank you .. I am getting error .. can you suggest a resolution please ......
    Error occurred when executing ControlNetLoader:
    module 'comfy.sd' has no attribute 'ModelPatcher'
    File "F:\SDXL\ComfyUI_windows_portable\ComfyUI\execution.py", line 151, in recursive_execute
    output_data, output_ui = get_output_data(obj, input_data_all)
    File "F:\SDXL\ComfyUI_windows_portable\ComfyUI\execution.py", line 81, in get_output_data
    return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
    File "F:\SDXL\ComfyUI_windows_portable\ComfyUI\execution.py", line 74, in map_node_over_list
    results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
    File "F:\SDXL\ComfyUI_windows_portable\ComfyUI
    odes.py", line 577, in load_controlnet
    controlnet = comfy.controlnet.load_controlnet(controlnet_path)
    File "F:\SDXL\ComfyUI_windows_portable\ComfyUI\comfy\controlnet.py", line 394, in load_controlnet
    control = ControlNet(control_model, global_average_pooling=global_average_pooling)
    File "F:\SDXL\ComfyUI_windows_portable\ComfyUI\custom_nodes\AIT\AITemplate\AITemplate.py", line 347, in __init__
    self.control_model_wrapped = comfy.sd.ModelPatcher(self.control_model, load_device=comfy.model_management.get_torch_device(), offload_device=comfy.model_management.unet_offload_device())

    • @AIAngelGallery
      @AIAngelGallery  Рік тому +1

      Please try to update both comfyui and custom nodes to latest version

  • @thanagridseesuae
    @thanagridseesuae 9 місяців тому

    ทำแบบนี้ใน sd ได้ไหมครับ

  • @diveneaesthetics
    @diveneaesthetics 5 місяців тому

    please come back, you don't post since long time bro

    • @AIAngelGallery
      @AIAngelGallery  5 місяців тому

      i will be back soon. thx for thinking of me

  • @chauhandivyansh9416
    @chauhandivyansh9416 Рік тому

    Use SAM detection for better inpainting.

  • @hainguyenhoang11
    @hainguyenhoang11 4 місяці тому

    Thanks a lot.

  • @DvirElisha
    @DvirElisha Рік тому

    You are very good. Can you do a tutorial on outpainting?

  • @salomevsn3724
    @salomevsn3724 9 місяців тому

    thank you so much

  • @Xavi-Tenis
    @Xavi-Tenis 8 місяців тому

    thanks!

  • @morfest
    @morfest Рік тому

    ขอบคุณมากครับ

  • @alphaz35802
    @alphaz35802 Рік тому

    ขอบคุณมากครับอาจารย์สำหรับความรู้ที่ให้มาเรื่อยๆครับ

  • @yiluwididreaming6732
    @yiluwididreaming6732 Рік тому

    Wonderful!!! Thank you!!! For sharing your knowledge and workflows!!!! SUBBED!!!!

  • @relaxingparadise2033
    @relaxingparadise2033 Рік тому

    When I attempt to use your method, the face in the photo transforms into another person. How can I capture the same face with different angles and expressions?

    • @AIAngelGallery
      @AIAngelGallery  Рік тому

      you have to use LoRA to help. please watch EP07 first

  • @icejust9195
    @icejust9195 10 місяців тому

    How can I turn on progress image on ksampler node(box)?