You are a hero dude! THANK YOU so much I would have never been able to do this without your help! you are awsome! I hope you are doing well youve earned my subscription! you explained everything so perfectly and went through so many things that would have slipped me up! one issue i had was where to put the pulid model and the unet loader model in comfy ui?
Thank you for watching and the sub! The models will go inside ComfyUI > models folder. Here, you will need to create a new folder and name it pulid. Place the pulid ip-adapter model here. The base model will go inside the checkpoints folder. Be sure to use SDXL ones as PuLID will only work with SDXL and not SD1.5 models.
@@CodeCraftersCorner Thank you so much i finally fixed it! I made sure I posted your video in 'latent vision's youtuube comment section so the people there could easily do his tutorial you are the best man!!!!!
Hello, you should search for Juggernaut XL on civitai website. The model id is 133005. Once the page loads, just below the name, there are different models you can choose from. Download the one that says: V9+RDPhoto2-Lightning_4S
Could someone help me? I get this error after installing PuLID from Manager and I don't know how to solve it: "Cannot import F:\Comfyui\ComfyUI\custom_nodes\PuLID_ComfyUI module for custom nodes: Unable to import dependency onnxruntime."
Can you try to open the terminal in your ComfyUI folder: F:\Comfyui Type in this command: python_embeded\python.exe -m pip install onnxruntime press enter. See if the install and then start ComfyUI
hii! thank you so much for the tutorial! for some reason I still have 4 missing nodes when loading a pipeline (missing pulidInsightFace, PulidModelLoader, PulidEvaClip, Apply Pulid). When I try pip install insightface it says, it's already installed..I assume it's not about individual components, can't find the solution
Can you try to update ComfyUI and the PuLID custom node manually. In your terminal, go to ComfyUI and type "git pull". Then go to custom node folder > pulid > type "git pull". Then try again.
Hello, most of the steps will be the same. You will have to replace part of the command with you python environment. Every time, I said "Python_embedded/python.exe", you have to use your virtual environment or conda environment python executable instead to run the command. Otherwise, it will be the same.
Thank you for the amazing tutorials, liked, subs and notif up! Can you please tell me what could be the problem, I installed PuLID node but it's says IMPORT FAILED, and that's the case for many other nodes. My comfy is pinokio integrated and I always update it and I tried the fix ''TRY FIX". Also for imported nodes When I try to generate it says: Numpy error.
Hello, can you check your terminal and see if you have any "module not found" error. Usually, the import will fail if you are missing some dependencies. Missing "Insightface module" is common or it can be the "filterpy" module.
Sorry to hear that! I understand it can be really frustrating. As a last resort, you might consider trying Python 3.10. However, this can lead to more complications with future updates.
ValueError: Query/Key/Value should either all have the same dtype, or (in the quantized case) Key/Value should have dtype torch.int32 query.dtype: torch.float32 key.dtype : torch.float16 value.dtype: torch.float16 🤔
"Hey, thanks a lot for Creating this tutorial! Really appreciate your effort, you're awesome!👍"
Outstanding presentation. Thank you. 😀
Thank you for your super detailed videos!
Glad you like them!
Nice tutorial for a fairly complicated installation. Thx :)
Glad it helped
You are a hero dude! THANK YOU so much I would have never been able to do this without your help! you are awsome! I hope you are doing well youve earned my subscription! you explained everything so perfectly and went through so many things that would have slipped me up! one issue i had was where to put the pulid model and the unet loader model in comfy ui?
Thank you for watching and the sub!
The models will go inside ComfyUI > models folder.
Here, you will need to create a new folder and name it pulid. Place the pulid ip-adapter model here.
The base model will go inside the checkpoints folder. Be sure to use SDXL ones as PuLID will only work with SDXL and not SD1.5 models.
@@CodeCraftersCorner Thank you so much i finally fixed it! I made sure I posted your video in 'latent vision's youtuube comment section so the people there could easily do his tutorial you are the best man!!!!!
Thanks for the update! Glad it worked for you, and I appreciate you sharing the video in the comments.
thank you
I will watch the useful video and learn from it.
I will support you~
Thank you too
THANK YOU!
Thanks! Keep it up! 💪
Thank you!
thank you
thank you sir
Most welcome
Thank you🙂
Glad it was helpful!
Bro, I can't get this model juggernautXL_4_Steps_Lightning.safetensors?
Hello, you should search for Juggernaut XL on civitai website. The model id is 133005. Once the page loads, just below the name, there are different models you can choose from. Download the one that says: V9+RDPhoto2-Lightning_4S
Could someone help me? I get this error after installing PuLID from Manager and I don't know how to solve it: "Cannot import F:\Comfyui\ComfyUI\custom_nodes\PuLID_ComfyUI module for custom nodes: Unable to import dependency onnxruntime."
Can you try to open the terminal in your ComfyUI folder: F:\Comfyui
Type in this command: python_embeded\python.exe -m pip install onnxruntime
press enter.
See if the install and then start ComfyUI
hii! thank you so much for the tutorial! for some reason I still have 4 missing nodes when loading a pipeline (missing pulidInsightFace, PulidModelLoader, PulidEvaClip, Apply Pulid). When I try pip install insightface it says, it's already installed..I assume it's not about individual components, can't find the solution
Can you try to update ComfyUI and the PuLID custom node manually. In your terminal, go to ComfyUI and type "git pull". Then go to custom node folder > pulid > type "git pull". Then try again.
@@CodeCraftersCorner thank you so much!
how to install it if I don't have the portable version????
Hello, most of the steps will be the same. You will have to replace part of the command with you python environment. Every time, I said "Python_embedded/python.exe", you have to use your virtual environment or conda environment python executable instead to run the command. Otherwise, it will be the same.
Thank you for the amazing tutorials, liked, subs and notif up! Can you please tell me what could be the problem, I installed PuLID node but it's says IMPORT FAILED, and that's the case for many other nodes. My comfy is pinokio integrated and I always update it and I tried the fix ''TRY FIX". Also for imported nodes When I try to generate it says: Numpy error.
Hello, can you check your terminal and see if you have any "module not found" error. Usually, the import will fail if you are missing some dependencies. Missing "Insightface module" is common or it can be the "filterpy" module.
@@CodeCraftersCorner thanks, I fixed the issue by deleting everything and reinstalling it.
Glad you got it to work.
facexlib error never goes away, i tried everything
Sorry to hear that! I understand it can be really frustrating. As a last resort, you might consider trying Python 3.10. However, this can lead to more complications with future updates.
@@CodeCraftersCorner yes i did it i was using pythoon 3.12 then i had to downgrade.
@ShubzGhuman I see.
what about live portrait, isn't it better?
Yes it's way far better
I'll say better
@@CodeCraftersCorner PuLID or libepotrait???
Live potrait mix node one is superior
They are for two different purposes though. LivePortrait will animate a static portrait while PuLID will create a new image with the face.
ValueError: Query/Key/Value should either all have the same dtype, or (in the quantized case) Key/Value should have dtype torch.int32
query.dtype: torch.float32
key.dtype : torch.float16
value.dtype: torch.float16
🤔
You can try running ComfyUI with --force-fp16. Add it to the run_nvidia_gpu.bat (batch) file. It may or may not work though!