I dont undertand what did you do at 05:45 time, because you are using the same model and same config and it took less time to load. What exactly did you do there?
I had to go back and take a look. So the reason it was faster the second time is because the model was already loaded into memory. The first time you load the interface up and do a generation it has to load the model into memory. Which takes a little bit extra time. But once it's loaded each render after that takes less time. Also if you watch a reduce the number of steps for the generation which will also decrease the amount of time taken to generate.
@@AIchemywithXerophayze-jt1gg thank you for the reply i noticed i had change Diffusion with Low Bits and it work well! btw what good upscaler to use for flux forge? i always get blurry result when i upscale
Img2img, DreamshaperXL_inpainting, denoise set to between .25 and .30, controlnet tile, ultimate SD upscale. I probably have a video on how to do this somewhere.
This is a great question and depends a lot on the hardware of your computer. The resolution is limited by the AI model and what kind of images it was trained on. Flux has a range of resolutions that it's capable of doing. Somebody just posted on our discord the table, but at a one-to-one ratio it can generate images up to 1440x1440 before you start getting into image distortions. You can't generate an image larger than what the AI was trained on necessarily because then the image starts getting distorted. So what you do is you generate the image at the resolution that it can and then you, through various techniques upscale the image. As for how long it takes again that just depends on the hardware in your computer. In my computer I have a RTX 3080 ti with 12 gigs of VRAM, I have 128 gigs of regular RAM, and a 10th gen core i9 processor with solid state drives. And for me flux will generate a 1024x1024 image in about 30 seconds.
@@AIchemywithXerophayze-jt1gg I have a 3060 12gb, but I have 16gb of ram, Is having more ram helpful? I sometimes get this gitters until its finished rendering.
Wow! Forge updated? The developers wrote a couple of months ago that support was ending. Can I update an old Forge release by using the update file, or do I need to download and reinstall it from another development branch?
You can try and update your existing installation I don't think that would be a problem. Maybe make it back up of it first. Personally I used stability matrix to do the install an updates.
I think there's a lot of it trying to get it working in forge but for most people it's not there yet. All we get is a image cannot be iterated error, hopefully everything comes together in the next few days.
I got the same thing where I was saying image could not be iterated. Make sure you're using the right model, and just leave the settings on default. I don't believe that has anything to do with the amount of VRAM you have. Typically it's because there's a setting that just isn't compatible with the model that you're using.
Going to be honest, I'm not sure. I know there are ways to use stable diffusion with AMD video cards, you may need to check their GitHub discussion forums to find out if they do.
Its there a How To Download And run Forge with FLUX? I just intalled Comfy UI but im finding the overall workflow clunky and non intuitive. Forge looks way my my cup of tea.
What about negative prompts?
Negative prompts are not built into the flux model. Wait for the refined models they will most likely include some of that.
I dont undertand what did you do at 05:45 time, because you are using the same model and same config and it took less time to load. What exactly did you do there?
I had to go back and take a look. So the reason it was faster the second time is because the model was already loaded into memory. The first time you load the interface up and do a generation it has to load the model into memory. Which takes a little bit extra time. But once it's loaded each render after that takes less time. Also if you watch a reduce the number of steps for the generation which will also decrease the amount of time taken to generate.
my sdxl lora doesn't work only on my previous forge
Yeah with the new update they're dealing with certain issues that they're gradually working their way through and fixing.
@@AIchemywithXerophayze-jt1gg thank you for the reply i noticed i had change Diffusion with Low Bits and it work well! btw what good upscaler to use for flux forge? i always get blurry result when i upscale
Img2img, DreamshaperXL_inpainting, denoise set to between .25 and .30, controlnet tile, ultimate SD upscale.
I probably have a video on how to do this somewhere.
Why does no one make higher resolution images? What is maximum resolution and how long does it take?
This is a great question and depends a lot on the hardware of your computer. The resolution is limited by the AI model and what kind of images it was trained on.
Flux has a range of resolutions that it's capable of doing. Somebody just posted on our discord the table, but at a one-to-one ratio it can generate images up to 1440x1440 before you start getting into image distortions. You can't generate an image larger than what the AI was trained on necessarily because then the image starts getting distorted. So what you do is you generate the image at the resolution that it can and then you, through various techniques upscale the image.
As for how long it takes again that just depends on the hardware in your computer. In my computer I have a RTX 3080 ti with 12 gigs of VRAM, I have 128 gigs of regular RAM, and a 10th gen core i9 processor with solid state drives. And for me flux will generate a 1024x1024 image in about 30 seconds.
@@AIchemywithXerophayze-jt1gg I have a 3060 12gb, but I have 16gb of ram, Is having more ram helpful? I sometimes get this gitters until its finished rendering.
Wow! Forge updated? The developers wrote a couple of months ago that support was ending. Can I update an old Forge release by using the update file, or do I need to download and reinstall it from another development branch?
You can try and update your existing installation I don't think that would be a problem. Maybe make it back up of it first. Personally I used stability matrix to do the install an updates.
Great tutorial, but are you using the PINOKIO version?, thanks
No all this is being done through stability matrix.
I think there's a lot of it trying to get it working in forge but for most people it's not there yet. All we get is a image cannot be iterated error, hopefully everything comes together in the next few days.
I got the same thing where I was saying image could not be iterated. Make sure you're using the right model, and just leave the settings on default. I don't believe that has anything to do with the amount of VRAM you have. Typically it's because there's a setting that just isn't compatible with the model that you're using.
Does these flux models work with control net?
They are working on that. I believe they have one out now that works in comfy
@@AIchemywithXerophayze-jt1gg hope you make a video on that when it comes out 😄
still cant get it to work on my Forge, but runs well on Comfy.. i dont know why..
Sorry about that. Seems like everybody's system is a little bit different.
@@AIchemywithXerophayze-jt1gg a few people still got this "TypeError: 'NoneType' object is not iterable" message
is Lora's Flux working with forge or not yet ?
Yes they have a few out now that seem to work
@@AIchemywithXerophayze-jt1gg which ones ? None of them working for me
Hi all, can i use amd gpu with Forge?
Going to be honest, I'm not sure. I know there are ways to use stable diffusion with AMD video cards, you may need to check their GitHub discussion forums to find out if they do.
Its there a How To Download And run Forge with FLUX? I just intalled Comfy UI but im finding the overall workflow clunky and non intuitive. Forge looks way my my cup of tea.
Yeah my previous video I go through and using stability matrix to install it and get it up and running.
ua-cam.com/video/6cuE1WZpzWQ/v-deo.html
discord link broken
discord.com/invite/b9qsf9Tz
BTW the schnell model set to work in comfyUI I couldn't get to work. (Came from huggingface I believe)
The ones I listed in the video do work. I've tested them on my system.
@@AIchemywithXerophayze-jt1gg I didn't like having to log in though.
👋
✌️
Can i have an invite to Discord? Love your work and need some help creating characters for my platform release :)
If you can, email me at my email address listed on the channel.