How to AI Upscale with ControlNet Tiles - High Resolution for Everyone!

Поділитися
Вставка
  • Опубліковано 19 січ 2025

КОМЕНТАРІ • 520

  • @sebastiankamph
    @sebastiankamph  Рік тому +19

    Download Prompt styles: www.patreon.com/posts/sebs-hilis-79649068
    Please support me on Patreon for early access videos. It will also help me keep creating these guides: www.patreon.com/sebastiankamph

    • @canaldetestes4517
      @canaldetestes4517 Рік тому +2

      @@interesant7 Hi, as I'm an advertising man since 1975 we deal with images and photos pictures designs rights all the time, but people need to understand that if you send you work to the Internet this will be a trouble, when someone get your photo or you draw and copy it 100% and sell this is crime, but when you get some work on internet and use it to learn this is not a crime and we all do this since 1992 when internet start, just think and you will see this.Also when you use more than one article to write yours this is not crime because this is research, if you read about Rights you will see that has a lot more things evolved than people are saying, Artist would like to show his works to get applause and than start to complain this is the problem. Everything that we see theses days was create from something and this a "petrea"rule you don't take milk from stone.

    • @AnimalsSlaughterButchery
      @AnimalsSlaughterButchery Рік тому

      The link doesn't open, is it exclusive for a specific countries?

    • @zoloradomester
      @zoloradomester 4 місяці тому

      It is not free, have to join to download...

  • @AscendantStoic
    @AscendantStoic Рік тому +172

    Few things:-
    1-10:22 If you want an upscaler that adds details to skins try 4x_NMKD-Siax_200k or 4x_NickelbackFS_72000_G, when I upscaled using those two the person's skin went from plastic smooth to highly realistic (with blemishes, veins, wrinkles, pores, beauty marks...seems these upscalers were trained on photos specifically).
    2-5:00 When I use the Seams Fix option in Ultimate Upscale I get 2 results, one of them before the seams fix and one of them after, and the one "after" barely has any noticeable seams whatsoever, so maybe try checking the "seams fix" option, either way it's still miles ahead of the original SD Upscale which suffered from ghosting issues between squares.

    • @sebastiankamph
      @sebastiankamph  Рік тому +14

      Got to try those upscaler models, thank you for the tip!

    • @lollihonk
      @lollihonk Рік тому +2

      Thanks 🙏

    • @rproctor83
      @rproctor83 Рік тому +1

      I love nmkd, but it seems to add some kind of green tinting to the images. I've not heard of nickelback upscale, will check it out. thx!

    • @BesniaDarvar
      @BesniaDarvar Рік тому +4

      Hello, where can I download the 4x_NickelbackFS_72000_G ?

    • @DanielPartzsch
      @DanielPartzsch Рік тому

      Thanks! Can you please let us know, where to best download them?

  • @mariokotlar303
    @mariokotlar303 Рік тому +10

    Thanks for speaking up about your experience of being unable to get rid of seams. I've had exactly the same experience myself, great to know it's not just me. Big props for using photoshop to clearly demonstrate the seams and compare results of different seams-fix modes!

    • @sebastiankamph
      @sebastiankamph  Рік тому +2

      Thank you for the support, Mario. Greatly appreciated 😊💫

    • @anushdsouza9632
      @anushdsouza9632 11 місяців тому

      m getting low opacity faces on some tiles of the image how to fix it@@sebastiankamph

  • @speculaBOND
    @speculaBOND Рік тому +65

    2:04 leave everything default, except denoising to 0.15 for first pass (could go higher second pass)
    2:32 check enable controlnet, re-copy image down below, preprocessor - tile_resample
    2:57 model - control_v11f1e_sd15_tile [a371b31b]
    3:37 leave default, except control mode - ControlNet is more important
    3:50 for script - 'Ultimate SD upscale', target size to 'scale from image size' x2, upscaler - '4x UltraSharp'

    • @anushdsouza9632
      @anushdsouza9632 11 місяців тому

      im getting low opacity faces on some tiles of the image how to fix it

  • @aliceleblanc7318
    @aliceleblanc7318 Рік тому +11

    You know what? I really like how you take time to compare the before and after result of the old man's 2k and 4k version, switching serveral times between them. Sounds like a small thing but thats what most youtubers I've seen don't pay attention to, making it more of an effort to follow for their viewers, and are thus missing an opportunity to get theit point across.
    ....sside that the workflow you pieced together is awesome. THANK YOU! :D

    • @sebastiankamph
      @sebastiankamph  Рік тому +2

      Thank you for the kind words and your thoughtful response. It means a lot 😊

  • @luizalexandre9720
    @luizalexandre9720 Рік тому +3

    Greetings from Brazil. One of the best videos I've watched these last few days. It helped me a lot here.

  • @kgsz
    @kgsz Рік тому +10

    I love the way you explain things. You're truly a gifted educator.

  • @theresalwaysanotherway3996
    @theresalwaysanotherway3996 Рік тому +36

    quick tip for speed: you can turn up the tile width to generate the upscale faster, as it requires less total tiles and so the total time waiting between each tile is reduced. It also allows controlnet when upscaling to retain more context from the image, theoretically resulting in a better picture. 768x768 is the obvious option to choose, but even higher is an option.

    • @sebastiankamph
      @sebastiankamph  Рік тому +11

      I had a think about this. For the 768 models, surely. But the ones trained on 512, I assume the image quality running 512 must be better compared to the speed gains? Either way, interesting thought.

    • @Steamrick
      @Steamrick Рік тому +3

      @@sebastiankamph I've always found that even 512 models do find with higher resolutions once they have a framework to work with.

    • @chyrek.ambient
      @chyrek.ambient Рік тому

      Nobody can mastrubate that quickly!

    • @morphles
      @morphles Рік тому +2

      @@sebastiankamph The 512 models being limited to 512 is sorta legend at this point imo. Some of comunity models can do ~1024x800 very decently. crystalClearV1 (note v1, v2 seems to have screwed up) is very nice model for that. I'm pretty sure you can gen 768x768 with it with no problems. Though there is usual caveat that aspect ration is very important depending on prompt.

    • @fixelheimer3726
      @fixelheimer3726 Рік тому +7

      I've tried with 512,768,1024 and 2048.you get less mutations and stuff with higher tile size,if you also lose quality or depth, I'm unsure. One thing for sure going 2x in upscale multiple times is better than 4x for example in a single step regarding end quality

  • @Deefail
    @Deefail 10 місяців тому

    The best tutorial ive watched. Amd im not even talking about the actual golden information, but just the way you do tutorials is great. Subscribed

  • @unheilbargut
    @unheilbargut Рік тому +16

    Oh this is a godsend. I work on a children-book with a lot of photorealistic images in the moment and use a combined approach of oldschool Photoshop work and Stable Diffusion and struggle a bit with the upscaling in some images. I‘ll spend the rest of the night using this approach and think that it will solve my issues. Thanks for the great video!

    • @mrzackcole
      @mrzackcole Рік тому +1

      I'm a composite artist doing the same kind of work. Would love to follow your journey! Are you sharing your work anywhere?

  • @sevret313
    @sevret313 Рік тому +97

    You do not need to copy the image to the ControlNet when using img2img, if you leave it blank it will just copy the image you're using in img2img.

    • @parsley8188
      @parsley8188 Рік тому +4

      thx

    • @parsley8188
      @parsley8188 Рік тому +7

      good advice

    • @maxmustermann3938
      @maxmustermann3938 Рік тому +8

      I think he also forgot to replace the controlnet image with the upscaled ones so he has been using the low res 512*512 image in controlnet the entire time

    • @ryanhart9391
      @ryanhart9391 Рік тому

      Ok, thanks! Whew! I'm glad that's the case. I was already starting to get annoyed with that step.

    • @androidemulation5952
      @androidemulation5952 Рік тому

      ​@@maxmustermann3938nice catch. hopefully he see this.

  • @hermitcleric
    @hermitcleric Рік тому +3

    Firstly thanks a lot for the video, it was my starting point. So, I experimented a lot with this and found out that the reason you might be getting less detail and smoothed out skin is that while you're setting low denoise 0.15, and therefore very low freedom to create content, you're using sampler Euler a, which generates a lot of chaos. What I'm doing is setting denoise to 0.4, sampler DPM++ 2M Karras, but to compensate for the extra freedom I'm giving to the machine I crank up ControlNet's "Control Weight" to 2. I've achieved great results. You can also experiment with the CFG scale between 7 (default) and 14 if you want to force the machine to create certain details. Oh yeah, and DEFINITELY use Deliberate or Reliberate guys. These checkpoints are game-changing.

  • @Zeetkt
    @Zeetkt Рік тому +23

    You can also use multidiffusion, it does basically the same thing automatically, and it actually ADDS detail instead of removing it.

    • @trsd8640
      @trsd8640 Рік тому +3

      @sebastian : Pls add Upscaling tutorial with Multidiffusion! The results seem to be better than with tiles. But my mac gets ot of memory on higher resolutions

  • @jossipepegonzalez
    @jossipepegonzalez Рік тому

    Dude, thank you so much for your content. It is fantastic. You have completely changed my experience of design and photomanipulation. Can't wait to see what you talk about next!

  • @Herman_HMS
    @Herman_HMS Рік тому +2

    Smoothing also comes from low number of steps. As denoise strentgh is lowering them, set them at least 100+, then steps on each tile will be 10+ with denoising around 0.1. It will provide more details and texture to images.

  • @aaronamortegui345
    @aaronamortegui345 Рік тому

    this is insane, thank a lot for this video, upscale like this way a picture is crazy

  • @AltoidDealer
    @AltoidDealer Рік тому +5

    I’ve been waiting for someone to upload this video, so glad to see you do it. Tile controlnet is so amazing. Btw, you don’t need to put an image into the preprocessor, it automatically fetches it from img2img. Your next video should be on the Inpainting model, which has rendered all “inpainting models” obsolete (Realisticvison-inpainting, etc etc)

  • @ronnykhalil
    @ronnykhalil Рік тому

    MINDBLOWN. Thanks for going so in depth.

    • @sebastiankamph
      @sebastiankamph  Рік тому

      You're welcome! Thank you for the support you've given Ronny, it means a lot to me.

  • @Lansolot
    @Lansolot Рік тому +3

    This is getting so amazing. When I get a txt2image that's good, I will use it to gen a few images at 4k res with a .3 denoising. I then take them into photoshop as layers and erase all the bad parts until I have the best image. Then I take that image and upscale it to 8k. I resize that back down to 4k bring it back into photoshop and erase any bad detail it may have picked up. Then I set denoising to .15 or .2 and upscale to 8k.
    edit: I should mention if the txt2image you have has any imperfections they will be amplified, use inpainting to fix them first.

  • @orirune3079
    @orirune3079 Рік тому +1

    One trick I recently discovered and started to experiment with is to generate two images, one with low denoising (0.15-0.25) and one with higher denoising (0.5-0.6). Then I load them both into Gimp, with the low denoised image above the high denoised one (so the less-changed image is on top). Then apply a transparent mask, and cycle between the images, and paint it over any parts where you think the low-denoised image needs more detail. This allows you to keep any good detail generated by the higher denoising, while having the majority of the image not overly changed. Of course, it takes a lot of time and effort, but I think it's worth it.

  • @Raddland
    @Raddland Рік тому

    Oh my word, this is super helpful. Thanks for making the vid!

  • @yakuza25rus
    @yakuza25rus Рік тому

    I understood everything from the first time and was able to repeat it myself, thank you a lot

    • @sebastiankamph
      @sebastiankamph  Рік тому

      Thank you for the comment, glad the video was helpful!

  • @theSato
    @theSato Рік тому +9

    Thanks for covering this! Between you and other figures in the space like AITrepeneur, you're doing the great work of informing the SD/AI community on the latest tools and techniques.

    • @AltoidDealer
      @AltoidDealer Рік тому +1

      The next thing they should be covering is the ControlNet Inpainting model… it obsoletes all “inpainting models”

    • @theSato
      @theSato Рік тому

      @@AltoidDealer yee that's great

  • @Jukahetube
    @Jukahetube Рік тому +2

    I think the different results between stepped upscale and direct upscale are quite easy to explain. With the direct to 8k method each 'starting tile' is quite blurry so the controlnet has trouble doing it's job of making the added detail make sense within the context of the image. In comparison the downside of stepped upscaling seems to be general loss of detail at each pass which has a cumulative effect. I think both of these downsides can be mitigated and there is probably a sweet spot to be found for number of stages to maintain both coherence and detail. Some general rule like "fewer stages is better as long as each tile is sharp enough for controlnet to work" might apply. Or; "Two stages are good, three is ok, four will loose too much detail". Finally, if there are techniques that can be used to mitigate the loss of detail on the staged approach it might be superior overall.

  • @Skittlz444
    @Skittlz444 Рік тому +4

    That "smoothing" you're getting is actually the upscaler, 4x-Ultrasharp is actually not perfect at retaining details as you showed, it's very good, very efficient, and fast (being a GAN). I personally like LDSR for keeping details, it's slower by far and more memory intensive yes, but there is much less smoothing. Haven't tried with tiling and your work flow to try and scale to insane sizes, worth you trying out though

  • @rickarroyo
    @rickarroyo Рік тому

    Very cool, Mr. Kamph!
    I will do some tests soon.
    Thank you very much

  • @sidewaysdesign
    @sidewaysdesign Рік тому +5

    Regarding stepping up vs. max in one step, the images are so close to one another that you could stack the output in an app like Photoshop and paint in a mask that would blend the best aspects of each. Similarly, the background tiling example you showed could easily be masked over with a dedicated AI upscaler like Topaz.

    • @bryan98pa
      @bryan98pa Рік тому

      But with A1111 you can add/fix some details while you upscale a photo

  • @manleycreations6466
    @manleycreations6466 Рік тому +4

    To combat the over smoothing of details, you could bring it into Photoshop and use an unsharp mask to add detail, then send that back to SD to upscale.

    • @sebastiankamph
      @sebastiankamph  Рік тому

      I love unsharp mask! Good tip. I usually use it for final touches.

  • @SabaSanatgar
    @SabaSanatgar Рік тому

    This was a super helpful video for my potato PC. All it takes is time and I can make my own 1440p desktop backgrounds. And here I was saving up my pennies for a used 3090. Thanks!

  • @mikerhinos
    @mikerhinos Рік тому +1

    I'm using the ultimate SD upscale technique for quite a while now and didn't really saw an advantage using ControlNet (yet), but when I go into the upscale steps, I delete the prompts and only keep the quality keywords, like "ultra detailed, 8K, hasselblad" and denoise at 0.15 to max 0.30, and put back random seed, so nothing will influence the details creation.
    Every tile is passed through the algorithm separately, that's why if you keep a high denoise it will generate a collage of different images, so random noise, low denoise :)
    Try it and let me know ;)

    • @mikerhinos
      @mikerhinos Рік тому

      After viewing the video from Olivio Sarikas (ua-cam.com/video/3z4MKUqFEUk/v-deo.html) I made multiple tests with the tile ControlNet, and it helps getting a better image quality upscaling directly to x4.

  • @ulamss5
    @ulamss5 Рік тому

    5:15 thank you for that detailed look into seams. It's been driving me crazy. Someone who opened an issue mentioned that apparently it goes away in Linux, but I can't verify that.

  • @DennisFrancispublishing
    @DennisFrancispublishing Рік тому +1

    I still cannot get Controlnet 1.1 to show the new settings. should I clear the old ControlNet from my system and start over?

  • @stibo
    @stibo Рік тому +3

    I've noticed that if you use the 4x Ultrasharp model, it changes the color of the result slightly. I think the 4x_UniversalUpscalerV2-Sharper_103000_G gives a more accurate colored result.

    • @shiccup
      @shiccup Рік тому

      where do you get this from?

    • @stibo
      @stibo Рік тому

      @@shiccup drive.google.com/drive/u/1/folders/15dV3rhMcq_LXg5Vrn7DG6lwo4M6cXt6C

  • @erikschiegg68
    @erikschiegg68 Рік тому +1

    There is also an ONNX 4x-ultra_sharp model in the link provided by Sebastian, you can use this ONNX model with ChaiNNer and an AMD graphics card.

  • @crazymoe8494
    @crazymoe8494 Рік тому

    Thanks for this video, made a few images whilst watching.

  • @MicahBBurke
    @MicahBBurke Рік тому +2

    3:57 "we installed ultimate sd upscale" - when/where?!

    • @MicahBBurke
      @MicahBBurke Рік тому +1

      Found it in the Extensions installer.

  • @jibcot8541
    @jibcot8541 Рік тому +1

    I have made a few interesting (and scary!) image fails while using the ultimate upscaler grids with too high a denoising settings, I'm going to have to try this control net approach, it looks great thanks!

  • @filmyk
    @filmyk Рік тому +2

    6:34
    To improve this we could use a photography technique, we could add noise to the image. In addition we could mask the noise according to the luminance of the photo, so that the noise is more pronounced in the dark elements of the image and less in the light ones, to affect the portrait as little as possible and as much as possible to the background. It is a common technique to remove banding from images.

  • @jfosterdesigns
    @jfosterdesigns Рік тому +1

    This is so incredibly helpful! Thank you very much!

  • @mamadsc
    @mamadsc Рік тому +2

    About the loss of details between the 2048 and 4096, I have two comments after doing some experiments :
    1.Shouldn't you feed back the resulting image to ControlNet image after each pass (or just removing the image from ControlNet, it will automatically select the image being img2img) ? Because in your example the 4096 image is being built from the control of a 512px, I guess it makes sense that it would lose a lot of details.
    2.I'd try to add more Denoising strength. 0.15 seems low for the amount of new pixels being created between 2048=>4096. Without letting the model denoise more, I think it has to look more "smooth" due to the basic resizing and expansion of pixels.
    Anyway, great video thanks :)

  • @NomadAerial
    @NomadAerial Рік тому +2

    Tip for getting more for less. First, lose the prompt when you send your image to img2img. Secondly, make sure your controlnet is pixel perfect. Scale to the desired size as you did in your video. Select the half-tile offset + intersections. Set both your denoise and the denoise strength in the seam fix equal to one another. Without prompt and with Controlnet you can set denoise strength to 1.0 if you want. I find 0.5 is fine for most work. If you have a prompt it will be interpreted individually in each tile and it'll try to draw the complete picture in each tile if you have your controlnet importance set to balanced. Perfect upscale with no seams every time.

    • @NomadAerial
      @NomadAerial Рік тому

      One other thing that might be confusing. At the bottom. Save only the seamfix and not the upscale.

    • @GrayFox200
      @GrayFox200 Рік тому

      Thank you for the helpful tips. I did not get this part (Set both your denoise and the denoise strength in the seam fix equal to one another)?

  • @RSKT_music
    @RSKT_music Рік тому +6

    I think you could use both of the results at the end. Use frequency separation in Photoshop and ad it to your 4k render. Erase the parts that don't work :)

  • @MikkoRantalainen
    @MikkoRantalainen Рік тому

    Superb tutorial for scaling! Now I can generate high resolution images even with my old GTX 1060 3 GB card that's running my Linux desktop.

  • @Alukardo11
    @Alukardo11 Рік тому

    This is all was i asking for, similar to the Midjourney Upscaler in their Web. Many Thanks Sir

  • @mgwr
    @mgwr Рік тому +1

    why does your a1111 look different than mine with the loras and hypernetworks at the top, also i can't drag and drop on img2img i have to manually find the file 😭😭

  • @monitepmoo
    @monitepmoo Рік тому +5

    Tip: To obtain sharp detail at very high resolution, try to copy the previous upscaled image in control net window too! 👍🙂

    • @sebastiankamph
      @sebastiankamph  Рік тому +2

      I did try this but didn't see much improvement on my end. Did yours get better? I might've missed something then.

    • @MrSongib
      @MrSongib Рік тому

      Two passes 4X + 4X ? good idea.

    • @monitepmoo
      @monitepmoo Рік тому +2

      @@sebastiankamph yes¡ For my experience, results are sharper and more detailed with this method! Can I show you in LinkedIn? (I wrote a post with examples)

    • @mrzackcole
      @mrzackcole Рік тому +1

      @@monitepmoo I'd love to see your results!

    • @sebastiankamph
      @sebastiankamph  Рік тому

      @@monitepmoo Sure, just send me a message

  • @Zique909
    @Zique909 Рік тому +1

    if I had used Adetailer in the image i wanna upscale and i set denoise 0.1 or plus.. my adetailer is completely ignored even if I use tiles. Has anyone encountered this problem?

  • @lordzz00
    @lordzz00 Рік тому

    Resolutionaries...lol, you just got yourself a subscriber. Oh and thanks for the awesome tut :)

  • @thiago_merlo
    @thiago_merlo 9 місяців тому +1

    Great video, but this folder doesn't appear for me, I installed my stable today, do you know if the ESRGRAn installation folder changed?

  • @TheAiConqueror
    @TheAiConqueror Рік тому

    No Clickbait here! 💪😁

    • @sebastiankamph
      @sebastiankamph  Рік тому +1

      It took a while, but it managed 8k 😅. Thank you for the support as always, you conqueror of AI, you! 😘

  • @lyonstyle
    @lyonstyle Рік тому

    fantastic video sebastian! everyone is sleeping on controlnet, it has tons of possibilities

  • @deathybrs
    @deathybrs Рік тому

    Oh, say, I had an idea for use with ControlNet... I don't have a channel, so I thought I'd share it with you in case you might like to make a video about it.
    The easiest way to create custom poses for use with ControlNet is to use one of those artist's manikins. It seems so obvious to me in hindsight that I feel kinda dumb for not thinking of it sooner. You can create any pose you want, and then easily get any specific angle of perspective on it once you have your pose, a lot easier than posing yourself and taking a picture, because controlling the framing and angle are going to be so much easier this way, plus easier to set up a green/white screen to have no background, etc.

  • @Venompapa
    @Venompapa Рік тому +1

    updated controlnet but i dont have the tile_resample option for some reason o.o

  • @kernsanders3973
    @kernsanders3973 Рік тому +1

    Just FYI, if you got a image already placed into the img2img, then you dont need to place it again in ControlNet, it will automatically pic up he image in img2img

    • @4.0.4
      @4.0.4 Рік тому

      I didn't know this!

    • @sebastiankamph
      @sebastiankamph  Рік тому +1

      Sent the wrong reply to the wrong comment. New here. I wasn't aware they changed this. Afaik it wasn't like that previously.

  • @76abbath
    @76abbath Рік тому

    Thanks a lot Sebastian! Your explanations are very useful! ❤

  • @ZeroCool22
    @ZeroCool22 Рік тому +1

    This method with last version of AUTOMATIC is not working anymore, it gives a different face...

  • @TransformXRED
    @TransformXRED Рік тому +3

    Seb, have you tried the multidiffusion-upscaler-for-automatic1111?
    For clean 2x with controlnet, it's pretty clean.
    What I do personally is to just upscale 2x the results in the extra tab, then use this image in controlnet to keep the general lines.
    Then the denoising can go a bit higher

    • @AltoidDealer
      @AltoidDealer Рік тому +1

      This is the upscaler that ControlNet says on their wiki they used as their target upscaler. It’s the best one for sure

  • @inputoutput1126
    @inputoutput1126 10 місяців тому

    good thing to point out is that if the model you're using has an inpainting version. use it for the upscale

  • @ShayVidas
    @ShayVidas Рік тому +1

    i dont see any models
    when i try to open that models tab in controlnet,
    but i see a LOT of models in the
    stable-diffusion-webui\extensions\sd-webui-controlnet\models
    folder
    does someone else have this issue ?

  • @FirstLast-pu9ty
    @FirstLast-pu9ty Рік тому

    You could also do a regular quick upscale and then send the scaled image back to in-paint and then mask an area where the detail was lost and regenerate that area using a new prompt that covers only that masked area. And then repeat for other areas. A blurry area like an out of focus background is not going to need any extra detail anyway. Only add detail where it's needed.

  • @PawFromTheBroons
    @PawFromTheBroons Рік тому +1

    When you drop the first pass 1024 result to be upscaled to 2048, don't you need to drop in on ControlNet too?

  • @TheStanislavson
    @TheStanislavson Рік тому

    Thanks a lot mate! My AMD GPU, the poor soul was so freightened by the other methods that it outputed a wall of text praying to the DMT gods not to do upscales.

  • @RTW590
    @RTW590 Рік тому +1

    How does this compare to doing a normal image to image at a higher resolution and then upscaling normally? Is the benefit of tile mainly for people with GPUs that can't handle the higher image to image resolution and upscaling or does it create better overall results?

  • @alecubudulecu
    @alecubudulecu Рік тому +1

    I like using the tule diffusion extensión instead. Same thing. Just I find it gives more granular control over the upscale. You can do quadrant prompting to target specific sections. And you can change the denoise or CFG just for the tiles or have heightxwidth custom settings for each tile

  • @Jessees902
    @Jessees902 Рік тому +1

    This is probably a dumb question but I can't find an answer anywhere. Can you use an photograph image in img2img and upscale that?

  • @parsley8188
    @parsley8188 Рік тому

    I love the upscalers. I've seen them before but I never knew how to use them

  • @hasuwini
    @hasuwini Рік тому

    That was amazing my dude! Would love to have a continuation video with the same thing but working on a small video, like how to use controlnet in that manner with potato PC doing a small animation if possible upscaled... Let me know if that could be a thing!

  • @iLEZ
    @iLEZ Рік тому +2

    Might be good to add that it's important to choose a sampler that works with this method. Euler a seems to yield good results, I've tried some others and I get incredibly blotchy results.

  • @AlphonsoPeluso
    @AlphonsoPeluso Рік тому

    Hey, Great Video! Should we be using the upscaled image in ControlNet or leaving the 512x512 image is ok?

    • @bgtubber
      @bgtubber 8 місяців тому

      Good question! 🤔

  • @themythof3dbeauty671
    @themythof3dbeauty671 Рік тому

    The license of 4x-UltraSharp is for NON-commercial use, and retribution is legally mandatory whenever you use it.

  • @Not4Talent_AI
    @Not4Talent_AI Рік тому

    Have you tried going into photoshop and trying to apply one image's details (8k) to the (4k) other? Maybe with a highpass or something

  • @SensuTech
    @SensuTech Рік тому

    Sorry for my idiot question but what is the difference with the same upscale in extra mode?

  • @coda514
    @coda514 Рік тому +1

    Great content Seb. Informative as always, you help "demystify" all of these extensions for the common man. BTW, Did you see the display of still-life art? It was not at all moving.

  • @darkwing_the_spacecat
    @darkwing_the_spacecat Рік тому

    What... the.... wow! Thank you so much!

  • @Rainz_Storm
    @Rainz_Storm Рік тому

    Thank you so much, this is really useful!

  • @DG-wy9fk
    @DG-wy9fk Місяць тому

    Hello. Thank you for this cool tutorial, I repeated all the steps couple of times for understanding. Please can you answer why the same process of upscaling does not wirk with SDXL models and any tile models like "Xinsir Сontrolnet Tile SDXL 1.0"? I got with whem only ugly upscaled picture with no extra detailing at all.

  • @youtor666
    @youtor666 Рік тому +4

    change the Noise multiplier for img2img from default(0.5) to between 1 and 1.1 :) it give way better results

    • @theSato
      @theSato Рік тому +2

      yes and no, its good if you want 'additional details' painted in (sorta similar to having higher Denoising for this) , but if you want authenticity toward the main image then you'd rather keep the noise multiplier lower.

    • @fixelheimer3726
      @fixelheimer3726 Рік тому

      Noise multiplier?is that new?

    • @ramboti6402
      @ramboti6402 Рік тому

      you mean 0.1 and 0.11 right?

  • @jasonstetsonofficial
    @jasonstetsonofficial Рік тому +1

    RuntimeError: mat1 and mat2 shapes cannot be multiplied (77x2048 and 768x320)????

    • @sebastiankamph
      @sebastiankamph  Рік тому +1

      Can't mix SDXL and 1.5

    • @accrualbowtie
      @accrualbowtie Рік тому

      Is there a workaround to get this to work with SDXL?@@sebastiankamph

  • @SouthbayCreations
    @SouthbayCreations Рік тому

    Thank you Seb! Fantastic info!! 🙏🙏

    • @sebastiankamph
      @sebastiankamph  Рік тому

      Happy to help! Thanks for your continued support 😊

  • @___x__x_r___xa__x_____f______

    also, can't remember where you explained outpainting with mk2, which allows you to crop in, and crop out in i2i...?

  • @hrmpk26
    @hrmpk26 Рік тому

    just use unsharp mask in GIMP if it’s blurry. if there is a seam issue inpaint the seam.

  • @RHYTE
    @RHYTE Рік тому

    What does the tile model do better compared to the ultimate sd upscaler tiling?

  • @davidbrinnen
    @davidbrinnen Рік тому

    Very well explained. Thank you.

  • @dodd15
    @dodd15 Рік тому +1

    There is mistake in this video. To use tile controlnet correctly with SD upscale you should leave controlnet input empty so it is fed with current tile and not whole image for each tile. Then you can crank up denoising strength much higher (start with 0.75) and you will get much better details!

    • @dodd15
      @dodd15 Рік тому

      correction: what I say apply to SD upscale, now tried with Ultimate SD upscale and see no clear winner between without and without image. Seems Ultimate replace image by itself, you can even put totally different image and it doesn't affect anything.

    • @simpleandfrank
      @simpleandfrank Рік тому +1

      @@dodd15 can you be more clear? I am a little bit confused..

  • @eranfeit
    @eranfeit Рік тому

    Thanks for great tutorial . Why did'nt you update the denoising to 0 (zero) ? why the value is 0.15 ?

  • @shiccup
    @shiccup Рік тому

    im getting this error RuntimeError: Given groups=1, weight of size [64, 3, 3, 3], expected input[1, 4, 192, 192] to have 3 channels, but got 4
    channels instead reply if you know of a simple fix has something to do with the ultimate upscaler not really sure though.

    • @mironov2162
      @mironov2162 Рік тому +1

      try settings>upscaling>upscaler for img2img>choose "none"

    • @shiccup
      @shiccup Рік тому

      @@mironov2162 well try this appreciate the response

  • @santosic
    @santosic Рік тому

    Where has that SD upscaler + control net tiles combo been my whole life (well, my short Stable Diffusion life anyway)?? That is huge. I will never upscale the old way again lol! 😅

  • @streamtabulous
    @streamtabulous Рік тому

    im curious what happens if you take the og image and use gigapixle ai upscale on it and compare it to the controlnet one.

  • @SohanSmol
    @SohanSmol Рік тому

    Can I ask why you are doing the upscaling on img2img instead of extra? I thought that was that extra is for

  • @2PeteShakur
    @2PeteShakur Рік тому

    awesome, now how do you save these upscaling settings for next session/s?

  • @Steamrick
    @Steamrick Рік тому +1

    Have you tried iterating the control net? I mean, always inputting the img2img output into the control net for the next stage?
    Also, I'd try putting something regarding old skin into the prompt. You know, 'age spots', 'wrinkles', stuff like that. That should help, especially if you go for a bit higher denoise.

  • @miro-hristov
    @miro-hristov 5 місяців тому

    Great tutorial! Just accidentally found out that if you use the inpainting version of a checkpoint it doesn't generate any seams.

  • @Raddland
    @Raddland Рік тому

    What I don't understand is if it is making tiles based off the original prompt, then when it gets to the tile for the lower half of his face, how does it not attempt to generate pupils and things for the lower half of his face? When I try this method, I tend to get a bunch of tiles that try to recreate the image based on the prompt.

  • @juanjesusligero391
    @juanjesusligero391 Рік тому

    I always press like as soon as I hear your dad joke XD I'm never disappointed XD

  • @hdawod
    @hdawod Рік тому

    That's brilliant, Thanks a lot.

  • @23dsin
    @23dsin Рік тому

    Great! Thanks for your work!

  • @qwetry-j2u
    @qwetry-j2u 28 днів тому

    Is this workflow applicable for the Flux model in Forge UI? If not, please let me know what I should do instead. Thanks!

  • @yoda5477
    @yoda5477 Рік тому

    Hello,
    Very clear and precise, so easy to reproduce, thanks a lot !
    Any idea if there is such things as tiled outpainting available somewhere ? (to outpait but with lowres constraint

  • @alexwang715
    @alexwang715 Рік тому

    Why don't use the 4x Ultrasharp in the Extra Upscaler directly? It seems both ways get similar results. But the Upscaler way is much easier and straight forward.

    • @sebastiankamph
      @sebastiankamph  Рік тому +1

      Upscaler is very simple compared to CN tiles that can add new details.

    • @alexwang715
      @alexwang715 Рік тому

      @@sebastiankamph Thanks man

  • @brettcameratraveler
    @brettcameratraveler Рік тому

    From your tests, is this new method the very best for images scaled all the way up to 8K? Or do you recommend a previous method if you are running a RTX card with 12+gigs of vram?
    How about upscaling 4K UA-cam (compressed) video to 8K? Is Topaz Video the best?