Architectural sketches to realistic renders with AI (Control Net)

Поділитися
Вставка
  • Опубліковано 4 чер 2024
  • The video shows you how in seconds to transform architectural sketches and turn them into realistic photos in a click of a button. Using examples from architects such as John Pawson, you can skip the long 3D modeeling and rendering process to create unlimited options in any style you desire. Specifically We will be using Stable Diffusion and Control Nets which is a deep learning, text-to-image model with control extensions.
    Links:
    stable diffusion and control net github:
    github.com/lllyasviel/ControlNet
    Installation guide to control net:
    • 🤯Insane Image Manipula...
    Instagram account: / urbandecoders
    Paper on control net by Lvmin Zhang, Maneesh Agrawala: arxiv.org/abs/2302.05543
    #ai #architecture #generativedesign #sketches #interiordesign #controlnet
  • Наука та технологія

КОМЕНТАРІ • 31

  • @paulstein5196
    @paulstein5196 Рік тому +2

    Excellent intro to Stable Diffusion and Control Net in the archviz and architectural fields. Many thanks !

  • @2819creative
    @2819creative Рік тому +3

    looking foreword to see more content like this one. Much appreciated.

  • @Cazametroides
    @Cazametroides Рік тому +1

    Really interesting workflow! Looking forward to see how it affects the archviz industry

  • @samlin0506
    @samlin0506 Рік тому +1

    Thank you for sharing🎉

  • @pivxtrex8698
    @pivxtrex8698 Рік тому

    It's a real boost for my workflow. Now I can use those "pre-rendered" images to choose the "mood" with my customers and then I can start to work for the hi res final render with my 3d soft adding details or changing lights. Next step is sketch->AI prerender->AI 3d file->AI BIM->3d soft->final project.

    • @Urban_Decoders
      @Urban_Decoders  Рік тому

      For sure there will be developments for a more integrated workflow like this! there are some interesting developments with 3D generation and integrations with BIM. A lot to look forward to.

  • @B-water
    @B-water Рік тому +1

    thanks for sharing

  • @homestecconstructionarchit6928
    @homestecconstructionarchit6928 2 місяці тому +1

    ❤❤❤

  • @rahulchatrola2752
    @rahulchatrola2752 10 місяців тому

    hi there, its really interesting to see how different professions use AI. I wanted to ask if we can change the exact material or textures, like if there is concrete floor lets say I want to put tiles flooring on it and that to with specific size and design (which can be inputed in the controlnet reference image?). Also can we use multiple images in controlnet. Just a newbie here wanting to learn how to use it to my workflow/project.

    • @Urban_Decoders
      @Urban_Decoders  10 місяців тому

      For materials I would try first to describe it in more detail in the prompt. The control net will not understand specific measurements in relation to the drawing scales such as "100mm tiles", so it is better to say something like "small square tiles". Negative prompts also help a lot of if you dont want shiny metalic texutres for example you can put it in negative prompt to avoid that.
      In answer to using multiple images in controlnet, YES you can. If you go to settings, there is a multi-control net option which you can increase to combine various control net settings and options. You can have a lot of fun combing the control nets. Hope this helps

  • @latanjang8636
    @latanjang8636 Рік тому +3

    this is real future tecnology ! might 3d model and render skiller will be replaced by this

    • @Urban_Decoders
      @Urban_Decoders  Рік тому +3

      I believe that there will always be some need for 3D artists, but methods like this will just help them to make their work quicker and better. There will also be a focus more on artistic visions and composition for 3D artists/ designers rather than software skills

    • @beyondgoodkitten4407
      @beyondgoodkitten4407 Рік тому +2

      @@Urban_Decoders First of, out of all of my workmates I know ONE who's capable of drawing something more complex than a cube with a linear perspective. So that's the first reason why the majority of us will lose jobs. Secondly, providing a low quality drawing and spamming a litany of keywords renders this "new and exciting tool for 3D artists" so accessible for everybody that it literally eliminates the necesity for being a 3D artist to begin with.

    • @latanjang8636
      @latanjang8636 Рік тому +2

      @@Urban_Decoders yes exactly , but honestly i don't want A.I process will be replace whole designers roles, i'm fully hope this kind of AI will be one of tools for designer , like hybrid with various designers

    • @Urban_Decoders
      @Urban_Decoders  Рік тому

      @@beyondgoodkitten4407 Having a good eye for design, or like you said, even having the skill to create a basic sketch to start with an idea is not everything someone has, so there will be a place still for talented designs to drive this AI. I used Pawon's sketches for this example as even my sketches are not the best!

  • @andreadellungo4385
    @andreadellungo4385 11 місяців тому

    Great workflow!! is possible with floor plan?

    • @Urban_Decoders
      @Urban_Decoders  11 місяців тому

      Interesting idea! i'm sure you could for 3D plans as I made a video to train on aerial images below. Otherwise you could test or train a model for 2D plans. ua-cam.com/video/wvWJbCGOgu0/v-deo.html

  • @besprutad
    @besprutad Рік тому

    Any suggestions on doing product design sketches?

    • @Urban_Decoders
      @Urban_Decoders  Рік тому +1

      Product design would follow a similar workflow, if the focus is more on the 3D form i would use the "depth" preprocessor in control net, but if there are lots of small details in the sketch other preprocessors such as "scribble" or "canny" can work better.

  • @rafael_tg
    @rafael_tg 7 місяців тому

    which model is good for image of room to redesigned decoration?

    • @Urban_Decoders
      @Urban_Decoders  4 місяці тому

      I would use the canny control net to keep the room the same and then prompt to change the decoration styles. Use the Realistic Vision model for photorealistic outputs too.

  • @maxstudio7475
    @maxstudio7475 Рік тому +1

    what is future of 3dsmax and corona after AI what u thinks about these software ?????

    • @Urban_Decoders
      @Urban_Decoders  Рік тому

      i think they will build their own AI tech into their software (corona already has for example there is an AI denoiser). I imagine though there could be AI rendering within the live viewport to assist with the process.

  • @mattsloboda4450
    @mattsloboda4450 11 місяців тому

    Has anyone had any luck getting landscape design renderings? Wondering if Adobe generative fill would add plants

    • @Urban_Decoders
      @Urban_Decoders  11 місяців тому

      I havn't tested it on planting, but would be interesting to try on sketch vegetation! Might work if you used the "canny" control net to get more details for landscaping. For just adding plants to your scene, yes adobe generative fill works very well on that kind of thing.

  • @mukondeleliratshilavhi5634
    @mukondeleliratshilavhi5634 Рік тому +1

    Loving the negative promote

    • @Urban_Decoders
      @Urban_Decoders  Рік тому

      They are very useful for giving that extra control on the image which sometimes doesn't work in the main prompt

  • @thomasdrouant143
    @thomasdrouant143 11 місяців тому

    Can i practice for free using my computer rather than cloud ?

    • @Urban_Decoders
      @Urban_Decoders  11 місяців тому

      sure! you can install locally from github and run for free. I used this tutorial: ua-cam.com/video/BbYJd9kRjLg/v-deo.html

  • @happyandhealthy888
    @happyandhealthy888 6 місяців тому

    Editable version.