Diffusion Texture Painting | NVIDIA Research
Вставка
- Опубліковано 8 вер 2024
- NVIDIA Research presents a technique that leverages 2D generative diffusion models (DMs) for interactive texture painting on the surface of 3D meshes. Unlike existing texture painting systems, our method allows artists to paint with any complex image texture, and in contrast with traditional texture synthesis, our brush not only generates seamless strokes in real-time, but can inpaint realistic transitions between different textures.
To enable this application, we present a stamp-based method that applies an adapted pre-trained DM to inpaint patches in local render space, which is then projected into the texture image, allowing artists control over brush stroke shape and texture orientation. We further present a way to adapt the inference of a pre-trained DM to ensure stable texture brush identity, while allowing the DM to hallucinate infinite variations of the source texture. Our method is the first to use DMs for interactive texture painting, and we hope it will inspire work on applying generative models to highly interactive artist-driven workflows.
Learn more: research.nvidi...
#generativeAI, #diffusionmodels, #NVIDIAResearch
Although this tool looks limited in usefulness for most texturing jobs it seems like it would be a good fit in photogrametry touch up. Nice to see an AI project actually interested in making artist lives easier rather than replacing them.
what difference from Substance Painter ?
Let me in!! haha this is wonderful
amazing
nice idea but effect is ugly
dark magic
😄😄😄