I was just looking for a way to do this easily. You rock man. Thanks for sharing these workflows, I know how much time, effort and trial and error devising these methods takes.
The release of each of your videos gives me great expectations that of course are not disappointed. I share your interest in linking 3D modeling with AI. You are always at the forefront in this regard. Thank you very much for the workflows and your excellent tutorials.
Excellent and inspirational. I am a beginner Blender user with plenty of 360 VR and some immersive VR world building experience. I wonder if I can walk through this and create a base model to iterate on. Just decided to challenge myself, your video was the inspiration. Thank you.
Nice one! Only thing missing would be some kind of a thing that gets the information from mesh combined with texture that there is some stretching going on in certain regions. And then calculates the missing texture for that area so that it does not fall apart when changing perspective. Then use some displacement maps to get more detail. Binga bonga, nice scene!
This is something i was doing with SD 1.5 and with latent labs lora, but it was really low res (no 3d model, prompts only). The 360 panorama looked great on VR (i made anime environments only so it was easy to cheat the seam line) but projecting this on 3d model was something i was missing. This can turn out to be an efficient way to easily make 3d environment for VR games especially static shooter game like the house of the dead. Thank you very much for this wonderful tutorial.
My suspension of disbelief was destroyed when you made the knight two stories tall. Like... come on man... he's towering over all of those archways and openings.... 😅 -- Anyway, this is a super cool process. Thanks for putting these tutorials and workflows together.
For your upcoming project, you might recreate the Tiger Scene from "Gladiator" set in the Colosseum, featuring an animated tiger restrained by a chain and a crowd simulation
I was really stunned by that waterfall tho! I really like how the idea turned out, but can you please give us all a bit of info how to implement animated water?
16:16 I used the workflow shown here! It works in a very similar way to the SDXL workflow for example but it uses AnimateDiff to create a looped animation! Fog, Fire, Water and clouds work extremely well!
Yeah AI is the future of VR. It's going to blow past everything else once we get fast enough GPU's and people bring SD style image generation into a controllable environment so you can walk around inside it. Then add AI controlled characters, and voice synthesis and you have the Holodeck. It's going to be INSANE.
Fantastic workflow. Thanks for sharing this. I wonder if it would be possible to separate some of the visual by using that depth map. It would then be possible to better simulate the parallax effect due to distance. ?
Damn, Mick. You're brilliant. So many cool things in one short vid. Some of the little Blender shader tips alone are worth the time of this video. And then you pile all the Comfy and Leonardo and other stuff in a tight, crisp presentation . . . excellent.
Oh man. This is insane. Bravo. I'm going to try this. I find Blender painful but I need to push through to see this. I'm on a Mac, so wondering how I might see this inside my Quest 3?
Still hoping one day we get all of this just integrated into an AI program so all I have to do is make a 3D scene, type some prompts, and ajust some values and bam whole finished artwork exactly the way I wanted.
This is so good! Is it possible to import the meshes (buildings + sphere) and materials into Unreal? I imagine you have to do some sort of UV projection first? Thank you for sharing your work!
I thought the same. I think if you export (.fbx) the entire object from Blender with the texture and material applied it should work. I need to try it.
Wow, amazing, i will love to create the most amazing scenes, like distopic scenes, but i dont see how, i see your tutorials are pretty advanced, can you teach me with your patreon tier, to do something above the common flux images?
@@FranzMetrixs I mean this unfortunate cooperation with Elon Musk. That makes Flux unusable for many users. Unfortunately, unfiltered images generated with Flux appear on his platform X with a very dubious message!
Is there a program where I can just prompt for these outputs? If not, why not make that? Why even program anymore, llms should be able to generate the right code for this.
Wouldnt this method only texture the facing side of the objects, forcing the viewer to remain in the middle of the scene? For example if you were to go to the other side of the pillars, wouldnt you see nothing as nothing was projected there?
So inspiring! Thank you! Did you play around with Lightcraft Jetset already? Not only the Cinema Version but als with the free iPhone app. Would be great to learn about a blender Jetset workflow. 👍
The image rendered with the outline shader only displays the object I selected and not all the objects present in the project. Does anyone know the reason for this?
Are you still using your 2070 super? i use 3070 right now and thinking yo upgrade to 4090. but if you still using your old gpu i think im gonna wait the 5000 series to upgrade
yeah, you just need to activate the “vr scene inspection” add-on and connect the headset (I used Steam VR for this) and click "start VR session" in blender. I was also surprised that it's so easy and will use it more often now!
@@mickmumpitz I've been dying to try that with unreal engine, but there's all this conflicting information about how they have changed, how it works or that it's not working lol unreal engine is already so confusing. I never even bothered
@@mickmumpitz It's super powerful as you can use the headset as a camera. For POV videos you can crawl under things and do all kinds of interesting angels that would be a pain with normal camera animating.
@@BabylonBaller TBH I am not sure. I have a feeling you probably can. Using the headset as a camera I did in unity and baking. I think you can bake the lights in blender too. Example: ua-cam.com/video/MSRrpgVrOoQ/v-deo.html
Amazing! Is there a way to skip Blender and just use CamtrakAR and a background generated by and AI? I dont mind if the background isnt designed exactly how I want it.
I was just looking for a way to do this easily. You rock man. Thanks for sharing these workflows, I know how much time, effort and trial and error devising these methods takes.
I tried AI painting on mimicpc and it handles details quite well too, the proficiency made for a surprisingly graphic experience.
The release of each of your videos gives me great expectations that of course are not disappointed. I share your interest in linking 3D modeling with AI. You are always at the forefront in this regard. Thank you very much for the workflows and your excellent tutorials.
Excellent and inspirational. I am a beginner Blender user with plenty of 360 VR and some immersive VR world building experience. I wonder if I can walk through this and create a base model to iterate on. Just decided to challenge myself, your video was the inspiration. Thank you.
You could create a depth pass from the generate image in Comfy and use it as a displacement or bump map back in Blender.
Wow!!!! This is exactly what I been looking for!! Thank you so much.
Do you think it is also possible to mix the outlines on the Flux workflow?
Great work man, thanks so much! I'll get you on Patreon!
This looks insane, can't wait to start testing all these workflows out
Top notch sir! Thank you.
Nice one! Only thing missing would be some kind of a thing that gets the information from mesh combined with texture that there is some stretching going on in certain regions. And then calculates the missing texture for that area so that it does not fall apart when changing perspective. Then use some displacement maps to get more detail. Binga bonga, nice scene!
damn, another subscription 😅really promising
This is something i was doing with SD 1.5 and with latent labs lora, but it was really low res (no 3d model, prompts only). The 360 panorama looked great on VR (i made anime environments only so it was easy to cheat the seam line) but projecting this on 3d model was something i was missing. This can turn out to be an efficient way to easily make 3d environment for VR games especially static shooter game like the house of the dead.
Thank you very much for this wonderful tutorial.
Great work again Mick , you are genius my friend.
German brains at work 🙂
My suspension of disbelief was destroyed when you made the knight two stories tall. Like... come on man... he's towering over all of those archways and openings.... 😅 -- Anyway, this is a super cool process. Thanks for putting these tutorials and workflows together.
He clearly just has a macrophilia fetish and wanted his knight to give the Dwarven dwellers a big steppie. Come on its not that hard to believe.
THIS IS AWESOME!!! Save so much time!!!
Thanks for all your wonderous workflows.
What an amazing workflow.
Once again a super duper tutorial. thx a lot!
Nice tutorial as usual ;)
For your upcoming project, you might recreate the Tiger Scene from "Gladiator" set in the Colosseum, featuring an animated tiger restrained by a chain and a crowd simulation
I was really stunned by that waterfall tho! I really like how the idea turned out, but can you please give us all a bit of info how to implement animated water?
16:16 I used the workflow shown here! It works in a very similar way to the SDXL workflow for example but it uses AnimateDiff to create a looped animation! Fog, Fire, Water and clouds work extremely well!
私もこのアニメーション部分の解説動画を期待しています😍
Fantastic tutorial. Love the pacing as well
So great!!!
you're a genius!
Yeah AI is the future of VR. It's going to blow past everything else once we get fast enough GPU's and people bring SD style image generation into a controllable environment so you can walk around inside it. Then add AI controlled characters, and voice synthesis and you have the Holodeck. It's going to be INSANE.
I really like the fact that your hair getting shorter and shorter. Great video as always
Fantastic workflow. Thanks for sharing this. I wonder if it would be possible to separate some of the visual by using that depth map. It would then be possible to better simulate the parallax effect due to distance. ?
Damn, Mick. You're brilliant. So many cool things in one short vid. Some of the little Blender shader tips alone are worth the time of this video. And then you pile all the Comfy and Leonardo and other stuff in a tight, crisp presentation . . . excellent.
really amazing work! thank you for sharing. subbed! :)
Hi @Mickmumpitz, are you considering a video on a "Sora"-like, or Runway ML tutorial and flow? Would love to try that on Comfy UI
amazing!
Este tutorial foi incrível! Obrigado, parabéns e te desejo muito sucesso para os seus projetos!
Great tutorial!! Thanks!!
Oh man. This is insane. Bravo. I'm going to try this. I find Blender painful but I need to push through to see this. I'm on a Mac, so wondering how I might see this inside my Quest 3?
谢谢你的教程,期待看你后来怎么解决移动起来的问题。
thats pretty awesome
Klasse klasse!
Still hoping one day we get all of this just integrated into an AI program so all I have to do is make a 3D scene, type some prompts, and ajust some values and bam whole finished artwork exactly the way I wanted.
let's hope we'll never get there :P
вау вау вау, это так круто)
Leonardo allows to enable tiling btw. But anyways, AI equirect projection is usually not exactly equirect, but its better than nothing!
Yeah true, but it's more for textures and things like that, so unfortunately it doesn't really work with those images.
You are a passionate artist. And it is contagious 😊
Have a look at Ian Hubert's Compify plugin to transform your environment texture from emission to principled BSDF shader
Amaizing!
Oh My!! That looks really difficult for me, but it seems really easy for you :) 🤕🤤
👋 Looking forward to watching this video
Superb tutorial, kudos. ❤️🔥
Amazing!!!!
This is so good! Is it possible to import the meshes (buildings + sphere) and materials into Unreal? I imagine you have to do some sort of UV projection first? Thank you for sharing your work!
I thought the same. I think if you export (.fbx) the entire object from Blender with the texture and material applied it should work. I need to try it.
woow its rl fresh technology for me
Wow, amazing, i will love to create the most amazing scenes, like distopic scenes, but i dont see how, i see your tutorials are pretty advanced, can you teach me with your patreon tier, to do something above the common flux images?
Can you do this offline with something less involved than comfy ui? Easy diffusion or something else? Thanks.
Flux certainly does a great job, but is only suitable for users who don't care about the background at all
What background are you referring to?
@@FranzMetrixs I mean this unfortunate cooperation with Elon Musk. That makes Flux unusable for many users.
Unfortunately, unfiltered images generated with Flux appear on his platform X with a very dubious message!
very cool!
wow
very nice! do you think you could generate trimsheets with ComfyUI to texture the 3D environment and assets
Such awesome work bro ❤
'very cool - great work!
Is there a program where I can just prompt for these outputs? If not, why not make that? Why even program anymore, llms should be able to generate the right code for this.
Es posible hacer el render de pases como el de profundidad y de lineas en comfyUI..
Can anyone tell best resources to learn about stable diffusion,loras,models,control nets etc. Any UA-cam channel?
Wouldnt this method only texture the facing side of the objects, forcing the viewer to remain in the middle of the scene? For example if you were to go to the other side of the pillars, wouldnt you see nothing as nothing was projected there?
you basically see that in the video and he also said it in video when looking through it with VR
So inspiring! Thank you!
Did you play around with Lightcraft Jetset already? Not only the Cinema Version but als with the free iPhone app. Would be great to learn about a blender Jetset workflow. 👍
The exact same prompt didn't gave me 360° pictures at all, unfortunately.
Any idea about this ?
Danke!❤😮
Danke DIR! 😊
The image rendered with the outline shader only displays the object I selected and not all the objects present in the project. Does anyone know the reason for this?
Amazing... I'll stick with my NVDIA stock.... cause I tried FLUX needs way high end GPU, at least my RTX2080 is very slow on that model.
Are you still using your 2070 super? i use 3070 right now and thinking yo upgrade to 4090. but if you still using your old gpu i think im gonna wait the 5000 series to upgrade
Bruh this needs a full video not 5 seconds 😭 16:28
Hi, if i have less than 24GB vram, my GPU has 10GB.. is it still doable?
did you just throw on an oculus and zapped into the blender scene? wow, i didnt know it supported that, I have a bunch of headsets lying around lol
yeah, you just need to activate the “vr scene inspection” add-on and connect the headset (I used Steam VR for this) and click "start VR session" in blender. I was also surprised that it's so easy and will use it more often now!
@@mickmumpitz I've been dying to try that with unreal engine, but there's all this conflicting information about how they have changed, how it works or that it's not working lol unreal engine is already so confusing. I never even bothered
@@mickmumpitz It's super powerful as you can use the headset as a camera. For POV videos you can crawl under things and do all kinds of interesting angels that would be a pain with normal camera animating.
@@resemblanceai niceee. Are you allowed to build out the scene while in VR with your controllers or is it just for viewing??
@@BabylonBaller TBH I am not sure. I have a feeling you probably can. Using the headset as a camera I did in unity and baking. I think you can bake the lights in blender too. Example: ua-cam.com/video/MSRrpgVrOoQ/v-deo.html
now skybox AI by blockade labs become makes sense. That's awesome approach
So, how did the Oculus Quest part work? Just conecting the quest? And why do you see a 3d effect. This makes no sence to me.
Into CogvideoX ?
i do have a vr headset but my laptop doesn’t work with it… kinda sucks…
Thomas Linda Rodriguez Richard Martinez Deborah
Hernandez Thomas Smith Brenda Clark Barbara
he must has a good gpu. @_@
Lopez Margaret Allen Matthew Lee Frank
Smith Kenneth Harris Frank Thompson Charles
Flux + Runway, is baked in 3 minutes, not is 3d, but this result isn't either
Moore Mary Lewis Donald Walker Paul
fantastic! what app are you using to explore the scene in VR headset? a vr viewing option in blender?
Amazing! Is there a way to skip Blender and just use CamtrakAR and a background generated by and AI? I dont mind if the background isnt designed exactly how I want it.