It's videos like these that keep me going. There is a lot to learn on how to do all these things. Your workflow was great. Something I can follow. Blender is getting a bit easier, but that isn't saying a whole lot. Thank you for putting the work into this tutorial. You just got a subscribe from me!
I was just looking for a way to do this easily. You rock man. Thanks for sharing these workflows, I know how much time, effort and trial and error devising these methods takes.
The release of each of your videos gives me great expectations that of course are not disappointed. I share your interest in linking 3D modeling with AI. You are always at the forefront in this regard. Thank you very much for the workflows and your excellent tutorials.
Excellent and inspirational. I am a beginner Blender user with plenty of 360 VR and some immersive VR world building experience. I wonder if I can walk through this and create a base model to iterate on. Just decided to challenge myself, your video was the inspiration. Thank you.
Nice one! Only thing missing would be some kind of a thing that gets the information from mesh combined with texture that there is some stretching going on in certain regions. And then calculates the missing texture for that area so that it does not fall apart when changing perspective. Then use some displacement maps to get more detail. Binga bonga, nice scene!
My suspension of disbelief was destroyed when you made the knight two stories tall. Like... come on man... he's towering over all of those archways and openings.... 😅 -- Anyway, this is a super cool process. Thanks for putting these tutorials and workflows together.
This is something i was doing with SD 1.5 and with latent labs lora, but it was really low res (no 3d model, prompts only). The 360 panorama looked great on VR (i made anime environments only so it was easy to cheat the seam line) but projecting this on 3d model was something i was missing. This can turn out to be an efficient way to easily make 3d environment for VR games especially static shooter game like the house of the dead. Thank you very much for this wonderful tutorial.
Yeah AI is the future of VR. It's going to blow past everything else once we get fast enough GPU's and people bring SD style image generation into a controllable environment so you can walk around inside it. Then add AI controlled characters, and voice synthesis and you have the Holodeck. It's going to be INSANE.
I was really stunned by that waterfall tho! I really like how the idea turned out, but can you please give us all a bit of info how to implement animated water?
16:16 I used the workflow shown here! It works in a very similar way to the SDXL workflow for example but it uses AnimateDiff to create a looped animation! Fog, Fire, Water and clouds work extremely well!
For your upcoming project, you might recreate the Tiger Scene from "Gladiator" set in the Colosseum, featuring an animated tiger restrained by a chain and a crowd simulation
Still hoping one day we get all of this just integrated into an AI program so all I have to do is make a 3D scene, type some prompts, and ajust some values and bam whole finished artwork exactly the way I wanted.
Fantastic workflow. Thanks for sharing this. I wonder if it would be possible to separate some of the visual by using that depth map. It would then be possible to better simulate the parallax effect due to distance. ?
Oh man. This is insane. Bravo. I'm going to try this. I find Blender painful but I need to push through to see this. I'm on a Mac, so wondering how I might see this inside my Quest 3?
@@FranzMetrixs I mean this unfortunate cooperation with Elon Musk. That makes Flux unusable for many users. Unfortunately, unfiltered images generated with Flux appear on his platform X with a very dubious message!
Damn, Mick. You're brilliant. So many cool things in one short vid. Some of the little Blender shader tips alone are worth the time of this video. And then you pile all the Comfy and Leonardo and other stuff in a tight, crisp presentation . . . excellent.
This is so good! Is it possible to import the meshes (buildings + sphere) and materials into Unreal? I imagine you have to do some sort of UV projection first? Thank you for sharing your work!
I thought the same. I think if you export (.fbx) the entire object from Blender with the texture and material applied it should work. I need to try it.
Wow, amazing, i will love to create the most amazing scenes, like distopic scenes, but i dont see how, i see your tutorials are pretty advanced, can you teach me with your patreon tier, to do something above the common flux images?
Is there a program where I can just prompt for these outputs? If not, why not make that? Why even program anymore, llms should be able to generate the right code for this.
The image rendered with the outline shader only displays the object I selected and not all the objects present in the project. Does anyone know the reason for this?
Wouldnt this method only texture the facing side of the objects, forcing the viewer to remain in the middle of the scene? For example if you were to go to the other side of the pillars, wouldnt you see nothing as nothing was projected there?
So inspiring! Thank you! Did you play around with Lightcraft Jetset already? Not only the Cinema Version but als with the free iPhone app. Would be great to learn about a blender Jetset workflow. 👍
yeah, you just need to activate the “vr scene inspection” add-on and connect the headset (I used Steam VR for this) and click "start VR session" in blender. I was also surprised that it's so easy and will use it more often now!
@@mickmumpitz I've been dying to try that with unreal engine, but there's all this conflicting information about how they have changed, how it works or that it's not working lol unreal engine is already so confusing. I never even bothered
@@mickmumpitz It's super powerful as you can use the headset as a camera. For POV videos you can crawl under things and do all kinds of interesting angels that would be a pain with normal camera animating.
@@BabylonBaller TBH I am not sure. I have a feeling you probably can. Using the headset as a camera I did in unity and baking. I think you can bake the lights in blender too. Example: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-MSRrpgVrOoQ.html
Are you still using your 2070 super? i use 3070 right now and thinking yo upgrade to 4090. but if you still using your old gpu i think im gonna wait the 5000 series to upgrade
Amazing! Is there a way to skip Blender and just use CamtrakAR and a background generated by and AI? I dont mind if the background isnt designed exactly how I want it.