Тёмный

Generate 3D Sets for your Short Films! 

Mickmumpitz
Подписаться 82 тыс.
Просмотров 178 тыс.
50% 1

Опубликовано:

 

30 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 187   
@TheFuzzypuddle
@TheFuzzypuddle Год назад
It's insane that a person with creativity, persistence, and a willingness to learn can make a high quality short film with almost no resources. A phone with some apps, a computer with some video processing power, and some free and/or inexpensive ai tools will give you realistic composited visuals. The same equipment will give you a soundtrack and overall sound design. The tools are at our fingertips. The skill is knowing which tool to use for which task and having the imagination to combine their power to create something great. It reminds me of something Deadmou5 said several years ago when referring to creating music electronically: "This can all be done with a minimal amount of software which is why a kid can make a dance hit on a laptop."
@nichodgkinson7163
@nichodgkinson7163 Год назад
As a VFX artist with 22+ years experience I loved the line "if you shot on a green screen you can just simply key them" !! Very funny.. It seems vfx is finally having (or about to have) it's punk moment. Thanks for helping bring it forward!
@theonm.5736
@theonm.5736 Год назад
many people still dont get it that they ll never get rid of the artists with the help of ai because of its own limitations, artists that are producing art, are never going to die
@yvesdalbiez1395
@yvesdalbiez1395 Год назад
What's so funny about it
@zo.mp4
@zo.mp4 Год назад
@@theonm.5736 those limitations will lessen over time. eventually you'll just need an idea and ai will do it for you
@harem137
@harem137 Год назад
its come along way...after that is the Glam metal phase watchout !
@videojames199
@videojames199 Год назад
@@theonm.5736 I can’t emphasize this clearly enough, we used to think that AI couldn’t create “art”. Now that is highly suspect. The same thing will happen with the idea that we need artists to create high quality art. We are not special. We are bioorganic computers that are good at inference. That is all.
@arkadiuszm.j.wernicki7345
@arkadiuszm.j.wernicki7345 Год назад
4:50 MID-LEVEL Bro.. in the Displacement modifier you have this blue slider. Change it to 1, that will get rid of the "problem".😉 Great video! 👀🍿❤️
@mickmumpitz
@mickmumpitz Год назад
Haha yeah, I should have mentioned that! The problem is that at a midlevel of 1, the strength of the displacement effect could no longer be changed (it just scaled the sphere). The difference between maximum and minimum displacement was too weak for me, the effect was hardly noticeable from the inside of the sphere. Do you have an idea how to fix this?
@arkadiuszm.j.wernicki7345
@arkadiuszm.j.wernicki7345 Год назад
@@mickmumpitz What you mean by "the Displacement effect could no longer be changed"?
@LFPAnimations
@LFPAnimations Год назад
@@mickmumpitz you could also use a color ramp node on the displacement map in the material to adjust the displacement map's range. Just make the white values grey.
@genshian
@genshian Год назад
Mind blowing as usual! Blown away on the research and implementation you do. Exceptional!
@nrdkraft
@nrdkraft Год назад
Here’s a thought. What if one took an HDRI of the real filming location, whether in front of the green screen or outdoors, et cetera. Then depending on how feasible it is, make a depth map of the original video, and smooth it out, and then displace the keyed video plane in front of the camera so that we effectively have a 3D version of the subject instead of just normal mapped. And then take the HDRI of the original film location, but give its brightness a power of minus one or something negative, and make that light only interact with the displaced video plane in hopes that depending on how accurate the the depth map is it could subtract the lighting of the original filming location (like in the case it wasn’t feasible to film outdoors on an overcast day, and have more distance lighting indoors but it’s also in practice to light it any other helpful way) and then add other lights and the environment otherwise to re-light the video plane as if the subject were another object in the scene. Probably way too overthought and impractical itself anyway but it’s a thought anyway😆 (I just wondered if there’s a way to record LiDAR on one’s phone in the video to use as the footage displacement map instead of generating it with AI)
@epessoarocha
@epessoarocha Год назад
Loved the way you simplified all of this creation. Well done!!
@manecolooper
@manecolooper Год назад
Excellent, always pushing the limits of imagination!
@stephenjohnson3307
@stephenjohnson3307 Год назад
If you don’t mind my asking, how long did it take you to make a single virtual set? I’m wondering because I thinking of making some content with this method, but I wonder if the workload is actually manageable with what I have in mind.
@RobertoBermeo_
@RobertoBermeo_ Год назад
0:04 how did you put yourself in an animated version?
@Alex-rr7qc
@Alex-rr7qc Год назад
Heck yeah! He's back 🎉🎉🎉
@SoulTuner
@SoulTuner Год назад
I'm so grateful to you. I've been looking for a quick way to create virtual worlds for so long. And in this video you have revealed everything in such detail. I'm shocked, thank you!
@relaxmax6808
@relaxmax6808 Год назад
Are you open to creating a video clip for a musical remix without it costing me too much? I'm just a musician and I don't have a producer behind. thank you
@SoulTuner
@SoulTuner Год назад
@@relaxmax6808 It's possible) Please send me your remixes. I want to hear them) And write, how you imagine the clip. My email is in description of channel. Thanks.
@relaxmax6808
@relaxmax6808 Год назад
@@SoulTuner ok , thank you , i will answer on your email .
@LucienHughes
@LucienHughes Год назад
Outstanding content as always. Pioneering stuff.
@bUildYT
@bUildYT Год назад
and for those who r using adroid??? :(
@5timesm
@5timesm Год назад
Sometimes I would like to see the name of the software used on screen……. Was it Blender, was it Davinci where you did the color correction? It would be helpful to see the names of the used plugins especially where you‘re a newbie to this whole process….😇
@Kareem_Essam
@Kareem_Essam Год назад
Incredible dude , I'm kinda disappointed because of the relight on the actor normal map trick didn't go as i expected , If i have to do so , I use photoshop and Ebysynth but that's a pain workflow.
@epelfeld
@epelfeld Год назад
It was great to see the full process without cuts also. Even as paid mini course. It's hard to get a lot of things without deeper explanation
@TheFilthLA
@TheFilthLA Год назад
Dude... these videos are incredible. You're finding ways of doing things I thought it wouldn't be truly possible to accomplish for another 5 to 10 years. Very cool! I have an ambitious short film I shot most of in 2019 that I've been wanting to finish but knew it would take a significant amount of money to create the fantastical world in the film's conclusion. This has given me hope that completing it might be feasible on a much smaller scale if I sit down and really focus on banging out in AI that would be very hard to do practically or with more traditional VFX techniques. Definitely spreading the word and keeping a close eye on your channel going forward. It's VERY EXCITING!! Thanks man, appreciate what you're doing here!
@TREXYT
@TREXYT Год назад
Midjourney have also --tile option, nice video btw 😃
@Kenb3d1
@Kenb3d1 Год назад
Awesome work as always. Have you tried Tokyo_Jab's technique yet for generating more consistent frames in SD?
@azielarts
@azielarts Год назад
You are a wizard! Thanks so much for sharing all of these explorations. Definitely taking notes!
@maya-akim
@maya-akim Год назад
Awesome job man! You always have great ideas
@grae_n
@grae_n Год назад
Thank you for the kind words! The de-flickering normal maps sequence is amazing! I wouldn't of expected that to work so well.
@brandonjacksoon
@brandonjacksoon Год назад
Thanks mate! Always awesome tutorials!
@toapyandfriends
@toapyandfriends Год назад
Subscribed
@MODEST500
@MODEST500 Год назад
i knew we will come to this stage, this is the teaching age where those who were the early adopters of the technology will start educating the people who are just trying to work things out . AMAZING !!! thanks for the workflow and tying things toegther
@titchc3657
@titchc3657 Год назад
So glad you have this channel, great stuff.
@patfish3291
@patfish3291 Год назад
Not so far from a classic workflow we do since 20 years only cheaper :D. The only thing who is really different is the possibility of AI image generation (start)..
@StrongzGame
@StrongzGame Год назад
Does anyone know if you can export the scenes made into unreal engine?
@jagsdesign
@jagsdesign Год назад
it has been an amazing learning curve and it certainly made me watch a few times to really grasp the technique and workflow. Amazing work and super cool explanation with a mixture of cross working tool sets which again makes it fun int he learning process. Thanks
@Caged_Monuments-x6p
@Caged_Monuments-x6p 17 дней назад
I think the AI revolution is to do everything without Adobe, since they've became a "you will own nothing and be happy" type of company. Thats why I use Davinci Resolve.
@jonathankr
@jonathankr Год назад
I've watched 4 of your videos now. You make everything seem possible. I know it's complicated and you are a genius. Do you do commissions?
@Bluemograph
@Bluemograph 2 месяца назад
A.I never replace filmmaking the reason is very complex, A.I is not standart of filmmaking (except plugins on some software maybe will help and solve filmmaking problems) A.i is not profesional for filmmaking, you're not done by just clicking 1video to generate and compete with profesional company
@PrairieFawkes
@PrairieFawkes 2 месяца назад
For anybody having issues with your steps being visible and creating jagged lines and you can't figure out what it means to save in higher bit depth, check to see what kind of file you're saving from Auto1111. If you're saving JPG files, then those files have a lower bit depth than PNG files. Switch the save file type to PNG in Auto1111 settings and download the depth map as a PNG. This solved the issue for me.
@Strawberry_ZA
@Strawberry_ZA 7 месяцев назад
Hi Mick! Amazing tutorial! I'm trying to reproduce the workflow but I'm stuck on the depth map white value alteration. I don't know how to do this (i have photoshop, but no skills) - also, I can't overcome the blocky appearance of the depth map. I save the images as 32 bit depth but the same blockiness is still there.
@ChristophTallerico
@ChristophTallerico Год назад
I'm having issues with the "image plane from visible ref" step. When I click on the "image plane..." button, set the shader as Principled and distance to 1 nothing happens. Am I missing a step to get the image plane in my scene?
@jordanco
@jordanco Год назад
You’re skill and dedication is insane. Thanks for sharing this
@iamvulgar8188
@iamvulgar8188 Год назад
I have 16 gb of ram, an rtx 3050, and a ryzen 5 2600. I create my 3d setting in Blender and add a displace and subdivision and it just tanks my system and Blender becomes unresponsive, if I try to add a 2nd subdivision it crashes, if I try to render it crashes. 100% ram and before I turned off GPU acceleration it was using 100% of that too. This doesn't seem like a very demanding thing that's being done. My setup seems to be plenty enough to do something like this so why is it crashing blender?
@zhehang1986
@zhehang1986 Год назад
the new feature of lighting in DaVinci resolve18.5 based on Depth may be useful in this workflow. maybe you dont need to build normal maps???
@ValentinesProducts
@ValentinesProducts 4 месяца назад
Great stuff! Do you know how I could find someone to teach me how to use these tools to make cool short films myself?
@quantitativemedia182
@quantitativemedia182 Год назад
My thoughts greatly aligned with yours...I have been fantasising on using AI to produce a short film for a while now. Just yesterday I found blockade...and then boom 💥 today I found your page...please can we connect and interact one on one ?
@EricLefebvrePhotography
@EricLefebvrePhotography Год назад
You should revisit this tutorial using the new relight functionality of DaVinci Resolve Studio BETA.
@VaibhavShewale
@VaibhavShewale Год назад
soon we will get more awesome yt short films and more backroom videos!
@DRiMe69
@DRiMe69 Год назад
Great video. Can you talk a bit more about preparing the 3D environment, particularly those last steps once you have the depth map from controlnet? I’m not familiar with the flow for how to reduce the black and white values in PS or change the file’s bit depth.
@foufedu74
@foufedu74 Год назад
bonjour tout le monde j'ai cette erreur qui s'affiche Error code: 1 stdout: stderr: Traceback (most recent call last): File "", line 1, in AssertionError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check quelqu'un a une idée comment la régler
@LFPAnimations
@LFPAnimations Год назад
Some really cool tools highlighted here. I am really interested to see where this workflow goes as the AI tools improve. I am a professional compositor and I have to say the "re-lighting" method made me cringe a bit. The normals are way too inaccurate for that ever to work. One idea for a fix though would be an image to 3d model AI so you can project the video back onto a generate human 3d model. You can then use that Geo to catch the lighting of the scene. This will also solve the flickering problem from the normals
@photobackflip
@photobackflip Год назад
Lol you think the future of filmmaking is fucking SKYBOXES.
@DivineMisterAdVentures
@DivineMisterAdVentures Год назад
I never thought the day would come. Well, yeah, I did. I guess a couple of my stories from long ago cover it. But it seemed far away when I was trying to break into the Hollywood writer scene. It looks like even a single actor can play many roles.
@erichylland4809
@erichylland4809 Год назад
For Stable diffusion 1.5 now there is a 360 LORA
@oscarjimenezgarrido7591
@oscarjimenezgarrido7591 Год назад
_The future of filmaking_ Kerry Conran more than 20 years ago: _Here, hold my DeLorean..._
@pixel325
@pixel325 Год назад
What an excellent idea and it does look ok. I'll have to try this out! Thanks for the insperation!
@aiakym
@aiakym Год назад
Hello Brother, please consider covering the software options available for this process, the steps involved in converting 3D animation videos to 2D animation videos.
@AmitYt_108
@AmitYt_108 Год назад
Thanks for this great video...... Is there any easy way to make cartoon short movie taking scene from an actual movie using AI....
@bailahie4235
@bailahie4235 Год назад
That smile of "I enjoy this landscape" (while in fact you see the most boring green in existence)! 😂 😜
@janjansen6443
@janjansen6443 Год назад
I am kind of scratching my head here. What the point of this all ? it seem to be subpar for loads of work. It seems like it would just be lots quicker and yield better results just making this all in blender or UE5, render a world in it or even pre load one into it. other than person's and or other objects that need a more realistisc direct interaction on screen. You don't all those other tools.
@nightmisterio
@nightmisterio Год назад
Can we use a 3D model background on unreal engine with all those tools also?
@TheAllthegoodstuff
@TheAllthegoodstuff Год назад
Thanks for the training Mr Muppetz 😊 much love 🙏 ❤
@Ollacigi
@Ollacigi Год назад
im a 3d animator, the most hardest part of my skill is creating a good environment..and this become my new favorite channel!!
@knottage
@knottage Год назад
There is so much in this video I have no idea about... still fascinating to watch though
@yashmehta6054
@yashmehta6054 Год назад
Hey great tutorial! but can you make it beginner friendly after sometime i got lost :( Thank You!
@nostalgiaprobe
@nostalgiaprobe Год назад
This needs a high GPU/RAM computer right?
@RGBJD
@RGBJD 10 месяцев назад
Very nice and great infomration. Thanks!. Pls do share more of this..
@thestorybank
@thestorybank Год назад
can you create something like leiapix is doing with blender if yes than can you please make a tutorial about it please
@camiladorin4153
@camiladorin4153 Год назад
Hi, can you please provide some explanation on how to shift the depth image values in photoshop?
@M4rt1nX
@M4rt1nX Год назад
You can record using an iPad pro as well if you don't have the right iPhone.
@lanceanthony198
@lanceanthony198 Год назад
Future of soulless artless creative works flooding and over-saturating every genre
@mooh1245
@mooh1245 Год назад
Can you please do one on creating 3d short film but using unity instead of blender
@futuretechgeekgal
@futuretechgeekgal Год назад
super and will try these techniques myself
@mohaneedabdo3366
@mohaneedabdo3366 Год назад
love yourrrrrrrrrrrrrrrrrrrrrrrrrrrrrrrr videos you are amaaaaaaaaaaaaaaaazing thank you very much you are unique
@rallyworld3417
@rallyworld3417 Год назад
Oh my god this is 1000 times better then corridore crew
@nutzeeer
@nutzeeer Год назад
I bet the hi res but seamed depth map can be layered over the better but low res variant. To remove the seam.
@rajendrabiswas
@rajendrabiswas Год назад
how to create 3d avatars who you can act with ?
@EricLefebvrePhotography
@EricLefebvrePhotography Год назад
The volume has another trick, a stage that can rotate since the screens dont; go full 360 degrees. I'd be happy to shoot with a really good, large, short throw laser projector with UE5 ... you could do that for less than 6k (USD). Ryan Conolly did a few REALLY impressive tests with short throw projectors and Unreal. Was watching another video and a company sells a small LED wall setup for Virtual Production for as low as 10k (USD) ... not EXACTLY available to us indies but still a move in the right direction.
@coloryvr
@coloryvr Год назад
WOW! What a great video! I've been experimenting with stable diffusion for months on my VR generated worlds. This video takes me a big step further.... BIG FAT FANX! So far I have used SD to generate endless textures for creating VR brushes or to integrate Deforum clips into 360 degree clips with Davinci Resolve. But: My main interest in connecting VR and AI is the possibility of transforming a 360 degree clip (based on a custom VR art world) into something else using SD Deforum...unfortunately I haven't managed to get a decent workflow done yet...mainly because of the large amounts of data ...(?) Happy colored greetinx!
@michail_777
@michail_777 Год назад
Thank you for the video tutorial, as always very informative. There is a question. If anyone knows, explain briefly. In ControlNet there is a Preprocessor and Models. In the models, for example, there is "control_canny-fp16" and there is "control_sd15_canny.pth". What are the differences between these models. The sd15 just weighs about 1.5 GB. But the fp16 is 720Mb. "control_sd15_canny.pth" an older model? Or what is the difference?
@digital_magic
@digital_magic Год назад
awesome tutorial, very interesting and well explained
@Truth621
@Truth621 Год назад
I Appreciate your work , thanks
@DanielPartzsch
@DanielPartzsch Год назад
Cool idea. You should use the mid-level offset slider in the displace modifier to shift the displacement from center to outwards, this way you can keep the original depth map values and do not have to compress them. Also, instead of an uv-sphere using an ico-sphere helps with getting more evenly distributed subdivisions and this more predictable displacements on the sphere mesh.
@readmore3490
@readmore3490 Год назад
Do you slide the mid level slider to 1? I don’t know how to compress the depth map values and save in high bit value so your method might be easier 😅
@DRiMe69
@DRiMe69 Год назад
I’m also curious about this since I don’t know how to reduce the black and white values in PS or change this file’s bit depth.
@simonflash_edit
@simonflash_edit Год назад
Deine Videos sind großartig! Mach weiter so :)
@relaxmax6808
@relaxmax6808 Год назад
🎸🎹🎶📺🧮🎥An advanced amateur among the commentators of this video to work on an animation clip for a musician? Low budget unfortunately for the moment. Leave your message here, to discuss more in depth. THANKS
@nightmisterio
@nightmisterio Год назад
Can we upload a image pre painted?
@borisreimelt
@borisreimelt Год назад
I really love the concept of "virtual production". Bringing a mix of 3D-assets/scenes and (AI modified) real footage into a game/physics engine which serves as a studio environment. Game Engine - not Blender/Maya... ! This idea is huge. As you said: not everyone can afford a studio environment with LED screens, but VR/XR headsets can do a big chunk of the work. Flipside VR for Oculus would be a candidate to leverage VR for virtual production.
@Thierrystheories
@Thierrystheories Год назад
Dude this is mind blowing
@Exilleron
@Exilleron Год назад
Wonderful work. I as a German Hobby-Artist, interested in Tracking and Matchmoving, was pretty amazed by your creativity and deeply explanations. I let an abo here. Good luck my friend.
@josealberto-rj1si
@josealberto-rj1si Год назад
new node relight in fusion, generate normal map with ai
@reelwurld-studios
@reelwurld-studios Год назад
Thank you for this video. I was looking for this exact suite of tools and workflow 🙏🙏🙏
@arreshubham
@arreshubham Год назад
Suggestion: You can use the Omniverse blender extension to convert obj or any other blender-supported formats to USD
@coltish
@coltish Год назад
Love the content. I just watched something the other day about scanning someone into the Unreal Engine. I wonder if that process is simpler?
@coltish
@coltish Год назад
It’s a Matt Wolfe video on using AI to create 3D games. I want to get my hands on Unreal’s metahuman. But your method allows for real human actors.
@mauimrc
@mauimrc Год назад
Awesome video! I'm gonna try these on my giant L.E.D. production studio tomorrow.
@2000ssher
@2000ssher Год назад
Hi Mick, thank you for all the wonderful videos. I am trying to find the easiest way, as I am not technically savvy, to create a music video with green screen/3D/iphone. It looks as if this video can help. If you have any suggestions, I would greatly appreciate.
@Dopamite
@Dopamite Год назад
Wow! What a brilliant idea to use ai generated normal maps to relight live action footage.
@ABC-tt7qe
@ABC-tt7qe Год назад
Holy cow, you are AWESOME! I can't wait to give this a shot!
@laporte1625
@laporte1625 Год назад
Mind. Blown. Thank you very much for this awesome presentation.
@choptop81
@choptop81 Год назад
Waaaaah it's too hard cant i just put in prompts
@upstairsneighbor4182
@upstairsneighbor4182 9 месяцев назад
you're the man. thank you for sharing your workflow. I was going down this rabbit hole blind for a year and you sir are the rosetta stone. Muchas gracias.
@iLLStaticProductionsBeats
@iLLStaticProductionsBeats Год назад
Good stuff, very creative and informative. Thanks for the effort you put into creating this video and teaching the options available.
@lucasparra2585
@lucasparra2585 Год назад
Amazing Content! For me this is still too advanced. I dont even know how to use stable difusión or blender
@lorenzmeier8020
@lorenzmeier8020 Год назад
Yeehaaaaa, wonderful, thanks for explaining this tool! 🎉
@Way_Of_The_Light
@Way_Of_The_Light Год назад
You’re soo underrated man 😭🙏
@Glowbox3D
@Glowbox3D Год назад
This was dense, a lot to digest, and it's awesome. Cool workflow. Thank you.
@gabrielrolim
@gabrielrolim Год назад
hey! how do I just export the normal maps from a batch file?
Далее
New Tech!  How I made PRO Level VFX in a BARN!
8:53
Просмотров 48 тыс.
Using AI BACKGROUNDS with GREEN SCREEN
6:44
Просмотров 282 тыс.
How to make great videos ANYWHERE with AI ENVIRONMENTS
9:25
This INSANE Software Just Changed Filmmaking
7:16
Просмотров 167 тыс.
23 AI Tools You Won't Believe are Free
25:19
Просмотров 2 млн