Hey everyone! In this video we show how to use a z-depth pass in Blender 3d in order to composite 2d elements within your renders with more control. Lots more 3d vfx tutorials are coming soon! Enjoy and let us know what visual effects you want to learn next. Our Blender Add-ons: Ultimate Blender add-on value pack: www.blendermarket.com/products/add-on-value-pack-citybuilder3dkhaoslightarchitectcablecam-bundle CityBuilder3D: www.blendermarket.com/products/citybuilder-3d KHAOS (Ultimate Explosion/Destruction Add-on): www.blendermarket.com/products/khaos-ultimate-explosion-simulator Spyderfy: www.blendermarket.com/products/spyderfy-bug-system-add-on\ Nature/Creature add-on bundle: www.blendermarket.com/products/naturecreature-add-ons---spyderfy--nisarga-lite-ultimate-bundle WeatherFX: www.blendermarket.com/products/weather-fx-add-on Cablecam (Cinematic movement rig): www.blendermarket.com/products/cablecam-cinematic-camera-movement-rig LightArchitect (Film setup previsualization): www.blendermarket.com/products/lightarchitect---filmmaking-add-on KHAOS Fire Shader: blendermarket.com/products/khaos-fire-shader-fire-shading-simplified
Is this an alternative to just adding the fire as an images as plane asset in the scene? Is there a benefit of doing one vs another or is this a "Best practice" technique?
This is a good question. If your shot is moving through 3d space the images on plane technique will be much better. This z-depth technique I use in more static shots. The advantage of it over the images on plane technique is that you can control the "Softness" of the depth. When you use the images as planes technique the edges often don't blend as well right away. Of course you can tweak things in the compositor a bit.
@@LightArchitect thank you for your help, I’m working on a scene now😂quick tip I found that with the images as planes technique, if I increase the emission value and add the bloom effect in evee, it helps soften it a step further too!
@@LightArchitect I saw a tutorial of pirates of confusion where they did relighting using normal maps but it was a bit difficult to understand Yeah you can do these z depth tasks by connecting depth map to assets blue input and select luminance Like we do with Luma matte in after effects, it is nearly same in fusion
I figured out my problem at least in Blender 4.01. There is no Z Depth input for the Composite Node. And the term makes it difficult to even describe the problem because every explanation is about enabling the Z Depth render output -- NOT getting an input on the Composite Node (the right-most Node in the Composting Nodetree where the image comes out) to have Anything more than "use alpha" Image and Alpha value. So I did a new file in Blender 3.6. Clicked on View Layer properties in the Scene panel, then under Passes/Data chose "Z" output. Then clicked on "Compositing tab" and then checked "Use Nodes" and there is the Render Layers linked to a Composite Node and now I see that wonderful "Z" input on the node. I'll save that files and open it in Blender 4.0. And then OOPS! There is no Z input for the composite. Looks like a bug.
Hello, please tell me why I see a black screen instead of a white one when I take the antenna out of depth. Because of this, normalize does not help and nothing changes
Nothing helps to get rid of the black, in my case, screen when I plug in Depth to Viewer or Map Values. Adding in Normalize in between doesn't work. Could you please help?
I had the same issue however I realised that I was trying to preview the composition live through the camera. You have to render your image first and then use the compositor using your passes.
Just use a mask and feather it. I have a tutorial on basic masking and compositing here--> ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-rmT_7lyII3I.html Thanks for watching!
@@LightArchitect thanks it was obvious now that you say it, i'm not used to mask things in 3d although i'm more a video guy. Your channel and all your contents are great man.