That's deep pixel compositing. It's not the same thing. That's just using the position pass to create masks. Deep in Nuke is a totally different thing and way more powerful
I wish I could get a copy of nuke that will allow me to work in 4k. I can't afford it. I'm doing a documentary for my Synagogue, and there's a lot going on. For example, in the book of Exodus, when the death plague comes, then when Moses receives the Ten commandments, and there's the burning bush. I'm no expert by ANY means, I like the simplicity of nuke over fusion. My fav software I had formal training on are: 3DS Max MAYA Avid Media Composer AE (YUCK) I would love to find formal training using Nuke so I can do my projects as an advanced hobby for my synagogue. I would enjoy very much to chat with you about Nuke and our lessons I'm converting into documentaries. Thank you for this video. Cheers
You can work in 4k in nuke NC you just can't render it out higher than HD. I know it's not ideal but I suppose thats why they limit it. Otherwise they'd make no money because everyone would just use that 🤣 If you'd like to chat I have a VFX mentor tier on my Patreon where I help people with their projects and go over anything they want to discuss. That might be what you're after!
I learned most of what I knew before working at a studio just from watching RU-vid videos. I used to use blender and after effects. Blender is free and has loads of great channels making tutorials like Blender Guru and Grant Abbitt. I'd suggest getting into that and watching some tutorials and learning for yourself. Then if you want to take it more seriously you can study VFX at an educational institution like University or college etc
Hi thanks for the run through, does anyone know why when i do a similar work flow and try to open the exported fbx file in cinema 4D, it always is the incorrect resolution? In cinema 4d. Thanks. It does not seem to retain the resolution of the timeline in the fbx export?
3D cameras don't have anything to do with the project resolution. They have their own sensory size etc that's independent from the resolution of the shots. I don't use C4D but I would imagine you just need to set the project resolution to be the correct size when you first make the scene. I do the same in Blender
I'd like to add to what you said about the Pmatte. The Pmatte works well with Position data for static geometries but fails if your geometry moves, in that case, you'll need the PRef(Position Reference data)
Awesome video as always, thanks for sharing with us! I have one video idea just to toss out; it’d be great to have a professional compositor do a video of all the common terms such as Mult, Pre-mult, ST Map etc. Literally just a talking head video, I would watch it at least 10 times to commit it all to memory.
One of the biggest issues with Fusion/Resolve is that there is 500 bad videos for every 1 good tutorial. Most of them seem to be more interested in putting their face on camera than giving you a good education. There area lot of issues with Resolve/Fusion, but it is still a million times better than using Premiere for editing. I still use After effects occasionally, but it has become so slow over the years that I avoid it when possible.
The info on the video is pretty good, but I do find it funny that someone boasting the "ULTIMATE VFX WORKFLOW" has obscenely choppy webcam footage. keep up the good work
Amazing work bro! Let me ask you..Do you recommend for a small team of 3 members to use Blender + Nuke instead of using Maya and Nuke for an independent film series VFX? Do you have problems working with Blender and Nuke since Maya has more integration tools with Nuke?
Thanks! I wouldn't say Maya has more integration with nuke tools. There's no overlap other than custom tools studios have made themselves. You're probably better off using blender in a small team
@@AlfieVaughan Thanks for sharing your thoughts. I have a small team and we don't have enough budget to buy blender and Maya, so I was looking for a solution for high-end VFX without breaking the bank account.. So I'll follow your experience and use Nuke + Blender
I don't get paid to promote Nuke. I just reached out to them after paying for Nuke myself for a couple of years and asked if they would be interested in providing a license. Which they did in exchange for me making some of their official tutorials. I don't think it changes my perspective at all. And like I said, if I wasn't getting it for free I would totally pay for Indie
Working at some of the best VFX studios in the world for 7 years? 🤷♂️ But in all seriousness, by definition a professional is someone that gets paid to do a job. I get paid to spend 50+ hours a week working as a compositor. How's that?
Do you have a video for the vector pass? Haven't been able to even view what it writes into the exr file, let alone use it in a functional way inside Nuke. Thanks in advance!
I don't, sorry. I always render my motion blur in 3D as it looks better and makes the comp less heavy. But all you should have to do is add a vector blur node, set the vector channel to the motion vectors and then increase the strength. I also usually set the drop down to uniform which makes it behave better
man, those 9 minutes summarized my 3+ years trying to figure out the blender-fusion vfx workflow... I hope everyone taking on a remotely similar journey as me, finds this video right away!
Ah good! I'm glad. I've also got a video about my whole workflow that you might like. It's not blender to fusion but there's probably some gems in there too in terms of render setup etc from Blender
Big plus about Fusion for me is that I'm the only person in my studio that is using it. I've set up fusion render farm (which is free) on all PCs and now I have more then 20 PCs calculating my compositions. It's insanely fast and all that for 300$!
How is it a plus that you’re the only person using it. You can’t pass off the project, or collaborate with ease. If you go on vacation or are sick no one can open your project. Don’t say job security either because the moment those situations arise a smart lead [producer or owner] would restructure. I also don’t know why you have 20 computers rendering composites from a single artist when one decent computer can handle it in the evenings. Don’t tell me you’re using other people’s stations while they’re working. 🤦♂️
@I3ra that's quite a quite a narrow minded perspective in my opinion... While technically you're right about it being difficult to hand off, if you're doing a shot that you know will definitely not be picked up by someone else then it's not a big deal. Other software can have tools and things that are uniquely useful and good for problem solving. I've been using blender quite extensively for the last 7 years while working as a compositor at VFX studios. No one else uses it and can't pick it up if I do something in blender. But for standalone work where I know it's just me, it's solved some enormous problems that would have taken 5 times as long to do solely in comp. As for the render farm, valid point but maybe their individual machines aren't that good? I've also done jobs at 15k that took 7 hours to render overnight. If you're working late and need to send something quicker than 7 hours then it could be a 21 minute render with other machines. More power is never a bad thing
That is very cool, I would love to have a longer tutorial fusion for Nuke people, now that I'm working for myself I switched to davinci but fusion is still somehow uncomfortable for me
Thanks! A lot of people have asked for it but I don't feel like I know fusion well enough to teach it. What you see in this video is more or less everything I've learned to do in it 🤣
In Fusion, wouldn't the Lens Distort node do the undistort trick? Note that I am not referring to the Lens Distortion node which is a different node and it is a simple lens effect. The Lens Distort is also able to load external Distortion Data.
What about the speed difference between Nuke and Fusion ? The effects,especially the GPU accelerated ones and the speed of reading the files sequence into cache.
@@AlfieVaughan at some point I read opinions about how Fusion or Davinci was much faster since it took better use of Nvidia GPU, even of multiple cards at the same time, but seeing how Nuke is much more expensive, maybe that's not true.
@RealTimeFilms I don't know enough to go e a valid perspective on that I don't think 🤣 But nuke has a lot of GPU accelerated nodes too. Especially for heavy stuff
There was a time when people felt the same way when comparing Resolve for editing to other NLEs. It was good and could get the job done but was missing a lot of features. Now, a lot of those features are already there. The thing to remember is that BMD's main goal with their products is to make affordable solutions that are just as good as the pro tools. It's true for every single product category they're in. There's almost always a pro tool that's considered the industry standard, has more features and costs an order of magnitude more. It's true for the cinema cameras, the broadcasting gear, and the software. I say almost because in color correction, Resolve is that leading tool. Your video showcases exactly that. It's not supposed to be better than Nuke, but you can do a lot of the same things for much cheaper. BMD keeps adding features. I was playing today with the new uVolume node for vdb files (it's in the Resolve 19 beta). They've added USD support as well last year. The tool keeps getting better and I don't have to pay anything for those upgrades.
I use Flame for running the edits at work. I haven't done any comping with it but we use it for conform/ online etc. I think it's great. I'd much rather comp in Nuke which I still do but as a timeline tool flame is brilliant
@@AlfieVaughan I'm curious about swapping out Flame Online for Fusion Online. We Grade in Resolve and to have both in 1 package would save a lot of time and headache exporting back and forward between the two. Do you think Fusion is a capable competitor to Flame for 2D Online work like that?
Thanks! Could be but I doubt it as several 3D softwares are able to render it so it's not exclusive in that sense. I think it's more likely no one else has bothered putting it in a compositor as it's not an important feature to 99% of the user base