Hey everyone. Let me know what you think about this video. Was it too long, or too short? Should I break these tutorials up into multiple parts, youtube shorts, etc. Any thoughts are valuable.
@@GKhan-un7py 😅 My lil baby was sleeping so I mixed the audio on some tiny earphones. Sounded great on them but like trash on other proper speakers. My Other videos shouldn't have this Issue. Thanks for stopping by and checking it out
@@Toraon I composited the dust using 2D elements. I had to do a planar track to get them to stick to the shot. I've decided to upload the project and share it here. Check the description in 5 to 10 minutes for the files. You can download and explore the whole thing.
The length of the tutorial is perfect!! We prefer it to be long to learn better. When it's too short, we miss the detail we need to learn. Thank you again and more 3D with Davinci Camera Tracker and Blender 3D Model please!!!!!
Great tutorial my friend, please I have a question, I don't know why for some reason, when Import the scene from fusion to blender. The tracking is messed up and the objects keeps sliding. I had a solve error of 0.27 on fusion, I guess it's neat , but I Don know what's happening in there, can you pls help? Thanks
Sure thing. At 6:52 you can see the focal length and the plane the footage would be on don't line up. You will need to add the camera's focal length to stop things from sliding around. Lemme know if it helps.
@@terr20114 yo ! It's works !!!! OMG YOU SAVED MY LIFE BRO ; I had this issue and had been working on t it for more than 3 hours straight lol ! thanks a million
@@anchorfeast 😁Nice! I'm glad I was able to help. This issue stressed me out for DAYS before I realized what was happening. For some reason the focal length doesn't get exported or blender doesn't import it.
@@DRLZEca pretty much in the eq for ur music u create this wide dip where ur highs and low are still audible but ur mids are brought down so they don’t clash with your own voice audio.
Sorry. I had a sleeping baby in the house. I mixed the audio on a tiny lil Skullcandy earphone. It sounded great on it but once got to hear the video on proper speakers I realized It sounded like hell.
@@Benn25 it can be. Personally I think it comes down to how comfortable you are with the tool. Blender’s camera tracker is good but requires more manual work so for a more complex track it might be better. Resolve’s tracker can be a quick, one click auto solution for an easy shot like this one. On my personal and client projects I use Syntheyes but I don’t use it on tutorials since it’s a whole separate paid application which would require a tutorial for itself. So if you’re fast and efficient with Blender’s tracker then it might be the better option for you. (Ian Hubert has a really cool tutorial)
Hey, I decided to upload the project file (linked in the description). I improved the dust composite a bit before I did. Sadly I can't change the video but the composite looks a bit better now.
@@terr20114 Cool workflow and compositing my friend. Do some particle simulations and have some dust and dirt flick and bounce off the "lens" with the right sound itll boost the effect immensely.
@@dirtreynolds7723 I was thinking of doing some smoke sims and adding flames to the spaceship but that was way out of the scope of the tutorial. It 100% would have made it better though.
I love how fast and “automatic” Fusion’s tracker is. It’s less work than blender’s when I need a quick auto track. Unless a lot has changed since blender 3. Most days I use Syntheyes for tracking but since it’s paid, I don’t use it in tutorials focused on tracking. Thanks for checking out the video! :)
@@terr20114 I'm wondering now if you could save the Fusion node setup, to make this process much faster in new projects by just importing the premade node setup
You are doing an incredible job! I need your help. Is it possible to make a tutorial to combine the USD format with camera tracking to create a VFX clip? Congratulations and thank you very much!
the focal length should be automatically exported from fusion right? why it didn't? and what will you do, if it's going to be varying focal length shot? can't change manually in blender right?
For some reason it doesn't send the data over, and no matter wat settings I use, it doesn't help. I've never thought about this scanario with Fusion's tracker though. I use Syntheyes for 99% of my tracking. If I ever figure it out I'll definitely be doing a video about it.