but Sir how about object tracking ? do I need to use PFtrack instead ? I will started from here if it's legit .. ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-SbkXb31DrRA.html
now you are my personal favorite blender artist . you teach everything very easily that anyone can learn. I watch your video about arrey modifier and it helps me a lot thx and keep going ;)
Thank you for making this video for us newbies. I actually really appreciate the hand-holding. It helps fight the hesitation/ evil procrastination. A couple of niggly bits I noticed. At 3:13 you suggest the [prefetch] puts the frames on your disk but I think you mean puts them into RAM for quicker access. Also you at 10:12 point to the 0.51 PX solve error which I believe is PIXELS (PX) not percent. I gather from multiple tuts that fractions of a pixel (so less than 1.0) are sought after in the solve error.
Hi Justin, you can analyze the blue error line for every single tracker separately and so, after the delete of the obvious bad tracker before the solve, you can refine the quality of solution with use of weight if you want or delete any one the worst tracker, if is the case. In graph editor the first button on top right position enable this single tracker error analysis.
@@TheCGEssentials Renoving the obvious bad tracker you get the solution in a range of error, but the range of error is an average value of all errors, so when you have many tracker maybe someone can be too much in error respect to the average, but it can be important for define the general tracking and you don’t want delete it. Analyzing the sigle error and giving the appropriate weight you can refine your solve without loose information. In many cases the error can be or raise too much only before the tracker is disabled and analyzing you can decide to cut off only the bad part too, instead or before apply appropriate weight to it.
Great video. I have been slowly integrating more of Blender into my workflow which has been Houdini / C4D and Syntheyes for tracking over the past 15+ years. Enjoying blender quite a bit. Question about tracking. One of the features I find really useful in Syntheyes for tracking is the offset feature. It allows a tracking point to offset to another feature / marker if it becomes temporarily occluded. Can Blenders tracking do that?
good tutorial but in the original video you showed at first i could see that you move your leg slightly, and when you detect features there are some trackers on your jeans too. in camera tracking the tracking points should only be on completely static objects, so you should have deleted those which are on your jeans. anyways, great tutorial
great tutorial! have you tried the "refine tracking solution" addon? it comes with blender, just have to enable it, with just one click it automaticly put the correct weights on your tracks based on their solved error, for example that .5 you got sometimes it gets it down to .2 with just that one click!
True. But Blender allows you to pull an unending library of 3D assets immediately into the tracked scene. It also allows you to create photorealistic lighting with the use of an HDRI, plus a shadowcatcher for realistic shadows (completely different from AE drop shadow). So you are correct, for adding the cg dog in this video, AE would be much simpler, but learning how to do it with Blender opens a world of possibilities. If you wanted to have a coffee cup spill over and have the coffee run off the edge of a desk, you could do that by tracking the scene in Blender. You can't do that in AE