Hi! Thanks for this! As for ARKit tracking and rigging, do you add extra parameters for the movement of things like eyebrows to track other expressions, such as sadness? Would it work differently?
Depends on what look you're going for, there's some overlap with arkit for live2D tracking specifically, but for example the arkit can track angry faces specifically so in that case rather than doing the angry eye on the brow you could change it to a sad eye form then create an angry blendshape on its own parameter Hope that made some sense! But tldr it's very similar