Usually not an Apple fan, but I'm amazed that even old Iphones are the best option for cheap motion capture, seems like no one was able to release a 3D camera in that entire timeframe
Biggest flaw with liveface iPhone has a bug for generations according to the forums where single eyebrow movement is not being registered. iPhone can only move both at the same time, but if you raise a single one it doesn't work.
I see your video was recorded this same week, but Live Face app hasn't looked like that for a long time. Have you not updated the app? THe new version doesn't have that button section showing the sliders. You only see the mesh on the face. ALso Im getting 100% complete opposite results than your video. WIth Accuface/webcam it looks just like the left, nearly no mouth movement but lots of eyes and expression. THen with LIVE FACE , I get single eyebrow movement and not as much expression but good mouth movement. I wonder if you perhaps confused the two.
Actually I can confirm,at 4:16 in your video Accuface is on the left. Im able to tell because when you selected the character on the right, its clearly titled f_O and that character is assigned to Iphone in Motion Live 🙂🙂
@@BlenderBob It turns out, there are two apps with same name. LIVE FACE for Unreal, and LIVE FACE for Iclone. I didn't know that. You do have correct app. But what I mentioned about your confusing the two cameras is still true, at 4:16
I wonder if this is good enough for a deaf person who can lip read to see sounds/words, OK for the hearing world, lips flap around in time with the sound, seems good enough but lip readers can really see a lot of sound, not as much as silly FBI TV shows suggest though
@@BlenderBob Generally speaking yes, but you can customize Metahuman mesh. I haven't try this workflows, but you can do it using Maya and this free tool: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-RBLw4q9Hcrc.htmlsi=PkM3qm19PhfxBiIy And with Blender, MeshMorpher plugin, Wrap3: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-HOSPDn1j9Jw.htmlsi=DzOQB4LoXPPgjCF5 There are other workflows also, but this two look most customizable.
are you oky. right is good then the left one. left is jittering and even did not open mouth when you was talkin i think bob its time to go weak sunglasses you did not look good.2ndly you can do it easy with blender , faceit or rigify. 3rd : best facial rightnow is Unreal engine . because thos 52 shape iphone and others in blender faceit and in iclone ex.. thos are lack of others shapes with you dont have , in unreal they have ... thos 52 shapes iphone are for kids bro... trust me in cilone and beldner frace it i mean ... its good only for facial expression not to talk ... but im impressed with that nvidia accuface . if left is i phone ts really bad af necause look the eyes trackig is really bad and eyebrows also ex.. even mouth is huge tucking bad XD . replace T with THE F xD .
Well, you have to understand that I was recoding from three input at the same time on my laptop. Hence the bad sound at some places. The signal was probably lost while recording. We didn’t get I to any Issues with live face on tiki.