How do you transfer these iclone face cap data to Houdini? Would love to see a workflow demo of transfering the entire character mo cap + face cap data to Houdini
[QUESTION] Is there any difference between iPhone models, aiming face performance capture ?? Would You recommend any model more than the other, or does Face ID or True Depth or dot sensor (whatever is responsible for mocap) works the same since iPhoneX till iPhone13PRO ?? Thank You the video.
Is it possible to store a strength multiplier preset? we have many animations and so far we've had to adjust the strength to the same parameters on each animation
Hello! I came across your lessons. I have one problem, and I haven't found a solution to it anywhere. I installed the program iClone 7 and created characters in it. Now I bought an iPhone and decided to record facial animation using the tutorial. I did everything according to the tutorial and connected the iPhone (the iPhone showed that everything is connected, there is a mask on the face). I specified the iPhone in the parameters and entered the path to the phone. That is, I did everything according to the lesson. But when you click on the record / preview, nothing happens, the character does not repeat emotions. I checked the character for emotions, everything moves in manual mode. I checked during the preview whether the coordinates of the points on the face are transmitted, everything is fine, everything is transmitted. The legs and face are static. What is the problem? is there a plugin missing or didn't put a check mark where? I didn't find a solution to the problem.
People asking for Android - ARKit is Apple’s proprietary face tracking tech. Until Android itself comes out with a similar tech, Android cant do it, even if RL wanted it to. It (1) needs a special camera like iPhone X has, and (2) needs an ARKit-like software. I hope it does one day.
Yes it’s free. It’s part of the new update that was released 2 days ago. I’m not sure if I understood your second question correctly but u can’t use recorded clips with face live. It can only capture live feed. Hope that answers your questions.
If you can't tell the quality after watching this video there's not really much more to tell you -- it's pretty well laid out here how it works and what it can do. You can compare yourself to the results you are getting otherwise to make your decision.
@@MikeDKelley I saw videos of Faceware Live (not Facecap, my mistake) for iClone. It looked terrific. I bought it. It's awful - weird stretching on the mouth - unusable.
@@frankcabanski9409 You saw how everything was constructed, how all the minute parts came together like is shown in this video? I've never seen such a video for Faceware. Indeed, I saw just the opposite -- folks showing that Faceware was very difficult to get right. Now, I bought Faceware as well, but Live Face is so much better I don't even have FW installed on my system anymore.
@@MikeDKelley See, that's what I'm looking for - word from users/buyers. It will be around $600 for this plus the iPhone. Also, I don't know if Face Mojo is better or Facemotion - I guess they worked together for awhile, but then they split.
@@frankcabanski9409 If you have Motion Live (which you should, for Faceware) you only need the Live Face plugin, which shouldn't be $600 (the last I looked it was around $300 but I do admit it was a while ago -- I bought it when first released at around $200). You can get a used unlocked iPhone (you don't need service) for less than $500 (perhaps even a lot less) but be sure it has face ID (the 3D camera -- most all of them in the last three years do). I guess if worse comes to worst you could turn around and sell the iPhone for about what you paid if you REALLY didn't like it, but it's hard for me to imagine that, not when it's this good (the Arkkit stuff is stuff even the pros have been hankering after).
for now that is imposible, because android doesnt have the cameras quality for mocap as the 3D cameras of the Iphones, they need the hardware before they can release it for android.
If only this video were true!!!! But it is not. I too am shocked by the quality of the mocap here, but even moreso because I have an ultra modern computer with RTX 3070 etc and an iPhone 11 pro, and when I boot up the same software, the CC3+ character doesn't move AT ALL like this demo. She's jerky, for ten seconds at a time she freezes, and then in a rush she speeds through all the bad mocap data from the past ten seconds, then freezes, then jerks. I couldn't imagine it being any different than this video. Why? Well the folks from Reallusion haven't explained. Numerous comments below have confirmed the same problems. Many have asked if the wifi connection is an issue and whether the folks from reallusion are hardwiring their connection (this is likely the case) or are they indeed cheating this by recording the mocap beforehand then playing it back as if it is a realtime prview. Either way they need to provide all the specs and connections for a demo like this. We need the computer model and specs, the phone, and everything to do with the connection. If it is wifi, what are the wifi specs. If there is a cable, they need to show us how it can be done. The folks from Mocapx (not a solution I recommend) do provide for an easy USB connection, which eliminates the wifi problem and at least that part works better with Mocapx. Still, Mocapx provides no calibration, no zeroing, no smoothing etc. And after owning Mocapx for a month, I have YET to do convincing mocap. That is why I am now looking at iClone. But wow, the JITTERS and FREEZING are insane.
@@mehdi.shiraziu7559 Yeah CC3 characters do not export well, but nearly all the goodies are there in the export. It just takes a while with your NLE to learn how to adjust the settings to get something that not only looks as good as the CC3 images, but even better. Better? How? Well that's easy, CC3 doesn't do great work on subsurface scattering or displacement (two essential tactics for making characters look 100% human). Pick a render engine (I use Arnold) and learn the ins and outs of that, and after a month of practice etc, your characters can really come alive.
BTW I did get my USB cable connected to get this mocap stuff working more smoothly. The key is to make sure you are connected as a 'hotspot' even though you are using a cable.