ITS ON LINKEY DONKEYYYYY! Here is video of the animator app studio.ru-vid.comiXH79mrKADM/edit First rap test ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-mAT1X4xbEdA.html
Yes so glad to hear you’ll be doing more tests with this. Bro just realised I’ve been on this unreal journey with you since your channel was a baby. You’re literally the don in this field. Keep em coming
@@Jsfilmz In their docs they group the X, 11 and 12 together and then 13 and 14 together. Then they give a caveat about the X, saying that its not capable of capturing more than a few sec, but they dont say that about the 11, so yeah, I think the 11 is capable of capturing longer form content like the 12.
Thanks for the tutorial, I can't wait to try it, although it will take hours since the shorts that I'm doing have a lot of dialogue, but it will look a lot more realistic. Best
just done my first quick test following your tuts to guide me through the process, got to say wowzer animator is amazing. so much cleaner then i expected. well worth the iphone rental :}. keep the vids coming :}
Please try it & please report if it works, I have same iPhone XS. I am not gonna have access to windows system for at least 15 days, but I am dying to test this feature.
A lost detail in Hideo Kojima's DS2 trailer, was the end saying the performance-capture is powered by Metahuman. That game is going to be insane levels of detail.
This really is such a big step forward. Very exciting. The only problem is the iPhone only requirement. That makes no sense for something like this, but let's hope it changes sooner or later.
Epic is great but they gotta give us some options with the markers, like maybe don't make the ones that go on teeth straight up yellow / green lol? Great video though.
Awesome video! Thanks for the quick setup explanation! But I have to point out that MetaHuman Animator already uses AI for its solves. When you hit the "Prepare for performance" button, it trains a model on your face to later mimic the way it moves so it can animate other metahuman characters to that likeness. Thats why this step took 8-10 minutes : )
epic insane , looks much better then faceware or livelink .like you say the curves look smooth and no jitter . and to think i spent 4 hours last night doing 30 sec of manual facial animation that looks rubbish. so if i get a friend who has iphone they can send me clips ? .
@@Jsfilmz your rap video is a long take so hopefully will be good , i just ordered/renting a cheap refurbished one. hopefully arrives tomorrow. :} the 3lateral video that just released is mind blowing.assume they used a stereo camera ?,
Great video - you mentioned about when MHA will later use AI. Just a heads up it is AI driven currently. Pixel tracking is only a small part of the foundation. Great video!
@@Jsfilmz yeah there’s a lot under the hood that’s ML driven already. Facial tracking doesn’t account for wrinkles or much else other than eye and mouth shapes, everything else is interpolated with AI. there’s more coming but the foundation is already utilizing AI in a lot of areas.The second pass animation is still being improved on, so it’ll continue to get better. But it’s a training model based on a lot of human facial animation, to know what to do when cheeks are raised, nostrils flared, eyebrow wrinkles etc Night and day different to something like live face or other tools which purely track eye and mouth shapes and don’t leverage any AI to them interpolate wrinkles and pseudo face muscles into the animation
@@AllanMcKayoh wow hahaha crazy stuff being 1.0 its not bad oh btw for iphone it can only output 30 even when recording 60 right? Thanks i love knowing about the tech
Amazing content as always! Could you please make a video on troubleshooting these three issues: - “Promote Frame” randomly jumping to a different frame than the selected one. - Metahuman Identity Solve not accurate result. - “Add Teeth Pose” breaking the Identity Solve even more. Thanks a lot!!!!!
Maybe in a year or 2 Epic will give us a way to import our videos for mocap, I’ve tried so many apps and don’t like the results, only one left to try is supposedly the best one, move ai.
Great showcase! In case you plan to make more test video's, can you show expressions that are hard to do with Apple Arkit? I'm curious how it compares. For example a sad face with hanging lower lip, asymmetric brow movement🤨, worried face 😟 or any interaction between teeth, lips and tongue.
Great video man! I'm getting an error when I hit the process button in the Metahuman Performance. It says "The Processing Pipeline failed with an error." Any ideas on how to fix this would be appreciated. Thanks!
after ubdate iphone app It gives a warning for the animation part of the application live link face app saying your device model is unsupported you may continue but your results could be affcted"for meta human animator capture ." and its work but i wonder My economic situation is not good and I will develop games. How much performance difference does the iPhone 11 make?
The software requires an iPhone with a “True Depth” (LiDAR) sensor because it needs depth data to accurately track your face. You can always borrow an iPhone 11 (or newer) to do the tracking and then transfer the file to your computer.
@@Jsfilmz ah I see your point but wouldn't it be better for calibration to do it without the helmet to get the three angles then so the depth part of the camera recordings with the helmet on make it more accurate for certain things like creases on face and stuff similar to that when making facial expressions?
3:52 it's says iPhone 11 here But my phone is also 11 and over there on live link for metahuman animator I get that your device model is not supported you can continue to use but results won't be that good
Can you do a video of how to warp a metahuman to look like a custom character? RS3D Zwrap is a good wrapper. I have this model of NAS I want to put to the test, as well as Michael Jordan and the Rock.
This was jammed packed with so much information. I have to pause this video and slowly follow all of the steps. This is amazing! I'd like to purchase the headset for my iPhone. Did you make it yourself? or did you get it from a website?
The METAHUMAN team would definitely need to add at least a subscription. Even the complex skeleton of METAHUMAN is worth it. Of course I use a custom rig, but still. Several million or billion tool - is almost free. The tools like Daz3D or Mari no need anymore, I'm even afraid to imagine how much time it took to develop them
hey bro! awesome! I have a little issue, when I track my face with the animation sequence it's like nothing happend in my sequencer, but it exported correctly because when I open it alone the animation is fine. any idea? Thank you!
You're absolutely smashing it man 👏🏽 Thank you so much for the awesome content! I do have one slight issue though!.. For the life of me, I cannot get the Livelink Face mocap to work with separate body mocap. The head/chest just detaches itself and they are both independent. It's driving me insane. I've tried following some advice on the UE forums to no avail 😭 Have you experienced this yet? any tips? Thank you
when I play the level in a new window (PIE) the Metahuman character is moving his face with my motions very good, but when I click in any window other than the PIE window it starts to lag. and when I click again on the PIE window, the character moves normally ! what should I do ?
Thanks for the nice tutorial! Only sad that metahumans reliably cause my 3 test systems to crash. This engine is such a shitshow at to moment. Constant out of video memory crashes even though there are still 2-3gbs of vram left or just the plain old D3D device removed error. Sad to say but i really hate developing in this engine, version 4 was so sooo much more stable in any way, shape or form.
@@Jsfilmz Yep, but only for few second.. I need 3 or 5 minutes recording for all my Project. I will test tonight if it's ok or not, but... There is no reason for them to lie. XD
Thank you so much for your tutorial, so timely. In addition, I am trying to import from the mesh body in UE5.2, and it seems that the tracking mark link can no longer be carried out. Have you encountered it?
I may be a little late to the party, but im confused why you choose in CAPTURE SOURCE LivelLink Archives (which is uploading footage etc from PC drive?) then when you go to import you do it from the iPhone in CAPTURE MANAGER?
I have a problem that the character shows more bottom teeth than top. I dont speak like that andclive link doesnt do it. I tried re-tracking face markers but still had the same problem.
The Performance audio track shows 'Unresolved Binding' though I can hear the audio. When I export the animation no sound is exported......the internet has failed me. Anyone? UE 5.3 and 5.4
Pretty disappointed - waited so long for this animator to come out since it was announced and my pc is not up to par. Crashes every time I try to prepare for performance 😔
Thank you for sharing a fun and essential tutorial!! Anyway, is there a way to use the neck animation recorded by Unreal Live link facial capture? When I imported the facial anim with neck animation, and apply to my metahuman skeleton, the body and face is broken because of the neck anim. It's also possible to just facial anim without neck movement, is there a way to use neck anim...? Is there just one solution using the mocap data (with neck anim) + facial capture anim just face movement(without neck anim)...?