Hello, I wanted to ask you something. My phone doesn’t connect with UE5, but I need that in order to get Live Link working. Do you have a piece of advice for me? I already checked the WiFi but it is the same WiFi like on my iPhone. Please help!
have you found a way to rig an IK for a character to match the metahuman or default mannequin while retaining live link face functionality on a custom model?
Amazing! Instead to use Take Record to record the facial capture, you can right click the Face track and choose "Bake animation sequence" to save the facial capture as an animation asset.
@@Jsfilmz well that step was missing in your video tutorial, I just think someone could be interested. Sorry if seems I was talking the obvious. Peace!
Hi there! I've a problem... The facial animation works great in timeline when scrubbing... but if I add camera ad Camera Cuts I cannot export animation rendering... the metahuman face still without facial moviment! Have you got solution? Thanks
@@michelesaporito7035 I wonder if you can bake it to the facerig after you get the clip in the timeline? Right Click the "face" labled portion of the Skeleton under the character and then you make see a bake anim to menu item. Then you could select the face anim panel rig and it should transfer the anim to the facepanel controls it is driving as keyframe data.
So cool, man! Do you have a full walkthrough of the process? 3D model creation > facial animation applied > exporting the full animation to third-party software as OBJs or FBXs? Thanks!
Amazing! I can feel the vibe.. really great addition to UE5. I wonder if it can also take- in audio for speech at the same time, that would be a great package for facial animation workflow.
What about the audio? This is a game changer for me. Now we can get VO talent to download the app and record themselves, then share the files. That’s a lot less production, better result.
Very nice work! I tried to ask AI if this was possible (offline) and it failed miserably. AI is too much A and not enough I. Hahaha! Humans rock! Especially this human (in the video). All the other tutorials took way longer to get to the point, but still get points for trying. Keep this up. We dig you. ❤
Pretty sure none of the VP tools require simulation or play mode. They should all work in editor. Try checking if 'Update Animation in Editor' property on the skeletal mesh is turned on. That's what enables any AnimBP data or livelink data to show up in Editor.
@@Jsfilmz had a few heart attacks and surgeries just made it back home... Got into the avalanche beta so bout to dig into that... no rest for the wicked
Very Awesome! When I did it though my chararacter had the upper lip issue :D not sure if its 5.1 or the importer method but sorted it out, I like this method of not having to live link to record the mocap :) Thank you
You have the Justin Timberlake' voice bro ! excellent ! By the way, do you have a video about scanning your own face and import it in Unreal ? thanks...
This is amazing, It works! Btw do you know "how to set unreal to stop gpu when in background" - When i watch youtube, i want my Unreal Project to stop using GPU, because the PC is so hot. Thank you
@@Jsfilmz I find out the answer that type in cmd: t.MaxFPS 1 (which mean 1 fps). It's solve my problem. But it's not the convenience way! If you have another way, please rely.
Can you do a render with this import Live Link feature? Sometimes it works for me using Render Queue and then othertimes it doesn't track the animations in the render. Love the channel!
Uh... Ok, so ive been using livelink with unreal for 6 months and it has never once kicked out that CSV file with the other files. What are you doing extra dude.
On the back of this If you try to render on path tracing mode I found that the imported keys alone won't work even if playing when running the sequencer. This gets resolved by baking the animation into a clip from the face clip in the sequencer. This will probably make you life easier through out the importing process. Happy to elaborate further though other channels.
Hi there! I've a problem... The facial animation works great in timeline when scrubbing... but if I add camera ad Camera Cuts I cannot export animation rendering... the metahuman face still without facial moviment! Have you got solution? Thanks
Never used either before, what is the advantage and disadvantage to using it with wifi (same network) or the method you are showing now? Is it only that you dont need internet?
Hello, thank you for the video. But I wonder why my ARkit is active but my ARkit face subject there's no iphone detected. Do you have solution? Thank you
@@Jsfilmz jeah man, sometimes you have to bite into the "make it work" kind of thing. in the end , if you dont give up, its working and the whole thing comes together. no matter if its blueprint stuff, or even tweak an animation till it looks good. gratulatione, that you figgert that out, cause there is no real documentation. its like package a game or project to android.try and error, sometimes! love ya channel, you are trying cool stuff , mate. respect from northern alps, bavaria.
This is another awesome one!! Everything works up until I get to the drop down. I only see my phone and not the added animation. I noticed it already said your animation was in the live link. Did it automatically show up for you or did you have to do anything extra to get it to show up in that list?
Tha is again for another good one brutha. Can you send me a link to 5.1, I am having a hard time finding it. I wanna try the decal joint out. I watched you video on it the other day, and it’s exactly what Im needing right now.
@@Jsfilmz that’s not a problem, I just can’t find it. Email me. Would love to talk about some ideas with you. Daniel@madmixedmedia.org thanks again brutha. Keep up the dope work.
Hello JSFILMZ . I watch a lot of your informative videos they are great . But unfortunately I tried this method a few times it does not work for me . Each time I import the CSV file into Unreal the sequence time range is shorter than the original video in my Iphone . The time range is shorter than the video . Is this bug in my Iphone x or maybe unreal ?
Hi JS, thank you for your comprehensive tutorials. I have a specific question. I have a mocap suit and a Custom 3dCharacter in my scene. I want to recrd my mocap body and import it in unreal. And then I wish to retarget my face capture data as fbx into unreal and retarget it to my character.. Can you say how I may do this.?
JDog, I did everything that you presented in this video step-by-step. I couldn’t get it to work. No animation on my character’s face at all. I have been trying very hard to get metahuman animator to work with no luck. Then, I tried the live link face app, and couldn’t get it to recognize my iPhone. I went to the drop-down menu and the ARKit face subject and couldn’t get anything in the drop-down menu. Then, I followed this video and found my take (which I transferred to my Google drive from my iPhone, And still no luck. Is there anyway you can help me identify what I might be doing wrong? Thank you so much. As always, J.P.
Very cool, we had a similar issue where sub clips weren't getting created but the LiveLink data was there. Did you then manage to re-recorder in Sequencer and get the facial animation onto the MH Face controls?
Thank you so much!! I really need this! But I can't find raw file in my ipad. There are only .csv, thumbnail, video, take.json these 4 files. Do you know how to set to save raw file?
Is anybody else having the issue where their Metahuman just stares blankly forward? All other animations are there, but the eye animations are not working.
have you tried it now my live link face app is creating a frame log csv file instead of raw csv and frame log is considered as a data table file in unreal instead of level sequence please help
Very cool...I think....what I mean, and forgive me if I'm asking a stupid question, I don't have a smartphone; a cell phone yes, but it's text and talk only....NO DATA service (I don't really need a service other than T and T, I'm at home on my computer all day so I don't see a need for one) So if I purchase an iPhone, can I just use it as a Wi-Fi phone (to get the app) and not have a phone service? If I don't need a service then YES this will be exactly what I need!
did you figure out anyway of editing the animation after hooking it up in a sequence... for example when you are doing live capture using the iphone, you can edit the graph and over crank face shapes (like jaw/lip closing etc) working on something that was captured on set, so i have a face live recording, but when applied to the metahuman, the mouth is a bit slack, wondered if you knew of anyway of modifying the data stream to add animation/adjustments ON TOP of the recorded data? thanks for all your vids by the way, really elping me get to grips with unreal/metahuman!
Hello friend, do you have a tutorial to use the Live Link and control the scene with the cell phone and record a video? I see that it exists for iphone, but I don't know if it exists for android.
@@Jsfilmz lol 😂 you know what my bad because I realize it is thank when I did it I didn’t had to simulate I just press the play button and it work, with this who needs iclone this is even better than iclone live link
Hey thanks a lot! Little issue: when I drag&drop my csv file I get a DATA TABLE OPTIONS folder and I can't see any option to import a Level Sequence (in your case it happened automatically), so I'm stuck :) do you know how to avoid that?
@@Jsfilmz Thanks for the reply. I followed all the instructions, I am using UE 5.1 and I've activated the LiveLinkFaceImporter plugin. I still see the DataTable Option window when I import the csv file and I don't get a Level Sequence. Maybe it's because I'm using a Mac, can't say :) Thanks again for the video, anyway!
Solved: on Windows it works perfectly but on Mac it doesn't! Now, for some reason, I just don't have the "LLink Face Head" option, but that's another problem :)
Hey Jae, im trying to get this to work in 5.1 but it’s not connecting. When I drag the CSV file into the project and open the sequencer, I go to select the sequence from the ARKit face subj drop down but it’s no longer showing up there. The name of my phone shows up but not the sequence name(i.e csv file name). Would appreciate any help you can provide to troubleshoot.
Hey J, I have a question for you, How can I have multiple MetaHumans in one Level Sequence using this system? I tried importing one MetaHuman with another but I crashed the session. Any thoughts?
Currently 5.1 asks you what file type the .CSV should be imported as - DataTable, FloatTable, FloatCurve etc. Then there are subcategories to choose from there. I'm going through each and not getting an anim file. Any suggestions?
I was able to bake the animation to the face skeleton as keyframes but this doesn't keep the head rotation. I think the head rotation is on the body skeleton instead of the face. Is there a way to get this head rotation from the CSV file baked to the head movement keyframes only? Thanks for these vids. Very helpful.
I tried copy/pasting the keyframes from the iPhone head.ctrl yaw, pitch, roll one lane at a time and it works to get the keyframes to/fro but it is very fiddly. Sadly these values move the head very little as if there is some interpolation between the actual numbers and what is driving the livelink face app values. We need a way to get this data to the neck without a lot of science project time. There should be a documented way to get the head rotation from the livelink face app onto the character. I notice this question is asked in a lot of places.
@@jayarajvin3404 I think I did find some post that had a solution but it was convoluted. I think Unreal could really make this process far more streamlined. There are too many gotchas and extra things to fiddle around with to really do this easily. There are too many cases where someone will want to save on GPU cycles and not want to capture a performance live and instead do it on their phone and have the hi-res capture successfully transferred with little hassle. I know they could make this far more plug_n_play if they just standardized the file format so that it always has the head rotation data as part of the way it hooks up then maybe add in options to disable the head rotation or any layer for that matter for those that don't want it.
@@caseycbenn Actually I got the Head rotations and face animation in the Sequencer & it plays just fine. But when I export, head rotations are not exporting, this the problem man, do you have any ideas why or what causing it not to export but plays fine only in the sequencer?? btw thanks for responding, much appreciated!