Seriously guys, each time Epic comes up with new outstanding features on Unreal Engine, I'm turning myself into a rocket scientist mentality. The team at Epic Games rocks!
@@AlexisRivera3D most animation data and 3d model data can be exported from unreal engine as FBX. You'll have to find a tutorial but it should be possible, if not... it would be because unreal engine as a company set up something to keep this from taking place. considering the fact that unreal is more recently apart of many pipeline software, the chances of that being the case is slim to none.
Anyone know how to solve issue where the body becomes detached from the animation ? My head is animating and it separates from around the shoulder area .
@edentheascended4952 I exported the Metahuman to Blender but has some issues, for example the hair is not compatible with Blender so I had to add a new one as hair cards
Amazing tech, but not a good tutorial. I wish you would have started by saying you'll need your initial Live Link Face capture to have a Front, Left, Right, and Teeth pose. You don't realize these prerequisites until halfway through the video. Also, what's the best way to get the data from your iPhone to your pc?
Ain't gonna happen, unfortunately. I gave my hopes up on that. A lot of people, specially new solo-devs on a tight budget trying hard to finally start making profit from the struggles of learning and experimenting with all they've got, are looking forwards to seeing that finally happening... and while waiting for that other people remain one or two steps ahead. They say we can use an head-mounted camera but why not simply have the chance of using the smartphones we have? Evidently they do not agree or do not find that business-worthy and rather look at us as dinosaurs... ready for extinction! I love Apple, don't get me wrong, but their monopoly when it comes to some stuff is hugely annoying...
Honestly kind of confusing, is there some technical reason for the lack of support outside Iphone? What self respecting developer owns an iphone anyway, a difficult to modify, locked down system. There is a reason for the meme of graphic designers/artists etc using apple products, they aren't generally the technically minded ones, and I say that as an artist.
i'm sure it's possible with a simple camera, why need a iphone ? there are phones with 2 cameras allowing you to have depth and capture volumes, it is also enough to put dots on your face, I hope you will add that in the very near future if not it sucks
question, does this require an iphone (aside from the stereo camera of course), or can the footage be from any camera at this point? Just curious, wasn't sure if it was using the iphone lidar data or something like that.
@@lexastron I did find out that it does use the data from the iPhone camera and that's why they use it. There's ways to use Android or a webcam, it's just not as accurate or articulate unfortunately, however, you can get close and then hand animate the rest to get it a lot closer to the performance. It'll take a little more work but I guess it's doable just would be a lot easier with an iPhone
Hot damn. I use Unity at my job, and Unreal at home. And switching from one to another feels like I'm traveling 10 years into the future. Completely ruined any chance for Unity to catch up in terms of graphical fidelity. And it's just as easy to use, if not more so in some instances. Insane.
3:29 What's the iPhone app we should download to record videos to import to the engine? could you show us the process with iPhone (a real step-by-step)?
Metahuman is an excellent way to spend three weeks just trying to make a floating talking head actually attach to an animated body correctly. Even if they're literally "both metahuman" as of 5.3 there is still not official method.
Please add support to iPad Pro 2022. I used with Live Link and worked perfectly. Now I just updated the app and when I open it, shows two methods: -The old Live Link -The new Metahuman Animator, but it says in red color that my device is not supported. How is not capable an iPad Pro 11” 2022 4ºgen with M2 chip, and iPhone12 is capable?
Are there any guides coming for the stereo capture. I made a capture source, selected stereo archive, pointed it at my directory with stereo hmc. Went to the capture manager, selected the source and.... nothing. No videos to select. What formats are supported? I tried mp4 and mov (does the codec matter)? I assume I am missing some really simple step, but can't find it.
Thank you for your very clear tutorial. I was wondering if it would be possible to batch-process 50 performances at once with the same Metahuman Identity? Is it possible, or do I need to manually set them up each time?
You would need 3D capture data. iPhones come with depth cameras, which is why Epic mentions them. You can get a standalone depth camera pretty cheap from Intel (cheaper than having to get a phone you don't want). I think Intel's flagship depth camera is called RealSense and they start at about $100.
And what about animation WITHOUT capture? not everybody got a suit or a smartphone with true depth camera, and/or would rather animate by hand, old school.
Hey! I've managed to capture a performance but once I add it as a face animation to the Metahuman Blueprint, the head stays detached from the body. How can I merge them together and work on my body animation separate from the facial / head rotation?
Are these results much better than the prior LiveLink setup? I am excited to test out myself, but it does look like a more complex workflow than the offline CSV method that currently exists. Excited to test out.
So I get it, you guys don't like Android, but what about other 3D-Cams solutions like Intel RealSense or similar? Anyway, amazing piece of software, congrats.
I really wish there was a version of this for people who aren't code heads. I'd love to do cool stuff with Metahumans, but Unreal Engine has to be the least intuitive, frustrating software I've ever attempted to learn. I've worked with Adobe Photoshop and Illustrator professionally since the 90s, and InDesign since it came out in the early 2000s, and I've also managed to learn tons of other software, but I guess I'm dumber than I thought. 🤷
I found it pretty easy in like 3 hours I had a character that looks like me running around in 3rd person, found it pretty cool hardest part I find about learning is the blueprints but still much easier than c++ and don't need much knowledge to create a character
@@elitegold4805 I do love the Blueprints. They're extremely easy to use and it's cool what sort of functions you can come up with without pages of code.
Have you tried Unity? I know this seems odd to suggest on an Unreal video. But I did find that Unity is far more intuitive for beginners to make certain kind of projects. I find Unreal a lot more scalable for making complex 3D functions and interactions.
@@elitegold4805 so yeah, you kinda just proved my point. You know C++. The closest thing to a programming language I've ever used is HTML. 😆 And yes, using the MetaHuman Creator interface online is super easy and fun, and I think it works great. I've gone bananas making MetaHumans, I have over 50 of them at this point. But it's when it gets into Unreal Engine that the wheels come off for me. Unreal Engine was designed for people like you, coders, not people like me, graphic designers. So for me the learning curve is quite steep. I've never used node-based interfaces before, for instance. I'm having to learn all this new terminology, because I haven't done much in 3D before, so 'normals' and 'UV maps' and things like that are things I've had to learn about. You already have a lot of that fundamental knowledge, so for you it's more about just learning how Unreal Engine does things you want to do. I kinda just wish there was a version of Unreal Engine that was designed for people like me, is what I'm saying. Nothing about Unreal Engine seems obvious or intuitive to me, and it relies on already knowing a lot of stuff I don't know yet. And I don't have any interest in making games, I only want to use Unreal to make cinematics and still images, so all of the game stuff just gets in the way for me. And yes I know _this is a game engine,_ so I have to learn about all that to use the software effectively. And I just...don't enjoy that aspect, I guess. Plus I'm Gen X, and I feel like I've been learning new software my whole life, so every time I have to start from literal scratch part of me is like 'not this again!' 😩 Oh, and I'm on a Mac. So. Yeah. 😔 I am glad, however, that the Unreal Engine interface doesn't try to hold your hand too much, because that's more annoying than not knowing stuff. I hate it when an interface peppers you with pop-up notifications while you're trying to concentrate on a task. Adobe has gotten really bad about that in recent years, and I get that those programs are probably just as dense for total beginners as Unreal Engine is for me, but there is a better way to teach people software than Clippy On Steroids. 😁
Hi, thank you for this. Is there a written documentation on this/these processes step-by-step? I feel parts of the video tutorial take for granted stuff, a noob like myself has a hard time following along. Oh and BTW. Something I do not understand. How come the Live Link app is available only for iPhone, but the Metahuman plugin in is not available for UE on macOS. Why is that?
In 5.3.2 It looks like once I add the teeth pose, set the frame, and click "fit Teeth" the B view looks like there are two head meshes overlayed and you don't see the teeth result. Then Prepare For Performance fails. ??? REALLY frustrating.
Anyone please tell me, how can I combine manually keyframed body animation (using metahuman body control rig) AND face animation made with metahuman animator, so the head DOES NOT detach from body? I can’t find an answer, tried combining body animations with face animations, but it’s a dead end, nothing helps. I can’t find the solution anywhere
A lot of people are new to UE, because of MetaHuman. But for a beginner, this tutorial is wayyy too fast and most click are not explained. Just some feedback for future work 🙂 Keep doing awesome stuff
Tried again with 5.4 and it still fails. There was a new ID start screen about system specs. How about more info on that? The error message just fails with no help as to why.
Bit of an incomplete tutorial... GDC23 made it look like you could go from performance to photorealistic, animating metahuman within 10 minutes... This tutorial skips passed that you're using a stock metahuman, skips passed attaching hair and linking audio with the performance... Hopefully more complete tutorial to come.
Awful "tutorial" doesn't explain anything just "press this button for now reason" type of content. You release "groundbraking" new feature and cannot even write a page of documentation anywhere, poor. Very poor.
So awesome! But i have master issue with bridge I create a metahuman with creator (for 5.2) but it is not showing up in bridge no matter what i do. Do you have a solution on that pls?
@@piorism i got it to work- i was just very dumb i had to update bridge vie the epic games launcher. After that the metahuman from creator showed up in my quixel bridge. Thanks for your reply man 😄
I'm reminded of something Siren devs did years ago, thinking of that old technique to create a realistic face to now is so surreal yeah it'll really help indies, they could use unreal for faces and probably still import to other places if need be
At 7:05 after Mesh to MetaHuman returns, the blendshapes all goto vertex count 0. Is there a way to preserve these after the solve? Standard Bridge MetaHuman downloads have both an embedded DNA as well as morph targets.
I only see MetaHuman Identity and Capture Data (Mesh) as an option. The menu itself is called MetaHuman instead of MetaHuman Animator. What did I do wrong?
(No Body Type Preset is selected in the Body Part. Please select a Body Type Preset to continue). i am stuck to this error any one help this appear while auto-riging metahuman mesh only
I appreciate the video but i'm 2 minutes in so far and it's missing important information: - I didn't have the Metahuman plugin in my project and there's no link in the video description so wasn't sure where to find it - It jumps straight into how to load in footage with no explanation of how to record it or even the app to download on my phone The rest of the tutorial was great though and worked really well!
3:52 these two additional left and right frames are IMPOSSIBLE using a head rig (Like the ROKOKO Head Rig) which rotates with your head rotation, giving a frontal view ONLY. You said the frontal should work, so I hope that is not an issue, as it seems to be where the industry is moving (frontal-locked head rigs only).
help plz 1:56 i find my iphone their but iy don't show any of my captures even when i press start capturing bottom it shows that it is capturing but when i stop it doesn't show anything
Hihi. I was waiting for this plugin since you announced it. And I prepare 2min of videoclips for making a short film. So you published it in the exact right point and I want to thank you for making it possibil that I can do this stuff for free. :)) Thank you really much
Hi! I have a problem when I activate the plugin and open the project. It gives me this error: ''The ‘MetaHuman’ plugin failed to load because the ‘MetaHumanMeshTracker’ module could not be loaded. This may be due to an operating system error or the module might not be properly set up.'' and closes the project. Any Idea of what could it be the problem? Thanks!
Has anyone notice something strange about RU-vid search algorithms? When you type unreal engine. You find strange videos that is not related to Unreal engine and hides lots of tutorial videos. Google employees talking about NDA and making Google game engine(not Android Game Development Kit ) but similar to Unity and Unreal; able track users activities, violate EULA, etc. Where did you get this information from? Sources, NDA, ChatGPT, etc. The guys in room is worried about UE could possibility make new UE video game console powered by UE, UE kernels, regional locks. Why are they worried about it? Nvidia, Quixel, Meta Humans... Photorealistic game powered by UE. Stadia and Google guys want to stop Unreal Engine developers making photorealistic games. Why are they so jealous about UE? Don't know. Investors giving more VC rounds for UE.
Hi! Any advice on troubleshooting these three issues on Metahuman Animator? - “Promote Frame” randomly jumping to a different frame than the selected one. - Metahuman Identity Solve not accurate result. - “Add Teeth Pose” breaking the Identity Solve even more. Thanks a lot!!!!!
This tutorial is fantastic! I'm hoping that there's a follow-up about batch processing performances via the included python scripts. I have a project with around 300 takes, but I haven't been able to figure out the batch processing workflow yet!
any steps by Animator is working fine. But anyone kown why the body and face are disconnected in sequencer when i add the animation on the face? When i set "Additive Anim class" to "Mesh Space", it seems also like very rigid.
Help! I get error when importing iphone video. Either it says audio clip could not be imported or it could not verify ingested file. Both instances I end up without a Capture Data file so I cannot move forward.
how to combine body and head? my head keeps detaching from the body when i attach the animation to the face and play it in sequencer, i tried to baking to control rig but no change..
@@elvismorellidigitalvisuala6211 i have found a very vague fix, can only work if you have no body animation, i still don’t know how to add body animation on this fix for now but here is the fix: Before doing this just make sure you bake animation to control rig (just in case) then, Right click face and go up to rebind component and click on body. What this does it attaches the body to the face/neck and the whole body moves with the neck, so it looks weird if you see it with full body but if the camera is just in portrait, it kind of works, i didn’t figured out a better fix for it for now
@@elvismorellidigitalvisuala6211 that fix isn’t a good one, but i hope other creators will figure out how to fix this or do we need a body animation for this or a head mount camera to nullify any neck below movements you know..😕
Is there a way to queue Processes for MetaHuman Performance under MetaHuman Animator? and while at it for Export Animation too? This would liberate time waiting for each Process to be finished before moving on.
So I'm stuck at the part that is conveniently skipped in the tutorial. I have pulled the take files from my iphone and placed them in a folder and set that folder as the target for my capture source. But I don't see anything in the capture manager. :(