Тёмный

Live Link Face updated with built in face calibration for Virtual Production 

Greg Corson
Подписаться 6 тыс.
Просмотров 1,9 тыс.
50% 1

Опубликовано:

 

27 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 10   
@zippyholland3001
@zippyholland3001 3 года назад
Thanks for posting. Really cool
@GregCorson
@GregCorson 3 года назад
Glad you like it!
@maxwax9509
@maxwax9509 2 года назад
Hey! Very good and clear tutorials! Thanks a lot for them! can you please answer what is your approach to record a video with metahuman like above and also how you record audio and synchronize it with video? many thanks in advance
@GregCorson
@GregCorson 2 года назад
For most recordings I use "take recorder" in unreal which will record the mocap, sound and other things. Then you can go back and adjust all the different elements in sequencer such as camera motion, delaying the audio to sync it up...etc. If you record audio in unreal it will usually come out in sync or pretty close. I usually include a sync marker/slate in the performance to help sync things up. Saying something like "pop" is usually a good mark because the character's mouth will open in sync with the pop sound on the audio track. An alternate way to record something like this is to just perform it live and use something like OBS to record the unreal window as you do it. OBS lets you delay the audio track if it is out of sync. As a side bonus you can live stream the performance or send it out the virtual camera of OBS so you can use your character in a teams/zoom conference.
@alhusseinali
@alhusseinali 3 года назад
Great!!
@lautarnauta
@lautarnauta 3 года назад
Thanks Greg! 😃 🙌 . Can you clean the face mocap later in Unreal?
@GregCorson
@GregCorson 3 года назад
Yes, you can use controlrig to tweak the animations you've recorded using Live Link Face.
@lautarnauta
@lautarnauta 3 года назад
@@GregCorson thanks!😊
@williamreid74
@williamreid74 3 года назад
The info on updating is good, but honestly the enunciation and lip movement on the Metahuman here looks about 4/10. Looks like a ton of cleanup needed from an iPhone Live Link setup
@GregCorson
@GregCorson 3 года назад
First off, mocap is never perfect, no matter how fancy a setup you have some manual cleanup is almost always needed. The point here is that it's MUCH better than it was before the calibration feature was added! Also, I recorded this in UE5 preview because it makes the character and lighting look better than the older versions. However looking at it I'm not sure the face mocap comes through as well. I'm trying a test today with 4.26 to see if the face motion looks any different.
Далее
MetaHuman Animator Tutorial | Unreal Engine 5
14:02
Просмотров 437 тыс.
The REAL Reason Unreal Engine VFX Looks FAKE
6:58
Просмотров 486 тыс.
Why Unreal Engine 5.4 is a Game Changer
12:46
Просмотров 1,3 млн
Unreal Engine 5.1 Live Link Face Importer Tutorial
6:06
How to Use MetaHuman Animator in Unreal Engine
10:30
Просмотров 254 тыс.