Тёмный

Take recorder layered recording and fixing distorted methumans for Virtual Production part 3 

Greg Corson
Подписаться 6 тыс.
Просмотров 2,5 тыс.
50% 1

Опубликовано:

 

11 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 13   
@animeculture
@animeculture Год назад
Awesome video Greg. I just got a question if you don’t mind. I’m currently looking into unreal to get into vr production for my own ideas. My idea was to use my vr setup (valve index vr with the controllers and 3 vive trackers for full body tracking.) I also have an iPhone for the live link facial tracking. My thing is is it possible with this setup to act out the scene with each character. Body movement, facial movement and audio. Then have it play back anytime I need so I can film it. I just wanted to know if it is since this video is close to what I’d want
@GregCorson
@GregCorson Год назад
While I haven't tried it myself, I believe there is a plugin in the unreal marketplace specifically for animating characters with the setup you have. ie: the headset for head motion, controllers for hands and 3 trackers for waist and feet. So it is definitely possible to animate characters with this kind of rig. The only downside is that with the headset on you can't also use an iphone for face tracking while wearing the headset. Also I don't think the plugin has any sophisticated ai to "fill in the blanks" on the other body joints, so you just get the positions of the tracker and fairly standard IK animation. In any case, the setup you have is a good start. Also, you can do the animation without the plugin I mentioned, it will just take more work on your part to set it up. Also, there is a lot of new low cost stuff coming out like the new metahuman animator from Unreal and the Sony Mocopi. Supposed to drop in a month or two so I don't have a lot of details on it, but things are moving fast.
@animeculture
@animeculture Год назад
@@GregCorson this is what I was thinking. But I’m just not sure if I could do the vr stuff first. Then have a separate animation layer for the live link for the face animation then use that for the face data then combine the two to make the whole set. I’m just not sure how to go about it since I’m fairly new to UE
@GregCorson
@GregCorson Год назад
Yes, the logistics of recording all the different motions separately and then matching them up can be hard. There is no magic way to get it right. It can be particularly hard to do simple things like getting the characters to look at each other or look directly into the camera. Unreal's take recorder can record a layer, then play that back while you record another layer at the same time. It works but can be complicated and still doesn't solve the problem of the characters not looking where they should. In a lot of my videos I use a trick of setting up a "stand in" for the place (camera or another character) I am supposed to be looking at. This can be as simple as a stick or a monopod. If you are recording a character in unreal, put the end of the stick in the real world where the other character or camera would be and look at it. This works for AR kinds of things too, in most of my videos where there is a CG character in my real studio and I'm looking at it, I have actually setup a monopod with the end of it where the character's head is so I know where to look. In the final output the CG character covers up the monopod so you don't see it.
@diamondx4727
@diamondx4727 Год назад
I use 2 index controller + 8 trackers for motion capture in ue and it is just terrible
@GregCorson
@GregCorson Год назад
If you just apply the positions of the controllers/trackers to the joints of your character directly, it probably won't be very good. You have just 10 trackers and are trying to work out positions for at least 12 joints. Most motion capture setups have a "solver" that actually calibrates to your body to estimate all the joint positions correctly. I think there is some software on the marketplace to help do this for the Vive, but I haven't tried it. Also keep in mind that even with professional gear, mocap is never perfect and usually requires some hand editing to clean it up. In the case of the Vive trackers you must make sure the sensors can always see at least one base station or there will be drop-outs. Try moving the base stations around a bit to get better results.
@gabrielamontes2393
@gabrielamontes2393 2 года назад
Hi Greg! Thank you so much for these tutorials! They're really great. I was wondering if you've ever encountered an issue where your metahuman's head floats above the posed body? I followed this tutorial closely (and part 1,2), but for some reason, the character's head pop up above the body in the section that livelink records.
@GregCorson
@GregCorson 2 года назад
I thought I mentioned how to fixe that in this tutorial. Sometimes the correct skeleton for the character does not get copied over into the recorded animation and you have to copy it over by hand. The usual symptoms are seeing the shoulders look hunched and the head too low or too high. The fix in this video should take care of that, let me know if it doesn't.
@alanjosephproductions
@alanjosephproductions 2 года назад
This is great info, I've been driving myself crazy trying to figure out why my MetaHuman after animating looks like a Monster. Question: If I choose LiveLink as the source in Take Recorder and target an existing sequence for the recording to be inserted into after recording is completed; unlike yours, I don't see any data captured from my LiveLink source even though the MetaHumans are animated while recording ... any thoughts why this might be? Also, you had a nice shout out from a BBC Comedy BTS of the VP work in a video for your help getting them up and running with their production entitled "How we made a Comedy Series for the BBC using Virtual Production".
@GregCorson
@GregCorson 2 года назад
I have never had this problem so I'm not sure what it could be. When you go back to review the recording is the live link track empty or not there at all? Is this with 4.27 or some other version
@alanjosephproductions
@alanjosephproductions 2 года назад
@@GregCorson it's 4.27 and the track is there but, no key frames along the timeline so, no data captured. I believe I have this fixed after watching another video; it seems that even though I selected under the properties for the actor to use my Live Link phone connection, that it was not getting updated/set within the Blue Print so, I needed to go into the actor's BP and update that property in a second place. Most recent test now captured the data you showed in your video. Thanks for the reply.
@GregCorson
@GregCorson 2 года назад
Note there are two ways to record the livelink stuff with TakeRecorder. You can record the live link data directly or you can record the "actor/character" I haven't really decided which method is best for what applications.
Далее
Как подписать? 😂 #shorts
00:10
Просмотров 454 тыс.
iPhone 16 для НИЩЕБРОДОВ!
00:51
Просмотров 1,1 млн
Why Unreal Engine 5.4 is a Game Changer
12:46
Просмотров 1,3 млн
Unreal 5.1 Virtual Production WITHOUT TRACKERS
6:22
How Ian Hubert Hacked VFX (and you can too!)
22:26
Просмотров 151 тыс.
Using Physics to Improve your Shots in Unreal Engine 5
11:35
How Games Have Worked for 30 Years to Do Less Work
23:40
This Will 10X Your Modeling Speed | Greyboxing
8:21
Просмотров 209 тыс.
Как подписать? 😂 #shorts
00:10
Просмотров 454 тыс.