I've checked the updated files. Your downloaded files now look like this: "Dollars", "Dollars_Metahuman", "Dollars_MONO" The only change is the removal of the "DollarsMarkerlessUE" folder, and now this file downloads directly as "Dollars" In the video, I showed how to copy the "Dollars" folder from inside the "DollarsMarkerlessUE" folder into the "Content" folder. But now, things are simpler. The file directly downloads as "Dollars", so all you need to do is copy the "Dollars" and "Dollars_Metahuman" folders into the "Content" folder. Hope this helps!
Thank you for your coverage for this app. I really am thinking of starting to mocap soon and these kind of reviews of different products and services gives me a good idea about my options and price range.
I'm ready to kiss you, my friend, there is no limit to happiness, thank you very much, there is nothing in official documents about individual dollars folders, and files are downloaded without these folders. You are a great man, thank you a hundred more times.
Yes, you can. Start by setting up a body mocap, then move on to a standard live link setup. If you want to get better performance for the body and head, just turn off the 'head rotation' option in the live link settings.
Can you make a video about it? I think Dollars' face mocap performance is lower than livelink. Body mocap with dollars and face mocap. It will be much healthier to do it with Livelink.
Hello :) i was trying to bring the motion capture from your last Video: (Real-Time Motion Capture with a Cheap Webcam (or iPhone): TDPT - Unreal Engine 5 Tutorial) to the newes version of UE5 and bevor to UE5.3 it doesnt worked. the newes version just dont show the mop reciver plug in. Thats really sad because i like the new meta humans more and i wanted to use all of it in a lv i had created. I will now try to downgrade everything. All in all i have to say im a bit disapointed (not in you in general) Im a total beginner and i just want to start making movies, but its just so coplicate and it seems there is no good solution that is cheap. I have to say this version you present in this video actually doesnt look good, the metahuman looks strange and unnatural and shifts in the ground. The three D pose tracker from your last video looks promiseing, because i think it is because the cam is limiting the data for processing. To sad it doesnt fit in UE5.4. It is kinda strange that there is no good easy way to go with low effort. I dont get this. thx anyway let get back to work and downgrade everything ^^
Can you make a video in that you create a sceen with motion capture at a metahuman. Record this, change what you have recorded with a control rig and turn it in an animation that can be changed with a rig? Thank you very much
Dollars mono now has pauses every 20 seconds in the free trial, do you use something else now or you just bought the full version? Thanks Btw, great video!
Hey your tutorials are amazing and very helpful for begginers like me 😄. I have a question, How can i composite metahuman animation in real life footage ? Like some of the virtual influencers do? I tried to search on whole internet but not able to find any tutorial on it, it will he very helpful if you make one of those 🙌🙌🙌
Thanks! glad you liked it 🙌 so when you say "real life footage" are you talking about those super realistic scenes and virtual influencers like codemiko? drop an example, I'll totally check it out!
@@duz_genI meant footage(video) recorded on phone or camera, like filmmakers make CGI and VFX effects and merge it with raw video, one of the example is @lilmiquela
@@duz_gen I mean footage (video MP4 format) captured through phone or camera. Like filmmakers add CGI and VFX effects in raw footage. One of the example is @lilmiquela you can check her posts and reels.
My guess is that she uses real people in her pics and puts her own face on them with photoshop. If I were a virtual influencer, I'd do the same thing. Creating everything in 3D and making it look photorealistic? Tough job. And check out her vids closely, those backgrounds aren't real spots, but 3D animations. She might be using Unreal Engine.
wehen i put the folders into my project folder and i open the project they are not visibable... the programm is working but i cant find the folders in the content browser
figuring out the best workflow can be a bit of trial and error. if you're using an iphone I recommend trying livelink + dollars mono. It's easy to set up and the results are pretty good
Hello sir can we record animation for metahuman in sequencer using Dollars Mono, I'm having issue with finger stretching .....can you help me in this issue....i want to make cinematic stuff.
Hi, yes you can. Window > Cinematics > Take recorder Take a look at this: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-xqTqWk1jUQ0.html Try wearing gloves so you can create contrast between your fingers and background.
Is it possible to capture movement like playing guitar hanging on the strap? I mean when you're not 'playing' an invisible guitar, but really do precise movenets with your both hands and then add a 3d guitar into the scene?
It's good at tracking finger movements, as long as your webcam can see every finger. I'm not sure how well it would perform for activities like playing a guitar there could be potential problems.
This is a great tutorial, thanks so much! The only bit I can't figure out is how to capture the mocap animations with take recorder and then get them onto my metahuman. How do you do this?
Glad you like it! You can use window > cinematics > take recorder Please take a look at this: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-xqTqWk1jUQ0.html
this is great, but there has to be a better way to set this up with your own metahuman than the process you described. if there's already a template project with it ready to go...no way to just migrate?
I was thinking about this idea while working on the video!😀 Tried changing a few files, thinking it should work but didn't get the result I wanted. Setting up body mocap is easy but when it comes to face mocap it takes a bit more time and seems a bit complicated. Perhaps dollars mocap might figure it out soon.
Was there not an option to move the mouth line onto your mouth position on screen? That could improve the capture and matching of your speaking and mouth expressions, no?
The mouth line is automatically adjusted by software, no tweaking allowed. In the new version, 'facial capture sensitivity' has been added, which improves the effectiveness of facial capture compared to the previous version.
Thank you for this insight! It would be also interesting to see how to integrate a custom made character (for exmaple in Character Creator 4) into this MoCap system. Character Creator 4 offers an Auto Setup for UE which already includes the rig for body and face. I wonder if Dollars MONO MoCap system is able to work with this.
thank for the tutorial , The creator of this plugin should try to place some type of smoothing or some type of control of the speed of the animation in real time to have more control of the nervous movements. I wish he implemented this, it would be even better.
I've checked the updated files. Your downloaded files now look like this: "Dollars", "Dollars_Metahuman", "Dollars_MONO" Actually, the only change is the removal of the "DollarsMarkerlessUE" folder, and now this file downloads directly as "Dollars" In the tutorial, I showed how to copy the "Dollars" folder from inside the "DollarsMarkerlessUE" folder into the "Content" folder. But now, things are simpler. The file directly downloads as "Dollars", so all you need to do is copy the "Dollars" and "Dollars_Metahuman" folders into the "Content" folder. Hope this helps and let me know if you need help.
thats amazing! it worked, purchased license through your link idk if youre an affiliate but just in case, definitely helped content creation for me today! @@duz_gen
sweet, perfect timing as I start building our first public game next week. Perfect software for indies. Definitely going to purchase it. Thanks for bringing this into my attention :)
Alot of companies are about to be put out of business. Most of us have been just trying to find some decent waist up mocap options. This including the face makes it beyond amazing. I purchased a license halfway through your video. Thanks!
Yes you can but there won't be any real-time integration. The only method for blender is to record the motion capture data as a BVH file and then edit the data in blender to create animations.
why cant companies just use cameras and software's instead of having us spend thousands of dollars on freaking suits that could be a waste of money in the future
MOVE AI has a camera-based motion capture app going for $500/month and that's for the standard dual-camera package. The problem is it's pricey and you'll also need cameras or iPhones, starting with 5 cameras would be a good start. People seem to like it but I haven't tried it yet. I'm curious to know if it's as precise as the traditional mocap suits