Тёмный

Master Class - Getting The Best iPhone Mocap in iClone - by 3DTest 

Reallusion
Подписаться 233 тыс.
Просмотров 43 тыс.
50% 1

Опубликовано:

 

15 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 84   
@hbstudio7891
@hbstudio7891 3 года назад
I am from India. I am your Big fan
@LordCritish
@LordCritish 3 года назад
Good to see Sean Connery is still alive.
@thomasahrenss7493
@thomasahrenss7493 2 года назад
Seems like what im been waiting for ...Have to look further into this feature .
@AnthonyCupo0012
@AnthonyCupo0012 3 года назад
Thank you for the video! Have you made a video on how to attach and use a cable to your PC?
@baroquedub
@baroquedub 3 года назад
+1
@doriansean7819
@doriansean7819 3 года назад
I’d love to see the setup too
@doriansean7819
@doriansean7819 3 года назад
My capture thru the phone and internet isn't as good as his! He must be hooked up to the pc!
@chillsoft
@chillsoft Год назад
@@doriansean7819 How? With normal USB to TB cable? Or whatever apple calls their end of the cable :P
@starwarz8479
@starwarz8479 3 года назад
How do you transfer these iclone face cap data to Houdini? Would love to see a workflow demo of transfering the entire character mo cap + face cap data to Houdini
@nathanbayne3576
@nathanbayne3576 2 года назад
It works as exporting an alembic that's what I've been doing and it completely works, but obviously the rig doesn't come in if that's what you meant!
@snakeeyesdeclassified
@snakeeyesdeclassified Год назад
thank you for this, very helpful.
@JimmeeAnimAll
@JimmeeAnimAll 2 года назад
[QUESTION] Is there any difference between iPhone models, aiming face performance capture ?? Would You recommend any model more than the other, or does Face ID or True Depth or dot sensor (whatever is responsible for mocap) works the same since iPhoneX till iPhone13PRO ?? Thank You the video.
@brianmercerjr2282
@brianmercerjr2282 3 года назад
Awesome
@arnoldo001
@arnoldo001 2 года назад
AWESOME Video, Thanks. Could you share the "stage" or "visual" setup of this scene? It really, REALLY looks great
@andreasgeorgiou1901
@andreasgeorgiou1901 2 года назад
Is it possible to store a strength multiplier preset? we have many animations and so far we've had to adjust the strength to the same parameters on each animation
@ToysRUsKid_Critter
@ToysRUsKid_Critter 3 года назад
Excellent!
@GTSongwriter
@GTSongwriter 7 месяцев назад
I've invested into iClone 6 & 7 & CTA 5... I want "iPhone Live Face Profile" but I can't afford it.
@JGooden762
@JGooden762 3 года назад
It must have been really expensive to hire Miachale Caine for the voiceover of this video...
@prankfever8877
@prankfever8877 2 года назад
is it possible to use the live face app with the samsung galaxy S20 ? it's only 499 $
@doriansean7819
@doriansean7819 3 года назад
So is this mocap on this video over the WiFi or cabled to your pc?????
@cinemarks.3d
@cinemarks.3d 3 года назад
Great update! I currently use iPhone X, I am wondering if the iPhone 12 will improve the performance too? Anyone has experience with that?
@reallusion
@reallusion 3 года назад
The iPhone 12 is faster and will improve performance. Especially when tethering the connection.
@sliderssli1769
@sliderssli1769 Год назад
Hello! I came across your lessons. I have one problem, and I haven't found a solution to it anywhere. I installed the program iClone 7 and created characters in it. Now I bought an iPhone and decided to record facial animation using the tutorial. I did everything according to the tutorial and connected the iPhone (the iPhone showed that everything is connected, there is a mask on the face). I specified the iPhone in the parameters and entered the path to the phone. That is, I did everything according to the lesson. But when you click on the record / preview, nothing happens, the character does not repeat emotions. I checked the character for emotions, everything moves in manual mode. I checked during the preview whether the coordinates of the points on the face are transmitted, everything is fine, everything is transmitted. The legs and face are static. What is the problem? is there a plugin missing or didn't put a check mark where? I didn't find a solution to the problem.
@mamadoudiallo9139
@mamadoudiallo9139 3 года назад
Bring this to Mac already
@enjoyenjoy5721
@enjoyenjoy5721 3 года назад
But HOW can you still have no iClone 7 for MacBooks???? isn't there anyone asking this before?
@cezanneali
@cezanneali 3 года назад
Imac are poor engines to use for such stuff or even rendering anything in general.
@RetzyWilliams
@RetzyWilliams 3 года назад
People asking for Android - ARKit is Apple’s proprietary face tracking tech. Until Android itself comes out with a similar tech, Android cant do it, even if RL wanted it to. It (1) needs a special camera like iPhone X has, and (2) needs an ARKit-like software. I hope it does one day.
@jdsguam
@jdsguam 3 года назад
"ARKit is Apple’s proprietary face tracking tech." - That pretty much settles it don't you think?
@imdeby
@imdeby 3 года назад
@Reallusion is it possible to export just the facial motion into blender without it being applied onto a character?
@fidel_soto
@fidel_soto 3 года назад
Is the smoothing feature only for Live Face for iPhone or does it work with Faceware? Edit: Just saw the smooth option on the clip directly
@Alaz21
@Alaz21 3 года назад
Unbelievable it is cool
@TXanders
@TXanders 3 года назад
Has this just been reuploaded?
@marcelo9655
@marcelo9655 3 года назад
😨😨👍👍 amazing
@fathouyniat8797
@fathouyniat8797 2 года назад
well can i import my own character with rig and facial on it and control it in iclone?
@helenawilsena5821
@helenawilsena5821 3 года назад
I don't have any iPhone, is it work with Android phone or DSLR Camera?
@sebastianvignau4333
@sebastianvignau4333 3 года назад
Great video. Question, can you do the MOCAP from a previously recorded video instead of using the LiveTrack? Is it better in any way?
@tsechee
@tsechee 3 года назад
can i use multi iphone capture to multi character?
@PalmaMultimedia
@PalmaMultimedia 3 года назад
Xlent
@wallstbets4865
@wallstbets4865 3 года назад
Can i use my webcam camera with this application or do i need to buy an iphone? Also does it record my voice will i talk?
@mrkshh
@mrkshh 3 года назад
only iphone
@MichaelZurcher
@MichaelZurcher 3 года назад
Is the African American model you use in this video a preset character or one that you have modified? I am not seeing it in character creator. Thanks!
@doriansean7819
@doriansean7819 3 года назад
Modified
@dabneeghmoob3D
@dabneeghmoob3D 2 года назад
can use androi ?
@WerIstWieJesus
@WerIstWieJesus 3 года назад
Is acculips free for iclone/cc3 users? Can I record clips outside and with a smartphone and apply them later inhouse with iclone?
@Alex_Lebron_Animations
@Alex_Lebron_Animations 3 года назад
Yes it’s free. It’s part of the new update that was released 2 days ago. I’m not sure if I understood your second question correctly but u can’t use recorded clips with face live. It can only capture live feed. Hope that answers your questions.
@WerIstWieJesus
@WerIstWieJesus 3 года назад
@@Alex_Lebron_Animations Thank you very much.
@over-e1834
@over-e1834 3 года назад
We need for android!
@over-e1834
@over-e1834 3 года назад
@@pushingpandas6479 So instead of a phone, can i just buy a camera that has the same capabilities of the iPhone?
@flixels5520
@flixels5520 3 года назад
Sort of ironic the software is windows only.
@marvinmartin6246
@marvinmartin6246 3 года назад
Why is this only for iPhone users ?
@reallusion
@reallusion 3 года назад
Because only iPhone has a TrueDepth camera built in. When Android has this, then we will also support it with this plugin.
@meglaarif
@meglaarif 3 года назад
What about android?
@reallusion
@reallusion 3 года назад
Android phones do not have a true depth camera.
@dreagea
@dreagea 3 года назад
which iphone works best? iphone 10, 11, 12 ?
@metulski1234
@metulski1234 3 года назад
I would say the iPhone 12 Pro and Max should work really good. The Lidar Scanner is awesome.
@dreagea
@dreagea 3 года назад
@@metulski1234 Thanks
@frankcabanski9409
@frankcabanski9409 3 года назад
Is this better than facecap live? I have that, and the motion around the mouth is awful, stretches it in odd ways.
@MikeDKelley
@MikeDKelley 3 года назад
If you can't tell the quality after watching this video there's not really much more to tell you -- it's pretty well laid out here how it works and what it can do. You can compare yourself to the results you are getting otherwise to make your decision.
@frankcabanski9409
@frankcabanski9409 3 года назад
@@MikeDKelley I saw videos of Faceware Live (not Facecap, my mistake) for iClone. It looked terrific. I bought it. It's awful - weird stretching on the mouth - unusable.
@MikeDKelley
@MikeDKelley 3 года назад
@@frankcabanski9409 You saw how everything was constructed, how all the minute parts came together like is shown in this video? I've never seen such a video for Faceware. Indeed, I saw just the opposite -- folks showing that Faceware was very difficult to get right. Now, I bought Faceware as well, but Live Face is so much better I don't even have FW installed on my system anymore.
@frankcabanski9409
@frankcabanski9409 3 года назад
@@MikeDKelley See, that's what I'm looking for - word from users/buyers. It will be around $600 for this plus the iPhone. Also, I don't know if Face Mojo is better or Facemotion - I guess they worked together for awhile, but then they split.
@MikeDKelley
@MikeDKelley 3 года назад
@@frankcabanski9409 If you have Motion Live (which you should, for Faceware) you only need the Live Face plugin, which shouldn't be $600 (the last I looked it was around $300 but I do admit it was a while ago -- I bought it when first released at around $200). You can get a used unlocked iPhone (you don't need service) for less than $500 (perhaps even a lot less) but be sure it has face ID (the 3D camera -- most all of them in the last three years do). I guess if worse comes to worst you could turn around and sell the iPhone for about what you paid if you REALLY didn't like it, but it's hard for me to imagine that, not when it's this good (the Arkkit stuff is stuff even the pros have been hankering after).
@أرحعقلك-ح2ط
@أرحعقلك-ح2ط 3 года назад
We need for android
@CarloMercadoJudgementProject
@CarloMercadoJudgementProject 3 года назад
for now that is imposible, because android doesnt have the cameras quality for mocap as the 3D cameras of the Iphones, they need the hardware before they can release it for android.
@Alaz21
@Alaz21 3 года назад
But how can i create my own face on it....?
@JeffersonDonald
@JeffersonDonald 3 года назад
via the Headshot plugin with Character Creator 3.
@Alaz21
@Alaz21 3 года назад
@@JeffersonDonald Thanks for your reply. Can you link me a tutorial video pleas ......?
@fayhiba431
@fayhiba431 3 года назад
Guyyys your website is down xD
@hyperface2050
@hyperface2050 3 года назад
If only this video were true!!!! But it is not. I too am shocked by the quality of the mocap here, but even moreso because I have an ultra modern computer with RTX 3070 etc and an iPhone 11 pro, and when I boot up the same software, the CC3+ character doesn't move AT ALL like this demo. She's jerky, for ten seconds at a time she freezes, and then in a rush she speeds through all the bad mocap data from the past ten seconds, then freezes, then jerks. I couldn't imagine it being any different than this video. Why? Well the folks from Reallusion haven't explained. Numerous comments below have confirmed the same problems. Many have asked if the wifi connection is an issue and whether the folks from reallusion are hardwiring their connection (this is likely the case) or are they indeed cheating this by recording the mocap beforehand then playing it back as if it is a realtime prview. Either way they need to provide all the specs and connections for a demo like this. We need the computer model and specs, the phone, and everything to do with the connection. If it is wifi, what are the wifi specs. If there is a cable, they need to show us how it can be done. The folks from Mocapx (not a solution I recommend) do provide for an easy USB connection, which eliminates the wifi problem and at least that part works better with Mocapx. Still, Mocapx provides no calibration, no zeroing, no smoothing etc. And after owning Mocapx for a month, I have YET to do convincing mocap. That is why I am now looking at iClone. But wow, the JITTERS and FREEZING are insane.
@mehdi.shiraziu7559
@mehdi.shiraziu7559 3 года назад
I also have this problem and I could not do it. The character exported from cc3 completely loses its beauty in iclone
@hyperface2050
@hyperface2050 3 года назад
@@mehdi.shiraziu7559 Yeah CC3 characters do not export well, but nearly all the goodies are there in the export. It just takes a while with your NLE to learn how to adjust the settings to get something that not only looks as good as the CC3 images, but even better. Better? How? Well that's easy, CC3 doesn't do great work on subsurface scattering or displacement (two essential tactics for making characters look 100% human). Pick a render engine (I use Arnold) and learn the ins and outs of that, and after a month of practice etc, your characters can really come alive.
@hyperface2050
@hyperface2050 3 года назад
BTW I did get my USB cable connected to get this mocap stuff working more smoothly. The key is to make sure you are connected as a 'hotspot' even though you are using a cable.
@joshmiller8392
@joshmiller8392 3 года назад
This looks so weird.
Далее
Video-Based AI Facial Mocap AccuFACE
9:01
Просмотров 6 тыс.
[Webinar] Advanced Facial Mocap and Editing in iClone
1:47:00