Тёмный
Johnny HowTo
Johnny HowTo
Johnny HowTo
Подписаться
Host of "How To's" and insight to Digital and Real world media techniques -- from a Nerd who has worked and taught in many of them.
Facebuilder: 3D Heads from 7 Photos?!
8:52
6 месяцев назад
Reallusion AccuFACE Demo and Review (JhowT)
11:22
8 месяцев назад
How to get RTXGI in UE4!
14:25
4 года назад
How to Use 2 Point Tracking in Nuke
13:10
4 года назад
How to Use 1-Point Tracking in Nuke
21:54
4 года назад
Nuke Keyer Showdown / Comparison
22:57
4 года назад
Комментарии
@ParikshitBhujbal
@ParikshitBhujbal 5 дней назад
Hi Johnny , I have some models from UE marketplace with facial blendshapes that i want to animate , I am confused between Liveface and Accuface , I do not own an Iphone but do have a RTX 3080ti , what are your experiences with the two & which would be better for me ? Thanks.
@HarryClipzFilmz
@HarryClipzFilmz 29 дней назад
Can this work in iClone 7?
@satakarnibommuluri946
@satakarnibommuluri946 Месяц назад
To find input photos for this plug-in is challenging. Is there source that ai generated or portraits photo shot at different angels
@BarishaillaMUNNA
@BarishaillaMUNNA Месяц назад
I dont know how to delete any objects in character creator and iclone . so i need help 😔
@masamiakita993
@masamiakita993 2 месяца назад
beautifully explained
@Tricky_NOOB
@Tricky_NOOB 3 месяца назад
i keep hearing a minecraft door opening
@marcogrob3899
@marcogrob3899 3 месяца назад
Hello Man! Thanks so much! I tried to connect my avatar into dollar. The plug in recognizes it but nothing moves ...neither in record nor in preview mode. everything connects properly otherwise..I am using a cc4 character I created in CC4
@Dude_Slick
@Dude_Slick 4 месяца назад
Great tools. It's nice to know that the rich people have these resources available. Good for them.
@Lanthiren
@Lanthiren 4 месяца назад
This looks extremely janky and it was my experience as well that it's just jittering all over the place even though I have a high quality camera and strong PC setup
@Pazeee1
@Pazeee1 2 месяца назад
Hi, please make a video of what you experienced so that people can see the original results of this software 😊
@IvanVladev
@IvanVladev 5 месяцев назад
God bless you man. Works for plugins from Marketplace too.
@JhowT
@JhowT 4 месяца назад
Hey, I'm glad to hear this! I haven't used a source code build in a long time, but had always wondered if that would indeed work. Thank you for the update! :D
@buda3d2007
@buda3d2007 5 месяцев назад
can you turn around and would it track that motion?
@JhowT
@JhowT 5 месяцев назад
That's a good question -- I didn't try that but I assume it would get the torso, legs, and head. Depending on where your arms are, it of course wouldn't "see" them from the camera's point of view -- so those wouldn't get tracked. If they were out at your side, though, I'm imagining they would.
@ChrisGeo-hk4qj
@ChrisGeo-hk4qj 6 месяцев назад
Thanks for the video. I'm running into a problem where I'm almost done with my first animation but I am trying to use this to go back and add some hand movements. I'm working on a section at the end and when I hit "record" it takes the avatar back to the center of the stage instead of leaving it stationary where the last motion clip ended. I don t know if this has to do with the GI Anchor as I'm using a template for the background but any advice for a noob would be helpful!
@JhowT
@JhowT 5 месяцев назад
Eek, sorry...I definitely would run into issues like this at times as well, which I'm assuming were mostly me just not being sure how to 'add' instead of 'replace'. I think inside of iClone you can 'add' animation with the puppet tools or something of that sort -- I'd maybe search Reallusion's documentation a bit to see if you run into that section. Sorry I couldn't be of more help. :(
@ChrisGeo-hk4qj
@ChrisGeo-hk4qj 5 месяцев назад
@@JhowT Thank you for taking the time to reply!
@JhowT
@JhowT 5 месяцев назад
@@ChrisGeo-hk4qj You're very welcome and, again -- sorry for the long delay in getting back to you!
@ale_straccia
@ale_straccia 6 месяцев назад
It's running very slow on my laptop (I7 48gb ram Geforce RTX 2060). Is it the computer or is there any setting or trick to make it run faster?
@JhowT
@JhowT 5 месяцев назад
I think I made sure to use a video resolution of 1280x720 -- I think running the webcam higher than that had my system bogged down a bit as well. Similar things happened with AccuFace, so I don't think it's a knock on Dollars Mocap or your system (nice system btw!). Let me know if that helps!!
@ale_straccia
@ale_straccia 5 месяцев назад
@@JhowT Awesome, very good insigths! Thank you very much!
@JhowT
@JhowT 5 месяцев назад
@@ale_straccia You're very welcome; hope some of it helps!!
@WillyCash69
@WillyCash69 6 месяцев назад
Awesome review bud, especially the Text to Motion
@WillyCash69
@WillyCash69 6 месяцев назад
Just curious, how is Dollar Mocap compared to Accuface for Facial expressions? There a 5x price difference between the two
@JhowT
@JhowT 5 месяцев назад
Hi! I apologize for taking a bit to respond here -- I'm going to be doing so to quite a few comments! Solely for face motion capture, I didn't notice a 'giant' difference, as in both cases you'd want to run the audio through AccuLips, which would definitely improve the quality. Otherwise (with either one) the mocap gets the overall motions, but might miss some of the detailed shapes (and for sure the tongue movements).
@LPMOVIESTVSofficial
@LPMOVIESTVSofficial 6 месяцев назад
hello my friend very cool is that the $99 one my friend your useing
@JhowT
@JhowT 5 месяцев назад
Hi! I apologize for taking a bit to respond here -- I'm going to be doing so to quite a few comments! It is indeed the $99 one...it's a pretty unbelievable value TBH. I appreciate Reallusions own plugins as well, but bang-for-buck...it's pretty amazing!
@selamatmenontonpemirsa
@selamatmenontonpemirsa 6 месяцев назад
are this software support export and retarget to any model in blender? it would be great tutorial for blender user like me
@JhowT
@JhowT 5 месяцев назад
Hi! I apologize for taking a bit to respond here -- I'm going to be doing so to quite a few comments! If you are using this, it should have export options (from what I recall) for Blender & most 3D packages. I've only used it with iClone (since I have it), but I'm "pretty sure" I saw that in there somewhere. ;)
@selamatmenontonpemirsa
@selamatmenontonpemirsa 5 месяцев назад
@@JhowT thanks for the response
@JhowT
@JhowT 5 месяцев назад
@@selamatmenontonpemirsa You're very welcome -- sorry again for the long delay in my initial reply!
@coryharris1168
@coryharris1168 6 месяцев назад
Why can't I hear audio while doing it live? AHHHHHH
@JhowT
@JhowT 5 месяцев назад
If you're doing things live, you might have to look into how you can route the audio to your steaming method. I know that's VERY vague, but I'm sure there's a thousand different ways people get audio going for streaming. In the case of iClone, perhaps turn off audio capture all together, since you're not going to be recording the audio and refining the animation later. That should leave the audio 'free' for your streaming software (like OBS) to latch onto. :) Just some thoughts off the top of my head -- hope you get / got it working!
@lifestoryentertainment
@lifestoryentertainment 6 месяцев назад
Thank you so much. I have a couple of questions. I’m using the same gear as this; iClone 8 & Mono. Is there a way for me to edit the motion file recordings in iClone? I only saw the recording in the timeline, but couldn’t find a way to edit the motion. Felt like I maybe just missed it. Do you know if it would allow me to capture the hands and body with one camera using Mono, and use a second camera to record the face capture? I would be trying to use MotionLIVE/AccuFace to record the face. I watched the video the dollars mocap people suggested that I watch, but I didn’t see any mention of a second camera. I only have one web cam, but was considering adding an action camera to record the hands body if it worked. Do you know? I’ll write the company if not. I tend to learn when I get my hands on it, and there aren’t many manuals for this new innovative 3-D gear. 😊 Thank you so much for your time and putting this video together. It was very helpful.
@lifestoryentertainment
@lifestoryentertainment 6 месяцев назад
Oh, sorry I use “edit motion layer” in iclone to animate motion… is that correct? 😊Sorry, I never got further than editing face in viseme to date. 🦁
@solidkundi
@solidkundi 7 месяцев назад
if i want to make something similar at a tradeshow where an avatar can engage with audience, how many Reallusion apps will i need?
@JhowT
@JhowT 7 месяцев назад
Hmm, well admittedly I haven't used the 'live' functionality very much (I'm almost always recording and then refining the animation). That being said, I'm pretty positive that's possible. If you wanted to create a custom avatar and then animate it, you'd need Character Creator 4 and iClone 8. As far as the actual face capture, you have a couple options. If you have an iphone with a TrueDepth sensor, you can get the LiveFace plugin for iClone 8. If you don't, but have an Nvidia RTX GPU you're using, then you can use AccuFace. Either one would work nicely I think.
@RyanWieber
@RyanWieber 7 месяцев назад
If you find the process of lining up frames to each other too tedious, consider that you could just stabilize the footage first and then proceed as described patching your frames together. And since they are all stabilized, you can pick as many frames as you want and won't have to line up each of them. In perhaps another situation with many frames that each have useful background pieces, the reveal-painting each layer may also become impractical, in which case you could try simply punching out the action (Owen) first, with a loose roto stenciling, and then just stack stabilized frames over each other, which will reveal through to whatever layers have that region. You can then noodle the mask feathering etc while looking at the stacked-up result to make sure you don't get any hard edges.
@AlphaSpaceDev
@AlphaSpaceDev 7 месяцев назад
For some reason did not find the info provided here in the video on how to copy/past the plugins into the Runtime folder elsewhere, worked! Thanks
@JhowT
@JhowT 7 месяцев назад
Pleased to hear it!! This series doesn't get much viewership anymore, but it's nice to know it's still useful. :)
@midnitejesus
@midnitejesus 8 месяцев назад
Is it possible to get the live motion capture into OBS?
@JhowT
@JhowT 7 месяцев назад
I've never tried that myself...are you meaning using it as a live-stream source (live animation)? I'm pretty sure you can, though it of course wouldn't be as 'refined' as something recorded & then tweaked. :)
@3DGraphicsFun
@3DGraphicsFun 8 месяцев назад
*Outstanding stuff for sure!* I use accuface and liveface all the time. Sometimes prefer one over the other but one thing for sure Acculips is critical with both. Really *great* examples you have shown :) ...Cheers! *IcloneFun🤗*
@JhowT
@JhowT 8 месяцев назад
Thanks so much for leaving a comment that you enjoyed the video! Yes, it's nice to have options and I kind of keep going back and forth. Thankfully I already had the needed hardware, so getting up and running was a little cheaper than it would have been otherwise.
@3DGraphicsFun
@3DGraphicsFun 8 месяцев назад
@@JhowT You are welcome. Great channel. I use accuface and liveface so much on my old gaming laptop rtx 3600 and my new gaming rtx 4800 and for me there is no speed different in Iclone 8 or CC4 or accuface or liveface. Maybe I did not need to buy the more expensive laptop haha :)
@JhowT
@JhowT 8 месяцев назад
@@3DGraphicsFun Haha, yeah sometimes it's hard to tell what / where the payoff will be. I'd imagine that if you're rendering out of iClone at 4K, the extra VRAM of the newer GPU would allow it to work more efficiently.
@3DGraphicsFun
@3DGraphicsFun 8 месяцев назад
Ever since I discovered Topaz VideoAI I never need to render anything in Iclone or CC4 at 4K. Because after all is done in Iclone I bring in to Topaz and in 10 minutes such beautiful Super enhanced 4K. Because of that I guess I really did not need the new Gaming Laptop after all. haha ... *Cheers😊IcloneFun🤗*
@JhowT
@JhowT 8 месяцев назад
@@3DGraphicsFun I've heard a lot of good things about that -- though admittedly still never used it!
@toddwaddingtonproductions4227
@toddwaddingtonproductions4227 9 месяцев назад
this is great. covers so many other smaller details that answered questions I didn't even know I had! thanks!
@JhowT
@JhowT 8 месяцев назад
Thank you -- I'm very pleased to hear that it helped you get the information you needed (and didn't even know you needed)! ;)
@garystinten9339
@garystinten9339 9 месяцев назад
Are syncovery versions compatible with one another?? Ie, a newer version compatible with older version?
@JhowT
@JhowT 8 месяцев назад
I would assume that somewhere along the line their code would change enough to where they wouldn't be compatible to run alongside each other. I use my version regularly though, and it's definitely on the older side of things.
@Aghashujaakhan
@Aghashujaakhan 10 месяцев назад
I opened the Terminator T-799, but unable to find fbx model. I opened Arnold.obj but unable to see the texture. In modify tab I located the texture (Shader Type PBR) and set Strength to 100% , but it did not work. 😞
@JhowT
@JhowT 10 месяцев назад
Hi! While I can't 100% say with confidence (at the moment), I "think" I might have made an intermediate stop to export it as an FBX from another application. That being said, you "should" be able to import the OBJ and then assign the texture(s) after the fact?
@TimV777
@TimV777 10 месяцев назад
Will this work if my model has horns inbuilt? I just want the face topo so I can use the animation features
@JhowT
@JhowT 10 месяцев назад
Hi, and sorry that it took me a short while to respond. How detailed are the horns? The most straight forward way I can think of doing this, would be to not include the horns in the headshot 2 topology, and then add the horns, basically, as an 'accessory' after the fact. It'd be the same process as, for instance, putting a pair of glasses on the character & will be parented to the head (so they will move along correctly). Hope this gives you some ideas & will be of some help!
@muralirajanfx8590
@muralirajanfx8590 11 месяцев назад
Wow nice 🎉
@JhowT
@JhowT 11 месяцев назад
Thank you! Pleased you found it helpful. :)
@davidmonteiro6822
@davidmonteiro6822 Год назад
Is it possible to do this workflow without attaching to a CC4 body? i actually just want to get my custom character head into CC4 and use this workflow to add all the CC4 blendshapes to a custom character
@JhowT
@JhowT Год назад
Hmm, I'm not "fully" sure. I think it's likely going to want you to attach it to a body...since I can't really think of a time where I've seen an isolated head as I've used the CC4 pipeline. This may just be me not having tried it though -- and that it is actually do-able. Sorry I can't give you a definite answer! If in doubt, I'd download the trial and see if it can do what you're wanting it to. I had done that in the past with a couple Reallusion plugins. :)
@aleksandrstukalov
@aleksandrstukalov Год назад
1:25 I can't find a link to this page in description:(
@mrhmakes
@mrhmakes Год назад
thank you! excellent detailed tut!
@JhowT
@JhowT Год назад
Glad it was helpful! :)
@argus-ek5xn
@argus-ek5xn Год назад
You’re the first one that explains the headshot 2 procedure completely. You don’t miss any steps. I was waiting for a tutorial like this. Thank you.
@JhowT
@JhowT Год назад
Sorry I'm late in responding to this -- but I'm very pleased to hear that you found it useful...that's definitely what I was hoping for! :)
@vannyyorn
@vannyyorn Год назад
🤩🤩🤩
@derhenker6629
@derhenker6629 Год назад
Is there some Difference? I See nothing.
@JhowT
@JhowT Год назад
Hi there. You'd need to tell me what part of the video you are referring to for me answer. It can definitely make a nice difference though! Having that bounce light is wonderful. Lumen (in UE5) does similar things built-in, but I don't think is as performant as RTXGI is. Hope this somewhat-answers things for you!
@vinograd1435
@vinograd1435 Год назад
very nice I would like to learn a professional deepfake
@JhowT
@JhowT Год назад
This is definitely an avenue where you could create a digital-double of someone. Perhaps better suited to stylized purposes, but very, very effective! :)
@jamesandtashadixon
@jamesandtashadixon Год назад
Brilliant tutorial. Thank you!! 🙏
@JhowT
@JhowT Год назад
Pleased that you found it useful! It's a pretty amazing new feature, so I was excited to talk about it.
@apmanti12
@apmanti12 Год назад
Hey Johnny, hope you are doing good! Your tutorials are awesome and so are you!
@JhowT
@JhowT Год назад
Thanks so much -- it's really encouraging to receive a comment like yours! I hope you are doing well, too!! :)
@kostavsheoran1530
@kostavsheoran1530 Год назад
man , you explained it the way it should be explained. THANK YOU !
@JhowT
@JhowT Год назад
Haha, you're very welcome -- glad it made sense in the way I worded & showed! :)
@deeber35
@deeber35 Год назад
What if you have 2 faces in the dest video? How do I apply the source face with only 1 of the dest faces?
@JhowT
@JhowT Год назад
Hi there! Sorry for being a bit late in responding to your question. So in the process you eventually get to a step where the application "shows" you the detected faces for the source and destination (to train off of). What you would do is delete all the faces for the other character you DON'T want swapped out. Does that make sense? :) Then the training will run and only work on what you specified.
@aroidpapa
@aroidpapa Год назад
I would love to see someone do this with a really badly lit BS/GS with seams all over the place. Most of these tutorials are just unrealistic and it takes a lot more finess to achieve a good key.
@JhowT
@JhowT Год назад
This is school-shot footage at a college I used to teach it for my compositing class. The school definitely had some nice facilities, but I wouldn't call it "unrealistic". Definitely shots can be more challenging though, I definitely agree. There's some other past shots I'd use in class for examples that were of much lesser quality, but I don't have tutorials with them up right now. Sometimes [in general] to show a concept, it's better to use something that's not the 'most challenging thing in the world' so it doesn't become overwhelming. In the case of me using really challenging footage, I'd likely call it something more along the lines of "keying not-ideal footage" or something silly like that. ;)
@superpanter4902
@superpanter4902 Год назад
damn you are so f'ing good at explaining
@JhowT
@JhowT Год назад
Thank you -- I sincerely appreciate you taking the time to tell me that. It was a huge road to try and learn the process on my own, but thankfully at that point I was able to verbalize it decently because of the 'struggle' to get there. ;)
@2badger2
@2badger2 Год назад
What does the "Raw Data" tab do and can you remove it to make the file smaller? Thanks for sharing this video.
@JhowT
@JhowT Год назад
Hi there. This is applicable for photos and digital cinema, so I'll somewhat talk about both. For video the RAW data is only going to show up for, well..."raw" formats like RED footage, Blackmagic RAW footage (used to be DNG's, but not BRAW files), etc. For photos if you've used a DLSR camera or a camera that can shoot in RAW format (and have it set to do so), it captures more data than you can actually see in the photo...and because of that it gives you more data to work with. For instance, if you take a photo as a JPG on your phone (or off the internet), if the sky is 'blown out' to white, there's no data there and if you try and darken down the photo, it's just going to be a 'light gray' color. If you shot in a RAW format though, there's a good likelihood that if you take it into something like photoshop, you can lower the exposure and...low and behold...there will be cloud or sky data there to bring back! Same thing goes for deep shadows...you can lighten those areas and there will be 'imagery' there. To lower file sizes, the 'easiest' way would be to set your camera to capture in something like a JPG, etc. That extra data will never be captured in the file, so it will be smaller. The 'safer' method, though, would be to take your photos in RAW, and then save them out to a different format after you make those adjustments. Hope the above helps a bit and makes sense; happy shooting! :)
@2badger2
@2badger2 Год назад
@@JhowT Thank you, that how makes sense. Once you've done your editing and converted to JPG, is it possible to erase the"Raw Data"?
@narnasqueneth
@narnasqueneth Год назад
Hi Johnny. Can you advise why I might not be getting the popup allowing me to select images for preview (see 18:35 in your video)? I've also tried doing this process without a preview image and when it says "press any key to continue" I don't get a response. Currently using latest stable release of deepfacelab. Thanks in advance for your tips!
@JhowT
@JhowT Год назад
Hi there. Hmm, that one is a slight mystery to me. At that point in the process I don't think it's trying to load things onto the GPU yet, so it wouldn't be an out of VRAM error. I'd maybe just double check and ensure that the images are extracted to the folder(s) correctly, as that would be (logically speaking) a reason why they wouldn't be popping up.
@fkasailor
@fkasailor Год назад
Cool, i create a really cool polaroid effect back in 2021 and i never saved it for future use, i was afraid of not being able to get my configurations back but i find way by using this video to learn about metadata. is truly amazing and quite scary of how much information that picture of mine was holding on. thank you much for this video sir!
@JhowT
@JhowT Год назад
That's super fun to hear how this video (in a round-about way) helped you find that information! But yes...it IS borderline scary the amount of information stored in photos -- that people don't necessarily even realize. For instance, when people send around phone photos, very often the GPS data is logged directly in there, so basically people know where the photo was taken! :O
@LadyKimi
@LadyKimi 2 года назад
YOU ARE AMAZING ¡¡¡ REALLY THX ¡¡¡ Seriously, I have not found any other tutorial that explains how to make this type of mask.
@JhowT
@JhowT 2 года назад
I'm glad to hear from you that they are helpful...part of the reason I did these is because there wasn't much material that really explained things ultra well, and was intending to save others the 'learning pains'. ;)
@apmanti12
@apmanti12 2 года назад
this is great! hope you are doing good.
@JhowT
@JhowT 2 года назад
Thank you! All going well overall here. I definitely need to hop back in with some new content -- I've been trying to decide what to cover for a while now. Thank you for saying hi!
@apmanti12
@apmanti12 2 года назад
@@JhowT thanks bruh, love your nuke content.
@JhowT
@JhowT 2 года назад
@@apmanti12 I'm glad to hear that! Nuke, while expensive, is one of my favorite applications out there.
@ronib1979
@ronib1979 2 года назад
THANKS ALOT MAN VERY USEFULL
@JhowT
@JhowT 2 года назад
You're welcome!
@sunnymon1436
@sunnymon1436 2 года назад
Programmers and Game Artists: We just want to do coloured stained glass lighting for scenes in churches. The Industry: Here's a thousand ways to do bounce lighting and emission.
@JhowT
@JhowT 2 года назад
Haha yup...I definitely love that type of stained-glass lighting (in real life, and games!). It's definitely one of those things where baked lighting is a lot more efficient, but obviously not interactive at that point. :-/
@JhonfaRaccoon
@JhonfaRaccoon 2 года назад
Why is it that whenever I export my file, it is only 480px... how can I export it in HD?
@JhowT
@JhowT 2 года назад
Hmm, you mean your final render out of the software?
@JhonfaRaccoon
@JhonfaRaccoon 2 года назад
@@JhowT I mean that when I mix the target video with the frames that I train, the trained frames look sooo pixelated, obviously the original video is in HD.
@Kpopersempre
@Kpopersempre 2 года назад
Thank you for making this video, it was very insightful! I was wondering if you were open to private tutoring and/ or available to hire. Thank you for your time
@JhowT
@JhowT 2 года назад
Hi Mai, and I'm glad to hear you found the video useful! Feel free to ask questions on here, that way we can all learn together. :)