Тёмный

How to use CAMTRACK AR 3D TRACKING with your CINEMA CAMERA in UNREAL ENGINE 5 

Hethfilms
Подписаться 97 тыс.
Просмотров 28 тыс.
50% 1

Опубликовано:

 

27 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 136   
@hethfilms
@hethfilms Год назад
Here's the official trailer for our Star Trek fan film I was talking about in the video, a few of the camtrackAR shots are in the trailer: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-kiFOlqjj41w.html
@TerenceKearns
@TerenceKearns Год назад
This is an awesome tutorial. Remember you can always tweak the lighting in unreal engine to match your shot if you cant shoot it again. would be interesting to try a multicam setup.
@prinkitrana
@prinkitrana Год назад
Thank you so much 🙏🏻 this was the one tutorial I was looking for. This means a lot.
@hethfilms
@hethfilms Год назад
Awesome! You are very welcome! :)
@bkunal88
@bkunal88 3 месяца назад
exactly what i was thinking ... I mean, the phone has amazing gyroscope, why no one is building some app, to make this live and flawless ? we need an app which talks to PC via 5g network or ethernet, this will help to get just live location of camera in unreal, actually shooting can be done in high resoulution camera only ... this whole thing can become live shoot too... but very happy to see the results !! amazing job !!
@hethfilms
@hethfilms 3 месяца назад
Thank you, there are a few solutions out there aiming for something like that. I did try to come up with a realtime version using live link into unreal, before I settled on camtrackAR. Main problem is to have the cinecam and the iphone in sync without any genloc / timecode setup etc. Check out "Jetset" which aims at something along those lines. Joshua M Kerr just did a nice video about it where they used it in some form of realtime workflow, still quite the setup to have: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-9u_8LHj9rEQ.html Another solution would be Skyglass, there you run unreal within the cloud for realtime tracking on the phone. You can get the recorded camera track as an USDA file after the recording and follow a simliar workflow like my tutorial here for example to match it with the cine cam. That way you at least have some sort of live tracking with visual reference of your environment on the phone for orientation. Both solutions however are subscription based and not super cheap, camtrackAR is a one time only purchase if the budget is a factor.
@lukastemberger
@lukastemberger Год назад
You're awesome! I've been looking for exactly what you've done except I'll be going straight into Blender. So many comments about it not being possible, but I was sure it just takes some calculating to adjust for the sensor size/lens/distance from sensor. Thank you! This should work great for my next project.
@Alphain
@Alphain 9 месяцев назад
Can you make a tutorial on how to do this in blender?
@hethfilms
@hethfilms 9 месяцев назад
Sorry, somehow missed you comment. Glad it helped you out. Did you manage to get it working as intended in Blender?
@philippsiebler
@philippsiebler 6 месяцев назад
Du bist genial 🔥 Wusste noch nicht wie man das Offset Thema lösen kann wenn das Handy die Szene trackt. Aber mit dem Rig in Unreal ist das smart gelöst👏🏼
@hethfilms
@hethfilms 6 месяцев назад
Danke! :) Inzwischen hab ich eine noch elegantere Methode erarbeitet die ohne Rig auskommt und livepreview vom Greenscreen Footage beinhaltet. Hab aber leider noch nicht die Zeit gehabt das mal als Tutorial hier für den Kanal aufzuzeichnen. Grundlegend aber das mit dem Offset kannst du auch erreichen in dem du im Sequencer in der getrackten Kamera einen zweiten Transform Track als additive erstellst und dort den Offset einstellst.
@RandMpinkFilms
@RandMpinkFilms Год назад
Fantastic breakdown of this tool. Thank you!
@hethfilms
@hethfilms Год назад
You’re welcome! 😃👍
@TaruunRJhaPHOTOGRAPHY
@TaruunRJhaPHOTOGRAPHY Год назад
Wonderful tutorial. Subscribed!
@hethfilms
@hethfilms Год назад
Awesome! Thank you :)
@gdnrecords
@gdnrecords Год назад
Sir, Your video is amazing thank you so much for sharing your knowledge. This app is amazing.
@hethfilms
@hethfilms Год назад
Thanks, glad I can help. I think camtrackAR is highly underrated for what it enables us to do! :)
@nahlene1973
@nahlene1973 Год назад
Thank you SO SO much for making this video!
@hethfilms
@hethfilms Год назад
You are very welcome, happy to help. :)
@zenks_scl
@zenks_scl 11 месяцев назад
Thank you, roboticman! I was looking for this for soooooooo looooooooooong!
@hethfilms
@hethfilms 11 месяцев назад
Hahaha you’re welcome 😃👍
@LFPAnimations
@LFPAnimations Год назад
This is insane! I am going to try using this in production for low budget projects. I think this solves the issue of getting a track (even a rough one) when shooting closeups handheld on a green screen. Closeups on a green screen are normally very difficult to get tracks because there is next to no detail to get the camera information. Having even a rough track from the phone solves this issue.
@hethfilms
@hethfilms Год назад
Yes! That is exactly my thinking on this. We are still in production of the Star Trek Fan Film I mentioned in this tutorial at the end and we have several close up shots of people walking and talking in the greenscreen, all shot with this method and it works great. Not a single marker on the screen, just pure green color. I'd say it´s perfect for these type of shots since there is no floor contact visible. Those shots are the most difficult as you see any small errors on the feet. We had a few shots that did not work out with camtrackAR where the actors feet were visible, too much floating / sliding etc. For those shots we had to go back to traditional camera 3D matchmoving which we did in Blackmagic Fusion Studio.
@LFPAnimations
@LFPAnimations Год назад
@@hethfilms I was going to ask about feet. I'd imagine that this method only really works when you can't see actors/objects making contact with the floor. I hope one day cinema camera manufacturers will just put all the iphone sensors inside and create their own version of the AR kit for cinema use.
@hethfilms
@hethfilms Год назад
@@LFPAnimations Now THAT would be awesome! :) I assume the problems with floor contact are mostly related to using different focal lengths for the cinema camera than what was used on the iphone. Also it might be feasible to try to add lense distortion / correction information in Unreal for the specific lenses one is using on the cinema camera to improve results. I have not tried that yet. We had one shot in our movie that worked almost flawlessly until we tilted the camera down to show the feet of the actors while they were walking, that's where the tracking started to drift off, after tilting up again it was spot on again. We were lucky that this shot and the environment made it plausible to have a plume of steam in the frame, so we just put the steam in front of the feet in that shot so the drifting is not visible ^^
@AnimeZone247
@AnimeZone247 Год назад
Use tracking markers on green screens
@hugodatdo
@hugodatdo Год назад
This is so great, thank you so much. I am about to use the same method to shoot a Music Video.
@hethfilms
@hethfilms Год назад
Awesome! Hope all goes well :) Just make sure you have a plan B if somethings not working as planned, it tends to happen. But we recently had a shoot for our film project and all tracking shots we did that day using this method worked flawlessly! :) Let us know how it turned out for you or send us a link to the music video when it's available.
@Philmad
@Philmad Год назад
Thanks for the effort, a lot of work.
@hethfilms
@hethfilms Год назад
You're very welcome! :)
@MartinLopezFunes
@MartinLopezFunes Год назад
very helpful! Thanks!
@jonathanparlane1
@jonathanparlane1 Год назад
Thanks for doing this. I thought this was a great tool but just couldn't find enough information.
@hethfilms
@hethfilms Год назад
Awesome, glad this helped you out. I also think camtrackAR is a valuable tool that's flying way too much under the radar. Back then I asked the developers to implement higher bitrates and color depths and also 24/25p recording modes, to be able to use the iphone footage itself for many occasions. They did reply but unfortunately, nothing came of it. That's why I pursued to get this more complicated workflow to work. :)
@jonathanparlane1
@jonathanparlane1 Год назад
@@hethfilms Yeah I appreciate your work. I actually did set up my iphone on my BlackMagic Cinema camera 4K some months back and started to shoot tests with camtrackAR, using Blender to make the FBX, but I could not figure out how to sort the scale and position of the tracking information in Unreal. It was close-ish but not quite right. Better formats in the Iphone would be nice, but I still need to shoot with a cinema camera for my filmmaking work, so this info was perfect
@hethfilms
@hethfilms Год назад
@@jonathanparlane1 Yes, definitely, the smartphones are still far away from replacing a cinema camera, regardless of what their marketing says ^^
@garthmccarthy4911
@garthmccarthy4911 Год назад
I wish there was a step-by-step like this to set up the HTC Vive pro set up
@zaakasaka
@zaakasaka 20 дней назад
Great tutorial. Any idea how can it be done in Davinci Resolve (or Studio)?
@hethfilms
@hethfilms 20 дней назад
Thank you! Yes, you can totally do the compositing part I showed in After Effects in Resolve / Fusion. That's actually how we are doing it with our Star Trek Fan Film series now. Depending on how complex you want the composite to be, you can either align the clips in the timeline on the edit page and then use the 3D keyer on the greenscreen footage, or you align the clips on the edit page and then put them in a fusion composition and use the delta keyer in fusion and what ever else effects you like to apply.
@kingnt66
@kingnt66 2 месяца назад
Can you share how you build your rig? What are some of the gear you use in your rig?
@hethfilms
@hethfilms 2 месяца назад
Sure! I actually used a lot of parts that came with my Zhyiun Gimbal. The whole upper part that holds the iphone is actually from that. It was meant to be screwed on the side of the handle of the gimbal. The rest are just short 15 mm rods, one piece is just a rod connector with two directions. The lower rod is connected to the cage, the holder piece for that came with my Tilta cage for my blackmagic pocket 4k but it also fits on the sony a7IV cage, there are screw holes for that on the bottom. The baseplate under the cam is from the gimbal and it has a screw hole in the front so I can put a counterweight there, this is also a gimbal accessory, although I think i bought it separately back in the day. But here's the good news. If all else fails you can also just put a simple smartphone holder on the top of the camera or cage using the cold shoe for example. It does help to align the phone with the camera as close as possible, but as long as you measure and write down the offset on all three axis you should be good to go, since we can enter the offsets in Unreal later on. The most important part is to get your gimbal calibrated cleanly, this might require some counterweight on the bottom somewhere because of the smartphone on top throwing the balance off. By the way, I found a much better solution which is even free and lets you preview your unreal scene during the shoot. We'll have a test shoot soon and I hope I can maybe make a new tutorial on that. The app is called lightcraft jetset and we figured out a workflow which works awesome without the need for the paid version of that app! Hope this helps!
@wakeup2.369
@wakeup2.369 Год назад
Крутой урок! Благодарю бро! По посуде в серванте сразу понятно что ты из СНГ)). Искал уроки такие как из ае трекинг использовать для внедрения графики в анриле и одна шляпа попадалсь! А будут еще уроки как в анриле графику внедрять в видео с трекингом из ае)?
@hethfilms
@hethfilms Год назад
I hope I understood you correctly, cinecom just released a video about tracking in After Effects and generating a 3D camera for Unreal, if that is what you are looking for: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-7qpInM5-5r4.html
@wakeup2.369
@wakeup2.369 Год назад
@@hethfilms Yes, but it's hard to call it tracking. They simply import after effects camera data and output graphics attached to tedious tracking objects in the alpha channel. Is it really impossible in ue5 to substitute the video footage itself for the background (as in blender or ae or Fusion)? And immediately make an almost finished video
@hethfilms
@hethfilms Год назад
@@wakeup2.369 Oh okay I get it now. You want your footage directly in unreal and do the compositing there. Yeah I wish there would be an easy way to do it, I've been looking for that too. But it looks like this is still not really a priority for Epic. I think there is a way by using composure, there are very few videos about it on youtube. But the whole workflow seemed very unintuitive and complicated. So not even close to the simple ways you can do this in programs like Blender. And it could be so simple if they for example just would add a media material slot for a background into the cinema cameras, that would always be visible but behind any geometry for example. Pretty much how Blender does it. But we would need other things too like a shadowcatcher material etc.
@groundunderstudios
@groundunderstudios Год назад
THANK YOU. I've been trying to figure this out for the better part of a year!! Question - do you think this method is possible fully within After Effects with the built in 3d camera??
@hethfilms
@hethfilms Год назад
Hi Nicolas, glad this helps! :) In theory it could be possible, depending on what you try to accomplish in After Effects. But I think what is most likely hard to get working is the offset, since After Effects does not deal in real world scale. Also there's no good way to set up a sensor size to match the field of view to the real camera. AE is just not much of a full 3D program, But it looks like Adobe will add more 3D functionality soon.
@zenparvez
@zenparvez Год назад
Gramp looks grumpy !
@robjabbaz
@robjabbaz Год назад
excellent video
@flpbg7288
@flpbg7288 Год назад
I'm waiting for part 5 PVZ more than the new year
@MediaBrighton
@MediaBrighton Год назад
Which iPhones are compatible? thinking of buying a cheap second hand one just for this
@hethfilms
@hethfilms Год назад
The app store claims "Requires iOS 13.2 or later" for camtrackAR, so make sure an older phone can run this version of IOS at least. The app also uses Apples ARKit API, so any model that is capable of that *should* work. If you want to be sure, maybe contact the camtrackAR support on their website about this before buygin the wrong device. I can only speak for the iphone 13 mini I have been using so far and that does work fine.
@DouglasDoNascimento
@DouglasDoNascimento Год назад
This was great. Do you have a solution with Davinci Resolve instead of After Effects? I would love to see that. Thank you.
@hethfilms
@hethfilms Год назад
Yes you can basically do the same on the fusion page in davinci resolve for the compositing part. We are actually using fusion studio for our current film project to composite our footage with the unreal engine renders.
@PopoRamos
@PopoRamos Год назад
Im going to try this method in blender.
@charlesleroq932
@charlesleroq932 Год назад
I would like to try this too. The benefit to using Unreal in virtual production appears to be its live features. If live compositing is unavailable with this app, then I don't see the need for Unreal. Blender has really good colour management and export EXR sequences etc, to integrate the shots more seamlessly. The one thing which seems unclear, is how to match lens distortion and perform overscans
@PopoRamos
@PopoRamos Год назад
@@charlesleroq932 I can confirm that it works really well, you can apply lens distortion and other defects in the compositor, but I do it in Davinci Resolve
@hethfilms
@hethfilms Год назад
@@charlesleroq932 Well since both apps are free, we have the option to choose which is great. But I would not discount Unreal Engine, as you can also export EXR sequences from Unreal for final compositing, There's also color management options for rendering. I just used JPG for the simplicity in this tutorial and to test the animation before committing to final render. To my knowledge there's also ways to configure the lens distortion to match the actual camera, but this is something I have not done yet, so I can't be more specific. But of course, for anyone who is already very good in Blender it totally makes sense to use that. But I love the simplicity and speed of Unreal Engine and with each update the quality you can render in almost realtime is pretty impressive.
@xace321
@xace321 Год назад
thanks for the good course. Question Can I only work in After Effects in post-production? Is ar tracking impossible in Davinci Resolve?
@hethfilms
@hethfilms Год назад
Thank you, you're very welcome! No, you do not have to use after effects for this. You can do the same thing for example in Fusion within Resolve or even just on the edit and color page in Resolve. Basically you just need to line up the footage of your cinema camera with the rendered background from unreal engine, so they are in sync. Then key out the greenscreen and match the look of the cinema camera footage to the scene from unreal.
@shekiba.646
@shekiba.646 Год назад
cool. Please I need your tutorial (22:38) how he walk with blue star trek, full tutorial. let me know. Thank you.
@hethfilms
@hethfilms Год назад
There is actually nothing special to it, follow the same method as described in this tutorial. All we did was startup camtrackAR and then walk around the parking lot so camtrackAR can map out the environment. Then set the floor plane in camtrackAR roughly in the area you want to start the long walk. Everything else is the same process as described in this tutorial. Maybe as a tip, while mapping the area in the app, do not hold up the greenscreen in front of the cam, otherwise camtrackAR might think it is a wall and can get confused later on when the greenscreen moves with the actor.
@shekiba.646
@shekiba.646 Год назад
@@hethfilms I understand. THank you for update.
@marklet
@marklet Год назад
Great video!. So on your long walk Star Trek shot how did the camtrack AR create a track when you had almost completely obscured the surrounding with the moving green screen?
@hethfilms
@hethfilms Год назад
CamTrackAR uses all the sensors of the iPhone not just the camera. Once you mapped out the area you can move freely even when the camera can’t see much the app still tracks the movement and position. That’s why it’s interesting for greenscreen work as you don’t need markers on the screen.
@marklet
@marklet Год назад
@@hethfilms Fantastic. I totally missed that massive piece of news. That is so cool!
@demonhogo
@demonhogo Год назад
I think livelink would work too. Plus it’s free and you wouldn’t need to import
@hethfilms
@hethfilms Год назад
Thanks for chiming in! I think livelink is great for other things. With livelink you are still required to have a PC running unreal engine close by to capture the motion. And if the PC can't keep up in realtime, it will record the stutters into the camera animation which would render the recording unusable. But If you are in a controlled studio environment with a beefy PC this might be an option to pursue. I did try to do that a while back, but found livelink just to unreliable and unstable for production purposes.
@demonhogo
@demonhogo Год назад
@@hethfilms good points 🦾🦾
@The-Superior-Drawings7024
@The-Superior-Drawings7024 Год назад
Me sirve jeje
@junglejim101
@junglejim101 Месяц назад
Hey guys, I'm having a major Syncing issue with the CamTrack Footage/ Animated Background Vs the Main Camera Footage. Didn't realise till afterwards that despite filming at 25FPS for Main Cam; I filmed CamTrackAR at 60FPS. Someone please help me fix the syncing issues!
@hethfilms
@hethfilms Месяц назад
Hey, make sure you render the background sequence from unreal in 25 FPS, just switch the sequencer FPS setting to 25fps and make sure you have the same setting in the render queue settings. When you line up the clips. based on your sync reference it should be matching. We shot all our clips with 30 fps, switched it to 25 fps in unreal and matched it perfectly with our 25fps sony footage.
@junglejim101
@junglejim101 Месяц назад
@@hethfilms Awesome, Thank you for the quick response! I’ll retrace my steps in Unreal and make sure it’s all good. I was worried that filming CamTrack at 60FPS might be too difficult/ impossible to sync to 25FPS even if it renders at that frame rate in Unreal. This would be so much simpler if CamTrack gave us PAL Framerates 😂😭
@hethfilms
@hethfilms Месяц назад
@@junglejim101 That's true. I even contacted the devs back then and asked for PAL frame rates. But it is not in their power, as all the tracking apps like camtrackAR, skyglass and jetset use the same Apple ARKit API and that is limiting the frame rates to 30/60 fps. But since Unreal keeps the timing intact when you change the frame rate, it still works pretty good for what it is. Of course there is no subpixel accuracy like in professional workflows, but it still worked well enough for us to use it on our Star Trek fan film for several tracks with great success.
@hethfilms
@hethfilms Месяц назад
Btw we just used Lightcraft jetset on our newest short episode, even more difficult shots with handheld camera, but it came out quite nicely: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-SagYkggNU6s.html
@dc37009
@dc37009 Год назад
Bummer it's on the Apple platform...I'd never invest in that Garbage dump of a company ! (Wazniac aside, of course). Still, admire your considerable research and dev time to troubleshoot the pipeline, well done !
@hethfilms
@hethfilms Год назад
Thank you :) Well I would not recommend anyone go out and buy an iphone just for this purpose. But it needs to be said, no matter how anyone thinks about Apple, they pushed this type of technology a big step forward with their ARkit APIs. A lot of cool features we use for our fan film production at the moment relies on that technology. Unreal Engine uses a lot of it too, making may things a lot quicker and easier to accomplish then before. For example Epics "Livelink Face" App to drive facial performance capture for metahumans. Of course it would be great to have this available on all mobile platforms and devices. But from a low budget film making point of view we have to go with what works right now when we need it. We're using whatever gets the job done for any particular shot or idea. We also use Blackmagics Fusion Studio for 3D tracking of conventional shots to add CGI elements to them. That is often much more precise then what camtrackAR can deliver. On the other hand a conventional matchmoving solution would be useless when you shoot greenscreen shots with only a few or no visible markers and elements in the shot. CamtrackAR can still track the shot in real time during recording.
@dc37009
@dc37009 Год назад
@@hethfilms Yes, of course ! It's just that they buy up key tech "catch and kill" the windows side (including pressuring 3rd party vendors)... they have a history of sleazy business practices to capture people into their ecosystem and then rip them off with proprietary supports ($30 cables, $100 batteries...)! Other than that I agree with everything you said ~Thx D PS, the CAMTRACK-AR-GUYS said that Android is coming, if your phone supports AR core. And as a Blender-Fusion-Raw-ML pipeline fan, we're not too far apart.
@yousuck5314
@yousuck5314 10 месяцев назад
😂 I rather track my camera in after effects then transfer to unreal...this looks like there's more steps 😂
@hethfilms
@hethfilms 9 месяцев назад
Absolutely valid point. It does seem to be rather complicated at first. But once you have done it and worked out the kinks, it's actually pretty straight forward and repeatable. Apart from that, I think you might have missed the bigger point about this technique. The idea is to NOT have to do any tracking in post and for that matter not having a need for any tracking markers during shooting. When you do tracking in After effects in post, especially 3D tracking, you need enough visible markers on different planes in your footage, which can be difficult especially in close up shots. But as I mentioned in other comments, for wide shots with floor contact of feet for example, a traditional post pro camer track might yield better results, as this is where camtrackARs tracking might be sub-par and result in a sliding or floating effect.
@yousuck5314
@yousuck5314 9 месяцев назад
@@hethfilms yeah definitely pros and cons.
@davidpringuer3553
@davidpringuer3553 Год назад
I can't get this to work without really jerky tracking. Can't believe it has so many good reviews. The instructions provided by FxHome are terrible.
@hethfilms
@hethfilms Год назад
Hard to say what the issue is you are encountering. As you can see in the clips in my video I got quite good results from CamTrackAR. We are actually using it on our current film project in a few shots. But I do agree it is a bit quirky and unreliable at times.
@ThinkBigVfx
@ThinkBigVfx Год назад
jerky is because your footage starts from frame one but the camera data recorded was started from frame 1 so move the animated camera keyframe to match the footage... I have not used it with the unreal engine but with blender but the same jerky issue I faced later after a lot of research, I found this solution. hope this will resolve the problem.
@hethfilms
@hethfilms Год назад
@@ThinkBigVfx Thanks good point. This is actually addressed in my video around the 20 minutes mark. In After Effects you can try and move the rendered footage from Unreal a frame forward or backwards in time to see if there's a timing problem. What you are referring to in Blender is a general discrepancy between Blender and AE or the camera footage. Like you correctly state, Blender always starts with frame 1 while most other tools start with frame 0. You don't have this problem in Unreal usually. If you just use the iphone footage and a render from unreal you can just combine the two clips in AE and they line up perfectly. In my example, as we sync manually by using a visual reference there might be a difference of a frame when the cinema camera shot 24 or 25 fps and the iphone shot 30 fps.
@ThinkBigVfx
@ThinkBigVfx Год назад
@@hethfilms thanks for your time and great insight...
@BAYqg
@BAYqg 9 месяцев назад
Exactly! Thus is a common bug in CamTrackAR. Author even mentioned it. Try to shift 1 frame. Good luck!
@fabiovilares2551
@fabiovilares2551 Год назад
You can use a video material to see the footage in realtime
@albertcarstestcrash4912
@albertcarstestcrash4912 Год назад
Super video! Which speech service did you use?
@hethfilms
@hethfilms Год назад
You see it right at the start in the video, it's called ttsreader.
@kristiancapuliak
@kristiancapuliak Месяц назад
Very comprehensive, thanks for putting it down like this !
@hethfilms
@hethfilms Месяц назад
You're welcome! :)
@frankyfantasticmedia9718
@frankyfantasticmedia9718 Год назад
I thought about this yesterday, I know you import footage as well as camera tracking data in after effect, so I thought, what if I just attach the phone to the camera? I’d be able to use the camera footage and the tracking data from the phone , and now I’m watching this video that confirms my theory! Well done for making the video!!!
@hethfilms
@hethfilms Год назад
Awesome and thx! 😃👍
@otegadamagic
@otegadamagic Год назад
This was so detailed and quite easy to understand for a noob like me. Thank you so much. Instant sub
@hethfilms
@hethfilms Год назад
Awesome, thank you so much! :) I was beginning to wonder if the robotic voice was turning people away from the video. I hope you'll create something awesome with this method! :)
@otegadamagic
@otegadamagic Год назад
@@hethfilms oh yes nobody likes the robotic voice, not I either but you were so informative and explained so well that th robotic voice was totally forgiveable . I would actually suggest you do away with the robotic voice and use your normal voice, nothing wrong with having an accent, and you can always support that with close captions for those who may have difficulty with the accent. Anyways thanks again
@MegaMontez-k4o
@MegaMontez-k4o Год назад
Works perfect! Anyone who can’t get this to work needs to restart the tutorial and see what they missed.
@hethfilms
@hethfilms Год назад
Glad to hear that! 😃👍
@AdamSmith-pn5hk
@AdamSmith-pn5hk Год назад
Yes, can’t thank you enough for this great info! Unfortunately, my tests keep giving me shaky CamTrackAR footage even when I use my slider or my gimbal. I’m assuming it has to do with trying to place the markers in CamTrackAR, but I still have issues with that side of it. Cheers
@hethfilms
@hethfilms Год назад
Did you try to just use the iPhone and film a little test and combine that with the actual iPhone recording of CamTrackAR? This should basically work every time, apart from the mess ups the app has sometimes. If that's working just fine, you can move on to the method for the cinema cameras. A slider shot should give you great results without any shakes. But you are correct, the initial setup of the shot in CamTrackAR, so placing the floor grid, must be spot on for the rest to work. I assume most of the shots that did not work in my project were due to vibrations from the gimbal or because of bad placement of the floor grid in CamTrackAR.
@AdamSmith-pn5hk
@AdamSmith-pn5hk Год назад
@@hethfilms Yes, I think my troubles come from the floor grid and just overall tracking within CamTrackAR. I notice that I’ll shoot the whole area that I’m working in so that CamTrackAR can feel out the area, but i can still see that some parts jitter within the CamTrackAR scene even after I place the ground plane and markers. The markers tend to move even when I place them onto points of the ground grid. Do you happen to have any other in depth videos on successfully placing markers within CamTrackAR? Thanks again for all the help and great info!
@hethfilms
@hethfilms Год назад
@@AdamSmith-pn5hk I had similar effect a few days ago during our shoot for our movie. I have yet to test the data if the tracking works or not. I placed a few markers and after a few takes they were no in their initial place, but the ground floor grid still seemed to stick perfectly to the floor, so I hope this will still work out. That's why I still would not recommend this method without a safety net or for high value productions. There just too much uncertainty for now.
@AdamSmith-pn5hk
@AdamSmith-pn5hk Год назад
@@hethfilms I agree. I do wish that when I placed the ground marker that it stayed in place. It looks like sometimes it continues to move even after I place the ground point. I’ve even used sticky notes to try and give CamTrackAR an easier time sticking to the marker points. But still no luck yet..
@hethfilms
@hethfilms Год назад
@@AdamSmith-pn5hk We had a weird thing happening twice the other day. CamtrackAR would not find the floor at all no matter how much I was moving the camera around and we had markers on the floor to help with the tracking. Only after removing the iPhone from the rig, restarting CamTrackAR and just use the iPhone wo track the floor, we got it working again. Then I put it back into the rig, restarted the app again and then I got it to recognize the floor again. It was almost like something was interfering with the sensors of the iPhone that confused CamTrackAR, but I have no idea what could do something like that.
@Alphain
@Alphain 9 месяцев назад
Can you make a tutorial on how to do this in blender?
@hethfilms
@hethfilms 9 месяцев назад
I think you can transfer a lot of the knowledge from this video to a Blender workflow. Follow the official How To video from the FXHome youtube channel on how to install the plugin in blender for import. After that, most of the stuff I'm doing in Unreal could be done in a similar way in Blender. Unfortunately, I haven't done much in Blender for the last few years, so I don't feel equipped to be the right person to do a tutorial on this.
@DGFA-
@DGFA- 5 месяцев назад
Very cool!😊
@BarakXYZ
@BarakXYZ Год назад
Hey! This is really amazing thank you for sharing this pipeline!! Can you please share the specs of your rig? Model names for the rig and phone holder will be amazing. Thank you!
@hethfilms
@hethfilms Год назад
Hey Barak, thank you! Unfortunately, my rig is really just made up of spare parts I already had available. The phone holder itself was part of the bundle with my gimbal the Zhiyun Weebill S. The rest is mostly just 15 mm rods and connection pieces. The part that attaches to the Cage of the Sony A7IV was part of the Tilta cage for my BMD Pocket cinema camera 4k. And the little plate that connects the smartphone holder to the rods is just a small field monitor holder with a 1/4" screw that can be attached to a 15 mm rod. I think it was part of one of my older field monitors or a little spare part you can buy everywhere. It doesn't really matter what you use or how you attach the phone. The most difficult part is to get the whole thing balanced on the gimbal without introducing vibrations.
@BarakXYZ
@BarakXYZ Год назад
@@hethfilms You're so awesome man! Thank you so much for that info! BTW have you had a chance to play with Composure? You can then see your reference footage inside Unreal instead of only in After Effects so you can adjust things quicker that way
@hethfilms
@hethfilms Год назад
@@BarakXYZ Good point. I consider myself still a total noob in regards to Unreal Engine. We found a nice little workaround with Unreal 5.1 to get the cinema camera footage as reference into unreal to better align the scene before render. But it's still a bit hit and miss and does not seem to work every time, we had a few crashes for whatever reason in some of the setups. Composure might be a way to do it, I have yet to wrap my head around on how it works. Our way in 5.1 was to use the new feature where you can just drag a video or image into the level editor as a plane, attach that to the render camera and place it directly in front of the lens and then add it to the sequencer so it gets played back while you watch the camera animation. I still wish there was a feature like in Blender where you can load footage into the camera itself as reference, that would be awesome for this kind of method. I hope to be able to share a few more shots we did with this method later this year when our fan film project is done. We definitely have a few cool shots that work flawlessly with this method and will be in the final film. A few are already in the official trailer: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-JSuZa7Hgr4Y.html
@BarakXYZ
@BarakXYZ Год назад
@@hethfilms Shots looks super great - amazing job man! 🫡🥳 Yeah, for sure Unreal still has to improve the way it handles footage. Composure is pretty weird but you can pull shadow catcher and some cool stuff with it which is very useful + all of the compositing can be done inside Unreal which is pretty powerful (or at least you can quickly adjust your scene, lighting and cg elements and afterwards send it to After Effects for additional fancy compositing. Anyway thank you so much for sharing this method - I’ll try it out myself with my A7iii - the potential is actually gigantic with this tech since camera tracking is so limiting and time consuming. This AR tech allows for some crazy camera tracking easily.
@DjBobbyHustle
@DjBobbyHustle 9 месяцев назад
dope
@smepable
@smepable Год назад
This is amazing, I was searching for Tutorials on this and didnt find one and gave it up, now it was recommended by YT, so glad thanks for the video
@hethfilms
@hethfilms Год назад
Awesome, you are welcome! :)
@oscaroneill8222
@oscaroneill8222 Год назад
Very cool tutorial man! Thanks for creating it. I need to get an iPhone
@hethfilms
@hethfilms Год назад
You're welcome! :)
@MagivaIT
@MagivaIT Год назад
wouldnt it be better to use the gyro data direct from the camera ?
@hethfilms
@hethfilms Год назад
In theory this would be amazing, in reality though, the gyro data of the camera is not the same as the tracking data from the smartphone using the app. Gyro data is only a part of the information for the tracking, but it does not contain transform data of the position in 3D space. A smartphone has a lot more sensors then pure gyroscopic. That's why in virtual production you always see them put additional external tracking devices on the cameras to determine the position in 3D space.
@MagivaIT
@MagivaIT Год назад
@@hethfilms what extra sensors do you refer to in the phone? a phone has a gyro, gps, compass but i would think only the computation is done with gyro movement data to provide movement data. I mean this as a healthy discussion. I wonder what the difference is between the camera gyro output (date rate, computation of movement variables, raw etc) vs a phone. the software must also compute where in 3d space the phone is compared to camera sensor to create an offset for movement calculation. As more and more development goes into the use of virtual scenes, i can image someone (blackmagic, sony, panasonic) may have this somewhere on a backlog for development.
@hethfilms
@hethfilms Год назад
@@MagivaIT Your guess is as good as mine. Unfortunately, there's not that much documentation about camtrackAR out there what methods exactly they use to achieve the tracking, but it must be a combination of several factors, visual tracking from the camera, gyroscopic data and how knows what. I assume they use ARkit from Apple, not sure what that all entails, but it is obvious there's more going on then just visual tracking, otherwise it would not be able to track this precise even when the view is obscured or no details are in the shot. What I was getting at is that there must be a good reason that no one seems to be using camera gyrodata alone for this type of visual effects work. So I assume it is not effective or precise enough? Otherwise they would not use external tracking devices with 2 or more witness cameras / sensors etc. But like I said, I would absolutely LOVE to have something like this available, that you can just shoot your scene, extract the tracking data from the shot and feed it into Blender, Unreal etc. I just haven´t seen anything close to that yet. But here's hoping! =D
@MagivaIT
@MagivaIT Год назад
@@hethfilms yeah all good points. can't wait to see how it develops
@hethfilms
@hethfilms Год назад
@@MagivaIT In principle this would already be doable. It's not a mental stretch to see a handheld camera wth the same sensors and an OS like in an iphone which includes the software in the same way camtrackAR works. It's probably just not very appealing for a mass market audience. Apples new AR headgear is already quite a lot like that, it's tracking precisely in 3D space without any external hardware or sensors in the room. That's our problem, as long as this stuff is more or less very niche market, there's not too much incentive for manufacturers to put money into developing something like this, no matter how much we would love to have something like that for VFX work =D
@silverbulletin1863
@silverbulletin1863 Год назад
i like the conzept, but is this reallly better? i guess if you allready have gear. but im a bit confused, how is this cheaper - then a used - ursa mini 2gen + Green screen /or a used 4k lazer wall short throw projector for backdrop+ 150 aclr screen + vivo controler and box - for the walking you can just have iphone shots mixed in or have your subject on a small treadmill- with this method i would need iphone 12 /14 (dont know what number iphone is min requirments) + bmp 4k or sony 7 + full rig + monitor + gimbal + shoulder rig + extra hassel as you cant see it before its render. let me know what i missed here, myself is looking into starting from scratch.
@hethfilms
@hethfilms Год назад
Hi Silver, yes you are both right and wrong, I'd say :) Of course, if you start out at zero without any gear it is expensive. But usually the people who look for this type of visual effects methods, are most likely already invested in some of the gear. Clearly, if you already have an iphone (I think the iphone X or at least the 11 will be enough for camtrackAR), you can just film with that, maybe put it on a little mobilephone gimbal that is not so expensive. Then the process is as easy as the makers of camtrackAR show on their website. The method described here is specifically for the people who, like me, wanted to have the tracking but still use their already available cameras for higher quality shots. I'd never suggest to buy all this equipment just for this method, it's way to much guesswork and too many uncertainties for anyone to throw a lot of money at it. But it's a great method if you already have the neccessary parts available to you to no or very little additional costs. The list of stuff you suggested as an alternative sound quite expensive to me too. You don't need a full frame camera like the Sony I used in this example. That is just the camera I had available for my project. You can do the same thing with an older used Panasonic MFT camera like a g85 or something. The argument about not seeing what you shoot while on set is absolutely valid, as stated in the video, that's one major downside for sure. On the other hand getting a real time capture with unreal running with live monitoring and no lag will very quickly become a lot more expensive then you might think. There's stuff like gen lock and timecode sync required to make it work realiably and that requires a camera that can actually use SDI out and additional hardware for gen lock etc. And of course you need to run unreal engine during the shoot, so you need to have some large indoor space with power and have your whole PC setup with everything, cables etc. With the method explained in the video you can shoot whereever you like, fast and easy. It's of course not a full replacement for a professional virtual production studio, but I think the video makes that clear. But for indie productions this might be a good way to try out, before spending the big bucks for a more professional setup. Does that make more sense to you?
@silverbulletin1863
@silverbulletin1863 Год назад
@@hethfilms ithink, unsure tho, but from stuff i looked into yes expencive, i nailed it down to ursa mini pro g1 (1500euro) + used lazer projector 150" (around 1500-2k) + pc sdi card/cabels + vive controller and box few 100´s euro. i dont think it have to be super expencive, if you go used and entry stuff. but your right thats static setup, you could just use the greenscreen no lazer projector i myself just think the wall lazer projector the ones that stand 30cm from wall, makes perfect sence, if you cant afford a lcd wall panel setup - just ned wall space and room infront of it. i got the Pj idea from here > ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-QoheeVKGJ1w.html&ab_channel=NextWaveDV if i just start with a iphone idk, it seems also a hassel from ppl comments about stabilizing image ect.
@hethfilms
@hethfilms Год назад
@@silverbulletin1863 Hey Silver, I thought about the projector concept too, when I started out, but in my opiion it's way too limiting. It's super hard to light correctly as you have to avoid to get any light on the wall, otherwise the image will get all muddy and low contrast. Also like I wrote in my last comment, you are very confined in the type of movements you can do. Also you get what you shoot, if there are any errors, issues or problems with the tracking or a detail you would want to change after the fact, it's more or less impossible or at least a huge hassle as it is all in camera. And the way you describe your shopping list, you would already spent more on the hardware then my example setup would cost new. I think you can get away with a BMD pocket 4K on a lower cost gimbal from zhiyun and maybe one zoom lens like the 12-35 mm f2.8 for example. that would take care of the stabilization. But like I said, I would not recommend for anyone to specifically buy all the equipment just to do this, be it a method like you suggest or the method I presented here. It's all still very much hit and miss and tricky to get to work reliably. But I can confirm that we had another greenscreen shoot for the Star Trek movie I mentioned a few weeks ago and all the tracking shots we did with the exact setup I presented in this tutorial and all the shots worked perfectly and will be featured in the finished movie. :)
@Projectanimation872
@Projectanimation872 Год назад
When pvz 5 in real life gonna release
@chesper_miguel
@chesper_miguel Год назад
Yesterday
Далее
3D Track Your Footage with Your Phone
7:45
Просмотров 92 тыс.
История Hamster Kombat ⚡️ Hamster Academy
04:14
Why I'm ADDICTED To Filmmaking In Unreal Engine
8:02
Просмотров 355 тыс.
After Effects cameras and Unreal Engine
10:41
Просмотров 2,7 тыс.
The REAL Reason Unreal Engine VFX Looks FAKE
6:58
Просмотров 447 тыс.
Virtual Production with an iPhone
9:08
Просмотров 165 тыс.
I Made The SMALLEST Virtual Production Studio!
7:58
Просмотров 80 тыс.