FAQ - CamTrackAR for Android? Please note that the reason CamTrackAR can only be used on iOS devices currently is down to the AR camera data we are able to pull from Apple’s hardware technology. We started with ARKit, which is amazingly good, and we are actively talking to Google about the best way forward for Android - so stay tuned!
I have some questions. 1) I have an environment and an animated charachter. Can I import the environment with an animated mesh? That would make it very easy for me to follow the character. 2)Is there a live connection to unreal engine? 3) Do you need a greenscreen to be able to see the 3D set?
The app is amazing!! When recording I admit the camera is more shaky than what it shows during recording but once I added the footage to Blender all I had to do was smooth out the camera and it looks amazing! My next attempt will be a Green Screen man behind a cgi background. I do have a question though, when recording your footage did you ever have problems with the quality of the footage such as grain and noise? If so is there a way to fix this in Hitfilm or After Effects?
Yes, the camera in the iPad is not as good as a dedicated video camera. We found some success with upscaling the footage using Topaz Labs' Video Enhance AI program. - Javert
One of the most helpful app for filmmakers! GREAT! But is it somhow possible to connect the CamTrackRA app With a camera? Beacause then I could film with my camera ( who has a better videoquality then my phone) and have the tracking on the phone. on the app.
I've been wondering this too, I would imagine if you match the focal length of the iPhone camera, and then build a rig that places your iPhone camera directly above the lens from your better camera, it should be as easy as using the tracking data with your better footage and not touching the source footage other than to match the timings! I'm thinking ill try it with the 8k from my canon r5 for the perfect chroma key! This app is literally going to get me to switch to iPhone! And I hate iPhones!
@@kylehessling2679 There are a few videos on RU-vid about this...(mounting iphone with a tracking app above a DSLR). Not with CamTrackAR though. I bet it would work though.
WELL DONE FXhome!!! I've been sniffing at CamTrackAR since you brought it out, but info on actually setting it up and using it was sketchy at best. Like most of your tutorials, this video makes it all understandable, blows the doors off anything else I've seen about this amazing software, and gives me the confidence to dip my toes in the pool! Again, WELL DONE!!!
HI! i dont understand how export USDZ from 3d max with textures. I apply USD material for object, add bitmap texture for this, but after export model have not texture in preview, please help
@@DavidCrossIN2U so, doesn't Google plan to implement something similar for Android? I'm super frustrated to have to buy an expensive Iphone just because of this when I already bought a good Samsung phone with a great camera... :(
@@matsadona I hear you. It would be great if they could do this for Android, but FXHOME explained it a few days ago, that Apple made ARKit available for developers. I dont think Google/Android has done the same. Maybe one day they will catch up to Apple on that front. I have an iPhone XR and it works great on that.
To get over the HD recording limitation, could you mount the iOS device to your normal camera and shoot with that but take the tracking data from your App to use it with the video from your normal camera to chroma key and composite in unreal?
@@Cinematoginsanantonio I was looking for something like this, an app that used the phone as a tracking probe, so that you could mount your phone on the camera and then export the tracking info to any 3d program, or in real time for Unreal engine. From what I see in this video you can export the camera as .fbx so it seems its possible.
@@TheDrunksInATree I think Film Riot guys tried this and I heard but haven’t checked yet but they are saying it has 4K support now?? So let us know as not had time to check this
@@TheDrunksInATree the huge hub systems is the way to go really, that way you get everything else, focus tracking, zoom, and camera tracking amongst other pros
I've only had a little play, but been impressed so far - I have a couple of questions other than the usual ones about frame rates, external cameras and android... 1. Is there any advantage using newer phones over an iPhone XS Max? 2. If you're not changing location, is there any benefit to re-calibrating/scanning between shots? 3. How large a location is reasonable, i.e. if you were outside walking down a street, how far could you go and it still work? 4. Does it use any internal stabilization, or is that a limitation of ARkit? Would using an external gimbal be helpful? Thanks!
Newer phones will have better tech inside them- we've seen that the LiDAR scanner in the newest iPad Pro greatly helps the tracking quality. Re-calibrating is likely a good practice if you put the device to sleep; when you wake it up again it has to re-orient itself and figure out where it is in 3D space, so re-calibrating is like starting fresh. There's no internal stabilization due to ARKit. This is something we'd like to look into for the future, but using a gimbal will help. - Javert
@@FXhome thanks Javert - it's an excuse to buy a new iPad Pro at least - I'll probably wait until there are a few more options in frame rates/resolutions in ARKit, hard to tell if Apple will open it up... Cheers!
I test on iphone 11 pro, 12 pro and 14 pro, and i have exactly the same problem.. after few minuts, the audio/video sync fails, i try to rec with an a sony a7s III at the same time, and imposible to sync the video/audio/tracking data… make it impossible to use… i test on 30 & 60 fps, but nothing work.
There no options in transferring files. The email method doesn't work. keeps failing for me. Was waiting for tech support but I decide to request a refund from apple. I like the app but there no options in transferring files. Maybe will get it the future again when this problem gets resolve .
This is a game changer I cannot believe until Now that my iPhone actually can do that it’s save me a lot of time and a lot of effort but I really hope that soon it will be in 4K
They'd love too, but simply put the iOS devices have hardware that the android devices don't have currently. Hassal Google or Samsung or any of the big Android device makers. I want it toooooo
With paid version can I create virtual worlds, import 3D models just using iOS device without exporting to desktop or portable post production apps like after affects etc basically can I go from start to post all iOS device if so do have any examples and do you have information on doing budget green screen how to light it properly thank you
Hi , can you improve tracking with the lidar camera ? It would be nice if you can scan first with lidar the place you will record , to better tracking solve and to export lidar geo with the camera track , to use them to build the scene.
I Hide the lidar Sensor with my Finger while i filmed with camtrackar, Nothing changed, so iam pretty sure the lidar scanner is useless for camtrackar, sad
Hey :) is there a way to pick the color that i want to use for keying? I found workaround (just go close to the chroma and lock it when software detects) but picking the proper color in the viewport would be really helpful :) Also is there a way to select different frame rates (24/25fps)
I could not get the usdz model in the shot. Nothing happens even after I toggle camera off and 3D on. Please let me know how to do this. It’s not in this video.
I have a doubt. All the camera tracking software out there in the market, use point trackers and create virtual camera based on the change in those tracked points, how is this software, especially apple's technology is tracking things up?
So your software, or at least the CamTrackerVR is limited to only iSO users and excludes the rest of us Android users? I do have a Generation 3 iPad but, I'm sure it's too old of a version for this software. Productions by Kevin B sad... So sad :(
Hey this may sound silly but I'm by mu self and I don't have buddy's :( Could you allow front facing camera because phone ubove iPhone 11 front facing camera is in 4k soo quikity should look good but need more light due to small censor Basically can you make it accessible for solo production artist please 🙏🙏 Omg I just realise I have a iPhone 12 not 12 pro and I don't have to lidar , do I need a pro phone with the lidar ?
So just to confirm this only works with inbuilt camera on the IOS device? Is it possible to record with our own camera but then have the tracking data imported into the video file somehow, maybe attaching the iOS device to the camera?? Or is it two seperate process? Or is it possible to go the other way, for example BMPCC has a time code generator, can that be fed into the CamTrackAR and then synched in post?
It would be possible, but to sync with your footage recorded on the camera would involve accommodating for the distance of your phone to the camera, the sensor of your camera and the lens used. In short, it will have the motion of your camera but unable to have the accurate scene creation.
You do realise that Blender as it''s own tracking software build in right? watch this tutorial by the mighty Ian Hubert: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-lY8Ol2n4o4A.html
@@ginkarasu yes yes i know, but i have hitfilm pro and foundrys camera tracker in hitfilm is much easier to use then blender's and it would make much things much more easier when you could combine them. But I understand you, you dont really need foundrys camera tracker to track in blender
I bought this app but may request a refund. Because development has been stopped for a year now. ARKit 6 was announced at WWDC in June this year and allows 4K shooting with AR on the backside camera, has it been implemented yet?
can someone explain to me exactly what the purposes are of anchor points? I can't seem to find a clear explanation for how they help or what they're doing. I understand they're to improve stability of the track, but that in and of itself needs explanation (i'm a TOTAL beginner).
They don't improve the stability of the track- they're used later when you bring the tracking data into other software to act as definite points on the ground. Otherwise, you have the 3D space data but don't actually know where 3D objects are placed in relation to the video. - Javert
Dear FXhome Team, it would be brilliant if we could Place in a wall or in a floor a plane. In blender we could use the planes for many things, blender User know what i mean