"Hello, Lightcraft Technology. Does Seemo already synchronize timecode between an iPhone and a camera, or is it necessary to purchase a Tentacle Sync or similar device for timecode synchronization?"
Thank you for this video. I'm stuck at the livelink configuration. I have jetset running on the iphone, ip adress in the clipboard, when adding the livesource, I see a source machine "receiving" but there is no "JetSet" inside.
The Jetset Cine lens calibration system calculates the offset between the iPhone lens and the cine camera lens wherever they are. The lens 'entry pupils' are (usually) close to the front of the glass, so it's a good idea to keep the iPhone close to the front (and also gives the phone a clear line of site.)
When using Jetset Cine with an external cine camera, the iPhone can operate at a different exposure than the main phone (and in fact we run the iPhone at auto exposure during Cine shots for exactly this reason. )
I'm in the VFX industry for many years, and seeing this workflow and the new technology you're bringing to the masses is so exiting :) Can't wait to test Jetset once I finish my new green screen studio (cyclorama). Amazing job guys!
Hello. I am considering purchasing an iPhone Pro to use the LiDAR feature specifically for virtual production with the LightCraft application. Could you please let me know if there is a significant difference in LiDAR quality and performance between the iPhone models from version 12 up to the upcoming iPhone 16? Are there any major benefits of using the newer models with your application?
We've found remarkable improvements with each new generation of iPhone, especially in GPU capacity and in cooling. The LiDAR hasn't changed much, but I'd still recommended getting the newest iPhone you can simply for the other performance aspects. It makes a big difference. We're getting Jetset ready for iOS 18 and looking forward to what is in the new hardware coming up soon.
Would this workflow with Jetset Cine and Syntheyes be useful outside of virtual production applications? Footage shot with the Jetset Cine rig mounted on your camera is basically able to capture a pretty solid camera track and attach some tracked lidar scanned geo that can represent the set? I could definitely see a huge time savings being able to bring a shot into syntheyes with the camera and geo already tracked for reference to integrate CG and VFX. Or am I misunderstanding what it does? Also, does it only work with solid backgrounds/green screen?
Yes -- we think this will be very useful for 'normal' non-greenscreen projects! The AI rotomattes are very good, and the set scanning + post tracking technique will work on any shot that needs VFX added. In fact, the iPhone uses a natural feature tracking algorithm that will work better in a 'normal' environment, since there are many more trackable corner features than on a greenscreen.
and now you just buy the ultimatte keyer from BMD and you have a professional virtual studio setup. Was waiting for this, NOW i will go for JETSET CINE....awesome. I worked many years ago with a Lightcraft Tech Previzion system, which was extremely expensive but now this is way better for an indie producer. Congrats, guys !!
Thank you so much for this! When using USD viewer I noticed you loaded Castle_3 but didn't show what those export settings were. When I load _2 version I get the d3d error that crashed Unreal. Any help be greatly appreciated!
I thought I was crazy because I didn't a _3 version and was wondering when that was supposed to be created? It seems like a part of the tutorial is missing?
I am interested in only gathering 1. Camera tracking data 2. Camera track plus respective lens data Can we please have a simple tutorial on how to gather inputs and use with Unreal post? And pls share shoot data so that one can try first hand?
Have you tested this with full blown rigs with mattebox? What is the limit of distance between cine lens and the iPhone lens? In most case of movie shoot we have matteboxes, why not demonstrate such scenario?
Good suggestion. I'm working on an update of the rigging that better handles an underslung camera rig by mounting the iPhone to the side of the main lens instead of above it. It's also fine to raise up the iPhone a bit to clear the matte box.
Thank you! So there is no workflow where a Tentacle Sync is not required? For example, having Unreal follow the built-in timecode coming from the camera itself? (Even if the camera's speed drifts ever so slightly over time)
If you want the live action and rendered CG signal to stay in sync, you'll need the Tentacle Sync. There may be some way to hack Timed Data Monitor to make that work, but frankly the Tentacle works great and is inexpensive. We also plan to use the timecode data to assist post production take syncing.
@eliotmack Sounds good :) I ask because I try to keep my camera rigs with the least amount of extra battery powered hardware as possible. Less hassle and points of failure for a rig that is already full of lens encoders, tracking system, monitor, etc etc. I was also able to get a non-Jetset VP rig to "appear" to be in sync with Unreal without a Tentacle Sync so was wondering if your script might be able to do the same. Good point on organizing the takes, though. I'll go with the TS on the rig.
The base version of Jetset is iPhone/iPad only. Jetset Cine (the version in this video) connects to external cine cameras and provides the lens calibration and post production footage processing.
Required for synchronized real time rendering in Unreal. Highly recommended for general use as then the tracking data has the same timecode as the cine video takes.
Shots with visible CGI & live action joins. In this case it's the join between the CG railing and the practical floor, but in other shots it might be high degrees of floor contact.
Ok so I've got the USD working but for some reason the model I am using is always laying on the ground. I've tried rotating the scene-loc and I've tried switch the z-axis to the y-axis and it still doesn't work. Also does the tracking work without the full preview. For example, filming with the door messed up and just adjusting the camera after the fact in unreal. Thanks!
Unreal has an integrated USD exporter that works on Mac. It works but doesn't handle everything that the Omniverse exporter does. We''ll do a tutorial on it at some point.
Hello Team, I am looking to use my Sony Fx3 for Virtual Production, does jetset and its software integrate well with this specific camera? if not how do I go ahead using my Fx3 for Virtual Production?
@@alexandertabrizi2083 1080p is fine for calibration. The SeeMo 4K has a SD card reader so you will be able to record takes directly to it if desired. Very useful for productions that don't want anything near iCloud.
We've tested spherical extensively. Anamorphic calibration in Jetset should be fine for getting a basic track. For sub-pixel refinement we're developing a Syntheyes workflow that can be extended to anamorphic solves.
Thanks for all of the great tutorials on how the system works. I'm interested in Jetset Cine for applications where there are few usable tracking points visible in the view of the main camera (e.g. the frame is full of moving people or moving elements like foliage or water). From your tutorials, it appears that Jetset is designed to have the iPhone camera facing the same direction as the main camera, so that they share a similar view of the scene. Is it possible to set the system up so that the iPhone faces in a different direction from the main camera - for example, to the left of the camera or behind the camera - where there may be a richer set of fixed points to track?
The system as you said is designed around the phone and the camera pointing in the same direction. The iOS camera is quite wide angle (19-20mm S35 equivalent) so it can pick up more points than the Cine camera usually can.
Are you on Jetset Cine v1.0.118? That's the one with the updated external tracking protocols. Check under Settings & Support -> Settings and scroll down to see the version #. You can update manually with the App Store.
Perhaps this question has been already answered in another video; I'm confused which Accsoon SeeMo should I get - the SeeMo 4K HDMI Smartphone Adapter or the SeeMo Pro SDI/HDMI to USB-C Video Capture Adapter? Also, would I need the CineView transmitter?
Exciting :) Did you solve the timecode sync issue for the Decklink cards that you mentioned, or are you still working on it and expecting to solve it? Rough eta on that tutorial? Can't wait. Thank you! :)
Yes -- we have a pretty clean workflow working by now with AJA/Blackmagic SDI capture cards. We're putting together a tutorial in the next couple of weeks.
Tracking will be the same with or without the Tentacle. It's very useful for automatic take matching in post, and for synchronized real time data. Such a good device that we're now recommending it as standard use.
@@lightcrafttechnology I thought Jetset Cine already had its own genlock thing with the digital slate to sync the tracking data and the footage? how crucial is the Tentacle and can I do by without it just fine or will I encounter problems in post?
@@cyrusstanola3929 The digital slate works for the Autoshot post tracking workflow, but under some lighting situations the automatic marker recognition can be fragile (lights reflecting in the digital slate screen, etc.) In general timecode is a great idea to tie things together, and the Tentacle Sync is both inexpensive and works well.
@@aust917 Yes, but we recommend the Tentacle as it can speed up take matching considerably. We're working on the implementation of timecode based take matching now.