Тёмный
Lightcraft Technology
Lightcraft Technology
Lightcraft Technology
Подписаться
Lightcraft builds Jetset and Autoshot, an ultralight virtual production system that works with Blender and Unreal.

www.lightcraft.pro/
Synchronized Rendering in Unreal
16:30
Месяц назад
Tracking Refinement with Syntheyes
36:47
2 месяца назад
Unreal Live Render Preview v2
11:30
3 месяца назад
Jetset Cine with Blackmagic BRAW
17:47
3 месяца назад
JetsetCineRigging
9:58
3 месяца назад
Autoshot Unreal Round Trip
47:36
3 месяца назад
Autoshot Blender Round Trip
36:00
4 месяца назад
Setting Cine Offset Manually
2:45
5 месяцев назад
Importing Animated Takes
6:49
8 месяцев назад
Exporting Animated Scenes
5:47
8 месяцев назад
Rendering 360 Panoramic Backgrounds
4:50
8 месяцев назад
Animation Timeline Control
3:00
8 месяцев назад
Exporting Animated Characters
4:55
9 месяцев назад
Convert Unreal Scenes to Blender
5:00
11 месяцев назад
Setting Tracking Origin with Markers
2:13
11 месяцев назад
2D Backgrounds
2:12
11 месяцев назад
Browser Slate and Remote
2:27
11 месяцев назад
Project Pick and Take Storage
2:19
Год назад
AI Matte Paint Fixes
3:35
Год назад
Export Animated USDZ
1:18
Год назад
Linking and Animating Vehicle
12:28
Год назад
Vehicle Armature
5:42
Год назад
Vehicle File Organization
8:47
Год назад
Loading 3D Scenes with iCloud
1:56
Год назад
Installing iCloud on Windows
1:44
Год назад
Комментарии
@Shuyunliu-j9q
@Shuyunliu-j9q 10 дней назад
"Hello, Lightcraft Technology. Does Seemo already synchronize timecode between an iPhone and a camera, or is it necessary to purchase a Tentacle Sync or similar device for timecode synchronization?"
@Shuyunliu-j9q
@Shuyunliu-j9q 11 дней назад
The status of LONET2 Livelink shows Receiving but doesn't complete Why the subject name doesn't show?
@c3ribrium
@c3ribrium 12 дней назад
Thank you for this video. I'm stuck at the livelink configuration. I have jetset running on the iphone, ip adress in the clipboard, when adding the livesource, I see a source machine "receiving" but there is no "JetSet" inside.
@Shuyunliu-j9q
@Shuyunliu-j9q 11 дней назад
The same problem as u :(
@EliotMack-z3v
@EliotMack-z3v День назад
Can you join an Office Hours session? We do then Mon/Tues/Fri at 9am Pacific time. Easier to debug live.
@Utsab_Giri
@Utsab_Giri 21 день назад
So, the phone sits on top of the lens, not the sensor? Why is that?
@eliotmack
@eliotmack 19 дней назад
The Jetset Cine lens calibration system calculates the offset between the iPhone lens and the cine camera lens wherever they are. The lens 'entry pupils' are (usually) close to the front of the glass, so it's a good idea to keep the iPhone close to the front (and also gives the phone a clear line of site.)
@jemsophia
@jemsophia 22 дня назад
this is soooooo cool! thank you!
@Utsab_Giri
@Utsab_Giri 22 дня назад
I'm wondering what Jetset's performance will be like in low-light conditions. Thoughts? Thanks you!
@eliotmack
@eliotmack 19 дней назад
When using Jetset Cine with an external cine camera, the iPhone can operate at a different exposure than the main phone (and in fact we run the iPhone at auto exposure during Cine shots for exactly this reason. )
@timdoubleday4627
@timdoubleday4627 25 дней назад
Great video as always, I'm guessing we could do a similar workflow but use Unreal Engine?
@eliotmack
@eliotmack 25 дней назад
I haven't used any of the Gaussian implementations in Unreal so I don't know how they behave. The basic splatloc concept should work, however.
@duchmais7120
@duchmais7120 Месяц назад
Hello Lightcraft Technology. Thanks for sharing video. Which Lens on the BMPCC are you using in this demonstration?
@eliotmack
@eliotmack Месяц назад
This is a standard Canon 24-105 zoom.
@LokmanVideo
@LokmanVideo Месяц назад
I'm in the VFX industry for many years, and seeing this workflow and the new technology you're bringing to the masses is so exiting :) Can't wait to test Jetset once I finish my new green screen studio (cyclorama). Amazing job guys!
@eliotmack
@eliotmack Месяц назад
Thanks! Post some shots when can!
@LokmanVideo
@LokmanVideo Месяц назад
@@eliotmacksure 👍
@lnproductions7958
@lnproductions7958 Месяц назад
Has anyone had issues with the proxy step? I'm not sure if it's the amount of clips but it doesn't even load for me
@michaelounsa5056
@michaelounsa5056 Месяц назад
Hello. I am considering purchasing an iPhone Pro to use the LiDAR feature specifically for virtual production with the LightCraft application. Could you please let me know if there is a significant difference in LiDAR quality and performance between the iPhone models from version 12 up to the upcoming iPhone 16? Are there any major benefits of using the newer models with your application?
@eliotmack
@eliotmack Месяц назад
We've found remarkable improvements with each new generation of iPhone, especially in GPU capacity and in cooling. The LiDAR hasn't changed much, but I'd still recommended getting the newest iPhone you can simply for the other performance aspects. It makes a big difference. We're getting Jetset ready for iOS 18 and looking forward to what is in the new hardware coming up soon.
@michaelounsa5056
@michaelounsa5056 Месяц назад
@@eliotmack thank you for the clearly answer.
@Kavouhn
@Kavouhn Месяц назад
Ensure your camera is focused on the object you're scanning, to get more tracking points.
@codydobie
@codydobie Месяц назад
Would this workflow with Jetset Cine and Syntheyes be useful outside of virtual production applications? Footage shot with the Jetset Cine rig mounted on your camera is basically able to capture a pretty solid camera track and attach some tracked lidar scanned geo that can represent the set? I could definitely see a huge time savings being able to bring a shot into syntheyes with the camera and geo already tracked for reference to integrate CG and VFX. Or am I misunderstanding what it does? Also, does it only work with solid backgrounds/green screen?
@eliotmack
@eliotmack Месяц назад
Yes -- we think this will be very useful for 'normal' non-greenscreen projects! The AI rotomattes are very good, and the set scanning + post tracking technique will work on any shot that needs VFX added. In fact, the iPhone uses a natural feature tracking algorithm that will work better in a 'normal' environment, since there are many more trackable corner features than on a greenscreen.
@MarkStefenelli
@MarkStefenelli Месяц назад
and now you just buy the ultimatte keyer from BMD and you have a professional virtual studio setup. Was waiting for this, NOW i will go for JETSET CINE....awesome. I worked many years ago with a Lightcraft Tech Previzion system, which was extremely expensive but now this is way better for an indie producer. Congrats, guys !!
@guillaumewalle
@guillaumewalle Месяц назад
this is f**** amazing, i need an iphone now
@joezohar
@joezohar Месяц назад
Thank you so much for this! When using USD viewer I noticed you loaded Castle_3 but didn't show what those export settings were. When I load _2 version I get the d3d error that crashed Unreal. Any help be greatly appreciated!
@bradballew3037
@bradballew3037 Месяц назад
I thought I was crazy because I didn't a _3 version and was wondering when that was supposed to be created? It seems like a part of the tutorial is missing?
@Kumarswamy_Hosmath
@Kumarswamy_Hosmath Месяц назад
I am interested in only gathering 1. Camera tracking data 2. Camera track plus respective lens data Can we please have a simple tutorial on how to gather inputs and use with Unreal post? And pls share shoot data so that one can try first hand?
@Kumarswamy_Hosmath
@Kumarswamy_Hosmath Месяц назад
Have you tested this with full blown rigs with mattebox? What is the limit of distance between cine lens and the iPhone lens? In most case of movie shoot we have matteboxes, why not demonstrate such scenario?
@eliotmack
@eliotmack Месяц назад
Good suggestion. I'm working on an update of the rigging that better handles an underslung camera rig by mounting the iPhone to the side of the main lens instead of above it. It's also fine to raise up the iPhone a bit to clear the matte box.
@kabalxizt5028
@kabalxizt5028 Месяц назад
you should make more tutorial about SYNTHEYES
@pinkuzaimas
@pinkuzaimas Месяц назад
Is there a link to InSpyReNet? I found it on GitHub but not sure what to download
@brettcameratraveler
@brettcameratraveler Месяц назад
Thank you! So there is no workflow where a Tentacle Sync is not required? For example, having Unreal follow the built-in timecode coming from the camera itself? (Even if the camera's speed drifts ever so slightly over time)
@eliotmack
@eliotmack Месяц назад
If you want the live action and rendered CG signal to stay in sync, you'll need the Tentacle Sync. There may be some way to hack Timed Data Monitor to make that work, but frankly the Tentacle works great and is inexpensive. We also plan to use the timecode data to assist post production take syncing.
@brettcameratraveler
@brettcameratraveler Месяц назад
@eliotmack Sounds good :) I ask because I try to keep my camera rigs with the least amount of extra battery powered hardware as possible. Less hassle and points of failure for a rig that is already full of lens encoders, tracking system, monitor, etc etc. I was also able to get a non-Jetset VP rig to "appear" to be in sync with Unreal without a Tentacle Sync so was wondering if your script might be able to do the same. Good point on organizing the takes, though. I'll go with the TS on the rig.
@momenkhaled99
@momenkhaled99 Месяц назад
wow
@zykoman825
@zykoman825 Месяц назад
Hey. Can we get a video with cine-camera pipeline to blender? We only have with iphone and that makes me confused...
@eliotmack
@eliotmack Месяц назад
The base version of Jetset is iPhone/iPad only. Jetset Cine (the version in this video) connects to external cine cameras and provides the lens calibration and post production footage processing.
@Kumarswamy_Hosmath
@Kumarswamy_Hosmath Месяц назад
Is tenacle sync a must?
@eliotmack
@eliotmack Месяц назад
Required for synchronized real time rendering in Unreal. Highly recommended for general use as then the tracking data has the same timecode as the cine video takes.
@Kumarswamy_Hosmath
@Kumarswamy_Hosmath Месяц назад
@@eliotmack can I get away with it for just camtrack data to be used in post?
@shawnhuang-m3x
@shawnhuang-m3x Месяц назад
Why do I set Lonet2 livelink and subject name show nothing?
@manolomaru
@manolomaru Месяц назад
✨😎😮😵😮😎👍✨
@jordanthecadby5762
@jordanthecadby5762 2 месяца назад
Under what circumstances would you need to refine the live track?
@eliotmack
@eliotmack 2 месяца назад
Shots with visible CGI & live action joins. In this case it's the join between the CG railing and the practical floor, but in other shots it might be high degrees of floor contact.
@ApexArtistX
@ApexArtistX 2 месяца назад
how to bring them over to unreal or blender ? tried fbx and usd does not works.. fails hard.. unreal has no camera sometimes
@eliotmack
@eliotmack Месяц назад
Watch closely at 17:32 -- it goes into detail on the Blender import process.
@weshootfilms
@weshootfilms 2 месяца назад
Amazing
@ApexArtistX
@ApexArtistX 2 месяца назад
How to do foreground occlusion composition
@ApexArtistX
@ApexArtistX 2 месяца назад
What happens if there is duplicated and drop frames ?
@ApexArtistX
@ApexArtistX 2 месяца назад
USD importer is still beta damn
@sanjimanga8923
@sanjimanga8923 2 месяца назад
The top view does not work for me its just grey space even after pressing F. I'm not sure what the reason is
@eliotmack
@eliotmack 2 месяца назад
Make sure you have something in the Unreal scene selected before hitting F; the F command just frames the selected object in the viewport.
@mustachefilms7949
@mustachefilms7949 2 месяца назад
Ok so I've got the USD working but for some reason the model I am using is always laying on the ground. I've tried rotating the scene-loc and I've tried switch the z-axis to the y-axis and it still doesn't work. Also does the tracking work without the full preview. For example, filming with the door messed up and just adjusting the camera after the fact in unreal. Thanks!
@mustachefilms7949
@mustachefilms7949 2 месяца назад
Is there a Mac workflow to this? Because the omniverse launcher only has windows and linux.
@mustachefilms7949
@mustachefilms7949 2 месяца назад
I also can't export the layer as a USD
@eliotmack
@eliotmack 2 месяца назад
Unreal has an integrated USD exporter that works on Mac. It works but doesn't handle everything that the Omniverse exporter does. We''ll do a tutorial on it at some point.
@mustachefilms7949
@mustachefilms7949 2 месяца назад
@@eliotmack Thank you so much
@somasekharkari8364
@somasekharkari8364 2 месяца назад
Hello Team, I am looking to use my Sony Fx3 for Virtual Production, does jetset and its software integrate well with this specific camera? if not how do I go ahead using my Fx3 for Virtual Production?
@raphieljoeroeja9406
@raphieljoeroeja9406 2 месяца назад
Can you save the hdri that has been captured?
@eliotmack
@eliotmack 2 месяца назад
Yes, but the quality of the HDRI wasn't that high, so we haven't pursued this heavily for VFX integration.
@raphieljoeroeja9406
@raphieljoeroeja9406 2 месяца назад
@@eliotmack Oh that’s too bad. It would be awesome if that was possible somehow. Thank you for the answer!
@darkodj4131
@darkodj4131 2 месяца назад
Thanks for this video, been following and keen to get on the board with Cine.
@abelarabian3895
@abelarabian3895 2 месяца назад
in my case it doesnt create the level sequence when copy pasting the code in unreal! its having issues everywhere...
@knstntn.g
@knstntn.g 2 месяца назад
check if imageplate plugin is enabled, worked for me
@abelarabian3895
@abelarabian3895 2 месяца назад
@@knstntn.g cool thanks! i will try it out!
@alexandertabrizi2083
@alexandertabrizi2083 2 месяца назад
Hi, I have the Seemo Pro 1080p version and was wondering if that will work?
@eliotmack
@eliotmack 2 месяца назад
Yes -- all of the SeeMo devices will work with Jetset Cine.
@alexandertabrizi2083
@alexandertabrizi2083 2 месяца назад
@@eliotmack thank you! Any limitations with the non-4K version?
@eliotmack
@eliotmack 2 месяца назад
@@alexandertabrizi2083 1080p is fine for calibration. The SeeMo 4K has a SD card reader so you will be able to record takes directly to it if desired. Very useful for productions that don't want anything near iCloud.
@eliotmack
@eliotmack 2 месяца назад
@@alexandertabrizi2083 The SeeMo 4K has a SD card slot that Jetset will soon be able to record to. Other than that no limitations.
@StudioWerkz
@StudioWerkz 2 месяца назад
Can or has this been calibrated with an anamorphic lens with a round trip into blender? Or is best to stick to spherical glass?
@lightcrafttechnology
@lightcrafttechnology 2 месяца назад
We've tested spherical extensively. Anamorphic calibration in Jetset should be fine for getting a basic track. For sub-pixel refinement we're developing a Syntheyes workflow that can be extended to anamorphic solves.
@StudioWerkz
@StudioWerkz 2 месяца назад
@@lightcrafttechnology Thank you, looking forward to it
@WatsonStorage
@WatsonStorage 3 месяца назад
Thanks for all of the great tutorials on how the system works. I'm interested in Jetset Cine for applications where there are few usable tracking points visible in the view of the main camera (e.g. the frame is full of moving people or moving elements like foliage or water). From your tutorials, it appears that Jetset is designed to have the iPhone camera facing the same direction as the main camera, so that they share a similar view of the scene. Is it possible to set the system up so that the iPhone faces in a different direction from the main camera - for example, to the left of the camera or behind the camera - where there may be a richer set of fixed points to track?
@lightcrafttechnology
@lightcrafttechnology 2 месяца назад
The system as you said is designed around the phone and the camera pointing in the same direction. The iOS camera is quite wide angle (19-20mm S35 equivalent) so it can pick up more points than the Cine camera usually can.
@dansmith904
@dansmith904 3 месяца назад
I don't have an option for External Tracking Protocol, that drop down does not exist and I'm updated to the latest versions.
@lightcrafttechnology
@lightcrafttechnology 3 месяца назад
Are you on Jetset Cine v1.0.118? That's the one with the updated external tracking protocols. Check under Settings & Support -> Settings and scroll down to see the version #. You can update manually with the App Store.
@GadrielDemartinos
@GadrielDemartinos 3 месяца назад
Perhaps this question has been already answered in another video; I'm confused which Accsoon SeeMo should I get - the SeeMo 4K HDMI Smartphone Adapter or the SeeMo Pro SDI/HDMI to USB-C Video Capture Adapter? Also, would I need the CineView transmitter?
@lightcrafttechnology
@lightcrafttechnology 3 месяца назад
Here's a good link: lightcraft.pro/docs/which-accsoon-seemo/
@GadrielDemartinos
@GadrielDemartinos 3 месяца назад
@@lightcrafttechnology Thank you!
@goodenna
@goodenna 3 месяца назад
Great video! But I'm still waiting for the sytheye pipline tutorial.
@brettcameratraveler
@brettcameratraveler 3 месяца назад
Exciting :) Did you solve the timecode sync issue for the Decklink cards that you mentioned, or are you still working on it and expecting to solve it? Rough eta on that tutorial? Can't wait. Thank you! :)
@eliotmack
@eliotmack Месяц назад
Yes -- we have a pretty clean workflow working by now with AJA/Blackmagic SDI capture cards. We're putting together a tutorial in the next couple of weeks.
@brettcameratraveler
@brettcameratraveler Месяц назад
@@eliotmack Amazing news!
@jordansaward
@jordansaward 3 месяца назад
Do you need Tentacle Sync? I noticed it's off by default, is there a huge difference in the track without it?
@lightcrafttechnology
@lightcrafttechnology 3 месяца назад
Tracking will be the same with or without the Tentacle. It's very useful for automatic take matching in post, and for synchronized real time data. Such a good device that we're now recommending it as standard use.
@cyrusstanola3929
@cyrusstanola3929 3 месяца назад
Is the tentacle necessary for lightcraft cine? Bc based on your other videos it seems like it is an additional accessory instead of a necessary one
@lightcrafttechnology
@lightcrafttechnology 3 месяца назад
It's not absolutely necessary but highly recommended.
@cyrusstanola3929
@cyrusstanola3929 3 месяца назад
@@lightcrafttechnology I thought Jetset Cine already had its own genlock thing with the digital slate to sync the tracking data and the footage? how crucial is the Tentacle and can I do by without it just fine or will I encounter problems in post?
@lightcrafttechnology
@lightcrafttechnology 3 месяца назад
@@cyrusstanola3929 The digital slate works for the Autoshot post tracking workflow, but under some lighting situations the automatic marker recognition can be fragile (lights reflecting in the digital slate screen, etc.) In general timecode is a great idea to tie things together, and the Tentacle Sync is both inexpensive and works well.
@aust917
@aust917 2 месяца назад
Is accurate manual synchronization possible without using tentacle sync?
@eliotmack
@eliotmack 2 месяца назад
@@aust917 Yes, but we recommend the Tentacle as it can speed up take matching considerably. We're working on the implementation of timecode based take matching now.