Last year we got the same Deity TC-1 generators and for some reason Mars won't sync to any of them. The TC in the Mars goes faster than realtime and in a few seconds its completely out of sync.
How do we export the video file from unreal with the timecode embedded so we can sync in post? We have TC on camera, mars and see it in unreal, but when we export, we can’t get the timecode embedded 😢
I appreciate this approach, but I don't think it looks good enough to justify the cost of the Mars, or even the Vive tbh. 3D tracking in software when compositing is using software you probably already have and MANY are getting excellent results. Even UE live apps offer good tracking data with integrated stabilization.. I think people get caught up in the novelty of doing it with realtime trackers, but if you're already doing the post production offline, it makes no sense to spend all that money on real time trackers imho. I only mention this because many beginners see a video like this and give up because they dont have an extra $6K for the mars or similar.
Do you mean you would just film yourself on a green screen, get the 3d tracking data from a program like Nuke/ After Effects / Mocha, then put that tracking info into unreal engine , export the scene and composite in nuke/after effects/premiere pro? Would you be willing to make a video with the steps on how to do that?
@@SitinprettyProductions yes, fairly accurate description. Try the Jaro Atry youtube channel. He runs a similar workflow. I just think we get caught up in new tech we often are blind to the results. Plus the vive mars movement is never as smooth as I’d like.
You'r wrong. Mars is a bad system but that's no the biggest issue here. You will need to match the distorsion of your real lens with your virtual camera. Also, the world pose. Without previz you will not know if your shot is working before weeks or even month depending on your budget/team. Also, no compositing help because with preiz you can tweak lights on set to match your virtual scene. 3D tracking is way more complex than what you think, and takes heellllll of a time. It's no where near using AE, you should use software like Syntheyes or 3DE. Its really HARD and LONG to 3D track so if you were to shoot stuff that last for let's say one hour, this is becoming impossible or extremely high in price.
man i havent watched you for years and you are as good as i remamber. you teached me so much on my way to become a pro editor. thank you for everything!!
The whole point of VP is to get the real composite from UE or any other dedicated software for VP. You missed the Keying step in UE, which is the most important one, when you see in real time the final result. And you need a capture card like Blackmagic decklink pro to get the camera feed in Unreal and many other steps. So basically this video doesn't show you how to do VP.
Want more of these virtual production tutorials bro . Please make a full series of this from scratch starting to video output and editing and adding VFX or any other required VFX shot or any object these kind which is after the shot is done in virtual production stage Then how do we add or change the baground or anything we required after all these camera recording done . Is there any ways to do it
Still one of the cheapest solutions :-) You can also use an iPhone and the free app 'VCAM' for camera tracking. Although it's far from accurate, you can get away with it for medium shots.
Great video guys, really well explained for all the steps :) Altough I have a question, how did you managed to create the shadow around 10:29? Thank you :)
That's done with After Effects. There's a great plugin from Red Giant that does this in 1 click. With Aximmetry you can also get really good shadows in real time. I'm still figuring those things out, so maybe I'll do a tutorial about it in the future.
@@ivanbruno6527 you can by creating a material in ue that keys real time and gets a video feed from your cam via live link. then you make a plane in ue, set it to have that material and make sure its parented to the camera correctly and you will have essentially a keyed card of your actor in engine
As i said everywhere i could, MARS isn't IMO a good product atm. I say this not because i have a personnal problem neither with valve or cinecom. I love you guys. But... You will notice jitter if you do handled shots. On your example, we can notice sliding feet even if your on a tripod. I recommand using at least a bliss tracker / antilatency system if you don't have 50k + budget for vicon or optitrack solutions. (wich is basically true pose so it works 100% and you need to calibrate only once). Theyr both under 10k solutions. I personnaly use Bliss wich is a T265 tracker uppgraded cost around 3k. Mars system is honestly bad (in my opinion) because of the design itself, using vive trackers that are known to produce jitter + issues with occlusion + issues with light is a really a terrible design choice. Still love your work and what you'r doing to push VFX so keep going
Hello, is there a UE5 cheat sheet for camera Film Back settings to achieve the many different aspect ratios? For example, what would the film back settings be for 2.35,or 2.39.1? Figuring out the film back settings is challenging.
How do you export video file from unreal with the timecode embedded? We get TC to camera, MARS, and we see it in unreal engine, but when we try to render and export it doesn’t have the timecode 😬
I know budget friend and “beginner” is somewhat relative but the packages you are recommending are neither in my opinion. You need at least a $5,000 PC, the Mars System is another $5,000, the time code slate is over $2,000, the time code boxed are another $500, and then theirs the large enough studio space. So funny what RU-vidrs think is “budget friendly” when they get stuff sent to them for free. Good video though.
13k is buget friendly to a single camera that can cost 25k, without any lenses. im poor myself so no way i can get into this , i have a decent computer though and im learning vfx in unreal. its a start. maybe i will make something that impresses people and they will wanna donate or invest. got to start somewere
Thank for the great video! I have a question about the timecode though. Where doed the timecode in unreal came from? From the camera through SDI or you connect another deity somehow into the pc?
Wow! But what if I want to feed an realtime 3d element like a stromcloud or ice-crystal to real tv studio image, with tracked cameras. What is the most professional way to key out the background of that UE video feed?
get rid of your ipad and get something that doesnt screw over people, like apple loves to do. UE has ways to make stuff for phones so i guess you can do some AR. but like you can do a lot with a phone camera and a greenscrean, wich can be just green cloth draped over something.. if you can do that then you can easely just add a background in UE
Hello. I've been wanting to try some of this stuff out. The one thing I really can't figure out is how to export the tracked camera data as an fbx for use in other 3d software. I can only get it to export the first frame of it.
I love your videos and the energy you have, but to call this video "beginner" is misleading. Who has all the gear you have? Still, there is a lot of great information in the video. Still subbed!
This gaming channel never dreamed about this to defeat "I laugh at your face competition with every word I can" so that's why and simple users never dreamed about technology generating green screen with AI. Other things are just part of methodology for advertising agency, but no 3D graphics tho. Make this thing only thing about that movie in that record 5:08, because you never had them and it's part of methodology.
So...what I attach the vive to the camera in Unreal, the Unreal camera jumps to a 0,0 position under everything and I cannot move it. Do I need to do something before that?
even tho the tutorials have gotten more complicated over the years with the inclusion of more 3D tools they skim over the top of things no less fast, and no less equipment specific an dependent.
I'll wait until they figure out how to make it easier to setup. Seems easier to create an environment, record green screen footage and then send camera tracking data into Unreal. Render background and then slap it into AE with your green screen footage. At least until they make Unreal more about creativity and less developer tool.
Blows my mind... how many trifles U need to connect adjust keep in mind and they can conflict with each other :) I can say its way too complicated if U had to use even your head as a finger ^). And yet (Thnks Gods of Cinema) it is possible to carry out.
what the shit.... if you're going to do an offline version of this, it would have been MUCH more straightforward to just do a CAMERA TRACK after you shoot! if you're not gonna do the "live" version, all this effort is waste.
Ah yes, the absolute beginners guide for absolute beginners who happen to have a large studio greenscreen and RED cameras (not to mention the $1000 clapper). No offense but this whole method seems fairly redundant and downright tedious to work with. Isn't the point of virtual production to be more... virtual?
well its more hardewere so you have to do less at the computer, like instead of making custom animations for hours or days using cgi, you can have an actor (or yourself) make the movements in 10 minuts
Is working Hey. Ha ha ha. Micro moving camera na It's working. Ha ha. Toys for profesionals. I AM realy scare to spend money for this sh.... It's still same Vive trackers