a good worklow would be to record composite and separate virtual camera+greenscreen, edit with the composite and re-compose after all editing is done. so, shooting and editing with excellent preview, while mainaining full compositing for output. this workflow would require rec/stop/save commands to be sent to unreal through sdi, and auto-naming of both unreal sequences and camera, to help with camera assistance.
This looks great! It makes green screens much more believable! S'il y a des français passant ici: la team JDG a très probablement utilisé ce tutoriel pour leur nouveau système de fonds verts ^^. C'est le même équipement, les mêmes logiciels, et le même niveau de rendu.
Gee Matt you’ve come a long way in a short time. That is unbelievably amazing. I would love to see this in person. Any plans for workshops once we are free from the pandemic? BTW I’m finally going to be able to get the RTX 2080 Ti so I can finally get to work with Cine tracer. Love your work man!!!
Matt can you make a tutorial to a live stream with a chroma key, but with a camera movement like a last shots, when you moving the camera and the background moves too pleeeeaaaasssseeeee
if you check Instagram we did some testing with backlights. I'm definitely going to do videos just on matching the live action to the virtual sets. That is the fun part.
@@CinematographyDatabase I've been looking into this quite a lot... would it be possible to use a few rear projectors to build a poor man's version of the ARWALL and lighting your actors that way?
An idea for lighting. Get a large TV screen, place it close to the model and send another feed of the virtual environment (taken from th eopposite direction of the camera). Tune the lighting can camera for low light so that the effect of the TV is more pronounced. Now the reflection of the light from the TV on your model makes them look more immursed. It should help to counteract all that green light pollution. It's all about proximity.
yeah we are working towards indie LED wall virtual production. I shot this demo last year, basically a mini Mando LED set - ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-kFBha5BE38k.html
Hey, really enjoyed your video and in-depth and clear breakdown for a VP setup. I was wondering however, if you could share how to setup just Vive tracker + base stations without the HMD i don’t have an HMD and having an extremely hard time setting those up with unreal. I keep hearing they are super easy to configure but can’t find any easy setup videos for that. I was hoping if you could help explain that. Also tried following you along the configurations you showed in unreal here and can’t find most options you mentioned here 🙏🏽
How do you get the pull focus from the lens to get unreal to sync with it, or is the lens focus part just for the mixed reality and not just filming all unreal content. This new wave of technology that your doing and what the Mandolorian crew did just makes me so excited to see live tracking from the RL used for camera recording. Its like VR had to be a thing with the whole tracking aspect to then have people get the idea of using tracking for virtual cameras, then it moved onto the next stage with Mixed Reality. Truly a breakthrough in the filming production world and a way for the gap to shorten between a small indie studio and the big corporate film industry. Similar to when home recording software became so user friendly and pc hardware became cheap enough for bands to record there own music in a small studio without having to pay alot of money for studio time from a big one. I hope that made sense. Anyways this is amazing and definitely the future. Im a CG artist and this kind of stuff really gets me excited and very interested in wanting to know more.
Is UE calculating the lighting in real time, or is the set "baked" lighting? Either way, looks like a fantastic start for a setup based on mostly "consumer" gear! You are really opening the door for a lot of productions!
It's exciting to see your tests with virtual production, Matt. This sort of thing would have cost a fortune 10 years ago. Imagine how cool it will be when you can get one of those big LED walls!
so how does it work with post... Are you recording on the Ursa also, and then give the unreal engine and tracking data and raw footage to the editor for posting the feature film?
I was thinking of trying the new intel realsense tracking camera as a way of doing the tracking on the camera, rather than using the vive method. I've seen bigger production sets using inside-out camera tracking where they markup the room with tracking markers and then just have a camera pointed at the ceiling with a bit of software to position and orient the camera (which is basically what the intel tracking camera is doing, but doesn't use markers). Ideally I guess you'd have a rear projection screen behind the cam for environment lighting. Ah man, if only I had the funds I could play all day with this kind of thing.
Dear Matt; I have an exact setup like your early days. I need an answer if you please 🙏🏻. I am connected to unreal via decklink card and I get keyed preview in viewport. I use composure and also I use Media bundle projected on a plane. all are perfectly working. WHAT I CANT achieve is to record my editor viewport in the UE. Is it mandatory to use an ext recorder via decklink in order to record.? I DO NOT WANT OBS or alike as a screen capture is there a way to record in the Unreal engine like take recorder or seq rec.? APRECIATE your time and help.
Hi Matt! Hope you can help me. I have spend countless hours trying to get composure to work but something isn't right. The CG layer looks completely different compared to the actual scene. It is as if it doesn't read any of the post processing or lighting properly. Any advice on what I can do to fix this. Thanks
Hello, I'm working with Composure and the cg_layer color is way off, its bad and not as good quality as the CineActorCamera. Have you come across this issue where Composure's view is different from your CineCameraActor?
If I wanna do this live on Twitch or something, is there a way to do it without the live feed ending every time I try to play the level? I want to utilize some spider AI I coded, and their animations, but can't because this method only seems to work in the editor.
Wow! Im really excited to try, but an HTC vive is ridiculously expensive in Brazil xD Does anyone know if its is possible to have high quality with any other VR set ?
Hey man, this is amazing, thankyou very much. Just one question, once you have everything set up how do you record it? I mean, to have the render video of what you were filmed in real time. Thanks in advance!!
Great videos love it!! Question for you though. I don't have an htc vive setup (yet.. covid $$ issues lol) so I'm trying to achieve similar results by using the iPhone as a tracker. Even though I match my cinecameraactor settings to my DSLR, i find they don't quite match in unreal. You don't seem to be having this issue with the vive trackers. what "information" do they give back.. Transform? world location? distance between cam and tracker? thx! Marc.
How does one learn this in his apartment? I have a possible opportunity for some freelance work creating 3D sets using Maya. However, they bring Unreal in because they want these sets to be part of a live broadcast. This tutorial seemed like an introduction to that. However, it is occurring to me that I may need access to a green screen, a high-end camera, a card of some sort in my workstation... Holy crap. Is there a way I can learn what I need to get these jobs without having to spend gobs of money on schooling or hardware?
hola que tal, excelente el trabajo que haces y los totorales y todos tus videos, felicitaciones! queria consultarle lo siguiente: que tipo de hardware de pc (placa de video, procesador, ramón, etc) es necesario para realizar este tipo de trabajos? desde ya muchas gracias y espero tu respuesta
I am trying to put foreground plate over media plate and added simple cube to foreground layer, it shows black background along with cube. ...already changed "alpha channel to linear space color only". Suggest if i am missing anything.
Is there a way to sync camera aperture value and shutter angle w/ background blur and motion? I imagine this is another limitation of the green screen vs an LED wall. Is there a way to get accurate dof based off of lens focal length, aperture values, and focus points? I can't imagine that kind of data is output by cameras and available to Unreal engine to calculate.
It is possible to sync the real world focus/iris/zoom/fov/chromatic aberrations/distortion of real world lenses and match them in Unreal Engine. LED walls have their own challenges with matching, but they are different than green screen for sure.
Matt do you know if the BM Video assist can record BRAW from the Ursa mini 4.6k? not the pro but the first 4.6k hope you tech guy can answer that will be really really helpfull. thanks for your knowledge.
Thanks for sharing this. Been working with the aja and media bundles. They work but really want to use some of the more advanced keying features in composure. I understand how composure works but don't want the media plate to be attached to the camera . My question is how can I take the media plate in composure and add it to a plane that I place inside unreal like the black magic or aja bundle? Have read that 4.25 added some new features to composure and wondering if this would be one of them.
hahaha i asked him the same exact question. I found a tedious way to do it but its not live footage like his example here. Dont use composure to key the footage. Just use a material to key your footage then apply it on a plane
@@Jsfilmz That's what we have been doing with the aja bundle, read something about new composure plates. Aja's keyer isn't the best and would love to use the composure keyer. If you have a written link to the tedious way i'd love to check it out.
You mention 2.0 base stations in the description, but would this work with 1.0 base stations as well? The htc website says the vivetracker works with the 1.0 base stations, but not sure about this specific use case.
I don’t know the specifics but I believe the 2.0 stations can be combined to make bigger tracked volumes easier than the 1.0. In the case of virtual production you want the best quality tracking volume as possible or the whole thing jitters. A perfectly setup space and perfect 1.0 setup would probably work, but most indie spaces have issues that are overcome with more stations, which means 2.0 stations.
How are you setting up the tracker distance to the lens? Are you just estimating the distance? Because there's parallax between the camera position and the tracker position and it's not looking good atm for me
No, more like half Cine Tracer and half Virtual Production but they over lap. For instance Vive based virtual camera is in Cine Tracer right now for the next update. VP is basically R&D for Cine Tracer.
looks the the keying is not very accurate when done live. If I was recording that I would want the raw footage with green to do a more advanced key in Nuke or AE
Matt this is AWESOME. You are getting close and I can see that inspiration you were feeling. Its getting close to the point where you can make something that looks pretty damn amazing with Virtual Production and UE.
Question for you: Where did you get the greenscreen background? I've been searching on Amazon but couldn't find a decent one. I'm going to try a virtual setup like this (I'm an AI programmer, so want to do some realtime virtual characters with live actors interacting.
Hi Matt! How do you output your Vlog video from Composure? Is there a straight forward way to output the composited image from Composure to a .mp4 file? Thanks!
Spending on it few days already, but haven't found info on how to place already keyed footage of a person in UE 3D environment to make some simple camera moves and render it out. All the tutorials are for live real time production with green. So frustrating....
Hi Matt. If I want to buy a video camera compatible with Unreal Engine to connect and shoot with chroma input directly after 3D creation What do you recommend to buy the type of camera compatible with the fifth version
How is the video signal coming to UE4, does UE work on the "raw" captured signal or does UE do fast compression? How good is the UE keyer, looks a bit limited. Woukd not be better to work with an external hardware keyer that does a key on a 444 signal then pass that to ue, or would that add to much latency ? Sorry for so many questions but i just started with this and i am interested to have as little as possible quality loss from lens to output.
is there a way to get this camera tracking but with just an unreal camera? composure + putting the composite plane in one place, wondering if this is possible? Love the work as always!
@Cinematography Database where did you get that spaceship model from? I'd like to give it a try. The Epic virtual studio models are all very bright, I'd like to try something that looks good in dim light.
@@CinematographyDatabase gonna recap your previous vids and hop on those UE tutorials when I'm back from tour! Super excited to see what I can pull off. More excited to see what you can pull off 😁
I watched that video demo with the guy on the motorcycle using Quexil assets and the led walls with the tracked cameras over 20 times showing all my friends and family how mind blowing it is that we are now to this point. Now that I found your channel im really glad I did because Ive been wanting to know more
Really cool-I just ordered a Roko because all of this virtual production is too next level. Have you had any luck trying to get some sort of lightwrap going on for realtime keying?
Best part is the Unreal Engine and the Live Key etc. tech is all free. The cameras, Vive, green screen, etc. vary a lot, but here are links in the description if you want to kind of Amazon cart ball park it. Throw in a $5K PC with a 2080ti.
SUGGESTION: If you just get the LUT you want loaded onto the Blackmagic you can send that over the SDI, you would not have to add a grade in Unreal (saving you some processing power on that computer)
yes, you put the video on a plane and then move the UE4 camera around. The illusion is pretty broken if you try to pan or tilt, but if the camera stays perpendicular it looks OK. Someone in the Facebook group is using this technique and fooled me at first into thinking it was a 3D track.
Hi Matt, love all your videos. Quick (newbie question) - how can I record the CG clean plate (or just get the tracking info) if I just want to use the live comp as an on set preview and do post later? I feel like there's a really simple answer for this, but I'm not sure what it is.
I know it takes a little longer but would it increase quality to record the foreground and back ground sperate and then key in the background in post? I was even thinking that you could record the background on a 4k Blackmagic video assist so that it's all Blackmagic RAW.
This is definitely possible and you can record the CG background, matte, a 3D camera data all together for a traditional post finish. I primarily focus on the live workflow because it evolves into projectors and LED walls.
I'm working to move in the same direction so I'm glad you told me that before I setup my studio with the wrong workflow in mind. Love your work and excited to join in with Virtual Production.