Тёмный

Indie Virtual Production | Real Time Key in UE4 

Cinematography Database
Подписаться 278 тыс.
Просмотров 92 тыс.
50% 1

Опубликовано:

 

28 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 190   
@Gleebi
@Gleebi 4 года назад
well done m8. always difficult to try something new when there is no youtube tutorials on the subject. looks great
@racerschin
@racerschin 2 года назад
a good worklow would be to record composite and separate virtual camera+greenscreen, edit with the composite and re-compose after all editing is done. so, shooting and editing with excellent preview, while mainaining full compositing for output. this workflow would require rec/stop/save commands to be sent to unreal through sdi, and auto-naming of both unreal sequences and camera, to help with camera assistance.
@CONTORART
@CONTORART 4 года назад
So awesome to follow your journey and see it all coming together now!
@peterxyz3541
@peterxyz3541 3 года назад
I need this in my life 👍🏼👍🏼👍🏼👍🏼👍🏼👍🏼. One person micro studio to create micro scifi short form story
@debbiedeerproductions8204
@debbiedeerproductions8204 3 года назад
So awesome!! Thank you...I really loved your explanation and demo of the composite:)
@Dante02d12
@Dante02d12 3 года назад
This looks great! It makes green screens much more believable! S'il y a des français passant ici: la team JDG a très probablement utilisé ce tutoriel pour leur nouveau système de fonds verts ^^. C'est le même équipement, les mêmes logiciels, et le même niveau de rendu.
@sijigs
@sijigs 2 года назад
I’m a bit late but thanks Matt. You’ve earned a sub
@beachcomberfilms8615
@beachcomberfilms8615 4 года назад
Gee Matt you’ve come a long way in a short time. That is unbelievably amazing. I would love to see this in person. Any plans for workshops once we are free from the pandemic? BTW I’m finally going to be able to get the RTX 2080 Ti so I can finally get to work with Cine tracer. Love your work man!!!
@JOKERSTUDIO9
@JOKERSTUDIO9 4 года назад
I love your work so much !! Looking forward to a new enhance version of CineTracer with character blocking and staging system, mark their movement.
@brendanblondeau8631
@brendanblondeau8631 3 года назад
Brilliant !
@jono0202
@jono0202 4 года назад
Matt can you make a tutorial to a live stream with a chroma key, but with a camera movement like a last shots, when you moving the camera and the background moves too pleeeeaaaasssseeeee
@ge2719
@ge2719 4 года назад
looks good, though a way to match the lighting in the scene better would be a big improvement.
@CinematographyDatabase
@CinematographyDatabase 4 года назад
if you check Instagram we did some testing with backlights. I'm definitely going to do videos just on matching the live action to the virtual sets. That is the fun part.
@HardRockChart
@HardRockChart 4 года назад
@@CinematographyDatabase I've been looking into this quite a lot... would it be possible to use a few rear projectors to build a poor man's version of the ARWALL and lighting your actors that way?
@GerfriedGuggi
@GerfriedGuggi 4 года назад
@@HardRockChart also looking into rear projection as a low budget solution :)
@CinematographyDatabase
@CinematographyDatabase 4 года назад
Louis Buys yes, this is possible and I will hopefully demo it.
@HardRockChart
@HardRockChart 4 года назад
@@CinematographyDatabase excited to see your demo
@TerenceKearns
@TerenceKearns 4 года назад
An idea for lighting. Get a large TV screen, place it close to the model and send another feed of the virtual environment (taken from th eopposite direction of the camera). Tune the lighting can camera for low light so that the effect of the TV is more pronounced. Now the reflection of the light from the TV on your model makes them look more immursed. It should help to counteract all that green light pollution. It's all about proximity.
@CinematographyDatabase
@CinematographyDatabase 4 года назад
yeah we are working towards indie LED wall virtual production. I shot this demo last year, basically a mini Mando LED set - ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-kFBha5BE38k.html
@caferoast
@caferoast 3 года назад
Hey, really enjoyed your video and in-depth and clear breakdown for a VP setup. I was wondering however, if you could share how to setup just Vive tracker + base stations without the HMD i don’t have an HMD and having an extremely hard time setting those up with unreal. I keep hearing they are super easy to configure but can’t find any easy setup videos for that. I was hoping if you could help explain that. Also tried following you along the configurations you showed in unreal here and can’t find most options you mentioned here 🙏🏽
@racerschin
@racerschin 2 года назад
interesting stuff. is there a lightwrap effect in the unreal compositor? that would help a lot.
@d3tach3d
@d3tach3d 4 года назад
How do you get the pull focus from the lens to get unreal to sync with it, or is the lens focus part just for the mixed reality and not just filming all unreal content. This new wave of technology that your doing and what the Mandolorian crew did just makes me so excited to see live tracking from the RL used for camera recording. Its like VR had to be a thing with the whole tracking aspect to then have people get the idea of using tracking for virtual cameras, then it moved onto the next stage with Mixed Reality. Truly a breakthrough in the filming production world and a way for the gap to shorten between a small indie studio and the big corporate film industry. Similar to when home recording software became so user friendly and pc hardware became cheap enough for bands to record there own music in a small studio without having to pay alot of money for studio time from a big one. I hope that made sense. Anyways this is amazing and definitely the future. Im a CG artist and this kind of stuff really gets me excited and very interested in wanting to know more.
@tomdchi12
@tomdchi12 4 года назад
Is UE calculating the lighting in real time, or is the set "baked" lighting? Either way, looks like a fantastic start for a setup based on mostly "consumer" gear! You are really opening the door for a lot of productions!
@CinematographyDatabase
@CinematographyDatabase 4 года назад
This scene has baked lighting. However with RTX and SSGI fully dynamic lighting has very high quality and you only need to hit 24 FPS in this case.
@ClassyDogFilms
@ClassyDogFilms 4 года назад
It's exciting to see your tests with virtual production, Matt. This sort of thing would have cost a fortune 10 years ago. Imagine how cool it will be when you can get one of those big LED walls!
@cire30a
@cire30a 4 года назад
so how does it work with post... Are you recording on the Ursa also, and then give the unreal engine and tracking data and raw footage to the editor for posting the feature film?
@lionbeatscobra
@lionbeatscobra 2 года назад
Man wish there was a PDF somewhere on how to set this up for all cameras.
@zoombapup
@zoombapup 4 года назад
I was thinking of trying the new intel realsense tracking camera as a way of doing the tracking on the camera, rather than using the vive method. I've seen bigger production sets using inside-out camera tracking where they markup the room with tracking markers and then just have a camera pointed at the ceiling with a bit of software to position and orient the camera (which is basically what the intel tracking camera is doing, but doesn't use markers). Ideally I guess you'd have a rear projection screen behind the cam for environment lighting. Ah man, if only I had the funds I could play all day with this kind of thing.
@sinanarts
@sinanarts 3 года назад
Dear Matt; I have an exact setup like your early days. I need an answer if you please 🙏🏻. I am connected to unreal via decklink card and I get keyed preview in viewport. I use composure and also I use Media bundle projected on a plane. all are perfectly working. WHAT I CANT achieve is to record my editor viewport in the UE. Is it mandatory to use an ext recorder via decklink in order to record.? I DO NOT WANT OBS or alike as a screen capture is there a way to record in the Unreal engine like take recorder or seq rec.? APRECIATE your time and help.
@rotanncolyn4608
@rotanncolyn4608 2 года назад
Hi Matt! Hope you can help me. I have spend countless hours trying to get composure to work but something isn't right. The CG layer looks completely different compared to the actual scene. It is as if it doesn't read any of the post processing or lighting properly. Any advice on what I can do to fix this. Thanks
@vivektyagi6848
@vivektyagi6848 4 года назад
Really Cool
@HandsomeDragon
@HandsomeDragon 2 года назад
Hello, I'm working with Composure and the cg_layer color is way off, its bad and not as good quality as the CineActorCamera. Have you come across this issue where Composure's view is different from your CineCameraActor?
@minarimon3106
@minarimon3106 Год назад
What’s the minimum size of a space required for virtual production led wall ??
@Im_Derivative
@Im_Derivative 2 года назад
If I wanna do this live on Twitch or something, is there a way to do it without the live feed ending every time I try to play the level? I want to utilize some spider AI I coded, and their animations, but can't because this method only seems to work in the editor.
@estudiorgb
@estudiorgb 3 года назад
u can share the BP? i have htc camera tracker, but i cant get the same result as you in 9:15
@Pedrangelo
@Pedrangelo 3 года назад
Wow! Im really excited to try, but an HTC vive is ridiculously expensive in Brazil xD Does anyone know if its is possible to have high quality with any other VR set ?
@deeeny_
@deeeny_ 2 года назад
Hey man, this is amazing, thankyou very much. Just one question, once you have everything set up how do you record it? I mean, to have the render video of what you were filmed in real time. Thanks in advance!!
@sultanaraboughly
@sultanaraboughly 4 года назад
Man...this is genius
@lukout
@lukout 4 года назад
pretty cool ! Amazing ! I am learning a little bit of Unreal, Can you use any digital camera, like for example a Gh4 ?
@sinanarts
@sinanarts 3 года назад
Never got a reply but lets try once more.. I do not see color diff material option have any idea.? appreciate
@arnoldwinford1096
@arnoldwinford1096 2 года назад
Hi, if I use Take recorder for record project , how to setting ? Thank you so much
@pawaalfilms
@pawaalfilms 4 года назад
how can we start a small virtual production . any simple DSLR tutorial Plz
@maizizahmed
@maizizahmed 3 года назад
Hay bro ,I like yr tubes .what about ptz aw ue150 plugin ue4 with vive tracker
@Marc-hy8cf
@Marc-hy8cf 3 года назад
Great videos love it!! Question for you though. I don't have an htc vive setup (yet.. covid $$ issues lol) so I'm trying to achieve similar results by using the iPhone as a tracker. Even though I match my cinecameraactor settings to my DSLR, i find they don't quite match in unreal. You don't seem to be having this issue with the vive trackers. what "information" do they give back.. Transform? world location? distance between cam and tracker? thx! Marc.
@8088NET
@8088NET 4 года назад
Brutal !!!!! Thanks you
@Enzait
@Enzait 4 года назад
Nice!
@keithyakouboff8755
@keithyakouboff8755 3 года назад
How does one learn this in his apartment? I have a possible opportunity for some freelance work creating 3D sets using Maya. However, they bring Unreal in because they want these sets to be part of a live broadcast. This tutorial seemed like an introduction to that. However, it is occurring to me that I may need access to a green screen, a high-end camera, a card of some sort in my workstation... Holy crap. Is there a way I can learn what I need to get these jobs without having to spend gobs of money on schooling or hardware?
@Khievkimhong
@Khievkimhong 4 года назад
Can we use eos70D for that.?
@felipeaguirrebengoa6191
@felipeaguirrebengoa6191 3 года назад
hola que tal, excelente el trabajo que haces y los totorales y todos tus videos, felicitaciones! queria consultarle lo siguiente: que tipo de hardware de pc (placa de video, procesador, ramón, etc) es necesario para realizar este tipo de trabajos? desde ya muchas gracias y espero tu respuesta
@sada16dec
@sada16dec 4 года назад
I am trying to put foreground plate over media plate and added simple cube to foreground layer, it shows black background along with cube. ...already changed "alpha channel to linear space color only". Suggest if i am missing anything.
@ahmadsyauki
@ahmadsyauki 4 года назад
Is the vive tracking ready for broadcast production? Are vive making tiny random movement while tracking?
@DhikaPramudhia
@DhikaPramudhia 4 года назад
can cinetracer do this?
@timdelatorre
@timdelatorre 4 года назад
Is there a way to sync camera aperture value and shutter angle w/ background blur and motion? I imagine this is another limitation of the green screen vs an LED wall. Is there a way to get accurate dof based off of lens focal length, aperture values, and focus points? I can't imagine that kind of data is output by cameras and available to Unreal engine to calculate.
@CinematographyDatabase
@CinematographyDatabase 4 года назад
It is possible to sync the real world focus/iris/zoom/fov/chromatic aberrations/distortion of real world lenses and match them in Unreal Engine. LED walls have their own challenges with matching, but they are different than green screen for sure.
@TcheFranco
@TcheFranco 2 года назад
Can it be done with 2 cameras? 1 wide and 1 closeup?
@CinematographyDatabase
@CinematographyDatabase 2 года назад
Yes, it would require hardware to output the different scene renders at once, but multicam/frustum is possible.
@ottogarza
@ottogarza 4 года назад
Matt do you know if the BM Video assist can record BRAW from the Ursa mini 4.6k? not the pro but the first 4.6k hope you tech guy can answer that will be really really helpfull. thanks for your knowledge.
@ApexArtistX
@ApexArtistX 2 года назад
iPhone instead of vive tracker works ?
@CinematographyDatabase
@CinematographyDatabase 2 года назад
It works, but it’s a bit drifty and unreliable for professional production
@patrickbeery9405
@patrickbeery9405 4 года назад
Thanks for sharing this. Been working with the aja and media bundles. They work but really want to use some of the more advanced keying features in composure. I understand how composure works but don't want the media plate to be attached to the camera . My question is how can I take the media plate in composure and add it to a plane that I place inside unreal like the black magic or aja bundle? Have read that 4.25 added some new features to composure and wondering if this would be one of them.
@Jsfilmz
@Jsfilmz 4 года назад
hahaha i asked him the same exact question. I found a tedious way to do it but its not live footage like his example here. Dont use composure to key the footage. Just use a material to key your footage then apply it on a plane
@patrickbeery9405
@patrickbeery9405 4 года назад
@@Jsfilmz That's what we have been doing with the aja bundle, read something about new composure plates. Aja's keyer isn't the best and would love to use the composure keyer. If you have a written link to the tedious way i'd love to check it out.
@CinematographyDatabase
@CinematographyDatabase 4 года назад
you use the Composite Plate / Image Plane plugin in UE4.25 to put the comp on a 2d plane to composite in world space.
@hiskishow
@hiskishow 4 года назад
This production contains social distancing.
@andreic048
@andreic048 4 года назад
You mention 2.0 base stations in the description, but would this work with 1.0 base stations as well? The htc website says the vivetracker works with the 1.0 base stations, but not sure about this specific use case.
@CinematographyDatabase
@CinematographyDatabase 4 года назад
I don’t know the specifics but I believe the 2.0 stations can be combined to make bigger tracked volumes easier than the 1.0. In the case of virtual production you want the best quality tracking volume as possible or the whole thing jitters. A perfectly setup space and perfect 1.0 setup would probably work, but most indie spaces have issues that are overcome with more stations, which means 2.0 stations.
@andreic048
@andreic048 4 года назад
@@CinematographyDatabase thanks
@seefoodeatrice4608
@seefoodeatrice4608 4 года назад
compmaterial Do you know how to make this?8:30
@estevesbb
@estevesbb 4 года назад
How are you setting up the tracker distance to the lens? Are you just estimating the distance? Because there's parallax between the camera position and the tracker position and it's not looking good atm for me
@CinematographyDatabase
@CinematographyDatabase 4 года назад
lens calibration isn't setup yet in this demo. In the future it will be.
@prjpantallasdeledspantalla7722
@prjpantallasdeledspantalla7722 3 года назад
please one tutorial, how connect in real time!!! regards
@salimalsarsour6680
@salimalsarsour6680 4 года назад
Can I use Canon 90D for this kind of setup?
@sid3walk
@sid3walk 4 года назад
I wonder what happens to Cine Tracer? Is development postponed for now?
@CinematographyDatabase
@CinematographyDatabase 4 года назад
No, more like half Cine Tracer and half Virtual Production but they over lap. For instance Vive based virtual camera is in Cine Tracer right now for the next update. VP is basically R&D for Cine Tracer.
@goodenna
@goodenna 4 года назад
Which unreal version were you using? I couldn't find difference key... I could only find chroma key...
@CinematographyDatabase
@CinematographyDatabase 4 года назад
you need 4.25 Preview for the Color Difference Keyer
@goodenna
@goodenna 4 года назад
@@CinematographyDatabase Thank you! I'll try 4.25.
@paulpierantozzi
@paulpierantozzi 4 года назад
looks the the keying is not very accurate when done live. If I was recording that I would want the raw footage with green to do a more advanced key in Nuke or AE
@CinematographyDatabase
@CinematographyDatabase 4 года назад
Paul Pierantozzi that’s possible to record the CG clean plate and do normal post compositing and use the live comp as a on set preview.
@Gldp01
@Gldp01 4 года назад
Where can I find this Color_diff Keyer? Thank you!
@CinematographyDatabase
@CinematographyDatabase 4 года назад
you need to use UE4 4.25 to get the Color Diff Keyer
@Gldp01
@Gldp01 4 года назад
@@CinematographyDatabase thank you!!! Can you use post process volume with the compositor as a cg element?
@33Records
@33Records 4 года назад
boss
@sixsoxsex1
@sixsoxsex1 4 года назад
Which release of Unreal engine need to download to make a virtual production?
@onotoc
@onotoc 3 года назад
at least 4.24 but the really new stuff is in 4.26
@sixsoxsex1
@sixsoxsex1 4 года назад
The hair are not perfect, can be improved?
@SceneOfAction
@SceneOfAction 4 года назад
This is amazing. Thank you so much for sharing your process.
@darthgzuz
@darthgzuz 4 года назад
Wow ... Tech has come a long way Brilliant ... Now keep that AVR (Augmented Virtual Reality) as Ur standard background 👏👏👏
@erictko85
@erictko85 4 года назад
Matt this is AWESOME. You are getting close and I can see that inspiration you were feeling. Its getting close to the point where you can make something that looks pretty damn amazing with Virtual Production and UE.
@zoombapup
@zoombapup 4 года назад
Question for you: Where did you get the greenscreen background? I've been searching on Amazon but couldn't find a decent one. I'm going to try a virtual setup like this (I'm an AI programmer, so want to do some realtime virtual characters with live actors interacting.
@zoombapup
@zoombapup 4 года назад
Balls, so you even had it in the damn title. Sorry for the dumbass question :)
@poyodiazmusic
@poyodiazmusic 2 года назад
This is great but how can you record the video composed in unreal to record it for example?
@airplaneian
@airplaneian 4 года назад
This is looking great! So cool to see you spin this up so quickly.
@minakovstudio
@minakovstudio 4 года назад
Looks great. You are inspiring me to shot the short with your technics. Thx!
@ChrisBraibant
@ChrisBraibant 4 года назад
HI Matt. You do an amazing job. I am trying to do the same with a Vive and a GH5. Could you please recommend some resources? Thanks.
@AncLight188
@AncLight188 Год назад
Hi Matt! How do you output your Vlog video from Composure? Is there a straight forward way to output the composited image from Composure to a .mp4 file? Thanks!
@imiy
@imiy 3 года назад
Spending on it few days already, but haven't found info on how to place already keyed footage of a person in UE 3D environment to make some simple camera moves and render it out. All the tutorials are for live real time production with green. So frustrating....
@ameenmo360
@ameenmo360 2 года назад
Hi Matt. If I want to buy a video camera compatible with Unreal Engine to connect and shoot with chroma input directly after 3D creation What do you recommend to buy the type of camera compatible with the fifth version
@darviniusb
@darviniusb 3 года назад
How is the video signal coming to UE4, does UE work on the "raw" captured signal or does UE do fast compression? How good is the UE keyer, looks a bit limited. Woukd not be better to work with an external hardware keyer that does a key on a 444 signal then pass that to ue, or would that add to much latency ? Sorry for so many questions but i just started with this and i am interested to have as little as possible quality loss from lens to output.
@hardcorerick8514
@hardcorerick8514 Год назад
is there a way to get this camera tracking but with just an unreal camera? composure + putting the composite plane in one place, wondering if this is possible? Love the work as always!
@eliteartisan6733
@eliteartisan6733 Год назад
Please could you tell me or point to a link where i can create a material "Andy's Tutorial"?
@anikettivare
@anikettivare 2 года назад
Please make details tutorial for second option
@mmtv_au
@mmtv_au 3 года назад
Bruh. Game changer. Even now.
@GregCorson
@GregCorson 4 года назад
@Cinematography Database where did you get that spaceship model from? I'd like to give it a try. The Epic virtual studio models are all very bright, I'd like to try something that looks good in dim light.
@CinematographyDatabase
@CinematographyDatabase 4 года назад
easily one of the best on the marketplace - www.unrealengine.com/marketplace/en-US/product/sci-fi-modular-environment
@stickwithit
@stickwithit 4 года назад
This is some next level stuff Matt! Would love to set up a rig like this at home for a static camera!
@CinematographyDatabase
@CinematographyDatabase 4 года назад
For static cameras it's really straight forward and really great quality. Adding camera moves makes it a bit more of a setup with the Vive etc.
@stickwithit
@stickwithit 4 года назад
@@CinematographyDatabase gonna recap your previous vids and hop on those UE tutorials when I'm back from tour! Super excited to see what I can pull off. More excited to see what you can pull off 😁
@tyldarprod1399
@tyldarprod1399 4 года назад
Well done. And here we go!!! Off to HomeDepot to make that green screen frame. Big thanks to wify. :)
@d3tach3d
@d3tach3d 4 года назад
I watched that video demo with the guy on the motorcycle using Quexil assets and the led walls with the tracked cameras over 20 times showing all my friends and family how mind blowing it is that we are now to this point. Now that I found your channel im really glad I did because Ive been wanting to know more
@aspiceoflife
@aspiceoflife 4 года назад
Excited to see what you will produce!
@VfxBlender
@VfxBlender 4 года назад
Nice you got the live key to get good results.
@Jsfilmz
@Jsfilmz 4 года назад
Did you have to track these? I saw you didnt have any tracking markerw on the greensreen but had camera movements.
@CinematographyDatabase
@CinematographyDatabase 4 года назад
I live track the camera using the HTC Vive and UE4
@JLOFlix
@JLOFlix 4 года назад
FANTASTIC! Thanks SO MUCH, MATT!! TRULY motivating!
@AdrianooElias
@AdrianooElias 3 года назад
Nice! Which graphics card was used?
@Sanjay_jit
@Sanjay_jit 4 года назад
Really cool-I just ordered a Roko because all of this virtual production is too next level. Have you had any luck trying to get some sort of lightwrap going on for realtime keying?
@CinematographyDatabase
@CinematographyDatabase 4 года назад
I haven't tried light wrap yet, but I heard something about it. I'll show it if I get it up and running.
@btcmagazine.tech24
@btcmagazine.tech24 3 года назад
Bro, thanks for sharing your rig, it's look really amazing!! Keep posting videos about it please!
@alesis_
@alesis_ 4 года назад
Great result, Matt. Thank you for your videos!
@24pfilms
@24pfilms 4 года назад
Matt what is the total ballpark costs...P.S. Great work!
@CinematographyDatabase
@CinematographyDatabase 4 года назад
Best part is the Unreal Engine and the Live Key etc. tech is all free. The cameras, Vive, green screen, etc. vary a lot, but here are links in the description if you want to kind of Amazon cart ball park it. Throw in a $5K PC with a 2080ti.
@AmmonEhrisman
@AmmonEhrisman 4 года назад
SUGGESTION: If you just get the LUT you want loaded onto the Blackmagic you can send that over the SDI, you would not have to add a grade in Unreal (saving you some processing power on that computer)
@CinematographyDatabase
@CinematographyDatabase 4 года назад
That’s a great idea. I should make a UE4 LUT 🍻
@dyervisuals9963
@dyervisuals9963 4 года назад
Is it possible to have a static physical camera (so no tracking info) into unreal engine and move a virtual camera around in unreal engine?
@CinematographyDatabase
@CinematographyDatabase 4 года назад
yes, you put the video on a plane and then move the UE4 camera around. The illusion is pretty broken if you try to pan or tilt, but if the camera stays perpendicular it looks OK. Someone in the Facebook group is using this technique and fooled me at first into thinking it was a 3D track.
@marcogrob3899
@marcogrob3899 2 года назад
Thank YOU!! Can you make the background out of focus and play with forcus in general? Best Marco
@CinematographyDatabase
@CinematographyDatabase 2 года назад
Yes you can, you need a way to track the real world camera focus distance however.
@SolomonJagwe
@SolomonJagwe 4 года назад
Well done!! 👏🏽👏🏽👏🏽
@AndreLLMedia
@AndreLLMedia 4 года назад
this is awesome
@rutchjohnson
@rutchjohnson 4 года назад
So this is awesome! But how do you think we can merge and sync the focus pulling between virtual and real cameras?
@CinematographyDatabase
@CinematographyDatabase 4 года назад
With high hardware/software systems (MoSys, NCam, Stype) it’s possible. At the indie level the community is looking/inventing solutions.
@SitinprettyProductions
@SitinprettyProductions 2 года назад
Hi Matt, love all your videos. Quick (newbie question) - how can I record the CG clean plate (or just get the tracking info) if I just want to use the live comp as an on set preview and do post later? I feel like there's a really simple answer for this, but I'm not sure what it is.
@eantays7185
@eantays7185 4 года назад
I know it takes a little longer but would it increase quality to record the foreground and back ground sperate and then key in the background in post? I was even thinking that you could record the background on a 4k Blackmagic video assist so that it's all Blackmagic RAW.
@CinematographyDatabase
@CinematographyDatabase 4 года назад
This is definitely possible and you can record the CG background, matte, a 3D camera data all together for a traditional post finish. I primarily focus on the live workflow because it evolves into projectors and LED walls.
@eantays7185
@eantays7185 4 года назад
I'm working to move in the same direction so I'm glad you told me that before I setup my studio with the wrong workflow in mind. Love your work and excited to join in with Virtual Production.
@DavidCrossIN2U
@DavidCrossIN2U 3 года назад
This is fantastic. I would love to learn how to do this. So, to be clear this is UE4 Composure that you are using?
Далее
THIS Virtual Production App STUNNED Top Filmmakers
8:58
HTC Sent Me a SECRET WEAPON for Virtual Production
13:58
Virtual Production with a Projector & Unreal Engine
14:23
Indie Virtual Production is HERE!
10:04
Просмотров 69 тыс.
Why does NOBODY use Unreal Engine for THIS?
8:07
Просмотров 45 тыс.