Тёмный

New Virtual Production TV Studios 

Greg Corson
Подписаться 6 тыс.
Просмотров 3,8 тыс.
50% 1

Two new studios and sound! The studios are courtesy of Epic Games and can be found in their "virtual studio" sample project. I just migrated my virtual production template into it and was recording right away!
Still some talent lighting issues because there are two big fluorescent lights over my head that I can't turn off which clash with the daylight balanced studio lights. Color grading the skin tones in Davinci Resolve 16 got rid of the worst of the green tint and also filtered out the noise of the honking loud HVAC fan right over my head.
I have to move to a new studio this week, so probably no updates for a bit. But the new studio is less noisy and lets me shut off the fluorescent lights, so new videos should look better.
Sample project updated to Unreal 4.24.1 is here github.com/Mil... this does NOT include the new Epic studios for licensing reasons, just grab them from Epic's Virtual Studio sample and check out Readme_2 in my project for instructions how to transfer my setup to this or your own project. Remember, this isn't exactly plug-and-play, you probably don't have the same setup as me so you will probably have to make adjustments for your cameras and hardware setup.
#virtualproduction #unrealengine

Кино

Опубликовано:

 

3 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 31   
@sanjeevsaini1828
@sanjeevsaini1828 4 года назад
Please make a complete vedio of this setup from a to z.
@prasithsay4741
@prasithsay4741 4 года назад
Thanks for your useful tutorial and project. I'm working in a TV station and finding some way to use UE4 in my Virtual set workflow for live broadcast. I'm new in UE4, Your tutorial and project will help to reach my goal. Waiting to seeing your new tutorial, please keep updated😊
@GregCorson
@GregCorson 4 года назад
There are a lot of uses for this in broadcast TV...I'm still updating, more stuff soon.
@prasithsay4741
@prasithsay4741 4 года назад
@@GregCorson I just try NDI video out but the quality is can't compare to BM or AJA Or my setting is not correct?
@GregCorson
@GregCorson 4 года назад
Well, a BM or AJA card is raw uncompressed video at a very high rate, so it's going to be hard to beat. With NDI it's going to depend on your network (wired or wifi) and what quality settings you use. NDI has several quality settings that up the bitrate for higher quality. You may want to read through the documents on ndi.tv and their troubleshooting guides if the quality doesn't look right to you. The iPhone NDI camera app (currently free) also has several different quality settings but the quality can go lower because it runs over WiFi. If you have the budget, a BM or AJA card with a professional broadcast or cine camera on SDI is always going to be the highest quality. But with the right setup NDI can be plenty good enough for serious TV and high quality livestreaming studios. Twit.tv uses a lot of NDI stuff. I will be spending more time messing with NDI in the coming weeks and will report back my experiences.
@CaptainSnackbar
@CaptainSnackbar 4 года назад
Nice work Greg looks really good, your tracking is spot on, the green screen is good but need more tweak in the engine check out Erode in the Transform/compositing passes you can kill off the green spill coming from the floor. if may add that you have a decent color grading options in unreal engine media plate. you may not need Davinci Resolve. you have
@GregCorson
@GregCorson 4 года назад
Thanks! This does have both Erode and Despill running in the key. I think most of the greenish-ness you see on my skin is from the fluorescent lights above me and is not spill from the floor. I can't turn these off. Before I color-graded the skin tones I looked seasick green, but there is still a slight green tint in the more shadowed areas. Every time I walk under one of these fluorescent lights the white balance goes off.
@CaptainSnackbar
@CaptainSnackbar 4 года назад
@@GregCorson i'm eager to test out your project on Github, most requested topics regarding VR is keyed video reflection on vr world. at the current composure option in unreal its not possible. zero density do 3D composite of the video feed thats how they do there reflections back in VR world. i'm still grounded i cant experiment without a completed studio set at the moment.
@ivanmarcos7965
@ivanmarcos7965 4 года назад
It looks great Greg, Did your video input in unreal matched your camera monitor? In my case the video feed in Unreal is way overexpose (2 to 3 fstops higher to what the actual image looks like on the camera) I had a blackmagic card earlier on and now I'm using the kona4 card, and its the same in both cards.
@GregCorson
@GregCorson 4 года назад
In my case it appears to. Though there are a lot of settings that can effect this. If you are using a full kona4 with sdi inputs your card probably has more settings than mine. Be sure to go through all the "extra" ones, use the little triangles at the bottom of each settings section in the plugin to open and look at them. The srgb one can make a difference. If you are using kona, look at the video in their control room app and compare it to unreal. If you are looking at the final output of a composure composite in unreal, be aware the unreal's auto exposure, tone mapping and a lot of post processor stuff can effect the final output too. Best to debug your setup with a simple scene like my sample project before jumping into using a fancier 3d scene. Let me know if you continue to have trouble.
@ivanmarcos7965
@ivanmarcos7965 4 года назад
@@GregCorson Thanks for your reply Greg. I'll go through your suggestions and I'll post results here. Thanks again.
@Ahmed-gc1ys
@Ahmed-gc1ys 4 года назад
nice work Greg , really i appreciate your work really it will be helpfull if you did some tutorial how we can build a virtual production from the biginning, please like you are expert i have some question if thiss possible. with vive track can we track lens zoom , focus .. ? for video wall can we make a video player input and play video from external souce ? for the shadow of real object like a presenter can make it real inside UE ? Thanks in advance
@GregCorson
@GregCorson 4 года назад
Hi Ahmed, There are really only three parts to this, one is building your virtual set which is just a normal Unreal model. There are lots of tutorials on on the Unreal site and RU-vid on this. The second is setting up the studio/greenscreen and the third is just transferring my virtual production stuff into your virtual set. I think the last part is covered by my tutorial but I'm working on making it simpler. I just finished moving my studio so I'll see if I can do something on studio setup. The Vive really doesn't have any way of tracking focus and zoom as it comes. You would have to buy a lens tracker, build one or have a camera with it built-in. Unreal can handle the lens changing zoom and focus, you just need to get the data into it somehow. There are a couple of ways to get the presenter to cast a shadow on the CG world. The easiest one is to light the presenter so they cast a shadow on the green-screen and then process into a shadow map you can apply to the CG. I haven't had time to work on this yet. You saw at the end of this video I had a movie playing on one of the virtual screens, there shouldn't be any reason why you can't put live video there too. If you want to drive a real-world video wall made up of multiple screens, Unreal can do that to.
@LeLabSecret
@LeLabSecret 4 года назад
Nice job Greg. How did you do your tracking?
@GregCorson
@GregCorson 4 года назад
Tracking is just using VIVE pucks, there are more details in other videos on the channel.
@_myaus
@_myaus 4 года назад
Hi Greg, the scene looks good, I have a question, is it possible to capture video without having a video capture card? I tried to record a picture, but the documentation says that this cannot be done without a board, maybe there is a way around it?
@GregCorson
@GregCorson 4 года назад
To get live video into Unreal, you need some kind of capture card, USB capture dongle or a webcam. To record your virtual production output you can use a software screen recorder like OBS, or use an external recorder like an Atomos Ninja V to record the output of your video card or a dedicated video output card. You could pre-record video too and send it into Unreal as a media file, but you need to get camera position info into Unreal somehow. If the camera is stationary you could just measure it or you could use something like Blender's camera tracking software to track the video from a moving camera and then feed it to unreal (haven't tried this but it should work).
@_myaus
@_myaus 4 года назад
@@GregCorson Thanks for the answer Greg, you helped me a lot! Thanks again!
@mineshg3437
@mineshg3437 3 года назад
how you are moving camera in 360 degrees with one flat green screen? please share
@GregCorson
@GregCorson 3 года назад
In this shot there is a normal flat green screen about 15 feet wide. There is a matte/mask that causes everything outside of the green screen to to be blocked out so when you pan the camera around all you see is the CG shot. You can see information about the mattes in the latest set of tutorials for VPStudio. You can use them to either hide objects or reveal them.
@iradmir777
@iradmir777 4 года назад
Hello. thanks for the project. I'm trying to do the similar thing and i have a few questions. I was able to open your project from github and connect vive tracker to the scene, but it is not moving right Should the camp camera rig change location after i start the project? In my case the cylinder appears to be in front of my talent. When i move my camera down , camera in scene moves up. Everything else i think works fine Btw i'm using Unreal 4.24
@GregCorson
@GregCorson 4 года назад
Be sure to do a "room setup" in steamVR before starting, put the headset on the floor wherever you want the talent to stand. In the real world, look where the camera is relative to this spot, when you start the project the camera should jump to the same spot in the level, relative to the talent marker. If the camera is not moving right, you may have the tracker mounted to the camera in the wrong orientation. In my setup the tracker is oriented with the tripod socket facing down and the USB plug facing to the rear. If your tracker is setup differently, you will have to adjust the offsets and rotations in the comp camera rig to match your setup.
@iradmir777
@iradmir777 4 года назад
​@@GregCorson thanks so much for the answer. The tracker position was wrong in the real world, right now it's working fine.
@ahmadsyauki
@ahmadsyauki 4 года назад
I'm seing some tiny random movement from camera while your moving the camera, is this because base station behavior while try finding/locking position of base tracking? do you think this is okay for live production?
@GregCorson
@GregCorson 4 года назад
The amount of jitter varies with the setup of the VIVE trackers. I am still working out what causes it. Some people don't seem to have it while others do, and it has varied a bit from one of my locations to another. One solution is to add a filter to the camera tracker which should be able to eliminate the problem, since the jitter is very small the filter should not cause problems. Also, if your camera is moving, this small amount of jitter isn't noticeable. So a camera that is hand-held, on a jib arm or slider shouldn't have a problem. Also, if you want a stationary camera, the tracker can be shut off after the setup and you won't see any jitter. Whether this is suitable for live production or not depends on your requirements. There are other ways to track a camera besides VIVE that can give a jitter free result, they are just more expensive. Remember what I'm doing is an experiment to see if I can get cheap hardware to perform well, so this is a work in progress, little problems being solved one at a time.
@darwich936
@darwich936 4 года назад
Hi Greg Where can I download your virtual studio, I need to try something with the camera tracker!!!
@GregCorson
@GregCorson 4 года назад
The virtual studio project is here github.com/MiloMindbender/UE4VirtualProduction if you want this particular virtual studio model it is in Epic's "Virtual Studio" sample you can get from the "learn" tab of the unreal launcher
@darwich936
@darwich936 4 года назад
@@GregCorson THANK YOU
@ngochoaitran3477
@ngochoaitran3477 4 года назад
Daer Sir, I downloaded your project and open it, but I have problems. I can find the same virtual studio when I open it. Can you tell me how to open the same your video, Now I try to use Axemmetry DE. Thanks Sir
@GregCorson
@GregCorson 4 года назад
The studio you see behind me in this video is not included in my project because it is from Epic's Unreal marketplace. Their license doesn't let me distribute their content. This studio is available free from Epic though. Go to the "learn" tab of the epic launcher app and scroll down to the Virtual Studio example. All the sets you see in my demo are in there.
@GregCorson
@GregCorson 4 года назад
I'm just putting up Tutorial 5 which shows how to use your own levels as virtual sets. I was thinking of your question when I did it so I did the demo using the Unreal Virtual Studio set that you are asking about. You should be able to follow those instructions to get the set you want.
Далее
iPhone 16 & beats 📦
00:30
Просмотров 97 тыс.
НОВАЯ "БУХАНКА" 2024. ФИНАЛ
1:39:04
Просмотров 405 тыс.
Virtual Studio Introduction | Aximmetry
2:28
Просмотров 146 тыс.
Best Green Screen Lighting  ||  HOW TO
21:18
Просмотров 147 тыс.