Тёмный

Make your Unreal Virtual Production more realistic. A tool to match real lights with CG ones. 

Greg Corson
Подписаться 6 тыс.
Просмотров 3,3 тыс.
50% 1

Опубликовано:

 

27 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 34   
@Tenchinu
@Tenchinu Год назад
dude… ur just the biggest life savior!!! vid is easy and clear to follow :) have not tell u how many times i go back to ur old vids and come back to keep learning.
@iPEMiC.
@iPEMiC. Год назад
Wow, nice explanations. Awesome!
@PhilipLemon
@PhilipLemon Год назад
This is the absolute best unreal virtual production channel around. Thanks heaps for all your work. So inspiring.
@GregCorson
@GregCorson Год назад
Thanks a lot!
@otegadamagic
@otegadamagic Год назад
This is brilliant. As always
@adorablemarwan
@adorablemarwan Год назад
Maaaaannn, I didn’t know you can do that!!! Awesome
@Ryoma0z
@Ryoma0z Год назад
Greg this is too insane!!
@sohrabhosseini981
@sohrabhosseini981 Год назад
Nice Greg.
@loveit8602
@loveit8602 Год назад
This is amazing!
@ewaughok
@ewaughok Год назад
Great tutorial! Very helpful!
@mariorodriguez8627
@mariorodriguez8627 Год назад
Great work thank you!!
@weeliano
@weeliano Год назад
Awesome tutorial! Thank you for sharing! I wonder if a similar method could be used to link/live synch the color temperature or intensity of the lights. This will make lighting for Virtual Production even better!
@GregCorson
@GregCorson Год назад
I think that if you have a "reference" white object in your unreal set and a matching one in your real studio you could compare the two and adjust lighting till they match. This could potentially be done automatically but if the lighting setup is complicated it might require help from a trained eye. An editor utility widget could help in the adjustments. In my widget I just select one thing, but you could make one that would let you select a group of lights and adjust their settings (like color temperature) all at the same time. This might not work in some settings though, because light that starts out white can bounce off a floor or wall before hitting your talent and if the "bounce" surface is colored it can change the temperature of the light reaching the talent.
@michaeltorp3863
@michaeltorp3863 Год назад
Thanks for all your great tutorials. It's really helpful getting into the world of VP. I am having some trouble with the tracked Apriltag. The data coming in from Retracker is in minus values, and as far as I can se, they are relative to the tracking camera which makes sense. But in unreal, they are relative to the world zero position, and therefore placed under the floor. How do I set the apriltag relative to the camera position in unreal? I really hope you can help me out. Thanks!
@GregCorson
@GregCorson Год назад
I think you missed the part regarding setting world pose. The bliss feeds coordinates relative to the tracking camera's start position, and the position of the apriltag is sent relative to the camera's current position. But I believe in this video I mention setting worldpose, this sets the tracking origin point in your studio (usually a spot on the floor), after that is setup all the coordinates you get should be relative to that spot. There are several ways to do this and I believe I covered them in the video, if you are still stuck I'll go back and dig out some specific references in the video. Let me know.
@genieeye5783
@genieeye5783 Год назад
Great job Greg! It's really a very helpful instruction to light up the virtual scene much easier. But, most of my situation is on the opposite way. For example, a virtual scene already lit up, and needed to light up the character(real person) to match the BG. I have difficulties in doing this all the time. I've seen a tutorial talked about the light contrast(the NIT thing)which is a bit helpful. Hope that you could do a tutorial about the virtual production compositing. Have a nice day!
@GregCorson
@GregCorson Год назад
You are right, this can be more tricky. To match the real world studio lighting with a virtual set usually requires some guesswork and measuring to get your studio lights to match. One method that can help is setting up light probes. That is, in the real world you get a grey calibration ball and (optionally) a mirror ball. Put these in the real studio where your talent will be and also create the grey and mirror ball in the same place in your virtual set. You can use this method to make sure balls in the virtual set are in the right place. Once you have these, adjust your studio lights till the the balls in the studio and in the virtual set are a good match. You can also make a shadow gauge, just a vertical rod sticking out of a flat surface, for Unreal and your studio. This will let you see shadows being cast by your lights so you can adjust the lights in both places till the shadows match.
@genieeye5783
@genieeye5783 Год назад
@@GregCorson I see. Try that next time. Again your tutorial is a textbook of VP. Cheers!
@AdamSmith-pn5hk
@AdamSmith-pn5hk Год назад
@@GregCorson This would be a wonderful tutorial as well! Always 🔥! Learning a ton and loving it! Cheers
@eliteartisan6733
@eliteartisan6733 Год назад
Hi could you tell me or point to a video teaching how to put metahuman into the real studio?
@GregCorson
@GregCorson Год назад
Actually the process of putting a metahuman (or any CG thing) against a live video background is very near the same as regular virtual production. Start by creating a "layer" in unreal and put just your metahuman in that layer. The composure setup is almost exactly the same as for putting a live person in a CG background, you just need to reverse the order of the foreground and background plates in composure. You want to make the video the background plate and the CG the foreground one. For the CG plate, set it to render only what is in the layer. This will render the metahuman with everything else see-through. For the video plate, just turn off chromakey. This will render the metahuman on top of the video feed, if your camera is aligned and tracking properly it should look good unless someone walks in front of the metahuman. If you want an example you can download the live link bliss sample here github.com/MiloMindbender/LiveLinkPlugins/releases/tag/release_3 the sample project has this setup in it. It's meant to be used with the Bliss tracker but you can use the same setup with any tracker. If you already have a virtual production project setup you don't really need this, just swap the order of the foreground and background plates like I said above and you are almost done.
@brettcameratraveler
@brettcameratraveler Год назад
Is there a method to measure and reverse engineer the characteristics of a real world light (softness, angle, etc) and then recreate it in Unreal? And could that same virtual light be RGB (real vs virtual) dynamically in sync via DMX with Unreal?
@GregCorson
@GregCorson Год назад
The usual method used in VFX for matching lighting is a combination of "light probes" as well as HDRI images taken on set. Light probes are usually done with a combination of neutral gray and mirrored ball photographs taken on the set. You can then put CG light probes in the scene and attempt to match them to a real one. A lot of VFX people also use 360 cameras to take HDRI images at various places in the set. These can be used for image based lighting and also reflections. I'm not aware of any system that does this automatically with DMX. Thankfully, lighting does not have to be "perfect" to look realistic. If you are trying to match a set of installed lights, the best approach might be to put light probes on the stage, turn on one light at a time and then photograph them. This would give you some idea of how much each light contributes to the lighting at each light probe location. Using software you could probably come up with settings for each light that would match a light probe from an unreal or other scene. This could get pretty complicated and I don't know of any software that does it automatically. In VFX I don't think there is any magic bullet for getting lighting exactly right automatically, someone with a good eye almost always has to adjust the lighting and software settings to get it to look correct. Prior to Unreal supporting ray tracing, lumen and other kinds of global illumination, it was necessary to put sometimes dozens or even over a hundred "simulated" lights in a CG scene to make it look right. With global illumination you can get pretty good results just by matching the position/orientation of the real lights and CG scene lights as shown here. There are ways to exactly measure the characteristics of a light fixture and use it to light a CG scene extremely accurately, this is most often done by architects when trying to simulate lighting for buildings and offices. So far as I know, you can't do this in real time yet because the full characteristics of a light source are too complicated. However as I mentioned before, with a little hand tuning and good setup, the lighting offered by UE 5.1 can look "good enough" for most uses.
@brettcameratraveler
@brettcameratraveler Год назад
@Greg Corson Thank you! I'm mostly doing live actors on a green screen with vive tracked cameras. Getting the lighting right on the actors is a must otherwise it's just cheesy. Dynamic lighting in a CG scene makes that even more difficult. I'm trying to make everything real-time so a camera operator can do their magic and "find shots" on impossible sets. I'm picturing attaching a vive tracker to something like a Titan tube that is wireless DMX and these tunes can animate RGB light. Perhaps taking the light probes in Unreal and translating that to the appropriate light values on the Titan tube based off of where it is in real space. Figure out the common color system and use the right equations and it might make for a dynamic lighting system that reacts to the CG environments you drop into Unreal. Will see if these pieces can be brought together :)
@GregCorson
@GregCorson Год назад
Many people have attached trackers to a "real" light and connected it to a CG light so that when you move the real light the CG one moves too. In one of my early tutorials I did this with a tracker attached to a flashlight, it works very good! I'm just starting to mess with DMX lights so that unreal can control real world lights in real time. You can easily sync up real world and in-game lighting effects this way.
@brettcameratraveler
@brettcameratraveler Год назад
@Greg Corson Thank you. I've been looking into OCIO + 18% grey balls with a color chart in hopes of coming up with a system that can take a given light and quickly make a perfect digital twin of it that around a common RGB standard so that when I use it in the future, all the RGB changes I make using DMX end up being final pixel accurate in-camera. Trying to take as much of the long manual guess work out of each and every setup. How have you gone about calibrating to a RGB standard when it comes to your RGB lights and final pixel?
@GregCorson
@GregCorson Год назад
@@brettcameratraveler I have really not tried to do RGB calibration yet. "perfect" can be very hard and still requires manual adjustment. "good enough" is probably possible though matching the exact characteristics of a light is always going to be tough.
@iPEMiC.
@iPEMiC. Год назад
Was wandering how come we still cannot use the ios tracking system for Virtual Production combine with a real camera plus composure for everyone here who still cannot afford a realsense or htc, why ? Will try ask unreal, because this is the solution for everyone. Thank you for sharing with us!
@GregCorson
@GregCorson Год назад
Actually, you could use the iOS (or Android) AR tracking libraries to create a camera tracker for Unreal. So far as I know though, nobody has published an app that does this. Usually they put together a phone app for their own use but don't publish it. The Unreal "virtual camera" app could potentially be used as a camera tracker but last time I tried I found they had built it so the camera tracking couldn't be used independently of the virtual camera functions. I think they have updated this app since then, but I don't know if they added this feature or not.
@iPEMiC.
@iPEMiC. Год назад
@@GregCorson I see, i recently saw a app call Skyglass for iphone but this work out of unreal actually. I will try the AR tracking librairies for the camera tracking. Which app did you mention please ?
@GregCorson
@GregCorson Год назад
It might be possible to use this app apps.apple.com/us/app/live-link-vcam/id1547309663 but I am not sure. It is intended for virtual cameras. Last time I checked you could not use the tracking without the virtual camera part being turned on, so it may not work, but might be worth a look.
@iPEMiC.
@iPEMiC. Год назад
@@GregCorson Nice, I gonna look at it!
@iPEMiC.
@iPEMiC. Год назад
@@GregCorson Actually this is exactly what I used
Далее
Rate our flexibility 1-10🔥👯‍♀️😈💖
00:12
Why Unreal Engine 5.4 is a Game Changer
12:46
Просмотров 1,3 млн
FREE Motion Capture for EVERYONE! (No suit needed)
6:07
The REAL Reason Unreal Engine VFX Looks FAKE
6:58
Просмотров 486 тыс.
15 Perfect 10/10 Games You Must Play
17:57
Просмотров 1,7 млн
How I Remade MW2 with Unreal Engine 5
12:37
Просмотров 3,1 млн