Here you will find Virtual Production tutorials and demos to help you get started. Virtual Production is used by everyone from small RU-vid channels all the way up to big budget Hollywood productions. Some of the things you can do include combining live video of yourself with a CG scene, adding animated CG characters to live video and doing high quality "all CG" productions.
What's really great is that, for the first time in history, the tools Hollywood is using are available free for you to learn and use!
All you need is Epic's Unreal Game engine (free download!) and a PC. Just about any PC that can play 3d games will work, of course higher end computers will be able to do more at higher quality. If you want to do effects with live video you will need a camera (even a webcam). Of course, you can add on a lot more hardware and software to do higher quality and more elaborate productions, but even with basic gear you can do a lot, even shoot your own indie movie!
Well done Greg. I found this extremely useful. I was getting hell trying to position the hands of my Talent on top of a table for a TV newscast. What version of UE did you use for this video. I am using UE 5,1
This setup is bringing in the camera video on SDI3, compositing it live with the CG animation in unreal and then unreal outputs the video to HDMI1 where my Ninja V recorder to record it. You could also record the output from your video card's HDMI output but the Blackmagic card supports more high quality video formats than most video cards do. I have tested bringing camera video in through the HDMI which also works. I'm not sure about what comes out the HDMI output port in this setup, I will have to try it. My guess is that you can't input and output at the same time, or that the output is just a copy of the input.
Hey guys, I've been experimenting with UE 5.4.2 today. After setting everything up and migrating my lens files over from an existing project created with 5.1, I wanted to set my camera alignment with the Aruco tag method, only to repeatedly get an error message stating: "Failed to resolve a camera position given the data in the calibration rows. Please retry the calibration." It works fine with the same lens files in the original 5.1 project. Has anyone else experienced this?
Sorry for being late to reply. I have heard some other people complain about a similar problem with camera alignment but I haven't had time to test it yet. I changed my camera setup around and have to get everything recalibrated so I can try it myself.
This is fantastic. Does any of this require a paid plugin or is it all included within UE 5 now ? Greg have you figured out a way to get the UE vcam app to record video ? It'd help getting started with VP so much more than having to use a aja card and a dslr. Cheers for this amazing tutorial man.
No special plugins are required, all the stuff you need is included with Unreal except for the aruco tag asset I built and made available through my github (linked in the description). As far as vcam, sorry I don't know if there is a way to do this, I haven't looked into it.
@@DdawgRealFX if you use mha you need to create a metahuman identity four face but you can keep using that identy unless the actor being captured changes
Hello! I'm following your tutorial but using UE 5.4 and I have some troubles with the Livelink Component Fiz. In my case the Camera role doesn't detect a single controller. I'm aware that I need to add a Lens component to the camera, but doesn't work. The live link does work tho, the camera moves synchronised with my phone, so I'm not sure what did I do wrong
Great content and delivery. Thank you for all these videos makes me feel supported to bring VP into my work. I’ve been considering whether it’s a way to bring augmented reality into a live theatre production (think Dogville type set) where the audience can move a screen/camera around to uncover the set. Just wondered if you had any thoughts on that at all, and insight into how latency might affect this if the audience are viewing the real and filmed actor via unreal. Thanks again
Hi Greg, Im looking at full body mocap and just discovered Sony Mocopi, do you still use it? what are the pros and cons? has there been any updates to improve it since you made this video? thanks
There have been a few updates since this video but I haven't tested them yet, I'm planning to do that soon. The main pro is that it is very easy to put on, take off and wear. You can literally do mocap anyplace using just Mocopi and your phone. The main disadvantage is that it only has six tracking points so it doesn't get finger, ankle or wrist motion, you have to add that after. Also you still have to use it with a phone, it doesn't connect directly to a PC. You can do live mocap and send it directly to the PC, but the phone always has to be running. For a while I was having problem with disconnections during use but found out that was being caused by using a metal phone stand. This stand had never caused problems before but with Mocopi it was. I switched to a plastic stand and the problem went away.
@gregcorson - So great - thanks! I can't seem to find the axis_guide static mesh. I've created projects in 5.1 - 5.4, and then looked at meshes in the starter content - but don't ever see that... Thoughts on where that static mesh is located?
If you go to the content browser and search for "Axis" it should be there. But it's in "engine content", you don't have to load starter content to see it. If it doesn't show up, go to the settings in the upper right of the content browser and make sure "show engine content" is checked, then it should show up.
always love seeing this vids in my notifications. always have to go back and watched them a couple of times to get it right, so looking forward to rewatching this in the future
I’ve also been looking into backlit green screen. There are a few products available. But What do you think of this front projection idea? ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-rJKLEZnsVsQ.htmlsi=GPWku1p7EprMQNz8
Yeah, as you said, the reason why most do not use green lights on a green screen is that you have to be much more careful with controlling the light so that it strictly hits the green screen. This is often difficult in smaller spaces, if a green floor is also used and when productions are moving quickly.
If the screen and floor are both backlit, it could work for floors. In general though, spill seems to be an issue with either green or white light. When it works, it does seem to make keying a bit easier because of the reduction of shadows and increased green saturation, it helps to make the green a more pure color. Also it can allow you to drop the brightness of the green screen to help it reduce spill. Front lighting with green is not a good idea if the floor is visible, but back lighting could work. Many people said this would make spill way worse even when you were not using a green floor. From this test it doesn't seem to change the amount of spill at all. This was just an initial test, I'll be doing some more in the future. I've built a test "linear/bank" light using aluminum channel and green LED strips. This seems to work pretty good so far and gives a pretty even light. I'll be doing a follow up on this once I get the full setup built. These lights can be made with either green or white LEDs of course.
@GregCorson If done right, yes, a more pure green would help the key, but even back lighting the floor, if green, will be much more of a direct green spill issue than the more natural and forgivable white light. When it comes to bounce spill light, green on green shouldn't make for any more of a spill issue than white light - however, the reality is there is often some direct spill coming from the lights themselves if not controlled perfectly, and if that light is green, then yes, your green spill issue is going to be worse. If you really wanted to be picky, the ideal green screen would likely be a green, opaque material that is evenly back lit with green LEDs on a dimmer. Or a LED volume with a floating/tracked green. However, it would be expensive to make very large and, even then, you still would want a practical set for the floor.
I think the best solution in the end is diffuse green material backlit by green LEDs. This is better than bounce lighting the green-screen (with any color light) for several reasons. The biggest one being that with a lightbox about a foot deep you can get a very evenly lit green screen which can be hard to do with bounce lights. The other big advantage is that a backlit green screen will work at a lower brightness level. So it generates less spill. In any case though, it is still something that will have to be tested. This is not something that people can accept or dismiss out of intuition. I have run into a number of professionals who have actually used green light for various situations and say it works, it's just not for everything.
@GregCorson Love testing things rather than working off of assumptions. At the same time, the green screen is a multidecade technique. These questions have most likely been explored. Nevertheless, sometimes, if the best quality solution is rarely used, it might be for good reason - but still relevant to some. In your pursuit of knowing the best quality technique, I would say the screen material not only needs to be uniform while back lit but also not to the point that it's reflective on the side of the camera and other lights. This material and/or technique is likely riding a fine line.
That's cool man! D'you happen to know how to get her hair working for the control rig? When I load the character control rig into the scene, it comes without her hair. Is there a way to connect the hair back to the rig? Using UE 5.4 currently. Thanks.
Hi, I haven't used Echo since upgrading to UE5 because Epic took a long time updating her. According to the current marketplace listing she is only supported for 5.0-5.2 which might be the reason she doesn't work in 5.4. Kind of a shame because I liked this character's style. Sometimes you can use a model on an unsupported engine, but since echo is fairly complicated and there have been a lot of updates to control rig, animation and other things, I don't think you can count on it. It looks like for now at least metahumans are your best bet for detailed, rigged characters.
Not sure if this will help, the "Slay" example in the marketplace is listed as 5.4 compatible and contains Echo. Don't know if this is an updated version or not but you might give it a look. Let me know what you find out.
@@GregCorson Yeah, I tried that recently. Seems to only have fk rig available as opposed to the Control Rig. Still figuring this stuff out. Thanks for the heads up :)
I use an A7R4. When you are putting in the sensor size/resolution you want to use the size of the sensor that is active. If the sensor gets cropped to create a 16:9 image, then you want that cropped size. Most full frame 35mm cameras just remove some from the top and bottom and use the full width of the sensor. So you can usually get the right numbers by dividing 35mm by the aspect ratio you are shooting in. Not all cameras are the same though, so check your manuals to see how much of the sensor is active in the shooting mode you are using.
Hello Greg, I am about to purchase the REtracker Bliss, but first I wanted to know the opinion of any user who uses it. Do you have the Indie license or the regular REtracker Bliss license? I am considering it to use it with led screen nDisplay ICVFX (not with a chroma)
Hi, I use bliss with a greenscreen setup and like it very much, I know others are using it with LED wall. Since I worked with the RETracker folks to create it, they gave me a full license. I understand the main difference with the Indie license is the lack of hands on video chat support. The only real issue I can think of with LED wall is that REtracker requires something stationary to track, so if you have a large wall, ceiling and floor that moves or uses a very large frustrum you might have minor issues. Usually RETracker can track whatever stationary practical elements you have in your set. It's also possible to point it at the ceiling, floor or to the side if facing directly towards the LED wall has issues. You can also place tracking markers (pieces of paper with high contrast patterns printed on them) in places the Bliss can see but that are out of shot for your main camera. For example at the top of the background screen or on the ceiling or floor.
@@GregCorson Thanks for the response Greg!, thanks for all the information, it is very valuable to have the opinion of someone who uses it and real cases. Could you give me the contact of someone who uses it with an LED screen? to ask you some questions
I can't think of anyone specific that is using REtracker with LED screen, suggest you ask Marwan Rassi at www.retracker.co/ for some references near you.
I'm having trouble with this in Unreal 5.3.2; although it appears to detect the Aruco (it pops up when I do Show Detection), the four points don't load in. I tried changing the calibrator to Nodal Offset Points, and doing each of the four Aruco points manually, but that crashed as soon as I did Apply To Calibrator Parent. Any ideas?
I'm not certain, but I know there will be problems with the Aruco calibrators if the lens calibration plugin wasn't enabled on before you imported my pre-made arucos into the project. I'm reworking my sample projects right now for 5.3 and 5.4, I'll let you know if I find anything weird.
Hi Greg, seem to be running into a weird issue where the values for "Raw FIZ Input" are all "No ___ Input"-even though the FIZ controller still totally works, and updates the camera's view inside the lens editor. Have you seen this before? Is that a problem? Also, the Camera Feed Info seems to max out at 1920x1080, even once the Camera Info and the camera's Sensor Back are set to a higher resolution. Is that just an artifact of Unreal using fixed resolution distortion maps or something? Or does that need to be configured somewhere else?
Hi, I'm not really sure about this, I haven't been doing virtual production for a bit (I'm starting back up now) and there have been a number of incremental changes Epic made in Unreal 5.0-5.4 that have changed small things about the way things work. I'll try to follow up on this as I update my tutorials for 5.4
@@JMY1000 Hi, I saw this issue as well. I noticed there was a message under "Lens Component" talking about how there wasn't a lens component assigned to feed those values. I did some poking around and added a lens component to this CineCameraActor that we're working with and that seemed to solve it. I selected the "CineCameraActor" in the Outliner, hit "Add" at the top of the "Details" panel, typed "Lens", selected the custom "Lens", and then assigned the "MyLens" file (that we built in this tutorial) under the "Lens File" setting. Hope this helps you!
@@MDLabStudios1 Thanks! I'm not working on this project at the moment, but if I do get back to working on it I'll give it a try and see if that fixes things.
Hello, I'm trying to build a small virtual production and your tutorial has been of tremendous help. Thank you. I have a question. I built the blueprint following your tutorial and it worked just fine. But is there a way to run it without entering play mode? I've searched the internet and contruction is what makes it possible but it wasn't working.
Hi, this tutorial is one of the older ones, you might want to look at my newer tutorials for 5.x unreal. The way livelink works has changed a bit and the setup for everything is simpler. It pretty much runs all the time without pressing play now.
Great video! Is the nodal offset a function to fix the problem that most camera rig's rotation center/pan and tilt axis is not aligned with the lens's nodal point? There must be a solution for this problem already because although we can rig the camera that the nodal point is aligned when camera is on tripod or jib, the nodal point will never align when we handheld the camera.
For virtual production it isn't important to know the offset from the pan/tilt axis of the tripod. What you need is the offset from the lens nodal point to the center of tracking on your tracker device. For example the tripod screw on a VIVE. In a VP project Unreal gets the position of your tracker, the offset to the lens nodal point allows UE to place the nodal point of it's camera in the same place as your real world camera. Without the offset the UE camera would be where the tracker is, which would be wrong and cause slipping. The correct lens nodal to tracker offset works for any kind of camera mount including handheld.
This is hella cool. The only issue I see is that the neck collides and clips through the collar of her shirt on MH Animator. But aside from that its really really cool. Thank you for sharing this demo! I can't wait to try them out.
Actually, that clipping only happens because only the head is being animated here. If the whole character was being animated the shirt wouldn't clip, it would follow the head.
Is this meant for static cameras only? I can get it to align up perfectly, but as soon as I pan or tilt the camera, it goes off alignment. With lens distortion and nodal offsets setup.