Тёмный
Greg Corson
Greg Corson
Greg Corson
Подписаться
Here you will find Virtual Production tutorials and demos to help you get started. Virtual Production is used by everyone from small RU-vid channels all the way up to big budget Hollywood productions. Some of the things you can do include combining live video of yourself with a CG scene, adding animated CG characters to live video and doing high quality "all CG" productions.

What's really great is that, for the first time in history, the tools Hollywood is using are available free for you to learn and use!

All you need is Epic's Unreal Game engine (free download!) and a PC. Just about any PC that can play 3d games will work, of course higher end computers will be able to do more at higher quality. If you want to do effects with live video you will need a camera (even a webcam). Of course, you can add on a lot more hardware and software to do higher quality and more elaborate productions, but even with basic gear you can do a lot, even shoot your own indie movie!
Niagara Demo
1:46
Месяц назад
Green Screen with Green Lighting?
2:33
2 месяца назад
Virtual Piano Player Animated by AI!
3:16
2 года назад
Комментарии
@linfordfevrier5011
@linfordfevrier5011 5 дней назад
Well done Greg. I found this extremely useful. I was getting hell trying to position the hands of my Talent on top of a table for a TV newscast. What version of UE did you use for this video. I am using UE 5,1
@ArianaMusicNetwork
@ArianaMusicNetwork 10 дней назад
Thanks for sharing
@taekjoolee6620
@taekjoolee6620 14 дней назад
I sometimes use the declink family to develop cloud video encoders these days. Thanks for the good explanation Greg.
@cucobein
@cucobein 15 дней назад
Amazing, thank you! Can it record and monitor simultaneously through HDMI in and HDMI out?
@GregCorson
@GregCorson 14 дней назад
This setup is bringing in the camera video on SDI3, compositing it live with the CG animation in unreal and then unreal outputs the video to HDMI1 where my Ninja V recorder to record it. You could also record the output from your video card's HDMI output but the Blackmagic card supports more high quality video formats than most video cards do. I have tested bringing camera video in through the HDMI which also works. I'm not sure about what comes out the HDMI output port in this setup, I will have to try it. My guess is that you can't input and output at the same time, or that the output is just a copy of the input.
@Tutuba
@Tutuba 17 дней назад
Thank you, Greg!
@mingli1492
@mingli1492 Месяц назад
Hey guys, I've been experimenting with UE 5.4.2 today. After setting everything up and migrating my lens files over from an existing project created with 5.1, I wanted to set my camera alignment with the Aruco tag method, only to repeatedly get an error message stating: "Failed to resolve a camera position given the data in the calibration rows. Please retry the calibration." It works fine with the same lens files in the original 5.1 project. Has anyone else experienced this?
@GregCorson
@GregCorson 17 дней назад
Sorry for being late to reply. I have heard some other people complain about a similar problem with camera alignment but I haven't had time to test it yet. I changed my camera setup around and have to get everything recalibrated so I can try it myself.
@behrampatel4872
@behrampatel4872 Месяц назад
This is fantastic. Does any of this require a paid plugin or is it all included within UE 5 now ? Greg have you figured out a way to get the UE vcam app to record video ? It'd help getting started with VP so much more than having to use a aja card and a dslr. Cheers for this amazing tutorial man.
@GregCorson
@GregCorson 17 дней назад
No special plugins are required, all the stuff you need is included with Unreal except for the aruco tag asset I built and made available through my github (linked in the description). As far as vcam, sorry I don't know if there is a way to do this, I haven't looked into it.
@jasoncorganbrown
@jasoncorganbrown Месяц назад
I'm unable to get my face data to be read in the animation curves. The debugger is picking them up, but not reflected in the Face Mesh panel.
@robertomachado7581
@robertomachado7581 Месяц назад
Great tutorial! ...."Metahuman animator" should be called "metahuman face animator." Metahumans' bodies suck.
@DdawgRealFX
@DdawgRealFX Месяц назад
Hi Greg… is a depth camera on one of the newer iPads and iPhones required? To use MHA ?
@GregCorson
@GregCorson Месяц назад
@@DdawgRealFX I think it is. I have been owing an iPhone XS Max which works till it overheats. Getting a phone cooler will fis thar
@DdawgRealFX
@DdawgRealFX Месяц назад
Understood. Was facial calibration required in metahuman identity or was that pretty much outta the box from capture? Ty.
@GregCorson
@GregCorson Месяц назад
@@DdawgRealFX if you use mha you need to create a metahuman identity four face but you can keep using that identy unless the actor being captured changes
@DdawgRealFX
@DdawgRealFX Месяц назад
Awesome. Thanks very much.
@DdawgRealFX
@DdawgRealFX Месяц назад
Amazing demo. Freeze frame can even decipher the difference. Nice work.
@Cronogeo
@Cronogeo Месяц назад
Hello! I'm following your tutorial but using UE 5.4 and I have some troubles with the Livelink Component Fiz. In my case the Camera role doesn't detect a single controller. I'm aware that I need to add a Lens component to the camera, but doesn't work. The live link does work tho, the camera moves synchronised with my phone, so I'm not sure what did I do wrong
@cuefogthe
@cuefogthe Месяц назад
Hey a full tutorial would be great!
@jamit63
@jamit63 Месяц назад
YES INTERESTED
@hoseinrahmani8822
@hoseinrahmani8822 Месяц назад
exclent job greg. i will waiting for complete video .good lock.
@Z807-FN
@Z807-FN Месяц назад
How fun! Would love to see a tutorial, it's been years since I touched composure stuff
@prasithsay4741
@prasithsay4741 Месяц назад
Fantastic 🎉wish to see the full tutorial. So long never see your new video.
@MrMarniche
@MrMarniche Месяц назад
Greg, you are fantastic. Surely interested in a full tutorial:-)) 🥳
@rjregner8515
@rjregner8515 Месяц назад
Would enjoy seeing your approach, long time fan of your videos. All the best.
@hegelsholiday1981
@hegelsholiday1981 Месяц назад
A tutorial would be great - I really like the demo.
@arthurlyrio
@arthurlyrio Месяц назад
Muito interessante!
@zippyholland3001
@zippyholland3001 Месяц назад
Very nice. Please do a tutorial ❤
@Tenchinu
@Tenchinu Месяц назад
as usual, looks amazing. Gotta go back and checking your tuts to reach this level, but it looks awesome
@SocailInteruption
@SocailInteruption 2 месяца назад
Great content and delivery. Thank you for all these videos makes me feel supported to bring VP into my work. I’ve been considering whether it’s a way to bring augmented reality into a live theatre production (think Dogville type set) where the audience can move a screen/camera around to uncover the set. Just wondered if you had any thoughts on that at all, and insight into how latency might affect this if the audience are viewing the real and filmed actor via unreal. Thanks again
@JamesBrett2008
@JamesBrett2008 2 месяца назад
Hi Greg, Im looking at full body mocap and just discovered Sony Mocopi, do you still use it? what are the pros and cons? has there been any updates to improve it since you made this video? thanks
@GregCorson
@GregCorson 2 месяца назад
There have been a few updates since this video but I haven't tested them yet, I'm planning to do that soon. The main pro is that it is very easy to put on, take off and wear. You can literally do mocap anyplace using just Mocopi and your phone. The main disadvantage is that it only has six tracking points so it doesn't get finger, ankle or wrist motion, you have to add that after. Also you still have to use it with a phone, it doesn't connect directly to a PC. You can do live mocap and send it directly to the PC, but the phone always has to be running. For a while I was having problem with disconnections during use but found out that was being caused by using a metal phone stand. This stand had never caused problems before but with Mocopi it was. I switched to a plastic stand and the problem went away.
@DJE-Thizzz
@DJE-Thizzz 2 месяца назад
0:12 It worked! :)
@dougl9758
@dougl9758 2 месяца назад
@gregcorson - So great - thanks! I can't seem to find the axis_guide static mesh. I've created projects in 5.1 - 5.4, and then looked at meshes in the starter content - but don't ever see that... Thoughts on where that static mesh is located?
@GregCorson
@GregCorson 2 месяца назад
If you go to the content browser and search for "Axis" it should be there. But it's in "engine content", you don't have to load starter content to see it. If it doesn't show up, go to the settings in the upper right of the content browser and make sure "show engine content" is checked, then it should show up.
@dougl9758
@dougl9758 2 месяца назад
@@GregCorson Yep - that did it! Thanks GC
@Tenchinu
@Tenchinu 2 месяца назад
always love seeing this vids in my notifications. always have to go back and watched them a couple of times to get it right, so looking forward to rewatching this in the future
@AlejandroGuerrero
@AlejandroGuerrero 2 месяца назад
click, click, click, click, click, click, click, click, click, click, click, click, click, click, click, click, click, click, click, click, click, click, click, click, click, click,
@TEDDYKILLAH
@TEDDYKILLAH 2 месяца назад
Hey Greg, this will be your first million views video. :)
@CaptainSnackbar
@CaptainSnackbar 2 месяца назад
welcome back Greg good to see you making videos, and nice one with the click bate :D
@creativetelesis6704
@creativetelesis6704 2 месяца назад
I’ve also been looking into backlit green screen. There are a few products available. But What do you think of this front projection idea? ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-rJKLEZnsVsQ.htmlsi=GPWku1p7EprMQNz8
@brettcameratraveler
@brettcameratraveler 2 месяца назад
Yeah, as you said, the reason why most do not use green lights on a green screen is that you have to be much more careful with controlling the light so that it strictly hits the green screen. This is often difficult in smaller spaces, if a green floor is also used and when productions are moving quickly.
@GregCorson
@GregCorson 2 месяца назад
If the screen and floor are both backlit, it could work for floors. In general though, spill seems to be an issue with either green or white light. When it works, it does seem to make keying a bit easier because of the reduction of shadows and increased green saturation, it helps to make the green a more pure color. Also it can allow you to drop the brightness of the green screen to help it reduce spill. Front lighting with green is not a good idea if the floor is visible, but back lighting could work. Many people said this would make spill way worse even when you were not using a green floor. From this test it doesn't seem to change the amount of spill at all. This was just an initial test, I'll be doing some more in the future. I've built a test "linear/bank" light using aluminum channel and green LED strips. This seems to work pretty good so far and gives a pretty even light. I'll be doing a follow up on this once I get the full setup built. These lights can be made with either green or white LEDs of course.
@brettcameratraveler
@brettcameratraveler 2 месяца назад
@GregCorson If done right, yes, a more pure green would help the key, but even back lighting the floor, if green, will be much more of a direct green spill issue than the more natural and forgivable white light. When it comes to bounce spill light, green on green shouldn't make for any more of a spill issue than white light - however, the reality is there is often some direct spill coming from the lights themselves if not controlled perfectly, and if that light is green, then yes, your green spill issue is going to be worse. If you really wanted to be picky, the ideal green screen would likely be a green, opaque material that is evenly back lit with green LEDs on a dimmer. Or a LED volume with a floating/tracked green. However, it would be expensive to make very large and, even then, you still would want a practical set for the floor.
@GregCorson
@GregCorson 2 месяца назад
I think the best solution in the end is diffuse green material backlit by green LEDs. This is better than bounce lighting the green-screen (with any color light) for several reasons. The biggest one being that with a lightbox about a foot deep you can get a very evenly lit green screen which can be hard to do with bounce lights. The other big advantage is that a backlit green screen will work at a lower brightness level. So it generates less spill. In any case though, it is still something that will have to be tested. This is not something that people can accept or dismiss out of intuition. I have run into a number of professionals who have actually used green light for various situations and say it works, it's just not for everything.
@brettcameratraveler
@brettcameratraveler 2 месяца назад
@GregCorson Love testing things rather than working off of assumptions. At the same time, the green screen is a multidecade technique. These questions have most likely been explored. Nevertheless, sometimes, if the best quality solution is rarely used, it might be for good reason - but still relevant to some. In your pursuit of knowing the best quality technique, I would say the screen material not only needs to be uniform while back lit but also not to the point that it's reflective on the side of the camera and other lights. This material and/or technique is likely riding a fine line.
@nataliatorres9135
@nataliatorres9135 3 месяца назад
This is SO well explained, and SO useful... Thank you very much for this video.
@GregCorson
@GregCorson 2 месяца назад
This is a pretty old video, you may want to check the more recent ones, there are a lot of them about nodal offset and tracking.
@G.I_BRO_SHOW
@G.I_BRO_SHOW 3 месяца назад
Bro you the best, this video should have a million likes, keep up the great work, p.s I Subbed
@JWPanimation
@JWPanimation 3 месяца назад
Thanks!
@JWPanimation
@JWPanimation 3 месяца назад
Thanks!
@AnimatorHeadSpace
@AnimatorHeadSpace 3 месяца назад
That's cool man! D'you happen to know how to get her hair working for the control rig? When I load the character control rig into the scene, it comes without her hair. Is there a way to connect the hair back to the rig? Using UE 5.4 currently. Thanks.
@GregCorson
@GregCorson 3 месяца назад
Hi, I haven't used Echo since upgrading to UE5 because Epic took a long time updating her. According to the current marketplace listing she is only supported for 5.0-5.2 which might be the reason she doesn't work in 5.4. Kind of a shame because I liked this character's style. Sometimes you can use a model on an unsupported engine, but since echo is fairly complicated and there have been a lot of updates to control rig, animation and other things, I don't think you can count on it. It looks like for now at least metahumans are your best bet for detailed, rigged characters.
@AnimatorHeadSpace
@AnimatorHeadSpace 3 месяца назад
@@GregCorson Ah, I see. Well I guess that makes sense. Thanks for the response, appreciate it.
@GregCorson
@GregCorson 2 месяца назад
Not sure if this will help, the "Slay" example in the marketplace is listed as 5.4 compatible and contains Echo. Don't know if this is an updated version or not but you might give it a look. Let me know what you find out.
@AnimatorHeadSpace
@AnimatorHeadSpace 2 месяца назад
@@GregCorson Yeah, I tried that recently. Seems to only have fk rig available as opposed to the Control Rig. Still figuring this stuff out. Thanks for the heads up :)
@honghaixie8383
@honghaixie8383 3 месяца назад
Did you use FX6? I use it and the cmos is 35.7*18.8,should i match 16:9 to change as 35.7*20.08?
@GregCorson
@GregCorson 3 месяца назад
I use an A7R4. When you are putting in the sensor size/resolution you want to use the size of the sensor that is active. If the sensor gets cropped to create a 16:9 image, then you want that cropped size. Most full frame 35mm cameras just remove some from the top and bottom and use the full width of the sensor. So you can usually get the right numbers by dividing 35mm by the aspect ratio you are shooting in. Not all cameras are the same though, so check your manuals to see how much of the sensor is active in the shooting mode you are using.
@danielrubinstein6714
@danielrubinstein6714 3 месяца назад
Hello Greg, I am about to purchase the REtracker Bliss, but first I wanted to know the opinion of any user who uses it. Do you have the Indie license or the regular REtracker Bliss license? I am considering it to use it with led screen nDisplay ICVFX (not with a chroma)
@GregCorson
@GregCorson 3 месяца назад
Hi, I use bliss with a greenscreen setup and like it very much, I know others are using it with LED wall. Since I worked with the RETracker folks to create it, they gave me a full license. I understand the main difference with the Indie license is the lack of hands on video chat support. The only real issue I can think of with LED wall is that REtracker requires something stationary to track, so if you have a large wall, ceiling and floor that moves or uses a very large frustrum you might have minor issues. Usually RETracker can track whatever stationary practical elements you have in your set. It's also possible to point it at the ceiling, floor or to the side if facing directly towards the LED wall has issues. You can also place tracking markers (pieces of paper with high contrast patterns printed on them) in places the Bliss can see but that are out of shot for your main camera. For example at the top of the background screen or on the ceiling or floor.
@danielrubinstein6714
@danielrubinstein6714 3 месяца назад
@@GregCorson Thanks for the response Greg!, thanks for all the information, it is very valuable to have the opinion of someone who uses it and real cases. Could you give me the contact of someone who uses it with an LED screen? to ask you some questions
@GregCorson
@GregCorson 3 месяца назад
I can't think of anyone specific that is using REtracker with LED screen, suggest you ask Marwan Rassi at www.retracker.co/ for some references near you.
@JMY1000
@JMY1000 3 месяца назад
I'm having trouble with this in Unreal 5.3.2; although it appears to detect the Aruco (it pops up when I do Show Detection), the four points don't load in. I tried changing the calibrator to Nodal Offset Points, and doing each of the four Aruco points manually, but that crashed as soon as I did Apply To Calibrator Parent. Any ideas?
@GregCorson
@GregCorson 3 месяца назад
I'm not certain, but I know there will be problems with the Aruco calibrators if the lens calibration plugin wasn't enabled on before you imported my pre-made arucos into the project. I'm reworking my sample projects right now for 5.3 and 5.4, I'll let you know if I find anything weird.
@JMY1000
@JMY1000 4 месяца назад
Hi Greg, seem to be running into a weird issue where the values for "Raw FIZ Input" are all "No ___ Input"-even though the FIZ controller still totally works, and updates the camera's view inside the lens editor. Have you seen this before? Is that a problem? Also, the Camera Feed Info seems to max out at 1920x1080, even once the Camera Info and the camera's Sensor Back are set to a higher resolution. Is that just an artifact of Unreal using fixed resolution distortion maps or something? Or does that need to be configured somewhere else?
@GregCorson
@GregCorson 3 месяца назад
Hi, I'm not really sure about this, I haven't been doing virtual production for a bit (I'm starting back up now) and there have been a number of incremental changes Epic made in Unreal 5.0-5.4 that have changed small things about the way things work. I'll try to follow up on this as I update my tutorials for 5.4
@JMY1000
@JMY1000 3 месяца назад
Cool, thanks! Love the tutorials, looking forward to it!
@MDLabStudios1
@MDLabStudios1 3 месяца назад
@@JMY1000 Hi, I saw this issue as well. I noticed there was a message under "Lens Component" talking about how there wasn't a lens component assigned to feed those values. I did some poking around and added a lens component to this CineCameraActor that we're working with and that seemed to solve it. I selected the "CineCameraActor" in the Outliner, hit "Add" at the top of the "Details" panel, typed "Lens", selected the custom "Lens", and then assigned the "MyLens" file (that we built in this tutorial) under the "Lens File" setting. Hope this helps you!
@JMY1000
@JMY1000 2 месяца назад
@@MDLabStudios1 Thanks! I'm not working on this project at the moment, but if I do get back to working on it I'll give it a try and see if that fixes things.
@99SBX
@99SBX 4 месяца назад
Thanks for the great comparison! Very informative.
@parametriq_
@parametriq_ 4 месяца назад
Hello, I'm trying to build a small virtual production and your tutorial has been of tremendous help. Thank you. I have a question. I built the blueprint following your tutorial and it worked just fine. But is there a way to run it without entering play mode? I've searched the internet and contruction is what makes it possible but it wasn't working.
@GregCorson
@GregCorson 4 месяца назад
Hi, this tutorial is one of the older ones, you might want to look at my newer tutorials for 5.x unreal. The way livelink works has changed a bit and the setup for everything is simpler. It pretty much runs all the time without pressing play now.
@unityvirtualproduction
@unityvirtualproduction 5 месяцев назад
Can you (or anyone reading) make any recommendation for similar workflow usingUnity Engine on macOS (Apple Silicon)?
@GregCorson
@GregCorson 4 месяца назад
I can't be much help with Unity because I've never used it. Sorry.
@jiaxinli8811
@jiaxinli8811 5 месяцев назад
Great video! Is the nodal offset a function to fix the problem that most camera rig's rotation center/pan and tilt axis is not aligned with the lens's nodal point? There must be a solution for this problem already because although we can rig the camera that the nodal point is aligned when camera is on tripod or jib, the nodal point will never align when we handheld the camera.
@GregCorson
@GregCorson 4 месяца назад
For virtual production it isn't important to know the offset from the pan/tilt axis of the tripod. What you need is the offset from the lens nodal point to the center of tracking on your tracker device. For example the tripod screw on a VIVE. In a VP project Unreal gets the position of your tracker, the offset to the lens nodal point allows UE to place the nodal point of it's camera in the same place as your real world camera. Without the offset the UE camera would be where the tracker is, which would be wrong and cause slipping. The correct lens nodal to tracker offset works for any kind of camera mount including handheld.
@chelo111
@chelo111 5 месяцев назад
please bro---we need that tutorial asap
@GregCorson
@GregCorson 4 месяца назад
Working on some new stuff now, a lot going on here in the last month, I'll post an update soon.
@3DcgGuru
@3DcgGuru 5 месяцев назад
Angelina Jolie?
@GregCorson
@GregCorson 4 месяца назад
Nope, it's not Angelina.
@digitalpanther
@digitalpanther 5 месяцев назад
This is hella cool. The only issue I see is that the neck collides and clips through the collar of her shirt on MH Animator. But aside from that its really really cool. Thank you for sharing this demo! I can't wait to try them out.
@GregCorson
@GregCorson 5 месяцев назад
Actually, that clipping only happens because only the head is being animated here. If the whole character was being animated the shirt wouldn't clip, it would follow the head.
@wavelogic8471
@wavelogic8471 5 месяцев назад
Is this meant for static cameras only? I can get it to align up perfectly, but as soon as I pan or tilt the camera, it goes off alignment. With lens distortion and nodal offsets setup.
@GregCorson
@GregCorson 5 месяцев назад
If you have some kind of camera tracking, this should work for moving cameras. If you don't it will only work for a static camera.