Тёмный

Unreal 5.1 Virtual Production WITHOUT TRACKERS 

Greg Corson
Подписаться 6 тыс.
Просмотров 9 тыс.
50% 1

Опубликовано:

 

27 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 56   
@mpbMKE
@mpbMKE Год назад
Very happy we found your channel, Greg! This is super helpful.
@da_noob_09
@da_noob_09 Год назад
Thank you for all the amazing content Greg, your channel is the best place to learn VP. 🙏
@GregCorson
@GregCorson Год назад
Thanks! Hoping I'm helping people get started!
@victoralexandersilva5212
@victoralexandersilva5212 Год назад
@@GregCorson you're amazing!
@taekjoolee6620
@taekjoolee6620 Год назад
Hi Greg, I've started working on virtual production as well. Thank you for the good tips.
@yusrighouse
@yusrighouse Год назад
Subbed immediately! Great content. Time to slowly go through everything on your channel!
@resetmatrix
@resetmatrix Год назад
thanks Greg for yours tutos and have a happy New Year !
@imrulkayes4623
@imrulkayes4623 Год назад
thank you so much you are the one who I was looking for. subscribed.
@GregCorson
@GregCorson Год назад
Glad I could help
@weltraumimport
@weltraumimport Год назад
i love your video Greg thanks so much for sharing all this info
@Maxyz
@Maxyz Год назад
Looks like a great trick for aligning witref cameras! Nice one, Greg!
@P4TTT
@P4TTT Год назад
That’s cool. Need to check your previous setup videos about it. It is quite interesting 😄
@vidayperfumes7514
@vidayperfumes7514 Год назад
GENIUS.
@JoseChemmassery
@JoseChemmassery Год назад
Thank you
@mariorodriguez8627
@mariorodriguez8627 Год назад
Great work! Thank you for the info!
@ArianaMusicNetwork
@ArianaMusicNetwork Год назад
Great Job
@iPEMiC.
@iPEMiC. Год назад
Perfect!
Год назад
Hi Greg. I´ve been following your great videos and I have a question. I´m calibrating with aruco but I´m having the next error "We could not find an appropriate Aruco dictionary for the selected Calibrator. Make sure you have selected a Calibrator containing Calibration Point components with names per the naming convention described in the info section of this algo" Any idea of what I´m doing wrong???? Thanks in advance
@GregCorson
@GregCorson Год назад
Take a look at your aruco tag in the blueprint editor. There should be a bunch of components named something like "DICT_6x6_100-8" If you have built your own aruco actor, these names matter. This is how the calibrator tells which aruco tag you are going to be using in your studio. This basically means a 6x6 aruco. the 8 is tag number 8. For things to work these things in your aruco actor have to be named right and the tag you have printed out has to match. I usually print my arucos from chev.me/arucogen/ you want to select the 6x6 dictionary. If the printed tag isn't right or the name of the components in the aruco actor isn't right, you will get this error. I think you will also get a similar error if it has trouble detecting the aruco in your video feed. This can happen if the tag is too small/far away, is at a sharp angle to the camera or is poorly lit with a lot of glare.
Год назад
@@GregCorson Thank you very much for the reply. Indeed, the aruco calibration points were wrongly named. I didn't notice for a _ that it wasn't.👍
@wjm123
@wjm123 Год назад
I used to be able to get this to work fine but recently I have challenges getting the aruco marker to be detected properly in the nodal offset page. I have to click multiple times (showing me error that no aruco markers were detected in the image) before suddenly out of the multiple attempts it will decide to find the aruco marker correctly. And subseuqently when i apply to camera parent, somehow the marker will be slightly offset from the scene's marker (the cine cam positioned has already been zero-ed). Any ideas what might be tripping up the detection and aligning?
@GregCorson
@GregCorson Год назад
About detecting arucos, I have found this to be pretty reliable. If you are not getting reliable detection it could be video noise (maybe filming at high ISO), not enough light or glare/light reflections on the tag. Printing the tag in a larger size could also help. If the tag is being viewed at an angle or at a distance, smaller sized tags might not detect reliablly. Regarding the alignment being off, make sure all the components in your cine camera are zeroed out, this includes the camera parent, default transform of the cine cam and the transform of the cine cam component inside the camera actor. If the alignment error is only happening sometimes, the aruco is probably not being correctly detected.
@wjm123
@wjm123 Год назад
@@GregCorson thanks for the reply! The alignment issue is a real head scratcher, when I check the "Show Detection" box, it correctly outlines the periphery of the aruco marker, so I think it is actually detected properly? The alignment was working fine for a period of time at the start but suddenly became out of line. I'll probably reprint my marker with a few various sizes and get it done on those hard foam backed boards. Will see if that helps
@GregCorson
@GregCorson Год назад
You will definitely have issues if your aruco is too small for how far away it is, or if it is bent, folded or wrinkled. I print a bunch of spares and clip them down to a sheet of 1/8th plywood with binder clips. Throw away any damaged ones. If the aruco is being viewed too edge-on it can cause issues too, the camera needs to get a good look at it. You can do it if the aruco is big enough, but I generally try to make sure it is at no more than a 45 degree angle to the camera.
@backup01-g4y
@backup01-g4y Год назад
Thank you very much Greg! A great start position. I have a question: I don't have any trackers, and I've followed the tutorial, it seems that I should have a tracker to set lens calibration. LiveLinkComponentControllerFIZ needs a LiveLinkComponentController to detect the lens file, though I can't create a LiveLinkComponentController without a Tracker.
@GregCorson
@GregCorson Год назад
You shouldn't need a tracker to create a lens file. The distortion calibration doesn't require one, just a manually setup FIZ to give it the lens settings. Getting the lens distortion correct should enable you to detect aruco tags and align the camera this way. For this technique it isn't necessary to know the nodal point of your lens. There have been some small changes in where you put the lens file in 5, 5.1 and 5.2 so depending on which version you use it may work slightly differently. Please let me know if you can successfully calibrate your lens, I need to do some calibrations soon and I will try doing one without using trackers to see how it works.
@jazzroy
@jazzroy 12 дней назад
@@GregCorson Hi Greg, it seems I have the same problem, trying to do it without trackers but the Aruco or Checkerboard calibrations don't work. I put manually a fixed value to all the lens FIZ and nodal offset, but still when I click for the calibration it says it is missing data from FIZ input and doesn't calculate it.
@jazzroy
@jazzroy 12 дней назад
Answering to myself.. I went a step further, putting a Lens Component in the Cinema Actor (I didn't put the livelink component for obvious reasons not having a tracker) and assigning the lens profile and as input the camera settings., So I don't receive anymore the FIZ error, but when I click for calibration I get only one point and it is the percentage of the screen where I clicked on with the mouse.
@SergioHerculano
@SergioHerculano Год назад
Great Tutorial, thanks for sharing, I am learning Unreal Engine to make a virtual productions for DJ Shows using DMX Plugin and I do not own trackers, I have 2 4k Webcams that I use today, my idea is to use these 2 Webcams pluged in UE and have 2 cinecams linked in different positions to make 2 static shots in 2 different fix camera positions, Is this make sense and possible ? any recommendation ? Thanks in advance.
@GregCorson
@GregCorson Год назад
Yes, this should work perfectly fine. You can align the two webcams manually and have two different static shots in Unreal. You could have a different composure setup for each camera in the same project. The only problem is that Unreal isn't so good with camera switching so both composure setups will always be running even if you are only sending one of them out, so it doubles the load on your computer and GPU. I never found a good way around this, though I haven't tried recently. If your computer can handle the load, it's fine, if it can't you might want to try using one PC for each viewpoint, tied to an external video switcher.
@SergioHerculano
@SergioHerculano Год назад
@@GregCorson Thank you for the feedback and tips I really apprecitate that
@VictoRs107
@VictoRs107 Год назад
Hello Greg. Strange but why no one has yet shown how to save and export video from the game mode and I also don’t understand how to export video to the sequencer.
@GregCorson
@GregCorson Год назад
Are you talking about recording the video from your camera or recording the final virtual production output? At the moment Unreal doesn't have any way to internally record video that I know of. Most people either route the final pixels out to an external recorder or record the screen with something like OBS. Another option is to record the live video externally (like in the camera) and record the tracking in unreal with take recorder. Then you can render the CG stuff non-real-time (in high quality) to a file. The camera and CG footage can then be synced up and composited in an external program.
@VictoRs107
@VictoRs107 Год назад
@@GregCorson Ok thanks, yes I meant how can I recording the final virtual production output. Let's also say how I do it when rendering a video, I use the "movie render queue". I understand that all virtual studios work in game mode. And I want to understand whether it is possible to simply transfer the video recording to the sequencer during the game mode, and then just render it like a normal cinematic using "movie render queue"? Thanks.
@GregCorson
@GregCorson Год назад
Hi, if I understand you right. When doing live virtual production, the movie render queue can't be used. It is only useful for non-real time recording and rendering in high quality. To record realtime output you need to use OBS, some other screen recorder or an external video recorder like a Ninja V hooked to your PC. Some people who need really high quality CG will externally record the video, record the tracking in unreal using take recorder, use the movie render queue to non-realtime render the CG in very high quality and then sync everything back up again in an external video editor. This is cas complicated as it sounds and not worth the trouble unless you are looking for "movie theatre" levels of quality.
@JimmeeAnimAll
@JimmeeAnimAll Год назад
This is super cool video, and technique. Thank You!! I'm going through Your other videos and I kinda know already what I want to ask coz this tutorial proves You are man for "quick and dirty" but reliable solutions :) . Could You please share some possibility/ties how to capture motion with iphone or VR controller ? (I have Rift2) What I mean is: how to record animation - it can be whatever ex. dummy or point anything with pivot - and place it in 3d space with recording of its movement in Time. I was thinking about take recorder (UE) and - I don't know - holding some object or parent object to the controller ooor, even quicker solution (to set up at least, but way less precise) is to use unreal camera tracker, and record its position. Also I was thinking about animating in Quill and attaching point to some geo in space, and bake keys, but it seems like a lot of extra unnecessary steps. Would You share some ideas how You would do it, please ?
@GregCorson
@GregCorson Год назад
If you want to use a VR headset or tracker, a lot of these depend on the cameras on the headset for tracking and won't work by themselves. I have seen people mount a VR headset on their camera and use it as a camera tracker, this can work but it's hard to solidly mount a VR headset on a camera. As far as using an iPhone for live tracking, I have not done it so I can't comment. There is an app for iPhone called camtrackAR that records video with tracking (as an FBX) you can use later. There are a number of programs that will generate tracking from a video. These don't run in real time but when used right they can give rock solid tracking. The best one I have seen is in the free software "Blender". It takes a bit to get used to it, but the results are very good. Again you can track a video clip and export the camera path as an FBX to use in other programs. I did a demo of this a long time ago here ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-q_cL0QCoNMA.html. The tracking, 3d and compositing was all done in blender and as you can see is rock solid. You can find tutorials on RU-vid by searching "blender camera tracking". There are a lot of good ones! I hope this helps.
@DouglasDoNascimento
@DouglasDoNascimento Год назад
Brilliant! Thank you. Do you have some content for the Intel T265 tracker?
@GregCorson
@GregCorson Год назад
Unfortunately I do not, I have one but never got it working acceptably with Unreal, then Intel stopped making them so it didn't seem worth pursuing it anymore since people wouldn't be able to buy one.
@DouglasDoNascimento
@DouglasDoNascimento Год назад
@@GregCorson Make sense. Thank you for your response. Your videos are really good. Thx.
@tamakacz
@tamakacz Год назад
Hi Greg! Thanks so much for this! I would love to know if its possible to align by using still images and not a live feed from the camera
@GregCorson
@GregCorson Год назад
You should be able to align to any image containing an aruco. You just have to get it onto the "media plate" the camera calibrator uses. Not sure what the application for this would be though?
@BartBarlow
@BartBarlow Год назад
Can you move the camera after with a vive tracker or even better a iPhone vcam after clicking the move to parent? Neat idea Greg!
@GregCorson
@GregCorson Год назад
As I mentioned in a previous tutorial, this method can be used to set the "origin" point of your studio/tracking system for almost any tracker. I use it for the Bliss tracker, it should also work fine with VIVE or just about anything else as long as you setup a "camera parent" and attach all the tracked actors to it.
@BartBarlow
@BartBarlow Год назад
Or maybe combine this with a motorized slider setup to skip on tracking?
@GregCorson
@GregCorson Год назад
Without some kind of tracking, moving the camera will cause the background to get out of sync. You need to know the position of the camera if you are going to move it. If you have a slider that reports back the exact position of the camera, or you have the camera mounted on something very accurate like a robot arm, that data can be used for the camera position instead of a tracker.
@jamit63
@jamit63 Год назад
basically its for static camera isn't it, is there any way to move around with camera this setup, at least placing the aruco tag on one corner of the scene, so that video can always track.
@GregCorson
@GregCorson Год назад
If you want to move the camera during a shot you need some kind of tracking system. It's possible to have one that tracks an aruco in real time, but it takes a fair amount of compute to do it in real time and I don't have the software for that. The next easiest step up would be to put a pan/tilt tracker on your tripod. These could be made DIY pretty cheap but I don't know of any ones ready-made that are cheap.
@eliteartisan6733
@eliteartisan6733 Год назад
Do you offer one on one consultations?
@GregCorson
@GregCorson Год назад
I don't do paid work on virtual production because I can't guarantee my availability. I'm happy to help when I can and answer questions though.
@ArianaMusicNetwork
@ArianaMusicNetwork Год назад
I am looking for one on one training, would you be able to interested ?
@GregCorson
@GregCorson Год назад
I don't usually take paid jobs or do training for individuals because I have limited time, I don't do this for a living, it's a hobby. If you have questions feel free to ask but I don't do anything requiring a commitment right now because my free time and availability varies quite a bit. Please feel free to suggest tutorials in areas where you are having trouble.
@点解甘多讲中文的反华
would u know ,how can the kuka robot connect with ue5,and track by robot....😂😂😂
@GregCorson
@GregCorson Год назад
I don't know the exact process, but most industrial robots have some way of sending the exact position of the end of the arm, relative to the base of the robot. Sending that data into unreal could give you the position of a camera mounted on the end of the robot. Is that what you mean?
@点解甘多讲中文的反华
@@GregCorson yes, it mean that,but i was not find the way to get to Kuka arm 6dof data acquisition
@GregCorson
@GregCorson Год назад
I have not got any experience with the Kuka arm. I'd suggest you talk to Kuka about how to get realtime position feedback from the arm. Kuka robots are pretty expensive, so their support should be pretty good! Once you have the information from the robot you can bring it into unreal with livelink. I have a livelink plugin for a different kind of tracking that could be modified to work with a robot. You would need to change the C++ to make it work though.
Далее
новое испытание
00:40
Просмотров 315 тыс.
На самом деле, все не просто 😂
00:45
Virtual Production with BMPCC - Timecode and Genlock*
16:29
The Future of Filmmaking is Here!
16:47
Просмотров 204 тыс.
The REAL Reason Unreal Engine VFX Looks FAKE
6:58
Просмотров 486 тыс.
How I Remade MW2 with Unreal Engine 5
12:37
Просмотров 3,1 млн
новое испытание
00:40
Просмотров 315 тыс.