Тёмный

Unreal Engine VFX Breakdown Jetset Cine - Virtual Production 

creative twins
Подписаться 2,2 тыс.
Просмотров 2,5 тыс.
50% 1

Feel free to support us by donating through our Patreon account. There, you'll find additional tutorials that we don't post on RU-vid. Here's the link: / creativetwins
#UE5 #VirtualProduction #UnrealEngine #unrealengine5 #vfx #greenscreen #unrealengine5.4 #greenscreen #filmmaking #vfx @lightcrafttechnology @UnrealEngine @creativetwins02

Опубликовано:

 

20 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 21   
@creativetwins02
@creativetwins02 Месяц назад
Full Music Video Here: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-Rt_jCWXcz3U.htmlsi=OWeay0aWrDJRpwlE
@fredyrmz1
@fredyrmz1 Месяц назад
Great breakdown, and the final product looks awesome! Are you compositing the greenscreen using Aximmetry, or are you managing it in post on other software?
@creativetwins02
@creativetwins02 Месяц назад
@@fredyrmz1 Thanks! We are chroma keying our footage in DaVinci Resolve, exporting it as pre-keyed EXR, and compositing in Unreal Engine 5.4.
@djpromethazinecayenne9359
@djpromethazinecayenne9359 Месяц назад
Wicked!!!
@VIPAH
@VIPAH Месяц назад
Thanks for sharing bro!! Much appreciated. I'm getting into V.P after using Unreal 5.4 with Arri Amira shooting on LED volume wall. But keen to shoot green screen cyclo too. I'm thinking VIVE MARS tracker??!! Expensive I know but keen to use it with Aximmetry free version.
@creativetwins02
@creativetwins02 Месяц назад
Hey, that's great! Based on the videos I've seen, I think the Vive Mars does a good job of tracking. However, since it only supports the Free-D protocol, it will only work with the Broadcast version of Aximmetry.
@tobiasfalk9112
@tobiasfalk9112 Месяц назад
The tracking and the comp looks really great. Have you tried adjusting the DOF in UE?
@creativetwins02
@creativetwins02 Месяц назад
Thanks! In this breakdown, we are mostly showing the 11mm shots, so everything is in focus. However, we also used a 35mm lens, which has depth of field (DOF). Jetset Cine automatically measured the focus distance using the iPhone’s onboard LiDAR focusing systems, and this information is also carried into Unreal Engine. It works well for shots where the focus is on the primary subject in the frame.
@tobiasfalk9112
@tobiasfalk9112 Месяц назад
@@creativetwins02 Sounds excellent. Do you export EXRs out of UE, to control the depth pass in post, or do you render the DOF?
@creativetwins02
@creativetwins02 Месяц назад
⁠​⁠​⁠@@tobiasfalk9112We render the EXR files with the DOF applied, without using a depth pass.
@tobiasfalk9112
@tobiasfalk9112 Месяц назад
@@creativetwins02 That's probably the most efficient workflow. I've used Sigma'smart' cine lenses with VP, but I don't think the lens metadata really works in UE, at least not yet. Have you tried any lens encoders?
@creativetwins02
@creativetwins02 Месяц назад
@@tobiasfalk9112 We haven’t used any lens encoders yet, but I’ve seen that most people use the Glassmark lens encoder.
@AkbarAli-ie2uv
@AkbarAli-ie2uv Месяц назад
Hi sir pls make some tutorial how to crate like this
@creativetwins02
@creativetwins02 Месяц назад
Hey, we have tutorials on our Patreon page: www.patreon.com/CreativeTwins
@sanatpratap2258
@sanatpratap2258 Месяц назад
Fantastic work! Whats the cpu configuration like?
@mrtbgrgk
@mrtbgrgk Месяц назад
Hi. Congratulations on your work. I am currently preparing for a sci-fi short film and trying to create a shooting plan. I have some questions. I can't understand the benefit of virtual production if there is no LED screen. The actor is playing in front of a green screen. There is no change for that. There are two monitors in front of the cameraman. He follows the image from the green screen monitor. After the shooting is over, the key work will be done on the computer again. I am not sure if I should invest in the accsoon seemo+jetset cine so that the cameraman can see for virtual world.
@creativetwins02
@creativetwins02 Месяц назад
Hi, Thanks. When it comes to virtual production, if your budget allows for an LED wall, it can provide a more immersive experience for the actors and crew. However, if an LED wall is out of budget, using green screen with tools like Lightcraft Jetset Cine still offers advantages. These tools allow the cameraman to see the virtual environment in real-time, helping with framing, lighting, and overall scene composition. Lightcraft Jetset Cine also tracks camera movement in real-time, which can save time in post-production and give you a better idea of the final look during the shoot. In the end, it depends on what fits your budget and needs.
@soyungatocuriso
@soyungatocuriso Месяц назад
Hello, friends, what equipment do you have to make camera tracking that works in unreal? thanks
@creativetwins02
@creativetwins02 Месяц назад
Hey, we use an iPhone 14 Pro Max with Lightcraft Jetset Cine for camera tracking.
@b0nec4ndy
@b0nec4ndy Месяц назад
I dont see a seemo, is that part of the rig? I thought it was required for Jetset Cine?
@creativetwins02
@creativetwins02 Месяц назад
The Accsoon SeeMo is mounted underneath the camera and used only at the beginning for lens calibration. After that, it can be unplugged.
Далее
ZERO COST Virtual Production And Everything Beyond
13:56
I Made a Game in Unreal in 14 Days... (No Experience)
32:59
ИСЧЕЗНИ ВОДУ ДО КОНЦА
00:43
Просмотров 180 тыс.
I Remade Iron Man VFX With $20
7:44
Просмотров 836 тыс.
VFX Artist Reveals the True Scale of Minecraft
14:28
Просмотров 1,7 млн
I Created My Own Custom 3D Graphics Engine
26:29
Просмотров 72 тыс.