Write & Direct teaches filmmaking from development through post production in a cohesive, hands on manner. It's the "Director's Film School" designed to help aspiring filmmakers start making movies faster than traditional education routes.
Thanks a lot Kyler! I was wondering if the Argyll settings would remain the same for a 1350:1 contrast ratio Eizo monitor or if you would change something? As it seems in between the 1000:1 and 2000:1 discussed.
Thanks ! When using the BM monitor 3G, is it better to work with an WQHD monitor over an UHD monitor (for pixel to pixel viewing) ? And how many horizontal pixel can you get from the BM 3G ? It's 1080, but max 1920 ? Right ? Thanks a lot!
Hey, the 3G is standard HD. So that’s 1080 vertical and 1920 horizontal. Personally, I grade with a small HD monitor. It’s totally fine. If you’re going to grade with a 4K display, I would consider a Blackmagic Decklink card versus the 3G.
Thanks! I’m with a Mac book pro so the decklink will not working I guess. The only option I think for having 2k and 4k is the BM Ultra Studio mini, but dam, he is at 1k!! I don’t personnel’s really need more than 2k output signal, so that’s why I’m interesting by the Eizo cg 2700s (wqhd). But clean 2048 output signal could be nice to have. Are your happy with your HD monitor (1920x1080) and the BM monitor 3G ?
@@selvafria4150 HD display has been fine for me. Eventually I want to move to 4K but not a 911. Also, you can get an external PCI card holder that will allow you to use the Decklink with the Macbook Pro. That combo will ring in way less than the BMD 4K mini.
If you had several different kinds of cameras (say, a Sony mirroless shooting in S-Log3, a GoPro in Rec 709, and a Blackmagic camera with BRAW) would you not use a color managed project and instead use CST nodes to convert each clip individually? Thanks for the excellent video.
Hi, how can I make DaVinci to ask me a confirmation before closing the app? I accidentaly clicked 'X' and it shut off the app immediately, losing some of my progress.
With Live Save activated, it won't prompt you. But Live Save is a crucial feature to have on IMO. The settings is under Preferences > User > Project Save and Load > Save Settings > Live Save. You can turn that off and see if it prompts. But if there's a crash, you''ll lose work.
Thank you for a very useful video. Unnfortunately is use Sonoma, but I have a PC with DisplayCal. I have a related question, that you might help me with. I have just bought a on camera monitor (Viltox DC-X2) and was wondering if you could a similar method outlined here to create a 3D lut for calibration of the monitor? The input to the monitor would then come from camera after the calibartion (not the computer), but it might help somewhat? A related question. As I use LUT on the monitor to convert from N-Log, is it possible to "combine" a N-Log LUT with a calibration LUt into one LUT? Thanks!
Hey, there is a forked version of DisplayCAL that works with Python 3. I need to test it, but heard it works. So regarding your monitor, I'm following you. But you can't (to my knowledge) combine those two LUTS. And with that, DisplayCAL relies on tweaking RGB values on the monitor, so I'm not sure you could even dial the small one in that well. However...if you did pull off a 3D LUT for the camera monitor, you could trying loading that onto a LUT box and sitting that between the camera and monitor. Then you could use the other LUT like normal. Haven't tried any of that, but if you do let me know!
@@writedirect Hi, thank you for your reply. I might try out the forked version. Just for fun I tried to calibrate my on camera monitor using Calibrite on my Mac, but the result was dissapointing. I thnik this might be caused by the colorimeter (Colorchecker Display Pro) is to big for that small monitor (6 inch) and when measuring color it also include more than the measuring spot leading to very odd results. Perhaps DisplayCal (I can use this from my windows PC) could make a bigger spot to measure.
You know, I don’t typically touch it while I’m running for whatever reason. It wouldn’t surprise me that it would warm up. But the temp must be acceptable or they’d have to put a fan in it.
I noticed the gamut covered was quite low at 88% after profiling. You can get better coverage if you increase the saturation slightly beyond rec709 primaries and check the tracking on HCFR before profiling. Just make sure to import the ccss or ccmx to hcfr.
This is such a fantastic channel. It really is answers to all the questions you didn't want to ask in the most easy to understand format imaginable. Many thanks.
Hahaha. I hadn’t thought of it that way, but it’s probably true because that’s how I think. I’m a light meter freak. Just remember at least on some cameras you can go above the green on middle grays. Blackmagic design cameras specifically allow you to “expose to the right” and bring that down in post without any information loss.
Great video. Learning more than I did at film school lol question though it was my understanding that deviating from the native ISO compromises image quality/introduces noise. Your 1250 example challenges by understanding what’s the point of a native ISO then?
Hey! This might change from camera to camera. I use BMD and typically ALWAYS shoot native ISO if I can control lighting. And I never go high on the ISO personally as that will get noise. But sometimes the lower ISO can really do it for you due to lighting. Example 1: I was shooting during golden hour. Sun dropped too much and I wasn't done with the shoot. The solution? Go to the second ISO bank on my BMD camera. The native for the 2nd bank is around 3200. But I shot around 1250 because 3200 would have been too much light. 1250 let me get my exposure right. The footage matched the native ISO 400 takes close enough. All was good. Could I have done 3200? Sure, but would have needed to close way down on the lens which totally changes DOF compared to the other takes. Another example: If I was shooting night scenes indoors with controlled lights. I'd play around with really bright lights and dropping below native so that I'd have the look I needed later in post with shadow detail, etc. Not that you can't do native...sure you can. But that's just another time I'd experiment. Because the lower ISO pushes your dynamic range lower, allowing you to dip down more into the darks.
@@PolymerJones it's the best balance of stops above and below middle gray. And it's also probably gonna be the best image from the camera. However, if you need to drop down, as long as you expose to that lower ISO correctly, you won't get noise. Going high? Noise is going to begin. But again, ISO is not an exposure tool most of the time. If you can control your lighting, leave it at native. No reason to change unless you wanted more stops below middle gray because you're shooting a sequel to the HAUNTING or something.
I have seen this and some other demos of 32 bit recorders. I totally get it that a clipped recording can be lowered down and not be clipped, but does 32 also help with boosting a low recording to a higher lever without introducing noise?
@@writedirect Ok, that I don't have so much use for 32 bit then I think cause my main issue is with low signals when I record, but I can definitely see the advantage of not clipping if I was doing field recording in trafic.
@@Average-Al yeah. I think experienced sound engineers push back at 32-bit float as they can totally work their magic on set. But when you're doing indie film and you don't have a dedicated sound mixer, 32-bit float can save your bacon.
@@writedirect Yes. As an hobbyist dabbling with music I am curious if if could help me bringing up lower signals with less noise from my old instruments or if it is just about avoiding clipping.
@@Average-Al I don’t think it will help you with that. It’s about protecting peaks. An analogy: if this were about lighting, it would protect from over exposure, but it would not give more detail in the shadows.
I watched this video over and over (but that's because it takes me a long time to process stuff!) and I'm now confident enough to take my gimbal out with me on a shoot this weekend - thank you SO much for making it simple.
@writedirect As people have said, your videos are much more intuitive than the official ones. Now, if you have something on attatching the focus motor to a Nikon Z6ii, you would have a friend for life..!
Hello, good time, forgive me for bothering me, take a picture from Kurdistan, Iran, I will disturb you, I will take a Sony Alpha camera, I will take four films, if you have the possibility of making the da Vinci project, send me.
Hi! Thank you for this extremely usefull video! I have a BenQ SW272U monitor that is hardware calibrated (all color adjustments happen inside the monitor's LUT). In order to calibrate the monitor and update the internal LUT I have to use Palette Master Ultimate (BenQ's software). For this calibration I can't use the blackmagic 3g monitor because PMU won't recognize it. After calibrating with PMU , I then add BM 3g monitor in the chain, which means that whatever icc profile was generated through the PMU calibration does nothing to the signal since I'm now bypassing the GPU. The icc created in this case (hardware calibrated monitor) contains the RGB color gamut info the video card should output and no color mapping at all (that lives in the monitor's internal LUT). Although the monitor is now calibrated for Rec709 gamma 2.4 output, the image is not looking good, the blacks appear crushed. Do you think that calibrating with DisplayCal as an extra step would benefit the final output? Maybe this is because of a different way that REC709 is being outputted from macOS(GPU) and Davinci Resolve (via the io device)? Sorry for the long message, I hope it makes sense! Thank you.
Hey! The only reason to use a 3D LUT is if you don't have a hardware calibrated display. However...since you have to hook it up to your computer to calibrate with the BenQ software, this could be a problem. Ideally you'd have someone come in and calibrate with the right tools so you could then hook it up to an external Decklink or 4K studio mini hardware. So anyway, if it were me, I'd probably calibrate like you are. Then hook it up to BMD hardware, use DisplayCAL to create a 3D LUT which would then do any needed corrections once the display isn't affected by the GPU. Does that make sense?
Thanks for this great guide!! Any chance you are still making the video about the verification of the calibration? That would be super helpful! I have calibrated my BenQ PD2700U monitor (connected through a Decklink) many times, tried too many options (0% black offset, 100% offset, full vs legal, standard correction vs user corrections ...) and every calibration looks slightly different. What to my eye looks best, has the worst result when looking at the measurment report and what looks good on paper, does not look good in the image. Idk, I am in the rabbit hole again. :D Therefore it would be much appreciated to get some help understanding the calibration result. Keep up the great work!
Thanks man! I’m behind on that lesson. And I probably won’t do it until I switch to the forked version of displayCAL. There’s just too many problems with the current version and macOS. It will get done! I’ve got a film in post and other chaos going on so It’ll just take some time.
Can you do a video on how to get the best overall color export for TV, cinema , iPhone, iPad. cause when I render in gamma 2.4 , on the phone it looks horrible. but TV and cinema is great. so for web should always be 2.2 and another version for 2.4 or one main version but it's a trial and error to find the middle match. it's hard to have one version that matches everywhere
I'll put that on the "to-shoot" list. But until then, I personally only do 2.4. And if you monitor your grade at Rec.709 Gamma 2.4 and then flip it to 2.2 it's going to suffer. You need to monitor for final output IMO. Make sure you're doing the Rec.709-A setting on the Delivery page. If you don't do that on macOS your exports aren't going to look good.
@@writedirect hi yeah i already do all of that but when you grade for 2.4 and than people watch it on their phone or iPad it looks horrible cause it's 2.2. so I guess the goal for web its always to grade in 2.2 and maybe do a TV or cinema version in 2.4
@@PascalPayantfilms are you grading with a reference monitor attached to hardware? And with that, are you grading with a 3D LUT or do you have a hardware calibrated display? Sounds like you’re doing all of that and just seeing the difference between the two gammas. It’s all a big pain. I mean, it’s a million times better than it was 20 years ago. But it’s still a pain.
@@writedirect "I'm using an LG C3 OLED calibrated in 4K as my main monitor. I don't use an Ultra Mini, but I know it's not making a big difference. I don't use a 3D LUT, though maybe I should, but the colors aren't the issue-it's the gamma shift between different devices
@@PascalPayantfilms yeah I understand. You wouldn’t want to use a 3G mini with a 4K display. At the least I’d get a decklink card. Mount external if you had to. Gamma shift is going to change when you create a 3D LUT. the angle I’m shooting here is the closer you can dial it in to accuracy the less of a shift you might see on the various devices. Possibly. That could completely not work out, but it’s kind of the only goal we have to shoot for.
Here it is: How to Balance the Blackmagic Pocket Cinema Camera on the DJI Ronin RS3 Pro | Sirui Anamorphic Lens ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-2MPzorLFtxs.html
Great video!! Unfortunately/fortunately it’s more informative than what DJI has for support. One question, what if I want to use two motors? One for focus and one for zoom?
a helpful tutorial but I found it frustrating that the video described itself as a "complete walkthrough" only to then say "I wont explain this, it's in a previous video" and not even so much as link the previous video that covered it.
@@writedirectone example is that you talk about how amazing false colours can be but never explain them. You get into some pretty basic stuff in this video in great detail, but never give even a quick explanation on them.
@@theshaggydogg2867 oh I gotcha. So here’s what’s going on: These are random videos from my complete online film school. I have a class dedicated to false colors, and I also have that class on the RU-vid channel. So it was redundant to repeat in this lesson IMO. Having focused videos keeps overall length down versus jamming everything in one.
@@writedirect I was actually using your guide for the BMPCC 6K G2 but it was very similar and your explanations so good that I stuck with it. Needed to refresh my memory before a big shoot. Thanks again.