thank you so much! i comfuse about it long time! your explain make me clearly more! what about apple log ,sony , go pro? can you make a video to explain for it in fcp?
You can keep your phone from getting hot by putting a wet sponge behind your phone. As the sponge gets hot, I just remove it, and pour some water on the sponge, and put it back on. That actually works pretty good. I finally switched to the RC Pro, mainly because I would have to remove the phone from my Otter case, and every time I started flying, my phone would ring. The RC Pro cost a crap load of money, but I use my Air 2S a lot more now. Plus if you have the fly more combo case, the controller fits in the same spot inside the case as the old controller did. So no need for the Pelican case.
To sync a typical DSLR you need a separate Sync E, which provides the source timecode to an attached device. By typical, I mean a DSLR without internally generated Timecode. Other, more advanced cameras, for Video, have internal timecode generators. most, though, not all, of these cameras, with internal Timecode generators, are able to do “jam“ sync. Jam sync is the ability for one device to read Timecode from another device, and then synchronize the Timecode to match the externally generated code coming in. Tentacle Devices are designed to provide Timecode and the jam sink ability to devices which do not have them on their own. The sync E serves as a central time generator, providing the source timecode to all devices. So, if your device, a camera or an audio recorder, for example, does not have its own time go generator, then you need sync E for it. If, however, your device is able to generate code internally, you may be able to jam sync to the sync E, and then disconnect from it. Note, it is best to assign real numbers do you recordings, a separate one for each device, which will help you keep track of the recordings entry back to them if you move between separate applications.
Without using the timecode you're just back to syncing with waveform. With firmware updates the Track E will output a TC track from the headphone port. You can use that to sync to devices that have internal Jam Sync capabilities. And you can always get a Sync E (or a few of them) later. Works great
You can't test a USB C 3.2 Gen 2 against a Thunderbolt 3/4 NVMe. All drives attached via USB-C on a mac are maxed out at 1050mps (Gen2) regardless. Dont even bother With Gen2x2 on mac go for a TB3 (20Gbs) or TB4 (40GBs).
I really appreciate this comment. This was shot on Sony FX3 Slog3 into Rec709 (with my Slog3 to Rec709 conversion LUT Pack) and a Kodak 2383 Emulation LUT I created. For the Osmo Pocket 3, the official DJI LUT has a ton of blue and yellow, I made a Dlog-m to Rec709 LUT that corrects this issue.
D-LogM is definitely one of the hardest "log" profiles to get right, I'd protect your highlights over anything even if it means having some grain in your shadows, only because it's more pronounced on DLog-M. I'm still testing a couple of DlogM conversions for the Pocket 3 which will fix the highlight issue and be applicable for -0.7EV all the way to +0.07EV making ideal for different shooting scenarios.
@@joshuafcenters okay so after looking at the footage I shot this weekend I’d say staying at -0.3 / 0.0 for shaded/indoor stuff and -0.7/-1.0 for sunny outdoors. I’m working on some really great conversion LUTs and so far they are matching our FX3 perfectly! I’ll be putting out a video once I have that conversion complete.
Hey! Great question; I think an NVME is going to be the best option and the “future proof” way to go. It also depends on what you’re using it for. I have noticed three of my SanDisk SSD’s performance go downhill since making this video and the two NVME drives I made using the Samsung NVME and Konyead enclosures are still maintaining the same fast speeds and I use them daily for our wedding videography business. I hope this helps!
Thanks! Yes, you just have to make sure you're shooting at double the frame rate options. For example, if you're shooting at 59.94 (4k/60), you'd select 29.97 as your timecode frame rate.
i'm really interested in this topic, but it is so hard to follow this video because of the "music" consisting of 2 chords, 2 stupid notes and some cheap beats. 35min of this is a REAL TORTURE! Please try to listen to this for 35min. (without any drugs or alcohol) !
Thanks for making and sharing this video. I hadn't heard of the Konyead enclosure until Google suggested this video. I am looking for an enclosure for the SK Hynix P41 Platinum 2TB SSD I have on order and this might do the trick. Looks like I can get it with a decent deal on Amazon imported into Australia much cheaper than the local stock. I'll do a little more research before I hit the "Buy" button but so far it's looking good as a competitor to Acasis and Orico enclosures I have been researching. Oh and the AJA testing platform might help in these kinds of tests. I've watched a few other videos where they use Blackmagic and AJA to compare with results.
Interesting idea. I thought of something similar but find the approach too clunky. Plus, I recommend staying away from the camera LUT. Final Cut puts the camera LUT at the first instance of the signal chain, leaving you no chance to recover highlights. As you said in your video, "When adjusting the colours of your clip, they stay within limits" - this results from the LUT clipping values at 0 and 100 (or 109, depending on your LUT). The same happens with the camera LUT. Long story short, if you expose improperly, you cannot rescue anything when the camera LUT is applied since it will clip at 0 and 100 (or109), too. To correct that, you would need to get hold of the video signal before the camera LUT and that's impossible in FCP. If you set up your signal chain (just like you would set up your node tree) in a way that allows for corrections BEFORE the device input transform, you will have much more flexibility. Try using an instance of colour wheels before the custom LUT (s-log to ACES), then make all your adjustments in the middle and have another custom LUT (aces to rec709) at the end. Using this approach, you can get rid of adjustment layers. I think your intention was to ensure the proper order of operations for the signal processing, but the inspector on the clip-level works from top (input) to bottom (output), too, just like your node tree from left to right. Then, you could save the signal chain (pre-loaded with the proper aces LUTs) as a preset, and Bob's your uncle. Hope this wasn't too all over the place. If you want to geek a bit about colour, feel free to shoot me an email :)
Plus, if you set up your signal chain as discussed, there is no need for different exposure versions of the LUTs. You would correct exposure before the video even hits the input device transform to aces. :)
In Final Cut, you can use a Custom LUT effect to transform from log, and put your color grading layers above it. This way you can edit your footage before the LUT.
I got the linked Konyead Thunderbolt Enclosure and the 970 EVO Plus 2 TB. My results were same as yours on the read speeds (2700ish) but could only reach 1300ish on the write (tried APFS, ex-fat and Mac OS; tested on my M1 Macbook Pro, 16 GB RAM and two other similar macs). Just sharing another data point, not expecting any help. That said, if anyone wants to drop any troubleshooting tips, I'd appreciate it.
Thanks for sharing this! If you haven't already, id try: 1. Switching the thunderbolt cable 2. Using the MacBook port directly (Not a hub) 3. Format at the 'Device' Level, not 'Volume' level. In Disk Utility on Mac, go to 'View' and ' Show All Devices'.