Тёмный

Blackmagic Davinci Resolve CPU vs GPU Rendering 

EEVblog2
Подписаться 111 тыс.
Просмотров 6 тыс.
50% 1

Blackmagic Davinci Resolve Studio CPU vs GPU Rendering
Plus the Davinci Resolve Speed Editor.
If you find my videos useful you may consider supporting the EEVblog on Patreon: / eevblog
Web Site: www.eevblog.com
Main Channel: / eevblog
EEVdiscover: / eevdiscover
AliExpress Affiliate: s.click.aliexpress.com/e/c2LRpe8g
Buy anything through that link and Dave gets a commission at no cost to you.
T-Shirts: teespring.com/stores/eevblog
#ElectronicsCreators #davinciresolve #blackmagic

Наука

Опубликовано:

 

14 авг 2023

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 79   
@Razor2048
@Razor2048 11 месяцев назад
For davinci resolve, by default if there is only one GPU detected, it will not use full GPU acceleration unless you specifically enable it. It does this to avoid stutters on the main display. while it is rendering. When a videocard needs to use shared memory, you are essentially dropping from 300-400GB/s throughput to around 30GB/s for any working data that spills over to the system memory. For proper acceleration, you need at least 12GB of VRAM for 4K, and will also need to ensure that full acceleration is used, especially for visual effects.
@RexZoran
@RexZoran 11 месяцев назад
@Razor2048 how exactly do you enable full acceleration, please?
@Razor2048
@Razor2048 11 месяцев назад
@@RexZoran Depending on the version, the names of the settings and positions can change, though you will often have to skim through the Decode options (both the system and user tab), and make sure the GPU acceleration is enabled for blackmagic RAW decode, H.264/H.265, and if needed, GPU for RED Debayer). For Memory and GPU settings, any setting similar or matching "Use display GPU for compute", make sure it is enabled, sometimes they will not enable it by default if you only have one GPU in the system. If using Nvidia then make sure the processing mode is set to CUDA. While it can also use OpenCL, they optimized it for CUDA more, this is why you see it performing better on Nvidia compared to AMD video cards (though they are making progress with OpenCL). By default they disable options that can lead to the display lagging while doing a really heavy compute task.
@RexZoran
@RexZoran 11 месяцев назад
@@Razor2048 Thank you for your reply! I am little confused by it. It implies 'use display GPU for compute' option is available when there is only one video card in system and unless you specifically enable it you won't be getting acceleration. However, Resolve's manual (page 93 of most recent version) seems to contradict what you said, it says 'by default, a single GPU system uses the same GPU for the DaVinci user interface and also for image processing ... if two GPUs are installed for image processing, this checkbox enables the shared use of the display GPU instead of dedicating it to just the DaVinci user interface' implying that option isn't available unless you have two display cards and in single card system it will be used by default. Could you please clarify?
@Razor2048
@Razor2048 11 месяцев назад
@@RexZoran It should allow 2 GPUs to be used, usually a good setup is 2 similar generation GPUS (same versions of OpenCL or CUDA), and having it use both the GPU that is running the display, as well as the 2nd GPU, or in the case of using both al older and newer GPU. For example, someone with an RTX 3060, as well as a RTX 4070 in the same system.
@RexZoran
@RexZoran 11 месяцев назад
@@Razor2048 Thank you!
@Thirsty_Fox
@Thirsty_Fox 11 месяцев назад
One thing I've discovered is how much better 4K video on RU-vid looks on a 1080p monitor than 1080p feed on the same display. I always assumed it would make no difference but it's massive due to the amount of compression you get with 1080p, especially on 1080p versions of videos uploaded in 4K.
@marcogenovesi8570
@marcogenovesi8570 11 месяцев назад
the 4k video has a higher bitrate so even if you crush it down to 1080p monitor it will still look better even if you have lost the 4k resolution
@Thirsty_Fox
@Thirsty_Fox 11 месяцев назад
@@marcogenovesi8570 That's my understanding -- the compressed 4K gives similar data to uncompressed 1080p. I also think YT compressses the non-native resolutions even more (ie. 1080p on a 4K video looks more compressed than 1080p on a 1080p video).
@Rocky712_
@Rocky712_ 9 месяцев назад
@@Thirsty_Fox RU-vid is known to give awful compression if your video is below 1152p.If you go at least with 1152p, RU-vid will encode your video with a much higher bitrate. That´s basically why it looks better. Not because higher res is better, but because they see that your video is resolution XY and think: Now I need to give that video a higher bitrate because it is above this specific resolution. So bascially it´s a quirk of RU-vid´s website. Other websites might handle it differently.
@morphx666
@morphx666 11 месяцев назад
Hey Dave, why don't you try inviting someone from DaVinci to explain all those settings? That would make an awesome video/colab
@DerSchrottBastler
@DerSchrottBastler 11 месяцев назад
i foresee the future: part 298: "finaly starting editing videos" :D
@ScottGrammer
@ScottGrammer 11 месяцев назад
FYI, the 2060 is not the fastest video encoder. The best trick is to buy a $120 Intel A380 and add it to your existing system. It encodes h.264, h.265, and AV1 like a bandit. WAY faster than a 2060, for $120. Not much good for anything else though.
@_BangDroid_
@_BangDroid_ 11 месяцев назад
Solid advice
@gjmi72
@gjmi72 11 месяцев назад
You are probably talking about the embedded encoders. Does davinci use the embedded encoders or does it use openCL (or other compute lib) to encode? I think the last. Since all of the gpu power is used. So then it becomes a question of how much grunt the GPU’s have. Embedded encoders in CPU/GPU’s are usually not up to par with software (on CPU or GPU) and you would not use them for producing quality video’s. (Except gamestreams)
@brandonaitken5950
@brandonaitken5950 11 месяцев назад
@@gjmi72resolve uses the hardware encoders on your GPU. Yeah quality isnt the best for a lot of uses where bitrate isn't low the quality differences aren't significant.
@jaro6985
@jaro6985 11 месяцев назад
Sure its better but not insanely better, anywhere from 5-40%.
@Veptis
@Veptis 28 дней назад
The A380 also supports more codecs for decoding than Nvdec in the 4090. Especially for people who do like 10bit 422 on Sony, canon, DJI, GoPro.
@xjet
@xjet 11 месяцев назад
If you set "network optimization" to ON then YT can start encoding your upload as soon as it starts... rather than having to wait until the entire upload is completed.
@EEVblog2
@EEVblog2 11 месяцев назад
Not interested in that workflow. I save the file locally and then drag and drop later and do other things when required.
@ZylonFPV
@ZylonFPV 11 месяцев назад
@@EEVblog2I think xjet doesn’t mean uploading direct from the editor, he just means when you drag the file to RU-vid it can start processing as soon as the upload begins rather than RU-vid waiting for the whole file to upload.
@tschuuuls486
@tschuuuls486 11 месяцев назад
The thing with "pure GPU" encoding (NVENC for example) is: You won't get the same quality per MB/s in the file. The NVENC encoder is just less efficient than X264/X265 on the CPU. This is because NVENC is made for streaming and for "good enough" quality with low system overhead for live content basically. If you have the CPU resources free, it's generally always better to go with CPU encoding, since you could also do multi pass encoding and other tricks to get the best quality in the smallest bitrate. That's at least my latest info on that subject.
@EEVblog2
@EEVblog2 11 месяцев назад
Yes, hence why I mentioned the file size being bigger on the NVIDIA version.
@Veptis
@Veptis 28 дней назад
additionally to that, if you get dedicated encoding hardware (not just part of a GPU). Think capture cards, hyperdecks or the Atem Minis with iso recording - they will be fast (fast enough for realtime usually) and exceed nvenc/quicksync in quality. sadly resolve doesn't let you use your Atem as an encoder - but the quality difference is substantial. Won't make it through to RU-vid really. But especially at like 8mb/s or so it's night and day.
@unacomn
@unacomn 11 месяцев назад
Hi Dave. Improving encoding times can be a real pain when dealing with multiple resolutions and bit rates within the same project. I found that sometimes setting a higher Bitrate for the finished product made encoding faster, but with the larger final file size. There are a lot of tweaks you can do with the GPU encoder, Resolve gives you a lot of settings, but it will take time to dial them in and see which is best. I have it running on a 1660 Super also with 6gb of VRAM. No playback issues,no proxies. Just local SSDs and a surprisingly fast 5400rpm 4tb Seagate HDD.
@dragosmihai3489
@dragosmihai3489 11 месяцев назад
Very interesting. I moved from Openshot (opensource and free) to Resolve (just free) just recently. In Openshot I had access to GPU rendering, while in Resolve I was pondering if there's any gain for me to get the "Pro" for the GPU rendering, this video came at the perfect time. My rendering rig is an old-ish Dell Precision with an even older Nvidia GPU, but crucially maybe a pretty old Skylake laptop CPU and just like you, I edit then I click render and it takes however long it takes. On this laptop, moving from software/CPU to GPU NVENC in Openshot did reduce rendering times for x264/high 1080p from around real-time to less than half real-time. The CPU is however an old quad core and power limited to 45W, while your CPU has 3x the cores and of a newer architecture, could be that your CPU is just so fast the GPU doesn't make a difference. My feeling is the GPU memory is not an issue if the GPU has been loaded to 100%, if it had to wait for memory you'd have seen lower GPU load.
@OMEGAS246
@OMEGAS246 11 месяцев назад
Have you tried using the other encoders rather than custom? I'm fairly sure the youtube one and Masters both encode using the gpu for the free version.
@bobstark8749
@bobstark8749 11 месяцев назад
Karnaugh Mapping?! I remember learning about that in Tech School in 1973. It was fun, I could understand it, but I couldn't see much practical use for it, but you're demonstrating it in your screenshot sequences. Hmm, it still must be a thing these days for digital circuit design engineers. Thanks for the video!
@grahammuppet
@grahammuppet 11 месяцев назад
Make sure you have resizable bar enabled in the bios, only a slight improvement though
@TMS5100
@TMS5100 11 месяцев назад
Is CPU or GPU better for davecad though?
@KeritechElectronics
@KeritechElectronics 11 месяцев назад
One of the other Daves (Dave Lovett - Usagi Electric) uses full version of DaVinci Resolve, IIRC. That's a quick renderer indeed... Nice showcasing of the software.
@ScottGrammer
@ScottGrammer 11 месяцев назад
I just bought one of those last week! Still trying to figure it out. PLEASE do a tutorial! I'll watch it a dozen times.
@FrankGennari
@FrankGennari 11 месяцев назад
Interesting and a bit surprising. I have an older PC with a quad core CPU and Nvidia GTX 1070. When I switched from software encoding to GPU (nvenc) I saw a huge reduction in encoding time. I didn't measure it, but it was several times faster. I guess it depends a lot on how many CPU cores you have.
@marcogenovesi8570
@marcogenovesi8570 11 месяцев назад
You need a lot of cores to compete with hardware accelerated encoding on the GPU
@GarthClarkson
@GarthClarkson 11 месяцев назад
Dave. Like you, I also don't really care how long rendering takes. The reason I would like to get the Studio version is because of the codecs. I also use Handbrake in my workflow for transcoding but apparently you don't need to do it with Studio as it has lots more codecs including ProRes. I also don't need all the amazing equipment control, etc.
@JamesBalazs
@JamesBalazs 10 месяцев назад
Just bought the speed editor specifically because it costs £340 and comes with the £240 software so I can finally use GPU rendering. Speed boosts all around with the new keyboard too, so I cant wait for it to arrive. Didn't expect an EEVBlog video though when looking for a performance comparison! Great video, thanks.
@HappyyGamePlay
@HappyyGamePlay 27 дней назад
How is the Gpu rendering performance?
@Quindor
@Quindor 11 месяцев назад
To improve smooth editing generate proxy files first. You can do so by right clicking on the clip in the most left tab. The default proxy settings are kind of insane disk space wise, you can lower that down to a lower quality one but don't use H264/5 those are not well suited for real-time decoding and cutting in and out, as you are noticing during editing. With a proxy clip it's smooth as butter. I used to used Vegas Pro too but it just couldn't handle my 4k60, DaVinci is such an improvement like 10 fold and almost no crashes anymore (compared to 50 per project in Vegas and sometimes taking days to render parts of the file and join them again because Vegas would just keep shitting itself....) But yeah DaVinci is nice, and with a bit of tweaking that rig you have should allow you to edit 4k just fine (make proxies) and rendering as you said isn't your priority or get a new GPU for that.
@ZylonFPV
@ZylonFPV 11 месяцев назад
The H264 proxies actually work quite well in resolve because the key frames are set to once per frame I believe. Normal h264 has key frames further apart so when skipping it has to regenerate the frame from the key frame.
@jasonk9779
@jasonk9779 11 месяцев назад
I also have moved to Resolve from Vegas, though on a Mac Studio and it just flies on 4k footage. Likely due to the Apple Silicon having dedicated encoders. But the best part of Resolve vs Vegas is the stability. No more endless crashes.
@marcogenovesi8570
@marcogenovesi8570 11 месяцев назад
strange how people prefer a software that doesn't crash all the time
@MatthewSuffidy
@MatthewSuffidy 11 месяцев назад
I have a Ryzen 5900X and a RTX 3060. I use my ryzen 1 generation Asus Prime B350 Plus with the BIOS update. I am really happy with my computer. I found I was getting overheat using just a fan and crashes, but I altered the Precision Boost so it clips the max boost frequency to the one in the specs (4.8ghz), as well I altered the max wattage to 105 watts. I get the pretty much specified performance and the heat is ok and the system runs great. To my knowledge the 3060 was the lowest card to support resizable bar, so I did the built in windows legacy to efi converter and run it now that way. I just did a blender test and I got cpu 2 min 47 sec vs 36 sec with GPU, 27 sec with cpu as cudu device with gpu. This is potentially not an equivalent work type though. One time I went to 'autotune' out of desperation and it tried to up the base frequency of the cpu and when it went into 100% load the system just restarted. I hope it didn't hurt anything but it is working fine. I set the thermal throttle limit to 78C which under certain loads it gets to, but actually under 100% blender load it is a bit lower than that like 74C. The CPU and GPU I got on sale using extra restaurant tips last summer. They were both around 450 CDN dollars before tax. I got an openbox Gigabyte Eagle OC 3060 and I don't overclock it and it never goes above 70C.
@dynorat12
@dynorat12 11 месяцев назад
I remember the old days you never edited your videos wow how things have changed lol great videos.
@EEVblog2
@EEVblog2 11 месяцев назад
I have always edited my video, even from vido #1. The exception is some single take videos on the 2nd channel.
@PhilipBryden
@PhilipBryden 10 месяцев назад
Well this has convinced me not to upgrade, so you've saved me £245. My CPU is a 5900x and sometimes when editing the timeline it can feel a little choppy, even ay 1080p 30fps. Someone mentioned about generating proxy media, so I tried that and it works like a dream now. Thanks for the video. I don't mind waitng a few extra minutes for the video to render.
@AdamsLab
@AdamsLab 11 месяцев назад
I've never understood your reason for using handbrake for transcoding. Seems like an extra step for no reason...
@dolbyman
@dolbyman 11 месяцев назад
Black not back (title and description) :)
@xjet
@xjet 11 месяцев назад
Nah.. .Dave has it right... BackMagic Dissolve 😛
@gglovato
@gglovato 11 месяцев назад
Leaving your comments aside, the 3000 and 4000 series have a much improved NVENC and NVDEC blocks, which are faster and have much better quality at the same bitrate compared to 2000 series GPUs(there's a comparison table in wikipedia iirc). Now i'm very curious to see this same test repeated across a 3000 GPU, maybe some company will send you one? ;). Also, you were looking at "shared memory" which means nothing really, the gpu memory usage is the one to look for, and it won't make any difference for video editing
@AlexConner
@AlexConner 11 месяцев назад
Your actual encoder settings were still set to high quality tuning.
@DJlegionuk
@DJlegionuk 11 месяцев назад
it's very interesting to see how resolve chooses to use GPU or CPU to decode some files. The thing I don't understand is why can't it use CPU and GPU at the same time, you can see the GPU at 100% and the CPU is at 10% all that processing speed sitting unused.
@Veptis
@Veptis 28 дней назад
GPU rendering, as in calculating effetcs, actually included in free version. Not using the youtube preset will give you more options. Also opening the advance settings will allow you to for example "bypass reencode when possible" which really speeds things up, as it doesnt need to decode and then encode again. Also the quality high/low/best isn't that usefull. Rather restrict it to a bitrate you set by hand. Having studio will give you plenty of other new features, and rhe edit page performance should be mich great. Even generating thumbnails for the timeline tracks...
@Distinctly.Average
@Distinctly.Average 11 месяцев назад
I reckon Blackmagic are missing a trick here. That editor and their keyboard could so easily with a bit of software be usable as a photoshop editor too. Yes, I know Adobe are a competitor, but this could open the door to a lot of customers. Same for Affinity Photo, capture one etc. I use photoshop for images, and DaVinci for video. Photoshop users often pay just this much for a dedicated editor keyboard. If they get Davinci thrown in it may quickly turn them away from Adobe’s video editing tools.
@Lynxxde
@Lynxxde 11 месяцев назад
Speed of the GPU doesn't matter for Video En/De-Coding, it's done by the NVENC-Part of the GPU which was introduced in 2012 and got some new features like HEVC and lastly AV1-Encoder, but in essence for the same settings the generation of NVidia-Card doesn't matter, the speed doesn't change.
@EEVblog2
@EEVblog2 11 месяцев назад
Yes, I've mentioned this in previous videos. You only need the low end of a new chipset to get the exact same encoding engine on that chipset. I do think the 4060 using a newer core than the 2060 though?
@Quindor
@Quindor 11 месяцев назад
Oh yes it does, especially on the 40 series GPUs, on a 4080 it's at least double the speed of a 3080 for instance. And there have been upgrades in 30 series vs 20 series too. P.s. 4070 and lower (from head) don't get the new double-split encoding engine so will see less of an inprovement!
@StingyGeek
@StingyGeek 11 месяцев назад
I think some of the later GPU's (NV) have multiple encoders. Well out of this ducks price range. But it may be beneficial in multi-input workflows?
@pldaniels
@pldaniels 11 месяцев назад
I ended up buying nVidia T400's specificially because that was all I needed to have the GPU encoding to help me with OBS and it was cheap as well as very low on the power consumption. Nice to have the 3x miniDP outputs too.
@leandrolaporta2196
@leandrolaporta2196 11 месяцев назад
The stutter you have when editting in the playback screen it's because of mechanical hard drives on the nas, it is NOT the network, i tested ALL of it because it makes me crazy when i'm editing that quite often the screen freezes and the audio keeps going. i tried to copy the soruce files to a local WD BLACK on the editing machine and made no difference, the real difference was having all in the NVME and edit from there, wow, it blowed my mind, the difference was astronomical, so if you want speed, have all the source on nvme drives, the destination doesn't matter, any hdd nas will do.
@EEVblog2
@EEVblog2 11 месяцев назад
Nope. I tried editing from local SSD with no difference.
@ZylonFPV
@ZylonFPV 11 месяцев назад
Might be good to get hold of a 40 series nvidia card so you can get AV1 encoding 😊 it’s 25% better quality for the same file size
@mikehensley78
@mikehensley78 11 месяцев назад
@00:34 ... That's what she said. :)
@mcconkeyb
@mcconkeyb 11 месяцев назад
68C seems a bit too warm. Does your GPU rate limit when it overheats?
@zadrik1337
@zadrik1337 11 месяцев назад
If you could figure out "feel-a-vision" you would revolutionize a parallel video industry.
@lezbriddon
@lezbriddon 11 месяцев назад
i'm moving up the gfx card world, finally got a 1080.....
@gabest4
@gabest4 11 месяцев назад
No use comparing different encoding implementations. The algorithm is different, the result is different. I can make ffmpeg encode ten times faster or ten times slower than a gpu, it depends on how much motion search I want.
@XSpImmaLion
@XSpImmaLion 11 месяцев назад
There's a few things that explain why there wasn't much difference there. I think the main point is just basically that you have a very nice CPU Dave, but your GPU while being no slouch, it also isn't anything extraordinary. nVidia 20xx is already 2 generations behind, and the xx60 class is one step above budget class - it's in the very bottom of mid range. Still better than my 1660TI, but you know. There's also the fact that you are basically not using the PC for anything else, which lets DaVinci max out CPU usage for rendering. It's like, these days, the difference between basic rendering tasks between CPU and GPU are not a huge gap like it was in the past... with multithreading and CPUs with 12 cores, CPU rendering has become much faster than it was in the past. Or rather, it depends on what your project has. For instance, if you had a ton of 3D scenes, multi-cam, tons upon tons of layers, transitions and effects, text overlays and all that jazz... then perhaps you'd see a bigger difference there. But basic cuts and few layers, you won't get much of a difference. That or say, if you were using the PC for something else during the render process. And then, I think there are some tweaks and changes in configuration to optimize GPU usage, but I'm not too familiar. That's more or less it.
@vincei4252
@vincei4252 11 месяцев назад
Someone send Dave a graphics card with 48G of RAM 🙂
@ErrorMessageNotFound
@ErrorMessageNotFound 11 месяцев назад
Chome uses some GPU Memory, you'll have more free if you close it probably. Also, as you mentioned you do have a beast of a CPU, that may be part of why it's not much of an improvement.
@markissboi3583
@markissboi3583 11 месяцев назад
my i7-9700k & 2080's watercooled beats most new gpus :) cost a bit in 2019 but never need to upgrade bit like the people who bought the 1080ti's big performance beyound 2023
@6581punk
@6581punk 11 месяцев назад
You need a better GPU obviously. If you GPU and CPU are on a performance par then it won't help that much really. A 2070 super is a good price/performance GPU card. 4090 is the latest and I tend to find it doesn't get fully utilised by apps.
@marcogenovesi8570
@marcogenovesi8570 11 месяцев назад
Everybody is leaving Vegas for Resolve
@TwithGazz
@TwithGazz 11 месяцев назад
I'm rendering as I watch this video. Renderception.
@stelmo502
@stelmo502 11 месяцев назад
Black to the Future. Black to front, ???
@nonsuch
@nonsuch 11 месяцев назад
Get a RTX 4090.
@HappyyGamePlay
@HappyyGamePlay 27 дней назад
Do you have one? How does it effect you're render times and what settings help it go faster if you don't mind me asking, should I just use what he uses in the video
@simonpaul9795
@simonpaul9795 11 месяцев назад
It would be much better with a better card. 2060 is crap
Далее
Stop Wasting Money on These GPU!
12:31
Просмотров 30 тыс.
Renewable Energy vs Fossil Fuels Presentation
30:30
Просмотров 7 тыс.
Rendering GPU vs CPU | Test and Review
16:05
Просмотров 27 тыс.
CPU Vs GPU for Video Editing and Rendering
14:14
Просмотров 37 тыс.
Why SOLO Editors should be using Blackmagic CLOUD!
12:13