Thanks for the tip, can't wait to test this out! For anyone who has an intel cpu with integrated graphics, but the intel quicksync option doesn't show up - one possibility is that you don't have internal graphics enabled, they're disabled by default if you have a separate gpu. You can enable this in BIOS, but the exact menus depend on the motherboard you're using - a youtube search for "enable quicksync" should tell you everything you need to know.
Thank you! You are a life-saver. I have a 13600K and I searched for the Intel option In DaVinci but didn't see it, so I thought maybe my iGPU was dead. Lol
@@theTechNotice if I enable this, will the nvidia gpu still be doing effects and color grading ? I have a 11400h, it supports quick sync, but I don’t know if its better than my 3060
Your tip is worth its weight in gold, thank you very much. The export in particular runs about twice as fast on my MSI laptop with Intel 12700H. Best regards, Karl
In DaVinci Resolve 18 this option to select NVIDIA or Intel was gone. I use old NVIDIA GPU and 12gen 5 Intel and IT works fine with 1080p and with Mavic 4K.
Do a search on your CPU on Intel's website and see what codecs it supports. But any Intel CPU should, except those which name ends on an F, if I remember correctly). Those don't have an internal iGPU.
To my knowledge, the current Intel GPUs have a lot slower engine for video encoding, than their CPUs. Don't know if it's the same for playback (decoding).
So far I’m finding that in 18.5.1 I have to use the options to force Resolve to use QuickSync for decoding if there is an Nvidia card present. With both selected, it keeps trying to use Nvidia. I haven’t tested with AMD or using proxy or optimized files - only straight from camera h.264 files so far.
I have a question about buying 2 new upgrade items for my dying i5 6500 skylake build. I'm looking at 12400 or 13400 for mobo , i want to get Asus Prime H670 or Asus Prime H770 I have no other budget, thus will remain using the same DDR4 32gb - 2133mhz if using XMP profile will be 2400mhz . Please give me your experienced advice master 🤔
11th gen has uhd750 graphics almost the same as 12400 (uhd730) it has one video decoder. CPUs from 12500 and higher have uhd770 which has 2 video decoders. So uhd770 is a bit faster than 750.
Thank you for not being a davinci fanboy and saying its my computers fault when it cost 20 grand for this computer and nothing has ever caused cpu spike EVER. Thank you for offering real solutions. I am about to switch back to premiere if it doesnt stop. I havent uploaded in 3 days because this trash program. They actually banned me from the discord because i told them its davinci's fault. craziness. P.S. This worked, I didn't have the intel option but I unchecked Nvidia. Thank you.
That's unfortunate man. I've had the same experience as you with adobe premiere. That's why i switched to Resolve Studio. I hated losing progress in adobe. Maybe its due to windows or an older NVIDIA GPU idk. I have a 13th gen iGPU w 4090 and no crashing on my end on 18.6.
Two questions: First, is this only if you have Resolve Studio (I have the free one), and second, is this for all Intel processors or just the newer generations (I have a Core i7-8700K on my desktop and a Core i7-9750H on my laptop).
I tried this and it really helped with h.265 files, but at the same time I lost a lot of speed when it comes to h.264 files. Playback of h.264 fikes stuttering even when files are placed onto firecuda 530 disk. But, with this box checked, full HD h.264 files are playing smoothly even when they are on NAS, which is so much slower than firecuda.
This is so helpful! Thank you so much. How much better are 12th and 13th gen than my 99000k in Resolve? I don’t know much about encoders and decoders so I don’t know how good they are on my CPU. Thanks!
thanks for the video! you are talking about timeline performance. does this include vfx and color grading? what about rendering? would you use the nvidia card to render the final output?
I literally switched to resolve days ago so this is timed perfectly. I’m using Intel 12900 cause I don’t overclock. My GPU is RTX 3090 Evangelion edition 😎
I don't have that option within the decode options. Currently, I am trying to figure out an intel hardware acceleration error I have on startup. That might have something to do with why I don't have that option. All my drivers are up to date and still, the error persists. I can't find any settings to allow me to fix this. I will keep searching for answers that may explain this.
I was using Premiere and I noticed it does switch to the best choice, which was I relief as I just want it to work without having to go into settings , open/closing depedning on what I am doing.
I’m using obs gaming footage that’s 2k at h.264 but in the task manager the igpu is never in use or decoding video at all, no matter what options I put, I have the studio version davinci, rtx 4090 i9129000k 164gb ram. I cannot figure out what is going on and there’s nothing on google about it, please can you help me, the intel gpu is greed out in the video decoding options I can only select the rtx 4090, next to the igpu in decode options it says the igpu is being discreetly used or something along those lines, my video is super choppy even while in proxy’s and quarter res. Please help me.
Does that mean, those laptop with , say 13900H, without and dedicate GPU, would be good enough as with for editing? Have you tried that? Otherwise, maybe it's not bad to get the ARC 380 for those who dont have Intel CPU 12/13 Gen?
I have a 9900k with 2080ti and 32 gb of ram. Are 12th and 13th gen chips way better than my 9th gen for encoding and decoding and overall experience? I do 1080p and 4k editing for someone and don’t mind upgrading.
Hi all. I would like to see a real comparison of AMD RX7900XTX 24Gb vs RTX 4080 - and you????? Just more memory and better in tests. Especially interesting with the I5 13600 processor? Versus all 11 and 12 series processors. Now that would be a money saver. But who would go for that? Or?
random question, how good is the gtx 1660ti compared to rx 6600 on davinci resolve? some said you dont really need cudacore on davinci, unless you're using premiere, is that true?
I didn't know there was an i5 13th gen! Just found one on newegg. Putting together a machine for resolve 18. Kinda out of my depth. I just don't want to make a purchase and regret it a year from now when I really know what I'm doing.
I have a NVIDIA Geforce RTX 3060, I don’t have the option to change the setting like you do, is this because I’m using the free version? Won’t let me select NVIDIA on my rendering Encoder only H.264 H.265
OK, Davinci Studio 18.1.2 on Linux Min 21.1 with Intel 12700, 64 GB DDR4, 1TB SSD, NVidia RTX 3060+12GB. Rendering a 4K clip with fusion elements and some effects: 57 secs with NDIVIA ON 1:03 min with NVIDIA OFF
To my knowledge, the current Intel GPUs have a lot slower engine for video encoding, than their CPUs. Don't know if it's the same for playback (decoding).
@@akyhne im from Mongolia and English is not my native language so sorry that i didnt heard mentioning several times in the video what i have asked. If you were a mongolian you would have helped by giving a timstamp to listen it again but it seems each country with their people, you seem to just scold me. Im not briliant at tech so thats why i asked, i will delete my sily question but this software should be able to choose one over another for decoding.
@@AexoeroV The CPU is more powerful than any GPU for playback. It also supports more codecs. Therefore, there's no reason to have both checked. The software cannot know how much "energy" the CPU or the GPU uses, at playback. Therefore it doesn't always make the correct decision. That's me saying that, not the guy in the video. So if Premiere Pro can figure it out, it's because it's built into the software. A kind of matrix of CPUs and GPUs, where the software simply checks "oh, you have an Intel Core I9 13900K and a RTX 3060, so therefore I should choose to use the CPU for playback". This can easily get complex, so Blackmagic who makes DaVinci Resolve, chose not to.
Hello i need with my intel i9 13900K, am new on both PC and Davinci resolve user and every time i play a footage my intel processor hits 100% as soon as play the Video, please i don't what i am doing wrong?.
Will this even work with a 13600KF? Ie, no built in GPU? I'm looking at a build for resolve right now and considering saving the money as I'll have a pretty powerful GPU.
is it a good idea to computer to davinci free version, use a processor with integrated graphics and buy more ram memory such as 32 gb ram. without a separate dedicated graphics card? is an intel processor with integrated graphics enough for stable operation? my needs are small video forms up to 20 minutes fhd movies or sometimes 4k. thx.
This will cause some out of memory problems when you are putting heavy grading nodes. Trying putting Neat denoise plugin and you will see. Sometimes I can’t even export the project or can do it but the final rendered file is having glitches. Need to just tick only on nvidia and it will never have an error when exporting. Talking about 4K footages. Trying so many times…
Recently I bought Intel Core i5-13600K. It works with MSI B660M Mortar mobo. Which BIOS options are recommended to be enabled/disabled for the cpu to be as cool as possible, while staying fast?
It depends on your project size/footage. For my projects with 8K h.264/h.265 footage, the cards with 10GB or less of VRAM run into issues. The cards with more than that are showing over 9GB of VRAM usage almost as soon as I start working with the footage. So that’s why I still use a 3060 12GB on my less powerful system. So if you are working at 4K or lower, the Ti might be better. But honestly I’d seriously look into an Arc A770 16GB or 16GB AMD cards like the 6800 XT or 7800 XT. I’ve been testing all of them against the 4070 this week and the results have been very interesting. On my oldest system (with no iGPU), I swapped in a 4070 and 7800 XT for the 3060. In PugetBench, the 7800 XT came out with the highest overall score (both standard and extended) but there was a split in terms of which part of the program were handled better by the 7800 XT vs. the 4070. The 7800 XT won 4K Media, 79 to 77. The 4070 won 8K media, 73 to 64 and GPU Effects 112 to 98. But then the 7800 XT won Fusion 290 to 244. The A770 has been really good with decoding footage, even without the aid of the 13700K iGPU, too. I haven’t tested it together with the iGPU yet.
To my knowledge, the current Intel GPUs have a lot slower engine for video encoding, than their CPUs. Don't know if it's the same for playback (decoding).
@@akyhne This same channel said that the Intel Arcs has strong encoders that could even match higher end nvidia cards, though abviously less gpu muscle.
On Windows the Free version doesn’t support a lot of things that the Studio version does in this area: - no support for multiple GPUs (or combining iGPU and discrete GPU) - no hardware acceleration for h.264 and h.265 so you don’t have access to the same preferences - no support for h.264 10-bit 4:2:2 footage (shows up as audio only) On Mac some of that changes, but on Windows you really need Studio to get a fast experience working with h.264/h.265 footage. If you are using the free version, you may want to convert your footage to DNxHR format since without the h.264/h.265 GPU acceleration, the CPU playback can be a bit easier DNxHR.
This isn't true for me with 4090/13900k - turning off the NVIDIA means it will use the CPU and the Intel GPU instead. If I disable my 4090 and then watch the task monitor it will still use Intel GPU AND Nvidia GPU simultaneously. This is with Sony 8k H265 Log media at least. It is very dependent on the codec and what resolve activity you are doing (edit/colour/FX) what combination of HW acceleration it decides to use. Best to leave them all turned on, you're just stressing your CPU more if you turn it off.
@@theTechNotice yep that what I was using 8k H265 4:2:2 in max bitrate. Resolve 18.1.2 and latest Studio drivers. I'll try 4k footage, but with 8k it's best to leave NVIDIA ticked.