As someone who is just starting to produce things in Blender seriously, this video is an absolute godsend:)🙂 Thank you very much indeed for producing and uploading; it is very useful:)🙂
Thanks man I appreciate it! Would love to see some of your beautiful 3d work using any of my sketches in the future. You are a beast! Anyway have a great day and keep hustling!
just one thing, tile size under rendering settings are important for optimizing the perfomance (smaller for cpu), (larger for gpu) but if it's too big or too small there will not be a perfomance improvement and it will depend on your hardware. otherwise great video.
literally 6 years later and this covered my exact question, "can you render something with gpu and cpu at the same time" everywhere else i did not say much but this video is the exact answer i needed, thank you hehe, ps blender has come a far way sheesh
Also, you should note that for CPU+GPU rendering you should use a tile size of 16x16, or 32x32 if you use denoiser, because cycles has been optimized for this with GPU+CPU, also the denoiser part is on the air because it could change, so better to stay at 16x16, the final objective is to remove the tile size option and let Cycles to decide the faster size, but for now you should tell this, cheers.
this is what I was referring to when I mentioned render settings at the end, I did not feel the need to delve deeper was required for this tutorial, but thanks 👍
Great then! I said this to you because most people will keep using old settings like 64x64 or 256x256 and will find that GPU+CPU is MUCH slower, while the problem is just that :)
I love your work its always amazing but the most thing i hate that you have alot of subs and no one open your wonderful videos you deserve more than you have of subs keep it up❤️👍💖💕
THANK YOU, tried the settings and disabling the CPU nearly doubles the rendering speed. Seemed to spend a while on denoising before. On top of that, earlier I switched my 1660ti from Game Ready to Studio which boosted speed by up to thrice. Now have a 15 hour render before me, instead of 40 hours, and there I had 8 instead of 12 samples, too. Edit: nope, just a glitch where gpu might get slow after PC sleep/hibernation, needing reboot or disabling then enabling the gpu driver. The game/studio driver setting prop does nothing.
Hi can u tell me how u did that I have rtx 3060 and ryzen 7 5800H yet still the render time use gpu by 4% and uses cpu like up to 80% even wjen i put it on gpu compute
@@amaraahmad7366 Dunno what to tell you. I updated my comment above because what I thought fixed things was actually irrelevant. The video might not even be that relevant now, bit old. I've nothing to suggest, sadly.
Would like to see an updated version of this. I am trying to get a 2070 super + gtx 970 to render at the same time with blender 2.8 and not finding a solution so far. The 980 is always set as "display" in the blender preferences but I want to make use of all the GPU's for rendering. And want to add a third card too if I can get 2 to render at the same time . Win 10 pro version 1903 with nvidia experience drivers 431.60 (August 2019)
I got all 3 cards working on my msi z97 gaming 5 motherboard with ddr 3 memory and a . RTX2070 super + gtx 970 + gtx 970 + i7 4790 clocked at 4ghz. I do the classroom benchmark in 2 min 12 seconds and the barcelona one in 4 min 28 seconds...... absolutely blazing fast speeds . With the new denoisers I can halve that time again. It's pretty crazy the improvements this year.
(In cpu + gpu rendering environment) My guess is that the number of cores of cpu is more important than gpu. What do you think? For example, I assume that the speed of ryzen1700 + gtx1070 will be similar to ryzen1600 + gtx1080ti, but it is not easy to find related benchmarks. Do you have any information?
Clearly for this particular utilisation GPU is more important. BUT with my 16 thread Ryzen 2700 + gtx 1060, when you have GPU render + CPU support (instead of CPU + GPU support) I got a massive increase of rendering speed, more than 2x faster.
Hi, Danny Mac 3D. Congratulations for this video. Can you tell me if we (the Blender users) can use GPU's to render using the the Blender Render option instead Cycles Render? I have no experience making videos with Cycles. I don't know if some object of my project can not be rendered in cycles. Best regards! :)
Sadly, once 2.8 will be out this tutorial will be outdated as for now the Ctrl Alt U shortcut has been removed (and probably for good), and the user preferences button moved from File menu to Edit menu.
I did think about this before I recorded it but it could be a while before it's out so I did it anyway - it's hardly a ground breaking tutorial though so I don't mind the short shelf life :)
My blender install used to do this by default, but now it renders the whole scene as one big tile, and only did it as one thread via my CPU. I've made it use my GPU now but even when I select both it only uses my GPU, anyone know how to fix this?
they should kickstart blender or make more financing by donations, its more intuitive than maya or max for modeling and cycles feels very simple to use for render, no idea about evee yet... I still prefer materials editing in c4d... they should definitely make more shrinked down addons experience and focus on making blender a more compact software with less windows pulldowns =)
systems specs dell g7 laptop gtx 1050ti card intel 8750h 6 core i7 processor ram 16gb renderer cycles scene classroom blender demo scene time cpu only 23:42.95 gpu only 11:46.20 cpu+gpu 9:7:50 single frame. 1920*1080. thanks for the video.
hello, i am following the steps to change to gpu, but the only option under cycles compute device is "none". There is no CUDA option. does anyone know why that is, if so, can you help me get the CUDA option?
Even when I have GPU compute selected and my GPU under the Cuda menu is selected I still see the GPU is idle in the task manager under Windows 10 and only the CPU is busy working. The render takes forever compared to Maya.
@@basspig change your tile size in the performance tab, CPU likes smaller tiles while GPU bigger tiles. For me 16x16 is optimal for CPU and 512x512 for GPU. I don't think you can use 2 tile sizes though so it requires a lot of experimenting and in my case, GPU alone is still faster but not by much.
Hi, I'm having an issue where when I try to use my eGPU it always turns my character's hair black or destroys any transparency in the scene. It just messes up my materials. Do you know how to fix this? I'm using the Blackmagic eGPU on Mac with 2.79. The same one in the video.
I have i7-6700hq 2.60hz and GTX 960M When I use both CPU+GPU its slower than using only GPU. Does anyone know why it happens?Render finishes but it keeps calculating the scene 5 seconds more( same difference between gpu+cpu render time and only gpu rendertime) ?
@Jared Thats not true at all lol. 99% of the time the gpu will be faster, unless you have a horrible bottleneck. graphics cards have thousands of cores vs just a few on cpus.
When I check the Cuda with CPU and GPU at same time. I still only get 3 squares working, and not 9. do you know why this is? and how I solve it? Also. The CPU is working at full capacity, while GPU is doing nothing, when I look at the activity at ctrl+alt+delete
Number of CPU dedicated tiles is reliant on the number of CPU cores you have. If you're running a single core CPU with multi-threading (or dual core CPU without HT) then you're only going to get an additional 2 tiles per tile-compute.
I have a question I’m new to blender i knew 2.8 was coming out so I held off on learning it until now I’m going to start now. I just ordered a new iMac should be here later this week I had it maxed out with a i9 cpu and a Vega 48 GPU only has 8GB ram now but I will upgrade it to 128GB ram at some point 64 GB at first when I’m setting up my settings what would you suggest I do to get the most out of this computer.
You made the right choice with CPU for Blender, because Macs cant use GPU computer for Blender because Apple cut support for it and Blender doesn't support Metal
i think i saw a forum saying that 2.79b was only a bug fix, so the option wasn't carried over. or something like that... too bad. the version that has it is 2.79.1, but i'm not sure where to find that.
I'm not great at rigging but I've done bits and bobs over the past few years, mostly in Maya. Here I'm using a tool called auto rig pro since I don't have time to build or script my own control rig. Saves tons of time and you can go in and edit everything to your liking :)
I also use auto rig pro, but it seems to always give way too much weights to the thighs/hip area and the deformation looks completely off. Obviously tweaking the weights is the best way to go, but i personally don't know how the leg should bend in relation to the waist/hip and how it should look. I usually put my model into an extreme pose, like the splits, for example, but once i've fixed the side of the thigh, the front and back get messed up as a result so it seems like you just can't win and balancing between them all is very hard. @IGarrettI Auto rig pro costs $40 for the pro version and $19 for the standard, i think (pro version gives you the ability to basically automate the whole process and allows you to place key markers and the add-on will generate the whole rig for you, but will probably still need a little work/cleanup. With standard version, you don't get the automation like with the pro so you'd have to add a human rig and change the bones positions yourself.), the popular and free one currently is called blenrig5 in my opinion. It's much more advanced than auto rig pro and gives some very good deformations using a pre-skinned mesh deform cage, but also has a steep learning curve. A little caveat is the video documentation is behind a paywall. You can download the free quick start pdf guide, but it doesn't go into as much detail or step by step as the video tutorials do and explains some things, but then states it'll be explained more in the video tutorials. You could also just use rigify which is free as well and a lot of tutorials around, although i personally don't like it as much as the other two options mentioned in this post.)
Yes auto rig pro doesn't really do your weights for you you're always going to have to do them yourself, and yes the thighs are a tricky area to compete with. One thing you may want to look into are corrective shape keys. It's kinda hard to explain what they do in a comment but there should be something on RU-vid
Are AMD Ryzens capable doing this? Example: Ryzen 3 + GTX 1050ti I saw on other posts that programs should have CUDA for Intel while OpenCL for AMD... can someone please enlighten me XD