What a great talk, seriously. I came expecting to have to pause every slide to check some concepts online before resuming but it wasnt necessary. Erik explains everything without leaving any gaps, or without assuming the listener knows everything. Thanks a lot not only for the content but also for the code examples and simple explanations.
Impressive tech and talent. But the first question i get from this is: why? What do you do with all that cpu? Using your gpu wouln't reduce the amount of other cool things you do on the gpu ? (E.g. better graphics )(kinda rethorical because for culling they loose about 57mb) And also dont you loose cpu access of some of that information?
In fact, cpu is usually not a bottleneck. And players usually struggle to have gpus powerful enough. I could make assumptions as to why they chosed this. But i havent seen it in the talk.
I am not an expert in the field of game optimisation but I use 3d Programms quit often especially blender. There you have the choice to render the scene via the gpu or cpu. And I never ran into the scenario where the cpu was faster then the gpu. So there is quite a lot of headroom for the gpu to do some other computing. And especially on consoles where there are not the most powerful cpus its seems like a good idea to shift more work to the gpu then the cpu. But again I am not an expert. I never made a game or game engine :) I just watch videos like this from time to time. This is an uneducated guess
A common bottleneck to dismiss is communication between cpu and gpu. Moving data around is the most expensive thing in the world of compute. Moving the culling to GPU not only reduces the synchronization point between cpu/gpu, it also allows you to do a more fine-grained culling since you are less concerned with pcie bandwidth. Not to mention mesh shader could do other cool things
people at remedy are madlads , 200 millions polygones scene 💀💀, so there is actually a part of the mesh shading pipeline that was missing , why was vertex generation part skipped ? but damn , that culling capability is insane , imagine working on scenes whit less than 5 millions polygones to begin with , mesh shaders are the futur and for a good reason😎 also i don't know what to think , ps5 has primitive shaders and the series x has mesh shaders (from information i could gather ) ,so the ps5 can actually run mesh shaders in hardware? So much for missinformation
it's possible that primitive shaders are just a different name for mesh shaders with a slightly different featureset. Vulkan compared to d3d12 has that a lot for example; where DirectX calls something like "Variable rate shading" and then Vulkan calls it "Fragment shading rate", basically the same but slightly different and renamed. I don't know though, since I don't have any NDA on either playstation GNM or xbox GDKX
Primitive shaders existed earlier than Mesh shaders, but Mesh shaders introduced more features and got more popular. Currently I don't think there's a difference in capabilities between them. And yes, PS5 is RDNA2, so they could support Mesh shaders, but decided to stick to their Primitive shaders.
I suspected that stupid meshlets technology is the reason why you can't run Northlight games in native 6k resolution with smooth framerare, even on lowest detail. AW2 looks pretty mid on lowest settings and still runs like crap, unless you lower resolution or turn on upscaling. It's a very disappointing experience, since ho WH will fix this. It's the game engine "feature". It's ironic you have to play the game in lower resolutions despite the fact it wants to present photorealism.
newer graphics cards (nvidia 2000 series, amd 6000 series upwards ) have zero issues with using mesh shaders. ran totally fine on my 6900xt and on my 4070. also, lol at native 6k
Native 6k?? I don't get this resolution chase, why bother trying to present something with a high quantity of pixels if the contents of those pixels then have to sacrificed in fidelity in order to do so, negating the entire point of running at a high resolution anyways especially since you're actually only exposing further the lack of fidelity in the first place. We can attain good image quality smarter these days instead of using brute force, making room for cool solutions such as those seen in this presentation. And I'm pretty sure, provided you're using modern hardware, mesh shaders only improved the perf of the game.