Тёмный

Global Illumination Based on Surfels 

SIGGRAPH Advances in Real-Time Rendering
Подписаться 7 тыс.
Просмотров 42 тыс.
50% 1

Опубликовано:

 

15 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 81   
@nowherebrain
@nowherebrain 3 года назад
including skinned meshes, this is impressive. Can't you also save surfels by ignoring geometry with direct lighting...that is..not applying surfels to directly lit surfaces.???
@NullPointer
@NullPointer 3 года назад
I thought the same, but then the surfaces close to those areas won't receive that bounce
@nowherebrain
@nowherebrain 3 года назад
@@NullPointer I get that, I'm not clever enough to have a creative solution for that..besides..it's kind of arrogant of me to have thought that during the development(ongoing) that you hadn't thought of this.. I love this btw, good work.
@sampruden6684
@sampruden6684 2 года назад
There're some cool ideas here, but after watching this just once I'm not seeing an obvious advantage vs DDGI. This has very slow convergence times, and even the converged renders sometimes look a little blotchy in the demo. There's a lot of complexity that goes into handling skinned meshes etc (and that doesn't handle procedural geometry) that DDGI avoids by storing all of the information in the probe volume. At the start they mention that they think it's better to calculate the GI on the surface, because that's where it's needed. That sounds sensible in theory, but I wouldn't say that anything here stood out as being visually better than DDGI in practice. Is there something in the "pro" column that I've missed? I guess it doesn't suffer from DDGI's corner case when all eight surrounding probes are unreachable.
@williamxie5901
@williamxie5901 2 года назад
It’s good for large open world games. For ddgi, far objects will fallback to low res probe grid due to its clip map structure, whereas GIBS spawn the surfers from screen space which is almost constant
@erikm9768
@erikm9768 2 года назад
Isnt this just photon mapping essentially? is there a difference, except that using surfels with depth functions instead of spheres? Photon mapping traces back several decades
@clonkex
@clonkex Год назад
In realtime though?
@cube2fox
@cube2fox 9 месяцев назад
So did they end up using this approach as a default for GI? Or do they use something else for new EA/Frostbite games?
@dixie_rekd9601
@dixie_rekd9601 2 года назад
sooo, am i right in thinking its kinda like nvidias hardware raytracing based global illumination but instead of single pixel samples with an AI noise filter, its a softer blobbier sample radius with far better performance?
@clonkex
@clonkex Год назад
RTX simply provides hardware acceleration of tracing the rays. That is, it makes it really fast to say "fire some rays from this point in this direction and tell me what they hit, how they bounce and what colour information they gather along the way". That's literally all it does. It's up to you to decide how to use that information and incorporate it into your GI shading. This is basically another version of "fire as many rays as we can afford and accumulate the results over time until it looks realistic". Hardware raytracing could totally be used in this algorithm to make it "look good faster" by firing a lot more rays. The trick with this sort of solution (well, one of many, many tricks) is that you don't want to waste any work you've already done, but you also have limited memory. I also don't think there's any AI noise filtering going on here. It's just regular procedural denoising unless I missed something.
@Zi7ar21
@Zi7ar21 2 года назад
Neat! This was made by EA though and so we have to troll them with jokes about how they are gonna start charging $0.01 per surfel
@ch3dsmaxuser
@ch3dsmaxuser 10 месяцев назад
That is awesome!
@635574
@635574 2 года назад
I can thank Coretex for telling me about surfels
@davinsaputraartandgamedev9453
@davinsaputraartandgamedev9453 2 года назад
I'm curious on how this compares to lumen. Anyone willing to share their thought on comparing the 2?
@Xodroc
@Xodroc 2 года назад
If it supports VR, it beats Lumen.. Otherwise, it's a nice alternative.
@eclairesrhapsodos5496
@eclairesrhapsodos5496 2 года назад
Irradiance cashe is better, not sure about that one, but Lumen do reflections too and its not ray tracing. My opinion VXGI / Lightmaps / SVOGI / Brute Force RTGI / Poton Mapping diffuse GI is best for now. PS: soon for Unreal Engine be added realtime caustics via Photon Mapping on GPU with extreme good denoise / approximation. I really excited about that method of Irradiance cashe - should be medium premium lol (ballance of quality/speed/production time).
@brainz80
@brainz80 2 года назад
I had the exact same thought
@edinev5766
@edinev5766 2 года назад
In my testing, and since it's linked to my job it has been extensive - Lumen is slow for big exteriors. Unusable for most professional applications. This doesn't seem to be. But no way to know unless it becomes available for the general public.
@halthewise3995
@halthewise3995 2 года назад
I'm not an expert, but you're right to point out that Lumen and this are trying to solve roughly the same problem, and the high-level approach is somewhat similar as well. Both combine local probe points stuck to the surface of objects with a global grid of sample points, and both are using roughly similar approaches for ray steering. The biggest difference in approach that I see is that Lumen's "local" sampling points are re-created from scratch each frame because they are strictly placed on a screen-space grid, while surfels stay alive as long as the camera hasn't moved too dramatically. That means Lumen needs to do temporal smoothing in screen space at the end of the pipeline, while surfels can do it earlier (and a little bit more flexibly). In theory, that means the best-case performance of surfels when the scene _isn't_ changing and the camera's not moving is significantly better, especially for high-resolution rendering. On the other hand, when the camera is moving, surfels needs to do a lot more bookkeeping to move and update the surfels, so it seems likely more expensive in that case. In practice, the big difference is that Lumen is much farther in development, and actually exists today, including lots of work hammering out edge cases and adding all the little tweaks required to get good real-world performance. Surfel-based GI is clearly earlier stage right now, so it's hard to say how good it will be when it's "done".
@cptairwolf
@cptairwolf Год назад
Interesting solution but I'll take path tracing with radiance caching over this anyway.
@lohphat
@lohphat 2 года назад
OH! I thought you said "squirrels". Worst clickbait EVAR! (I still enjoyed the video.)
@dragonslayerornstein387
@dragonslayerornstein387 2 года назад
Oh god this is so jank. But it works!
@charoleawood
@charoleawood 2 года назад
I think that "surface circle" is a better description of what these are versus "surface element"
@nielsbishere
@nielsbishere 2 года назад
Surficle
@inxiveneoy
@inxiveneoy Год назад
@@nielsbishere Surcle
@nielsbishere
@nielsbishere Год назад
@@inxiveneoy sule
@endavidg
@endavidg 7 месяцев назад
Since it’s also something that has to do with sinuses, “Snot on a wall”.
@ThePrimeTech
@ThePrimeTech 2 года назад
Wow
@Jkauppa
@Jkauppa 2 года назад
you could be shooting rays from all light sources, bounce them around, then keep the average in check, then you get automatic global illumination, just keep track of the real-time light maps, as if its accumulated real ray tracing, as in real-time light baking
@Jkauppa
@Jkauppa 2 года назад
paint the textures with light
@Jkauppa
@Jkauppa 2 года назад
you only need to fill the image pixels and no more
@Jkauppa
@Jkauppa 2 года назад
hows the light outside screen space
@Jkauppa
@Jkauppa 2 года назад
importance sample all objects
@Jkauppa
@Jkauppa 2 года назад
send more rays
@TheIqCook
@TheIqCook 2 года назад
pixar introduced this kind of rendering techniques 15 years ago for offline rendering,
@clonkex
@clonkex Год назад
Did they? Wasn't it just regular offline raytracing?
@art-creator
@art-creator Год назад
@@clonkex no. It was pointcloud/brickmap-based, with harmonic filtration etc.
@diligencehumility6971
@diligencehumility6971 2 года назад
Quite beautiful results... But, FrostBite is a EA engine.... and EA is not a nice company, at all. Pay-to-win and microtransactions, surprise mechanics, taking advantages of kids, etc... So not really interesting
@miurasrpnt_v2
@miurasrpnt_v2 2 года назад
Company and Engine have to be separated imo.
@clonkex
@clonkex Год назад
Who cares? They're also spending some of that money on advancing GI technology. We can benefit greatly from their research and still never touch Frostbite.
Далее
Radiance Caching for Real-Time Global Illumination
33:54
Global Illumination in Tom Clancy's The Division
58:41
How do games render their scenes? | Bitwise
13:12
Просмотров 567 тыс.
A Deep Dive into Nanite Virtualized Geometry
1:10:00
Просмотров 246 тыс.
Interactive Graphics 22 - Global Illumination
1:10:00
Просмотров 13 тыс.
How Games Fake Water
22:52
Просмотров 211 тыс.
Why Do Video Game Studios Avoid Blender?
6:49
Просмотров 643 тыс.