Тёмный
No video :(

Radiance Caching for Real-Time Global Illumination 

SIGGRAPH Advances in Real-Time Rendering
Подписаться 7 тыс.
Просмотров 45 тыс.
50% 1

Опубликовано:

 

28 авг 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 46   
@rallokkcaz
@rallokkcaz Год назад
As a computer scientist, this stuff is almost pure magic. Congratulation to the team at Unreal for designing/implementing these algorithms and tools! This is amazing work.
@10minuteartist87
@10minuteartist87 2 года назад
remembering the old days of "Mental Ray" when i saw GI first time and Said wowww.. ☺️
@Alexander_Sannikov
@Alexander_Sannikov 2 года назад
I see much more people are interested in Nanite than in Lumen (judging by the number of views). But 90% of nanite ideas have already been implemented in the 2008 paper that it's based on. Radiance caching in screen space, however, looks novel and even though the idea is somewhat obvious once you name it, but I have actually never seen it implemented anywhere. This is in my opinion the core idea of the entire approach and can be used for a much wider set of GI algorithms based on different acceleration structures.
@djayjp
@djayjp 2 года назад
Check out the video by EA about their Surfel GI technique used in Frostbite. I think it's using the same method fundamentally? Idk
@635574
@635574 2 года назад
@@djayjp it might achieve similar results, but we have no direct behind the scenes comparison for realtime motion through scene and for bidirectional dynamic object GI in Lumen. There isnt as much temporal accumulation in lumen by what we have seen.
@wsqdawsdawdwad
@wsqdawsdawdwad 2 года назад
"But 90% of lumen ideas have already been implemented in the 2008 paper that it's based on", is this a typo? From the context I think you mean Nanite.
@Alexander_Sannikov
@Alexander_Sannikov 2 года назад
@@wsqdawsdawdwad correct, thanks for that. I edited the post.
@Revoker1221
@Revoker1221 2 года назад
@@Alexander_Sannikov Would you happen to have a link or name of the 2008 paper on hand? I'd love to take a deeper dive.
@DominikMorse
@DominikMorse 2 года назад
This is just pure genius.
@caelanread3199
@caelanread3199 2 года назад
It looks as though by the end, real life graphics achieved, with 2 rays Per pixel. Am excited for the future,
@GeneralKenobi69420
@GeneralKenobi69420 2 года назад
People have been saying that for the last 20 years lol. It does look good but trust me there's still a lot that can be improved before it can be actually indistinguishable from reality. Just compare with any Disney or Pixar movie made in the last 3 years. Real time graphics are still many years away from that, and even then movies will probably get even better by then
@TehIdiotOne
@TehIdiotOne 2 года назад
@@GeneralKenobi69420 I mean, modern graphics aren't even remotely comparable to graphics of 20 years ago dude...
@GeneralKenobi69420
@GeneralKenobi69420 2 года назад
@@TehIdiotOne i know, and yet you should have seen the reactions when Sony revealed that PS2 tech demo. "Omg so lifelike! It looks just like a movie!!"
@mnomadvfx
@mnomadvfx 2 года назад
@@GeneralKenobi69420 20 years ago real time gfx were not even close to this level of advancement. As for feature CGI animation, at this point there is little to make them LOOK better. The real advancements for offline rendering will come from: 1) faster rendering of what they can already do, thereby making 4K VFX targets for all films regardless of shot quantity a serious possibility. 2) Further improvements to FX simulations (water, fire, smoke) and tissue/muscle simulations for creatures and digital humans. For me what is currently lacking is a believable volume conserving model for facial animation that shows everything down to even small scale ticks and other unconscious movement that is constantly happening on the average human face. For example, it took the VFX guys a year to create the Blade Runner 2049 scene with faux Rachel. Despite extensive reference video from the original film and skull scans from Sean Young to guide them it still looks fundamentally off when her face moves - like parts of her face are simply asleep or affected with botox (IMHO they also way overdid it with the subsurface scattering effect too, her facial skin looked less like a fleshy volume over a skull than a light diffusing film covering another translucent volume.
@delphicdescant
@delphicdescant 2 года назад
@@GeneralKenobi69420 People have been saying it for 20 years, and every time it's been true. That's a great thing imo. When someone looked at Mario 64 when it came out and said "wow that looks like real life," they weren't wrong. You can either look at the progression of graphics technology and think "we have never, and never will, achieve 100% photorealism, because there will always be something better 10 years later on," or you can recognize that human perception is a fuzzy and a relative thing and say "we have achieved photorealism many times, and every time it stays new and exciting."
@prithvib8662
@prithvib8662 2 года назад
Really well explained, good stuff!
@diligencehumility6971
@diligencehumility6971 2 года назад
So what am I supposed to do with my 10 years of Unity experience now?
@iammichaeldavis
@iammichaeldavis 2 года назад
It’ll translate 🥰
@eclairesrhapsodos5496
@eclairesrhapsodos5496 2 года назад
I wonder if its possible to have semi-dynamic GI? - like lightmaps with data what allows of interaction/blend with realtime GI, so only changes / characters be having realtime GI.
@doltBmB
@doltBmB Год назад
would basically be a modern version of quake's lighting system so yeah should be possible.
@clonkex
@clonkex Год назад
Totally. I'd say they just didn't do that because it would be a pretty big chunk more work and their target audience (console game devs) would prefer to have fully dynamic lighting with no baking time.
@theneonbop
@theneonbop 2 месяца назад
How would you determine which parts of the lightmap are valid?
@robbie_
@robbie_ 2 года назад
I didn't understand a single word of this. Am still working on my Lambert Shading algorithm.
@UserX03
@UserX03 2 года назад
That’s alright bud everyone starts at knowing nothing about a topic
@raanoctive6092
@raanoctive6092 2 года назад
Excellent job👏👍
@djayjp
@djayjp 2 года назад
Anyone know how this differs from Frostbite's Surfels GI technique, in a nutshell?
@sporefergieboy10
@sporefergieboy10 2 года назад
One uses wizard magic the other one uses alien technology
@Sergeeeek
@Sergeeeek 2 года назад
They are similar, but Lumen uses grids for screen space cache, while GIBS uses surfels (surface elemetns) which lie directly on the geometry. Also I don't think GIBS has static world space probes like Lumen, they just put surfels far away and make them bigger, in a way they always use world space probes, but place and size them dynamically. They do use a grid to accelerate searching for surfels, and the grid changes based on distance from the camera. Close up it uses a uniformly sized grid, but far away it's a view aligned frustum thing, where each grid cell is stretched to align with the view. From the camera perspective it looks the same, but from world space it stretches grid cells a lot, which makes it easier to cover a huge open world area. Overall they seem to achieve a very similar result, but I personally think surfels are neater.
@djayjp
@djayjp 2 года назад
@@Sergeeeek Ah I had watched part of their respective presentations but couldn't quite determine how they might differ. Your explanation helps a lot! Thanks! I have to say that, simply based on the renders shown in each of their SIGGRAPH videos, Lumen appears to give more realistic results, closer to unbiased PT.
@Sergeeeek
@Sergeeeek 2 года назад
@@djayjp watch the gibs presentation fully too, I really liked the demo where a skinned character had a fully emissive material and was illuminating the scene while animating through it. Very impressive imo
@djayjp
@djayjp 2 года назад
@@Sergeeeek Yeah that's a good point. I didn't see anything quite like that with Lumen. Also I haven't seen a scene with many dynamic lights in Lumen just yet.
@635574
@635574 2 года назад
But does it even work on dynamic objects or just how the map interacts with them? Surfels from EA can handle rigged meshes but so far only one sample per bone.
@arifahmed6610
@arifahmed6610 2 года назад
Beautiful video 😍
@jimbanana3071
@jimbanana3071 Год назад
Awsome Job !!
@yizhang7027
@yizhang7027 Год назад
It'd be much clearer if you could state the problem prior to an explanation.
@blackrack2008
@blackrack2008 3 месяца назад
He did
@thecrypt6482
@thecrypt6482 2 года назад
This is creep that both epic games and electronic arts new gi systems are based on 20 years old method by Mental Ray wow...
@roklaca3138
@roklaca3138 29 дней назад
Pointless if it still needs 2000$gpu to get 30fps
@LotmineRu
@LotmineRu Год назад
unusable
@prltqdf9
@prltqdf9 2 года назад
This presenter's narration suffers from slurred speech. It's often times hard to understand what he is talking about.
@djayjp
@djayjp 2 года назад
Turn on the auto captions.
Далее
Global Illumination Based on Surfels
47:39
Просмотров 40 тыс.
A Deep Dive into Nanite Virtualized Geometry
1:10:00
Просмотров 242 тыс.
журавли в пятницу
00:14
Просмотров 57 тыс.
Interactive Graphics 26 - GPU Ray Tracing
1:10:00
Просмотров 11 тыс.
Global Illumination in Tom Clancy's The Division
58:41
Large-Scale Global Illumination at Activision
52:31
Просмотров 16 тыс.
Why Unreal Engine 5.4 is a Game Changer
12:46
Просмотров 1,2 млн
SIGGRAPH 2021:  Global Illumination Based on Surfels
47:39
What Is A Graphics Programmer?
30:21
Просмотров 413 тыс.