Тёмный

Intel’s New AI: Amazing Ray Tracing Results! ☀️ 

Two Minute Papers
Подписаться 1,6 млн
Просмотров 125 тыс.
50% 1

❤️ Check out Weights & Biases and say hi in their community forum here: wandb.me/paper...
📝 The paper "Temporally Stable Real-Time Joint Neural Denoising and Supersampling" is available here:
www.intel.com/...
📝 Our earlier paper with the spheres scene that took 3 weeks:
users.cg.tuwie...
❤️ Watch these videos in early access on our Patreon page or join us here on RU-vid:
/ twominutepapers
/ @twominutepapers
🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
Aleksandr Mashrabov, Alex Balfanz, Alex Haro, Andrew Melnychuk, Benji Rabhan, Bryan Learn, B Shang, Christian Ahlin, Eric Martel, Geronimo Moralez, Gordon Child, Jace O'Brien, Jack Lukic, John Le, Jonas, Jonathan, Kenneth Davis, Klaus Busse, Kyle Davis, Lorin Atzberger, Lukas Biewald, Luke Dominique Warner, Matthew Allen Fisher, Matthew Valle, Michael Albrecht, Michael Tedder, Nevin Spoljaric, Nikhil Velpanur, Owen Campbell-Moore, Owen Skarpness, Rajarshi Nigam, Ramsey Elbasheer, Steef, Taras Bobrovytsky, Ted Johnson, Thomas Krcmar, Timothy Sum Hon Mun, Torsten Reil, Tybie Fitzhugh, Ueli Gallizzi.
If you wish to appear here or pick up other perks, click here: / twominutepapers
Thumbnail background design: Felícia Zsolnai-Fehér - felicia.hu
Károly Zsolnai-Fehér's links:
Instagram: / twominutepapers
Twitter: / twominutepapers
Web: cg.tuwien.ac.a...

Опубликовано:

 

4 окт 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 329   
@bj124u14
@bj124u14 Год назад
This denoising technique is absolutely fantastic! 200 frames per second? That unbelievable. I really hope Blender gets it's hands on this technique. Denoising is very important, but doesn't get as much attention as it should. I'm so glad to see how it's evolved. Thank you 2MP, and of course, what a time to be alive!
@TwoMinutePapers
@TwoMinutePapers Год назад
You are very kind, thank you so much! 🙏
@petterlarsson7257
@petterlarsson7257 Год назад
It's pretty sad that most amazing technology isn't open source, imagine how fun people could have with them
@ty_teynium
@ty_teynium Год назад
Thank you for mentioning Blender. I was thinking the same thing since my Blender renders and animations always have some noise in them. I'd love to do some more animating if not for this issue. I do hope it gets released soon.
@little_lord_tam
@little_lord_tam Год назад
@@petterlarsson7257 Non. No one would have money for developing the tools so they wouldnt exist
@ClintochX
@ClintochX Год назад
@@petterlarsson7257 bro, what you're looking at is OIDN (Intel Open Image DeNoise) and its completely Free and open source. Blender already uses it, but now it's time for an upgrade
@arantes6
@arantes6 Год назад
I'd appreciate just a small explanation of what's the new idea of the new techniques presented in the videos : how it works under the hood, in a few sentences. Is it a neural network, where the previous techniques were hand-crafted? Is it a new kind of neural net architecture? Is it just a math operation that nobody thought of applying to this problem before? Just a glimpse behind the technical curtain, instead of just the results, would make these amazing videos even better!
@amoliski
@amoliski Год назад
Sounds like we need two minute papers for a quick overview and five minute papers for a slightly more technical explanation. Maybe have to paper author do a quick interview showing their work if they are available?
@liambury529
@liambury529 Год назад
The paper is called "Temporally Stable Real-Time Joint Neural Denoising and Supersampling", and if you're that curious, the link to the paper is in the video description.
@itsd0nk
@itsd0nk Год назад
I second this motion.
@facts9144
@facts9144 Год назад
@@liambury529 wouldn’t hurt just to give a extra little bit of info in the video tho
@OmniscientOCE
@OmniscientOCE Год назад
@@facts9144 I concur.
@StanleyKubick1
@StanleyKubick1 Год назад
next gen ARC gpu's starting to look pretty desirable
@danielb.4205
@danielb.4205 Год назад
A direct comparison between Intels vs NVIDIAs approach (OptiX) would be interesting.
@woodenfigurines
@woodenfigurines Год назад
I think and hope you'll get your comparison a few days after this hits the market in intel cards :D
@Dayanto
@Dayanto Год назад
It's not really fair to compare with the outdated SVGF algorithm when the followup A-SVGF paper solved the main shortcoming of the original paper (significant ghosting/smearing during motion or lighting changes) already 4 years ago. For example, A-SVGF is what was used in Quake 2 RTX back when real time ray tracing was still new.
@raylopez99
@raylopez99 Год назад
There's no fair in science. Peer review is very unfair at times and ruthless. Just the way it works. Speaking as a two minute scholar who does not read the original sources.
@Exilum
@Exilum Год назад
It was labelled as SVGF, but also as being from 2020. More likely to be a variation of it. SVGF is a 2017 paper, A-SVGF is a 2018 paper.
@Dayanto
@Dayanto Год назад
@@Exilum The one from 2020 was a different paper labeled "NBG". (Neural Bilateral Grid)
@Exilum
@Exilum Год назад
@@Dayanto ok, my bad on that
@alegzzis
@alegzzis Год назад
I will add more, it's not fair because Nvidia has an even newer denoiser "nVidia NRD"
@sebastianjost
@sebastianjost Год назад
absolutely fantastic! In some of the examples I can definitely still see some room for improvement and a follow-up video, but this is still remarkable improvement. However, I find some of the comparisons a bit difficult to interpret. I wish the same videos with different techniques were shown side-by side rather than one after the other. Ideally even with just one video and a moving split showing the two techniques rendering parts of the video.
@black_platypus
@black_platypus Год назад
We barely ever hear about the inner workings anymore :( Please bring back at least a structural overview or abstract! Two Minute Papers > Two Minute "Neat, isn't it"! 😊
@roccov3614
@roccov3614 Год назад
This is a brilliant idea for real time games. Mixing a quick light transport video with a quick light transport optimised noise filter, for a quality real time output is brilliant.
@draco6349
@draco6349 Год назад
I want to see this used in tandem with ReStir. An amazing path-tracing algorithm that leaves barely any noise combined with an amazing denoiser should be able to get breathtaking results, and even better, in real-time.
@draco6349
@draco6349 Год назад
@@dylankuzmick3122 That's actually so cool. I might literally just install that, real-time path-tracing in Blender is something I've always wanted.
@myNamezMe
@myNamezMe Год назад
Impressive progress, will be interesting to see the next paper.
@xamomax
@xamomax Год назад
It seems that neural network based denoising could benefit from an input reference image rendered very fast without raytracing, but with high detail. Then, this high detail image plus the raytracing image can be used by the denoiser to get the lighting correct without washing out the detail.
@brianjacobs2748
@brianjacobs2748 Год назад
that sounds cool but it wouldn’t be as authentic
@xamomax
@xamomax Год назад
@@brianjacobs2748 depends on how it is trained.
@circuit10
@circuit10 Год назад
@@brianjacobs2748 More authentic than no raytracing at all, which is the alternative
@circuit10
@circuit10 Год назад
I think this would also be really good for DLSS 3 because the frame could be generated without much latency based on actual data rather than interpolating between two frames
@MRTOWELRACK
@MRTOWELRACK Год назад
The neural network is trained according to reference input. The renderer already knows the geometry and blurs the noise accordingly.
@notapplicable7292
@notapplicable7292 Год назад
Its rather incredible how many times I've listened to him explain noise in ray traced lighting
@13squared2009
@13squared2009 Год назад
I’d love a video taking us under the covers to see what it is these authors are actually doing to improve these techniques… I just don’t see myself reading the actual papers as a casual viewer. Love your videos!
@juliandarley
@juliandarley Год назад
i would love to know if or when this could be applied to Blender/CyclesX so that 1) we can have much reduced render times of photoreal scenes and 2) reasonably good caustics.
@shmendez_
@shmendez_ Год назад
Wait wait wait bro how is your comment 7 hr old but the video is only 3 min old??
@Ben_CY123
@Ben_CY123 Год назад
@@shmendez_ em….maybe timezone issue?
@nullneutrality8047
@nullneutrality8047 Год назад
@@shmendez_ videos are shown early on patreon
@MattPin
@MattPin Год назад
Honesty once this gets included into blender it's going to be very good, as I can see this ai Denoiser will definitely help render frames faster, it would be very cool to see a comparison of this technique with their other Denoiser, openimagedenoise.
@computerconcepts3352
@computerconcepts3352 Год назад
yeah lol
@mosog8829
@mosog8829 Год назад
Am glad this is going to be implemented in blender, as they are already working on it.
@juliandarley
@juliandarley Год назад
can you provide a link, pls?
@juliandarley
@juliandarley Год назад
@@mosog8829 many thanks. have not looked at the 3.4 alpha yet. release notes say that path guiding works only with CPU, but GPU support coming in future. may be worth doing some comparison tests with CPU only, but obviously real gain will be with GPU.
@mosog8829
@mosog8829 Год назад
@@juliandarley welcome. Indeed. It will be even better if it's possible to combine CPU and GPU.
@kleyyer
@kleyyer Год назад
You might be confusing this with the new Pathguiding feature for fireflies that will be implemented into Cycles in the future. I have seen absolutely nothing about this new denoising being implemented into Blender
@juliandarley
@juliandarley Год назад
​@@kleyyer it wasn't my suggestion, but i can ask blender hq about it. i did look at the new alpha and it did not look the same as what is show here. i still keep hoping for good caustics from cycles. for me, needing photoreal renders, it is the number one thing that lets cycles down.
@Sadiinso
@Sadiinso Год назад
You forgot to mention that the 200fps (5.19ms per frame as shown in the paper) was observed when running on an rtx 3070. Performance and runtime are not absolute measures and are related to the hardware on which the workload is running.
@aidanm5578
@aidanm5578 Год назад
Finally, I don't have to go 'outside' to see realistic lighting.
@zzoldd
@zzoldd Год назад
A redditors true dream
@Sekir80
@Sekir80 Год назад
I'd like to see the result of something slower, say, give the renderer 1 second, way less noisy, maybe that way the upsampled result is closer to the reference. For clarity: I'm not really interested in real time image generation if we are talking about minute/hour/day long rendering challenges. I'm interested in great quality results.
@Sekir80
@Sekir80 Год назад
@michael Interesting insight! I was more cynical with 3D visualization and figured I'll do some crappy commercial which I disliked, so never entered this space. Mediocrity. I even see it in AAA games: for example I tend (past tense) to model stuff for specific rendering quality, if I see a racing simulator where the steering wheel is a visual n-gon I just scoff. I'd rather spend the polygon budget on them most obvious things. Maybe I'm weird.
@bryanharrison3889
@bryanharrison3889 Год назад
I love these videos... even if I WASN'T a 3d animator, I'd still enjoy them because of Karoly's passion for the subject of A.I., computer graphics, and machine learning.
@fanjapanischermusik
@fanjapanischermusik Год назад
when can I expect this to be used in my smartphone? while taking photos for example. looks really good.
@facenameple4604
@facenameple4604 Год назад
The reference simulation at 3:14 is actually NOISIER than the denoising algorithm.
@Eternal_23
@Eternal_23 Год назад
This+RESTIR=Realtime CG
@t1460bb
@t1460bb Год назад
What a time to be alive! Excellent work!
@AIpha7387
@AIpha7387 Год назад
5:42 It seems to me that the reflective material on the surface is being ignored. It was removed during de-noising.
@AIpha7387
@AIpha7387 Год назад
It is completely different from the gloss intended by the developer. This can't be used in the game.
@harnageaa
@harnageaa Год назад
I think they might make a paper to add reflection to all type of materials, so you have 2 algorithms in one, that probably might solve the issue
@adrianm7203
@adrianm7203 Год назад
It would have been nice if this video talked about how this was accomplished. The results are interesting but I'm more interested in how it works...
@michaelleue7594
@michaelleue7594 Год назад
It's called 2-minute papers, not 2-semester papers.
@alex15095
@alex15095 Год назад
@@michaelleue7594 Cut out a minute of the same footage that was looped 22 times in the video, and roughly explain what architectures they're using, how it was trained, show some diagrams or graphs from the paper that people can pause to have a deeper look, etc
@emrahyalcin
@emrahyalcin Год назад
please add this to the games with high visual output like satisfactory. I can't even believe such a thing can happen. It is really amazing.
@fr3ddyfr3sh
@fr3ddyfr3sh Год назад
Thanks for your work, to present us complicated papers in an easy way
@MarshalerOfficial
@MarshalerOfficial Год назад
Those 2 years years ago were like 20 years imo. Sharpening images are the new sexy again. But this dosen't mean that raytracing wasn't boring either, it's damn impressive what we had come from What a time to be alive boys and girls!
@jeffg4686
@jeffg4686 Год назад
amazing. glad to see all the tech giants are in NN. Unbelievable results.
@Guhhhhhhhhhhhhhhhhhhhhhhhhhhhh
This will be amazing for open world games with lots of light in a large area
@Build_the_Future
@Build_the_Future Год назад
The reference look so mush better
@jinushaun
@jinushaun Год назад
The fact that it’s able to reconstruct the no smoking sign from that noisy mess is mind blowing.
@levibruner617
@levibruner617 Год назад
This is Impressive. I hope someday we can use this technique to make Minecraft look more realistic with light. Right now the best we can do in Minecraft is ray tracing. I apologize about spelling and grammar. This is very hard for me to type because I’m partially deaf blind. Hope you have a great day and keep on learning.
@trippyvortex
@trippyvortex Год назад
What a time to be alive!!
@R2Bl3nd
@R2Bl3nd Год назад
We're rocketing towards photo realistic VR that's almost indistinguishable from life
@RealRogerFK
@RealRogerFK Год назад
How fast all of this is actually going amazes me. I'm guessing by 2040 we'll have completely photorealistic realtime renders on mobile platforms at a good framerate. Next step: how do we make all of this *believable* in simulated worlds? CoD MWII's Amsterdam scenes *look* incredible but you still feel like it's a videogame. That's the next challenge, proper AI and believable environments and not just clean, shiny floors and water puddles.
@JeremieBPCreation
@JeremieBPCreation Год назад
Absolutely love your videos! Is there an actually 2 minute or even 60s "short" version of these videos (the name of the channel made me wonder)? If not, would you mind if someone else did it? I love information density and without the repeated parts and speaking slightly faster, I think those videos would make wonderful shorts!
@harnageaa
@harnageaa Год назад
I think he has contract with sponsors to post every video + sponsor that probably means shorts too (since they are on the video page) so prob. Not gonna happen, at least on this channel
@nixel1324
@nixel1324 Год назад
Can't wait for video game tech to incorporate this! The next generation of games is gonna look incredible, especially when you factor in improvements in hardware (and of course other software/technology)!
@OSemeador
@OSemeador Год назад
Initially it looked like a temporal motion blur with an upscaling pass on it but the last comparison result shows there is more to it than initially meets the eye... pun intended
@erikals
@erikals Год назад
4:45 info; this paper is from 2020. (not 2014)
@LukeVanIn
@LukeVanIn Год назад
I think this is great, but they should apply the filter before high-frequency normal/detail mapping.
@MacCoy
@MacCoy Год назад
the "Any sufficiently advanced technology is indistinguishable from magic" quote fits for the wizards over at intel
@witness1013
@witness1013 Год назад
There is no magic happening at all.
@ryanmccampbell7
@ryanmccampbell7 Год назад
I wonder if these techniques can be combined with traditional rasterization-based rendering, which should produce nice sharp edges and avoid any blurriness produced by the denoising. I imagine if you run both a coarse light simulation and a rasterization step, then combine the results with a neural network, you can get the best of both worlds.
@alexmcleod4330
@alexmcleod4330 Год назад
It's already rasterised as a first step, to produce a G-buffer of surface normals, texture & colour information. That's done with Unreal Engine 4, in this case. That rasterised layer then informs the raytracing, supersampling and denoising steps. The key idea here is that they're combining the supersampling and denoising into one process. It seems like they didn't really want to pre-process the reference scenery too much, but you could probably get a better balance of sharpness & smoothness by giving the hero meshes some invisible texture maps that just say "stay sharp here, it's a human face", and "don't worry about this, it's just the floor".
@ryanmccampbell7
@ryanmccampbell7 Год назад
@@alexmcleod4330 I see, interesting.
@TheKitneys
@TheKitneys Год назад
As a CGI artist specialising in Advertising CGI (Products/interiours/cars) these images have a very long way to go for photorealistic results. With the impressive results shown, it won't be too many years before they do.
@JimBob1937
@JimBob1937 Год назад
I'd caution to look at these with a grain of salt, and more so focus on and abstract out the techniques being shown. These research papers are using "programmer art" that is comparable to the placeholder art you usually see in the likes of game development and such before actual artists enter into the process. Even top of the line techniques look bland if not in the hands of an artist, even if it just comes down to scene composition and lighting setup.
@michaeltesterman8295
@michaeltesterman8295 Год назад
What are the hardware requirements and how intensive is it on the cards
@ozzi9816
@ozzi9816 Год назад
I’m very excited for when something like reshade implements this!!! Cheap, fast RTGI or even just straight up ray tracing that can run on modest graphics cards isn’t that far away!
@muffty1337
@muffty1337 Год назад
I hope that games will profit from this new paper soon. Intel's new graphics cards could benefit from an uplift like this.
@quantenkristall
@quantenkristall Год назад
Have you already tried using a closed loop feed back of de-noised result into back ray-tracing to improve metropolis light transport guessing. Adaptive recursion from lower to higher resolution and depth (number of bounces). The less useless calculations the less waste of energy or time .. 🌳
@oculicious
@oculicious Год назад
this gives me hope for photoreal VR in the near future
@radekmojzis9829
@radekmojzis9829 Год назад
I would like to see just the denoising step compared to reference in the same resolution... I cannot help but feel like even the new method outputs a blurry mess without any high frequency details anywhere in sight - which is to be expected since the technique has to reconstruct the "high resolution" by conjuring 4 new pixels for every pixel of input.
@goteer10
@goteer10 Год назад
A couple of comparisons are shown in this video, and it's exactly as you say. Looking at the spaceship example is the most obvious: The side panels have some "dots" as either geometry or texture, and the noisy+filter version doesn't reflect this detail. It's minor, but it goes to show that you still can't use this in a professional manner,
@radekmojzis9829
@radekmojzis9829 Год назад
@@goteer10 i don't think it's minor - it's significant enough for me to prefer the look of traditional rendering techniques at that point.
@MRTOWELRACK
@MRTOWELRACK Год назад
@@radekmojzis9829 To be fair, this could be combined with traditional rasterization.
@jackrdye
@jackrdye Год назад
Just shows compute power may not be compounding at the same rate but new techniques are compounding outcomes
@drednac
@drednac Год назад
Wow, this is mind-blowing. The progress that the AI algorithms make in recent times is literally unbelievable. Every day there is less and less reason to focus on anything else in technology but machine learning.
@michaelleue7594
@michaelleue7594 Год назад
Well, ML is great but its most interesting applications are when it gets used to focus on other things in technology. The field of machine learning itself is advancing rapidly, but there's a wide world of applications for it in other tech fields that are even more fertile ground for advancements.
@drednac
@drednac Год назад
@@michaelleue7594 Years back I was working on a light transport algorithm that was running real-time on the GPU and allowed dynamic lighting and infinite bounces. Now when I look at these advancements I see where things are going. The upscaling tech, and AI "dreaming up" missing details better than a human is a total game changer. Also the ability to generate content. We truly live in exponential times, it's hard to catch up. Whatever you can work on today that seems interesting will be so obsolete next friday ..
@JorgetePanete
@JorgetePanete Год назад
​​@@drednac It's unbelievable that that's not even an exaggeration, and we don't even know what we'll have once quantum and photon computing open new possibilities
@drednac
@drednac Год назад
@@JorgetePanete We know what comes next .. Skynet :DDDD
@JorgetePanete
@JorgetePanete Год назад
@@drednac Once fiction, soon reality, it just takes one man to connect a robot to the internet, put an AI that transforms natural language into instructions, and add a task system to let it discover how to do anything and go do it instantly
@lolsadboi3895
@lolsadboi3895 Год назад
we have the "Enhance image" from every sci fi film now!!
@GloryBlazer
@GloryBlazer Год назад
I didn't know intel was doing AI research aswell, thanks for the exciting news!
@harnageaa
@harnageaa Год назад
At this point everyone is doing secretly, seems this is the future of computational speed
@kishaloyb.7937
@kishaloyb.7937 Год назад
Intel has been been doing AI research for nearly 2.5 decades now. Heck, in the early to late 2000s, they demonstrated running a RT on a CPU by generating a BVH structure. Sure it was slow, but was pretty impressive.
@GloryBlazer
@GloryBlazer Год назад
@@kishaloyb.7937 I didn't realize ray tracing was such an old concept, first time I heard about it were the real time ones by Nvidia. 👍
@GloryBlazer
@GloryBlazer Год назад
@Garrett yea I'm guessing it has alot to do with software aswell
@markmuller7962
@markmuller7962 Год назад
We are experiencing the 3rd big jump in videogames graphics afte DOOM and the switch from 2D to 3D (mainly throug consoles), I was there the previous 2 jumps and now with ray tracing and UE5 luckily I'm here for this jump too, this is definitely one of the most exciting industry of the entire world, like top 5
@infographie
@infographie Год назад
Excellent
@erikals
@erikals Год назад
this is.... incredible !!!
@damonm3
@damonm3 Год назад
Was hoping you’d do another RT vid!
@blackbriarmead1966
@blackbriarmead1966 Год назад
I'm taking a computer graphics course this semester. After making a CPU (no lighting or raytracing) renderer that takes a few seconds to render one frame of a very simple scene, the sheer immensity of the number of calculations required for realistic scenes set in. There is a whole stack of technology to make stuff like this possible, from the 5 nm silicon, GPU architecture, software optimization, and creative techniques to reduce the number of calculations necessary. And the layperson takes all of this for granted when they play cyberpunk 2077 on their 4090
@JimBob1937
@JimBob1937 Год назад
Not to mention the entire electronics platform that it is running on. The signal transmission to the monitor... the monitor technology. Yeah, people's heads would explode if they attempted to comprehend the technology and functioning it takes for a single frame of a game to be shown to them.
@joshuawhitworth6456
@joshuawhitworth6456 Год назад
I can't see this getting much better. What will be getting better is unbiased render engines that will one day simulate light in real time without filtering techniques.
@MRTOWELRACK
@MRTOWELRACK Год назад
That would be brute force path tracing, which is orders of magnitude more demanding and not viable with existing technology.
@gaijintendo
@gaijintendo Год назад
I want to play Goldeneye upscaled like this.
@AronRubin
@AronRubin Год назад
Do we no longer collect per-fragment info on whether how fully a ray(s) was incident? It seems like you could just in-paint low ranking fragments instead of smoothing a whole scene.
@tm001
@tm001 Год назад
It's absolutely amazing but I noticed the a lot of the fine details got washed up... I wonder if that could be avoided with some fine tuning
@chad0x
@chad0x Год назад
Absolutely nuts!
@AuaerAuraer
@AuaerAuraer Год назад
Looks like Intel is entering the ray tracing competition.
@JonS
@JonS Год назад
I'd like to see this added to optics stray light simulation packages. We face the same noise/ray counts/runtime issues when analyzing optics for stray light.
@andrewwelch5017
@andrewwelch5017 Год назад
I'd like to see a comparison between this technique applied to a native 1440p render and the 1440p reference.
@kylebowles9820
@kylebowles9820 Год назад
I don't understand why they don't build these techniques into the rendering, why wait until after when you have no information about the scene? It would have been trivial to get real high frequency details back from the geometry and textures.
@itsd0nk
@itsd0nk Год назад
I think it’s the quick real-time aspect that they’re aiming for.
@kylebowles9820
@kylebowles9820 Год назад
@@itsd0nk I specifically built that step into the earliest part of the rendering pipeline because the advantages are monumental and the cost is insignificant. I think the main reason is so you can tack it on as an afterthought.
@uku4171
@uku4171 Год назад
Intel is doing pretty well with that stuff. Excited about them entering the GPU market.
@gawni1612
@gawni1612 Год назад
When Dr. K is Happy, so am I.
@soupborsh8707
@soupborsh8707 Год назад
I heard something about 100x path tracing performance on GNU/Linux on Intel graphics cards.
@JimBob1937
@JimBob1937 Год назад
I think that was the speed up a driver update had on Linux for Intel's dedicated GPUs. The drivers were so bad at initial release that they managed a near 100x improvement. While that is an impressive speed up... it more so just says they released it in a pretty poor state to begin with.
@kaioshinon5954
@kaioshinon5954 Год назад
Bravo!
@Dirty2D
@Dirty2D Год назад
Looks like it mixes together some frames and smooths, could you not render the world without raytracing on, and then add the raytraced approx on that as a reference image?
@azurehydra
@azurehydra Год назад
WHAT A TIME TO BE ALIVE!!!!!!!!!!!!
@Halston.
@Halston. Год назад
2:42 if you've seen the explanation of light transport simulation before 👍
@xl000
@xl000 Год назад
Is this related to the incoming Intel ARC A7x0 ? I don't remember Intel doing much research around graphics, except: Embree, OSPRay, Open Image Denoiser, maybe OneAPI Is it related to those ?
@lemonhscott7667
@lemonhscott7667 Год назад
This is amazing!
@sikliztailbunch
@sikliztailbunch Год назад
If the new method allows for 200 fps in that example scene, wouldn´t it make sense to limit it to 60 fps to gather even more samples in the first phase before denoising it?
@JimBob1937
@JimBob1937 Год назад
There is still other computation to occur within that frame that you have to account for... like the entirety of the rest of the scene rendering (most applications will be more complex), and the rest of the game (since they're obviously targeting interactive applications). The 200 FPS isn't a goal of some sort of final application, this is just an example of the algorithm running, and the faster it achieves this the better. The actual applications that implement the algorithm would be the ones to make that tradeoff decision.
@abandonedmuse
@abandonedmuse Год назад
What a time to be alive!
@ZiggityZeke
@ZiggityZeke Год назад
We are gonna have some pretty looking games soon, huh
@RealCreepaTime
@RealCreepaTime Год назад
Would it be worth it for the light simulation to run even longer than 4-12 milliseconds? Allowing it to be less noisey base, resulting in a higher quality output? Or would it just be negligible/not worth it?
@speedstyle.
@speedstyle. Год назад
The light simulation quickly plateaus, which is why it takes hours to get the (still noisy) reference. Maybe with a faster computer you could choose to spend an extra few ms on raytracing, but for realtime if it takes 10ms to raytrace and 5ms to denoise/upscale that's already only 66fps. Once you add physics simulations and game mechanics you won't want to decrease performance much further. For non-realtime applications like films it doesn't matter, people will spend hours per frame then run this denoiser on the 'reference'.
@RealCreepaTime
@RealCreepaTime Год назад
@@speedstyle. Ah, thank you for the response! That makes sense, and yea for video game ideals/standards you probably wouldn't want to do that. I will say my intentions for what it would be used for is a a bit different but similar to the last portion you mentioned. I was more thinking of using it in rendering software like the person in the pinned comment mentioned. It might be helpful for previewing in viewport with denoising.
@dryued6874
@dryued6874 Год назад
Is this the kind of approach that's used in real-time ray-tracing in games?
@zackmercurys
@zackmercurys Год назад
Is this technique good at temporal denoising? Like, usually denoising causes flash artifacts from the fireflies
@little_lord_tam
@little_lord_tam Год назад
Where the 200 fps achieved with 1440p or with 720p? I couldnt quite follow the fps in relation to the pixels
@davidstar2362
@davidstar2362 Год назад
what a time to be alive!
@xRage88
@xRage88 Год назад
You are wrong. This method is by NVIDIA not by Intel.
@colevilleproductions
@colevilleproductions Год назад
light transport researcher by trade, dark transport researcher by light
@jacobdallas7396
@jacobdallas7396 Год назад
Amazing
@frollard
@frollard Год назад
With nvidia getting all the fanfare the last few years it's really interesting to see intel get some spotlight. This is purely fantastic.
@walid0the0dilaw
@walid0the0dilaw Год назад
Glad Nvidia isn't the only one in this space!
@PbPomper
@PbPomper Год назад
There are many others. Hopefully they don't make everything proprietary like nVidia usually does. It is Intel, so I am not optimisitc.
@younasqureshi9179
@younasqureshi9179 Год назад
seriously I get bored seeing everyone saying Nvidia invented this, look up PowerVR RT capable gpus from fricking 2016
@Veptis
@Veptis Год назад
Some of the top Intel researchers for upscaling methods actually joined Nvidia.
@ene_ai
@ene_ai Год назад
Yes I see the problem. The Naruto running dude in the simulation lmao.
@fancy4343
@fancy4343 Год назад
That’s absolutely insane
@therealOXOC
@therealOXOC Год назад
I knew why i never changed systems since the 486. I know i can run Crisis on it some day.
@rb8049
@rb8049 Год назад
Seems to me that even better results are possible. Some surfaces loose resolution. These surfaces have repetitive textures which should be able to replicate on the final image. Maybe need both a high resolution non ray traced stream and the ray traced stream. The fine details of textures can come from the non Ray traced version. Maybe use several different fast light transport methods at higher resolution and Ray tracing at the lower resolution and merge all into one NN. Definitely can do better than this in a future paper.
@MRTOWELRACK
@MRTOWELRACK Год назад
This is exactly what modern games often do - hybrid rendering combining traditional rasterization and ray tracing.
@custos3249
@custos3249 Год назад
Hold up. At 5:08 and more at 5:14, is that noise I see in the reference that the program removed?
@woodenfigurines
@woodenfigurines Год назад
will this be a feature of Intel's new graphics cards? interesting times indeed!
@haves_
@haves_ Год назад
But can it be used for not-seen-before scene?
@raguaviva
@raguaviva Год назад
I wish you would show the after/before comparison at the beginning of the video.
@chintanguru7083
@chintanguru7083 Год назад
Intel is back!
@InfinitycgIN
@InfinitycgIN Год назад
what the actual fuk, this is unbelievable, the metro scene was unredeemable, but after the denoise, damn Would love to see it included in blender or some app to clear out renders, would make life much faster for animations!
Далее
Ray Tracing: How NVIDIA Solved the Impossible!
16:11
Просмотров 796 тыс.
Do we really need NPUs now?
15:30
Просмотров 607 тыс.
ЭТО НАСТОЯЩАЯ МАГИЯ😬😬😬
00:19
🎙Пою РЕТРО Песни💃
3:05:57
Просмотров 1,3 млн
How a Clever 1960s Memory Trick Changed Computing
20:05
Why Does Diffusion Work Better than Auto-Regression?
20:18
Maine's Radical Solution To End Political Corruption
14:05
Why you’re so tired
19:52
Просмотров 1,3 млн
The Weird Rise Of Anti-Startups
12:57
Просмотров 331 тыс.
The Surprising Solutions to the World's Water Crisis
24:02
Ray Tracing Performance On An Intel GPU Is Weird...
14:03
AI’s Hardware Problem
16:47
Просмотров 628 тыс.
How does Ray Tracing Work in Video Games and Movies?
29:22
ЭТО НАСТОЯЩАЯ МАГИЯ😬😬😬
00:19