Тёмный

DLSS 3.5: Ray Reconstruction 

Daniel Owen
Подписаться 197 тыс.
Просмотров 257 тыс.
50% 1

Nvidia's big Gamescom announcement is DLSS 3.5 Ray Reconstruction. It runs on all RTX GPUs, not just the 40 series, and its goal is to provide a better denoiser for RT that produces a better than native ray traced image. We will still need third party testing to verify these claims and analyze the quality, but for now let's take a look at Nvidia's claims about how it looks and how it works.
What equipment do I use to make my videos?
Camera: Sony a6100 amzn.to/3wmDtR9
Camera Lens: Sigma 16mm f/1.4 amzn.to/36i0t9t
Camera Capture Card: Elgato CamLink 4K ‎amzn.to/3AEAPcH
PC Capture Card: amzn.to/3jwBjxF
Mic: My actual mic (AT 3035) is out of production but this is a similar mic (AT 2020) amzn.to/3jS6LEB
Portable Mic attached to camera: Rode Video Micro amzn.to/3yrT0R4
Audio Interface: Focusrite Scarlett 2i2 3rd Gen: amzn.to/3wjhlad
Greenscreen: Emart Collapsable amzn.to/3AGjQXx
Lights: Neewar Dimmable USB LED amzn.to/3yw4frD
RGB Strip Backlight on desk: amzn.to/2ZceAwC
Sponsor my channel monthly by clicking the "Join" button:
/ @danielowentech
Donate directly to the channel via PayPal:
www.paypal.com...
Disclaimer: I may earn money on qualifying purchases through affiliate links above.

Опубликовано:

 

13 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 1,7 тыс.   
@jedimindtrickonyou3692
@jedimindtrickonyou3692 Год назад
I still never get tired of you floating around the screen to point out different on-screen elements. It makes your videos more accessible because you help point out what the audience should be paying attention to.
@AssassinKID
@AssassinKID Год назад
Soon he will become a downloadable cursor skin
@HanSolo__
@HanSolo__ Год назад
"aaaaa..."
@Bajablast_scuba_cat
@Bajablast_scuba_cat Год назад
It is also hilarious 😂
@phamaral249
@phamaral249 Год назад
and i love the noises he makes when resizig
@devonmoreau
@devonmoreau Год назад
100%, it's my favorite quirk of these videos :)
@zidan40o0
@zidan40o0 Год назад
it's really incredible that we have all these advances in graphical technology, yet enemy AI is still stuck in 1995.
@jadedandbitter
@jadedandbitter Год назад
Yep. Along with the destructible physics environments we were promised in like 2008.
@legendslog3911
@legendslog3911 Год назад
Must have been wind
@ek9385
@ek9385 Год назад
Games that look pretty are easier to market.
@saricubra2867
@saricubra2867 Год назад
In 1995? Unreal Tournament 99's AI bots destroys a lot of modern games.
@NovaXP
@NovaXP Год назад
Enemy AI is left "dumb" intentionally because it would be a pain in the ass to beat otherwise. Computers can beat the best chess players in the world, they're definitely capable of beating anyone in games if left unrestricted.
@Real_MisterSir
@Real_MisterSir Год назад
Gotta say, I love the tiny Daniel flying around and pointing at things as you explain them -it's hilarious but helpful at the same time haha
@gtx1650max-q
@gtx1650max-q Год назад
I was dying when he was bouncing like rays at the beginninng xD
@shehroz295
@shehroz295 Год назад
It's functional and creative xD
@Donahue250
@Donahue250 Год назад
he needs to edit himself in a professor x wheelchair. preferably the yellow hover one.
@alexgr0111
@alexgr0111 Год назад
I find it super annoying. 2nd video I watched from this guy and can't see myself following him.
@mobarakjama5570
@mobarakjama5570 Год назад
Don't let the door hit you on your way out.@@alexgr0111
@Narwhal001
@Narwhal001 Год назад
AMD about to say they're working on FSR3.5 😂
@anthonybaker5199
@anthonybaker5199 Год назад
Damn 😂
@user-mw4wq3ci3d
@user-mw4wq3ci3d Год назад
LMAO
@darudesandstorm7002
@darudesandstorm7002 Год назад
and never mention it again for another year 🤣
@vitordelima
@vitordelima Год назад
They lost an opportunity to build something faster and universal with FSR.
@fafski1199
@fafski1199 Год назад
@@darudesandstorm7002 And then a year after that they'll finally release it, which will guaranteed to be second rate in comparison.
@Chasm9
@Chasm9 Год назад
Incredible! Noisy and laggy ray tracing artifacts are the most off-putting visual anomalies to me, and here we go :) I'm so glad they've decided to tackle it this fast.
@Dionyzos
@Dionyzos Год назад
I was a little disappointed when I tried Cyberpunk Overdrive when I saw the artifacts. I expected it to look way cleaner. I hope this fixes that at least in part.
@lilpain1997
@lilpain1997 Год назад
@@Dionyzos Same here. Its honestly a huge reason why I didnt bother using it despite it being so damn good looking. The artifacts/noise when driving or just moving was annoying. I am a person who gets distracted really easy and looks around the screen a lot at random shit so I notice the artifacts a ton.
@Chasm9
@Chasm9 Год назад
@@Dionyzos It's a growing field, and still in its infancy, so we have to be patient, especially when we don't know how it can be improved 😅 I'm still shocked we have real-time path tracing in video games. CDPR accurately stated that it's a 'technology preview'. The real deal is (hopefully) coming with Phantom Liberty 💪
@Junglechief87
@Junglechief87 Год назад
@@Chasm9 When I was in school for Game Art Ray Tracing was used in pre-baked lighting but it took an eternity to render and still looked noisy, you'd have to smooth it out in photoshop still. To see it done in real time is ridiculous to me. This was in 2008.
@DIE2dayORelse
@DIE2dayORelse Год назад
I also hope they fix the artifacting of DLSS when used with TAA, it's subtle most of the time but really ruins the visuals of Cyberpunk for me sub-4k. I think they "fixed" it but still seems to affect some (myself included). There's no real 100% fix to it either that I've found, I've gotten it to stop but it always comes back with game/driver updates or seemingly at random. Anyone who wants to play through cyberpunk immersively in VR is SOL because of this. The game needs to be run in native while also displaying the VR image and the only cards that can keep up are the 90 series gpus without DLSS, and even then they struggle.
@cybervoid8442
@cybervoid8442 Год назад
Wow Nvidia adding features to 20 series cards? Color me shocked
@Spr1ggan87
@Spr1ggan87 Год назад
@@LibangF15Nvidia guy in the presentation said it's coming to all RTX GPUs.
@TheVanillatech
@TheVanillatech Год назад
"Feautres" huh? I prefer the word "gimmick". We will have 15 games within 2 years if we are lucky that actually use it, and 1/2 of those will be Cyberepunk / Quake 2 / Half Life 1 / etc.
@xNemesis_
@xNemesis_ Год назад
Lil bro deleted his comment proving he didn't watched the video 💀
@infinityimpurity4032
@infinityimpurity4032 Год назад
@@TheVanillatech if it improves the ray traced quality and fps in games, why are you complaining?
@buckiesmalls
@buckiesmalls Год назад
@@TheVanillatech "Gimmick" I dont think you know what that word means. Maybe AMD should start to be the first to implement "gimmicks" instead of always playing catchup.
@dededede9257
@dededede9257 Год назад
It's good to see that it's on all RTX GPU and not only 40xx
@takemeseriouslyplx2124
@takemeseriouslyplx2124 Год назад
Sad part is that without FG you would most likley not be able to play any raytraced games on lower hardware anyways since DLSS3.5 wont be adding any FPS as the guy in the presentation said, only reason it might get a little better perfomance is because of the many denoisers taking some performance in that scene! It's a catch-22.
@AlucardNoir
@AlucardNoir Год назад
@@takemeseriouslyplx2124 the 20xx series was an overpriced mistake since rtx works like shit on them. The 30xx series has decent rtx performance only on the top end. But now Nvidia is showing interest in path tracing, meaning that even their 40xx series is being humbled. What I wonder is if they're showing interest because it's the obvious next step or just so they can showcase their AI cores...
@Aleksey-vd9oc
@Aleksey-vd9oc Год назад
However, only the more powerful 4000 series models can comfortably render RT or PT)))
@photonboy999
@photonboy999 Год назад
@@AlucardNoir RTX 20 series was not a "mistake" in the way you describe. I can't really talk about the price easily because mining and other reasons affected the prices. But specifically talking about performance, you HAD TO START SOMEWHERE. It's the chicken-and-egg problem. You won't have great software support without the hardware to drive it. And if NVidia put on a crap tonne of extra transistors to make RAY-TRACING better at the expense of traditional RASTER performance then people would complain about the FPS in non-traced games... The ONLY way to do this is what NVidia actually did. You bite the bullet, put in enough ray-tracing hardware to get things started and let the game devs start adding it to EXISTING games. Optimization comes later. If YOU have a better way to do it that makes BUSINESS sense I'd love to hear it. And this is NOTHING NEW in the computer World. For decades we've had new features that weren't efficient. Tessellation. Various anti-aliasing methods. Even AUDIO when it would run on the CPU could kill performance until that became a solved, non-issue. So the trend is this: 1) New feature. Has issues. Need high-end to "brute force" it. 2) Optimizations. Newer, more powerful cards. New feature filters down to mid-range. 3) Further optimizations and newer cards. Lower-end hardware starts running the features.
@ashamahee
@ashamahee Год назад
@@AlucardNoir good, I love it when tech companies push the boundaries and are innovative, bring on the new tech and lets see what wonders it can achieve.
@spoots1234
@spoots1234 Год назад
If it's one thing, Nvidia is pretty good at AI.
@joseyparsons7270
@joseyparsons7270 Год назад
more like NvidAI
@itsaUSBline
@itsaUSBline Год назад
That is kind of becoming their primary focus now.
@sentryion3106
@sentryion3106 Год назад
They know they are fast approaching the limit of the economics feasibility of how much hardware can be push so they have to find a way to push it. This also mean they don’t have to compete with amd on pure silicon which is ahead of them in chiplet design
@OcihEvE
@OcihEvE Год назад
As long as the CUDA API is the industry standard.
@Patrick-tw7nr
@Patrick-tw7nr Год назад
@karl5010they are. You just to much up Jensens arse to pay attention
@bulutcagdas1071
@bulutcagdas1071 Год назад
Leave it to Nvidia to confuse the hell out of people regarding what the heck their own cards are capable of doing.
@WidePhotographs
@WidePhotographs Год назад
Stick to console gaming if this is to much information for you to take in.
@razorbackroar
@razorbackroar Год назад
@@WidePhotographslmao let the man have his cake & icing my friend
@JustSkram
@JustSkram Год назад
​@@WidePhotographsI think he is just referring to why can't all RTX cards utilize dlss 3.0 yet they can utilize certain key features provided by 3.5
@satakrionkryptomortis
@satakrionkryptomortis Год назад
best they can do is pushing a shitload of fake frames ignoring your input and the games response to that.
@G0A7
@G0A7 Год назад
@@JustSkram Because all RTX have raytracing capable hardware but only 40 series have frame generation hardware probably
@OppaHan
@OppaHan Год назад
My question is i have RTX 3000 series, and if I apply DLSS 3.5 in game, will i get Frame boost like from 20 to 108 (SR+FG+RR) or 20 to 60 like dlss 2 but with Ray Reconstruction in it?
@nhenzimethoratadridlerin5546
Second one, RTX30xx series does not have FG, it is exclusive to 40xx series
@DragonOfTheMortalKombat
@DragonOfTheMortalKombat Год назад
DLSS : reaches version 3.5 Me pokes FSR: c'mon do something.
@MrMeanh
@MrMeanh Год назад
The main reason I don't like to use anything lower than DLSS Quality at 4k in games with RT is how "noisy" and "blurry" the image becomes (even more so in motion), so if this works well it might be the feature that makes me actually use DLSS Performance (or at least Balanced) in games.
@giglioflex
@giglioflex Год назад
It doesn't reduce blur, that's down to the fact that DLSS renders at a lower resolution and then upscales. This tech is only focused on improving the ray traced lighting.
@guspaz
@guspaz Год назад
@@giglioflex Games use less and less ray samples as you reduce the internal rendering resolution, so using DLSS Performance instead of DLSS Quality results in a far noisier and less defined image. You could describe it as surfaces sort of boiling when the denoisers try to compensate. DLSS 3.5 is supposed to solve that.
@unpseudototalementnormal517
​@@guspazso it's gonna improve DLSS and not only RTX ?
@joos3D
@joos3D Год назад
If you have the performance headroom you can sort of emulate DLAA with DLDSR+DLSS for higher image quality
@joos3D
@joos3D Год назад
Can't edit my comment but the point of that would be to get around the limitation that ray reconstruction in 3.5 is currently limited to DLSS and not available using DLAA, like Owen mentioned. As a bonus you could potentially get even better image quality depending on your hardware.
@Teh-Penguin
@Teh-Penguin Год назад
Moving yourself to the thing you're explaining is a great visual support to the things you explain!
@farazalikhan5242
@farazalikhan5242 Год назад
Daniel you explained it so well, no wonder you are a good teacher irl
@bfhandsomeface409
@bfhandsomeface409 Год назад
As an AMD owner, still waiting on fsr3..............HELLO AMD are you still there! lol
@ZackSNetwork
@ZackSNetwork Год назад
AMDone
@faultier1158
@faultier1158 Год назад
Yeah, that one is a bit of a bummer. AMD falling further behind isn't a good thing for consumers in the long run.
@BPMa14n
@BPMa14n Год назад
This is going to take lighting , ambient occlusion and shadows to another level, this makes the biggest difference in making worlds feel 3d and not flat.
@giglioflex
@giglioflex Год назад
It's an improved denoise. It's inherently a slight improvement over what was already there.
@JustAnotherAccount8
@JustAnotherAccount8 Год назад
Shaders are what give things a 3d feel, not raytracing.
@justapleb7096
@justapleb7096 Год назад
@@giglioflex as a movie buff, i can confidently say that an improved denoiser is NOT just a "slight improvement"
@kovrcek
@kovrcek Год назад
Try Reshade with Superdepth3D, then the game won't just feel 3D, it will be! Of course you must have the HW for it, but for example CP2077 is godlike converted to real 3D. You won't even mind the lowered resolution or little ghosting + artifacts, all those minuses which accompanies the 3D effect/conversion. It's just so immersing. Or Subnautica (kind of indie underwater diving game) converted to 3D is very scary (it's scary even in 2D lol).
@EvanOfTheDarkness
@EvanOfTheDarkness Год назад
Their AI denoiser seems to like adding more light to the scene, than what is there already. It's like bumping up the saturation on your TV. Looks move vivid, but does not actually make the image any "better". One positive is that it'll now upscale the reflections too, but I'd temper your expectations.
@damara2268
@damara2268 Год назад
Love this channel, Daniel always describes stuff so nicely and without any stupid "fanboy bait" content that a lot of other techtubers do
@odytrice
@odytrice Год назад
It's difficult to explain but these noise patterns are very common in Ray-Traced Titles and you definitely see them. Especially at 4k. Good to see that they are trying to tackle them
@vitordelima
@vitordelima Год назад
@@kadupse Some raytracers reduce the amount of noise before post-processing via multiscale rendering, maybe it would also work for this rendering method.
@odytrice
@odytrice Год назад
@@kadupseExactly! in Cyberpunk, It also shows up as nasty ghosting trails. Reminds me of GTA Vice City lol
@sumansaha295
@sumansaha295 Год назад
@@kadupse very immersion breaking in Lego RTX
@photonboy999
@photonboy999 Год назад
Ya, it's a processing power issue. When it's obvious it arguably doesn't make sense to even have ray-tracing. At some point you're better off sticking with more traditional raster techniques. You may use Screen Space Reflections for example, rather than Ray-Tracing and then have the reflections disappear at certain locations or camera angles but better that sometimes than having the reflections have a grainy flicker that catches the eye. I can tune out certain raster issues. i can't tune out the grainy flicker of some ray-traced effects that don't have sufficient rays (and/or sufficient denoise etc) and look noisy.
@mryellow6918
@mryellow6918 Год назад
​@@odytricetbf in cyberpunk the default lighting solution is that noisy I'd rather play 30fps Ray tracing than use that mess
@jamesdavidsonasmr1449
@jamesdavidsonasmr1449 Год назад
dude you made this video 20 minutes after the geforce video came out
@flovvers3
@flovvers3 Год назад
He is a man of focus
@joelhodoborgas
@joelhodoborgas Год назад
Real MVP
@christophermullins7163
@christophermullins7163 Год назад
Daniel is the one take king 👑
@HanSolo__
@HanSolo__ Год назад
Between two math lessons.
@Dionyzos
@Dionyzos Год назад
Good teachers can improvise well :)
@Scytherman
@Scytherman Год назад
Ray Reconstruction seems like a great update to DLSS and making it available to all RTX cards is great too. Can't wait to test it in Cyberpunk whenever they're rolling that out.
@haq44
@haq44 Год назад
It will be available with Phantom Liberty update.
@Wylie288
@Wylie288 Год назад
They have never just "not made something available". DLSS 3 is available for all RTX cards too. The ONE specific feature that uses the hardware only 40 series has is the only thing platform specific. As is with DLSS itself. Everything nvidia GPU with tensor cores has DLSS. They have never artificially limited in any of their proprietary software within their own ecosystem.
@jm8080ful
@jm8080ful Год назад
Crazy how you can juggle producing and releasing videos and a full time job as a teacher, specially now that your channel is getting big so companies are definitely working with you now, your schedule must be hectic 😵‍💫
@EvanOfTheDarkness
@EvanOfTheDarkness Год назад
So like 90% of this technology is doing upscaling *before* denoising for RT. The rest is their AI denoiser, which seems to prefer adding more light to the scene, it's not accurate at all, but it does look good I'll give you that. At least this is something that AMD can copy in a millisecond too (unlike framegen which does seem to heavily rely on those optical flow accelerators of the 40 series cards).
@tepkisiz
@tepkisiz Год назад
This looks revolutionary, good denosing is the biggest issue with rendering in general as it removes a lot of details if you dont have enough rays, If this can do it intelligently results will be much much better
@chiari4833
@chiari4833 Год назад
And thus 50 series will stay at 8gb and 16gb of vram for double the price 😂
@TTx04xCOBRA
@TTx04xCOBRA Год назад
And you'll still be poor and mad
@sofyo2bib
@sofyo2bib Год назад
and down to 96 bit bandwitch
@HanSolo__
@HanSolo__ Год назад
@@TTx04xCOBRA No, I call one "sucker" if pays prices taken out of someone's ass. I can afford two 4090s. Do you think I will pay such money for the PC part only to play a game on it? 😆 I have 5 other expensive and way more attractive hobbies I can spend this money on. It is the case when the steering wheel, a few 4K displays and hydraulic suspension for the seat make a racing gaming setup. The setup costs more than the entire season of real racing (car included) at some low-cost stock racing cup. In Europe, we have races with older cars which can't cost more than 1000 or 2000 euros. I'm sure it is the worst experience than gaming with 4090. I don't care if you call me "poor". Go back to your mom's basement and play.
@TTx04xCOBRA
@TTx04xCOBRA Год назад
@@HanSolo__ you're broke
@julianorozaa
@julianorozaa Год назад
I hope you can turn this on without frame gen. DLSS 2 + This would be amazing.
@fiftyfive1s410
@fiftyfive1s410 Год назад
yes. it's a free upgrade for all RTX GPUs, turing included.
@gabepvpz
@gabepvpz Год назад
You can! If you look at the chart it says Ray Reconstruction (All RTX GPUs)
@Miska2559
@Miska2559 Год назад
it would be logical you can since its will works on all RTX cards, and not only RTX 40 series like frame gen.
@kanta32100
@kanta32100 Год назад
If you can play with RT, it will just look better with the same FPS or slightly faster. Path tracing already looks sick, and now they're making it better.
@julianorozaa
@julianorozaa Год назад
Awesome guys, thanks for the replies :)
@d34d10ck
@d34d10ck Год назад
With GPUs becoming more and more different from each other these days, I don't think it's a bad idea to have vendor-specific graphics pipelines to get the best performance out of each of these GPUs.
@TheDarkguide
@TheDarkguide Год назад
It's always enjoyable to watch floating Daniel explain things. 😊
@Nobe_Oddy
@Nobe_Oddy Год назад
WOW, Thanks for covering this!!! I haven't seen ANYTHING about Gamescon or This AT ALL!!! And I'm following like 10 tech/gaming chans.... I guess everyone is waiting for AMD's 7800/770 XT and FSR 3 announcement.... Thanks Mr.O!!! :)
@joelashworth7463
@joelashworth7463 Год назад
The immersion you get playing in Overdrive is incredible - complete gamechanger - it was obvious that there were some upscaling difficulties when playing - now we know some of it was denoising. I did wonder - if DLSS had been properly trained on Ray Traced Datasets - I guess the answer is yes but the denoiser/rendering tech produced by the engine was a big problem - that screenshot showing the subtle reflections (from the light tunnel ) of diffuse lighting is mind-blowing . Nvidia now will have the best traditional upscailer and a mind blowing ray traced upscailer (that seemingly is actually able to infer a better image from a very spotty image). Nvidia is really laying the body on AMD here. Will AMD be able to respond without tensor cores ? Anyway someone has to be the technical leader and actually lead - in this market it is definitely Nvidia so far !
@JudeTheYoutubePoopersubscribe
Nah AMD will never get close with ray tracing.
@rudrasingh6354
@rudrasingh6354 Год назад
apparently there are some "AI" cores on the Radeon 7000 series, but yeah AMD software is way behind Nvidia. They can't really humble them the way they did Intel in CPUs because Intel got complacent for nearly a decade, Nvidia isn't making that mistake, they are just greedy and AMD is following them in that regard as well
@antivanti
@antivanti Год назад
They REALLY should have had different names for the different technologies and a different umbrella term for the suite of DL techs
@thelegendaryklobb2879
@thelegendaryklobb2879 Год назад
Shhh, aware and informed customers are bad for business...
@antivanti
@antivanti Год назад
@@thelegendaryklobb2879 Hehe. Tho if these don't become universally supported technologies that developers can rely on they will die out (like physx etc) and that would be even worse for business. That's why Nvidia worked with Microsoft on DirectX Raytracing
@7xyn
@7xyn Год назад
cant wait to try dlss 3.5
@sesad5035
@sesad5035 Год назад
kys
@metamon2704
@metamon2704 Год назад
I don't know why they put FG as DLSS3 since you can run it without enabling DLSS also it makes even less sense since 3.5 works on older cards as well but not FG, they should just have not associated it with DLSS.
@bloxoss
@bloxoss Год назад
Yess! This incredible technology is no longer locked to JUST the 40 series! Proud of you Nvidia!
@JohnsDough1918
@JohnsDough1918 Год назад
The main feature that came along with DLSS 3, Frame Generation, is still limited to the 40 series.
@nelsonmejiaslozada9362
@nelsonmejiaslozada9362 Год назад
now 4080 owners will be able to play 20fps with great lightning, a dream come true.
@TTx04xCOBRA
@TTx04xCOBRA Год назад
Awwe youre mad lmao
@c523jw7
@c523jw7 Год назад
Salty😂
@Art_Vandelay_Industries
@Art_Vandelay_Industries Год назад
I totally get and agree the point you make about having exclusive graphical features per manufacturer but one positive I take away from this, is that it reminds me of the early days of 3d graphics or even early consoles, when having a different 3d accelleration card meant you get a different version of the game. I think it was a really exciting time and this could be the beginning of another shift in technology, during which the big gpu companies diverge in innovation and we get a bunch of weird stuff.
@whitehorsept
@whitehorsept Год назад
I'd like to point out that although I was also excited about that pink reflections, what I notice is that the fourth picture doesn't have any white light source while the other three do, especially the first one. That in itself explains the different behavior and misleading our first reactions.
@tonymorris4335
@tonymorris4335 Год назад
This actually reinforces my opinion that there's probably a reason that frame gen is limited to 4000 gpus. We still haven't seen AMD come out with a competing tech, let alone one that works as well which means it might legitimately be hard to accomplish on hardware that isn't designed around the idea of it.
@thomaswhite9160
@thomaswhite9160 Год назад
Frame interpolation is nothing new, TVs do it with "frame smoothing"(the soap opera effect) , what Nvidia is doing is actually doing that in-engine instead and inserting an AI generated in-between frame between two frames, so it works great on high end cards because theres more frames to work with, and it's not as good on lower tier cards... You still need a good base performance anyways
@fenril6685
@fenril6685 Год назад
It's not just an opinion. This is the actual reason that DLSS 3 is not enabled for older Nvidia cards as well. The hardware structure of the 4000 cards is different and was designed with this technology in mind. AMD's FSR works with an entirely different method that has nothing to do with tensor core algorithms.
@CaptainKenway
@CaptainKenway Год назад
Frame interpolation isn't voodoo magic or something Nvidia created. TVs have been doing it for a decade or more using low-end ARM chipsets. Turing and Ampere cards have the so-called 'Optical Flow Accelerator' that Nvidia are offloading the work to. They simply claim that the Ada one is improved, and the previous generation ones wouldn't offer as good of an experience. Be sure to note that they're not saying previous gen cards CAN'T do it. Just that it'd (allegedly) have more artifacts, and so they won't enable it.
@fenril6685
@fenril6685 Год назад
@@CaptainKenway Nvidia has said that the Optical Flow Accelerator present in 30 series cards is not fast enough to get any fps benefit out of frame generation. Of course DLSS is backported because DLSS was designed with hardware in mind starting in the 20 series. They can do it but it's not worth enabling because the entire point is to get a significant fps boost. You are correct that frame interpolation is not new. What is new is the proprietary method they are using to do it and the quality of the image that algorithm produces because of the custom hardware modules on the card specifically designed to make this feature work. This is what Nvidia claims. I don't really care either way if it's true or not as I hate Nvidia and will never go back to using hardware from them ever again. I think them bundling all these features under DLSS when they are really separate features is just another example of how anti-consumer Nvidia really is. They do shit like this all the time since the beginning of their company. Insult to the intelligence of their customer.
@giglioflex
@giglioflex Год назад
"works as well"? It doesn't work well, it creates a ton of input lag. I can get the same effect on my smart tv. Frame interpolation is nothing new.
@Tetrapak1234
@Tetrapak1234 Год назад
Your pointer finger is on point as always :D
@yeezystreetteam
@yeezystreetteam Год назад
Watching you shrink yourself into a fly and start buzzing around the screen is all the breakfast i need
@existentialselkath1264
@existentialselkath1264 Год назад
Did you specifically ask about compatibility with DLAA, or did you just ask about native? I may have misunderstood your question as one about native TAA. I'd be surprised if DLAA is unsupported
@Psychx_
@Psychx_ Год назад
One of the main points about RT and PT was physically accurate light simulation and getting rid off approximations and artistic tricks to achieve a somewhat realistic result. Now, through the power of AI and INT8/FP16 calculations, the approximations are back - the efffect shown in the slide deck can be achieved with some regular screen space effects btw.
@Niosus
@Niosus Год назад
No they can't. SS effects can't show anything off-screen, and have horrible artifacts at the edge of the screen and in combination with occlusion. The AI/lower precision calculations aren't approximations in the same way the old school effects are approximations. The old tricks are doing the wrong calculations to get to somewhat realistic results quickly. The new approximations perform the correct calculations at a lower precision or frequency. Sure, they both have their artifacts, but the new effects will scale with new hardware to greatly reduce those. The artist effort and workflow with the new effects is also greatly reduced. We're not quite at the inflection point yet, but ray tracing will be the norm and the noisy artifacts will go away.
@TheDravic
@TheDravic Год назад
You would have to be incredibly ignorant to think screen space effects can reproduce path tracing effects outside of canned and cherry picked still comparisons. The moment something moves outside of your screen space, it's game over.
@giglioflex
@giglioflex Год назад
@@Niosus That's not entirely accurate. The new denoiser uses AI as does DLSS to upscale. In both cases you have an AI making an educated guess. The only difference is the AI is better at guessing than a algorithm.
@Psychx_
@Psychx_ Год назад
@@Niosus I specifically said the effect in the slide deck (the Cyberpunk 2077 comparison screenshot). No RT, nor AI is needed to have an on-screen, bright purple object bleed color onto the concrete below it. And no, the new effects won't scale with the hardware - not if there isn't a fundamental change in GPU architecture design. RT/PT and the algorithms employed, i.e. by UE5, leave more than half of the ALUs empty, doing nothing due to lane masking, as modern shader code essentially became a cascade of "if-else", or worse, "switch" statements - with GPUs near-always having to execute every branch sequencially. We're facing at least another 5 years of games running like shit in relation to raw HW performance and looking blurry as f*ck due devs not optimizing anything anymore and due to upscaling, denoising and other such reconstruction and image synthesis techniques. I'm not having it!
@Niosus
@Niosus Год назад
@@Psychx_ Until you look down and all your lighting is gone. Or until there is an object in-between the ground and the light source, and you get this weird shadow around it. In most games, this object could even be your weapon. Do you have any source on your UE5 claim? Thousands upon thousands of engineers at Epic, Nvidia and other places are working on this topic. You're expecting me to believe that a random RU-vid commenter is the only one to figure out that they're only using half the GPU and nobody has bothered to fix it and literally double performance for free? I find that extremely hard to believe. I'm not a graphics engineer, but I am a software engineer. In my experience, every time someone proposes a simple solution to a complex problem, it's just wrong. You can say a lot about Epic and Nvidia, but not that they're technically incompetent.
@graxxor
@graxxor Год назад
Sorry but the teenager in me low key chuckled out loud when I saw you use your tiny image as a cursor to indicate the screenshots... Love it!
@setiadisatria1038
@setiadisatria1038 Год назад
Dude... here I am waiting for FSR 3 news and yet I found DLSS 3.5 news.... AMD, please give us more competition on the technology side as well as the hardware side As for the exclusivity, I agree, if there's too much of exclusivity in technology, devs probably will lean towards the better one, whether it's easier to implement, or better result overall. This leads to the old classic scenario where AMD stuck with the inferior tech (you can't use certain option, like those hair stuffs in Tomb Raider iirc).
@TTx04xCOBRA
@TTx04xCOBRA Год назад
You shouldve bought Nvidia dude
@setiadisatria1038
@setiadisatria1038 Год назад
@@TTx04xCOBRA well, I'm using RTX 3080 now, but the reason I'm waiting for FSR 3, is for the actual competition, and extra option for those couple games that only has FSR
@TTx04xCOBRA
@TTx04xCOBRA Год назад
@@setiadisatria1038 FSR 3 is heavily rumored to only work on RDNA 3 GPUs
@setiadisatria1038
@setiadisatria1038 Год назад
@@TTx04xCOBRA ahhh, well that's add another reason for people to buy 7000 series card huh? Finally AMD turns to the darkside, just like Nvidia mwehehehehe. Welp, we'll see how it turns out, if it's indeed interesting then..... It's time to actually start the technology wars then, which is GOOD. Might not be good in the short run, but probably good in the long run
@adriancioroianu1704
@adriancioroianu1704 Год назад
@@setiadisatria1038 What technology war? Name a technology AMD released in the last 10 years that wasn't a clone of an Nvidia technology released prior to it? There is no competition, look at the market share and steam hardware survey numbers. Nobody buys amd. Even people who talk shit about nvidia buys nvidia. Btw competition is not necessarily good for the consumers, despite the clichee. Look at cpu market, since the competition sharpened, amd has now raised the priced beyond intel prices in the dominant era. So, (1) don't fall for clichee's and (2) buy what serves your interest best, don't think about anything else.
@wabbits4596
@wabbits4596 Год назад
I rolled my eyes at first seeing this assuming it wasw only 4000s series and my 3080ti would be left behind, but wow this really looks absolutely incredible and I cant wait to test it
@Leptospirosi
@Leptospirosi Год назад
These hardware tied features are very bad for game developing: it can turn Pc gamig on the path of Console "exclusiveness". I'd rather have a worse RT API that anyone can use, like DX, OpenGL and Vulkan, then old style 3DFX closed doors. FSR and XSS are both hardware agnostic and I much prefer them that way. People forgot how bad it was when 3DFX was in the position Nvidia is nowadays: Nvidia itself used to be the "underdog", with ATI, PowerVR and Permedia, bringing free 3D acceleration to everyone.
@vitordelima
@vitordelima Год назад
The Unreal Engine has its own upscaler too.
@pliat
@pliat Год назад
XESS is not hardware agnostic, there are two upscalers called XESS, one for XMX cores, and one for DP4a (other GPUs). Hence why XESS is better on intel. Until AI accelerators become standardised (fat chance) this will continue.
@fafski1199
@fafski1199 Год назад
There's alway been proprietary hardware and software in the PC market, just as much as there has been in the console market. Intel CPUs have always had different architecture and have used different sockets and motherboards than AMD CPUs, for decades. The same is true with Nvidia GPU's and AMD GPU's. Both have always had a different set of proprietary hardware, features, drivers and other software. It's just the way things are. If any of them come up with a new idea, innovation or patent, then they should have the right to choose whether or not to share it out or to close it off from any competition and instead use it as a selling point. For example do you think AMD ought to let Intel use thier Infinity cache and 3D V-cache? Or even let Nvidia copy thier MCM design? Or should they reap the rewards and force the competition into coming up with thier own solutions in order to compete against thiers. Like it or not, proprietary product's and IP's are a very good way of forcing companies to innovate, come up with thier own ideas and compete against each other. And heathy competition is always good thing. From a business prospective making something 'open source' also isn't going to earn you a dime and is also helping out the competitions product, just as much as it's helping your product.
@Leptospirosi
@Leptospirosi Год назад
@@fafski1199 @fafski1199 no, you are missing the point by a mile. AMD, Intel and even Nvidia never tried to push proprietary APIs into games until DLSS. XESS is obviously heavily optimized on Intel hardware, or rather the other way around, and so, I guess, FSR on AMD. Despite this, they can still run on other hardware if the developers chose to implement that API in game. I can still test XESS on Radeon or RTX GPUs if I want, or FSR, and see how it compares with other upscaling API if I like, but I will never be able to do the same with DLSS, by design, meaning that code is for exclusive benefit of Nvidia owners, and I pay for it even if I don't own a GTX Card. I don't care what is in the drivers of a Rizen or Intel main board and CPU: the program will work exactly the same on both systems. Like in the old Voodoo 3DFX API times, DLSS its brand tied and Nvidia pushes for developers to code for their hardware specifically. You can see the difference: I can write code for FSR or XESS, and I guess the API designers will heavily optimize their hardware to run better on their architecture, but they don't cut, purposely, off other brands. If AMD is smart enough, it can freely design a GPU which is better optimized for XESS then Intel ones as it's free use software. Intel and AMD will never be able to design a GPU able to run DLSS code anyway because of legal constraints, exactly as it happened for Voodoo dedicated API. The developer should set the API as an overlay for their game to work and it's up to the drivers and the hardware to optimize for it.
@MLWJ1993
@MLWJ1993 Год назад
​@@LeptospirosiYou don't need to specifically code anything for DLSS... DLSS, FSR 2 & XeSS make use of the exact same inputs that're generated by the game engine. The only thing you need is the SDK's for the upscaler in question but that's not exclusive to DLSS. FSR 2 & XeSS also require their respective SDK in order to work. What is different is that DLSS does not have any inferior fallback for any generic GPU without "Tensor Cores".
@ICaImI
@ICaImI Год назад
Wonder what it will look like 10 years from now, when the projected raw computation power of hardware should be enough to do RT without having to rely on neat optimization tricks.
@MATTE.U.K
@MATTE.U.K Год назад
It will look pretty realistic bro
@rudrasingh6354
@rudrasingh6354 Год назад
could probably do 6-10 ray bounces in fully pathtraced games with 4k textures and 8k screens, though the 8k screen is a bit overkill, more fps at 4k resolution is better, imagine path traced 4k games running at 240fps
@DavidAlfredoGuisado
@DavidAlfredoGuisado Год назад
Does this improve performance or just image quality? Also, you can hate on Nvidia for pricing and stuff but they're the ones bringing the tech forward and the rest are just playing catch up.
@GewelReal
@GewelReal Год назад
first slide shows slight performance increase
@allansolano5587
@allansolano5587 Год назад
@@GewelReal but if you watch the video daniel said that some scenes might but is not the norm this is just a image quality option basically shouldnt lift performance in 90% of the time
@lombredeshakuras1481
@lombredeshakuras1481 Год назад
From the CP 2077 it seems to do both
@sacb0y
@sacb0y Год назад
same performance but sometimes better. But it mostly just improves quality.
@groenevinger3893
@groenevinger3893 Год назад
Without Nvidia you would still have a patato in your pcie slot..
@LeGenDxKaOtiK
@LeGenDxKaOtiK Год назад
At least people who are interested enough in RT to want something like this pretty much all have NVIDIA GPUs already
@maxstr
@maxstr Год назад
The problem I always have with ray-tracing is how terrible it is at lighting up shadowed or dark areas. Most implementations I've seen make shadows too dark. For example, in broad daylight and going under a tree or something makes it almost pitch black. I'm assuming it's because the light rays are being sent from the sun, but not bounced around or reflected like real light.
@MLWJ1993
@MLWJ1993 Год назад
Indeed, that sounds like very limited bounces for GI (or none at all & just RT shadows).
@joos3D
@joos3D Год назад
Multiple bounces are heavy to compute, so they are limited to 1-2 in games. Portal RTX lets you increase the bounces to see what it can look like.
@Azhureus
@Azhureus Год назад
I like that, Nvidia is trying at least something to make right. Cant wait to try it in CP !
@adityabhusari9645
@adityabhusari9645 Год назад
C'mon AMD tell us something about FSR 3💀
@keyycee5547
@keyycee5547 Год назад
so is it faking Ray Tracing like u dont turn on ray tracing on cyberpunk but if u use dlss 3.5 it looks like you have rt turn on? or is it like u turn on rt and dlss 3.5 and the image gets more clear or dlss 3.5 gets rid of the rt lighting and replaces it or parts of it with AI ? im confused. maybe due to the fact that english is like absolutly not my native one ..
@haystackdmilith
@haystackdmilith Год назад
Absolutely agree that vendor-features is a no-go. Hopefully this stuff will be generalized through DX, Vulkan and Metal
@alwatts
@alwatts Год назад
Someone has to be first mover though, and these features require hardware other than the system CPU to be able to run at a usable performance. Likely rules out a software company led approach. Are you suggesting you’d prefer for these features to just not exist at all? Someone has to be first (it just happens to be Nvidia in this case), then others will follow with their own and it will become a standard with support then added to DX, Vulkan etc.
@RicochetForce
@RicochetForce Год назад
Are you seriously whining like this? Nvidia is typically the one pushing the tech in this market. AMD doesn't do it. Intel doesn't do it. Someone has to do it first, and it's up to the others to match or exceed that.
@rudrasingh6354
@rudrasingh6354 Год назад
who is gonna rent a supercomputer to train AI models for stuff like this and make it open source? AMD hasn't made anything on this scale of quality yet. Microsoft isn't gonna do it for free either, though they have a better chance of doing it for DX12 or a supposed DX13 but don't expect this in Vulkan, Khronos group don't have that kind of money and Apple doesn't care about games either
@n8mr124
@n8mr124 Год назад
now new games will run native 20fps without upscaling
@user78405
@user78405 Год назад
intel and nvidia are doing great for their customers lately, i wish amd have engineers from nvidia and intel to make their 7900xtx run faster than they are now
@jayesh44781
@jayesh44781 Год назад
Dang! Using DLSS I could buy any mid-range RTX and no longer needed high-end specs! Amazing to see this ❤
@1979rhino
@1979rhino Год назад
What about DLSS 2.0 with RR whats that like on the 20series and the performance hit.
@bestonyoutube
@bestonyoutube Год назад
11:50 what you mean with better looking? it looks terrible blurry compared to native.
@nomercy8989
@nomercy8989 Год назад
Wait this works on the 30 and 20 series as well? Can someone put this into portal RTX so I can try it out?
@keikei2185
@keikei2185 Год назад
Yes.
@atdraz9283
@atdraz9283 Год назад
Alan Wake II is described as having full ray traced (path traced) graphics on Nvidia's website( there is an image comparison as well). I can't wait to test this out since I noticed in a lot of games the image quality looked a bit more dirty with ray tracing on.
@obliviondust2719
@obliviondust2719 Год назад
AMD about to announce FSR 4!
@tylerthere5832
@tylerthere5832 Год назад
😅
@deimosok2003
@deimosok2003 Год назад
No frame gen : D hehehehe poor AMD
@pliat
@pliat Год назад
Just 2 more weeks!
@brandenwaite
@brandenwaite Год назад
The bright lights, at the top of those four images you started with... Look at the last one on the right... DLSS 3.5 obviously needs work to find more patterns if it's missing that lighting information.
@devencherry8976
@devencherry8976 Год назад
Ray reconcuatriction sounds like a intriguing technology. I’m excited to try it whenever it releases..
@martineyles
@martineyles Год назад
The biggest difference between the right hand cyberpunk screenshot and the ones to the left is the removal of the white illumination above the scene, so is the render of reflections actually better?
@alanliang9538
@alanliang9538 Год назад
hopefully nvdia can come up with ai to simulate more vram chip
@dead_meat
@dead_meat Год назад
Lul. Ever the elephant in the room. "Shhhh, guys, it'll be fine, we've got this new trick just you wait!" No shame.
@ComradYeltsin
@ComradYeltsin Год назад
Earned my sub from this, very well done video, even the way you presented the final product then the explanation for me really worked. Looking forward to more!
@DaVizzle_Bro
@DaVizzle_Bro Год назад
Literally Just bought a 4080 right before this announcement and now I feel like this is a sign I mad the right choice (knock on wood)
@mornedhel1
@mornedhel1 Год назад
4080 never was/is the right choice
@DaVizzle_Bro
@DaVizzle_Bro Год назад
@@mornedhel1 I can always back but explain
@V3RAC1TY
@V3RAC1TY Год назад
worst price to performance of current gen GPUs if you have the money just get the 4090 @@DaVizzle_Bro
@DaVizzle_Bro
@DaVizzle_Bro Год назад
​@mornedhel1 I can understand if your point of view is the one suggesting I invest my money in AMD because I'd be getting more Bang for my buck. But you have to realize, which I think a lot of people are overlooking, is if I get the AMD that is equivalent to the 4080 in terms of performance (ray tracing not withstanding) Then I'm going to have to fork over another 100 to $200 for a 1000 W power supply since I don't have that much with my current setup. Now granted the premium Starfield bundle is VERY inticing with AMDs current promotion I have to say it is still not exactly winning me over. If AMD could come up with a better solution to their FSR ANDDD more importantly to their very big problem with Ray tracing then I might actually be won over since the one thing I will concede AMD destroys Nvidia on is VRAM Capacity which is starting to shows cracks in foundation when it comes to the most recent AAA games at 4k. Keep in mind also that my decision is extremely based off of the fact that I have a 55 inch 4K TV so I have to play games at 4K If I want them to look even halfway decent and not like a PS2 port.
@blasiansauce3125
@blasiansauce3125 Год назад
The 4080 is a fantastic GPU, if you have the money. Compared to a 3080 the price to performance is about 15% worse, but it is 50% more powerful. Saved me from having to upgrade my 750W power supply, runs much cooler, and you get the cutting-edge features
@awildconcept
@awildconcept Год назад
Say what you want about proprietary hardware and gatekeeping of features but NVIDA is really pushing the technical advancements for GPUs and PC gaming. Releasing this feature for DLSS 2.0 GPUs is also a nice touch.
@IghorMaia
@IghorMaia Год назад
The point is, Nvidia doesn't have competition.. and that is horrible for us
@pliat
@pliat Год назад
Good for me, i have a 4090.
@thisguy3500
@thisguy3500 Год назад
@@pliatI'm sorry for your loss. Nvidia swindled me with the 980Ti. We learn the hard way I guess?
@pliat
@pliat Год назад
@@thisguy3500 i don't feel very swindled tbh, it's an excellent card.
@HanSolo__
@HanSolo__ Год назад
@@pliat RDNA4 high-end GPUs will come later than those Navi44 and Navi43 because they are aiming to 4x performance of RTX 4090. BTW. RX 7900XTX match 4090 in most 4K titles.
@thisguy3500
@thisguy3500 Год назад
@@pliatExactly how I felt at first, then 5-6 months later the 1080 released. At least with the 4090 there won't be a replacement until 2025, if that's to be believed. They intend to make people salivate for the $2200 RTX5080.
@alexgontijo
@alexgontijo Год назад
If only this quality were real... this is just one screenshot... in movement, the quality goes away with that blurry mess. Thanks, I just want the pure horsepower nVidia.
@robertmyers6488
@robertmyers6488 Год назад
Nvidia just admitted that their reliance on RT produces a lower quality image. Those image imperfections are then amplified by FG and all with massive performance losses. As if that couldn't have been done with traditional methods to produce the better colored image. All so we can automate art with AI and can the artists.
@vitordelima
@vitordelima Год назад
Yes, RT is slow so it forces the use of very low resolutions and a lot of tricks to compensate for it (denoising, upscaling, frame generation, reduced ray bounces, higher resolution rasterized base images, ...).
@robertmyers6488
@robertmyers6488 Год назад
@@vitordelima The way it was explained is that the rays are driving the image creation rather than a effect on existing textures. Am I incorrect in that understanding?
@vitordelima
@vitordelima Год назад
@@robertmyers6488It should be this way but the denoiser becomes confused with the lack of information and uses a rasterized image with only the geometry and textures to guide itself. This is shown in the slides.
@VocaloidEnjoyer1
@VocaloidEnjoyer1 Год назад
@@robertmyers6488 I alway love it when wannabe smart asses like you talk about stuff they have zero clue, reminds of the people who are afraid of vaccines despite professionals telling them they are safe.
@chosenundead66
@chosenundead66 Год назад
0:17 "Let me aaaahhhh" Great video bth!!!
@podio_km4g532
@podio_km4g532 Год назад
I mean the feature makes sense and it most likely will be good (they already denoise rt stuff so I don't see why not generating new rays entirely). But why TF does dlss 2.0 have a better reflection and GI compared to native?
@allansolano5587
@allansolano5587 Год назад
I think native is not using denoisers at all because those go on with dlss and you are at 20 fps so maybe the ray hasnt even show yet hahaha
@podio_km4g532
@podio_km4g532 Год назад
@@allansolano5587 denoisers should be there in the game engine, no way a RTX 4090 can get those framerates while rendering all that rays (it would be more like a frame every 5/10 seconds not 20fps)
@Dionyzos
@Dionyzos Год назад
Because it's the hardware accelerated DLSS reconstruction algorithm that processes the denoising. Native only works with a simpler, let's say, stupid denoiser. It should be possible to do this without upscaling in the future, just like DLAA.
@podio_km4g532
@podio_km4g532 Год назад
@@Dionyzos ok makes sense
@KenTWOu
@KenTWOu Год назад
0:20 But the far right screenshot is clearly different, no bright white light sources on top, that's why reflection has less white color in it, and purple light is more pronounced. That's clearly a marketing bullshot to make the result of DLSS 3.5 more appealing.
@lennylennington
@lennylennington Год назад
Can I use it with older RT titles by manually updating the DLL file? Or does it have to be optimized and updated by devs?
@PyromancerRift
@PyromancerRift Год назад
Hawaii : The fires have destroyed so much ! nvidia : We can reconstruct that too.
@vellocx
@vellocx Год назад
I like that they keep pushing new solutions out to the public, even if it is exclusive. to me it's a proof of concept, which will make engineers look into denoising a different way so they might come up with their own solutions yada yada yada
@MLWJ1993
@MLWJ1993 Год назад
There's a whole slew of graphics features we enjoy nowadays that we're originally pushed by Nvidia in a proprietary feature. That it's proprietary now doesn't necessitate it being proprietary forever & since they don't need to make it compatible with everything & everyones hardware you can push things to the masses faster & with lower research costs. It's not ALWAYS because corporations being evil overlords...
@bondedomao
@bondedomao Год назад
It's so funny and yet effective the way you move yourself in the video hahaha
@hjge1012
@hjge1012 Год назад
The lighting in those scenes(at the start of the video) is different, so it's really hard to compare. Why would Nvidia pick those as comparison material? Or are they supposed to be the same? But If I had to to choose, I'd say DLSS 3.5 looks strictly worse there. Tiles don't reflect light like that and that lighting cylinder thing also looks way more muted in 3.5. How do you get such a strong reflection from such a muted light source? I don't know, most of looks pretty unconvincing. The only thing that looks convincing in the marketing material, are the fps numbers and maybe those preview imagine(for whatever that's worth). I highly doubt those fps gains are anywhere close to true though.
@Draanor
@Draanor Год назад
Have you been to Las Vegas at night? That's polished concrete. It's going to reflect color.
@giglioflex
@giglioflex Год назад
@@Draanor You think there'd be polished concrete in a dystopian city? I'd have to agree with the OP, the material properties were incorrectly set.
@Draanor
@Draanor Год назад
Well given the first image without RT is polished concrete, I don't particularly care about your art design choices, the asset in the screenshot is polished concrete. Complain to the devs. And yes, polished concrete seems right at home is dystopian Las Vegas style city... @@giglioflex
@kosmosyche
@kosmosyche Год назад
You should look at nvidia's presentation to better understand why the lighting is so different in the screenshot. There's a video at the end where they show all 4 variants of the scene in motion. It's a scene with dynamic heavily colored lighting, so the lighting is changing constantly and is mismatched between all 4 variants, that's why they look so different in the screenshot. What I noticed right away though was that the scene with DLSS 3.5 changes raytraced lighting from the light sources in the scene much much faster, because of (I presume) faster denoising process which doesn't need as much frames for better accumulation of results. The difference is night and day in my opinion. Without DLSS 3.5 the secondary lighting lags behind by at least half a second and you can see how slowly it reacts to the changes in light sources. With DLSS 3.5 it happens almost instantly like it should. Basically DLSS 3.5 effectively provides faster light bouncing in the scene, which is much closer to what happens with light in reality (instant change). This is very promising. It's not yet completely lagless, but getting close to it. Also the reason the lighting looks more muted in the scenes w/o DLSS 3.5 is because it doesn't have enough time to fully saturate the scene because the light source changes faster than the GPU is able to fully process changes in RT lighting. You can see it clearly in the video, how it slowly and gradually changes from yellow-ish to blueish and back again, but never has the time to change completely to the new lighting state before the light sources change again, so it's always sorta kept half way. In DLSS 3.5 version because the light is bouncing much faster, the scene saturates fully almost instantly like it should.
@Big_Biba
@Big_Biba Год назад
Amd devs working overtime to catch up to dlls 3 seeing this 🗿
@HanSolo__
@HanSolo__ Год назад
Play. Like. Full screen. Let's watch.
@frankderks1150
@frankderks1150 Год назад
No point having mixed feelings about the fact that the best optimized pathways for rendering are a combination of hardware rendering pipelines and software together. Ray tracing will mature to a point that game developers don't have to fake the lighting effects with shaders and just have to set the lighting sources in the scene and the dedicated ray tracing hardware and software does it automatically with far better results. It's just that nVidia is willing to develop and push those technologies and others are behind and try to catch up.
@AtakenSmith
@AtakenSmith Год назад
That 20 fps looks very arbitrary and why DLSS Super Resolution lightning is different? It should look the same, it's about resolution not lightning and staff... Am I missing something or is this another misleading marketing...?
@MutantMasterRace
@MutantMasterRace Год назад
I thought that same thing...
@antont4974
@antont4974 Год назад
My understanding: Before denoisers composit an image, engine needs to cast and sample rays for lighting - this process takes time and you need many rays for good looking results, ReStir helps with this issue, but the result is still noisy and at this step NVIDIA's new denoiser starts to create an image that in theory will better represent something like a game. Denoisers fill in missing pixels that's why the result looks different
@AtakenSmith
@AtakenSmith Год назад
@@antont4974 I see, thanks for the explanation. ^^
@mrnicktoyou
@mrnicktoyou Год назад
So this is a fix for their poor image quality when using Ray tracing and DLSS.
@farazalikhan5242
@farazalikhan5242 Год назад
Another up for Ngreedia vs AMD whose FSR3.0 is still in doubt
@guitaristkuro8898
@guitaristkuro8898 Год назад
With Portal RTX you can compare reference and denoised and boy does it’s it eat quality and produce artifacts. The biggest improvement I hope to see in this tech outside of that is the noise and shimmering DLSS introduces to certain aspects of ray traced scenes. Most often this is seen as what looks likes moving film grain and such in reflections.
@Kadeda
@Kadeda Год назад
Welp... guess im going greedy ass nvidia... AMD getting left in the dust... I think the next play is to consider intel.
@itsaUSBline
@itsaUSBline Год назад
I think in another generation or two, Intel is going to be a serious competitor based on what we've been seeing with their driver improvements.
@sesad5035
@sesad5035 Год назад
intel has spyware drivers same for ngreedia.
@shanent5793
@shanent5793 Год назад
"DLSS off" is the reference, so by definition any differences are artifacts. It doesn't make sense to say that the RR image is "better" than the reference
@shreyass5756
@shreyass5756 Год назад
This has been my big concern ever since i got a RTX GPU cause ray tracing didn't really make a gigantic difference as it should have in most scenarios for the performance cost it asks. Now this feature may actully make ray tracing worth it for once.
@antimsm6705
@antimsm6705 Год назад
NVIDIA should spend more time making their GPUs better instead of forcing us to use an upscaler and fake frame generator to give the illusion of smooth gameplay.
@haukikannel
@haukikannel Год назад
And still you can do the ”same” thing with traditional ways… With Nvidia 1080 and other older GPUs… By hand coloring the images etc…
@shreyass5756
@shreyass5756 Год назад
@@antimsm6705 this not a illusion of smooth gameplay whatsoever. This is genuinely a way to improve Ray tracing as a feature which will be used more and more in coming games. Sure frame generation is a gimick but this isn't. DLSS 2 is definitely not a illusion. It's a very good technology. The game developers are simply exploiting this feature to hide their garbage optimization. It's not all Nvidia's fault. (I do agree that the 4000 series is bad)
@shreyass5756
@shreyass5756 Год назад
@@haukikannel That is not what ray tracing means at all....... It's completely different from what u are saying. The very point of ray tracing is to remove the need for such hand tuned traditional methods and decrease the effort on developer side and increase the workload on hardware side to making realistic lighting. U are right such *effects* can be recreated using traditional methods but it takes a insane amount of work to do it and is very time consuming. Even then it won't be completely realistic. Nvidia is trying to make lighting in games become realistic without having to put a insane amount of work on it and more accessible to every developer
@adriancioroianu1704
@adriancioroianu1704 Год назад
I disagree. A proper GI implementation of RT is actually making a big difference in immersion. You are probably talking about implementations that can barely be called "RT" like some shadows and some reflections here and there, which is most of the "RT" games today to be frank. But its still a step over pure raster, its like the new "ultra".
@syholt4621
@syholt4621 Год назад
This feels like it could lead to games that you can only play with Nvidia cards and games you can only play with AMD cards to get access their special technologies, or you'll have to play them without either technology enhancement if you have the wrong card.
@MrVidification
@MrVidification Год назад
Pretty much the same as frame generation, as long as one method becomes available as open source or multi format...
@kon_radar
@kon_radar Год назад
You should add a feature in OBS for your camera movements around the screen: While dragging, rotate yourself by 30° with a smooth animation. It would be even funnier.
@r4plez
@r4plez Год назад
but why? are you not entertained?
@mishashalda4516
@mishashalda4516 Год назад
So you can turn on ray tracing but can't turn on ray reconstruction without upscaling? In Nvidia think we are stupid or what?
@MajorSpam
@MajorSpam Год назад
Stop relying on software for lackluster hardware
@darkshark9137
@darkshark9137 5 месяцев назад
Agreed
@prashanthdoshi9926
@prashanthdoshi9926 Год назад
there wont be competition if everything is workable on every gpu . if not for nvidia we wont have fsr 2 and xess
@BoothTheGrey
@BoothTheGrey Год назад
For several years now when I watch very detailed "enhancements" of graphics the manufacturer want me to acknowledge... I often dont know if the new version is really "better". Very often for me its just... different. I notice similar experiences when checking pictures made by the best smartphone cameras in comparison. Yes - they all are different in detail. But whats really BETTER? I actually dont know in the most cases.
@photonboy999
@photonboy999 Год назад
Ray-Tracing has been added to many games AFTER the game was designed WITHOUT ray-tracing in mind. And many raster shadows and reflections look GREAT already when used in appropriate situations. If you have Screen Space Reflections for example, then maybe the MOUNTAINS reflect on to a lake. It looks great. You don't really register that if you're far away. And if the game never lets you get CLOSE it's a non-issue. But if you DO get close then those reflections can disappear. There are similar issues for all shadows and reflections... also, in the FUTURE it will eventually be far, far simpler to design a game if you ONLY design it with ray-tracing. Because you don't need to keep baking the lighting. You're building the game, you move the objects around in real time, you change the light color etc and the game engine just re-calculates all the bounces (so shadows/reflections are just automatically created like in real life). So part of this is using ray-tracing where it makes SENSE visually that justifies the performance hit, and part of this is about making it easier for the game developers (eventually). ALSO... there are certain things game devs SIMPLY DO NOT DO because raster techniques fail. One example would be a crystal cave with a lot of reflections AND refractions. So light bouncing off and going THROUGH a lot of objects. you can create those areas, but they don't look like they would in real life. And they don't currently do it with ray-tracing because it would cost too much. So there ARE situations that can only be done WITH ray-tracing and only WHEN it's computationally feasible. (Pixar knows how computationally expensive every frame is when generating animated movies. When they did "Frozen" the most computationally expensive scene, which was calculated with path tracing, was in the frozen castle due to all the light bounces needed to make it look correct.)
@BoothTheGrey
@BoothTheGrey Год назад
@@photonboy999 You remind me how musician theorists talk about why one song is great... and another is not so great... while I just like both. Sorry folk. You seem to not get it. You want me to appreciate that one is better than the other and follow your opinion. I just DONT. For me the "better" graphics often just isnt better than the already very good one. I like both. For me they are just different.
@baroncalamityplus
@baroncalamityplus Год назад
Considering the rumors around Starfield only supporting FSR, maybe Nvidia can work on a feature to use their card's AI can improve FSR to near DLSS levels? Probably just a fantasy
@ZoragRingael
@ZoragRingael Год назад
Yeah, the best way would be upscaler API included in DirectX12 and Vulkan and supported by all vendors Doesn't seem to be going in this direction now, though
@antimsm6705
@antimsm6705 Год назад
This DLSS is getting worse and worse. DLSS 3 has fake frames that are not even part of the game and the frametimes are higher, so no reason to use this, also these fake frames look horrific and will go into your subconsciousness, not a good idea. DLSS 2 is just an upscaled image from a lower resolution, therefore all DLSS creates a fake image, not as the developers intended. The developers are now using DLSS as an excuse to not optimize their games. Now even with the newest GPUs you often have to use DLSS.
@RCmies
@RCmies Год назад
I don't know if you pointed this out in the video but something that really caught my eye is that the DSLL off, DLSS2 and DLSS3 scenes all have this white shine at the right side of the RGB light fixture thing, whereas the DLSS 3.5 scene doesn't. Is it completely removed by DLSS 3.5 because it should be part of the scene, since it's visible in the DLSS off version which should be true without any AI stuff on top or generated frames. And if so why is this or is this an error? What is going on? This should be important because the bright white light should definitely affect the color of bounced light on the alleyway.
@coky7754
@coky7754 Год назад
Yeah, so my 2080s can run "A WHAT EVER" - RECONSTRUCTION and maybe i will get worse fr/s compare to not running it. Right? WOW, ITS GREAT. THANK LEATHER MAN.
@itsaUSBline
@itsaUSBline Год назад
It's denoising.
@alwatts
@alwatts Год назад
The exclusivity problem is that these DLSS features require specific hardware acceleration that the CPU can’t do itself. The options really are that either AMD and Intel catch up and maybe there’s even some kind of standardization around the hardware required (eg tensor cores), or that Nvidia stops innovating because the competition can’t keep up - and then the industry stagnates and it’s back to plain old clock speeds, fab nms and bus widths. At least if one player is innovating like this it may just act as the driving force for the others to develop their own solutions in the area.
@ZoragRingael
@ZoragRingael Год назад
Nah, that's not true anymore. Intel has its own "tensor cores" called XMX and uses it as an excuse to vendor lock best upscaling method of XeSS to Intel only GPUs. AMD has AI acceleration tech included in radeon 7000 series. Not used for anything except for compute/AI/ML right now. But there's a suspicion it'll be used for FSR3. So yeah, Nvidia is just doing their usual vendor lock schemes.
@alwatts
@alwatts Год назад
@@ZoragRingael There’s no standardisation on the hardware though. Unless you are suggesting that Nvidia don’t invent new ways to break from the clock speeds/fab/bus grind to get more performance, there’s not really a viable choice other than to develop for their hardware. Also sounds like it’s an industry-wide problem if Intel are also using their XMX in a way to lock in and AMD may follow for FSR3. Maybe we need DirectX AI/ML in 13 these new features can build on top of.
@ZoragRingael
@ZoragRingael Год назад
@@alwatts there weren't any standards for RT either, until Microsoft came with DXR and Vulcan community came up with Vulkan RT Same needs to be done with upscaling tech. --- >Unless you are suggesting that Nvidia don’t invent new ways to break from the clock speeds/fab/bus grind to get more performance, Nvidia didn't. Nvidia is now using newest and smallest node and hugely benefits from it. As for RT/AI tech in their RTX GPUs, the reason for that is following: - Nvidia uses same architecture for both gaming and enterprise/server/compute - Nvidia introduced tensor cores and RT tech for their enterprise consumers. DLSS and gaming RT were afterthought, because tensor cores and RT stuff take die space gimping Nvidia GPUs raster performance. So Nvidia needed a way to turn it into their advantage. AMD doesn't have that problem since they have separate architecture for enterprise/server/compute which includes AI/ML. If anything, it's AMD who's trying to overcome limitation of tech processes by utilizing chiplets tech.
Далее
BEST gaming PC build in September 2024!!!!
34:45
Просмотров 44 тыс.
a hornet bit me on the nose 👃😂
00:16
Просмотров 2,6 млн
How do Video Game Graphics Work?
21:00
Просмотров 3,6 млн
I fixed PC cooling.
10:45
Просмотров 2,1 млн
AMD Giving up
27:46
Просмотров 75 тыс.
Evolution of NVIDIA Tech Demos 1999-2022 w/ Facts
14:34
Why Unreal Engine 5.3 is a BIG Deal
12:25
Просмотров 2,5 млн
DLSS 3.5 - Better Pathtracing for Free
13:10
Просмотров 297 тыс.
Frame Generation
24:41
Просмотров 104 тыс.