I swear the video titles 5 years from now are probably gonna look like this "DLSS 4.1.69 vs FSRSS vs. XeSS vs. PiSS - Can Dr.Peppers' upscaling technique save us from the GeForce 7660's $2000 entry level price tag?"
I can see the differences and they definitely help to understand how the tech stacks up against each other. But, I will agree that if im actually playing the game and not measuring my peepee with graphs and slowmos, I would find it borderline impossible to tell the difference between FSR 3.1 and dlss both at quality. Xess at quality I could prob tell apart in certain games. This being at 4k of course, at lower resolutions i still think that all of the technologies look a bit wack, id rather upgrade than use upscalers at lower resolutions.
As an IT professional for 25 or more years now, I agree. Well, if you go looking for the artifacts you can find them, but the quality difference of most games and tech these days is not like it was in the past. When you played half life, or even the sequel, on a low end GPU to a Mid tier one, the change was pretty big, and if you went from low/mid to top tier, again the performance and graphical changes made a significant change that most untrained or uninformed people would pick up on. I think on average the difference between low, mid, and high tier settings is much less dramatic in this current gaming era. When it comes to upscaling tech, most wont see the difference between 1080p and 1440p (okay maybe some would, but if we talk 1440 to 4k, yeah not many will see the change without slowdown or close inspection. Performance, now that is probably more noticeable to everyone, as playing at sub 60 fps and jumping to a consistent 60 or above can be a game changer, and most will notice the improvement in fluidity, unlike extra grass or slightly clearer textures. So as long as these FG technologies can provide a smoother feel without degrading the image so much the average layman can tell the difference, I think it is a big win, especially for lower spec gamers!
it's a shame they waited until now to switch to a DLL packing, because now there's a few years of games stuck with an inferior upscaler unless the devs go back and patch it, but hey, at least that number won't grow anymore now
Thankfully, if the game does support DLSS 2.0+ then you can use OptiScaler to translate it to FSR or XeSS. It only supports FSR 2.1.2 and 2.2.1 right now but there's a branch with FSR3 support for DirectX11 games.
Did the frame jumping DLSS walking also lead you to say "My god, that's bad"? It isn't common, DLSS *GENERALLY* is better, but it is forced to be a closed box so it can make you buy their graphics cards, so that should be a hard nope.
While i understand the intent, i believe "performance normalized settings" was a mistake. The intent of the video was to compare the improvement of fsr 3.1. How it compares to older versions of itself and how it stacks up against its current competition. Showing us comparisons when we know that each upscaler is not using the same internal resolution almost defeats the purpose of the breakdown. Comparing one upscaler pulling from 1080p to one pulling from 1440p feels pointless. I understand your reasoning but I think it would have been better to just show the improvements and compare them at each similar internal resolution, then after that highlight the different fps performance at each setting. Anyhow love the work, long time viewer.
Upscaler performance also depends on your GPU generation and GPU class. You might get completely different performance results on a 4090, 2060 Super, 1080ti, 5700xt, 2080 ti etc.
He basically did that near the end of the video, comparing the "Quality" modes across upscalers, did you actually watch the video? Even in that case, each upscaler uses slightly different internal resolution targets, so you're never going to get a true apples to apples comparison. Comparing them based on actual performance makes sense, as the whole reason these upscaling technologies exist is to provide improved performance while minimizing image quality loss.
I do understand the "performance" settings compared, but makes no sense (at least to me), to compare 1440P to 4K upscaling with DLSS and FSR and then use XeSS 1.3 in Balanced mode, which is upscaling from 1080P... that's a huge difference pixel wise, hence why XeSS is not delivering better results
I truly believe ur right. But for me I'm not sure with this new change intel made to the scale that balanced isn't upscaling to 4k from 1440p with their balanced setting
Well, he normalized for performance in the initial comparison. It doesn't really matter how good it looks if it's not at the FPS you want, so he made sure that all the upscaling techniques were performing similarly first.
This reminds me of that tweet that goes "twitter is 90% someone imagining a guy, tricking themselves into believing that guy exists, and then getting mad about it"
@@trulsdirio who cares if it is a design choice if i do not like it. by the way the sword swing effect is a feature not a bug. hope you like that feature then....
If you need to zoom 3x , slowmo at 1/4th speed to elucidate the difference , is there really any tangible difference while gaming in real time , i wonder
The flickering, Ghosting and smearing of FSR is extremely noticeable in many games while playing. I switched to XeSS 1.3 whenever possible because it doesn't have these issues anymore, and the shimmering of XeSS 1.3 is much less noticeable.
For me, the shimmering and flickering are the things I can always spot and it's really distracting. Also if there's overall blurriness. Ghosting is something I don't spot that much during gameplay, although the GoT example here was really bad for FSR.
The zoom, slowmo, and other stuff is done because this is a youtube video. It's meant to drive the point and show examples in the most "understandable" way. For me personally, I can see it, to the point sometimes I'd rather stick to 60 clean fps rather than 90 fps with ghosting. It's one of those once you see it, you can't unsee it.
None of this will matter if AMD does not make a high end product next generation. My 3090 needs to retire soon but I'm hoping for a 5080 competitor from AMD to buy. If there is none, I guess I'm stuck with NVIDIA again 😔
Honestly I've been playing FSR and my monitor it literally looks better than XeSS and performs much better, so these videos are more clerical to me Sorry @hardwareunboxed I trust my fucking eyes more than your videos
@@brunogm Google: Translate to English Clicks it. Nothing changes. I swear it's just trolling us now. Though to be fair naming schemes are pretty archaic lmao.
12:30 Surprised you didn't mention that weird warping DLSS is doing instead of ghosting. Like its splotching the ghosting out. Oddly enough it seems like FSR has a more clear picture here despite the ghosting due to it not being blurred
@@JayMaverick I never remember I saw reports of both that PlayStation has there own software in the PlayStation. I mean we been upscaling on game systems form at least PS4 erra I remember. At least in the terms on not just scaling on pure math.
Well if AMD is only making a cheaper to build part that matches around the 7900 XT/XTX they won't have any competition in the high end and intel will need a massive power draw reduction to be anywhere near AMD and Nvidias previous gen cards
I mean, Nvidia isn't going to fight until the have a reason to. Intel and AMD have 0 challengers at the top end of the GPU market, and the upcoming GPUs seem to be the same in that AMD isn't going to try to fight the 5090 for supremacy and AMD has already said as much as well. So nothing is changing anytime soon bud
What makes you think thats the vibe lol ? Maybe youre into underdog competition. But nvidia is doing better than ever and looks to take the next gen performance and software crown with their AI advancements.
XeSS XMX on ARC vs FSR 3.1 vs DLSS 3.7 would be interesting. Battlemage just around the corner this test would make sense to give some idea what to expect. High end INTEL CPU and A770.
Why would you need to use a high end Intel CPU for this test? This is a GPU test, so a 7800x3d would technically barely bottleneck less. I do like the idea of more XMX XeSS tests though, everyone seems to just test the DP4A path
@@superamigo987 Because there is performance gap using AMD CPU. It is noticeable gain with INTEL CPU. Also XMX is INTEL specific for ARC. Two different architectures. AMD/Nvidia you can say don't use CPU much at all. INTEL splits the tasks with CPU and it matters. Hence REBAR for the data transfer and use of E cores. Lot less waiting and queuing for lot better performance. CPU is vital. Two silicone's working together. That's why the ARC is very affordable because it is the selling point for the INTEL CPU. Also a reason that I see for Nvidia looking into CPU's also. With INTEL every FPS counts. AMD/Nvidia need almost twice the FPS for same quality of performance. For example 40FPS on 4k with INTEL very playable. AMD/Nvidia need at least twice the FPS because the waiting and queuing with single silicone creates gaps and stutter to be filled. FPS from INTEL is not the same as it is from AMD/Nvidia that share similar architectures. Then there is Intel Application Optimization (APO) paired with INTEL high end CPU that is upcoming and promise 10-50% gain. So if you are looking something that will age like fine wine it is INTEL.
I'll be honest, while you can find these artefacts when looking for them, all the technologies are good enough that I'd be able to just play the game without it distracting me
@@WrexBFyou keep using that word, but I do not think you know what it means. (On a less memey note, I don't have a leg in the game so why would it be?)
Why would you take something inferior when something superior exists. This is nothing like an audiophile snob problem there's a reason each company and now Sony are all spending tens of millions to develop and market their own upscalers. In the future you will be right but for now there are simply inferior and superior ones. They all have their own issues though.
i dont understand the performance normalized idea. generally while using FSR, most will just use quality for the little extra boost to FPS while maintaining image quality.
It seems especially pointless since most will not notice the slight performance losses between the different upscalers thanks to displays with adaptive sync, but will most likely notice the big improvements in overall image quality.
It's not pointless because it's inherently an fps-increasing technology, and one that games are increasingly relying on. A 40% performance boost is preferable to a 20% performance boost 5:39. Just going blindly with the same name across different upscalers and only looking at the quality is going to give you a skewed image of the value they provide. With 4K, which is largely what the video is aimed for and where upscalers are the most relevant, that additional performance can make a big difference.
@@Maxoverpower I don't have an issue with normalizing, it makes sense here, I am just confused as to why they decided to use FSR Balanced in Horizon Forbidden West since, by their own charts, FSR Quality performs as good as DLSS Quality. It's at 4:35. The difference is one frame in the AVG framerates, and the 1% lows are identical. This is as within the margin of error as possible. Shouldn't that particular example constitute the exact same performance uplift?
Man, not that many people play at 4k so it feels weird to focus so heavily on that resolution. Especially when we know upscaling doesn't struggle that much at higher resolutions. 1440p balanced and 1080p quality would have been a lot more helpful to have learned about.
It's because there's no point in anything anymore, we don't need the kind of resolution and quality the hardware can provide. Rocking a 20- or 30- series and it's absolutely fine. So these tests that are came up with just to justify some performance advantage that we won't ever use
I can easily tell the difference. Even dlss at quality looks bad to me at 4k. I play about 5 ft away from a 65" screen though. DLSS makes everything soft and blurry, fsr has ghosting and a LOT of visual noise and shimmering
@@Dempig I experience the same thing you mentioned. Once you see it, it can't be unseen. I would like to have a High Quality Setting between Native and Quality. Its just not good enough to switch to DLSS / FSR / XeSS. Graphics is king, especially with solo games like Horizon or Rachet & Clank where FPS and reaction is less important compared to games like Call of Duty or other E-Sports titles.
I have the 7800xt. I run most games in native 1440p. If i have to zoom in 200% and reduce the speed to notice the difference in render resolution or individual pixels then i have a bigger problem then deciding which upscaler is better.
Man i can confirm that as AMD user i dont even touch upscale technology cuz we have a monster in games such rx 7800 xt and rx 7900 series so no need even think about Upscaling things. But 😂in other hand Nvidia 😂 without useing Upscaling tech they will be equal to the last generation from Nvidia 300 series 😂plus is too expensive 😂
As an nvidia user, I check all the upscaling techniques available in different games. When fast objects in the foreground reveal grass or trees in the background, the sizzling is so bad that I have to turn it off. Though to be fair, I still haven't experienced an fsr 3.1 game first hand. It's hard to judge the upgrades by watching a yt video.
Great deep dive on the comparisons and I'm glad that the big three are pushing the longevity of cards farther. I have noticed the ghosting with FSR 3.1 in HFW but was able to reduce its perceivabilty by reducing motion blur. Overall, I still feel hesistant to recommend graphics cards to friends based on image upscaling features because not everyone is willing to test to find the most optimized settings possible. With the state and quality of implementations, it's still a nice to have but not a must have. This may change as it gets implemented across more titles.
Comparing DLSS Quality vs FSR Balanced is meh ... 5% difference in FPS is not important. AMD owners mostly use FSR Quality in 90% cases and don't care about FSR Balanced
@@imo098765 Yeah, but the reason why people use FSR/DLSS instead of just manually setting resolution to 720p is to get an fps improvement without losing too much quality.
@@KrisDee1981idk maybe I’m blind but it looks really good to me , I give it a 8/10 , I don’t use upscalers very often only in games that are kind of hard to run , I have a 6800xt
DLSS also has different preset options you can swap between using DLSStweaks. It's a great way to tune out ghosting or to increase the softness. In Death Stranding DC, DLSS 3.7.10 favors preset "C" due to the decreased ghosting, even though for most games I've tested or seen, preset "E" is considered the most performant and the default. I'm not sure if either FSR or XeSS have preset options, but it's another layer to team greens cake that I enjoy.
I have no clue at some point how stuff is being tested or any feeling it is fair, because ..., DLSS only runs on an Nvidia card (so any testing of it is exclusively on Nvidia), is then FSR in the same game also exclusively being run on an Radeon 7900 XTX ? as we know by now it will show the best it can do an a Radeon card ? While the Xess is run at the same time on the best Xe Intel GPU as it runs the best there ? Or is Nvidia being given the best of both worlds and all stuff is run side by side on a RTX 4090 ???
At around 17:45 Ghost of Tshusima walking animation, see the jumping around the DLSS version of his head, especially noticeable in the feather, it does, compared to the far more flowing look from both FSR and XeSS. This is the problem of making the "baseline" 'wot DLSS duz' instead of "no upscaling". He's peering intently at both XeSS and FSR to determine WHAT FLAWS CAN I SEE, and thereby ignoring DLSS here. It's not a widespread problem for DLSS use, but that is Tims problem with his focus on how DLSS HAS TO BE best.
I bet that's just an animation glitch in the game, nothing to do with DLSS being the upscaler. You can see he briefly stops walking to reset the animation, and the movement becomes smooth again.
It's good but it comes with several caveats. One is you need to use a frame cap like RTSS to ensure consistent smoothness. Another is while using it's frame gen, it introduces a slight input lag which is quite noticeable in games like fast paced FPS shooters. That being said, if you play that game using a controller, the delay from the controller's input masked the frame gen's input lag. What I love about Lossless Scaling is it works on pretty much any media as long as you know how to use it. You can watch movies with much higher FPS from the native 25fps. I also used FSR to make old blurry videos sharper.
Great review! Can't say I can see much difference between the three technologies now, which is nice. I only use FSR on the laptop for select games, so hoping this update can improve that experience a bit 😀
I would love to see having 4k native next to the 3. Also I wonder how much of a problem are really some of the FSR 3.1 issues since I don't look at the monitor at 300% zoom and I don't play at 25% speed. I know you've added all that for the sake of the video, I'm not being harsh on you.
You're watching a compressed RU-vid video here which subdues most upscaling artefacts by turning them into compression artefacts. Zooming in is basically necessary to SHOW you how that'd look like on uncompressed footage (I.E. your displays output).
I just finished playing Ghost of Tsushima at 4k with FSR enabled. During the whole 60 hours of gameplay, I experienced ghosting more than once per hour. Usually it was a relatively subtle artifact that disappeared in around 1 second. Sometimes, around 10 times (in 60 hours) in total, the artifact was severe enough that I had to stop playing for a couple of seconds because it interfered with my vision. I knew it was caused by FSR, but it didn't bother me enough to turn it off. The game looks espectacular anyway.
They still do. Except now you can use mid range cards like a 4070 or 7800xt to play at 4k with decent frames. I'd sooner do that using up-scaling rather than play 1440p native.
I game with an RX 7800 XT @ 1440p and never use upscaling. Upscaling, IMHO, is for lower tier hardware like laptops, handhelds and maybe entry level GPUs (if those still exist)
@@adi6293 I did the same and was a great jump but I still like the polish of Nvidia products when it comes to drivers and programs. I'll prob go back to nvidia when the 6000 series comes out or what ever is after the 5000 series this year.
Comparing the upscalers using a different internal resolution is wrong. The tests should be dlss/fsr quality and xess in ultra quality. I understand doing tests to see how they all perform. But comparing image quality with dlss quality and fsr at balanced is really dumb. In the horizon test, Xess performance is like 720p to 4k, while fsr is 1270p and dlss is 1440p. Like how did you guys think this is how the comparisons should be done? I'm sorry it's very stupid.
It can have some sense if all he test was performance but making video about image quality improvements (this time about FSR but next will be any of other two) and then comparing techs using different base resolution is just plane dumb.
should also be locking framerate to 60 for example. these are temporal technologies, the less difference between frames there is the better picture quality you get.
Well he's doing it because he's comparing them at the levels of performance. It makes sense in a way , he did also show the same settings as well so why complain? Lol
@@pedro.alcatra yes, but the range of fps to decide to lower matchups was strange next time a clear hysteresis of +- 3% or just range intervals, targeting 1% low fps instead of pure fps avg, since games are very variable.
What exactly would you say is "night and day" difference here? 11:10 Good video but you missed some issues. Like the newfound moirè pattern issue in Ratchet & Clank with FSR3.1, the aliasing on Clank when he's a backpack, or the disappearing confetti. I also think all the testing should've been done at lower resolutions than 4k to highlight the differences.
Its very pleasant to see that the universally compatible offerings from red and blue are greatly improved compared to their initial versions. I believe that everyone's problem would be resolved if amd released an XMX equivalent version of FSR like intel did to please their owners. Although my gut feeling is that they may release something first that will run on their shiny new NPUs. I really hope that that's not the case and they briefly provide updates to fsr3 that will make 7000 series more compelling at least in terms of upscaled visual quality.
We all know DLSS looks the best, but when you have an older gpu like my 980 ti, FSR is what keep it playable in 2024. I have RTG to thank for that, not nvidia.
You are using a 9 years old card, what do you expect. I mean there are a lots of better GPUs for very small amount of price like the RX 6700 XT used kills and spits on your GPU.
@@kevboard I actually have to do both to get it runs cyberpunk at 1440p @ 60. It is not the matter of what would be a better experience, it's weather I got an experience at all.
@@PeterPauls I wasn't expected it to be running newer games at all. But with the help of FSR, now it does (not excellently, at least acceptable). The Maxwell card is a backup card I have to use for now, I will soon get a replacement.
@@veda9151 I used a GT 1030 as a backup card and I was able to gaming. Now I understand you but you didn’t write that in your OG comment. What will be your next GPU?
Good, competition is what keeps things moving. I'm aware that it is the illusion of competition but it's still movement regardless. I find I catch a ball more if my body is moving, if I freeze in anticipation to catch the ball, the ball will bounce off my hands. Thereby, innovation happens when there is movement and it stumbles when things are stagnant. It is better to make a mistake than to make nothing at all, with forgiveness a -1 can be converted through 0 to become a +1. Thereby, even if the technology is not good, at least we learned what we do not like and thereby towards the inverse is what we like.
The closest you're going to get to native AA in most of these games, is DLAA, native plus temporal anti-aliasing is essentially native, as the temporal AA doesn't really encoura significant cost. What frustrates me is looking at any of these solutions as performance enhancements.. if anything that's a bonus, as the real benefit of the machine learning super sampling is that it does anti-aliasing in motion without many of the significant drawbacks to traditional TXAA or just temporal AA in general.. This is why FSR is such a joke as far as I'm concerned.. Depending on the resolution you're up sampling from, you end up with worse performance in motion than traditional TXAA depending on how it has been implemented.. and on top of that, the absolutely brilliant contrast adaptive sharpening that they have which is usually the perfect solution for TAA blurriness, it's already integrated into the package, and usually in a terrible way. I used to have issues with DLSS, but I didn't realize is that in-game LOD settings were being hyper sharpened so that you could see them in all their blurry glory, not dissimilar to how TXAA is implemented in a lot of best practices, which is silly.. nothing feels like gas lighting like the entire world being blurry until you stop and look at it.. Once I fix the LOD settings, the performance in motion with transparencies and particle effects, it's like going back to the old days in terms of clarity in motion.. throw a light reshade of CAS on top of DLSS or DLAA and you got about as close as a modern game can get to the clarity that we used to have as normal.. depending on if you can fix the LOD settings lol as they are usually trash.. I swear developers... I don't think they have very good eyes lol.
Thanks for keeping up with the comparisons between the upscaling technologies it's extremely helpful when determining what to use and when. I would love to see an image quality comparison with XeSS in XMX mode to see what the differences are though I imagine it's best to wait for Battlemage to do that.
I tried both DLSS and FSR in Dragon's Dogma 2 with my 3070 Ti. DLSS had a lot of issues with shadow draw, creating a fuzzy, blurry mess where a shadow should be. FSR, by contrast, held no such issue for me. First instance of FSR being objectively better than DLSS for me.
Its honestly very impressive how good dlss compared to when it first came out and looked horrible. Its not even a thought now on if i should have it on or not. At 4k quality mode i cant tell the difference anymore.
FSR seems to try to preserve a bit more detail, but the result is noisier. DLSS is better at hiding that noise, but at the cost of making some areas blurrier. If you pause, you can see that difference, but in motion and without zooming in, it's virtually unnoticeable. I don't think reviewers should be focusing so much on 2D fakery (spatial and temporal interpolation) anyway. Manufacturers seem to have successfully diverted attention to that, and away from the fact that the current generation is overpriced and barely any faster than the previous one at *actually rendering 3D scenes.* Likewise, in a lot of games the "raytracing" option seems to be just a switch that makes Nvidia cards slightly slower and other manufacturers' cards a lot slower, to change the "winner" of a benchmark with barely any noticeable change in image quality (in some games, RT actually makes shadows look a lot worse). Who wants to use 2x the amount of power, produce more heat, more fan noise, and _lose_ some FPS in exchange for (supposedly) more accurate reflections on irregular surfaces, that don't even look _nicer?_ And reviewers / journalists keep falling for it, and publishing two, three, or sometimes even more versions of the _same_ benchmark, which doesn't really help anyone except the manufacturers, by diverting attention from the lack of real 3D rendering performance improvements, considering the increase in price.
@@iurigrang light probes are automatically put in the median distance from each other. then actual shading samples from those. Light bleeds and artifacts on normal raster come from these. Later someone has to manually reposition prober to fix it. The lower production cost in RayTracing comes from not needing manual intervention. But one could use RT results to automatically fix light probes and then the game at runtime has better light without realtime RT.
Great breakdown. I thought it looked decent in my testing but then again I'm not pausing and zooming in 3X to look for issues. Though, I have a 3080 so I'll continue to use DLSS unless it's not an option in a game.
Glad FSR makes good progress. This tech works on virtually every (gaming) gpu so its already leagues better by giving almost every one this tech mate. Even if it looks less good in my opinion its already a more valueable tech to further develop. Not some gimmick only working on the latest hardware and DLSS 4 will probably work only for rtx50 because Nvidia desings tech this way to just create incentive to buy their newest gpu's same story with the rediculous Vram amount on GPUs costin North of 800 bucks. 16 gb on a 1100 euro GPU is a disgrace... CB2077 already (almost) uses that amount on 4k all max settings. No matter how fast the Vram is if its not enough u get stutters and low FPS. So lets focus on things that help gamers in a whole instead of bashing tech that gets better and better and is helping the entire (pc) gaming scene! Its just getting a tad boring most channels bashing FSR/XeSS. Ur paying 100-300 bucks more for DLSS so u should EXPECT it to better. Same pricepoint AMD u get 2 to 8 gbs of VRAM more and drivers and performance is actually really competitive.
nice comparisons, would have been nice to see a bit more side by side of FSR 2.2 to 3.1. Shame to still see so much ghosting and shimmering in 3.1. Hopefully they keep improving and get it implemented more in consoles. Will be curious to see if PSSR catches on in the PS5 pro.
Considering that Nvidia has like at least 80% of the market and the fact you get DLSS on a 20x series, cards over 6 years old, the mayority of people already have access to the best upscaling method anyway.
@@initialfd-3716 oh I agree. Their frame gen is actually pretty close, it’s also much easier to develop over upscaling. But natively it’s only in so few games. There’s a mod to swap it from DLSS FG, but most games I’ve tested had UI or physics issues.
IMO you should ignore upscalers naming and just compare them across base resolutions or % of screen resolutions. Then you compare image quality - ability of upscaler to upscale or even enhance image. You also get the performance of each upscaler as a bonus info.
You should've reviewed every technique at the same or closest possible internal resolution. This benchmark isn't good because we aren't really having a fair comparison of quality, but instead of performance normalized quality.
So through the entire video we get a categorization based off nothing but framerates and assign the “win” to X GPU for having (even if it has less frames) the bigger percentage uplift , and no word on what the presets actually look like to the human eye when compared. So X card has to use “performance” to match Y cards “balanced” but what if performance still looks better AND yields a higher frame rate? Just slap graphs and numbers on screen, with no visual representation. That’ll get some views.
GeForce owners get access to the best upscaler, until their GPU doesn't get supported by the latest DLSS version any more. While DLSS 3 got updated to give improvements to RTX 2000 and 3000 cards later, the RTX4000 launch showed it is only a matter of time until Nvidia might discontinue support for DLSS on older generations. I'd probably welcome if AMD and Intel would just combine their efforts for an "everyone" upscaler, although I don't think they will ever achieve DLSS levels without using dedicated cores as Nvidia does.
You're confusing things here. The temporal upscaler is compatible with any RTX GPU, so is ray reconstruction. The only part of DLSS that isn't supported on all RTX GPU's is Frame Generation because the Optical Flow Accelerators in past generations isn't up to the task (which isn't an uncommon thing to occur really, there's lots of beta hardware in GPU's for not yet fully functioning purposes).
No, I am aware. But when RTX4000 released with DLSS 3 alongside it, it was not compatible with older cards for a time (for the stated reason and Nvidia bundling everything together). It took them I think about a year to make the improvements that DLSS 3 brought for upscaling availabke to the earlier generations of RTX cards (while GTX cards were always left out, even when they were rather recent products). Both things show that Nvidia is willing to leave recent customers in the dust. If you watched their latest Computex show, their interest in gaming customers seems to be at a low anyway.
@Xzavn When DLSS 2.X got renamed to DLSS 3.X the upscaler didn't get any improvements. It just added the Frame Generation & renamed the thing as a whole (a terrible decision though, I agree). Everyone is able to run the exact same upscaler regardless of RTX GPU generation so far.
@@Xzavn Even RTX 2000 cards support the latest DLSS 3.7. They just don't support frame gen. They should have just made Frame Gen a separate feature name.
I understand choosing the setting that has the same type of uplift but at the same time, realistically speaking, I think it would have been worth it to do a comparison with similar fps instead, because someone with a 4070 and a 7800xt playing the same game would probably go with what gives them acceptable fps so if in both cards you get 100 fps on a game with the quality setting both would go for that setting even if the uplift of dlss is a higher % (so it's a more realistic comparison) there's value in comparing between the same uplift, but the real life comparison should also be used based on similar fps (I know you mentioned it would be too many combinations, but I think that's a more realistic scenario that people might use/choose, based on fps, not on % of performance uplift)
You should switch back to Quality vs Quality comparsions, with the same input resolutions for each technique. While the performance differences are interesting they shouldn't take center stage. If you use any of the upscalers you want more performance for similar image quality to native and all deliver on that front.
I remember i used to use xess on cod because fsr and dlss used to have this annoying artifecting in the waiting lobby wasn't noticeable in game but used to drive me crazy
I would have liked a Native comparison in quality. Cause sometime it is hard to tell what the ground trough is suppose to look like. (especially when you don't know the game) I guess you could argue that AI upscaling could look better than nativ, solving aliasing problems etc. and if it looks good why would it matter how native looks. But still.
There is no ground truth in games anymore. All modern engines use some form of temporal anti-aliasing and image reconstruction as default because the image would look awful without it. Just look at RDR2 on PC, which does allow disabling all forms of AA, and the image is absolutely horrific because of all the aliasing. You are always comparing one form of TAAU solution with another and we already know DLSS and XeSS are currently the best version available as they can reconstruct detail that even the default AA cannot.
I second this. Like, I understand it from a "scientific" perspective, they want to know just how much better the methods are for improving performance, but for gamers, this is an irrelevant stat. All it matters is that the resulting frame-rate is satisfactory for the consumer.
FSR 3.1 is much better now. DLSS in real trouble now. DLSS is better but you have to pay for it. FSR is Free. I'll give it to FSR. Thank you AMD. Thanks for the video. You Rock
If you watched this without audio, and just watch, you come away thinking that FSR has pretty much achieved parity with DLSS. But if you only listened to the audio, you'd think FSR isn't even close. Either my vision is going early, or what's his face has some personal preference influencing the interpretation. I say this having bought 6 Nvidia and 1 AMD gpu in my life. But I also have never bothered using up-scaling. But if I did, from just watching this video I would think FSR was close enough and go with the cheaper option. But again, if I only listened to it, I'd think, well better not take a chance on FSR.
RU-vid compression kills a lot of fine detail. I've owned both GeForce & Radeon cards with no loyalty to brands. In my opinion I'm just not interested in any upscaling below 4K, it just doesn't look good. But if you do have that native 4K image to source from, DLSS is better and mainly due to the ghosting from FSR. In some games the ghosting is quite bad.
And I think that's the best option. If you can buy a card that plays games at the framerate you want at your desired resolution, none of this should be necessary. By the time it might be, it's likely all of these features will have reached parity.
@@Chasm9 Thanks! 2 questions: you wrote Ultra Quality twice (1.3x and 1.5x) - what's the difference between them? And secondly: what are the scaling factors of DLSS Quality and FSR Quality? Ty, my wholesome potato in shining armour :D
Great video as always. Would have appreciated(800p, 1080p) using some handheld resolutions given the increased issue seen at 1440p. My main issue with fsr 2 was the pixelation on foliage so it’s great to see the improvements.
No one cares; they'll never do a proper scientific method to prove which is better. They're just going to keep doing this because it's all everyone cares about.
I hope next gen Intel GPUs will also join these tests (with Intel CPU, to take advantage of the "combo" mode). Just hearing that XeSS is a good option in some titles on AMD graphics would have been crazy 2 years ago.
Hi, thanks for the effort. I have a question, so I have GTX 1080ti still, and in this case, is it better to use FSR balanced rather than quality if I want to get best out of upscaling?
Use the highest FSR setting, which is FSR quality. Or use XeSS. You should only choose lower FSR setting if you want more fps with a lower image quality.
Tried FSR 3.1 with Spider-Man on Steam Deck, it actually worked pretty well, just have to manually lock GPU at 1600MHz, and have a 60+fps frame rate cap
This is an upscaler that's constantly making progress and it's free. Not too long ago we were under the impression that Ampere and previous gen were not able to execute framegen. AMD exposed that. Now we have blockbuster titles that can utilize a mix of DLSS with FSR 3.1. I have friends with Turing and Pascal technology that are seeing their cards get a second wind which is awesome. On an OLED screen it still looks damn good with old tech.
I was surprised to see how well 4070 fares against 7800 XT in those cases when upscaling (DLSS and FSR correspondingly) is in use. 4070 is a pretty competent GPU after all. If only it's initial MSRP was $500, it would become a decent GPU.
Oh yeah, most of Nvidia's cards are really great if we could just cut the pricing by like 30% lol I think most people would agree that pricing is the worst thing about GeForce at this point. Unfortunate
@@TheDarksideFNothing The pricing is the reason why I changed my 1080ti for a 6900xt when that dropped in price. I've used Nvidia since the 8800GTX but that current pricing is just... Fucking nuts.
@@thrafkroos I doubt any major changes will happen in 3 years. Next gen consoles will surely increase VRAM usage in games significantly. But they will not launch in 3 years, more like in 4-5 years. But anyway in 3 years games will become much more demanding in terms of raw GPU power. By that time the owners of 4070 S will be lucky to run modern titles at 1080p/High settings. Thus 12GB will still be fine. Look at Hellblade 2, Banishers and Robocop - they use just about 8GB at 1440p. So 4070 Super runs out of power way before VRAM capacity becomes an issue. Since UE5 is the most popular game engine, most games will behave the same way. Only a small bunch of future console ports may become a problem for 12GB cards.
With gaming for the masses, the picture detail that has changed the gaming industry from 2013's Far Cry 3 to 2021's Cyberpunk 2077 is night and day. Back in 2013 my $150 video gpu would do about 30 fps in med Graphics, at 900p and looked realistic but compared to my $350 GPU (rx5700) running Cyberpunk at custom graphics (close to high) at 1080p would get me 60+ Fps (BM) and with upscaling from 1080p to 1440p and fsr 2.1 get close to 90 fps with same graphics settings and was incredible looking compared to FC6 or Ghost Recon Wildlands or any other game. So upscaling really was a huge improvement and has allowed older midrange cards to enjoy much higher FPS and/or resolution output. I personally dont notice minor imperfections but in watching the Benchmarks in Far Cry 6 and GRW, the background scenery (trees/grasses) shimmers and is annoying, and is not rendered clearly enough with those recent titles, so the base game seems to have more influence on the perfection of picture outcome than the upscalers ability or maybe its just you need to set graphics output to ULTRA to have the blades of grass look more realistic and the GPU cant handle that level of output (noticeable in GRW). Things that catch your eye are the flaws and a stationary wall that glitches is noticeable, while distant background features in the heat of battle in an action game are not even observable, except when the action ends and you stare at it. Like others here have said I cant hardly identify the flaws and they have to be revealed by slowing time and zooming in and even then I find it hard to identify them. Its always been the case that for those that are perfectionists and want the best they will get a slightly better image and slightly better frame rates if they are willing to spend double (or more) on their gaming rig.
Will you please always check you specs of the machines you use - while 4070 is nearer the type of GPU most people use only about 5% (guess) use a 4K monitor for gaming. This vid gives little info on whether the 4K monitor has a 'it matters' impact on the image as opposed to a 1440p or 1080p monitor
4070 actually isn't close to what "most" people use. Most people are a couple generations behind, and the 60 class cards have always been more popular.
Thanks for the upscale update! Hopefully amd will update FSR more often now it's compartmentalized. Don't know how much they can do without using deep learning like Intel and nVidia but it's nice to see they improved fsr. Hopefully they can get the ghosting under control since it was better in far 2.2.
so for 1440p you either want to use DLSS (if the Game has it) or stay native, instead of using FSR, cause it's way too blurry, grainy and has ghosting issues.
To be fair, if you're using a 4070/7800xt you don't even need upscaling for 1440p, just use optimized settings instead of automatically setting it to max settings.
One of two things, AMD and Intel either need an equal upscaler or they need to make a stand on RT being a useless tech and run the gamut on rasterized performance.
Got a mid range board, when played CP it was enabled by default, after some hours of gameplay i decided to turn off, and man, even with half fps the game was better than with upscalling....
I didn’t use them yet, because I still use older gpu Vega 64, thats still roughly on par with RX7600 and thats on 1080p, so upscalers has little use there. 1080p is still the most used resolution and there upscalers have little use, so this whole technology is still pretty niche. What I would like to use though is DLAA or FSR Native AA, but I can’t, because I don’t have newer gpu and very few games support it.
DLSS is often better than native TAA, so in those cases it would be foolish to not use it. I'm using a Mod in FO4 to enable DLSS in DLAA mode to get rid of TAA blur.
@@Littleandr0idman I can definitely see myself never touching it if I didn't have an Nvidia GPU although Intel's solution is getting pretty good even if it doesn't provide the same boost in performance.
Summary if you don't want to read this entire text: AMD should not make technologies for all users and should copy Nvidia in that sense. Amd should stop trying to be the brand for gamers and be more like Nvidia with its technologies, people don't care that Amd sells itself as a friendly brand, they are not going to sell graphics with that and the mere existence of Nvidia and its 88% of the market proves my point. Everyone prefers Nvidia and AMD cannot be trying to please Nvidia users with these things of making technologies for everyone. It has to make a technology that competes with Nvidia, and if most of the time to develop that technology is wasted in making it usable for everyone, AMD will always be left behind. psdt: i am a fan of all amd does.
i have bad eyesight anyway - i could not spot any differences at what you showed here. For me price performance is more important than expensive feature set.
Same. It's hard to spot much of a difference at equivalent quality settings while the game is running at full speed. Pretty obvious sometimes when paused or in slow motion, but that's not going to effect gameplay.
Because we don't notice in game play, that's why the slow-motion and the zoom, this tipe of videos are just for views because in reality playing a game normally it's all the same.
Pretty happy with FSR 3.1 on Horizon Forbidden West. Got a new 7900 GRE and on 1440p ultrawide I get about 100-120 FPS with it. Used settings recommended in some Reddit post, looks fantastic and it's smooth. Balanced setting.
I don't understand why compare IQ of DLSS Quality, FSR balanced and XeSS performance? Can someone fill me in? I know that XeSS changed the render resolutions in the latest update. But XeSS performance does not equate to DLSS Quality (67%) and that does not equate to FSR Balanced (58%), unless I'm missing something 🤔 Edit: ah, it was judged by the relative performance output of each technique on a given GPU? I don't think this is a valid reason to benchmark IQ, though.
We use any of these 3 techniques to increase performance, not including Ray Reconstruction which improves graphics too If you get 70 fps with DLSS Quality but 50fps with FSR Quality is that a fair comparison?
@@imo098765 well, you are judging the upscaler that is upscaling from a given resolution. You're not judging performance side of things which can widely differ from PC to PC (different CPUs come to mind, as you are more apt to be CPU limited when upscaling since you are literally rendering the game at lower res; different GPUs respond differently to different upscalers, etc). So performance side of things should go out the window - you are judging IQ. To give each technique a fair shot, I'd say it's only fair to test, say, 4K with DLSS 67%, FSR 67% and XeSS 67%. I don't really understand why performance is relevant in an image quality test.
@@Chasm9 yes and the performance is the reason why you turning the upscaler on If FSR to match DLSSs increase in fps, needs a lower internal resolution. Then so be it. Its objectively less efficient
With a performance difference for FSR 3.1 being present when used on Nvidia GPU's compared then a quality difference may be in there as well. If you haven't might be worth looking into.
so @ 24:59 you SAY the tests are run in PERFORMANCE mode, but the SCREEN shows everything at QUALITY mode... PLUS ALL the previous tests in this section of the video have been run in QUALITY mode... so why would this one suddenly be in PERFORMANCE??? - So WHICH ONE is it??? QUALITY or PERFORMANCE???!?! (I'm going with Quality :D )