i played it at launch and for ages it had this issue and u needed to edit the ini or it would silently be half ur res. i imagine it is now fixed but i cant say because i haven't revisited it (unfortunately due to the unavoidable shader compilation stutter)
Thank you SO MUCH for this! Most tech channels stopped caring about budget-related stuff so videos like this are very much needed. Please - if possible, dedicate one video each to FSR and intel XeSS at 1080p. For budget gamers it's more useful than ever as the $100-$200 GPU segment continues to be ignored by both AMD and Nvidia (and intel needs REBAR which is an unreasonable request at this price range).
1080p doesn't mean budget. I have a 4070 Ti and play at 1080p. Firstly because my old monitor is still fine and secondly, i like to play games at high fps while having high details.
Stop simping. Sub 200 is ignored only by Nvidia not AMD. Nvidia offers 3050 6gb under 200 while AMD has 6600 under 200 which slaughters even the normal 3050.
A strategy people use with DLSS at 1080p is to use DLDSR feature in Nvidia drivers which would use tensor cores to filter a non-integer downscale and it seems that when supersampling from 1620p and downscaling to 1080p with DLDSR, it produces comparable results to supersampling from 4K with traditional gaussian filtering used in DSR. So you could render e.g. at 1440p with DLDSR and then use DLSS at performance (which is 720p internally) and after the upscale it will downscale to the monitor resolution of 1080p. It has a little bit worse performance than DLSS quality at 1080p, although it still is faster than native TAA rendering and looks better than 1080p quality DLSS. Unfortunately, because there is no official way to manualy change the resolution scale the DLSS would use (instead of predefined presets like quality, balanced and so on) this is the best way to utilize the tensor cores for a better image quality and game fluidity. It would be really awesome if you could also test this configuration or maybe something like DLSS tweaker which would allow you to use custom resolution scales. Edit: I tweaked .ini config file when playing Talos Principle 2 and changed the internal resolution scale when using DLSS (could be used with other upscalers like XESS or TSR) and set it to 83% instead of the 66.67% used in quality mode. It looked a lot better and still performed better than DLAA.
Great comment, but kinda proving why I'd rather just game on my consoles if I can get 1440p @ 60fps. The endless tweaking just isn't fun for me anymore (7900xt owner).
@@rocstarang5747 Yep, definitely understandable, since everything on pc is soo customizable that it always draws you into a huge loop on how you can squeeze better performance or visuals whereas with consoles you can have a peace of mind not having to bear the burden of optimising the game & knowing it isn't on you & you also can't do much if you wanted too.
Amazing to see a video about 1080p! I just upgraded from a 1050ti to a 3050 and mainly play at 1080p. While I imagine most of the subscribers /watchers of your channel are using higher end cards, there are a lot of us little guys around (just look at steam survey). Again, thanks for the content!
you're happy of doing two things wrong. Good for you! For the price of 3050 you could have 1440p display plus better gpu, litlle less, little more money depending on the market... Steam Survey results interpretation is a joke... 11% below FHD, 59% FHD, 30% multi display or bigger than FHD (1 of 6 steam gamers plays on 2560x1440p, 16% of the total)... at least tell me you're on 24" display? or a 32 TV?
@@Mr_Bloodjack make the effort of connecting your 4070ti to a 4K TV with at least hdmi 2.0 cable and v-sync to 60Hz, but if you are lucky your tv has freesync support upto 120Hz, you will flip out on the missing pixels you were not able to see with your current setup... and power/energy wise you are wasting electricity because most effiencient way of playing 1080p 60Hz is 4060/4060Ti (or even cheaper RX 6600)... search also for DLAA and rendering down sampling, at least if you have GTA V you can put everything at ultra at 1080p but go to internal resolution scaling and put it to maximum, you won't lose FPS, but image quality improvements will be overthe moon (the bigger the display the better)...
@@TheMrRadishI’m still willing to bet the majority (50% + 1) of RTX card users are still using 1080p panels. A lot of people would rather max out fps instead of resolution, not to mention the upgrade cost.
I do tend to use DLSS at 1080p simply to get extra frames, and I guess I'm just not as bothered by the small shortcomings as you are. That said I do wish more games would add a DLSS sharpening slider because as you stated some games are just much blurrier than they should be, and a little sharpening helps a lot even though it does tend to add a few more artifacts. CP2077 at 1080p Quality DLSS with around 35% sharpening is the best balance for visuals and performance for my taste.
If I dont see the native and dlss comparison side by side I cant even tell the difference and if you run a 24 inch monitor like i do you cant notice it.
Oh you ABSOLUTELY can tell in person@@BladeCrew YT compression favors DLSS here because it looks like dogshit anyway, but in person you can absolutely tell
Why not just use CAS sharpening with Reshade? DLSS sharpening isn't anything better, actually probably worse. And as far as I know newer versions of DLSS just ditched sharpening completely, which is great in my opinion. Older versions had it forced even when at 0 and looked way worse.
I agree. I think this video kind of glosses over the fact that it's used to hit those frames and to enjoy a smooth/er experience (particularly on newer titles) which is far more important than the odd image hiccup or slight blurriness, which, if I'm frank, isn't noticeable when playing - to me, anyway. It's a huge difference playing at an average of 40-50fps vs 80-90fps, for example.
@@HyperScorpio8688 nice try amd dickrider. i play every single game with dlss quality on my 1080p 360hz monitor simply to get more frames and dlss quality looks really good. (cant tell the difference)
These comparisons will become ever more complicated in the future when some engines have their won TAA-based accumulators and upscales, like UE5 with its own TSR which is by default completely replacing pure TAA. And new versions of UE5 sometimes have some changes to how it works too.
@@PaulC-xv4zr DLAA is very performance heavy while DLSS is purely performance focused. BUT, in some cases with bad TAA implementation DLSS in quality mode can be just fine and even add more details on top of native res image.
Cyberpunk recently added DRS DLSS , or dynamic dlss where the internal resolution sales to match the desired FPS target. On my 1440p monitor with target FPS set to 60 I was consistantly running at internal resolution between 75-100% while DLSS Quality sets internal resolution to 66%.
Thanks for your comment, any chance you could explain what you mean exactly? As in you have had DLSS sets the internal resolution to 66% when Quality Mode but you also use it in tandem with Dynamic Resolution? I'm already aware of what the 2 technologies are but I am a bit confused about your comment. I struggle with articulating my thoughts but I am basically asking, are you running both Graphics Options at the same time or you choose one or the other when you want to?
@@yourlocalhuman3526 Think of how consoles use FSR with a target FPS of 60 . the DRS works within the Upscaler rather than having static resolution preset's like "Quality" or "Performance" . I just uploaded a video testing this if you want to check it out
@@proesterchenWhy does TAA get to be "proper" when both are literally rendered natively? Incidentally, I don't think I've ever preferred TAA to DLAA in a game where the latter is an option. DLAA does sometimes leave a few more jaggies, but this is usually because TAA is agressively wiping detail out of existence. Just compare distant buildings in Spider-Man Remastered, at 1440p the difference is quite shocking.
I'd actually love an IQ comparison between 1080P native and DLDSR 1.78x and 2.25x (with DLAA/TAA and Quality... etc modes) Thanks for the video, looks like DLSS still struggles at 1080p (limits of the rendering resolution too big to overcome)
It's simple - more performance (especially when you're under 60fps range) is worth minor loss in image quality, especially that you won't even notice most if it when focusing on gameplay rather then focusing on fishing artifacts and issues related to upscaling. Also - Ideally upscaling should have percentage slider rather then fixed presets, or at least ultra quality preset especially relevant at lower resolutions.
Yeah, you can set your own internal resolution on each preset per game using DLSSTweaks, but it should have been supported natively by every game that supports DLSS.
Or even better: Having the option DRS. So depending on the FPS target your get a higher resolution where you would see details better and lower resolution where your wont see a difference because if gameplay.
One of the often-missed factors when measuring DLSS quality is framerate. As the variation between individual frames becomes smaller on higher frame rates, DLSS will be able to resolve motion much better and as a result provide better upscaled images. I would like to see a comparison that tries to artificially lock FPS to 30-60-120-240 and observe quality of DLSS under these circumstances.
You can't "Scale to higher resolutions", if your hardware is 1080p, that's your Native display limitation. You can Render Higher, and Downscale to Native (essentially it's SSAA). DSR is Nvidia's version of Downscaling, with added Gaussian Blur post processing pass on top of raw downscaling, while DLSS Upscales a lower Render to your Native Display. DSR DL uses DLSS to upscale to a higher Render which is less taxing on your GPU but make no mistake, it's still Downscaling to Display & much bigger performance hit than Native. As a rough guide if you can Render & Display your Native Res @ 144 FPS+, you get 60 FPS with DSR 2.25x DL. A better option these days DLAA + Native Rendering.
@@uhurunuru66091440p with DLSS quality versus 1080p with DLAA will have similar performance, but 1440p with DLSS quality offers better image quality, so there's no point in using 1080p with DLAA.
I'd really prefer it if DLSS and FSR used internal resolution sliders instead of presets, then you could just set the render resolution to like 900p at 1080p so you get ok image quality.
Or, at least, it'd be great to have "Ultra Quality" preset, which rendered image at ~850p. It would probably be the best compromise in terms of image quality and performance ratio. Though the is still a way to use DLSS on a 1080p screen with great results. Just enable DLDSR (1,78x), switch in-game resolution to 1440p and then enable DLSS (Balanced). This way you will get better image quality and better performance then at native 1080p. I was using this settings pretty often, when I had 3060.
In a blind test, most people cant tell the difference. Just like with texture quality between high and ultra (even less so in fast paced games medium and ultra) (vram craze). You guys should invite some casuals and hardcore gamers and have the same setup x2 with the same games not side by side, have them look and test and see if they can tell any difference which one is which. Maybe have an additional mode, a third one. And also compare texture qualities, but dont inform the BLIND users what has changed because if they know its a texture comparison they'll start pixelpeeping. Which isnt what normal people do. Many of the examples in this video cant really be seen in the video in realtime only when you slow it down, making most of it, nitpicking. Especially when the two modes TAA vs DLSS trade their tradeoffs.
Good points. I believe that when you have to zoom at 2X and do a slowmo to see the differences, the difference in detail quality won't matter. If DLSS gives superior performance with such hard to find differences, then it is better to go with DLSS.
@@Naxxagamer Why bother looking for flaws if 99% of the gamers most likely wont spot them, even more so when you can get a 30-60% performance gain from using DLSS?
Been using DLSS/FSR for 1080p for a long while now, if it's very well implemented it will seems as if there's no loss of image quality if set to "Quality" mode in 1080p and in some cases where games use aggressive or badly implemented TAA, you can get actual better image clarity since DLSS/FSR is less blurry. I know Hardware Unboxed has always been very against using it in 1080p but the vast majority of folk are still gaming in that resolution and with how horribly demanding/unoptimized AAA titles have become and how garbage the prices for GPU has become the new standard, 1080p is still the way to go for even mid range builds like a Nvidia 3060-esk levels, ESPECIALLY for anyone not living in the USA , cause outside in places like Europe prices can be up to double almost or very least 50% more expensive for hardware versus USA. We are currently in an age of gaming where it's rare to see new triple A titles hit from a major studio that does not need upscaling AI tech due how bad most things run natively now. I mean, it can very easily be argued that folk gaming in 4k or 2k aren't even technically gaming in that resolution, they are going down to like 1080p and upscaling to 2k or 4k and us in 1080p are technically gaming in 720p upscaled to 1080p. We pretty much too a step backwards in resoltion numbers these several years.
Where are you getting badly implemented TAA? By the way TAA works it will always cause artifacting. Its makes an issue that DLSS or any other upscalers usually gets to solve. The best solution is to no use any AA & AF any more on any games.
My experience is that I can usually get better a result with mixing and matching settings, rather than blanket sweep the whole thing with a preset and upscaling.
Thanks, Tim! Love that you’re showing some love to 1080p gaming. I think the biggest takeaway is that it’s friggin’ 2024 and a nearly 300 dollar card is struggling to stay between 50 and 80 fps at 1080p native and that we’re having to rely heavily on upscaling to get consistent +60fps performance in AAA games! I remember getting my 1070 ti way back when and loving the fact that I could finally get a solid 60fps at 1440p with most games at the time, al beit with few minor compromises here and there. If you’d have told me 6 years ago that 1080p performance in games would be this poor in 2024 I would have laughed in your face. Yet here we are. Sure we technically have more detail in modern games, but when we have to rely on upscaling wizardry to get any of the benefits of that detail, what’s the point? Can’t help but feeling that the vast majority of average gamers have been done dirty somewhere along the line, unless you have a 1000 dollars (or two) to burn that is.
I mean i had an RX 6600 which is much cheaper and i never had any issues with framerates at all - infact i tried pushing some games to 1440p and it ran really well. Most people i know who game don't only play AAA or even mostly play AAA releases, most of them have their go to mp game for chilling or an mmo and rarely do i hear people constantly talking about AAA games. (maybe you and the people you know only play AAA games, with the cost of the games $60+ each if you do this please spend more than $300 on a Gpu xD) But imo AAA games today bar some outliers like Cyperpunk , don't actually look that much better than games a couple years ago but boy do they run a lot worse... Is it the AAA studios adding in features that no one would even know was on that gimps performance?
When I got my 660 Ti back in 2012 or so, I could play basically every game at 1080p with maxed settings and never dipped under 60 fps. And the 2-3 afterwards might not've allowed for max settings, but it allowed still for decent settings. I only replaced it in 2017. There was no upscaling back then, you got the resolution you set, and midrange cards could still do it. I mean, I could play Borderlands 2, a 2012 game, on a 2008 HD 4870 nearly maxed out. I played DMC4 on mid settings on a lowly 7300 GT and it ran fine. A friggin 4060 should be able to stay above 60 fps in 1080p max settings in all but the most demanding games!
@@HappyBeezerStudiosI'm not sure what you're on about? in the HWU review of the 4060 it averaged 91fps at 1080p? Honestly unless you're talking Starfied or some really heavy RT / PT games it does run over 60fps at max settings @ 1080p.
6 years is a long time though. Doesn't excuse the poor state of ported games today but technology has come a long way since then. The GTX1000 series is now on it's last legs, even the mighty 1080Ti is challenged in newer games at 1080p.
It seems like this comparison mostly boils down to the quality of TAA implementation between games. Might be worth it to compare it without TAA, which seems to be the main variable here, not the DLSS itself.
I mean that wouldn't make much sense? Games look horrendous without *some* AA. And TAA being the standard it makes a ton of sense to use that as a baseline since that's what the vast majority of people are used to see and use.
You should check 1440p dldsr from 1080p monitor with dlss as its miles better than 1080p native. Especially textures and sharpness look massively better. Edit: The star wars with 1440p dldsr even with performance dlss looks miles better than 1080p.
I think this is more about people needing DLSS for increased fps, while DLDSR will actually harm your framerate very much. But you are right in the way, that 1440p+ display might be utterly redundant, and only a better graphics card (whcih then could use DLDSR) is needed for most people.
@@elmariachi5133 when you use 1440p dldsr with performance dlss, it is equivalent to 1080p quality dlss as both upscale from 720p.. the performance difference is very minimal.
I'm actually pretty impressed with dlss quality at 1080p. I mean we're talking upsampleing from a 720p render. The fact that its even comparable is great.
It's not comparable. Nobody plays games without any movement. Like the video said once in motion there is clear ghosting and artifacting. All upscalers at 1080p looks good without any movement.
@@brettlivingston595 take off those fanboy glasses. What are you impressed by, the artifacting or ghosting?...dlss quality at 1440p looks ok, beyond that it looks trash. Knock it off.
@@brettlivingston595 you do game at 1080p, stop lying 🤥. Nothing wrong with that. Just stop lying. The comparison in this video is the same comparison that people use against dlss Vs fsr at 1440p. So I don't understand how dlss is acceptable in this video, but fsr is labelled terrible. That has to be bias.
Great video. The only criticism I would like to add is that there is a huge difference in render quality between TAA on/off. I would even say that when using TAA, it's not enterily correct to call it "native", since TAA at 1080p drastically reduces image sharpness (even more so in motion).
Depends, the ghosting from cars is good at 4k quality DLSS but there is still some light trails. At 1440p quality DLSS the car ghosting is there but not super noticeable. Ray reconstruction with RT shadows / lighting also adds some weird ghosting effect I've noticed. I play @1440p with DLSS resolution scaling at 75 with Ray reconstruction reflections only. Another problem with CP2077 it seems like enabling Ray reconstruction on Reflections only decreases performance by a lot, not exactly sure why, maybe it enables path traced reflections or something.
The TAA implementation in CP is really bad. The best image quality is achieved using DLAA, since turning off AA (TAA is the only AA option) completely destroys the game's visuals due to how the game was made. The game is destined to be a blur-fest no matter what.
@@Zyxlian Yeah i actually tried the mod that removes TAA it's weird how badly it breaks the game. DLAA / DLSS quality look good though i definitely wouldn't call them "blur-fest". Now that you can scale DLSS it's even better.
Make sure you turn off motion blur in it as well, it’s really bad in CP2077, and if I remember right, it gets enabled every time you change your settings preset. Pretty annoying but looks much better off.
Good, you don't want to see the difference.. For me, it's screen tearing and small frame drops. None of my friends can notice or tell a difference. but I can, and it's like nails on a chalkboard to me because I took too much interest in it. It started to ruined gaming for me
remember that if the image resolution and quality is poor, RU-vid spoils the graphics even more. In fact, on a full HD matrix it doesn't look that bad at all, being 22 inches
I really appreciate this as 1080p is still what the majority of PC users out there are using I believe. it really depends on the title, my 3060 could always do fine with 1080p native but lately titles have come out such as starfield and Alan Wake 2 where I am forced to choose between dumping all the graphics to get a 60+ fps experience or just leaving them on and turning DLSS Quality on. Ultimately, I choose DLSS and its a fine experience. Also, a game like control I can run no problem in native, but I choose to max it out with raytracing and DLSS Quality because it's a much more immersive presentation in my opinion, even if it is a softer image.
Looks more detailed. More realistic. I have tried it in Cyberpunk. Unfortunately I get stutters using DLDSR, so I am sticking to 1080P until I upgrade my monitor.
I tried it in fh5 , dlaa on 0.8 is way better than that . Basically the only setting that satisfied me among all the upscaling technologies offered in the game , i liked dlaa the most . Taa was fine but dlaa gave more fps on 0.8 sharpness
Excellent video! I personally game at either 4k on an oled tv or on a 1440p monitor depending on the game and regularly use dlss to do so. Recently I helped a friend do some upgrades to his pc which included the move to an rtx 4060 and he games on a 1080p 144hz display. We play a lot of remnant 2 and I was interested to see how dlss looked on his 1080p display. I couldn't notice really any difference in image quality during gameplay but I did notice the much higher frame rate.
I love that 1080p is now considered, "not too demanding for entry-level hardware", when a decade ago it was just starting to be the mainstream resolution and a 960 could barely manage 60 FPS in most titles with the settings turned to high.
Don't fall for the 1440p/4k gaslighting, 1080p is by far the most popular resolution for good reasons (performance, low heat/noise gpu solution, cost vs, simply buying a Playstation).
@@kesamek8537 LOL I think you meant to post to someone else? Nothing in my post mentioned anything other than 1080p. More to the point, there is no gaslighting involved in the 1440/2160 resolutions. They offer more fidelity via more pixels. That's just science.
@@kesamek8537Gaslighting? Does being more popular make it better? 4K is simply a superior image. There’s nothing by more than that. To me, 4K ruins the presentation of a game, with the bluriness and aliasing being extremely distracting/immersion breaking.
Thank you for this video. Many have lower end systems and this can help. Actually, my friend has an RTX2070 and an 8700k, so for him, dlss ballanced or performance is a must for 60FPS+ gaming.
Why did TAA have to be the AA Solution to stick around nowadays... I miss the days where you could choose between SMAA, MSAA and SSAA instead of just having to rely on TAA or FXAA now, we have the horsepower to run these solutions now, why did they have to be abandoned now??
SSAA crushes performance. Lots of games still have the option for SMAA but it also has problems (Intel CMAA 2 is better but rarely used). MSAA crushes performance and isn't really that great without crushing more performance for 4x + CMAA or something for transparency.
modern gpu's can easily handle msaa. few years back msaa 2x or even 4x was not that demanding i'm sure its even less so now with large gpu caches available @@Navi_xoo
I don't know why no one tests upscaling using 'DL scaling setting' resolution in Nvidia drivers to 1440p or 4k on a 1080p screen. This works really well for me and you can use dlss at upscaled 1440p or 4k and it looks really good and sharp on a 1080p display. Doing this i turn off the awful TAA as this just makes the image blurry and since rendering the game at a higher resolution it looks better than native without upgrading the monitor. Works really well on gamea such as red dead redemption 2.
I've got no other chance but playing Cyberpunk with FSR activated since I've only got a GTX 1070. Is it an optimal picture? No. Is it well enough to enjoy the game? Hell yeah! And that's what matters!
You could also try the driver-level upscaler for NVidia, it's in the control panel. "Image Sharpening" is the setting, you enable the upscaler from there. IIRC you then need to set your in-game resolution to one that's just less than 1080p (only specific resolutions allow the upscaler to work). It might be better than FSR in some games.
Isn't it interesting where we're heading in graphics? This perfectly illustrates how the way we talk about resolution is changing, and how the old way of looking at it doesn't quite apply in the age of upscaling. Even more so when we bring frame gen into the mix, where 80 fps with frame gen is a completely different thing from native 80 fps, even when you don't take latency into account. Great video, definite subscibe from me!
I've been playing games on 43" 4k TV for 2 years. Never tried 4k, 1440p monitors. But now I'm back to PC currently using 24" 1080p. And to be honest I don't feel like I'm missing that much with 1080p. It mostly the size of the screen seems too small now and maybe the quality of the panel but 1080p resolution is not that big of a deal to me. At least it denitely not the case like when your game runs poorly and you decide to leave it for the future when you can appreciate it. It's perfectly fine
I also went back from 1440p 'productivity' video/audio editing monitor to a separate 1080p for gaming. I miss nothing, I gain performance and I can run my gpu without cranking it which only blasts out heat and eats up electricity for no benefit to me at all. 1080p rules for gaming and the majority agrees.
I believe you but when I used my 4k tv as a monitor it felt really uncomfortable because of the size. No way i would use it like that. I'm curious to try 32" 4k. @@kingplunger6033
i struggle with the concept of calling a TAA smoothed Image "native". other AA methods are ALOT sharper, clearer and barely have any artifacts. yes, they sometimes cost a little more to implementiert, but msaa wipes the floor with any TAA or upsampling.
1. TAA (in 1080p), with any movement of the cameras, the image turns into soap, if you do not move the camera, the image is very clear (meaningless anti-aliasing in one word). 2. DLSS Quality (1080p) is a necessary feature if TAA is built into the game. 3. !!! Advice if the game uses TAA: The best solution is DSR 4k + DLSS Balance/Performnce when using these settings on a 1080p monitor, your in-game image will be better than native 1080p and 2k monitors, and the fps will remain the same. Damn the day when TAA appeared... honestly, it can only work adequately at 4k (and even then with issues). In general, the ancient SSAA is the best AA, as it was and is.
No native all the way for 1080p which is what i go for, if i cant get 60fps+ for 1080p ill gladly tweak the settings to get it But at the moment my 5600/RX6600/32GB ram rig does well betond 60fps for any game so far for high to very older games i can use ultra
i will stay with 1080p for quite a while. still looks great on both my 24" desktop monitor and my 46" TV sitting at the end of my bed. my RX 6700XT gives me plenty of headroom to play the games i want at highest settings with 60 or more frames. no need for FSR or fake frames
@@danavidal8774 you definetly wont run lot of games at 60 fps plus with that card...even 4090 struggle to mantain 60 fps in lot of games at 1080p maxed out... people have mind washed...damn guys, wake up to reality!!!
@@danavidal8774 i would need a new monitor and a new tv. show me a good tv with 1440p resolution. no need to spend that money right now. maybe in 5 years when i buy a new pc. next thing i need is a better laptop for work related stuff. i am not a millionair
I would love for you to consider a comparison between 4k upscaled/TAA and 4k(no TAA, no upscale, just 4K). There is not many titles these days that can be used for that thought (most don't allow you to disable TAA)... forza horizon 5 is one of them. I think it is important to show what TAA/upscaling made us compromise on (if there is any).
Thanks for the video! I’m someone who often uses the Balanced & Performance DLSS settings on 1080p. My card is a 2070S and I’m a player who prefers higher fps for better response than visual fidelity. Using these settings can definitely add a bit of graphical jank… but the fps increase is worth it and it looks better than turning textures down to the minimum for example.
Unfortunately, fps increase does nit always mean better response. If the game is dogshit, shit comes out even if you try to clean it up and smear it all over the walls. Like with starfield
DLSS and FSR and XeSS at 1080p are DEFINITELY relevant for handhelds. Currently only FSR is, but with MSI Claw XeSS will be relevant too (probably...) and there's rumors for a Nintendo Switch 2 which will use DLSS so for that it will become relevant too. For desktop usage ... well, still relevant, but not for long, I hope.
Handhelds using XeSS on Intel hardware will definitely look better than FSR (depending on the game) but small screens will keep FSR relevant. Intel seems pretty confident if they're bidding on the next XBox.
12:07 "it may be a personal preference thing as to which issuses you would rather have" This line exactly illustrates the my biggest problem with modern 3D games, indies and AAAs. There's a reggresion when it comes to render quality in new games and I despice it. Games are getting more blurry with less stable picture because of forced TAA and almost mandatory upscaling techniques. My guess is that it happends because devs are using dittering on textures to achive transparecy. Without antialiasing things like grass, hair, fur or leaves look very shimmery. I don't know what happened that made devs use this inferior technique. Sometimes after disabling TAA in confing files, I can remedy this with uping the render resolution and adding some other antyaliasing method like SMAA but It's not possible to do it in every game. I wish that devs would notice this issue and try fixing it, because I fear that future of gaming will be blurry and full of ghosting.
Optimization is difficult and it is much much easier (read cheaper) to slap on a bandaid hack like undersampled rendering with TAA than to spend days upon days optimizing every detail of one effect. In the past when computer performance was lower and such bandaid solutions were not as available companies had no choice than to let the devs optimize, which is how some modern video games have such terrible performance while at the same time not looking much better than older games.
1080P performance mode scaling is definitely something that handheld PC users would be doing (Steam Deck, ROG Ally, MSI Claw, Lenovo Legion Go), so there would be an audience that's interested in the performance improvements, but not necessarily the image quality because the display is much smaller.
I would really like to see comparisons of power consumption. With 40ct/kWh, and the environmental impact in mind it's the real reason for me to use upscaling.
As someone who games at work on gefofce now on a 1080p monitor, its absolutely worth using. The anti aliasing algorithm it uses is better than fxaa and taa. It is usually the best looking option DLAA is ideal, Baldurs Gate looks sooo sick even at 1080p with max settings. MSAA works amazing if you have the power to burn like I do. 1080p destiny basically has no jaggies, stupid clean, if you go 4x msaa.
DLSS is incredible, it is the main reason I got a 4080. That and driver support. In the future when AMD matures their software I will definitely give them a try too.
Same here. Although I'm not holding my breath for FSR. Sure, it looks all right on comparison screenshots of stable images, but it's terrible in motion. I find DLSS much better. DLSS I'd run at quality mode even if just for the visual upgrade. FSR I wouldn't enable unless I really really have to.
@@pituguli5816 I don't need to. Think about it like this. A graphics card will always have a limit on how many FPS it can make it a given game. Even a 4090. If DLSS looks just as good, if not better than native Resolution, and gives 20% more fps, why not enable it?
@@cherryfruit5492Because not look better, downscale resolution DLSS quality 1440p is 1080p so how game look better if downscale resolution? I try DLSS 1440p with quality setting, make game look wash out and distance object blurry. This better for 4k because lower resolution to 1440p still retain sharpness decent fidelity but 1440p downscale to 1080p not good imo. You maybe not need to because you own 4080 but majority PC gamer not have powerful GPU still need use downscale to run new game. Imo this should not be case for people worry about downscale make this most important issue, important issue console ports buggy not run properly dev use excuse DLSS FSR not release game running well. I not agree downscale better than native resolution, this Nvidia marketing tell gamer better graphic if downscale resolution but when think logical how downscale resolution better than native? Sorry for my English I speak other languages, I learn English now for few months.
One thing i would like to see talk about are the different implementations of temporal upscalers. Those can vary drastically, depending on the engine it seems. There are even games where FSR works better than DLSS. Based on the table it might be worthwhile to investigate even further into this topic. The most astonishing example for me was the FSR2 implementation for No man's sky for the Switch. It just looks so clean and that's for a 720p display with this low-end hardware. I would wish for more if this kind especially for Gaming handhelds!
DLSS Tweaks allows you to set any scaling factor for DLSS up to 1.0 and that makes it pretty usable on 1080p even without combining it with downsampling. I mostly use it to enable DLAA in games that don't have it as in game option. DLAA on 1080p makes for about as stable image as I would ever need and it preserves texture quality way better than TAA.
I have yet to encounter a single game where it doesn't work, personally. If I did then I would be using 2.25x DLDSR@100% Smoothness + DLSS Quality to get DLAA. But that does cost more than just regular DLAA.@@stangamer1151
I kept trying to reply to you but my comment keeps getting deleted for reasons beyond my comprehension. I cannot stand youtube anymore, might have to stop posting here completely.@@stangamer1151
As a 1080p user with a 2080 Ti, this is why I love games that let you push the internal resolution up. Bumping up IR to a higher resolution then using DLSS to render the game at 1080p (or higher I guess) before upscaling with DLSS is better. It's also why I'm glad that DLAA is getting a higher adoption rate. It completely smashes any other anti-aliasing techniques except maybe MSAA but that's a thing of the past.
This quality analysis comes from someone used to 4K and 1440p image but most people on 1080p monitors have never experienced higher resolution and thus have no frame of reference of how the image should look. An overwhelming majority of the issues pointed out here will not be applicable to them since the DLSS image isn't that different from the native presentation and saying it looks better at 4K means nothing. For a 1080p gamer, this is a fantastic feature as the quality level is often similar to what they are used to while getting a massive boost in performance.
Am i the only one that thinks the differences are minute? I had my screen two inches away from my face and still struggled to see much difference in detail.
I'm sure you're not the only one, many people watch RU-vid videos on their phones. How much detail do you really expect to see on a small screen like that though? Especially factoring in RU-vid compression, resolution and potential quality loss if you don't have the best connection speed. Watching this video on a proper 4K monitor the differences are clear as day. Also, how the hell can your eyes focus at 2 inches?
Neah trust me on a monitor going from 108pp to 720 you will see it,you cant see it on a phone because its small but on a 22 inch monitor you can see it
@@stangamer1151 there is a difference on a 21 inch that you can see,it is there,not a big one but enough to the point where you can say yep thats 720p,but on a tv oh boy thats gonna look 🤢
@@chrys9256 I was watching on a galaxy tab s8 plus which is 12.4 inches tablet and 1440p, i feel like i can sus out more detail on that thing usually because i can press it really close to my face, unlike a monitor. It's also not that i couldn't distinguish what the guys were talking about, it's just that i feel like we're splitting really small hairs on details that can pass through the blink of an eye if your playing these games at a somewhat normal pace. Maybe these are big deals to alot of people, but to me i'd rather enjoy a smoother gameplay than worry about it.
Thanks for demonstrating the results! One of my friends tried to push to Nvidia and one of the arguments was the DLSS technology. Thankfully I did not listen to him, since I was playing at 1080p :P
Probably not unless you have to, to get a playable framerate. Edit: A lot of games here have TAA. I guess almost all modern AAA games use deffered rendering and TAA nowadays another reason to not play those. I don't consider TAA native rendering it's ugly compared to forward rendering with msaa or supersampling in my opinion but you can't test without TAA when there are almost (probably) no dlss games without it forced on (and integrated into the game so much it looks broken without it) huh. I just wish more developers stuck to forward rendering with carefully crafted graphics and clever tricks like Valve did in their source 2 games I just miss the crispness of old games, now they are brute forcing with high resolution to get the level of the detail we used to get in 1080p and lower, not to mention new games requiring 10x the compute power to look worse than 5-10 year old games anyways rant over.
While I also disliked TAA based solutions in the past, I've grown so accustomed to the look and the far better results with transparency and odd cases like thin objects that at this point I prefer it over the old stuff. Add on top DLSS and I have no intention on going back. Do also note that current games push orders of magnitude more geometry with their environments and much higher res textures and more complex PBR materials. You wouldn't dream of running Pixar level graphics with real-time ray-tracing back when source 2 games were being released.
This ↑. I'd love to see a comparison with MSAA, that's a lot closer to what I would call "native" with antialiasing. TAA is already using some sort of temporal upscaling to reduce per frame required details. But I don't know of any game supporting both MSAA and DLSS :/
You are not smart. Do you understand that SSAA destroys performance? Do you really think a bit of MSAA would solve this problem and it also destroys performance? LOL game devs can still implement this it's just pointless because DLSS or even TAA look as good or better when it comes to 4k and doesn't destroy performance.
@@Navi_xoo Nobody said anything about SSAA, did they? MSAA does solve aliasing, like it or not, it's obviously more demanding than some other AA solution but it also offer the lowest amount of artifacting. Also I don't think you've actually compared TAA to MSAA to say that it looks better, in motion the difference is quite easy to see. Talking about performance is great and all but it wasn't really the point, we are talking about visual quality in motion.
@@Navi_xoo There is already a performance difference in the video, since DLSS is going to be more performant than TAA. Also, no matter the resolution, DLSS and TAA will look blurrier than native or MSAA. As someone who plays at 4k, DLSS and TAA are not an option for me because of the blurring it adds. I would much rather turn down shadow quality or something a little bit to enable MSAA or another better AA method, but games don't even offer that option anymore. Anyone who says things like "TAA or DLSS is not noticeable at 4k" has never actually tried it, or they are just too used to motion blur.
Playing Alany Wake 2 on my RRTX 3060 right now with full path tracing (Medium) but DLSS set to Performance and resolution on 1080p. Does it look flawless? Nope. But it maintains a stable 30fps, and the visuals look quite nice even on a 1080p 120" projections. Only issue that sticks out is mostly the fine detail like hair, that sometimes fizzles or aliases and generally a slightly blurry presentation. It works wel thanks to the heavy post processing though...
Personally I use the performance mode on 1080p. I changed something in Nvidia controls. Don't ask what, i got it from a vid for remnant2.. something about dsr factor 1.78x. it made the image quality better but still had the lower temps. It looked like native with a little bit of ghosting but had a lower temp. Thanks for your vid. Maybe I will change back to native and do some more individuall testing.
Because most games don't support it and people want to see how does DLSS compare to traditional native res. Also if you think TAA is a blur filter, you are spoiled. Wait till you see the FXAA or the super shimmery MSAA or the super demanding SSAA.
@@DragonOfTheMortalKombat I think you might want to read up on anti-aliasing methods. TAA is a blur filter, using out-of-date data from (a) previous frame(s) blended with the last computed frame. FXAA is a blur filter without the previous frame component. MSAA is geometry-only, so any "shimmery" you'll find there is merely the inside of triangles not being altered at all. To me, native is native, as in a fully-rendered frame not using stale data or blur filters. This whole nonsense of trying to include blur filters in the "native" image has been done since the release of DLSS, for some reason, and DLSS has traditionally benefitted from image quality comparisons being done against these "native," not native frames.
Glad to have come across this video. I'm on 1080p and will be for some time. I had been considering the RTX 4070 but that would require a new PSU. I'm of the opinion that 12GB of VRAM may be overkill but... Worried 8GB may be too little. I do not intend to buy a new monitor for years to come unless this one fails me.
I feel in many games 1080p native looks far too blurry and shimmery due to bad TAA which is fixed by DLSS. Cyberpunk 2077 is a prime example - especially in the nomad area. I was about to make a video on this, have several screen captures, but HU bbeat me to it XD
For Cyberpunk on my laptop with rtx 4060 and 1200P display, I run native non-ray traced, with DLAA. I also cap the frame rate to 72, and use frame gen. This allows for a fine experience, and keeps the laptop fan quiet in the cubicle.
Swapping dlss versions still helps a lot on lots of cases. Games where dlss is 'just slapped on' don't use tune the library with the api available. On newer versions a default preset changed causing more artifacts. Any using dlss tweaker and picking a different preset (often C) it gets better instantly. And this is also true for 1080p. Anyway, still awesome tech and let the user decide , as always :). On a laptop the efficiency might be worth the image quality. And if you don't reach playable framerates , the quality hit is a no brainer (or to reach a solid synced 60fps when you don't have vrr). For the rest it depends on your preferences! Looking at my own 3060 , I happily go down to high or medium instead of ultra , and use a bit of dlss to get 1440p.
Dlss looks ugly to me in a lot of games, much more ugly than fsr and blurrier too in a lot of cases. Perhaps fsr has improvement artifacts, but overall far appears more 'natively sharp' to me? If you hay makes sense
@@cairndouglas4040 nah fsr looks worse like in almost every case. Dlss depends on how good it was put into a game. Some games have almost zero differences from dlss to native, others have a pretty obvious sharpness loss. Rdr2 doesn't look the best with dlss, but to be fair, rdr2 looks even on native blurry
@@Heisen_burger-dudeI'm with him, I've found myself using FSR more than I'd like to admit, DLSS just smears the image sometimes and it feels foggy at times, FSR despite the minor artifacts is just easier on the eyes for me
I constantly used and continue to use it at 1080p, even though I've now bought a 4070 Ti It's not as seamless as it is 1440p but honestly it looks better on Quality preset than 1080p native with most TAA implementations and runs better. Really is comes down to the fact I'd take 120hz lock at medium/high settings mix for example over max settings at 60fps lock any day, so I do whatever I have to to get more frames. Just got a 3050 Ti laptop too, and with the smaller laptop 1080p 120hz display I loaded up D2 Resurrected and it looks awesome at 1080p, high settings with balanced dlss and 120hz lock and isn't stressing my gpu as much as 90fps lock at native.
I hate TAA. I also hate that TAA is forced into the rendering pipeline of most titles these days and cannot be disabled, especially on console. TAA motion blur and picture softness makes 1440p content look almost 720p in so many games. I would rather disable anti aliasing altogether and take shimmering grass/hair and jaggies just to have a sharper picture because modern anti aliasing techniques are absolutely terrible. I'll die on this hill, thanks.
With all the rumors regarding dlss 2 or even dlss 3 support on the Nintendo Switch 2 this test is a good indicator of what we can expect if Nintendo decide to implement it. I believe that 1080p and 720p would be realistic native resolution to upscale from on a handheld Nintendo console.
I'm incredibly embarrassed that I never though to check this, but you just solved my problem with deathloop and the wird ghosting artifact on my ironsights. that bugged the hell out of me. Good video.
Thanks for explaining this. Now it makes sense why using dlss was really bad on my 1080p screen.... i tried it and i had a tons of artifacts in Cyberpunk. I will have to save for a higher resolution monitor
One thing you can do to fix image quality in some games using older versions of DLSS is upgrade to a newer and much more stable DLL (usually being 2.5.1). And that should fix a lot of image problems, although still wont fix all the image problems caused by a 1080p resolution.
I guess hardware reviewers get spoiled by high end hardware that they get all nitpicky with lower end stuff. I honestly look at these "obvious DLSS artifacts" and I'm like dude, I see nothing wrong with it. And if that gives me 20 more FPS, hell yeah I'm gonna use it.
DLSS can be usable at all resolutions when you take into account what their goals are. If you're not pixel peeping and your goals are purely performance based an upscaled 1080p image is very playable, and even if you ARE pixel peeping that 1080p image will almost _always_ look better than dropping your resolution to hamburger 720. It's all subjective of course but you need to keep in mind someone enabling upscaling at 1080p most likely isn't obsessed over fidelity to begin with so much as chasing a more desirable framerate so they're going to be far more forgiving of artifacts and shimmering. I enabled FSR on an APU laptop to see what kind of maximum performance I could wring out of it and yes, it was still quite usable to push that APU into a higher class of playable games. It didn't look good obviously but at least you _could play_ now. If you're already at 60fps you're not as likely to enable DLSS/FSR to get to 90fps, but if you're starting from 20fps you're _very_ likely to crank the upscaling to achieve a more playable 30fps. Artifacts be damned. Don't be unfairly hard on upscaling because you're _trying_ to pick out shortcomings. If your goal was to just sit down and play the game you're going to focus on the GAME, not a shimmering grate 150ft in the distance. If you're driving down the street you're not gazing at the pavement underneath your car to make sure you don't see a hint of ghosting from the taillights, you're looking at WTF you're driving towards so you don't run into it. Once you're focused on the _game_ and not the graphics, it takes serious artifacts to pull you back out of the game, but constant fps dips are very likely to grab your attention. While I'm all for superior fidelity, raytracing, ray reconstruction etc. on my main system, when you account for people on hardware that are struggling to play a game _at all_ you'd be surprised what kind of artifacts and blurriness you'll put up with when the alternative is just not to play. If you have to pause and zoom in while showing them side by side to say "Look!, right there that sign isn't as sharp!" no one is going to notice that shit when they're playing the game, but they're _absolutely_ going to notice an across the board 20-30% improvement in performance when they enable upscaling. Even when you _do_ notice shimmering or artifacts whether it's using upscaling or not, it's not like you're going to slam your keyboard down and give up on playing the game altogether, you're just gonna say "hmm, that sucks" and continue playing the game. Even _heavily_ upscaled games today look dramatically better than PS3 games at their best, and I don't remember anyone saying they couldn't play video games back then because they didn't look pixel perfect to reality. This is why I can't agree with anyone that uses terms like "unplayable" when pixel peeping. A game is still "playable" with ghosting, artifacts, and shimmering. 20fps? Now _that's_ unplayable. There's a massive chasm of difference in "undesirable" and "unplayable".
Nice that DLSS produces decent image quality when a performance boost is needed. Can't say the same with FSR 2 at 1080p or below. Too much shimmering, artifacts and aliasing. Hoping that nvidia produces an apu that they could put in a handheld similar to the Ally or Legion Go.
this is one of the most practical explorations of 1080p gaming I've seen in FOREVER. really just loading the games up and showing them off alone is so important to those of us who represent the majority of PC users.
Great idea to test this at 1080p, especially curious for DLSS as I don't have access to it! I support testing FSR and XeSS too. I've seen potential for using XeSS that is not a big image quality sacrifice but the performance wasn't much.
it would be interesting to see a comparison video like this for 1080p Quality vs 1440p balanced vs 4k performance. I tend to play 1440p balanced and think it looks alright, the only game I've used 1080p quality dlss on was Ratchet and Clank, and it was the only game I've played at that res and thought looked comparable to 1440p balanced. insomniac really know what they're doing