Has anyone been able to run with NVidia surround? I can't seem to enable both of them while using nvidia surround. It's either the resolution isn't available or windows reverts back to the game driver where the DSR setting isn't available. If I reinstall the studio driver and enable DSR then the resolution isn't available, it's either a lower resolution or the in game resolution is set to OPT .
DLSS quality at 1080p is the same thing as DLSS performance at 1440 as both targets 1280x720 resolution rendering. you will just get a much clear image with a minor hit to performance upgrading DLSS at 1440 from DLSS at 1080 using the performance preset. and even doing this will get you a better image than native with a performance boost so that's a Win!
@@hotrod123231Who games in 4k lmao (I'm just trolling). I play call of duty at 1440p at 165fps because it's a good balance of graphics and performance.
💥 Sharpness and definition are really bad in NVIDIA, serrated? I have an notebook Lenovo Legion 5i i7 12700h RTX 3060 and the image IN THAT THINGS are worst than in my Acer Aspire 5 Ryzen 7 5700u Integrated Graphics in the same conditions! NVIDIA ugly, AMD beautiful! And the filters of Experience increaaes the sharpness, but don't turn to beautiful, like AMD!
@@pc_blend💥 Sharpness and definition are really bad in NVIDIA, serrated? I have an notebook Lenovo Legion 5i i7 12700h RTX 3060 and the image IN THAT THINGS are worst than in my Acer Aspire 5 Ryzen 7 5700u Integrated Graphics in the same conditions! NVIDIA ugly, AMD beautiful! And the filters of Experience increaaes the sharpness, but don't turn to beautiful, like AMD!
I did this like a year ago. Its really helpful for ppl whos monitor is 1080p that the game runs with high fps than Hz you will get high quality on 1080p with some fps losing or gaining some
@@3boodAl7asan DLSS at 1080p is effective but shines at 1440p and 4K where the image is already super sharp and the loss is damn near impossible to spot naturally, i had to watch a video on what to look for to identify upscaling.
Great content. I use DSR + DLSSp to run SoTR maxed settings + RT in 5K, yes five K, on my UltraGear qhd monitor. The result are simply amazing❤ Love RTX cards
But it doesn't work in the same way. DLSS and DSR are AI implemented system far superior. DLDSR (DLSS+DSR) is the most amazing IQ in the game history. it's sad that AMD isn't capable to create something similar.
@@rodrigomendes3327 Not really. DLSS and FSR have virtually the same performance cost and generally the visual quality of DLSS is slightly better but in MW2 specifically, DLSS looks quite a bit worse than even Nvidia Image Scaling. FSR 2.0/2.1 is fairly comparable to DLSS, I would not say DLSS is 'far superior'.
@@forz2882 not how it works. for a number of reasons you can find the ability to do this, but for 99% of all people when you first watch a short you have no ability to do this
I only have a 1080p monitor and I was missing taking advantage of DLSS by upscaling to higher resolutions like 4K. I had already tested DSR, but it was bad because it didn't adjust the smoothness to 0. Thank you very much for the tip, my game looked beautiful.
Thanks for help me. Now I know why my game fps drop and gpu running hot cause this setting auto set to 4k (4.0x). When I turn it off it help to boost my fps more than 20fps 🎉🎉
Lol I literally just did this over the weekend, only finally gave it a shot because monster hunter world wasn't letting me use DLSS at 1080p, and it looked real blurry, but now it looks crisp around the edges... Just don't use DSR on the desktop, it'll look blurry on everything besides native res
💥 Sharpness and definition are really bad in NVIDIA, serrated? I have an notebook Lenovo Legion 5i i7 12700h RTX 3060 and the image IN THAT THINGS are worst than in my Acer Aspire 5 Ryzen 7 5700u Integrated Graphics in the same conditions! NVIDIA ugly, AMD beautiful! And the filters of Experience increaaes the sharpness, but don't turn to beautiful, like AMD!
This video is super helpful . I want to try this on my 4gb 3050ti especially for rdr2. But I'm worried if I will get better performance than native 1080p at all
1440p and dlss quality puts the game resolution at 1707x960(960p resolution) which will run better than 1080p since the resolution is lower and will look better(the game and textures looks sharper and more detailed). so if you use DLDSR to scale your resolution to 1440p and use DLSS Quality to downscale it to 960p, I feel like this is even better for lower end cards that have dlss and dldsr.
Just be careful your games might look too sharp with 0 percent smoothness, less of an issue in competitive games but especially human characters skin can look weird
💥 Sharpness and definition are really bad in NVIDIA, serrated? I have an notebook Lenovo Legion 5i i7 12700h RTX 3060 and the image IN THAT THINGS are worst than in my Acer Aspire 5 Ryzen 7 5700u Integrated Graphics in the same conditions! NVIDIA ugly, AMD beautiful! And the filters of Experience increaaes the sharpness, but don't turn to beautiful, like AMD!
💥 Sharpness and definition are really bad in NVIDIA, serrated? I have an notebook Lenovo Legion 5i i7 12700h RTX 3060 and the image IN THAT THINGS are worst than in my Acer Aspire 5 Ryzen 7 5700u Integrated Graphics in the same conditions! NVIDIA ugly, AMD beautiful! And the filters of Experience increaaes the sharpness, but don't turn to beautiful, like AMD!
Basicly Dlss makes 1080 like shit with some fps gain but 2k with dlss make it more good with you lose some fps but not less or equal 1080p without dlss
@@user-mc5er9fp1i 1080 look like shit only if you want to look only the dlss, everytime i activate it, i look closely on the bad things, but if yoou just play your game with dlss quality 20%, 10min after, u will forget it is activate
If you're doing DLSS from a lower resolution to a higher resolution than your monitor can show, only to downscale it for your monitor... wouldn't you be better off simply sticking with your original resolution and just choosing one of the highest DLSS quality modes?
💥 Sharpness and definition are really bad in NVIDIA, serrated? I have an notebook Lenovo Legion 5i i7 12700h RTX 3060 and the image IN THAT THINGS are worst than in my Acer Aspire 5 Ryzen 7 5700u Integrated Graphics in the same conditions! NVIDIA ugly, AMD beautiful! And the filters of Experience increaaes the sharpness, but don't turn to beautiful, like AMD!
💥 Sharpness and definition are really bad in NVIDIA, serrated? I have an notebook Lenovo Legion 5i i7 12700h RTX 3060 and the image IN THAT THINGS are worst than in my Acer Aspire 5 Ryzen 7 5700u Integrated Graphics in the same conditions! NVIDIA ugly, AMD beautiful! And the filters of Experience increaaes the sharpness, but don't turn to beautiful, like AMD!
why ? i use 4 scalers at once , they all compliment each other and my textures are clear and smooth , without any anti-alias , this last windows 11 update has an AI doing all that work to smooth out the problems , my 1440p res gets upscaled 4x by nvidia DSRDL + DLSS + windows SR ( super res ) + ''lossless scaling'' from steam store ... no problem what so ever and everything looks super clear , even have my projector at 100% sharpness , i can't look at blurry textures so i don't ...
Many games don’t have DLSS, so instead use NIS, it’s just worse than DLSS but can be used in almost any game. Also if you don’t have RTX series GPU than use this too! It’s nVidia’s version of RSR And VSR is AMD’s version of DSR. Please pin this comment
I also use this trick. DLSS works like TAA for the most part: it takes information from previous frames. A bigger frame buffer can store more of this information
Yea but don’t you have to use GPU scaling instead of Display scaling for this to work? I know some monitors still only have GPU scaling option but once I switched to my first expensive 1440p monitor and used display scaling, the difference was extremely noticeable. It was like the difference of playing at 144fps at a 7ms smooth frametime for display scaling and like 10-16ms fluctuating frametime with GPU scaling. Or a 2ms input lag compared to like 7ms input lag. I just can’t go back to GPU scaling even if im using DSR. I just have my monitor set to 1440p 180Hz and then use DLSS sometimes Quality and Sometimes Balanced. And I use Gsync and cap frames around 150-160 and even on a 2080Ti im still staying stable at quality with medium settings. Not too bad. Can’t wait to upgrade this GPU though. Tax returns just weren’t it this year!
Less blurry. More pixelated though. Like you just turned off AA. I usually use shaders and sharpening to get rid of most of the dlss blur. And always use dlss in quality mode. Of course you can’t really use shaders on multiplayer games.
So dlss increases frame rates but decreases resolution or makes the game blurry.... How does this happen? For higher HD type gameplay I have to turn off dlss or what..? Plz explain
@@GigaChad-vz9ng it lowers native resolution and then uses ai to more efficiently upscale it back to your native resolution. I think It looks blurry cause pixels are being placed by ai but some areas still have a lower pixel density then full native resolution. Some scenes look better than native resolution but most scenes have a blurry look.
@GigaChad-vz9ng it's in the name. Deep Learning Super Sampling meaning the resolution decreases but ai trys to upscale the image with better performance but the less resolution you have the harder the ai has to work to get the same image quality.
Exactly!! You have to use GPU scaling in order to do this and that is pretty outdated at this point. Display scaling is the way to go. When I finally got a monitor that supported display scaling it was a night and day difference. So he’s basically getting a little sharper image while increasing input lag and frametimes while actually losing frames lol I understand some single player games using this, but not COD.
I have a 2060 with an i7 4790 so in the CPU intensive games I only have like 50-80% GPU usage. Because of this short I was able to learn how DSR works and make use of all the extra GPU headroom to make games look much better without losing much performance
@@youtubeshadowbannedme I upgraded to a 1440p monitor so at least on single player GPU intensive games its not much different from a good CPU. On the CPU intensive multiplayer games it can be a bit rough but I dont really play those. I dont even really play games much at all which is why I havent just upgraded the CPU
I did this by accident. used DLDSR to boost image quality then when i used DLSS i noticed better performance with out any noticeable impact to image quality. 😅
Yo I recently installed an rtx 3050 in my pc and i thought its the worst rtx so i was hoping do use dlss but then it turned out be be very good. For example I have 50 fps in beamng italy with the ai cars or in teardown on the highest settings on a big map i can throw a whole 20 stories building with no lags like it made me wonder why people by card like rtx 4090 ti or what ever the only thing i came up with was 4k but like damm
This is better than you think. I just upgraded to Native 1440p. And its honestly not that bad. Native is definitely better. But DSR is much better than 1080p. If you can run it, try it! Native 1440p also runs better than DSR.
There is a problem. My monitor is 165hz. When I use DSR, that goes down all the way to 60hz. And you WILL feel that difference. Know this too before using DSR
Why some people say 50% or even more smoothness for dldsr and you recommend 0%? Isn't that too over sharpened. Asking because I also find 33-40-50% a bit too soft for 4k ultrawide. Around 8-10% is okay but 0% is still sharper, idk whats best
It really depends on many factors. Your monitor, resolution and settings you use it with. In general lower is sharper (so better). So 0% should look best. However then it may look too sharp if you have quality monitor or resolution too high. I use 0% in some games and around 25-30 in others as some games will make it look too sharp.
@@sebastianm3505 I use 4k Dldsr on ultrawide monitor with 3440x1440 native res. So the dldsr res is a bit more demanding than normal 4k, it's like 5120x2160 or something like that. I found myself to be okay-ish at 8-10%. I still don't know which is best tho. I have tried 75, 50, 40, 33, you name it, can't tell much of a difference just a degrade in texture quality with higher %? But then again 0% also looks off too. Nothing is perfect for my taste, always something that catches my eye
I been doing this since I got the game. But I went back down to 1080p because I mainly just been playing Rustment 24/7 and I’ll take more frames over sharpness on the smaller maps. But nice to see you spread this info I feel it’ll help a lot.
but how does the game rendered in 1440p dlss performance (which is 50% render scale, 720p) upscaled all the way up to 1440p and then shrunk by DSR to fit on the 1080p screen look better and play faster than native 1080p?
great tried this, didn't work right. Tried to revert back, now it's not running right. I could get 100fps or so in mp, now I can get 30. The game looks even worse after reverting!
@Joaquin Sabin yea I tried the DSR thing in the past it cause so many issues so I decided to go out and get a 4k monitor Instead however even with high end specs some games struggle with 4k on ultra graphics im now starting to see a steady 60fps with star citizen
I'd say if you want better image quality and have the hardware for it to enable dlaa. dlaa runs at the native resolution and doesn't down scale so no dlss funkiness. but I have a 3070 which is better than what most people are running but it's still worth a try
💥 Sharpness and definition are really bad in NVIDIA, serrated? I have an notebook Lenovo Legion 5i i7 12700h RTX 3060 and the image IN THAT THINGS are worst than in my Acer Aspire 5 Ryzen 7 5700u Integrated Graphics in the same conditions! NVIDIA ugly, AMD beautiful! And the filters of Experience increaaes the sharpness, but don't turn to beautiful, like AMD!
You can decrese the resolution of your screen and use less dlss. For example, I set my 1080p monitor to the resolution 1600x900 and then use dlss with incresed quality.
@@sergioescobar6390 Yes. It was common in the ancient ages (before DLSS) to reduce the screen resolution to increase performance, but now, with DLSS, you can use both techniques to achieve better performance and a good appearance of the game. Reducing the resolution of the screen before using the DLSS actually causes a feeling of smoother graphics.
@@sergioescobar6390 You can experiment what looks better to you. I don't know your actual screen resolution and what is your DLSS render resolution. I have a 1080p monitor, and I like to render to something like 800×600. To not drop from 1080p to 800×600, I first set my resolution to 1600×900 and then 800×600. If you notice, I reduced the amount of scaling done, but not the rendered resolution. It is quite a simple idea. The FPS gain is the same in both cases, but it looks smoother if I first set my monitor resolution to 1600×900. I did this in Hogwarts Legacy and I have a 1050 TI. I was playing at more than 100 FPS.
bro it looks worse XD nah jk, it's really personal preference. The choices are: pixelated, or blurred. They both give your eyes the same amount of information and they only look different on a monitor with a resolution greater than 1080p. I personally prefer my image to look blurred over pixelated because it helps my eyes recognize patterns and transformations that don't line up well with the monitor's pixel matrix.
@@Jakiyyyyyyeah but one thing , with DLSS and FG your latency will be more for like 10ms , for example on native you have 40ms , with features will be around 50ms , you won’t even notice this 10ms )
How configure a game at 1440p if I use only a screen with 1080p native? If i change that, the image will vanish with a "out of scale" error in my screen