@@rinart73 I got pretty good fps in RDR2, but I turned FSR on for a bit just to see how it affected the image quality and I was sorely disappointed. Shadow checkerboarding, image ghosting, grainy pixels on fine details like hair. It wasn't worth the marginal frame increase.
@@rinart73 Really? Damn. Cause I think even DLSS is still shit compared to native lol so it must be even worse. I really don't like any of these up scaling techniques, I trully can tell straight away when they are implemented no matter what revision they bring out. But they forcing us to use it in games that need it for best performance even on very capable hardware :( If you have a lower end GPU sure it's big help, But you cannot beat native! And I want to play all my games native with good performance with a powerful GPU without needing to compromise... Talking to you game devs! lol
AMD's FSR 3 has been a blessing for me. I play The Talos Principle 2 (it uses Unreal 5) on a very old CPU with a mid-range AMD GPU, and when the game devs added FSR to the game, my framerate increased spectacularly (depending on the game scene, from 30-40 FPS to 50+). Great stuff!
Ya these upscaling are amazing, problem is when games are now designed to require them to run kinda defeating the whole purpose of being a boon to lower end cards or just when you need a boost.
I played Talos Principle 2 on the Steam Deck, and had to turn FSR off partway through because the ghosting on moving laser repeaters was *incredibly* strong. Maybe there was an update, or some Linux/Proton strangeness, but... It was fun while it lasted, at least.
@@skult227 i5 3570K, RX 6700XT, 1080p monitor, 60Hz I play on Ultra quality settings with FSR on native (I think, didn't play in a while). For this particular game on this particular system, the old CPU is the bottleneck, and so my guess is that with FSR on, even the CPU (and not just the GPU) spends less time producing an interpolated frame compared to a regular frame.
FSR has been awesome at extending the life of aging GPUs. My old 1080 managed to stick around a little longer, playing new games because of it and a friend of mine has a 1080ti that's still happily trucking along with him just auto-enabling FSR whenever he sees it. The ghosting has gotten a lot less noticeable since I first used it on God of War two and a half years ago. Also, the FSR native AA is really cool when games support it. A lot of temporal AA techniques have a tendency to look a bit soft. FSR's AA on the other hand is super crisp. Ghost of Tsushima is a great place to compare this as that game supports just about every technique under the sun.
@Jeyel-t3k Maybe it's my poor eyesight, I can't even see screen door effects on most headsets. What I appreciate is the greater framerate stability, since I have a weak stomach.
Clearly you don’t understand the difference in motion vector prediction derived through machine learning using tensor cores and software algorithms burdening the compute units used by amd. Nvidia could easily make software that does the same thing but instead they opted for quality and not to take shortcuts.
@@mikezappulla4092 the optical flow accelerator is not without flaws either, it uses a shitload of vram and nvidia gpus have less vram compared to its amd counterpart, and frame by frame analysis shows that dlss fg simply removes bad generated frames and keeps acceptable ones while fsr fg doesn't hence why most of the time fsr fg gives more fps, hell even in some scenarios turning on dlss fg reduces the frame rate by quite a bit when the vram spills over the limit, and the thing about motion vectors, that still has to be implemented manually by the game devs themself, the tensor cores simply run the machine learning based upscaling part, which looks amazing and way cleaner than fsr upscaling, but for frame gen, the difference is so negligible that it simply looks the same for everyone, also about compute units, amd's stream processors are way more powerful than nvidia's cuda cores hence why amd gpu's actually run frame gen better than nvidia gpu's with similar delay/lag which can be mitigated using antilag or reflex
@@mikezappulla4092 if you didnt have enough tensor cores, doesnt mean you cant achieve something similiar result, that just lazy people excuse. Smart people provide alternative without it and aim to achieve better
Why my last few GPUs and CPU has been AMD. Best bang for your buck. I play on a 1440p/144Hz gaming monitor and 4k LG nano TV and current card is 6750xt. Very affordable gpu. Plays all current games maxed on monitor. For 4k, sometimes have to turn off or lower a few things (RU-vidrs will tell me what). If all else fails, FSB if game supports it (pick graphics/performance/balance based on game and needs). If game doesn't support it, Super Resolution thing in ATI app. I'm a busy, but poor man. I don't have time to play 20 questions with a game why it's messing up or spend more than $500 for a GPU or console.
Check out RTINGS for that sort of stuff. The unfortunate truth is that TV manufacturers can call settings and features whatever they want, and different companies often call things by different names, so you're going to need to rely on the people who actually review TVs to know what is what.
@@eldukedrino exactly. I think if we have big nerd explain some main ideas and how they're named and how well they work under some common brandings, it would be wonderful.
I think they do. That is, on Linux, you just need to launch your software-doesn't even technically need to be a game-within gamescope, and you can apply FSR to that window or fullscreen. I'm 99% sure that there's software that will do that on Windows (and would be moderately surprised if it's not something you can enable system-wide in Andrenaline or Game Mode settings)
@@GSBarlev That's FSR 1 though, which is really bad compared to FSR 2. FSR 2 needs support built into the game. Nvidia also has NIS, which is like FSR 1 and can be applied to any game via gamescope.
I use 'Lossless Scaling' mostly because it works on every game. Works just like your TV does but works brilliantly. If you use motion interpolation on your tv then you won't be bothered by the artifacting that comes with it as it does look a bit worse than FSR or DLSS but it works amazingly well. Also works outside of games, for example RU-vid or other media sources.
@@yoshinatsu very true. But the game has to be coded to accommodate it. I'm not saying that Lossless Scaling is the be all and end all. I'm just saying that it works for ganes that dlss and AMD don't support.
Ark Ascended announced they are switching from DLSS 3 to FSR 3.1 at the launch of the Aberration map on September 4th. They said it is more line with their goals of being cross platform since both current gen consoles use an AMD APU. Up to now, I've been using RSR with my 6700XT, so I'm looking forward to FSR being introduced. Around the same time they are also moving to UE 5.4, so the combination should see good performance improvements on my system.
@@Naqaj They mention FSR3 and yes, AFMF is part of FSR, but AFMF is only available for RDNA2 and 3 aka. RX6000 and RX7000. So effectively AFMF is limited to specific Hardware, just like the current DLSS is for RTX 4000
100% agree. AFMF is on the driver side, and AMD says that if FSR3 is supported by the game it is the preferred choice. But for games that don't support FSR3... oh boy is it a game changer, especially with FSR2/1 or RSR, I've had 2,5x more fps at the cost of a bit of image quality
I was really sceptical about all the upscaling and frame gen, but after trying it on my 1080 Ti, I am actually surprised. It gave me like 90 FPS more and doesn't even look that bad.
FSR gives Ghost of Tsushima stupidly good performance on the Steam Deck and on my desktop it's one of the few games where i can actually cap out the 165hz framerate of my monitor
As a RTX 3080 user, I don't get DLSS frame gen because, you know, Nvidia only supports 4000 series users. Thanks to AMD, I can use DLSS upscaling with AMD FSR 3 frame generation. Constant 100+ fps with max graphics at 1440p in Ghost of Tsushima. My next graphics card is going to be AMD.
@@nickduan4949 Finally, AMD found a way to gain marketshare 🙂. My Radeon HD7850 (old as fuck) died 2 months ago and I am waiting for RDNA4 to have more choice between 3 generations of Radeons. I will buy whatever has best performance/price ratio.
I just recently started playing Ratchet and Clank: Rift Apart after it was updated to support FSR 3, and omg it's been amazing. Using FSR upscaling and Frame gen, I've been getting an average of 200+ fps and even 300 fps at max settings and 144 fps with full ray tracing. (I run a 7800 xt)
I have a 4080. Ive found myself using fsr 3 in most games due to dlss frametimes being quite higher. Gpu times with dlss was typically over 10ms, spiking upwards of 15-20 and fsr3 has been 6ms or less incredibly consistently. Dlss might look a bit better, but ive had a better experience with fsr. If i couldnt feel the hitching id prefer dlss. This is just my experience.
Honestly FSR3 is great when it's available, but I've used AFMF a million times more, which is something you should've tacked on with the RSR blurb, because AFMF is AMD driver specific frame gen for anything directx 11/12. That said, it has some of the usual caveats like needing to run full screen in (most) titles, and other limitations to enable it, but when and where it works, it works well.
I've been using FSR in Death Stranding since forever and it's probably one of the best examples for the technology. It shows both how good the image quality can get, and also how bad the ghosting can be (with foliage waving in the wind and such). For the type of game Death Stranding is, the technology is perfect, and the shortcomings can be almost entirely ignored, since it's a slower game. I love it!
I love Death Stranding, and I enable FSR 2.0 for all the games that support it. I'm on chapter 3 currently and loving the game. I also use Linux, and the drivers are just perfect for gaming since I own an AMD RX7800xt. I am currently using Bazzite, which is a gaming distro, and all the microstuttering I had on Windows 11 is completely gone. Anyway, I hope you enjoy the game.
@@ssdemon96 That's so cool to hear, I had no idea it was supported, even happier to hear it works that well. I'm doing my second playthrough on the hardest difficulty, getting max ranking on every single order xD I adore the game, you've got plenty of beauty to look forward to!
even in games where I'm able to run comfortably 90+fps, even sitting FSR to quality often either increases framerates or decreases power consumption, which is important to manage in a small room
Hey Riley, a video about "lossless scaling frame generation" on steam would be nice. How it works, on which hardware it works and how its quality compares to nvidia/amd/intels solutions.
I use FSR in most games that offer it unless I really don't need the extra frame rate at all. Good on the developers of farming simulator, they added fsr3 basically right after it came out and I use it in that game and it works great. Basically free extra FPS with not really any downside. I would say maybe some visual artifacts on certain surfaces at distance, but it's not something that bothers me enough to turn it off over getting extra frames. Really wish more games would support it
You may be interested in a tool called lossless scaling. It's in steam with the image of a duck. It does frame generation and resolution scaling for games that don't usually support it. One of my friends uses it to stream with moonlight in low res and 30 fps to upscale the image and framerate of the received video in order to use less bandwidth. Its really good.
Been looking for a good fsr explainer video just yesterday and what's out there isn't great, so was surprised to see this today - nice one! I have a potato spec PC with an RX550 which to be fair performs far better than it has any right to, and now I've started using some games with FSR or upscaling mods (Fallout London is a good example of this). It's the difference between playable or not playable on some games, and for others allows smooth gameplay at 1080p where resolution had to be lowered beforehand. Getting something cutting edge, useful, exciting AND for free these days is amazing.
I used FSR 1.0 via Magpie in unsupported games becouse it's just better than lowering resolution or extremely lower the graphic settings. I also used it in games that just don't support higher resolutions becouse assets are made for lower ones and it just blurres if set my screen res.
DLSS and FSR are both frame generation and tbh it just full screen 720p video to 1080p screen AT THE BEST. and i not use any of them . but honesty its nice to see AMD even help nvidia users ! i saw many ppl use FSR on even older nvidia card like gtx 1060 and yeah AMD actually loved by many ppl but not as i love to see you Riley on next techlink.
Switched to Team Green yesterday. The performance difference in AI/ML is orders of magnitude better. Can you all do more segments on AI and Machine Learning? Maybe just a different set of benchmarks for your creators/ML Deep Learning followers who are also gamers?
i remember playing though about half of god of war before realizing FSR was enabled, their was some specific artifacts i was noticing in certain areas, what made me notice it was a spinning grindstone emitting sparks, and it just, did not look right around the edges
FSR is definitely getting there but it's still got a ways to go before it beats DLSS. Still it's exciting to see these two technologies including one that's completely open source becoming so popular and being worked on so frequently. The future looks really. really good. (Hopefully)
Every time I tried FSR, RSR or FMF, the picture looks more blurry and janky then when rendered native so I dont use it even it could give me some more FPS. Would like to see a video where LTT trys the same with two setups (one with and one without) and let your team decide if they can see the difference like you did with DLSS before.
Wait: 2021? Nintendo implemented FSR unusually quickly. For those unaware, AMD FSR has been implemented in most first-party Switch games since April 2022's Nintendo Switch Sports. This algorithm will likely negate the need to make Next Switch versions of the games from the later half of the current Switch's life, especially considering that DLSS, which is rumored to provide the Next Switch with 4K output, was proven to make FSR stronger. Of course, select early Switch games may see a Next Switch version regardless, if the rumors are to be believed.
Technology like DLSS and FSR are complete game changers for a company like Nintendo that wants to keep costs low and wants to use as cheap and freely available parts as they possibly can. I remember hearing about some kind of proprietary version of DLSS that was going to be implemented on a kernel or system level which at the time I thought was ridiculous until all the talk of a PS5 pro with some kind of Sony proprietary DLSS "Playstation super resolution" essentially would be doing the same thing with some kind of system level integration of the tech. The next couple of months like I would say 6 at most are going to be insanely interesting just because I want to see how this tech ends up, looking working and hopefully changing everything in a positive direction.
Hope someone can help since I have an even more basic question. If I enable FSR do I need to manually lower the resolution to say 1280x720 or do I keep it at 1920x1080 so the game knows that's what I would like it rendered at?
You keep the resolution at what you want the "end result" to be, so usually just your screen res. Then if you turn on FSR it will render at a lower res and upscale to that target res. For instance I have a 1080p screen, keep resolution at 1080, set FSR to quality, and it will actually render at 720p and output 1080. Hope that helps
Usually if you enable it you set two resolutions in the settings one is the target resolution and one is the native rendering resolution. At least that's how it worked in most cases I used it till now. Another version I have seen in game settings is you setting the target resolution and the game giving you a slider on how far away it's allowed to keep the native rendering resolution like 2x less and similar selections
You keep your native resolution as you were without FSR. The preset that you're using determines how low actual rendering resolution would be, but it always will upscale to, in your case, 1080p. On the other hand, in games without FSR there's an option to upscale it through the driver Adrenaline Panel. It's worse than in-game FSR, because the game is aware what part of the screen is UI and text, and should be rendered natively and sharp with minimal cost, and what's the rest to cheap out. Adrenaline option requires you to pick lower resolution, and upscales it indiscriminately, making text blurry. Good to have that option tho, just in case.
It may be far from perfect, but it can make the difference between a game being playable and not, especially on resource-limited devices like the Steam Deck.
nice video, but would be nice of you to talk about the mods like dlss-enabler and optiscaler which help you change fsr version (or even upscaler itself) if, for example, the game you're playing has bad fsr implementation(looking at you deep rock galactic) or only supports dlss
DLSS + FSR mod on Cyberpunk is why I get a solid and stable average 87 fps @1080p max settings on a Core i7 3770K, 32 GB DDR3 1 866 MHz and RTX 3060 12GB 🥰
Will I get this update for my Rx 6500 Xt? I saw in the video about 6000 series, but I was unsure because of how low performance the card does compared to the other 6000 cards.
it's a blessing for some, because older or integrated graphics pcs can play more games. but is also a curse for others because newer games are asking for it too much. for example the new Star Wars recommends a RTX 3060 to play at 720p.
3060 for 720p ? The f**k? I glad i mostly play indie games now i barely aaa games . I Watch zwormz channel and i comfirm there are no games struggling on gtx 1060 at 1080p low to medium until 2023 where unoptimized games started to become huge problem.
Don't get me wrong, I run an Nvidia GPU in my desktop PC, but FSR is a godsend. It lets me play games on my very much not built for gaming laptop, which is great for when I'm travelling. Plus it makes high-end games playable by more people, and more gamers in the gaming community is a good thing.
You don't actually need a 5000 series AMD card or newer to use FSR in games that don't have it built-in if you just use Lossless Scaling which is $5 on Steam
Every game I have ever tried FSR on, FPS drops, and the image gets significantly blurrier. Feel like that is the opposite of what it should be doing. I get better performance going DLSS w/o frame gen then FSR + Frame Gen.
People are sleeping on AMD Fluid Motion Frames (FMF) which is a software-based frame-gen. It overcame engine limitations on modded Skyrim, doubling frames that were previously abysmal. Do you know how good your tech has to be improve engine-bottlenecked game???
I want to try DLSS and FSR, but my min maxed system gets 0 extra frames in most games. How about a piece discussing the principles regarding the CPU overhead needed to push this tech? Which cpu/gpu combos benefit most, versus combos like mine, which get no benefits.
For my low end notebook, this feature is gamechanging. Is the only thing that allow me to play modern games like RDR2, Hogwarts leggacy, RE4 remake and others.
tbh i'm pissed i had to buy a 4060 laptop because full AMD laptops just are not even a thing in my area, the alternative was the framework one (which i very much like) but costs 2k euros without ram and storage...
So, uh, there's an interesting point to be made here: it's commonly brought up that Steam hardware surveys show that like 70-80% of the marketshare is nvidia GPUs, but seeing that only 52% of surveyed GPUs support DLSS is actually very interesting - that paints a picture of how much marketshare is entrenched older GPU ownership, like GTX 9 and 10 GPUs. Of course, nvidia is still by far the big dog in the market, but it's interesting noting that barely half of PC gamers on Steam have DLSS-supporting cards.
Steam and Valve really need to get together and make it easier to turn on FSR for games on the Steam Deck. It’s great you can do it even when the game doesn’t support it, but boy is it ridiculously complicated to do so.
How is it complicated? You just go into the right hand sidebar and set the scaling slider to FSR. Then you can choose whatever resolution you want in the game and it will be upscaled using FSR
I was just playing with my Adrenaline software, cause ya know… instabilities. I turned off all my AMD technologies in a bid to test “standard configuration” with JDEC ram speeds and just default/auto everything which was a failed venture. My latest musing, is feeling that I just need to go harder on the AMD. It’s go time Lisa! So now I’m up in my game settings being like “is this an Nvidia technology that’s enabled?” While in Adrenaline I’m like “yes, force the program to use AMD anti aliasing” but don’t get me wrong, I ain’t no hyper RX 3itch! We’re rendering at native 4k with the xTx up in this hizzy. But at this point I’m pretty sure my instabilities are from using two matched ramkits to populate all 4 slots. They’re ddr4 4000 kits @1.45v and I currently have em clocked down to 2933 @1.35v in alignment with the 5800x3d max specs, but still with the 4000 level xmp sub timings. My board can do ECC ram, so I’m debating maxing out the system with 128GB DDr4 ECC at 2933mhz @1.2v The way this ties into AMD, is I think the x3D chip which is barred from overclocking, has it’s clock and voltage algorithms effected by the ram voltage. So the 1.45v seems excessive and like it may be causing a cascade of overvolting that’s causing thermal instability. My ram isn’t explicitly AMD compatible, and when I look at the serial numbers in the bios, it confuses me into thinking I put the pairs in the wrong slots, which I took care to get right during install. I also notice my VRAM is like 2500mhz, and I wonder when using SAM Smart access memory, if I would want not only the infinity fabric coupled with the ram speed, but also the VRAM. 2666mhz may be a delightful sweet spot. Which is the quad channel dual rank speed on the 5800x3d spec page, and an overclock of VRAM i have been able to run. Otherwise my instability could be the 700w Seasonic fanless, or the overcurrent protection of the cpu, which can be toggled in Gigabyte control center, but unfortunately there are no video instructions, or written instructions describing Gigabyte control center. Which brings us full circle, as an AMD user, what are some tech tips of how to configure my addrenaline and game settings to avoid Nvidia technology issues, is it safe to use HBAO full with an AMD card? Delving the Bios is probably too much to get into with coherent expediency, but can we get some content on how to use a motherboard’s software. Even if the advice is as simple as to not using it, or never install it, or having it launch at start to always have it running, or preferring the use of AMD Adrenaline software for all your configurations instead of Gigabyte Control Center when controlling the graphics card, because both can. I have Ryzen Master, I have Gigabyte Control Center, I have Adrenaline. I feel there’s a lack of basic introductory instructions on how to use these, and how a true computer wizard like an LTT playa might use them to pimp their system. That personal touch may also strike a parasocial chord with the audience. I think most channels initially attract their audience for the practical content, but long term, they basically just tune in to hangout with their buddy, to see familiar faces being chill and chummy, Duos and Trios having a huge advantage over solo hosts, as it creates that social environment, I feel some slow paced software musings from the perspective of the host’s personal systems and configurations, would be really great for your pc enthusiast audience. If it was a car channel, I’d be like show me your engines, with a PC channel, you have endless upgrade cycles and settings configurations for every personality that the audience has become familiar with. The AMD upgrades are kind of touching on that, but they’re too infrequent, give me 6-9 minutes of James talking about his new fans every month, let the viewers grow with these computers in a parasocial kind of way, if Misty is teaching me about Pokemon, I want to see her pokemon evolve, and know what she’s dropping at the tournament. Allot of content could be done during a lan, where all the systems come out to play, but that’s a busy time as is. It’s like dresses on the red carpet, lets see what you’re rolling up with. And yr GCC config lol
Why does everyone want to turn off frame generation on TVs? I want to watch everything at 60 FPS. It makes everything look so buttery smooth and like they actually used to quality camera instead of a flip phone.
Fsr is a great technology and Id rather have it than not! (even if it doesn’t work in Forza 8) I havent used fsr3 yet but sadly fsr 1&2 just could not compete with DLSS and XeSS. Still great if those are not available
I play the wicther 3 on gtx 1660super and it gives me like 35fps on ultra preformance mode (bc old cpu) so GOOD JOB AMD!! (Take notes nividia aint spending 300 more to get better ray tracing and all that 7700xt can do is as well and much cheaper (on the wicther 3 1660super can run it too tho only at 15fps bc it doesnt have rt cores and uses normal cores to render so)
There is FSR in Dragons Dogma 2, but for some reason it looks like crap. Is it just a bad implementation on Capcoms part, or is the current FSR really just that bad?