Am I misreading the patch notes or this latest one doesn't talk about X2? Only about X3 improvements "X3 mode has been updated to further reduce artifacts on patterned textures and in dark scenes."
If you don't like the latency, you can try the "Allow Tearing" option under Sync mode. It significantly reduces latency at the cost of potential screen tearing. However, after 60ish hours of using this option, I haven't experienced screen tearing at all. Obviously, it might not work as well on EVERY game and EVERY pc but its a pretty huge improvement, so its worth a try for anyone who has LS.
Lossless scaling at 4x is great for 2D games with lots of scrolling. These games are often limited to 60 fps (such as Factorio and Terraria). These games combined with my 240 Hz monitor looks fantastic. All motion blur induced by tracking objects with your eye is almost eliminated.
I play Super Smash flash 2. This game is locked on 30fps. This game needs this app to achieve 60fps. Ain't no other option to fix this but to use Lossless Scaling. 😅
Never in my whole life do I expect a pixelated Duck software on Steam could double,triple or QUADRUPLE my FPS in games. TFS is legend for upgrading this for free constantly. Lossless Scaling is truly the GOAT when you wanted to get more FPS on games that didn't have the expensive exclusive DLSS or AMD's whatever FSR or FMF, this Duck just say "Let me allow every single game be blessed with tons of Frames and Resolution!" and then the frames actually did appear. It really saved my precious GTX 1060 6GB from early retirement and I believe it will benefit people who had crazy high Hz monitors or uses a high frame rate TV madlads to use this just for the sake of playing anything on 4k or 8k.
Unfortunately lossless scaling is snake oil. The upscaling is literally just worse dlss/fsr. The frame gen reduces ur native frames to put more fake frames. Fake frames are not real frames, they just create the illusion of smoothness. Ur input lag is still based on ur real native frames, which are now lower thanks to lossless scaling consuming gpu power and vram. The 3000 and 4000 series have special hardware in them that is used for dlss and frame gen. This hardware makes the performance cost of those nvidia features practically zero, which lossless scaling doesn't have because it's a software implementation rather than a hardware one. U're better off just saving money, buying a used 3060 if on a budget and using dlss, that will give u much better quality at a much smaller performance cost. The frame gen feature could be useful for games that dont have native frame gen, but unfortunately its gsync support is lackluster. And gsync/freesync aren't even that good. Special K's latent sync blows both of them out of the water. Even then u need at least a 144hz monitor to make it worth using. If u have a decent gpu like a 4070 or above, u shouldnt struggle getting 144 fps on non-demanding games, and the demanding ones usually have native frame gen included anyway. So pretty much the only time this program is worth using is if u have an older gpu with a 144hz monitor where u get around 75-80 fps stable. Lock the framerate to 72 and double it up to 144. And even then u miss out on latent sync and u have to use a shitty gsync implementation. Aint worth it unless ure really destitude.
@@kerkertrandov459 FPS unlockers aren't a option for most older games on a emulator where making the frame rate higher breaks the physics or makes the game run too fast or nerfs jumping etc.
I used the program for a few months now already and from my experience as long as you aren't specifically looking for the artifacts and those little jittering pixels I have no issues and em very greatfull for this to exist
@@G0A7 maybe try with the G-sync now available turned on? It could be because sometimes they add the jittering on the top but with g sync this then gets rid of the problem
@@risingg3169 i could try but tbh 60fps is more than enough for chrono trigger, more often than not i just forget that i have It installed and just plain without it
In case no one's mentioned it here yet or tried it, you can also use Lossless Scaling for videos and it works surprisingly well. However, you do need to be on fullscreen and there should be no UI elements present such as captions or the scrub bar otherwise you will get FPS fluctuations. Also works alongside RTX VSR.
@@papiertoilette-pb9lf movies are made to be 24fps for reason. give it more than 60fps give its a cheap "soap effect" look. looks like robotic , or game like. but if u like that thats fine.
I just bought and tested it. X2 is the best option. No need to use X4. 120/144 FPS is enough. X2 has a very good quality too. I didnt even notice that there are any artifactes. This shit is better than Frame Generation and FSMR LOL,
@@Jeannemarre yea, this app is great for capping to your max refresh rate too. Also for games which are hard capped at 60, like Tekken 8. I play Tekken 8 with smooth 120 fps and you do not notice any more input lag.
The thing is that 120/144 means you have Sync in Default and not in OFF (allow tearing), which means you have more input lag. Best option is 2x performance mode with allow tearing
@@ghostlu5462This Honestly Using the LS1 Upscaler + the X2, both on performance, have minimum impact on the image while feeling smooth. Each update silently boosts X2 Frame rate to the point is the best choice for a lot of gamers.
I tested it today on youtube and a movie. And - ITS WORKS! You can boost 4k 25 fps video to 100fps or 24fps movie to 96 fps even in fullscreen without graphics problems.
Hadn't gotten around to trying it like that but was hoping it might work. I tend to stream at 720p but have a 1440p 165 Hz monitor. I've only tried it on a few games. Didn't work with LA Noire but worked very well on FH4.
@@xShikari no, what should fix it is internal motion pacing Same in games The game could be running at 60fps then see a random item or character moving at 30fps or less Sometimes it can be cool Like into the spiderverse where Miles is moving at 12fps but the movie is 24fps But running a movie at 60fps is awful, the movement is very video game looking It's like those people who were upscaling JJK episodes from 24fps to 48fps and upscaled to 4K Way too much ghosting and flames looked very liquid You could see this with a satellite tv running on a Samsung or LG 4K60 display It upscales everything to run at 60fps Every single movie no matter how old it's got that motion smoothing feature on that basically doubles the frame rate and everything you watch is at 60fps
I downloaded more SSD space yesterday. The cool thing is i did not have enough space to download the extra space but the extra space gave me the option to download the extra space on the extra space itself. The problem is the download was so massive that at the end of the day i have the same amount of SSD space as before.
I downloaded more internet speed, but unfortunately the download of the speed itself takes up so much bandwidth that it eats into itself. Still, it's essentially free, so I think I'll keep downloading it just so I can tell people I have a faster connection.
Turn the Performance toggle to On for better latency! Theres a bit more artifacting, but its less noticeable the more frames you feed it, and its less noticable than the input lag.
I've used it with 2x scaling and 2x FG to play FH4 at 1440p60 with a GTX 670 SC. Both on performance mode. Things got a bit wobbly when moving through my garage quickly but in play, latency was low and the image good. Good enough to be normally competitive in ranked racing. I've played the game at least 120 hours like that.
You probably won't notice the artifacting while actively engaged in a fast paced game. It's when you go out of your way to find them using flick tests that they become more apparent. Something to note is that lowering the Mode to X2 and enabling Performance Mode (and setting Sync Mode to Disabled) significantly helps latency. While X4 is cool, it's unoptimized and without the performance settings will result in a bad time latency wise
@@DragonOfTheMortalKombat Yes you can stack in game Fg and Lsfg, Tried it on Cyberpunk with Fsr fg mod resulted in 660fps with 1440p low preset Fsr balanced *And Spiderman Miles Morales's Fsr 3 Fg around the same fps due to Lsfg capped to monitor refresh rate for base Fps. I have 165hz monitor so x4 from Lsfg is 660fps.
Considering X2 is Quality and X4 is performance. I actually use the X2 to save power on my 3090. I limited my AW3423DWF refresh rate to 120 so I can use 10-bit. Then I cap the refresh rate in games to 60/90 and just let the X2 generate the rest.
yeh 2x is good enough to play with good image quality, also I'm using 2x and i have 88 fps on cyberpunk 2077, before 42 fps without using lossless scaling. i'm using gtx 1080 8gb GPU
Should specify that this application is the star of the show for emulation. Emulation, you're looking for a fixed frame rate 100% of the time. That goes great with this application that works best while at a stable FPS as well. VRR doesn't do anything for emulation and can cause slow motion at lower frame rates. Games that lock at 30 FPS aren't great but games that lock at 60FPS like many Mario 3D titles, feel much better with it on.
When games close at 30 frames, it seems as if they are at 60 frames, but this is a shame. The more the frames are cut, the more the control becomes heavier and heavier, and this is annoying.
@@thebigcj517 unless you can do run ahead frames like retroarch and combine that with the frame smoothing, since computing power isn't the limitation here
I recently started using LSFG in Helldivers 2. Works pretty darn well! For better latency, make sure to turn on ULLM/Reflex in your Nvidia control panel. With ULLM off the delay is pretty noticeable. With ULLM on, delay is barely there and easy to forget about. Baseline FPS on my rig is ~100 or so. With LSFG on it makes it 80-90 baseline with 160+ output.
It only works when you put --use-d3d11 in the steam launch parameter as driver level ULLM does nothing in DX12 (The reason why they came up with reflex and put it into games)
@@RyanKristoffJohnsonLatest version of RTSS also features Nvidia reflex as the frame limiter. I tried it and it works for games with no in-game reflex option. Try this instead of ULLM.
Have been using this for quite some time now and I am really happy with it. I personally do not really care about hud elements having errors and the artifacts are also barely noticeable, at least for me. I don't really pay too much attention to finer details. I have a 165hz monitor for almost 2 years now but I couldn't really utilize it. I was mostly stuck on 120-144hz because of performance, even with x2 frame gen of Nvidia (I have a 4080). Since x3 mode I can play pretty much every game at 165fps, so I can finally use the 165hz of my monitor. And yea like you said, with games like Mordhau (competitive game) I do not use it because of the latency. But in single player games that aren't crazy fast paced, 60fps baseline is perfectly fine latency wise. Only downside is: you can't really go back to something like 60fps. My god it feels like lag 🤣
Really interesting video Daniel! However when stacking frame gens, you should absolutely use the performance mode, in your frame gen stack example your base FPS post DLSS FG dropped significantly, showing a big GPU bottleneck, please retry this (for yourself, not on video) and tell me if it improves things, it should! Also please lock your base FPS with the in game frame limiter to 59 FPS, so 59 -> 118 with DLSS FG + Reflex, then x4 performance 118 to 472 FPS to be in VRR range for minimal input lag :) With this it should feel, and look, amazing :)
I remember long ago i used to skip frames on ps2 emulator and up to 2x it worked more or less playable. And now instead of skipping we're adding frames. What a plot twist. Btw in Valheim even AFMF2 feels perfect, with L.S. it would be probably even better.
EDF6 is another great option for LossLess scaling, since it's limited to 60fps, and the frame generation does help make the game a hell of a lot smoother.
@@danielowentech While it is the best way, you can always record at the highest FPS your phone or camera can do and slow it down. It will not be the most accurate in terms of milliseconds but I think most people just want to know which one has the lowest latency. If they can see one is visually faster than the other, it should be enough. Also maybe an image quality comparison would be nice as in try to target a stable FPS (either 60 or 120FPS or both!) and record it. The reason for stable FPS is to guarantee that the recording is good and hopefully get a clean capture. I'm not sure the built in recording feature like from AMD Adrenalin can capture FG, but if it can, you can use that since we are not concerned about the absolute performance. If not, then obviously you can use capture card. Whatever works best for you.
I experimented with this on my 6600XT with a 170hz 1440p monitor with vrr. It works great and I don’t notice any ghosting or artifacts pop up, but I do want to avoid having them whenever possible, same story with the latency since I play with a controller most of the time. Works best when you maintain a consistent 60fps to begin with
Daniel, it would be great if you could try LSC with a second GPU for frame generation. From the LSC panel itself, you can select which GPU you want to use. This way, there is no frame loss on the GPU that is rendering the game. I haven't seen any channel doing that.
I wish ths focused on quality of frame generation instead of quantity of frames. The x2 is amazing but still has flickering issues (especially with 3rd person games)
That is an odd statement. Increasing the number of frame inserted is by far the easy part. The entire task with frame Gen is to accurately predict the frame while not increasing latency. AMD could easily make an upscaler that look BETTER than dlss.. know why they don't no it? Because turning it on would DEcrease your frame rate instead of increasing it. Same here.. they could easily make the 2x be Far better quality but the hit to base performance would be so much that it is not worth using. There is a limited amount of resources that these programs have to work with as well as limited data to work with.. only the pixels on your screen. That is it and they do an excellent job with that limited info. You're saying they should be working on the really hard issue instead of offering the really easy stuff like 3x 4x. Doesn't really make sense. Everyone likes the option to do 4x even if just for the fun of it.
@@waltuhputurdawayThe only way to get better frames than Lossless is to use raw data that comes directly from the game engine even before the frame is rendered completely. Motion vectors is the big one people talk about but there is other useful data as well. Sure, Lossless with get better but it will only get a bit better. By time gpus are strong enough to be able to use a significant more resources to generation frame, there will be ai driven framegen in every game for every GPU. With Lossless.. there is a limit to how good it can get.
In absolutely any game, input lag increases very much if your GPU is loaded at 95-100%, it doesn't matter how much fps you have, Nvidia reflex PARTIALLY solves this problem, but the best solution today is still lock fps. Cyberpunk and Elden Ring showed you this very clearly, it's not about the gamepad and mouse. LSFG works best if your GPU is never loaded at 100% and your base fps never drops below 60.
First DLSS 2.0 dropped and I couldn't believe it wasn't magic. Then Nvidia released Frame Gen and it's almost as impressive. I can't wait for what they are cooking up next. They should definitely be criticized for their anti consumer behaviour but you can't deny the genius of their engineers.
It's just like an evolution of injectors back in the day, they just fiddle with settings to make games look better than they actually are. It isn't a replacement for native, and it certainly isn't magic. More like copium for devs inability to properly optimize PC games
Using words like magic and genius bring back bad memories of Apple's language. I dont agree with glorifying these big corporations. Unfortunately Nvidia is more and more Apple like.
It's worth mentioning that frame generation will use extra gpu power, and this will cause you to lose "real frames." So if you were getting 60 real fps, you might get 50 when FG is on. Personally I wouldn't bother with FG if you can't keep more than 60 real fps because the input delay will just be too much
I've been using tool this for a year, and it really does work well! Wherever there's not support for AMD RSR or FSR, I'll use it. (Primarily Minecraft Java, tbh. Lol) As well as when streaming videos on youtube or elsewhere to bring them to 4K. (Just use a pop out picture in picture for your video, then use the shortcut key to make it scale it up. I use brave browser that has good support for PIP) Haven't tried frame generation, but I'm not a fan of fake frames tbh, and don't play games that would really require it where it wouldn't hurt me mechanically. Lol.
didnt watch the whole vid but if you want to demonstrate the smoothness cant you record at 240fps, and play it at 0.25 speed on youtube to display each frame. Obviously people who have 240hz know its smoother, its just seeing the quality of each frame and we can picture it back together at full speed.
Agreed. I think he should record it at 120hz and then slow it down to 1/2 and upload it as that. And then those with a 120hz screen can run it at 2x and they will get the real 120hz video.
@@spencer4hire81 Everyone can not watch his video in 2x and get the benefits from seeing it in 120hz. Only those with 120hz screens. I pretty much always watch in 2x anyway. Sure he gets a less watch time. But so what. The whole point is he is showing of high frame rate video. And this is a technique he could use to improve, what his audience will see compared to what he sees and not be limited to 60fps RU-vid videos.
I think, you mentioned in earlier videos, that capping the base frame rate can limit latency, which you did not mention in this video. I used this with a RTX2060 to play Black Myth Wukong with 120 fps on 1440p high settings with dlss on 50% (40% in chapter 3) to get 30fps capped base frame rate and it played wonderfully. Not capping the framerat introduced latency spikes.
I love this application and use it daily. Yes there are some artifacts, often small or dismissable for me. In shooters I won't use it and will perfer any built in game options. Reguardless, this is amazing. It's best used in non-esport type games like for me: Witcher 3, Spiderman, Path of Exile mapping, and best of all youtube/google chrome. Since many videos and movies are uploaded with 32 fps, this makes the videos really feel like 120, and makes me feel like I'm getting real use out of my monitor outside of just gaming. Really appreaciate the work they put into this and making it so simple to use.
For those that don't know Frame Generation Version 1.07 is the best version of FG. You can manually download it and put it in your game files. That version is the most stable one.
I randomly found Lossless Scaling by looking for an app that let me scale my old games using integer scaling, which it does beautifully for most games. Slowly they added more features and it's probably the best few dollars I ever spent on any windows app. The frame generation works so well, yes it's not perfect and requires certain settings like windowed mode, etc. But when it's all in action and activated it's almost like downloading more GPU. Feels good.
A more granular 1.25 or 1.5 mode would be nice here. A perceived jump from 30 - 40fps is significant enough honestly. I'm thinking this would be a good application for power-limited devices or lower spec hardware... Just enough of a bump to meet that good playability range; but not as much of a hit in terms of visual anomalies since the frame gen would only occur every 4th or 3rd frame.
Daniel can you please try using Lossless Scaling with the RX 7600? Its crashing allot for me and im not the only one. There's a whole thread on the official discord but so far nothing has changed with each update :(
@@MiGujack3 Its RX 7600 and RX 580 issue. My PC blacks out and shuts down while using stock settings without any overclocking or undervolting. The GPU works fine for gaming and there's no issue but when I use Lossless the PC crashes and shuts down. I'm not the only one as i've found posts relating to this issue on the steam page, their discord and reddit. Just hoping to see if Daniel can figure it out because i wanna use this thing.
Haven't played an AAA game from the past several years and my laptop is a budget one from 2019 with 1080p60hz display and a gtx1650, so this stuff is all very new and cool to me. You upgrading this year to a laptop with a 4060 and 3.2k 165hz display, so im stoked to see what it looks like.
Regarding 10:10 when Daniel enables LSFG x4 on top of DLSS3 FG. He's attempting base 40, DLSS3 FG doubled to 80 and then quadrupled to 320. But he's not locking a perfect 80/320 at all times, he's doing 74/305 or smth. That's due to the GPU being maxed out. If he'd drop some settings so the fps would be 100% locked to 80 / 320, the input lag would be significantly better. Talking as someone with a 7900 XTX that played Cyberpunk Path Traced using base 40, FSR3 FG to 80 and then LSFG x3 to 240 fps.
@@chacharealsmooth941 well ya theres trade offs, otherwise no one would ever buy a new gpu, although maybe in the future fg will be so good that it wont matter anymore
Daniel, congrats! You have now become an educator and a youtuber. So that answers your dilemma that you were facing in one of your past videos. You offer a great easy to understand explanation that many people may not understand, the difference in latency, and the explanation of the values being presented. Appreciate it!
2x is for 120hz gaming.. 3x is for 180hz and 4x is for 240hz. I have been enjoying dlss frame gen so so much on cyberpunk and the witcher. Going from 65 to 110 is absolutely worth the increase in latency that i cant seem to notice on a controller. I hope fsr and dlss offer a 3x or 4x soon. Amd needs to take some notes from lossless to better afmf. I lost my 6950 xt before i used the second version but i can say with condidence that the first iteration of afmf was absolute TRASH. Glad amd is making some moves.
As someone that loves this software and uses it in most games I do want to advise that it it does drop your base frame rate by about 10-20 fps so if your too far under 60fps it will induce fairly significant input latency, anywhere near or above 60fps and its a significant improvement to smoothness
I think you're wrong about something, x4 right now does have image quality issues but x2 LFSG completely destroys FSR3 framegen in term of image quality, there's no comparison, the only advantage FSR3 has over LSFG x2 is that FSR3 skips the HUD entirely but that leads to other issues as well. FSR3 barely create an actually convincing/correct/stable interpolated image right now.
@@damara2268 I have Lossless Scaling. And I have used FSR 3.0 mod on Cyberpunk 2077 for example. Being easier does not matter to me. What matters is there a mod or not. And the image quality, how much the base frame rate drops etc.
@@mikalappalainen2041FSR 3 much better, tested in TLOU . Lossless Scaling in this game works like shit and burn gpu a lot, Fsr decrease gpu usage, temp, give more stable fps and decrease latency much better. Only one reason to use lossless, game without Fsr3 generation
Been using this on WoW. And omg...it is amazing. Set base fps to 60 in the game as that is the lowest i go down to in valdrakken. Then set to x3 with allow tearing. Perfect 180fps matches my monitor refresh rate, and doesn't skip a beat. So good.
@@cajampa it is not the same at all , you can test it even without numbers the diff is so obvious , also x4 will eats alot of base fps so you got that too
@@christophermullins7163 there is no coordination between real and generated frames if you use both of them at the same time it will be bloody mess my man it is not just math x2 x2 =4 , enjoy the mess lol
I'm using 2X for PS2 emulator so I can get 60fps in 30fps titles and 120fps in 60fps titles, which is a great stuff because its a smooth experience and doesn't affect game velocity, at 2x is good enough to not get artifacts all around, sometimes I use 3X but to be honest is not needed for emulation, its noticeable but doesn't make a big difference to compensate the increase in artifacts/visual glitches, I used 3X on avatar frontiers and its incredible, went from 50fps 1440p high to 140-160fps using just an RTX 4060 with a small increase in input lag, enough to be noticeable but doesn't affect the gameplay, if you don't care that much about visual fidelity it can be awesome.
I tried Lossless Scaling X4 earlier on Trepang2 a super fast game and noticed no input latacy at all and i found i died less and played better with the fluidity absulutly amazing i would rarther use this than Dlss frame generation the end of the gun gargled slightly but on a faced paced game not notale what so ever.
One place Lossless scaling has been amazing has been emulation where games had fixed framerates. For example FF8 had 15fps battles (vomit) but since they are menu driven applying lossless scaling makes it look so much better. FG will never fix how laggy it feels but if you are getting a headache from how it LOOKS then it can help immensely.
I just tried this on Helldivers 2 and finally hit 240fps on my 240hz 1080p monitor! Felt super smooth but the latency was definitely noticeable. I prefer x2 and x3 mode with perceived latency practically non existent with x2 mode as long as you have right settings switched on.
Been using it for about 2 months now, and I'm halfway through replaying God of War since it first came out on PS4, works flawlessly. Lossless scaling is a must buy gem. Definitely a must have for the chill non competitive games.
I'm surprised Nvidia and AMD didn't do this first. Maybe they were saving 3- and 4-frame interpolation to pretend their _next_ generation is worth what they'll be asking for it.
I am loving Lossless Scaling, and yes a controller is the way to go. With less jerky movements, regardless of the Frame Generation technology you use, a controller will allow the software to present a far better and more consistent looking picture currently possible over a kb&m. Try using a controller and you'll notice a big difference when bumping up frames. I've yet to see how the new X4 and G-Sync update feels, but latency is just one of those things that will not feel good for most, depending of course on the game you are playing. I recommend sticking to 2X frame generation, if you can, to reduce latency. I just recently made the switch from 30+ years of console gaming to pc gaming about 8 months ago, and witnessing frame rates almost double that of a console, without the use of scaling of any kind, is still an incredible experience to me. Knocking a games frame rate up any higher is kinda just a bonus for me. Plus, I still prefer to play my games on my 55" LG C1 OLED TV which only supports 120Hrz, so all I'm ever going to see is 120fps, and am completely content with it. There is no way I can't just go from a large TV to a smaller pc monitor. It ain't gunna happen. Maybe until new LG TV's hit the market that support 240Hrz, is when I'll upgrade. For people chasing FPS gains, the way I see it is this... When it comes to the vast majority of gamers out there, 90% to 95% are playing on consoles, at much lower frames, who are still enjoying their games at 30fps to 60fps. Rarely a 120fps on console pops ups, and when it does, is a heck of a treat. My point? Rushing to the store for more frames, when what you've got right now is better than what most console gamers are experiencing. When you look at these new GPU's, you also have to look at a new monitor, which together is outrageously expensive. On top of that, even the most incredible pc games released today are still struggling to hit a Native 4K at 60fps to 120fps, and that is where DLSS, FSR, and now Lossless Scaling comes into play, to save your wallets from minimal gains to fps. The problem with new hardware-based frame gen technology, in these new GPUs, is that it only supports very few new games currently. I think like 95% of them are sponsored games, and who wants to upgrade a GPU to play 2 or 3 new games every year, that supports this frame gen technology, while also hoping to that those games are good? Go for it if you have the money, but if you don't, just know that you are already sitting pretty, as a PC gamer as is, over the vast majority of gamers. Don't let these GPU companies try to fool you into upgrading, and use Lossless Scaling for now, until we start to get more games that include frame generation. What's funny, is that Lossless Scaling outperforms FSR 3 frame gen technology. I've seen side-by-side comparisons of FSR 3 frame gen versus Lossless Scaling, and while FSR 3 frame gen has its perks, Lossless Scaling still looks better. Ai frame gen software is exactly what people need right now, who just do not financially have the means to purchase the latest and greatest in graphics technology every year, when Lossless Frames is a hair away from the competition. A software based GPU upgrade for only $7 bucks. I don't know if Lossless Scaling has this feature or not, but if the devs read this, I'd like for them to add an option to turn off the fps counter if possible, and replace it with like a small X2, X3, or X4 watermark to indicate that the software is doing it's thing. While looking at FPS counters all day can be fun, I'm just not personally a fan of fps counters in my games, because I find it a bit distracting. I don't really care how many frames its generating, when I can clearly see that it's working based on how I set it. The only time I care about the FPS counter, is when I'm dialing in my settings initially to cap the frame rate to a consistency that the software requires, in order to for Lossless Scaling to produce stable additional frames. From that point forward, using the Lossless Scaling's fps counter doesn't really mean much to me. If that makes any sense. The program does what it needs to do, and it does it exceptionally well! Just a thought I'd throw out there, but no big deal. Great video as always Daniel! Cheers y'all!🍻
this thing is the best thing ever made in gaming history, it helps me with helldivers, where I have a 4060 ti gpu and a weak cpu for the game amd 5600x that causing me only reached around 40-55 fps max in that game (while wukong can grant me around 70-80 fps in cinematic) , was thought to upgrade my whole pc except the gpu, and then I found your video, and try it, it works just like magic, as you said frame gen doenst increase response time, but highers fps, makes my own natural response time better XD
Losless scaling is legitimately great, but I've found the most use out of it on RU-vid and with emulators. Real frames are always preferable, interpolating past frame caps is where it's at
i managed to get solid 60fps with the Rx580 and 1440p monitor. mixed low medium high ultra setting and DLSS Enabler Mod + LSFG. you need to cap the actual fps using rivatuner tho.. 60/120 is smooth af with Freesync or Gsync.
@@MiGujack3you will get tearing otherwise. Enable Vsync from NVCP. I had no severe latency issues with vsync + loseless for games that are not competitive. But I am using a controller which is recommend
I have a rx 570 4gb paired with i5 2400... It runs elden ring at 1080p high at 30 - 40 fps (cpu bottleneck) most of the time, if i cap at 25 frames and use 3x mode of lossless i get 75 frames (my monitor refresh rate), it has artifacting and the input lag is noticible but its waaaaaaaaaaay more stable and enjoyable for me, i can even play it with a higher resolution! This program gave so much life to my pc! :)
I tried Lossless Scaling but refunded because it just made every game screen tear like crazy. I played with the settings a lot (VSYNC, Framerate limits) etc but no luck.
Some games gave me more issues than others with LS. I wonder if by tearing, you're actually seeing artifacting. The X2 mode is least likely to artifact and tear and make sure you don't have in-game settings and LS settings fighting each other. I had to play with in-game settings, LS settings, and my driver settings to get an optimal experience but once I solved it, it works great.
What was your fps? Have you enabled vsync only from Nvidia panel? Are you using RTSS properly? What monitor refresh rate have you set? Or maybe the game you are trying has problems on its own that only the developers can fix or check the internet for stuttering fixes? I can help you with this. For me loseless scaling is working perfectly with all older games, emulators.
I think the endgame use of this is using the highest quality/least artefacting modes stacked on top. So DLDSR stacked with Quality DLSS stacked with x2 frame generation will have more input lag sure, but you could take a game from unplayable to solid and banish aliasing to boot. I think combining with Special K framecapping there is some potential to reduce felt input lag and improve stability even further, and in a few key situations it really gets going. I think that once these tools are mature the next step is developing a system that dynamically uses tools like these together to pull out a close-to-ideal configuration automatically, which makes fiddling with individual settings even easier as you're closer to the desirable endpoint when you start messing with stuff. Frame-generation with frame-capped games benefits especially, as you pointed out - it's a way of increasing smoothness and providing high refresh rate compatibility without an egregious downside. Guess I gotta pick up lossless scaling.
@@MrSuvvri but what's the point if the mouse isn't smooth? Idk, when I'd want to use it is when it sucks the most so this quite literally seems no good
Yeah I understand where you are coming from. Because I don't really find it useful in shooting games. Almost unplayable. I use it for livestream and videos tho, almost all the time. Movies, discord streaming. All this makes huge difference.
If the delay latency is too much for you, you can try set the "Low latency mode" to ultra. This setting can be found in Nvidia control panel and can be pretty useful for some game.
u must pay lootboxes and get the legendary X5 mode with a drop rate of 0,000000000x10-^432149x=z²+x/69, after that you need the battlepass (only 1000$ right now it's almost free ngl) and only then you must have 100 000 hours on loseless scaling, and make a 2000$ dollars payment to finally get X6, but it's only avalaible to those who own a rtx 10090 ti super ultra (which cost around 15k$ in 2050)
I love to use Lossless Scaling in KT Racing's WRC games. Their game engine needs to be locked to either 30, 60 or 120fps or the experience won't be smooth. My PC can't quite manage 120fps so I lock to 60fps, enable 2x in Lossless and I get a perfect 120fps experience.
@@BourbonBiscuit. what he is saying is that no matter if you are generating 1 or 2 or 3 frames the latency is only as long as the time it has to wait for a the second in game frame. so all 3 should have the exact same latency but from some reviews Ive seen the 3x has less latency than 2x (something having to do with frame pacing or idk I aint no tech wizard)
@@BourbonBiscuit. yeah but why would you do that if you can already hit the target frame-rate from 60 to 120 going 30 to 120 would be both very stupid and unnecessary but going from 60 to 240 should feel the exact same as 60 to 120 when it comes to latency
As a teacher you shouldn't compare Apples with Oranges, you know the scientific method. So if want to compare latency and image artifacts, you compare DLSS FG to LS 2x and not the 4x. You should only compare de LS 4x with the LS 3x and 2x, because it's only one changed variable the number of frames inserted on the same upscaler. So I have to vote negatively for your video, I could voted positively if you had compared Apples with Apples.
This app is now my default method in playing AAA games. Mostly because my 2060 is pretty much outdated now with all these new games having intensive graphics. This can also help me achieve higher graphics settings and favorable performance at the same time too. For example, normally i wouldn't be able to achieve a stable 60fps on High settings in games like Black Myth Wukong. But with this app, i can just cap the frame rate at 30, turn all the graphics settings to High, set LS on 4x and casually enjoy the free 120fps/High settings. Sure, the input lag will be there and can get a bit annoying at times but that's way better than playing at an unstable 60fps that keep dropping to the 50s or even low 40s in some area. Watching movies with this is a blast too, if you are fine with anything more than 30fps in movies and it works really good with movies that heavily use CGI like MCU movies. Im having so much value from this 7$ app than i ever had from spending hundreds or even thousands of dollars upgrading my hardware.
Agreed. Ignore the other comments. People acting like frame interpolation is “performance” are either lying to themselves or don’t understand the technology at all. This is a cool way to get “smoother” gameplay, but it introduces input lag and artificial frames to make it happen. You can’t call that performance. Imaging if your car’s speedometer doubled the speed it showed you, so while driving at 60MPH it showed you 120MPH. Is that performance? Of course not. This isn’t anything like adding a turbo to an engine- overclocking would be that equivalent.
@@nintendoconvert4045 But this is actually giving you more frames per second, it's not just SHOWING more FPS like your speedometer comparison. If it said 240fps while actually getting 60 fps this app would be getting entirely shit on by everyone, but it doesn't do that.
This would be such a game changer in Winlator, emualting PC games, hell even fsr would be a game changer... why doesn't it exist... in VR as well, would be crazy game changer...
Hi, thanks for all your work, this channels is very interesting 'cause it tries to explain things in very simple ways. Anyway i don't understand the latency part (if we're not talking of just a sensation). The additional frame generated, according to the explanation, should be between 2 frames my machine would render anyway, doubling the frames shown (in x2 case), not delaying the second "natural" one or it would lose its purpose. I try to explain: no frame generator active, 60 FPS per second, i press mouse button at 1/60 and at 2/60 the gun will begin to shoot on my monitor; now i activate frame generator x2, i will have 120 FPS; i press mouse button at 1/120 and at 3/120 the gun should begin to shoot... i will have a sensation of delay because my eyes will see it later in the game, but the time between 1-2/60 and 1-3/120 should be the same. I can only think of 3 things: 1) Frame generator has some kind of priority and "locks" my inputs to do its work and my mouse input is given a queue to let the frame generator do his work (so, my PC will start to elaborated my input at frame 3/120 and i will see it on monitor at frame 5/120) 2) There is no real delay, it's just an optical illusion. It will bother your perception, but the time from when you press and shoot will be the same, frame generator active or not. 3) It will "stretch" the second it says the frames are in, but in this case after many hours my clock should be different from the one in the game 😁😁 (obv. joke, but i liked to put it in, hope you find it funny). Jokes aside, what are your thoughts on this? Do you know the real reason for the delay, if it is there and not just a sensation? (note: in DLSS since it is integrated in the game, the lock part could be directly in the game, but the reasoning should be the same). I hope you will reply, but thx in any case for your channel if you won't find the time.
it works very well now even when frame rate is not an even 60 tried it on the FF XVI demo and it made it still very excellent on my rtx 3060 12gb impressive software and tech this is. To add everything maxed and DLSS 3 with quality preset and lossless framegen.
i am currently playing a plague tale: requiem, and let's just say with my vega 64 i would be cooked! BUT using lossless scaling i can play at 1440p high preset at 60/70 fps, with the x2 mode and i don't even see much arctifacts, only when there are low light occlused ambients i see a bit of a shimmering effect on some walls. I love this app, the creator is THE GOAT.
What I'd be curious about, is whether the negatives mentioned happen much if you are not trying to add SO MANY extra frames, but rather just close the gap to a nice round value (like 60 or 72 - capping it with MSIAB/RT or the like).