If you use this let me know what hardware you are using and what kind of performance your seeing. This is not something I recommend for multiplayer games due is introducing more latency but for Single player games this is pretty awesome if you need just a bit more FPS
you could PLEASE show this on a DESKTOP with a low to mid GPU like an RX6600 and show us what it can do with that?? Can it bring it up above 120fps at 1440p ??? THANKS!!
i have AMD R5 3600 paired with RX6600xt and 32GB 3200mhz. with my setup ryujinx run TOTK and BOTW with 30 locked, use 3x i could play on 90fps with dips to 65fps, its console game anyway so latency doesnt really matter. oh and i also play FO4 and skyrim fully modded with wabbajack modlist (200GB+ each) lock my fps to 60fps and use 2x i could play on 1080p with 100~ fps
I'm on Rog Ally Z1 Extreme and I also use lossless, i'm using FSR scaling set to 8 (optimized) LSFG 2.3 at X2 performance mode, default sync mode and 1 frame latency, depending on what game i'm playing, graphics settings are always configured in a way that I get 60 / 120 fps with lsfg. Most games I play at 720p upscaled to 1440p. This on the standard Ally screen looks pretty decent, I'm using an external monitor which is certainly bigger, so some low detail is still visible. I hope in the future to get my hands on a XG Mobile 4090 to massively boost gpu power and be able to play beyond 120fps max settings.
I bought LS in 2019 for solving some issues with games that refused to work well in full screen 4K, but now I'm glad I have it because it has improved a lot.
Yeah I bought this specifically for project zomboid at 4k since the 4k ui scaling is bad on that game. This app has SIGNIFICANTLY grown since then. Its amazing.
yep, the latency that you would get from having 30 FPS is the same as when you use x2/x3/x4 scaling from base 30 fps. It helps removing strain from playing at low FPS, but it keeps the latency from low fps
I've been using this with a 3090 for months now. Basically adds frame gen to 30XX cards and it's been fantastic. Works better then FSR frame gen modding for sure.
Didn't realise that the scaling was taking a low res and bumping it. I will turn the AC SE version off (RIS I think?) and use LS1 instead at 720p and upscale. Warhammer 2 I think would benefit.
yall, if u have a crazy cpu bottleneck like me, this app is your best friend, this app changes the game literally, i can see this app going mainstream very soon shit is wild
No, DLSS and FSR are 'better', but not tweakable/adjustable. DLSS v3 also needs games to be shader pipeline aware, since it's using motion data from the game to insert the motion data for multiple frames. So it's more powerful... but it's not going to work for non-games. The AI factor is also making the fake-frame MV data from games, specifically. So if you feed it non-games, it might not have a good sense of that process. Especially non-game content like anime, animation, youtube or 'live action' material (sic). LS can work with video, since it's just a video processor like Motion Interpolation / Motion Smoothing in TV's with game-ready features like upscaling/interpolation. SVP is also another video processing/smoothing app, but it's limited in usefulness. To some degree, FSR is more similar to this, but has less tweakable settings. If AMD allowed FSR with video, maybe that could be done, but it would be a bit of a hack. While DLSS could be made to work with video, it's more difficult to do well, since Video data isn't Game data.
Its pobably been mentioned by others but going up to 3X or 4x and seeing bad results is nothing really to do with the power of your device, its just that you are creating more frames that are guessed rather than rendered. The more frames you guess, the more innacurate its likely to be. For example, if your original FPS is 30-40 it doesn't matter what GPU or CPU you are on, its gonna look bad trying to up that to 90 or even 120fps as there are simply too many gaps to fill.
I was starting to notice some weird distortion at the edges of the screen in his example of CP2077. At first I thought it was just motion blur, but the more I watched it seemed like this is prioritizing the action in the center of the screen and the frame gen gets a little lazy around the edges were you might not be looking.
@@bokocchop Cool thing is, Linux already has a package for scaling called gamescope that’s developed by Valve themselves. I’m pretty sure the Steam Deck already use it to some extent.
@@jefferson-tanSteam Deck doesn't have system wide or driver level frame gen. Only works on games that have it built in at the moment. Future updates may have AMFM.
I bought lossless scaling when it first came out. First of all, it is an amazing program that can net some very real gains in fps and because they have continued to work on it, there is very little (if any noticeable) downsides. Its amazing. Im happy you finally did a video on it. The developers are committed to furthering it too. Its development is really good.
You can actually add the generated frames in afterburner and RTSS. Add it as a RTSS sensor called Frame Rate (Presented) then add it to your overlay via RTSS 👍
Goddamn! I didn't know this has been around for quite a while now and just discovered this today with your video, bought it on steam and immediately tried it on my Cyberpunk 2077, I only have a mid-range system, 3060ti with just I5-10400f, I normally get 35-55 fps on high settings with RTX on, but with this borderline magic, I was able to reach x2 (I really don't need more than x2 due to monitor refresh rate limitation) of my usual framerate which is 80-100. Sure, a bit of sacrifice with a tiny bit of input lag but it's barely noticeable when you're playing for quite a while.
@@rambledogs2012 probably because I have RTX on high? Based on other's exp with similar specs as mine with the same RTX settings, it's really just gonna be around 40-60fps depends on the environment and situation.
@@leincin Ok got ya. I turned it off as that was ruining my frame rate. I didn't notice much graphical wise so left it off. Turned off some other settings too as they didn't make much impact graphically.
IT ACTUALLY WORKS. Guys, let's support the devs so we can have more updates, this is insane!! It's not magic, it's AI/Technology techniques. It's perfect for GPUs like mine (1080) who is showing it's age. I got 2X FPS with video settings with same graphics as native. NVidia could do this but instead they want to sell you expensive GPUs and lock the new cool stuff behind them. Games that i tried: The Last Of Us (Ultra, All Settings Maxed, 2560x1080 Ultrawide) Before:40/50 FPS, After: 80/90 FPS. Horizon Zero Dawn (Ultra, All Settings Maxed, 2560x1080 Ultrawide) Before:40/60 FPS, After: 80/90 FPS.
Here are some extra points that I find somewhat alluring: - For people who are into emulation, this is perfect when you are playing a game that runs at 30 or even 60fps if you want even more frames. Older titles running at 30fps are far from unplayable, but LS can let you play at a framerate that the game can't natively support. - If you're playing a game that doesn't natively hit your monitor's refresh rate, knock the in-game framerate to half of what you want to hit, then turn on LS. Even if you have a really nice PC, this is nice for when you want to see 165fps for instance, but a game maxes out at 120. I would set the game to run at 85fps, then use LS. - If you really wanted to, you can use this for videos and movies. I don't typically do this with movies, but I watch some RU-vid content that was filmed in 30fps, and I'll occasionally bump the video up to 60, sometimes even 120fps just to see what it looks like. You get so much for just $7, I think it's a no-brainer purchase regardless of what specs your PC has.
@@leochrismxa3672 there are cheat patches made by community that lets you do 60 fps on 30fps locked games. Install that and then use LSFG to output 120fps. If there is no such patch available, then your only option is 30 to 60 via LSFG.
I think the Lossless Scaling guy should team up with the Marty's Mods guy and use his sophisticated optical flow and normal buffer generation techniques to get even better quality and enable DLSS and XeSS support.
I've been using this for my Rog Zephyrus 2022 using the Igpu (Radeon 680m) for better battery while on the go (since I don't have money to buy a handheld i'm like 300 bucks short) and dang in Cyberpunk 2077 it increase from 30 fps in 1080p medium with no Fsr on 15w to like 60 - 70 fps avg which is crazy especially when i'm gaming on low wattage to keep that battery life up. Amazing experience I would say, worth every dollar.
One tip.... limit fps with Riva tuner to 30 (you can undervolt and or overclock). (dont do it ingame it will artifact more) and then double it to get solid 60 stable. Frame gen is not fan of variable framerate (unless your display has native variable framerate feature), and never go above display refresh rate. 70- 80 is nice but it's just artifacting too much. If you have 120 fps screen that would be ok if you can hold 60 and double it, otherwise limit display to 60 fps. But for a game like cyberpunk I would go 60 any day.
Absolutely floored by how simple and crazy performant this program is. Crazy that this single developer is beating out the likes of NVIDIA, this makes DLSS look like garbage.
I use it with my laptop rtx 4060 to get a high refresh rate experience. I run lossless scaling frame-generation on my Intel Iris iGPU, this means I dont loose any performance when turning on frame generation, its a straight doubling in fps 60->120.
I'm getting weird behavior on Total War: Three Kingdoms. Settings and results below, but basically, running native 1080p is clearly faster (not counting AI generated frames, obviously, because the benchmark doesn't either) than 1080p window scaled up to 1440p native display resolution. Native 1080p also looks better on actual gameplay. Hardware: TUF A15 6800h+32gb, 3060 6gbm; Global Hardware settings : Min CPU state 35%, Max 90%, 35w SPL/45w SPPT/55w FPPT, GPU with 25w boost, no manual OC. External 3440 x 1440p monitor (it was on sale for the price of a 1080 16:9, I might as well have this and enjoy it through more upgrades). ----- Reference Native Settings - 3440 x 1440p, No NIS in NVidia Control Panel, Adv Settings Custom* Battle Benchmark : ~56fps** Actual gameplay settings I currently use - 3440 x 1440p, NVidia Image Scaling On, Image Scaling In-Game 70%, Adv Settings Custom* Battle benchmark : ~83fps** Qualitative : Looks nice, can't tell the difference vs 3440 x 1440p native; can have stutters in benchmark/actual battles, has stutters in campaign map, but not something that would bother me more than shtty visuals (this isn't FPS; included here because I didn't dig into the frametime graph) No NIS, "Native" 1080p 21:9 - 2560 x 1080p, NVidia Image Scaling still enabled on NVidia Control Panel, Image Scaling In-Game 100%, Adv Settings Custom* Battle benchmark : ~85fps** (~55fps if with FXAA)** Qualitative : Differences are obvious since you start out in the menu with everything rendered at 1080p ie the menu items are visibly larger and stretched out on the display; can have stutters in benchmark/actual battles, has stutters in campaign map, but not something that would bother me more than shtty visuals (this isn't FPS; included here because I didn't dig into the frametime graph) Lossless Scaling On - 2560 x 1080p, Windowed, 100% on In-Game Image Scaling Slider, Adv Settings Custom* ; LS, default slider setting; 2X frame gen, Performance On Battle Benchmark : ~55fps** Frame Gen : ~110fps Qualitative : Image quality looks like I went back to Total War : Rome II/Shogun II on low settings ie kinda blocky like the arms, sleeved or not;*** could use a lot of sharpening; contrast kinda sucks too *All constant to make data more easily comparable, all settings here based on what I actually use **Two run average; I just need the rough range, not benchmarking a computer as a content creator ***It's not Tekken 2 on PS2 blocky, but it looks like it's not even clearly on par with Tekken Tag on PS3
I've been messing around with overclocking all day to get 120fps on warhamer 40k spacemarine 2, and I cam out defeated best I got was 70fps on a rtx 4060, then this was recommended in the feed OMG yes, I've just played dantes inferno on 4k 120fps and godamn I'm still waiting for the shoe to drop as I'm still can't believe it worked, after days of tinkering with new releases to get the best quality and performance it's always a hassle and then you get defeated and just say f it guess i just gotta experience the game on windowed mode at1080p at 60fps this always urked me as I want to play on 4k and I spend so much money on my PC hence the reason I got a xbox. Hut this has single-handedly solved 2 issues first is fps and quality, 2nd is longevity and 3rd is doubt nothing worse than spending money on a great graphical game only for you to have to experience it at lower quality or having frame drops to the point it's unplayable, best £7 I've ever spent
I have a gtx 1060 6GB and I don't manage to use the program on games like Valorant or CS. I limit the fps of those games to the half of the HZs of my screen but when I escale with the program it does nothing and the fps drops even more. I mean, I play Valorant with 200 fps but when I turn this program it just goes to 40. I don't mind Valorant because I have enough FPs but for example on CS2 I got around 100 fps and I'd like to have 140 FPS. I reduce the FPS limit to 70 in game and turn on the program, suddenly my FPS goes to 40 and the mouse doesn't work properly.
*immediately proceed to watch this at 4k240 hrz* :D yes, this soft is great But we can easly notice those frames are fake playing fps. As long as the video doesn't move fast, it's great.
There is some heavy head flickering 3rd person games when you moving camera. Spider Man is good example for that. Very annoying bug. But 60fps locked games get some boost in this software 👍
Best 6€ I spent on a desktop app! It‘s insane for games that are framerate-locked like Earth Defense Force, running it with 120fps (or more) instead of 60 fps is great! The app even comes in clutch if you want to play more demanding games on max settings at 4K with RTX 30 series cards 👌
Doesn't really work for me ideally. I've tried every single guide possible, yet i cant seem to get better frames with duable imput lag. Not worth the 7 dollars lol
Technically, it would turn 19 into 38, and 10 into 20. It "doubles". You can also use the "triple" and "quadruple" but I don't suggest it personally. However, it is a post-processing effect, so it will lower your base frame rate.
This should be an ad to not waste money on a laptop lol. I made a major mistake of buying a laptop after I built my desktop and realized how bad the price to performance was. It's borderline a scam buying a laptop.
I have a legion go but can't download it on Steam because its say I have a steam deck which I don't . I received a refund very strange. Is steam the only place I can download this ?
If you can play with a stable 30fps it will work. If you can't hit at least 30fps, it's not worth using. 60fps will be a slow motion version. You need 30fps and higher to have the best experience.
I tried this today, all ill say is, this is made for single player games where the gameplay speed is somewhat like Elden ring. I then tried it on battlefield 2042 and imi could feel the delay. Not for competitive games but single player games were life changing.
Yes, but it's things like this that give the application more exposure. I'd never heard of LS until a couple months ago because a random youtube video.
Question? I’m having a problem switching from the LS app to the boarderless window screen is to get the game to upscale in my labtop is there a quick key combo I can press to go back and forth?
Let's say I'm playing an emulated game that only gets 30 fps native and has graphic glitches that show if I try to 2 or 4x the game to 1080p in the emulator, Will the upscale in this program help avoid the glitches and still give me the smoother graphics of the higher res? And what if I try to bump the frame rate up? Presumably, the higher frame rate means weird artifacts will pop up in high movement situations.
capping the framerate at half the refresh rate gives smoother frametime and best overall fluidity.i'm playing the new senua game at 30fps boosted to 60 on my rtx 2070 and it works great even though i can push native 60fps i prefer it during summer cause it gives me much lower temps and my room doesnt become an oven lol
FINALLY!! I've been screaming about this little app everywhere, I'm glad it's finally getting some attention. Tried this on an old 2015 Radeon R330M dedicated laptop GPU on The Witcher 3, went from unplayable 240p 20fps "gameplay" to scaled up 720p 40fps. Same hardware just one app... That's a miracle in my book (and by using the integrated Intel GPU as the scaler, you get no lag whatsoever) This is a must have app
i rock a rx580. still a great card, but it's showing its wear and tear. i thought there was no way a program could 4x my fps. pri bad input lag on an office bluetooth mouse but it's phenomenal. works as said
I will tell you I don't need to use resolution scaling with 4090 but before I had it resolution scaling is probably one of the most amazing techniques next to dlss. You can have a entry level or mid range card and get 140fps for competitive games with a cheap gpu. You guys don't know how lucky you are now days to have options like scaling and dlss compared to 20 some years ago.