Тёмный

40% Less Input Lag Without AMD Anti-Lag or NVidia Ultra-Low Latency 

Battle(non)sense
Подписаться 176 тыс.
Просмотров 463 тыс.
50% 1

What should have been a simple and straight forward test of AMD's Anti-Lag and NVidia's Ultra-Low Latency mode took an unforseen turn when I discovered something else that has a major impact on input lag.
► Support Battle(Non)Sense:
/ battlenonsense
► Stable Frame-Time / Input-Lag: • How To Fix Stutter In ...
► Previous video: • Bufferbloat & Lag - Wh...
► Next Video: • CoD Modern Warfare Bet...
► Connect with me:
➜ FB: BattleNonSense
➜ twitter: BattleNonSense
➜ email: chris@battlenonsense.com
► Outro Music:
Many of you asked for the name of the outro song. Sadly I have to tell you that it is not a "song". It is a custom made music to be used for intros/outros that I bought a while ago.
#InputLag #Anti-Lag #Ultra-LowLatency

Опубликовано:

 

27 сен 2024

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 1,7 тыс.   
@BattleNonSense
@BattleNonSense 5 лет назад
► Stable Frame-Time / Input-Lag: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-xsXFUVYPIx4.html
@youngyongyung
@youngyongyung 5 лет назад
11:20 So the reason you get more input lag when you're GPU bound is this. The game sends a bunch of commands to the GPU, which get executed asynchronously. Then the game calls present(). At this point the driver can force the game to either wait until the frame was completed, or it can just let the game render the next frame, even if the current frame isn't completed yet (leading to better GPU utilization, as there are always more rendering commands in the pipeline). How often the game can call present() while the first frame still hasn't finished rendering without being stopped by the driver is essentially the "pre-rendered frames" setting. If the driver would *always* stop the game at present() even if no other frame was in-flight, performance would be terrible because you would potentially give up a huge amount of GPU/CPU asynchronicity. (I hope that's a word.) But stopping the game when a single frame is in-flight usually only incurs a small performance penalty. I guess what low-latency mode is trying to do is guess how long they have to block in present() so that the next present() comes in just as the GPU finishes the previous frame. Of course if you're CPU bound (or simply *not* GPU bound through a frame rate limit), none of this really does much, because every time you call present() the previous frame is already done anyway. It's essentially perfect low-latency mode. What can be done about this? Well, nothing really. Except for low-latency mode or frame rate caps. You could be even more aggressive than the current modes, but that would incur an even bigger performance penalty. And then you have to ask yourself, in (probably competitive) games where people care so much about latency, aren't users playing with low settings anyway to get the most FPS, and are therefore usually CPU bound anyway? It's probably not worth it to provide an even more aggressive mode.
@futmut1000
@futmut1000 5 лет назад
Does this applies to CPU's aswell?? I mean if we limit CPU usage to 95% would i get lower input lag than CPU running at 99%??
@Linets.
@Linets. 5 лет назад
@@futmut1000 with better cpus is hard to get over 50%
@jonlemon9411
@jonlemon9411 5 лет назад
@@Linets. Some games like bf4 take cpu2 realy hard sometimes near to 100%
@St0RM33
@St0RM33 5 лет назад
So basically...can we get a gpu load limiter? Essentially a variable fps limit so not to max out gpu load but still get max fps at each moment?
@resresres1
@resresres1 5 лет назад
DAMN battle(non)sense, you need to do more videos..... your videos are some of the best. As a competitive gamer (not a pro, just decent player) i absolutely love the topics and information you provide.
@kaliyuga5757
@kaliyuga5757 5 лет назад
Doing this test with csgo would be interesting because input lag is so crucial for that game and also because it is so much different than the other games tested (heavily CPU limited, dx9 and very high fps)
@rdmz135
@rdmz135 5 лет назад
Yeah my GPU is chilling at 40% even at 300+ FPS so I guess this doesn't really affect me, or most CS players with decent PCs. It is interesting that anti lag increased input lag under these circumstances tho. I just left it on all this time as i didnt feel a massive difference.
@allongur
@allongur 5 лет назад
If it's heavily CPU limited, why would it matter? GPU will never reach 99%.
@rdmz135
@rdmz135 5 лет назад
@@allongur Right now we don't know if the drop in latency is due to less frames being rendered or less GPU utilization. We also don't know how anti lag affects you at really high framerates. Testing CS can help answer these.
@rdmz135
@rdmz135 5 лет назад
​@@mt441 I initially thought I felt a difference but now I switched back and saw no difference soooo yeah, its impossible to tell without actual data. Never underestimate the power of placebo. That or when you're already getting 350-500FPS the change in input lag is so low that its impossible to notice.
@zonkle
@zonkle 5 лет назад
I've always capped my FPS in Overwatch to 140, because it feels lower lag than uncapped, despite everyone saying uncapped is better. I never knew if I was imagining it or why it would work like that. But, this seems to confirm that I was right.
@Kpaxlol
@Kpaxlol 5 лет назад
Ive always wondered when I played it back in 2016. Read somewhere that when your gpu starts struggling you get burst of input lag. Im pretty sure that was correct for overwatch. Mind you I used a 7870 AMD and a 770 gtx later back then.
@puffyips
@puffyips 6 месяцев назад
Uncapped is still best input lag, just make sure anti lag or reflex is on so it manages your gpu% to not go too high
@heckyes
@heckyes 5 лет назад
Excellent work. Thank you so much to your patreons for funding this! And thank you to you!
@megacosmic6507
@megacosmic6507 5 лет назад
It's insane that nobody has discovered this insane flaw yet! Nice job
@norpanmekk
@norpanmekk 4 года назад
But... it is discovered ?
@Johnny31323
@Johnny31323 3 года назад
@@norpanmekk yeah lol, i think he worded it wrong xd
@stephenkamenar
@stephenkamenar 4 года назад
this video is HUGE value i always use capped fps because i prefer consistent frame times and a cooler gpu. it's amazing to know this is also giving me less lag. capping framerate is something all gamers need to learn about
@pr3cious193
@pr3cious193 4 года назад
Agreed!
@mrjonjoe1895
@mrjonjoe1895 4 года назад
i hope they paid him for this, NVdia just came out with Nvidia Reflex option automatically to be enabled which i suspect might be from what he told them, i think it keeps the load from 97% +
@adamblomberg
@adamblomberg 5 лет назад
This was totally unexpected for me and I don't think many are aware of this, not even enthusiasts.
@VirusXAX
@VirusXAX 5 лет назад
Also me i was expect the Opposite! I play a lot of Rainbow 6 Siege... and i was using the Low Latenzy mode ... I will try to put it off right now and see, what happens! I have a 144HZ Monitor (AOC) and a GTX 1080. I got 200 Frames and it jumps from 138-250 fps!
@GoldenEagle0007
@GoldenEagle0007 5 лет назад
Battle(non)sense we miss you keep the content coming
@netwuv3399
@netwuv3399 5 лет назад
esport companies should hire this guy to do more testing of this sort.
@HuMaOne69
@HuMaOne69 4 года назад
To achieve the least inputlag you have to have a high amount of FPS even if you have low HZ screen for a better frametime but never letting GPU to reachh 99% usage, when gpu is overloaded it gives a huge input lag, so for example is better to have 120fps with gpu at 80% than 144fps with gpu at 99%
@MsTatakai
@MsTatakai 4 года назад
thank you
@izen1337
@izen1337 4 года назад
Wrong
@badnewsbruner
@badnewsbruner 5 лет назад
How bizarre.... I wonder how nobody noticed this before?! So strange.
@giglioflex
@giglioflex 5 лет назад
Some people did notice, it's just that most benchmarks indicate that the product with the highest FPS is the better product so most people believe higher FPS = better. IMO this throws shade on the whole "Intel is better for gaming" thing. AMD's ryzen CPUs have always gotten very consistent frame-times, makes we wonder how good they are in actual MS input lag.
@JeanFrancoisPambrun
@JeanFrancoisPambrun 5 лет назад
5-10 ms is mostly an academic difference. I doubt most user can notice that; I know I can't..
5 лет назад
Still people regularly pay hundreds or even thousands of dollars to shave off 5-10 ms from latency. They deserve to know why they are losing those milliseconds unexpectedly.
@rdmz135
@rdmz135 5 лет назад
@@JeanFrancoisPambrun i can notice 15ms on an input lag test and im sure there are people that are more sensitive than me
@JeanFrancoisPambrun
@JeanFrancoisPambrun 5 лет назад
@@rdmz135 5-10 ms is only about 10% of the typical input latency. Feeling such small delta is usually difficult for our senses. I wonder how you can state that you would notice with such confidence since testing this hypothesis is quite involved. I should probably say that I have studied human visual acuity as part of post graduate work. Edit: In the end, the fact that nobody even notice this before strongly suggest that it is imperceptible to most, if not all, user..
@Ryno2094
@Ryno2094 2 года назад
what an awesome test, thank you for taking the time to do this
@HeIifano
@HeIifano 4 года назад
I would have really liked to see the 5 tests shown at 9:20 repeated with G-Sync active. EDIT: Also perhaps the first and last test repeated with Ultra Low Latency set to On (previously Maximum Pre-rendered Frames = 1) just for posterity.
@louisrmusic
@louisrmusic 4 года назад
Oh boy, you must enable reduce buffering in Overwatch. Tested that with G-Sync, V-Sync and G-Sync + V-Sync, and trust me, it reduced lag in every case . You should have that enabled at all time.
@louisrmusic
@louisrmusic 4 года назад
Dyils Yes, but also no. Wether NULL is on or off, I experience the lowest latency with the in-game setting. NULL works best when the FPS are exceeding monitor’s refresh rate. It only makes a minor improvement when the FPS are capped at refresh rate or lower (4ms according to this video). The G-Sync + NULL video shows NO improvement at all when FPS are capped to refresh rate with G-Sync + V-Sync ON. I’m giving my point of view about something that haven’t been specified in any test. Maybe, just give it a try and make your own conclusions ;)
@louisrmusic
@louisrmusic 4 года назад
Dyils I’m sorry for being confused in my explanation. You don’t need to do the led trick with a high speed camera to mesure that. I’m simply saying that, in Overwatch, there’s a noticeable input lag reduction when you enable the in-game reduce buffering, and this applies whatever your sync settings are. When enabling low latency through the control panel, I see no difference at all, like there’s still one frame in the buffer. I really see the difference when I move my mouse the fastest possible over a short distance. It more reactive by one frame (7ms for me). You better see that by focusing your eyes on a point between your screen and your mouse, so you can have both in your sight. Gameplay feels less spongy. (I can hear a 7ms delay when making music, so why my eyes could not ? Everyone is different)
@WickedRibbon
@WickedRibbon 5 лет назад
Fantastic testing. It echoes my experiences playing playing Battlefront 2 and Apex Legends recently. If I leave the framerate uncapped, both games feel very jittery and uneven. On my 60hz, non-freesync monitor, 90-100 fps seems to worsen responsiveness rather than improve it. However, as soon as I lock to 60 fps using RivaTuner, both suddenly feel really fluid and responsive. My GPU is no longer being tapped out. Having consistent frame times in-line with your monitor refresh rate really is the top priority.
@infinitestars394
@infinitestars394 3 года назад
Thats because of the heavy tearing lines that you get. If you have 60HZ monitor and get FPS anywhere between 61-119, you will get tearlines that actually jitter around your whole screen. Those tearlines will sperate one image into many smaller ones. Those smaller sections will MASSIVELY stutter. Plus you see the tearlines moving up and down your screen. Then 60FPS wont actually look like 60, it almost feels like 30-50FPS both the image and responsiveness. I just tested this 2 days back, and I play MUCH better on 60 FPS 60HZ then 70FPS and 60HZ... always cap the FPS to 0.5, 1 or 2 times your HZ. Btw, this tearline-stutter effect is also called "stutter beat frequencies".
@sidtai
@sidtai 5 лет назад
Love your testing. In your tests, you capped your frame rate so basically you are not CPU limited. For the sake of testing, I would love to see the effect of being CPU limited on input lag. Your video has shown that higher FPS does not mean lower input lag, it also depends on your GPU load. I am wondering if it applies to CPU load as well.
@gamerhobbit
@gamerhobbit 5 лет назад
You forgot maybe the most important comparison: in-game frame limiter with anti-lag/ULL vs RTSS frame limiter with anti-lag/ULL. RTSS is by far the best solution to cap FPS and get stable frametime, however it is known to add 1 frame extra delay. Since RTSS holds both CPU and GPU to reach 100%, it may as well co-work with anti-lag/ULL and give the ideal maximum. Nice video BTW, thank you!
@PixelGod240
@PixelGod240 5 лет назад
Do you have csgo? You should run desktop 1024x768 and run csgo in same resolution with all setting low as possible. See if you can feel the difference between the two. Seems to be the best real world comparison without the equipment to actually test.
@Limeayy
@Limeayy 5 лет назад
I already do this but my frames sometimes drops a bit from the cap value to a lower value and i get stuttering or frame stuttering or w\e you call it.
@PixelGod240
@PixelGod240 5 лет назад
Lime I noticed csgo fps_max 0 much less latency then capping at 300 FPS. The issue people forget is source’s tuner will break and cause clock drift once FPS hits 1000 even if for a second (menu/loading map) so if u do fps_max 0 only do so once loaded into a server until then keep it capped at 300 or what ever. It’s a pain but keep csgo from breaking . P.s. fps_max 999 is trash!
@gamerhobbit
@gamerhobbit 5 лет назад
@@PixelGod240 Without the equipment, I'm the "margin of error". We are talking about 8.3-16.7ms differences. Hard to notice, impossible to justify with bare eye.
@skoopsro7656
@skoopsro7656 5 лет назад
You do some of the best work in the industry. Bravo man. I wish I could get as much value for the community out of my hardware as you are able to. Fantastic work. Thank you again
@RuinerXL
@RuinerXL 5 лет назад
Hardware Unboxed just made a video response to this video. Cool stuff!
@andrelip1
@andrelip1 5 лет назад
Chris, if you are reaching high gpu usage that means that the cpu pipeline is faster than gpu at their maximum. The obvious conclusion is that it will pre-render the frame so the latency will increase. That explains why antilag worked. When you fix your fps then you Will run with pre-rendered frames of essencially 0 as both gpu and cpu are abble to render in-time and even sleep. This could be observable using GPUView. What is really interesting is antilag getting higher response time when the expected behaviour is that It should nit affect at all.
@jays5190
@jays5190 5 лет назад
That makes intuitive sense, but the difference in frame rate is extremely small compared to the difference in lag when capping at slightly below the max gpu-limited fps. It seems like those extra buffers must not be adding much to the average fps, or maybe something else is going on? I'm also curious why capped 80 fps had more lag than capped 60 or capped 140 (with 150% render target). It also seems like the anti-lag mode must be doing more than just eliminating buffers, since as you say you'd expect no effect for anti-lag off/on while capped if that was it.
@SIeipner
@SIeipner 5 лет назад
But still having the GPU delivering the highest FPS (which should lower the input lag) and then delivering that frame immediately to the monitor without storing it in the buffer should give you the lowest input lag possible, but for some reason this is not the case. Its super weird imo
@Bijob
@Bijob 5 лет назад
Yet another essential video that is changing the way i'm playing. Thank you for bringing those complicated topic to the masses! I'm trying this Low latency mode coupled with Fast V-sync and the results are promising. I can't calculate precisely, it's an overall feeling yet it feels stable and the picture is good. My situation: 4K with 1080ti, it's not good. Really bad tearing, and the input lag is atrocious with Vsync, not mentioning the stutter when going below 60fps, i have a 60hz screen. Well, Fast V-sync works as intented and eliminate tearing while not stuttering if i go below 60fps. What i like the most though, is that stabilized input lag. I didn't realize it was fluctuating that much and it deeply affected my gameplay. Thank you so much battle(non)sense!
@jonbrandjes9024
@jonbrandjes9024 5 лет назад
Bro Fastsync only works when your fps goes above the monitors refreshrate.
@Vlamos27
@Vlamos27 5 лет назад
Wow. That's awesome. Great video, as Always! I noticed this too when locking the fps in overwatch, it somehow felt snappier than using a higher but variable framerate)( gpu is maxed out). Keep up the good work!
@blackpete
@blackpete 5 лет назад
Jup. I don't play Overwatch, but noticed that in other games.
@Domistroy
@Domistroy 5 лет назад
I've noticed the same thing (also OW) when using freesync + 142fps cap on my 144hz monitor, it felt smoother then uncapping it to an avg of around 210fps. I just thought it was a side-perk of freesync rather then being below 99% gpu load.
@MrDvneil
@MrDvneil 5 лет назад
really proud that still are people like you that digs intro the real data and not just opinions. your work there is awesome.
@theMJL
@theMJL 5 лет назад
Your RIva Tuner video was a godsend! I was getting stutter that I couldnt figure out how to get rid of it and using Riva and your tips to limit frame rate, eliminated it entirely. Youre videos are incredible and always a pleasure to watch
@vladimirjebievdenko7356
@vladimirjebievdenko7356 4 года назад
Now test input lag with new Nvidia In-Driver FPS limiter in 441.87 drivers.
@ZekeMagnum
@ZekeMagnum 4 года назад
he just did 10 min ago ;)
@Zeioth
@Zeioth 2 года назад
Everything you say it's true and well documented. Input latency is maybe the hardest concept to grasp in gaming because you don't have metrics to measure it. You must understand the math behind it, limit your FPS, and test it yourself. Also it can be counter intuitive, because uncapping the FPS make the game 'feel' more responsive, but in reality that makes that what you are seeing on the screen is not exactly what is happening, as massive input lag is introduced.
@JP-fr6by
@JP-fr6by Год назад
It better to cap fps just below what you need to attain 95% gpu usage.
@Thermophobe
@Thermophobe 5 лет назад
This explains why ultra low latency mode made CSGO feel not so smooth to play.
@frazplayssquad9232
@frazplayssquad9232 5 лет назад
Did you test this in fullscreen exclusive mode? Windows updates have started​ making games run in their own optimised mode (allows for faster alt tabbing but causes​ input lag) when selecting the game to run in fullscreen, in some cases. To fix this you have to go into the properties of the games .exe file and disable fullscreen optimizations, that works for most games (some exceptions) and should put you in fullscreen exclusive mode and reduce input lag.
@Kissislove17
@Kissislove17 5 лет назад
Nani?!
@AssassinKID
@AssassinKID 5 лет назад
@@Kissislove17 Basically Windows run games with WDM Tripple Buffering and call it *fullscreen optimizations* to allow faster response to Alt+Tab and window switching
@Taltharius
@Taltharius 5 лет назад
Welp, guess testing will need to be performed yet again.
@RamenRiderX
@RamenRiderX 5 лет назад
Even when you alt-tab it causes Overwatch to have reduce buffering broken. The three dots on the fps counter in overwatch means the reduce buffering is broken and you have to re-enable it.
@RequiemOfSolo
@RequiemOfSolo 2 года назад
@@AssassinKID happen to know if disabling FSO still works in 21h2? Or if there's something else you have to do now to fully disable it? I use to disable it on every game when i was on 2004. Stupidy decided to format and update to the latest win 10 build, and now disabling it feels like it does absolutely nothing tbh.. It's bugging the hell out of me. Average accuracy has gone down in games.
@edu000
@edu000 5 лет назад
Hope you do a follow up on this based on possible responses from AMD/Nvidia and factor in using RTSS vs in-game limiters, because my head hurts right now. RTSS limiting with those frametimes feels so smooth though. I must have missed your earlier video. Great work as always!
@grafforlock
@grafforlock 5 лет назад
I wish there was a way to just have the feature automatically turn on every time youre in a scenario with 99% gpu usage and then turn off again when below that, it would make more sense then manually going back and forth.
5 лет назад
Sounds like an optimization issue somewhere along the frame processing pipeline, within the GPU driver, DirectX or the game engine itself. Obviously, frame rate limiting will not be an optimal solution since fps can vary wildly depending on gameplay situation. Now I'm pretty curious to hear AMD/Nvidia's explanation.
@SteveEricJordan
@SteveEricJordan 5 лет назад
Danke für dieses video, deine arbeit ist extrem wichtig und hilfreich für eine ganze szene und hat mir sehr geholfen, auch diesmal wieder. mach weiter so!
@terra0009
@terra0009 5 лет назад
Some input on the subject (I'm no game engine expert, so take this with a grain of salt): In the past most game would work in the following way: In each frame the game would calculate the logic step that evolved from the previous frame, then it would draw the conclusion of this logical evolution. Let's these steps logical frame and graphical frame. As game engines evolved, some elements which were part of the logical frame moved to the graphical frame, for example simple animations that had little to no effect over the game state besides graphics. Developers realized that you could improve the graphical experience by making the graphical frame independent from the logic one. The logical frame in many situations need to be at a static pace (ex: 60 fps), otherwise some of the collision steps would become too complex. Engines like unity separates them. You can have your game running at 240fps, but the logical frames will be set to 30. This allow very smooth animations and better gpu utilization than if you locked both frames stages. The reason why you must be getting better input lag when reducing the frame rate may be that the logical frames are set to a much lower value then the graphical frame, so by increasing the graphical frame rate you are also increasing the necessary CPU utilization for doing the graphical frames (best case scenario should be only the bureaucracy of sending information to the GPU). So by increasing the graphical frame rate, we may actually be causing a bad influence to the logic frame rate... Games that have both of them locked together should always improve latency when you increase frame rate. Otherwise it becomes complicated. Well... that's my 2 cents.
@oblivionnerd
@oblivionnerd 5 лет назад
Interesting. I wonder what the logical frame rates for the games tested were or if they are tied with gpu rates.
@MiauFrito
@MiauFrito 5 лет назад
That makes a lot of sense, thanks for the explanation
@sKratch1337
@sKratch1337 5 лет назад
So a basic summary would be: Don't let your GPU get fully utilized in competative games to reduce input lag. You want to be CPU bound instead of GPU bound basically? Don't use ULL or Anti-Lag if your GPU isn't being fully utilized as it will just increase input lag. So capping CSGO @ 300FPS (Which I roughly maintain about 80-90% of the time.) and using the default option for pre-rendered frames (I believe it's 2 or 3 in GO.) would be the best solution for input lag AND frame time stability, is that right? Or did I miss something. Edit: I remember hearing that capping your FPS also increases your input lag. Have you ever tested the difference in input lag in a game like CSGO if you're getting 500+ fps and capping it at 144, 300, 500 and 999? With the ingame command fps_max.
@Linets.
@Linets. 5 лет назад
This is what I really wanna know
@rdmz135
@rdmz135 5 лет назад
@@TerpsiKo Some in-game limiters can give you inconsistent frametimes. RTSS will give you a smooth consistent one. The trade off is tiny tiny increase in input lag but that can be well worth it depending on the situation.
@reppy0757
@reppy0757 4 года назад
This is why I searched for your video when I was curious about this. You're the only who truly does in depth testing. Thanks Chris
@methadonmanfred2787
@methadonmanfred2787 5 лет назад
very intresting topic, it's fascinating how many things there are we don't know about yet
@tylerpenn545
@tylerpenn545 5 лет назад
I knew before that keeping your gpu load lower kept input delay lower. I didn’t have any numbers though, but this was preached on blur busters forum for some time. It’s great to have some numbers behind it. Could you test out games that have dx12 and dx11 and run input lag tests where you are gpu bound in both cases? The developer and not the driver have the pre rendered frame control in those games.
@mrnelsonius5631
@mrnelsonius5631 4 года назад
Thank you for doing this, I’ve never seen this correlation in a single other report, but I’ve FELT it. I’ve been capping my frame rates for years and things just feel smoother that way, it’s interesting to know the reduced GPU load is a big part of that effect. Even using Gsync, I try to set my in game settings so that they seldom drop below 60 (whatever your refresh rate is), so there’s plenty of “headroom” for the GPU left most of the time. Awesome findings
@fiber_king2334
@fiber_king2334 4 года назад
What is your monitor refresh rate and fps cap? Do you use RTSS?
@mrnelsonius5631
@mrnelsonius5631 4 года назад
darth_frager I used to use RTSS for a lot of games, but since getting a Gsync monitor I haven’t found it as necessary. Mainly because when you have Gsync set up properly, it’s already limiting your FPS to your monitor refresh rate. I’m on a 60hz monitor so that’s 60 FPS. From there I just try to adjust in game settings so that’s I’m always hitting the 60 fps cap outside of brief dips, maybe to 55 FPS in demanding but infrequent places. This way, it feels like you’re running with no vsnyc, zero noticeable input lag, but you don’t get any screen tearing.
@fiber_king2334
@fiber_king2334 4 года назад
@@mrnelsonius5631 You say it feels like "you're running with no vsync", but exceeding your refresh rate does just that. I'm confused. I have gtx1080+ 6700k and freesync 2 monitor 144Hz WQHD. In games that it matters there usually is frame limiter present so that would make rtss obsolete for me I think. I think I should use gsync/freesync. I am getting avg110, standard deviation 5 frames. This means 99,7% is within 95 and 125fps. I wonder if I should limit fps somehow and how would that affect lag. I feel like constant higher lag is preferable to lower avg, but variable lag. Or should I only limit it if some game exceedes my refresh rate?
@mrnelsonius5631
@mrnelsonius5631 4 года назад
darth_frager I’ve never used a high refresh monitor, but my understanding is that you shouldn’t need to frame limit unless you’re going over 144 FPS. Some games will stutter without any cap though, apparently. And you’re right about Gsync having a vsync wall to an extent, but I also frame limit to 60 in game settings when I can, and for whatever reason that combined with Gsync creates no noticeable input lag for me. I can turn on vsync proper and disable gsync and instantly feel it. I think gsync behaves differently as long as frames are being limited to refresh somehow. The site Blurbusters has the best info on all this. Check it out if you haven’t
@fiber_king2334
@fiber_king2334 4 года назад
@@mrnelsonius5631 You limiting in game framerate to 60 might effectively mean your frametime can't be lower than 16,(6)ms. With RTSS to achieve this some margin is needed. I think I settled for: GSync on + VSync ON in NVCP. I understand that no Vsync results in tearing even with GSync on. I don't know yet if I limit in game to 141, or lock framerate with always achievable number, so I don't have GPU maxed out causing this bizarre lag presented in the video (depending on the game. For Apex legends and other demanding games I'd go for like 100 and lower graphics settings if needed, Quake Champions or CS GO 141, but I gave up on these anyway)I'll test both ways as soon as I can. As for Nvidia anti-lag I tink I'll settle for ON. Worst case it still shouldn't hurt much with GPU not maxed and could prevent queuing multiple frames as I understand it. What's the defaulf BTW? I own Freesync2 144Hz monitor and Pascal GPU
@resorteslgh
@resorteslgh 4 года назад
Surprising discovery, congratulations. So we are wrong to think that when playing the maximum fps we decreased the input lag, but when limiting the fps it would give us lower latencies and better fluency in the game.
@neotimeytb
@neotimeytb 5 лет назад
That's a sick test! Thanks for this. Hope AMD/NVIDIA will do something, because if I understand it correctly, if you limit your GPU usage and so get less fps, you get a better input lag, and that suck...
@chad-wyck
@chad-wyck 4 года назад
I recently discovered this for myself tuning and testing my acer nitro 5 with a gtx 1050. The frametime delay difference of a locked 60 fps 1050 at 80% GPU usage vs unlocked 150+ fps 1080 at 100% GPU usage. It legitimately broken my brain a bit having one so much smoother, while the other so much more blatantly response. Input lag reduction is where it's at, great video.
@W4leed.
@W4leed. 4 года назад
can you help a brother out? I play R6 Siege on 60hz. Should I use this mode? and if yes then should I do it with V sync off or on?
@goten_9101
@goten_9101 4 года назад
i got the same acer nitro 5 how i do all that crap?
@rohansatram
@rohansatram 4 года назад
@@W4leed. yeah you can use low latency mode, but I would turn off vsync for sure
@creaturedanaaaaa
@creaturedanaaaaa 4 года назад
@@W4leed. vsync literally always adds input lag unless you use gsync or freesync so it's recommended to keep it off.
@Dyscrasia
@Dyscrasia 4 года назад
Yo
@mazedmarky
@mazedmarky 5 лет назад
super interesting results! I always noticed a reduced input lag in OW, when my gpu wasn't maxed out, since I did several tests in the past as well. But I always thought this was an OW related (engine) thing, so thanks a lot for testing this with other games, too!
@xminecraftbestxibest9307
@xminecraftbestxibest9307 4 года назад
I went through like 4 videos and this is the only one I understood thank you
@St0RM33
@St0RM33 5 лет назад
So basically...can we get a gpu load limiter? Essentially a variable fps limit so not to max out gpu load but still get max fps at each moment?
5 лет назад
Actually, the devs need to find out why full load increases input lag and remove the bottleneck. Predicting and tuning GPU load on the fly might be a difficult task without introducing another artificial limit that might just serve as a new bottleneck.
@St0RM33
@St0RM33 5 лет назад
@ i don't think they need to find anything, this is how the system works, it is gonna push frames always as fast as possible, but it can't predict if it will finish rendering a specific frame in time, neither in how much time, thus some will be dropped, others will get delayed that's why you get more latency
5 лет назад
@EerieEgg Good point, it might be the case that AMD/Nvidia was not really interested in solving this problem. Maybe this time is different. Battle(non)sense is a well respected source and the renewed competition and interest in input lag might make one of them move forward.
5 лет назад
@St0rm33 This is the exact same problem the new low latency modes from AMD and Nvidia trying to address. The CPU should wait and prepare the frame just in time for the GPU to process it, at full GPU load. If it doesn't work like this then there's room for improvement. I don't say that it's easy but it shouldn't mean that it's impossible either.
@St0RM33
@St0RM33 5 лет назад
@ well i don't think it is impossible to implement correctly but it's gonna need a big driver re-write and better integration with the cpu to make it work optimally..and after this games need to be written well for this..which they don't
@davws88
@davws88 5 лет назад
Nice finding... I guess next Hyper Ultra Extreme Anti-Lag Latency mode will limit GPU load to 97%...
@OutOfRangeDE
@OutOfRangeDE 5 лет назад
Haha yes
@BattleNonSense
@BattleNonSense 5 лет назад
A "dynamic FPS limiter" which aims to keep the FPS as high as possible while keeping the GPU load below 98% would be good. It would still not give you perfect frame times, but it would give you max FPS without the input lag bump.
@mrjonjoe1895
@mrjonjoe1895 4 года назад
This guy might be the reason Nvidia Reflex came out in games automatically enabled, it keeps gpu load under 97 % i think
@ArtfulRascal8
@ArtfulRascal8 4 года назад
definitely
@emi6388
@emi6388 5 лет назад
if your gpu usage is less than 99%, you can still be limited by the GPU a certain percentage of the time it's the individual frames which will be limited by the GPU or not, so an "immediate gpu usage" or "average gpu usage in the last second" will not tell you what's up which is exactly why with 80% gpu usage you can gain fps by upgrading your gpu, and with 99% gpu usage you can gain fps by upgrading your cpu, and you can often gain fps by upgrading your ram
@emphase6
@emphase6 3 года назад
This is by far the best video I've watched on this. I've been researching for hours, thank you for this.
@doltBmB
@doltBmB 4 года назад
I've known about this for a long time. It's mysterious. But about NULL/AntiLag, testing with uncapped framerate should do very little, the render queue only fills up when the CPU outpaces the GPU. Limiting the framerate is not a valid case because then the engine waits and does not prepare a new frame. The worst case is when you are limited by Vsync, the engine will then prepare new frames until the queue is fully filled up. This is the real cause of Vsync input lag. You can then understand why the "58fps trick" works, by making the engine wait instead of filling up the queue you reduce input lag. If you know how most game engines are put together you will know that calling the Swap() function in your graphics API with Vsync enabled will cause the thread to stall until the frame is presented. Since the driver and hardware is a black box and we don't know exactly what happens when you call this function this is where the mystery begins. In order for the render queue to work, that means the driver has to present a false backbuffer so that the engine continues with the next frame instead of stalling. For the render queue to fill up, the CPU framerate has to be higher than the GPU framerate. If Vsync truly capped your framerate to exactly 60 then the queue could not fill up. This presents another mystery, the game engine measures the framerate by using the high precision timer to count the nanoseconds between calls. If say the CPU produces frames faster than 60 to fill up the queue, then the delta time information should also be lower than 16.6ms but this doesn't appear to be the case. Does the driver fudge the timer as well? Now the most mysterious part, which is what you've run into here. Input is taken by the engine before the frame is processed so that it can be used in the next state of the game. Input is taken instantaneously at that moment and does not update until the next frame. The lag should always be at least the time between the frame is processed and it appears on the screen, which is Input + Wait + Swap() + Queue + Sync (even without vsync the monitor still has to cycle) + Pixel Response Time. Capping the framerate should make the engine wait a certain period of time before processing the next frame, that should be extra lag. BUT, when you cap the framerate, you get a lower input lag than if you are limited by hardware. This lower input lag appears to be proportional to your potential maximum framerate. This implies that the frame is actually processed just before it appears on the screen, *even in the case of Vsync with a queue of 1!!!!!* But the thread should be stalling, which should add lag, not reduce it. But the result is you can have input lag comparable to 100fps with a cap of 20fps, but only if you could reach 100fps without the cap. This is what you'll truly have to explain to me, please forward my comment to your contacts.
@kowalski2385
@kowalski2385 4 года назад
Dude...I'll TRY to read this, but you NEED to make paragraphs. PLEASE. PLEASE make paragraphs. About every 8 lines of text, you should divide your text... I'll even show you. I'll copy and paste this, and then use notepad to screenshot it and link it here.
@RequiemOfSolo
@RequiemOfSolo 2 года назад
I love people like you.
@TheKillerZmile
@TheKillerZmile 4 года назад
GPU BOUND GAMES 95 - 99% = TURN ULTRA CPU BOUND GAMES = TURN OFF
@Raglarak
@Raglarak 4 года назад
What about ON ?
@TheKillerZmile
@TheKillerZmile 4 года назад
Raglarak doesnt matter its same as ultra
@hypnofba
@hypnofba 3 года назад
@@TheKillerZmile no its not.
@TheKillerZmile
@TheKillerZmile 3 года назад
@@hypnofba same shit
@Ridley1911
@Ridley1911 4 года назад
Variable framerates and post processing are the biggest input lag culprits. By having a framerate lock (by the engine, not by external progams and vsync) at 144, no post processing effects you will have the best result possible.
@Meta-Drew
@Meta-Drew 4 года назад
@The Big Game Theory if you have g-sync or freesync, right? If you have a 60hz monitor, setting fps way over 60 significantly reduces input lag(in most games idk about overwatch specifically)
@g00glegoggle72
@g00glegoggle72 5 лет назад
Good analysts but incomplete. Perhaps it's best to wait for answers from AMD/Nvidia but more work is needed here. How does Radeon Chill compare to a capped framerate? How about testing a game where the input polling rate isn't asynchronous from the framerate? What is a better gaming experience for competitive gamers? Lower framerates with lower input lag, or perhaps higher framerates with more input lag? Do the lag characteristics change with G/Fsync enabled?. This is a interesting area to keep digging at, please keep up the great work!
@jpradasdiez
@jpradasdiez 5 лет назад
Very nice analisys with amazing conclusions. I have always heard it is better to leave FPS free insteado of capping them to reduce lag but it seems it is just the opposite. According to your tests it is better to reduce GPU load by capping FPS. Interesting.
@eliasunalan5244
@eliasunalan5244 5 лет назад
I just have to say that your videos are incredibly interesting and helpful ! Thank you so much for all this helpful information wich would be very hard to get without your channel.
@Dboyle1209
@Dboyle1209 5 лет назад
This would beg the question, is there a utility that caps the GPU to a certain usage % rather than fps?
@BattleNonSense
@BattleNonSense 5 лет назад
That would still leave you with the issue of a constantly changing Frame Rate/Frame Time, and so inconsistent input lag that you can't adapt to. The ideal solution is to find out at which FPS your PC can run a game at all times, and then create a profile for that game in RTSS where you enter that value. See: ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-xsXFUVYPIx4.html
5 лет назад
I'm not sure fixed frame rate is always the best solution. Sure, if your computer can push the frame rate near the max refresh rate of your monitor 99% of the time then you just have to set a limit there. But I doubt this is the case with the majority of people. Let's say you have a 144 Hz monitor and your game runs between 75 and 125 fps, depending on the gameplay. Like PUBG on my config. Where would you fix your frame rate? At 75 or 80 fps, giving up a lot of visual information and the potential for lower latency at higher frame rates? Or at 100 fps, making better use of your GPU most of the time but allowing for some regular spikes of latency whenever the frame rate drops below 100 fps and the GPU gets fully loaded? I don't think that either of these options are really desirable. I would rather prefer the lowest possible latency and highest frame rate the game allows me at any given moment, I guess.
@drunkev
@drunkev 5 лет назад
Amazing work. TY for the info. Input lag is one of my favorite issues to tackle when it comes to gaming optimization and you always surprise me with new findings!
@GholaTleilaxu
@GholaTleilaxu 4 года назад
Also I recommend reading Blur Buster's G-SYNC 101 article, page 11: In-game vs. External FPS Limiters. Resuming: In-game framerate limiters, being at the game’s engine-level, are or should be almost always free of additional latency. External framerate limiters like RTSS (Riva Tuner Statistics Server), which comes bundled with MSI Afterburner, is a CPU-level FPS limiter, which is the closest an external method can get to the engine-level of an in-game limiter; it introduces 1 frame of delay. Nvidia’s own FPS limiter (accessed with Nvidia Inspector) uses a driver-level FPS limiter, which introduces 2 or more frames of delay. I guess the "new" NVidia Ultra-Low Latency entry and renaming of _maximum pre-rendered frames_ (long live Marketing Departments) is meant to compensate the input lag of it's own driver-level fps limiter and maybe for people who always have V-sync=ON.
@ognjenstankovic98
@ognjenstankovic98 5 лет назад
You, man, are moving the boundaries of our understanding of gaming. THANK YOU
@lHawkee
@lHawkee 5 лет назад
You sir are a Legend ! Thx for making this Video
@microsista1413
@microsista1413 5 лет назад
ok, i've done some testing with gpuview and came out with a conclusion that even with maximum pre-rendered frames at 1, when gpu-bound, frames are overlapping each other, which increases overall fps, but requires you to get the cpu frame earlier in the pipeline increasing lag, in bfv you have an option called allow future frame rendering disabling which actually prevents that and makes the frames not overlap at all, same like frame-capping, so it reduces lag, but decreases gpu usage and fps as a result
@lolaleath18
@lolaleath18 4 года назад
So basically, get your fps as high as you can while staying under 97% gpu load is the best way to go? (Without using either anti-lag or ultra low latency)
@doffy5394
@doffy5394 3 года назад
yes and if you want to lower the input lag even further try ULLM set to ON vs OFF
@JunkratCake
@JunkratCake 5 лет назад
There is a new option on the PTR called High Precision Mouse Input and it uses a separate thread for mouse input and I tested it and now my input response time feels like I am on 300 fps even when at a lower cap like 144, it feels so good now
@jasonde2866
@jasonde2866 4 года назад
woah i didnt even notice they had that option
@samuelmanuel788
@samuelmanuel788 4 года назад
What is PTR may i ask
@JunkratCake
@JunkratCake 4 года назад
Samarth Mannual it’s the public test release, a separate install of overwatch which has future features for testing and anyone can install it as it helps blizzard with balance changes and stuff
@totalprocall6894
@totalprocall6894 2 года назад
You deserve much praise for this video...great work. How could amd or nvidia not explain this clearer for people? Subbing based on this video.
@Bensam123
@Bensam123 5 лет назад
Very interesting, so a better version of Anti-Lag is something that will dynamically limit the FPS in the game you're playing based on a moving average of your GPU load to something like 95-98% (depending on overhead needed to achieve these results). This could also be something messed up in Windows too and you're essentially manipulating a bug of some kind.
@chesterthesniper
@chesterthesniper 5 лет назад
Thanks for the tips man!!! Now I run cs go at 10 fps I can't see shit but I have the fastest reactions ever recorded in the universe thanks fam
@AzusaSnowflake
@AzusaSnowflake 5 лет назад
it can only cap to 49
@fiqirr
@fiqirr 5 лет назад
@@AzusaSnowflake issa joke
@mikfhan
@mikfhan 8 месяцев назад
TLDR you want stable frametimes, which means you want to cap your framerate with AMD chill + FRTC or whatever Nvidia equivalent at 97% GPU load at all times. In that situation neither CPU nor GPU is under max load, so anti-lag and ultra low latency is not necessary and may even introduce minor latency on top. At least that's how I read/saw this. Thanks Chris :)
@matteoaroi8681
@matteoaroi8681 5 лет назад
This is extremely interesting! Great video as always.
@NolePTR
@NolePTR 4 года назад
DPC Latency should have a pretty noticeable effect on input lag too. When GPUs are maxed out (or any device actually), DPC latency increases. Also significantly affects network jitter, so minimizing it is critical to reduce input lag.
@Loco00
@Loco00 5 лет назад
What about "Low Latency Mode On" ?
@dunteltesco264
@dunteltesco264 5 лет назад
Yes please. That mode should have been tested as well.
@Case_
@Case_ 4 года назад
CPU utilization of each individual core doesn't really tell you much either. On modern systems, you're unlikely to get into a situation where one CPU thread consistently represents a single app/game thread. Because it is not an efficient way to schedule CPU time. It does happen at times, but it's pretty rare. Usually, even if the game is mostly utilizing say two main threads (which is pretty much the majority of games nowadays, with very few exceptions), these threads are being "thrown around" the actual CPU threads (=cores). As a result, you don't see two "cores" being utilized close to 100%, you see most of your cores being utilized to a certain extent. What makes matters worse is that these CPU monitors are also slow, so they "average out" the values you see. As a result, you can see your cores being utilized to say 20-30% max and *still* be completely CPU limited by the game's main thread. Almost everyone makes this mistake - they see activity on a lot of cores and assume there's no CPU bottleneck and the game is well multithreaded, when in a lot of these cases, the games are still heavily CPU bottlenecked (because their main thread is reaching maximum CPU utilization, but it doesn't show in CPU "core" utilization) and rarely make use of more than 2-3 actual main threads. Even pros such as Digital Foundry consistently make this mistake, which drives me nuts, because not only should they know better, but they're also effectively confirming these misconceptions in their viewers... :( If you want to know whether or not your game is CPU limited and how many threads it actually uses, you need to use a tool that allows you to monitor the CPU utilization of the actual threads of the application/game. Only then you have correct data. Personally, I use Process Explorer for this.
@mujtaba360
@mujtaba360 5 лет назад
Finally a test that relieve me... I've been wondering for years why cod4 is so frikin smooth compared to other games with the same or close fps values... Now it makes sense that my gpu in cod4 capable to give me 500fps but i limit it to 250 that makes gpu load is about 37-42% But in other demanding games i dont put limit to get 200 fps and the gpu is at 99%.. I was always put the burden on the new games engines.. But now i know the truth uhhh
@NomadFM
@NomadFM 4 года назад
Getting a better monitor is most important.
@gevorian
@gevorian 4 года назад
You put an incredible amount of effort into this video, great job.
@clintans
@clintans 5 лет назад
Thanks for this. I had this theory for a while. So I have locked my fps to 120. I'm on a 240hz monitor. It just feels better with hitregs. I9 9900k and 2080 Ti
@VicBayolo
@VicBayolo 5 лет назад
Yeah I locked my fps to 120fps a few days and man it feels a lot better. I'm hitting shots easier for some reason.
@Jazzinplayer
@Jazzinplayer 5 лет назад
I'm going to try it out. I'm on the same scenario as you and very curious to see how it feels Maybe 140 fps limit ?
@VicBayolo
@VicBayolo 5 лет назад
@@Jazzinplayer if you can keep a constant fps 140 sure why not.
@clintans
@clintans 5 лет назад
@@Jazzinplayer As long as it's stable and consistent, why not.
@DerJoshbert
@DerJoshbert 5 лет назад
That makes so much sense! If you reserve some resources of the CPU and GPU to process the inputs more effectively than this results in better latency.
@lizardizzle
@lizardizzle 5 лет назад
What I don't get is that we've (or at least I'VE) been assuming that a lower fps always means a higher input delay. This sounds like as long as no v-sync is involved, limiting the framerate will always reduce the input delay when the gpu could be pumping them out higher, but that just doesn't make sense to me. I mean, the results are right there, in multiple games, but surely a game running at 1000 fps will have a lower input delay than if you limit that game's framerate to 144, right? If you did that, the GPU would be running at like, 30% usage instead of near 100%.
@BattleNonSense
@BattleNonSense 5 лет назад
lower FPS still means more input lag (like 30FPS vs. 60FPS at less than 80% load) - the caveat is that you have to think about the input lag increase when nearing 99% GPU load.
@lizardizzle
@lizardizzle 5 лет назад
@@BattleNonSense That actually makes a lot more sense in my head, thanks.
@KEM4OFFICIAL
@KEM4OFFICIAL 5 лет назад
I already feel smarter by listening to his voice.
@dans6117
@dans6117 5 лет назад
just found out the low input lag feature on the latest Nvidia drivers, thank you for letting know about the new features :)
@ermansengul461
@ermansengul461 2 года назад
don't run the video card at full load, even if the fps is high, there will be high input lag
@ohdudez
@ohdudez 5 лет назад
i knew it, my games always felt far more responsive when locking the frame at 30 compared to letting it run wild around 40-70. Glad to see my hypothesis gets confirmed and that i should disable anti lag to reduce input lag even further
@GunzHeroLoLoL
@GunzHeroLoLoL 5 лет назад
This is with low latency mode entirely disabled versus ultra right? What about just having it 'on' (the old 1 buffer mode)
@NinoPanino
@NinoPanino 5 лет назад
Very Interesting. Your the 1st to go in-depth about this topic. Everyone else is just turn on and move on. I'm wondering now if I should turn it back to off as I cap my fps to 160 for most games.
@jimez86
@jimez86 5 лет назад
How does adaptive sync affect this?
@hsew
@hsew 5 лет назад
This changes everything. If lower framerates can lead to a better experience, soon, GPU marketing will be less about FPS or frame times, and more focused on input lag and frame time variance.
@jonathansidwell1
@jonathansidwell1 5 лет назад
WELCOME BACK BATTLENONSENSE
@Gonbatfire
@Gonbatfire 5 лет назад
Holy nuts i was like "what??, how???" while watching this, it's super interesting, gonna give it a try, thanks dude i'm a sub now
@Kalimeromitsopreo
@Kalimeromitsopreo 5 лет назад
Dat ist Wissenschaft! Danke dafür
@RealNC
@RealNC 5 лет назад
This effect has been known for quite a while now. I've been suggesting to people to use frame limiting to lower their input lag in games.
@loyalitiy
@loyalitiy 5 лет назад
It actually makes sense on hardware level. My assumption : 1.the gpu architecture is similar to the cpu (instruction fetch/decode phases etc.) 2. The need for loading/saving registers happens for each frame. Conclusion : less frames, means less decoding/loading instructions and saving/loading register data, that means less overhead which give you a lower latency. You sacrifice the computing peak performance for a lower latency PS.: if I made a mistake, feel free to tell me
@igorthelight
@igorthelight 5 лет назад
Sounds reasonable.
@emromw
@emromw 5 лет назад
I don't quite understand why that results in an overall latency increase. Is there maybe an architectural limit or choice, that favors higher frame throughput (ie higher fps) but has the downside of higher input lag ? This wouldn't surprise me, since benchmarks mainly concern themselves with framerates, and GPUs get judged accordingly. Even taking into account frametime consistencies is a rather new trend. So why would they care about low input lag, if no one ever cares about that ?
@loyalitiy
@loyalitiy 5 лет назад
​@@emromw ok a little bit more explanation. Let's say the processing goes so fast, the time it takes to calculate is insignificant low. Your overhead comes mostly from fetch/decode ( which means translation and interpreting code) and saving (e.g. in case of context switch, other calculation tasks forced by your OS) calculated data for future use. to speed things up, we have branch prediction e.g. this means, the processor guesses what actually might happens for the next picture. Let's say the first 10-15 share the same instruction, nothing need to be "loaded"(instruction was predicted), only calculation and result saving. The next couple frames needs some other instruction, the branch-prediction give you a false register and wrong instruction, now you messed up the whole queue, because you need to reload and save current data and load new instruction. At this point you are doing a double round of calculation, the first results are useless so theoretically calculation happens for this frame two times. When the queue is full, the gpu at its limit, it can present a frame every 16ms but internally the reaction is kinda messed up (internal times: avg. time 16ms, range 10-25ms). More frames means more possible wrong predictions, which causes a higher latency. In addition you have to consider that we have a full parallel system, so we have to deal with a lot of context switches because we have more thread than cpu/gpu cores, IO(in/out)-Input is a huge cause of any delay, because everytime we move our mouse it might be enough to a cause a massive amount of "wrong predictions". This is highly simplified. You have to keep in mind, every user input is highly complex, for a computer its hard to understand why the user presses a certain key. You want to see your input immediately so everything has to be stopped, interpreted / calculated and that's where a solid amount of delay comes from. Nr.1-PS.: you can test it out for yourself when you open the task manager. Under cpu you find processes, threads and handle, you will notice that you might have a couple hundres of processes a shit ton of threads. everything happens smoothly because of fast switches and intelligent organizations, but everything has its downside, especially with computers Nr.2-PS.: the more frames you calculate the more information you get in exchange for latency, thats my assumptions and conclusions :D Disclaimer: everything is highly depended of game engine and OS choice
@JustInTime0525
@JustInTime0525 5 лет назад
This is eye-opening, thank you for testing them!
@connor5847
@connor5847 2 года назад
Have you ever heard back from AMD and do you know if this issue still persists on modern graphics cards?
@x1c3x
@x1c3x 5 лет назад
I'm surprised that this is new info. One of the perks of owning lower end gpus over the years is noticing these "anomalies" and random performance issues even though fps may be fine. The first time i noticed 99% gpu use causing poor performance was on my Radeon 4890, but that was paired with an old CPU so ..lesser.. performance was expected, and it didn't happen often, which is one of the reasons it became obvious wheenn it did happen, funnily enough. I stopped worrying about it after getting my new system though, but it's much more noticable in the lower fps ranges 50-80 on lower end hardware. Been saying for years I'd love a driver setting to cap gpu performance by %, like maxGpuUse=90%.
@tinamunich3107
@tinamunich3107 5 лет назад
amd lets you set power target. -50% power draw for your GPU = never 99% usage.
@x1c3x
@x1c3x 5 лет назад
Tina Munich i have an Nvidia. If i limit power in Afterburner it just downclocks and the 99% becomes 99% of a lower core clock.
@codectified
@codectified 5 лет назад
yeah i was wondering the same.. this is old news for sure, but what's interesting is i haven't seen any explanation for why this is the case.
@tinamunich3107
@tinamunich3107 5 лет назад
@@x1c3x maybe its MSI afterburner 3rd party software vs manufacturer driver.
@feschber
@feschber 5 лет назад
Could you please test this with csgo? I heard that this game doesn't like frame cap and I also want to know how the input lag is at like ~300fps
@vapor5167
@vapor5167 5 лет назад
yes
@dannass5
@dannass5 5 лет назад
His input tests was not recorded
@dannass5
@dannass5 5 лет назад
On vidya
@TheKillerZmile
@TheKillerZmile 4 года назад
Low latency mode to OFF Csgo is cpu intensive and its already have the lowest input lag when you have high fps
@lapin0307
@lapin0307 5 лет назад
I know this comment might be lost in the list but ill put it anyway. From my testing on my gtx 1080 ive found that default settings works best. Any programme settings changed would add lag and even the full rgb setting. The only change that works best is display scaling (no scaling). The same conclusion as yours apply. Furthermore ive seen that enabling vsync below the refresh rate makes motion more in sync on a gsync monitor (if it makes sense) its easy to verify by shaking the mouse a little from left to right. Hope it helps.
@Sud0F1nch
@Sud0F1nch 5 лет назад
I SWEAR IVE BEEN NOTICING THIS!
@chicken7106
@chicken7106 5 лет назад
It actually makes complete sense. With low latency/anti-lag on you are limiting or completely removing the number of pre-rendered frames that the GPU can prepare. When GPU load is below 100% and these features are disabled, it can use that extra processing time to render the next frame(s) in advance and reducing input lag. Seems like these features are working as intended, just not beneficial for every use case.
@microsista1413
@microsista1413 5 лет назад
that doesnt make sense
@max_955
@max_955 3 года назад
This is extremely interesting. Now I get why if I enable the fps limiter I have improved reaction times.
@CYNC33
@CYNC33 4 года назад
So this is why rocket league started to feel more sluggish when I moved to my 240hz monitor. My gpu was pretty much always 100% whereas before with my 144hz monitor it wasn't. Makes perfect sense now!
@s_for_short2400
@s_for_short2400 4 года назад
Unless you capped your game at 144fps that shouldnt be the case
@CYNC33
@CYNC33 4 года назад
@@s_for_short2400 I used to cap my fps to 141 when I had my 144hz monitor because it was freesync. Soon as I got my 240hz monitor I maxed it out at 250. I'd generally hover around 210fps in game which means my gpu was at 100%. Now, even though I'm still using my 240hz monitor I've tried capping it from 144 to 165 to 180 to 200 and every single one of those feels much better than 250. I am using rivatuner as well
@kaddasixseven3581
@kaddasixseven3581 5 лет назад
i FUCKING love your scientifc methodology. Rare to see in RU-vid video reviews. The results in this video are also a huge revelation. Who could have thought that lower fps can result in lower latency? The math doesn't make sense.
@desmond3245
@desmond3245 5 лет назад
Rtss scanline sync requires the GPU load to stay under around 80% for it to work. I wonder if it has to do with these results.
Далее
Ultimate Latency Reduction Technology, AMD vs NVIDIA
20:04
50% Less Input Lag! Low DPI vs. High DPI Analysis
7:31
Это ваши Патрики ?
00:33
Просмотров 33 тыс.
AMD Anti-Lag+ | We Need To Talk
14:41
Просмотров 146 тыс.
I watched 121 FPS guides and they’re full of lies
20:07
THANK YOU FORTNITE!
9:46
Просмотров 456 тыс.
NVIDIA Does It Add Input Lag?
51:09
Просмотров 65 тыс.
The Best Input Lag Settings You're Not Using
8:18
Просмотров 1,7 млн
How to Enter Flow State When Gaming (The Zone)
8:55
Просмотров 2,9 млн
NEVER install these programs on your PC... EVER!!!
19:26
Why Gamers are Switching to High DPI
8:43
Просмотров 2,3 млн