Amazing demonstration! Always saw the NVIDIA low Latency option in settings and everyone told me to turn it on. Didn't know it made this much of a difference.
It's actually not that great. The only halfway truthful bit was the brimstone op shot, the jett peeker advantage bit is in fact a complete misconception and flat wrong.
@@manasmitjena5593 well, skins actually have an influence on how you play and perform. Even though the stats, damage etc is exactly the same with and without skins, some skins make u feel more accurate. Idk how to explain it better
@@preetish2158 I used to be good with 75hz or even 60 but now at 144hz it's just so much more comfortable that it's much easier to flickshot but the model 0 also had a huge impact on that
honestly that's kind of what pc is you pay more for it you get more of an advantage then others. its how its always been for pc. console everyone is even except for maybe elite controllers
@@ImConnor01 don’t just to conclusions my man I just bought a ps5 to go along side my Xbox series x and my pc powered by a 3070. Get the fuck off your high horse
@@av4up this comment was made when GPU prices were still sky-high due to crypto miners and it was extremely hard to get your hands on one since they’d be sold out. Thus, OP was sarcastically saying the miners would definitely love this new GeForce feature, since they’re the only ones with the GPUs to begin with. (However, now the mining issue has gotten significantly better and GPU prices are actually waaay lower and very close to original MSRP. That’s why I finally upgraded from a 1060 to a 3060 a couple weeks ago 😁
It might be 30 ms less only if you are full gpu battlenecked. But, if you allow this feature you are gonna have some fps drops. You will always get lower lag in exchange of framerate. But don't expect 30. This is edge case scenario.
@@A2Lettuce if you are a competitive gamer, yes, since u will be putting thousands of hours into a game, why would you want to have a big disadvantage against other players with a better pc, that will destroy you just because they have a better pc, but even if you have a really good pc, there will allways be people with a better pc, so it depends.
@@A2Lettuce commenting here for you to re-read this comment and get a chance to delete it, you've probably grown up and realized what you said, no need to thank me my good friend
Well, maybe a lil more then huge my guy. Depends on what integrates graphics, if you have uhd then it's good and still an accomplishment, if you if intel hd 4000 or 5000 then that's amazing cause that's insane, that should be impossible cause of the stutters and if you play on Intel hd less then 1k like intel hd 520 then bro you might be able to get in the big leagues cause lemme tell you, it can't even run Minecraft nicely. On fast graphics low distance quality. Take it from a noob who gets 60+ fps and still noob lmao
@@goomyliaplayz7911 thanks man now I'm on uhd 630 but i used to play on a dual core pc without gpu reached silver 3...enemies sometimes shot me with 3 bullet but on my screen it seemed like they 1tapped me
@@matisvanasse5905 bro shut up xD flexing with your money on others, doesnt make u a better person. Im playing on 60hz rn and its alright. Ik higher hz feels a lot better, but if you think 144 is unplayable, then ur probly hardstuck in bronze or summ
@@limes_I Bro Im not flexing im saying my opinion, I swear when you played 1-2 hours on 360hz, 144 feel bad and 240 feel ok. Didn’t mean to seem like a jerk, I hope these technologies become less expensive like 144hz. 120 and 144 were expensive af and became the norm with time and hope 360hz become the norm too. Im not rich im just an enthusiast and btw I don’t play valorant. Have a nice day G (and english is my third language ignore the errors lol)
Sure, maybe fraction of a second advantage. However, a skilled player with a functional kit will still be able to drop someone with the lowest latency gear available.
Getting a PC that can run 360fps in Valorant actually not really that pricey, I mean for some people it's kinda pricey but nowhere near as pricey as buying a monitor 360hz though
this is true, that's why players who have Best internet connection and powerful pc goes to Top.🙋 poor player have poor internet connection and wooden pc your just a food for this kind of rich players. real talk.
Everyone should have 360Hz :) But then... who would want to work hard or study physics, IT or medicine if they just could get 360Hz like everyone else :P
Hi im having a sttutering in valorant but i have 100+ fps and sadly 60hz old tv monitor.... Nvcp reflex is set to-ULTRA while in game is On+Boost . And still having sttutering and feels like 30fps 😕 what are your suggestive settings to end the sttuter but having the less input lag. Thanks in advance..
In 2 years when they arent sold out and people are drooling over the rtx 3120ti with shock 890 hertz irl games wallet breaker 9 billion fps card for 2000 (on the nvidia site)
i am not sure if the video is correct. i mean does it matter what you see? or it only matters that you send (to the server) the information that you clicked?
It should be about 35 ms with 360hz reflex off and about 28 ms 144hz reflex on - 33ms - 13ms = 20ms. 300/20= 15. 20x15 = 300 so: (that 60 is from 60hz reflex on) if 300 is a 100% then 3 is a 1% so - 3x28=84 plus 60 hz = 144 so: 144 has a 28 ms reflex on. Ok so we got 144 reflex on then we can calculate reflex off 144hz. If we take that from 60hz to 144hz reflex on is agap of 5 ms then we should assume is the same for 144 off reflex BUT we know 60hz reflex on is 33 and 60 hz reflex off is 64 so its 31 in diffrence SO: if 64 is 100% then 1% is 0,64 so: 0,64x48,5 = 31,04 lets count that as 31 ;p - if we know that 144hz reflex on is a 28 then: 28/100=0,28x48,5 = 13.58 SO: 144hz reflex off is: 64-13.58=50.48 SO: its about 50ms. Well it is just a step by step calculation by my own there is a possibility it is wrong but it makes sense right ? :D that 360hz reflex off was calculated the same way as 144hz reflex off. Thank you for looking in this matter :D
notice how they never show 360hz with reflex off? thats because nvidia reflex does next to nothing past 144hz which is what 90% of competitive players play at, which is why nvidia reflex is useful for things like AAA games where most people have lower framerates but is useless for competitive gaming.
@@xman10110 The "144hz" that you refer to is the maximum range for G-Sync. Reflex does nothing to change the refresh rate. It simply minimizes the input delay.
@@phamxuankhoaa im saying that gameplay at 144hz already has such low input delay that nvidia reflex doesnt make a noticable difference in input delay, but they are purposely avoiding this by only showing nvidia reflex with 60hz. nothing in this video is related to gsync, and 144hz is not the maximum range for gsync, that just depends on the max refresh rate of the monitor.
exactly, because this is 100% bullshit. Truth is, between 360hz and the 60hz comparison, both footage weren't played at the same framerate ( hint, description, 60Hz is at 90fps whereas the 360hz is at 360 fps ). Yes reflex might decrease input lag, but this comparison isn't real by anyway. Basically, if you didn't know but I guess you do, the more fps the less input lag. But it has nothing to do with Hz, the game doesn't care if the image is displayed or not, it just processes it ( that's also why there is tearing, game doesn't wait for the image to be displayed to process anything ). more framerate = decreased tick interval = code being processed "more" or "faster" = less input lag. What impresses me is the lack of people acknowledging this, shame.
@@bttfsof I don't think you are right or RU-vidr Battle No Sense is full of shit: he says that yes, 360 fps 60 Hz is faster than 60 fps 60 Hz, but 360 fps 360 Hz is even better.
@@Z3t487 Ofc, higher refresh rate is better, there is no debate on that. The thing is, at a 60Hz refresh rate and high framerate, it depends on the frame window the input is actually received by the game. See, at 60hz there is a 16.67ms delay, so if the input is received 14ms after the last frame, and if you're running at a very high framerate ( let's say crazy 1ms frame time, meaning 2ms=2frames ), then it means you could see a response from your input only 2ms after you've actually pressed ! This just serves as an example ofc but that's what happens at high framerate and low refresh rate. edit: and even other than that as I probably said in my previous comment, if the game at 60fps takes more than 2 frame to "get" the input, it means it would have a minimum of 33ms of delay, so there is a big room for improvement and lower this to a max of 16ms on a 60Hz screen by increasing the framerate. Basically, high refresh rate only makes it more consistant, but you can see an improvement in input lag even on 60Hz monitors, if you've got a high sample of testing, the low response time should match on a low and high refresh rate screen. The thing is, comparing refresh rate with different framerate is nonsense and purely misleading, since framerate is a major variable in input lag. basically this part of the video : ru-vid.com/video/%D0%B2%D0%B8%D0%B4%D0%B5%D0%BE-msOWcvoIC8M.html ( wrote this before seeing it btw )
But it wouldn't worth buying better CPU than because higher fps than your refresh rate is always better it makes the game smoother + decreases input lag
@@MrZodiac011 yea, but noone will play with 60 hz and low latency cause there will be no diff But with high hz will be no diff vs high hz and low latencu
It's not that NVIDIA isn't producing cards, it's their source chip company that have problem with production. They are related to it, so they can't do anything about it themselves. And if you know it affect other companies too, not only NVIDIA.
Where are my 3080 oh right miners are buying them bulk from factories LOL or maybe nvidia is mining them selves since it is so profitable now who nows...
@BartLx you can sell consumer cards even if you didnt make any profit with asic you cant asic is pretty risky investment, thats why noone gives a damn about them