I enjoy how your videos include subtitles. I never noticed them, but I always enjoy when creators go through the really not that great amount of effort to make them. Too many RU-vidrs, even educational ones, don't have any subtitles on their videos in any form. It's really annoying because it's a bit time consuming but not that hard. For the accessibility it provides the amount of work is worth it.
I always thought the auto generated subs on philip's videos are pretty good, until one day i noticed some notes sprinkled in the subs by philip, just makes it easier to understand for non native english viewers.
Probably the best way to do subtitles on YT is let it do them for you, download and tweak them, then re-upload. It's still quite a bit of work depending on how long the video is.
As someone who used to write and maintain TF 2 config files I'm not surprised at all by these results. While I was limited to sv_pure restrictions , obtaining 1000fps was a trivial task on 4th gen Intel CPUs, of course this was on a empty map staring into the wall, on an actual competitive match, 9v9 the frame rate would easily be reduced to a third, still it was really entertaining to see how low, or high, depending on the perspective, the game could go. Also LOD biasing the game to look like Minecraft is always a treat.
TF2 vs CSGO is an interesting case, as TF2 has alot more going on that would bog down the cpu, where as CS:GO is heavier on the rendering side with its shadow system. Although these differences are really only noticeable in 2010s era hardware. I wonder how high of a stable framerate a philip's cpu could handle staring across mid on a full 32 player hightower server.
A little late to the party but I didn't see anyone in the comments talk about the fixated frame rate numbers at 10:43 so here's my best guess. The numbers are all fractions of 4096. The simple ones are the powers of two; 2048 and 1024 which are just 4096/2 and 4096/4 respectively while the other numbers are just the closest integers to the fractions 4096/3 and 4096/5. For example 4096/5=819.2. My theory for why this is happening is something along the lines of the game measuring frame times in 4096ths of a second. If a frame takes say 3 of these ticks to render, then the game sees this as a time of 3/4096 seconds which equates to 1365.33 frames every second. The calculation must be more complicated because you're getting fps numbers that aren't just fractions of 4096 but I would guess that at some point during the process it's doing something like what I described and then perhaps averaging over several frames to get the final number but it's worth more investigation. I don't think the paper has anything to do with the phenomenon. It just happens to also be looking at integers that are close to fractions of 4096. Essentially what it's describing is taking the numbers from 0 to 4095 inclusive and then putting them into a certain number of boxes. you can see how this would result in the same numbers as you're getting because if you're trying to equally split these 4096 numbers into five boxes, you're going to get 819 in four of the boxes and 820 in one of them. That is what's displayed in the table which shows how many numbers are in each box when you split 4096 into anywhere from 2 to 10 boxes (It's a little confusing because the number m on the side is the number of boxes minus one.) Edit: SOLUTION!!! The game seems to measure frame times in 65536ths of a second. When pausing your video and doing a bit of math I realized that all of the fps figures were fractions of 2^16 or 65536. This leads to the same the same math as before except this time a frame rate like 1376 corresponds to 65536/48 but it also explains all the other frame rates that the game can display like 1820 which is 65536/36. The reason these frame rates are sticky is that once you get to 2000+ fps the difference between the game taking say 32 ticks to render and 31 is a difference in the displayed frame rate of 66+. This means that if the real frame rate is holding steady at 2500fps plus or minus 100 then the only three possible frame times that the computer can see are 25, 26, or 27 65536ths of a second which it calculates as 2621, 2520, or 2427fps and so the 200 possible real frame rates get compressed down into only 3 that the computer can display. This isn't a problem at lower frame rates because at less than 256fps the difference between displayable frame rates is less than 1 frame per second.
@@scottmcqueen3964 It's not an problem though? When he tested it on an actual server, it was noticably teleporting him back since CSGO uses good networking techniques. By this logic, the speed hack will only ever work on local servers you're hosting so it's not really an issue.
Assumption about the speedup: Probably caused by the way source calculates delta time. Delta time is used to ensure that physics (movement etc) are not bound to the FPS, but to the ticks instead, while also allowing higher FPS than the tickrate. In this case, I assume that the original assumption was that a frame wouldn't ever be generated faster than 1ms (1/1000 sec --> reached at 1000fps), so the developers probably divided the value by 1000 to turn it into ms somewhere along the way. Assuming 1500 FPS, we have 1500fps / 1000ms = 1.5. Since the original code was written under the assumption that 1000fps would never be reached, this operation, fps/1000, would always result in a value below 1.0. Now here's why a value above 1.0 is significant: delta time is usually used by multiplying vectors (e.g. your movement direction and speed) with the delta time. Since this is usually below 1, if the delta time is for example 0.5 (right in the middle between two ticks), the game would multiply your movement vector with something below 1 and thus emulate a "partial tick", putting you somewhere between where you were at the last tick and will be at the next tick. From the previous example, 0.5 would put you in the middle between those 2 positions. Now, if that value, which was assumed to be below 1, happens to be something like 1.1 due to them doing 1100fps/1000, this means that at some point betweem the two ticks, you will actually have surpassed the point you were supposed to be at at the next tick. This is just my assumption based on knowledge I gained while programming a few small games, and it may be wrong. Still, this explanation makes sense to me.
I'd bet the game running faster than intended is something to do with delta time values between frames. If they're only tracked to 3 decimal places, the minimum would be 0.001, but at above 1000 fps this might be rounded to 0 in some cases, which would mean the game thinks no time has passed between one frame and the next, and might skip ticks to "catch up", since it thinks it needs to, or something like that.
I would be guessing that it's some floating point arithmetic error too, but it might even be Windows being Windows. As far as I know in some languages like Python one of the time library functions is limited to resolution of about a millisecond, and that might have to do with HPET or some internal timer implementations of the Windows API. I'm not a low-level C programmer though, so someone with more experience could maybe explain what's going on. Regardless, after a quick test in Python 3.10 I get relatively consistently 0.00099706649 seconds (0.99706649 ms) per step in a loop. This isn't close to 0.001 ms, but might still affect the game engine since 1000 ms / 1000 frames = 1 frame every millisecond.
@Based Madara Using time.perf_counter and time.perf_counter_ns I can get it down to ~500 ns which is great for Python, but it gets it from the OS and it's the process time, not system time. time.monotonic returns very accurate and stable value, but the resolution on Windows is about 16 milliseconds. Although I'm guessing game engines use way more accurate methods for time-critical functionality, lots of games nowadays break completely with high polling rate mouse sensors or even the very widely used 1000 Hz which is baffling to me. At the very least devs should ensure that high framerates or huge frame time variations don't cause unfair advantage or stuttering.
I have no idea why or what the technical details are, but I do know that this is a problem from old DOS games where the original software was designed for a slower processor, the devs didn’t account for the future, and now your game runs 2000 times faster than it should because of everything being built on the CPU/GPU response time clock
I've been waiting for this moment for a long time. Ever since I surpassed 40 fps by upgrading laptops, I've wondered how far we can go. Congrats on cracking 1000 fps!
I'm a verifier for Portal on srdc, and one time someone actually submitted a run that had the game running at over 1000 fps. it appears that framerates higher than 1000 actually cause the tickrate to increase, going from Portal's default 66.67 to over 100 ticks per second. Needless to say we didn't accept that run :p
hey, wanted to tune in and let you know that the fps counter being weird and hovering around certain numbers is due to how CS internally calculates the fps it displays you. instead of doing it "properly" and counting how many frames are drawn it uses a rolling average which breaks at "higher" numbers due to the time between frames being so small it just can't hold that much info. a way more accurate fps counter would be the afterburner/rivatuner overlay thingy.
Philip's gpu/cpu videos are so entertaining and more interesting than other tech channels. You focus on the less relevant benchmarks, but they are FAR MORE INTESTING! The power consumption was very interesting, thank E-Philip!
i generally dont care for ads, but i gotta admit that what Phil did here with the nordvpn ad read was pretty cool. he knows his audience and was able to turn it into something actually new and informative instead of the usual boring and repetitive script ever other youtuber follows
The bar graph starting @5:26 is getting comical. I cannot believe the 1% low is still above 400fps... WITH 3840x2160. I bet moving your mouse on CSGO felt like butter while still enjoying the visual clarity of 4K. I cannot help but to envy.
@@griffin1366 I'm trying to work out the sweet spot for my next PC. Maybe a 3070, maybe Ti. Seems like most stuff after that is just overkill for anything I'd need for a while. The 4090 just seems silly.
I love that you are tech aware. All the youtubers, streamers, pro players dont know shit about components and pcs. You even mentioned the then upcoming x3d cpus which are now a thing and would totally make the fps higher.
COD4 had some engine quirks that gave different optimizations at different frame rates. At 500 you could walk up slanted surfaces I think. 250 frames I'm pretty sure was the highest jump while 333 was the best for getting acceleration in the air while strafing. These mattered for the COD4 jumping (KZ kind of) and bouncing community. I think these are all general quake engine quirks but I've always wondered if other games had benefits like these.
333 doesn't give the most acceleration. That's used for height as its better for height than 250. 125 fps is the best for head on strafe jumps and for acceleration before landing on a platform. 250 is best for diagonal jumps. I love cod4 physics and play codjumper pretty much every day
@@iceangelx22 ahh its been awhile thank you for the clarification. I can remember my first time hitting the pipeline bounce. The cheers from my more experienced friends in voice chat. It was a high like no other. I was a lurker on the codjumper forums once upon a time as well. I'll have to install it again sometime.
Its sad that V-Sync is enabled on console games and takes around 6-12fps per game, and it adds more then 10ms in latency, in games were you can preview it like in Rainbow Six Siege, I found that it runs at 4K maxed settings and it pushes 150+fps impressive for a console, but then again this game is not hard to run. 23ms with v-sync on, and 11ms with it off. The game felt more responsive and if VRR was enabled and DLSS 3 was available the experience would have pushed 200+fps. On a console. This was tested on a PS5. Cat scared by fart and weird FPS phenomenon was the high points! Thanks for the entertainment ❤️🥺
It would be interesting to see a follow up on the high fps speed bug. How does the game timer work, how does a bomb plan go? Bomb explosion? Just throwing some ideas out there.
OMG thank you philip 7:25 fixed my issue that i have when i join community servers. Thank you soooo much i actually though i could never fix this and i never though having to much fps breaks the game
Hey Philip There was a command in half-life 1 called clockwindow that stopped this "jitter" of FPS, perhaps CSGO has one too, i was pulling, 999fps but had to cap it to 200 and it requires server owners to change the variable. Valve implemented it as an anti cheat for speedhacks(edited)
there has never been a sv_clockspeed command in any GoldSrc games, let alone Source games. It also wouldn't make any sense, since game clients/servers has nothing to do with setting any hardware configuration.
@@dealloc you're right it's called clockwindow and it was made to stop speedhacks, setting to 0 foxes this issue you have to increase updaterate/cmdrate/rate and interp
Thank you philip for showing your air-cooled setup. It shows that you can have a perfectly fine high end rig that doesn't need water cooling if you don't intend to overclock anything. The market and most tech youtube channels seem to shove water cooling in the face of anyone that's looking to build even a mid level gaming pc, and honestly you don't need it unless you're a PC enthusiast or going for heavy overclocking on your system.
@@3kliksphilip is there something wrong with it? I thought the eco preset was also a simple power limit Edit: ahh, we were talking about Intel CPUs. I thought eco mode was something you could set in the BIOS on both platforms. I think I've heard this mentioned a few times by different reviewers.
The speed hack at more than 1000FPS can be caused by floating point rounding error - if the game calculates movement by multiplying moveSpeed * deltaTime (time between frames), then with so small values rounding errors occur and may prevent calculations from being done correctly. It's especially probable, because those rounding errors occur mostly at powers of 2 (1024, 2048, 4096) which are the numbers that FPS locks at
the fps counter liking certain values might be because the counter you were using doesn't use a high enough resolution system clock/timer API, just a guess. edit: for example if an fps counter was implemented like a queue, with frames older than 1000 milliseconds being discarded, the accuracy of FPS this gives you can go wrong in two ways: the time recorded with the frame, and the current time you are comparing to. for example this would go wrong if the current time function resolution was too low, and it returned the same time in milliseconds for two frames. the reason you saw these funny numbers in a psuedo-random number generator research paper is possibly because random number generators are seeded on the system clock. edit 2 (thanks octav): it seems that CSGO fps is calculated with the time it takes to render each frame (averaged), and due to 32-bit floating point values not being accurate with very small values, precision can be lost as these very small values have maths operations performed on them. so in CSGO the issue is unrelated to the system clock.
@@octav5692 i see, so the inaccuracy is fps measurement in this scenario is probably due to low floating point precision, since all those usercmd/createmove values such as frametime are floats (or otherwise 32 bit) if i recall correctly.
New World's bricking issue was a specific manufacturer error with 1 gpu that just happened to only come up because of New World. all that stuff about low settings & max fps has nothing to do with it. the menu ran too fast, and 1 specific card didn't have the right safeties in place to deal with it.
Yep. I hate how this is just brushed over by people. A game will not brick your PC, only if the hardware fails. I remember the Battlefield 4 menu screen would hit 6000 fps at launch the GPU to 100% / full load, but by default the game was capped to 200 fps if I recall.
Thank you for making this. At my work I occasionally work with real-time control loops running at 10 kHz. I will suggest they start writing the algorithms in source.
The Source built-in FPS cap isn't very good at sticking to a value consistently - I've had more success with `fps_max 150` to target 145 FPS and keep a 144 Hz monitor maxed out. External FPS limiters don't have this problem, but they introduce more input lag compared to a built-in frame limiter.
@@Calinou It's actually hitting 150 FPS, just the way the game calculates it in the HUD is off. Force the HPET timer to 0.5ms and it will cap bounce around 147-149. Go back to Windows 7 and it will cap to exactly 150 and be stable.
@@stormkiller4148 I am running the game at 2.5x my refresh speed, so it is smooth enough. Most importantly, I am saving some energy and polluting the world a little less. Who cares about smoothness.
wow, you found another mathematical phenomenon with those frame rate numbers and that write up you found. same thing happened to me. there is a buger joint in ann arbor michigan that is famous for as many toppings as you want. they have a sign on the wall that showed the possible combinations equaling 2,147,483,646..... one off from the runescape mash cash stack size aka highest integer number on a 32 bit system. but yours is much more odd to me.... good vid as always phillip
I love how a game a I dont even play is something i follow so intently simply because of the way you present your videos. You probs wont see it but I appreciate your content.
Next up is trying to combine this with AI upscaling and frame interpolation, especially if the CPU is the limiting factor. 10,000 FPS is probably already in the realm of possibility.
1024 and 2048 are powers of 2, pretty important numbers in a binary world such as computers. Not sure why that phenomenon would occur with the framerate fixating around certain numbers however. I'd love to see a follow up video if you are ever able to discover what is causing it.
I believe the repeated numbers has something to do with the game going off frametime rather than the framerate. All the repeated numbers are close to 1.2ms, 1.1, 0.7, 0.6, 0.5, etc. Instead of determining the exact fps, it probably rounds to the closest 'number' for that set.
There was an exploit in TF 2 where if you had high FPS the sticky launcher would reload faster. Sure, it's useless for casual play but makes huge difference in comp play.
Lower input latency. Input is polled each frame in most games. As long as framerate is lower than polling rate, there is potential to decrease input latency by increasing framerate (1000 FPS being most optimal for both 1khz polling rate input devices and also 1khz refresh rate monitors which, based on what blurbusters have said, is the (minimum?) refresh rate at which zero motion blur occurs during movement on screen).
How much detail he puts into videos and how much depth that goes into every step and analyzing every detail is absolutely insane and I for one love it. Only how much information would he be able to gain if he had virtually unlimited resources to help eliminate performance limiters such as component temp, If he somehow used dry ice to cool components and benchmark overkill cs, on one of the most overkill pc's.
Game might be even faster with other version of Windows. Something like Windows 10 LTSC, which will free the processor a bit (cuz no telemetry is collected)
Literally zero difference because modern Intel CPUs can run the entirety on Windows on 10% of an efficiency core. Stop with this placebo bullshit. LTSC is not even aimed at you. And it's fucking hilarious you think telemetry has any effect on performance.
I feel like the 1000fps mark for stuttering has something to do with milliseconds and timing on the source engines part. The game might be running too fast and CSGO is telling it to slow down for a split second.
The vent sniff was perfect. I love the Steam Deck community's love for lowering TDP and sniffing probably bad but nice smelling chemicals. I hope we see more ekliksphillip in the future!
wtf? the fact you started running faster is insane, never would've guessed that. and you even confirmed that forum post about the guy getting stutter over 1000 fps looool, neat
I have been wondering what that whining noise from my PC was ever since I got my 3900x back in 2019. Thank you for clearing that up! I usually got it in Minecraft or in loading screens
10:35 - 10:50 Can confirm this also happens in tf2. Especially notable on jump maps and other maps that don't center around chaotic fights or actual gameplay.
Source Engine has a thing to limit frame time to be at least 0.001 seconds unless you have sv_cheats enabled. Which means enabling cheats will *stop* the speed hack from working.
Judging by the magical number of '1000', I can speculate at a rather high certanty that this is related to some clock values being fetched at millisecond accuracy instead of higher during client side interpolation, which could well explain the glitchy yet rapid movement (though this is the stable one, due to a lot of extra complexity introduced by multicore if you use the higher accuracy yet unstable clock, you might travel 'back in time' if fetching from a different core). I do not think this is fixable on most current computers without having some rather un-universal requirements on the system. Also, being sub-millisecond per frame, system context switching from syscalls and scheduling might start to come into picture (as these are usually tens to hundreds of microseconds on current PCs) and might start to introduce instabilities to the frametime. Nice video and very fun experiment with the now 10+ year old engine (definitely not advised for normies to try this at home XD). Please keep on with such content.
This has to do with how source handles prediction, the client and server basically do the same things and are expected to have the same result, so when the server updates you as the client, you don’t notice it since you are where you should be. However, when you give source a lot of frames, my guess is somewhere in the deltatime (movement calculated between each frame to move a specific speed per real second) is getting crushed by just how many frames the engine is receiving, making it speed you up more than normal. I’m fairly sure you can find a way to display prediction errors, the console variable “developer 1” is where I’d start. Love your vids! Keep up the good content ^^