Check best free tools to customize FPS of videos: bit.ly/3zeKYuJ Know more about 4K 24fps, 30fps and 60fps: bit.ly/3tMWlt2 Music: Cool Vibes - Film Noire by Kevin MacLeod
If you're wondering why 24fps looks so jittery. It's not because of the fps increase. It's because 24 doesn't go into your phone or computer's screen refresh rate which is generally 60 frames per second. It's a 1:2.5 ratio meaning that it will always be ahead or behind by one frame. If it was a 1:2 or 1:3 ratio like 30fps is then it would be way more smoother
No. It's because unlike in cinema, the shutter speed isn't changed here. In cinema the shutter speed causes moving objects to have a blur so they don't look so choppy as in here. This doesn't have motion blur so therefore the footage looks choppy from frame to frame because space is skipped
I find the best demonstrations to be of a camera moving through a cityscape, since you can clearly see how the framerate has little effect on the look of the slow moving buildings furter away, but lower framerates clearly make the motion of the fast moving buildings closer to the camera look jittery and unclear.
If you're wondering why a video recorded with a camera in 30 or even 24 fps doesn't look jittery is because (if the camera has been set correctly) it also captures motion blur that "compensate" the time in between the existing frames. This animation doesn't have motion blur, which is fine for a basic demonstration, but it is possible to add motion blur in the animation software. But well, it'll look blurry of course.
IF it is set correctly. Which is why I've grown to hate cinematic framerates: pan the camera across a beautiful landscape at 24 Hz and it's either a blur, or jittery af. Both makes me feel a bit dizzy at times, ever since I'm used to way smoother motion in video games and on RU-vid. Some things you just can't go back to. Just like I can't bear the high pitched squeal of a CRT TV anymore. I don't know how it didn't bother me as a little kid, but nowadays it makes my ears bleed.
Motion blur in games is absolutely cancerous. I know that cartoon animations use a similar system to simulate frames between and that fine but motion blur is the one thing I always disable, it's just ugly and overdone.
Should be noted though that motion blur might be okay, but still is not a complete fix, you're still seeing the same amount of pictures with more blur.
post-processing motion blur (and interpolation) as seen in videogames and most image editing software is very different from "real" motion blur. rather than containing visual information that otherwise wouldn't be captured between the frames, its just a smear on the frame itself. sort of like the difference between fast approximate anti aliasing and multisample anti aliasing if you're familiar. the latter tries to generate more detail for each pixel to make the image more realistic, while the former just smears everything up. but at least you don't see the pixels, right?
Watching 24fps in 50% speed doesnt have any sense, because it's like watching 12fps. Of course that frequency won't make an Illusion of fluency, but in many casas, 24fps is enough to trick our brains.
Exactly. The stated framerates only hold for five seconds of the video (from 8s to 13s). For the rest of the video, the effective framerates are 30 fps, 15 fps and 12 fps, since the circles update every two, four, and five frames respectively. So fundamentally the video is bullshitting us.
Yeah. I'll run 144 fps for competitive games since it actually matters, but for anything else I usually cap it at 60 to lower my freaking power bill >_
@@alimirza4608 How bad it is is entirely dependent on the type of game it is. If it's something like Xcom where everything is turn based and you never need to worry about pressing a button at an exact moment then yeah 30 fps is going to be no problem at all. But if you're playing something like Overwatch or Titanfall 2 where your framerate can be the difference between life and death having 30 is basically unplayable.
@@user-dw6uk9kz9q I have a 165Hz monitor and it's amazing how stuttery 60 fps actually is when you go above 60Hz refresh rate. I can only imagine how bad 60 fps would look on a 244Hz monitor.
@@m2_v9 I dont see why you would think that, but I'm actually also talking about past experiences with games. Uve played them running at 24 fps, and also at 30 fps, and the two feel drastically different
@@lazyspatewith a prerendered animation like this, monitor hz wouldnt matter, because the end frame rate would be the same on anywhere from a 30 hz to 240hz monitor if your talking about 30fps. As for what you said about games, yes, 30 to 60 fps is a pretty huge difference, but from my experience playing zelda the ocarina of time (24 fps) and playing zelda wind waker (30 fps), the former feels practically unplayable to me, while the latter feels perfectly fine, despite being only 6 frames of difference.
It could also be an issue with timing. Say your monitor refreshes 60x a second, while the video sends an image 24 times a second. These do not match up. 24fps doesn't go into 60hz with an even integer like 30 does. So some frames get skipped, making it look considerably worse.
@@dhwanitgohel3804 I think if you are watching in that resolution, you will get 30fps. RU-vid doesn't support 60fps on any resolution below 720p. If you are watching in 144p, the 60fps part doesn't even make a difference
yikes bro. Not trying to sound entitled or something like that, but the upgrade in FPS makes it look mind bendingly better. 30 fps -> 60 fps is a massive jump, and 60 fps is more than okay. but 144 fps and then 240 fps are buttery smooth. Upgrade to 60 fps at least if you can, if you're spending a lot of time gaming you at least owe yourself that much.
well that would be a comparison where the ball moves a set amount per frame, which is a great example of exactly what *not* to do if you're making the physics for a game (make sure to multiply all movements by the time since the last frame, to get a consistent result across all frame rates, and also making speedrunners' lives easier by not making them cap the leaderboards to 60fps)
@@gheetza14 fallout 4 leaderboards cap to 60 because physics are fps-dependent, i’d say 60 is reasonable because almost everyone with a medium-end computer can run most modern games at 60fps 1080p, which generally by today’s standards is the lowest end of “high quality”, i think capping at 30 is unreasonable, unless you’re doing console leaderboards, but they have reliably identical hardware and shouldn’t need fps caps
It's not so much they use motion blur so much as motion blur occurs because of the slower shutter speed that usually comes with lower frame rate filming.
@@RiamsWorld Exactly. The motion blur occurs naturally at 24p. I guess you’d need to add motion blur if you’re animating at 24p, including adding CGI to movies.
the human eyes have their own motion blur, monitors also have their own motion blur. for movies it works, but when you’re controlling a character 3 layers of motion blur is not good lmao
i remember hearing back in the day "the human eyes can't even tell the difference past 30fps" later in life "you can't even notice a difference from 30 to 60" then one day I upgraded my 60hz monitor to a 144hz ultrawide and was amazed at the clarity, can't help but hear those words whenever I see direct comparisons nowdays and everyone trying to get 180^ fps in our favourite games
@@zeroclan3816 No, no, you've gotta format it like a meme if the kids are gonna get it! Grammar: They're vs. There vs. Their This guy, who failed remedial English: Thayre the same thing.
You can easily increase your framerate by just changing your graphic card and maybe your processor ; there are really cheap ones like the AMD ones, that, for both, would cost you about $200 to 300, to reach 60 FPS. Reaching 60 FPS for the most of the game (or near) in high (it's important) is what you should aim at, it's clearly enough. And the more time passes, the more you'll decrease quality of the games by getting from high, to medium, to low at the end. And when you're at about 40 FPS even on low, there you know you have to change your setup (or if possible, only some parts). I always did that and I never spent more than $100 per year, approximately. (if I give an average cost, as I change components in my PC about each 4 years)
@@rigierish3807 thanks. That helped a lot. I’ve been thinking of getting a better computer as a friend IK said he’d help me build one as he knows how to. I’ll definitely keep this in mind tho
@@NappaThaClappa you're welcome. So try to aim a computer that can run the latest games in 60 FPS and ultra, so later, you can just lower your graphics level of your games and still have 60 FPS (because I think it's the most important, as the difference between low and ultra settings isn't that big). And if you change your monitor, I just want to say you that a 720p HD screen with 60 FPS (I'm talking about the calculations of your computer) is still better than a 1080p HD screen with 30 FPS (as there are twice as many pixel as in the 720p). I'm mentioning something as low as 720p as you said you are used to 30 FPS, so I guess you're used to low setup in general and you probably won't put a thousand dollars or more for your pc. But it's up to you. (btw, sorry for my bad English, I'm not a native speaker)
Wow, this was so informative! I never realized the difference between FPS until now. I always thought higher FPS meant better quality. Great job explaining it in a simple way!
After watching I deliberately set it to a 30fps option instead of a 60fps one. To my surprise, the top dot was still smooth and the middle was still jumpy, but since the video was playing at 30fps, they should have both looked exactly the same. Now I’m not saying this is fake or exaggerated, but it is definitely suspicious.
That's why movies, which mostly run on 24 FPS, use special interpolation techniques to make them watchable, and videogames that are limited to 30 and get a fair share of lags try to copy that with using motion blur (which is gross if it's not fine-tuned for each scene). The only solution is better optimizing games to have a smoother framerate.
It's not usually interpolation. Interpolation usually creates a "soap opera effect" that can give people a lot of motion sickness. It's usually a simple frame duplication, where certain frames in a pattern are doubled. Much better, looks infinitely more cinematic than the almost inhuman movements you see with interpolation.
Great 24 fps looks like someone lagging 30 fps is someone blurry And 60 is perfect edit: A war has started in the replies im happy to tell you guys that I will be getting a new phone with 120hz :D
@@luciano440 Man... I'm sorry, I don't wanna be that annoying teacher but... It's: *Nope, *seriously, *standard Hope that will help you in the future:)
@@syahrizkyathaullahanandisa9814: bruh I got a 144hz monitor and I think 60 looks like shit, but I do think that with 60 FPS games are playable even If I prefer 144 FPS.
Honestly, he did a great job with thumbnail luring viewers to watch, although it gives the wrong idea, but when you click the video it is accurate information.