1 month ago
25 vs 30 FPS dates back to the introduction of an AC power grid: in the USA the AC power grid uses 60Hz, while in europe it uses 50Hz.
Later on, when telivision came, USA had NTSC at 30FPS, and europe later on PAL at 25FPS (I won't go into the others here).
For movies it was determined that, starting 20FPS, the human usually recognizes something as being "one motion", which is why the Gamebuino Classic uses that as a default.
That being said, movies run at 24FPS to give four extra frames buffer, in case something goes wrong.
Now, will anybody really notice the difference between 25FPS and 30FPS content? There are probably a few people who say they can notice a difference, but to the vast majority there is no noticable difference.
The main reason many are like "bruuhuuu i want my 30/60/120FPS" as opposed to 25/50/100 is, that, well, the USA being the USA pushed their standard down the computer industries throught.
As a fun fact: Digital television, at least here in germany, is still at 25/50FPS.
using 25/50 fps gives another advantage: the amount of milliseconds a frame takes is an even number: 1000/25 = 40, as opposed to 1000/30 = 33.33333..... So you would actually be running a bit below 30FPS anyways.
Running at 25FPS also gives the CPU more time to do other calculations.
NEW 1 month ago
yeah, for most folks, they can't tell the difference between 25 and 30 without having them side by side and pointed out. And 30 to something like 60 really do need a side-by-side comparison unless one has quite good eyesight. Most of the improved quality of higher framerates comes from the panels used on those monitors being better to begin with in my experience.
And indeed, the more cycles between frame updates the fancier the effects it'll be possible to do.
You must be logged in in order to post a message on the forum