Frankly, the people who claim that they can enjoy anything below 120fps, 1440p are just lying to themselves. Anything below that is objectively not fun
I've heard a number of people involved in graphics saying that motion blur at 30fps looks better than no blur at 60fps. Can't say I have enough experience to evaluate it one way or the other, but it deals with the issue of gaps between objects from one frame to the next, which is a large part of what our brains don't like about low frame rates.fix-the-spade said:However, around 50-60fps is where human beings stop perceiving the flickering between images. This is why movies are projected at 72fps (24 times 3 to be exact) because actually projecting 24images per second makes films look horrific, whilst most TV is broadcast at 60fps (30x2) or 50fps (25x2).
If you watch a film on a projector at an actual 24fps you can see the black spaces between images, it's awful. On games at least the previous image stays in place until the new one is rendered, so flicker at 30fps isn't as much of an issue.
Don't worry, there is plenty more time for increasing frames per second a resolutions to keep those master race wallets empty.