Treblaine said:
Strazdas said:
So a synchronization to monitors defautl frequency of 60 hz (60 blinks per second) is given way for the traditional 30 FPS (every second blink) tactics.
It is being met with a huge amount of stupid fans who think it changes anything.
They try to explain it with a made up theory of how it works without knowing shut about how human eye or brain interprets sight.
Well, its capcom.... stupidity is demanded of them.
And yes, there's a huge difference between 30 and 60 fps.
yes, it requires 2x the processing power of computer for no gain. synchronizing with monitor is good and so, but it can be done with 30 (now 35 for example would be a problem, or if you use one of those monitors that run on 80 hz, but those are kinda extinct now).
as far as "seeing" if synchrnoization is done correctly the only effect is psychological.
and yes i know there are gmes like quake where higher FPS gives you higher jumps, thats BAD PROGRAMMING.
Yeah, you're the "expert". Nintendo, Sony, Valve, id-software, John Carmack.... they're all just stupid idiots who have been wasting their whole career and you're so smart you don't even have to make any games you'll just claim they are wrong without evidence, experience or precedent.
It has nothing to do with synchronisation considering how 30fps games so often dip into 24-29fps range which does NOT evenly split over 60 frames per second.
Bu no, the WHOLE INDUSTRY is wrong and you are right. Apparently.
Im not an expert, but i did my research. Nintendo was never known to be "Smart" anyway, Valve never claimed that "60 fps better than 30 fps" and so on. The explanation why 30 fps is ok
given in this article is CLEARLY false however so it follows that the guy is either lieing or stupid.
You make assumtions of what i do and do not, and you blame me for lack of evidence, funny.
The lag-spike that drops the FPS is a problem for 30 fps locks, unless you program it like San Andreas did, which made no problem at 25fps lock. 30 is more popular however due to monitor sinchronization. if the sinchronization is done properly, human eye cant see the difference. difference is seen when the game frame generation does not match monitor refreshing, and thats why some people claim to "see the difference" when all they see is game change does not match monitor change. V-sync is popular for a reason.
If you make a game that can run on 60 FPS with, say, lag-spyking into 50 fps, lock it at 30 fps, you will not have lag-spykes because you dont need to generate more than 50 frames as you generate only 30. Now of course there are things like bad end-user equipment but really thats up to the user to sort out.
There are many people who are wrong about many things, and its no wonder there are many in the gaming indsutry as well. when you take a 3000 lines code and see that i can easily be shorted into 1000 lines code and woudl take 0.5 times the processing power, but they tell you to "go with what your given" and then complain about "high system requirements" it becomes really easy to blame the industry for stupidity.