lithiumvocals said:
So what I'm asking for is a calm and tactful explanation of this. Why is 90' FOV better than, say, 75' FOV? And why do some get sick from a small field of view?
The main deal with small FoV is that you feel like you have to constantly, hectically whip your view around for situational awareness. I can usually handle 70 and up comfortably enough, and in some games I can go as low as 60 and still tolerate it, but it's not *good.*
There are two disadantages with large FoV:
1-Bigger FoV means fewer on-screen pixels devoted to a particular spot, which can make it harder to make out details, and
2-The linear projection that games use sees increasing distortion as you get closer to 180 degrees.
However, as someone who found Halo 1's ~110-degree 2-player split-screen FoV reasonably tolerable (and in some ways beautiful in its own quirky way) under most circumstances on a 480i display (so 240i per 110-degree view), I find these issues can be pushed pretty far before becoming significantly problematic. In most cases, I find pushing up to around 120 degrees in wide aspect ratios fairly reasonable.
TheKasp said:
Eh, it all has to do with perception and distance to the screen. On consoles a 75' FoV works due to the distance, if it was a window it would be the field of view you had from that distance out of that window. With PCs you have a closer distance to the monitor, a 90' FoV is more natural.
I've never been especially convinced by this argument. Yes, linear projection will project accurately back to your eyes if you view a flat screen like that, but in my experience the brain can comfortably handle massive deviations just fine. I mean, when I go play my sixth-gen games on a small SD TV from an 8-foot viewing distance, I don't think "hey, this would feel more natural if the FoV was only 20 degrees!" And when I'm playing newer games on our 37" HDTV, Halo 3 probably yields a more "accurate" viewing with its 70-degree FoV, but ODST's view feels a heck of a lot better at ~86 degrees.
[Bonus question: I've also seen people complaining that a lower framerate makes them sick as well. Why is that?]
Sick? I dunno. But higher framerates definitely have benefits.
Now, I'm of the opinion that the "latency" argument is seriously pushing it, as is the argument that you can clearly make out the time gaps between frames.
Here's the real issue: Go move your mouse around really quickly on your PC's desktop. Just shake that thing back and forth. Even at 60fps, which is supposedly sufficient for perceptably perfectly smooth motion according to many people, those jumps are going to be HUGE. The issue here is that it's much easier on the brain if the jumps are smaller, since it doesn't have to do crazy amounts of pattern matching and interpolation in order to decide what's going on. If you're playing a fast PC shooter at 30fps, when you whip the view around, there are going to be times when almost nothing of the previous frame is in the next frame. That's tough to deal with, and in some cases it can feel like you're playing blind and sometimes playing for substantial fractions of a second basing what you do off of what you saw in the last glimpse when you weren't jerking your motion around quickly.
Now, good motion blur implimentations can help fake the real motion blur caused by persistance of vision and give your brain extra information for decoding what's going on, but it's not as good as truly increasing the framerate; it's like comparing FXAA, which basically just blurs edges, to MSAA, which actually takes subpixel locations of things into account and results in far less obnoxious shimmering on objects in actual gameplay (at the cost of being much more GPU-expensive).
Slower gameplay can also reduce the need for high FPS, since the gaps in object locations are going to be smaller between frames. This is why console shooters, which already tend to require slower aiming due to the controller, tend to feel good at tiny fractions of the framerates that PC shooter players like to play at.