Poll: 60fps vs 30fps? opinions?

Smooth Operator

New member
Oct 5, 2010
8,162
0
0
Well since I'm stuffing my face at the moment let's make it a food analogy:
One of them is a McDonalds burger and the other a properly cooked burger, sure they both do the same thing but there is a big difference in how they taste. And at the same time a lot of people will not know it, possibly not want to know it as they are stuck with only one option.

Having no other option I will also squint a little and get through 30FPS games, but it will always be an eyesore (quite literally at times).
 

maxtiggertom

New member
Apr 13, 2015
41
0
0
Seraj33 said:
Pfft, 60 fps.. how about 120. Now we are talking.
I know that some if not most people can not see the difference in fps. But for me it is like night and day. Playing CS:GO on anything lower than 120 and it gets painful for me.
Games that are limited to 30 fps are percieved by me as choppy and unless the graphical design works well with it, it can get on my nerves while playing (not as much of a problem when just watching however). Even in youtube videos, 30 and 60 fps is a clear difference to me.

A similar problem I have is when they limit the FOV to less than 90. When I first played Skyrim, I got nauseated because it felt like I had tunnel vission.
Personally the amount of people whom have experienced 120fps is extremely low especially in the grand scheme of things the only reason I could see someone investing in a 120hz mounter is for CSGO and or Starcraft 2 just my opinion tho :D
 

maxtiggertom

New member
Apr 13, 2015
41
0
0
thewatergamer said:
Call me a "PC-master racer" or "entitled" but I refuse to play any games that are locked to 30fps in 95% of cases, on PC there is zero excuse for it at this point, and frankly, I think console players should expect better from their machines, if you don't care about 60fps fine, but I get sick of people trying to push 30fps as just as good or somehow even better, it's not
I agree with you completely mate lets look at sleeping dogs DE for example 1080p at 30fps on the "next gen" consoles what the flying hell were they thinking? that should easily be able to keep a constant 60fps no excuses there.
 

cikame

New member
Jun 11, 2008
585
0
0
60fps should take priority, it makes all games better. I would gladly take sacrificed graphics for 60fps in all games.
I understand that in the past you couldn't render a 3D world and have 60fps, i'm thinking the playstation 1 era with games like Syphon Filter, Tomb Raider etc, the render distance would be so short it would be a huge negative, but we're well past that. Every single game in the last generation was 30fps until COD4 came along and took the world by storm, you just couldn't get that fast rendered fast paced style multiplayer shooter on the consoles, of course PC users were used to games like Quake, its sequels, Half Life and it's hugely popular mods like Counter Strike, COD4 was the first exposure for huge numbers of people on the consoles so now we can easily see the difference when developers opt for 30.

There are people who claim to not notice and that's fine, but this poll is currently 94% in favor of 60, so don't pretend that it's not what people want.
 

NPC009

Don't mind me, I'm just a NPC
Aug 23, 2010
802
0
0
maxtiggertom said:
Antigonius said:
I play games only on 200 FPS or don't play them at all.
Yes, even if I have to turn on the PS part myself
1) 200 fps is not even remotely possible yet on any platform lord knows when the consoles will even get constant 60fps on every game.
2) just stop with the trolling mate if you are going to screw around on this thread then go if you are actually going to provide your insight into the matter that's fine as well. If this your idea of the joke i'm sorry but I don't get the point or the punchline.
1) I'm pretty sure he's joking/making fun of the elitist bastards. Ignore him or laugh along. Be careful when accusing someone of living under a bridge. Mods don't take kindly to that around here.

2) Games will never all run on at 60fps, because often times developers have to choose between fancy graphics and performance. Many see 30fps as an acceptable compromise and seeing as these games sell just fine, they have no reason to change their mind. One notable exception would be Nintendo. Did you know most Nintendo games run at 60fps on Wii U? Barely anyone notices, because barely anyone cares. So... yeah.

3) Like I mentioned before: the average gaming PC isn't actually much more powerful than the average modern console. Of course this is no excuse for crappy ports, but even if games are ported properly, large groups of players will still have to make compromises to get the games to run well on their systems.

And as a bonus:

4) It's kind of surprising how big of a deal fps is nowadays. While it's true that 2D console games shown at were 50 or 60hz, they weren't actually animated at that rate. That 50 or 60hz is just the rate your tv/monitor refreshed at. For instance, a simple 1 second walking animation might actually have only a handful or so frames. High refresh rate or not, movements would still look choppy if there were only a few frames of actual animation. The refresh rate only caused problems when the animations were tied to it. Many old PAL games actually run slightly slower than the NTSC originals, because players were still getting the same amount of images, but at a slower pace (50 images per second instead of 60). As a result, games like PAL Sonic play a little slower than the original.

The first 3D games had fairly low frame rates as well, and that goes for both consoles and PCs. IIRC, Monster Maze went as low as 6fps. People were fine with this, by the way. And I recall some 3D SNES games being in the 15-20 range. It's not something people complained about, because animations were often choppy anyway. It was only annoying when there was a performance drop (like a game becoming all stuttery during a mode 7 segment) and you could see the game slow down.

The olden days were not the 60fps paradise some people imagine it to be. Not on console, not on PC either. Actually, high framerates could be a major headaches as it was not uncommon for developers to tie the clock to the frame rate. As a result, games sometimes sped up to near unplayable levels when played on hardware much more powerful than whatever the developers envisioned. Usually, framerates often capped to keep that from happening. I think Doom was capped at 30 or 35?

And I mean, sure, I totally get that smooth animations are awesome and we want to see developers make the most of those impressive 3D models and the subtle, realistic movements they're capable of. And yes, I'll readily admit smooth, detailed animations are important in action games where every movement counts, but some of the reactions to 30fps are just really, really silly. I imagine if those people ever discovered cartoons, they'd start worshipping classic Disney animations (as these were animated 'on the ones' - 24 images per second) and then, after they're done praying to the animations gods, build a time machine to burn Hanna Barbera studios to the ground.
 

Ravenbom

New member
Oct 24, 2008
355
0
0
While I voted 60 that's mainly because as a PC player I'm used to not getting rock-solid frame rates. I look at it from the perspective of the larger the buffer zone for frame rate dips, the better.

That being said, I do hate locked frame rates as well.


One thing that bugs me about PC Master race though is the lack of understanding that monitor > frame rate.

I didn't notice this until last gen when I bought my second flat screen TV and it was 120hertz and suddenly all my PS3 and 360 games looked better than even my PC games.
Buying that TV was a big dilemma because I didn't like the look for movies and TV but it won me over with gaming performance.

Unfortunately, I think most newer monitors of the last few years are concentrating on curved and/or 4K over eliminating motion blur and you have to look into each manufacturer for their solution because each is slightly different.
LED and OLED pretty much eliminated motion blur if you're not interested in looking at manufacturers.

Most people seeing a 30 frame LED game vs a 60 frame LCD monitor game will think the 30 frame LED is playing at a higher frame rate. Removing motion blur will in most cases make a locked frame rate actually look better than one that isn't.

For instance, this is a neat trick that can be used a trade shows in order to make early builds look better than they would otherwise appear and it's an important consumer distinction that very few people seem to care about instead quoting numbers.
 

DoPo

"You're not cleared for that."
Jan 30, 2012
8,665
0
0
Antigonius said:
And honestly - this whole fps debate is literally elitists, arguing about what is better.
I don't think anybody is arguing what is better. The "debate" or whatever seems to be with people on one side saying that 60 FPS is better and the on the other we have people who say they don't care. I've not seen anybody claim 30 FPS is better, unless they tack on qualifiers to that statement.

The reason being that there is no debate. I'm pretty sure most people know - some just don't care.
 

TotalerKrieger

New member
Nov 12, 2011
376
0
0
I used to not really care, but after upgrading my PC I will admit that I have become accustomed to playing at 50-60 fps. Playing at 30 fps is still fine, but I now perceive it to be noticeably less clear and fluid in motion. It is reminiscent to re-playing an older game that had bleeding-edge graphics for its time. The "wow factor" disappears and you start to notice all the imperfections more easily.

Xbox One and PS4 players will probably see more games running around 60 fps at 1080p resolution once DirectX 12 becomes the new norm for developers. Both Xbox One and PS4 use AMD GPUs (rough equivalents: XB1-HD 7790, PS4-HD 7870) which will likely see a significant boost in performance as AMD's CGN architecture will be able to leverage more graphical processing power under this new new API.

It is way too early to tell as we have only one gameplay benchmark for a DirectX 12 title (Ashes of Singularity), but hopefully console players will see a similar increase in FPS numbers between DX11 and DX12 games. For the Radeon 290x, we saw FPS literally double. The older GPUs in the consoles probably won't see such a dramatic improvement as they use an earlier version of CGN, but a 50 to 75% increase in FPS is quite possible.
http://arstechnica.com/gaming/2015/08/directx-12-tested-an-early-win-for-amd-and-disappointment-for-nvidia/
 

LordLundar

New member
Apr 6, 2004
962
0
0
I posted this in another thread but it seems to be as applicable here:

LordLundar said:
60 FPS has mattered I would say since gaming started. Even the original Pong was running at 60 FPS. Now as for when did it become vital is not the question but why and that depends on the genre. For something like a JRPG where thought is more important than reflexes a 30 FPS is not as heavy of an impact as say a fighting game.

But that's not what we're talking about here, now is it? No, we're not talking about when the number became important as when the argument became important. In that respect it became a lot easier to track down and it boils down to companies focusing so much on graphical fidelity that they realized they couldn't pull off the 60 FPS. So rather than tone it down to hit the numbers, they hit up the marketing team to come up with an excuse for why they couldn't pull it off. Thus did the "30 FPS is better" argument start up.
 

NPC009

Don't mind me, I'm just a NPC
Aug 23, 2010
802
0
0
Antigonius said:
This guy gets it. You're awesome, NPC009 :D
To be fair, there's a bit more to it than just animation. Input lag may be an issue for action-oriented games. Higher frame rate mean a smaller delay. However, this only relevant to games were reaction speed matters (and the player is good enough to notice a difference), such as fighting and rhythm games, first-person shooters and perhaps even RTS games that are played competitively. It's of little importance to a turn-based strategy game or point & click adventure.
 

cleric of the order

New member
Sep 13, 2010
546
0
0
30 fps makes my eyes bleed.
60 is definitively better, but why not go further?
60fps and 1080p today
200fps and 4000k resolution tomorrow
 

Emanuele Ciriachi

New member
Jun 6, 2013
208
0
0
PCMasterRacer here. Very few game types (e.g., strategy, slower-paced RPG) can be justified to run at less than 60 FPS.
If I cannot get at least 60, I will decrease the detail level - fluidity trumps all.

This Christmas I gifted myself a 144Hz monitor... and replaying old titles at 144 FPS is just... well, glorious.
 

NPC009

Don't mind me, I'm just a NPC
Aug 23, 2010
802
0
0
LordLundar said:
I posted this in another thread but it seems to be as applicable here:

LordLundar said:
60 FPS has mattered I would say since gaming started. Even the original Pong was running at 60 FPS.
But wasn't that the refresh rate of the monitor rather than the frame rate being put out by the system? And even if it was the actual framerate, then we're still just talking the number of images per second, not the number of frames making up a movement. If an actual 1 second animation is, I don't know, 12 frames (I have no idea how many pixels the bats and ball can move in a single frame) but if it's shown at 60fps, all that's really happening is that this 12 frames bit of animation is stretched out to fill 60 frames. The first frame is shown five times, the second one too, and so on. The animation is still choppy, and at best there will be more of a blur because the frames with different animations are closer together.

Again, I don't know in how much this goes for Pong, as the original is a little before my time. And I'm these ancient games would be the best example, as they were often developed together with the hardware they were supposed to run on. I kind of doubt the aim was to have games run at 60fps. People just wanted the games to be playable. (Space Invaders is a fun example. The reason the aliens speed up as they shrink in number is because the hardware couldn't really handly large groups of them. Instead of redesigning the game around the hardware limitations, the creator kept the decreasing slowdown in as a gameplay mechanic.)
 

TotalerKrieger

New member
Nov 12, 2011
376
0
0
cleric of the order said:
30 fps makes my eyes bleed.
60 is definitively better, but why not go further?
60fps and 1080p today
200fps and 4000k resolution tomorrow
60FPS at 4K UHD will be the next reasonably attainable standard for PC gaming within a couple of years. The price of 4K monitors and TVs are starting to drop below $1000 and the next generation of GPUs from Nvidia and AMD due in Q2-Q3 2016 will likely be able to produce 60fps at 4k resolution due to the drop from a 28nm node to a 16nm node (a lot more transistors). This will definitely be the standard for the PS5 and the next-gen Xbox as well.
 

Zipa

batlh bIHeghjaj.
Dec 19, 2010
1,489
0
0
Its simple, 60fps is objectively better than 30fps, that doesn't mean that a game running at 30fps in unplayable, they can still look and play excellently.

That said it is kind of a shame that games are still having to be run at 30fps at all thanks to the underpowered hardware in the consoles. The gap is likely to widen even further between PC and consoles as early as this year as both Nvidia and AMD have new GPUs coming on a new die size which will mean even better performance. I imagine before this console cycle is over PCs standard will be 1440p gaming at 60fps+ (Gsync and Freesync are likely to become more common and popular as the prices drop to).
 

cleric of the order

New member
Sep 13, 2010
546
0
0
Higgs303 said:
60FPS at 4K UHD will be the next reasonably attainable standard for PC gaming within a couple of years. The price of 4K monitors and TVs are starting to drop below $1000 and the next generation of GPUs from Nvidia and AMD due in Q2-Q3 2016 will likely be able to produce 60fps at 4k resolution due to the drop from a 28nm node to a 16nm node (a lot more transistors). This will definitely be the standard for the PS5 and the next-gen Xbox as well.
This makes me immensely happy
 

cleric of the order

New member
Sep 13, 2010
546
0
0
Higgs303 said:
60FPS at 4K UHD will be the next reasonably attainable standard for PC gaming within a couple of years. The price of 4K monitors and TVs are starting to drop below $1000 and the next generation of GPUs from Nvidia and AMD due in Q2-Q3 2016 will likely be able to produce 60fps at 4k resolution due to the drop from a 28nm node to a 16nm node (a lot more transistors). This will definitely be the standard for the PS5 and the next-gen Xbox as well.

[edit]I really dislike double posting[/edit]
 

NPC009

Don't mind me, I'm just a NPC
Aug 23, 2010
802
0
0
Zipa said:
Its simple, 60fps is objectively better than 30fps, that doesn't mean that a game running at 30fps in unplayable, they can still look and play excellently.

That said it is kind of a shame that games are still having to be run at 30fps at all thanks to the underpowered hardware in the consoles. The gap is likely to widen even further between PC and consoles as early as this year as both Nvidia and AMD have new GPUs coming on a new die size which will mean even better performance. I imagine before this console cycle is over PCs standard will be 1440p gaming at 60fps+ (Gsync and Freesync are likely to become more common and popular as the prices drop to).
Keep dreaming. It's not just console pulling the PC elite down, it's other PC gamers as well :D

Look at what Steam users are actually [http://store.steampowered.com/hwsurvey/processormfg/] using [http://store.steampowered.com/hwsurvey/videocard/]. The Intel Graphics HD 4000 is the most common videocard, mid-range CPUs everywhere!

Few developers are crazy enough to developer for only top-range systems. But, it would be nice if developers made a habit of putting in a frame rate slider so users can cap their's at whatever they feel like (and enjoy the wonders of Hearthstone or Minecraft at 144fps).