Poll: 60fps vs 30fps? opinions?

TotalerKrieger

New member
Nov 12, 2011
376
0
0
NPC009 said:
Zipa said:
Its simple, 60fps is objectively better than 30fps, that doesn't mean that a game running at 30fps in unplayable, they can still look and play excellently.

That said it is kind of a shame that games are still having to be run at 30fps at all thanks to the underpowered hardware in the consoles. The gap is likely to widen even further between PC and consoles as early as this year as both Nvidia and AMD have new GPUs coming on a new die size which will mean even better performance. I imagine before this console cycle is over PCs standard will be 1440p gaming at 60fps+ (Gsync and Freesync are likely to become more common and popular as the prices drop to).
Keep dreaming. It's not just console pulling the PC elite down, it's other PC gamers as well :D

Look at what Steam users are actually [http://store.steampowered.com/hwsurvey/processormfg/] using [http://store.steampowered.com/hwsurvey/videocard/]. The Intel Graphics HD 4000 is the most common videocard, mid-range CPUs everywhere!

Few developers are crazy enough to developer for only top-range systems. But, it would be nice if developers made a habit of putting in a frame rate slider so users can cap their's at whatever they feel like (and enjoy the wonders of Hearthstone or Minecraft at 144fps).
Yea, but most AAA games that were primarily developed for the console market typically include many post-processing effects and other bells-and-whistles for the PC version that will cause even high-end systems to struggle to some extent. Titles like the Witcher 3 and even GTA:5 require a fairly pricey rig to run at 1080p/60 fps with all settings maxed out. If you consider playing at higher resolutions like 1440p and 4K, then very high end systems can be worked to their max as well. There is no single graphics card on the market right now that can run the most recent titles at 4K resolution/60fps. IMO, the PC hardware of today isn't being held back at all because consumer-grade hardware has not improved all that much in the past 3 years.

The notion that consoles hold back PC games is usually false. PC gamers simply tend to overestimate the capabilities of their hardware. The most recent fiasco was the graphical downgrade of the Witcher 3 from what was shown in the 2013 E3 demo. Naturally many blame the limitations of consoles for this downgrade, but fail to consider that most high end systems have difficulty running the final product at 1080p/60fps with maxed out settings. You would likely need some sort of ridiculous $2000+ rig to hope to run the Witcher 3 with the graphical fidelity shown at E3 2013. Why the devs decided to cut these setting out of the final product is less clear, but it is a bit of a moot point given that the vast majority of "hardcore" PC gamers would not be able to use them until 2-3 years after release.
 

DoPo

"You're not cleared for that."
Jan 30, 2012
8,665
0
0
NPC009 said:
The Intel Graphics HD 4000 is the most common videocard
Yet, at the same time, Intel cards are by far the least common of anything. Aside from "Other".

DIRECTX 11 GPUS totals by manufacturer said:
AMD: 19.42
Intel: 15.85
NVIDIA: 49.31
Other: 15.43
The fact that it's most common isn't actually that much a factor.
 

RedDeadFred

Illusions, Michael!
May 13, 2009
4,896
0
0
60 is obviously better, but I am perfectly capable of playing and thoroughly enjoying a game at 30.

As far as I am concerned, there are much greater sins that consoles have caused devs to inflict upon PC games (shitty FOV, weird menus, and the mothercunting mouse acceleration).
 

NPC009

Don't mind me, I'm just a NPC
Aug 23, 2010
802
0
0
Higgs303 said:
Good points. My laptop is pretty pathetic, but looking at newer ones in the same price range today, I noticed there hadn't been much progress. And that's just systems within my budget, I have very little idea of what's available beyond of what I'm willing to pay for a pc. I'm guessing that by what you just said, I'm not actually missing all that much by sticking to consoles and a low-end pc.

PC gamers have one reason to complain about consoles, though: consoles are very popular and many triple A developers develop for PS4/Xbox One first and foremost. Ports are usually decent, but some... Wait, would that make console gamers the master race?

DoPo said:
NPC009 said:
The Intel Graphics HD 4000 is the most common videocard
Yet, at the same time, Intel cards are by far the least common of anything. Aside from "Other".

DIRECTX 11 GPUS totals by manufacturer said:
AMD: 19.42
Intel: 15.85
NVIDIA: 49.31
Other: 15.43
The fact that it's most common isn't actually that much a factor.
Plenty of other videocards that are comparable in power, though. No matter how you look at it, mediocrity is what's normal :)
 

TotalerKrieger

New member
Nov 12, 2011
376
0
0
NPC009 said:
Higgs303 said:
Good points. My laptop is pretty pathetic, but looking at newer ones in the same price range today, I noticed there hadn't been much progress. And that's just systems within my budget, I have very little idea of what's available beyond of what I'm willing to pay for a pc. I'm guessing that by what you just said, I'm not actually missing all that much by sticking to consoles and a low-end pc.

PC gamers have one reason to complain about consoles, though: consoles are very popular and many triple A developers develop for PS4/Xbox One first and foremost. Ports are usually decent, but some... Wait, would that make console gamers the master race?
At present, I would say there is a clear difference in performance and graphical fidelity, but overall it is not a massive increase in quality (like medium-high settings vs. ultra). The real question is whether PC gaming is worth the difference in price, which depends on individual preferences. Consoles and Gaming PCs each have unique, undeniable advantages and disadvantages. I enjoy tinkering with PC components and fixing any problems that may arise. I find it satisfying to acquire new technical knowledge of this sort. However, I freely admit this sort of thing can be perceived as a legitimate downside to PC gaming among others who want a more plug and play experience. Each to their own, I say.

Yep, console gamers are clearly seen as the "master race" in the eyes of the executives who run the major publishing companies. Money talks and consoles hold a far larger share of the market.
 

Amir Kondori

New member
Apr 11, 2013
932
0
0
I thought I was going to be shaking my head looking at the poll results but surprise surprise, people know what is shit and what is shinola.

I would say in any kind of first person game you really need to prioritize frame rate. In a 2d top down game like Nuclear Throne 30 fps is no big deal, and in fact necessary for how they manage timing everything with the engine they use.

Basically for most big modern releases frame rate should be prioritized. Even if it isn't 60 fps they should at least make sure they never dip below 30, which games like Fallout 4 and the Witcher 3 do on occasion.
 

Dizchu

...brutal
Sep 23, 2014
1,277
0
0
pookie101 said:
its weird for me ive had a lot of people and quite a few on this forum try to point things out, show gifs, videos, etc to show the difference but my eyes or brain simply cant see a difference between 30 and 60
Honestly, it's more about how it feels rather than how it looks. Most live-action films are in 24FPS and many animated films can often be half that, but they still look decently fluid. In a game it's much different though, because we're directly influencing what's on screen. Imagine if your actual vision was locked to 30FPS, it'd be completely disorientating.

Many games locked at 30FPS these days tend to use tricks like motion blur to smooth it out (much like frames in a film will blur when there's movement), but because games are interactive it'll feel sluggish. It's not the most noticeable thing in the world, games at 30FPS are hardly unplayable. But there's more of a disconnect going on because what's happening on screen is running at a low framerate while what the player is doing isn't locked to "frames" at all. So basically, the more frames there are per second, the less of a barrier there is between the player and the game.

I mean everything above 12FPS will probably look like fluid movement, everything about 30FPS may even be negligible. But there's a difference between how it looks and how it feels to play.
 

laggyteabag

Scrolling through forums, instead of playing games
Legacy
Oct 25, 2009
3,376
1,077
118
UK
Gender
He/Him
99% of the time, 60 is objectively better than 30, because it just makes the game so much smoother. That being said, though, after about 10 minutes, whether or not it is 30 or 60, as long as it is stable, I stop paying attention to it.
 

someguy1231

New member
Apr 3, 2015
256
0
0
For years, I had a sub-par computer. Most of the games I played on it just barely met the minimum requirements. I always thought the "30 vs 60" debate was overrated and insignificant....until I upgraded my computer.

After playing numerous games at 60 FPS that wouldn't have even run on my old computer, the difference is astonishing. It less the look and more how the games feel. The increase in responsiveness is incredible. Once you've tried 60 FPS, you'll never be satisfied with 30 again.
 

AT God

New member
Dec 24, 2008
564
0
0
I've been exclusively a PC gamer for about a decade and even though I think it is sad that consoles aren't 60 FPS, it isn't what kept me away from them. I maintain that while 60 FPS is clearly smoother than 30 FPS, as well as having the benefit of being twice as fast when it comes to inputs and feedback, I don't adamantly hold that 30 FPS is insufficient by any means.

It depends entirely on the game in question. I think any 3D game made today should never have the frame rate tied to the game's logic or functioning. It is a cheap cop-out for developers who don't want to properly bug-fix their games and would rather give some people a less enjoyable experience than spend the time to make their game run better. However, certain games can gain nothing from moving the framerate away from that which it was created at. My example of this is the South Park RPG. Given that the game was designed from the get go to be visually indistinct from the TV show, there is really nothing to be gained from changing the frame rate to 60 when 30 is actually closer to the desired effect.

More importantly than 30 vs 60 FPS when it comes to consoles is constant frame rate. As a PC gamer, I can always turn settings down or improve my hardware to get a game to run better. The few games that cannot reach a solid framerate become quickly infamous and are hard not to become aware of. However, console gamers cannot tweak their consoles or games to improve frame rate, and unlike in the PC space, it seems to be entirely okay if a console game loses a consistent frame rate during certain moments. This, to me, would be a much bigger concern than not getting 60 FPS. I recently got a 144hz monitor and was curious about my ability to notice frame rates above 60. Long story short, at least for me, the experience of shifting framerates is infinitely more disorienting and nausea inducing than being locked at a lower, stable frame rate. I played an FPS that would stress my system and cause the framerate to drop. When I was running at 144 frames per second, dropping down to the mid 90's for a few moments was far more discomforting than simply capping the game at 60 FPS. And even then, the game never dropped down below 90 FPS. If a console game's best case scenario is only 30 FPS, any game that fails to hold there should be viewed as a scam.
 

FPLOON

Your #1 Source for the Dino Porn
Jul 10, 2013
12,531
0
0
This reminds me of something I said a LONG time ago (in 2015):
FPLOON post="6.873933.21945712" said:
Honestly, there has only been one game where I can totally see the difference between 30 and 60... However, it was only when I was going from 60 (original) to 30 (HD) that I noticed the full difference first hand...

Other than that, even after that revelation, my amount of fucks for frame rate, in general, is about as given as an inconstant frame rate...
What's interesting is that this quote still holds up for me til this very day... albeit I have seen certain games off my Steam account drops frames faster than constantly downloading porn pics during a random late-night insomnia session... *shakes fist at them for making me reset shit constantly*

Other than that, I just realized how weird that last phrase in that quote [of mine] is when you really break it down... :p
 

Fijiman

I am THE PANTS!
Legacy
Dec 1, 2011
16,509
0
1
So long as I can't tell that frames have been dropped I usually don't care.
 

NiPah

New member
May 8, 2009
1,084
0
0
Ezekiel said:
I don't play on consoles anymore, so I don't have to choose. Locked hardware is dumb.
Locked hardware has its place, easier to monetize so people actually invest money into making games, less script kiddies in multiplayer, less issues with porting games, easier to develop for, cheaper to buy a working system, ect.
Sure having a higher frame rate is nicer, but how much does it really matter while playing DOTA, LOL, or what ever other MOBA clone that everyone plays on PC, I'm sure Harthstone could run perfectly well at 60FPS on a WiiU, so why spend money on a killer gaming PC?
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
a very strange thread. 60 fps is objectively better than 30 fps. it is a scientific fact. there is no room for "opinion" here.

MysticSlayer said:
Don't care. Does it look good? Does it run smoothIs it responsive enough? Then I'm all for it.
you say you dont care yet your third sentence deals precisely with you caring about framerate? what?

Souplex said:
The difference is incredibly minuscule, and it doesn't really affect anything.
I wouldn't be inclined to care if it weren't for people kvetching about something they can barely even perceive.
False. the difference is massive, visible to everyone without eye problems and the responsiveness alone is effect big enough to make sure people refuses to play at 30 fps.


Sniper Team 4 said:
I've never been able to tell the difference, honestly. I mean, if you show me the same game side by side then I can tell that one looks better, but if you sit me down and have me play a game and then ask, "Which one was it, sixty or thirty?" I will shrug and tell you I don't know.
You are the minority then. there was an experiment done with these conditinos you describe and 96% of respondents could correctly identify the high framerate. when asked they couldnt tell exactly how they knew, but they could identify it.
 

Godhead

Dib dib dib, dob dob dob.
May 25, 2009
1,692
0
0
I want games to be running at whatever the refresh rate for my monitor is, ideally a little bit more.
 

CaitSeith

Formely Gone Gonzo
Legacy
Jun 30, 2014
5,374
381
88
I appreciate having 60fps over 30fps. But 30fps locking isn't a deal breaker to me (even in action games like Bloodborne)
 

Estarc

New member
Sep 23, 2008
359
0
0
30 FPS tends to look fine on consoles to me. The problem is that they often run below 30 FPS, dropping 10 or 15 frames in demanding sequences. If you can only manage a stable 30 FPS on console that's fine, but if it is only 30 FPS under optimal conditions than it isn't okay.

On PC though it has to be 60 FPS. I don't need a frame counter to tell when the FPS drops. It is very noticeable. Even a drop to 40 FPS, which is higher than what I am used to on console at the best of times, is very jarring on PC.
 

Drops a Sweet Katana

Folded 1000x for her pleasure
May 27, 2009
897
0
0
I'll take a stable frame rate over a high frame rate if I had to make that choice. I can play as low as 20fps as long it doesn't fluctuate too much. It's not great, but it's something I can adapt to if need be.
 

Zipa

batlh bIHeghjaj.
Dec 19, 2010
1,489
0
0
NPC009 said:
Zipa said:
Its simple, 60fps is objectively better than 30fps, that doesn't mean that a game running at 30fps in unplayable, they can still look and play excellently.

That said it is kind of a shame that games are still having to be run at 30fps at all thanks to the underpowered hardware in the consoles. The gap is likely to widen even further between PC and consoles as early as this year as both Nvidia and AMD have new GPUs coming on a new die size which will mean even better performance. I imagine before this console cycle is over PCs standard will be 1440p gaming at 60fps+ (Gsync and Freesync are likely to become more common and popular as the prices drop to).
Keep dreaming. It's not just console pulling the PC elite down, it's other PC gamers as well :D

Look at what Steam users are actually [http://store.steampowered.com/hwsurvey/processormfg/] using [http://store.steampowered.com/hwsurvey/videocard/]. The Intel Graphics HD 4000 is the most common videocard, mid-range CPUs everywhere!

Few developers are crazy enough to developer for only top-range systems. But, it would be nice if developers made a habit of putting in a frame rate slider so users can cap their's at whatever they feel like (and enjoy the wonders of Hearthstone or Minecraft at 144fps).
In defence of the whole intel HD4000 thing though we don't know if Steam counts them simply because they are there rather than if they are used or not, a lot of laptops have one in as well as some sort of dedicated GPU as well. Add that to some laptops will switch between the dedicated GPU and the intel chipset when in certain low power modes and we really don't know how accurate it is.

As for Gsync and framerates, you can set certain options already via the Nvidia control panel, Skyrim for instance has to be set to a max of 60fps otherwise it starts causing problems with things like the physics and time progression, symptoms of it being a console port.

As for devs not catering to high end users, its more of an excuse than anything now. We have games like the Witcher III that when cranked up to full looks mind blowing but will also scale very well to suit lower powered hardware. Considering their small for an AAA release budget people like Bethesda really have little excuse when Fallout 4 runs like shit on pretty much anything (even consoles) by comparison.