Poll: 60fps vs 30fps? opinions?

Supernova1138

New member
Oct 24, 2011
408
0
0
Higgs303 said:
The difference in quality between 30FPS and 60FPS is very easy to perceive on recorded video. Certainly, some may genuinely prefer a lower frame rate for aesthetic reasons, but the advantage of higher FPS is obvious if one is interested in recording motion as close as possible to how it was perceived in person (even 60FPS isn't sufficient for this purpose, IMO).

https://www.youtube.com/watch?v=-nU2_ERC_oE

If you have trouble perceiving any difference, try pausing the video when the dune-buggy reaches the peak of its jump (0:06-0:07/0:39).

Admittedly, the effect can be more subtle in 3D applications, but it is indeed a significant change if one also considers the reduction in input lag.

My bet is this issue will fall to the wayside once PS4 and XB1 titles start reaching 60FPS more frequently as developers make greater use of low level APIs like DX12 and Mantle, as well as the usual late-gen optimizations.
DirectX 12 and Mantle aren't saving the Xbone and PS4 from poor performance, there just isn't enough horsepower in either console to do 60FPS without making serious compromises on visuals, world population density and draw distance. The only stuff that is going to run at 60FPS on Xbone and PS4 are going to be indie titles or lower budget games that don't have the resources to put into extra shiny graphics or gigantic game worlds. I'd say remasters should be 60FPS too, but they often aren't because the developers are too lazy to port them properly or want to add even shinier graphics to them that tanks their performance back down to 30FPS.

We're waiting until the 9th generation at least for 60FPS to become standard, and at that point we're probably going to be trying to render games at 4K resolution, which will probably mean 30FPS on the consoles again unless the 9th generation machines have some truly beastly hardware inside.
 

TotalerKrieger

New member
Nov 12, 2011
376
0
0
Supernova1138 said:
Higgs303 said:
The difference in quality between 30FPS and 60FPS is very easy to perceive on recorded video. Certainly, some may genuinely prefer a lower frame rate for aesthetic reasons, but the advantage of higher FPS is obvious if one is interested in recording motion as close as possible to how it was perceived in person (even 60FPS isn't sufficient for this purpose, IMO).

https://www.youtube.com/watch?v=-nU2_ERC_oE

If you have trouble perceiving any difference, try pausing the video when the dune-buggy reaches the peak of its jump (0:06-0:07/0:39).

Admittedly, the effect can be more subtle in 3D applications, but it is indeed a significant change if one also considers the reduction in input lag.

My bet is this issue will fall to the wayside once PS4 and XB1 titles start reaching 60FPS more frequently as developers make greater use of low level APIs like DX12 and Mantle, as well as the usual late-gen optimizations.
DirectX 12 and Mantle aren't saving the Xbone and PS4 from poor performance, there just isn't enough horsepower in either console to do 60FPS without making serious compromises on visuals, world population density and draw distance. The only stuff that is going to run at 60FPS on Xbone and PS4 are going to be indie titles or lower budget games that don't have the resources to put into extra shiny graphics or gigantic game worlds. I'd say remasters should be 60FPS too, but they often aren't because the developers are too lazy to port them properly or want to add even shinier graphics to them that tanks their performance back down to 30FPS.

We're waiting until the 9th generation at least for 60FPS to become standard, and at that point we're probably going to be trying to render games at 4K resolution, which will probably mean 30FPS on the consoles again unless the 9th generation machines have some truly beastly hardware inside.
You may be right, but look at the fairly dramatic increase in FPS that what seen in AMD GPUs for the Ashes of Singularity DX12 benchmark. At 1080p, the R9 290X went from 28FPS in DX11 mode to 48FPS in DX12 mode. Both consoles use AMD GPUs with similar GCN architecture. If this benchmark is representative of the performance gains to be made by GCN GPUs under DX12, then console users could possibly see a significant increase in FPS.

I admit that this is pure speculation, DX12/Mantle might ultimately be of little benefit to console performance. However, this one data set does suggest that a significant increase in FPS is possible. Hardly sufficient to really say for sure, I guess I was being a little too optimistic. If consoles did see an increase of 15-20 FPS, devs could likely tweak existing post-processing effects to allow for consistent performance in the 48 to 60 FPS range (although most first person shooters already run at 60FPS on both consoles). However, it is also possible that they would sacrifice any potential FPS gain for more post-processing effects as better graphics is a far more marketable concept.
 

Patathatapon

New member
Jul 30, 2011
225
0
0
Lemme tell you a little story: Once upon a little while ago I wanted to play Halo 5 with my brother just like we always did with all the previous ones. Then we learned it had no co-op. Why? Graphics. And specifically, 60fps. Why bother letting us by more controllers if we can't even play in the same room? Next thing you know the next Mortal Kombat won't have local co-op. If that's what having 60fps means, I don't want any of it for my console games.

Also, from what I can tell, the only time you can guarantee a "necessity" for 60fps is if you need every single frame you can get that can make the difference. So basically competitive play. Okay. Why does that affect my fucking co-op campaign experience you assholes?
 

CrimsonBlaze

New member
Aug 29, 2011
2,252
0
0
Whatever allows me to play the game flawlessly and unencumbered.

I've played The Last of Us for the PS3 for a little and found it to be an amazing title. Then, when I bought my PS4 and downloaded the remastered edition, I was just as amazed, if not blown away, at how much of a difference 1080p and 60fps makes to an already stunning game.

So a set fps will not deter me from certain titles, but if I can go 60, I'll do 60.
 

Fijiman

I am THE PANTS!
Legacy
Dec 1, 2011
16,509
0
1
SycoMantis91 said:
My vote, as a PC and Console gamer: WHO CARES
The answer is PC Elitist snobs who have nothing better with their non-gaming time than ***** and moan about how terrible they think any system that isn't on par with their super deluxe high-end computers are. That or people who can never seem to get a stable frame rate no matter what system they're using.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Supernova1138 said:
It's rare in Western development, but you still see it happen with a lot of Japanese games. Koei Tecmo has started putting out PC ports of some of their titles, they are usually locked to 30FPS and will run into game speed problems if you try to unlock the framerate. Japanese devs still develop primarily for consoles and operate under the assumption that they are never releasing their games on PC so they don't have to worry about the end user unlocking the framerate, so they can be lazy with how they code their games and tie a bunch of crap to the framerate.
Fair enough, i am not as familiar with the exclusively japanese market so that may be the case in there. still clearly the debate here is not about those games as they are mentioned only as an exception.

t850terminator said:
Really? Is this a discussion? Its like asking if you want 360p or 1080p.
i once had a person on this very forum very seriously claim that he prefers 480p over 1080p in games. That was two years ago though.

MatParker116 said:
Where's the "I don't give a flying fuck as long as the game works" option?
Its not available due to framerate being important part of game working.

MysticSlayer said:
But I guess I should provide a couple links by now that have informed my own opinion (I would have done this last night, but I was starting to get really tired and didn't feel like looking for them).

For starters, here's an article that Gamasutra did a few years ago with the help of a Neversoft employee: link [http://www.gamasutra.com/view/feature/130359/programming_responsiveness.php?page=1]

It's long and theoretical, but a couple things do stand out:

"So if we are running at 30fps, then the lag is 3/30th or one-tenth of a second. If we are running at 60fps, then the lag will be 3/60th or 1/20th of a second... [page 2]

One of the great misconceptions regarding responsiveness is that it's somehow connected to human reaction time. Humans cannot physically react to a visual stimulus and then move their fingers in less than one-tenth of a second.

Game players' peak reaction times vary from 0.15 seconds to 0.30 seconds, depending on how "twitchy" they are. [page 3]"

Now, as the rest of the article shows, there's more that goes into input lag than just framerate, and as a result a poorly programmed game at 30 FPS is more likely to show input delay than a poorly programmed game at 60 FPS, which has more "padding". But there are also a lot of other factors, even down to the way the animations go. Essentially, it is an issue with how well the game is programmed, not whether or not the framerate is in the threshold.

Furthermore, this comes up occasionally on Game Development on Stack Exchange. Here's one example: link [http://gamedev.stackexchange.com/questions/59645/what-is-an-acceptable-input-delay]

The premise of the question was bad (they were trying to control input delay more than they should), but there were also a few comments regarding how 30 FPS is entirely within the acceptable range. One person even called it "common knowledge" without content. This is pretty much the way I've always seen these discussions go: 30 FPS is entirely acceptable, but 60 FPs is better. The only exceptions are discussions like this where someone is trying to prove 30 FPS isn't acceptably responsive without much evidence outside of anecdotal experience that may have been indicating bad programming on the developer's part more than an issue with 30 FPS.

If you knew me, you wouldn't come to that conclusion.
Then why are you claiming that to be the case here?
You said I don't measure enjoyableness by smoothness and responsiveness, which is something you've added to what I've been saying.

In actuality, I know very few people that pick up input lag as often as I do or let it bug them as much as I do. I've even put down games because I could tell there was a delay between my button press and what happened on screen, only for someone to tell me they couldn't feel any. This includes those I know that only play their games at 60+ FPS.
Your link doesnt work, so thanks for taking up to quoting it. Gamasutra apperently has itself fallen for the misconception that responsiveness is about human response time, it is not. it is about games response time. yes, humans can and do both see and feel when after pressing a button the game responds faster or slower, even in those amounts. in fact the reason any monitor that has a response time above 5ms is considered "shit for gaming" is because the monitor introduced lag of even 6 ms makes game controls unconfortable, and this is 5 times higher numbers you are talking about. Of course purists will go fo 1 ms TN panels, but in general the 5ms ones are accetable.

as far as your stack exchange link, it seems to support my statement above with "note that this and what you are talking about are not the same thing. The acceptable delay between making a motion with a mouse and seeing a result is almost certainly much smaller than the time it would take a human to react to something displayed on the screen with the mouse."

You said you can enjoy games regardless of framerate. framerate is the main factor in smoothness and responsiveness. therefore, you can enjoy games regardless of their smoothness or responsiveness, which means that the value you put on games lies elsewhere. hence my conclusion.

NPC009 said:
The actual discussion is, or atleast should be, about when it's acceptable to sacrifice high framerates for better graphics and such.

I hate making food comparisons because they're such clich?s, but in the current gaming landscape (where many big studios put pretty graphics before high framerates, because most consumers really don't give a fuck) 60fps is like food from a good restaurant. It's great when you can afford it, but homecooked mac 'n cheese isn't so bad either, especially if that means you can pay rent and support your action figure collecting habit on top of it.
If most consumers wouldnt give a fuck then why are there literal riots every time a big franchise anounces a 30 fps lock, FPS police is one of the most followed curator on steam and FPS is one of the main features advertised in games? Maybe you "dont give a fuck", but most consumers most certainly DO.

Oh and there is no point in the discussion above, the answer is of course stop being limited by outdated hardware.

Higgs303 said:
Certainly, some may genuinely prefer a lower frame rate for aesthetic reasons,
what? how is stuttering visuals preferable aesthetics? Or is that whole "cinematic" bullshit again? because that was never ever in any way shape or form true.

My bet is this issue will fall to the wayside once PS4 and XB1 titles start reaching 60FPS more frequently as developers make greater use of low level APIs like DX12 and Mantle, as well as the usual late-gen optimizations.
the issue is already on the "Wayside". the discussion has moved to 60 fps vs 120 fps.



Supernova1138 said:
DirectX 12 and Mantle aren't saving the Xbone and PS4 from poor performance, there just isn't enough horsepower in either console to do 60FPS without making serious compromises on visuals, world population density and draw distance.
DirextX and OpenGLNext that will be in Xbone and PS4 respectively allows driver handled multicore draw calls, which, given that consoles have many weak cores and currently most games use only some of them, may improve performance on consoles of the developers take advantage of the features. how much effect it will actually have..... well sythetic tests show 20-30% improvement, in reality though its still to be decided.

Higgs303 said:
You may be right, but look at the fairly dramatic increase in FPS that what seen in AMD GPUs for the Ashes of Singularity DX12 benchmark. At 1080p, the R9 290X went from 28FPS in DX11 mode to 48FPS in DX12 mode. Both consoles use AMD GPUs with similar GCN architecture. If this benchmark is representative of the performance gains to be made by GCN GPUs under DX12, then console users could possibly see a significant increase in FPS.
worth noting that Ashes of Singularity is being designed from the ground up specifically for DirectX 12 and is the type of game that will benefit the most from it (a lot of independent AI actors interacting constantly with literally thousands of particle effects on screen). The effect would be much smaller in, say, driving game.

Fijiman said:
SycoMantis91 said:
My vote, as a PC and Console gamer: WHO CARES
The answer is PC Elitist snobs who have nothing better with their non-gaming time than ***** and moan about how terrible they think any system that isn't on par with their super deluxe high-end computers are. That or people who can never seem to get a stable frame rate no matter what system they're using.
People who care about quality of their games = elitist snobs. ok then.
 

SquallTheBlade

New member
May 25, 2011
258
0
0
Patathatapon said:
Lemme tell you a little story: Once upon a little while ago I wanted to play Halo 5 with my brother just like we always did with all the previous ones. Then we learned it had no co-op. Why? Graphics. And specifically, 60fps. Why bother letting us by more controllers if we can't even play in the same room? Next thing you know the next Mortal Kombat won't have local co-op. If that's what having 60fps means, I don't want any of it for my console games.
You really wouldn't want fighting games to run at sub 60 fps. And fighting games won't get rid of the local multiplayer ever. That kind of defeats the point.

Also, from what I can tell, the only time you can guarantee a "necessity" for 60fps is if you need every single frame you can get that can make the difference. So basically competitive play. Okay. Why does that affect my fucking co-op campaign experience you assholes?
Not true. We play Tales of games with my friends and every single game so far has been 60fps in battles but the latest has been 30fps. Every one of us noticed it and has complained about it. It hinders our experience. It just doesn't feel good. One night after playing we decided to pop in Xillia 2 and god damn it felt good to play it. The difference was nigth and day.

60fps does effect casual play too.
 

NPC009

Don't mind me, I'm just a NPC
Aug 23, 2010
802
0
0
Strazdas said:
NPC009 said:
The actual discussion is, or atleast should be, about when it's acceptable to sacrifice high framerates for better graphics and such.

I hate making food comparisons because they're such clich?s, but in the current gaming landscape (where many big studios put pretty graphics before high framerates, because most consumers really don't give a fuck) 60fps is like food from a good restaurant. It's great when you can afford it, but homecooked mac 'n cheese isn't so bad either, especially if that means you can pay rent and support your action figure collecting habit on top of it.
If most consumers wouldnt give a fuck then why are there literal riots every time a big franchise anounces a 30 fps lock, FPS police is one of the most followed curator on steam and FPS is one of the main features advertised in games? Maybe you "dont give a fuck", but most consumers most certainly DO.
You're making the mistake of assuming the vocal majority on the internet is the actual majority buying the games. It doesn't work like that. Companies don't sell millions to whiny elitist bastards with awesome rigs, they sell millions of copies to people who buy their games in toy stores.

And even among the pro-60fps people is kinda obvious some don't even understand what framerates really are and do. That's when you get bullshit arguments like 'retro games all run at 60fps, look at how far we've fallen'. No, they didn't. They were shown at 50 or 60hz interlaced, which translates to 25-30 actual frames per second. At best. TVs were kinda shitty back then and other hardware limitations were also a thing. Not to mention that animations were often less than spectacular. So, yeah.

Oh and there is no point in the discussion above, the answer is of course stop being limited by outdated hardware.
Yes, people should all have infinite amounts of money so they can keep up with the Crysises. I don't know about your situation, but most adults have to spend most of their income on rent, utilities, food and other basic neccesities. High-end gaming PCs are a big investment many people can't reasonably afford.

And honestly, that's not a terrible thing. Development costs of triple A titles have risen to absurd heights in the past decade. We've gotten to the point where selling 'only' 3-4 million copies may mean losing money. Many developers can't afford to really put the PS4 hardware to work, let alone push PC gaming to greater heights. Even if you, as a consumer, do spend big amounts of cash on hardware, your advantage over people with consoles or even low-end PCs isn't that great. Many recent awesome games have fairly low system requirements, so even with an outdated system, you're unlikely to run out of great games to play.
 

Hazy992

Why does this place still exist
Aug 1, 2010
5,265
0
0
60fps is absolutely better than 30fps, in fact I can't think of a single game that wouldn't be improved by being at 60. It just looks smoother and feels more responsive. I honestly don't get how people don't notice the difference, it's night and day.

Granted I'm not one of those people who finds games running at 30 unplayable (which is good cause I'd miss out on some great games), but given the choice I'll always go for 60 and my first priority when I'm playing on PC is to get the game running at a stable 60.

I will say though that stable 30 > 60 with frame drops.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
NPC009 said:
You're making the mistake of assuming the vocal majority on the internet is the actual majority buying the games. It doesn't work like that. Companies don't sell millions to whiny elitist bastards with awesome rigs, they sell millions of copies to people who buy their games in toy stores.

And even among the pro-60fps people is kinda obvious some don't even understand what framerates really are and do. That's when you get bullshit arguments like 'retro games all run at 60fps, look at how far we've fallen'. No, they didn't. They were shown at 50 or 60hz interlaced, which translates to 25-30 actual frames per second. At best. TVs were kinda shitty back then and other hardware limitations were also a thing. Not to mention that animations were often less than spectacular. So, yeah.
I never claimed a vocal majority on the internet though. The riots part may felt like it a bit, but the other two points would in fact be run by the silent majority for the most part. and before you go the "Steam is online" angle, steam is 97% of pc gaming. so yes, its a vast majority of people actually buying games. No, in fact companies, even console exclusive companies, sell most of their games online and the retail is dieing quickly. They dont sell millions of copies in toy stores, this is not the 00s anymore.

Well there are people who dont understand something in most groups. Especially since framerate is often more felt than seen due to its unique way of improving the game. so a person can easily feel a higher framerate being more pleasant to play without knowing why it happens. As far as retro games goes, depends on game and publisher. actual 60 fps was more common on nintendo consoles. though when it comes to PC gaming we were always high FPS. back in the 90s i played Starcraft and Doom in 85 fps on my 85hz CRT monitor. Actually pre-recorded animations is one of the easiers things to run in games nowadays, so that really isnt the excuse for lower framerate.

Yes, people should all have infinite amounts of money so they can keep up with the Crysises. I don't know about your situation, but most adults have to spend most of their income on rent, utilities, food and other basic neccesities. High-end gaming PCs are a big investment many people can't reasonably afford.

And honestly, that's not a terrible thing. Development costs of triple A titles have risen to absurd heights in the past decade. We've gotten to the point where selling 'only' 3-4 million copies may mean losing money. Many developers can't afford to really put the PS4 hardware to work, let alone push PC gaming to greater heights. Even if you, as a consumer, do spend big amounts of cash on hardware, your advantage over people with consoles or even low-end PCs isn't that great. Many recent awesome games have fairly low system requirements, so even with an outdated system, you're unlikely to run out of great games to play.
No need for infinite money (and there were no new crysis games since 2013). You do not need top of the line hardware to run games in 60 fps. you just shouldnt expect to run that on consoles that are 12 years old. No, high end gaming PC is not a big investment. in fact it will end up cheaper than console gaming if you take into account cheaper games. you can easily build a PC for 500 dollars that will run every game currently released. Thats how much PS4 costs isnt it?

Most developement cost bloat is the marketing budget, though. That has no effect on how well the actual game is made. there is nothing to really "put to work". PS4 has weak hardware that is using standard architecture, the same architecture that is used in Xbox One and PC. This architecture was in common use for two decades.

My outdated PC can still run games on 60 fps though. so your point is?
 

NPC009

Don't mind me, I'm just a NPC
Aug 23, 2010
802
0
0
Strazdas said:
I never claimed a vocal majority on the internet though. The riots part may felt like it a bit, but the other two points would in fact be run by the silent majority for the most part. and before you go the "Steam is online" angle, steam is 97% of pc gaming. so yes, its a vast majority of people actually buying games. No, in fact companies, even console exclusive companies, sell most of their games online and the retail is dieing quickly. They dont sell millions of copies in toy stores, this is not the 00s anymore.
Retail isn't doing so badly, actually. Console companies don't want to compete to hard with physical stores and webshops, because they rely on them to get their systems sold. Plus, many people just like the convenience and/or think buying online is a little scary. It's no coincidence stores carry cards for PSN, eShop, Steam and even individual digital games and DLC.

As for Steam, if it is truly 97% of PC gaming, it seems a little quiet. Many triple A games get like, what, 10,000-20,000 reviews? Out of several millions copies sold? Either Steam isn't as big as you say it is, or most people can't be bothered to take two minutes to leave a review (which is the prefered method of complaining about a game). Or maybe the biggest whiners weren't buying the games in the first place, which makes me wonder how big that group is - is it too small for developers/publishers to take notice of?

Well there are people who dont understand something in most groups. Especially since framerate is often more felt than seen due to its unique way of improving the game. so a person can easily feel a higher framerate being more pleasant to play without knowing why it happens. As far as retro games goes, depends on game and publisher. actual 60 fps was more common on nintendo consoles. though when it comes to PC gaming we were always high FPS. back in the 90s i played Starcraft and Doom in 85 fps on my 85hz CRT monitor. Actually pre-recorded animations is one of the easiers things to run in games nowadays, so that really isnt the excuse for lower framerate.

Again, unless you had an extremely expensive TV (like, a 100hz model, or something), you we playing SNES games at 25/30fps or less, just like everyone else. It wasn't until the days of the PS2 that higher framerates slowly began to happen, because it was at that point in time that TVs capable of showing more fps and higher resolutions started to make their way into consumers' lives. Some PS2-games offered 60fps, yes, but since most people were still playing on basic 60hz TVs, few could actually take advantage of it. In Europe there was something else going on: 60hz. Being able to play games like F-zero GX at its native 60hz was a big deal back then for the more dedicated European players (most players, however, didn't really care/notice).

Also, IIRC DOOM was capped at something like 35fps. Plus, I bet your CRT monitor offered an interlaced image, meaning the actual framerate was closer to 40. The main difference between CRT and modern flatscreens is that input lag is lower by default. This made games feel more responsive, but you weren't actually seeing high framerates. (This is, by the way, why some retro fans insist on playing old games on CRT screens, as it is as close to the original experience you can get.)



No need for infinite money (and there were no new crysis games since 2013).You do not need top of the line hardware to run games in 60 fps. you just shouldnt expect to run that on consoles that are 12 years old. No, high end gaming PC is not a big investment. in fact it will end up cheaper than console gaming if you take into account cheaper games. you can easily build a PC for 500 dollars that will run every game currently released. Thats how much PS4 costs isnt it?
Aside from Japanese developers (the PS3 still has a huge user base in Japan and it will be years before the PS4 catches up, if it ever does) and some developers with extremely mainstream franchises (Just Dance, FIFA), few are still developing for the PS3 generation. So, I'm not entirely sure what you're getting at here. O, wait, are you being edgy by claiming the PS4 was outdated a decade ago? Cute. Especially if you insist a proper gaming PC will cost as little as 500 bucks. That may get you the main components (and maybe the most basic thing to put them in) and you'll be able to play even the most demanding games on low settings, but hardware such as the monitor needs to be upgraded once in a while as well. There's also the OS to consider. You'll kinda want Windows for obvious reasons. So even if you're building it yourself, you'll end up spending 700+ dollars, and that won't put you in early adopters territory, meaning you're still not contributing to the advance of PC gaming. In fact, you'll start to feel the age of the system within a few years.

Oh, and don't forget many people don't have the skills/knowledge/confidence to build their own system and keep it running properly. Be a dick about that all you want, but it won't change the fact most people don't find shopping for graphics card a fun pastime.

Most developement cost bloat is the marketing budget, though. That has no effect on how well the actual game is made. there is nothing to really "put to work". PS4 has weak hardware that is using standard architecture, the same architecture that is used in Xbox One and PC. This architecture was in common use for two decades.

My outdated PC can still run games on 60 fps though. so your point is?
My extremely shitty laptop can run games at 60 fps just fine. Sure, those games are most likely a decade old or faux retro stuff, but hey, 60 fps! That makes me a proud member of the PC master race, right? Right?

...

Look, I'm not going to delude myself into thinking the PC gaming experience I'm perfectly content with is making the pissing contest-obsessed members of the master race proud. Neither should you.

And there's a lot to put to work in consoles, by the way. Sure, they're not running on Cell chip or Emotion Engine-level arcane magic this generation, which means developers won't need half a decade to figure out what the hell they're doing, but what's in there is actually overkill for many developers. Aside from triple A games, games generally aren't pushing the PS4 or Xbox One. And speaking of triple A games, ever thought the marketing budgets may be so high because they can't risk relying on word-of-mouth to sell games with massive budgets? Sony's head of worldwide development, Shuhei Yoshida, estimated the development costs for top PS4 games at over 20 million dollars (and that's lowballing it, because he also mentioned numbers such as 50 million). And that's development costs, it doesn't take marketing into account.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
NPC009 said:
Retail isn't doing so badly, actually. Console companies don't want to compete to hard with physical stores and webshops, because they rely on them to get their systems sold. Plus, many people just like the convenience and/or think buying online is a little scary. It's no coincidence stores carry cards for PSN, eShop, Steam and even individual digital games and DLC.

As for Steam, if it is truly 97% of PC gaming, it seems a little quiet. Many triple A games get like, what, 10,000-20,000 reviews? Out of several millions copies sold? Either Steam isn't as big as you say it is, or most people can't be bothered to take two minutes to leave a review (which is the prefered method of complaining about a game). Or maybe the biggest whiners weren't buying the games in the first place, which makes me wonder how big that group is - is it too small for developers/publishers to take notice of?
retail is doing horrible. closing down shops, downscaling, agressively pushing other services and generally having death throbs. left and right. People can pick up consoles in hardware stores or walmart. No, some small minority of people think buying online is scary. this is 2016, not 2006. Basically the only physical copy buyers are people with awful internet or physical copies collectors. The cards for eshops are mostly because most kids dont have bank accounts but still have disposable income to buy games. They also make good gifts to gamers. gifting "buying card/check" is popular. most shops thatn dont usually have them make them specifically for christmas because people use them as gifts. gaming is no exception here in that it is used for gifts.

Yes, most peopld do not leave reviews. hence the phrase "silent majority". I have played over 100 games on Steam. i have left a review for 4 of them. As far as i know, none of my friends left even a single review. So no, most people dont bother writing reviews.

Again, unless you had an extremely expensive TV (like, a 100hz model, or something), you we playing SNES games at 25/30fps or less, just like everyone else. It wasn't until the days of the PS2 that higher framerates slowly began to happen, because it was at that point in time that TVs capable of showing more fps and higher resolutions started to make their way into consumers' lives. Some PS2-games offered 60fps, yes, but since most people were still playing on basic 60hz TVs, few could actually take advantage of it. In Europe there was something else going on: 60hz. Being able to play games like F-zero GX at its native 60hz was a big deal back then for the more dedicated European players (most players, however, didn't really care/notice).
No. I had connected my console to a simple 50hz 13" CRT TV that i saved up for as a kid. Some games ran at 50fps, though most as you say were interlaced. This problem didnt exist when i bought a Pentium I, as PC ran just fine with the 85 hz CRT i mentioned earlier. 50 hz CRT TVs were a standard here in Europe, though most TV programs were indeed interlaced 25 fps programs.

Also, IIRC DOOM was capped at something like 35fps. Plus, I bet your CRT monitor offered an interlaced image, meaning the actual framerate was closer to 40. The main difference between CRT and modern flatscreens is that input lag is lower by default. This made games feel more responsive, but you weren't actually seeing high framerates. (This is, by the way, why some retro fans insist on playing old games on CRT screens, as it is as close to the original experience you can get.)
No, DOOM had no frequency cap, though when it released most machines could not run it at 60 fps well. And no, my CRT did not use interlacing, at all. The CRT i used for my computer didnt have interlacing as a function. CRTs have no input lag. as in, its at 0.001ms input lag. they were also for a very long time thanks to the TV manufacturing cartel the only monitors capable of above 60 hz visuals because 144/120 hz TN panels werent around, but good quality CRTs could go up to 125 hz progressive.

Actually the reason many retro gamers use old CRTs is because the games were designed for non-flat screens and look kinda weird on flat screens. Many old CRTs are non-flat screens.


Aside from Japanese developers (the PS3 still has a huge user base in Japan and it will be years before the PS4 catches up, if it ever does) and some developers with extremely mainstream franchises (Just Dance, FIFA), few are still developing for the PS3 generation. So, I'm not entirely sure what you're getting at here. O, wait, are you being edgy by claiming the PS4 was outdated a decade ago? Cute. Especially if you insist a proper gaming PC will cost as little as 500 bucks. That may get you the main components (and maybe the most basic thing to put them in) and you'll be able to play even the most demanding games on low settings, but hardware such as the monitor needs to be upgraded once in a while as well. There's also the OS to consider. You'll kinda want Windows for obvious reasons. So even if you're building it yourself, you'll end up spending 700+ dollars, and that won't put you in early adopters territory, meaning you're still not contributing to the advance of PC gaming. In fact, you'll start to feel the age of the system within a few years.
No, by that remark i meant PS3, Xbox 360 and WiiU. A PC built for 500 will outperform PS4 or Xbox One though. Most likely youll end up playing on medium settings, but it will still be nicer than what console shows and thats the goal with that computer isnt it, to get more than the console for same costs.

Monitor should not be put into question here, as you do not have a monitor with a console either. and yes, you can plug your computer into your TV with exact same cables as you can plug your console in. And well yes, monitors eventually die, something like one per 10 years of daily use. but so do TVs and all other electronics.

I built a top end PC for 700 dollars two years ago. i wasnt an "Early adopter", but that isnt necessary to play your games at decent framerate. So far the only think that i actually upgraded was me putting in and SSD, something that i did specifically to make it easier to do other things in the background while im gaming (such as record footage) and something an average gamer can be without. Im not "Feeling the age" yet.

Oh, and don't forget many people don't have the skills/knowledge/confidence to build their own system and keep it running properly. Be a dick about that all you want, but it won't change the fact most people don't find shopping for graphics card a fun pastime.
can you build things out of lego? yes? congratulations, you have the skills to build a computer. you can get the knowledge on youtube in a few hours. confidence is something youll have to make on your own though. Once you build it correctly, there is nothing to "keep it running properly".

My extremely shitty laptop can run games at 60 fps just fine. Sure, those games are most likely a decade old or faux retro stuff, but hey, 60 fps! That makes me a proud member of the PC master race, right? Right?
No, i meant games released last year - 2015. my computer mentioned above handles them fine.

To be a member of PC Master Race you have to knowledge the superiority of PCs. You dont even have to own one.

And there's a lot to put to work in consoles, by the way. Sure, they're not running on Cell chip or Emotion Engine-level arcane magic this generation, which means developers won't need half a decade to figure out what the hell they're doing, but what's in there is actually overkill for many developers. Aside from triple A games, games generally aren't pushing the PS4 or Xbox One. And speaking of triple A games, ever thought the marketing budgets may be so high because they can't risk relying on word-of-mouth to sell games with massive budgets? Sony's head of worldwide development, Shuhei Yoshida, estimated the development costs for top PS4 games at over 20 million dollars (and that's lowballing it, because he also mentioned numbers such as 50 million). And that's development costs, it doesn't take marketing into account.
nonsense. What is there is already hitting the limits with black bars, low resolutinos and limited framerate. the tablet level APUs in consoles (and they are same APUs that actually came out in tablets last year, yay for cheaping out AMD) have already reached their limit. DirectX 12 and OpenGLNext may improve performance a bit, so we will see a boost there, but other than that, there wont be much of the developer optimization this time around.

Maybe AAA business should remmeber that its not advertisement or word of mouth that sells games, its actually making a good game (points at Undertale).
 

Qizx

Executor
Feb 21, 2011
458
0
0
IS there a "I don't care so long as it's fun" option? Cause honestly I picked 60 just because it is better... Duh, but I don't really care that much so long as the game works and is enjoyable.
 

SquallTheBlade

New member
May 25, 2011
258
0
0
NPC009 said:
Yes, people should all have infinite amounts of money so they can keep up with the Crysises. I don't know about your situation, but most adults have to spend most of their income on rent, utilities, food and other basic neccesities. High-end gaming PCs are a big investment many people can't reasonably afford.
I spent about 400 euros on my rig over the years. I finished the build 2 years ago and I'm still running games 1080p/60fps on medium settings. Of course it depends on the game if I need to lower the settings more to achieve that. Some even run 60fps with graphics maxed out like Dark Souls 2:SotFS. At the moment I'm playing Final Fantasy 13 at 1080p/60fps most of the time, mostly in battles.

Building a decent gaming rig isn't as expensive as you think.
 

joest01

Senior Member
Apr 15, 2009
399
0
21
too add some actual peer reviewed level scientific facts to the discussion, I have it on good authority that the pc gamer in this picture has sub .3 ms reflexes and is actually training to be a cage fighter.

 

Cycloptomese

New member
Jun 4, 2015
313
0
0
There should be and "I don't care either way" option. That said, I clicked on 60FPS because bigger number.
 

AnthrSolidSnake

New member
Jun 2, 2011
824
0
0
The problem is (or lack of a problem in this case) is that your mind is very adaptable.
Let's say you've been playing a game on your PC at 60 FPS for a couple weeks (let's assume you were using a gamepad). You're entirely used to it.
Now your computer's CPU malfunctions and you're forced to play the same game on your console at 30FPS.

At first, you'll most likely notice the difference. It might feel slightly sluggish. If you pay attention, you'll notice this slight ghosting effect, and control inputs may feel like a fraction of a second delayed.
However, you play for a few hours anyway. At some point, you'll no longer notice. Your mind is used to the lower framerate and the timing for controls, and thus you no longer seem to care about the framerate.

This effect also carries over long term. If someone has been playing games at 30FPS for years without thinking about it, they may claim that they notice little to no difference between 30 and 60. This applies to the opposite as well, where someone playing at 60 FPS for years will notice a difference.


What is a fact about the difference?
60 FPS shows more frame of animation on screen than 30 FPS. There is no denying this.
In some games, this can be crucial. If you're objectively receiving more visual information from a game you're playing, your mind has more information to work with. This is why control inputs look and feel smoother to most people.
To many, 60 FPS is mandatory for certain games because of this. Unless you are so good at adjusting your control to such a degree that it makes up for the loss in animation at 30FPS, you'll likely be in a disadvantage going against a player using 60 FPS in the same game.
This is why genres such as fighting games, racing games, and first person shooters are considered games where 60 FPS (or higher) is required, at least in a competitive setting.
 

NPC009

Don't mind me, I'm just a NPC
Aug 23, 2010
802
0
0
Strazdas said:
retail is doing horrible. closing down shops, downscaling, agressively pushing other services and generally having death throbs. left and right. People can pick up consoles in hardware stores or walmart. No, some small minority of people think buying online is scary. this is 2016, not 2006. Basically the only physical copy buyers are people with awful internet or physical copies collectors. The cards for eshops are mostly because most kids dont have bank accounts but still have disposable income to buy games. They also make good gifts to gamers. gifting "buying card/check" is popular. most shops thatn dont usually have them make them specifically for christmas because people use them as gifts. gaming is no exception here in that it is used for gifts.
You're confusing specialised gameshops with retail as a whole. It's not the same thing. And do you have any idea how many people still have bad internet connections? I'm guessing you're from a European country were fiber is common, but a lot of consumers don't have great options. Heck, they may only have one shitty option. I heard that's pretty common in the US.

Yes, most peopld do not leave reviews. hence the phrase "silent majority". I have played over 100 games on Steam. i have left a review for 4 of them. As far as i know, none of my friends left even a single review. So no, most people dont bother writing reviews.
Okay, so these people are rioting, but so quietly we don't even notice them? I'm not sure how that works.

No. I had connected my console to a simple 50hz 13" CRT TV that i saved up for as a kid. Some games ran at 50fps, though most as you say were interlaced. This problem didnt exist when i bought a Pentium I, as PC ran just fine with the 85 hz CRT i mentioned earlier. 50 hz CRT TVs were a standard here in Europe, though most TV programs were indeed interlaced 25 fps programs.
When set at interlaced, a monitor only refreshes half the image. Instead of getting 50 new images a second, you're only really getting 25. I'm not entirely sure how many TVs actually supported 240p back then.

No, DOOM had no frequency cap, though when it released most machines could not run it at 60 fps well. And no, my CRT did not use interlacing, at all. The CRT i used for my computer didnt have interlacing as a function. CRTs have no input lag. as in, its at 0.001ms input lag. they were also for a very long time thanks to the TV manufacturing cartel the only monitors capable of above 60 hz visuals because 144/120 hz TN panels werent around, but good quality CRTs could go up to 125 hz progressive.

Actually the reason many retro gamers use old CRTs is because the games were designed for non-flat screens and look kinda weird on flat screens. Many old CRTs are non-flat screens.
That's the first time I heard about the curves being better. I don't know. Seems like a weak argument to me. Things like no input lag, native resolution (no scaling to mess things up) and such seem more important to me.


No, by that remark i meant PS3, Xbox 360 and WiiU. A PC built for 500 will outperform PS4 or Xbox One though. Most likely youll end up playing on medium settings, but it will still be nicer than what console shows and thats the goal with that computer isnt it, to get more than the console for same costs.
I looked up what ~500$ gets you at this point in time. I found this, along side other lists, this article [http://lifehacker.com/5840963/the-best-pcs-you-can-build-for-600-and-1200] (look the $600 PC). Looking at what you can do with it, it seems to be on par with modern consoles, at best. Consoles and PCs are hard to compare anyway, because unlike a dedicated system, a PC always has a OS taking up a room and power.

Monitor should not be put into question here, as you do not have a monitor with a console either. and yes, you can plug your computer into your TV with exact same cables as you can plug your console in. And well yes, monitors eventually die, something like one per 10 years of daily use. but so do TVs and all other electronics.
That wasn't exactly my point. I'm a laptop user (work-related reasons, plus I like how it saves space on my desk at home). If I were to upgrade to a gaming PC now, I would have to invest in a proper display. I wouldn't want to use my TV in the living as a monitor. Despite being fairly modest, it's still too big to use comfortably with a pc (unless that PC is purely used for watching series and playing games with a controller). Using a console with a PC display works a lot better, by the way.


I built a top end PC for 700 dollars two years ago. i wasnt an "Early adopter", but that isnt necessary to play your games at decent framerate. So far the only think that i actually upgraded was me putting in and SSD, something that i did specifically to make it easier to do other things in the background while im gaming (such as record footage) and something an average gamer can be without. Im not "Feeling the age" yet.
(700 dollars? I thought you're from Europe?)

I guess my estimate was almost spot-on, then. Look, decent framerate is relatively easy to obtain, as long as you're willing to sacrifice other things (lighting effects, draw distance, other fancy stuff). Well, and as long as the developer lets you - having a very minimal settings menu is a problem in some games. In any case, just saying you're getting a good number of fps isn't painting the whole picture. My example was extreme, but it is still the truth: my piece of low-tech shit can do 60fps. Just don't ask what games I'm playing and how I'm playing them.

can you build things out of lego? yes? congratulations, you have the skills to build a computer. you can get the knowledge on youtube in a few hours. confidence is something youll have to make on your own though. Once you build it correctly, there is nothing to "keep it running properly".
A set of legos costs a few dozen bucks, a hundred if you want something big. A PC has hundreds of dollars worth of components. One might find that intimidating. Sure, in reality they're pretty easy to put together, but picking components (that involves a lot of research) and learning to put them together still requires time and commitment. Few people have fun with it. It's no wonder a lot of people would rather pay extra for a pre-build than to take any risks. (Heck, I could have saved some money by building a PC for my parents, but it just wasn't worth the hassle. Ordering it online and having it send to their place was much less time-consuming.)

Keeping it running properly is kind of a big and important thing. If you're computer savvy, you'll probably avoid doing anything too stupid automatically, but a lot of people are not computer savvy. That's more or less the software side of things. The hardware side... Well, sometimes things go weird in the weirdest ways. My sister once had a laptop that would try to boot up, shut down midway, try to boot up again and so on. Took a while before I figured out it was trying to draw power from a near-dead battery instead of directly from the power supply. And then there's all the fun I've had with modems... Please, don't get me started on modems...



No, i meant games released last year - 2015. my computer mentioned above handles them fine.
Again, that doesn't mean much. I'm playing games released in 2015. Half of them look like 16-bit games, and the other half isn't much more impressive, but yeah, games from 2015... yay.

To be a member of PC Master Race you have to knowledge the superiority of PCs. You dont even have to own one.
Superiority is relative. I only became interested in PC Gaming again when it became much easier to use. GoG is nice. So is Steam (but I stay away from games that require UPlay and the like). I'm not interested in power. I'm interested in convenience. Easy access entertainment. As long as the PC offers that, it'll be part of my gaming diet together with the PS4, Wii U, 3DS, PSP and more. (I love my PSP - it's such a cute little system!)

It seems to me that PC gaming is most fun if you're interested in the tech involved. I kinda get that putting your own system together and figuring out how to get the most out of it gives a sense of accomplishment. But, it's not what interests me, personally. It just seems like an exhausting rat race to me.

nonsense. What is there is already hitting the limits with black bars, low resolutinos and limited framerate. the tablet level APUs in consoles (and they are same APUs that actually came out in tablets last year, yay for cheaping out AMD) have already reached their limit. DirectX 12 and OpenGLNext may improve performance a bit, so we will see a boost there, but other than that, there wont be much of the developer optimization this time around.
It seems to me that is more a result of many games being rushed out of the door - poorly optimised and filled with bugs (I heard the console versions of Just Cause 3 have some really fun memory leaks that end up adding whole minutes to loading times after just an hour of play). Nintendo has the least amount of power to work with, but they're releasing some of the best looking and most stable games this generation.

More raw power will not solve that most issues you see on consoles. Actually, I think it might make developers more complacent and that wouldn't be good for PC gamers either. Poorly programmed console games are not a good foundation for ports.

Maybe AAA business should remmeber that its not advertisement or word of mouth that sells games, its actually making a good game (points at Undertale).
It's be great if it were that simple. Of course you need a good game, but that alone is no guarantee for success. Undertale was one of many quirky retro style games released in 2015, but it was also one of the few fortunate enough to be noticed by the right people (and that didn't happen without atleast some marketing). It wouldn't have exploded the way it did if it hadn't.

If you ever have a a day to spare, go spelunking. Don't just stick to the depths of Steam, check sites such as rpgmaker.net as well. You'll be surprised by the gems hidden there. If you want a starting point: Off. If you liked Undertale, this RPG Maker game from 2008 may be to your liking.
 

TotalerKrieger

New member
Nov 12, 2011
376
0
0
Strazdas said:
Higgs303 said:
Certainly, some may genuinely prefer a lower frame rate for aesthetic reasons,
what? how is stuttering visuals preferable aesthetics? Or is that whole "cinematic" bullshit again? because that was never ever in any way shape or form true.
I was referring to film specifically. Cinematographers use all sorts of outdated filming methods and technology for artistic purposes, sometimes it is effective other times it is not. However, for 3D applications, I would agree that a lower framerate like 30 FPS has no conceivable aesthetic value compared to 60 FPS.

Strazdas said:
Higgs303 said:
My bet is this issue will fall to the wayside once PS4 and XB1 titles start reaching 60FPS more frequently as developers make greater use of low level APIs like DX12 and Mantle, as well as the usual late-gen optimizations.
the issue is already on the "Wayside". the discussion has moved to 60 fps vs 120 fps.
I am not so sure about that. For the most recent games, only the most expensive GPUs like the GTX 980 Ti, Titan X and R9 Fury X can hope to even approach the refresh rate of a 120Hz monitor at 1080p (they max out at about 100 FPS on average). For higher resolutions like 4K, the high end dual GPU configurations that are required to attain even 60FPS are well beyond the budget of most PC gamers. I don't think there are 4k monitors with more than a 75Hz refresh rate on the market right now. IMO, most PC gamers will choose to move on to higher resolutions well before a framerate higher than 60 FPS becomes the new standard. Nvidia's Pascal architecture and AMD's Arctic Islands architecture may pack enough punch to allow for reasonably priced single GPU builds that are capable of 4K/60FPS, but I don't see the standard moving beyond that for some time.



Strazdas said:
Higgs303 said:
You may be right, but look at the fairly dramatic increase in FPS that what seen in AMD GPUs for the Ashes of Singularity DX12 benchmark. At 1080p, the R9 290X went from 28FPS in DX11 mode to 48FPS in DX12 mode. Both consoles use AMD GPUs with similar GCN architecture. If this benchmark is representative of the performance gains to be made by GCN GPUs under DX12, then console users could possibly see a significant increase in FPS.
worth noting that Ashes of Singularity is being designed from the ground up specifically for DirectX 12 and is the type of game that will benefit the most from it (a lot of independent AI actors interacting constantly with literally thousands of particle effects on screen). The effect would be much smaller in, say, driving game.
Yea, I would have to agree. The benchmarks of the new DX12 Fable game might be more representative of typical performance gains. I guess we will know sometime later this year.
 

Patathatapon

New member
Jul 30, 2011
225
0
0
SquallTheBlade said:
I don't remember saying it didn't affect casual play. I would just rather play in the same room as someone while playing Halo than have a 60 fps if given the option. If we can have both I'm fine! I don't care! But apparently we can't, so 60fps can go suck a dick.