Master Racier

Sep 13, 2009
1,589
0
0
Frankly, the people who claim that they can enjoy anything below 120fps, 1440p are just lying to themselves. Anything below that is objectively not fun

fix-the-spade said:
However, around 50-60fps is where human beings stop perceiving the flickering between images. This is why movies are projected at 72fps (24 times 3 to be exact) because actually projecting 24images per second makes films look horrific, whilst most TV is broadcast at 60fps (30x2) or 50fps (25x2).

If you watch a film on a projector at an actual 24fps you can see the black spaces between images, it's awful. On games at least the previous image stays in place until the new one is rendered, so flicker at 30fps isn't as much of an issue.

Don't worry, there is plenty more time for increasing frames per second a resolutions to keep those master race wallets empty.
I've heard a number of people involved in graphics saying that motion blur at 30fps looks better than no blur at 60fps. Can't say I have enough experience to evaluate it one way or the other, but it deals with the issue of gaps between objects from one frame to the next, which is a large part of what our brains don't like about low frame rates.
 

RandV80

New member
Oct 1, 2009
1,507
0
0
Hah I've pretty much always been a PC peasant! While it makes for a humorous comic though while some PC gamers like to rag on console gamers, while there can certainly be the dick measuring contest turning on fellow PC gamers for being under spec'd generally isn't a thing (from what I've seen at least)... unless maybe someone is trying to game on a laptop.

About the only time it ever may have been a big deal was in ye olden days while 56K modems were still around and you had the dreaded 'lag' users in multiplayer games.
 

Requia

New member
Apr 4, 2013
703
0
0
Renegade-pizza said:
The master race has some snazzy clothing. Also, can the human eye notice the difference between 60 and 120 fps? I know there's a cap before it becomes redundant to increase fps, but I forget
It's untested, the only honest to god blinded experiment (y performance rather than survey) for video games only checked below 30, 30, and 60 fps on quake 3 (with the way the theoretical model works, that it has to do with not only watching but needing to rapidly respond, some games may not see improvement at the same rates as well).
 

BX3

New member
Mar 7, 2011
659
0
0
The thought of 120fps honestly makes me giddy.... but I know that as a peasant, I will not be able to afford the prerequisites in a long time. I do happen to own a 1440 monitor though. Shit's cash... but the color sucks. Maybe cuz it's old.
 

EHKOS

Madness to my Methods
Feb 28, 2010
4,815
0
0
Sure we can't go to 1080p, but give the PS4 credit. It runs MGS:V at 60.
 

fix-the-spade

New member
Feb 25, 2008
8,639
0
0
The Almighty Aardvark said:
I've heard a number of people involved in graphics saying that motion blur at 30fps looks better than no blur at 60fps. Can't say I have enough experience to evaluate it one way or the other, but it deals with the issue of gaps between objects from one frame to the next, which is a large part of what our brains don't like about low frame rates.
I think that's code for at 60fps everyone can see the bondo and duct tape.

In movies I'm inclined to agree, motion blur is part of the shot but should be used sparingly. One of the things that struck me about the The Hobbit high frame rate in the cinema was how much more obvious the sets and effects looked against the same movie in normal.

In games the opposite applies, the more frames the better.
 
Sep 13, 2009
1,589
0
0
fix-the-spade said:
I think that's code for at 60fps everyone can see the bondo and duct tape.

In movies I'm inclined to agree, motion blur is part of the shot but should be used sparingly. One of the things that struck me about the The Hobbit high frame rate in the cinema was how much more obvious the sets and effects looked against the same movie in normal.

In games the opposite applies, the more frames the better.
Motion blur definitely gives you a lot more freedom to cut corners in animation

One thing worth noting, motion blur in movies is usually better implemented than in games. Movies don't have to work within realtime restraints, and can be a lot more accurate. Games tend to use approximations, so the reason why motion blur work as well in games could very well be because it actually doesn't look as good

ravenshrike said:
I'm sorry, but Monty Python references are dealt with severely on this site. Someone fetch me the comfy chair
 

Darth_Payn

New member
Aug 5, 2009
2,868
0
0
Reinterpretation: Erin is not really a member of the PC Master Race, but an infiltrator from the Console peasants to mess with the PCMR from the inside, and those two dolts were too stupid to know any better.
ravenshrike said:


There are three Pillars to the PC Master Race, they are FPS, Resolution, Modifications, and Backwards Compatibility.

Four, there are four Pillars to the PC Master Race, they are FPS, Resolution, Modifications, Backwards Compatibility, and Free Multiplayer.

Five, there are five Pillars to...
Blimey! I wasn't expecting a PC Inquisition!
 

Rebel_Raven

New member
Jul 24, 2011
1,606
0
0
Haha, the "wars." The only way to win is not to be a part of it.

<youtube=aDMsGl_XxTk>

Wonderful art as always.
 

Jadedvet

New member
Jul 1, 2013
48
0
0
Its been a long time since my gtx670 was enough to meet party qualifications. Whats worse, I hear they are considering 4k as a minimum in a year or so.

Both Nvidia and AMD have some good hardware coming this year; maybe I can reach upper middle class without going bankrupt.
 

IamLEAM1983

Neloth's got swag.
Aug 22, 2011
2,581
0
0
That's all well and good, but isn't there a point where throwing more FPS at the screen amounts to nothing, visually? I mean, we're all fairly limited meatbags with jelly orbs in place of visual apparatuses; there's no way the human eye can appreciate that high a framerate. There just might be a point where framerate matters more in terms of responsiveness, but even that's pretty subjective.
 

dyre

New member
Mar 30, 2011
2,178
0
0
This comment isn't specifically directed at today's strip (though it certainly applies), but I think the quality/humor of this comic has been improving as of late. Keep it up!
 

TomWest

New member
Sep 16, 2007
41
0
0
The Almighty Aardvark said:
Frankly, the people who claim that they can enjoy anything below 120fps, 1440p are just lying to themselves. Anything below that is objectively not fun
Heh. I remember doing a test a little while back on a friend who had a high end PC. I could tell the difference between 30 and 60 FPS, but guessed which one was the 60 FPS only slightly better than chance. Could not notice any difference above that.

But then, I also annoyed my wife by pointing out how much clearer her new Retina iPad was in comparison to the previous generation non-Retina model she also owned. She frostily pointed out that I was holding the *old* iPad.

Of course, sound is a different matter... Wait, no it isn't. My friend in university wanted to repossess the older high-ish end stereo he sold me (we were room mates and he was an audio-phile) when he walked into my room, and realized that the speaker wire had fallen out of one of my speakers. I'd been playing in mono for weeks and hadn't noticed.

Moral of the story, not all of us notice lower resolutions/FPS. I had great fun tormenting my brother-in-law by playing SD on our HD TV (although I did notice a difference when we finally got an HD signal.)
 

Something Amyss

Aswyng and Amyss
Dec 3, 2008
24,759
0
0
Diablo1099 said:
....Man...How the hell do they get up to 120 FPS? Most Developers don't even go that high O_O
Yes, but it's not about whether or not it improves the game, it's about benchmarks!
 

Something Amyss

Aswyng and Amyss
Dec 3, 2008
24,759
0
0
TomWest said:
Heh. I remember doing a test a little while back on a friend who had a high end PC. I could tell the difference between 30 and 60 FPS, but guessed which one was the 60 FPS only slightly better than chance. Could not notice any difference above that.

But then, I also annoyed my wife by pointing out how much clearer her new Retina iPad was in comparison to the previous generation non-Retina model she also owned. She frostily pointed out that I was holding the *old* iPad.

Of course, sound is a different matter... Wait, no it isn't. My friend in university wanted to repossess the older high-ish end stereo he sold me (we were room mates and he was an audio-phile) when he walked into my room, and realized that the speaker wire had fallen out of one of my speakers. I'd been playing in mono for weeks and hadn't noticed.

Moral of the story, not all of us notice lower resolutions/FPS. I had great fun tormenting my brother-in-law by playing SD on our HD TV (although I did notice a difference when we finally got an HD signal.)
I've had fun tricking serious audiophiles before with various techniques. One of the easiest is playing a digital recording that includes tape or record pops and his and telling them it's vinyl or cassette. I'm yet to have a single audiophile tell the difference between high quality digital and the purity of analogue. They may exist, but apparently not where I live.

Jadedvet said:
Its been a long time since my gtx670 was enough to meet party qualifications. Whats worse, I hear they are considering 4k as a minimum in a year or so.

Both Nvidia and AMD have some good hardware coming this year; maybe I can reach upper middle class without going bankrupt.
If you're thinking about money, then you're simply not party material.

Guards!