Dead Rising 3 Targeting 30 Frames Per Second

Sight Unseen

The North Remembers
Nov 18, 2009
1,064
0
0
TiberiusEsuriens said:
Mr.Tea said:
TiberiusEsuriens said:
I played xbox at 720p for years then switched to full 1080p because I somehow missed that setting the first time around, but barely even saw a difference. When you put a full blu-ray movie in it the difference is suddenly mind boggling. We perceive frame rates differently depending on genre, on-screen action, and point of view.
Fair points for the rest of your post, but this is missing a fact: The consoles don't do 1080p at all, ever, with the exception of a PS3 playing a Blu-Ray (or the short-lived HD-DVD thing for the Xbox, perhaps). They just upscale all the different rendering resolutions the games have set up to your 720p or 1080p.

They really run at such "high definition" resolutions as 1152x640, 960x544, 1040x624 and, of course, 1280x720 (which is the standard 720p). There are slight variations for each of those, adding or subtracting a couple of vertical or horizontal lines here and there, but nothing above 720p. Xbox 360 [http://beyond3d.com/showpost.php?p=1113344&postcount=3] | Playstation 3 [http://beyond3d.com/showpost.php?p=1113342&postcount=2]

And that's not even starting on the TVs themselves, which often don't even use the exact 1920x1080! Try plugging a computer to one of them to see what I mean.
Neat! I knew there were some distinctions, as connecting my PC gave a completely different image, but for a while my xbox was also running in 1080i. Any clue what that means? I'm still trying to figure it out, aside from it being magically below 1080p somehow.
I'm not an expert on this but I think the i in 1080i is "imaginary" and that it takes a 720p image and tries to extrapolate it out to be a 1080p image, or something like that. I'm probably wrong though.
 

idarkphoenixi

New member
May 2, 2011
1,492
0
0
But I thought that "the power of the cloud" would improve games so that this exact kind of thing wouldn't happen?
 

TiberiusEsuriens

New member
Jun 24, 2010
834
0
0
Mr.Tea said:
The "i" and "p" in video formats stand for "interlaced" and "progressive" scan, respectively. The resolution is the same, which is why the number is the same; the difference between the two is in how the image is displayed.

When it's time to display an image, a TV or computer monitor will draw each frame line by line, starting from the top, until a complete image has been drawn and then it starts over, drawing as many complete images as the frame rate dictates (24, 25, 30, 50, 60 depending on the source). Even at 25 times a second, it's fast enough that you don't notice and it gives good picture quality. That's Progressive scan.

Interlaced scan, on the other hand, starts out the same way. It draws the first line at the top of the display, but then it skips every even numbered line until half of an image has been drawn. When it starts over, it starts at the second line and proceeds to skip all the odd numbered lines that have already been drawn until the other half of the previous image has been drawn. It does this faster than progressive scan (50 times in PAL [http://upload.wikimedia.org/wikipedia/commons/thumb/0/0d/PAL-NTSC-SECAM.svg/1000px-PAL-NTSC-SECAM.svg.png] regions, 60 in NTSC [http://upload.wikimedia.org/wikipedia/commons/thumb/0/0d/PAL-NTSC-SECAM.svg/1000px-PAL-NTSC-SECAM.svg.png]), but only ever draws half images at any given time.

When it's done right, Interlaced video is fine, but it's really a (dying) Television standard from the days of CRTs. LCD or Plasma displays never (as far as I know) work in interlaced mode. So when they get an interlaced video signal, such as a TV channel or your Xbox in 1080i mode, it has to do something called deinterlacing [http://en.wikipedia.org/wiki/Deinterlacing] before showing it to you. I'm sure you've seen bad or absent deinterlacing before... it looks like this:


More reading [http://en.wikipedia.org/wiki/Interlaced_video].
This post deserves a gold star. Here you go.

 

Yuuki

New member
Mar 19, 2013
995
0
0
TiberiusEsuriens said:
Similar to the 120hz tv I bought last year, some things I watch there is a notable difference, some not.
Ehhh, there's a few things weird here:
1) 120hz TV's aren't actually refreshing at 120hz, they add motion blurring to standard 60hz to fill in "ghost" frames that make it appear smoother to the human eye. It makes videos appear smoother but does bugger-all for gaming, best left disabled. For a true 120hz experience you would need a 120hz monitor...on PC...and a PC that can drive 100-120fps :p
2) Even IF the TV could hypothetically go at true 120hz, your console can only output 30fps to it (60fps in some cases, depends on game) so it would be pointless.
 

mad825

New member
Mar 28, 2010
3,379
0
0
Zac Jovanovic said:
The thing is, it doesn't matter that it looks better. You won't be able to SEE the difference between 30 and 60 fps anyway, unless you can watch a direct comparison on 2 monitors at the same time.

The key is in input delay, playing at 30 fps on PC bring a control input delay that is often higher than your server ping online. This is very noticeable in games where you directly control your character, first person shooters, third person slashers etc. The game feels, laggy, sluggish, less reactive.

If you watch someone play a shooter at 30 fps on PC it will look perfectly fluid to you, but for the guy playing it will be agony if he's used to 60+.
Seems like one big assertion. People said the same about The hobbit.

For me, 30FPS is playable but not enjoyable. The thing is that unless the GPU surpasses the resource demand, there will be spikes, changes in FPS. It will not be constant and lag appears around 30FPS and gets worse from there.
 

Easton Dark

New member
Jan 2, 2011
2,366
0
0
CrossLOPER said:
The consoles are not even out yet and they are targeting the base minimum.
j-e-f-f-e-r-s said:
You shouldn't be targeting 30fps with next-gen hardware. 30fps should be the bare minimum.
Bam and bam, two posts right next to each other.

There better be a million zambamboes on screen at once each with a differently modeled face or I'm going with the lazy coding excuse.
 

MiracleOfSound

Fight like a Krogan
Jan 3, 2009
17,776
0
0
Daystar Clarion said:
Next gen consoles, folks.


I never really noticed until I switched to PC, but when I say, spend a few hours playing Borderlands 2 on PC, then go to play The Last of Us on my PS3, that FPS drop is really noticeable, I can't stop noticing it. Especially when any console game hits a rough patch and it suffers from even more FPS drop.
When I played PC Borderlands 2 then Xbox 360 Borderlands 2 I realised it was REALLY time for a new console gen. They were just leagues apart. The xbox version was almost depressingly ugly and felt really choppy/sloppy to control in comparison. I really hope devs focus on high frame rates this gen. Its been crucial to the success of you-know-what
 

TiberiusEsuriens

New member
Jun 24, 2010
834
0
0
Yuuki said:
TiberiusEsuriens said:
Similar to the 120hz tv I bought last year, some things I watch there is a notable difference, some not.
Ehhh, there's a few things weird here:
1) 120hz TV's aren't actually refreshing at 120hz, they add motion blurring to standard 60hz to fill in "ghost" frames that make it appear smoother to the human eye. It makes videos appear smoother but does bugger-all for gaming, best left disabled. For a true 120hz experience you would need a 120hz monitor...on PC...and a PC that can drive 100-120fps :p
2) Even IF the TV could hypothetically go at true 120hz, your console can only output 30fps to it (60fps in some cases, depends on game) so it would be pointless.
Most tvs now you can turn "TruMotion" off. That's what makes 120hz tvs look and feel more like 60hz. Even though my pc/blu-ray is not true 120fps the refresh rate dramatically smooths out the horrendously low framerates we would otherwise have to deal with. I'll take what I can get ^.^
 

Ed130 The Vanguard

(Insert witty quote here)
Sep 10, 2008
3,782
0
0
Hagi said:
I'd say it really depends on why it's 30 FPS.

If you're putting twice the amount of stuff on-screen, more enemies, more weapons, more explosions, more terrain etc. then you're going to take twice as long t render a single frame and thus half your FPS. I'd say that's fine if it improves the gameplay. I'll sacrifice a bit of fluidity for having a game that's much more filled.

If you've badly optimized code ( no, start of the console cycle is not an excuse. This gen is pretty much normal PC architecture ) then there's no excuse.
I have a feeling its a combination of both.

Still, aiming for 30fps at the start of a console generation is pretty sad.
 

Grabehn

New member
Sep 22, 2012
630
0
0
Wait... isn't this coming up for the next "console generation" and 30 fps is what they aim for? Is that it? I would've expected that with so many people claiming that moronic statement of "Next generation consoles will kill PC gaming" companies wouldn't be struggling with keeping a game running at 30 fps...
 

spartandude

New member
Nov 24, 2009
2,721
0
0
MiracleOfSound said:
Daystar Clarion said:
Next gen consoles, folks.


I never really noticed until I switched to PC, but when I say, spend a few hours playing Borderlands 2 on PC, then go to play The Last of Us on my PS3, that FPS drop is really noticeable, I can't stop noticing it. Especially when any console game hits a rough patch and it suffers from even more FPS drop.
I noticed it when playing Skyrim, once i upgraded my computer (and could play skyrim on highest with HD mods) i was looking at something not only nicer to watch but also smoother. I then went back to the 360 version and the difference was incredible.
I also realised that on League of Legends you can lock the fps at 30 fps (if your machine is crap), so out of curiosity i did and my god it was terrible