Jimquisition: Ubisoft Talks Bollocks About Framerate And Resolution

Aaron Sylvester

New member
Jul 1, 2012
786
0
0
Ubisoft is only echoing what the general console playerbase is already happy with.

According to the thread I created, the majority of people don't care about 60fps and are happy to play at 30fps: http://www.escapistmagazine.com/forums/read/9.862342-Update-2-How-why-are-console-gamers-satisfied-with-30-fps

30fps takes literally half the resources of 60fps, resources which can instead be sunk into visuals. Investing in visuals also has additional big benefits of better-looking gameplay trailers and video reviews - since the majority of buyers get their knowledge of the game from trailers and reviews (and videos run at 30fps) the actual framerate of the game doesn't need to be any higher than that.

There's no reason for devs/publishers like Ubisoft to aim for 60fps on consoles. There is simply not enough demand from console gamers for it.

So who can you blame?
 

J Tyran

New member
Dec 15, 2011
2,407
0
0
LaochEire said:
PS3 ? 2.43 million
X360 ? 2.17 million
PS4 ? 1.22 million
Xbox One ? 0.55 million
PC ? 0.38 million
Wii U ? 0.12 million

So inferior and less competitive they outsell PC by some margin. Those are January sales figures by the way, it stands at 0.47m now on PC, but as long as you get to take a free swipe at consoles you're happy Westonbirt.
Wait, what? Are those supposed to be game sales statistics or pre built PC sales statisitcs? Either way you are either misunderstanding them or you are being disingenuous.

Most PC gamers build their machines, if you look at the raw sales statistics more components are sold (profitably too not at a loss or close to margin like consoles) than current gen consoles. More PC gamers are building and upgrading their machines than people are buying current gen consoles, further more when it comes to games the PC platform earns the most revenue and profit for the developers and publishers that can exploit it.

The "AAA" bullshit sector often have lower sales on the PC platform because PC gamers are fed up with them, other publishers work hard to give customers services and products they want and earn incomes that in some cases completely eclipse the revenues from a whole "AAA" publishers entire catalogue with just a single game.
 

RicoADF

Welcome back Commander
Jun 2, 2009
3,147
0
0
Charcharo said:
*looks at Watch Dogs on PC*

Nope. Am certain it is just shit coding/optimization.

Metro looks MUCH better. Not a little, MUCH better. The only thing it has less is NPCs and theoretical space around the character. The actual AI probably wont be that good on Unity though.

4A achieved more, with the same hardware, at higher resolutions and 60 fps. With less money and less people on the project.
Metro is a FPS corridore shooter, Watch Dogs is a sandbox like Grand Theft Auto, 2 very different genres. Sandbox games usually have less visuals with higher demands because they have so much more going around.
 

JayRPG

New member
Oct 25, 2012
585
0
0
I love Tales of Symphonia, one of my favourite games, I was so excited for Tales of Symphonia Chronicles (HD rerelease on the PS3) and then I started playing and noticed something really wrong. Having played through the Gamecube version 4 or 5 times there was something not right when I started playing the HD remake.

I later discovered it was based off the PS2 port which was locked at 30fps (the gamecube version is 60fps) and it is almost unplayable for me. In fact, my last playthrough was on my Gamecube (which I still have) and I would still prefer to play my Gamecube version running at 60fps in all it's 480p blurry glory on my 55" UHD TV than the more graphically sound but locked at 30fps HD remake.

You spend more than half the game in a battle sequence which is basically an action adventure/hack n slash, it is not an FPS and it does not feel "more cinematic" at 30fps, nor does it feel better in any way.. hell, it barely feels acceptable after playing it at 60fps.
 

Mikeyfell

New member
Aug 24, 2010
2,784
0
0
Ultratwinkie said:
Mikeyfell said:
I don't want to piss one everyone's perception of what they "personally" consider to be "objectively" better, but I can't stand looking at things that are 60 FPS.

I can always tell the difference and it always looks worse.
So you can take your "objective fact" and shove it. I'd think of all people Jim would have the common decency not to call his opinion an objective fact and claim science supports something that is obviously his opinion.

Thank god for the improper usage of the word objective
http://gfycat.com/PaltryMiniatureConey

Even I can tell how jerky 30 fps is on that.

So something smoother and more responsive makes the game look bad am I right? It doesn't even effect graphics so the "it makes it look worse" is just an excuse.

It means how much the screen is updated, and that means a more responsive game. It makes it less vulnerable to lag. It doesn't speed up the game or anything.

There is no good argument against 60 fps. Its all excuses to cover bad developers who can't be assed to make a stable game on better hardware.
I think 60 FPS looks awful, that's my opinion.
You think 60 FPS looks better, that's your opinion.
There is nothing objective or factual about either of those sentiments.

When I watch things at 60 FPS it looks like I'm watching a puppet show in the back of a moving truck

You can't claim something is better or worse when you're talking about aesthetic preference.
There are people out there who think 24 is the one true frame rate.

Jim might have been joking when he said 60 FPS is objectively better but if he was this was the first time I couldn't tell he was joking. You'd think for a guy who gets accused of being "bias" so often, he wouldn't use the word objective incorrectly even for the sake of a joke.


And yes there are people who would sacrifice frame rate for draw distance and polygon count.
And increasing the responsiveness of a game by 1/60 of a second will only ever matter to people who play fighting games professionally.
 

SamTheNewb

New member
Apr 16, 2013
53
0
0
As a matter of fact, higher framerates allow better reproduction of motion, and lessens temporal aliasing producing a higher fidelity result. But I guess temporal aliasing is only an issue for stuff that moves really fast.

Also, response times are hugely important to the VR people (especially those doing head tracking), so, you shouldn't discount latency.
 

JET1971

New member
Apr 7, 2011
836
0
0
Mikeyfell said:
There is nothing objective or factual about either of those sentiments.
Not true, you are basing your subjective opinion against an objective fact. more frames per second does in fact improve gameplay. games at 15 FPS would be unplayable, 30 fps playable but not optimal, 60 fps is optimal because 60 fps covers the lack of motion blur that is used by TV and movies to give the illusion of smooth framerates.
 

pokepuke

New member
Dec 28, 2010
139
0
0
Mikeyfell said:
I think 60 FPS looks awful, that's my opinion.
You think 60 FPS looks better, that's your opinion.
There is nothing objective or factual about either of those sentiments.

When I watch things at 60 FPS it looks like I'm watching a puppet show in the back of a moving truck

You can't claim something is better or worse when you're talking about aesthetic preference.
There are people out there who think 24 is the one true frame rate.

Jim might have been joking when he said 60 FPS is objectively better but if he was this was the first time I couldn't tell he was joking. You'd think for a guy who gets accused of being "bias" so often, he wouldn't use the word objective incorrectly even for the sake of a joke.


And yes there are people who would sacrifice frame rate for draw distance and polygon count.
And increasing the responsiveness of a game by 1/60 of a second will only ever matter to people who play fighting games professionally.
Yes, it is objectively better. I don't see how you could think it could be argued against or would be a joke in any way. It is literally more information on the screen for you to utilize. The video is smoother, you are displayed with more detail when action is happening, you miss less of the action from those missing frames, and then you can react more proportionately to the action while it is happening instead of during whichever snapshots you were given while either enemies are attacking you are you are engaging them.

All I can suspect is that people are used to TV and movies being at such low framerates (with all their tricks and gimmicks to smooth it out). Then when it is faster, and possibly without those gimmicks, or the gimmicks actually break, it is something you are not used to processing. Exactly why PC players don't like low framerates, because they are used to the purity and fidelity of high and smooth frames being blasted into their corneas.

Plus it is possible that some of those little errant camera movements (during fake shaky-cam and such) would be diminished at a lower framerate. You wouldn't notice it bouncing around as much because the low amount of frames would be clipping off bits of motion from the bounces or whatever.


This same kind of debate happened for The Hobbit, and people complained about "too much detail", even though that is all in the HD department, not fps. It was mainly for all the action and sweeping scenery shots, so they weren't blurry as fuck. But, to the critics' credit, even though they never actually mentioned this, I think The Hobbit wasn't shot correctly, because sometimes when people were talking it would have that effect where they seems to be moving too quickly, as if the video is trying to catch up with the audio. I think the framerate was actually messed up, so yes it was too fast, and then the beginning of the shot was ahead, while the end of the shot was behind. The only correct point would be right in the middle. I didn't notice this in Hobbit 2.
 

Kohen Keesing

New member
Oct 6, 2014
40
0
0
irishda said:
If it's not inconsequential, then why is the example you provided (a twitch game) show almost no difference with regards to framerate except for once it reaches the low teens and below. I mean, it WOULD be really annoying if they locked it in at 5 fps. I might be pretty put off by that.
Well, all I can say is if for whatever reason, be it your eyes or your monitor's refresh rate, that you can't tell the difference between 30 and 60, then that's lucky or at least convenient. But for those of us who do this shit all day erry day, it's spitefully annoying/frustrating in that range.
 

Strelok

New member
Dec 22, 2012
494
0
0
Arnoxthe1 said:
Specs:

Intel Core i5-4200M (2.5 GHz, 4 CPU's)
Intel HD Graphics 4600 (Better than you think.)
1366x768
4 GB's of RAM
Windows 7 Pro

What's funny though is that it can run Serious Sam 3 pretty well and even Skyrim decently enough so I suspect it's probably just Far Cry 2 being stupidly optimized, if at all.
Is it better than I think? Somehow I doubt it, as it ranks as a mid-low end card graphics option, #242 on the list if you look. It struggles with anything but the lowest settings at 1024 X 768 on games from 2005 (World of Warcraft). Sounds like you just didn't know what you were buying, or what to ask, like the most simple question, "Is the video option integrated on this notebook, and does it share VRAM with the system".

At least we get a scale to see what "decent" means to you.

The Elder Scrolls V: Skyrim
2011
low 1280x720
29.3 31.9 33 48 51.1 ~ 39 fps
med. 1366x768
14 15 15.5 23 24.2 ~ 18 fps
high 1366x768
8 8.4 13 13.8 ~ 11 fps
ultra 1920x1080
4 7.3 ~ 6 fps

http://www.notebookcheck.net/Intel-HD-Graphics-4600.86106.0.html
 

J Tyran

New member
Dec 15, 2011
2,407
0
0
RicoADF said:
Charcharo said:
*looks at Watch Dogs on PC*

Nope. Am certain it is just shit coding/optimization.

Metro looks MUCH better. Not a little, MUCH better. The only thing it has less is NPCs and theoretical space around the character. The actual AI probably wont be that good on Unity though.

4A achieved more, with the same hardware, at higher resolutions and 60 fps. With less money and less people on the project.
Metro is a FPS corridore shooter, Watch Dogs is a sandbox like Grand Theft Auto, 2 very different genres. Sandbox games usually have less visuals with higher demands because they have so much more going around.
Except this argument falls apart when you consider a game like Sleeping Dogs, its an open world sandbox yet it looks far better and runs far better than Watch Dogs and had some exceptionally pretty lighting effects. No it doesn't look quite as drop dead gorgeous as Metro: Last light and Metro 2033: Redux (the one using LLs engine) because of the reasons you mention and the lack of PhysX which makes a real difference with the particle effects and environmental damage models. Its also a two year old game and uses far less resources, at launch there where a few niggles with Nvidia GPUs as it was an AMD developed game but instead of whining how unfair it was like AMD did with Watch Dogs Nvidia worked to fix the drivers to suit it instead.

All in all its a well optimised great looking game.
 

Mikeyfell

New member
Aug 24, 2010
2,784
0
0
pokepuke said:
Mikeyfell said:
I think 60 FPS looks awful, that's my opinion.
You think 60 FPS looks better, that's your opinion.
There is nothing objective or factual about either of those sentiments.

When I watch things at 60 FPS it looks like I'm watching a puppet show in the back of a moving truck

You can't claim something is better or worse when you're talking about aesthetic preference.
There are people out there who think 24 is the one true frame rate.

Jim might have been joking when he said 60 FPS is objectively better but if he was this was the first time I couldn't tell he was joking. You'd think for a guy who gets accused of being "bias" so often, he wouldn't use the word objective incorrectly even for the sake of a joke.


And yes there are people who would sacrifice frame rate for draw distance and polygon count.
And increasing the responsiveness of a game by 1/60 of a second will only ever matter to people who play fighting games professionally.
Yes, it is objectively better. I don't see how you could think it could be argued against or would be a joke in any way. It is literally more information on the screen for you to utilize. The video is smoother, you are displayed with more detail when action is happening, you miss less of the action from those missing frames, and then you can react more proportionately to the action while it is happening instead of during whichever snapshots you were given while either enemies are attacking you are you are engaging them.

All I can suspect is that people are used to TV and movies being at such low framerates (with all their tricks and gimmicks to smooth it out). Then when it is faster, and possibly without those gimmicks, or the gimmicks actually break, it is something you are not used to processing. Exactly why PC players don't like low framerates, because they are used to the purity and fidelity of high and smooth frames being blasted into their corneas.

Plus it is possible that some of those little errant camera movements (during fake shaky-cam and such) would be diminished at a lower framerate. You wouldn't notice it bouncing around as much because the low amount of frames would be clipping off bits of motion from the bounces or whatever.


This same kind of debate happened for The Hobbit, and people complained about "too much detail", even though that is all in the HD department, not fps. It was mainly for all the action and sweeping scenery shots, so they weren't blurry as fuck. But, to the critics' credit, even though they never actually mentioned this, I think The Hobbit wasn't shot correctly, because sometimes when people were talking it would have that effect where they seems to be moving too quickly, as if the video is trying to catch up with the audio. I think the framerate was actually messed up, so yes it was too fast, and then the beginning of the shot was ahead, while the end of the shot was behind. The only correct point would be right in the middle. I didn't notice this in Hobbit 2.
It's not objectively better
there is no such thing as objectively better when you're talking about a preference.

Just because you like 60 doesn't make it better
majority is not the measuring stick of objectivity

I prefer 30 I am not wrong about my own opinion, it is not objectively better, I don't care what PC gamers like because I play on a TV, and when I see video running at 60 FPS it looks horid
 

Arcane Azmadi

New member
Jan 23, 2009
1,232
0
0
Every time the magic number "60 FPS" comes up in gaming discussion, I always think the same thing: F-Zero X ran at 60 FPS with 30 vehicles on the screen at once at BLISTERING speeds on the N64 way back in 1998. You have to do more than THAT to impress me
 

Sack of Cheese

New member
Sep 12, 2011
907
0
0
Wow, Jim is married. First time I have heard about it. I wonder what she thought of all the naughty dragon dildos he got from fans. Ahh, yes, mm framerates stuff. I have been wondering, why 30vs60? Why not 40fps? 50fps? 70fps? Why is 60 the new standard?
 

EXos

New member
Nov 24, 2009
168
0
0
While I agree that stable FPS is more important than as high as possible; PC hardware is driven to get a stable 60Fps.
And Yes, more is better, no argument as the human eye can distinguish frames up to an impressive speed. (At Least 200 and is expected to beyond)

http://amo.net/NT/02-21-01FPS.html
http://www.100fps.com/how_many_frames_can_humans_see.htm
http://www.cameratechnica.com/2011/11/21/what-is-the-highest-frame-rate-the-human-eye-can-perceive/
"Links with evidence? What is this? Inconceivable!"

What irks me the most is that they put limits in the game while one of the biggest strengths of PC is that it is ever growing in power and speed.
Just like the original Crysis that wailed on graphic cards until they burned, an unlimited game gets better looking over the years. Locking something is extremely stupid for a PC game and just shows a lazy developer.

Just because Sony and Microsoft dropped the ball with this generation the PC still suffers from the dumbing down.
Want to play at 30? Set it up yourself just NEVER, EVER lock an option.
"30fps is more cinematic?" Oh [email protected] off...
 

Mikeyfell

New member
Aug 24, 2010
2,784
0
0
Charcharo said:
It's not objectively better
there is no such thing as objectively better when you're talking about a preference.

Just because you like 60 doesn't make it better
majority is not the measuring stick of objectivity

I prefer 30 I am not wrong about my own opinion, it is not objectively better, I don't care what PC gamers like because I play on a TV, and when I see video running at 60 FPS it looks horid
60 fps is better on TVs too... Just saying...

Lets for just a second assume that some people may really prefer 30 fps over 60 (both locked fully). Does this mean you like 20 more then 30? Have you tried 15? Maybe it will be better for you?

Anyways, last time I checked, most well made PC games allow for 30, 60, 75, 120 and even 45 and 144 fps locked. So, again, PC has you covered.[/quote]

I've never seen 45 FPS, I might try that, but 30 is the sweet spot for me.

my internet is really shitty so sometimes I have to make due with 12 FPS which looks bad but I can at least process the choppiness. when I see something at 60 or higher (I've played Watch Dogs on a 240 HZ tv that simulated 120 frames I had to leave the room) Everything looks like it's sliding around the screen.
Watching animation at 60 is sickening because in my experience "In-between-frames" typically have motion added to them to smooth out the transition, but that doesn't happen at 60 because the motion is twice as smooth so it just looks like there's a shape being moved.

and live action stuff in 60 is (This is completely pathetic so don't mind me) frightening.

You see motion blur in real life, like when you wave your hand in front of your face really fast. So when I see the smoothness of 60 (or more) FPS, I know there should be motion blur but there isn't and that makes reality look like the uncanny valley, so it looks like a photo realistic cartoon and it's scary as hell

CG in 60 looks watery
2D animation in 60 looks like a puppet show
And Live action in 60looks like the absolute nadir of the uncanny valley.

I realize I'm in the minority, but still 30 is nice.
 

JET1971

New member
Apr 7, 2011
836
0
0
Sack of Cheese said:
Wow, Jim is married. First time I have heard about it. I wonder what she thought of all the naughty dragon dildos he got from fans. Ahh, yes, mm framerates stuff. I have been wondering, why 30vs60? Why not 40fps? 50fps? 70fps? Why is 60 the new standard?
I am sure she is thanking the fans.

OT:

30 and 60 has been a standard for years now. 30 for low end systems barely able to run the game or at least to hold a stable FPS, 30 is the bottom line, the base frame rate, the lowest number for playability. 60 fps is the standard for PC gaming and has been for around 20 years. That's why we have v-Sync at 60 FPS. Monitor manufacturers have used that standard for monitors bare minimum refresh rate. 60 FPS is the standard for quality because it does the same as motion blur does for movies and TV, it is fast enough to look correct without noticeable stuttering.

30 FPS actually has visible stuttering, watch smoke or dust type particle effects in a game and you can see it if you pay attention. 60 fps and you wont see that stuttering.

But then why 30 or 60? well why 8, 16, 32, 64, 128, 256, 512, 1024, 2048, or 4096?