Jimquisition: Ubisoft Talks Bollocks About Framerate And Resolution

PunkRex

New member
Feb 19, 2010
2,533
0
0
Silentpony said:
Not that I disagree, but how does this stack up to the idea that graphics don't matter? I'll admit, I only have a decent gaming lappy, and I play most of the new releases on my apparently old and worthless 360. But when I hear a debate of 30fps and 60fps, or that the resolution is off(whatever that bloody means!) or that the in-game graphics have been downgraded since the last demo.=, I can follow it, and it makes sense. Gaming companies have been hoisted by their own petard so to speak. They sold us on graphics and then didn't deliver, fair. Good. Great.

But then the same people arguing turn around and say Minecraft is fucking amazing and that graphics don't make the game. They praise shitty looking games for 'evoking a sense of nostalgia!' and for not 'buying the corporate line about graphics, man' And I can't help but feel the people are either being two-faced, or just like arguing for arguments sake.

Is it just that Ubisoft promised 60fps and then only delivered 30? Would there be a controversy if they just said 30fps and that graphics shouldn't matter if the game is good? Don't we all believe that? Isn't that a core principle of gaming? Why are AAA games taken to task for the exact fucking pixel count when the indies are purposefully praised for having shit graphics? Is it money? Do we expect AAA games to have a great graphics to backup their absurd bankrolls? If so, aren't we tentatively implying that bad games can be fixed by flinging money at them? Then how can we complain about over-budget games? Shouldn't we all WANT an over-budget game, because it must have solved every problem.

Again, not trying to start a flame war, but how do the two principles exist side-by-side?
I can sort of agree with you here, I'm in the same boat as not knowing what half this technological mumbo jimbo is about, but I think the anger mostly comes from Ubi's condescending attitude and the fact that it was developers who pushed this stuff in the first place.

Also, indie games pick less developed graphics for cost reasons and due to their choice of aesthetics, ACU has a realistic look and has been claiming next gen since day one so it can't use the latter reason and the first seems self imposed.
 

Bad Jim

New member
Nov 1, 2010
1,763
0
0
Charcharo said:
I have a simpler reason Jim.

Ubisoft have bad coders.
In fairness, the code is probably fine. It's more likely that the artists have put too much eye candy in and no-one wants to scale down their precious artwork for something as trivial as making the game playable.
 

hydrolythe

New member
Oct 22, 2013
45
0
0
canadamus_prime said:
I think Ubisoft is trying claim the title previously held by EA, the worst company in America. They're certainly working their way towards replacing them as company most hated by gamers. Honestly I don't care about frame rates or visuals, I just wish they'd stop with the bullshit.
You probably mean the worst company in France. It would probably never obtain that title, since French probably would rather give that honour to Infogrames (which nowadays hides under Atari's brand label to avoid brand-exposure, it is that bad). Mainly because they are well-known there for making lots of shovelware (though a few of their titles are still well-remembered there, but very few).

One has to wonder though if they are as lying to the people in their home country. I do not know.
 

Arnoxthe1

Elite Member
Dec 25, 2010
3,391
2
43
Jimothy Sterling said:
... That PC's have been doing for years...
Uh, I don't mean to split hairs here but we both know that it all depends on how much money you invested in your PC, Jim. To run any modern game at 60 FPS takes AT LEAST a $700 machine. A little less than twice what a new XOne/PS4 console costs. In fact, for 60 FPS with high graphics settings, it'll probably cost more.
 

Biran53

New member
Apr 21, 2013
64
0
0
BlueJoneleth said:
themilo504 said:
Could somebody please tell me what fps is and how it affects videogames? I honestly don?t know.
Frames per second, so the number of images displayed every second on your screen. Framerate has to be consistent cause otherwise it cause a chopping effect in a game.

Most movies use 24 fps. For games 30 is usually considered the minimum bearable, but 60 has more or less been the standard for pc games for years now. People were hoping the new gen will move consoles up to 60 fps too, even devs announced it, but it seems the xbox one & ps4 are not powerful enough, so now we have ton of PR people telling 30 fps is fine.
I wonder why the XBone and PS4 aren't powerful enough. You would THINK that improvements would be abound for a brand "new" console generation, but I see failure after failure with both consoles in proving why they should exist in the first place. If neither are going to improve gaming hardly at all, why should we even bother?

EDIT: Because I probably have no idea what I am talking about, would it be expensive to have powerful console machines?
 

Arnoxthe1

Elite Member
Dec 25, 2010
3,391
2
43
Charcharo said:
Want to match the PS4 on a lets say 500 dollar machine? Lower to medium-high. And you may even BEAT it.
Uh... Considering that I have a $750 recent laptop that can't even run Far Cry 2 on medium settings and a reduced resolution without getting choppy, I think you're very wrong.
 

Muspelheim

New member
Apr 7, 2011
2,023
0
0
Heh... Jim with children.

*Mental image of Jimmie-Boy pummeling his podium with that baby doll in Think of the Children.*
 

Bad Jim

New member
Nov 1, 2010
1,763
0
0
Charcharo said:
Bad Jim said:
Charcharo said:
I have a simpler reason Jim.

Ubisoft have bad coders.
In fairness, the code is probably fine. It's more likely that the artists have put too much eye candy in and no-one wants to scale down their precious artwork for something as trivial as making the game playable.
*looks at Watch Dogs on PC*

Nope. Am certain it is just shit coding/optimization.

Metro looks MUCH better. Not a little, MUCH better. The only thing it has less is NPCs and theoretical space around the character. The actual AI probably wont be that good on Unity though.

4A achieved more, with the same hardware, at higher resolutions and 60 fps. With less money and less people on the project.
Well Watch Dogs was bad. But that's just Ubisoft shitting on the PC.

As for Metro, the theoretical space around the character is really significant. In Assassins Creed you can climb towers and see for miles, so detail must be limited to accomodate that. In Metro, you can see a hundred yards down the tunnel you are in, so that hundred yards of tunnel can have a lot more detail.
 

Arnoxthe1

Elite Member
Dec 25, 2010
3,391
2
43
Charcharo said:
Arnoxthe1 said:
Charcharo said:
Want to match the PS4 on a lets say 500 dollar machine? Lower to medium-high. And you may even BEAT it.
Uh... Considering that I have a $750 recent laptop that can't even run Far Cry 2 on medium settings and a reduced resolution without getting choppy, I think you're very wrong.
What laptop? My mate's laptop cost 800 dollars and whilst not even close to a similar priced Desktop PC (what WE WERE TALKKING about here) it can definitely max 6 year old games.
Specs:

Intel Core i5-4200M (2.5 GHz, 4 CPU's)
Intel HD Graphics 4600 (Better than you think.)
1366x768
4 GB's of RAM
Windows 7 Pro

What's funny though is that it can run Serious Sam 3 pretty well and even Skyrim decently enough so I suspect it's probably just Far Cry 2 being stupidly optimized, if at all.
 

irishda

New member
Dec 16, 2010
968
0
0
I'm willing to bet this is just going to become a rehashing of this thread.

I played Last of Us on PS3. I've got no problems with framerates as long as it's north of 0.

It's hard to deflect the criticism that gamers are childish when they're so willing to blow up over something as inconsequential as 30 v. 60 fps. Watching the footage Jim was using for AC: Unity, I'm having a hard time figuring out what the problem is.

If this is the measure of a controversy for Ubisoft, I'd say that says more about the type of controversies drummed up against them than it does the company themselves. At this point, I wonder if there'd be less of a controversy if they just put out there, "We're doing it a 30 fps because we lot aren't happy no matter what so go fuck yourselves. /crotch grab"
 

Piorn

New member
Dec 26, 2007
1,097
0
0
At least they don't force us to wear 3D glasses at home with our regular monitors yet.

And I totally agree with Jim here, higher framerate is literally and absolutely always better, wether it's games, movies, TV or anything. When I have to stop rotating the camera and have to reorient myself because the screen became a blurry stuttery mess while turning, then that's just not good. But at least in games you know which way you turn, in movies you don't, and if you pay attention, you'll notice how much of screentime is actually just audio with blurry filler-vision. The hobbit was the first movie I could actually keep watching during fast scenes, because there was actual visuals there.
 

RandV80

New member
Oct 1, 2009
1,507
0
0
Yep, if these graphic & performance issues 'aren't important' anymore then gamers may as well go out and buy Nintendo consoles.
 

Amir Kondori

New member
Apr 11, 2013
932
0
0
I don't understand why Ubisoft lies about this stuff. They have to know the majority of gamers who care enough to read all this stuff in the first place are not idiots and know that they are lying. It just makes their customers feel like Ubisoft has total contempt for them.

This parity makes zero sense as well, as all it does it make their games less competitive on various platform compared to other games on said platform, I don't see how it helps them at all.

I really feel like Ubisoft is run by people who don't understand the industry they are in.
 

Kohen Keesing

New member
Oct 6, 2014
40
0
0
Arnoxthe1 said:
Specs:

Intel Core i5-4200M (2.5 GHz, 4 CPU's)
Intel HD Graphics 4600 (Better than you think.)
1366x768
4 GB's of RAM
Windows 7 Pro

What's funny though is that it can run Serious Sam 3 pretty well and even Skyrim decently enough so I suspect it's probably just Far Cry 2 being stupidly optimized, if at all.
Not bad, I'm running an i5 @ 1.7GHz with two phys and two logical cores, Intel 4000HD and 3 or 4 GB RAM.
But for sure, if I cut out AA and complex shadows, and turn off Depth of Field (personal preference more than anything) my shitty little laptop can fucking curbstomp our 360.


alj said:
I mean come on use your testers better.
You know, I'm not actually sure that game publishers really test anymore. You used to get paid for that stuff, and nowadays public betas and alphas are almost commonplace to the point of exploitation.

irishda said:
It's hard to deflect the criticism that gamers are childish when they're so willing to blow up over something as inconsequential as 30 v. 60 fps. Watching the footage Jim was using for AC: Unity, I'm having a hard time figuring out what the problem is.
A video embedded on a forum site doesn't do it justice. As I mentioned in an earlier post, I play using my laptop on a 50" screen with a tendency not to blink for an hour at a time, and when you see a game like, severe example here, Crysis 1 on a large screen, the framerate really does become an issue, and a blindingly obvious one at that. I wouldn't say that 30 v 60 is "inconsequential" in all cases, as I stated I play a lot of twitch-reflex games like DeadSpace and Nuclear Thone, and when you're relying on good aim or the ability to react within milliseconds, you get into a sort of 'flow' as you play. When you rely on said 'flow' too much, a dip in framerate becomes somewhat jarring visually, especially if your display has a low refresh rate.

A low framerate in a game like Skyrim, where combat is slow and methodical, and the animation for someone swinging a weapon takes close to a second to finish, a drop from 60 fps to 30 fps isn't too bad - maybe visually offputting, but not actually impacting gameplay. But in something like Risk of Rain where there's about three frames in the animation for a devastating attack from an enemy, losing those milliseconds can become straight-up rage inducing.

This is, of course, leaving out all the other issues that start with framerate deviance, with Yahtzee talking in his Quantum Conundrum video about bugs resulting from framedrops, or for example in the original Dark Souls - with DSfix on - even just climbing and sliding down ladders becomes glitchy and risky.

Actually, an even better example is Skyrim,: I can't even count how many times there's been a sudden framerate drop, and the physics engine has bugged, resulting in me flying off the side of a mountain at the speed of a getaway vehicle to splat on the ground a kilometre or so below and two kilometres away.

And of course, there's the video that darkhollow put up (hopefully he doesn't mind me stealing it) that is exactly what I'm talking about in the Skyrim example

DarkhoIlow said:
"Framerate doesn't matter"
Have a gander here: http://gfycat.com/SaltyGraveAmazondolphin
This applies to most games not only FPS.
 

irishda

New member
Dec 16, 2010
968
0
0
Kohen Keesing said:
As I mentioned in an earlier post, I play using my laptop on a 50" screen with a tendency not to blink for an hour at a time, and when you see a game like, severe example here, Crysis 1 on a large screen, the framerate really does become an issue, and a blindingly obvious one at that.
I weep for the poor gamers who also don't blink for hours at a time, as truly they will be the most hurt by this decision.

I wouldn't say that 30 v 60 is "inconsequential" in all cases as I stated I play a lot of twitch-reflex games like DeadSpace and Nuclear Thone, and when you're relying on good aim or the ability to react within milliseconds, you get into a sort of 'flow' as you play. When you rely on said 'flow' too much, a dip in framerate becomes somewhat jarring visually, especially if your display has a low refresh rate.

A low framerate in a game like Skyrim, where combat is slow and methodical, and the animation for someone swinging a weapon takes close to a second to finish, a drop from 60 fps to 30 fps isn't too bad - maybe visually offputting, but not actually impacting gameplay. But in something like Risk of Rain where there's about three frames in the animation for a devastating attack from an enemy, losing those milliseconds can become straight-up rage inducing.

DarkhoIlow said:
"Framerate doesn't matter"
Have a gander here: http://gfycat.com/SaltyGraveAmazondolphin
This applies to most games not only FPS.
If it's not inconsequential, then why is the example you provided (a twitch game) show almost no difference with regards to framerate except for once it reaches the low teens and below. I mean, it WOULD be really annoying if they locked it in at 5 fps. I might be pretty put off by that.