No One Really Cares About 1080p, Says Far Cry 4 Dev

Silk_Sk

New member
Mar 25, 2009
502
0
0
People like pixel games so they don't care about 1080p? That's like saying people who enjoy black and white films don't care about color.
 

devotedsniper

New member
Dec 28, 2010
752
0
0
Strazdas said:
the mistake in this is that you assume Ubisoft wants to write optimized code. they have demonstrated multiple times they have no intention in having thie games optimized. after all according to them you should just buy a better graphic card for badly optimized ports [http://www.videogamer.com/pc/assassins_creed_4_black_flag/news/ubisoft_provides_statement_on_ac4_pc_optimisation_proud_of_pc_version.html]

By that definition I need Titans in SLi as up until recently a 780 was one of the top dogs... which is probably true because Watch Dogs still ran terrible even after upgrading from my 660TI. Saying that though it's not just ubisoft, none of the big names seem capable of a simple 1080p standard which us pc gamers have taken for granted for years.
 

Slegiar Dryke

New member
Dec 10, 2013
124
0
0
Windknight said:
Again, I agree with you up to a point, but this obsession with graphics is something devs/publishers have pushed themselves. They've used uber-graphical dick waving as a selling point, and stoked up the faction who obsess about this kind of thing... and now are trying to backpedal when their not keeping up with the expectations they've created.
agreed as well on that clarification. It really is a problem of expectations built up over the years. Similar to how people expect increasingly huge gains in computer power from upgrading hardware, there's a growing disconnect between what systems "Should" be capable of, and what they manage to present, as a result I feel not just because of overblown dreams of the creators, but lack of quality of the underlying systems created to support that dream appearance.

An, er, less convoluted explanation of an example would be, you can have a semi-poorly coded website, that still manages to work and get the job done. but there will be a point throwing hardware at it won't help fix that it's poorly coded. Game companies have their heads too stuck into numbers and check-boxes to worry if everything fits together properly. although now I'm getting off topic, into why I think online updates ruined quality of code, so that's a topic for a different discussion.

suffice to summarize: Selling points, buzz words, overblown expectations, ending in angry customers. I may not care about the battle of resolutiongate/fpsgate/qualitygate/etc....but I do recognize the issues that led to all of them.
 

crypticracer

New member
Sep 1, 2014
109
0
0
I had to buy a fancy new TV because I couldn't read words on my old one, when the 360 and ps3 came out. The developers forced that shit so they need to pay up.

I am sick of developers assuming we are stupid. Even if we have no idea what the difference is, we could find out with a quick wiki search. This isn't 1993. You can't expect people not to know things.

Has Ubisoft said anything smart in the last year?
 

clippen05

New member
Jul 10, 2012
529
0
0
I'm done with Ubisoft. Their stuff over the past few months, from the lack of women if AS:Unity (Or rather, not their lack of inclusion, but their reasoning as to why they didn't), all their graphics blunders they have down from Watch Dogs to Assassin's Creed to Far Cry. I used to buy a lot of their games but not only have the ones I have bought (IMO) been of declining quality in the first place, their treatment of consumers, specifically PC gamers in the past few months has been reprehensible. The last Ubisoft game I bought was Splinter Cell Conviction, (not years ago, a couple of days ago because it had coop and me and my mate were bored and its extremely cheap on PC) and it will be my last for the foreseeable future. As much as I am interested in the revival of the Rainbow Six Frnahcise, if they are going to continue this garbage than they aren't having my money. I can live without one specific game, I mean, I've had to make due without GTAV this long.
 

witmanfade

New member
May 6, 2013
4
0
0
I'm in the boat of "This guy doesn't quite understand resolution," but I think I agree with what he is trying to say. I really don't care what resolution the game runs at as long as it is playable and fun. Even though the new console is supposed to be the "best there is for graphics" right now, in a few years, when people go back and play it again, they aren't going to play it for the graphics, they are going to play it because they enjoyed it the first time through.

Frankly, I'm tired of people griping about how some graphics are every so slightly better (or worse) than the last game or not. I would rather the extra disk space go to game content to make the games more enjoyable, or longer, or something. Losing game play just to make the water sparkle 10% more really seems to me like a way for developers to get away with less content.

I play the "retro" games not because they are "cool to play because its retro", I play them because they have good game play, with enjoyable content.
 

Aiddon_v1legacy

New member
Nov 19, 2009
3,672
0
0
Windknight said:
Again, I agree with you up to a point, but this obsession with graphics is something devs/publishers have pushed themselves. They've used uber-graphical dick waving as a selling point, and stoked up the faction who obsess about this kind of thing... and now are trying to backpedal when their not keeping up with the expectations they've created.
yeah, considering the context, the quote just comes off as someone realizing that people aren't happy with the finished product on display and rightfully so. Ubisoft needs to be held responsible for its actions and this is not the way to do it. 3rd parties' attitudes as of late can best be described as inept; they keep making painfully obvious mistakes, thinking that things will be different. It's no wonder Yahtzee chose "Let's all laugh at an industry, That never learns anything, tee hee hee" as his jingle.
 

Alarien

New member
Feb 9, 2010
441
0
0
Jiggle Counter said:
Strazdas said:
Plasmas will always look great because, well, they are plasmas. they are also something that will make you bankrupt from electricity bills and cannot be used for gaming due to massive input lag. oh and dont let them burn in, it does not go away, ever.

the color quality though, almost makes it all worth it.
I've SEEN Plasma TV input lag, but I've never suffered it on mine. I once thought it was a myth or a technical fault in only a few brands or versions of plasma screens. Then I watched it happen on youtube, and holy crap it was awful. It was almost as if they were streaming it out through the internet and then back through the TV.

Regardless, I have the feeling that it only applies for older Plasma screens. I bought mine only 2 years ago, so maybe they've figured out a fix for it.

With the burn-in effect, I can say that it does happen. Turn off the TV and there's the hud still staring you in the face. It does go away, albeit half a month later.

The electricity bill is totally worth the image quality.
As someone who has owned a 50" plasma TV for about 5 years now, I can tell you that one of the reasons we bought it is was because it had noticeably less input lag than LCD/LED's of the same period (except for those that cost twice as much as our plasma... those had about the same or a little less). I do think that this is accurate, older plasmas had lag and were subject to significant burn in. Later plasmas were not. Demon's Souls would leave a noticeable image retention on my screen after a few hours of playing, but most plasmas have a screen burn protection mode that wipes that out in a few minutes. Also, after about 6 months, burn in on a plasma is a lot less likely and while IR does occur pretty quick, it also goes away pretty quick. We've left unmoving screens up on ours for hours (overnight) even and we have no permanent retention.

The lag issue was a big deal for me, because, at the time, I was a big guitar hero/rock band nerd. Had to FC everything on expert, which is almost impossible with large lag (partially because large input lag tends to vary by a greater proportion than quick input lag). I never had a problem after my initial calibrations.

In regards to the main post, this is stupid. I don't care if YOU can see 1080p vs 720p. It can be objectively demonstrated to be better. This was made super obvious to me just playing Dark Souls on PC before and after DSFix. It's not even close. 60FPS is also objectively better than 30FPS.

We shouldn't be arguing that "I don't see/don't really care." We know that given the resources, particularly on PC, 1080p60FPS is achievable, particularly with adjustable settings, for nearly every system that is at least mid-range (mid range costs around the same or less than a modern console if you're building your own). Bloodborne can achieve 1080p60fps, Far Cry 4 can achieve 1080p60fps, The Witcher 3 will be able to. The problem is the minimal improvement in the current console generation over the last one, which kicked off around what? 2005? In the 8 or so years between generation launches, the growth of console hardware has bogged down severely, far moreso than in previous generations (look at the difference between SNES and PS1). These developers are completely capable of dealing with producing these games at their PC performance level, but are hampered by needing to placate the console audience as to why they can't push better specs there. It's not really necessarily the devs' fault and certainly not the fault of console fans/gamers. It's the fault of Sony and Microsoft for either not creating better machines or waiting until it was cost-feasible to do so.

This is not about consoles or PC's from my perspective. I prefer PC but always end up with one or more of the consoles from each generation and I like gaming from my couch for some games. This is about hamstringing the devs with sub-par hardware, driving up the "omg next gen" (in reality, 2 gens back for PC) performance, and then needing to explain not meeting that expectation. It causes Ubi and other devs to consistently release "stupid."
 

Grabehn

New member
Sep 22, 2012
630
0
0
OMG Ubisoft again at the same stupid point, can't they just shut THE FUCK UP? I just laugh at the idea of a Dev from a "Big company" in the game industry talking about how "graphics don't matter" when these companies were pusshing the thing all over.

Saying "oh no, it's 1080p 60FPS all over" and then either pulling that statement back or simply not making it there is FAR WORSE than just releaseing your game without even talking about visuals. People get it, you can't do it, tough shit, but PLEASE shut the fuck up and make a good game. Especially when you're just making Farcry 3.5, since that already looks fine.
 

Olas

Hello!
Dec 24, 2011
3,226
0
0
Obviously people care about 1080p. They wouldn't fucking make 1080p monitors and TVs if people didn't care about 1080p.

I frequently turn my games down to 720p in order to get better framerates because my PC isn't great, but I acknowledge that I'm making a sacrifice when I do so. 720p on a 1080p monitor looks fuzzy.

Also, saying that the vocal minority are the only ones who care is just... a convenient delusion. A sample isn't always representative of a whole, but it's rarely completely unrepresentative.
 

wAriot

New member
Jan 18, 2013
174
0
0
"No one cares about resolution"
"60fps is not that important"
"OH BUT WE HAVE DA BES GRAFIX!!1!"
I am going to ask one of those stupid questions: Am I the only one who would prefer 2004 (or lower) level of graphics (textures, number of polygons, aliasing) along with high resolution and framerate, instead of, well, what we have now?
I really can't play a game if the fps drop under 50, and 720p looks like complete crap to me, specially if I'm playing in a goddamn 40" TV.

Side note: I find it particularly funny when morons spout "graphics are not that important, THEREFORE who cares about framerate and resolution?". Two pro-tips:
1. Framerate and graphics don't have much in common.
2. The reason we can't have a decent resolution in consoles is precisely because devs consider that YOU consider every other graphic quality (as I mentioned, textures, polygons, etc) more important.
 

Fangface74

Lock 'n' Load
Feb 22, 2008
595
0
0
Zeriu said:
This guy is clearly talking out of his ass, and is projecting a bad image on his company for doing so.
Think he should start projecting images in 1080p?

Kidding aside, it's fairly obvious that Microsoft are keen to side-step an issue that Sony and it's ilk are lording over.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Jiggle Counter said:
I've SEEN Plasma TV input lag, but I've never suffered it on mine. I once thought it was a myth or a technical fault in only a few brands or versions of plasma screens. Then I watched it happen on youtube, and holy crap it was awful. It was almost as if they were streaming it out through the internet and then back through the TV.

Regardless, I have the feeling that it only applies for older Plasma screens. I bought mine only 2 years ago, so maybe they've figured out a fix for it.

With the burn-in effect, I can say that it does happen. Turn off the TV and there's the hud still staring you in the face. It does go away, albeit half a month later.

The electricity bill is totally worth the image quality.
Plasma lag exists due to how plasma works. as in, physical properties. yes, new tvs have started doing progressive guessing to minimize that which works most of the time but is still quite bad. its more likely that you just got used to input lag after using that TV a lot since you own it.

pilar said:
Ask the PC user with a mid-ranged GPU; you can't have the shiny new graphics if you want the solid 60 frame rate, especially as how Evil Within and Shadow of Mordor either lock it to 30 or can't even run a GTX 760 at 1080p30 on Very High Settings.

It would help if people stopped thinking of PC vs Console: from their internal architecture to hardware, they're completely different machines -- it's not even close, unless you say a horse is related to a cat because it is a quadrupedal (four legs).
i see you continue telling lies. Shadow of Mordor works fine at 60 fps on high settings on my 760gtx, thank you.
It would help if you stopped spreading misinformation. Console architecture this generation is the same as PC.

GoodNewsOke said:
I'm one of those who don't care about resolution. 1080? 720? pfft. The game needs to be fun, immersive, well designed, that kind of thing.
"I dont care if a game is well programmed, i care if its fun, immersive or well programmed"
no, no problem with that statement at all.

cikame said:
As far as consoles and tv's go, yes, as long as the image is clean and clear with little aliasing it really doesn't matter what the resolution is,
so it does not matter what the resolution is as long as the resolution is high? yes, i agree.

Lightknight said:
http://www.rtings.com/images/optimal-viewing-distance-television-graph-size.png
this graph is incorrect. it is based on our focus point and assumes we are blind outside it, which we are not. our focus point while having higher fidelity than the rest of our vision consists of only 2% of our total vision field.

Another thing to note is that resolution is the true for of antialiasing, because aliasing exists due to poor resolution. this means that raising resolution even beyond what our eyes can see is HIGHLY BENEFICIAL to experience.

Well, the good news is that he was talking about TVs and not monitors where you only sit a couple feet away and there are 4k monitors on the market for reasonable prices (compared to $1,500+ for 4k TVs).
if this was true only uneducated would buy 4k TV. considering that monitors are functionality wise BETTER than TVs (which is why they cost more, not the other way around) buying a monitor if it was cheaper than TV would be an "not a choice" option.

he also claimed that you need to be 18 inch or MORE from 55" TV to see the resolution, which is not even logical. Mr Hutchinson clearly does not even know what hes talking about.



Mr_xx said:
For Screen size, viewing this image on your smartphone, tablet or laptop, the image will look identical even at a really short distance. (Unless of course your phone has a 20" display)
Is this supposed to be a comparison? i can see the difference from 3 meters away by looking at it on a 21" 1080p screen. the difference is jarring.

The following graph
..is bogus and unrepresentative of reality.

It's important to keep in mind that 3D application also employ other means to achieve a 'better' image, e.g. Anti Aliasing, which is sometimes less resource consuming than using a higher resolution.
Antialiasing is a procedure where you generate the image in higher resolution and then lower it to monitors native resolution, thus removing aliasing. Some techniquues do not generate whole image in higher resolution but only parts of it, if the game engine supports it, hence why they often take less resources. Of course there is the likes of TXAA who do not do antialiasing but instead blur the image hiding aliasing. TXAA should be banned because it actively makes image quality WORSE but use more resources than not using any AA.

If you want real antialiasing you ARE using higher resolutions. antialiasing is generating it in higher resolution.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Shingeki no Gingerbeard said:
Unless you work for the industry, or have access to financial data, you don't know how much anything costs. This includes both basic optimization and that ludicrous $400 million budget you pulled from some mule's rectum.

It does sound like, however, that you're of the mind a company should just spend money without any thought to cost-effectiveness. Businesses, under normal economic conditions, behave rationally. If it costs more money to optimize a port, then the port should cost more than the original. Imagine paying $60 for Sunset Overdrive on XBO, and $70 for a PS4 or PC port in another year or so.

What you, and most people here, are ignoring is it's perfectly legitimate for a company to want to standardize their game's experience across multiple platforms; especially if they rely on cross-platform sales and don't lock themselves into exclusivity deals. There's no sense in burning bridges by showing who's inferior to the competition. It's just bad business.
Answer to first paragraph has to remain confidential.

Company should spend money where its most rational. in the situation where 1080p is considered a minimum standard resolution and the company has been banking its games success on its graphical fidelity repeatedly, a company that always pushed graphical fidelity and is known for saying that statistically graphics is the largest factor in copies sold should adhere to at least minimum resolution standard. that would be rational.

What is not rational is to go around after 7 years of praising graphics and telling your costumer base, who after all bought your games based on pushing graphics, that the company knows better what its costumers want.

Well, Ubisoft has stated it wants to turn 60 dollar paying costumers into 200 dollar paying costumers so that 70 dollar pricetag may actually be a reality. a very funny reality as we will see how Ubisoft will tear itself apart trying to defend such lunacy. Game prices are too high as it is already and evidence shows that selling them for less actually makes the companies earn more money. cosidering most PC games at launch already are bellow 50 dollars even increasing their price would not make it on part with consoles ones though.

No, it is NOT perfectly legitimate to want to standartize the experience when the platforms are vastly different. whats bad business is intentioanlly gimp one version just because the other platform is underperforming.

devotedsniper said:
By that definition I need Titans in SLi as up until recently a 780 was one of the top dogs... which is probably true because Watch Dogs still ran terrible even after upgrading from my 660TI. Saying that though it's not just ubisoft, none of the big names seem capable of a simple 1080p standard which us pc gamers have taken for granted for years.
780ti > Titans anyway unless you really really need that 6GB of Vram (as in your gaming on 3 monitor setups and the like).

besides, the top dog now is 980 anyway.

wAriot said:
"No one cares about resolution"
"60fps is not that important"
"OH BUT WE HAVE DA BES GRAFIX!!1!"
I am going to ask one of those stupid questions: Am I the only one who would prefer 2004 (or lower) level of graphics (textures, number of polygons, aliasing) along with high resolution and framerate, instead of, well, what we have now?
I really can't play a game if the fps drop under 50, and 720p looks like complete crap to me, specially if I'm playing in a goddamn 40" TV.
yep. id rather we stop the polygon count for a while and instead increase the framerate and resolution and other technological aspects like raytracing. its rare to see somone that thinks like that though, most people go "buy teh 30.000 polygons in start citizen"
 

TomWiley

New member
Jul 20, 2012
352
0
0
And what he should have said was "If you care so much about 1080p, what the fuck are you doing on a console?"
 

RavingSturm

New member
May 21, 2014
172
0
0
I remember last year when they said optimizing on PC wasnt their priority. Really this year has been one lousy port after another.
 

pilar

New member
Jul 7, 2014
59
0
0
Strazdas said:
pilar said:
Ask the PC user with a mid-ranged GPU; you can't have the shiny new graphics if you want the solid 60 frame rate, especially as how Evil Within and Shadow of Mordor either lock it to 30 or can't even run a GTX 760 at 1080p30 on Very High Settings.

It would help if people stopped thinking of PC vs Console: from their internal architecture to hardware, they're completely different machines -- it's not even close, unless you say a horse is related to a cat because it is a quadrupedal (four legs).
i see you continue telling lies. Shadow of Mordor works fine at 60 fps on high settings on my 760gtx, thank you.
It would help if you stopped spreading misinformation. Console architecture this generation is the same as PC.

[HEADING=3]You're calling DigitalFoundry [http://www.eurogamer.net/articles/digitalfoundry-2014-shadow-of-mordor-face-off] a liar?[/HEADING]
"You can try pushing your luck by pushing texture detail higher - for example, we tried running high quality textures on a 2GB GTX 760 working in combination with the 30fps lock option."

So 1080p High Quality @ 30 FPS Lock. Hmm... welcome to the Playstation 4.

And if the architecture is the same, then what new AAA title takes advantage of each platform? If developers are so familiar with them, then why does each look only marginally better with some seriously different hardware? (The Evil Within is capped at 30 on the PC, just like the PlayStation 3.)

Do you know how pathetic is it for the consoles to challenge the 760?
 
Apr 5, 2008
3,736
0
0
Have Ubisoft spokespeople decided to emulate EA by spewing garbage from their mouths whenever they're opened?
If people didn't care about faster framerates and higher resolution, they wouldn't have bought XB1s or PS4s.

You can't have it both ways industry. Next gen hardware, last gen graphics. Ubisoft, the developer that couldn't.