No One Really Cares About 1080p, Says Far Cry 4 Dev

LGC Pominator

New member
Feb 11, 2009
420
0
0
Thing is I don't think that the whole 1080p-60fps thing really is the important thing in gaming tech nowadays, I mean realistically speaking, aside from pixel density and image smoothing, that isn't what makes a game look good is it?

I mean in Halo CE:A, you can switch between the halo 4 graphics and the halo CE graphics at will... and they are both running at 1080-60, I mean that hasn't changed, but the quality of the graphics is markedly better in the halo 4 engine, no?

I think the 1080-60 thing has become a bit of a major buzzword in modern gaming, and I don't think it is good for gaming really, I mean Id rather games have the depth of Deus Ex (installing again) than the hd prettiness of BF4, and the idea that modern hardware can push THAT quality up, so we can get gaming of that depth with the fidelity of modern games, then I would be happy as can be, 1080-60 be damned
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
wildstyle96 said:
I think it's more a problem with optimization than with the manufacturer of the Graphics, just look at games that came out at the beginning of last gen's iteration and games that came out at the end like GTA V
Either they can't figure out how to code for these new consoles or would rather not bother and just make maximum profit and terribly optimized games; I think we know the answer here.
No.

The difference here is that last generation consoles used unique hardware that was much different to code for. Now they use standard 86x architecture that has been around for over 2 decades. somone as multiplayform as Ubisoft definatelly knows the architecture. i can understand some small indie PS3 only dev knowing only cell architecture, but not Ubisoft.

Another fact is that they simply dont have raw power. You can optimize all you want, weak hardware is still weak. and tests show that Xbox GPU component is weaker than a 480GTX - a over 4 year old PC GPU.

Lightknight said:
Resolution only matters to focal point because it's your focal point that is able to resolve the pixels the best. If you can't resolve pixels with your focal point then you absolutely wouldn't be able to do so outside of your focal range whereas if you can't resolve pixles outside of your focal point you may still be able to do so with your focal point. Right now while you're looking at the screen, keep your focus on the screen but pay attention to the objects you see outside of the screen. Note that they're likely blurry or certainly not as a clear as where your eyes are focused on.

So, focal point is the only relevant component to evaluate when addressing resolution. If your focal point is catered to, then everything else will work too. Again, as far as resolution is concerned.
Resolution would only matter in the focal point if focal point was covering whole screen at the same time. It is not unless you are VERY far away from the screen. when it is covering only part of the screen, resolution of that part of the screen is what matters. since focal point is less than 2% of all vision, and this chart was drawn based on assuming focal point is whole vision, this chart is wrong.

The "blurry" effect outside of focal point often comes from our depth of field vision (we see in 3D) and not so much because of our eyes being blurry. since we are focused on a dot on a screen, other parts are unfocused and thus we cannot correctly have 2D vision. Its like if you look at a 3D movie without glasses - lack of eyes focus at these points. Now as i type that sentence my focal point is smaller than a single sentence on this screen. and yet it probably takes less than 5% of whole screen area. you see the problem with assuming focal point is on whole screen?

WildFire15 said:
I see no real excuse for Next-Gen systems not doing 1080p. The Wii U does it happily, so why shouldn't XB1 and PS4?
This is incorrect. Too often I hear people praise the Wii U by saying that it runs games at "1080p" while the other two consoles can't. This is NOT true. Nearly all games on the Wii U run at UPSCALED 1080p (fake 1080p). Do some research you'll see that only two AAA titles run at native 1080p (real 1080p). Those being WW HD and Rayman.
What resolutions are the games upscaled from? It's usually 720p, but can be as low as 600p. Fucking 600p!
You know what this means? There's more native 1080p games on the PS4 and Xbox One than there are on the Wii U. Much more, in fact.
Nintendo often distorting the truth about the technical aspects of their games. They like to lie about upscaled 1080p being the same as native 1080p.

pilar said:
Didn't think anyone would ever mention Black Flag and PC in a positive context as frame rate is the only real advantage over the consoles because you've still got to use Ubisoft's much beloved launch software Origin to play it.

That's why most PC users will avoid that hassle altogether just to slip the disk into a console and play = +1 consoles

Battlefield 4 is indistinguishable between the Playstation & the $400 780 = +1 consoles

A 750 TI is more power than the Playstation; that is, if it ever gets optimized, or you'll have to lower the textures considerably to double the frame rate. Same for the 760; but you never even glanced at the article linked in the last post, so I wouldn't expect you to know this.

Reading = Knowledge
Why is it whenever i see your post i want to laugh and leave it at that.

You do not seem to even know the name of Ubisofts launch software - UPlay (Origin is from EA) and you claim to be knowledgable on thier games. While you may not have though of anyone mentioning Black Flag, it actually ran quite good on PC and was much more graphically impressive than console counterparts provided you had the hardwarare for better graphics.

then you go on to list FALSEHOODS as points for consoles.

A 750 TI is more powerful than PS4. You do not optimize hardware. they are as they are. you do not need to lower textures to get better performance. you may need to lower textures to get double performance than that of a console, but you are getting twice the performance of console so its not really something you want to dismiss. And yes, i did "glance" At the article, which is why i noticed their broken methodology that i pointed at.

Yes, reading is knowledge. Like, say, reading the name of distribution platform your criticizing....


LGC Pominator said:
Thing is I don't think that the whole 1080p-60fps thing really is the important thing in gaming tech nowadays, I mean realistically speaking, aside from pixel density and image smoothing, that isn't what makes a game look good is it?
Yes and no. No, pixel density does not make a game look good. it allows good looking game to look good. Resolution alone does not bring asthetics. but you cannot have same asthetics without the resolution. Its an enabler if you will.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Strazdas said:
Lightknight said:
Resolution only matters to focal point because it's your focal point that is able to resolve the pixels the best. If you can't resolve pixels with your focal point then you absolutely wouldn't be able to do so outside of your focal range whereas if you can't resolve pixles outside of your focal point you may still be able to do so with your focal point. Right now while you're looking at the screen, keep your focus on the screen but pay attention to the objects you see outside of the screen. Note that they're likely blurry or certainly not as a clear as where your eyes are focused on.

So, focal point is the only relevant component to evaluate when addressing resolution. If your focal point is catered to, then everything else will work too. Again, as far as resolution is concerned.
Resolution would only matter in the focal point if focal point was covering whole screen at the same time. It is not unless you are VERY far away from the screen. when it is covering only part of the screen, resolution of that part of the screen is what matters. since focal point is less than 2% of all vision, and this chart was drawn based on assuming focal point is whole vision, this chart is wrong.

The "blurry" effect outside of focal point often comes from our depth of field vision (we see in 3D) and not so much because of our eyes being blurry. since we are focused on a dot on a screen, other parts are unfocused and thus we cannot correctly have 2D vision. Its like if you look at a 3D movie without glasses - lack of eyes focus at these points. Now as i type that sentence my focal point is smaller than a single sentence on this screen. and yet it probably takes less than 5% of whole screen area. you see the problem with assuming focal point is on whole screen?
No, it doesn't. The areas outside of your focal point are blurry in real life too. The reason why there's a blur outside of the focal point is because we lose focus the further away we go from the focal point. That's literally why it's called the focal point. Because that's where you're focusing your sight and everything else is out of focus (aka, blurry).

The focal point is the point that can resolve images the best. So... who gives a shit about everything outside of the focal point since if you succeed at catering to the focal point, everything else is as accommodated as it's going to get.

Again, here is how it goes:

1. If you can't resolve pixels with your focal point, you can't resolve pixels with non-focal vision.
2. If you can't resolve pixels with your non-focal vision, you may STILL be able to resolve pixels with your focal point.

So again, why do you think non-focal vision is relevant to this discussion? It's the only thing that matters when discussing the resolution of the screen. In what way do you think people would do anything differently to account for the parts of your vision that you're not focusing on?
 

pilar

New member
Jul 7, 2014
59
0
0
Strazdas said:
wildstyle96 said:
pilar said:
Didn't think anyone would ever mention Black Flag and PC in a positive context as frame rate is the only real advantage over the consoles because you've still got to use Ubisoft's much beloved launch software Origin to play it.

That's why most PC users will avoid that hassle altogether just to slip the disk into a console and play = +1 consoles

Battlefield 4 is indistinguishable between the Playstation & the $400 780 = +1 consoles

A 750 TI is more power than the Playstation; that is, if it ever gets optimized, or you'll have to lower the textures considerably to double the frame rate. Same for the 760; but you never even glanced at the article linked in the last post, so I wouldn't expect you to know this.

Reading = Knowledge
Why is it whenever i see your post i want to laugh and leave it at that.

You do not seem to even know the name of Ubisofts launch software - UPlay (Origin is from EA) and you claim to be knowledgable on thier games. While you may not have though of anyone mentioning Black Flag, it actually ran quite good on PC and was much more graphically impressive than console counterparts provided you had the hardwarare for better graphics.

then you go on to list FALSEHOODS as points for consoles.

A 750 TI is more powerful than PS4. You do not optimize hardware. they are as they are. you do not need to lower textures to get better performance. you may need to lower textures to get double performance than that of a console, but you are getting twice the performance of console so its not really something you want to dismiss. And yes, i did "glance" At the article, which is why i noticed their broken methodology that i pointed at.

Yes, reading is knowledge. Like, say, reading the name of distribution platform your criticizing....
When I say optimization and hardware, I mean the software by which the developers use to create their vision; if hardware was all that mattered, then these newest titles would be absolutely next gen looking on the PC instead just a little better looking and with a SOLID double frame rate -- and that's with a GTX 780!

But developers don't put any priority on those high-end GPU, which is why there are so few settings that make a noticeable difference between the platforms. Draw distance may be better among other things, but all of these settings can be made better or worse by the lack of software optimization.

The 750TI is much more powerful than a Playstation, but there's no game that's actually shown this; like Communism, there's what you see on paper, and there's what you actually get. You end up lowering the resolution, textures and other visual settings to sub Playstation just to lock a higher frame rate.

Tomb Raider was upgraded to Ultra PC settings on the Playstation 4 w/ several extra visual additions over the PC as well, because it was much, much more powerful than the Playstation 3 and with software that was better optimized for the new platform.

You can have any game in 1080p resolution, but if the textures are badly designed and the mechanics are jittery, then resolution doesn't really matter (if only to justify an overpriced GPU).
That's why SONY's exclusives are constantly in the mix for, or are usually the recipients of GOTY on several online magazines, like this one.
No matter how powerful a GPU is, you still can't beat optimized software.
 

KisaiTenshi

New member
Mar 6, 2014
45
0
0
Steven Bogos said:
Far Cry 4 creative director Alex Hutchinson thinks its weird that people can enjoy a retro pixel game, yet complain about resolution in AAA titles.

All of those Official Xbox Magazine [http://www.escapistmagazine.com/news/view/132275-MGSV-Ground-Zeroes-to-Run-at-1080p-on-PS4-Only-720p-on-Xbox-One], claiming that the reality is that people just don't really care if a game is 1080p or not.
Utter BS.

I want to mention two things:

a) The average person isn't actually aware that a game runs at 1080p because the effectiveness of 1080p doesn't extend past GUI. Look at a 1080p JRPG or Visual Novel, where the 2D assets are 1080 pixels tall, and then go look at your average 3D game.
b) The average person doesn't get 1080p television broadcasts

So therefor the average person has unlikely experienced actual 1080p at all. When you play a game that is scaled to 1080p from some other resolution (eg a retro 320x240 pixel game) you are playing a scaled game with that aesthetic, not relying on the computer monitor to scale it with all the wonderfully awful fuzziness. The few people who have experienced 1080p on a console, are those who play 2D games, not 3D ones. PC players of course have been able to play 1080p versions of games that weren't 1080p on last generation consoles (and still aren't on todays consoles), but can you really say there is an improvement?

For the most part, there isn't, because the increased resolution doesn't come with any benefit to anything but the GUI without an increase in texture resolution, which is typically locked to 2Kx2K max (requiring 16MB of video memory per texture.) The average 3D model doesn't really improve with an increase in texture sizes without a corresponding increase in polygons. When you go from 240p to 480p to 720p to 1080p, there is no increase in polygons or texture sizes. In reverse it doesn't work either, you can't scale down a 1080p screen to 720, 480 or 240 and still have it playable as the GPU still had to render it at 1080p.

That is ultimately what I feel the developer is getting at, wrong headed as it is. The developers realize that most people have 1080p screens, but don't really see much difference beyond the 2D GUI elements. This is the fault of the developer developing the game's assets with that assumption. It would be SOO much more work to put realistic looking high-poly models in a game, but then you could only have 2 or 3 of them on screen, because as soon as that model is more than 10 virtual feet away, most of that detail isn't visible anyway and you have to switch to the low-poly model. Loading time is a problem for all games, so they just favor loading time and not have high poly models to begin with.

Your average game made in 2014 looks no different than a game made in 2006 because of this.
 

Darkbladex96

New member
Jan 25, 2011
76
0
0
devotedsniper said:
It's not that its running at 720p which is the annoying bit, it's the fact that were in 2014 with brand spanking new consoles that can't even achieve 1080p at 30fps from what we've seen so far.

1080p should really be the min spec by now, I don't know of a tv/monitor that you can buy that doesn't do 1080p (excluding the small mini screens for things like car dvr's).

Oh well either way it doesn't really affect me as I play on pc but it's still disappointing to know we're not at that standard for the console gamers.

Going to be honest I would of expected the throw everything at it and hope it copes phase would be over by now with the new consoles (it is a year old after all), and as a developer I get the idea of oh it's got way more RAM now so we can just be a bit less careful but come on inefficient code is inefficient and you would be surprised by how much of a difference doing something a little different can effect the performance of a program.

I've reduced 20 minute runtime SQL queries on stupidly complex structures down to 20 seconds just by adding a single line of code, needless to say our customer was happy. Point is if you write code efficiently or have someone check it over you would probably find all these 720p 30fps news stories would be next to none existent. You'd be surprised how well this tag team coding style works.
Exactly. It's not the consoles, it's the devs being inefficent as hell with their resources.
 

Aaron Sylvester

New member
Jul 1, 2012
786
0
0
I think Ubisoft know exactly what they're doing here. By blurting out fucking retarded statements in rapid succession, it's easy clickbait on news sites and easy advertising for their upcoming games. Quite clever.
 

devotedsniper

New member
Dec 28, 2010
752
0
0
Darkbladex96 said:
Exactly. It's not the consoles, it's the devs being inefficent as hell with their resources.
At least someone gets it!

No the consoles aren't groundbreaking technologically speaking but they do have tremendous amounts of power compared to the old consoles, it's just right now we're stuck in the usual phase of I'm going to chuck everything at this new beast and pray it copes when a new generation appears.

I'm just disappointed they can't even achieve 1080p at a playable 30+fps when me and other pc gamers are starting to consider 4k and 1440p resolutions as viable options.
 

WildFire15

New member
Jun 18, 2008
142
0
0
Strazdas said:
WildFire15 said:
I see no real excuse for Next-Gen systems not doing 1080p. The Wii U does it happily, so why shouldn't XB1 and PS4?
This is incorrect. Too often I hear people praise the Wii U by saying that it runs games at "1080p" while the other two consoles can't. This is NOT true. Nearly all games on the Wii U run at UPSCALED 1080p (fake 1080p). Do some research you'll see that only two AAA titles run at native 1080p (real 1080p). Those being WW HD and Rayman.
What resolutions are the games upscaled from? It's usually 720p, but can be as low as 600p. Fucking 600p!
You know what this means? There's more native 1080p games on the PS4 and Xbox One than there are on the Wii U. Much more, in fact.
Nintendo often distorting the truth about the technical aspects of their games. They like to lie about upscaled 1080p being the same as native 1080p.
Goes to show you how little I care that I haven't researched this as much as you have. I'd much rather play and enjoy a game then worry about such things.
 

Rozalia1

New member
Mar 1, 2014
1,095
0
0
He is right you know. The IGC makes a big deal out of a lot of things, and will twist what is happening if it doesn't fit their narrative if needed.

Ultimately Ubisoft is a hated performer by the IGC, but that means exactly nothing. Look at all the hated performers of the past who have been hated while drawing big no problem. Whats that? They can't do good stories? Can't do good graphics? Gameplay? They draw regardless which shows how much that matters.
 

pilar

New member
Jul 7, 2014
59
0
0
devotedsniper said:
Darkbladex96 said:
Exactly. It's not the consoles, it's the devs being inefficent as hell with their resources.
At least someone gets it!

No the consoles aren't groundbreaking technologically speaking but they do have tremendous amounts of power compared to the old consoles, it's just right now we're stuck in the usual phase of I'm going to chuck everything at this new beast and pray it copes when a new generation appears.

I'm just disappointed they can't even achieve 1080p at a playable 30+fps when me and other pc gamers are starting to consider 4k and 1440p resolutions as viable options.
Shadow Fall and Second Son have an unlocked frame rate; and I get the feeling that Order 1886 will have one two; but all of them are reduced to 900p for some reason. If Uncharted 4 makes 1080p60, then there won't be any excuse for publishers.

And it's not like the other platforms are doing any better; Digital Foundry had a hard time on Ultra1080p60 with a GTX 780 for Lords of the Fallen; and let's not get started with Evil Within.
 

Fdzzaigl

New member
Mar 31, 2010
822
0
0
Seriously now? What a crappy argument. It's not because there might indeed be few gamers (compared to the total) out there who actively debate about gaming online, that no one really cares about 720p vs 1080p anymore.

I'll say this: I'm not a graphics whore, I often play older games with crappy graphics and things like the general style and feel of a game are much more important for me than the graphical quality.

But when I play a game in 1080 versus 720p I definitely feel and see the difference, and if I CAN play it in a higher resolution I obviously will.

It's hugely ironic that after this new generation of consoles the same devs who were whoring all over graphics at the cost of gameplay suddenly start downgrading to make their unoptimized products run on the consoles they're releasing on, while claiming no one cares about such things anymore.
 

Evonisia

Your sinner, in secret
Jun 24, 2013
3,257
0
0
The irony when the developers of a series that gave us FarCry 1 talking about visuals not mattering.

I don't care if it is 1080p or not, they tried forcing it on us for years and now so many of our games on the 360/PS3 are 30fps which is sad given that they are capable of higher.
 

Patathatapon

New member
Jul 30, 2011
225
0
0
To be honest, I think they have a bit of a point. I think maybe around 10%-20% of the buying public care about this sort of thing. What you see everywhere on Reddit, The Escapist, etc. is called the vocal minority. The people who speak up about their issues are not a majority of gamers. It's the same thing with the people who insult EA, and why EA still make a lot of money. Because most people don't give a shit about the politics, they just want to play games (Yes this includes casual gamers).


I'm of the opinion that as long as I can see what I'm doing, and the game play works, it's fine. I don't care if Call of duty is in 240p as long as I can see what is important, not the blades of grass and their shadows. 720p is fantastic to me because my sight is bad enough that I can't detect any better. If it lets them focus more on the game play and the actual fun in a game, then I say graphics are an acceptable sacrifice.


You may now tell me why my opinion is wrong. If you like better graphics that's fine, but you are not part of a majority, just a vocal minority.


EDIT: One other thing I'd like to mention, most people probably wouldn't be able to tell you the difference unless it was side by side as well. Someone ought to do a study about that...
 

SecondPrize

New member
Mar 12, 2012
1,436
0
0
I think it's weird that someone who works in an industry that has always been an arms race to better and shinier graphics would say such a thing. You don't spend decades training your consumers to buy a game or system because it offers better graphics and then suddenly expect them to stop when your budgets get out of control by bringing up works from studios who literally identify as independent.
 
Sep 13, 2009
1,589
0
0
Well I suppose I'll be one of the few people who agrees with this statement. If there's any area whatsoever that I'd want a game to make cuts, it would be in the resolution. I mean, it's not like these games are running at 720p because they decided that's the better way to go, it's because they had to make some cuts due to the power of the device, budget, whatever. It's a different case when they lock it at 30fps and 720p of course, but I think it's kind of ridiculous to complain about it only running at 720p on a console that can only run it at 720p.

Then again most of my time I spend playing games on consoles 2 or more generations ago, and I still think games like Metroid Prime look far better than the vast majority of titles that come out nowadays, even at an "atrocious" 480p
 

FalloutJack

Bah weep grah nah neep ninny bom
Nov 20, 2008
15,489
0
0
Patathatapon said:
Vocal minorities don't cause companies to lose more money than they've ever lost in living memory (such as Microsoft, reporting such), ergo that statement is false. They (various companies who act this way) have lost money on some of the worst decisions, marketing, and products in their area. Even if they retain stability, it has been noted that they did suffer for it, and probably will again. Really, the asinine PR comments, the inability to back them up, the back-peddling... It's kind of inevitable.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Lightknight said:
No, it doesn't. The areas outside of your focal point are blurry in real life too. The reason why there's a blur outside of the focal point is because we lose focus the further away we go from the focal point. That's literally why it's called the focal point. Because that's where you're focusing your sight and everything else is out of focus (aka, blurry).

The focal point is the point that can resolve images the best. So... who gives a shit about everything outside of the focal point since if you succeed at catering to the focal point, everything else is as accommodated as it's going to get.

Again, here is how it goes:

1. If you can't resolve pixels with your focal point, you can't resolve pixels with non-focal vision.
2. If you can't resolve pixels with your non-focal vision, you may STILL be able to resolve pixels with your focal point.

So again, why do you think non-focal vision is relevant to this discussion? It's the only thing that matters when discussing the resolution of the screen. In what way do you think people would do anything differently to account for the parts of your vision that you're not focusing on?
Sigh. I think you are misunderstanding me. Ill try to use paint, maybe that will be more clear.



Do you see why this graph is unrealistic information now? Do you see why i mentioned peripheral vision where majority of the screen remains at?



pilar said:
When I say optimization and hardware, I mean the software by which the developers use to create their vision; if hardware was all that mattered, then these newest titles would be absolutely next gen looking on the PC instead just a little better looking and with a SOLID double frame rate -- and that's with a GTX 780!

But developers don't put any priority on those high-end GPU, which is why there are so few settings that make a noticeable difference between the platforms. Draw distance may be better among other things, but all of these settings can be made better or worse by the lack of software optimization.

The 750TI is much more powerful than a Playstation, but there's no game that's actually shown this; like Communism, there's what you see on paper, and there's what you actually get. You end up lowering the resolution, textures and other visual settings to sub Playstation just to lock a higher frame rate.

Tomb Raider was upgraded to Ultra PC settings on the Playstation 4 w/ several extra visual additions over the PC as well, because it was much, much more powerful than the Playstation 3 and with software that was better optimized for the new platform.

You can have any game in 1080p resolution, but if the textures are badly designed and the mechanics are jittery, then resolution doesn't really matter (if only to justify an overpriced GPU).
That's why SONY's exclusives are constantly in the mix for, or are usually the recipients of GOTY on several online magazines, like this one.
No matter how powerful a GPU is, you still can't beat optimized software.
Newest titles do look next gen. they did for years. On PC that is. New consoles are weak and weaker than most of PCs so obviuosly they cant show off for much. Compare games from 2 years ago on PC and on consoles - massive difference in graphics. PC was "next gen" before consoles were.

Sure, anything can be made better with software optimization. you could make Commodore 64 run Crysis. though not in a playable state. It would likely cost billions though. Its much easier to just use a more powerful GPU for better tecxtures and the like... oh.... wait... consoles dont have that.

And once again you start your smear campaign when 750 has proven to play the same games at better settings and identical resolution/framerate to that of PS4. Once again please stop spreading falsehoods.

Here is your bellowed Digital Foundry proving you wrong on Tomb Rider falsehood: http://www.eurogamer.net/articles/digitalfoundry-2014-tomb-raider-definitive-edition-next-gen-face-off


WildFire15 said:
Goes to show you how little I care that I haven't researched this as much as you have. I'd much rather play and enjoy a game then worry about such things.
i dont know, i care about my hobby and i care that i see unblurred visuals. maybe you dont.