No One Really Cares About 1080p, Says Far Cry 4 Dev

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Strazdas said:
Lightknight said:
http://www.rtings.com/images/optimal-viewing-distance-television-graph-size.png
this graph is incorrect. it is based on our focus point and assumes we are blind outside it, which we are not. our focus point while having higher fidelity than the rest of our vision consists of only 2% of our total vision field.

Another thing to note is that resolution is the true for of antialiasing, because aliasing exists due to poor resolution. this means that raising resolution even beyond what our eyes can see is HIGHLY BENEFICIAL to experience.
The focal point is the only thing that matters with resolution. Our ability to resolve two points drops dramatically outside of our focal point so the information is based only on the absolute clearest portion of our vision.

Since the point is when the average eye can tell the difference between resolutions then using our point of focus is the only pertinent information.

Were the chart to somehow take into account non-focal vision then you may even have to be inches away from the TV in some cases to resolve pixels. Addressing anything but our focal field is entirely pointless for any kind of discussion on resolution.
 

Atmos Duality

New member
Mar 3, 2010
8,473
0
0
Aiddon said:
Everyone notice how suddenly devs are claiming stuff like 60 FPS and 1080 don't matter when the fact was before this current gen they were claiming that such things were important and they needed new hardware for it? Looks like reality is rearing its ugly head and now they're trying to move goal posts.
Yeah, all of that talk about moving towards photo-realism (which actually does require higher-resolution), and suddenly it evaporates.

Apart from the obvious points that everyone else has said, methinks there's a change in the winds here, and an even more conservative attitude is descending from on high; all the way down to the devs even.

Dialing back the rate of technical progress could be the result of companies feeling less bold; which isn't out of the realm of possibility, between AAA's exploding production costs and the generally more negative attitude from consumers.
Corners are going to get cut somewhere.
 

Godhead

Dib dib dib, dob dob dob.
May 25, 2009
1,692
0
0
Man, how many "PR Blunder of the Year" candidates are there now for Ubisoft
 

pilar

New member
Jul 7, 2014
59
0
0
Atmos Duality said:
Aiddon said:
Everyone notice how suddenly devs are claiming stuff like 60 FPS and 1080 don't matter when the fact was before this current gen they were claiming that such things were important and they needed new hardware for it? Looks like reality is rearing its ugly head and now they're trying to move goal posts.
Yeah, all of that talk about moving towards photo-realism (which actually does require higher-resolution), and suddenly it evaporates.

Apart from the obvious points that everyone else has said, methinks there's a change in the winds here, and an even more conservative attitude is descending from on high; all the way down to the devs even.

Dialing back the rate of technical progress could be the result of companies feeling less bold; which isn't out of the realm of possibility, between AAA's exploding production costs and the generally more negative attitude from consumers.
Corners are going to get cut somewhere.
Not to mention how demanding it is for even a high-end GPU to run High Quality @ 60 locked.
 

Signa

Noisy Lurker
Legacy
Jul 16, 2008
4,749
6
43
Country
USA
I'm far more concerned about the native resolution than actual 1080p. It just so happens that I have a 1080p TV for all my gaming, so that's what I want my games to run at.

You know, I seem to remember a bunch of hype and pressure from all the visual media industries to get a 1080p TV...
 

Homey C-Dawg

New member
Oct 20, 2014
14
0
0
It's important when they can use it as a selling point and it's not important when they can't pull it off. Devs be so funny.
 

andri88

New member
Oct 28, 2014
5
0
0
Homey C-Dawg said:
It's important when they can use it as a selling point and it's not important when they can't pull it off. Devs be so funny.
Yup, of course I want 1080p but even more so I want the Devs stop lying to our faces in this regard.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
pilar said:
[HEADING=3]You're calling DigitalFoundry [http://www.eurogamer.net/articles/digitalfoundry-2014-shadow-of-mordor-face-off] a liar?[/HEADING]
"You can try pushing your luck by pushing texture detail higher - for example, we tried running high quality textures on a 2GB GTX 760 working in combination with the 30fps lock option."

So 1080p High Quality @ 30 FPS Lock. Hmm... welcome to the Playstation 4.

And if the architecture is the same, then what new AAA title takes advantage of each platform? If developers are so familiar with them, then why does each look only marginally better with some seriously different hardware? (The Evil Within is capped at 30 on the PC, just like the PlayStation 3.)

Do you know how pathetic is it for the consoles to challenge the 760?
well if it quacks like a duck, swims like a duck and flies like a duck its very likely a duck.

The mere fact that they locked FPS on a PC and then though that would be an actual test proves that they clearly are not fit to conduct tests.

You ask for games when you already gave an example - shadow of Mordor.
Heres some more.
Black Flag (AC4)
Battlefield 4
Titanfall
The Crew

As far as the case of The Evil Within goes. you can actually unlock the framerate and resolution on PC. ITs horribly optimized though.

Yes, i know how pathetic for a console it is to think it can work like a 760 when a 750 beats it easily.


Lightknight said:
The focal point is the only thing that matters with resolution. Our ability to resolve two points drops dramatically outside of our focal point so the information is based only on the absolute clearest portion of our vision.

Since the point is when the average eye can tell the difference between resolutions then using our point of focus is the only pertinent information.

Were the chart to somehow take into account non-focal vision then you may even have to be inches away from the TV in some cases to resolve pixels. Addressing anything but our focal field is entirely pointless for any kind of discussion on resolution.
Incorrect. Focal point is important, but so is everything else. This assumes that we see WHOLE SCREEN in our focal point, when that, as i said, is wrong. in fact if you take whole eye into perspective we can see over 500 megapixels in terms of resolution. taking 2% of our vision and assuming we see whole screen in those 2% is unfair. In reality our focal point shifts multiple times per second to different parts of the screen. It is actually medically unhealthy to use the computer in such a way that our focal point does not shift, because that strains your eye muscles beyond their normal use.

Res Plus said:
He's right - 4K's where it's at now, at very least 1420p.
1420p? thats an odd resolution. most are 1440p. where did you get that?
 

kickyourass

New member
Apr 17, 2010
1,429
0
0
I agree with him at least, I may be in a minority here, but as long as it runs at a consistent framerate, I honestly do not know how to emphasis how little I care about anything else outside the actual gameplay. If Far Cry 4 went back to the graphics level of the original Far Cry, I wouldn't care as long as it was still as fun as Far Cry.
 

prpshrt

New member
Jun 18, 2012
260
0
0
Ugh. I wish console industry would just back the fuck off developers. Not anyone's fault they make sub-par systems. If anything, maybe they should pour more money into R&D and try to design cards that can actually run games at 60 fps and 1080p... Or maybe they shouldn't have gone with ATI graphics in the first place and just gone with nvidia.
 

wildstyle96

New member
Oct 7, 2014
14
0
0
prpshrt said:
Ugh. I wish console industry would just back the fuck off developers. Not anyone's fault they make sub-par systems. If anything, maybe they should pour more money into R&D and try to design cards that can actually run games at 60 fps and 1080p... Or maybe they shouldn't have gone with ATI graphics in the first place and just gone with nvidia.
I think it's more a problem with optimization than with the manufacturer of the Graphics, just look at games that came out at the beginning of last gen's iteration and games that came out at the end like GTA V
Either they can't figure out how to code for these new consoles or would rather not bother and just make maximum profit and terribly optimized games; I think we know the answer here.

In other news, I had a couple of friends over a month or two back, 2 console gamer's were among them. When I showed them BF4 and Titanfall running on my PC they both openly exclaimed how amazing the graphics looked.

When I first finished building this PC and ran warthunder at 120fps I could visually see the responsiveness, smoothness and other benefits. When tanks came out, still unoptimized, the game ran at 30fps and I struggled to drive the tanks due to the unresponsiveness provided by the fps benchmark, now that it's running at 60fps I'm able to properly play.

On resolution, the fact that its a struggle to see text and huds in games running at 720p on a 1080p 50" TV is one of the reasons why games NEED to run at 1080p; we're nearly a decade into having this resolution massively available and its been standard for quite some time.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Strazdas said:
Lightknight said:
The focal point is the only thing that matters with resolution. Our ability to resolve two points drops dramatically outside of our focal point so the information is based only on the absolute clearest portion of our vision.

Since the point is when the average eye can tell the difference between resolutions then using our point of focus is the only pertinent information.

Were the chart to somehow take into account non-focal vision then you may even have to be inches away from the TV in some cases to resolve pixels. Addressing anything but our focal field is entirely pointless for any kind of discussion on resolution.
Incorrect. Focal point is important, but so is everything else. This assumes that we see WHOLE SCREEN in our focal point, when that, as i said, is wrong. in fact if you take whole eye into perspective we can see over 500 megapixels in terms of resolution. taking 2% of our vision and assuming we see whole screen in those 2% is unfair. In reality our focal point shifts multiple times per second to different parts of the screen. It is actually medically unhealthy to use the computer in such a way that our focal point does not shift, because that strains your eye muscles beyond their normal use.
Resolution only matters to focal point because it's your focal point that is able to resolve the pixels the best. If you can't resolve pixels with your focal point then you absolutely wouldn't be able to do so outside of your focal range whereas if you can't resolve pixles outside of your focal point you may still be able to do so with your focal point. Right now while you're looking at the screen, keep your focus on the screen but pay attention to the objects you see outside of the screen. Note that they're likely blurry or certainly not as a clear as where your eyes are focused on.

So, focal point is the only relevant component to evaluate when addressing resolution. If your focal point is catered to, then everything else will work too. Again, as far as resolution is concerned.
 

Bombiz

New member
Apr 12, 2010
577
0
0
Slegiar Dryke said:
*looks at the growing echo chamber in the comments and shrugs* I kinda agree with the guy. is something fun/interesting/new/etc, ie enjoyable to play, mysterious to explore, or exhilarating to blow stuff up......if so, count me in and screw the resolution and the frame rate. I only got a tv capable of 720 and 1080 in the past year or two, and still play anything from my snes to my wii games on it. as for pc stuff, not gonna kill my hardware to push "quality", or break games to force them faster than they run.
yes but to say that "no one cares about 1080p" is just false. we're talking about how ubi keeps putting their foot in their mouth. also you agree with the guy that people don't care? Cause I think you're vastly wrong about that.
 

vector_zero

New member
Mar 18, 2009
25
0
0
I think some people in here got the right idea. It's not so much the fact that it is 720p/30frames that is the issue. It is more a problem of setting expectations. It's like telling a girl you got a 55 inch TV and then taking her home and all you got is a 32. Still more then big enough for a pleasant viewing experience, but that expectation is broken and will sour any experience with said device, game, etc.
 

Slegiar Dryke

New member
Dec 10, 2013
124
0
0
Bombiz said:
yes but to say that "no one cares about 1080p" is just false. we're talking about how ubi keeps putting their foot in their mouth. also you agree with the guy that people don't care? Cause I think you're vastly wrong about that.
I "kinda" agree with the guy on a very particular point, which I outlined in my original post. so I'll say it again here, because it's what's important to my opinion on the "issue".

"is something fun/interesting/new/etc, ie enjoyable to play, mysterious to explore, or exhilarating to blow stuff up......if so, count me in and screw the resolution and the frame rate."

To summarize things further, I don't care about resolution and framerate. and I don't care about peoples opinions on the matter because it's their opinion. the argument of objectiveness vs subjectivness and whether people care or not can be shouted from the rooftops, but until it becomes something that actually affects me, my opinion and expectations will always ride on if the game is FUN. and you don't need pretty pictures to have fun.
 

WildFire15

New member
Jun 18, 2008
142
0
0
While exact resolutions and FPS don't concern me so long as I have a consistent, smooth game play experience with a nice aesthetic, I see no real excuse for Next-Gen systems not doing 1080p. The Wii U does it happily, so why shouldn't XB1 and PS4?
 

pilar

New member
Jul 7, 2014
59
0
0
Strazdas said:
pilar said:
[HEADING=3]You're calling DigitalFoundry [http://www.eurogamer.net/articles/digitalfoundry-2014-shadow-of-mordor-face-off] a liar?[/HEADING]
"You can try pushing your luck by pushing texture detail higher - for example, we tried running high quality textures on a 2GB GTX 760 working in combination with the 30fps lock option."

So 1080p High Quality @ 30 FPS Lock. Hmm... welcome to the Playstation 4.

And if the architecture is the same, then what new AAA title takes advantage of each platform? If developers are so familiar with them, then why does each look only marginally better with some seriously different hardware? (The Evil Within is capped at 30 on the PC, just like the PlayStation 3.)

Do you know how pathetic is it for the consoles to challenge the 760?
well if it quacks like a duck, swims like a duck and flies like a duck its very likely a duck.

The mere fact that they locked FPS on a PC and then though that would be an actual test proves that they clearly are not fit to conduct tests.

You ask for games when you already gave an example - shadow of Mordor.
Heres some more.
Black Flag (AC4)
Battlefield 4
Titanfall
The Crew

As far as the case of The Evil Within goes. you can actually unlock the framerate and resolution on PC. ITs horribly optimized though.

Yes, i know how pathetic for a console it is to think it can work like a 760 when a 750 beats it easily.
Didn't think anyone would ever mention Black Flag and PC in a positive context as frame rate is the only real advantage over the consoles because you've still got to use Ubisoft's much beloved launch software Origin to play it.

That's why most PC users will avoid that hassle altogether just to slip the disk into a console and play = +1 consoles

Battlefield 4 is indistinguishable between the Playstation & the $400 780 = +1 consoles

A 750 TI is more power than the Playstation; that is, if it ever gets optimized, or you'll have to lower the textures considerably to double the frame rate. Same for the 760; but you never even glanced at the article linked in the last post, so I wouldn't expect you to know this.

Reading = Knowledge