Strazdas said:the mistake in this is that you assume Ubisoft wants to write optimized code. they have demonstrated multiple times they have no intention in having thie games optimized. after all according to them you should just buy a better graphic card for badly optimized ports [http://www.videogamer.com/pc/assassins_creed_4_black_flag/news/ubisoft_provides_statement_on_ac4_pc_optimisation_proud_of_pc_version.html]
agreed as well on that clarification. It really is a problem of expectations built up over the years. Similar to how people expect increasingly huge gains in computer power from upgrading hardware, there's a growing disconnect between what systems "Should" be capable of, and what they manage to present, as a result I feel not just because of overblown dreams of the creators, but lack of quality of the underlying systems created to support that dream appearance.Windknight said:Again, I agree with you up to a point, but this obsession with graphics is something devs/publishers have pushed themselves. They've used uber-graphical dick waving as a selling point, and stoked up the faction who obsess about this kind of thing... and now are trying to backpedal when their not keeping up with the expectations they've created.
yeah, considering the context, the quote just comes off as someone realizing that people aren't happy with the finished product on display and rightfully so. Ubisoft needs to be held responsible for its actions and this is not the way to do it. 3rd parties' attitudes as of late can best be described as inept; they keep making painfully obvious mistakes, thinking that things will be different. It's no wonder Yahtzee chose "Let's all laugh at an industry, That never learns anything, tee hee hee" as his jingle.Windknight said:Again, I agree with you up to a point, but this obsession with graphics is something devs/publishers have pushed themselves. They've used uber-graphical dick waving as a selling point, and stoked up the faction who obsess about this kind of thing... and now are trying to backpedal when their not keeping up with the expectations they've created.
As someone who has owned a 50" plasma TV for about 5 years now, I can tell you that one of the reasons we bought it is was because it had noticeably less input lag than LCD/LED's of the same period (except for those that cost twice as much as our plasma... those had about the same or a little less). I do think that this is accurate, older plasmas had lag and were subject to significant burn in. Later plasmas were not. Demon's Souls would leave a noticeable image retention on my screen after a few hours of playing, but most plasmas have a screen burn protection mode that wipes that out in a few minutes. Also, after about 6 months, burn in on a plasma is a lot less likely and while IR does occur pretty quick, it also goes away pretty quick. We've left unmoving screens up on ours for hours (overnight) even and we have no permanent retention.Jiggle Counter said:I've SEEN Plasma TV input lag, but I've never suffered it on mine. I once thought it was a myth or a technical fault in only a few brands or versions of plasma screens. Then I watched it happen on youtube, and holy crap it was awful. It was almost as if they were streaming it out through the internet and then back through the TV.Strazdas said:Plasmas will always look great because, well, they are plasmas. they are also something that will make you bankrupt from electricity bills and cannot be used for gaming due to massive input lag. oh and dont let them burn in, it does not go away, ever.
the color quality though, almost makes it all worth it.
Regardless, I have the feeling that it only applies for older Plasma screens. I bought mine only 2 years ago, so maybe they've figured out a fix for it.
With the burn-in effect, I can say that it does happen. Turn off the TV and there's the hud still staring you in the face. It does go away, albeit half a month later.
The electricity bill is totally worth the image quality.
Think he should start projecting images in 1080p?Zeriu said:This guy is clearly talking out of his ass, and is projecting a bad image on his company for doing so.
Plasma lag exists due to how plasma works. as in, physical properties. yes, new tvs have started doing progressive guessing to minimize that which works most of the time but is still quite bad. its more likely that you just got used to input lag after using that TV a lot since you own it.Jiggle Counter said:I've SEEN Plasma TV input lag, but I've never suffered it on mine. I once thought it was a myth or a technical fault in only a few brands or versions of plasma screens. Then I watched it happen on youtube, and holy crap it was awful. It was almost as if they were streaming it out through the internet and then back through the TV.
Regardless, I have the feeling that it only applies for older Plasma screens. I bought mine only 2 years ago, so maybe they've figured out a fix for it.
With the burn-in effect, I can say that it does happen. Turn off the TV and there's the hud still staring you in the face. It does go away, albeit half a month later.
The electricity bill is totally worth the image quality.
i see you continue telling lies. Shadow of Mordor works fine at 60 fps on high settings on my 760gtx, thank you.pilar said:Ask the PC user with a mid-ranged GPU; you can't have the shiny new graphics if you want the solid 60 frame rate, especially as how Evil Within and Shadow of Mordor either lock it to 30 or can't even run a GTX 760 at 1080p30 on Very High Settings.
It would help if people stopped thinking of PC vs Console: from their internal architecture to hardware, they're completely different machines -- it's not even close, unless you say a horse is related to a cat because it is a quadrupedal (four legs).
"I dont care if a game is well programmed, i care if its fun, immersive or well programmed"GoodNewsOke said:I'm one of those who don't care about resolution. 1080? 720? pfft. The game needs to be fun, immersive, well designed, that kind of thing.
so it does not matter what the resolution is as long as the resolution is high? yes, i agree.cikame said:As far as consoles and tv's go, yes, as long as the image is clean and clear with little aliasing it really doesn't matter what the resolution is,
this graph is incorrect. it is based on our focus point and assumes we are blind outside it, which we are not. our focus point while having higher fidelity than the rest of our vision consists of only 2% of our total vision field.Lightknight said:http://www.rtings.com/images/optimal-viewing-distance-television-graph-size.png
if this was true only uneducated would buy 4k TV. considering that monitors are functionality wise BETTER than TVs (which is why they cost more, not the other way around) buying a monitor if it was cheaper than TV would be an "not a choice" option.Well, the good news is that he was talking about TVs and not monitors where you only sit a couple feet away and there are 4k monitors on the market for reasonable prices (compared to $1,500+ for 4k TVs).
Is this supposed to be a comparison? i can see the difference from 3 meters away by looking at it on a 21" 1080p screen. the difference is jarring.Mr_xx said:For Screen size, viewing this image on your smartphone, tablet or laptop, the image will look identical even at a really short distance. (Unless of course your phone has a 20" display)
![]()
..is bogus and unrepresentative of reality.The following graph
Antialiasing is a procedure where you generate the image in higher resolution and then lower it to monitors native resolution, thus removing aliasing. Some techniquues do not generate whole image in higher resolution but only parts of it, if the game engine supports it, hence why they often take less resources. Of course there is the likes of TXAA who do not do antialiasing but instead blur the image hiding aliasing. TXAA should be banned because it actively makes image quality WORSE but use more resources than not using any AA.It's important to keep in mind that 3D application also employ other means to achieve a 'better' image, e.g. Anti Aliasing, which is sometimes less resource consuming than using a higher resolution.
Answer to first paragraph has to remain confidential.Shingeki no Gingerbeard said:Unless you work for the industry, or have access to financial data, you don't know how much anything costs. This includes both basic optimization and that ludicrous $400 million budget you pulled from some mule's rectum.
It does sound like, however, that you're of the mind a company should just spend money without any thought to cost-effectiveness. Businesses, under normal economic conditions, behave rationally. If it costs more money to optimize a port, then the port should cost more than the original. Imagine paying $60 for Sunset Overdrive on XBO, and $70 for a PS4 or PC port in another year or so.
What you, and most people here, are ignoring is it's perfectly legitimate for a company to want to standardize their game's experience across multiple platforms; especially if they rely on cross-platform sales and don't lock themselves into exclusivity deals. There's no sense in burning bridges by showing who's inferior to the competition. It's just bad business.
780ti > Titans anyway unless you really really need that 6GB of Vram (as in your gaming on 3 monitor setups and the like).devotedsniper said:By that definition I need Titans in SLi as up until recently a 780 was one of the top dogs... which is probably true because Watch Dogs still ran terrible even after upgrading from my 660TI. Saying that though it's not just ubisoft, none of the big names seem capable of a simple 1080p standard which us pc gamers have taken for granted for years.
yep. id rather we stop the polygon count for a while and instead increase the framerate and resolution and other technological aspects like raytracing. its rare to see somone that thinks like that though, most people go "buy teh 30.000 polygons in start citizen"wAriot said:"No one cares about resolution"
"60fps is not that important"
"OH BUT WE HAVE DA BES GRAFIX!!1!"
I am going to ask one of those stupid questions: Am I the only one who would prefer 2004 (or lower) level of graphics (textures, number of polygons, aliasing) along with high resolution and framerate, instead of, well, what we have now?
I really can't play a game if the fps drop under 50, and 720p looks like complete crap to me, specially if I'm playing in a goddamn 40" TV.
Strazdas said:i see you continue telling lies. Shadow of Mordor works fine at 60 fps on high settings on my 760gtx, thank you.pilar said:Ask the PC user with a mid-ranged GPU; you can't have the shiny new graphics if you want the solid 60 frame rate, especially as how Evil Within and Shadow of Mordor either lock it to 30 or can't even run a GTX 760 at 1080p30 on Very High Settings.
It would help if people stopped thinking of PC vs Console: from their internal architecture to hardware, they're completely different machines -- it's not even close, unless you say a horse is related to a cat because it is a quadrupedal (four legs).
It would help if you stopped spreading misinformation. Console architecture this generation is the same as PC.