Realistic Graphics Are Broken

Rad Party God

Party like it's 2010!
Feb 23, 2010
3,560
0
0
There's already a game series centered on talking to demons, it's called Shin Megami Tensei and it's even older than Doom (Strange Journey might be even uglier than Doom, but I absolutely love that game).

But nitpicking aside, I kinda get your point and to a certain extent, I agree, I like good aesthetics more than pretty technology with orgasmic physics systems. Although it never hurts to couple them both (try not to picture that in your head), but relying on technology just for the sake of it, is when things get lazy and generic.
 

TheRightToArmBears

New member
Dec 13, 2008
8,674
0
0
Irridium said:
Oblivion came out in 2006. And while the graphics were nice, they ended up not mattering since the art-direction was so bland it just looked boring. Also in addition to the graphics, it also touted its "fully voiced NPC's"

Yeah, fully voiced. Only one line of unique dialog between them and all the rest is shared. This isn't even mentioning the ELEVEN voice actors for their world of "100+ NPC's"
Welp, you're right about the year it was released. Still, if anything, surely the art direction helps to prove my point? The focus was more on graphics than design, which Silentpony was claiming was a more recent thing.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Touting graphics is a foolish strategy? That and quality games are practically the only strategies consoles have. The ability to have fantastic graphics means that the system is powerful and will be able to play amazing games in the future. There's no downside to that.

I've got a few complaints here but I want to preface this by saying I do agree with several things you're saying. I especially like the recommendations towards the end of the article. But generally this is not graphics being broken, that's the improper use of graphics being broken. My complaints are all mainly with your article title and the first seven or so paragraphs. Just before you get into ludonarrative dissonance (something I consider to be lazy game mechanic brainstorming). Anyways, here I go:

Complaint 1
Saying, "The uncanny valley" isnt a reason to throw out the pursuit of crossing it. The point of it being a valley and not an asymptotic wall is that there's another side. It IS being crossed and this next step in advancement should be enough to get us more firmly on the other side. We've already been managing to produce truly beautiful human forms in the current generation and this will end with a much better platter for us. You can't look at games like L.A. Noire that convey realistic human facial expression and think, "Well, we'd better just give up because, well, Uncanny valley!" No, we started hitting the uncanny valley in much earlier consoles and have been working through that valley since then. Now we have the ability to develop realistic humans that are at least on the attractive side of the equation. We got darn close to crossing it in the current generation and honestly may have in some games. The more powerful machines will allow for better AI. Those wifes who didn't behave like wives? That's AI processing, not graphics and both are made better by this.

Complaint 2
As for production costs, those only increase if the production companies allow them to increase. Let me explain how budgeting is supposed to work in a major company:

1. How much can we anticipate to make in revenue (total money taken in) if we make this game? (this is called forecasting)
2. How much do we want to make in profit (revenue - expenses) out of the anticipated revenue?
3. Revenue - profit = expense (how much is spent in development, marketing and everything else that creates the games and gets it into the hands of the gamer).
4. Of the anticipated expense for the game, where to we allocate the money? (This is called budgeting)
5. Sticking to the budget.

Companies mess up either in 1 or 5, usually. There's room to mess up in any of them but 1 or 5 will end in those spectacular failures. If they anticipate making X in revenue and production of a Ultra High graphic game costs X+1. Then a bad company goes ahead with it (these are the companies that make an RPG and forecast their game as making COD money). A good company either says no or scales back the cost to produce a solid game that may not be ULTRA high quality.

Additionally, and this may have been missed, but developers have been struggling with fitting ever better graphics into archaic machines for a few years now. Skyrim pretty much scraped the bottom of the PS3's barrel thanks to its ridiculous asset categories getting bloated by the game's assets. Having a platform that can handle more graphics will allow them to loosen up and not have to SUPER fine tune certain assets. That cuts costs. The systems have gone to x86 environments and so are MUCH easier to develop for than this past generation. You're talking about even larger savings. Hypothetically, this move may have saved them a ton of money.

Also, we should expect to see engines being made that other developers will be able to utilize. Just like the Unreal Engines or Source. Engines can be a real driving source to technology that is built incrementally in such a way that sidesteps the cost of having to make lavish advances in technology every time a new title is being made.

So "costs" is a cop out. Anyone who spends more on something than they can make on it deserves to not succeed there. Hopefully someone else will pick up the IP and treat it right.

Complaint 3
Graphical quality doesn't just play into character realism. You get more vibrant and realistic environments. Water behaves more accurately and lighting is more realistic. I anticipate that we may see a sort of digital exploration market (like Myst or Zork) crop up again if something like the Occulus Rift catches on. Processing may be extremely necessary for this stuff. I strongly recommend looking into the kind of tech we're beginning to see, it's remarkable.

Pretty cool

Want to explore Pandora? Sure, why not? To render it in real time you need these advances.

Discussion
I believe this discussion (again, down to around paragraph 7 in the article) to be about something other than good graphics. I think there's a lot of disatisfaction with publishers failing to put the story first. That isn't graphics or technology's fault. That's the publisher's fault. These indie developers aren't showing that graphics can't enhance games. They're showing that even simplicity can be better than a graphically superior game that sucks as a story.

Small, simple games like Angry Birds also expanded the market in a way AAA games can't. Not because AAA games aren't as good, but because it's the difference of reading a novel vs. reading a magazine. One is a long term project while the other is something you can just jump into and out of at will. But that doesn't make the magazine articles any less engrossing. I just got done playing Thomas was Alone. Holy heck if it isn't simple but AMAZING.

When people argue about better graphics. It just seems to me like they're angry about there being a more capable platform even though it doesn't have anything to do with how developers use that. The ps4 and the xbone, those are machines whose sole purpose is to provide a blank canvas. This is just a bigger canvas allowing for more detail. You can still hand paint on it. It'll be the publisher's job to decide what works for them. At that point their decision will be on their shoulders, not the consoles'.

In any event, if graphics are Pandora's box, they've already been opened. Sticking at today's graphics aren't going to make bad publishers stop doing business the way they're doing it. This is a step in the right direction and it serves to benefit no one to stick around. I hope for a day where graphics can't get much better and everyone has a free game engine that allows them to make what they want and have it look as realistic or artistic as they want. On that day, good graphics won't sell a game because everyone can have good graphics, but only good stories will survive. That's the day we've arrived.
 

AldUK

New member
Oct 29, 2010
420
0
0
Right now I am replaying Icewind Dale thanks to GOG and I am completely immersed. Graphics in my mind does not equal immersion at all. Story, environment, music and intuitive gameplay create immersion. The most immersed I have been in a modern title was in Farcry 3, which I was very impressed with, but that was an exception rather than the norm. The push for hyper-realistic graphics have most definitely damaged games and the real kicker? Recent booming success of the independent game scene proves that we, the consumer, are not quite chomping at the bit for the 5 billionth lip-pixel. Just make good games and everybody's happy.
 

Dr.Awkward

New member
Mar 27, 2013
692
0
0
You know, as a challenge, I'd like to see Epic demonstrate UE4 using a very "unrealistic" cel-shaded style that's colorful and somewhat neon-tinted. None of that dirty grungy stuff we're always seeing - Just pastels and solid colors, maybe a little texturing, just to give it some sort of material look, and then use characters that just aren't proportionally realistic and give them qualities that make them stand out.
 

ResonanceSD

Guild Warrior
Legacy
Dec 14, 2009
4,538
5
43
Country
Australia
I'm agog. This was a truly excellent article. Keep up the great work, CI!!!


Now to E3, where the two major players will go "THANKS TO PC GRAPHICS CARDS, OUR GAMES WILL BE SHINIER THAN THEY'VE EVER BEEN!"
 

Shamanic Rhythm

New member
Dec 6, 2009
1,653
0
0
SecretImbecile said:
Frankly, you can blabber on about the uncanny valley for as long as you wish, but we're not even close to it yet and the thus I feel the point of this article is voided. Stylised visuals in games are not used to avoid the uncanny valley, but simply to masks the imperfections of the technology they used. if this were true, all games would be doing this, and yet the majority of AAA games push for photorealism.

The uncanny valley is not something that should be avoided, or treated as an impassible chasm. The next generation consoles are continents away from the uncanny valley, and when we finally reach the cragged edge, only by pressing onwards will we ever reach the other side. Carpe Diem.
I think you've missed his point. It's not that no one should ever try to pass the uncanny valley, but that the production cost to do so, particularly with present-day technology is crippling to a game. It's all well and good to make the in-game characters look lifelike, but all it takes is one moment where they start moonwalking against an invisible wall because the AI routine has hit its limits, and all suspension of disbelief disappears.

When you're playing a game, or indeed witnessing any kind of art form, your brain is subjecting it to a lot of scrutiny. The cortex subjects it to rational analysis while the amygdala forms emotional responses. Now, the emotional responses come faster, but they can be overriden by the cortex if it comes up with a more rational interpretation. This is basically an evolutionary principle designed to save our asses by encouraging swift reaction unless the slower rational processing can come up with something.

The key issue as it relates to game production is that the less realistic and more stylised the graphics are, the less likely the cognitive response is going to break the immersion by highlighting an inconsistency within what you're perceiving - because it's not subjecting it to the same scrutiny as it would a real -life image. The cortex knows what a human behaves like, so the moment one starts walking upside down Fallout New Vegas style, it concludes it's not real, and any emotional responses currently being maintained because you have been 'tricked' into thinking it's a survival situation disappear. As Robert points out, the more the graphics go up, the more the programmers have to keep up to make sure everything within that world behaves consistently. Hence, 8bit games like Minecraft can vividly simulate an entire world, while a graphics heavy game like Call of Duty can only simulate a series of corridors and reasonably expect people to accept the experience as real.

So while it's relatively accepted (although sometimes debated as to whether it should be reprogrammed) that dirt blocks don't obey gravity in Minecraft, when it happens in a game like Call of Duty you get this kind of reaction from people:
OT: Really well-written article, I fully agree.
 

takemeouttotheblack

New member
Apr 4, 2013
61
0
0
DVS BSTrD said:
Realistic Graphics aren't broken, the games that rely on them are.
Very nicely put. It's always nice to look at something that's aesthetically pleasing, but that doesn't directly translate as these new uber-realistic graphics; look at the success of the Wii and more widely Nintendo games. Moreover, the fact of the matter is that I already spend far too long customising the look of a character whom I'm going to see mostly from the back or wearing a helmet, so having even more customisation over the really small details like the number of nose hairs he has really isn't what I'm looking for.
 

Andy Farren

New member
Jan 22, 2013
34
0
0
Just watched a video review (by one of your rivals) of 'The Last of Us'. They say a big problem with the game is the way your AI compatriots react sometimes, and how the realistic setting makes this all the more jarring. They specifically mention how, when your character is trying to sneak by enemies, if your sidekick wanders into their field of view, it doesn;t alert them the way you would. So, kudos.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Mimsofthedawg said:
I don't know why people are so hell bent on believing this next gen is going to be defined by graphics. Between a mediocre at best GPU and a noticeable similarity between current and next gen graphics, they're hardly the focus of the competition now. Now it's the utilization of engines and technology, with better lighting, better particle effects, better physics, etc., as well as more robust mechanics ranging from deeper "morality systems" to greater consequences for our actions and more dynamic worlds. These are things that have me and developers excited, whether a developer decides to add hyper realistic graphics on top of that simply sweetens the deal.
That's all subsequent generations are defined by, increases in average gaming computing technology means that the gaming tech increases. The ability to create stories existed before computers were capable of helping us tell them. One generation to the next is generally the incremental improvement in hardware that provides a better platform for the story.

Graphics is just one of many features that more advanced computing power allows. You may scoff at the ps4 having 8GB of GDDR5 but not only is that fancier RAM than DDR3 RAM, but it's also 16 times what's in the ps3 and the ps3 is capable of playing an easily recognizable (albeit scaled down) version of Skyrim. The CPU/GPU and everything is is many more years advanced and believe me when I tell you that the ps3/360 have been holding the market back technologically.

What more advanced computing means is more dynamic AI, more realistic lighting and water effects, more realistic landscapes. But then again, maybe not, maybe the developer doesn't want realistic so much as robust/dynamic environments. That's what this allows for. There is so much more you can do with more under the hood than just graphics. Those complaints in Skyrim about AI pathing and the wife not behaving as a wife. More processing can help achieve the level of detail and processing required to change that. As far as I'm concerned, complaining about graphics might as well be complaining about processing because that's all that allows the graphics to be rendered better or worse.

But again, we are marching forward towards a day where the hardware can render pretty much whatever you want and where even indie developers have game engines that allow them to develop games that look like top notch AAA titles. On that day, games cannot succeed by being eye candy. Eye candy will be normal. The only thing they can get by on is a meaningful story that draws people to it. It's a tech race with a finish line and we need to encourage that. We'll still have games that don't rely on fantastic graphics to tell meaningful stories. All this does is allow developers that want to produce eye candy to be able to do so and that's not something to be afraid of. That's what some people want to have.

But this is silly. Publishers that don't understand the importance of a good story aren't being held hostage to graphics. They're just bad publishers. People here are essentially being a crowd of people blaming the carpenter's tools for the problem when it's really just the carpenter. Better tools may help them create something they weren't able to before, but the quality of it still comes down to the knowledge of his trade. Stop scapegoating advancing tech and start blaming publishers for not understanding where to put the budget.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
DVS BSTrD said:
Realistic Graphics aren't broken, the games that rely on them are.
Exactly right. It's like complaining about a very powerful car being responsible for a driver who misunderstood that just going really really fast doesn't compensate for not steering the car. *grumble grumble* "Fast cars are broken!" No, that kind of driver is. Fast cars do cool things in the right hands.
 

cookyt

New member
Oct 13, 2008
126
0
0
Maybe it's not as tough as you think it is. Creating more immersive controls for a player is less about actually putting in the work to make the game smarter, and more about tricking the player into thinking it is. For example, playing Zork (and other text-based adventure games) is an immersive experience for me because it feels like I have more fine-grained control over the flow of the game. This is because it gives me, the player, an actual voice via the text system. When you look deeply at the system, the system isn't much better than other, more conventional control schemes, but it feels better. So, maybe, we should be optimizing not for better control systems (the technnology is still far off), but for more deceptive ones.
 

DrunkOnEstus

In the name of Harman...
May 11, 2012
1,712
0
0
I'd rather see more work put into sound. Ever since I got 5.1 headphones, I'm dying for the "immersion" that could be offered by truly dynamic sound effects, and the illusion of real positional awareness. Maybe change the sounds for "walking" or other actions based on where you're stepping better than we do right now. Oh, and music. Not enough games these days have memorable music. Everyone knows the Mario music for everything, the Zelda theme, and the "ahh"s of the Halo intro and it brings floods of memories. I can really only point to God of War off the top of my head as a song that floods me with memories of that particular game in this gen (and it's really a PS2 gen thing).
 

Kahani

New member
May 25, 2011
927
0
0
Good points about the problems with developers focussing too much on realistic graphics rather than realism as a whole. However, I think your opening missed the mark a bit:
Robert Rath said:
Microsoft and Sony sit in the wings like gladiators waiting for the gates to rise, and the weapons they will battle with... are graphics.
The thing is, Microsoft and Sony are not, in fact, developers. It doesn't matter how great an idea more realistic game mechanics might be, there's really nothing they can do about. They just provide the hardware, and its up to the developers to worry about how to make their games immersive. Hardware developers focus on graphics simply because it's a very easy measure of how powerful their hardware is. There's no point them trying to advertise a console based on its ability to display branching dialogue trees and high quality music, because that's all just taken as a given. If we want more immersive games with more realistic mechanics, it's the developers we need to talk to, not the hardware manufacturers.
 

Headsprouter

Monster Befriender
Legacy
Nov 19, 2010
8,662
3
43
Seeing that old article about DOOM is quite funny looking at it nowadays. Especially considering the writer's suggestion of making friends with demons. FUCKING DEMONS!! And where are the green lizards in DOOM? I have played the entirety of the original DOOM and have not seen a single green lizard.

Besides that little tidbit I decided to focus on, I totally agree with this article. Cartoonish styles make a game more long-lasting visually and make it unique. It's the reason Timesplitters 2 still looks kind of nice. But not Timesplitters 1...that game was ugly.
 

Piorn

New member
Dec 26, 2007
1,097
0
0
I just find it sad when people claim "realistic graphics" is equivalent to "good graphics", they're not.
Good-looking graphics is the goal, and "realism" is just one way to get there.

The ludonarrative dissonance has only in recent years become gradually apparent to me personally, but that might be due to the enemies in the games I played not being human most of the time. It bothers me, though, when I have to question and justify my actions in the game. Why do I have to kill everything? I find it ok to be the bad guy, just give me a proper motivation!
I remember the first time I ever noticed it was in Tomb Raider:Underworld, which I got really cheap in some sort of steam sale. In the very first level, the bad guy steals a small golden statue from Lara, and she proceeds to gun down an entire boat full of seamen. I stopped playing there.
 

Whytewulf

New member
Dec 20, 2009
357
0
0
Good Article. I have always thought a more interactive environment is way more important than arm hair. Think about MMORPGS as an example. I think WOW is still doing well because it wasn't realistic, it allows you to escape. I am not sure I want to escape with real looking people having real life problems. Show me a fluffy big purple bunny with a jellybean gun, blowing up boxes and opening every door to find something and I am there.