Consoles Are Holding Gaming Back

Dragonbums

Indulge in it's whiffy sensation
May 9, 2013
3,307
0
0
mike1921 said:
Dragonbums said:
mike1921 said:
Dragonbums said:
mike1921 said:
Dragonbums said:
I'm waiting for the Oculus Rift since that seems like an actually innovation that's actually superior to the old system.
I bet people one hundred fucking dollars that if Nintendo were to make up the Oculus Rift everyone would whine about how it is the ultimate form of gimmicks and point them to the failed versions of the 1990's.

Seriously.

The fact that your biggest gripe with motion controls is that you have to move is laughable.
I'm also willing to bet you will look more like a idiot using the Oculus Rift, then the Wii could ever make anyone.

Motion Controls- more bit of immersion into the game by attempting to allow you to use actions that will mimic your in game counterpart.
Everyone here calls it a gimmick, and complains about how much their arms hurt (I've seriously never had this issue using the Wii)
Arms tiring out also doesn't help our image in the physical fitness department.

Oculus Rift- A head gear set that allowed you to have your vision surrounded by the game to fully immerse yourself in the videogame world. Still requires you to use a controller(from my knowledge)
Is called the revolution of gaming.
And yes, if you can't see the outside world that can break you out of immersion.
Why? Because depending on environment if you are unable to detect who is around you in the real world you can get into accidents.

I'd take that bet. You know why? No one seems to have thought the DS was a shitty gimmick, not for most of it's life cycle at least, I'm fairly certain that was the first console to use a touch screen. That's because the dual-screen system is actually useful and motions are relatively small, even if they are bigger than normal controllers. It amazes me how people take wii-hate as nintendo hate when Nintendo came out with TWO consoles within 12 months of each other and people seemed to love the DS.

Who cares how you look? Did I ever say my problem with the wii is how you look?

Saying inherent problems with a type of system are laughable is a laughable statement in and of itself.

Motion controls wreck your immersion by causing sensation that cause you to think about your body more. It's not about arms hurting, you feel the weight of your arms of moving. Controllers, and keyboard and mouse, are so superior because the motions are almost all less than an inch so you're more likely to forget that you're using a controller. Maybe an Oculus Rift type system will make motion controls more viable by making you feel more like your sensations are the character's, but with a screen that is a distance from you like that, you can't have so many sensations outside of the screen tugging at your immersion. Also: Wii motion controls aren't perfect, when you make a big motion and it doesn't work the way you should , even if you were able to build up some immersion it was just sent through the paper shredder.

Yes, you still need to use a controller...or maybe keyboard and mouse work. I don't see why revolutions need to be changing control schemes, and we'll get a different control scheme that's not shit one day. I for one give it 10 years though.
And having a head gear on your head, moving your head around doesn't break you out of immersion?
Because if you are surrounded by the visuals of a videogame you have to be very conscious of your surroundings which could even be more breaking then just moving your arms in a swinging motion.
The Oculus Rift and the Wii's motion controls are both gimmicks in every sense of the word.

The "gimmick" with the Oculus Rift is you have a 360 vision of the video game environment.
The "gimmick" of the Wii is that the motions in real life affect the motions in the game.

Both of them bring something new to the gaming experience.
I'm not even going to bother with touch screens because that was going to be a mainstream thing anyway.

However motion controls are a "gimmick" because we labeled it as such.
Oculus Rift is "innovation" because we labeled it as such.
However someone else can twist that around and say the opposite and you can't really counter argue that point.

Personally I think both Motion controls and Oculus Rift bring something new to the table, and I have no ire towards any of them.
However I think it is pretty unfair to claim that one is the gimmick and the other is innovation because of a company name.
No, moving your head doesn't really break your immersion because all you're seeing is the game world, it doesn't make you perceive much outside of the game world. I'm assuming the head gear is light, if it's heavy then it'll likely break you out. It takes a lot more to break your immersion when all you see is the game. Maybe moving your head will break you out, but when the game reacts as such, and all you see is the game so it's easier to perceive your sensations as the character's sensations it seems much easier to be immersed in that then making full swings or wrestling with a motion sensor and being immersed in that.

To me the difference between gimmick and innovation is that an innovation is a clear improvement. Gimmicks are novelty for novelty's sake. A gimmick is dildo that you can use as a pen, an innovation is a dildo that vibrates. In terms of core games the wii controller is a gimmick, maybe it's innovative for casual games but those aren't really the games you get immersed in.

Saying it's because of company name is a strawman. Like I said, people love the DS. They love it. Saying that was GOING TO BE a mainstream thing is irrelevant. It was the first console to do that and in 2005 touchscreens weren't ubiquitous. Past innovation is still innovation whether it caught on or was ignored.
In your opinion.
The Oculus Rift is only as innovative and revolutionary as the audience deems it.
It can easily be brushed off as a gimmick.

How many people would care about the Oculus Rift?
How many people would want to use it?
How many people can't use it?

Some people have motion sickness.
Others can be easily disoriented.
Other players are just put off at the prospect of being surrounded by gaming visuals and losing any sort of awareness to the outside environment.

The Oculus Rift can very easily become the next Wii.
Everyone will get it because your "surrounded" by the game, and then by the end of the generation the fad will fade away and people will go to something else.
I do not see future consoles utilizing the Oculus Rift heavily. Similar to the sense of how the Wii U downsized on using motion controls.
I'm well aware that innovation is subjective. That's why I'm arguing the superiority of the Rift in terms of core games. The way I see it, the reason core gamers mostly rejected the wii is because it wasn't truly beneficial to most people. Yes it can be brushed off but the way I see it, it seems that gamers have fairly reliable taste when it comes to things that manage to get into the limelight.


This all started with you talking about how PC kills innovation, I don't see why the focus would be on how much consoles utilize the rift. I don't care how much the consoles use it. If it's successful on PC that might cause the consoles to pick it up though.
The only thing I said even remotely about PC on this thread was that it can never reach the magnum opus of gaming most people claim it to be because a PC first and foremost is supposed to be the "jack of all trades"
An individual not only uses PC for various activiites- in my case digital illustrations- but they can also have computers that are far older and far cheaper and never bothered to upgrade because as far as they are concerned, their PC still does exactly their daily needs.

I don't think I was the one that said PC is holding back gaming.
Where did I state this?
What I did say was holding back gaming are game devs themselves with their complacency to simply stay on the safe side, never find new ways around limitations and would rather play with the newest toys, and do the same old same old. Mainly due to suits more then anything.
 

Dragonbums

Indulge in it's whiffy sensation
May 9, 2013
3,307
0
0
Vegosiux said:
Dragonbums said:
Other players are just put off at the prospect of being surrounded by gaming visuals and losing any sort of awareness to the outside environment.
And this, this is my main reservation about it. My brain has trouble enough dealing with the sensory input the "current" reality is handing me. Bringing a "different" reality into the equation, confusing my brain into thinking that one is "real" too (because, in the end, what makes it real or not is precisely the ability of your brain to filter through sensory input) is something I really do not wish to subject myself to. With the added quirk that the "current" reality won't wait for me while I'm "fully immersed" in the other one.

I doubt any human being can utilize VR equipment for an extended period of time without going a tad bonkers.
I hope the Oculus Rift takes heavy references with the Virtual Boy. I am very certain it was more than the red and black color scheme that put people off heavily on the thing.
Many people suffered from headaches and other symptoms similar to immersion. And that wasn't even in an "alternate" reality.
 

mike1921

New member
Oct 17, 2008
1,292
0
0
Dragonbums said:
mike1921 said:
Dragonbums said:
mike1921 said:
Dragonbums said:
mike1921 said:
Dragonbums said:
I'm waiting for the Oculus Rift since that seems like an actually innovation that's actually superior to the old system.
I bet people one hundred fucking dollars that if Nintendo were to make up the Oculus Rift everyone would whine about how it is the ultimate form of gimmicks and point them to the failed versions of the 1990's.

Seriously.

The fact that your biggest gripe with motion controls is that you have to move is laughable.
I'm also willing to bet you will look more like a idiot using the Oculus Rift, then the Wii could ever make anyone.

Motion Controls- more bit of immersion into the game by attempting to allow you to use actions that will mimic your in game counterpart.
Everyone here calls it a gimmick, and complains about how much their arms hurt (I've seriously never had this issue using the Wii)
Arms tiring out also doesn't help our image in the physical fitness department.

Oculus Rift- A head gear set that allowed you to have your vision surrounded by the game to fully immerse yourself in the videogame world. Still requires you to use a controller(from my knowledge)
Is called the revolution of gaming.
And yes, if you can't see the outside world that can break you out of immersion.
Why? Because depending on environment if you are unable to detect who is around you in the real world you can get into accidents.

I'd take that bet. You know why? No one seems to have thought the DS was a shitty gimmick, not for most of it's life cycle at least, I'm fairly certain that was the first console to use a touch screen. That's because the dual-screen system is actually useful and motions are relatively small, even if they are bigger than normal controllers. It amazes me how people take wii-hate as nintendo hate when Nintendo came out with TWO consoles within 12 months of each other and people seemed to love the DS.

Who cares how you look? Did I ever say my problem with the wii is how you look?

Saying inherent problems with a type of system are laughable is a laughable statement in and of itself.

Motion controls wreck your immersion by causing sensation that cause you to think about your body more. It's not about arms hurting, you feel the weight of your arms of moving. Controllers, and keyboard and mouse, are so superior because the motions are almost all less than an inch so you're more likely to forget that you're using a controller. Maybe an Oculus Rift type system will make motion controls more viable by making you feel more like your sensations are the character's, but with a screen that is a distance from you like that, you can't have so many sensations outside of the screen tugging at your immersion. Also: Wii motion controls aren't perfect, when you make a big motion and it doesn't work the way you should , even if you were able to build up some immersion it was just sent through the paper shredder.

Yes, you still need to use a controller...or maybe keyboard and mouse work. I don't see why revolutions need to be changing control schemes, and we'll get a different control scheme that's not shit one day. I for one give it 10 years though.
And having a head gear on your head, moving your head around doesn't break you out of immersion?
Because if you are surrounded by the visuals of a videogame you have to be very conscious of your surroundings which could even be more breaking then just moving your arms in a swinging motion.
The Oculus Rift and the Wii's motion controls are both gimmicks in every sense of the word.

The "gimmick" with the Oculus Rift is you have a 360 vision of the video game environment.
The "gimmick" of the Wii is that the motions in real life affect the motions in the game.

Both of them bring something new to the gaming experience.
I'm not even going to bother with touch screens because that was going to be a mainstream thing anyway.

However motion controls are a "gimmick" because we labeled it as such.
Oculus Rift is "innovation" because we labeled it as such.
However someone else can twist that around and say the opposite and you can't really counter argue that point.

Personally I think both Motion controls and Oculus Rift bring something new to the table, and I have no ire towards any of them.
However I think it is pretty unfair to claim that one is the gimmick and the other is innovation because of a company name.
No, moving your head doesn't really break your immersion because all you're seeing is the game world, it doesn't make you perceive much outside of the game world. I'm assuming the head gear is light, if it's heavy then it'll likely break you out. It takes a lot more to break your immersion when all you see is the game. Maybe moving your head will break you out, but when the game reacts as such, and all you see is the game so it's easier to perceive your sensations as the character's sensations it seems much easier to be immersed in that then making full swings or wrestling with a motion sensor and being immersed in that.

To me the difference between gimmick and innovation is that an innovation is a clear improvement. Gimmicks are novelty for novelty's sake. A gimmick is dildo that you can use as a pen, an innovation is a dildo that vibrates. In terms of core games the wii controller is a gimmick, maybe it's innovative for casual games but those aren't really the games you get immersed in.

Saying it's because of company name is a strawman. Like I said, people love the DS. They love it. Saying that was GOING TO BE a mainstream thing is irrelevant. It was the first console to do that and in 2005 touchscreens weren't ubiquitous. Past innovation is still innovation whether it caught on or was ignored.
In your opinion.
The Oculus Rift is only as innovative and revolutionary as the audience deems it.
It can easily be brushed off as a gimmick.

How many people would care about the Oculus Rift?
How many people would want to use it?
How many people can't use it?

Some people have motion sickness.
Others can be easily disoriented.
Other players are just put off at the prospect of being surrounded by gaming visuals and losing any sort of awareness to the outside environment.

The Oculus Rift can very easily become the next Wii.
Everyone will get it because your "surrounded" by the game, and then by the end of the generation the fad will fade away and people will go to something else.
I do not see future consoles utilizing the Oculus Rift heavily. Similar to the sense of how the Wii U downsized on using motion controls.
I'm well aware that innovation is subjective. That's why I'm arguing the superiority of the Rift in terms of core games. The way I see it, the reason core gamers mostly rejected the wii is because it wasn't truly beneficial to most people. Yes it can be brushed off but the way I see it, it seems that gamers have fairly reliable taste when it comes to things that manage to get into the limelight.


This all started with you talking about how PC kills innovation, I don't see why the focus would be on how much consoles utilize the rift. I don't care how much the consoles use it. If it's successful on PC that might cause the consoles to pick it up though.
The only thing I said even remotely about PC on this thread was that it can never reach the magnum opus of gaming most people claim it to be because a PC first and foremost is supposed to be the "jack of all trades"
An individual not only uses PC for various activiites- in my case digital illustrations- but they can also have computers that are far older and far cheaper and never bothered to upgrade because as far as they are concerned, their PC still does exactly their daily needs.

I don't think I was the one that said PC is holding back gaming.
Where did I state this?
What I did say was holding back gaming are game devs themselves with their complacency to simply stay on the safe side, never find new ways around limitations and would rather play with the newest toys, and do the same old same old. Mainly due to suits more then anything.
Fuck I thought you were someone else.

Being a jack of all trades isn't really that much of a problem. I see no reason why a PC being able to do digital illustrations would take away from it's ability to play a game. Just like consoles being able to play DVD's isn't really a problem.

The fact that some people have older PCs seems fairly irrelevant to me. So not everyone can play every game, so what? The people who want to play the games will upgrade. If you want to play games on PC than your old PC doesn't meet your daily needs
 

Dragonbums

Indulge in it's whiffy sensation
May 9, 2013
3,307
0
0
mike1921 said:
Dragonbums said:
mike1921 said:
Dragonbums said:
mike1921 said:
Dragonbums said:
mike1921 said:
Dragonbums said:
I'm waiting for the Oculus Rift since that seems like an actually innovation that's actually superior to the old system.
I bet people one hundred fucking dollars that if Nintendo were to make up the Oculus Rift everyone would whine about how it is the ultimate form of gimmicks and point them to the failed versions of the 1990's.

Seriously.

The fact that your biggest gripe with motion controls is that you have to move is laughable.
I'm also willing to bet you will look more like a idiot using the Oculus Rift, then the Wii could ever make anyone.

Motion Controls- more bit of immersion into the game by attempting to allow you to use actions that will mimic your in game counterpart.
Everyone here calls it a gimmick, and complains about how much their arms hurt (I've seriously never had this issue using the Wii)
Arms tiring out also doesn't help our image in the physical fitness department.

Oculus Rift- A head gear set that allowed you to have your vision surrounded by the game to fully immerse yourself in the videogame world. Still requires you to use a controller(from my knowledge)
Is called the revolution of gaming.
And yes, if you can't see the outside world that can break you out of immersion.
Why? Because depending on environment if you are unable to detect who is around you in the real world you can get into accidents.

I'd take that bet. You know why? No one seems to have thought the DS was a shitty gimmick, not for most of it's life cycle at least, I'm fairly certain that was the first console to use a touch screen. That's because the dual-screen system is actually useful and motions are relatively small, even if they are bigger than normal controllers. It amazes me how people take wii-hate as nintendo hate when Nintendo came out with TWO consoles within 12 months of each other and people seemed to love the DS.

Who cares how you look? Did I ever say my problem with the wii is how you look?

Saying inherent problems with a type of system are laughable is a laughable statement in and of itself.

Motion controls wreck your immersion by causing sensation that cause you to think about your body more. It's not about arms hurting, you feel the weight of your arms of moving. Controllers, and keyboard and mouse, are so superior because the motions are almost all less than an inch so you're more likely to forget that you're using a controller. Maybe an Oculus Rift type system will make motion controls more viable by making you feel more like your sensations are the character's, but with a screen that is a distance from you like that, you can't have so many sensations outside of the screen tugging at your immersion. Also: Wii motion controls aren't perfect, when you make a big motion and it doesn't work the way you should , even if you were able to build up some immersion it was just sent through the paper shredder.

Yes, you still need to use a controller...or maybe keyboard and mouse work. I don't see why revolutions need to be changing control schemes, and we'll get a different control scheme that's not shit one day. I for one give it 10 years though.
And having a head gear on your head, moving your head around doesn't break you out of immersion?
Because if you are surrounded by the visuals of a videogame you have to be very conscious of your surroundings which could even be more breaking then just moving your arms in a swinging motion.
The Oculus Rift and the Wii's motion controls are both gimmicks in every sense of the word.

The "gimmick" with the Oculus Rift is you have a 360 vision of the video game environment.
The "gimmick" of the Wii is that the motions in real life affect the motions in the game.

Both of them bring something new to the gaming experience.
I'm not even going to bother with touch screens because that was going to be a mainstream thing anyway.

However motion controls are a "gimmick" because we labeled it as such.
Oculus Rift is "innovation" because we labeled it as such.
However someone else can twist that around and say the opposite and you can't really counter argue that point.

Personally I think both Motion controls and Oculus Rift bring something new to the table, and I have no ire towards any of them.
However I think it is pretty unfair to claim that one is the gimmick and the other is innovation because of a company name.
No, moving your head doesn't really break your immersion because all you're seeing is the game world, it doesn't make you perceive much outside of the game world. I'm assuming the head gear is light, if it's heavy then it'll likely break you out. It takes a lot more to break your immersion when all you see is the game. Maybe moving your head will break you out, but when the game reacts as such, and all you see is the game so it's easier to perceive your sensations as the character's sensations it seems much easier to be immersed in that then making full swings or wrestling with a motion sensor and being immersed in that.

To me the difference between gimmick and innovation is that an innovation is a clear improvement. Gimmicks are novelty for novelty's sake. A gimmick is dildo that you can use as a pen, an innovation is a dildo that vibrates. In terms of core games the wii controller is a gimmick, maybe it's innovative for casual games but those aren't really the games you get immersed in.

Saying it's because of company name is a strawman. Like I said, people love the DS. They love it. Saying that was GOING TO BE a mainstream thing is irrelevant. It was the first console to do that and in 2005 touchscreens weren't ubiquitous. Past innovation is still innovation whether it caught on or was ignored.
In your opinion.
The Oculus Rift is only as innovative and revolutionary as the audience deems it.
It can easily be brushed off as a gimmick.

How many people would care about the Oculus Rift?
How many people would want to use it?
How many people can't use it?

Some people have motion sickness.
Others can be easily disoriented.
Other players are just put off at the prospect of being surrounded by gaming visuals and losing any sort of awareness to the outside environment.

The Oculus Rift can very easily become the next Wii.
Everyone will get it because your "surrounded" by the game, and then by the end of the generation the fad will fade away and people will go to something else.
I do not see future consoles utilizing the Oculus Rift heavily. Similar to the sense of how the Wii U downsized on using motion controls.
I'm well aware that innovation is subjective. That's why I'm arguing the superiority of the Rift in terms of core games. The way I see it, the reason core gamers mostly rejected the wii is because it wasn't truly beneficial to most people. Yes it can be brushed off but the way I see it, it seems that gamers have fairly reliable taste when it comes to things that manage to get into the limelight.


This all started with you talking about how PC kills innovation, I don't see why the focus would be on how much consoles utilize the rift. I don't care how much the consoles use it. If it's successful on PC that might cause the consoles to pick it up though.
The only thing I said even remotely about PC on this thread was that it can never reach the magnum opus of gaming most people claim it to be because a PC first and foremost is supposed to be the "jack of all trades"
An individual not only uses PC for various activiites- in my case digital illustrations- but they can also have computers that are far older and far cheaper and never bothered to upgrade because as far as they are concerned, their PC still does exactly their daily needs.

I don't think I was the one that said PC is holding back gaming.
Where did I state this?
What I did say was holding back gaming are game devs themselves with their complacency to simply stay on the safe side, never find new ways around limitations and would rather play with the newest toys, and do the same old same old. Mainly due to suits more then anything.
Fuck I thought you were someone else.

Being a jack of all trades isn't really that much of a problem. I see no reason why a PC being able to do digital illustrations would take away from it's ability to play a game. Just like consoles being able to play DVD's isn't really a problem.

The fact that some people have older PCs seems fairly irrelevant to me. So not everyone can play every game, so what? The people who want to play the games will upgrade. If you want to play games on PC than your old PC doesn't meet your daily needs
It is relevant because it is these very reason why people would rather buy a console then spend money on upgrading their computer.
And the build your own computer argument is invalid in most cases because most people don't have the technical know how to make one efficiently and most don't want to deal with it.

Another thing. Me doing digital art does matter because the programs I have on my computer and the subsequent files that result from them take up memory.
Which is just as big as deal.
Games are not the only thing occupying the memory of my computer. The variety of programs and files that come with doing my work take it up to.
Not to mention the processing power it takes for me to do my art efficiently without lagging.
My laptop is old, and I have no desire to spend money on a computer upgrade. I will upgrade my computer when it can no longer work. That is the mentality of many computer users.
If I want to play games that badly I will buy a gaming console.
Simple as that.
 

Negatempest

New member
May 10, 2008
1,004
0
0
Okay, so wait a moment, wait a moment. Has the OP even looked at the top sellers on steam? About 80% of the games on that list are hardly graphic intensive like Metal Gear Solid 4. Games excel on game play over graphics every single time. I'd would rather pop in a fresh copy of Crash Bandicoot 3: Warped, than play a game like Skyrim that is filled to the brim with bugs. T^T I really hate starting over 10 minutes ago again and again on these recent western games because of annoying glitches and bugs. Can't I get one good solid game completion without worrying about a possible game breaking bug? I can look over a zombie stuck in a wall, I can walk past that. I hate getting stuck myself though. T^T
 

faefrost

New member
Jun 2, 2010
1,280
0
0
The OP has it completely and utterly backwards. Consoles aren't holding back gaming because of a lack of graphic improvements. Improved graphics are what hold back and ultimately damage gaming. As some above have said it is a cost return calculation. The higher the graphical fidelity, the more the cost. The problem is that cost escalates far faster than any actual benefits to the paying game playing customer. This is why there are so few games out for the WiiU. Nintendo was unprepared for just how shocking the rise in development time and costs are for HD games. This is why we have so few ginormous and ultimately evil AAA gaming companies these days. Games cost too much to make because of the graphics. They are starting to cost more to make than can be reasonably returned from their niche customer base. And that is what is killing gaming. That is what has led to virtually every evil we have endured. From purely overpriced games, to evil DRM schemes, to trying to buffer revenue streams via DLC and selling customer metadata.

And here is an example of how little graphics actually matter. Look at God of War 3 on PS3, then look at its PS2 predecessor God of War 2. The gameplay is essentially the same. The game experience is virtually identical. GoW3 looks slightly better because it was truly tweaked to use all of the PS3's power. But it adds so little to the game experience vs the previous that the overall fun factor and value is the same for either version. But 3 probably cost 4x more to make.
 

mike1921

New member
Oct 17, 2008
1,292
0
0
Dragonbums said:
mike1921 said:
Dragonbums said:
mike1921 said:
Dragonbums said:
mike1921 said:
Dragonbums said:
mike1921 said:
Dragonbums said:
I'm waiting for the Oculus Rift since that seems like an actually innovation that's actually superior to the old system.
I bet people one hundred fucking dollars that if Nintendo were to make up the Oculus Rift everyone would whine about how it is the ultimate form of gimmicks and point them to the failed versions of the 1990's.

Seriously.

The fact that your biggest gripe with motion controls is that you have to move is laughable.
I'm also willing to bet you will look more like a idiot using the Oculus Rift, then the Wii could ever make anyone.

Motion Controls- more bit of immersion into the game by attempting to allow you to use actions that will mimic your in game counterpart.
Everyone here calls it a gimmick, and complains about how much their arms hurt (I've seriously never had this issue using the Wii)
Arms tiring out also doesn't help our image in the physical fitness department.

Oculus Rift- A head gear set that allowed you to have your vision surrounded by the game to fully immerse yourself in the videogame world. Still requires you to use a controller(from my knowledge)
Is called the revolution of gaming.
And yes, if you can't see the outside world that can break you out of immersion.
Why? Because depending on environment if you are unable to detect who is around you in the real world you can get into accidents.

I'd take that bet. You know why? No one seems to have thought the DS was a shitty gimmick, not for most of it's life cycle at least, I'm fairly certain that was the first console to use a touch screen. That's because the dual-screen system is actually useful and motions are relatively small, even if they are bigger than normal controllers. It amazes me how people take wii-hate as nintendo hate when Nintendo came out with TWO consoles within 12 months of each other and people seemed to love the DS.

Who cares how you look? Did I ever say my problem with the wii is how you look?

Saying inherent problems with a type of system are laughable is a laughable statement in and of itself.

Motion controls wreck your immersion by causing sensation that cause you to think about your body more. It's not about arms hurting, you feel the weight of your arms of moving. Controllers, and keyboard and mouse, are so superior because the motions are almost all less than an inch so you're more likely to forget that you're using a controller. Maybe an Oculus Rift type system will make motion controls more viable by making you feel more like your sensations are the character's, but with a screen that is a distance from you like that, you can't have so many sensations outside of the screen tugging at your immersion. Also: Wii motion controls aren't perfect, when you make a big motion and it doesn't work the way you should , even if you were able to build up some immersion it was just sent through the paper shredder.

Yes, you still need to use a controller...or maybe keyboard and mouse work. I don't see why revolutions need to be changing control schemes, and we'll get a different control scheme that's not shit one day. I for one give it 10 years though.
And having a head gear on your head, moving your head around doesn't break you out of immersion?
Because if you are surrounded by the visuals of a videogame you have to be very conscious of your surroundings which could even be more breaking then just moving your arms in a swinging motion.
The Oculus Rift and the Wii's motion controls are both gimmicks in every sense of the word.

The "gimmick" with the Oculus Rift is you have a 360 vision of the video game environment.
The "gimmick" of the Wii is that the motions in real life affect the motions in the game.

Both of them bring something new to the gaming experience.
I'm not even going to bother with touch screens because that was going to be a mainstream thing anyway.

However motion controls are a "gimmick" because we labeled it as such.
Oculus Rift is "innovation" because we labeled it as such.
However someone else can twist that around and say the opposite and you can't really counter argue that point.

Personally I think both Motion controls and Oculus Rift bring something new to the table, and I have no ire towards any of them.
However I think it is pretty unfair to claim that one is the gimmick and the other is innovation because of a company name.
No, moving your head doesn't really break your immersion because all you're seeing is the game world, it doesn't make you perceive much outside of the game world. I'm assuming the head gear is light, if it's heavy then it'll likely break you out. It takes a lot more to break your immersion when all you see is the game. Maybe moving your head will break you out, but when the game reacts as such, and all you see is the game so it's easier to perceive your sensations as the character's sensations it seems much easier to be immersed in that then making full swings or wrestling with a motion sensor and being immersed in that.

To me the difference between gimmick and innovation is that an innovation is a clear improvement. Gimmicks are novelty for novelty's sake. A gimmick is dildo that you can use as a pen, an innovation is a dildo that vibrates. In terms of core games the wii controller is a gimmick, maybe it's innovative for casual games but those aren't really the games you get immersed in.

Saying it's because of company name is a strawman. Like I said, people love the DS. They love it. Saying that was GOING TO BE a mainstream thing is irrelevant. It was the first console to do that and in 2005 touchscreens weren't ubiquitous. Past innovation is still innovation whether it caught on or was ignored.
In your opinion.
The Oculus Rift is only as innovative and revolutionary as the audience deems it.
It can easily be brushed off as a gimmick.

How many people would care about the Oculus Rift?
How many people would want to use it?
How many people can't use it?

Some people have motion sickness.
Others can be easily disoriented.
Other players are just put off at the prospect of being surrounded by gaming visuals and losing any sort of awareness to the outside environment.

The Oculus Rift can very easily become the next Wii.
Everyone will get it because your "surrounded" by the game, and then by the end of the generation the fad will fade away and people will go to something else.
I do not see future consoles utilizing the Oculus Rift heavily. Similar to the sense of how the Wii U downsized on using motion controls.
I'm well aware that innovation is subjective. That's why I'm arguing the superiority of the Rift in terms of core games. The way I see it, the reason core gamers mostly rejected the wii is because it wasn't truly beneficial to most people. Yes it can be brushed off but the way I see it, it seems that gamers have fairly reliable taste when it comes to things that manage to get into the limelight.


This all started with you talking about how PC kills innovation, I don't see why the focus would be on how much consoles utilize the rift. I don't care how much the consoles use it. If it's successful on PC that might cause the consoles to pick it up though.
The only thing I said even remotely about PC on this thread was that it can never reach the magnum opus of gaming most people claim it to be because a PC first and foremost is supposed to be the "jack of all trades"
An individual not only uses PC for various activiites- in my case digital illustrations- but they can also have computers that are far older and far cheaper and never bothered to upgrade because as far as they are concerned, their PC still does exactly their daily needs.

I don't think I was the one that said PC is holding back gaming.
Where did I state this?
What I did say was holding back gaming are game devs themselves with their complacency to simply stay on the safe side, never find new ways around limitations and would rather play with the newest toys, and do the same old same old. Mainly due to suits more then anything.
Fuck I thought you were someone else.

Being a jack of all trades isn't really that much of a problem. I see no reason why a PC being able to do digital illustrations would take away from it's ability to play a game. Just like consoles being able to play DVD's isn't really a problem.

The fact that some people have older PCs seems fairly irrelevant to me. So not everyone can play every game, so what? The people who want to play the games will upgrade. If you want to play games on PC than your old PC doesn't meet your daily needs
It is relevant because it is these very reason why people would rather buy a console then spend money on upgrading their computer.
And the build your own computer argument is invalid in most cases because most people don't have the technical know how to make one efficiently and most don't want to deal with it.

Another thing. Me doing digital art does matter because the programs I have on my computer and the subsequent files that result from them take up memory.
Which is just as big as deal.
Games are not the only thing occupying the memory of my computer. The variety of programs and files that come with doing my work take it up to.
Not to mention the processing power it takes for me to do my art efficiently without lagging.
My laptop is old, and I have no desire to spend money on a computer upgrade. I will upgrade my computer when it can no longer work. That is the mentality of many computer users.
If I want to play games that badly I will buy a gaming console.
Simple as that.
Then...buy a new computer if it's that shit? I'm fairly certain you can get a decent gaming computer for $500 (xbone price, I think we could safetly say if you're comparing it PS4 you can save $100 easily off of steam sales and bundles and not paying $50-$60 a year to play games online).
In fact: http://www.newegg.com/Product/Product.aspx?Item=N82E16883113263
Fairly certain that's more than enough to last you for years and years.

I'm assuming by memory you mean storage: large amounts of storage isn't really that big of a deal anymore, external hard drives exist and as you can see that computer just gives you a terabyte. You could get a 1TB external for $100 if you really want and don't want to add another drive.

Why is the amount of processing power your art takes relevant? You're not gaming at the same time you're doing your art, are you? Like I guess if you demand you do them both at the same time that would maybe be an issue but....that seems pretty fucking weird to me.

yes, I know that if you want to play games you'll buy a console, I just think that's a horrible mentality and that PC's are infinitely superior for gaming and it just seems to be evidence of closed mindedness, "NO I USE THIS KIND OF DEVICE FOR THIS TASK, I DON'T CARE WHICH IS BETTER I WANT THE DEDICATED ONE".

faefrost said:
And here is an example of how little graphics actually matter. Look at God of War 3 on PS3, then look at its PS2 predecessor God of War 2. The gameplay is essentially the same. The game experience is virtually identical. GoW3 looks slightly better because it was truly tweaked to use all of the PS3's power. But it adds so little to the game experience vs the previous that the overall fun factor and value is the same for either version. But 3 probably cost 4x more to make.
Sorry but god of war 3 looks more than slightly better than god of war 2. Like I could see ow somewhere between them you are hitting the point of diminishing returns....but those are still sizable returns.
I mean, look at it http://origin.playstationlifestyle.net/assets/uploads/2010/02/Cyclops-Eye-Rip-Comparison-.jpg
GoW2 was the better game but that's more because GoW3 just was like, dragged out as fuck because it was pretty much just the ending of God of War 2 with no real respectable reason to have an arc of it's own.
 

Laughing Man

New member
Oct 10, 2008
1,715
0
0
PC's are infinitely superior for gaming
Except they aren't PC's are actually pretty horrible for gaming. That is why getting games to work on a PC requires hardware with near twice the spec of a dedicated console.

Oh yes the final result are indeed superior but this is the combination of years upon years of software, drivers, OS modifications APIS and any number of other software go between that have been created to make games actually work on PCs but end of the day it doesn't change the fact that if you built a PC with similar spec components to a dedicated console then the console would piss all over the PC.

As for console holding anything back, that's total crap. A lot of the stuff we see in games now a days is a direct result of publishers being able to spend large amounts of cash on game development and the cash has come from console bringing gaming to the mainstream. Sure we can select the odd indie budget title that blows us all away or we can sit here and pretend that the bedroom developed games from the Spectrum era where all amazing and nothing new is a patch on them but gaming is where it is because of consoles and without consoles gaming would be a minor league hobby, 'dominated' by small league developers and the market would be a wash with 1 decent title for every 1,000.
 

faefrost

New member
Jun 2, 2010
1,280
0
0
mike1921 said:
faefrost said:
And here is an example of how little graphics actually matter. Look at God of War 3 on PS3, then look at its PS2 predecessor God of War 2. The gameplay is essentially the same. The game experience is virtually identical. GoW3 looks slightly better because it was truly tweaked to use all of the PS3's power. But it adds so little to the game experience vs the previous that the overall fun factor and value is the same for either version. But 3 probably cost 4x more to make.
Sorry but god of war 3 looks more than slightly better than god of war 2. Like I could see ow somewhere between them you are hitting the point of diminishing returns....but those are still sizable returns.
I mean, look at it http://origin.playstationlifestyle.net/assets/uploads/2010/02/Cyclops-Eye-Rip-Comparison-.jpg
GoW2 was the better game but that's more because GoW3 just was like, dragged out as fuck because it was pretty much just the ending of God of War 2 with no real respectable reason to have an arc of it's own.
Yes GoW 3 looked better. Things were sharper clearer. Better sparkly effects. But the game itself was pretty close to exactly the same as GoW 2. Same gameplay. Same enjoyment. Same overall player experience. Were those slightly improved graphics worth the cost? Did they add so much to the end user experience That they made it worth needing at a minimum 3x the production staff to create?

This is the same shit we are seeing going on in Hollywood these days. The costs of producing these lush budget summer movies have grown beyond the revenue streams ability to reasonably recover them for little actual practical benefit to the end viewer. Look at those spectacular "knocking down buildings" sequences in Man of Steel. They cost 10's of millions. But how much did 40 continuous minutes of them bring to the movie? Did they not comunicate much the same experience for a much cheaper price way back in 1980?

For FPS fans, do you get more or the same level of enjoyment from an online matchup of the current version of CoD or from Team Fortress 2? Yes CoD "looks" much better. But does that really improve or add to the gameplay? And is that "looks better" worth the obscene costs and all of the negative effects that it piles on the industry? At what point do we finally walk away from that old paradigm of "better graphics always = better games"?
 

mike1921

New member
Oct 17, 2008
1,292
0
0
Laughing Man said:
PC's are infinitely superior for gaming
Except they aren't PC's are actually pretty horrible for gaming. That is why getting games to work on a PC requires hardware with near twice the spec of a dedicated console.

Oh yes the final result are indeed superior but this is the combination of years upon years of software, drivers, OS modifications APIS and any number of other software go between that have been created to make games actually work on PCs but end of the day it doesn't change the fact that if you built a PC with similar spec components to a dedicated console then the console would piss all over the PC.

As for console holding anything back, that's total crap. A lot of the stuff we see in games now a days is a direct result of publishers being able to spend large amounts of cash on game development and the cash has come from console bringing gaming to the mainstream. Sure we can select the odd indie budget title that blows us all away or we can sit here and pretend that the bedroom developed games from the Spectrum era where all amazing and nothing new is a patch on them but gaming is where it is because of consoles and without consoles gaming would be a minor league hobby, 'dominated' by small league developers and the market would be a wash with 1 decent title for every 1,000.
Of course there's years of software and drivers behind it, that's sort of how technology advances in the digital age, you update shit, make incremental improvements, that's not a downside.

Yes with the same specs console wins, I hope we're all aware that consoles do have the singular advantage of easier optimization because every console owner has the same specs. In exchange for that though, you get an open platform, cheaper games, more control options (I can use pretty much any controller with my PC), better performance if you do have decent hardware (including 60 fps), and a gigantic backlog of games including pretty much every game that came out before the 360 release.
 

Soopy

New member
Jul 15, 2011
455
0
0
Laughing Man said:
PC's are infinitely superior for gaming
Except they aren't PC's are actually pretty horrible for gaming. That is why getting games to work on a PC requires hardware with near twice the spec of a dedicated console.

Oh yes the final result are indeed superior but this is the combination of years upon years of software, drivers, OS modifications APIS and any number of other software go between that have been created to make games actually work on PCs but end of the day it doesn't change the fact that if you built a PC with similar spec components to a dedicated console then the console would piss all over the PC.

As for console holding anything back, that's total crap. A lot of the stuff we see in games now a days is a direct result of publishers being able to spend large amounts of cash on game development and the cash has come from console bringing gaming to the mainstream. Sure we can select the odd indie budget title that blows us all away or we can sit here and pretend that the bedroom developed games from the Spectrum era where all amazing and nothing new is a patch on them but gaming is where it is because of consoles and without consoles gaming would be a minor league hobby, 'dominated' by small league developers and the market would be a wash with 1 decent title for every 1,000.
Mmm nah, I could build a PC to compete with the upcoming generation consoles for the same or less than what the consoles cost. 10yrs ago, I'd have agreed. But PC gaming is far easier these days.
 

Lilani

Sometimes known as CaitieLou
May 27, 2009
6,581
0
0
4RM3D said:
The games market isn't just about graphical fidelity or the specs of consoles or the quality of games that come out, you know. There's also the huge aspect of, you know, marketing. E3 has become a huge moneymaking and marketing opportunity, not just for console makers but also game devs. If not for the incredible amounts of money console games have brought into the market, such a huge event would have no reason to exist. The big AAA companies that cater to consoles also bring tens of thousands of jobs to the games industry. Consoles are a huge factor in why big celebs like Jimmy Fallon and Lewis Black stuck their neck out for games when Microsoft announced the X-bone.

Also, as has been pointed out before, the most successful console of just about every single generation has not been the one that had the most processing power of that generation. Which goes to show that having more power does not necessarily mean more money, more games, or a greater quality in games.

Like it or not, consoles are a huge part of the reason games are becoming so much more commonplace and generally accepted in society, and they bring in a lot of money, jobs, and power to the market. Mobile games are also doing this, but consoles were on their way to being ubiquitous long before Angry Birds came out. Losing consoles at this point would only serve to injure the industry in a big way.
 

The_Tron

New member
Jun 8, 2010
92
0
0
HIEL MIENE PC MIESTER RENNEN.....

in all seriousness this I find this thread ridiculous and just a piss poor attempt to start a flame war.
 

Soopy

New member
Jul 15, 2011
455
0
0
Charcharo said:
faefrost said:
mike1921 said:
faefrost said:
And here is an example of how little graphics actually matter. Look at God of War 3 on PS3, then look at its PS2 predecessor God of War 2. The gameplay is essentially the same. The game experience is virtually identical. GoW3 looks slightly better because it was truly tweaked to use all of the PS3's power. But it adds so little to the game experience vs the previous that the overall fun factor and value is the same for either version. But 3 probably cost 4x more to make.
Sorry but god of war 3 looks more than slightly better than god of war 2. Like I could see ow somewhere between them you are hitting the point of diminishing returns....but those are still sizable returns.
I mean, look at it http://origin.playstationlifestyle.net/assets/uploads/2010/02/Cyclops-Eye-Rip-Comparison-.jpg
GoW2 was the better game but that's more because GoW3 just was like, dragged out as fuck because it was pretty much just the ending of God of War 2 with no real respectable reason to have an arc of it's own.
Yes GoW 3 looked better. Things were sharper clearer. Better sparkly effects. But the game itself was pretty close to exactly the same as GoW 2. Same gameplay. Same enjoyment. Same overall player experience. Were those slightly improved graphics worth the cost? Did they add so much to the end user experience That they made it worth needing at a minimum 3x the production staff to create?

This is the same shit we are seeing going on in Hollywood these days. The costs of producing these lush budget summer movies have grown beyond the revenue streams ability to reasonably recover them for little actual practical benefit to the end viewer. Look at those spectacular "knocking down buildings" sequences in Man of Steel. They cost 10's of millions. But how much did 40 continuous minutes of them bring to the movie? Did they not comunicate much the same experience for a much cheaper price way back in 1980?

For FPS fans, do you get more or the same level of enjoyment from an online matchup of the current version of CoD or from Team Fortress 2? Yes CoD "looks" much better. But does that really improve or add to the gameplay? And is that "looks better" worth the obscene costs and all of the negative effects that it piles on the industry? At what point do we finally walk away from that old paradigm of "better graphics always = better games"?
Graphics do not cost THAT much. The reason GoW3 had such a need for money was because it was made on limited hardware and OPTIMIZATION is extremely expensive. That, and some other stuff such as PR and marketing, voice acting and so on probably also was responsible.
Take for an example Metro Last Light. That game is MUCH more graphically advanced than GoW3. And yet it was done by 90 people with 1/10 of GoW3's budget. Same thing with, lets say, STALKER Clear Sky, which also had better AI.

Oh, and TF2 and CoD are quite comparable overall. CoD is probably slightly more graphically intensive, but TF2 does actually tax the CPU with better physics and more dynamic objects than CoD.
Good post dude, some very good points there.
 

faefrost

New member
Jun 2, 2010
1,280
0
0
Charcharo said:
faefrost said:
mike1921 said:
faefrost said:
And here is an example of how little graphics actually matter. Look at God of War 3 on PS3, then look at its PS2 predecessor God of War 2. The gameplay is essentially the same. The game experience is virtually identical. GoW3 looks slightly better because it was truly tweaked to use all of the PS3's power. But it adds so little to the game experience vs the previous that the overall fun factor and value is the same for either version. But 3 probably cost 4x more to make.
Sorry but god of war 3 looks more than slightly better than god of war 2. Like I could see ow somewhere between them you are hitting the point of diminishing returns....but those are still sizable returns.
I mean, look at it http://origin.playstationlifestyle.net/assets/uploads/2010/02/Cyclops-Eye-Rip-Comparison-.jpg
GoW2 was the better game but that's more because GoW3 just was like, dragged out as fuck because it was pretty much just the ending of God of War 2 with no real respectable reason to have an arc of it's own.
Yes GoW 3 looked better. Things were sharper clearer. Better sparkly effects. But the game itself was pretty close to exactly the same as GoW 2. Same gameplay. Same enjoyment. Same overall player experience. Were those slightly improved graphics worth the cost? Did they add so much to the end user experience That they made it worth needing at a minimum 3x the production staff to create?

This is the same shit we are seeing going on in Hollywood these days. The costs of producing these lush budget summer movies have grown beyond the revenue streams ability to reasonably recover them for little actual practical benefit to the end viewer. Look at those spectacular "knocking down buildings" sequences in Man of Steel. They cost 10's of millions. But how much did 40 continuous minutes of them bring to the movie? Did they not comunicate much the same experience for a much cheaper price way back in 1980?

For FPS fans, do you get more or the same level of enjoyment from an online matchup of the current version of CoD or from Team Fortress 2? Yes CoD "looks" much better. But does that really improve or add to the gameplay? And is that "looks better" worth the obscene costs and all of the negative effects that it piles on the industry? At what point do we finally walk away from that old paradigm of "better graphics always = better games"?
Graphics do not cost THAT much. The reason GoW3 had such a need for money was because it was made on limited hardware and OPTIMIZATION is extremely expensive. That, and some other stuff such as PR and marketing, voice acting and so on probably also was responsible.
Take for an example Metro Last Light. That game is MUCH more graphically advanced than GoW3. And yet it was done by 90 people with 1/10 of GoW3's budget. Same thing with, lets say, STALKER Clear Sky, which also had better AI.

Oh, and TF2 and CoD are quite comparable overall. CoD is probably slightly more graphically intensive, but TF2 does actually tax the CPU with better physics and more dynamic objects than CoD.
Graphics DO cost that much. Really they do. The shear amount of digital art department man hours required to develop for a higher resolution, graphically superior game can be staggering. In a few cases being unprepared for that change can and has killed games. When you double the polygon count you vastly increase the man hours needed to color in those polygons. And this escalates exponentially throughout the development process. Remember, while most games are built around common third party engines these days, the games art and appearance is something unique to each game. The artwork has to be built pretty much from scratch for each game. Probably 30-40% + of the total man hours poured into a game development is art related.

One of the better examples of this (granted a little dated now) Was Asherons Call 2. The game failed and closed not simply becauseof chat server bugs, but because the game could not deliver what the players were expecting in terms of content. In the prior game Asherons Call players had grown accustomed to monthly content patches that progressed the story and the world forward. These were great patches that would easily add 2 weeks to a months worth of content for the players to enjoy every month. Maybe 5 or 6 new dungeons. A bunch of new loot etc. But AC1 had a fairly crude graphics engine. a few people could easily put it all together in a month. Then they put out the sequel with what was then the best MMO graphics engine anyone had ever seen. And suddenly those monthly content patches seemed to get real lean. Where players used to get at a minimum a week or twos worth of content, the new games patches were lucky to give them an hour. If it used to take a dev an hour to design a new sword and add it to the game, it now took 10+ hours, just for that one sword. At the higher level of graphics they could not produce content to keep pace with the players ability to devour it. It took more and more people to do less and less.

Graphic fidelity is wonderful to a point. But at the end of the day it is far surpassed by simple better art direction and good game design. And there reaches a point where finer and finer graphics are not worth the costs being thrown at them.
 

Soopy

New member
Jul 15, 2011
455
0
0
faefrost said:
Charcharo said:
faefrost said:
mike1921 said:
faefrost said:
And here is an example of how little graphics actually matter. Look at God of War 3 on PS3, then look at its PS2 predecessor God of War 2. The gameplay is essentially the same. The game experience is virtually identical. GoW3 looks slightly better because it was truly tweaked to use all of the PS3's power. But it adds so little to the game experience vs the previous that the overall fun factor and value is the same for either version. But 3 probably cost 4x more to make.
Sorry but god of war 3 looks more than slightly better than god of war 2. Like I could see ow somewhere between them you are hitting the point of diminishing returns....but those are still sizable returns.
I mean, look at it http://origin.playstationlifestyle.net/assets/uploads/2010/02/Cyclops-Eye-Rip-Comparison-.jpg
GoW2 was the better game but that's more because GoW3 just was like, dragged out as fuck because it was pretty much just the ending of God of War 2 with no real respectable reason to have an arc of it's own.
Yes GoW 3 looked better. Things were sharper clearer. Better sparkly effects. But the game itself was pretty close to exactly the same as GoW 2. Same gameplay. Same enjoyment. Same overall player experience. Were those slightly improved graphics worth the cost? Did they add so much to the end user experience That they made it worth needing at a minimum 3x the production staff to create?

This is the same shit we are seeing going on in Hollywood these days. The costs of producing these lush budget summer movies have grown beyond the revenue streams ability to reasonably recover them for little actual practical benefit to the end viewer. Look at those spectacular "knocking down buildings" sequences in Man of Steel. They cost 10's of millions. But how much did 40 continuous minutes of them bring to the movie? Did they not comunicate much the same experience for a much cheaper price way back in 1980?

For FPS fans, do you get more or the same level of enjoyment from an online matchup of the current version of CoD or from Team Fortress 2? Yes CoD "looks" much better. But does that really improve or add to the gameplay? And is that "looks better" worth the obscene costs and all of the negative effects that it piles on the industry? At what point do we finally walk away from that old paradigm of "better graphics always = better games"?
Graphics do not cost THAT much. The reason GoW3 had such a need for money was because it was made on limited hardware and OPTIMIZATION is extremely expensive. That, and some other stuff such as PR and marketing, voice acting and so on probably also was responsible.
Take for an example Metro Last Light. That game is MUCH more graphically advanced than GoW3. And yet it was done by 90 people with 1/10 of GoW3's budget. Same thing with, lets say, STALKER Clear Sky, which also had better AI.

Oh, and TF2 and CoD are quite comparable overall. CoD is probably slightly more graphically intensive, but TF2 does actually tax the CPU with better physics and more dynamic objects than CoD.
Graphics DO cost that much. Really they do. The shear amount of digital art department man hours required to develop for a higher resolution, graphically superior game can be staggering. In a few cases being unprepared for that change can and has killed games. When you double the polygon count you vastly increase the man hours needed to color in those polygons. And this escalates exponentially throughout the development process. Remember, while most games are built around common third party engines these days, the games art and appearance is something unique to each game. The artwork has to be built pretty much from scratch for each game. Probably 30-40% + of the total man hours poured into a game development is art related.

One of the better examples of this (granted a little dated now) Was Asherons Call 2. The game failed and closed not simply becauseof chat server bugs, but because the game could not deliver what the players were expecting in terms of content. In the prior game Asherons Call players had grown accustomed to monthly content patches that progressed the story and the world forward. These were great patches that would easily add 2 weeks to a months worth of content for the players to enjoy every month. Maybe 5 or 6 new dungeons. A bunch of new loot etc. But AC1 had a fairly crude graphics engine. a few people could easily put it all together in a month. Then they put out the sequel with what was then the best MMO graphics engine anyone had ever seen. And suddenly those monthly content patches seemed to get real lean. Where players used to get at a minimum a week or twos worth of content, the new games patches were lucky to give them an hour. If it used to take a dev an hour to design a new sword and add it to the game, it now took 10+ hours, just for that one sword. At the higher level of graphics they could not produce content to keep pace with the players ability to devour it. It took more and more people to do less and less.

Graphic fidelity is wonderful to a point. But at the end of the day it is far surpassed by simple better art direction and good game design. And there reaches a point where finer and finer graphics are not worth the costs being thrown at them.
So how do you explain TW2 and other indie games with exceptional graphical quality and comparatively minuscule budgets?

AAA development is simply over bloated with inconsequential crap.
 

Lunar Templar

New member
Sep 20, 2009
8,225
0
0
and here I was ALMOST expecting to see something OTHER then graphics, silly me. Come back when they hold something that actually matters back.
 

faefrost

New member
Jun 2, 2010
1,280
0
0
Soopy said:
faefrost said:
Charcharo said:
faefrost said:
mike1921 said:
faefrost said:
And here is an example of how little graphics actually matter. Look at God of War 3 on PS3, then look at its PS2 predecessor God of War 2. The gameplay is essentially the same. The game experience is virtually identical. GoW3 looks slightly better because it was truly tweaked to use all of the PS3's power. But it adds so little to the game experience vs the previous that the overall fun factor and value is the same for either version. But 3 probably cost 4x more to make.
Sorry but god of war 3 looks more than slightly better than god of war 2. Like I could see ow somewhere between them you are hitting the point of diminishing returns....but those are still sizable returns.
I mean, look at it http://origin.playstationlifestyle.net/assets/uploads/2010/02/Cyclops-Eye-Rip-Comparison-.jpg
GoW2 was the better game but that's more because GoW3 just was like, dragged out as fuck because it was pretty much just the ending of God of War 2 with no real respectable reason to have an arc of it's own.
Yes GoW 3 looked better. Things were sharper clearer. Better sparkly effects. But the game itself was pretty close to exactly the same as GoW 2. Same gameplay. Same enjoyment. Same overall player experience. Were those slightly improved graphics worth the cost? Did they add so much to the end user experience That they made it worth needing at a minimum 3x the production staff to create?

This is the same shit we are seeing going on in Hollywood these days. The costs of producing these lush budget summer movies have grown beyond the revenue streams ability to reasonably recover them for little actual practical benefit to the end viewer. Look at those spectacular "knocking down buildings" sequences in Man of Steel. They cost 10's of millions. But how much did 40 continuous minutes of them bring to the movie? Did they not comunicate much the same experience for a much cheaper price way back in 1980?

For FPS fans, do you get more or the same level of enjoyment from an online matchup of the current version of CoD or from Team Fortress 2? Yes CoD "looks" much better. But does that really improve or add to the gameplay? And is that "looks better" worth the obscene costs and all of the negative effects that it piles on the industry? At what point do we finally walk away from that old paradigm of "better graphics always = better games"?
Graphics do not cost THAT much. The reason GoW3 had such a need for money was because it was made on limited hardware and OPTIMIZATION is extremely expensive. That, and some other stuff such as PR and marketing, voice acting and so on probably also was responsible.
Take for an example Metro Last Light. That game is MUCH more graphically advanced than GoW3. And yet it was done by 90 people with 1/10 of GoW3's budget. Same thing with, lets say, STALKER Clear Sky, which also had better AI.

Oh, and TF2 and CoD are quite comparable overall. CoD is probably slightly more graphically intensive, but TF2 does actually tax the CPU with better physics and more dynamic objects than CoD.
Graphics DO cost that much. Really they do. The shear amount of digital art department man hours required to develop for a higher resolution, graphically superior game can be staggering. In a few cases being unprepared for that change can and has killed games. When you double the polygon count you vastly increase the man hours needed to color in those polygons. And this escalates exponentially throughout the development process. Remember, while most games are built around common third party engines these days, the games art and appearance is something unique to each game. The artwork has to be built pretty much from scratch for each game. Probably 30-40% + of the total man hours poured into a game development is art related.

One of the better examples of this (granted a little dated now) Was Asherons Call 2. The game failed and closed not simply becauseof chat server bugs, but because the game could not deliver what the players were expecting in terms of content. In the prior game Asherons Call players had grown accustomed to monthly content patches that progressed the story and the world forward. These were great patches that would easily add 2 weeks to a months worth of content for the players to enjoy every month. Maybe 5 or 6 new dungeons. A bunch of new loot etc. But AC1 had a fairly crude graphics engine. a few people could easily put it all together in a month. Then they put out the sequel with what was then the best MMO graphics engine anyone had ever seen. And suddenly those monthly content patches seemed to get real lean. Where players used to get at a minimum a week or twos worth of content, the new games patches were lucky to give them an hour. If it used to take a dev an hour to design a new sword and add it to the game, it now took 10+ hours, just for that one sword. At the higher level of graphics they could not produce content to keep pace with the players ability to devour it. It took more and more people to do less and less.

Graphic fidelity is wonderful to a point. But at the end of the day it is far surpassed by simple better art direction and good game design. And there reaches a point where finer and finer graphics are not worth the costs being thrown at them.
So how do you explain TW2 and other indie games with exceptional graphical quality and comparatively minuscule budgets?

AAA development is simply over bloated with inconsequential crap.
By TW2 do you mean Two Worlds 2? A game that while gorgeous in screenshots was characterized by excessive graphics glitches (and the publisher actively bribing and blackmailing gaming review media to excuse or cover up that fact and not downrank the game because of it?) Yeah! That would be good art direction mixed with very very poor graphics QC. (remember those "exponential production escalations" that I mentioned? Well QC'ing the higher rez graphics is one of those. Higher graphics means more things that can cause tears, clipping, frame rate issues etc etc. But if you just ignore all that and publish the game anyway yeah you can really lowball the costs.
 

Soopy

New member
Jul 15, 2011
455
0
0
faefrost said:
Soopy said:
faefrost said:
Charcharo said:
faefrost said:
mike1921 said:
faefrost said:
And here is an example of how little graphics actually matter. Look at God of War 3 on PS3, then look at its PS2 predecessor God of War 2. The gameplay is essentially the same. The game experience is virtually identical. GoW3 looks slightly better because it was truly tweaked to use all of the PS3's power. But it adds so little to the game experience vs the previous that the overall fun factor and value is the same for either version. But 3 probably cost 4x more to make.
Sorry but god of war 3 looks more than slightly better than god of war 2. Like I could see ow somewhere between them you are hitting the point of diminishing returns....but those are still sizable returns.
I mean, look at it http://origin.playstationlifestyle.net/assets/uploads/2010/02/Cyclops-Eye-Rip-Comparison-.jpg
GoW2 was the better game but that's more because GoW3 just was like, dragged out as fuck because it was pretty much just the ending of God of War 2 with no real respectable reason to have an arc of it's own.
Yes GoW 3 looked better. Things were sharper clearer. Better sparkly effects. But the game itself was pretty close to exactly the same as GoW 2. Same gameplay. Same enjoyment. Same overall player experience. Were those slightly improved graphics worth the cost? Did they add so much to the end user experience That they made it worth needing at a minimum 3x the production staff to create?

This is the same shit we are seeing going on in Hollywood these days. The costs of producing these lush budget summer movies have grown beyond the revenue streams ability to reasonably recover them for little actual practical benefit to the end viewer. Look at those spectacular "knocking down buildings" sequences in Man of Steel. They cost 10's of millions. But how much did 40 continuous minutes of them bring to the movie? Did they not comunicate much the same experience for a much cheaper price way back in 1980?

For FPS fans, do you get more or the same level of enjoyment from an online matchup of the current version of CoD or from Team Fortress 2? Yes CoD "looks" much better. But does that really improve or add to the gameplay? And is that "looks better" worth the obscene costs and all of the negative effects that it piles on the industry? At what point do we finally walk away from that old paradigm of "better graphics always = better games"?
Graphics do not cost THAT much. The reason GoW3 had such a need for money was because it was made on limited hardware and OPTIMIZATION is extremely expensive. That, and some other stuff such as PR and marketing, voice acting and so on probably also was responsible.
Take for an example Metro Last Light. That game is MUCH more graphically advanced than GoW3. And yet it was done by 90 people with 1/10 of GoW3's budget. Same thing with, lets say, STALKER Clear Sky, which also had better AI.

Oh, and TF2 and CoD are quite comparable overall. CoD is probably slightly more graphically intensive, but TF2 does actually tax the CPU with better physics and more dynamic objects than CoD.
Graphics DO cost that much. Really they do. The shear amount of digital art department man hours required to develop for a higher resolution, graphically superior game can be staggering. In a few cases being unprepared for that change can and has killed games. When you double the polygon count you vastly increase the man hours needed to color in those polygons. And this escalates exponentially throughout the development process. Remember, while most games are built around common third party engines these days, the games art and appearance is something unique to each game. The artwork has to be built pretty much from scratch for each game. Probably 30-40% + of the total man hours poured into a game development is art related.

One of the better examples of this (granted a little dated now) Was Asherons Call 2. The game failed and closed not simply becauseof chat server bugs, but because the game could not deliver what the players were expecting in terms of content. In the prior game Asherons Call players had grown accustomed to monthly content patches that progressed the story and the world forward. These were great patches that would easily add 2 weeks to a months worth of content for the players to enjoy every month. Maybe 5 or 6 new dungeons. A bunch of new loot etc. But AC1 had a fairly crude graphics engine. a few people could easily put it all together in a month. Then they put out the sequel with what was then the best MMO graphics engine anyone had ever seen. And suddenly those monthly content patches seemed to get real lean. Where players used to get at a minimum a week or twos worth of content, the new games patches were lucky to give them an hour. If it used to take a dev an hour to design a new sword and add it to the game, it now took 10+ hours, just for that one sword. At the higher level of graphics they could not produce content to keep pace with the players ability to devour it. It took more and more people to do less and less.

Graphic fidelity is wonderful to a point. But at the end of the day it is far surpassed by simple better art direction and good game design. And there reaches a point where finer and finer graphics are not worth the costs being thrown at them.
So how do you explain TW2 and other indie games with exceptional graphical quality and comparatively minuscule budgets?

AAA development is simply over bloated with inconsequential crap.
By TW2 do you mean Two Worlds 2? A game that while gorgeous in screenshots was characterized by excessive graphics glitches (and the publisher actively bribing and blackmailing gaming review media to excuse or cover up that fact and not downrank the game because of it?) Yeah! That would be good art direction mixed with very very poor graphics QC. (remember those "exponential production escalations" that I mentioned? Well QC'ing the higher rez graphics is one of those. Higher graphics means more things that can cause tears, clipping, frame rate issues etc etc. But if you just ignore all that and publish the game anyway yeah you can really lowball the costs.
No, I mean The Witcher 2.