Lightknight said:
It's the same processing cost. But no, it isn't necessarily less hardware. 1 server could have hundreds of gigs of ram and so many other things. No, this would require more actual machines and the same amount of processing. Not only that, but every additional machine involved in the processing is another potential weakest link. You're talking about one of the least efficient manners of server-side processing where real time processing is concerned.
But you already got thousands of hardware distributed at peoples homes, that is sitting idle most of the time. Thats a huge waste of hardware processing power, which if utilized would lower practical costs.
Yes, it has other downsides, plenty of them, and thus noone really uses it, but in a controlled enciroment it could be a good thing.
Then again i learnt some people actually keep their consoles sideways and whatnot so if they cant get this right....
Let me make this clear. I bought my ps3 from Sony and my 360 from Microsoft. My hardware does not belong to anyone but me. It is not something I purchased from the makers of COD or Bioshock or any other game (except first party stuff, I suppose). COD does not have the right to use my hardware. Hell, Microsoft doesn't either. I bought the hardware from them, it's mine. Making machines do work does wear on them. It's like letting someone drive my car when I'm not using it. Let's say it's done so efficiently that the inside of the car looks the same, the gas is the same level, and it's always there when I'm going to use it. It's still grinding down the parts of the engine and the brakes and everything else. No one has the right to incur that cost on my without compensation or my permission. After I've purchased the console, I owe nothing to anyone else.
Yes, you decribed current local hardware situation. This is irrelevant to a situation that would exist if distributed computing existed. you would buy hardware that you would own, but you would have to allow others to use it in order to use hardware from other users, just liek they have to let you use it and in turn you have to let them use it. Thats the notion of sharing. Neither microsoft nor Whatever studio made a game you play got nothing to do with it.
You know who lets somone use thier transport when their not using it? taxi. and in turn the ride costs less than owning a car.
Yes, there is the wear factor, however as i think somone pointed out to you arleady on another topic (correct me if im wrong, but i think he did quote you), the most wear and tear happens due to temperature changes and not actual usage in all nonmoving parts (which is basically everything but coolers and HDD). You alsoe get electricity surges turning it on and off as condensers fill up/drain out which contribute to wear and tear, hence its not advised to turn things on if your leaving for an hour or so.
but in essense you are correct, it would get worn out, just like you wear out other peoples while your playing. but if the hardware is made with quality, that is not a problem, as technology moves so fast nowadays that it will become obsolete before it wears out. My 100mhz Pentium I still works like new, but i wont be using it since its obsolete hardware already. Now of course that means we cant use the cheap hardware made to live for 2 years and break, but then again we shouldnt be using them to begin with.
And you
do get compensation in a form of using other peoples hardware when you are gaming. If you get more out of it than you loose really depends on how much and what you game i guess.
Now let me ask you something you may not have considered. You say they use this while I'm not using it. Ok, now who gets to use it? Let's say 100 different games use this. Do all of them have an equal bid to use my processing power regardless of whether or not I've even purchased their game? Doesn't this mean that it's not unlikely to see my console getting used at the highest capacity they allow for distributed computing demands? Not to mention it using my bandwidth the entire time when I might very well be using bandwidth on another device.
They have no right to any of this.
Whoever needs to use it. The optimum way would be your processor being fully loaded to process however many games it can process at that time. it could be 1 or it could be 100, it really does not matter. As far as games concerned, it could go two ways. if it is merelly processing logical algoryths or whatever it may not need to even know what game it is for and that means your game collection does not matter. but this is harder to do. the other way is you already have the game engine on your box and thus it needs a lot less info to know what to process and what to send back. this would also save bandwitch (which is irrelevant). that would limit it to your game collection and would being inconsistency since people ahve different amount of games. but it would allow easier processing and could be utilized more efficiently perhaps.
Fiber optics make bandwitch problem irrelevant. i dont have the best plan my ISP offers and thus my fiber optics are throttled by ISP, and i still can watch multiple HD streams and online game without increase in ping if i chose to. bandwidth becomes a non-problem with fiber optics. once infrastructure is in place fiber optics is very cheap. Its the infrastructure thats costly.
The rights issue i adressed in the previuos quote.
You can only get so realistic before you reach it. There is no step beyond real. We as humans can't even really tell the difference between high range FPS (frames per second, not first person shooter) at some point. There will be a day when the characters look like actors on the screen and there's nowhere to go once that's reached. My imagination isn't even capable of viewing things with the clarity of realism.
Sureal? im joknig of course.
Yes, you cant get better graphics than reality. but what i was arguing is even if we double our computing power every year, we will still take hundreds of years before we trully go real simulation.
Heck, most humans cant tell difference between 25 and 60 FPS (provided their equipment works correctly, because most "omg it lags at 30 fps" problems are because your GPU and your monitor does not work together). that is limitation of our eyes however and not of reality. and at some point we will go into virtual reality and away from 2d monitors. heck, we already need double the framerates for this "Fake 3d" effect we are trying to make on our TVs.
Theres nowhere to go once its reached, but it will be reached when both you and i will be long dead (unless those talks about immorality at 2040 are actually real, ech). So its really not a wall we are hitting now where we could outhardware our needs.
Besides, even if we reach real simulation we will just go... bigger. simulating galaxies, you know, sort of how X3 games try to do. but "more real".
This is merely because games are currently limited by hardware. That doesn't mean they can magically expand into the infinite. There are perhaps some types of games which could. Like a universe simulator that's meant to require more and more resources over time. But that'd only apply to a very specific type. But a game where you're in a town doing whatever the game has you in that town for isn't going to need to process the entire universe. It just needs to process your area and the surrounding areas to a point in such a way that the player can never get outside the given rendered area. That is to say, a finite and known amount of data needs to be processed. The only question is how detailed that data will be. When it is a finite arena, we can absolutely achieve a point where processing outstrips the possible needs of processing. As long as humans are limited by our senses and our brain, that's going to happen.
Open world games. we are getting more and more of them not because people like them more, not because they are needed to tell a good story, but because we can. as in our hardware allows it. plenty of linear shotters woudl be openworld if the hardware could. once we can, we will have pretty much every game simulating the whole area. the "never go outside rendered area" is invisible walls due to technical limitations or, well, gamemaker didnt do anything there.
We will achieve a point eventually where we outstrip the hardware limitations and can simulate a closed in area. BUt at that point there is no need for the area to be closed in.
Eventually (think millions of years) if we still ahve videogames at this point they will all simulate whole universes even if everything happens in a single town. why? because they can. Why did humans went to moon? because they could.
Correction, HD video editing is more demanding than most video games at the moment.
Implying there is significant portion of populace that edits SD video. even local TV networks edit thier shows in HD and downscale them for SD transmission, while transmitting in HD for those that can recieve it. You will hardly find a youtube video that is bellow 720p in the last 2 years unless it is very specific thing that needs to show small area. video editing pretty much moved into HD some time ago. we are using HD, 2k and now even 4k is starting to come around.
Humans just like to make things bigger when they can. Heck, not so long ago i remmeber people inventing specialcodecs becuase average PC couldnt run average video format - the format was too large. so they ivnented codec that made GPU do part of CPU work. of course modern PCs now can decode 720p without trouble, but try running a 4k video and you will see same problems.
We make things bigger when technology allows it, world is not standing still while technology goes forward.
That's why games like the Last of Us look so impressive despite still having to fit in what is essentially a half-a-decade old CPU with 512mb divided (unnecessarily) into two components.
I think its worth mentioning that while yes RAM is the biggest failure of current gen Consoles and they seem to try to fix that with next gen, the CPU while old is not weak. PS3 CPU, in full thoeretical power, is faster than current fastest PC CPU. back then it was a mindbogling thing. thing is, noone actually used it fully. its hell to program for. not even naughty dog, a company working pretty much exclusively with this CPU for a decade managed that. but they did more than any other folk could, and thus the game looked good in comparison (while still bad compared to PC game looks).
We occasionally get games made for pc that push the limits but those are rare because console markets are very lucrative and the 360 at least requires somewhat minimal effort to port. So it's best to first make most games playable on the console and then just provide upscalling utilities for the pc which only makes the game look a little better without really changing the core mechanics of the game that were limited by the console.
yes, the average hardware, current consoles, our ability to simulate, is the bottleneck.
This next console advancement should give us another significant leap in gaming demands and capabilities for around the same period of time. We'll also see pc technology get moved along faster because of this (since high-power pcs are only needed for gaming and video editing and a stagnating gaming market does slow things down). With the introduction of higher resolution HD gaming (and even 3D gaming) we'll see gaming demands meet or exceed video editing.
Only untill we decide that editing 2k video is a good idea. But with video i noticed the similar tendency as with music, at some point people just stop wanting better. you could make audio in such quality that a modern PC choke during a playback, but the point of it is lost, and similar may happen with videos. Games however i think wont have this point any time soon due how interactivity.
I'm not sure how pointing to the past 10-15 years as word processing barely changing is proving any point. 15 years ago was 1998. Word processing software was first made in 1974. You stating that things haven't really changed fundamentally over the past 10-15 years is only proving my point that the demands of word processing are somewhat finite and computer technology has so drastically exceeded its needs that running a word processing application doesn't so much as cause the RAM consumption to noticeably rise on most machines.
So this comment may have been the most incorrect comment you've made in this thread. Word processing has evolved significantly over the past 40 years and even in the past 10 years we've seen significant advances in word processing features, formats, and security. But at the end of the day it is just typing words on a page and maybe imbedding images here and there. Some day, video games will be in the same position where the environments are realistic and the individual can transit between environments without noticing it. That's all you need and then processing has advanced beyond gaming. It could be ten years, could be fifty. But it will happen.
Fair enough, you are correct with word processing.
No, they are not catching up to hardware. They are increasing in quality and ergo processing demands, but it will never take 4GBs of RAM to play an audio file. As for movies, there are some advancements left to be made but we're getting to definitions that the human eye can't really distinguish between. Beyond that, what's the point?
Positive reinforcement is the point. We do things because we have the means. not because it is necessary. it can take 4 gb to play an audio file, though i doubt it will ever be a norm, we dont even have technology to properly put that to sound. (as in no such speakers). Video on the other hand still has a lot to go to. Human eyes are... underrated. WE can distinguish a lot even without actually seeing it fully.
Yes, at some point we will get there, but its not going to be soon.
Why would we simulate to the atomic level if humans can't see that? That aside, what you just established was a goal. This is a goal that can be reached and then surpassed. I made no allusion to this being near future necessarily. Just that there is a finite distance we can go before arriving at realism or a quality of gaming that is virtually indistinguishable from realism (the only thing that matters). With that knowledge in mind, that goal will eventually be reached.
Because physics. We cant see air but we wouldnt be well off without it. there are plenty of things that could only be "real"ly simmulated by simulating to atomic level.
As far as "virtually indistiguishable", i always had a problem with that, as it sounds more like a PR "you can do anything" talk than anything else. whenever they throw the "your character can do everything" card i think: "can you rape[footnote]Rape is just an easy example that no game ever allows so its easy to prove them wrong[/footnote] people? no? then stop telling us we can". You either simulate reality, or you dont.
Not really, they render the full room now. Currently they delay the pop-in of texture. So in a complex game with limited hardware you can enter the room just fine but the textures won't pop in until loaded. This is exceeding apparent in games like RAGE where the texture file is particularly large. Once loaded, however, it's there. It takes significantly more processing to tie video processing into player actions.
I consider a thing rendered when it is in complete state that the user is supposed to see. but thats semantics and you are correct here.
Interesting. I mentioned that game above. The thing is, it does really well. There are some areas where the texture doesn't load in time or you see white light along creases in the walls. But the game is pretty damn cool as far as graphics are concerned. Now imagine a machine that is 10x as powerful as a machine able to render that graphically advanced game.
Yes, and it took the most powerful CPU in the world to run it (ok thats not fair they didnt utilize even half of its power). A 10 times better graphics is, well, just that, 10 times better. not realistic. not by a long shot. its always sad when game makers trut around claiming thier game looks realistic when, while beatiful, its so far away from realism that they shouldnt even mention it. best example is stones in games. they never look like stones.
It doesn't matter, the console market is what bottlnecks the game development process. Since pcs are an almalgamation of hardware components, those are upgraded regularly and the moment a console's specs are set in place the average pc starts to catch up and then exceed it. But even after the pc exceeds it the consoles still have a tremendous enough following (hundreds of millions of console owners) to demand they develop at its capabilities.
What i meant is that even a 10 times more powerful console wont bring us closer to realism as we already have more powerful machines failing at that. at least thats what i think i meant.
I'm also not certain where you're getting the 1x capable of PC Skyrim. The average computer is still stuck at 4GBs thanks to how long 32bit OS's kept us tethered to that number. And a console, even an x86 one, isn't able to be compared on raw numbers. Consoles allow for optimization in a way that the frankenstein's monster that is PCs, a cobbling of various non-standard hardware, is entirely unable to. This is why the minimum requirement for pc Skyrim was 2GB of RAM on a relatively modern cpu/video card but still playable on a 516MB machine with 6 year old cpu/gpu combinations. A computer with 8GB of RAM is not necessarily comparable with the consoles. A super powerful video card would ultimately decide whether or not it is.
Minimum requirements and maximum graphics are diferent things on PC. And while yes 32 bit OS has kept us with 4 GB (actually funnily enough i seen plenty of PCs for sale with 8 gb and 32 bit OS, what the hell?) that by no means mean its a comparison to make.
You also do the mistake many console gamers do when they think of PCs. they aren't some Frankenstein monstrosities unless the owner makes them so. the "name" ones have standard equipment that all are connected with standard buses and they even share hardware drivers among many generations. This is no longer 2000, you can build a PC by trial and error and it would work. heck, as Intel announced they are going to merge CPUs and Mobos now the hardest part - adding a CPU is gone now too.
PCs beat consoles in all parameters except raw theoretical[footnote]never put in practice[/footnote] calculating speed. And thats okay, consoles were never meant to be the top hardware runners. but console is a gaming hardware, and thus should be compared to average gaming hardware, just like you dont compare an average phone with average calcualtor just becuase phone has a calculator option too. and average gaming PC is the one thats 1x capable of skyrim with HD textures pack.
AI and physics play into graphics. The way a ball moves or wood splinters makes the world look more graphically impressive.
I woudl agree that some gamemakers consider physics part of graphics, but plenty of them toat around that "They are also doing physics" so i dont know.
As far as AI i meant things like enemy soldiers AI, not splinter effect AI. as in the world not only looks, but acts real.
Your brain makes constant caculations about the world around it. It takes note when things don't behave like it knows it should and breaks what's known as atmosphere (aka, real immersion). As such, properly reacting NPCs and correct physics does more for "graphics" than just throwing more polygons will
Thats a very.... untraditional way to look at AI. but i guess by that logic it would be part of graphics, just like EVERYTHING else in the game.
How the Hell does this save the consumer money? By all means, tell me how translating the cost of a server farm to local machines will mean any more money in the consumers' wallet. The consoles themselves will be more expensive and the price of games isn't going to go less than $60 and the wear on the console will be heavier which will mean a shorter lifespan and possible replacement costs of an entire console. This, again, at no perceiveable benefit to the end user.
Right now you use local hardware. that means you need to have hardware at least as pwoerful as you need to run the most calcualtions needed in any game you play (or youll lag, duh). this costs a lot of money. now if could buy 3x less powerful hardare, and let other 2/3 of processing be done on other peoples consoles while they are at work, meanwhile yours work while your at work or w/e, you would need to buy only 1/3 powerful ahrdware. this costs less money.
hence it is logical, that by buying less costly hardware you save money.
you dont get free server farms. you pay them, by paying 60 dolalrs in the game price, for server farms. if by sharing console power you could negate the console farms, the price of game would be lower (that is, unless game makers decide to get richer, which they will, but then we can just write anything as "they will jsut be corrupted anyway")
I adressed wear and tear before.
That's highly specific to your region, where ever it is. The average speed in the US is still around 10mbps. That is getting better but that's still it.
Yes, currently it is. but barring an apocalypse, the world will catch up.
Comcast has enjoyed 30+ years of almost complete monopoly in my city. Such is the result here and in any other areas where this is true. Even in some cities where there's only a few I wouldn't be surprised if we learned of price fixing or collaboration to keep the services provided low.
Funny thing about monopolies and google. remember when monopolies sued google for laying fibers and competing, and they claimed that the state should ban google from providing services because the locals offer inferior services and cant compete.
As soon as the monopolies are rich enough to buy laws they remain monopolies. i noticed that the best scanario for consumers is 3 huge companies. 2 can easily fix prices, but with 3 you will get at least 1 of them trying to profit by stealing others costumers, huwever the companies being only 3 is still large enough to provide massive infreastructure and service variety to consumers that small and local business cant.