I don't think the current Internet infrastructure is capable of supporting that. Many of us are on High Speed Internet, but if I'm not mistaken there are still a great many areas that aren't. This may come about eventually, but not any time soon.
Not necessarily. A console will always take a significant amount of time to draw each frame. A cloud gaming server, on the other hand, can support many players at once and can therefore draw frames many times faster. It is possible for the time saved by faster rendering to exceed the time lost to network latency and coding/decoding the video stream. And latency tends to matter more in multiplayer games, where the network latency is there anyway.ShakerSilver said:Streaming games is fucking awful for anything that does not require fast response time from the player. Action titles, Racing games, Fighting games, Shooters, etc are all gimped by network latency. There's no instantaneous internet speeds simply due to the the speed of light existing, so there's always going some latency, even on the fastest of internet speeds.
Are you really saying that FTL communication is more realistic than regional servers?Strazdas said:Quantum entanglement is probably the most realistic solution to faster than light communication. i wouldnt say its happening though. we can barely agree that the concept exists so far, not effect it in any way yet.
there is no workaround for the speed of light. the only real workaround would be to build a dataserver in every town and village no matter how small. good luck with that.
No, it's a statement to stay in line. I'm hearing this rumour again and again and start to see a pattern here. You know why it will fail?The Enquirer said:As much as I may despise Ubisoft, it's a really good point. Last generation consoles moved away from being mere gaming devices and this generation even more so. With the talks of upgradable hardware it is turning into the exact thing a gaming computer is.
This is furthered with the continued pushing of cross platform play. Soon it'll ultimately come down to a matter of "do you want short or long term costs reduced?"
thebobmaster is absolutely right. The infrastructure and the contract philosophies are horrible on average.thebobmaster said:Until data caps are removed, and bandwidth is strong everywhere, I don't think this will happen yet. He brushes off the bandwidth cost, but that is a rather serious obstacle.
We could talk for a long time why people buy consoles, and it wouldnt be pretty, but i get your point here. Consolites were always more complacent with being mistreated.Joccaren said:I think there's a reason he said consoles, rather than PCs, are going to be outdated by streaming. As when you hit 1080p streaming, we'll be using 8K displays and demanding that instead. You can't keep up with the PC scene, it just moves too fast.
And console players demand higher resolution and better framerate... yet they don't get it, and buy anyway. Because more than resolution and framerate, they like the convenience and relative cheapness. Its why they're not gaming on the computer half the time. In fact, historically, the consoles that have gone for more power, have been flops. Things just don't work out as well on many levels when that happens. This generation the PS4 is killing the Xbone, but a lot of that is in the terrible rap that the Xbone got near launch, where it focused on being a home media spywar... entertainment device, whilst Sony focused on the games. That and people are getting sick of Microsofts Shit, TBH. Had both been marketed the same way, they likely would have been pretty even, potentially even the Xbone pulling ahead with a lower asking price than the PS4.
yeah, perhaps 200 years were generous for technological advancement. But the "We know it exists" is still very much only on high science levels, the average person does not even believe in such things, and what were talking aobut is average person using them.No, we know it exists, and we can entangle particles. The only problem is the world record for the length of time we have artificially entangled two particles is about 13 seconds or so. We've got a lot of work to go in getting a sustained entanglement, and once that's done we've got to figure out how to use them to communicate, and then you'll have to work around the fact that an entangled particle only has one pair... There are a lot of engineering problems to figure out, but give us 200 years and we'll have nailed it. 200 years ago we were still throwing our sewage out into the street, cars didn't exist, and white man was trying to colonise Africa. 3-4 generations is a lot of time for things to change.
I get it, you believe science will solve everything. but were talking about FTL travel here. this is one barrier we were hitting out heads at for over a hundred years now.You say that...
Its the same sort of thing as people in the old times saying there was no workaround for not being born with wings, and then hot air balloons were invented and we could float through the sky.
Additionally it doesn't need to be everywhere. Just focused around major living areas, where most users will be using it. Why build one out in woop woop for 5 people who play games? Not going to happen. Having one centre cater to 20,000 gamers? Yeah, sounds much better. Rurals get short changed, but then again they always do. From the company's perspective, income is more important than reaching everyone with the technology, and shortchanging a 100,000 more remote users, whilst maintaining a core userbase in the 10s of millions, is a pretty decent deal.
Its not a workaround though, its a local calculation. something that is impossible if we move to streaming. and if we have streaming devices powerful enough to do the local calculation then there is no point in streaming because we just have modern PCs anyway.Precisely, a workaround for the speed of light problem, which I thought we just said didn't exist. No, the workaround won't be the same as for in games, however such workarounds can and do exist. Processing, naturally, has to partially be done client side. Considering the TVs will have CPUs at that point, and some already do, I don't see that as too far outside the realm of possibility in 20 years or so, to have the TV apply the same sort of algorithm streaming video uses to kill its file size [You watch a 1080p movie over 2 hours. Why are you not storing Terrabytes of data? Because intelligent algorithms cut that down to usually around 2Gb, with minimal loss in quality most of the time], to be able to send multiple frames of data, so they will play whilst waiting for a response from the client, which tells them the next 5 frames to display, or however many is normal for normal latency.
In this way the screen can continue to present flawless images, whilst awaiting the next bit of info from the server. Visually, latency isn't much of a problem after this. Still has some issues, but again, intelligent software creation can act as a workaround here. Say the TV sends not only the input, but which frame it was displaying at the time the input was entered. When you can display, say, 10 frames between server responses, and you send back that in the previous batch, frame 7 was when the button was pressed, the server can have a check it runs to see if that was within the 'bounds' of when an action should have been performed, and if you have the animations for success and failure start of nearly the same, you can just swap out the ending, and control latency is reduced as well.
It must be 100% for streaming otherwise it is not streaming.Eh, you lose some of the local calculation and display. There is no "It must be 100%" rule for streaming, and with processors in the TVs that accept streaming, you can rely on a small amount of client side processing to help you along.
Personally i found arkham controls very floaty and was a pain to use actually. But yes, it was quite highly praised for some reason.Depends on the game. See the Arkham series games. Don't feel floaty, and animations for attacks can take half a second at times. There's also the gameplay nicety of needing to time your attacks properly, I often found myself only inputting commands once or twice a second. Twitch gameplay? Yeah, going to suffer the same problems online twitch gameplay suffers today. Timing based gameplay, or god forbid, non-action oriented gameplay? Yeah, it'll get on fine.
Its not guaranteed to happen in the next 20 years, but its not outside the realm of possibility that streaming games could become a thing. Certainly not yet, but its not beyond the realm of possibility that, when technology advances a bit, and the industry sees a couple of properly mainstream pushes involving both game developers and hardware makers, and service providers, towards the streaming style, it could be achieved. It wouldn't be perfect, but then again nothing is. It'd be an undertaking, but it could be done. With a bit more technology, again, presently its not there.
Not regional servers. Local servers. And i do mean every city town and village. No, 1000 KM will never have reasonable ping for streaming. There is a reason people notice differences between 5ms and 1ms response in monitors. every MS matters for gameplay to feel good.Bad Jim said:Are you really saying that FTL communication is more realistic than regional servers?
Light is pretty damn fast. Roughly 300,000Km/sec in fact. If a user is 1000Km away from the server, light can travel the distance in 3.3ms and the absolute minimum ping imposed by the speed of light is a very reasonable 6.7ms. Obviously the real ping will be higher do to processing time and the fact that there won't be a straight wire running directly from user to server, but we are still looking at a very reasonable ping without having to build a ridiculous number of server farms.
Actually, your ping is likely to be a lot lower than what you currently get in multiplayer games, because they will need a lot more hardware, and will therefore need to build lots more data centers to put their hardware, so your nearest data center will be closer.
As wonderful as it would be to have the laws of physics prevent shitty business practices, they don't.
The difference between 5ms response and 1ms response in monitors is not noticeable because of latency. It is noticeable because a long response time means that a large fraction of time each frame is spent not showing the correct picture. It is clearly better to spend 1/16th of the time not showing the correct picture than nearly 1/3 of the time.Strazdas said:Not regional servers. Local servers. And I do mean every city town and village. No, 1000 KM will never have reasonable ping for streaming. There is a reason people notice differences between 5ms and 1ms response in monitors. every MS matters for gameplay to feel good.Bad Jim said:Light is pretty damn fast. Roughly 300,000Km/sec in fact. If a user is 1000Km away from the server, light can travel the distance in 3.3ms and the absolute minimum ping imposed by the speed of light is a very reasonable 6.7ms. Obviously the real ping will be higher [due] to processing time and the fact that there won't be a straight wire running directly from user to server, but we are still looking at a very reasonable ping without having to build a ridiculous number of server farms.
Actually, your ping is likely to be a lot lower than what you currently get in multiplayer games, because they will need a lot more hardware, and will therefore need to build lots more data centers to put their hardware, so your nearest data center will be closer.
Also see my above discussion why ping now =/= ping in streaming.
True, but yes, the core idea is that console players sacrifice quality for convenience/cost. It just comes with the territory. Whether that convenience/cost is actually worth it or not... Eh. That's another discussion. But its pretty much the sole two reasons not to play on the PC. Outside exclusives which can go fuck themselves.Strazdas said:We could talk for a long time why people buy consoles, and it wouldnt be pretty, but i get your point here. Consolites were always more complacent with being mistreated.
We're getting to the point where the average person won't believe vaccines are real, yet that doesn't mean they'll stop working. The average person couldn't tell you shit about how their mobile phone works, but it still does. They don't need to believe in the technology, or know that it exists, in order to use the devices that use that technology. They just need to be told what the device does in the highest level sense.yeah, perhaps 200 years were generous for technological advancement. But the "We know it exists" is still very much only on high science levels, the average person does not even believe in such things, and what were talking aobut is average person using them.
Science can't solve everything, but it can definitely find solutions that allow us to move forward from everything. I still don't have wings. I can't flap my arms and fly. We haven't solved that particular problem. But we have an alternative. A workaround that means we don't need to have wings to fly. We can use balloons. Or giant mechanical devices. Or spinning blades. We may never end up breaking the speed of light, but that doesn't mean we'll never reach the stars; we could develop cryogenic sleep, and wake up when we get there. Its never the perfect solution, but its one that can work.I get it, you believe science will solve everything. but were talking about FTL travel here. this is one barrier we were hitting out heads at for over a hundred years now.
No, it needs to be everywhere. thats the point. if its not everywhere latency is going to be a killer and noone is going to be using it. this is why just having more datacenters are not a solution. unless you can build and upkeep them for basically free, its not going to be financially viable solution. and without it we wont be able to do game streaming.
Unless your just suggesting to tell everyone outside major cities to go fuck themselves, which, admittedly, companies done in the last.
Yes, and the local calculation is a workaround. It finds a solution by dodging the initial problem. That's why its not a solution. Its a workaround. It works around the issue.Its not a workaround though, its a local calculation. something that is impossible if we move to streaming. and if we have streaming devices powerful enough to do the local calculation then there is no point in streaming because we just have modern PCs anyway.
No, thats the point, there is no client side processing in the streaming. thats the entire appeal of streaming is that you dont have to have a machine capable of processing it.
You do realize that the reason streaming does not take terabytes is because the quality is awful? the compression algorythms used in, say, netflix stream makes the video look horribly but allows people with bellow 10mbps internet to watch it. Its bad enough i have to deal with it with video streams, having this for games would be the day i quit gaming.
Also if i understand correctly you are saying that you want servers to calculate all possible scenarios and send a few frames buffer of all of them to trick the system? you do realize how much processing power that requires right? not to mention the coding nightmare this would be given the variety of game engines and process chains. Yeah, it can be technically done, if you can spend a few million dollars per user for servers.
See above. I stream data from the Diablo III servers in order to play the game. Same with Sim City. Same with any MMO. Its not magically not streaming that data because some of the calculations are done on my computer. Streaming isn't a 100% thing. You can stream any amount of data, and merge it with any amount of calculated data. Hell, when you stream a movie, you're still actually running a movie player on your computer. Obviously you're not streaming that movie right? Quicktime is running, calculating how to convert from the streamed information to on-screen display, handling pause and unpause, rewind fast forward ect, storing cache info. My god, we don't stream movies do we?It must be 100% for streaming otherwise it is not streaming.
Eh, they're floaty in the same sense Guitar hero is floaty because you have to play to the rhythm rather than just spam the buttons. Its timing based rather than reflex based, and yeah, games like that - that can still be action games - would have a lot less impact from streaming that reflex based games.Personally i found arkham controls very floaty and was a pain to use actually. But yes, it was quite highly praised for some reason.
Yes, i can see non-action gaming to be somewhat less affected. but that is just a small part of gaming, and worth mentioning one that Ubisoft does not develop.
Streaming games were a reality 5 years ago. the OP mentioned OnLive that allowed you to stream game. It mostly suffered from small selection of games so noone knew about them. And yes, they tried their hardest to find solution to latency, they didn't succeed much.
It can be achieved, but it would be a downgrade for the player. Heck ill believe that ubisoft and company could even force it into gaming if they wanted like they forced Uplay. That does not mean its actually an improvement.
You'll note console players are used to 33.3 ms response from monitors, playing at 30 FPS. And if you're playing at 60FPS, the best you're going to get is 16.6 ms response in what you see compared to what you do. Yeah, some PC players play at even higher FPS and can notice the high percentile differences, but we're not talking about max high end PC streaming. We're talking console level streaming, a lot more reasonable.Not regional servers. Local servers. And i do mean every city town and village. No, 1000 KM will never have reasonable ping for streaming. There is a reason people notice differences between 5ms and 1ms response in monitors. every MS matters for gameplay to feel good.
Also see my above discussion why ping now =/= ping in streaming.
When talking about stream gaming latency will be the factor in response times. For streaming they become intertwined.Bad Jim said:The difference between 5ms response and 1ms response in monitors is not noticeable because of latency. It is noticeable because a long response time means that a large fraction of time each frame is spent not showing the correct picture. It is clearly better to spend 1/16th of the time not showing the correct picture than nearly 1/3 of the time.
You are grossly overstating the sensitivity that ordinary people have to latency, and completely ignoring contrary evidence like the fact that tons of people play games at 30fps on consoles when they could play those same games at 60fps or better on PCs. You are also ignoring the fact that a powerful server can serve multiple players and spend much less time rendering each frame than a console or even a PC.
No, despite a few extremist, the vast majority still very much believe vaccines are real and even know the basic concept of how they work. Its even taught in schools.Joccaren said:We're getting to the point where the average person won't believe vaccines are real, yet that doesn't mean they'll stop working. The average person couldn't tell you shit about how their mobile phone works, but it still does. They don't need to believe in the technology, or know that it exists, in order to use the devices that use that technology. They just need to be told what the device does in the highest level sense.
But that is...pointless. we have local calculation, the suggestion is to move to server calculation and streaming, and the workaround is to do local calculation? why do we even need streaming at all then?Yes, and the local calculation is a workaround. It finds a solution by dodging the initial problem. That's why its not a solution. Its a workaround. It works around the issue.
DVD compression is extremely old and bad actually. I moved to BluRays a couple years ago. These use much better compression algorythms and actually has a decent bitrate. That bitrate is 30.000 kbps on average, more thna 4 times that of netflix. Even then, that compression is something id love to be without. I would also like if they move to H265 more (not to be confused with H264). Alas we still cannot run that properly because even high end CPUs have problem deciding a 1080p video due to very calculation intensive compression (which results in better quality).And its not just netflix that does that compression. Your DVDs are compressed like that. ALL digital media is compressed like that. DVDs store 20Gb max, so tell me how you're fitting terabytes of video and sound data onto it for a 2 hour movie?
You're not. You're fitting 2Gb or so most of the time, using streaming algorithms to cut down the excess, useless information. Some streaming cuts down more than others, to varying effects. Point is though, ALL video media is compressed in some way. You're not going to be dealing with terabytes of data, no matter what medium you're using to play your 1080p movie, unless its a film video reel or something.
and if you press a button after only 2 frames your fucked. Thanks but no thanks. Frame buffers need to be REDUCED, not increased. the current 3 frame buffers already create enough input lag that people will pick screen tearing over it.And no, the server would send the next 10 frames of gameplay, and, much like animations in games today are designed to look good despite being completed in 0.01 seconds for "Good feel", the game designers design the animations for the two potential outcomes of a given action to start in a very similar way. The server then receives what happens, calculates the next 10 frames, and sends them along. It also decides what actually happened based on what the client said happened, much like multiplayer servers today do, and they send predicted algorithms for player positions and such to clients as well to try and create a smoother feeling of play. Client side, the first 10 frames could be either one action or the other. The second set confirms which action the player took. Yes, its still more processing power than just streaming normally, but we're hitting a level of graphics now that its going to be a struggle to improve upon. In 20 years? Look at the old Rare games. Compare to today. We'll hit a level of graphical technology where only tiny improvements exist from huge processing power requirements, and the streaming centres could easily cut this down, much like consoles, displaying only the essentials, rather than every possible best setting, and keeping a pretty good look to each game. That greatly reduces the power cost, and future technology will greatly increase the power available. Hell, I mean, what's the point of a streaming centre if it doesn't have better hardware than you do?
Thats not stream gaming. you are going into semantics now, when in reality streaming gaming means you are streaming the entire game - like netflix - with only controls being from the player side. Your multiplayer gaming is not an example of streaming.See above. I stream data from the Diablo III servers in order to play the game. Same with Sim City. Same with any MMO. Its not magically not streaming that data because some of the calculations are done on my computer. Streaming isn't a 100% thing. You can stream any amount of data, and merge it with any amount of calculated data. Hell, when you stream a movie, you're still actually running a movie player on your computer. Obviously you're not streaming that movie right? Quicktime is running, calculating how to convert from the streamed information to on-screen display, handling pause and unpause, rewind fast forward ect, storing cache info. My god, we don't stream movies do we?
But we do. Local calculations are fine. It just requires some calculations to be done server-side. A "Stream" - hence, streaming - is simply a flow of information. It doesn't imply much about the size of that flow.
Not true. It does not matter which framerate you are playing at there is going to be a significant difference in the "Feel" of control based on input lag. this is because things DO change mid-frame based on player input. Input does not get delivered only every 33,3MS, it gets delivered when you pressed it. This makes a difference to the gameplay.You'll note console players are used to 33.3 ms response from monitors, playing at 30 FPS. And if you're playing at 60FPS, the best you're going to get is 16.6 ms response in what you see compared to what you do. Yeah, some PC players play at even higher FPS and can notice the high percentile differences, but we're not talking about max high end PC streaming. We're talking console level streaming, a lot more reasonable.
Elfgore said:Yeah, he really should not be brushing this off like it is nothing. It's literally the largest obstacle to seeing this happen. My dad has around 1.2 MB a second and a max of thirty GB a month. That is the best he can get and it is insanely expensive. He better hope internet in the U.S improves massively to see this theory become true.thebobmaster said:Until data caps are removed, and bandwidth is strong everywhere, I don't think this will happen yet. He brushes off the bandwidth cost, but that is a rather serious obstacle.
1. In some areas, you'd be surprised.Strazdas said:No, despite a few extremist, the vast majority still very much believe vaccines are real and even know the basic concept of how they work. Its even taught in schools.
Why do you drive a car? You still have to move your feet to move, pressing the accelerator and brakes, and clutch. Obviously its pointless as you're still moving your legs to get there...But that is...pointless. we have local calculation, the suggestion is to move to server calculation and streaming, and the workaround is to do local calculation? why do we even need streaming at all then?
And yet, its perfectly serviceable for the majority of consumers. And you're still only proving my point. 30,000 kbps is orders of magnitude less than 600 Mbs. As for why downgrade? For upgrades in other areas. As established, console players prefer convenience over power and looks - we're talking about a pretty minor difference in quality too, so its a sacrifice they're quite likely willing to make. I mean hell, why do people watch Netflix with its terrible compression algorithms when they could just buy the Bluray and get a much higher quality experience?DVD compression is extremely old and bad actually. I moved to BluRays a couple years ago. These use much better compression algorythms and actually has a decent bitrate. That bitrate is 30.000 kbps on average, more thna 4 times that of netflix. Even then, that compression is something id love to be without. I would also like if they move to H265 more (not to be confused with H264). Alas we still cannot run that properly because even high end CPUs have problem deciding a 1080p video due to very calculation intensive compression (which results in better quality).
And yes, its not ideal, but that's the only thing we got so its that or nothing. We have better in videogames, why downgrade?
Honestly, only the highest end "MUST RUN AT 300FPS BETTER PLAY AT 480P SO THAT ITS HIGHER FPS" junkies espouse that view. The gaming community pretty much unanimously is fine with frame buffers under normal circumstances.and if you press a button after only 2 frames your fucked. Thanks but no thanks. Frame buffers need to be REDUCED, not increased. the current 3 frame buffers already create enough input lag that people will pick screen tearing over it.
Incorrect. The server calculates where it expects you to be. Your client calculates where you are, and sends it to the server. The server also calculates where it expects everyone else to be, and relays that information to your client to display. This is a core part of anti-cheat mechanisms; If what your client says you are doing is not what the server says it expects you to be doing, it flags it as likely cheating. The server still has to predict what every player is going to do though, funnily enough.No, servers dont predict anything in current multiplayer games. Our local clients do, calculating it locally. Servers are sanity checks and new data from other players.
I never said we'd hit the limit of grahpics improvements. That'll go on until the end of time. What I said is we'd hit that limit where you'd need a ton of extra processing power, to deliver very minimal extra details. Compare today's PC flagship titles, with all the highest settings on, and then just turn realistic hair physics off. How much better does it run? A shitton. How much do you notice the worse hair? Eeeeeh, not a ton. Same goes for the more intensive forms of Ambient Occlusion and Anti Aliasing, compared to the more simplistic ones. Yeah, there are differences, that can be noticed, but for a normal player in a normal gameplay session, its so small a difference that most won't pay the extra thousand dollars required to achieve it all. And this will only be exacerbated in the future, as each tiny increase in graphics quality demands huge amounts of extra resources. Its diminishing returns at its finest, and allows streamers to stream a lower quality version of the graphics settings, that look very similar to the higher quality versions, for much less processor power.Also you really are wrong about graphic improvements. Just look at what Snowdrop engine did. Or Avalanche. Were nowhere even close to where we struggle to improve upon. Maybe we will hit that level in 20 days, but i doubt it, given that assuming same technological progression it will take till 2056 to reach even current pixar levels of graphic rendered in real time for enthusiast users. Yes, we will hit the limit some day, but its not anywhere close enough to use it as an argument yet.
Even Netflix has a decoder at your end that it needs to process in order to turn it into watchable media. You even admit it here, that some processing - namely for you, controls - needs to happen client side. Yeah. Funnily enough that's how it works. There is no 100% streaming. Streaming the vast majority of the intensive calculations, and making some minor calculations on your end to facilitate the streaming of those other calculations, is, however, 100% 'game streaming'. Its how streaming as a whole works. Your pureblood "Everything is streamed" stream doesn't exist, as then you would be streaming literally terabytes of data for an hour's play session, or movie. Funnily enough, that's not how it works. Stuff is always processed client side. You can either continue trying to deny that to try and make the problem more impossible than it is, or you can accept that's the way its always been done, and that it can be done so in the future. You're not running complex graphics or physics calculations that take a ton of power, you're processing inputs and where in the grand scheme of things they are received, and then handling the incoming frames and when to display them.Thats not stream gaming. you are going into semantics now, when in reality streaming gaming means you are streaming the entire game - like netflix - with only controls being from the player side. Your multiplayer gaming is not an example of streaming.
Also not completely true. Funnily enough, games run at a particular framerate - non-graphical, but calculation wise. Modern programming design is intelligent in that if it runs FASTER than that framerate, it slows itself down, that way you don't have different experiences on different speed machines, where on a slow one you have 10 seconds to dodge a bullet, whilst on a fast one that bullet has been through you and killed you in 0.01 seconds.Not true. It does not matter which framerate you are playing at there is going to be a significant difference in the "Feel" of control based on input lag. this is because things DO change mid-frame based on player input. Input does not get delivered only every 33,3MS, it gets delivered when you pressed it. This makes a difference to the gameplay.
Ok, so console streaming couldn't happen because consoles are sub-par, and to stream a similar quality experience to what consoles now are getting, you'd have to make it stream the top end PC scene, and that's just impossible.Also considering consoles are currently not reasonable but subpar performance that should not happen at all in the 21st century (and yes, i was playing at 60hz 1600p monitor since the 90s. It was a clunky CRT but it was still better than what consoles do now). This is not reasonable, and wont be reasonable if you make it into streaming.
But your suggestion of doing calcualtions both locally and on server does not make it do less work. In fact you end up doing more, because you are calculating both locally and on the server instead of only locally now.Joccaren said:Why do you drive a car? You still have to move your feet to move, pressing the accelerator and brakes, and clutch. Obviously its pointless as you're still moving your legs to get there...
Or, and hear me out here, the fact that you have to do vastly less leg work makes it worth it.
Same with streaming. Its not magically not worth it because you have to calculate 10 megaflops of data, whilst the server does 100 terraflops. Its the fact that you're not doing those 100 terraflops that makes the difference, not the fact that you still have to do 10 megaflops.
No, DVD compression is not perfectly servicable. Its a case of "We simply have nothing better". You could probably make that argument with netflix next, and i would call every such consumer blind. Oh, wait, you did.And yet, its perfectly serviceable for the majority of consumers. And you're still only proving my point. 30,000 kbps is orders of magnitude less than 600 Mbs. As for why downgrade? For upgrades in other areas. As established, console players prefer convenience over power and looks - we're talking about a pretty minor difference in quality too, so its a sacrifice they're quite likely willing to make. I mean hell, why do people watch Netflix with its terrible compression algorithms when they could just buy the Bluray and get a much higher quality experience?
If you can answer that question, well, you have answered your own question.
Is that why we invented not one but two techniques coming out last/this year to remove those buffers? No, the gaming community is not fine with frame buffers. They never were.Honestly, only the highest end "MUST RUN AT 300FPS BETTER PLAY AT 480P SO THAT ITS HIGHER FPS" junkies espouse that view. The gaming community pretty much unanimously is fine with frame buffers under normal circumstances.
Additionally, no, if you press it after only 2 frames you're not fucked. Perhaps you missed the part where locally, your device records which frame the button was pressed at, and feeds that through to the server, which then treats its calculations as if that was when you had pressed the button, and gives you your output based on it. Yes, it'd require some intelligent programming to get that done right, and intelligent art design. I've said as much. Doesn't mean it isn't something that's possible to do.
No. The server only has to check what is possible. If you are being shot and server calculates that the bullet hit you, then you not loosing health is not possible therefore server sends the client a message to loose health and client adjusts health if its own calculations were wrong. (or as is now popular in games, it doesnt, and cheaters run free because server does not check anything. See: GTA 5).Incorrect. The server calculates where it expects you to be. Your client calculates where you are, and sends it to the server. The server also calculates where it expects everyone else to be, and relays that information to your client to display. This is a core part of anti-cheat mechanisms; If what your client says you are doing is not what the server says it expects you to be doing, it flags it as likely cheating. The server still has to predict what every player is going to do though, funnily enough.
Your still wrong. Like i said look at Snowdrop or Avalanche. They made huge improvements with the looks despite no big leaps in graphical processing. Oh and hairs are GORGEOUS in witcher 3. Human hairs not so much but animal ones, you just want to pet them! The performance thing is mostly because the devs fucked up with 64x tesselation. once they patched that out it didnt tank it as much. Also a bit off topic but have you seen what Nvidia did with grass simulation? i cant wait to see that in a game.I never said we'd hit the limit of grahpics improvements. That'll go on until the end of time. What I said is we'd hit that limit where you'd need a ton of extra processing power, to deliver very minimal extra details. Compare today's PC flagship titles, with all the highest settings on, and then just turn realistic hair physics off. How much better does it run? A shitton. How much do you notice the worse hair? Eeeeeh, not a ton. Same goes for the more intensive forms of Ambient Occlusion and Anti Aliasing, compared to the more simplistic ones. Yeah, there are differences, that can be noticed, but for a normal player in a normal gameplay session, its so small a difference that most won't pay the extra thousand dollars required to achieve it all. And this will only be exacerbated in the future, as each tiny increase in graphics quality demands huge amounts of extra resources. Its diminishing returns at its finest, and allows streamers to stream a lower quality version of the graphics settings, that look very similar to the higher quality versions, for much less processor power.
No. what your talking about is tier games clock to CPU speed. this isnt being done anymore and not related to how games are rendered. In fact games internal clock is independent of framerate in any well designed game (only a few in the past few years did this mistake. Tieing physics is a more common mistake). A game calculation does not wait for next frame to count the player input. This is what makes the difference when it comes to input even at low FPS gameplay.Also not completely true. Funnily enough, games run at a particular framerate - non-graphical, but calculation wise. Modern programming design is intelligent in that if it runs FASTER than that framerate, it slows itself down, that way you don't have different experiences on different speed machines, where on a slow one you have 10 seconds to dodge a bullet, whilst on a fast one that bullet has been through you and killed you in 0.01 seconds.
Completely false - see above.Now, I don't know what speed every game runs itself at, but its going to be;
60 FPS
90 FPS
100 FPS
120 FPS
These periods are the only times your client actually calculates inputs. Sure, it'll 'receive' them before then, but it won't actually do anything with them, or even note that its received them. It'll just go "Eh", until its calculation frame comes up.
Best case, in games designed to run as fast as possible because they are twitch reflex focused, you have at least 8.3ms response time for the game to just calculate what is going on. On games not as focused on twitch reflexes, you have more.
No, console streaming couldnt happen for multitude of reasons, but that does not mean that your suggestion of "similar" experience is something that should be acceptable in the first place. Oh and im sorry since when is basic minimum framerate a top end PC scene? Apperently 5 year old GPUs are top end now!Ok, so console streaming couldn't happen because consoles are sub-par, and to stream a similar quality experience to what consoles now are getting, you'd have to make it stream the top end PC scene, and that's just impossible.
Right. I'm seeing what's going on here. You're here to have a bash at consoles and game making companies, not actually discuss the feasibility of streaming a console level experience. Cool. I guess that means we can leave it here. Funnily enough, whilst you don't seem to accept consoles, a great many millions do. And hell, I understand. I can't game on consoles because they're too shit. But a lot of the market can, so if you're going to say console streaming can't be done because it won't be good enough for the console market, and your argument is that the console market is full of plebs who accept poor quality so you'd better instead keep it at the PC level, where you originally weren't targeting it... Yeah, you're just having a whole different discussion now.
Evil's a bit reductive, but then again, I'm not sure they're not Captain Planet villains.Li Mu said:I find it bizarre that the US lags so far behind other first world nations when it comes to internet. Your telecom companies sound evil. I work in Moscow (although I'm British) and I pay $6 a month for internet with no limitations. I can usually DL a game of between 6 and 10GB in around 15 minutes. Of course, Moscow is a capital city and not representative of the rest of the country. But if Russia can have fast, uncapped internet, why can't the US?
One word....'GREEEEEEEEED'
Hell, my parents live in an area of the UK that is pretty rural and they still get uncapped internet. The speeds are erratic since their house seems to be connected to the internet via a long piece of string, but still, it sounds better than what you guys get. :-(
And that's why Blu-Ray is already dying, while streaming services are on the rise. And kind of the kicker is, I'm an early adopter of Blu-Ray, and I still use Netflix. Why? I don't need that quality for everything. From what I can tell, I'm far from alone.Joccaren said:And yet, its perfectly serviceable for the majority of consumers. And you're still only proving my point. 30,000 kbps is orders of magnitude less than 600 Mbs. As for why downgrade? For upgrades in other areas. As established, console players prefer convenience over power and looks - we're talking about a pretty minor difference in quality too, so its a sacrifice they're quite likely willing to make. I mean hell, why do people watch Netflix with its terrible compression algorithms when they could just buy the Bluray and get a much higher quality experience?.
It's rather misleading to say that people hated the 60Hz tick rate in BF4. What actually happened is that they increased the tick rate to 60 and the increased physics calculations on water maps was too much and bogged down the servers. This is a problem caused by the tick rate being too high, not too low. No-one had a problem with the latency inherent in a 60Hz tick rate.Strazdas said:You are underestimating the sensitivity actually. A 60 tick rate in BF4 was very diastrously felt by everyone playing that rate for example. Really, your bringing 30 fps arguments in there? you already lost then, 30 fps should not exist at all.
So what? If you're setting up a data center it is no more expensive to have 200 big servers that support 5 players each than to have 1000 thin servers that support a single player.Strazdas said:you will still need same amount of rendering power to render the frame, so a server that can serve 5 people will likely cost 5 times as much.
You mean this:Strazdas said:Worth noting, Crackdown 3 is coming out soon and it uses Microsoft cloud computing for its physics engine. Their demo that they showed shows them using not one, not two but eleven servers to power a single game experience. Basically Microsoft runs 5000 dollars worth of hardware for a single player instance. They may optimize till release of course, but this is the kind of cloud computing were talking about here - expensive one.