Ubisoft CEO: Streaming Will Replace Consoles

Canadamus Prime

Robot in Disguise
Jun 17, 2009
14,334
0
0
I don't think the current Internet infrastructure is capable of supporting that. Many of us are on High Speed Internet, but if I'm not mistaken there are still a great many areas that aren't. This may come about eventually, but not any time soon.
 

Bad Jim

New member
Nov 1, 2010
1,763
0
0
ShakerSilver said:
Streaming games is fucking awful for anything that does not require fast response time from the player. Action titles, Racing games, Fighting games, Shooters, etc are all gimped by network latency. There's no instantaneous internet speeds simply due to the the speed of light existing, so there's always going some latency, even on the fastest of internet speeds.
Not necessarily. A console will always take a significant amount of time to draw each frame. A cloud gaming server, on the other hand, can support many players at once and can therefore draw frames many times faster. It is possible for the time saved by faster rendering to exceed the time lost to network latency and coding/decoding the video stream. And latency tends to matter more in multiplayer games, where the network latency is there anyway.

Cloud gaming has other technical benefits, like the fact that you don't have to install your game, loading times should be much shorter, and framerate can be more consistent. Also, it is arguably beneficial to have the cost of rendering placed on the publishers, although there are also downsides to this.
 
Apr 5, 2008
3,736
0
0
It could well happen. A *lot* of engineering is going into streaming in various forms. Steam Link and Nvidia Shield are examples of streaming within a home. In these examples, the gaming PC is still doing the work but the game is displayed elsewhere. Ultimately what it is doing is swapping the DVI/HDMI cable from graphics card -> monitor to a network cable.

What it means is, in basic terms, there is an amount of bandwidth required to display a game with low/no latency over network technologies. The "work" is done elsewhere, an uplink transmits the players inputs and a downlink provides the live image at 60FPS.

It's the singularity where video streaming and gaming meet. I imagine it can be done, the issue is one of latency, and for those unfortunate, bandwidth limits. The latency is the barrier I think as a game has to be responsive to the user. We can detect even small amounts of lag. I don't think processing the game itself or streaming the video/audio from it is the issue.

It will be interesting to see where it goes. But I swear one company was trying this and IIRC sony bought them...Gaikai (just googled it). We'll see what happens. Consoles are definitely changing tho. They're already using PC hardware/architecture and cost as much as one, the singularity is coming, but what it will be is anyone's guess.
 

Bad Jim

New member
Nov 1, 2010
1,763
0
0
Strazdas said:
Quantum entanglement is probably the most realistic solution to faster than light communication. i wouldnt say its happening though. we can barely agree that the concept exists so far, not effect it in any way yet.

there is no workaround for the speed of light. the only real workaround would be to build a dataserver in every town and village no matter how small. good luck with that.
Are you really saying that FTL communication is more realistic than regional servers?

Light is pretty damn fast. Roughly 300,000Km/sec in fact. If a user is 1000Km away from the server, light can travel the distance in 3.3ms and the absolute minimum ping imposed by the speed of light is a very reasonable 6.7ms. Obviously the real ping will be higher do to processing time and the fact that there won't be a straight wire running directly from user to server, but we are still looking at a very reasonable ping without having to build a ridiculous number of server farms.

Actually, your ping is likely to be a lot lower than what you currently get in multiplayer games, because they will need a lot more hardware, and will therefore need to build lots more data centers to put their hardware, so your nearest data center will be closer.

As wonderful as it would be to have the laws of physics prevent shitty business practices, they don't.
 

Naldan

You Are Interested. Certainly.
Feb 25, 2015
488
0
0
The Enquirer said:
As much as I may despise Ubisoft, it's a really good point. Last generation consoles moved away from being mere gaming devices and this generation even more so. With the talks of upgradable hardware it is turning into the exact thing a gaming computer is.

This is furthered with the continued pushing of cross platform play. Soon it'll ultimately come down to a matter of "do you want short or long term costs reduced?"
No, it's a statement to stay in line. I'm hearing this rumour again and again and start to see a pattern here. You know why it will fail?
thebobmaster said:
Until data caps are removed, and bandwidth is strong everywhere, I don't think this will happen yet. He brushes off the bandwidth cost, but that is a rather serious obstacle.
thebobmaster is absolutely right. The infrastructure and the contract philosophies are horrible on average.

If the manufacturers do this all together in the next generation, I'm convinced we will see a crash.

It's absolutely not anymore about what the consumer masses want or tolerate. It's about what they are even able to do.

Streaming games is absolutely impossible for the bigger market. And don't take statistics made by telecommunication companies seriously. I know of those the Telekom puts out, German company. The biggest company in this sector, former monopolist, current inofficial monopolist and future growing monopolist if the EU doesn't intervene.

They throw everything together. Copper with fibre optic cables. 20 mBit with 100+ mBit. Vectoring with stable connections. That all makes their statistics sound like Germany were in the top of the world regarding infrastructure. They even consider to throw LTE into the mix.

And they consider introducing bandwidth caps for their offensive infrastructure, just like for example the US' companies do.

The internet's infrastructure gets fucked already, is in the shitter probably for most folks in the US and Germany (and I haven't even bothered to look for France and the UK) and for Germans at the very least will get worse on average regarding contracts.

The only way would be if they manage to get a 4k machine capable of streaming with 50 KbpS/u and 500 KbpS/d, this absolutely fluently. Now you tell me how probable this is.


There is no way this will come and succeed. If this would succeed, then absolutely everything technical about the internet I know of would be false.

And there could be even so many other cool features, sugarcoating, cherries on top, everything made of gold. If it doesn't work, it doesn't work.
If you ever thought always online was bad, think again.
 

Mad World

Member
Legacy
Sep 18, 2009
795
0
1
Country
Canada
For once, Ubi$oft may actually be right. Of course, it needs to be done properly. OnLive failed, if I recall, because of input delay and okayish graphics.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Joccaren said:
I think there's a reason he said consoles, rather than PCs, are going to be outdated by streaming. As when you hit 1080p streaming, we'll be using 8K displays and demanding that instead. You can't keep up with the PC scene, it just moves too fast.
And console players demand higher resolution and better framerate... yet they don't get it, and buy anyway. Because more than resolution and framerate, they like the convenience and relative cheapness. Its why they're not gaming on the computer half the time. In fact, historically, the consoles that have gone for more power, have been flops. Things just don't work out as well on many levels when that happens. This generation the PS4 is killing the Xbone, but a lot of that is in the terrible rap that the Xbone got near launch, where it focused on being a home media spywar... entertainment device, whilst Sony focused on the games. That and people are getting sick of Microsofts Shit, TBH. Had both been marketed the same way, they likely would have been pretty even, potentially even the Xbone pulling ahead with a lower asking price than the PS4.
We could talk for a long time why people buy consoles, and it wouldnt be pretty, but i get your point here. Consolites were always more complacent with being mistreated.

No, we know it exists, and we can entangle particles. The only problem is the world record for the length of time we have artificially entangled two particles is about 13 seconds or so. We've got a lot of work to go in getting a sustained entanglement, and once that's done we've got to figure out how to use them to communicate, and then you'll have to work around the fact that an entangled particle only has one pair... There are a lot of engineering problems to figure out, but give us 200 years and we'll have nailed it. 200 years ago we were still throwing our sewage out into the street, cars didn't exist, and white man was trying to colonise Africa. 3-4 generations is a lot of time for things to change.
yeah, perhaps 200 years were generous for technological advancement. But the "We know it exists" is still very much only on high science levels, the average person does not even believe in such things, and what were talking aobut is average person using them.

You say that...
Its the same sort of thing as people in the old times saying there was no workaround for not being born with wings, and then hot air balloons were invented and we could float through the sky.
Additionally it doesn't need to be everywhere. Just focused around major living areas, where most users will be using it. Why build one out in woop woop for 5 people who play games? Not going to happen. Having one centre cater to 20,000 gamers? Yeah, sounds much better. Rurals get short changed, but then again they always do. From the company's perspective, income is more important than reaching everyone with the technology, and shortchanging a 100,000 more remote users, whilst maintaining a core userbase in the 10s of millions, is a pretty decent deal.
I get it, you believe science will solve everything. but were talking about FTL travel here. this is one barrier we were hitting out heads at for over a hundred years now.

No, it needs to be everywhere. thats the point. if its not everywhere latency is going to be a killer and noone is going to be using it. this is why just having more datacenters are not a solution. unless you can build and upkeep them for basically free, its not going to be financially viable solution. and without it we wont be able to do game streaming.

Unless your just suggesting to tell everyone outside major cities to go fuck themselves, which, admittedly, companies done in the last.

Precisely, a workaround for the speed of light problem, which I thought we just said didn't exist. No, the workaround won't be the same as for in games, however such workarounds can and do exist. Processing, naturally, has to partially be done client side. Considering the TVs will have CPUs at that point, and some already do, I don't see that as too far outside the realm of possibility in 20 years or so, to have the TV apply the same sort of algorithm streaming video uses to kill its file size [You watch a 1080p movie over 2 hours. Why are you not storing Terrabytes of data? Because intelligent algorithms cut that down to usually around 2Gb, with minimal loss in quality most of the time], to be able to send multiple frames of data, so they will play whilst waiting for a response from the client, which tells them the next 5 frames to display, or however many is normal for normal latency.
In this way the screen can continue to present flawless images, whilst awaiting the next bit of info from the server. Visually, latency isn't much of a problem after this. Still has some issues, but again, intelligent software creation can act as a workaround here. Say the TV sends not only the input, but which frame it was displaying at the time the input was entered. When you can display, say, 10 frames between server responses, and you send back that in the previous batch, frame 7 was when the button was pressed, the server can have a check it runs to see if that was within the 'bounds' of when an action should have been performed, and if you have the animations for success and failure start of nearly the same, you can just swap out the ending, and control latency is reduced as well.
Its not a workaround though, its a local calculation. something that is impossible if we move to streaming. and if we have streaming devices powerful enough to do the local calculation then there is no point in streaming because we just have modern PCs anyway.

No, thats the point, there is no client side processing in the streaming. thats the entire appeal of streaming is that you dont have to have a machine capable of processing it.

You do realize that the reason streaming does not take terabytes is because the quality is awful? the compression algorythms used in, say, netflix stream makes the video look horribly but allows people with bellow 10mbps internet to watch it. Its bad enough i have to deal with it with video streams, having this for games would be the day i quit gaming.

Also if i understand correctly you are saying that you want servers to calculate all possible scenarios and send a few frames buffer of all of them to trick the system? you do realize how much processing power that requires right? not to mention the coding nightmare this would be given the variety of game engines and process chains. Yeah, it can be technically done, if you can spend a few million dollars per user for servers.

Eh, you lose some of the local calculation and display. There is no "It must be 100%" rule for streaming, and with processors in the TVs that accept streaming, you can rely on a small amount of client side processing to help you along.
It must be 100% for streaming otherwise it is not streaming.

Depends on the game. See the Arkham series games. Don't feel floaty, and animations for attacks can take half a second at times. There's also the gameplay nicety of needing to time your attacks properly, I often found myself only inputting commands once or twice a second. Twitch gameplay? Yeah, going to suffer the same problems online twitch gameplay suffers today. Timing based gameplay, or god forbid, non-action oriented gameplay? Yeah, it'll get on fine.

Its not guaranteed to happen in the next 20 years, but its not outside the realm of possibility that streaming games could become a thing. Certainly not yet, but its not beyond the realm of possibility that, when technology advances a bit, and the industry sees a couple of properly mainstream pushes involving both game developers and hardware makers, and service providers, towards the streaming style, it could be achieved. It wouldn't be perfect, but then again nothing is. It'd be an undertaking, but it could be done. With a bit more technology, again, presently its not there.
Personally i found arkham controls very floaty and was a pain to use actually. But yes, it was quite highly praised for some reason.

Yes, i can see non-action gaming to be somewhat less affected. but that is just a small part of gaming, and worth mentioning one that Ubisoft does not develop.

Streaming games were a reality 5 years ago. the OP mentioned OnLive that allowed you to stream game. It mostly suffered from small selection of games so noone knew about them. And yes, they tried their hardest to find solution to latency, they didn't succeed much.

It can be achieved, but it would be a downgrade for the player. Heck ill believe that ubisoft and company could even force it into gaming if they wanted like they forced Uplay. That does not mean its actually an improvement.

Bad Jim said:
Are you really saying that FTL communication is more realistic than regional servers?

Light is pretty damn fast. Roughly 300,000Km/sec in fact. If a user is 1000Km away from the server, light can travel the distance in 3.3ms and the absolute minimum ping imposed by the speed of light is a very reasonable 6.7ms. Obviously the real ping will be higher do to processing time and the fact that there won't be a straight wire running directly from user to server, but we are still looking at a very reasonable ping without having to build a ridiculous number of server farms.

Actually, your ping is likely to be a lot lower than what you currently get in multiplayer games, because they will need a lot more hardware, and will therefore need to build lots more data centers to put their hardware, so your nearest data center will be closer.

As wonderful as it would be to have the laws of physics prevent shitty business practices, they don't.
Not regional servers. Local servers. And i do mean every city town and village. No, 1000 KM will never have reasonable ping for streaming. There is a reason people notice differences between 5ms and 1ms response in monitors. every MS matters for gameplay to feel good.

Also see my above discussion why ping now =/= ping in streaming.
 

Bad Jim

New member
Nov 1, 2010
1,763
0
0
Strazdas said:
Bad Jim said:
Light is pretty damn fast. Roughly 300,000Km/sec in fact. If a user is 1000Km away from the server, light can travel the distance in 3.3ms and the absolute minimum ping imposed by the speed of light is a very reasonable 6.7ms. Obviously the real ping will be higher [due] to processing time and the fact that there won't be a straight wire running directly from user to server, but we are still looking at a very reasonable ping without having to build a ridiculous number of server farms.

Actually, your ping is likely to be a lot lower than what you currently get in multiplayer games, because they will need a lot more hardware, and will therefore need to build lots more data centers to put their hardware, so your nearest data center will be closer.
Not regional servers. Local servers. And I do mean every city town and village. No, 1000 KM will never have reasonable ping for streaming. There is a reason people notice differences between 5ms and 1ms response in monitors. every MS matters for gameplay to feel good.

Also see my above discussion why ping now =/= ping in streaming.
The difference between 5ms response and 1ms response in monitors is not noticeable because of latency. It is noticeable because a long response time means that a large fraction of time each frame is spent not showing the correct picture. It is clearly better to spend 1/16th of the time not showing the correct picture than nearly 1/3 of the time.

You are grossly overstating the sensitivity that ordinary people have to latency, and completely ignoring contrary evidence like the fact that tons of people play games at 30fps on consoles when they could play those same games at 60fps or better on PCs. You are also ignoring the fact that a powerful server can serve multiple players and spend much less time rendering each frame than a console or even a PC.
 

robert022614

meeeoooow
Dec 1, 2009
369
0
0
the infrastructure is definitely not there yet at least in the US and big cable companies seem hesitant on fixing that. Financially they make tons off of providing the bare minimum while charging a high premium for tiers that would make streaming like that viable for the most part anyway.
 

Joccaren

Elite Member
Mar 29, 2011
2,601
3
43
Strazdas said:
We could talk for a long time why people buy consoles, and it wouldnt be pretty, but i get your point here. Consolites were always more complacent with being mistreated.
True, but yes, the core idea is that console players sacrifice quality for convenience/cost. It just comes with the territory. Whether that convenience/cost is actually worth it or not... Eh. That's another discussion. But its pretty much the sole two reasons not to play on the PC. Outside exclusives which can go fuck themselves.

yeah, perhaps 200 years were generous for technological advancement. But the "We know it exists" is still very much only on high science levels, the average person does not even believe in such things, and what were talking aobut is average person using them.
We're getting to the point where the average person won't believe vaccines are real, yet that doesn't mean they'll stop working. The average person couldn't tell you shit about how their mobile phone works, but it still does. They don't need to believe in the technology, or know that it exists, in order to use the devices that use that technology. They just need to be told what the device does in the highest level sense.

I get it, you believe science will solve everything. but were talking about FTL travel here. this is one barrier we were hitting out heads at for over a hundred years now.

No, it needs to be everywhere. thats the point. if its not everywhere latency is going to be a killer and noone is going to be using it. this is why just having more datacenters are not a solution. unless you can build and upkeep them for basically free, its not going to be financially viable solution. and without it we wont be able to do game streaming.

Unless your just suggesting to tell everyone outside major cities to go fuck themselves, which, admittedly, companies done in the last.
Science can't solve everything, but it can definitely find solutions that allow us to move forward from everything. I still don't have wings. I can't flap my arms and fly. We haven't solved that particular problem. But we have an alternative. A workaround that means we don't need to have wings to fly. We can use balloons. Or giant mechanical devices. Or spinning blades. We may never end up breaking the speed of light, but that doesn't mean we'll never reach the stars; we could develop cryogenic sleep, and wake up when we get there. Its never the perfect solution, but its one that can work.

And yes, I'm not saying its what SHOULD be done, but its something that COULD be done. Yeah, people living in rural areas get screwed over. As said, from a business perspective catering to 10 million players and pissing off a few hundred thousand, for vastly reduced costs, is a pretty decent deal. Sure, as a consumer, it sucks. As a business, its a possible decision.

Its not a workaround though, its a local calculation. something that is impossible if we move to streaming. and if we have streaming devices powerful enough to do the local calculation then there is no point in streaming because we just have modern PCs anyway.

No, thats the point, there is no client side processing in the streaming. thats the entire appeal of streaming is that you dont have to have a machine capable of processing it.

You do realize that the reason streaming does not take terabytes is because the quality is awful? the compression algorythms used in, say, netflix stream makes the video look horribly but allows people with bellow 10mbps internet to watch it. Its bad enough i have to deal with it with video streams, having this for games would be the day i quit gaming.

Also if i understand correctly you are saying that you want servers to calculate all possible scenarios and send a few frames buffer of all of them to trick the system? you do realize how much processing power that requires right? not to mention the coding nightmare this would be given the variety of game engines and process chains. Yeah, it can be technically done, if you can spend a few million dollars per user for servers.
Yes, and the local calculation is a workaround. It finds a solution by dodging the initial problem. That's why its not a solution. Its a workaround. It works around the issue.
And yes, it can work for streaming. There's always some level of processing going on at the receiving end. Its just how things work. You would never be able to even stream games at all if the TV couldn't process controller inputs, store them, then send them as a datapacket. The whole thing with streaming isn't that it has NO local calculations, but vastly reduced local calculations. Rather than needing a great CPU, excellent graphics card, and kickass RAM, you just need that CPU and a small amount of RAM. You stream in the rest of the calculations and/or data.

And its not just netflix that does that compression. Your DVDs are compressed like that. ALL digital media is compressed like that. DVDs store 20Gb max, so tell me how you're fitting terabytes of video and sound data onto it for a 2 hour movie?
You're not. You're fitting 2Gb or so most of the time, using streaming algorithms to cut down the excess, useless information. Some streaming cuts down more than others, to varying effects. Point is though, ALL video media is compressed in some way. You're not going to be dealing with terabytes of data, no matter what medium you're using to play your 1080p movie, unless its a film video reel or something.

And no, the server would send the next 10 frames of gameplay, and, much like animations in games today are designed to look good despite being completed in 0.01 seconds for "Good feel", the game designers design the animations for the two potential outcomes of a given action to start in a very similar way. The server then receives what happens, calculates the next 10 frames, and sends them along. It also decides what actually happened based on what the client said happened, much like multiplayer servers today do, and they send predicted algorithms for player positions and such to clients as well to try and create a smoother feeling of play. Client side, the first 10 frames could be either one action or the other. The second set confirms which action the player took. Yes, its still more processing power than just streaming normally, but we're hitting a level of graphics now that its going to be a struggle to improve upon. In 20 years? Look at the old Rare games. Compare to today. We'll hit a level of graphical technology where only tiny improvements exist from huge processing power requirements, and the streaming centres could easily cut this down, much like consoles, displaying only the essentials, rather than every possible best setting, and keeping a pretty good look to each game. That greatly reduces the power cost, and future technology will greatly increase the power available. Hell, I mean, what's the point of a streaming centre if it doesn't have better hardware than you do?

It must be 100% for streaming otherwise it is not streaming.
See above. I stream data from the Diablo III servers in order to play the game. Same with Sim City. Same with any MMO. Its not magically not streaming that data because some of the calculations are done on my computer. Streaming isn't a 100% thing. You can stream any amount of data, and merge it with any amount of calculated data. Hell, when you stream a movie, you're still actually running a movie player on your computer. Obviously you're not streaming that movie right? Quicktime is running, calculating how to convert from the streamed information to on-screen display, handling pause and unpause, rewind fast forward ect, storing cache info. My god, we don't stream movies do we?
But we do. Local calculations are fine. It just requires some calculations to be done server-side. A "Stream" - hence, streaming - is simply a flow of information. It doesn't imply much about the size of that flow.

Personally i found arkham controls very floaty and was a pain to use actually. But yes, it was quite highly praised for some reason.

Yes, i can see non-action gaming to be somewhat less affected. but that is just a small part of gaming, and worth mentioning one that Ubisoft does not develop.

Streaming games were a reality 5 years ago. the OP mentioned OnLive that allowed you to stream game. It mostly suffered from small selection of games so noone knew about them. And yes, they tried their hardest to find solution to latency, they didn't succeed much.

It can be achieved, but it would be a downgrade for the player. Heck ill believe that ubisoft and company could even force it into gaming if they wanted like they forced Uplay. That does not mean its actually an improvement.
Eh, they're floaty in the same sense Guitar hero is floaty because you have to play to the rhythm rather than just spam the buttons. Its timing based rather than reflex based, and yeah, games like that - that can still be action games - would have a lot less impact from streaming that reflex based games.

And yeah, streaming games were a reality... They came somewhat close to working, but the technology wasn't quite there. Same as VR, which was a thing 20 years ago. Technology progresses. Give it 20 years, we could be there.

And yeah, doesn't need to be an improvement. As said, we're not talking what should be done to deliver the highest quality games, we're talking what could be done to deliver passable quality games.

Not regional servers. Local servers. And i do mean every city town and village. No, 1000 KM will never have reasonable ping for streaming. There is a reason people notice differences between 5ms and 1ms response in monitors. every MS matters for gameplay to feel good.

Also see my above discussion why ping now =/= ping in streaming.
You'll note console players are used to 33.3 ms response from monitors, playing at 30 FPS. And if you're playing at 60FPS, the best you're going to get is 16.6 ms response in what you see compared to what you do. Yeah, some PC players play at even higher FPS and can notice the high percentile differences, but we're not talking about max high end PC streaming. We're talking console level streaming, a lot more reasonable.

And again, yeah, some people are going to get screwed over by streaming datacenters. Welcome to 'business', it doesn't need to serve everyone. I mean, there's not home delivery for my favourite restaurant near my house, yet that would be infinitely more convenient for me. Doesn't mean that I can't eat there, or that its some horrible thing - its just a service that's not offered to my area. The only problems arise when its the only option.
 

FalloutJack

Bah weep grah nah neep ninny bom
Nov 20, 2008
15,489
0
0
Hmmm, well let's examine this for a moment. Do consoles and console gaming still make money? Yes? Well then, so much for that theory. Really, there's nothing more to it. You can blurb out anything you like, but the bottom line is the bottom line.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Bad Jim said:
The difference between 5ms response and 1ms response in monitors is not noticeable because of latency. It is noticeable because a long response time means that a large fraction of time each frame is spent not showing the correct picture. It is clearly better to spend 1/16th of the time not showing the correct picture than nearly 1/3 of the time.

You are grossly overstating the sensitivity that ordinary people have to latency, and completely ignoring contrary evidence like the fact that tons of people play games at 30fps on consoles when they could play those same games at 60fps or better on PCs. You are also ignoring the fact that a powerful server can serve multiple players and spend much less time rendering each frame than a console or even a PC.
When talking about stream gaming latency will be the factor in response times. For streaming they become intertwined.

You are underestimating the sensitivity actually. A 60 tick rate in BF4 was very diastrously felt by everyone playing that rate for example. Really, your bringing 30 fps arguments in there? you already lost then, 30 fps should not exist at all.

you will still need same amount of rendering power to render the frame, so a server that can serve 5 people will likely cost 5 times as much. Though there is an argument to be made that the server runs 24/7 and thus it may serve multiple people over different timeframes, which usually does not happen with home computers.

Worth noting, Crackdown 3 is coming out soon and it uses Microsoft cloud computing for its physics engine. Their demo that they showed shows them using not one, not two but eleven servers to power a single game experience. Basically Microsoft runs 5000 dollars worth of hardware for a single player instance. They may optimize till release of course, but this is the kind of cloud computing were talking about here - expensive one.

Joccaren said:
We're getting to the point where the average person won't believe vaccines are real, yet that doesn't mean they'll stop working. The average person couldn't tell you shit about how their mobile phone works, but it still does. They don't need to believe in the technology, or know that it exists, in order to use the devices that use that technology. They just need to be told what the device does in the highest level sense.
No, despite a few extremist, the vast majority still very much believe vaccines are real and even know the basic concept of how they work. Its even taught in schools.

Yes, and the local calculation is a workaround. It finds a solution by dodging the initial problem. That's why its not a solution. Its a workaround. It works around the issue.
But that is...pointless. we have local calculation, the suggestion is to move to server calculation and streaming, and the workaround is to do local calculation? why do we even need streaming at all then?

And its not just netflix that does that compression. Your DVDs are compressed like that. ALL digital media is compressed like that. DVDs store 20Gb max, so tell me how you're fitting terabytes of video and sound data onto it for a 2 hour movie?
You're not. You're fitting 2Gb or so most of the time, using streaming algorithms to cut down the excess, useless information. Some streaming cuts down more than others, to varying effects. Point is though, ALL video media is compressed in some way. You're not going to be dealing with terabytes of data, no matter what medium you're using to play your 1080p movie, unless its a film video reel or something.
DVD compression is extremely old and bad actually. I moved to BluRays a couple years ago. These use much better compression algorythms and actually has a decent bitrate. That bitrate is 30.000 kbps on average, more thna 4 times that of netflix. Even then, that compression is something id love to be without. I would also like if they move to H265 more (not to be confused with H264). Alas we still cannot run that properly because even high end CPUs have problem deciding a 1080p video due to very calculation intensive compression (which results in better quality).

And yes, its not ideal, but thats the only thing we got so its that or nothing. We have better in videogames, why downgrade?

And no, the server would send the next 10 frames of gameplay, and, much like animations in games today are designed to look good despite being completed in 0.01 seconds for "Good feel", the game designers design the animations for the two potential outcomes of a given action to start in a very similar way. The server then receives what happens, calculates the next 10 frames, and sends them along. It also decides what actually happened based on what the client said happened, much like multiplayer servers today do, and they send predicted algorithms for player positions and such to clients as well to try and create a smoother feeling of play. Client side, the first 10 frames could be either one action or the other. The second set confirms which action the player took. Yes, its still more processing power than just streaming normally, but we're hitting a level of graphics now that its going to be a struggle to improve upon. In 20 years? Look at the old Rare games. Compare to today. We'll hit a level of graphical technology where only tiny improvements exist from huge processing power requirements, and the streaming centres could easily cut this down, much like consoles, displaying only the essentials, rather than every possible best setting, and keeping a pretty good look to each game. That greatly reduces the power cost, and future technology will greatly increase the power available. Hell, I mean, what's the point of a streaming centre if it doesn't have better hardware than you do?
and if you press a button after only 2 frames your fucked. Thanks but no thanks. Frame buffers need to be REDUCED, not increased. the current 3 frame buffers already create enough input lag that people will pick screen tearing over it.

No, servers dont predict anything in current multiplayer games. Our local clients do, calculating it locally. Servers are sanity checks and new data from other players.

Also you really are wrong about graphic improvements. Just look at what Snowdrop engine did. Or Avalanche. Were nowhere even close to where we struggle to improve upon. Maybe we will hit that level in 20 days, but i doubt it, given that assuming same technological progression it will take till 2056 to reach even current pixar levels of graphic rendered in real time for enthusiast users. Yes, we will hit the limit some day, but its not anywhere close enough to use it as an argument yet.


"Hell, I mean, what's the point of a streaming centre if it doesn't have better hardware than you do?"

Control. You dont own games anymore, you just rent them now.

See above. I stream data from the Diablo III servers in order to play the game. Same with Sim City. Same with any MMO. Its not magically not streaming that data because some of the calculations are done on my computer. Streaming isn't a 100% thing. You can stream any amount of data, and merge it with any amount of calculated data. Hell, when you stream a movie, you're still actually running a movie player on your computer. Obviously you're not streaming that movie right? Quicktime is running, calculating how to convert from the streamed information to on-screen display, handling pause and unpause, rewind fast forward ect, storing cache info. My god, we don't stream movies do we?
But we do. Local calculations are fine. It just requires some calculations to be done server-side. A "Stream" - hence, streaming - is simply a flow of information. It doesn't imply much about the size of that flow.
Thats not stream gaming. you are going into semantics now, when in reality streaming gaming means you are streaming the entire game - like netflix - with only controls being from the player side. Your multiplayer gaming is not an example of streaming.

You'll note console players are used to 33.3 ms response from monitors, playing at 30 FPS. And if you're playing at 60FPS, the best you're going to get is 16.6 ms response in what you see compared to what you do. Yeah, some PC players play at even higher FPS and can notice the high percentile differences, but we're not talking about max high end PC streaming. We're talking console level streaming, a lot more reasonable.
Not true. It does not matter which framerate you are playing at there is going to be a significant difference in the "Feel" of control based on input lag. this is because things DO change mid-frame based on player input. Input does not get delivered only every 33,3MS, it gets delivered when you pressed it. This makes a difference to the gameplay.

Also considering consoles are currently not reasonable but subpar performance that should not happen at all in the 21st century (and yes, i was playing at 60hz 1600p monitor since the 90s. It was a clunky CRT but it was still better than what consoles do now). This is not reasonable, and wont be reasonable if you make it into streaming.
 

Li Mu

New member
Oct 17, 2011
552
0
0
Elfgore said:
thebobmaster said:
Until data caps are removed, and bandwidth is strong everywhere, I don't think this will happen yet. He brushes off the bandwidth cost, but that is a rather serious obstacle.
Yeah, he really should not be brushing this off like it is nothing. It's literally the largest obstacle to seeing this happen. My dad has around 1.2 MB a second and a max of thirty GB a month. That is the best he can get and it is insanely expensive. He better hope internet in the U.S improves massively to see this theory become true.

I find it bizarre that the US lags so far behind other first world nations when it comes to internet. Your telecom companies sound evil. I work in Moscow (although I'm British) and I pay $6 a month for internet with no limitations. I can usually DL a game of between 6 and 10GB in around 15 minutes. Of course, Moscow is a capital city and not representative of the rest of the country. But if Russia can have fast, uncapped internet, why can't the US?
One word....'GREEEEEEEEED'

Hell, my parents live in an area of the UK that is pretty rural and they still get uncapped internet. The speeds are erratic since their house seems to be connected to the internet via a long piece of string, but still, it sounds better than what you guys get. :-(
 

Joccaren

Elite Member
Mar 29, 2011
2,601
3
43
Strazdas said:
No, despite a few extremist, the vast majority still very much believe vaccines are real and even know the basic concept of how they work. Its even taught in schools.
1. In some areas, you'd be surprised.
2. Obvious Hyperbole.
3. Missing, ignoring, or avoiding the point.

But that is...pointless. we have local calculation, the suggestion is to move to server calculation and streaming, and the workaround is to do local calculation? why do we even need streaming at all then?
Why do you drive a car? You still have to move your feet to move, pressing the accelerator and brakes, and clutch. Obviously its pointless as you're still moving your legs to get there...
Or, and hear me out here, the fact that you have to do vastly less leg work makes it worth it.
Same with streaming. Its not magically not worth it because you have to calculate 10 megaflops of data, whilst the server does 100 terraflops. Its the fact that you're not doing those 100 terraflops that makes the difference, not the fact that you still have to do 10 megaflops.

DVD compression is extremely old and bad actually. I moved to BluRays a couple years ago. These use much better compression algorythms and actually has a decent bitrate. That bitrate is 30.000 kbps on average, more thna 4 times that of netflix. Even then, that compression is something id love to be without. I would also like if they move to H265 more (not to be confused with H264). Alas we still cannot run that properly because even high end CPUs have problem deciding a 1080p video due to very calculation intensive compression (which results in better quality).

And yes, its not ideal, but that's the only thing we got so its that or nothing. We have better in videogames, why downgrade?
And yet, its perfectly serviceable for the majority of consumers. And you're still only proving my point. 30,000 kbps is orders of magnitude less than 600 Mbs. As for why downgrade? For upgrades in other areas. As established, console players prefer convenience over power and looks - we're talking about a pretty minor difference in quality too, so its a sacrifice they're quite likely willing to make. I mean hell, why do people watch Netflix with its terrible compression algorithms when they could just buy the Bluray and get a much higher quality experience?
If you can answer that question, well, you have answered your own question.

and if you press a button after only 2 frames your fucked. Thanks but no thanks. Frame buffers need to be REDUCED, not increased. the current 3 frame buffers already create enough input lag that people will pick screen tearing over it.
Honestly, only the highest end "MUST RUN AT 300FPS BETTER PLAY AT 480P SO THAT ITS HIGHER FPS" junkies espouse that view. The gaming community pretty much unanimously is fine with frame buffers under normal circumstances.
Additionally, no, if you press it after only 2 frames you're not fucked. Perhaps you missed the part where locally, your device records which frame the button was pressed at, and feeds that through to the server, which then treats its calculations as if that was when you had pressed the button, and gives you your output based on it. Yes, it'd require some intelligent programming to get that done right, and intelligent art design. I've said as much. Doesn't mean it isn't something that's possible to do.

No, servers dont predict anything in current multiplayer games. Our local clients do, calculating it locally. Servers are sanity checks and new data from other players.
Incorrect. The server calculates where it expects you to be. Your client calculates where you are, and sends it to the server. The server also calculates where it expects everyone else to be, and relays that information to your client to display. This is a core part of anti-cheat mechanisms; If what your client says you are doing is not what the server says it expects you to be doing, it flags it as likely cheating. The server still has to predict what every player is going to do though, funnily enough.

Also you really are wrong about graphic improvements. Just look at what Snowdrop engine did. Or Avalanche. Were nowhere even close to where we struggle to improve upon. Maybe we will hit that level in 20 days, but i doubt it, given that assuming same technological progression it will take till 2056 to reach even current pixar levels of graphic rendered in real time for enthusiast users. Yes, we will hit the limit some day, but its not anywhere close enough to use it as an argument yet.
I never said we'd hit the limit of grahpics improvements. That'll go on until the end of time. What I said is we'd hit that limit where you'd need a ton of extra processing power, to deliver very minimal extra details. Compare today's PC flagship titles, with all the highest settings on, and then just turn realistic hair physics off. How much better does it run? A shitton. How much do you notice the worse hair? Eeeeeh, not a ton. Same goes for the more intensive forms of Ambient Occlusion and Anti Aliasing, compared to the more simplistic ones. Yeah, there are differences, that can be noticed, but for a normal player in a normal gameplay session, its so small a difference that most won't pay the extra thousand dollars required to achieve it all. And this will only be exacerbated in the future, as each tiny increase in graphics quality demands huge amounts of extra resources. Its diminishing returns at its finest, and allows streamers to stream a lower quality version of the graphics settings, that look very similar to the higher quality versions, for much less processor power.

Thats not stream gaming. you are going into semantics now, when in reality streaming gaming means you are streaming the entire game - like netflix - with only controls being from the player side. Your multiplayer gaming is not an example of streaming.
Even Netflix has a decoder at your end that it needs to process in order to turn it into watchable media. You even admit it here, that some processing - namely for you, controls - needs to happen client side. Yeah. Funnily enough that's how it works. There is no 100% streaming. Streaming the vast majority of the intensive calculations, and making some minor calculations on your end to facilitate the streaming of those other calculations, is, however, 100% 'game streaming'. Its how streaming as a whole works. Your pureblood "Everything is streamed" stream doesn't exist, as then you would be streaming literally terabytes of data for an hour's play session, or movie. Funnily enough, that's not how it works. Stuff is always processed client side. You can either continue trying to deny that to try and make the problem more impossible than it is, or you can accept that's the way its always been done, and that it can be done so in the future. You're not running complex graphics or physics calculations that take a ton of power, you're processing inputs and where in the grand scheme of things they are received, and then handling the incoming frames and when to display them.

Not true. It does not matter which framerate you are playing at there is going to be a significant difference in the "Feel" of control based on input lag. this is because things DO change mid-frame based on player input. Input does not get delivered only every 33,3MS, it gets delivered when you pressed it. This makes a difference to the gameplay.
Also not completely true. Funnily enough, games run at a particular framerate - non-graphical, but calculation wise. Modern programming design is intelligent in that if it runs FASTER than that framerate, it slows itself down, that way you don't have different experiences on different speed machines, where on a slow one you have 10 seconds to dodge a bullet, whilst on a fast one that bullet has been through you and killed you in 0.01 seconds.
Now, I don't know what speed every game runs itself at, but its going to be;
60 FPS
90 FPS
100 FPS
120 FPS
These periods are the only times your client actually calculates inputs. Sure, it'll 'receive' them before then, but it won't actually do anything with them, or even note that its received them. It'll just go "Eh", until its calculation frame comes up.
Best case, in games designed to run as fast as possible because they are twitch reflex focused, you have at least 8.3ms response time for the game to just calculate what is going on. On games not as focused on twitch reflexes, you have more.
Additionally, you were talking about screen refresh rates, and how having a 1ms refresh rate reduces input lag compared to having a 5ms refresh rate, which it just does not. As the previous poster pointed out, its to do with the removal of the previous image, and generally your game is running at 30FPS for a console - or if we're being generous for you - 60 FPS, which means that even if it were running at 1ms refresh rate on the screen, it would still display the exact same frame for 16-33 cycles.
Either way, a certain level of input lag is unavoidable thanks to the fact that the game itself runs at a capped FPS. What this FPS is depends on the game, but again, especially in non-reaction speed focused games, its often a reasonable FPS. Sometimes it'll be faster than what's displayed on screen, it doesn't always have to be though.

Also considering consoles are currently not reasonable but subpar performance that should not happen at all in the 21st century (and yes, i was playing at 60hz 1600p monitor since the 90s. It was a clunky CRT but it was still better than what consoles do now). This is not reasonable, and wont be reasonable if you make it into streaming.
Ok, so console streaming couldn't happen because consoles are sub-par, and to stream a similar quality experience to what consoles now are getting, you'd have to make it stream the top end PC scene, and that's just impossible.
Right. I'm seeing what's going on here. You're here to have a bash at consoles and game making companies, not actually discuss the feasibility of streaming a console level experience. Cool. I guess that means we can leave it here. Funnily enough, whilst you don't seem to accept consoles, a great many millions do. And hell, I understand. I can't game on consoles because they're too shit. But a lot of the market can, so if you're going to say console streaming can't be done because it won't be good enough for the console market, and your argument is that the console market is full of plebs who accept poor quality so you'd better instead keep it at the PC level, where you originally weren't targeting it... Yeah, you're just having a whole different discussion now.
 

pookie101

New member
Jul 5, 2015
1,162
0
0
i tend to agree but think it will take longer than that but it is the future of gaming.

personally i can see a generation of consoles that are downloadable content only with no disks, etc
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Joccaren said:
Why do you drive a car? You still have to move your feet to move, pressing the accelerator and brakes, and clutch. Obviously its pointless as you're still moving your legs to get there...
Or, and hear me out here, the fact that you have to do vastly less leg work makes it worth it.
Same with streaming. Its not magically not worth it because you have to calculate 10 megaflops of data, whilst the server does 100 terraflops. Its the fact that you're not doing those 100 terraflops that makes the difference, not the fact that you still have to do 10 megaflops.
But your suggestion of doing calcualtions both locally and on server does not make it do less work. In fact you end up doing more, because you are calculating both locally and on the server instead of only locally now.

And 10 megaflops wouldnt be even enough for the video codec to decode a netflix stream. But you still need to do those 100 teraflops if you want to utilize the same technique used in modern multiplayer games. the client does ALL of the calculations and only checks in with the server as a consistency/anticheat measure. Its why a single PC can host 128 player games easily, the server does very little because the client does most of the work.


And yet, its perfectly serviceable for the majority of consumers. And you're still only proving my point. 30,000 kbps is orders of magnitude less than 600 Mbs. As for why downgrade? For upgrades in other areas. As established, console players prefer convenience over power and looks - we're talking about a pretty minor difference in quality too, so its a sacrifice they're quite likely willing to make. I mean hell, why do people watch Netflix with its terrible compression algorithms when they could just buy the Bluray and get a much higher quality experience?
If you can answer that question, well, you have answered your own question.
No, DVD compression is not perfectly servicable. Its a case of "We simply have nothing better". You could probably make that argument with netflix next, and i would call every such consumer blind. Oh, wait, you did.

Honestly, only the highest end "MUST RUN AT 300FPS BETTER PLAY AT 480P SO THAT ITS HIGHER FPS" junkies espouse that view. The gaming community pretty much unanimously is fine with frame buffers under normal circumstances.
Additionally, no, if you press it after only 2 frames you're not fucked. Perhaps you missed the part where locally, your device records which frame the button was pressed at, and feeds that through to the server, which then treats its calculations as if that was when you had pressed the button, and gives you your output based on it. Yes, it'd require some intelligent programming to get that done right, and intelligent art design. I've said as much. Doesn't mean it isn't something that's possible to do.
Is that why we invented not one but two techniques coming out last/this year to remove those buffers? No, the gaming community is not fine with frame buffers. They never were.

If you press a button at frame 2 your still going to wait for frame 10 to see it take any effect because of latency. so its either going to be very unresponsive controls or every button press will end up in rubber-banding. Good luck selling that to gamers.

Incorrect. The server calculates where it expects you to be. Your client calculates where you are, and sends it to the server. The server also calculates where it expects everyone else to be, and relays that information to your client to display. This is a core part of anti-cheat mechanisms; If what your client says you are doing is not what the server says it expects you to be doing, it flags it as likely cheating. The server still has to predict what every player is going to do though, funnily enough.
No. The server only has to check what is possible. If you are being shot and server calculates that the bullet hit you, then you not loosing health is not possible therefore server sends the client a message to loose health and client adjusts health if its own calculations were wrong. (or as is now popular in games, it doesnt, and cheaters run free because server does not check anything. See: GTA 5).

I never said we'd hit the limit of grahpics improvements. That'll go on until the end of time. What I said is we'd hit that limit where you'd need a ton of extra processing power, to deliver very minimal extra details. Compare today's PC flagship titles, with all the highest settings on, and then just turn realistic hair physics off. How much better does it run? A shitton. How much do you notice the worse hair? Eeeeeh, not a ton. Same goes for the more intensive forms of Ambient Occlusion and Anti Aliasing, compared to the more simplistic ones. Yeah, there are differences, that can be noticed, but for a normal player in a normal gameplay session, its so small a difference that most won't pay the extra thousand dollars required to achieve it all. And this will only be exacerbated in the future, as each tiny increase in graphics quality demands huge amounts of extra resources. Its diminishing returns at its finest, and allows streamers to stream a lower quality version of the graphics settings, that look very similar to the higher quality versions, for much less processor power.
Your still wrong. Like i said look at Snowdrop or Avalanche. They made huge improvements with the looks despite no big leaps in graphical processing. Oh and hairs are GORGEOUS in witcher 3. Human hairs not so much but animal ones, you just want to pet them! The performance thing is mostly because the devs fucked up with 64x tesselation. once they patched that out it didnt tank it as much. Also a bit off topic but have you seen what Nvidia did with grass simulation? i cant wait to see that in a game.

Ambient Occlusion does not take much processing power, and i turn it off usually because its annoying.

Anti aliasing has basically 3 types. "True" antialiasing, which is done by scaling the game render resolution and then downscaling. This is going to take as much processing power as its going to take running the game in higher resolution, but it gives best (true) results of antialiasing. This is great if you have exess power.

The render object antialiasing (most popular - MSAA) is less resource intensive but depends a lot on how the game is designed. If you design the game engine and/or word in a way that it cannot parse objects through antialiasing like that its going to look worse than no antialiasing. Its very much hit or miss with the devs and main reason why last gen consoles had no actual AA.

Blurring antialiasing. This is your FXAA and the like, very light on resources but also offers no benefits to the user. It takes final image and blurs what it thinks are aliasing. In a word - it makes it worse.

Antialiasing didnt change much from the 90s, its always been render inteisive process that many people wanted but not always could have. Though now dynamic shadows are taking over as the most GPU killing feature.

Diminishing returns are there, but nowhere near what you claim. As far as streamers go, Twitch quality is attrociuos. Its nowhere even close to what the game actually looks like and anyone that claims it is needs their eyes checked. The artifacting alone is a dead giveaway.

Also not completely true. Funnily enough, games run at a particular framerate - non-graphical, but calculation wise. Modern programming design is intelligent in that if it runs FASTER than that framerate, it slows itself down, that way you don't have different experiences on different speed machines, where on a slow one you have 10 seconds to dodge a bullet, whilst on a fast one that bullet has been through you and killed you in 0.01 seconds.
No. what your talking about is tier games clock to CPU speed. this isnt being done anymore and not related to how games are rendered. In fact games internal clock is independent of framerate in any well designed game (only a few in the past few years did this mistake. Tieing physics is a more common mistake). A game calculation does not wait for next frame to count the player input. This is what makes the difference when it comes to input even at low FPS gameplay.

Now, I don't know what speed every game runs itself at, but its going to be;
60 FPS
90 FPS
100 FPS
120 FPS
These periods are the only times your client actually calculates inputs. Sure, it'll 'receive' them before then, but it won't actually do anything with them, or even note that its received them. It'll just go "Eh", until its calculation frame comes up.
Best case, in games designed to run as fast as possible because they are twitch reflex focused, you have at least 8.3ms response time for the game to just calculate what is going on. On games not as focused on twitch reflexes, you have more.
Completely false - see above.

This is at best somewhat related to how server calculation works, as they do it every certain interval, called tick rate. This is how often server updates the client with new info it calculated in multiplayer games. Guess what, whenever its bellow 100 times a second people complain about input lag and nonresponsive servers.

Ok, so console streaming couldn't happen because consoles are sub-par, and to stream a similar quality experience to what consoles now are getting, you'd have to make it stream the top end PC scene, and that's just impossible.
Right. I'm seeing what's going on here. You're here to have a bash at consoles and game making companies, not actually discuss the feasibility of streaming a console level experience. Cool. I guess that means we can leave it here. Funnily enough, whilst you don't seem to accept consoles, a great many millions do. And hell, I understand. I can't game on consoles because they're too shit. But a lot of the market can, so if you're going to say console streaming can't be done because it won't be good enough for the console market, and your argument is that the console market is full of plebs who accept poor quality so you'd better instead keep it at the PC level, where you originally weren't targeting it... Yeah, you're just having a whole different discussion now.
No, console streaming couldnt happen for multitude of reasons, but that does not mean that your suggestion of "similar" experience is something that should be acceptable in the first place. Oh and im sorry since when is basic minimum framerate a top end PC scene? Apperently 5 year old GPUs are top end now!

Streaming feasibility is already known, its zero, so why not take the time to also bash some consoles while were at it :D

Wait, so your basic argument is that "we cant make it work properly, but since those people are willing to accept shit its good enough, ship it"? Are you pulling a WarnerBros here?
 

Mutant1988

New member
Sep 9, 2013
672
0
0
All technical obstacles aside, we should oppose this for one very simple reason: We, the customers, lose control.

Streaming is not sustainable. And with streaming, our game investments can be taken away from us at a moments notice (With no way to circumvent it, since we have no access to the game data) and/or be rendered completely inoperable for any number of reasons simply because of the server reliant design.

Yves Guillemot and others like him are an enemy of all customers.
 

Something Amyss

Aswyng and Amyss
Dec 3, 2008
24,759
0
0
Li Mu said:
I find it bizarre that the US lags so far behind other first world nations when it comes to internet. Your telecom companies sound evil. I work in Moscow (although I'm British) and I pay $6 a month for internet with no limitations. I can usually DL a game of between 6 and 10GB in around 15 minutes. Of course, Moscow is a capital city and not representative of the rest of the country. But if Russia can have fast, uncapped internet, why can't the US?
One word....'GREEEEEEEEED'

Hell, my parents live in an area of the UK that is pretty rural and they still get uncapped internet. The speeds are erratic since their house seems to be connected to the internet via a long piece of string, but still, it sounds better than what you guys get. :-(
Evil's a bit reductive, but then again, I'm not sure they're not Captain Planet villains.

The thing is, in many areas, there is only one high-speed provider, or possibly two. My only options are Comcast or satellite-based DSL, and in my building the latter's ruled out anyway. Comcast could technically be considered to be violating antitrust laws, but got a free pass by the feds when they came into the area. But they will do everything in their power to keep competition out. Which probably does include some sort of Captain Planet schemes.

When you're a for-profit business (with shareholders to please, no less) and a monopoly, it becomes more profitable to maintain that monopoly than provide service. I mean, where else are your customers going to go?
 

Something Amyss

Aswyng and Amyss
Dec 3, 2008
24,759
0
0
Joccaren said:
And yet, its perfectly serviceable for the majority of consumers. And you're still only proving my point. 30,000 kbps is orders of magnitude less than 600 Mbs. As for why downgrade? For upgrades in other areas. As established, console players prefer convenience over power and looks - we're talking about a pretty minor difference in quality too, so its a sacrifice they're quite likely willing to make. I mean hell, why do people watch Netflix with its terrible compression algorithms when they could just buy the Bluray and get a much higher quality experience?.
And that's why Blu-Ray is already dying, while streaming services are on the rise. And kind of the kicker is, I'm an early adopter of Blu-Ray, and I still use Netflix. Why? I don't need that quality for everything. From what I can tell, I'm far from alone.

I imagine telling people how superior PC is isn't going to change their minds now any more than it has in the last three decades or so, to boot.
 

Bad Jim

New member
Nov 1, 2010
1,763
0
0
Strazdas said:
You are underestimating the sensitivity actually. A 60 tick rate in BF4 was very diastrously felt by everyone playing that rate for example. Really, your bringing 30 fps arguments in there? you already lost then, 30 fps should not exist at all.
It's rather misleading to say that people hated the 60Hz tick rate in BF4. What actually happened is that they increased the tick rate to 60 and the increased physics calculations on water maps was too much and bogged down the servers. This is a problem caused by the tick rate being too high, not too low. No-one had a problem with the latency inherent in a 60Hz tick rate.
http://bf4central.com/2015/09/bf4-60hz-server-tick-rate/

Strazdas said:
you will still need same amount of rendering power to render the frame, so a server that can serve 5 people will likely cost 5 times as much.
So what? If you're setting up a data center it is no more expensive to have 200 big servers that support 5 players each than to have 1000 thin servers that support a single player.

Strazdas said:
Worth noting, Crackdown 3 is coming out soon and it uses Microsoft cloud computing for its physics engine. Their demo that they showed shows them using not one, not two but eleven servers to power a single game experience. Basically Microsoft runs 5000 dollars worth of hardware for a single player instance. They may optimize till release of course, but this is the kind of cloud computing were talking about here - expensive one.
You mean this:

That guy is playing with vastly overpowered weapons on a powerful server to demonstrate the possibilities of cloud gaming. If he causes massive damage to the destructible environment, the server can handle the complicated physics calculations that an XBOne simply could not do on its own. However, that is not normal gameplay. Normal gameplay requires far less calculations, and can be done on the cloud in a cost effective manner.

The point is that this massive destruction is quite possible via cloud gaming, because a datacenter supporting 1000 players only needs a little extra processing capacity to accommodate it, as long as players don't all do it at once. It doesn't, by any stretch of the imagination, mean that you need to spend $5000 per player just to stream a game they could play locally on their console.