Ubisoft CEO: Streaming Will Replace Consoles

Joccaren

Elite Member
Mar 29, 2011
2,601
3
43
Strazdas said:
But your suggestion of doing calcualtions both locally and on server does not make it do less work. In fact you end up doing more, because you are calculating both locally and on the server instead of only locally now.

And 10 megaflops wouldnt be even enough for the video codec to decode a netflix stream. But you still need to do those 100 teraflops if you want to utilize the same technique used in modern multiplayer games. the client does ALL of the calculations and only checks in with the server as a consistency/anticheat measure. Its why a single PC can host 128 player games easily, the server does very little because the client does most of the work.
I can't tell if you're being deliberately obtuse or...

I provided an example of a present-day workaround for this whole "Speed of Light" problem. I did NOT say that the TV you stream to should have a dedicated graphics card and run the game itself and check back to the streaming server. I said it should do minimal calculations, with a design of the game and the information existing, and relay its small information calculations between the server and the player.

As for 128 player games... The reason they're often so rare is because the servers start to run into issues under that much strain, thanks to the whole n^2 problem of the number of players, and the number of updates you have to send out. Its often around 650Mb/s sent out for a 128 player game, before you even get into sending out bullet trajectories and such. A 64 player match is only 84Mb/s for the server, which is why its much more feasible, and the normal console sized matches of 24 players are only 4Mb/s or so. This is, of course, unless we use very simplistic calculations, and very inaccurate information is sent, and then the client tries to predict based off that information what happened, but that leads to its own rubber banding issues and poor connectivity. Suffice to say, there's a reason 128 player games these days are non existent on consoles, and really only exist in Battlefield for the PC, at a lot of cost to EA.

No, DVD compression is not perfectly servicable. Its a case of "We simply have nothing better". You could probably make that argument with netflix next, and i would call every such consumer blind. Oh, wait, you did.
Here's the thing. You would call every such consumer blind. That just speaks of arrogance. No, the thing doesn't need to appeal to you. That doesn't mean it doesn't appeal to everyone. To think you are so important that you alone determine what is acceptable and what isn't... Uhh... No. Sorry. The wider market has spoken. Movies are perfectly serviceable, as is Netflix.

Is that why we invented not one but two techniques coming out last/this year to remove those buffers? No, the gaming community is not fine with frame buffers. They never were.
Tell me, how many non-hardcore extremists do you expect to buy these new cards with no graphics buffer purely for that decrease in input lag? What's that, none? Most people don't even notice?
Well, I'll never. Graphics card companies constantly try and push the highest end, its what they sell their tech on. The average consumer? Doesn't really care. They don't own a Titan to milk those last handful of FPS out of a title. They have their cheap mid-range graphics card because occasional dips below 60 are fine.

If you press a button at frame 2 your still going to wait for frame 10 to see it take any effect because of latency. so its either going to be very unresponsive controls or every button press will end up in rubber-banding. Good luck selling that to gamers.
Or you could link this up with the rest of what I was saying, in that it would require a game design push to. The animations begin very similar, by the time its received the input, and sent out the next batch of frames, the animation is at the point where it would split off. Say you could block, or get hit in Batman. Its always him raising his arm to a block position either way. Then, once you've received the input, he either finishes the animation and blocks the incoming blow, turning to face his opponent [Which would take another 10 frames and hey presto that buffer hasn't been noticed], or you could have pressed it too late and the animation finishes with him missing his block, getting hit, and staggering, which would probably take 20 frames or so. Hey presto, frame buffer not noticed.
It'd take a lot of game design attention, engineering attention, and software attention. Congratulations. That's what I've always been saying.

No. The server only has to check what is possible. If you are being shot and server calculates that the bullet hit you, then you not loosing health is not possible therefore server sends the client a message to loose health and client adjusts health if its own calculations were wrong. (or as is now popular in games, it doesnt, and cheaters run free because server does not check anything. See: GTA 5).
Ok, riddle me this. How does the server check what is possible? Does it draw some tarot cards and read them?
No. It gets your velocity and location, calculates where you should end up, and makes sure you don't end up too far away from that. It calculates where that bullet should land, checks where you say it lands, and if the two aren't that far off, you're fine. If they are, you get hit anyway. Either way, it doesn't magically divine what is possible and what isn't. It calculates it. By predicting where you should be. My god, its almost like what I've been saying!

Your still wrong. Like i said look at Snowdrop or Avalanche. They made huge improvements with the looks despite no big leaps in graphical processing. Oh and hairs are GORGEOUS in witcher 3. Human hairs not so much but animal ones, you just want to pet them! The performance thing is mostly because the devs fucked up with 64x tesselation. once they patched that out it didnt tank it as much. Also a bit off topic but have you seen what Nvidia did with grass simulation? i cant wait to see that in a game.
Uuuhhh...
Snowdrop and Avalanche; they've put some work into making some nice animations, but otherwise graphical quality is pretty similar to Crysis, released almost a decade ago. Yeah, I'm not seeing these HUGE leaps in graphical quality you're talking about. Meanwhilst, compare Crysis to the best looking games of 1998. HUGE difference. Imagine another 20 years from now, when that graphics difference margin has shrunk as much not once, but twice. The two will be nearly indistinguishable. Sure, some people will notice and go "Sweet, that looks nice", the majority won't. And of those that do, the majority won't care.
As for hair in TW3... Yeah, it can look pretty nice, it isn't some must have oh my god its awesome thing though. Yeah, its cool. The game looks pretty similar without it anyway, for most people, in most circumstances. And that's what matters.

Ambient Occlusion does not take much processing power, and i turn it off usually because its annoying.
Haha. Haha. Doesn't take much processing power. Compared to Ray Tracing maybe, but yeah, it does, and not even 5 years ago it could cripple a cards performance in games. As you note, you turn it off, because its annoying. You don't notice the tiny improvements it has on the shadows of the game, and how much more natural they look. And that's my point. For something that brought cards to their knees not long ago, we are now a ton more powerful, AND we don't even notice the difference without it. Hell, thank you for proving my point.

Anti aliasing has basically 3 types. "True" antialiasing, which is done by scaling the game render resolution and then downscaling. This is going to take as much processing power as its going to take running the game in higher resolution, but it gives best (true) results of antialiasing. This is great if you have exess power.

The render object antialiasing (most popular - MSAA) is less resource intensive but depends a lot on how the game is designed. If you design the game engine and/or word in a way that it cannot parse objects through antialiasing like that its going to look worse than no antialiasing. Its very much hit or miss with the devs and main reason why last gen consoles had no actual AA.

Blurring antialiasing. This is your FXAA and the like, very light on resources but also offers no benefits to the user. It takes final image and blurs what it thinks are aliasing. In a word - it makes it worse.

Antialiasing didnt change much from the 90s, its always been render inteisive process that many people wanted but not always could have. Though now dynamic shadows are taking over as the most GPU killing feature.
Eeeeh, I've never had problems with AA, to be completely honest. I'm also one of those people who's only rarely noticed the difference with or without it. Most people don't at all. That said I've always run near top of the line graphics cards, so its kind of expected they'd run the AA properly. Maybe it also helps that I've always run my screens at the highest resolution possible for the day [ATM 4K], so if any Jaggies were going to show up, I'd barely notice them anyway.

Diminishing returns are there, but nowhere near what you claim. As far as streamers go, Twitch quality is attrociuos. Its nowhere even close to what the game actually looks like and anyone that claims it is needs their eyes checked. The artifacting alone is a dead giveaway.
Good thing we suggested using a Bluray bitrate instead of Netflix then isn't it?

No. what your talking about is tier games clock to CPU speed. this isnt being done anymore and not related to how games are rendered. In fact games internal clock is independent of framerate in any well designed game (only a few in the past few years did this mistake. Tieing physics is a more common mistake). A game calculation does not wait for next frame to count the player input. This is what makes the difference when it comes to input even at low FPS gameplay.

Completely false - see above.

This is at best somewhat related to how server calculation works, as they do it every certain interval, called tick rate. This is how often server updates the client with new info it calculated in multiplayer games. Guess what, whenever its bellow 100 times a second people complain about input lag and nonresponsive servers.
You're basically just repeating what I said, I called it "A particular framerate - non-graphical", and you've labelled it tick rate. NO game runs as fast as the CPU can run, because that will just end up with wildly differing gaming experiences, because if you calculate a bullet moving forward 3000 times per second, rather than 100, its going to be 30 times faster than it should be, regardless of whether you only render it 60 times a second or not.
Point still stands; there is a limit to the speed your input will be calculated. At 120 ticks per second, that's 8.3ms still. And as said, to address nonresponsive servers, in twitch based games. In non-twitch based games - which are more popular than I'm guessing you'd think - lower tick rates survive and are quite serviceable. And they are both client and server side, funnily enough.

No, console streaming couldnt happen for multitude of reasons, but that does not mean that your suggestion of "similar" experience is something that should be acceptable in the first place. Oh and im sorry since when is basic minimum framerate a top end PC scene? Apperently 5 year old GPUs are top end now!
And who defines what is acceptable? You?
Hahaha. No. People decide what is acceptable for themselves, and 'similar' quality to consoles ATM seems to be rather quite acceptable, considering their popularity.
And we're not talking basic minimum framerate here. We're talking high frameright, high resolution, best graphics settings - because consoles just aren't acceptable - and everything else that comes with the high end PC territory. To stream to consoles, you replicate consoles, and that's 30FPS, at lower resolutions, with some graphics reductions. Wow. Suddenly it seems more feasible. Funny that.

Streaming feasibility is already known, its zero, so why not take the time to also bash some consoles while were at it :D

Wait, so your basic argument is that "we cant make it work properly, but since those people are willing to accept shit its good enough, ship it"? Are you pulling a WarnerBros here?
Streaming feasibility in 20 years isn't known, and apparently there are those in th industry that believe it could be done. I mean, I could make the same argument; Ray Tracing capability is known; none. That's for now. Never means we won't improve things enough in 20 years that it becomes feasible. Hell, 20 years ago the prevailing logic was that dynamic shadows were infeasible. Its funny how 20 years of progress changes things.

No, Warnerbros is "These people won't accept shit but we can trick them before they purchase it, so we'll sell them shit before they know it". This is simply catering to what people want - kind of like consoles themselves, or CoD. I view both as pretty shit. But the market as a whole loves them. Funny that, how people can have tastes that aren't my own.
 

WeepingAngels

New member
May 18, 2013
1,722
0
0
Madmatty said:
I prefer to own my own games so I'll pass on these new consoles. I'll just stick to steam if you ask me
Joke post?

I prefer to own my games so I'll just continue to long term rent from Steam??
 

Madmatty

New member
Apr 5, 2016
110
0
0
WeepingAngels said:
Madmatty said:
I prefer to own my own games so I'll pass on these new consoles. I'll just stick to steam if you ask me
Joke post?

I prefer to own my games so I'll just continue to long term rent from Steam??
With steam I can play downloaded games offline so that's why I prefer steam or GOG. Steam is like ps store for PC
 

Amir Kondori

New member
Apr 11, 2013
932
0
0
WeepingAngels said:
Madmatty said:
I prefer to own my own games so I'll pass on these new consoles. I'll just stick to steam if you ask me
Joke post?

I prefer to own my games so I'll just continue to long term rent from Steam??
GOG.com, which has grown by huge amounts, allows you to really own the games. You download the installer, zero DRM, no login or online activation required. I make them my first stop for games.
 

FalloutJack

Bah weep grah nah neep ninny bom
Nov 20, 2008
15,489
0
0
Something Amyss said:
Not everyone has/wants Netflix. Not that there's anything wrong with Netflix overall, but there's a number of technical issues (compatibilities, servers, the operational status of your computer, etc.) that can add up to issues. My main thing is that while this is a great system and all, what Netflix giveth, Netflix can taketh away...but if you have it, you have it. I'm an early blu-ray adopter too. It plays all DVDs, regardless of quality, as long as they're readable and all it needs is what I've got. So, in case of crash, virus, bad video card, speaker problem, Verizon cables, net service screw-up, downed telephone pole, or any number of other computer or internet-based problems short of power outage, I can watch what I've got at my leisure. My computer can play movies, and hell it can do alot of things, but it isn't my go-to device for it.
 

Something Amyss

Aswyng and Amyss
Dec 3, 2008
24,759
0
0
FalloutJack said:
Not everyone has/wants Netflix.
So what? Does that make the trend any less real? Will your personal preferences for viewing somehow undo this? Of what consequence is it that not everyone wants or has Netflix if that's the way the world is trending? There are people who still love Betamax and Laserdisc. Meanwhile, you have shows forgoing physical releases.
 

FalloutJack

Bah weep grah nah neep ninny bom
Nov 20, 2008
15,489
0
0
Something Amyss said:
FalloutJack said:
Not everyone has/wants Netflix.
So what? Does that make the trend any less real? Will your personal preferences for viewing somehow undo this? Of what consequence is it that not everyone wants or has Netflix if that's the way the world is trending? There are people who still love Betamax and Laserdisc. Meanwhile, you have shows forgoing physical releases.
Well, in short answer, yes. I'm not just me. The phrase "You're never the only one" comes to mind. I'm sure there's a bunch of people who have your opinion too. Not all, though. That's the reality. The 'trend' is that people will pursue whatever they like, and businesses will try to make money off of it. There is money in streaming, so they pursue it. There is money in not streaming, so that is also pursued. The problem is that you are thinking in an extreme, much like the Ubisoft guy, in that something will just absolutely consume all other media. And yet, there is still significant money in books, television, retro-gaming, consoles in general, etc. Streaming has found a lovely nest-egg of profit and will continue to do so for some time, no more and no less.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Bad Jim said:
It's rather misleading to say that people hated the 60Hz tick rate in BF4. What actually happened is that they increased the tick rate to 60 and the increased physics calculations on water maps was too much and bogged down the servers. This is a problem caused by the tick rate being too high, not too low. No-one had a problem with the latency inherent in a 60Hz tick rate.
No. The tick rate was too low on launch which casued complains. When tick rate was raised complains stopped.

So what? If you're setting up a data center it is no more expensive to have 200 big servers that support 5 players each than to have 1000 thin servers that support a single player.
And the developers are just going to set up those datacenters in every town for free?

That guy is playing with vastly overpowered weapons on a powerful server to demonstrate the possibilities of cloud gaming. If he causes massive damage to the destructible environment, the server can handle the complicated physics calculations that an XBOne simply could not do on its own. However, that is not normal gameplay. Normal gameplay requires far less calculations, and can be done on the cloud in a cost effective manner.
The thing is, what we see in the video should be normal gameplay. as in actual destruction. yes, the weapons are overpowered, but it still takes to calculate all the impacts even with lower power weapons. This video, if anything, shows that it cannot be done on a cost effective manner.

The point is that this massive destruction is quite possible via cloud gaming, because a datacenter supporting 1000 players only needs a little extra processing capacity to accommodate it, as long as players don't all do it at once. It doesn't, by any stretch of the imagination, mean that you need to spend $5000 per player just to stream a game they could play locally on their console.
If everyone would not play their games for an hour please, i want to play mine. Yes, thats the only situation where you dont need hardware-per-player in the servers.


Joccaren said:
I provided an example of a present-day workaround for this whole "Speed of Light" problem. I did NOT say that the TV you stream to should have a dedicated graphics card and run the game itself and check back to the streaming server. I said it should do minimal calculations, with a design of the game and the information existing, and relay its small information calculations between the server and the player.
No you didnt. you provided an example of a workaround current multiplayer uses to hide the latency when all calculations are done locally and only confirmed with the server later. This work around will not work, as i have repeatedly explained, with streaming, because there is no local calculation to do it. If we have local calculation big enough to do it then there is no point to stream the game to begin with since you are already rendering everything locally anyway. There is no magical "Does local calculation but needs no power to do it" middle ground. you either do local calculation or you dont. and if you dont its going to be horrible gameplay. There is no "minimal calculations". This is literally logically impossible.

As for 128 player games... The reason they're often so rare is because the servers start to run into issues under that much strain, thanks to the whole n^2 problem of the number of players, and the number of updates you have to send out. Its often around 650Mb/s sent out for a 128 player game, before you even get into sending out bullet trajectories and such. A 64 player match is only 84Mb/s for the server, which is why its much more feasible, and the normal console sized matches of 24 players are only 4Mb/s or so. This is, of course, unless we use very simplistic calculations, and very inaccurate information is sent, and then the client tries to predict based off that information what happened, but that leads to its own rubber banding issues and poor connectivity. Suffice to say, there's a reason 128 player games these days are non existent on consoles, and really only exist in Battlefield for the PC, at a lot of cost to EA.
No, the main reason is game balance. when there are that many players games often turn into chaos and results in no tactics used. few exeptions like Planetside 2 exists of course.

There is no n^2 problem if server is done correctly. server only has to serve every player and recieve data from it. it does not have to deal with information between players and in fact such information isnt even needed if server does its job correctly. Also nobody sends bullet trajectories in multiplayer. Heck, most games dont even have bullet trajectories and uses a scan method for aiming.

Tell me, how many non-hardcore extremists do you expect to buy these new cards with no graphics buffer purely for that decrease in input lag? What's that, none? Most people don't even notice?
Well, I'll never. Graphics card companies constantly try and push the highest end, its what they sell their tech on. The average consumer? Doesn't really care. They don't own a Titan to milk those last handful of FPS out of a title. They have their cheap mid-range graphics card because occasional dips below 60 are fine.
Most gamers, id expect. Given its popularity so far, i seem to be right. Oh and are we back to "i think most people are blind idiots" argument? Because being a blind idiot is never an excuse.

Oh and btw these are not graphic cards that does this, those are monitors. (though yes graphic cards have to support them)

It'd take a lot of game design attention, engineering attention, and software attention. Congratulations. That's what I've always been saying.
intentionally designing bad game mechanics to compensate for bad game delivery systems is hardly something worth pushing for.

Ok, riddle me this. How does the server check what is possible? Does it draw some tarot cards and read them?
No. It gets your velocity and location, calculates where you should end up, and makes sure you don't end up too far away from that. It calculates where that bullet should land, checks where you say it lands, and if the two aren't that far off, you're fine. If they are, you get hit anyway. Either way, it doesn't magically divine what is possible and what isn't. It calculates it. By predicting where you should be. My god, its almost like what I've been saying!
Server checks whats possible because it has that information hardcoded. For example you are driving a car in multiplayer. The max speed of a car is 230 km/h. Server recieves the position from the client and can calculate speed (dont trust client speed reported, this can be faked easily). If the car is going bellow 230 km/h its fine, above - the player is cheating. Certain allowance should be made for unstable user connections of course, but you get my point.

No prediction needed for reality checks.

Uuuhhh...
Snowdrop and Avalanche; they've put some work into making some nice animations, but otherwise graphical quality is pretty similar to Crysis, released almost a decade ago. Yeah, I'm not seeing these HUGE leaps in graphical quality you're talking about. Meanwhilst, compare Crysis to the best looking games of 1998. HUGE difference. Imagine another 20 years from now, when that graphics difference margin has shrunk as much not once, but twice. The two will be nearly indistinguishable. Sure, some people will notice and go "Sweet, that looks nice", the majority won't. And of those that do, the majority won't care.
As for hair in TW3... Yeah, it can look pretty nice, it isn't some must have oh my god its awesome thing though. Yeah, its cool. The game looks pretty similar without it anyway, for most people, in most circumstances. And that's what matters.
Absolutely ridiculous. Snowdrops graphical results are way beyond what Crysis could offer. and Avalanche open spaces are nothing we ever saw in gaming until this point. I suggest you replay Crysis because you seem to have forgotten what it looks like, even though it was revolutionary when it came out. There are huge leaps, they just arent in polygon numbers that seems to be your sole focus for some reason.

Also you only need to look at peoples reaction to battlefront to disprove the "people wont notice" argument.

Haha. Haha. Doesn't take much processing power. Compared to Ray Tracing maybe, but yeah, it does, and not even 5 years ago it could cripple a cards performance in games. As you note, you turn it off, because its annoying. You don't notice the tiny improvements it has on the shadows of the game, and how much more natural they look. And that's my point. For something that brought cards to their knees not long ago, we are now a ton more powerful, AND we don't even notice the difference without it. Hell, thank you for proving my point.
Thats funny i remmeber playing games with AO in 2006 on a Athlon XP and 7300GT. So much for card crippling. I turn it off because it makes it look worse for me. This is one of those kinks i have where i turn off most of the post-processing crap because i hate how blurry it makes everything. I turn it off because i DO notice the difference.


Eeeeh, I've never had problems with AA, to be completely honest. I'm also one of those people who's only rarely noticed the difference with or without it. Most people don't at all. That said I've always run near top of the line graphics cards, so its kind of expected they'd run the AA properly. Maybe it also helps that I've always run my screens at the highest resolution possible for the day [ATM 4K], so if any Jaggies were going to show up, I'd barely notice them anyway.
at higher resolution the AA becomes less important because the pixels themselves are smaller (unless you increase screen size appropriately, but thats doubtful and rarely happens). I often played without AA because i tend to have mid-range cards rather than top of the line (currently have a 760) and AA likes to eat GPU power. I also turn off the FXAA and all its variants because they actually make things look worse. Now i do notice the difference AA makes, its just sometimes a compromise for the framerate i find more important.

Good thing we suggested using a Bluray bitrate instead of Netflix then isn't it?
You know if netflix/youtube/twitch streamied in BR quality id be ok with it. Its compressed, but there are no big artefacts and compression is rater unagressive. But then something better would probably come around and id want that, its just that we never had better for movies.


Point still stands; there is a limit to the speed your input will be calculated. At 120 ticks per second, that's 8.3ms still. And as said, to address nonresponsive servers, in twitch based games. In non-twitch based games - which are more popular than I'm guessing you'd think - lower tick rates survive and are quite serviceable. And they are both client and server side, funnily enough.
I think the big difference here is that you think only twitch games require responsive controls. In reality almost every game does.


And who defines what is acceptable? You?
Hahaha. No. People decide what is acceptable for themselves, and 'similar' quality to consoles ATM seems to be rather quite acceptable, considering their popularity.
And we're not talking basic minimum framerate here. We're talking high frameright, high resolution, best graphics settings - because consoles just aren't acceptable - and everything else that comes with the high end PC territory. To stream to consoles, you replicate consoles, and that's 30FPS, at lower resolutions, with some graphics reductions. Wow. Suddenly it seems more feasible. Funny that.
The technology at the time and general consensus defines what is acceptable. Clearly the consensus thinks that 30 fps is not acceptable and we do have the technology to play at higher framerates.

As far as resolution goes, 1080p has been a standard in the 90s for most, many expect things to improve, not downgrade.

No, 60 is not "high framerate". Its acceptable framerate. And 1080p is not high resolution, its standard resolution nowadays. I never talked about graphical settings other than to refute your point about consoles having equivalent of high settings so not sure why you put them here. clearly being weaker hardware they will have lower settings.

Streaming feasibility in 20 years isn't known, and apparently there are those in th industry that believe it could be done. I mean, I could make the same argument; Ray Tracing capability is known; none. That's for now. Never means we won't improve things enough in 20 years that it becomes feasible. Hell, 20 years ago the prevailing logic was that dynamic shadows were infeasible. Its funny how 20 years of progress changes things.
The difference is that even if nothing changes, current trends will make ray tracing feasible in 20 years whereas streaming being feasible requires invention of FTL communication.
 

DBLT4P

New member
Jul 23, 2011
136
0
0
No, please, no. I already don't have access to 100% of a game's content because the internet where I live is too slow for online play. I live in upstate NY and internet speeds are terrible here, download speeds at all 3 houses I have lived at in the past 2 years have regularly been Kb, and there are no options for providers. Not to mention the input lag of streaming combined with current gen TV's (its getting worse, not better) is already terrible for games where reaction times matter.
 

Bad Jim

New member
Nov 1, 2010
1,763
0
0
Strazdas said:
No. The tick rate was too low on launch which caused complaints. When tick rate was raised complaints stopped.
Sorry, my first google missed that. But it was originally 10Hz which is very low (actually the tick rate was 30 but the server only sent you 10 packets/sec, but it still added up to 100ms to your latency). I never claimed that an otherwise good game couldn't be ruined by latency, just that a cloud gaming service could keep latency low enough for the game to be playable.

When Onlive started, you got your first game super cheap, so I got Just Cause 2 for ?2. And it was playable. I conquered half of Panau.

Strazdas said:
And the developers are just going to set up those datacenters in every town for free?
With Onlive, you just paid for the game and you could play it as much as you wanted for no additional cost. The average cost of running the servers was obviously included in the price. But while you may spend more on each game, you don't have to buy a console or PC.

Onlive also offered a Netflix style arrangement where you paid something like $10 a month and could play whatever you wanted from a catalogue of games.

Strazdas said:
The thing is, what we see in the video should be normal gameplay.
It would be nice, but he is using more computational power than a console can handle, probably more than any affordable PC could handle. The only way of doing this at all is via the cloud.

Strazdas said:
If everyone would not play their games for an hour please, i want to play mine. Yes, thats the only situation where you dont need hardware-per-player in the servers.
If you have, for example, a limited supply of rockets, you might need about 5 seconds of physics calculations before everything settles down, then spend a minute or more running around, shooting and not doing anything that needs serious physics. Occasional mass destruction is better than no mass destruction.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Bad Jim said:
When Onlive started, you got your first game super cheap, so I got Just Cause 2 for ?2. And it was playable. I conquered half of Panau.
Perhaps you were lucky to be close enough.

With Onlive, you just paid for the game and you could play it as much as you wanted for no additional cost. The average cost of running the servers was obviously included in the price. But while you may spend more on each game, you don't have to buy a console or PC.

Onlive also offered a Netflix style arrangement where you paid something like $10 a month and could play whatever you wanted from a catalogue of games.
Yes, and OnLive went bancrupt. Though its probably because it had very poor game selection more than anything else.

Thing is, PC costs does not dissapear with streaming, since most people use computer for more than just games. And you still have to make sure the average person pays for a console (raise the game costs appropriately) because they still need to produce the hardware on MS/Sony end.

OnLinve netflix style arrangement was very expensive from what i remmeber, but yes, they did offer it. Worth noting though that this did not gave you access to the entire catalogue, but only games where developers agreed with this. so its like imagine if you buy netflix subscription and get access only to half of the movies there and have to buy others seperately anyway. Though of course this was not OnLive fault but the publishers that refused to try this.

It would be nice, but he is using more computational power than a console can handle, probably more than any affordable PC could handle. The only way of doing this at all is via the cloud.
Thing is, hes not doing anything special though. The destructible enviroment he shown in that video, we had a very similar setup in RF:Guerilla in 2009, running on Xbox 360. So that makes me think the Crackdown devs were simply shit at optimization instead of it actually requiring massive amounts of power. Heck we got plenty of physics based games nowadays that do offer various kinds of destruction and none require the cloud. Two years ago Nvidia has shown that a 680, now a 3 generations behind card, could calculate physical bending of objects (for example grass) for millions of objects real time and wouldnt even break a sweat. we have the hardware to do it. we just need the developers to implement it.

If you have, for example, a limited supply of rockets, you might need about 5 seconds of physics calculations before everything settles down, then spend a minute or more running around, shooting and not doing anything that needs serious physics. Occasional mass destruction is better than no mass destruction.
If you look close at the video, there is a UI element that shows how many cloud resources they are using in real-time. Notice how when they are running around these resources do not fall all that much? the servers are still engaged with the player.

What you are describing is also very hard to implement in real gaming, because you are basically asking the client to establish new connections with server and alocate server resources every few seconds on-demand and thats simply not how server infrastructures work. If a server allocates say 8GB of ram to the user only that user can use it and noone else regardless of if he actually utilizes. what you are asking is that these resources be dropped from the player and then given back every few seconds? We havent perfected a way to do it on a local machine for multiple applications yet, let alone cloud computing.
 

Something Amyss

Aswyng and Amyss
Dec 3, 2008
24,759
0
0
FalloutJack said:
Well, in short answer, yes. I'm not just me.
You are, however, part of a shrinking demographic that is reaching an increasing level of irrelevance to the concumer market to the point where physical media releases are no longer a guarantee. The thing I just said, only in fewer words.

Again, so what? Now instead of you, there is a small market that doesn't justify the cost of mass production of physical media (that I already kind of addressed anyway, so...). That still doesn't address or change my point. It's kind of like that thing you said earlier:

FalloutJack said:
Hmmm, well let's examine this for a moment. Do consoles and console gaming still make money? Yes? Well then, so much for that theory. Really, there's nothing more to it. You can blurb out anything you like, but the bottom line is the bottom line.
The bottom line is, indeed, the bottom line. And the bottom's been falling out on physical media for a while. BD never reached saturation like DVD did, and even the companies that staked their wellbeing on them are not predicting clear skies. I mean, this is exactly what you were saying about consoles, except you're not on the side of the desired product. Digital video services happen to be what the movie companies want as well, but would be untenable if not so greatly desired by the consumer public. Meanwhile, they'd still be serving up Laserdiscs if that's what the public demanded.

Oh, and no. Streaming has not found a nest egg. It's still the rapidly growing form of video consumption.
 

FalloutJack

Bah weep grah nah neep ninny bom
Nov 20, 2008
15,489
0
0
Something Amyss said:
Once again, you're speaking in terms of extremes. You're doing exactly what my post there said you can't do, trying to convince me that the bottom line is anything else than what it is, trying to redefine it according to those who want the money say, not those who have the money. History says you're wrong. You talk of laserdiscs and betamax, apples to oranges. Console VS streaming is as different as books and television. They're not in direct conflict with each other's existence. It's not the same as directly-competing products like Blu-Ray VS HD DVD. They require different setups, different resources, and not just in little ways like format. Essentially, you're trying to replace the hammer with a screwdriver, as opposed to the hammer with a better hammer with a claw for pulling nails out. That is the problem defined and that is what Mr. Ubisoft doesn't understand.