ForumSafari said:
Oh I don't know, it really depends on a game by game basis. As you say it all depends on how time sensitive the calculation is and how complex it is in the first place, but there's room for use in slower games and the non-game services that most consoles run in the background anyway.
Deploying dedicated servers on a cloud platform, particularly when they can piggyback it on Azures infrastructure is a good way to load balance.
I agree with that technically in principle, but in practice, it's just that, overwhelmingly, the mainstream console games trend towards "real-time" rendering.
More accurately, those kinds of video games are calculated on a per-frame basis (rendering, by definition, is per-frame unless it's a totally static image), meaning that to find the "timeliness" of a game, or window that it has to crunch everything, you take your Frame-Per-Second, and solve for seconds.
So, if your game runs at, say, 60FPS, all crunching must be done by 1/60 second intervals (0.0167 seconds, or 16.7 milliseconds)
Meaning your latency to the Cloud MUST BE less than 16.7ms to provide any real benefit.
(For 25FPS, the time is a much more generous 40ms, but there's a big push for higher frame rates in gaming.)
Even in an optimal scenario where you can save more processing to put towards video by "unloading" non-video processes to the Cloud, it still won't amount to much of an increase in performance OR fidelity.
Because increasing graphical fidelity now requires INCREDIBLE increases in processing power to be noticeable.
Freeing up 20-40% of local processing resources is nowhere near enough.
Hell, TRIPLING the processing resources available wouldn't be enough today.
So, the only benefits you're left with are those akin to dedicated servers.
Which is a nice perk, but a potentially dangerous one too, since the same Cloud-process-loading feature can be very easily modified to render a game dependent on the Cloud making it "Always Online-required".