No, despite a few extremist, the vast majority still very much believe vaccines are real and even know the basic concept of how they work. Its even taught in schools.
1. In some areas, you'd be surprised.
2. Obvious Hyperbole.
3. Missing, ignoring, or avoiding the point.
But that is...pointless. we have local calculation, the suggestion is to move to server calculation and streaming, and the workaround is to do local calculation? why do we even need streaming at all then?
Why do you drive a car? You still have to move your feet to move, pressing the accelerator and brakes, and clutch. Obviously its pointless as you're still moving your legs to get there...
Or, and hear me out here, the fact that you have to do vastly less
leg work makes it worth it.
Same with streaming. Its not magically not worth it because you have to calculate 10 megaflops of data, whilst the server does 100 terraflops. Its the fact that you're not doing those 100 terraflops that makes the difference, not the fact that you still have to do 10 megaflops.
DVD compression is extremely old and bad actually. I moved to BluRays a couple years ago. These use much better compression algorythms and actually has a decent bitrate. That bitrate is 30.000 kbps on average, more thna 4 times that of netflix. Even then, that compression is something id love to be without. I would also like if they move to H265 more (not to be confused with H264). Alas we still cannot run that properly because even high end CPUs have problem deciding a 1080p video due to very calculation intensive compression (which results in better quality).
And yes, its not ideal, but that's the only thing we got so its that or nothing. We have better in videogames, why downgrade?
And yet, its perfectly serviceable for the majority of consumers. And you're still only proving my point. 30,000 kbps is orders of magnitude less than 600 Mbs. As for why downgrade? For upgrades in other areas. As established, console players prefer convenience over power and looks - we're talking about a pretty minor difference in quality too, so its a sacrifice they're quite likely willing to make. I mean hell, why do people watch Netflix with its terrible compression algorithms when they could just buy the Bluray and get a much higher quality experience?
If you can answer that question, well, you have answered your own question.
and if you press a button after only 2 frames your fucked. Thanks but no thanks. Frame buffers need to be REDUCED, not increased. the current 3 frame buffers already create enough input lag that people will pick screen tearing over it.
Honestly, only the highest end "MUST RUN AT 300FPS BETTER PLAY AT 480P SO THAT ITS HIGHER FPS" junkies espouse that view. The gaming community pretty much unanimously is fine with frame buffers under normal circumstances.
Additionally, no, if you press it after only 2 frames you're not fucked. Perhaps you missed the part where locally, your device records which frame the button was pressed at, and feeds that through to the server, which then treats its calculations as if that was when you had pressed the button, and gives you your output based on it. Yes, it'd require some intelligent programming to get that done right, and intelligent art design. I've said as much. Doesn't mean it isn't something that's possible to do.
No, servers dont predict anything in current multiplayer games. Our local clients do, calculating it locally. Servers are sanity checks and new data from other players.
Incorrect. The server calculates where it expects you to be. Your client calculates where you are, and sends it to the server. The server also calculates where it expects everyone else to be, and relays that information to your client to display. This is a core part of anti-cheat mechanisms; If what your client says you are doing is not what the server says it expects you to be doing, it flags it as likely cheating. The server still has to predict what every player is going to do though, funnily enough.
Also you really are wrong about graphic improvements. Just look at what Snowdrop engine did. Or Avalanche. Were nowhere even close to where we struggle to improve upon. Maybe we will hit that level in 20 days, but i doubt it, given that assuming same technological progression it will take till 2056 to reach even current pixar levels of graphic rendered in real time for enthusiast users. Yes, we will hit the limit some day, but its not anywhere close enough to use it as an argument yet.
I never said we'd hit the limit of grahpics improvements. That'll go on until the end of time. What I said is we'd hit that limit where you'd need a ton of extra processing power, to deliver very minimal extra details. Compare today's PC flagship titles, with all the highest settings on, and then just turn realistic hair physics off. How much better does it run? A shitton. How much do you notice the worse hair? Eeeeeh, not a ton. Same goes for the more intensive forms of Ambient Occlusion and Anti Aliasing, compared to the more simplistic ones. Yeah, there are differences, that can be noticed, but for a normal player in a normal gameplay session, its so small a difference that most won't pay the extra thousand dollars required to achieve it all. And this will only be exacerbated in the future, as each tiny increase in graphics quality demands huge amounts of extra resources. Its diminishing returns at its finest, and allows streamers to stream a lower quality version of the graphics settings, that look very similar to the higher quality versions, for much less processor power.
Thats not stream gaming. you are going into semantics now, when in reality streaming gaming means you are streaming the entire game - like netflix - with only controls being from the player side. Your multiplayer gaming is not an example of streaming.
Even Netflix has a decoder at your end that it needs to process in order to turn it into watchable media. You even admit it here, that some processing - namely for you, controls - needs to happen client side. Yeah. Funnily enough that's how it works. There is no 100% streaming. Streaming the vast majority of the intensive calculations, and making some minor calculations on your end to facilitate the streaming of those other calculations, is, however, 100% 'game streaming'. Its how streaming as a whole works. Your pureblood "Everything is streamed" stream doesn't exist, as then you would be streaming literally terabytes of data for an hour's play session, or movie. Funnily enough, that's not how it works. Stuff is always processed client side. You can either continue trying to deny that to try and make the problem more impossible than it is, or you can accept that's the way its always been done, and that it can be done so in the future. You're not running complex graphics or physics calculations that take a ton of power, you're processing inputs and where in the grand scheme of things they are received, and then handling the incoming frames and when to display them.
Not true. It does not matter which framerate you are playing at there is going to be a significant difference in the "Feel" of control based on input lag. this is because things DO change mid-frame based on player input. Input does not get delivered only every 33,3MS, it gets delivered when you pressed it. This makes a difference to the gameplay.
Also not completely true. Funnily enough, games run at a particular framerate - non-graphical, but calculation wise. Modern programming design is intelligent in that if it runs FASTER than that framerate, it slows itself down, that way you don't have different experiences on different speed machines, where on a slow one you have 10 seconds to dodge a bullet, whilst on a fast one that bullet has been through you and killed you in 0.01 seconds.
Now, I don't know what speed every game runs itself at, but its going to be;
These periods are the only times your client actually calculates inputs. Sure, it'll 'receive' them before then, but it won't actually do anything with them, or even note that its received them. It'll just go "Eh", until its calculation frame comes up.
Best case, in games designed to run as fast as possible because they are twitch reflex focused, you have at least 8.3ms response time for the game to just calculate what is going on. On games not as focused on twitch reflexes, you have more.
Additionally, you were talking about screen refresh rates, and how having a 1ms refresh rate reduces input lag compared to having a 5ms refresh rate, which it just does not. As the previous poster pointed out, its to do with the removal of the previous image, and generally your game is running at 30FPS for a console - or if we're being generous for you - 60 FPS, which means that even if it were running at 1ms refresh rate on the screen, it would still display the exact same frame for 16-33 cycles.
Either way, a certain level of input lag is unavoidable thanks to the fact that the game itself runs at a capped FPS. What this FPS is depends on the game, but again, especially in non-reaction speed focused games, its often a reasonable FPS. Sometimes it'll be faster than what's displayed on screen, it doesn't always have to be though.
Also considering consoles are currently not reasonable but subpar performance that should not happen at all in the 21st century (and yes, i was playing at 60hz 1600p monitor since the 90s. It was a clunky CRT but it was still better than what consoles do now). This is not reasonable, and wont be reasonable if you make it into streaming.
Ok, so console streaming couldn't happen because consoles are sub-par, and to stream a similar quality experience to what consoles now are getting, you'd have to make it stream the top end PC scene, and that's just impossible.
Right. I'm seeing what's going on here. You're here to have a bash at consoles and game making companies, not actually discuss the feasibility of streaming a console level experience. Cool. I guess that means we can leave it here. Funnily enough, whilst you don't seem to accept consoles, a great many millions do. And hell, I understand. I can't game on consoles because they're too shit. But a lot of the market can, so if you're going to say console streaming can't be done because it won't be good enough for the console market, and your argument is that the console market is full of plebs who accept poor quality so you'd better instead keep it at the PC level, where you originally weren't targeting it... Yeah, you're just having a whole different discussion now.