Bad Jim said:
It's rather misleading to say that people hated the 60Hz tick rate in BF4. What actually happened is that they increased the tick rate to 60 and the increased physics calculations on water maps was too much and bogged down the servers. This is a problem caused by the tick rate being too high, not too low. No-one had a problem with the latency inherent in a 60Hz tick rate.
No. The tick rate was too low on launch which casued complains. When tick rate was raised complains stopped.
So what? If you're setting up a data center it is no more expensive to have 200 big servers that support 5 players each than to have 1000 thin servers that support a single player.
And the developers are just going to set up those datacenters in every town for free?
That guy is playing with vastly overpowered weapons on a powerful server to demonstrate the possibilities of cloud gaming. If he causes massive damage to the destructible environment, the server can handle the complicated physics calculations that an XBOne simply could not do on its own. However, that is not normal gameplay. Normal gameplay requires far less calculations, and can be done on the cloud in a cost effective manner.
The thing is, what we see in the video should be normal gameplay. as in actual destruction. yes, the weapons are overpowered, but it still takes to calculate all the impacts even with lower power weapons. This video, if anything, shows that it cannot be done on a cost effective manner.
The point is that this massive destruction is quite possible via cloud gaming, because a datacenter supporting 1000 players only needs a little extra processing capacity to accommodate it, as long as players don't all do it at once. It doesn't, by any stretch of the imagination, mean that you need to spend $5000 per player just to stream a game they could play locally on their console.
If everyone would not play their games for an hour please, i want to play mine. Yes, thats the only situation where you dont need hardware-per-player in the servers.
Joccaren said:
I provided an example of a present-day workaround for this whole "Speed of Light" problem. I did NOT say that the TV you stream to should have a dedicated graphics card and run the game itself and check back to the streaming server. I said it should do minimal calculations, with a design of the game and the information existing, and relay its small information calculations between the server and the player.
No you didnt. you provided an example of a workaround current multiplayer uses to hide the latency when all calculations are done locally and only confirmed with the server later. This work around will not work, as i have repeatedly explained, with streaming, because there is no local calculation to do it. If we have local calculation big enough to do it then there is no point to stream the game to begin with since you are already rendering everything locally anyway. There is no magical "Does local calculation but needs no power to do it" middle ground. you either do local calculation or you dont. and if you dont its going to be horrible gameplay. There is no "minimal calculations". This is literally logically impossible.
As for 128 player games... The reason they're often so rare is because the servers start to run into issues under that much strain, thanks to the whole n^2 problem of the number of players, and the number of updates you have to send out. Its often around 650Mb/s sent out for a 128 player game, before you even get into sending out bullet trajectories and such. A 64 player match is only 84Mb/s for the server, which is why its much more feasible, and the normal console sized matches of 24 players are only 4Mb/s or so. This is, of course, unless we use very simplistic calculations, and very inaccurate information is sent, and then the client tries to predict based off that information what happened, but that leads to its own rubber banding issues and poor connectivity. Suffice to say, there's a reason 128 player games these days are non existent on consoles, and really only exist in Battlefield for the PC, at a lot of cost to EA.
No, the main reason is game balance. when there are that many players games often turn into chaos and results in no tactics used. few exeptions like Planetside 2 exists of course.
There is no n^2 problem if server is done correctly. server only has to serve every player and recieve data from it. it does not have to deal with information between players and in fact such information isnt even needed if server does its job correctly. Also nobody sends bullet trajectories in multiplayer. Heck, most games dont even have bullet trajectories and uses a scan method for aiming.
Tell me, how many non-hardcore extremists do you expect to buy these new cards with no graphics buffer purely for that decrease in input lag? What's that, none? Most people don't even notice?
Well, I'll never. Graphics card companies constantly try and push the highest end, its what they sell their tech on. The average consumer? Doesn't really care. They don't own a Titan to milk those last handful of FPS out of a title. They have their cheap mid-range graphics card because occasional dips below 60 are fine.
Most gamers, id expect. Given its popularity so far, i seem to be right. Oh and are we back to "i think most people are blind idiots" argument? Because being a blind idiot is never an excuse.
Oh and btw these are not graphic cards that does this, those are monitors. (though yes graphic cards have to support them)
It'd take a lot of game design attention, engineering attention, and software attention. Congratulations. That's what I've always been saying.
intentionally designing bad game mechanics to compensate for bad game delivery systems is hardly something worth pushing for.
Ok, riddle me this. How does the server check what is possible? Does it draw some tarot cards and read them?
No. It gets your velocity and location, calculates where you should end up, and makes sure you don't end up too far away from that. It calculates where that bullet should land, checks where you say it lands, and if the two aren't that far off, you're fine. If they are, you get hit anyway. Either way, it doesn't magically divine what is possible and what isn't. It calculates it. By predicting where you should be. My god, its almost like what I've been saying!
Server checks whats possible because it has that information hardcoded. For example you are driving a car in multiplayer. The max speed of a car is 230 km/h. Server recieves the position from the client and can calculate speed (dont trust client speed reported, this can be faked easily). If the car is going bellow 230 km/h its fine, above - the player is cheating. Certain allowance should be made for unstable user connections of course, but you get my point.
No prediction needed for reality checks.
Uuuhhh...
Snowdrop and Avalanche; they've put some work into making some nice animations, but otherwise graphical quality is pretty similar to Crysis, released almost a decade ago. Yeah, I'm not seeing these HUGE leaps in graphical quality you're talking about. Meanwhilst, compare Crysis to the best looking games of 1998. HUGE difference. Imagine another 20 years from now, when that graphics difference margin has shrunk as much not once, but twice. The two will be nearly indistinguishable. Sure, some people will notice and go "Sweet, that looks nice", the majority won't. And of those that do, the majority won't care.
As for hair in TW3... Yeah, it can look pretty nice, it isn't some must have oh my god its awesome thing though. Yeah, its cool. The game looks pretty similar without it anyway, for most people, in most circumstances. And that's what matters.
Absolutely ridiculous. Snowdrops graphical results are way beyond what Crysis could offer. and Avalanche open spaces are nothing we ever saw in gaming until this point. I suggest you replay Crysis because you seem to have forgotten what it looks like, even though it was revolutionary when it came out. There are huge leaps, they just arent in polygon numbers that seems to be your sole focus for some reason.
Also you only need to look at peoples reaction to battlefront to disprove the "people wont notice" argument.
Haha. Haha. Doesn't take much processing power. Compared to Ray Tracing maybe, but yeah, it does, and not even 5 years ago it could cripple a cards performance in games. As you note, you turn it off, because its annoying. You don't notice the tiny improvements it has on the shadows of the game, and how much more natural they look. And that's my point. For something that brought cards to their knees not long ago, we are now a ton more powerful, AND we don't even notice the difference without it. Hell, thank you for proving my point.
Thats funny i remmeber playing games with AO in 2006 on a Athlon XP and 7300GT. So much for card crippling. I turn it off because it makes it look worse for me. This is one of those kinks i have where i turn off most of the post-processing crap because i hate how blurry it makes everything. I turn it off because i DO notice the difference.
Eeeeh, I've never had problems with AA, to be completely honest. I'm also one of those people who's only rarely noticed the difference with or without it. Most people don't at all. That said I've always run near top of the line graphics cards, so its kind of expected they'd run the AA properly. Maybe it also helps that I've always run my screens at the highest resolution possible for the day [ATM 4K], so if any Jaggies were going to show up, I'd barely notice them anyway.
at higher resolution the AA becomes less important because the pixels themselves are smaller (unless you increase screen size appropriately, but thats doubtful and rarely happens). I often played without AA because i tend to have mid-range cards rather than top of the line (currently have a 760) and AA likes to eat GPU power. I also turn off the FXAA and all its variants because they actually make things look worse. Now i do notice the difference AA makes, its just sometimes a compromise for the framerate i find more important.
Good thing we suggested using a Bluray bitrate instead of Netflix then isn't it?
You know if netflix/youtube/twitch streamied in BR quality id be ok with it. Its compressed, but there are no big artefacts and compression is rater unagressive. But then something better would probably come around and id want that, its just that we never had better for movies.
Point still stands; there is a limit to the speed your input will be calculated. At 120 ticks per second, that's 8.3ms still. And as said, to address nonresponsive servers, in twitch based games. In non-twitch based games - which are more popular than I'm guessing you'd think - lower tick rates survive and are quite serviceable. And they are both client and server side, funnily enough.
I think the big difference here is that you think only twitch games require responsive controls. In reality almost every game does.
And who defines what is acceptable? You?
Hahaha. No. People decide what is acceptable for themselves, and 'similar' quality to consoles ATM seems to be rather quite acceptable, considering their popularity.
And we're not talking basic minimum framerate here. We're talking high frameright, high resolution, best graphics settings - because consoles just aren't acceptable - and everything else that comes with the high end PC territory. To stream to consoles, you replicate consoles, and that's 30FPS, at lower resolutions, with some graphics reductions. Wow. Suddenly it seems more feasible. Funny that.
The technology at the time and general consensus defines what is acceptable. Clearly the consensus thinks that 30 fps is not acceptable and we do have the technology to play at higher framerates.
As far as resolution goes, 1080p has been a standard in the 90s for most, many expect things to improve, not downgrade.
No, 60 is not "high framerate". Its acceptable framerate. And 1080p is not high resolution, its standard resolution nowadays. I never talked about graphical settings other than to refute your point about consoles having equivalent of high settings so not sure why you put them here. clearly being weaker hardware they will have lower settings.
Streaming feasibility in 20 years isn't known, and apparently there are those in th industry that believe it could be done. I mean, I could make the same argument; Ray Tracing capability is known; none. That's for now. Never means we won't improve things enough in 20 years that it becomes feasible. Hell, 20 years ago the prevailing logic was that dynamic shadows were infeasible. Its funny how 20 years of progress changes things.
The difference is that even if nothing changes, current trends will make ray tracing feasible in 20 years whereas streaming being feasible requires invention of FTL communication.