I can't tell if you're being deliberately obtuse or...Strazdas said:But your suggestion of doing calcualtions both locally and on server does not make it do less work. In fact you end up doing more, because you are calculating both locally and on the server instead of only locally now.
And 10 megaflops wouldnt be even enough for the video codec to decode a netflix stream. But you still need to do those 100 teraflops if you want to utilize the same technique used in modern multiplayer games. the client does ALL of the calculations and only checks in with the server as a consistency/anticheat measure. Its why a single PC can host 128 player games easily, the server does very little because the client does most of the work.
I provided an example of a present-day workaround for this whole "Speed of Light" problem. I did NOT say that the TV you stream to should have a dedicated graphics card and run the game itself and check back to the streaming server. I said it should do minimal calculations, with a design of the game and the information existing, and relay its small information calculations between the server and the player.
As for 128 player games... The reason they're often so rare is because the servers start to run into issues under that much strain, thanks to the whole n^2 problem of the number of players, and the number of updates you have to send out. Its often around 650Mb/s sent out for a 128 player game, before you even get into sending out bullet trajectories and such. A 64 player match is only 84Mb/s for the server, which is why its much more feasible, and the normal console sized matches of 24 players are only 4Mb/s or so. This is, of course, unless we use very simplistic calculations, and very inaccurate information is sent, and then the client tries to predict based off that information what happened, but that leads to its own rubber banding issues and poor connectivity. Suffice to say, there's a reason 128 player games these days are non existent on consoles, and really only exist in Battlefield for the PC, at a lot of cost to EA.
Here's the thing. You would call every such consumer blind. That just speaks of arrogance. No, the thing doesn't need to appeal to you. That doesn't mean it doesn't appeal to everyone. To think you are so important that you alone determine what is acceptable and what isn't... Uhh... No. Sorry. The wider market has spoken. Movies are perfectly serviceable, as is Netflix.No, DVD compression is not perfectly servicable. Its a case of "We simply have nothing better". You could probably make that argument with netflix next, and i would call every such consumer blind. Oh, wait, you did.
Tell me, how many non-hardcore extremists do you expect to buy these new cards with no graphics buffer purely for that decrease in input lag? What's that, none? Most people don't even notice?Is that why we invented not one but two techniques coming out last/this year to remove those buffers? No, the gaming community is not fine with frame buffers. They never were.
Well, I'll never. Graphics card companies constantly try and push the highest end, its what they sell their tech on. The average consumer? Doesn't really care. They don't own a Titan to milk those last handful of FPS out of a title. They have their cheap mid-range graphics card because occasional dips below 60 are fine.
Or you could link this up with the rest of what I was saying, in that it would require a game design push to. The animations begin very similar, by the time its received the input, and sent out the next batch of frames, the animation is at the point where it would split off. Say you could block, or get hit in Batman. Its always him raising his arm to a block position either way. Then, once you've received the input, he either finishes the animation and blocks the incoming blow, turning to face his opponent [Which would take another 10 frames and hey presto that buffer hasn't been noticed], or you could have pressed it too late and the animation finishes with him missing his block, getting hit, and staggering, which would probably take 20 frames or so. Hey presto, frame buffer not noticed.If you press a button at frame 2 your still going to wait for frame 10 to see it take any effect because of latency. so its either going to be very unresponsive controls or every button press will end up in rubber-banding. Good luck selling that to gamers.
It'd take a lot of game design attention, engineering attention, and software attention. Congratulations. That's what I've always been saying.
Ok, riddle me this. How does the server check what is possible? Does it draw some tarot cards and read them?No. The server only has to check what is possible. If you are being shot and server calculates that the bullet hit you, then you not loosing health is not possible therefore server sends the client a message to loose health and client adjusts health if its own calculations were wrong. (or as is now popular in games, it doesnt, and cheaters run free because server does not check anything. See: GTA 5).
No. It gets your velocity and location, calculates where you should end up, and makes sure you don't end up too far away from that. It calculates where that bullet should land, checks where you say it lands, and if the two aren't that far off, you're fine. If they are, you get hit anyway. Either way, it doesn't magically divine what is possible and what isn't. It calculates it. By predicting where you should be. My god, its almost like what I've been saying!
Uuuhhh...Your still wrong. Like i said look at Snowdrop or Avalanche. They made huge improvements with the looks despite no big leaps in graphical processing. Oh and hairs are GORGEOUS in witcher 3. Human hairs not so much but animal ones, you just want to pet them! The performance thing is mostly because the devs fucked up with 64x tesselation. once they patched that out it didnt tank it as much. Also a bit off topic but have you seen what Nvidia did with grass simulation? i cant wait to see that in a game.
Snowdrop and Avalanche; they've put some work into making some nice animations, but otherwise graphical quality is pretty similar to Crysis, released almost a decade ago. Yeah, I'm not seeing these HUGE leaps in graphical quality you're talking about. Meanwhilst, compare Crysis to the best looking games of 1998. HUGE difference. Imagine another 20 years from now, when that graphics difference margin has shrunk as much not once, but twice. The two will be nearly indistinguishable. Sure, some people will notice and go "Sweet, that looks nice", the majority won't. And of those that do, the majority won't care.
As for hair in TW3... Yeah, it can look pretty nice, it isn't some must have oh my god its awesome thing though. Yeah, its cool. The game looks pretty similar without it anyway, for most people, in most circumstances. And that's what matters.
Haha. Haha. Doesn't take much processing power. Compared to Ray Tracing maybe, but yeah, it does, and not even 5 years ago it could cripple a cards performance in games. As you note, you turn it off, because its annoying. You don't notice the tiny improvements it has on the shadows of the game, and how much more natural they look. And that's my point. For something that brought cards to their knees not long ago, we are now a ton more powerful, AND we don't even notice the difference without it. Hell, thank you for proving my point.Ambient Occlusion does not take much processing power, and i turn it off usually because its annoying.
Eeeeh, I've never had problems with AA, to be completely honest. I'm also one of those people who's only rarely noticed the difference with or without it. Most people don't at all. That said I've always run near top of the line graphics cards, so its kind of expected they'd run the AA properly. Maybe it also helps that I've always run my screens at the highest resolution possible for the day [ATM 4K], so if any Jaggies were going to show up, I'd barely notice them anyway.Anti aliasing has basically 3 types. "True" antialiasing, which is done by scaling the game render resolution and then downscaling. This is going to take as much processing power as its going to take running the game in higher resolution, but it gives best (true) results of antialiasing. This is great if you have exess power.
The render object antialiasing (most popular - MSAA) is less resource intensive but depends a lot on how the game is designed. If you design the game engine and/or word in a way that it cannot parse objects through antialiasing like that its going to look worse than no antialiasing. Its very much hit or miss with the devs and main reason why last gen consoles had no actual AA.
Blurring antialiasing. This is your FXAA and the like, very light on resources but also offers no benefits to the user. It takes final image and blurs what it thinks are aliasing. In a word - it makes it worse.
Antialiasing didnt change much from the 90s, its always been render inteisive process that many people wanted but not always could have. Though now dynamic shadows are taking over as the most GPU killing feature.
Good thing we suggested using a Bluray bitrate instead of Netflix then isn't it?Diminishing returns are there, but nowhere near what you claim. As far as streamers go, Twitch quality is attrociuos. Its nowhere even close to what the game actually looks like and anyone that claims it is needs their eyes checked. The artifacting alone is a dead giveaway.
You're basically just repeating what I said, I called it "A particular framerate - non-graphical", and you've labelled it tick rate. NO game runs as fast as the CPU can run, because that will just end up with wildly differing gaming experiences, because if you calculate a bullet moving forward 3000 times per second, rather than 100, its going to be 30 times faster than it should be, regardless of whether you only render it 60 times a second or not.No. what your talking about is tier games clock to CPU speed. this isnt being done anymore and not related to how games are rendered. In fact games internal clock is independent of framerate in any well designed game (only a few in the past few years did this mistake. Tieing physics is a more common mistake). A game calculation does not wait for next frame to count the player input. This is what makes the difference when it comes to input even at low FPS gameplay.
Completely false - see above.
This is at best somewhat related to how server calculation works, as they do it every certain interval, called tick rate. This is how often server updates the client with new info it calculated in multiplayer games. Guess what, whenever its bellow 100 times a second people complain about input lag and nonresponsive servers.
Point still stands; there is a limit to the speed your input will be calculated. At 120 ticks per second, that's 8.3ms still. And as said, to address nonresponsive servers, in twitch based games. In non-twitch based games - which are more popular than I'm guessing you'd think - lower tick rates survive and are quite serviceable. And they are both client and server side, funnily enough.
And who defines what is acceptable? You?No, console streaming couldnt happen for multitude of reasons, but that does not mean that your suggestion of "similar" experience is something that should be acceptable in the first place. Oh and im sorry since when is basic minimum framerate a top end PC scene? Apperently 5 year old GPUs are top end now!
Hahaha. No. People decide what is acceptable for themselves, and 'similar' quality to consoles ATM seems to be rather quite acceptable, considering their popularity.
And we're not talking basic minimum framerate here. We're talking high frameright, high resolution, best graphics settings - because consoles just aren't acceptable - and everything else that comes with the high end PC territory. To stream to consoles, you replicate consoles, and that's 30FPS, at lower resolutions, with some graphics reductions. Wow. Suddenly it seems more feasible. Funny that.
Streaming feasibility in 20 years isn't known, and apparently there are those in th industry that believe it could be done. I mean, I could make the same argument; Ray Tracing capability is known; none. That's for now. Never means we won't improve things enough in 20 years that it becomes feasible. Hell, 20 years ago the prevailing logic was that dynamic shadows were infeasible. Its funny how 20 years of progress changes things.Streaming feasibility is already known, its zero, so why not take the time to also bash some consoles while were at it
Wait, so your basic argument is that "we cant make it work properly, but since those people are willing to accept shit its good enough, ship it"? Are you pulling a WarnerBros here?
No, Warnerbros is "These people won't accept shit but we can trick them before they purchase it, so we'll sell them shit before they know it". This is simply catering to what people want - kind of like consoles themselves, or CoD. I view both as pretty shit. But the market as a whole loves them. Funny that, how people can have tastes that aren't my own.