Next-gen AI and Next-gen BS

ForumSafari

New member
Sep 25, 2012
572
0
0
Atmos Duality said:
[Not to rain on your parade, but it really won't be.
Apart from centralizing some multiplayer processes (which is a staple of dedicated/official servers already), it won't really improve much of anything.
Oh I don't know, it really depends on a game by game basis. As you say it all depends on how time sensitive the calculation is and how complex it is in the first place, but there's room for use in slower games and the non-game services that most consoles run in the background anyway.

Deploying dedicated servers on a cloud platform, particularly when they can piggyback it on Azures infrastructure is a good way to load balance.
 

Sectan

Senior Member
Aug 7, 2011
591
0
21
If I remember correctly, didn't F.E.A.R have really stupid AI? Like so stupid that their programming was literally "RUN AT PLAYER. SHOOT." A developer came back and said the only reason the enemies flanked the player and seemed so smart was the level designed was aimed towards forcing the AI into those routes and getting players to react in a way that would help this.
 

Atmos Duality

New member
Mar 3, 2010
8,473
0
0
ForumSafari said:
Oh I don't know, it really depends on a game by game basis. As you say it all depends on how time sensitive the calculation is and how complex it is in the first place, but there's room for use in slower games and the non-game services that most consoles run in the background anyway.

Deploying dedicated servers on a cloud platform, particularly when they can piggyback it on Azures infrastructure is a good way to load balance.
I agree with that technically in principle, but in practice, it's just that, overwhelmingly, the mainstream console games trend towards "real-time" rendering.

More accurately, those kinds of video games are calculated on a per-frame basis (rendering, by definition, is per-frame unless it's a totally static image), meaning that to find the "timeliness" of a game, or window that it has to crunch everything, you take your Frame-Per-Second, and solve for seconds.

So, if your game runs at, say, 60FPS, all crunching must be done by 1/60 second intervals (0.0167 seconds, or 16.7 milliseconds)
Meaning your latency to the Cloud MUST BE less than 16.7ms to provide any real benefit.

(For 25FPS, the time is a much more generous 40ms, but there's a big push for higher frame rates in gaming.)

Even in an optimal scenario where you can save more processing to put towards video by "unloading" non-video processes to the Cloud, it still won't amount to much of an increase in performance OR fidelity.
Because increasing graphical fidelity now requires INCREDIBLE increases in processing power to be noticeable.

Freeing up 20-40% of local processing resources is nowhere near enough.
Hell, TRIPLING the processing resources available wouldn't be enough today.



So, the only benefits you're left with are those akin to dedicated servers.

Which is a nice perk, but a potentially dangerous one too, since the same Cloud-process-loading feature can be very easily modified to render a game dependent on the Cloud making it "Always Online-required".
 

Bad Jim

New member
Nov 1, 2010
1,763
0
0
Atmos Duality said:
So, if your game runs at, say, 60FPS, all crunching must be done by 1/60 second intervals (0.0167 seconds, or 16.7 milliseconds)
Meaning your latency to the Cloud MUST BE less than 16.7ms to provide any real benefit.
Only if the client side AI is literally brain dead without the cloud. If it can follow basic instructions, such as go here, attack that guy etc, then you can have a high level AI in the cloud that updates the client AI every half second or so. On that sort of timescale, latency is not a big issue.


Atmos Duality said:
Which is a nice perk, but a potentially dangerous one too, since the same Cloud-process-loading feature can be very easily modified to render a game dependent on the Cloud making it "Always Online-required".
There is an upside to this. If a company like EA realises it can protect its' AI code by not actually giving it to their customers, they might, just might, throw a decent sack of cash at advancing the field of game AI so they can justify it. Whether the AI they develop will actually need cloud processing is largely down to random chance, but they would hopefully realise the importance of the AI being good.

Kahani said:
Somewhat ironic that your post was about people not actually realising what the AI does, since Civilisation (the latest one at least, I can't remember exactly how previous versions worked) is one of the ones that actually doesn't cheat
Actually, I got that example from a lecture by Soren Johnson.

 

Atmos Duality

New member
Mar 3, 2010
8,473
0
0
Bad Jim said:
Only if the client side AI is literally brain dead without the cloud. If it can follow basic instructions, such as go here, attack that guy etc, then you can have a high level AI in the cloud that updates the client AI every half second or so. On that sort of timescale, latency is not a big issue.
You can, but what's the bloody point when AI barely consumes any local processing to begin with?
You aren't really saving anything by hosting it on the cloud, and what you do save could never amount to anything significant past that.

If anything, hosting it on the cloud just creates another potential point of failure.

There is an upside to this. If a company like EA realises it can protect its' AI code by not actually giving it to their customers, they might, just might, throw a decent sack of cash at advancing the field of game AI so they can justify it. Whether the AI they develop will actually need cloud processing is largely down to random chance, but they would hopefully realise the importance of the AI being good.
That's quite a stretch for an upside.

For one thing, broadband internet has all but eliminated AI as a development priority, because multiplayer-only is the future companies like EA are moving in. When you have a large population of human players; there's just no point to making AI controlled elements.

That, and I have never once heard the need to "protect" AI as being a motivator for...well, anything.
It's just a bizarre assertion to make.
 

Senare

New member
Aug 6, 2010
160
0
0
dolgion said:
A good topic to discuss. Sure, the combat AI routines are peanuts compared to the requirements of graphics processing but when I think of "Super AI", I'm thinking of the limitless potential that the field has yet untapped.

Think of an open world game where every single NPC has dynamic interactions with others AND the player, where their actions are determined on the fly depending on ever changing parameters. You'll have to write a human simulation, much in the vein of The Sims. When you increase the complexity and reactivity, the computational needs would surely skyrocket no?

An example:

Bob is a farmer somewhere in Tamriel. He has 2 daughters in their teens and a troublesome son who's running with the wrong crowd. He worries about whom he should marry his daughters to and how he should straighten out his son. After all, last month he had to bail him out from the local guard after his son tried to steal a noble's purse as some sort of rite of passage at the thieves guild. Thing is, the bail cost him his savings and then some. He's now in debt and all he can hope is a really good harvesting season. One of his daughters is in love with a local banker's assistant, but Bob isn't sure if the boy has the smarts to make a career and provide for his daughter should they marry. With so many worries, and the death of his wife some 6 years ago, he's been starting to have a drink before bed just so he can fall asleep better.


This is just One character among thousands in this imaginary game. You'd need to simulate a working economy, personality traits and relationships between characters and more. I wonder how impossible this kind of thing would be in terms of processing power, let alone to design algorithms for.
And even then you would have the challenge of communicating it to the player. It is not that hard to imagine that Bob may already have a whole hidden life in today's Skyrim but the player would not notice it because the developers could not think of a good way to let Bob communicate it to the player. The game is voice acted, so by that standard they would have to invent a voice synthesizer and a way to translate Bob's plans into words - a massive feat on its own - just to get the player to notice all the intricacies.
 

maturin

New member
Jul 20, 2010
702
0
0
Shamus, you must have never played ArmA.

The only reason AI doesn't usually require lots of system resources is because gaming AI is so incredibly limited and dumbed-down. The minute you try to do something ambitious with 'combat AI' (like in ArmA, where the poor little guys have to do EVERYTHING the player can do in a massive persistent environment), system resource limitations begin to loom large. Really large.

ArmA is a PC-game, so the point about new console generations stands, but it's still one game that has already run up against a wall, as the use of headless client (MP communities buying a separate copy of the game to run the AI independent of the server) shows.

ArmA could still be a game with AI that resembles FEAR or CoD (it would probably be a lot more reliable and also destroy the dynamic, sandbox nature of the game), but still, AI is only cheap because developers set their sights low in view of the its great complexity.

Put 100 soldiers in a field, and fire a rifle at them from concealment. To act like humans, all 100 men have to drop to the ground, start looking around for dozens of possible places to take cover, while staying somewhat in formation, while preserving the chain of command. And then they have to start looking for the sniper by evaluating the sound and scanning the hundreds of possible objects the sniper could be hiding behind. And maybe doing some recon by fire and selecting random targets to spray at. And throw smoke? But who throws and where?

And that's the ideal AI. ArmA only does half of that. And don't tell me that that's only a few CPU cycles. It's literally thousands of LoS checks. Like every hitscan bullet fired in Halo ever.
 

Callate

New member
Dec 5, 2008
5,118
0
0
I have a soft spot in my heart for the enemy player A.I. in the old Commodore 64/Apple II era Spy vs. Spy game, for one simple and perhaps silly reason:

The game was a split-screen action/strategy game in which the Spies attempted to locate four MacGuffins and escape while setting various traps to kill the other Spy as they each attempted to maneuver around the rooms on the map on the same mission. Death was an inconvenience, taking the player out of the game for perhaps ten seconds while giving the other player a bit of free rein and possibly access to any MacGuffins the other player had already located.

One of the traps involved setting a bucket of "electrified" water over a door so that if the enemy Spy tried to open that door, it would fall on their head and immediately kill them.

When both Spies happened to run into each other in the same room, the action would condense down to one screen and the Spies had the option to try to bludgeon one another to death with clubs- an option that was certainly the slowest and most uncertain method of incapacitating one's opponent. As both players were running against a clock, this was usually best avoided.

Now, here's the thing: as this was 1980-something, both players were accessing all functions with a single eight-directional joystick and one button. This meant, among other things, that the controller function for opening a "northward" door (pressing up on the joystick and pressing the button) was more or less identical to the action for swinging a club upward (holding down the button and pressing upward). It was fairly easy for a savvy player to "trick" a human opponent into opening a trapped door and electrocuting themselves, when they were actually trying to swing their club upward.

...And I don't know why, but the computer could be "tricked" into the same error.

As Shamus notes, it's all too easy to create an AI that has advantages over the player: one who doesn't have the fog of war in a strategy game or always has free money; one who never misses in a shooter. It's harder to create one that plays convincingly like a human, including making human-style errors. But I really have to give credit when a programmer creates an "AI" player that feels like they're playing using the same interface as the player, right down to the tics of the controller. I saw it in that game back in 1980-something, and I'm not entirely certain I've seen a good example since then.
 

Andrew_C

New member
Mar 1, 2011
460
0
0
Atmos Duality said:
Freeing up 20-40% of local processing resources is nowhere near enough.
Hell, TRIPLING the processing resources available wouldn't be enough today.



So, the only benefits you're left with are those akin to dedicated servers.

Which is a nice perk, but a potentially dangerous one too, since the same Cloud-process-loading feature can be very easily modified to render a game dependent on the Cloud making it "Always Online-required".
The problem with your example is that with a good normal map the 600 poly model gives the same graphical fidelity as the 60000 poly one, and is less resource intensive on modern GPU's than the 6000 poly model. And everyone uses normal maps these days. Basically, a new technology came along and totally sidelined everyone's assumptions.

I'm not saying that "the Cloud" will be that technology, particularly when it's just use as a marketing label for all the stuff that's already done server side. Just that your example has been overtaken by changes in technology, and is now misleading, which is a pity as it's simple and easily understood.
 

iseko

New member
Dec 4, 2008
727
0
0
I wonder if the AI of alien isolation will be any good. It is what is going to make or break the game.
 

Atmos Duality

New member
Mar 3, 2010
8,473
0
0
Andrew_C said:
The problem with your example is that with a good normal map the 600 poly model gives the same graphical fidelity as the 60000 poly one, and is less resource intensive on modern GPU's than the 6000 poly model. And everyone uses normal maps these days. Basically, a new technology came along and totally sidelined everyone's assumptions.
*sigh*
Another great visual aide flushed right down the drain. Yeah, I forgot about bump mapping.
It's pretty great tech (especially for as old as it is), but it's not a disproof against my point; just that specific image.

Well, now I have to find a way to demonstrate the ever-ballooning state of texture maps, static meshes and a gaggle of "post-processing" goodies that are being crammed into rendering these days. All of which are FAR more resource intensive than any AI.
 

lukesparow

New member
Jan 20, 2014
63
0
0
I always thought all of the AI claims were a bunch of bs. Good to know I had the right idea.
But who knows? Maybe these developers will truly surprise us and come up with something amazing not possible before.