What Does the End of Moore's Law Mean for Gaming?

Something Amyss

Aswyng and Amyss
Dec 3, 2008
24,759
0
0
When we're already bumping our heads on limitations on consoles, I don't think we're quite to that point yet.
 

Xeorm

New member
Apr 13, 2010
361
0
0
Kenjitsuka said:
Xeorm said:
I think the Stardock games do this (Galactic Civilization series) where they spend extra computing time on AI cycles. A more powerful computer will have better AI.
Wow, really? That really sucks!
The goal is to make the game as good as possible on all systems, for the best experience.
It would be really unfair if my sweet ass PC would kick my ass, just because I OC'ed it up to the heavens...
Really unfair, and probably unwanted by devs. Optional would be another story though; can NEVER have too many of those! :D
The difference isn't too much. With AI you run very quickly into diminishing returns. Even then, I don't remember it being much of an improvement to have a much better computer. Plus, better AI means your allied friends are less derps than usual.

Rack said:
While this kind of thing can be parallelised it's probably not the kind of thing that can really soak up that much processor time to great effect. This is probably more a time and game design limitation than anything.
No, not at all. Good AI is processor and memory intensive. So much so that the majority of efforts are done to minimize the amount of resources needed for the AI, rather than making it strictly better.
 
Jan 12, 2012
2,114
0
0
Thanks for all the responses, folks; I know little about how computers actually work, and it's good to learn a bit more. It seems like the answer is "There are some situations where it can help, but it's not a cure-all." I guess I'm not secretly the greatest programming mind of my generation.

Reading through the comment of @geizr, I wonder how much of the coding for the next ten years is going to be people rebuilding engines to do things in a more efficient way, giving you all the practical benefits with none of those pesky hardware changes.
 

Amir Kondori

New member
Apr 11, 2013
932
0
0
freaper said:
Indeed, my PC is starting to run into trouble trying to play the latest AAA games, I'm looking forward to having to upgrade maybe for the last time.
We are not there yet, expect at least a decade of upgrades yet, and this is assuming materials science doesn't give us another huge leap in processing performance. This is just a what-if, but say a graphene based processor could be made that would allow significantly high clock speeds, like 100Ghz, the upgrade cycle would continue.

Things are slowing down, but we still have quite a ways to go.
 

Wiggum Esquilax

New member
Apr 22, 2015
118
0
0
It also depends on the genre of game. Turn-based asynchronous games can easily split up their processes, and the PC can divide and perform it's tasks while waiting for the human player to finish his turn. Problems arise, of course, if one starts to hammer the end turn button.

Fighting games OTOH have very little ability to divvy up amongst cores, due to their natureofconstantlyquicklyprocessing. There's just no time for cores to chat between themselves.

Xeorm said:
Kenjitsuka said:
Xeorm said:
I think the Stardock games do this (Galactic Civilization series) where they spend extra computing time on AI cycles. A more powerful computer will have better AI.
Wow, really? That really sucks!
The goal is to make the game as good as possible on all systems, for the best experience.
It would be really unfair if my sweet ass PC would kick my ass, just because I OC'ed it up to the heavens...
Really unfair, and probably unwanted by devs. Optional would be another story though; can NEVER have too many of those! :D
The difference isn't too much. With AI you run very quickly into diminishing returns. Even then, I don't remember it being much of an improvement to have a much better computer. Plus, better AI means your allied friends are less derps than usual.

Rack said:
While this kind of thing can be parallelised it's probably not the kind of thing that can really soak up that much processor time to great effect. This is probably more a time and game design limitation than anything.
No, not at all. Good AI is processor and memory intensive. So much so that the majority of efforts are done to minimize the amount of resources needed for the AI, rather than making it strictly better.
GalCiv doesn't punish you for having a faster processor, difficulty select is still in play. A better CPU just means than smarter enemies are now an option. Technically, you could force a higher than advised difficulty level than your rig can handle, but the turn times will be positively glacial.

As for my question: How much of the recent CPU development slowdown is due to the shifting nature of the market? Since the proven successes of the first Blackberries/IPods, development moved away from faster CPUs to more enrgey efficient, cooler CPUs. While there's certainly parallel development between the two paths, how much has mobile device R&D taken away from home consoles and PCs? With the slowing of the mobile device market, will big CPU development heat up again?
 

Callate

New member
Dec 5, 2008
5,118
0
0
P-89 Scorpion said:
Callate said:
On the plus side, recent news suggests breakthroughs in SSD technology means we can soon expect 10TB SSD drives about the size of a stick of gum. And Microsoft is making all sorts of noises about how wonderful and efficient DirectX 12 will be; time will tell. We've still got space to grow for a time, though Shamus' 10-year projection may well still prove accurate.
Samsung already has already been showing off a 16TB SSD and they say they will double that next year.
Yes. But is it the size of a stick of gum? :D

Sorry, I'm spouting off a bit- Maximum PC had a recent article about how 3D NAND flash memory pioneered by Micron and Intel enables vertical chip-stacking that's more power and space efficient- thus the stick-of-gum thing. Samsung's drive is what's currently more typical in SSD drives, the 2.5 inch form factor.

Which is not to say I would turn up my nose at one if it were offered to me right now. Nearly five-year-old computer and all that. But as long as console manufacturers are going to insist on a package that's roughly the size of a VCR, I can imagine a tiny, power-efficient SSD drive being an attractive proposition, assuming the price on the technology drops as quickly as tech writers would like to believe.
 

immortalfrieza

Elite Member
Legacy
May 12, 2011
2,336
270
88
Country
USA
DrunkOnEstus said:
I'm perfectly happy with graphics now. I was perfectly happy at every generation actually, because the games I was attracted to had a distinct and enjoyable art style and aesthetic. Textures really don't need to be higher res, shadows don't need to be more accurate, and we don't need a million piles of alpha-effects on the screen to make the games look better. I'd be fine with that arms race if the cost of production wasn't bankrupting publishers and causing such a safe and conservative mindset in the AAA space, and that's where a line has to be drawn somewhere before all the big players get sucked into a graphical vortex.
The obsession on both sides of the fence for MORE GRAPHICS to the exclusion of everything else regardless of the impracticality and infeasibility of it is easily what is dragging down the video game industry more than anything else. Graphics should have been improving at a VERY slow rate, slow enough to ensure that as the creation of graphics for each game and development in general becomes faster and easier with the same amount of time and people involved the graphics improve to reflect this and thus the costs of development are kept down as low as humanly possible.

I guess my point is this: Amazing graphics have never made a shitty game worth playing through, and I've never put down an enjoyable experience because there wasn't enough eye candy.

This^. To any reasonable person as long as the graphics are good enough that one can tell what it is they are looking at they are insignificant to the experience of any game, it's everything ELSE that is far far more important by an absolutely massive amount.
 

rofltehcat

New member
Jul 24, 2009
635
0
0
Are you sure about that "consoles/gaming are probably held back by their CPU" part? It is quite possible they used a far too slow CPU to begin with but I was surprised how well my 2011 550? PC (i5-2400) runs Witcher 3. CPU sure as hell wasn't an issue.

I'd even say the currently slower advancement in CPU speeds is very interesting for that very reason: You probably don't need the biggest CPU possible, you just need one that is fast enough, as the difference after that gets negligible. Developers also are less likely to make their game too CPU-intensive, thus making it run on more systems and avoiding all kinds of hassle for customers. Well, unless you're a console manufacturer without any foresight and want to save $10 per machine, I guess...
 

alj

Master of Unlocking
Nov 20, 2009
335
0
0
Clock speed levelling off has nothing to do with Moors law

Moors Law says nothing about clock speed, its about performance, just because the amount of clock cycles has stopped increase does not mean the performance increase has also slowed. Improvements in architecture transistor count and memory bandwidth are far more important.

Whist the improvement in single core performance increase is slowing ( all be it slightly) the increase in overall multi threaded performance is still holding true to mores law. Programmers just need to start making better use of multiple cores.

Comparing the tailing off of clock speed increase to a slow down in moors law is a fundamental misunderstanding of the theory.

Compare the performance of say a 800mhtz Pentium 3 and a current gen arm core running at the same frequency. The current gen arm core will run rings around the P3.

The big problem is we are reaching the limits of what the physical properties of the silicon die can do, transistors are getting to the point where they cannot be any smaller. Intel has on the roadmap for 2017 10nm technology and you cannot get much smaller than this without single atom transistors.
 

raankh

New member
Nov 28, 2007
502
0
0
There's plenty more concurrency to exploit in modern sofware -- in particular the move away from thread models and into message passing (erlang-style) and fork/join models (opencl, cuda). If there's graphics power to spare, you can always have it do serial tasks.

Consumer software has a long way to go to catch up to enterprise/embedded systems, but that mostly means there's more insanely expensive GPUs to be bought down the line! New architectures ahoy!
 

Rect Pola

New member
May 19, 2009
349
0
0
I've had a similar thought, at least about graphics, many times. The need for the tech has been pushing the industry. At first, I worried about what would happen when we hit the computing event horizon. But two things have changed my opinion. First is the resurgence of indies and their use of pixelated graphics. Though semi-unrelated forces caused both movements, it proved gaming would be fine as long a people had ideas they could interpret in gameplay. The technology was always there to do it in any fashion they wanted, but the key was the culture not being so addicted to the hottest and sexiest, that simpler designs would appeal.

The second was the potential of a universal format. With no where else to go, this would serve as a perfectly good reason to drop proprietary hardware and have a single format. The only snag in this nirvana is oddball Nintendo inventing new ways to play with the current tech.

But to be fair, their recent consoles are another solution to keeping the hardware generations going.