Rob Robson said:
Lots of ignorance. Lots and lots.
Ignorance on my side, assumptions on yours. I'd call us even.
Rob Robson said:
1. Crysis 3 and Battlefield 3 already use 8 cores. No, it's not a perfect usage. Yes, it will get better.
How many games do you know that only strive to have the best graphics? I know 2; the 2 you mentioned.
Don't forget that not every developer has the money or, in fact, even wants to push the consoles to its limits. Crysis 3 is a nice example though. Throughout the last patches the performance got better and now the difference between one AMD octacore (don't remember which one exactly, sorry) and an Ivy Bridge i5 was ~3 FPS in benchmarks.
Rob Robson said:
2. Technical revolution? The fact that both upcoming consoles are 8-core (6-7 effective cores with AMD architecture compared to Intel) realistically guarantees that optimization will go towards 8 cores in gaming.
Here you can find said assumptions.
"realistically guarantees[...]". No, just no.
The PS3 effectively had a hexacore, since it was able to run 6 threads parallel for games. The 360 had a tricore, and yet many games don't even use 3 cores.
A new console generation doesn't magically expand every developers budget. In fact, they'll even have less money for this. Look at the broken AAA games industry and the constant problem of making your money back because your games are getting so expensive. Even without writing new engines that make efficient use of those cores the costs will simply raise by making games catch up to the graphics quality of modern PC games.
Of the entire games industry, the AAA games are only one fraction. Of this AAA games, only a fraction pushes the consoles to its limits.
So no. The PS4 and XBone having the cores won't make 8 core games the standard. It will just increase the number of said exceptions. Do you honestly believe that the majority of game developers will jump from 2-3 core optimized games, straight to 8 core optimized games?
Rob Robson said:
3. CPU dependent exist, in case you haven't noticed. Skyrim and similar sandbox games will always need CPU power to draw all that near and distant geometry. Object references alone will be a handful for CPUs of the now and the future. The only reason why Skyrim didn't go whole hog with CPU load is because the amount of RAM on the Xbox and PS3 made it impossible. Also, some games are simply CPU heavy by design without necessarily being open world. Natural Selection 2 is a good example of a game that uses up to 6 cores and where you will see tangible FPS benefits simply from overclocking your CPU. (I went from 65 to 110 FPS by overclocking to 4.8 Ghz.)
AFAIK Natural Selection 2 is a poor example, because it's less a game that is optimized to use 6 cores, but more a game that is poorly optimized in the first place. Planetside 2 would have been a much better example. It's a CPU-killer by design.
It's nice of you that you mention Skyrim and its problems on the current-gen consoles in this context though.
By the time the PS3 came out, it was a processing power monster. They even build supercomputers out of PS3-clusters. And yet, what brings the PS3 to its limits, doesn't make a quadcore CPU you'll find in gaming PCs from years ago sweat a bit.
The PS4 and XBone launch with CPUs that are already underpowered by standards from years ago.
I believe (and you are very well within your rights to believe otherwise) that there won't be any game that won't run on a powerful gaming PC with a quadcore CPU, considering that the i5-4570 eats the CPUs from both consoles for breakfast.
I intentionally said "believe", since I want to wait and see what the PS4 can actually do. As of now, it looks like its CPU won't be powerful.
Rob Robson said:
I would get deeper into your studies.
I would stop reducing the entire games industry to the few developers who can actually afford this kind of stuff.
That being said, if I wanted to be sure to run those games, I still wouldn't get an AMD octacore, but an overclockable Haswell i7. While a true octacore CPU will outperform a quadcore with HT (given that they have no other difference) in a game that makes efficient use of all of them, we are still far away from those.
I'd much rather have the better performance in most games and lower temperatures that the Intel CPUs have over a benefit in very few games.