John Carmack: PS4 and Xbox One Are "Essentially the Same"

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
PoolCleaningRobot said:
Lightknight said:
I agree with that. When it comes to 2 devices with about a "50%" (finger quotes here) difference in power, its not really that much on paper but the whole goal of consoles is to squeeze out as much performance as possible on these machines. Eventually that 50% will be huge. Moar reasons to love the ps4!
I don't think the difference will be huge


CrystalShadow said:
Lightknight said:
They'rs so similar architecturally that you can guess at their relative performance from the specifications alone. On that basis, the PS4 is almost certainly faster.

But probably only by a factor of 2-4 at most. And to a PC developer (And remember who Carmack is here...) that's nothing.

(Dealing with 10x performance gaps has been routine for years. Modern PC's even force 100x performance gaps to be an issue - which is pretty demanding, and seems to have led to a lot of lower performance systems being unable to run games...)

For a point of reference, the Wii was about 20 times less powerful than the 360 and PS3.
That gap is huge compared to what the PS4 Xbox one gap is likely to be even in the worst case scenario.

Hell, even the Wii U is unlikely to even get much past 5-6 times slower than the fastest of these systems at the most...
Which is why thr claims that it can still compete with them aren't as crazy as some people make it sound...
(Remembering that a 10x performance gap was routinely handled by PC developers for quite a long period.)
As I stated in a previous post (#53) above to PoolCleaningRobot, consoles allow for optimizations that pcs simply can't compete with directly (they do it indirectly by allowing upgrading of the components over time). When developers know all the hardware that is in most of their consumers hands they can create efficiencies between those components and push them in a way that simply won't work on other machines that have been compiled of unknown but equally powered hardware.

You could potentially do this with any computer. Come up with a standard set of hardware and software that is somewhat closed to alteration and then developers can begin to pick away at its strengths and weaknesses until they get the most use out of the system. This is why Oblivion and Skyrim are so very different in quality on the same machines. I mean, frankly (and after the Skyrim patches to the ps3 for the reasons I also mentioned earlier), these games were playable on consoles that were significantly slower than the comparable minimum specs for the games. I mean, 1/4th the minimum RAM (or less, if you consider that the RAM was divided on the ps3) and 5-6 year old CPU/GPUs. This is nothing ot scoff at where advantages are concerned. I don't think the difference will be so severe at the end of this generation but it will still be there.

As such, how can anyone expect to just divine the difference in processing power without taking that into account? The smallest of advantages could make a noticeable difference and the ps4 is supposed to have a significant advantage comparatively. That isn't going to be immediately apparent but should become noticeable as the sytems become older. I think the ps3 crippled themselves by partitioning the RAM and forcing developers to balance assets into various categories. The thing is, they did the asset category on purpose. The CEO (a project lead when he said this) actually stated that the reason was they were afraid developers would unlock the full potential of the system. They should have been so lucky to have games that look like they do today back in 2006. I was quite pleased when they dropped the proprietary hardware crap.

As for the WiiU pacing alongside the major consoles. It's doubtful. You're right that it is closer to these machines than the Wii was to the ps3/360. But the difference is significant enough to require downscaling in AI and graphics. It'll be one of those things where you can definitely see the difference. The WiiU has a few things other things against it:

1. Proprietary hardware: It will now be the only system that is particularly difficult to program for and port to. Porting between pc, XBO and the ps4 will be remarkably easy thanks to x86 architecture. While this should mean more multi-platform games in general, it will make the WiiU the only one that requires special attention to both code for and to downscale large titles appropriately.
2. Sales: The WiiU sales are outright sad and only getting slower as per Nintendo's announcement last week. At this rate, it may not outsell the Dreamcast. The dreamcast only sold 10.6 million units in about 2.5 years before it was discontinued. The WiiU has slowed dramatically and has only sold 3.61 million units since Nov 18th, 2012. 3.06 million of those units were sold in the first month (numbers released on Dec 12, 2012) but only 390k sold in the following three months (Mar 3, 2013) and then 160k for the next three (June 6, 2013). Sales like this would (and has) quickly lead developers to wonder whether or not it's worth their development time to port a game to the system. It will especially have difficulty attracting exclusive titles.
3. The Disk and small HDDs: It still isn't known whether or not the WiiU disk can read dual layers up to the current 50GB standard. If not, this could lead to some serious issues mid-way down the road since the small HDDs on both WiiU models are really not friendly. Also, DLC can quickly become an issue if only a few GBs.

Can the WiiU turn around and become a major competitor in this market like the Wii was and is? I don't think so. I think Nintendo systematically failed this console's release and continued interest. They'd have to pull something wildly impressive and I don't think they can.

Likewise, a multitude of however many times faster or slower something is doesn't necessarily translate across generations. What I mean is this. Imagine that last generation's standard was 10units per second. Being half as slow was a smaller disparity than a generation whose standard is 100units per second. The first would be 5units, the second would be 50units. Whatever those units are, a multiple number of times gains significant gravity as the average number of them increases. We'll have to see though. Maybe it will be able to keep pace. From what I've seen though, it's not much stronger than the 360 though. While the 360 is over 10x weaker than the XBO appears to be, I don't think it's 20x.

Graphically, we're getting close to a time when graphical differences aren't that noticeable though.
 

CrystalShadow

don't upset the insane catgirl
Apr 11, 2009
3,829
0
0
Lightknight said:
PoolCleaningRobot said:
Lightknight said:
I agree with that. When it comes to 2 devices with about a "50%" (finger quotes here) difference in power, its not really that much on paper but the whole goal of consoles is to squeeze out as much performance as possible on these machines. Eventually that 50% will be huge. Moar reasons to love the ps4!
I don't think the difference will be huge


CrystalShadow said:
Lightknight said:
They'rs so similar architecturally that you can guess at their relative performance from the specifications alone. On that basis, the PS4 is almost certainly faster.

But probably only by a factor of 2-4 at most. And to a PC developer (And remember who Carmack is here...) that's nothing.

(Dealing with 10x performance gaps has been routine for years. Modern PC's even force 100x performance gaps to be an issue - which is pretty demanding, and seems to have led to a lot of lower performance systems being unable to run games...)

For a point of reference, the Wii was about 20 times less powerful than the 360 and PS3.
That gap is huge compared to what the PS4 Xbox one gap is likely to be even in the worst case scenario.

Hell, even the Wii U is unlikely to even get much past 5-6 times slower than the fastest of these systems at the most...
Which is why thr claims that it can still compete with them aren't as crazy as some people make it sound...
(Remembering that a 10x performance gap was routinely handled by PC developers for quite a long period.)
As I stated in a previous post (#53) above to PoolCleaningRobot, consoles allow for optimizations that pcs simply can't compete with directly (they do it indirectly by allowing upgrading of the components over time). When developers know all the hardware that is in most of their consumers hands they can create efficiencies between those components and push them in a way that simply won't work on other machines that have been compiled of unknown but equally powered hardware.
That's all quite true, but misses the point somewhat.
The abstractions that allow thousands of different PC's to be programmed using the same API create a huge amount of overhead.
However, when talking about 2 consoles that have the same CPU architecture, and are even using the same manufacturer for their CPU and GPU, which even come from the same family of processors...
Optimisation isn't going to mean much, because the optimisations are going to be much the same for both systems.

You could potentially do this with any computer. Come up with a standard set of hardware and software that is somewhat closed to alteration and then developers can begin to pick away at its strengths and weaknesses until they get the most use out of the system. This is why Oblivion and Skyrim are so very different in quality on the same machines. I mean, frankly (and after the Skyrim patches to the ps3 for the reasons I also mentioned earlier), these games were playable on consoles that were significantly slower than the comparable minimum specs for the games. I mean, 1/4th the minimum RAM (or less, if you consider that the RAM was divided on the ps3) and 5-6 year old CPU/GPUs. This is nothing ot scoff at where advantages are concerned. I don't think the difference will be so severe at the end of this generation but it will still be there.
That's hardly a huge surprise. The overhead on PC is huge. And you can't optimise it out without breaking compatibility between devices... For that matter, the OS architecture typically has features that you would need to bypass to even start to optimise a game.
Meanwhile, are you aware of how much ram a typical PC operating system actually demands just for the OS? It's not that much of a surprise console versions get away with less. It's not as impressive as you'd think.
Yes, consoles do more with less, and can be optimised more. - But that's an argument about PC vs console. NOT console Vs. Console.


As such, how can anyone expect to just divine the difference in processing power without taking that into account? The smallest of advantages could make a noticeable difference and the ps4 is supposed to have a significant advantage comparatively. That isn't going to be immediately apparent but should become noticeable as the sytems become older. I think the ps3 crippled themselves by partitioning the RAM and forcing developers to balance assets into various categories. The thing is, they did the asset category on purpose. The CEO (a project lead when he said this) actually stated that the reason was they were afraid developers would unlock the full potential of the system. They should have been so lucky to have games that look like they do today back in 2006. I was quite pleased when they dropped the proprietary hardware crap.
Small advantages don't create huge differences that easily. Big structural differences can make a huge difference, (the PS3 and 360 are built very differently in most regards), but that simply leads to different optimisations needing to be made for different systems. - When you build something in a manner that's optimal for one system, but then port it to a system that works differently, performance on the system that wasn't optimised for tends to suffer.
But... As I said, we're now talking about two consoles with nearly identical architecture. - Optimisations for both are going to be pretty similar.
As for John Carmack... I trust his guesses on this kind of stuff... If he says something like that, I believe him.
Do you know the kind of insane optimisations that were nessesary to even get something like doom working on the systems that existed in 1993? Or running quake without 3d hardware to work with?
He's got decades of experience creating some of the most optimised code ever written to back up his opinions.
As for mine... Small changes don't magically create huge performance differences. Big performance differences that have to do with optimisation come from big structural differences. (The gamecube for instance has a radically different graphics architecture to the original Xbox. The Xbox could do effects a gamecube would struggle with - but the gamecube could do trivially do things with textures an Xbox would choke on even attempting. - those kinds of differences certainly show in heavily optimised games... But not generally in cross-platform titles, because those tend to be built to the lowest common denominator.)

As for the WiiU pacing alongside the major consoles. It's doubtful. You're right that it is closer to these machines than the Wii was to the ps3/360. But the difference is significant enough to require downscaling in AI and graphics. It'll be one of those things where you can definitely see the difference. The WiiU has a few things other things against it:
Yes, that's quite true. But it's still going to be less of a challenge than it ever was trying to get a 360 or PS3 title running on a wii.

1. Proprietary hardware: It will now be the only system that is particularly difficult to program for and port to. Porting between pc, XBO and the ps4 will be remarkably easy thanks to x86 architecture. While this should mean more multi-platform games in general, it will make the WiiU the only one that requires special attention to both code for and to downscale large titles appropriately.
True enough. Although this does simply reinforce the lack of meaningful difference between the PS4 and Xbox One...
Although it should be remembered that modern consoles have much, much, more in common than older ones.
While the Wii U still uses a different processor architecture (and a lot of structural differences), it's graphics hardware is ATI technology -derived shader model 4 equivalent hardware, not hugely removed on any technical level from that in the PS4 and Xbox One.
Still, it will be something of an issue, as already demonstrated by launch titles ported from other systems which were clearly very badly optimised compared to the other systems they were on.
(Wii U seems to have a design particularly heavily biased towards GPU rather than CPU loads...)

2. Sales: The WiiU sales are outright sad and only getting slower as per Nintendo's announcement last week. At this rate, it may not outsell the Dreamcast. The dreamcast only sold 10.6 million units in about 2.5 years before it was discontinued. The WiiU has slowed dramatically and has only sold 3.61 million units since Nov 18th, 2012. 3.06 million of those units were sold in the first month (numbers released on Dec 12, 2012) but only 390k sold in the following three months (Mar 3, 2013) and then 160k for the next three (June 6, 2013). Sales like this would (and has) quickly lead developers to wonder whether or not it's worth their development time to port a game to the system. It will especially have difficulty attracting exclusive titles.
That's neither here nor there. The likelyhood of the Wii U ending up in the situation the dreamcast was in is incredibly low. Of course, anything could happen, but it's unlikely.
As for your conclusions here, if anything is going to suffer because of this it's actually the multi-platform releases.
Nintendo actually has some pretty hefty connections when it comes to exclusives, not to mention their own internal development teams tend to result in Nintendo systems in fact having a disproportionate number of exclusives.

Exclusives are easy, from a technical point of view though; They don't really factor into a discussion about hardware power, because by definition they never face the prospect of being ported, so comparative hardware strengths and weaknesses rarely matter unless something has to run reasonably well on multiple systems.

3. The Disk and small HDDs: It still isn't known whether or not the WiiU disk can read dual layers up to the current 50GB standard. If not, this could lead to some serious issues mid-way down the road since the small HDDs on both WiiU models are really not friendly. Also, DLC can quickly become an issue if only a few GBs.
Depends on how you look at it. 25 vs 50 gb on a disk is not a huge issue. The gamecube demonstrated that this tends to just result in multi-disk releases (or sometimes some reduction in asset use), and that a bigger gap (roughly 1.5 Vs 9 gb)

The internal storage could be a bigger issue, but remember the 360 started life with hard disks being an optional extra, and the Wii U does in fact support external storage if the issue were truly critical. (You can hook up a 2 TB external hard disk to a Wii U right now, if you're inclined to; - the internal storage is not an absolute limit)
Not ideal, but hardly fatal.

Can the WiiU turn around and become a major competitor in this market like the Wii was and is? I don't think so. I think Nintendo systematically failed this console's release and continued interest. They'd have to pull something wildly impressive and I don't think they can.
I wouldn't say they're doing well, but I wouldn't count them out just yet. The 3DS launch was almost as bad, and it's now quite popular.
But yes, there is a fairly high chance this will be one of their worst product launches in a very long time.

Likewise, a multitude of however many times faster or slower something is doesn't necessarily translate across generations. What I mean is this. Imagine that last generation's standard was 10units per second. Being half as slow was a smaller disparity than a generation whose standard is 100units per second. The first would be 5units, the second would be 50units. Whatever those units are, a multiple number of times gains significant gravity as the average number of them increases. We'll have to see though. Maybe it will be able to keep pace. From what I've seen though, it's not much stronger than the 360 though. While the 360 is over 10x weaker than the XBO appears to be, I don't think it's 20x.
That depends on how you're measuring things. You're also forgetting that computing hardware, for all practical intents and purposes (especially 3d graphics hardware) shows the effects of diminishing returns. 10x the raw performance of a graphics chip doesn't nessesarily result in a dramatic change in appearance.

For instance, going from a scene with 10,000 polygons to one with 100,000 is a 10x leap in complexity (and performance requirements). But it's not going to represent a 10 times more impressive image. Going from 100,000 to 1 million is less impressive than that, but is still a 10x improvement in performance. And going from 1 million to 10 million is getting to the point where a lot of the differences are incredibly subtle, but again represents a 10x improvement in performance.

even though you are correct, that if the base was 100 units, being half as slow would be 50 units, whereas if you start from 10 it would be 5, this neglects that raw power doesn't translate clearly into improved performance.
To give a different example, I have a laptop, and a desktop. The laptop has a 1.6 ghz dual core processor, while the desktop has a 2.6 ghz quad core processor. Basic maths suggests the desktop system is 3 times more powerful than the laptop. Yet for 90% of tasks you'll struggle to even notice there's a meaningful difference.
Meanwhile, the Desktop system also has a GPU that's 20 times faster (According to benchmarks). This has noticeable effects for some extremely demanding games, and yet, in some cases, where titles will run on both systems, the laptop does much better than should seem reasonable from that gap. (Like, Half life 2 runs at 40 fps on one, and 120 on the other - And that's not even with any huge number of effects disabled.). - Raw performance calculations rarely have the effect the numbers suggest they should.

If anything, I would argue relative differences tell you much more than absolute differences will. The difference between a performance of 1 and 2, is much more impressive than that between 99 and 100. The absolute difference is identical, yet one will stand out like a sore thumb, the other will barely register... Of course, going from 1 to 2 is a doubling of performance, while the other is something like just over 1% more...

The problem with this new generation as a whole is it represents an unusually small leap in absolute raw power terms, at a time when we're already faced with a serious case of diminishing returns.
So not only is the apparent improvement not going to be huge, but even the raw improvement on a technical level is surprisingly small. (From the PS1 to PS2 to PS3 seems to have been in the region of 20-40 times as powerful each generation. This new generation seems to be hovering at about 10-15 times more powerful at the very most...)

If I pull some numbers out of thin air, (Educated guesses based on known information, but still ultimately made up, so take them with a grain of salt), you find something like this:
Wii: 1
Xbox 360: 20
PS3: 20-30 (Very difficult to work with architecture confuses performance estimates)
Wii U: 30-60 (weak cpu. Hard to get any meaningful estimates of GPU performance)
Xbox One: ~120
PS4: ~180

Now, made up or not, even if the above numbers are about right, what does that mean in practice?
Well, put the best the Wii has to offer next to a PS3 or 360 title, and believe it or not, it can hold it's own... Superficially. Of course, if you look at it any deeper than superficially and it becomes immediately obvious that pushing a wii to it's absolute limits will just about get you to a place that the PS3/360 can more or less do in it's sleep. But that anything that looks even superficially comparable exists should be the first warning sign that raw performance alone isn't all it's cracked up to be.
The 'next gen' stuff meanwhile, so far doesn't look that great compared to the existing generation. It's way too early to tell of course, and again we end up comparing best case stuff on the old systems to things that are easy for the newer systems...

The Xbox One and PS4 however seem to have pretty small margins over their predecessors, especially when you consider what came before (And the Wii is already faster than anything in the generation before it.).
Xbox One and PS4 are very similar overall, making comparisons between them unusually easy.

But that kind of performance gap, while it seems huge (my made up numbers for instance suggest the gap between the two is as large as the best case scenario for an entire Wii U system - not a trivial gap in absolute terms...), it's still just not likely to mean much in practice.
At this point we are very heavily into the realms of diminishing returns.

As for the Wii U issue, well, only time can really tell. It seems to be stuck between generations in terms of performance. - And that's going to have some impact, but it's not as big a gap as it seems. And with a gap that size, ports are still quite feasible. That definitely wasn't the case for it's predecessor. A wii version of a cross-platform release had nothing in common with other versions of the 'same' game. And I very much doubt a repeat of that situation will arise again... (But it still might not get ports in the first place.)

Anyway, it's all just a bunch of random speculation. The Wii U is hard to predict in relation to this.
The PS4 and Xbox One are not. - They're equivalent for all practical purposes.
One is more powerful than the other, but the margin is so small and their designs so similar, it barely matters.

Anyway, I think I've spent enough time rambling incoherently. I may be an amateur game programmer who has been studying this kind of technical stuff for over 15 years, but that's no excuse for the mess I just wrote. XD

Anyway, Ignore my nonsense. But please, take Carmack seriously. The guy knows what he's on about.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
CrystalShadow said:
That's all quite true, but misses the point somewhat.
The abstractions that allow thousands of different PC's to be programmed using the same API create a huge amount of overhead.
However, when talking about 2 consoles that have the same CPU architecture, and are even using the same manufacturer for their CPU and GPU, which even come from the same family of processors...
Optimisation isn't going to mean much, because the optimisations are going to be much the same for both systems.
Let's say the tests John ran weren't able to use most of the GPU or RAM because the tests didn't allow the CPU to mostly offload the processing. The machines would look pretty darn similar if that's the case since from what I hear the CPU is nearly identical if not actually identical. We still don't know the actual CPU speeds though as neither party announced them aside from an unconfirmed leak that was fairly accurate elsewhere.

The strength of both consoles lies not in the CPU but everything else. As such, for any accurate comparison you HAVE to have software that is optimized for offloading in efficient ways to get a valid test. Even though they are both x86. We have no idea what tests Carmack ran. Again, I generally trust his work and even moreso I'm going to trust his intention for having said that he hasn't really run them through the wringer. It means on a face comparison. For consoles, optimization means so much more than the surface comparison means.

Meanwhile, are you aware of how much ram a typical PC operating system actually demands just for the OS? It's not that much of a surprise console versions get away with less. It's not as impressive as you'd think.
Yes, consoles do more with less, and can be optimised more. - But that's an argument about PC vs console. NOT console Vs. Console.
It invalidates testing consoles the same way you'd test pc's. That's the point.

Small advantages don't create huge differences that easily.
It depends on what you're calling small differences. We've seen some pretty lofty estimations regarding how much more powerful one machine is than the other. 50% is the last estimate. Microsoft hasn't even denied that, they responded to the estimates by saying they expect to double or triple their processing via cloud computing.

As for John Carmack... I trust his guesses on this kind of stuff... If he says something like that, I believe him.
Do you know the kind of insane optimisations that were nessesary to even get something like doom working on the systems that existed in 1993? Or running quake without 3d hardware to work with?
He's got decades of experience creating some of the most optimised code ever written to back up his opinions.
As for mine... Small changes don't magically create huge performance differences. Big performance differences that have to do with optimisation come from big structural differences. (The gamecube for instance has a radically different graphics architecture to the original Xbox. The Xbox could do effects a gamecube would struggle with - but the gamecube could do trivially do things with textures an Xbox would choke on even attempting. - those kinds of differences certainly show in heavily optimised games... But not generally in cross-platform titles, because those tend to be built to the lowest common denominator.)
What you forget is that Carmack stated up front that he hasn't run benchmark testing on them. He likely didn't intend for anyone to run away with his quote as some kind of golden fact. But, him being the industry vet that he is, he was smart enough to preface his comment with a disclaimer that he hasn't done in-depth testing or benchmarking. If you trust Carmack you should take that preface into account as well.

The notion that anyone, even an industry veteran, could take full advantage of optimizations of consoles before they're even released to the public is... questionable. I do believe they'll look similar. I mean, look at the graphics at the end of the current generation. VERY impressive. I wouldn't expect graphics to get that much more complex before significant diminishing returns hit home and so this generation may be more an improvement in AI and physics which is a pretty good area to make better and will help graphics anyways.

Yes, that's quite true. But it's still going to be less of a challenge than it ever was trying to get a 360 or PS3 title running on a wii.
Yes and I don't know. Yes it should be less of a challenge to fit them in the WiiU graphically because as my first sentence stated, the WiiU doesn't look like it'll be as large of a gap as the Wii was compared to the 360/PS3. "I don't know" because I don't know how different the WiiU's proprietary hardware is compared to x86 environments compared to the Wii's difference with the 360/ps3. I imagine this past generation was an all around nightmare for programmers that had to learn the basics of three different machines with only the 360 being closest to a pc.

True enough. Although this does simply reinforce the lack of meaningful difference between the PS4 and Xbox One...
I'm not sure why this is relevant. Non-proprietary hardware benefits absolutely everyone involved. The ps4 and the XBO moving to a standard architecture is a plus in both of their courts and a negative in anyone else left behind. The hardware becomes cheaper to make, development resources drop SIGNIFICANTLY, and ports become a lot cleaner. Manufactures, Developers, Customers. I can't think of a single downside aside from these companies marching along to what may inevitably be the equivalent of steam boxes in the living room that act as home servers. As far as I'm concerned, the future console developers have will be mostly in setting generational standards that devs can more easily work on as well as making those standards cheaper for the consumer than buying a full-blown pc of comparable power would be. FYI, I own a powerful pc, a ps3, and a 360. I own the other systems as well except for the WiiU but I hardly consider my Wii to be a game machine so much as a really fun party machine. I mean, I'd consider it my Nintendo first-party game emulator before I'd call it a full fledged gaming machine. Fun games, but very limited.


Although it should be remembered that modern consoles have much, much, more in common than older ones.
While the Wii U still uses a different processor architecture (and a lot of structural differences), it's graphics hardware is ATI technology -derived shader model 4 equivalent hardware, not hugely removed on any technical level from that in the PS4 and Xbox One.
Still, it will be something of an issue, as already demonstrated by launch titles ported from other systems which were clearly very badly optimised compared to the other systems they were on.
(Wii U seems to have a design particularly heavily biased towards GPU rather than CPU loads...)
Indeed. This really is sad. I would like to say that hopefully this is because they are porting from the 360/ps3 titles which are quite different but they're all pc games. I do wonder what the problem is but we may not really know for another year or so. I'm hoping the publishers just saw it as a risk of a machine and so did not put as many resources as they could have.

That's neither here nor there. The likelyhood of the Wii U ending up in the situation the dreamcast was in is incredibly low. Of course, anything could happen, but it's unlikely.
How so? The numbers seem to put it at selling at an even slower rate than the dreamcast did. Keep in mind that the Sega Saturn killed Sega's console run, not the Dreamcast. So saying that the WiiU appears to be heading that route doesn't mean that Nintendo itself is going to die. Though I do wonder if Nintendo may see more benefits to them releasing their software on other consoles this gen. I mean, their Mario Kart Wii sold over 34.3 million games. Surely sales like that would be more beneficial to them than remaining exclusive on a console that has just barely sold more than a tenth of the number of copies of that game sold. Heck, a Mario game on PC or non-nintendo console? I'd buy it. It'd have be done after the WiiU is a sure failure and if Nintendo isn't planning to try to relaunch a different console halfway into the generation. Frankly though, unless Nintendo has another Wii breakthrough they may make more money as a software company like Sega did.

But either way, I'm not sure how you can say that the odds are incredibly low when it's selling at around the same rate or even less thanks to this last quarter result as the dreamcast did? The first quarter after the holiday 3million units (that was already 2 million units short of what they were forecasting) they only increased by 12% which is terrible for a new console. This past quarter was 4%. Nintendo has admitted several of the mistakes they've made to cause this and have provided no clear way out.

As for your conclusions here, if anything is going to suffer because of this it's actually the multi-platform releases.
Nintendo actually has some pretty hefty connections when it comes to exclusives, not to mention their own internal development teams tend to result in Nintendo systems in fact having a disproportionate number of exclusives.
Multi-platform releases are a major thing to miss out on. You're talking most AAA titles that aren't Nintendo. I'm not 100% what exclusives they had over the past decade that were good and weren't released later on the other systems or wasn't Nintendo made (example, No More Heroes was a great Wii exclusive title, then bam, released on other systems with more content). I've come to consider my purchase of Nintendo consoles as generally my Nintendo tax to be able to play their software and little else.

Exclusives are easy, from a technical point of view though; They don't really factor into a discussion about hardware power, because by definition they never face the prospect of being ported, so comparative hardware strengths and weaknesses rarely matter unless something has to run reasonably well on multiple systems.
Point #2, the one your comment quoted immediately above is in response to, is regarding sales, not hardware strength. These poor sales are already hitting Nintendo:

http://www.theverge.com/2013/7/31/4574032/nintendo-earnings-q1-2013

Note that even games that have already developed ported titles like Batman: Arkham Origins are trimming out features like multiplayer. That announcement was in response to this. Don't get me wrong, there are very few games I play for multiplayer but this is just a bad sign for them. The reason is that they don't expect the market to be large enough to warrant supporting it. The sales of the console IS major. It sidesteps the hardware mattering at all especially when the hardware is proprietary. If there aren't customers on the other end then you're not going to spend resources producing ports for it. Entire publishing companies have decided to not support the WiiU because of this and the ones have remained stalwart may only be doing so due to work already being in progress or because they think it can turn itself around. Either way, they are taking a risk. Nintendo may need a significant price drop and a MAJOR title or two just to recapture interest.

Depends on how you look at it. 25 vs 50 gb on a disk is not a huge issue. The gamecube demonstrated that this tends to just result in multi-disk releases (or sometimes some reduction in asset use), and that a bigger gap (roughly 1.5 Vs 9 gb)

The internal storage could be a bigger issue, but remember the 360 started life with hard disks being an optional extra, and the Wii U does in fact support external storage if the issue were truly critical. (You can hook up a 2 TB external hard disk to a Wii U right now, if you're inclined to; - the internal storage is not an absolute limit)
Not ideal, but hardly fatal.
Multi-disks would work, I keep forgetting that bit despite having lived through it. Can the system play games from the 2 TB external drive? Can digital games be transferred readily back and forth? Keep in mind, we're instantly talking about non-standard setups the moment we're talking about external drives.

I wouldn't say they're doing well, but I wouldn't count them out just yet. The 3DS launch was almost as bad, and it's now quite popular.
But yes, there is a fairly high chance this will be one of their worst product launches in a very long time.
I assume they have enough assets stored away for such a rainy day. A failure as bad as this is shaping up to be may significanty change the way they decide to do business going forward if they really can't compete on the hardware front. I would be interested to see what they'd do if they became a software company (with the exception of their excellent handheld product line and software) in which the profit margin rose or fell according to what they published.

That depends on how you're measuring things. You're also forgetting that computing hardware, for all practical intents and purposes (especially 3d graphics hardware) shows the effects of diminishing returns. 10x the raw performance of a graphics chip doesn't nessesarily result in a dramatic change in appearance.

For instance, going from a scene with 10,000 polygons to one with 100,000 is a 10x leap in complexity (and performance requirements). But it's not going to represent a 10 times more impressive image. Going from 100,000 to 1 million is less impressive than that, but is still a 10x improvement in performance. And going from 1 million to 10 million is getting to the point where a lot of the differences are incredibly subtle, but again represents a 10x improvement in performance.
Image complexity is one component of processing. As I've stated before, I anticipate that the largest difference we should see this generation should be in the physics and AI department moreso than a huge graphical leap like there was between the PS2/XBOX generation and the PS3/360 generation. Physics can require a heck of a lot more processing than image quality. We will still see a firming up of graphics and the games should look a lot more attractive than in the previous generation but we really are in a age where better graphics won't make that much of a difference like you stated. That doesn't mean that demands on processing are any less significant. You're just talking the paint on the hood while I'm talking about what's underneath.

even though you are correct, that if the base was 100 units, being half as slow would be 50 units, whereas if you start from 10 it would be 5, this neglects that raw power doesn't translate clearly into improved performance.
To give a different example, I have a laptop, and a desktop. The laptop has a 1.6 ghz dual core processor, while the desktop has a 2.6 ghz quad core processor. Basic maths suggests the desktop system is 3 times more powerful than the laptop. Yet for 90% of tasks you'll struggle to even notice there's a meaningful difference.
Meanwhile, the Desktop system also has a GPU that's 20 times faster (According to benchmarks). This has noticeable effects for some extremely demanding games, and yet, in some cases, where titles will run on both systems, the laptop does much better than should seem reasonable from that gap. (Like, Half life 2 runs at 40 fps on one, and 120 on the other - And that's not even with any huge number of effects disabled.). - Raw performance calculations rarely have the effect the numbers suggest they should.
The difference in power has a significant and clear difference in performance when the capacity of one is reuohed. When you have to fit a game into a smaller box it will be apparent here and there. In this case I'd expect a hit to graphics as well as under-the-hood performances. While I don't think the difference is as severe as between the wii and current gen systems, I still think it will be QUITE noticeable, all things considered, even if not entirely graphical.

Of course, going from 1 to 2 is a doubling of performance, while the other is something like just over 1% more...
Exactly. This is why it's unreliable to say that 5 to 6 times weaker hasn't traditionally been the problem that 10+ times weaker has been. If it used to be 10x weaker that was the problem then this generation that number of times should be less because the overall bar is so far raised.

The problem with this new generation as a whole is it represents an unusually small leap in absolute raw power terms,
While it is not the same kind of proportionate change as we've seen before, this is several times as powerful as the current gen tech. What you've got to realise is that with each subsequent iteration you're seeing not just a machine that is multiple times more powerful than its predecessor, but multiple multiple times all of the predecessor's predecessors.

The 'next gen' stuff meanwhile, so far doesn't look that great compared to the existing generation. It's way too early to tell of course, and again we end up comparing best case stuff on the old systems to things that are easy for the newer systems...
This is the story of EVERY console release. The graphics are slightly but noticeably better than the previous generation and things improve from there. We could hold up Skyrim or the most advanced game on the 360 compared to whatever the most graphically advanced game was available on the Xbox and the difference would be striking.

The Xbox One and PS4 however seem to have pretty small margins over their predecessors, especially when you consider what came before (And the Wii is already faster than anything in the generation before it.).
Xbox One and PS4 are very similar overall, making comparisons between them unusually easy.
They are multiple times more powerful than their predecessors. The improvement should not always be exponential until eternity. Eventually even doubling system power may not be viable over a span of 5 years. Not without some significant breaks in technology in the meantime. Imagine the most powerful computer on the market. Four Titans crossbridged with all the latest fix'ns to back it up. Twice as powerful as that really means something right now.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
Just because I don't like people (Carmack in this instance) saying shit in public on topics that they should know better when they don't know the truth I'm going to post these two links here to show their contrast with his words:

http://www.edge-online.com/news/power-struggle-the-real-differences-between-ps4-and-xbox-one-performance/
http://www.nbcnews.com/technology/xbox-one-vs-playstation-4-how-do-they-measure-under-4B11215299

The escapist already posted the bit with the unnamed developers saying that the power difference was clearly there. Those articles put numbers to it and interview individuals to get a better estimate.

Their words is that it is clearly noticeable. A 50% power difference in which the ps4 is the winner. Those articles go on to state that functionally it may not matter because game developers will want their games to work on both systems and so will design with the XBO parameters in mind. Ergo, the weakest link decides the quality of most games. With only games where optimizations for the ps4 being made will show the difference.

The only benefit of a doubt I'll give Carmack in this instance is that he clearly stated he hadn't run benchmarks on them. There's also no telling what kind of tests he was running. The XBO can perform better if the computations aren't a using texturing or arithmetic/logical unit (ALU) based which pretty much all modern games are. So how he came to this conclusion is beyond me.