Lightknight said:
PoolCleaningRobot said:
Lightknight said:
I agree with that. When it comes to 2 devices with about a "50%" (finger quotes here) difference in power, its not really that much on paper but the whole goal of consoles is to squeeze out as much performance as possible on these machines. Eventually that 50% will be huge. Moar reasons to love the ps4!
I don't think the difference will be huge
CrystalShadow said:
Lightknight said:
They'rs so similar architecturally that you can guess at their relative performance from the specifications alone. On that basis, the PS4 is almost certainly faster.
But probably only by a factor of 2-4 at most. And to a PC developer (And remember who Carmack is here...) that's nothing.
(Dealing with 10x performance gaps has been routine for years. Modern PC's even force 100x performance gaps to be an issue - which is pretty demanding, and seems to have led to a lot of lower performance systems being unable to run games...)
For a point of reference, the Wii was about 20 times less powerful than the 360 and PS3.
That gap is huge compared to what the PS4 Xbox one gap is likely to be even in the worst case scenario.
Hell, even the Wii U is unlikely to even get much past 5-6 times slower than the fastest of these systems at the most...
Which is why thr claims that it can still compete with them aren't as crazy as some people make it sound...
(Remembering that a 10x performance gap was routinely handled by PC developers for quite a long period.)
As I stated in a previous post (#53) above to PoolCleaningRobot, consoles allow for optimizations that pcs simply can't compete with directly (they do it indirectly by allowing upgrading of the components over time). When developers know all the hardware that is in most of their consumers hands they can create efficiencies between those components and push them in a way that simply won't work on other machines that have been compiled of unknown but equally powered hardware.
That's all quite true, but misses the point somewhat.
The abstractions that allow thousands of different PC's to be programmed using the same API create a huge amount of overhead.
However, when talking about 2 consoles that have the same CPU architecture, and are even using the same manufacturer for their CPU and GPU, which even come from the same family of processors...
Optimisation isn't going to mean much, because the optimisations are going to be much the same for both systems.
You could potentially do this with any computer. Come up with a standard set of hardware and software that is somewhat closed to alteration and then developers can begin to pick away at its strengths and weaknesses until they get the most use out of the system. This is why Oblivion and Skyrim are so very different in quality on the same machines. I mean, frankly (and after the Skyrim patches to the ps3 for the reasons I also mentioned earlier), these games were playable on consoles that were significantly slower than the comparable minimum specs for the games. I mean, 1/4th the minimum RAM (or less, if you consider that the RAM was divided on the ps3) and 5-6 year old CPU/GPUs. This is nothing ot scoff at where advantages are concerned. I don't think the difference will be so severe at the end of this generation but it will still be there.
That's hardly a huge surprise. The overhead on PC is huge. And you can't optimise it out without breaking compatibility between devices... For that matter, the OS architecture typically has features that you would need to bypass to even start to optimise a game.
Meanwhile, are you aware of how much ram a typical PC operating system actually demands just for the OS? It's not that much of a surprise console versions get away with less. It's not as impressive as you'd think.
Yes, consoles do more with less, and can be optimised more. - But that's an argument about PC vs console. NOT console Vs. Console.
As such, how can anyone expect to just divine the difference in processing power without taking that into account? The smallest of advantages could make a noticeable difference and the ps4 is supposed to have a significant advantage comparatively. That isn't going to be immediately apparent but should become noticeable as the sytems become older. I think the ps3 crippled themselves by partitioning the RAM and forcing developers to balance assets into various categories. The thing is, they did the asset category on purpose. The CEO (a project lead when he said this) actually stated that the reason was they were afraid developers would unlock the full potential of the system. They should have been so lucky to have games that look like they do today back in 2006. I was quite pleased when they dropped the proprietary hardware crap.
Small advantages don't create huge differences that easily. Big structural differences can make a huge difference, (the PS3 and 360 are built very differently in most regards), but that simply leads to different optimisations needing to be made for different systems. - When you build something in a manner that's optimal for one system, but then port it to a system that works differently, performance on the system that wasn't optimised for tends to suffer.
But... As I said, we're now talking about two consoles with nearly identical architecture. - Optimisations for both are going to be pretty similar.
As for John Carmack... I trust his guesses on this kind of stuff... If he says something like that, I believe him.
Do you know the kind of insane optimisations that were nessesary to even get something like doom working on the systems that existed in 1993? Or running quake without 3d hardware to work with?
He's got decades of experience creating some of the most optimised code ever written to back up his opinions.
As for mine... Small changes don't magically create huge performance differences. Big performance differences that have to do with optimisation come from big structural differences. (The gamecube for instance has a radically different graphics architecture to the original Xbox. The Xbox could do effects a gamecube would struggle with - but the gamecube could do trivially do things with textures an Xbox would choke on even attempting. - those kinds of differences certainly show in heavily optimised games... But not generally in cross-platform titles, because those tend to be built to the lowest common denominator.)
As for the WiiU pacing alongside the major consoles. It's doubtful. You're right that it is closer to these machines than the Wii was to the ps3/360. But the difference is significant enough to require downscaling in AI and graphics. It'll be one of those things where you can definitely see the difference. The WiiU has a few things other things against it:
Yes, that's quite true. But it's still going to be less of a challenge than it ever was trying to get a 360 or PS3 title running on a wii.
1. Proprietary hardware: It will now be the only system that is particularly difficult to program for and port to. Porting between pc, XBO and the ps4 will be remarkably easy thanks to x86 architecture. While this should mean more multi-platform games in general, it will make the WiiU the only one that requires special attention to both code for and to downscale large titles appropriately.
True enough. Although this does simply reinforce the lack of meaningful difference between the PS4 and Xbox One...
Although it should be remembered that modern consoles have much, much, more in common than older ones.
While the Wii U still uses a different processor architecture (and a lot of structural differences), it's graphics hardware is ATI technology -derived shader model 4 equivalent hardware, not hugely removed on any technical level from that in the PS4 and Xbox One.
Still, it will be something of an issue, as already demonstrated by launch titles ported from other systems which were clearly very badly optimised compared to the other systems they were on.
(Wii U seems to have a design particularly heavily biased towards GPU rather than CPU loads...)
2. Sales: The WiiU sales are outright sad and only getting slower as per Nintendo's announcement last week. At this rate, it may not outsell the Dreamcast. The dreamcast only sold 10.6 million units in about 2.5 years before it was discontinued. The WiiU has slowed dramatically and has only sold 3.61 million units since Nov 18th, 2012. 3.06 million of those units were sold in the first month (numbers released on Dec 12, 2012) but only 390k sold in the following three months (Mar 3, 2013) and then 160k for the next three (June 6, 2013). Sales like this would (and has) quickly lead developers to wonder whether or not it's worth their development time to port a game to the system. It will especially have difficulty attracting exclusive titles.
That's neither here nor there. The likelyhood of the Wii U ending up in the situation the dreamcast was in is incredibly low. Of course, anything could happen, but it's unlikely.
As for your conclusions here, if anything is going to suffer because of this it's actually the multi-platform releases.
Nintendo actually has some pretty hefty connections when it comes to exclusives, not to mention their own internal development teams tend to result in Nintendo systems in fact having a disproportionate number of exclusives.
Exclusives are easy, from a technical point of view though; They don't really factor into a discussion about hardware power, because by definition they never face the prospect of being ported, so comparative hardware strengths and weaknesses rarely matter unless something has to run reasonably well on multiple systems.
3. The Disk and small HDDs: It still isn't known whether or not the WiiU disk can read dual layers up to the current 50GB standard. If not, this could lead to some serious issues mid-way down the road since the small HDDs on both WiiU models are really not friendly. Also, DLC can quickly become an issue if only a few GBs.
Depends on how you look at it. 25 vs 50 gb on a disk is not a huge issue. The gamecube demonstrated that this tends to just result in multi-disk releases (or sometimes some reduction in asset use), and that a bigger gap (roughly 1.5 Vs 9 gb)
The internal storage could be a bigger issue, but remember the 360 started life with hard disks being an optional extra, and the Wii U does in fact support external storage if the issue were truly critical. (You can hook up a 2 TB external hard disk to a Wii U right now, if you're inclined to; - the internal storage is not an absolute limit)
Not ideal, but hardly fatal.
Can the WiiU turn around and become a major competitor in this market like the Wii was and is? I don't think so. I think Nintendo systematically failed this console's release and continued interest. They'd have to pull something wildly impressive and I don't think they can.
I wouldn't say they're doing well, but I wouldn't count them out just yet. The 3DS launch was almost as bad, and it's now quite popular.
But yes, there is a fairly high chance this will be one of their worst product launches in a very long time.
Likewise, a multitude of however many times faster or slower something is doesn't necessarily translate across generations. What I mean is this. Imagine that last generation's standard was 10units per second. Being half as slow was a smaller disparity than a generation whose standard is 100units per second. The first would be 5units, the second would be 50units. Whatever those units are, a multiple number of times gains significant gravity as the average number of them increases. We'll have to see though. Maybe it will be able to keep pace. From what I've seen though, it's not much stronger than the 360 though. While the 360 is over 10x weaker than the XBO appears to be, I don't think it's 20x.
That depends on how you're measuring things. You're also forgetting that computing hardware, for all practical intents and purposes (especially 3d graphics hardware) shows the effects of diminishing returns. 10x the raw performance of a graphics chip doesn't nessesarily result in a dramatic change in appearance.
For instance, going from a scene with 10,000 polygons to one with 100,000 is a 10x leap in complexity (and performance requirements). But it's not going to represent a 10 times more impressive image. Going from 100,000 to 1 million is less impressive than that, but is still a 10x improvement in performance. And going from 1 million to 10 million is getting to the point where a lot of the differences are incredibly subtle, but again represents a 10x improvement in performance.
even though you are correct, that if the base was 100 units, being half as slow would be 50 units, whereas if you start from 10 it would be 5, this neglects that raw power doesn't translate clearly into improved performance.
To give a different example, I have a laptop, and a desktop. The laptop has a 1.6 ghz dual core processor, while the desktop has a 2.6 ghz quad core processor. Basic maths suggests the desktop system is 3 times more powerful than the laptop. Yet for 90% of tasks you'll struggle to even notice there's a meaningful difference.
Meanwhile, the Desktop system also has a GPU that's 20 times faster (According to benchmarks). This has noticeable effects for some extremely demanding games, and yet, in some cases, where titles will run on both systems, the laptop does much better than should seem reasonable from that gap. (Like, Half life 2 runs at 40 fps on one, and 120 on the other - And that's not even with any huge number of effects disabled.). - Raw performance calculations rarely have the effect the numbers suggest they should.
If anything, I would argue relative differences tell you much more than absolute differences will. The difference between a performance of 1 and 2, is much more impressive than that between 99 and 100. The absolute difference is identical, yet one will stand out like a sore thumb, the other will barely register... Of course, going from 1 to 2 is a doubling of performance, while the other is something like just over 1% more...
The problem with this new generation as a whole is it represents an unusually small leap in absolute raw power terms, at a time when we're already faced with a serious case of diminishing returns.
So not only is the apparent improvement not going to be huge, but even the raw improvement on a technical level is surprisingly small. (From the PS1 to PS2 to PS3 seems to have been in the region of 20-40 times as powerful each generation. This new generation seems to be hovering at about 10-15 times more powerful at the very most...)
If I pull some numbers out of thin air, (Educated guesses based on known information, but still ultimately made up, so take them with a grain of salt), you find something like this:
Wii: 1
Xbox 360: 20
PS3: 20-30 (Very difficult to work with architecture confuses performance estimates)
Wii U: 30-60 (weak cpu. Hard to get any meaningful estimates of GPU performance)
Xbox One: ~120
PS4: ~180
Now, made up or not, even if the above numbers are about right, what does that mean in practice?
Well, put the best the Wii has to offer next to a PS3 or 360 title, and believe it or not, it can hold it's own... Superficially. Of course, if you look at it any deeper than superficially and it becomes immediately obvious that pushing a wii to it's absolute limits will just about get you to a place that the PS3/360 can more or less do in it's sleep. But that anything that looks even superficially comparable exists should be the first warning sign that raw performance alone isn't all it's cracked up to be.
The 'next gen' stuff meanwhile, so far doesn't look that great compared to the existing generation. It's way too early to tell of course, and again we end up comparing best case stuff on the old systems to things that are easy for the newer systems...
The Xbox One and PS4 however seem to have pretty small margins over their predecessors, especially when you consider what came before (And the Wii is already faster than anything in the generation before it.).
Xbox One and PS4 are very similar overall, making comparisons between them unusually easy.
But that kind of performance gap, while it seems huge (my made up numbers for instance suggest the gap between the two is as large as the best case scenario for an entire Wii U system - not a trivial gap in absolute terms...), it's still just not likely to mean much in practice.
At this point we are very heavily into the realms of diminishing returns.
As for the Wii U issue, well, only time can really tell. It seems to be stuck between generations in terms of performance. - And that's going to have some impact, but it's not as big a gap as it seems. And with a gap that size, ports are still quite feasible. That definitely wasn't the case for it's predecessor. A wii version of a cross-platform release had nothing in common with other versions of the 'same' game. And I very much doubt a repeat of that situation will arise again... (But it still might not get ports in the first place.)
Anyway, it's all just a bunch of random speculation. The Wii U is hard to predict in relation to this.
The PS4 and Xbox One are not. - They're equivalent for all practical purposes.
One is more powerful than the other, but the margin is so small and their designs so similar, it barely matters.
Anyway, I think I've spent enough time rambling incoherently. I may be an amateur game programmer who has been studying this kind of technical stuff for over 15 years, but that's no excuse for the mess I just wrote. XD
Anyway, Ignore my nonsense. But please, take Carmack seriously. The guy knows what he's on about.