My God, a system like that would beat out a PlayStation 5. If they're saying "A generation ahead of PC", they must surely mean, "for the same price point". Because PCs are ways ahead of the hardware that is in the XBOX One/PlayStation 4. The exact same hardware and/or power that is in the PlayStation 4/XBOX One will run you about $600 right now. Which is more than they're likely to sell these systems. But that won't last long, and before too long, you'll be able to afford this hardware for $300. For goodness sake, Intel's Haswell series and and the HD 8000 series are right around the corner.Gorfias said:
He would have, but he already had two holes full, and still needed the remaining one to speak.Grabehn said:Oh look at the guy trying to get attention, I'm surprized he didn't talk about the Wii U to lick their boots too since they said they were not going to develop for that platform and now they do.
Dexter111 said:Anandtech had an interesting article on that: http://www.anandtech.com/show/6972/xbox-one-hardware-compared-to-playstation-4
I clearly stated those specs ARE for the uninitiated, ie, the things printed on the box and or mass marketed, because that's what the XbO is.DjinnFor said:1. The fact that you think posting "tech specs" makes you look like one of the "initiated". Newsflash: the actual initiated know you're a poser. Come back to me when you've built a couple rigs and have experienced first hand how little those specs actually mean for actual performance.The Lugz said:i'll post a few simple tech specs for those un-initiated, and you can see the utter absurdity..
2. That you either a) didn't read the article or the press release at all, or b) didn't understand it. Apparently cutting edge architecture design is equivalent to "flying cars" and "holodecks for less than £1k". I guess the idea that a company that sells electronics, and another that designs the most widely used operating system in the world, might know a thing or two about good hardware design escapes you. And lets not even talk about how an integrated, compact, stable platform mass-manufactured and sold at cost might have some room for cost-efficiency advantages over a rig built by combining standardized, generalized, premium-priced modules.
That's just the very definition of irony, my friendDjinnFor said:Come back to me when you've built a couple rigs and have experienced first hand how little those specs actually mean for actual performance.
I agree. The pure specs usually don't measure up and yes, it will perform better than the set-up I mentioned but you can't really say this dude has even a grain of truth in his statement that the consoles will outperform the highest of the high-end. We're a long way away from any console outperforming something like dual GTX Titans (which is a viable set up for super-high end users).Lightknight said:snip
As it stands, there are clearly PCs more powerful. My discussion on optimization is just to explain why they're not low-end pc equivalents and are more mid-high than they'd be low-mid. My current home pc appears to be a bit better as well. Maybe a lot better but I'll wait and see.AlwaysPractical said:I agree. The pure specs usually don't measure up and yes, it will perform better than the set-up I mentioned but you can't really say this dude has even a grain of truth in his statement that the consoles will outperform the highest of the high-end. We're a long way away from any console outperforming something like dual GTX Titans (which is a viable set up for super-high end users).Lightknight said:snip
Adam Jensen said:I thought that the GPU actually ships the image straight out now, it doesn't go back to the CPU. Well, at least not anymore. There's obviously the message to say 'i've sent the frame out', but the whole frame isn't copied back into system memory, it would simply take too long and take up too much RAM (imagine shipping 60 frames of 1920x1080 data into system RAM every second, it'd nom all the bandwidth) . It may have been that way in the past, of course, when GPUs were less advanced and frames smaller in size (also, you'll have to forgive me, my knowledge of PC architectures doesn't extend much further back than 2008 or so, when i really started getting interested in it all after getting my first laptop), but at least nowadays i don't think it's the case.The Comfy Chair said:A standard PC setup handles it like this:
1. CPU explicitly copies the data to GPU memory
2. GPU completes the computation
3. CPU explicitly copies the result back to the CPU
And here is how the PS4 unified architecture works:
1. CPU simply passes the point to the GPU
2. GPU completes the computation
3. CPU can read it instantly. There is no copying back to the CPU.
The only advantage the APU would have in that sense would be that the 'ok, the frame buffer can take another delivery of a frame to prepare' message would get there a bit faster.
Anandtech has an article which gives an overview of the graphics pipeline:
http://www.anandtech.com/show/6857/amd-stuttering-issues-driver-roadmap-fraps/2
Aside from the message back to the CPU to say 'moar information please', it's a one way street.
If you actually go and read the piece that he wrote, rather than the bastardized version that Chalk has foisted on The Escapist to drum up nerd rage, that is precisely what he means.Lightknight said:2. This is the most likely, what he said was that the "electronics and an integrated systems-on-a -chip (soc)" architecture was a generation ahead of computers. Perhaps he's not talking about video cards or cpu but rather the way they're optimized. In which case it'd be correct.
If that's the case then he was being entirely honest in that regard. Thanks for confirming.Raesvelg said:If you actually go and read the piece that he wrote, rather than the bastardized version that Chalk has foisted on The Escapist to drum up nerd rage, that is precisely what he means.Lightknight said:
2. This is the most likely, what he said was that the "electronics and an integrated systems-on-a -chip (soc)" architecture was a generation ahead of computers. Perhaps he's not talking about video cards or cpu but rather the way they're optimized. In which case it'd be correct.
This whole affair is just cynical "journalists" selectively quoting someone to generate rage, and more importantly page hits. I'd say that they should be ashamed of themselves, but I've long since come to the conclusion that they're not capable of the emotion.