igor2201 said:
@Lightknight:
You may not know this but AMD has been making very similar APUs without the gddr5 for use in tablets, so it really is underpowered all things considered. There are also other considerations to account for, like the fact that once again the psu is inside the console, meaning that running the APU under full load could have unintended consequences, such as excess heat.
That's not entirely true about the APU. From the articles I've read it's basically an HD 7870 successfully implemented in silicone that may perform, at worst, like an HD 7850 which is NOT "underpowered" even now. It isn't a Titan, sure. But since when did anything but the top of the line qualify as "underpowered"? The CPU is indeed what you've been seeing in laptops and tablets but not the GPU. The entire point of this particular CPU is that it's extremely bandwidth sensitive and scales well with higher bandwidth RAM. So, once again, I've got to stress that while yes, the CPU is certainly nothing special in terms of processing power, it no longer matters as much as people think. It's still 8 cores and decent enough to handle the tasks that it's actually supposed. Please remember, it was the available RAM that the developers utilized, not the full CPU.
Here, this article did the best reverse engineering job on it I've seen.
http://www.extremetech.com/extreme/171375-reverse-engineered-ps4-apu-reveals-the-consoles-real-cpu-and-gpu-specs
So when you're talking about a CPU that scales well with various RAM, saying that it has appeared in laptops or tablets really isn't saying all that much. Especially when developers aren't really developing software to hit the CPU as anything but basic process direction. There was a day when CPUs were the work horse. Yeah, that wasn't even two decades ago. But that day has passed with significantly cheaper and more specialized work to be found in RAM and GPUs. I think Sony made the good call on price/performance there
But really, the final stone in the "hitting the roof" bucket, is that the game doesn't run stably, it does suffer from frame rate drops. Sometimes they are really bad depending on whats going on on screen. In a game that only populates its NPCs when the visual distance is cut down drastically, and WAS built from the ground up with gen 8 ps4 exclusivity in mind and diredt support from the engineers who designed the boards and specs. Tho I don't really think we are at odds and it has been fun talking with you.
I should mention that while I'm not an expert on hardware I am part of development cycles. Particularly QA engineering and testing. There are a lot of reasons why a game could fail to run in a stable state on a machine that should be capable of rendering the environment. As the available processing increases, there's a significant temptation to let less optimal code slide by in the development process. The opportunity for bugs increases as the complexity of the environment increases (think Bethesda games that create massive worlds in minute detail, their games are glitchy because they are massive and complex, not because of any overt faults to be found in the developers). There's generally a tradeoff between complexity and scale of environments and when we start getting to the point where both are possible then the demands on QA and developer turnaround on bugs skyrockets. Especially on a new graphics engine.
The jump from having been limited to less than 512MBs to 4.5 GBs alone is going to present significant overhaul issues too. The first engine you create isn't going to be perfect. The next one will get a lot better and the last one will be darn near perfect.
This is the biggest difference between generations. The creation of new graphics and physics engines that are well suited for those environments. What's really exciting is that this will likely mean that Indie developers are going to start getting their hands on new tools as well (since they're usually about a generation behind unless Valve throws them a bone).
So the final stone or whatever is when the appropriate actions have been taken and it's no longer enough. You have to scale back significant things rather than just reducing the physics processing (which, as I said, can easily scale to use all processes). We aren't close to that point. x86 isn't new, but creating dedicated engines for this much hardware is. Games that require more than 4GB RAM are only just starting to come out (Watchdogs) and those aren't particularly optimized either as we clearly saw.
@ everyone else:
All I am really saying is that NO ONE is in any position to claim Ubisoft is gimping anything by their new target of 900p@30fps
If they can get it to run at that rate on the XBO then they should be capable of at least rendering in 1080p on the PS4. We do know that the PS4 is 50% more powerful than the XBO even without counting possible optimizations that may be available for a console that has so much GDDR5 and has managed to reduce its latency to allow it to run as both video RAM and regular ol' DDR3 RAM.
My guess is that Ubisoft built an x86 graphics engine with both consoles in mind to kick off the next gen and made minor modifications for each console's specific hardware. If that's the case then they are really just getting started. You can't expect a company that releases their games on every available console to cater to one system so specifically right out the gate. I think Ubisoft was telling the truth, that creating a universe in an environment with this much more resources was something they had to work up towards. Again, in the console arena this is a jump in resources by ten times. The next jump could be five times (barring huge breakthrough advances in computing) and that will also be huge because exponential growth is still a lot even when tapering when you've got large values.
But seeing as Ubisoft is basically EA's little brother when it comes to how they consider the customer I'm not really opposed to people accusing them as actively deciding not to specifically cater to the PS4 just yet. That's the most likely reason since they didn't state limitations of the consoles as the factor. I don't think that's them necessarily being bad. I just think that's them making a calculated business decision to start out small and see what other companies do well in their graphics engines and then implement the good processes themselves when they go more specific. The game is still going to look great.