CrystalShadow said:
Laughing Man said:
I wonder If this is going to be related to the graphics card conflict that's been going on for a while. The witcher 3 pc version had such issues.
Uh no, the Witcher 3 thing was an opportunity for AMD to sling as much mud as it could at Nvidia for what amounted to nothing more than AMD being total and utter shit when it comes to releasing updated and timely drivers for new games. Instead of looking in house and asking why community modders were able to fix their drivers and get things like hairworks working it was easier to blame CD Project Red and Nvidia with some made up BS about them being locked out of the code and not having access to the final game code.
Batman on the other hand is just a game that was built for console that has under gone a totally piss poor PC conversion, hence the universe poor performance on all rigs respective of the hard ware it is running on.
What a laughable state of affairs it is that games on PC are basically so poorly coded that the people that write graphics drivers basically have to write a unique driver-side hack for every major game release just to get it to work properly.
It seems like a rather absurd situation, but it's true.
Game devs no longer know how to code properly in a way that is in any way optimal for the hardware. (or sometimes even following basic api requirements). And in the rush to fix this and make their hardware look better, the driver developers have introduced so many hacks and alternate code paths, that even if you did kind of know how to do it right, you'd still mess it up because you cab't predict which particular byzantine combinations of hacks and workarounds your code might trigger, and thus can't be in any way sure what will happen.
http://www.gamedev.net/topic/666419-what-are-your-opinions-on-dx12vulkanmantle/#entry5215019
Just read that for a second, and tell me that sounds like a healthy situation?
Seems we really need vulcan/mantle/whatever... Though then you might just find out just how bad at graphics programming many devs actually are...
To be fair to developers(and speaking as someone who works on the guts of a game engine every day), it's less "no longer know how", and more like "no one ever knew, and you cannot know it now".
It's been this way for a VERY long time now. Back with the advent of hardware acceleration, and then OpenGL and DirectX, the graphics APIs were designed to make it easier to get things to render as you wanted. However, the hardware manufacturers - as per your linked story - pretty much just hacked things into the drivers, and there's really not much the API creators can do in that aspect. They just call to the drivers and hope for the best.
It gets then compounded further when the APIs themselves can have shenanigans - DirectX tends to be better about this than OpenGL, largely because DirectX will flatly deprecate stuff, while OpenGL as it is has a lodestone around it's neck with legacy support - where you have 2 functions to do the same thing, but one of them has terrible performance on modern hardware and there is no documentation telling you this.
The final layer on top of that is the developer's code itself. This depends on the engine chosen. Pre-existing engines tend to be fairly stable, but can be a huge undertaking to optimize them further if you require it. Rolling your own engine ensures the code does exactly what you want it to do, but is a huge time and money sink that most companies will avoid.
Compounding this further is if the developer picks an engine that has the source restricted or completely locked off, like Unity. This is why Unity has a rap about shoddy performance. The developers don't have source access so they can't correct anything that comes up during development, so they just hope it's good enough to work at launch.
So that's 3 layers of potential obfuscation that actively get in the way of good performance.
With DX12 and Vulkan and Metal(for OSX, as Apple's doing their own thing. Again.), it gives VERY explicit control over 2 of those layers. So in theory, the only reason something would perform like trash is because the developers are either using an existing engine that implements them poorly, or they wrote it themselves and it's terrible.
So even if we find out most graphics programmers are complete and utter garbage, at least we've mitigated 2 layers of the overall problem and we can actually learn how to not be terrible going forward.