a library developed by Nvidia belongs to Nvidia and AMD does not run it as good? stop the presses!
And AMD once again making claims about Nvidia they cant back up.
ShakerSilver said:
There are A LOT of details missing in this story, makes it just look like AMD is name-calling.
It's not just AMD cards, but anything that doesn't support Nvidia's GameWorks is completely screwed over for optimization of GameWorks developed features and titles. This includes all ATi cards, Intel integrated graphics, and even Nvidia cards older than their GTX 900 series. A GTX 770 outperforms the GTX 960 due to being simply more powerful, but with GameWorks features activated, the 960 gets edges out the 770 because it's not properly optimized for these features.
To use GameWorks devs make a deal with Nvidia, forcing them to only optimize these features through Gameworks and only for Gameworks supported cards. The incentive for game makers to essentially stop support much of the PC market is because the deal also has Nvidia pay the devs and offer help in development. Basically Nvidia pay devs to only fully optimize games for their cards. Hell I doubt devs even a say in many of these cases or see any of this money, as it's mostly just the suits making money decisions, then telling the devs "OK you're going to use gameworks because Nvidia paid us".
Nvidia is making sure that "the way games are meant to be played" is through Nvidia, even if it means screwing their existing customers because they didn't want to upgrade right away. This is all in complete contrast to AMD who have made all their code open-source and try to make it compatible with all hardware, even their competition.
Because they are.
Yes, AMD cards, Nvidia cards, everyone has problems with Hairworks. even those cards that do support Gameworks. And no, 770 is NOT more powerful than 960 in anything but theoretical power. To state that simply shows your lack of understanding in hardware architecture.
Nvidia makes a new feature that is processed better on newer cards, because older cards never had that technology on them. obviuosly, new cards are going to perform better for that feature. but of course theres going to be a lot of butthurt people that want new features to work on old hardware because they dont understand how technology works.
And all of this about a feature so insignificant most people disable it right away.
IamLEAM1983 said:
There's always a point where adding or subtracting values to your FPS counter does absolutely jack shit.
Yes, its called monitor refresh rate. Having more FPS than your monitor refresh rate wont do you no good.
The Lunatic said:
I find 30FPS to be unplayable. I find it rather odd really.
I understand that a lot of people play at 30. And therefore, it should be fine and playable, as most people seem to be able to do so.
But, when I try it, it looks so off, it's completely unresponsive and jarring to play.
It's a little frustrating, honestly.
When i was a poor kid i used to play games in as low as 15 FPS on severely outdated machines. Its playerable, technically. Its not a good experience though. Nowadays if i cant do 60 id rather lower the settings. cant do 60 at minimum - not even buying the game. but you can play games at very low framerate, even online shooters. its not pleasant, but possible.
Adam Jensen said:
Nvidia crippled The Witcher 3 on anything other than 970, 980 and Titan X. They used GameWorks to sell more 900 series of cards. Nothing else can run the game on Ultra with 60 fps.
False. CDPR used tesselation techniques that are better developed on those cards and less developed on older cards, making them perform worse. Not all GPU technologies improve equally.
Pinky said:
NVIDIA paid for Gameworks integration,
[Citation Needed]
deadish said:
Nvidia isn't called the "graphic mafia" for nothing.
Nvidia improves performance. The horrible mafia!
Jake Martinez said:
Honestly, it's pretty naive to think that Nvidia wouldn't do this as they did the exact same thing with Watchdogs.
Watchdogs claims was disproven many times and yet some people still believe it.