AMD: Nvidia "Completely Sabotaged" Witcher 3 Performance on Our Cards

Steven Bogos

The Taco Man
Jan 17, 2013
9,354
0
0
AMD: Nvidia "Completely Sabotaged" Witcher 3 Performance on Our Cards


AMD's Richard Huddy says Nvidia's new "HairWorks" technology completely sabotaged The Witcher 3's performance.

The eternal mudslinging between graphics card giants AMD and Nvidia continues today, with AMD claiming that Nvidia went so far as to deliberately sabotage recently-released The Witcher 3 [http://www.escapistmagazine.com/tag/view/the%20witcher%203%20wild%20hunt?os=the+witcher+3+wild+hunt]'s performance on AMD cards.

""We've been working with CD Projeckt Red from the beginning. We've been giving them detailed feedback all the way through. Around two months before release, or thereabouts, the GameWorks code arrived with HairWorks, and it completely sabotaged our performance as far as we're concerned," AMD's chief gaming scientist Richard Huddy told Ars Technica [http://arstechnica.co.uk/gaming/2015/05/amd-says-nvidias-gameworks-completely-sabotaged-witcher-3-performance/1/].

"We were running well before that," he added, "It's wrecked our performance, almost as if it was put in to achieve that goal."

Additionally, despite Nvidia's claims that developers have free range of their technologies source codes, Huddy says that this was not the case with CD Projekt Red and HairWorks. "I was [recently] in a call with one of the management at CD Projekt," said Huddy, "and what I heard then was that they didn't have access to the source code."

HairWorks, as the name would suggest, is a kind of technology that allows games to simulate and render fur and hair (including Geralt's Tomb Raider [http://www.polygon.com/2015/3/24/8282437/the-witcher-3-geralt-dynamic-beard-growth-over-time].

It's also not just AMD cards that are suffering performance drops from HairWorks, as reports are coming in that even Nvidia's GTX 980 drops down from 87.4 FPS to 62.2 FPS after turning on the feature.

Source: Ars Technica [http://arstechnica.co.uk/gaming/2015/05/amd-says-nvidias-gameworks-completely-sabotaged-witcher-3-performance/2/]

Permalink
 

CardinalPiggles

New member
Jun 24, 2010
3,226
0
0
You can turn it off. Same with TressFX.

Then there's also the fact that it ruins the framerate on Nvidia cards as well.

Go home AMD, you're drunk.
 

Remus

Reprogrammed Spambot
Nov 24, 2012
1,698
0
0
Sounds like a bad deal all around. Hair simulation has always been a sticking point with PC graphics and I don't see that changing anytime soon.
 

Pinky's Brain

New member
Mar 2, 2011
290
0
0
CardinalPiggles, they can't send in people to fix it ... which is what NVIDIA did with TressFX. The two situations are fundamentally different (which is not to say what NVIDIA is doing is inherently wrong).
 

Ragsnstitches

New member
Dec 2, 2009
1,871
0
0
I've got a 780 and even running Hairworks on geralt only can knock my frames down to as low as 20fps. Considering I can run the game on ultra at 60fps (though it's smoother and less jittery when locked to 30 for me), that's a massive penalty for such a tiny bit of extra flash. Don't get me wrong, it's pretty nice to look at and seeing some of the combat vids with it enabled shows how it really shines, but it's more of a novelty really.

Anyone who shelled out for a 970 or 80 thinking it would drastically affect the quality of the game must be feeling kind of burned on this. The difference is noticeable when it's on, but it's not significant to warrant such costs. Though I imagine playing this game with a smooth 60 fps at 2k or 4k is a nice perk.

I'm pretty sure we won't see many games pushing that feature in the near future considering the resources it demands. Maybe when 970's and 80's series cards become the norm will this feature be of any real relevance, but as of now it's just for bragging rights and that's about it.

Game still looks gorgeous without it on.
 

ShakerSilver

Professional Procrastinator
Nov 13, 2009
885
0
0
There are A LOT of details missing in this story, makes it just look like AMD is name-calling.

It's not just AMD cards, but anything that doesn't support Nvidia's GameWorks is completely screwed over for optimization of GameWorks developed features and titles. This includes all ATi cards, Intel integrated graphics, and even Nvidia cards older than their GTX 900 series. A GTX 770 outperforms the GTX 960 due to being simply more powerful, but with GameWorks features activated, the 960 gets edges out the 770 because it's not properly optimized for these features.

To use GameWorks devs make a deal with Nvidia, forcing them to only optimize these features through Gameworks and only for Gameworks supported cards. The incentive for game makers to essentially stop support much of the PC market is because the deal also has Nvidia pay the devs and offer help in development. Basically Nvidia pay devs to only fully optimize games for their cards. Hell I doubt devs even a say in many of these cases or see any of this money, as it's mostly just the suits making money decisions, then telling the devs "OK you're going to use gameworks because Nvidia paid us".

Nvidia is making sure that "the way games are meant to be played" is through Nvidia, even if it means screwing their existing customers because they didn't want to upgrade right away. This is all in complete contrast to AMD who have made all their code open-source and try to make it compatible with all hardware, even their competition.
 

RicoADF

Welcome back Commander
Jun 2, 2009
3,147
0
0
Sounds like a shitty system that doesn't work well for anyone, less sabotage and more failure for Nvidia.

ShakerSilver said:
. This is all in complete contrast to AMD who have made all their code open-source and try to make it compatible with all hardware, even their competition.
That's what I like about AMD, their philosophy is far more consumer friendly and less greedy all around.
 

Baresark

New member
Dec 19, 2010
3,908
0
0
That is the second accusation this week... it's just starting to feel manufactured. I call bullshit on an optional feature fucking up your performance. It fucks up everyone's performance.
 

Imperioratorex Caprae

Henchgoat Emperor
May 15, 2010
5,499
0
0
Whats really messed up about all this is that these two companies compete with stupid shit that should be left up to the developers. The cards themselves should render whatever, and the developers should have the tools to make their product work with ANY card with ALL the features enabled. Instead NVIDIA has PhysX, held up development on SLI for years after buying out 3DFX... I don't like NVIDIA very much, and have been an AMD guy for a long time, even before they bought into the GFX development area.
I don't necessarily need these hair features, but its kind of sad that GPU devs are doing crap like this to take things away from end users (especially the ones who don't know tech very well). NVIDIA, like INTEL is overpriced for what you get, AMD has always offered quality for the price IMO, and I've always been happy with their technology and I've owned both sides.
I've got my opinions and bias and I'm not afraid to say it. I owned NVIDIA before they were big and even then always had issues with their cards. ATI/AMD has had in my experience significantly less issues with their hardware, though I may not have ALL the performance that NVIDIA gets, I'm also left with money in my pocket to put toward other things I want, just like buying an AMD processor over an overpriced Intel processor.
I've been building PC's since before there were 3D accelerator cards, and I've always found AMD to somehow work better in the long run, last longer and just make me feel I made the right choice as a consumer. The iterations of Intel processors I've bought have crapped out far quicker, have either underperformed or have RMA'd the chip (never RMA'd an AMD in my experiences and I've bought 100's as a professional PC builder). My bias is prevalent, but its not without experience. Same with NVIDIA cards. TNT cards had driver issues back in the day, but at the time ATI was hit or miss until AMD bought them out. There have been some driver issues with AMD cards but they've been fixed relatively quickly and I'm currently running the beta catalyst drivers with no issues on any of my games, also using the AMD Turbo function on my 8-core which boosts the clock speed (watercooling is so awesome). Had it up to 7.2GHz but I'm unsure how accurate the reporting is... Core-Temp program said it was operating at that speed so I'm inclined to believe it.
 

Bat Vader

New member
Mar 11, 2009
4,996
0
0
RicoADF said:
Sounds like a shitty system that doesn't work well for anyone, less sabotage and more failure for Nvidia.

ShakerSilver said:
. This is all in complete contrast to AMD who have made all their code open-source and try to make it compatible with all hardware, even their competition.
That's what I like about AMD, their philosophy is far more consumer friendly and less greedy all around.
Plus AMD doesn't tend to make their cards as expensive as hell. They are affordable and in my opinion are just as good as what Nvidia offers. I'm running a XFX HD 7970 with The Witcher 3 on Ultra settings and the game runs great. I want to get two R9 290Xs later this year and crossfire them.
 

Alfador_VII

New member
Nov 2, 2009
1,326
0
0
AMD are probably right on this, but of course you can turn off Hairworks.

Basically whenever you see a video card maker's logo on a game, someone's being a huge arsehole about it.AMD didn't get access to the final code until too close to release to make a proper driver for it, even without the Hair nonsense.

The same happens the other way round with games made in association with AMD
 

WNxSajuukCor

New member
Oct 31, 2007
122
0
0
Anyone figure also that the feature is future-proofing a bit for DX12? Would think this would be something DX12 was made for.
 

IamLEAM1983

Neloth's got swag.
Aug 22, 2011
2,581
0
0
CardinalPiggles said:
You can turn it off. Same with TressFX.

Then there's also the fact that it ruins the framerate on Nvidia cards as well.

Go home AMD, you're drunk.
This, plus the fact that the game is still perfectly playable with HairWorks enabled. 30 FPS is completely playable, despite what some more zealous PC enthusiasts than I might say. If I hit 60 with any game, I'm living the dream. There's always a point where adding or subtracting values to your FPS counter does absolutely jack shit.
 

Steven Bogos

The Taco Man
Jan 17, 2013
9,354
0
0
IamLEAM1983 said:
CardinalPiggles said:
You can turn it off. Same with TressFX.

Then there's also the fact that it ruins the framerate on Nvidia cards as well.

Go home AMD, you're drunk.
This, plus the fact that the game is still perfectly playable with HairWorks enabled. 30 FPS is completely playable, despite what some more zealous PC enthusiasts than I might say. If I hit 60 with any game, I'm living the dream. There's always a point where adding or subtracting values to your FPS counter does absolutely jack shit.
You're entitled to your opinion, but I completely disagree. 30 FPS is "playable" yes, but 15 FPS was "playable" back in the n64 days. I would much,much rather have a game run at 60 FPS in native resolution with graphical bells and whistles (like hairworks) turned off. More than anything i value a smooth gaming experience, and the difference between 30 and 60 FPS is like night and day.

I do, however, agree that anything above 60 FPS isn't really necessary.
 

The Lunatic

Princess
Jun 3, 2010
2,291
0
0
I find 30FPS to be unplayable. I find it rather odd really.

I understand that a lot of people play at 30. And therefore, it should be fine and playable, as most people seem to be able to do so.

But, when I try it, it looks so off, it's completely unresponsive and jarring to play.

It's a little frustrating, honestly.

Anyway, AMD blaming Nvidia for this? That seems a bit childish.

It's CD Projekt Red's product. It is up to them to ensure it works on everyone's systems as best as possible. If they choose to use software that, when tested causes a significant decrease in performance for those on a certain manufacturer's graphics card, the buck absolutely stops with them.

If Nvidia spends money, time and effort on the development of new technologies, I see no reason they're required to release it for free.
 

Valok

New member
Nov 17, 2010
141
0
0
Considering my GTX 970 OC'ed (i7 3770 4.3Ghz blablabla it should have been "enough") is having serious problems keeping 40 FPS (Hair, Shadows, some other stuff = off), I really don't understand what they are about. Performance on that game (for those that care about 60 FPS, which should be a pretty fucking large group around these parts) is a problem all around right now.

Unless of course high-end AMD-only cards are getting sub 15 FPS. Is this happening?
 

TT Kairen

New member
Nov 10, 2011
178
0
0
My 780 Ti is running the game on Ultra with HairWorks off, not 100% consistent 60fps, but most of the time. HairWorks on, it goes to around 30. If AMD cards are having problems, that's kind of just a problem with AMD cards sucking. You get what you pay for.
 

Smooth Operator

New member
Oct 5, 2010
8,162
0
0
The articles on this are poorly worded, they didn't make it clear if the active hair feature made things worse or if performance got shot the moment that API got added to the game. Need to be very clear on that when you make stuff like this known, otherwise you look quite foolish. Especially since your TressFX feature when running caused havoc on all but modern AMD cards.

But I have no trouble believing the mere API addition nukes the opposition, it has been known for some time that GameWorks stuff only ever runs properly exclusively on new Nvidia cards. That drives sales very specifically to their latest models, which is evil genius right there, why wouldn't they do that given the opportunity...
Better still as they update GameWorks for newer games they can keep pushing everyone's stuff out of the market, including their own older cards.
 

lancar

New member
Aug 11, 2009
428
0
0
It's also kinda funny how Nvidia Experience, when set to optimize The Witcher 3, automatically chooses to turn hairworks off even on my geforce 780.