The HairWorks Debacle Explained

Shamus Young

New member
Jul 7, 2008
3,247
0
0
The HairWorks Debacle Explained

Who's to blame for the Witcher 3 HairWorks debacle? Shamus says there's plenty of blame to go around.

Read Full Article
 

CrazyBlaze

New member
Jul 12, 2011
945
0
0
Lets not forget that Hairworks barely worked for Nvida tech at launch. It was such an FPS killer that people with GTX 980s were complaining about it.
 

Kenjitsuka

New member
Sep 10, 2009
3,051
0
0
Very good analysis, Shamus! And it did cripple NVIDIA cards too, just to a way different amount. I do believe it to be intentional to AMD, but you can't deny it sucked all around for the performance hit for such a stupid small feature...

Makes you wonder how much the hardware companies are now gonna PAY to have their middleware included, or WAY WORSE, be the only middleware included, excluding the competitions software....... :(
 

Glodenox

Eternally tweaking things
Mar 29, 2011
13
0
0
Also interesting to add: TressFX Hair on the other hand is open source, which allowed Nvidia to implement it themselves as well. Can't see why any sane developer would choose the Hairworks solution, unless there's a bag of money involved.
Note: at the point of development for the Witcher 3 I don't think version 2.0 of TressFX was available, which closed the gap in quality between the two systems/products.
 

Tiamat666

Level 80 Legendary Postlord
Dec 4, 2007
1,012
0
0
CrazyBlaze said:
Lets not forget that Hairworks barely worked for Nvida tech at launch. It was such an FPS killer that people with GTX 980s were complaining about it.
Indeed. I have a GTX 980 and have turned Hairworks off. I get 60 FPS even with Hairworks on most of the time, but that completely expendable feature is not worth it for me to put that kind of strain on my graphics card. With Hairworks the card is always hitting 80°C with the fan at 80 to 100%. Turn it off and the card operates at 60° to 70° and the fan stays quiet.

Really, I don't see what all the fuzz is about. Just turn the darn thing off. The default hair looks good enough and also waves around in the wind.
 
Jan 12, 2012
2,114
0
0
I'm not a PC guy, nor am I a Witcher customer, but it's good to know that people like Shamus are there keeping a tab on shady stuff like this before it spills over into something that directly affects me.
 

Codey

New member
Sep 9, 2009
12
0
0
From what I've read CD Projekt Red worked closely with Nvidia for two years on integration of their technology. AMD had access to the game and team that whole time, but largely didn't use it.

I don't remember exactly where I read it (possibly here on The Escapist), but according to CD Projekt Red communication with AMD was silent for months before release. When AMD came to offer TressFX and help with that it came too late. So it is their own fault.

I am a proponent of open source and information sharing, but I understand the desire to have the edge at something. Tomb Raider had TressFX and Nvidia [CENSORED]. Now AMD has their turn.

So in conclusion, AMD sat on their [CENSORED], while Nvidia were forthcoming. AMD's users got burned.
The fight between the two continues with none of them learning.
 

Ishigami

New member
Sep 1, 2011
830
0
0
I'm displeased by this piece.
You acknowledge that you have no evidence to suggest that Nvidia deliberately sabotaged performance on AMD graphic cards yet you go on and declare Nvidia guilty of doing so.
You willfully ignore Nvidias arguments why the performance difference might actually be plausible (difference in tessellation performance) and that HBAO+, another GameWorks feature, works fine [http://www.computerbase.de/2015-05/witcher-3-grafikkarten-test-benchmark-prozessoren-vergleich/3/].
You have overseen that AMD, despite having allegedly worked month [http://arstechnica.co.uk/gaming/2015/05/amd-says-nvidias-gameworks-completely-sabotaged-witcher-3-performance/] before release on optimizing the game, failed to implement the lower tessellation samples work around [http://wccftech.com/witcher-3-run-hairworks-amd-gpus-crippling-performance/] in a driver profile for Witcher 3 in order to boost HairWorks performance. Something a fan found mere two days after release. This also indicates that Nvidia might actually tell the truth that Maxwell simply outperforms Hawaii when it comes to this particular tessellation task.
I also dislike that I have to take your word for everything.
When you say HairWorks crippled AMD a link to benchmark would have been nice. Not that I doubt it.
You point out that CDPR didn't have access to the source code (citation would have been nice again) but failed to address that Nvidia very well offers source code access licenses [http://www.extremetech.com/gaming/183411-gameworks-faq-amd-nvidia-and-game-developers-weigh-in-on-the-gameworks-controversy/2]. So it is not like Nvidia locks developers out on purpose.
It would have also been nice if you interview CDPR about the topic. You see AMD claims the HairWorks code arrived two month [http://arstechnica.co.uk/gaming/2015/05/amd-says-nvidias-gameworks-completely-sabotaged-witcher-3-performance/] before release yet there are videos of Witcher 3 dating back to August 2014 [https://www.youtube.com/watch?v=5DvVxbmu3OU] demonstrating HairWorks.
Afaik Nvidas has been working with CDPR for 2 years to get HairWorks into the game. Two month prior to release expecting TressFX to make it is ridiculous, yes it was too late.
And 5 seems to be a lie since AMD admitted to have been working with CDPR on Witcher 3 since the beginning [http://arstechnica.co.uk/gaming/2015/05/amd-says-nvidias-gameworks-completely-sabotaged-witcher-3-performance/1/] so no AMD was not out of the loop.
 

CrystalShadow

don't upset the insane catgirl
Apr 11, 2009
3,829
0
0
Yeah, worth remembering, (I've mentioned this before, but it bears repeating)
http://www.dsogaming.com/interstitial.php?oldurl=http://www.dsogaming.com/news/ex-nvidia-driver-developer-on-why-every-triple-a-games-ship-broken-multi-gpus/

Here's a quote from the forum post this article shows part of (the forum post itself is: http://www.gamedev.net/topic/666419-what-are-your-opinions-on-dx12vulkanmantle/#entry5215019)

The first lesson is: Nearly every game ships broken. We're talking major AAA titles from vendors who are everyday names in the industry. In some cases, we're talking about blatant violations of API rules - one D3D9 game never even called BeginFrame/EndFrame. Some are mistakes or oversights - one shipped bad shaders that heavily impacted performance on NV drivers. These things were day to day occurrences that went into a bug tracker. Then somebody would go in, find out what the game screwed up, and patch the driver to deal with it. There are lots of optional patches already in the driver that are simply toggled on or off as per-game settings, and then hacks that are more specific to games - up to and including total replacement of the shipping shaders with custom versions by the driver team. Ever wondered why nearly every major game release is accompanied by a matching driver release from AMD and/or NVIDIA? There you go.

The second lesson: The driver is gigantic. Think 1-2 million lines of code dealing with the hardware abstraction layers, plus another million per API supported. The backing function for Clear in D3D 9 was close to a thousand lines of just logic dealing with how exactly to respond to the command. It'd then call out to the correct function to actually modify the buffer in question. The level of complexity internally is enormous and winding, and even inside the driver code it can be tricky to work out how exactly you get to the fast-path behaviors. Additionally the APIs don't do a great job of matching the hardware, which means that even in the best cases the driver is covering up for a LOT of things you don't know about. There are many, many shadow operations and shadow copies of things down there.
It makes for fascinating reading, but it also shows just how absurdly broken the game development + graphics driver situation on PC actually is. I mean, just look at the first line: Nearly every game ships broken.

That sounds completely absurd, but apparently, that's normal, everyday life for the people that have to write graphics drivers.
Game devs ship broken games, the driver devs pick up the pieces.

Is it any wonder you get all kinds of weird vendor specific quirks in a situation like this?
 

UrinalDook

New member
Jan 7, 2013
198
0
0
I get that this is summarising the issue for an audience that is perhaps less up to speed on the intricacies of GPU politics, but it seems like you've missed or at least brushed over some of the key points in the debate. In the interests of disclosure, I'm an AMD GPU user so you can judge if you think there's an element of bias in this.

TressFX and Hairworks are competing technologies with a similar, if not outright identical function and are made by AMD and nVidia respectively, yes. However, the fact that TressFX is open source and Hairworks is not is a huge deal. There are basically three stages of optimisation for PC games: how the developer chooses to optimise their code broadly across all systems, how the GPU manufacturers are able to optimise the link between this code and their specific GPUs via drivers, and how the end user is able to optimise for their specific setup using in game options, .ini file tweaks and third party post-processing tools.

It's the middle step here that's important. nVidia are far, far better than AMD at releasing drivers. Often, they will release a new driver after a specific big release (eg. GTA V and, naturally, The Witcher 3) with the goal of further optimising the game for their cards, often only days after the game comes out. Full credit to them for that. TressFX, AMD's flowing hair tech, is open source. This means nVidia can use the source code for TressFX to optimise its use in a game for nVidia cards and release this optimisation in a driver. Ultimately, this means TressFX can be used on either brand of GPU with little difference in performance, as Tomb Raider showed.

nVidia, however, have not fully released the source for Hairworks. This means AMD are unable to release a driver to help optimise Hairworks functions on their cards. My point here is that it doesn't matter if CDPR knew about differences in performance between the two brands at launch. My point is that they had to have known that if there were differences, there was not a damn thing AMD or AMD users could do to fix it. Likewise, they had to have known that - being open source - this would not have been an issue if they had used TressFX (at least, not to anywhere near the same extent).

The end result of my argument and Shamus' are basically the same, a lot of the blame also needs to be dropped at CDPR's door. But the way it comes across in the article is that to appease everyone, CDPR would have to include both Hairworks and TressFX as part of their engine and that both AMD and nVidia are holding middleware to ransom at the expense of the consumer in an attempt to differentiate themselves. This is not the case. TressFX would have worked fine on either type of card.

Shamus is bang on the money to be worried about GPU manufacturers shaping games to give their platform an edge, but I wanted to stress that - for the moment at least - this shady shit is the sole domain of nVidia.

Oh, and just for the record: Hairworks is but one of the several features bundled under the nVidia Gameworks package. We can probably expect to see more of this shit in the future.
 

Hoplon

Jabbering Fool
Mar 31, 2010
1,840
0
0
Shamus Young said:
The HairWorks Debacle Explained

Who's to blame for the Witcher 3 HairWorks debacle? Shamus says there's plenty of blame to go around.

Read Full Article
i think there is a way simpler answer to this question, CD project red got money for only going nvidia, it why the game gets bundled with their cards. that wasn't done for free. And that came with the caveat that they couldn't also include and tech AMD where using other than the common to both sets of cards stuff.

Sounds sinister i know, but i think it's just a fact of life in being an independent studio. if some one totally legitimate offers you cash towards the development, you take it.

Also apprently tress FX is open source, so Nvidia not using it is them being sulky children, much like the adaptive frame rate standard in display port 1.3 that AMD got on board with and nvidia haven't because they want everyone to use their much more expensive nsync solution that AMD can't use.
 

AT God

New member
Dec 24, 2008
564
0
0
I haven't played Witcher 3 yet, but the only hair rendering middleware I've ever dealt with was TressFX on Tomb Raider. And it might be worth noting that on my NVIDIA 680, TressFX killed performance for that game. I could run it on full detail at 60FPS except for when TressFX was on. Maybe a 980 is recommended for TressFX but the fact that AMD's software crippled my NVIDIA card makes me think that neither side is actively sabotaging anyone, its just that their various softwares are so sketchy that each company has to make tweaks to get it to work on their cards, and without access to the source code the opposing side has almost no chance at getting the competitor's version to work.

Again, I haven't played Witcher 3 so I don't know if HairWorks is vital to the experience but I had no problem playing Tomb Raider with TressFX off and didn't feel my experience suffered at all. If Hairworks were integral to the game working and not something that could be toggled off I could understand the anger but if it is optional and assuming that having it off doesn't make the hair look notably awful, I don't mind. I would much rather have averagely rendered hair and 60 FPS than having silly dynamic hair and crap framerate.
 

Ishigami

New member
Sep 1, 2011
830
0
0
UrinalDook said:
Nvidia is under no obligation to grand AMD access to their source code. They prohibit any developer to unclose the source to AMD even if they have access and this is an absolutely reasonable stance.
However there are licenses available [https://www.youtube.com/watch?v=aG2kIUerD4c&t=4m8s] to grand developers access to GameWorks source codes and Nvidia does not prevent developers to optimize the code for AMD [http://www.youtube.com/watch?v=aG2kIUerD4c&t=36m10s] hardware.

Nvidia develops this middleware and offer it for free [https://developer.nvidia.com/gameworks-registered-developer-program] to any developer. The development of this middleware is however not free. Of course Nvidia wants to get its investment back by pushing the envelop of visual fidelity to create demand for more powerful cards.
This however is not a one sided trade off since we, the customers, get more visual fidelity and the developer get reduced workload in return. If you don't want that you can switch those advanced features off and enjoy the improved performance.

What you also forget in this CDPR should have done both argument is that Nvidia apparently went out of their ways to get HairWorks into the game while AMD was twiddling their thumbs.
In the article I linked in my previous comment CDPR said that AMD basically went all silent on them for several month.
Other developers described Nvidia staff to be really passionate [http://www.extremetech.com/gaming/183411-gameworks-faq-amd-nvidia-and-game-developers-weigh-in-on-the-gameworks-controversy/3] about games in general and optimizing games for their hardware. And I do believe this to be true.
In interviews [https://www.youtube.com/watch?v=aG2kIUerD4c] Nvidia staff said that they would not give generous support but sometimes send their own software engineers to developers [http://www.youtube.com/watch?v=aG2kIUerD4c&t=27m25s] to get the implementation done and this on Nvidias expenses. AMD seems to not do this or at a much smaller scope. That's AMDs problem, not Nvidias.
I assume HairWorks is in Witcher 3 because Nvidia cared and actively helped CDPR while TressFX is not there because AMD was doing nothing except offering some code which afaik is free anyway and on top of that two month before release.
If it wasn't for Nvidia then probably no hair tech at all would be in the game. A state you can create by disabling the feature.

Hoplon said:
Nvidia has been accused of paying developers off to use their tech several times in the past [http://www.xbitlabs.com/news/multimedia/display/20100311101148_Nvidia_Denies_Bribing_Game_Developers_for_Implementation_of_PhysX.html].
Afaik thus far AMD or any other accuser never provided any evidence to support the claim and Nvidia disputed the accusations.
What Nvidia has acknowledged however is that they actively helped developers to implement Nvidia technology which could also be the reason why Nvidia gets their tech into games.

I don?t think comparing HairWorks and TressFX to G-Sync and FreeSync holds up very well either. G-Sync and FreeSync may produce similar results and are based on the same principle but apparently G-Sync goes differently about creating the result than FreeSync which uses the AdaptiveSync standard.
Nvidia decided to tackle the problem on both ends (GPU and monitor) to ensure their vision of the experience. You will see for example that for FreeSync the range of useable frequency ranges differs for monitors. AMD says it works from 9 to 240Hz yet there are monitors which only allow FreeSync to be used between 48 and 75Hz and some between 40 to 144Hz.
You don?t see this on G-Sync. There it is always 30 to 144Hz.
Nvidia goes way differently about what happens if the frame rate drops out of this window [http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Inside-and-Outside-VRR-Wind] than AMD and this only possible because Nvidia has a frame buffer inside the panel.
G-Sync is also apparently better at eliminating ghosting. [https://www.youtube.com/watch?v=-ylLnT2yKyA]
All this because Nvidia thanks to the G-Sync module has direct control over the display and AMD does not.
 

Hoplon

Jabbering Fool
Mar 31, 2010
1,840
0
0
Ishigami said:
Nvidia has been accused of paying developers off to use their tech several times in the past [http://www.xbitlabs.com/news/multimedia/display/20100311101148_Nvidia_Denies_Bribing_Game_Developers_for_Implementation_of_PhysX.html].
Afaik thus far AMD or any other accuser never provided any evidence to support the claim and Nvidia disputed the accusations.
What Nvidia has acknowledged however is that they actively helped developers to implement Nvidia technology which could also be the reason why Nvidia gets their tech into games.

I don?t think comparing HairWorks and TressFX to G-Sync and FreeSync holds up very well either. G-Sync and FreeSync may produce similar results and are based on the same principle but apparently G-Sync goes differently about creating the result than FreeSync which uses the AdaptiveSync standard.
Nvidia decided to tackle the problem on both ends (GPU and monitor) to ensure their vision of the experience. You will see for example that for FreeSync the range of useable frequency ranges differs for monitors. AMD says it works from 9 to 240Hz yet there are monitors which only allow FreeSync to be used between 48 and 75Hz and some between 40 to 144Hz.
You don?t see this on G-Sync. There it is always 30 to 144Hz.
Nvidia goes way differently about what happens if the frame rate drops out of this window [http://www.pcper.com/reviews/Displays/AMD-FreeSync-First-Impressions-and-Technical-Discussion/Inside-and-Outside-VRR-Wind] than AMD and this only possible because Nvidia has a frame buffer inside the panel.
G-Sync is also apparently better at eliminating ghosting. [https://www.youtube.com/watch?v=-ylLnT2yKyA]
All this because Nvidia thanks to the G-Sync module has direct control over the display and AMD does not.
I wasn't so much "accusing them" as much as I was thinking that everyone does it or your logo doesn't go in there. because who the fuck gives other companies free advertising? They both do it, it's why you get the annoying "works better on AMD" "Nvidia, the way it's meant to be played" horse shit.

honestly from reading your links it sounds like LCD's are terrible at changing refresh rates and you need an expensive work around or you get ghosting. Honestly the only comparison was with the nature of the two things, one side open source, one side proprietary. which of course is still given away because other wise no one uses it. See the history of Sony for many, many examples of propitiatory format failures. and the only reason AMD isn't doing proprietary is because they are too broke.

I'm not pro or anti either of them, is no point.
 

Ishigami

New member
Sep 1, 2011
830
0
0
I think it is less about money [http://www.dsogaming.com/news/slightly-mad-studios-boss-nvidia-have-not-paid-us-a-penny-amd-working-to-fix-performance-issues/] and more about marketing opportunities.
Being on either program comes with certain benefits. Nvidia and AMD seem to offer support apparently free of charge however the level of support you get probably depends on how good of a partner you are and your product.
So I imagine it more of a tradeoff. AMD and Nvidia give you free middleware, support, documentation and some promotions via tech demonstrations etc. and the publishers and developers give AMD and Nvidia marketing opportunities and brand-building in return.
Can I be sure that no money is involved. No.
But I doubt that EA or DICE would need AMD money to develop Battlefield 4 or UbiSoft some cash by Nvidia for the next Assassins Creed. For all I know it could be the other way around as well...
 

Amir Kondori

New member
Apr 11, 2013
932
0
0
The thing that really sells the fact that this is deliberate sabotage is the huge performance hit HairWorks causes. Even on Nvidia cards you give up 25% of your performance. For some flowing hair. Who wants to make that trade off unless you are running a very beefy setup.

Of course Nvidia is happy to make that tradeoff if it means that the hit to AMD's performance is almost 50%.

I agree that CDPR deserves a big portion of the blame, but they didn't get played, they sold out. Nvidia pays in something called "co-marketing", which can be worth hundreds of thousands of dollars or more in marketing for the game.

I wish this stuff would stop, and I currently run a GTX 980.
 

Lightknight

Mugwamp Supreme
Nov 26, 2008
4,860
0
0
You know, in real life when I walk my hair doesn't flow unless there's a big breeze. Seems like these flowing hair programs take realistic looking hair and then throw them into the bottom of the uncanny valley. But maybe that's just me? Am I really the only person seeing the character's hair dancing all over the place as he walks into a bar and thinks that just looks silly and unrealistic? It's like the hair has no weight to it.