AMD: Nvidia "Completely Sabotaged" Witcher 3 Performance on Our Cards

Pinky's Brain

New member
Mar 2, 2011
290
0
0
rgrekejin said:
Did they, though? Last I checked, TressFX still brutallizes my GPU.
Sure, but it's not playing favourites any more ... it just murders everyone equally :)

A 780 basically ties with a 290, both with it on and off, same with a 760 and a 270.
 

Valkrex

Elder Dragon
Jan 6, 2013
303
0
0
Little hypocritical AMD with the TressFX crap that you pulled in Tomb Raider.
 

Adam Jensen_v1legacy

I never asked for this
Sep 8, 2011
6,651
0
0
Valkrex said:
Little hypocritical AMD with the TressFX crap that you pulled in Tomb Raider.
Oh for fuck sake. How many times do people need to say this? TressFX runs exactly the same on both AMD and Nvidia. It ran like shit on both before Crystal Dynamics patched it. And even if it didn't, it's open source. Nvidia is free to mess with it.
 

Zipa

batlh bIHeghjaj.
Dec 19, 2010
1,489
0
0
Interestingly signs do seem to point to AMD not being entirely honest about this though, TressFx has been in the Witcher 3 a lot longer than they seem to claim.

http://www.forbes.com/sites/jasonevangelho/2015/05/21/amd-is-wrong-about-the-witcher-3-and-nvidias-hairworks/
 

Ishigami

New member
Sep 1, 2011
830
0
0
ShakerSilver said:
You mean CDProject optimizing it for the AMD's hardware; which they can't because of the GameWorks program - they're only allowed to optimize GameWorks features for GameWorks supported cards (Nvidia's GTX 900 series and Titan Black).
First of all CDPR didn't have access to source code according to CDPR themselves and AMD, so CDPR can't optimize anything in regards to any GameWorks stuff.
Secondly using GameWorks doesn't come automatically with any exclusive deal for Nvidia. After all GameWorks binaries etc. are mostly free to any developer. Just because you are using GameWorks binaries or tools doesn't mean you are working for Nvidia now for some reason.
There are apparently licensing deals that prevent developers from unclosing the source code to AMD but first of all CDPR didn't have access to the source code in the first place and from a business standpoint it makes sense and is legal to protect your source code from the competitor. This is a proprietary solution by Nvidia after all.
Last but not least if the developer has access to the source code of the GameWorks stuff Nvidia is not preventing them for optimizing the code for AMD as well: https://www.youtube.com/watch?v=aG2kIUerD4c&hd=1

ShakerSilver said:
For AMD to make it work well on their hardware they have to wait until after launch and throw together their own patch well after people notice how poorly those features perform on their hardware.
According to AMD themselves they worked together with CDPR on optimizing the game month before release.
HairWorks is not new to the game, it was demonstrated a long time ago. There are video demonstration of HairWorks in Witcher 3 dating almost a year back.
And I'm very sure that CDPR knew it performed even worse on AMD than Nvidia hardware.
AMD therefore had a long time to come up with the tessellation tweak to get the performance up. They didn't and an unrelated person at Guru3D came up with it just a day after release...
Kind of sloppy that AMD missed that opportunity don't you think?
 
Sep 14, 2009
9,073
0
0
Adam Jensen said:
Hey AMD users. Would you like to run Hairworks better than Nvidia users just for bragging rights? Here's how: http://gearnuke.com/witcher-3-run-hairworks-amd-nvidia-gpus-without-crippling-performance/

It actually works if anyone's interested.
thanks for the tip, I've been running it smoothly on high~ultra (minus hairworks) but this might be worth tinkering with as long as it doesn't mess with the FPS too much.


OT: it's honestly a bit funny that this thread is more tame than the facebook comments, sweet jesus the fanboys are showing their true colors so hard on there.

yay for anti-competition! *sarcasm*
 

Jake Martinez

New member
Apr 2, 2010
590
0
0
Pinky said:
CardinalPiggles, they can't send in people to fix it ... which is what NVIDIA did with TressFX. The two situations are fundamentally different (which is not to say what NVIDIA is doing is inherently wrong).
Honestly, it's pretty naive to think that Nvidia wouldn't do this as they did the exact same thing with Watchdogs.

It's a pretty simple gambit actually - a developer decides to use some of your technology for some rendering, but you don't allow them access to the source code, which means that they cannot identify or give feedback to other card manufacturers about what they can do to boost performance with the new technology. Usually it even starts out with lots of promises of access and "Oh sure you can look at the code" and all that, but about 6 months into the work when it's too late to back out all of a sudden there are "legal issues" that can't be resolved.

This isn't a new tactic, not even by a long shot, sound card manufacturers (notably Creative) used to be infamous for this kind of stuff. The only difference here is that Nvidia is being particularly aggressive with this strategy and AMD's card sales have been suffering for a long time so they are more likely to be audibly upset about this.

The goal here isn't to really get Nvidia to stop this practice (although I'm sure they'd like that), it's to make game developers be aware of, or take into account, the complaints of their potential customers that are running AMD graphics hardware.

As for NVidia themselves, they are dickholes and have always been dickholes. They keep their hardware specifications completely closed source so it's impossible to create opensource drivers for their hardware without cleanroom reverse engineering and while I know most people here are Windoze users, people who use Linux pretty much hate them because they have put artificial limitations into their cards in regards to things like monitor support.

Frankly, I wish Intel and Qualcomm would get involved in the high end graphics card business because AMD is doing a shitty job of competing with Nvidia and if Nvidia is allowed to keep doing stuff like this then eventually their already shitty anti-consumer practices are going to get even worse since people won't have a legitimate choice of an alternative vendor for gpus.
 

Nurb

Cynical bastard
Dec 9, 2008
3,078
0
0
The amount of money required to make games look their best has been getting more and more rediculous every year.

It's unacceptable to spend a few hundred dollars and STILL have a severe performance hit when all settings are at their highest.
 

FoolKiller

New member
Feb 8, 2008
2,409
0
0
ShakerSilver said:
There are A LOT of details missing in this story, makes it just look like AMD is name-calling.
Yea. I've stopped caring about anything that Mr. Bogos writes because its inflammatory, incorrect, and incomplete more often than not. I just skip over and head to the sources and/or the comments to read about the topic if it interests me.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
a library developed by Nvidia belongs to Nvidia and AMD does not run it as good? stop the presses!

And AMD once again making claims about Nvidia they cant back up.

ShakerSilver said:
There are A LOT of details missing in this story, makes it just look like AMD is name-calling.

It's not just AMD cards, but anything that doesn't support Nvidia's GameWorks is completely screwed over for optimization of GameWorks developed features and titles. This includes all ATi cards, Intel integrated graphics, and even Nvidia cards older than their GTX 900 series. A GTX 770 outperforms the GTX 960 due to being simply more powerful, but with GameWorks features activated, the 960 gets edges out the 770 because it's not properly optimized for these features.

To use GameWorks devs make a deal with Nvidia, forcing them to only optimize these features through Gameworks and only for Gameworks supported cards. The incentive for game makers to essentially stop support much of the PC market is because the deal also has Nvidia pay the devs and offer help in development. Basically Nvidia pay devs to only fully optimize games for their cards. Hell I doubt devs even a say in many of these cases or see any of this money, as it's mostly just the suits making money decisions, then telling the devs "OK you're going to use gameworks because Nvidia paid us".

Nvidia is making sure that "the way games are meant to be played" is through Nvidia, even if it means screwing their existing customers because they didn't want to upgrade right away. This is all in complete contrast to AMD who have made all their code open-source and try to make it compatible with all hardware, even their competition.
Because they are.

Yes, AMD cards, Nvidia cards, everyone has problems with Hairworks. even those cards that do support Gameworks. And no, 770 is NOT more powerful than 960 in anything but theoretical power. To state that simply shows your lack of understanding in hardware architecture.

Nvidia makes a new feature that is processed better on newer cards, because older cards never had that technology on them. obviuosly, new cards are going to perform better for that feature. but of course theres going to be a lot of butthurt people that want new features to work on old hardware because they dont understand how technology works.

And all of this about a feature so insignificant most people disable it right away.


IamLEAM1983 said:
There's always a point where adding or subtracting values to your FPS counter does absolutely jack shit.
Yes, its called monitor refresh rate. Having more FPS than your monitor refresh rate wont do you no good.

The Lunatic said:
I find 30FPS to be unplayable. I find it rather odd really.

I understand that a lot of people play at 30. And therefore, it should be fine and playable, as most people seem to be able to do so.

But, when I try it, it looks so off, it's completely unresponsive and jarring to play.

It's a little frustrating, honestly.
When i was a poor kid i used to play games in as low as 15 FPS on severely outdated machines. Its playerable, technically. Its not a good experience though. Nowadays if i cant do 60 id rather lower the settings. cant do 60 at minimum - not even buying the game. but you can play games at very low framerate, even online shooters. its not pleasant, but possible.



Adam Jensen said:
Nvidia crippled The Witcher 3 on anything other than 970, 980 and Titan X. They used GameWorks to sell more 900 series of cards. Nothing else can run the game on Ultra with 60 fps.
False. CDPR used tesselation techniques that are better developed on those cards and less developed on older cards, making them perform worse. Not all GPU technologies improve equally.



Pinky said:
NVIDIA paid for Gameworks integration,
[Citation Needed]

deadish said:
Nvidia isn't called the "graphic mafia" for nothing.
Nvidia improves performance. The horrible mafia!

Jake Martinez said:
Honestly, it's pretty naive to think that Nvidia wouldn't do this as they did the exact same thing with Watchdogs.
Watchdogs claims was disproven many times and yet some people still believe it.
 

Kungfu_Teddybear

Member
Legacy
Jan 17, 2010
2,714
0
1
Country
United Kingdom
Gender
Male
You mean like your TressFX hair did to Nvidia cards in Tomb Raider, AMD?

AMD users can just turn it off, for gods sake. Hell, I'm using an Nvidia card and I still turned it off. It's just pointless eye candy.
 

Gethsemani_v1legacy

New member
Oct 1, 2009
2,552
0
0
Kungfu_Teddybear said:
You mean like your TressFX hair did to Nvidia cards in Tomb Raider, AMD?

AMD users can just turn it off, for gods sake. Hell, I'm using an Nvidia card and I still turned it off. It's just pointless eye candy.
Pointless, resource intensive eye candy, if you please. Yeah, I also have an Nvidia card and I didn't even bother to turn it on. The basic hair effects are decent enough, apart from the weird fact that Geralt seems to be standing next to a fan going at full speed, even when indoors in places with no draft at all....
 

Pinky's Brain

New member
Mar 2, 2011
290
0
0
Strazdas said:
CDPR used tesselation techniques that are better developed on those cards and less developed on older cards, making them perform worse. Not all GPU technologies improve equally.
Tessellation has been such a huge disappointment, the promising part of tessellation has always been screenspace tessellation where surfaces are diced up to ~X pixel sized tris after transformation of the control surface (ala. micropolygons). Instead we have tessellation as a form of geometry compression which almost all the time is not only unnecessary but actually counter-productive for performance.

Pinky said:
NVIDIA paid for Gameworks integration,
[Citation Needed]
You're right, I'm jumping to conclusions. I should have said NVIDIA paid CDPR and CDPR integrated Gameworks.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
Pinky said:
Strazdas said:
CDPR used tesselation techniques that are better developed on those cards and less developed on older cards, making them perform worse. Not all GPU technologies improve equally.
Tessellation has been such a huge disappointment, the promising part of tessellation has always been screenspace tessellation where surfaces are diced up to ~X pixel sized tris after transformation of the control surface (ala. micropolygons). Instead we have tessellation as a form of geometry compression which almost all the time is not only unnecessary but actually counter-productive for performance.

Pinky said:
NVIDIA paid for Gameworks integration,
[Citation Needed]
You're right, I'm jumping to conclusions. I should have said NVIDIA paid CDPR and CDPR integrated Gameworks.
I agree with you on tesselation being a dissapointment. I merely explained why the performance difference existed. Personally i prefer well mapped objects to tesselation but we have what we have.

Im still calling out for citation on your clarification though, as there are no proof Nvidia paid any developer, ever.
 

Steve the Pocket

New member
Mar 30, 2009
1,649
0
0
Jake Martinez said:
As for NVidia themselves, they are dickholes and have always been dickholes. They keep their hardware specifications completely closed source so it's impossible to create opensource drivers for their hardware without cleanroom reverse engineering and while I know most people here are Windoze users, people who use Linux pretty much hate them because they have put artificial limitations into their cards in regards to things like monitor support.
They should try actually playing with an AMD card and see how they appreciate getting like 60-70% lower framerate than on Windows half the time. Nvidia are actively playing dirty, AMD is just incompetent. It's like US politics, come to think of it.