AMD: Nvidia "Completely Sabotaged" Witcher 3 Performance on Our Cards

lacktheknack

Je suis joined jewels.
Jan 19, 2009
19,316
0
0
And TressFX butchered Nvidia cards in Tomb Raider.

He who lives in a glass house...
 

EndlessSporadic

New member
May 20, 2009
276
0
0
TressFX is pretty bad by itself. It completely butchers the performance on Nvidia cards as well. This is the nature of hair and water physics - they are extremely performance intensive and really only work if you can get to the core of the GPU. AMD tech won't work with an Nvidia card and vice-versa.

It's not like Nvidia is much better, but the things that are tanking performance on your GPUs are your shitty drivers, your bad hardware configurations, and your sub-par software technologies. Come back to us when you fix those first.
 

gigastar

Insert one-liner here.
Sep 13, 2010
4,419
0
0
Are we just going to pretend that AMD didnt introduce TressFX to do pretty much the exact same thing to Nvidia a few years ago?
 

Arina Love

GOT MOE?
Apr 8, 2010
1,061
0
0
HairWorks makes my GTX 970 have pretty low frame rate from what it should pull. So i call it's all AMD PR bullshit.
 

Adam Jensen_v1legacy

I never asked for this
Sep 8, 2011
6,651
0
0
Nvidia crippled The Witcher 3 on anything other than 970, 980 and Titan X. They used GameWorks to sell more 900 series of cards. Nothing else can run the game on Ultra with 60 fps. I canceled my pre-order because of GameWorks, but ended up pre-ordering at the last minute to get the 20% discount for owning both previous titles. And you know what? The game runs like a dream on my R9 280x. I don't care about Hairworks. And other than that piece of tech, The Witcher 3 runs better on AMD. Nvidia users have been reporting some freezing and stuttering issues. Nothing like that on my end. Frame latency is also a dream. I can barely tell the difference between 60 fps and 40 fps. That's how good it runs on AMD (without hairworks).
 

Amaror

New member
Apr 15, 2011
1,509
0
0
Imperioratorex Caprae said:
Whats really messed up about all this is that these two companies compete with stupid shit that should be left up to the developers. The cards themselves should render whatever, and the developers should have the tools to make their product work with ANY card with ALL the features enabled. Instead NVIDIA has PhysX, held up development on SLI for years after buying out 3DFX... I don't like NVIDIA very much, and have been an AMD guy for a long time, even before they bought into the GFX development area.
I don't necessarily need these hair features, but its kind of sad that GPU devs are doing crap like this to take things away from end users (especially the ones who don't know tech very well). NVIDIA, like INTEL is overpriced for what you get, AMD has always offered quality for the price IMO, and I've always been happy with their technology and I've owned both sides.
I've got my opinions and bias and I'm not afraid to say it. I owned NVIDIA before they were big and even then always had issues with their cards. ATI/AMD has had in my experience significantly less issues with their hardware, though I may not have ALL the performance that NVIDIA gets, I'm also left with money in my pocket to put toward other things I want, just like buying an AMD processor over an overpriced Intel processor.
I've been building PC's since before there were 3D accelerator cards, and I've always found AMD to somehow work better in the long run, last longer and just make me feel I made the right choice as a consumer. The iterations of Intel processors I've bought have crapped out far quicker, have either underperformed or have RMA'd the chip (never RMA'd an AMD in my experiences and I've bought 100's as a professional PC builder). My bias is prevalent, but its not without experience. Same with NVIDIA cards. TNT cards had driver issues back in the day, but at the time ATI was hit or miss until AMD bought them out. There have been some driver issues with AMD cards but they've been fixed relatively quickly and I'm currently running the beta catalyst drivers with no issues on any of my games, also using the AMD Turbo function on my 8-core which boosts the clock speed (watercooling is so awesome). Had it up to 7.2GHz but I'm unsure how accurate the reporting is... Core-Temp program said it was operating at that speed so I'm inclined to believe it.
I don't think you know how GPU's work. The reason they are so much more powerfull is that their dedicated to very specific tasks. But that also means that you can't just tell a GPU what to do like you can a CPU. That's the reason why all these GPU developers have these special features like PhySx and Stuff. These are things a GPU would not normally be able to do, because it's not designed for that task. So in order to be able to do these things they need to be designed to do them even on an hardware level. Which is also why Nvidia cards suck and doing stuff developed by AMD and the other way around.
You can't just "leave it to the developer", because in that cause we wouldn't have any of these special features.
 

vallorn

Tunnel Open, Communication Open.
Nov 18, 2009
2,309
1
43
And people wonder why I stay behind the graphics card curve... This nonsense is why really, I'd rather not have to upgrade to a certain manufacturer's ultra-awesome-super-amazing-hyper$$ card just to play new releases...

For now I will stick with my GTX 750 and play TF2, Kerbal Space Program, Space Engineers, Dark Souls, Killing Floor and War Thunder... And now back to trolling low level players with the M10 gun carriage from across the map.
 

CrystalShadow

don't upset the insane catgirl
Apr 11, 2009
3,829
0
0
Eh. This is par for the course.

Ever seen those 'best played with nvidia' things? (The way it's meant to be played or some such thing?)

That sounds like a good thing right?

Well, not really. It means the developer got a cash incentive and technical support from Nvidia.
That's all well and good of course, but Nvidia has a vested interest in ensuring that whatever that game implements kinda breaks AMD cards...

Now, the Devs themselves don't actually like that, because, obviously, if half the gamers that might buy your game can't play it because the code Nvidia handed you breaks your game on their systems that's going to hurt you in the long run...

But, guess what? Nvidia likes it of course. Especially since gamers often fail to see what's really happened, and blame AMD for 'broken drivers', 'bad performance' and so on.

Now, that's not to say AMD is innocent. They certainly attempted to pull the same trick often enough (especially back in the ATI days). But they don't seem to have as much money to throw at it, so they often seem to get the short end of the stick.

There's also this: http://www.dsogaming.com/news/ex-nvidia-driver-developer-on-why-every-triple-a-games-ship-broken-multi-gpus/

There's more to that specific rant than the article is quoting, but what it comes down to is this:

Games don't work. They are, most of the time, released in a state that violates basic rules about how graphics cards actually work.
What happens is, Nvidia and AMD identify what game is running, then actively fix every mistake the game is making at the driver level before handing the fixed code to the GPU...

This happens partly by accident, but part of it is that DirectX and OpenGL are so abstract and convoluted it's almost impossible to code something that actually works correctly from the point of view of the actual 3d rendering hardware.

Which means... The drivers contain millions of lines of convoluted code to deal with a million different edge cases of things that just don't work right, and jiggling all the game code around into a state that's actually capable of running...

No wonder the open-source AMD linux drivers are so bad...
Driver writing is clearly not as simple as it sounds, and in fact sounds like the worst kind of convoluted nightmare you could possibly face as a programmer;
Messy, unelegant code, full of special cases, multiple code paths, conditional code execution and all other kinds of seriously nasty stuff...

It's a small miracle any of it works at all if it's that bad...

Steven Bogos said:
tween 30 and 60 FPS is like night and day.

I do, however, agree that anything above 60 FPS isn't really necessary.
It's nessesary for VR. Essential even. 60fps is a bare minimum[/i]. As in, letting the framerate drop below 60fps, ever is not a good idea.
But... VR is an edge case.

Sure, framerates (or rather, input-display latency, to get at the heart of the issue) in VR are not just nice to have, but critical to avoiding people becoming sick while playing, but, as should be clear, VR is very much a niche thing. (for now anyway. Who can predict the future)
 

martyrdrebel27

New member
Feb 16, 2009
1,320
0
0
this article and conversation is exactly why i was exclusively a console gamer for most of my life, and primarily one now. though this latest generation's shenanigans have almost pushed me off of console gaming entirely. one or two more missteps, and i'm OUT.
 

Charli

New member
Nov 23, 2008
3,445
0
0
ShakerSilver said:
There are A LOT of details missing in this story, makes it just look like AMD is name-calling.
That's the impression I'm getting too, AMD is just coming off as rather sulky here. And my household has one Nvidia machine and one AMD. I'm not in a position to be judgemental I use both for different things. But come on...
 

J Tyran

New member
Dec 15, 2011
2,407
0
0
AMD throwing a strop again... This excuse is weak and old, even if it has some basis in truth should developers and consumers have to be limited by hardware with a minority share?

I'm no fanboy, I want AMD to be better, the PC community needs them to be better but the focus on low to mid range hardware and APUs has been coming back to bite them and now they can barely afford any R&D. The 300 series is just a re-badging of existing chips with HBM memory slapped on in all but one of the cards, a chip first seen in 2013...

That says it all really, they have always been flaky with the drivers too.
 

Adam Jensen_v1legacy

I never asked for this
Sep 8, 2011
6,651
0
0
BloodRed Pixel said:
Wait, we are talking a 29% FPS drop because of hairs?

Rediculously Epic Fail! I'd say.
And it's not even that big a deal. Geralt's hair looks amazing even without it. And it's not like you'll spend a lot of time looking at other people's hair to justify the drop in frame rate on any graphics card ever. It's useless eye candy.
 

Hairless Mammoth

New member
Jan 23, 2013
1,595
0
0
vallorn said:
And people wonder why I stay behind the graphics card curve... This nonsense is why really, I'd rather not have to upgrade to a certain manufacturer's ultra-awesome-super-amazing-hyper$$ card just to play new releases...

For now I will stick with my GTX 750 and play TF2, Kerbal Space Program, Space Engineers, Dark Souls, Killing Floor and War Thunder... And now back to trolling low level players with the M10 gun carriage from across the map.
Yep, this is one reason why I'm still going strong on a GTX 465 and will probably jump to around your card's generation (or go with AMD if that camp is looking good at the time) in maybe a year. Nothing I really want to play has my 465 screaming in pain anyways.

I'll just let the two professional corporations make the 10 year olds fighting the early 90's console wars look like the mature group.
Adam Jensen said:
It's useless eye candy.
But, it's the FUTURE! /sarcasm
 

laggyteabag

Scrolling through forums, instead of playing games
Legacy
Oct 25, 2009
3,355
1,042
118
UK
Gender
He/Him
If your card cant handle it, turn off the setting. HairWorks is what I consider to be "graphical fluff", and should only be turned on after texture quality, shadow quality, model quality etc. Not having HairWorks is like not having PhysX on Borderlands. Sure, it looks prettier with it on, but you aren't losing anything if it isn't.

As for AMD: Hope that the 300 series is as good as NVidia's 900 series. Otherwise, keep up. This train isn't stopping for anyone.
 

Lightspeaker

New member
Dec 31, 2011
934
0
0
vallorn said:
And people wonder why I stay behind the graphics card curve... This nonsense is why really, I'd rather not have to upgrade to a certain manufacturer's ultra-awesome-super-amazing-hyper$$ card just to play new releases...

For now I will stick with my GTX 750 and play TF2, Kerbal Space Program, Space Engineers, Dark Souls, Killing Floor and War Thunder... And now back to trolling low level players with the M10 gun carriage from across the map.
Hell personally I'm running dual 680s in SLI and can still play damn near every game on Ultra settings. There's a few here and there now that I'm having to drop the quality a little bit for but on the whole its not an issue.

And given that I took a look at graphics cards this morning out of curiosity (I'm hunting for a new hard drive, my current one is almost full and I could use extra space) and nearly inhaled my drink in shock at the prices I doubt I'm going to rush out to upgrade.
 

Ishal

New member
Oct 30, 2012
1,177
0
0
Adam Jensen said:
Nvidia crippled The Witcher 3 on anything other than 970, 980 and Titan X. They used GameWorks to sell more 900 series of cards. Nothing else can run the game on Ultra with 60 fps. I canceled my pre-order because of GameWorks, but ended up pre-ordering at the last minute to get the 20% discount for owning both previous titles. And you know what? The game runs like a dream on my R9 280x. I don't care about Hairworks. And other than that piece of tech, The Witcher 3 runs better on AMD. Nvidia users have been reporting some freezing and stuttering issues. Nothing like that on my end. Frame latency is also a dream. I can barely tell the difference between 60 fps and 40 fps. That's how good it runs on AMD (without hairworks).
Just would like to confirm this is my situation with my R9 280 as well.

Getting stable frames so far between 50 and 60. Might drop a little later as I haven't gotten to far into the game. But so far so good.