I don't think you know how GPU's work. The reason they are so much more powerfull is that their dedicated to very specific tasks. But that also means that you can't just tell a GPU what to do like you can a CPU. That's the reason why all these GPU developers have these special features like PhySx and Stuff. These are things a GPU would not normally be able to do, because it's not designed for that task. So in order to be able to do these things they need to be designed to do them even on an hardware level. Which is also why Nvidia cards suck and doing stuff developed by AMD and the other way around.Imperioratorex Caprae said:Whats really messed up about all this is that these two companies compete with stupid shit that should be left up to the developers. The cards themselves should render whatever, and the developers should have the tools to make their product work with ANY card with ALL the features enabled. Instead NVIDIA has PhysX, held up development on SLI for years after buying out 3DFX... I don't like NVIDIA very much, and have been an AMD guy for a long time, even before they bought into the GFX development area.
I don't necessarily need these hair features, but its kind of sad that GPU devs are doing crap like this to take things away from end users (especially the ones who don't know tech very well). NVIDIA, like INTEL is overpriced for what you get, AMD has always offered quality for the price IMO, and I've always been happy with their technology and I've owned both sides.
I've got my opinions and bias and I'm not afraid to say it. I owned NVIDIA before they were big and even then always had issues with their cards. ATI/AMD has had in my experience significantly less issues with their hardware, though I may not have ALL the performance that NVIDIA gets, I'm also left with money in my pocket to put toward other things I want, just like buying an AMD processor over an overpriced Intel processor.
I've been building PC's since before there were 3D accelerator cards, and I've always found AMD to somehow work better in the long run, last longer and just make me feel I made the right choice as a consumer. The iterations of Intel processors I've bought have crapped out far quicker, have either underperformed or have RMA'd the chip (never RMA'd an AMD in my experiences and I've bought 100's as a professional PC builder). My bias is prevalent, but its not without experience. Same with NVIDIA cards. TNT cards had driver issues back in the day, but at the time ATI was hit or miss until AMD bought them out. There have been some driver issues with AMD cards but they've been fixed relatively quickly and I'm currently running the beta catalyst drivers with no issues on any of my games, also using the AMD Turbo function on my 8-core which boosts the clock speed (watercooling is so awesome). Had it up to 7.2GHz but I'm unsure how accurate the reporting is... Core-Temp program said it was operating at that speed so I'm inclined to believe it.
NVIDIA could and did go and fix it ...lacktheknack said:And TressFX butchered Nvidia cards in Tomb Raider.
It's nessesary for VR. Essential even. 60fps is a bare minimum[/i]. As in, letting the framerate drop below 60fps, ever is not a good idea.Steven Bogos said:tween 30 and 60 FPS is like night and day.
I do, however, agree that anything above 60 FPS isn't really necessary.
That's the impression I'm getting too, AMD is just coming off as rather sulky here. And my household has one Nvidia machine and one AMD. I'm not in a position to be judgemental I use both for different things. But come on...ShakerSilver said:There are A LOT of details missing in this story, makes it just look like AMD is name-calling.
And it's not even that big a deal. Geralt's hair looks amazing even without it. And it's not like you'll spend a lot of time looking at other people's hair to justify the drop in frame rate on any graphics card ever. It's useless eye candy.BloodRed Pixel said:Wait, we are talking a 29% FPS drop because of hairs?
Rediculously Epic Fail! I'd say.
Yep, this is one reason why I'm still going strong on a GTX 465 and will probably jump to around your card's generation (or go with AMD if that camp is looking good at the time) in maybe a year. Nothing I really want to play has my 465 screaming in pain anyways.vallorn said:And people wonder why I stay behind the graphics card curve... This nonsense is why really, I'd rather not have to upgrade to a certain manufacturer's ultra-awesome-super-amazing-hyper$$ card just to play new releases...
For now I will stick with my GTX 750 and play TF2, Kerbal Space Program, Space Engineers, Dark Souls, Killing Floor and War Thunder... And now back to trolling low level players with the M10 gun carriage from across the map.
But, it's the FUTURE! /sarcasmAdam Jensen said:It's useless eye candy.
Hell personally I'm running dual 680s in SLI and can still play damn near every game on Ultra settings. There's a few here and there now that I'm having to drop the quality a little bit for but on the whole its not an issue.vallorn said:And people wonder why I stay behind the graphics card curve... This nonsense is why really, I'd rather not have to upgrade to a certain manufacturer's ultra-awesome-super-amazing-hyper$$ card just to play new releases...
For now I will stick with my GTX 750 and play TF2, Kerbal Space Program, Space Engineers, Dark Souls, Killing Floor and War Thunder... And now back to trolling low level players with the M10 gun carriage from across the map.
Just would like to confirm this is my situation with my R9 280 as well.Adam Jensen said:Nvidia crippled The Witcher 3 on anything other than 970, 980 and Titan X. They used GameWorks to sell more 900 series of cards. Nothing else can run the game on Ultra with 60 fps. I canceled my pre-order because of GameWorks, but ended up pre-ordering at the last minute to get the 20% discount for owning both previous titles. And you know what? The game runs like a dream on my R9 280x. I don't care about Hairworks. And other than that piece of tech, The Witcher 3 runs better on AMD. Nvidia users have been reporting some freezing and stuttering issues. Nothing like that on my end. Frame latency is also a dream. I can barely tell the difference between 60 fps and 40 fps. That's how good it runs on AMD (without hairworks).
Did they, though? Last I checked, TressFX still brutallizes my GPU.Pinky said:NVIDIA could and did go and fix it ...lacktheknack said:And TressFX butchered Nvidia cards in Tomb Raider.