ATI CARD ROUNDUP

Grumman

New member
Sep 11, 2008
254
0
0
Zhukov said:
Used to use an ATI.

Never going back.

If I had a dollar for every time I encountered gaming graphics troubles, went looking for a fix and found only, "Oh ATI GPUs/drivers just don't like that game, hopefully they'll patch it", then I would have been able to afford a replacement much sooner.
That does not mean what you think it means. It does not mean "ATI makes shitty drivers", it means "developers make shitty games". If NVidia or ATI needs to write a kludge into their drivers to get your game to work, you suck at your job.
 

Zhukov

The Laughing Arsehole
Dec 29, 2009
13,769
5
43
Grumman said:
Zhukov said:
Used to use an ATI.

Never going back.

If I had a dollar for every time I encountered gaming graphics troubles, went looking for a fix and found only, "Oh ATI GPUs/drivers just don't like that game, hopefully they'll patch it", then I would have been able to afford a replacement much sooner.
That does not mean what you think it means. It does not mean "ATI makes shitty drivers", it means "developers make shitty games". If NVidia or ATI needs to write a kludge into their drivers to get your game to work, you suck at your job.
I couldn't give a frozen turd for the reasons. I care about the outcome.

And the outcome of this scenario is that I am encountering problems when using an ATI GPU that I would not be encountering if I were using an Nvidia GPU.

After making the switch, I am yet to encounter the inverse situation.
 

Mirrorknight

New member
Jul 23, 2009
223
0
0
Oddly enough, despite everyone saying that ATIs are the ones that tend to run hot, I started with Nvidia and had two burn out on me within a year switched to ATI, and I haven't had a problem with the two I've gotten so far (knock on wood).
 

ND

New member
May 24, 2008
43
0
0
Seriously.

These latest comics from you remind me of an old quote I heard in a movie somewhere:

"It's not the size that counts, but the way you use it."
 

mrgerry123

Regular Member
Aug 28, 2011
56
0
11
I'm happy with my 280x. Runs quiet and gives me good 1080p performance. Nvidia doesn't really have a competing card for the same price with 3gb+ of vram even though they trounce amd with their higher end stuff.
 

burny5555

New member
Nov 26, 2014
1
0
0
i gotta say,my old nvidia gfx card gave me nothing but troubles,constant nvidia driver has crashed and recovered nonsense,now with my ati gfx card i am pretty content.

i do notice how on some games it runs rly hot,but that seems to be bad game optimization as many other heavier games run cool.
 

Spushkin

New member
Nov 2, 2011
75
0
0
IndieForever said:
When I started reading through this I initially assumed that this was the usual teenage-esque defense of 'whatever I happened to have purchased is obviously better than the other thing you bought.' Having seen the cards that people have had and decried as crap or praised as great, it's obviously older people. Very disappointing!

I had a Matrox Millenium. It lasted less than a year and stopped working. All Matrox cards are crap. What? That was just my own experience and some people may have got a decade out of them? No way! Whaddya mean they don't make them anymore...

The only thing this conversation proves is that quality control is really, really important in the electronics industry. Once you've sold someone a product that fails prematurely or doesn't live up to expectations, even though tens of thousands of others work just fine for a long, long time, you've lost them as a customer. And, with the reach of the internet, they can tell everyone that the 'Nvidia/AMD/Ati* xxxx' is utter rubbish and doesn't work properly. Oh, and those crappy Matrox cards. That will be right above the post that says how brilliant the 'Nvidia/AMD/Ati* xxxx' is and anyone who says otherwise is obviously having sexual relationships with someone in their family. Matrox cards will never be good though, because they're not made anymore and I bought a bad one.

* Delete according to how old you are!

It's been said already but worth repeating - these two companies leap-frog each other constantly but sometimes it's not clear where. AMD have pretty much bowed out of the single card top-end race but you can't go wrong with one of their mid-range offerings. Nvidia peddle over-priced coin-miners/folding machines/computational devices as well as the single best bang-for-buck card on the market - the 970. Who cares if you're actually missing 500Mb of VRAM - I build properly demanding games for a living and the news barely caught my attention.

When you're dealing with games shipped by the big publishers, who write their own engines, then some of what has been written above does come into play. Optimisation, bribes - oops, I mean technical advice and development assistance funding - all of that has been, can, and will be an issue.

For everything else that is built on a non-proprietary engine, it doesn't matter. Pick your budget, do a quick google search and you will find the best hardware for your price-point in minutes. Sometimes the winner is AMD, sometimes it's Nvidia. If you find they are equally good (the horror!), simply pick AMD if you live in a colder climate, preferably with cheap electricity, and Nvidia if you live in a warmer climate.**

**That previous sentence may or may not apply depending on which range you pick.

See how silly it all is?

The trick, of course, is to do none of the above. Don't buy a current generation card. Do what we do here in the studio which is to pick hardware we think we will be mid-range in 18 months and buy last-gen from suppliers keen to get rid of old stock. My dev rig has two 780Tis in SLI and cost next to nothing because everyone wants the latest and greatest. I can work and test an SLI setup, disable one if the drivers throw a hissy-fit and it has yet to encounter anything it can't handle at max-settings at 1440p. 4k is here, but it's not the modal setup.

On a different note, I've just bought a Volvo V70.

Feel free to post 'shut up dumbass, everyone knows Lamborghinis are faster like what I have and my super-model girlfriend's BMW is so much cooler. Stupid fag Volvo drivers.'

:)

Give this man a cigar, seriously.
 

JayRPG

New member
Oct 25, 2012
585
0
0
Just a little factual note for any fanboys in the thread: AMD's apparent inferiority is easily explained, their research and development expenditure is at a 10 year low.

$238 million dollars was the total amount AMD spent on R&D, which includes R&D on all their products.

Intel's was $3 billion dollars for the same period, admittedly this cost also includes on-going costs for fabrication plants so it's not an entirely accurate comparison, but what is an accurate comparison (and a worrying one) is Nvidia's R&D budget.

Nvidia spent $348 million dollars ($110 million more than AMD) on R&D in the same period, and Nvidia develops less products overall than the entirety of AMD.

It's not hard to see why AMD's processors have barely changed in the last 8 years, and why their processor tech hasn't changed in even longer; And on the GPU front it isn't hard to see how AMD's only option is to put 2 GPUs on a single card to 'beat' Nvidia - They just don't spend enough on R&D so they are always at least 1 step behind tech wise.

I'm not trying to shit on AMD either, it is often amazing and/or impressive what they can do with the outdated tech, and they definitely have their place in the market, but at the end of the day they are still working on outdated tech while their competition pushes forward.
 

w00tage

New member
Feb 8, 2010
556
0
0
IndieForever said:
The trick, of course, is to do none of the above. Don't buy a current generation card. Do what we do here in the studio which is to pick hardware we think we will be mid-range in 18 months and buy last-gen from suppliers keen to get rid of old stock. My dev rig has two 780Tis in SLI and cost next to nothing because everyone wants the latest and greatest. I can work and test an SLI setup, disable one if the drivers throw a hissy-fit and it has yet to encounter anything it can't handle at max-settings at 1440p. 4k is here, but it's not the modal setup.

:)
Nice strat. When I built / recommended purchases, I settled on 6 months old and just after the release of the new tech. The issues had been identified and patched by then, the performance was at least 80% of what I could buy new, and the price was 2/3 to half of what it used to be (which meant I could easily afford higher performance gear than I would otherwise have bought).

Now I've switched to laptops for a lot of reasons, the strat doesn't work as well, but I still get decent gear for a decent price (lower-end ASUS G750 atm, it's ok for games and sweet in all other ways).
 

Dissentient

New member
Aug 19, 2011
32
0
0
A joke about current generation AMD GPU's should have been about space heaters. 280X beats GTX 960 really hard for the price, but one of those things will raise your room temperature after an hour of gaming.
 
Sep 14, 2009
9,073
0
0
even though I have a slight preference for AMD, I still laughed out loud at this.

still, I'm getting what I pay for in my product at least, unlike nvidia's 4 GB...oh I mean, 3.5 GB ;-)
 

Samus Aaron

New member
Apr 3, 2010
364
0
0
I'm pretty sure food stamps aren't actually stamps anymore. They come in the form of EBT cards (Electronic Benefit Transfer cards), sorta like a credit card. So this comic isn't entirely accurate.

*The more you know*
 

Neferius

New member
Sep 1, 2010
361
0
0
'scuse me, but last i checked ATI...er srry AMD cards still have the biggest processing power when it comes to high-end stuff.
https://litecoin.info/Mining_hardware_comparison

GOOD DAY SIRS!
*slams door in disgust on the way out*
 

nyysjan

New member
Mar 12, 2010
231
0
0
Used to use AMD cards almost exclusively (mainly due to price).
last one i had was R9 270X with 4 gig memory, cost 200? or so, worked fine for few months, then it broke down.
Replaced it with GTX 960 with half the memory and 50? more price.
No noticeable improvements, fps is mostly the same and so is noise level.

Sure, if i wanted to spend as much on my graphics card as i did on rest of the machine, i would probably do better in ATI cards (even if mostly because of the dev optimization), but when trying to stick to ~200? price range, it does not really matter that much.
 

Dhael

New member
Nov 29, 2008
36
0
0
I prefer AMD mostly because AMD aren't nearly as anal about what hardware they will work with. With Nvidia the optimal setup is the ONLY setup allowed. While AMD works with any AMD card of the same architecture, so you can you can create a decent transitional rig. Don't have enough for that sweet dual 280x crossfire rig, but already have a 260, slap that 280x in there and it WILL successfully crossfire. Yeah,the 280x won't be able the max out due to memory restrictions but it will still be better than a 280x alone. Buy the second 280x when you have the money. And if you use mantle you can even reduce even that problem

You can't do that with Nvidia. You HAVE to trash your old card because Nvidia cards won't SLI across platforms.

AMD's biggest problem is that they are garbage at making drivers, so it takes forever for the good drivers that can optimize the card to it's fullest potential to come out
 

Korolev

No Time Like the Present
Jul 4, 2008
1,853
0
0
GASP! An insult! Against my FAVOURITE GRAPHICS CARD/GPU MANUFACTURER! I am thrown into an instant, frothing rage! My vision has become crimson and my heart yearns for the blood of Nvidia users! But the rage is too much! My sympathetic nervous system has become overstimulated, so geared up I am to this fight, that, though my heart brims with passion and fervour, my body fails me, and I am left, shaking, nay trembling in my chair, with this sensation of implacable hatred!

You shall rue this day!
 

IndieForever

New member
Jul 4, 2011
85
0
0
Dhael said:
AMD works with any AMD card of the same architecture, so you can you can create a decent transitional rig.
That's a really good point which no one else has mentioned. At a personal level I have to say I prefer the AMD open-everything philosophy but, from a developer's perspective, Nvidia offer tools which AMD do not. The final results appears to work equally well on both manufacturers' cards, but the tools will not run on AMD kit. From a consumer's point of view, AMD offers more.. what's the word.. flexibility.

----

At the end of the day it doesn't matter what you buy really - a search will steer you clear of the poor-value cards from either company and point you to the right ones if you have a niche use such as mining. I can assure you all from hardware surveys that SLI/X-fire is not a common setup, and neither is 4k gaming. At the point when you're thinking, hmmm... the AMD 9999 gets 130fps at 1920x1200 with shadows at medium but textures at max, and the Nvidia 999 gets 122fps at 1440p but the shadows are more.. 'shadowy' and it has Texture-Sharp (tm) technology that blurs things to make them look sharper... just buy what you want :)

But please, for the love of 28nm architecture, be happy with what you have and the research you did and don't take personally someone saying theirs caught fire and murdered their cat, whilst simultaneously stealing the last beer from their fridge. It's not a reflection on your choice, it's a statement about their experience.

Our games will run just as well/poorly on your AMD chipset as your Nvidia one at roughly the same price-point.

Fuck those Matrox cards though. Apparently VESA drivers won't install anymore...