Kopikatsu said:
So, I'm going to go get a gaming PC. I'm tech-savvy enough to figure out what I need for everything except for the graphics card. I don't know jack shit about graphics cards. So! I was wondering how a AMD Radeon HD 6670 would fair in the gaming world of today. Probably not too well, since it's fairly cheap ($99 as opposed to $720 for the 6990), but like...could anyone give examples of games that the 6670 is just good enough to play? (Like Hunted: The Demon's Forge, or Brink or something.)
I have no clue how the 6670 compares to the rest of the 6xxx series, but I can't imagine it's any worse than my graphics card. Don't let PC elitists scare you into buying "THE BEST!!!!11!one!!", until games stop being developed for the consoles
anything better than an HD Radeon 2xxx is going to be able to run computer games, just with varying levels of beauty.
I have a Mobility Radeon 5850 (two-year old card) in my laptop and I've been able to run every game released this year at 1600x900 with either high or ultra settings (even Crysis 2 with DX11 and the high-res texture pack, Witcher 2 I only need to turn ubersampling off). By next year I'll probably need anti-aliasing turned down a bit, maybe medium settings for some games, but PC hardware doesn't get outdated nearly as fast as people like to say anymore. I would still recommend something a bit better than the 6670, possibly a 6850, just to last you
slightly longer if you can afford the extra ~$50 (depending on where you find it of course. . . but I think the 67xx series is pretty similar to the 6670 so there wouldn't be much of a difference between the 6670 and, say, a 6750). But with a 6670 I can't imagine you'd have trouble running anything released this year as long as you have a competent processor and a nice amount of RAM to match. Or if you decide to really splurge and get two graphics cards to run in Crossfire, you certainly wouldn't have problems. (Except with games that get all screwy when you have Crossfire, of course.)
EDIT: The biggest issue when it comes to PC gaming nowadays is how effectively the developers optimized the PC version. The Witcher 2 will run on pretty old hardware, and still looks fairly good when it does, because it's a rather well-optimized game. Crysis (the first one) still struggles to be run at full settings on newer hardware than the game itself because the CryEngine it was built on is extremely powerful, but horribly optimized (hence why everyone said it was built for some supercomputer from the future when it was released).
Unfortunately, a lot of games get paltry ports, so it is good to use hardware well above any recommended specs because it's difficult to actually know if something will run well until you personally test it. The first FEAR game, for instance (a PC exclusive when it was first released, later ported to the consoles), has such a bad engine that on some computers it severely cuts your framerate if you're using Windows 7 and have something like a mouse plugged in to a USB port.