Necrohydra said:
Thinking about it, besides the obvious reasons like increased pay, I wonder why coding experience isn't required of all games' testers...
That is a surprisingly complicated question and the answer will necessarily vary from publisher to publisher. Part of it is money, part of it is politics, part of it is logistics.
In general, there are two types of testers: Black box and White box. You're talking about white box testers (testers that actually break in to the code, do more granular debugging and actually fix 'easy' issues.) Black box testers test from a player's perspective and very few ever get deeper than poking through .ini's or game logs for specific information. Most testers in the games industry are strictly black box. There's a few good reasons for that and I'll try to cover a few of them.
As noted in the article, Microsoft does look to promote testers with coding experience. But as you mention, cost is a major concern for making this a requirement. According to Game Developer Magazine's 2007 Salary Survey, Quality Assurance Testers with less than 3 years of experience earn an average of $39,063 (which feels a little high, honestly, and is definitely a piece of information I need to take to my next performance review...), while your average programmer makes somewhere in the neighborhood of $83,383. Your average Designer (the third lowest on the survey) earns around $63,649. (The true travesty in the survey is that Business and Marketing are the highest compensated persons in the industry, but that's another subject entirely.)
As mentioned in the article, publishers do hire testers that can code. However, they're primarily involved in building testing tools and automation. They're also paid a hell of a lot more than your average black box tester, so there are usually only a small handful per game team (sometimes across an entire QA Department.) Generating and debugging these tools tends to take up a HUGE amount of time for these people. So actually getting their hands dirty and black box testing the product is virtually impossible. These Test Engineers are greatly appreciated (though usually silently) by the rank-and-file. Test automation usually kills a large amount of the drudgery in testing a title. Unfortunately, some tasks can't be completely automated, so you have to pay someone to sit around and watch things. Black box tester to the rescue.
With the growing complexity of console titles and the surge in MMO production (arguably some of the most complicated software systems on the planet), there is a HUGE need to test these things at run time. The simple fact remains that you just don't know what's going to happen until you turn the key. If you can find me a coder that can produce 100% error free code, 100% of the time, I'm going to start praying to every God known to man because the end times are nigh.
One of the other issues with hiring exclusively test engineers can be in the structure of the publisher itself. If "testers" are fixing bugs, then why aren't they on the production team? It mostly comes down to money here too, but Executive egos get involved and shit gets ugly. Again, complicated and will definitely vary from publisher to publisher.
Just one last thing, my experience in QA has been mostly positive. The hours can suck, the people can be even worse, but I'd never trade my 100+hr weeks for anything. The war stories are what makes things fun. I've gotten to work on a wide variety of products and it's really pushed my ability to think critically, solve problems, manage stress, build relationships with coworkers and a whole myriad of other useful life and job skills.
Not that anybody ever really cares about testers, but I like talking about this sort of stuff, so if people have questions, I'm more than willing to answer nonspecific stuff.
Anyway, I'm going to shut up now for risk of just rewriting the article.
Edit: I lied about the shutting up thing. Another problem with testing is one of perception. A large segment of people (from Executives down to entry level designers) think they can test a game and that involves little more than "playing" for 8 hours a day. It's really unfortunate that the perception still exists, especially considering there is such a vast difference between play testers and QA testers. While play (balance, flow, progression) testing is part of the QA experience, the job extends to things far beyond them and the sad thing is that in my experience developers (Producers, Leads, Execs) are less likely to listen to game play feedback from QA testers than they are from focus groups. There are reasons behind that, I'm sure. But when somebody that is familiar with every in and out of your product says something is screwy with an aspect of "subjective" game play, I would think the decision makers and the product owners would listen...