Honestly, every MMO that isn't WoW simply plays like a list of reasons why every MMO that isn't WoW sucks. Worse auction house, worse dungeon/raid content, worse class balance/viability, more bugs/more gamebreaking bugs, and most importantly, nowhere near the number of updates or quality of updates. The $15 that goes to WoW almost never feels like it's being "wasted," what with the content they put out every patch being top caliber. Raid quality is hit or miss, but usually only one raid per expac is a total bomb (togc in wrath, dragon soul in cata) with the rest of them being pretty good or amazing.
Also, as much as some dipshits rag on add-ons "ruining" the gaming experience, WoW is made far more playable because you *can* edit everything about it. The layout of your UI, timers, cooldown bars/icons, raid windows, threat charts, damage charts, addons that track who died to what mechanic so your raid can improve, etc. When a MMO hits and the company says "no player made mods are allowed," they're basically saying "our game is 100% perfect and nothing anyone can do can improve it even slightly." Because no game, particularly in the MMO market, is perfect, all that serves to do is reduce the potential quality of their game. Even recent titles like Guild Wars 2 (an awful grindfest with no endgame content aside from grinding karma/gold for legendaries) and ToR (a half-hearted attempt to make a story-driven PvE experience, except because it's a MMO the moral choice system is even more out of place than it usually is and bioware hasn't put out a single content patch in over a year, so "stale" doesn't even begin to describe the gameplay) could massively benefit from player-made addons to improve the dodgy UI or functionality (Guild Wars 2's range-detection system is fucking awful, for example - a tiny bar at the bottom of a skill will turn red or black depending on if your client thinks you're in range, but the change from red to black or vice versa is hard to notice and really out of the way, where a player mod could easily have the entire skill icon dim out when out of range).
Basically, when it comes to the MMO market, millions are all playing 1 game for a reason. It's objectively less shit than all the competition, and when a game comes out with 1 new or interesting idea, Blizzard often plucks it and rolls it into the next big WoW patch. And no other game comes close to the amount of money that's thrown at it, and it shows - when Cata launched with WoW, two entire continents were redesigned with new quest chains and a completely new skin, simply so flying mounts could be used in the vanilla WoW areas. No other dev would have the balls to put so much money into an update of "old" content, and they did it because they knew how dated/shitty the zones were - despite being costly both in terms of money and man-hours, it was worth it to improve the game. When a game like Rift or Tor has a shitty zone, you can pretty much guarantee it'll never get updated. And Guild Wars 2 is pretty much 90% shitty samey zones with buggy karma hearts and dynamic events, with no world continuity and loading screens between every zone. That game seriously feels like it belongs firmly in last decade, or even in the 90s - WoW has flying mounts that allow you to zip across entire continents with no loading screens, so there's not really an excuse for GW2 to have them in such abundance. Then again, both the PvE and PvP content is awful, unengaging, uninspired, and grindy, so all things considered loading screens aren't even the biggest problem Guild Wars 2 has.