I actually think that a 30 FPS cap on console games isn't even a big deal. And that's not because I'm a PC gamer that doesn't care about the standards console players have to deal with, I think those that game primarily on consoles have a different set of expectations.
When you're playing on your PC you (typically) have a precise keyboard and mouse setup, a HD display (standard PC monitors have been HD since before HD televisions became popular), and a more "involved" feel. You're generally not leaning back on a sofa across the room from the action, you're at a desk a couple of feet away from the display. You can see the pixels, you can sense the responses that minute mouse movements make, it's a lot more precise.
With precision and scrutiny comes the demand for the most optimal feedback. 30 FPS might cut it for console games but on an FPS played on a PC, generally with a lot of camera movements, it is merely tolerable.
On the other hand, console games have traditionally had more of a disconnect between game and player. You're there with the controller specifically designed for the console, a controller that is sometimes needlessly cumbersome for many games (see the Wii and Wii-U and for the most frustrating modern example, the Kinect). Consoles are also advertised to be used by multiple people in the household, and are usually placed in the living area where others can spectate or join. PCs (as their name implies), are for more specialised, personal use.
So while 60 FPS is objectively better and people should quit arguing that low frame rates are "cinematic", console games simply aren't scrutinised as much. 30 FPS can be acceptable and those used to the relatively imprecise controller inputs and distance from the screen probably won't see much of a difference, especially when the game tastefully uses an effect like motion blur.
Besides, don't pretend like our PS1/PS2 games didn't chug when there were lots of things on screen.