The only times I'm actually able to spot the difference is in fast-paced action or racing games, and to be completely honest, I don't really care all that much. Sure, Tales of Zestiria looks a little smoother at 60fps, but I had no problem playing it at 30fps on my PS4. Besides, my laptop is too shitty to run the game at all, so whatever. And okay, sure, my laptop wouldn't be considered a gaming system by the members of the PC master race, but looking at stats collected through Steam and such, it's actually not that bad. I'm mean, it's bad, but the average PC is actually not that far removed from the avarage console, so it's not as my laptop is absolutely terrible. Heck, according to Steam surveys, I'm far from the only one with a simple Intel HD Graphics card!
And all those games with limited animation I play, like visual novels and turn-based RPGs, those don't really need 60fps. I may not even notice it if they dropped below 20fps. I mean, it's mostly text appearing text boxes, and depending on my settings it might appear instantly, so yeah... (Though it is much easier to get these games to run at 60fps, hah.)
In any case, what's most important to me is a stable frame rate. I'll take a steady 30fps over an unsteady 60fps any day. And yes, I'm totally okay with some graphical sacrifices to make that stable framerate happen. In game development, there will always be compromises. Even if developers had an unlimited budget, an equal amount of talent, and no deadline, they'd still dependent on the users. If people can't play it, they won't buy it. If compromises are unavoidable, and they always are, I hope developers focus on stability rather than making the pretty pictures a little prettier. Xenoblade X may run at 'only' 30fps, but it does so consistently despite the size of the world. It's a wonderful thing that was made possible with the power of smart compromises [https://www.youtube.com/watch?v=qaYfIdZ-_a0].