Yeah, speaking as someone who has a pretty high end computer, all the current gen consoles, and likes to max out graphics and FPS, I notice the difference in frame rates, and for twitchier games the smoother FPS feels nicer, but for pretty much anything else, I'll take 30 on my consoles just fine, and I'll take a consistent 30 over fluctuating between 50-70 any day of the week. Even in console shooters, it's noticeable for me, but I get used to it fast and stop caring.
So yeah, sure, I notice it, but it doesn't really hurt my enjoyment, and I usually stop noticing it after playing for a couple minutes.
I'll take 60 FPS if I can get it, but before I upgraded my PC some genres I've purposely knocked the graphics up and ran around 30 FPS just to get shinier textures or a better draw distance. 60 FPS or higher is nice to have if you can get it, but I've never really considered it to be a massive deal if I get locked at 30.
Now, if a game comes out on PC, and poor optimization locks it at 30 FPS despite my computer probably being able to run it at 90, then yeah, that sucks, but its not really about the FPS being low, I just don't like having my options limited when I have sunk money into building a machine that can crush the upcoming holiday game list.