Okay, time to drag people through a quick brain storm, based a little on guesswork and assumption, but it shouldn't be that far off the mark. (Feel free to check facts, though, I love corrections).
Back in the days of olde, when movie was a fresh new fancy medium, and for a substantial amount of time after that, movies were shot using long rolls of tape, on things called reels. As you might imagine, this could get pretty expensive, and I would assume they used some sort of audience to test how few frames they could have, without the movie becoming too choppy. 24FPS seems like a decent number, if the flicker fusion threshold is 17FPS (thanks gamespot). This would reduce the cost, since it would reduce the amount of film rolls needed.
Fast forward to modern day, the 24FPS has since then become a trend, so this is what movie makers assume the people are used to, and why mess up something that people are used to. I don't know what resolution or FPS raw film is shot in nowadays, but we most certainly have the technology to process large amounts of raw film without having to limit the FPS to 24, unless the movies are shot in 50k (which I highly doubt).
I reckon the films are shot in 24FPS as well though, to lessen the transition between raw film and finished product. It seems the whole "save money on production"-idea is still in the minds of movie makers, but that they keep applying this to the shooting of a film, in a time where the digital medium allows the shooting of 500 extra hours at the expense of a few GBs, is ridiculous. There shouldn't BE a need to resort to the old idea of 24FPS any more, because that was done to save money, and they don't use movie reels any more.
If it is done to please the elitist or retro crowd, then we have basically failed. We've practically stopped technological advancement because it doesn't suit a minority that thinks everything must stay the same, because of a crowd that thinks that's how it was done, that's how it should be done. Old-fashioned thinking for old-fashioned world.
To get back on topic, I personally love high framerates, the smooth movement of high resolution images, where every detail is crisp and clear. I can see the problem with technical limitation, but like people in the thread keep saying, hiding it as a gimmick when it very well could be a technical limitation is ridiculous, especially in a first person shooter, where movement and precision is key. It has been tested multiple times, higher framerate allows higher precision, because while the eye can't "see" beyond 30FPS (which is bullshit, I can easily see the difference between 30 and 60FPS), your hand "sees" a much higher fps, allowing for a much higher precision, especially first person shooters. Hand-to-eye coordination, so to speak. (I read an article on this a few months ago, I don't remember the wording they used).