Yes because we all know that pre-release screenshots never get touched-up, they never only show the good parts of the game. I mean seriously, after the recent WatchDogs fiasco you're not even one bit suspicious? Don't you realise that they will reduce the graphical settings?
You're right about 24fps being chosen for technical reasons back in the day (although it has as much to do with sound as visuals). To clarify some other points in your post - films are almost always shot at their output frame-rate, which in the vast majority of cases is 24. Almost the only reason to shoot at higher (or lower) rates is to slow down (or speed up) action after the fact by running it at the output frame rate. There's a technology called Showscan which advocates shooting at 120 and downsampling to lower rates, but that's not actually been used in any films yet.
24FPS is not chosen nowadays because of technical reasons, nor is that ever quoted as a factor. Shooting at higher frame rates is easy, but people generally don't want to. Put simply, it is an artistic choice. Films look very different at different frame rates. The general consensus as I'm aware is that 24fps, compared to say 48, intentionally looks less realistic. Suspension of disbelief is much harder when your brain thinks you're just looking at something real - the slight jitter of 24fps provides a buffer between you and the fictional world which counterintuitively can aid immersion greatly. In my experience I certainly found The Hobbit at 48fps to look like a stage play, with obvious costumes, lighting and makeup, and I'm pretty sure it's because my brain was more convinced it was real.
If it is done to please the elitist or retro crowd, then we have basically failed. We've practically stopped technological advancement because it doesn't suit a minority that thinks everything must stay the same, because of a crowd that thinks that's how it was done, that's how it should be done. Old-fashioned thinking for old-fashioned world.
I don't think that's a minority when it comes to film-goers. Gamers, I've noticed, tend to prefer higher fps in films because that's what they're used to. But that doesn't mean it's the right choice. Besides, I could make the counter-argument that if you increase frame-rates in films, you're just doing it to satisfy a minority who think that because we can do it, we should do it.
But games are a very different medium, and the almost ubiquitous drive to make them more "cinematic" generally undermines efforts to find gaming's unique strengths as an art form. Lower frame rates in games are almost universally detrimental given the interactive nature of the medium, and although a genuine limitation-free decision to lock a lower frame rate is a totally valid artistic choice, I think jamming it into a game (almost certainly to cover up technical issues) which isn't specifically designed around it will most likely result in a poorer experience.
I'll say the same thing I always say when it comes to this: as long as the framerate is consistent then I could give two shits about graphics or about a high framerate. Yes, it's 2014 and this should be standard now, but I've never enjoyed a game I liked less because it had a lower framerate or was full HD. If I like the story, characters, gameplay, and or music then I'm generally fine.
Really though, this trend of false advertising as of late is really not doing good for anyone really, and companies should really stop with this crap.
This right here. Yeah I can see a definite change between 30fps and 60fps but it doesn't bother me. As long as it's consistently either 30 or 60 and the graphics don't cause my eyes to bleed with overbloom I'm good to go. Doesn't matter much for this game anyway, it looks like your typical third person shooter except with a coat of Steampunk over it(and not even inspired Steampunk).
Yes, there are significantly better looking PC games.
But, that's not the point of a console.
If you told me, I could spend $350 to buy a PS4, which looked as good as Order does (Yet to be confirmed, mind you.) but, with the limitation of running at a sluggish 30 FPS, then, I guess that'd be fine. I mean, I'm a PC gamer, console games aren't really for me, but, for that price and performance, I guess it's acceptable.
I'd recommend it to somebody whom is less into games than I, and unwilling to put money into their hobby.
And that's ultimately what consoles are. They're PCs for the gaming hobbyist.
PCs are for enthusiasts. They're for people whom care about how well it runs, people who want high resolutions and all the fancy graphics, not for the casual gamer whom doesn't care if the game slows down occasionally.
And that's fine. It's a different standard for a less caring audience. So, there's not much point in holding it to the PC standard. Console games run at 30FPS in small resolutions, usually looking graphically inferior compared to PCs.
And for that, yes, Order does reach the console standard. However, the issue here is why they're saying they're running the game that slowly.
How is the witcher shit? Oh wait, consoles. They never got the full game because witcher 1 was PC exclusive. The full experience needs your choices from the last game.
Metro is shit? Its critically acclaimed. Both of them.
The only shitty game is hitman, and thats subjective because it follows silent assassin rules than blood money's rules.
And why would the order be any more than an ugly corridor shooter. Its not like we have star citizen or anything. Oh wait, we do, and its more impressive than the order. As well as every other PC game out there.
As for optimization, you are so totally wrong its hilarious. Better hardware is ALWAYS better hardware. To prove a point, here is a 8800, better than 7th gen but is nearing 10 years old, still going strong:
Here is another, with Batman running at 50 fps without fraps on.
That card is just as old as the xbox. Its slightly better than the ps3. Still going strong and still has higher settings. You don't have to upgrade, that's a myth. Everything you just said was a myth. A common myth that has long been debunked.
Its nice to know people will take a game no one has seen in full light and claim its the best. Its also nice you are making shit up now.
So next time don't pull out "trump" cards that are hilariously outdated by 20 years. What next, are you going to say we code to the specific card? Because that is also ancient as well.
Witcher is shit because it is about as aerodynamic as a bag of hammers. I played both incidentally and while the story is intriguing it plays like absolute shit on the pc.
Have you played Metro? The first was a pile of bugs with more bugs on them who all had crabs. The second a much better effort, but it also has its issues in the bugs/inconsistency area.
Yeah, it's a nice graphics card. I'd like to see the same videos with the CPU and RAM that is 10 years old.
I've never claimed anything was the best. But developers do not optimize their games to the 10000 different PC configurations out there, they make the games for hardware that is not more than a few years old, where as they optimize like hell for the consoles.
Oh wow, I spent the better part of an hour writing a reply, and then the forum went offline or something when I tried to post -.- I'll see if I can be bothered to write EVERYTHING all over again during the day.
I do hope this will come back to bite Ready at Dawn in the ass. They could go crash and burn for all I care, with people at their head who are so obviously bullshitting and have no grasp at all of their medium, I think we can do with less companies like these. They're essentially dead weight on this industry