The Great Framerate Debate

TiberiusEsuriens

New member
Jun 24, 2010
834
0
0
Charcharo said:
TiberiusEsuriens said:
I am all for making good use of the GPU, but you are wrong on part 1.
Most games DO NOT make good use of all the CPU can do. They barely use 2 cores, let alone 4 or 6 or 8.
Crysis 3 has great physics, equal to those of PhysX on the CPU without tanking the frame rate...

Also I disagree on the engine part.
STALKER Lost Alpha released a month ago (an early access build) and is a fan made official/unofficial game. ANd you know what?
They finally fixed the 14 year old STALKEr engine. Now it uses ALL avaliable cores to generall good effect and load MASSIVE, multi level and VERY detailed maps (that would make most AAA devs commit suicide). Here is how many parts of the game look like:
http://cloud-2.steampowered.com/ugc/468682266817903805/EADFB7A0B11FED887E6AFE3F591BF4E1AC7845B8/

*Though Shamus will ignore as always*

So even old engined can be made to rock.
Those are actually pretty solid screens. I'm not saying that old engines can't be made better, but that the important details are always thrown in as afterthought.

I am curious as to how long development has been going on the Stalker mod. As you said it's a very old engine, so the modders have had [up to] 14 years to make it sing. AAA studios can make their engines better, but as PC gamers love to point out, they simply don't try. Engines can be optimized after the fact, but it requires a LOT more work than if they had better groundwork, leading studios like Ubi to enhance Anvil repeatedly while never putting in the effort to optimize. Optimizing an engine takes a bunch more time than creating it. After each yearly iteration/addition they would have to completely re-optimize again and again.

That's about 3-4x as much effort as the CryTech example, where one very stable engine lasts 5 years, giving them time to create a new HIGHLY optimized very stable engine that lasts another 5-10 years. The Anvil engine just recently has what CryEngine 'perfected' years ago, and many times it feels like it barely functions. It's a big reason why Source engine is so old but still loved - it was built to last.
 

Groenteman

New member
Mar 30, 2011
120
0
0
As someone rocking 1440p at 114hz with the GPU brawl to back it, I find this discussion simply adorable!

*sips wine*

In all sincerity though, framerate is pretty noticeable. I immediately notice when my graphics driver decides to mess with me and switch back to 60Hz and frame drops below 60 actively annoy me.

And even if its not immediately noticeable, details add up. Bits of resolution, framerate, lighting, shading, post-processing, etc add up to the difference between lifelike and a plastic-y, stuttering mess.

No not everyone will immediately notice different framerates, but consoles are supposed to be family machines and should not be designed with only grandpa who refuses to wear his reading glasses in mind.
 

DrOswald

New member
Apr 22, 2011
1,443
0
0
TiberiusEsuriens said:
There is yet one more compounding factor to the debate: We have gotten to a point in time where increased graphics resolutions and textures give us increased diminishing returns (ie, spending tons of memory and CPU for tiny-winy boosts) and all people ever talk about are how important those tiny boosts are, but ignore the truly groundbreaking GPU additions.

The biggest changes that we will ever see in graphics engines in the near future are post processing effects. This includes fancy lighting, reflections, smoke, particle effects, anti-aliasing, ambient occlusion, tessellation... terms that most gamers may have heard of but never knew what they were (because no single effect is a GIGANTIC change). These are the features that set apart the true 'next-gen' game engines such as CryTech's CryEngine. These effects are also at the root of modern major gaming news such as the Watch_Dogs downgrade.

CryEngine 3 can give a medium-hardware machine such a pretty picture not because it has high resolution or fps, but because every single element mentioned above was considered in the engine's design from the beginning, given the devs time to optimize rendering for them. Watch_Dogs got downgraded because the engine was designed to print a basic image to the screen at solid fps. They then put in some of these and cranked it up to 11 to see how pretty it could be at E3, only to realize months later that the engine was never designed to handle all of them at once. You can only optimize for them so much when you don't take these seriously in to account - when they aren't included in the ground floor. These features are a tiny after thought when they should be integral, and not a single one of them is a gamebreaker by itself.

The same happened with Assassin's Creed 4: it was built with the Screed 3 engine and had some pretty lights tacked on that it was never meant to handle (Screed 3 was based off of Screed 2, etc..). The result: turning god rays on in Black Flag tanks your fps. While Ubisoft is known for having terribly optimized games, this is a broader trend with most companies. Tomb Raider trumpeted TressFX for Laura's hair, but how did the game handle it? The result: turning TressFX on in TotalBiscuit's mega rig tanked his fps by over half. The new inFamous ran at 30fps but still looked better than many games that can run 60, and it was because they dedicated early development to lighting and reflection. The game had some issues, but very few people complained about the game not looking good.

CryEngine 3 is considered one of the truest "next gen" engines, and it was released FIVE YEARS AGO. Just sayin'.
One note: Software is often created long before it can be practically used, especially in an area like graphic rendering. CryEngine 3 may have been released 5 years ago, but that means literally nothing about the type of hardware that is required to run it - I am willing to bet 5 years ago nothing readily available to consumers would have been able to run a CryEngine3 game that took advantage of the features of that engine.

We have had ray tracing for years but there is only one computer in the world, a $500,000 super computer at the University of Texas, that can run it real time.

How "next gen" a piece of software is cannot be determined by when it was created, and when the engine was created often does not indicate what kind of power is required to run it.
 

Steve the Pocket

New member
Mar 30, 2009
1,649
0
0
Personally I think we should expect 1080p at minimum because it's two thousand goddamn thirteen and it's pretty ridiculous if these much-hyped systems can't even match the specs of the televisions we hook them up to, televisions that a lot of us bought before the previous generation came out. Can you imagine if a video game system from 1998 wasn't able to achieve 640p? They'd be laughed out of the industry.

I don't know where I stand on framerate. I know it's pretty distracting when TF2 dips below about 45, even if I'm just flying around scoping out a new map that I just downloaded, and I imagine it's not good for my reaction time at levels far above that. But my other games, even the newer ones that surely can't hit that level, look fine to me at whatever framerate they happen to run at (Source is the only engine where I have any clue how to check). They're all single-player, granted, and frankly if you're into twitch action multiplayer, the fact that console players are stuck using a gamepad is going to hurt their performance a lot more than framerate issues.
 

Alpha Maeko

Uh oh, better get Maeko!
Apr 14, 2010
573
0
0
If Call of Duty ran at 30 fps, it would make a significant impact on most players who are used to having smooth motion and responsive actions. For twitchers that pop off headshots between blinks, it would be a nightmare.
 

Rad Party God

Party like it's 2010!
Feb 23, 2010
3,560
0
0
*sigh* I'd like to jump into this lovely "framerate/resolution" debate, but at the end of the day... it's just meaningless.

I'm one of those people who DO tell the difference between a 16 bit and a 32 bit picture (like the one in the article), I DO tell the difference between 60 and 30 fps (anything above 60 flies over my head, I just don't notice it) and I DO tell the difference between 720p and 1080p and yet... I don't care about all that fluff, as long as the gameplay is smooth enough.

I play DarkSouls unmodded (!), so it's 720p (wich in reality is even lower than that) and 30 fps, but I still love the game regardless, because of how smooth it plays. In my early days of PC gaming, I was stuck at 800x600 with framerates lower than 30 fps and I still considered that playable enough.

It wasn't until the first FEAR when I started to learn the difference between framerates (thanks to it's built in benchmark), I found that anything above 15 fps is still pretty playable to me, 20 fps and up and it was heaven for me.

Heck, I still play some PS1/N64 games hooked up to my old CTR TV and I still play them perfectly fine!, freaking Ocarina of Time/Majora's Mask are stuck at 20 fps and I still love those games!.

I could be here all day, but as I said, it's just meaningless to me at the end of the day.