Poll: 60fps vs 30fps? opinions?

DoPo

"You're not cleared for that."
Jan 30, 2012
8,665
0
0
Patathatapon said:
SquallTheBlade said:
I don't remember saying it didn't affect casual play.
Sure, you didn't literally say this word for word but you said exactly the equivalent when you said that it only affects competitive play. Casual is seen as "not competitive", thus, according to what you said, it would not be affected.

Patathatapon said:
But apparently we can't, so 60fps can go suck a dick.
You've yet to prove that. As far as I can find, there is a single game that does have 60 FPS and no local coop and some link between the two, namely, that's Halo 5. And the "link" between 60FPS and lack of local co-op is somebody saying that's the reason. If the guy said it's because of aliens, would you be here saying "I'm fine with aliens, but since they are the reason we can't have co-op, the aliens can go suck a dick". Because it did sound like a rather lazy excuse.

Especially that after actually looking up, seems like Halo 5 co-op requires an Xbox Live Gold subscription. Which is, I am lead to believe, paid. I somehow see a bigger correlation between that and no local co-op than 60 FPS and no local co-op. Ah, but I must be imagining things. Sure, let's just blindly trust what anybody says online.

Speaking of, I happen to be a Nigerian prince and have some Rolexes and Viagra for sale. Very cheap.
 

Patathatapon

New member
Jul 30, 2011
225
0
0
DoPo said:
Patathatapon said:
SquallTheBlade said:
I don't remember saying it didn't affect casual play.
Sure, you didn't literally say this word for word but you said exactly the equivalent when you said that it only affects competitive play. Casual is seen as "not competitive", thus, according to what you said, it would not be affected.
If you want to get down to specifics then fine. I said that you can only argue it NECESSARY in competitive play. Now, it's a nice thing to have, sure, but is it required? No. It's not required. Our eyes stop seeing the difference at 24-30, our mind stops seeing the difference around 300. Maybe we should all ***** about why more games don't have 120-300 fps instead? We obviously NEED it for casual/competitive play, even if we only end up with still images in Atari graphics by the end of the day

DoPo said:
Patathatapon said:
But apparently we can't, so 60fps can go suck a dick.
You've yet to prove that. As far as I can find, there is a single game that does have 60 FPS and no local coop and some link between the two, namely, that's Halo 5. And the "link" between 60FPS and lack of local co-op is somebody saying that's the reason. If the guy said it's because of aliens, would you be here saying "I'm fine with aliens, but since they are the reason we can't have co-op, the aliens can go suck a dick". Because it did sound like a rather lazy excuse.

Especially that after actually looking up, seems like Halo 5 co-op requires an Xbox Live Gold subscription. Which is, I am lead to believe, paid. I somehow see a bigger correlation between that and no local co-op than 60 FPS and no local co-op. Ah, but I must be imagining things. Sure, let's just blindly trust what anybody says online.
True someone online said it, and multiplayer online has always needed Xbox Live. Allegedly, 343 said they couldn't put it in due to the graphics which makes sense. But all previous Halo games had local co-op (No clue on Halo 4, I think it did).

A processor can only handle so much, and rendering two different screens at once at 60FPS simultaneously sounds like a lot of work, wouldn't you say? The Xbox One would likely combust from the effort! (Not literally, I don't need you complaining about that now).

I invite you to prove me wrong then. Give me another Xbox One/PS4 game similar to Halo's split screen mode (Not Mortal Kombat or smash brothers, that uses the same screen for all players) and then I will state that I am wrong and you win the argument. You can enjoy whatever that is worth.



Edited because I suck at splitting posts.
 

MysticSlayer

New member
Apr 14, 2013
2,405
0
0
Strazdas said:
Your link doesnt work, so thanks for taking up to quoting it. Gamasutra apperently has itself fallen for the misconception that responsiveness is about human response time, it is not. it is about games response time. yes, humans can and do both see and feel when after pressing a button the game responds faster or slower, even in those amounts. in fact the reason any monitor that has a response time above 5ms is considered "shit for gaming" is because the monitor introduced lag of even 6 ms makes game controls unconfortable, and this is 5 times higher numbers you are talking about. Of course purists will go fo 1 ms TN panels, but in general the 5ms ones are accetable.
Actually, the article was trying to dispel that notion. It was pointing out that the issues with responsiveness are often introduced by bad coding practices, such as taking too long on certain points of logic, improperly ordering how the logic carries out, and other factors. Other factors would be bad animations (which may be where your issue is coming in) and similar artistic decisions.

Overall, the frames should be able to be drawn fast enough for most people to never pick up. I know that those who do a lot of competitive gaming and/or play a lot of difficult twitch games may train themselves to be faster, but that's going beyond what a vast majority of the population would pick up on.

as far as your stack exchange link, it seems to support my statement above with "note that this and what you are talking about are not the same thing. The acceptable delay between making a motion with a mouse and seeing a result is almost certainly much smaller than the time it would take a human to react to something displayed on the screen with the mouse."
Are you talking about the response to the final comment?
 

Zio_IV

Not a Premium Member
Sep 17, 2011
178
0
0
Well, 60fps is objectively better than 30fps in all subjects. There's not one way having less frames per second improves a game on its own when compared to higher. The reason I (and evidently many others in this thread) are still okay with less fps is due to its allowance for better performance in other areas, but that in itself is not a point in 30's favor. That just means you're allowing a straight-up downgrade in one area for the sake of gaining improvements in others, and that's fine. Despite knowing all the benefits of framerate and how it actually works, I personally am still okay with playing games at lower framerates if it means having a consistency in performance. As a number of people here have said themselves, "stable 30fps > inconsistent 60fps", and I for one am in agreement. The only thing that gets my goat is when people try arguing that there's an actual, tangible benefit to having less fps, because there isn't. You're just prioritizing other performance aspects over framerate, is all. Again, that's fine, and there's nothing wrong with having different priorities, but please don't say that it's "better".
(Please note I'm not necessarily addressing any one person in this thread, but just saying in general).

It's true that some game genres don't need the improvement in framerate, though higher will always be better even then. Having 60-120fps while I'm playing Civilization V doesn't have an impact on my ability to play the game, sure, but it will still be smoother than 30 regardless. For many games however, higher doesn't just look smoother, but genuinely benefits from the heightened response time inherent to higher fps.

Just as an example, but anyone who thinks a fraction of a second's time can't be consciously measured and accounted for has never played a fighting game on a competitive, arcade level. You're making multiple movements and decisions within the span of each second, so those games absolutely require that framerate. I will admit FGs are one of the more extreme examples, though. While a lot of game genres do require those fractional response times, FGs just happen to be a platform where that kind of response level is demanded of the player more constantly.

But anywho, yeah, more is better than less. Though there are reasons to allow for lower fps, one should never forget that you're sacrificing the better product for the sake of improving something else, rather than thinking lower fps provides any intrinsic benefits on its own.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
NPC009 said:
You're confusing specialised gameshops with retail as a whole. It's not the same thing. And do you have any idea how many people still have bad internet connections? I'm guessing you're from a European country were fiber is common, but a lot of consumers don't have great options. Heck, they may only have one shitty option. I heard that's pretty common in the US.
no need for that. videogames retail is dieing all over. And yes, a small minority of people. Fiber is common in the entire world. US is actually one of the worst countries when it comes to internet. Currently only Africa is worse.

Okay, so these people are rioting, but so quietly we don't even notice them? I'm not sure how that works.
No, some people are rioting. most simply refuse to purchase inferior products.

When set at interlaced, a monitor only refreshes half the image. Instead of getting 50 new images a second, you're only really getting 25. I'm not entirely sure how many TVs actually supported 240p back then.
They were not set at interlaced, please read my post again. Also in the 90s only severely outdated TVs were still on 240p. 360p was standard TV broadcast resolution and thats what those tvs used. of course it was in 4:3 format back then.

That's the first time I heard about the curves being better. I don't know. Seems like a weak argument to me. Things like no input lag, native resolution (no scaling to mess things up) and such seem more important to me.
Then i suggest you look into it. its about viewing angles created by tube CRTs. But yes, other things you mention are also important, however they can also be achieved with flat screen CRT.

I looked up what ~500$ gets you at this point in time. I found this, along side other lists, this article (look the $600 PC). Looking at what you can do with it, it seems to be on par with modern consoles, at best. Consoles and PCs are hard to compare anyway, because unlike a dedicated system, a PC always has a OS taking up a room and power.
here is a PC that will perform better than a console: http://pcpartpicker.com/p/DYDLMp Total: $456.71
The amount Windows 8/10 takes while in background of a game is so miniscule that its not worth counting. This was a reasonable complaint around the turn of the century when OS did sigificantly slow down PCs or when Vista launched and had horrible resource hunger, but this is not so anymore and for a long time.

That wasn't exactly my point. I'm a laptop user (work-related reasons, plus I like how it saves space on my desk at home). If I were to upgrade to a gaming PC now, I would have to invest in a proper display. I wouldn't want to use my TV in the living as a monitor. Despite being fairly modest, it's still too big to use comfortably with a pc (unless that PC is purely used for watching series and playing games with a controller). Using a console with a PC display works a lot better, by the way.
The point is that you need a display for both PC and a console so comparing them it is unfair to include the price for one and not for another. what you preferences are is another thing that does not come into comparison. for example i dont own TVs at all. i connect all my media to a monitor. does that mean that i would need to add the cost of TV for a console but not for a PC? of course not.

(700 dollars? I thought you're from Europe?)

I guess my estimate was almost spot-on, then. Look, decent framerate is relatively easy to obtain, as long as you're willing to sacrifice other things (lighting effects, draw distance, other fancy stuff). Well, and as long as the developer lets you - having a very minimal settings menu is a problem in some games. In any case, just saying you're getting a good number of fps isn't painting the whole picture. My example was extreme, but it is still the truth: my piece of low-tech shit can do 60fps. Just don't ask what games I'm playing and how I'm playing them.
I converted for easier comparison. Yes im from europe. The amount of money i saved on game due to steam sales, humblebundles and other sales already cut half the price off that 700 though. when you constantly get to buy games with 85-80% off it adds up quickly. I never sacrifice draw distance. i do sacrifice antialiasing, but consoles have no antialising to begin with, so hardly a sacrifice for comparisons sake. i never claimed that i max out all games, that was never my intention anyway. I do play at better quality than on consoles, though.

A set of legos costs a few dozen bucks, a hundred if you want something big. A PC has hundreds of dollars worth of components. One might find that intimidating. Sure, in reality they're pretty easy to put together, but picking components (that involves a lot of research) and learning to put them together still requires time and commitment. Few people have fun with it. It's no wonder a lot of people would rather pay extra for a pre-build than to take any risks. (Heck, I could have saved some money by building a PC for my parents, but it just wasn't worth the hassle. Ordering it online and having it send to their place was much less time-consuming.)
there are many places on the internet, for example /r/buildmeapc on reddit, where you cna just give them a money target and what you will use it for and they will pick you the parts that work together if you are unable to watch a few youtube videos and use sites like PCpartpicker that check for compatibility automatically. I can understand the intimidation. i felt that as well when i built my first pc,especially since i saved that money for 2 years (i was 12 at the time) and if i fucked up i would probably be without a pc for 2 another years. i didnt fuck up though, turned out it was easier than expected. but yes, everyone is welcome to pay extra for others to do the job for them. thats the whole backbone of services industry.

Keeping it running properly is kind of a big and important thing. If you're computer savvy, you'll probably avoid doing anything too stupid automatically, but a lot of people are not computer savvy. That's more or less the software side of things. The hardware side... Well, sometimes things go weird in the weirdest ways. My sister once had a laptop that would try to boot up, shut down midway, try to boot up again and so on. Took a while before I figured out it was trying to draw power from a near-dead battery instead of directly from the power supply. And then there's all the fun I've had with modems... Please, don't get me started on modems...
We are assuming you are using windows here since you yourself suggested here. they are pretty much idiotproof by now. sadly, at the expense of power users like me. but its really hard to fuck things up now unless you specifically go out of your way to find malware that does it or are very very unlucky to be stupid enough to open every email attachment. yes, viruses exist, but modern antiviruses can be basically set to set it and forget it mode. Well your sister must have murdered that battery then. i got my 6 year old laptop sitting by my side as i type this and while the battery only last 1 hour now its enough to boot up. Also laptop troubleshooting 101 is first try removing the battery which forces laptop to go directly to power supply. most do that by default anyway if power is connected but that wasnt the case in the past.

Yes, modems are a pain in the ass sometimes. when i had only one PC i skipped modems entirely, direct line connection to PC. best thing ever. however sadly since now i have 4 devices (two mine two my cousins) connected i need to have a router that fucks up once in 3 months or so. simple restart fixes it though. but i remmeber back in 2003 the router i had was really annoying me.

Again, that doesn't mean much. I'm playing games released in 2015. Half of them look like 16-bit games, and the other half isn't much more impressive, but yeah, games from 2015... yay.
I guess i should have specified. i dont mean those small retro games. i enjoyed Cities: Skylines a lot for example.

It seems to me that PC gaming is most fun if you're interested in the tech involved. I kinda get that putting your own system together and figuring out how to get the most out of it gives a sense of accomplishment. But, it's not what interests me, personally. It just seems like an exhausting rat race to me.
I would argue that all hobbies are more fun if you are interested in how these hobbies are made.

It seems to me that is more a result of many games being rushed out of the door - poorly optimised and filled with bugs (I heard the console versions of Just Cause 3 have some really fun memory leaks that end up adding whole minutes to loading times after just an hour of play). Nintendo has the least amount of power to work with, but they're releasing some of the best looking and most stable games this generation.

More raw power will not solve that most issues you see on consoles. Actually, I think it might make developers more complacent and that wouldn't be good for PC gamers either. Poorly programmed console games are not a good foundation for ports.
Here we will agree, its the stupid choices by developers that is more at fault than the lack of power. I dont agree with Nintendo games being the best looking ones though. you will probably argue asthetics, and thats personal, so i wont go there. from a technical perspective though, they are lagging behind.

DirectX12 wont offer more raw power. it will offer better ways to utilize the power already there. developers will of course have to specifically code thier games to utilize these benefits. I think most will, though, because the limitations DirectX12 is removing is one developers of AAA studios were complaining about for a few years now.

It's be great if it were that simple. Of course you need a good game, but that alone is no guarantee for success. Undertale was one of many quirky retro style games released in 2015, but it was also one of the few fortunate enough to be noticed by the right people (and that didn't happen without atleast some marketing). It wouldn't have exploded the way it did if it hadn't.

If you ever have a a day to spare, go spelunking. Don't just stick to the depths of Steam, check sites such as rpgmaker.net as well. You'll be surprised by the gems hidden there. If you want a starting point: Off. If you liked Undertale, this RPG Maker game from 2008 may be to your liking.
gamers are very passionate and love to share good games. so yes, often it is that simple. note that being part of small niche does not make a game great on its own. Thanks for the offers, but i will not be taking them. I personally do not like that style of games, but i can recognize a good one when it comes around. Im glad there are people that enjoy them, but i personally dont.





Higgs303 said:
I was referring to film specifically. Cinematographers use all sorts of outdated filming methods and technology for artistic purposes, sometimes it is effective other times it is not. However, for 3D applications, I would agree that a lower framerate like 30 FPS has no conceivable aesthetic value compared to 60 FPS.
Ah, i can see your point, although i have seen this done well extremely rarely.

Higgs303 said:
I am not so sure about that. For the most recent games, only the most expensive GPUs like the GTX 980 Ti, Titan X and R9 Fury X can hope to even approach the refresh rate of a 120Hz monitor at 1080p (they max out at about 100 FPS on average). For higher resolutions like 4K, the high end dual GPU configurations that are required to attain even 60FPS are well beyond the budget of most PC gamers. I don't think there are 4k monitors with more than a 75Hz refresh rate on the market right now. IMO, most PC gamers will choose to move on to higher resolutions well before a framerate higher than 60 FPS becomes the new standard. Nvidia's Pascal architecture and AMD's Arctic Islands architecture may pack enough punch to allow for reasonably priced single GPU builds that are capable of 4K/60FPS, but I don't see the standard moving beyond that for some time.
This is true, which is why the discussion isnt as opened and closed as the 30 fps one. There is a 120hz 4k monitor on the market, released last year. Personally i think the most discussion is about 1080p@120hz vs 4k@60hz nowadays. Im in the camp of the first one, but i can understand the people who are in for the second one.

MysticSlayer said:
Actually, the article was trying to dispel that notion. It was pointing out that the issues with responsiveness are often introduced by bad coding practices, such as taking too long on certain points of logic, improperly ordering how the logic carries out, and other factors. Other factors would be bad animations (which may be where your issue is coming in) and similar artistic decisions.

Overall, the frames should be able to be drawn fast enough for most people to never pick up. I know that those who do a lot of competitive gaming and/or play a lot of difficult twitch games may train themselves to be faster, but that's going beyond what a vast majority of the population would pick up on.

as far as your stack exchange link, it seems to support my statement above with "note that this and what you are talking about are not the same thing. The acceptable delay between making a motion with a mouse and seeing a result is almost certainly much smaller than the time it would take a human to react to something displayed on the screen with the mouse."
Are you talking about the response to the final comment?
The article was trying to dispel the notion that gamers never had? interesting, exactly the thing id expect from gamasutra really. they seem to be detached from actual gamers all the time. Though like i said the link didnt work so i couldnt read the whole thing, only parts you quoted. Yes, the logic in the game is important and should be taken into account, for example see the tick rate problems in battlefield 4.

As was proven by the 60hz vs 120hz experiment, even 60 fps is too low to meet the criteria of most people not being able to pick it up. therefore this is clearly not the case with 30 fps.

Im talking about the most upvoted response in there if i remember correctly.
 

Smooth Operator

New member
Oct 5, 2010
8,162
0
0
NPC009 said:
When set at interlaced, a monitor only refreshes half the image. Instead of getting 50 new images a second, you're only really getting 25. I'm not entirely sure how many TVs actually supported 240p back then.
No, you get half the image quality at the same FPS, if the FPS went to half that is a separate limiting process not tied to interlacing.
 

DoPo

"You're not cleared for that."
Jan 30, 2012
8,665
0
0
Patathatapon said:
Our eyes stop seeing the difference at 24-30
[citation needed]

Because our eyes do see the difference even beyond that. As a proof: otherwise we wouldn't have this thread. Among other things.

Patathatapon said:
Maybe we should all ***** about why more games don't have 120-300 fps instead?
Not "*****", as that implies unjust complaint, however, we could certainly be asking this more often.

Patathatapon said:
A processor can only handle so much, and rendering two different screens at once at 60FPS simultaneously sounds like a lot of work, wouldn't you say?
It depends on how they would go about it.

Patathatapon said:
I invite you to prove me wrong then.
I invite you to first prove that it is literally impossible even if the developers actually try to do it. It's that thing called burden of proof.
 

NPC009

Don't mind me, I'm just a NPC
Aug 23, 2010
802
0
0
Strazdas said:
no need for that. videogames retail is dieing all over. And yes, a small minority of people. Fiber is common in the entire world. US is actually one of the worst countries when it comes to internet. Currently only Africa is worse.
Yeah, no [https://en.wikipedia.org/wiki/List_of_countries_by_Internet_connection_speeds].

No, some people are rioting. most simply refuse to purchase inferior products.
So, where are these people rioting? Are they a big enough group to matter?

They were not set at interlaced, please read my post again. Also in the 90s only severely outdated TVs were still on 240p. 360p was standard TV broadcast resolution and thats what those tvs used. of course it was in 4:3 format back then.
Well, I admit I was playing an a old TV from the 70s for most of the 90s, so I might be remembering things wrong. Didn't really get up to date until ~1998, right in time to start reaping the benefits of 60hz. Even then I was never ahead of the curve. On the other hand, I wasn't really behind either.

BTW wasn't PAL, the broadcasting standard, 576i?

Then i suggest you look into it. its about viewing angles created by tube CRTs. But yes, other things you mention are also important, however they can also be achieved with flat screen CRT.
Wait, I think I get it, the reason flat CRTs never come up with the people I speak with is because they're kinda ignored. Most people I know just grab nice TVs from the 80s and 90s (because those are cheapest) and those are generally curved. Hm, now that I think about it, my boyfriend did have a very nice flat 100hz CRT. That one didn't cause any distortions. Wouldn't quality play a big role? There's shitty TVs in all shapes and sizes.

here is a PC that will perform better than a console: http://pcpartpicker.com/p/DYDLMp Total: $456.71
The amount Windows 8/10 takes while in background of a game is so miniscule that its not worth counting. This was a reasonable complaint around the turn of the century when OS did sigificantly slow down PCs or when Vista launched and had horrible resource hunger, but this is not so anymore and for a long time.
I'm not sure the i3 is the right way to go with a gaming PC. Sure, the hyper-treading somewhat makes up for it being a dual core and it's a good price, but wouldn't a i5 be a little more future proof? And isn't 7200RPM considered slow for a harddrive? It seems to me that this system is about getting as much for young as possible, but doesn't exactly provide the most comfortable experience.

And yep, I know the dark days of Vista have passed, but OSes do take up a relatively big amount of resourches. (Though I have to admit some modern consoles are a bit weird. IIRC the Wii U sets aside 1GB of its RAM for system use online.)

The point is that you need a display for both PC and a console so comparing them it is unfair to include the price for one and not for another. what you preferences are is another thing that does not come into comparison. for example i dont own TVs at all. i connect all my media to a monitor. does that mean that i would need to add the cost of TV for a console but not for a PC? of course not.
I think you only need to add the cost when you don't have a suitable screen. I don't have a monitor, so that's something I would have to buy if I decided to get a PC. If you're happy with your monitor, you don't need a TV, meaning a console is relatively cheap for you.

I converted for easier comparison. Yes im from europe. The amount of money i saved on game due to steam sales, humblebundles and other sales already cut half the price off that 700 though. when you constantly get to buy games with 85-80% off it adds up quickly. I never sacrifice draw distance. i do sacrifice antialiasing, but consoles have no antialising to begin with, so hardly a sacrifice for comparisons sake. i never claimed that i max out all games, that was never my intention anyway. I do play at better quality than on consoles, though.
Hey, consoles have sales, too. And they can be pretty good, actually. Just like on Steam and GoG, 70% isn't that unusual. Some games might even be cheaper on consoles than PC, depending on the sales. Since I've got a bit of everything, I like to compare :)

Consoles can do anti-aliasing, BTW. But just like with PC games, if it's used varies from game to game. The main differenc being, of course, that on consoles the developer decides, and on PC the consumer gets a say in it (which makes sense, because there's such a huge variety in hardware used).

there are many places on the internet, for example /r/buildmeapc on reddit, where you cna just give them a money target and what you will use it for and they will pick you the parts that work together if you are unable to watch a few youtube videos and use sites like PCpartpicker that check for compatibility automatically. I can understand the intimidation. i felt that as well when i built my first pc,especially since i saved that money for 2 years (i was 12 at the time) and if i fucked up i would probably be without a pc for 2 another years. i didnt fuck up though, turned out it was easier than expected. but yes, everyone is welcome to pay extra for others to do the job for them. thats the whole backbone of services industry.
That involves trusting random people on the internet, though. Not everyone is eager to do so. If you aren't very computer savvy, just finding trustworthy sources can be a challenge.

(I learnt my way around PCs with outdated systems and spare parts - I would have pissed my pants if I had to fiddle with anything less expendable! New parts are still a little scary. Last time I decided to plop in some more RAM - literally three minutes work - I had to take deep breath before I dared to even start. Which was kind of silly, because finding the right RAM was actually a greater challenge.)

We are assuming you are using windows here since you yourself suggested here. they are pretty much idiotproof by now. sadly, at the expense of power users like me. but its really hard to fuck things up now unless you specifically go out of your way to find malware that does it or are very very unlucky to be stupid enough to open every email attachment. yes, viruses exist, but modern antiviruses can be basically set to set it and forget it mode. Well your sister must have murdered that battery then. i got my 6 year old laptop sitting by my side as i type this and while the battery only last 1 hour now its enough to boot up. Also laptop troubleshooting 101 is first try removing the battery which forces laptop to go directly to power supply. most do that by default anyway if power is connected but that wasnt the case in the past.
For compatibility reasons, Windows seems like the most obvious choice to me. I do kinda get the appeal of Linux, though. Lots of freedom to make it run the way that's best for your set-up.

I think that battery died simply because it was shitty and couldn't handle being recharged so often. Of course, the real culprit is the laptop (and the people who designed it) - the battery wouldn't have died so soon if the laptop hadn't constantly drawn power from it. My own laptop was nearly five your older at the time, and while the battery life had certainly degraded (in the end, I got 15 minutes out of it - at max), it hadn't been abused to death. We're talking early-mid 00s by the way - things were a little worse back then.

Yes, modems are a pain in the ass sometimes. when i had only one PC i skipped modems entirely, direct line connection to PC. best thing ever. however sadly since now i have 4 devices (two mine two my cousins) connected i need to have a router that fucks up once in 3 months or so. simple restart fixes it though. but i remmeber back in 2003 the router i had was really annoying me.
For a while, I had one that fucking hated Vista. Sadly, there was no easy fix for that piece of shit. With lots of googling I found other people with the exact same problems and the solutions turned out to be switching to WEP security. My DS didn't complain, but sheesh...

I guess i should have specified. i dont mean those small retro games. i enjoyed Cities: Skylines a lot for example.
Looking at the recommended requirements and assuming you meet those, I'm assuming you're meeting the minimum system requirements for many triple A games. That's indeed not too bad.


I would argue that all hobbies are more fun if you are interested in how these hobbies are made.
That's true. But the way PC gaming and PC tech is intertwined seems a bit strange to me, as console gaming is pretty much all about the games. I mostly just really like games.

Here we will agree, its the stupid choices by developers that is more at fault than the lack of power. I dont agree with Nintendo games being the best looking ones though. you will probably argue asthetics, and thats personal, so i wont go there. from a technical perspective though, they are lagging behind.
It's no secret the Wii U is the weakest of the bunch (though it's actually not as bad as some people assume), but Nintendo does a great job of making you forget you're playing an relatively low tech hardware. Their own games run incredibly well, and even something like Xenoblade X, in which a lot of sacrifices were made to create a huge, seamless world, draws you right in once you start playing. Hardware limitations are a lot more noticible in the PS4 games I play, because half of them I just kinda poorly put together.

DirectX12 wont offer more raw power. it will offer better ways to utilize the power already there. developers will of course have to specifically code thier games to utilize these benefits. I think most will, though, because the limitations DirectX12 is removing is one developers of AAA studios were complaining about for a few years now.
I meant the problems seen in console gaming (and gaming in general, really) can't be solved by just releasing more powerful consoles. The extra power can cover up only so much of poor programming. The advantage PC gaming has here, is that users can improve their own hardware where needed, and the most savvy ones can even put out mods to make the game run better. Still, this is users trying to fix problems that shouldn't be there regardless of the hardware being developed for.

gamers are very passionate and love to share good games. so yes, often it is that simple. note that being part of small niche does not make a game great on its own. Thanks for the offers, but i will not be taking them. I personally do not like that style of games, but i can recognize a good one when it comes around. Im glad there are people that enjoy them, but i personally dont.
Finding good games can be tricky, though. Unless you're willing to sacrifice a lot of time and/or money, there are obvious limitations to the number of great games you can discover on your own. Most people have to rely on critics and Youtube personalities to find new games to play. And how do these people find games? Publishers often approach these people themselves (fortunately, this rarely goes beyond 'here's a download code, enjoy'). If a developer does not promote their game, it's unlikely it will be discovered by people who can properly introduce it to the world.

Look at something like Dark Souls. The series gained a small following with Demon's Souls, and the people who played that started getting hyped up for Dark Souls, with the journalists pushing for previews and reviews. That game gained a bigger cult following, which just kept growing because more and more knowns give it a shot and loved it. Now it's a big deal.

I got to contribute to a few small explosions myself (Virtue's Last Reward) and was caught up in a few blasts as well (Spec Ops: The Line).
 

Odbarc

Elite Member
Jun 30, 2010
1,155
0
41
My monitor allows for 144hz so I definitely prefer higher frame rates. Graphics are generally overrated.
 

Patathatapon

New member
Jul 30, 2011
225
0
0
DoPo said:
Patathatapon said:
Our eyes stop seeing the difference at 24-30
[citation needed]

Because our eyes do see the difference even beyond that. As a proof: otherwise we wouldn't have this thread. Among other things.
This is what you call, someone trying to differentiate the difference between the two. I just meant it seems smooth at around that range films and games have run at 24/30 fps for a long time now. Apparently 16 was the minimum back in the day, here I'll even show the dumb article I found.

https://www.quora.com/Why-do-movies-look-smooth-at-24-fps-but-video-games-look-terrible-at-24-fps-Is-it-because-of-motion-blur

Is it credible? Hell if I know. You asked for a link, that's what I'm giving you.

DoPo said:
Patathatapon said:
A processor can only handle so much, and rendering two different screens at once at 60FPS simultaneously sounds like a lot of work, wouldn't you say?
It depends on how they would go about it.
So how would they go about it? That is the biggest cop out answer I've seen. As I said, for something like Mortal Kombat or smash brothers (both are console games that can run at 60FPS) it isn't too hard because it's all just one screen, and for that matter, isn't THAT complex. Meanwhile in a game like Halo 5, running two COMPLETELY different screens in First person perspective is quite a bit already. This isn't to mention high graphics, AI, or anything else either. Which is also a bit of a strain. Adding 60 Fps for each different screen is hard. The developers can't change the hardware on the Xbox One. If they made it so both players had to share a screen, sure they could do that, but that would be stupid for a First person shooter.

Patathatapon said:
I invite you to prove me wrong then.
I invite you to first prove that it is literally impossible even if the developers actually try to do it. It's that thing called burden of proof.[/quote]

Well I'm not a developer so I can't prove it to you. All I have is what I hear. You just proved you can't prove me wrong, and it's impossible for me to prove myself right. So until I can be proven wrong, we're at a stalemate in this area, aren't we? Doesn't mean you win by default.


My last words on this since I'm not posting again after this: Just to piss everyone off I might as well mention that both me, and my older brother can't tell the big difference between 30fps and 60fps. Side by side maybe we could figure it out. But not just from the sight of one. Anyway, I'm not coming back, if you take that as a victory, then fine. Enjoy.
 

DoPo

"You're not cleared for that."
Jan 30, 2012
8,665
0
0
Patathatapon said:
So how would they go about it?
By reducing the amount of computing power to get 60 FPS. There are various ways to do that - various ways that are employed already. Lowering video quality is the obvious one, but there are tricks that can be done in relation to the screen space - since you're showing two, you can show less and/or hide some details that would not be strictly needed. There are should also be some lower level optimisations that would just improve how the game performs.

And then, the implementation could mandate running at 30FPS when in split screen. There isn't a technical reason why that cannot be done. Well, some people would complain but not you.

Patathatapon said:
Adding 60 Fps for each different screen is hard. The developers can't change the hardware on the Xbox One.
They can change what computations throw at said hardware, though.

Patathatapon said:
Well I'm not a developer so I can't prove it to you. All I have is what I hear.
You believe what you hear from one person. I am also one person, yet you don't seem to listen.

Patathatapon said:
You just proved you can't prove me wrong
No, what I just proved is that you don't know what you're talking about but you are adamant in your lack of understanding.

Patathatapon said:
and it's impossible for me to prove myself right. So until I can be proven wrong, we're at a stalemate in this area, aren't we?
Means that you began your argument from a position of ignorance and somehow believe that your claims have any weight to them despite that.

Patathatapon said:
Doesn't mean you win by default.
It took me literally two minutes to find out that Borderlands: The Handsome Collection both works on 60 FPS on Xbox One and has local split screen co-op. I don't know if the co-op works at 60 FPS but that's sufficient proof that your claim was wrong - you can both have a game that runs in 60 FPS and that game can have local co-op.

Patathatapon said:
Anyway, I'm not coming back, if you take that as a victory, then fine. Enjoy.
Cya. Feel free to go and put blindly faith somebody else, then form completely baseless beliefs around them, and come and share your misguidedness with us again.
 

Bad Jim

New member
Nov 1, 2010
1,763
0
0
Strazdas said:
No, DOOM had no frequency cap, though when it released most machines could not run it at 60 fps well.
Sorry, as much as I admire your heroic defense of high framerates, the Doom engine is indeed capped at 35fps. http://doomwiki.org/wiki/Uncapped_framerate

Try running it in DOSBox with output=OpenGL and you can see the framerate with Fraps.

Perhaps you have been fooled by source ports like ZDoom that remove this cap?
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
NPC009 said:
Yeah, no [https://en.wikipedia.org/wiki/List_of_countries_by_Internet_connection_speeds].
according to this list, 80% of United States has internet connection capable of downloading games. Note that that remaining 20% includes people that dont know or care about internet as well, so the real amount of people that has to relie on retail is miniscule.

So, where are these people rioting? Are they a big enough group to matter?
Big enough for sony to get sued [http://www.eurogamer.net/articles/2014-08-07-sony-is-being-sued-for-killzone-failing-to-deliver-native-1080p] as well as EA [http://www.pcgamer.com/investors-sue-ea-over-quality-of-battlefield-4/]. Those people riot mostly on the internet. and yes, they are big enough group given that they are the majority of customers.

Wait, I think I get it, the reason flat CRTs never come up with the people I speak with is because they're kinda ignored. Most people I know just grab nice TVs from the 80s and 90s (because those are cheapest) and those are generally curved. Hm, now that I think about it, my boyfriend did have a very nice flat 100hz CRT. That one didn't cause any distortions. Wouldn't quality play a big role? There's shitty TVs in all shapes and sizes.
quality is msot definatelly a factor both back then and now. though i cannot magically say whether the TV you had as a kid was good quality or not so i avoided that factor. Thing is with most games from that time is that they were designed in a way that would abuse the low quality systems most homes have to their benefit and quality was much more important for things like image editing than games in comparison to nowadays where image quality for gaming is more important. That being said due to Flat CRTs late arrival on the TV scene the curved ones do tend to be higher quality in general.

I'm not sure the i3 is the right way to go with a gaming PC. Sure, the hyper-treading somewhat makes up for it being a dual core and it's a good price, but wouldn't a i5 be a little more future proof? And isn't 7200RPM considered slow for a harddrive? It seems to me that this system is about getting as much for young as possible, but doesn't exactly provide the most comfortable experience.

And yep, I know the dark days of Vista have passed, but OSes do take up a relatively big amount of resourches. (Though I have to admit some modern consoles are a bit weird. IIRC the Wii U sets aside 1GB of its RAM for system use online.)
The vast majority of games nowadays are GPU-bound and for a budget build i3 will do just fine. while it is just a dualcore, you have to realize that Intels cores have much higher IPC (instructions-per-cycle) count that makes up for it. in fact thats the reason Intels are so popular with gaming because most games dont do multithreading well and having fewer but stronger cores are beneficial. No, you likely wont max out games on i3, but it will play games fine. Yes, if i was building for myself id pick i5 (and i did) but that build was specifically designed to fit within 500 dollars. Also no, 7200rpm is standard, most popular (think - 99% of desktops) harddrive. there is a slower 5400 variant that is mostly used in laptops. however above 7200 rpm (9600rpm, 10600rpm, ect) are all server-grade hardware, not designed for regular users (but can be used if you can afford it).

No, modern windows take a miniscule amount. if you are runnign a fullscreen app it even unloads the desktop manager so you dont even have that one taking memory. the usage you see are mostly third party programs most people have (such as antivirus). Also did you knew that Out of 8 processor cores for Xbox only 6 are available for games because the other 2 are reserved for OS? thats way more resource use than even Vista did. Im not sure about PS4, but i heard it also has system reserved resources that are strangely big.

I think you only need to add the cost when you don't have a suitable screen. I don't have a monitor, so that's something I would have to buy if I decided to get a PC. If you're happy with your monitor, you don't need a TV, meaning a console is relatively cheap for you.
but your TV is a suitable screen for your PC! its just your wishes to have a monitor that makes it not a choice for you. same as in my case not having a TV. Especially since modern TVs and monitors are pretty much identical in design other than input choices. Well and most monitors dont have internal speakers.


That involves trusting random people on the internet, though. Not everyone is eager to do so. If you aren't very computer savvy, just finding trustworthy sources can be a challenge.

(I learnt my way around PCs with outdated systems and spare parts - I would have pissed my pants if I had to fiddle with anything less expendable! New parts are still a little scary. Last time I decided to plop in some more RAM - literally three minutes work - I had to take deep breath before I dared to even start. Which was kind of silly, because finding the right RAM was actually a greater challenge.)
Paranoid person will stay paranoid because he is too afraid to get it cured. Like i said, just go to a site like pcpartpicker.com and it will show you whether the build is going to work fine or not, most people will link you to that anyway, so you dont even have to do any work.

For compatibility reasons, Windows seems like the most obvious choice to me. I do kinda get the appeal of Linux, though. Lots of freedom to make it run the way that's best for your set-up.

I think that battery died simply because it was shitty and couldn't handle being recharged so often. Of course, the real culprit is the laptop (and the people who designed it) - the battery wouldn't have died so soon if the laptop hadn't constantly drawn power from it. My own laptop was nearly five your older at the time, and while the battery life had certainly degraded (in the end, I got 15 minutes out of it - at max), it hadn't been abused to death. We're talking early-mid 00s by the way - things were a little worse back then.
Yes, i agree with your point about windows.

Yeah, early 00s were times when laptop was still relatively obscure technology and a lot of corners were cut. i got the infamous DV9000 built in 2008. Its not really a laptop anymore, due to many things that happened to it over the years its basically used as an independent second monitor or youtube watcher now.

Looking at the recommended requirements and assuming you meet those, I'm assuming you're meeting the minimum system requirements for many triple A games. That's indeed not too bad.
im not aware of any game so far that i dont meet minimum requirements yet. Even the supposed very demanding Just Cause 3, im just a tad bit above minimum specs.

It's no secret the Wii U is the weakest of the bunch (though it's actually not as bad as some people assume), but Nintendo does a great job of making you forget you're playing an relatively low tech hardware. Their own games run incredibly well, and even something like Xenoblade X, in which a lot of sacrifices were made to create a huge, seamless world, draws you right in once you start playing. Hardware limitations are a lot more noticible in the PS4 games I play, because half of them I just kinda poorly put together.
yes, Nintendo sure knows how to polish their games and use the limited resources they have. its probably why they are so lowed, they dont have so many bad developers releasing games on their consoles. Personally its not my cup of tea, but from a technical perspective they are mostly well done.

I meant the problems seen in console gaming (and gaming in general, really) can't be solved by just releasing more powerful consoles. The extra power can cover up only so much of poor programming. The advantage PC gaming has here, is that users can improve their own hardware where needed, and the most savvy ones can even put out mods to make the game run better. Still, this is users trying to fix problems that shouldn't be there regardless of the hardware being developed for.
Kinda. Realistically simply having more power wont solve poor programming, but in some cases it can just power through it. for example take GTA 4 - a complete disaster of a PC port that no video card existing at launch could run it even half-decently. now 7 years later GPUs are so powerful they can just choke all the shitty programming done in there and still deliver decent experience. So yes, the problem shouldnt exist in the first place, but since it does, and there is a way i can fix it for myself, well then ill fix it rather than play shit (though in many cases i simply wont even buy that game).

Finding good games can be tricky, though. Unless you're willing to sacrifice a lot of time and/or money, there are obvious limitations to the number of great games you can discover on your own. Most people have to rely on critics and Youtube personalities to find new games to play. And how do these people find games? Publishers often approach these people themselves (fortunately, this rarely goes beyond 'here's a download code, enjoy'). If a developer does not promote their game, it's unlikely it will be discovered by people who can properly introduce it to the world.
the only youtube personality i follow is Jim Sterling and yet i have more games in my backlog than i can get through. I guess im not picky enough.... I actually discovered quite a few games reading this forum instead of some critics. i almost never read reviews, i actually prefer reading wiki entries over reviews because im very factual-based and reviews tend to be very opinion-based. There are many ways gamers learn about games. Heck, there was a time i was discovering new games and movies based on most pirated items on one site. Its quite a good measure of popularity.

If i am watching a games gameplay on youtube it becomes pretty easy for me to decide whether i want to buy the game or not. If i want to keep watching, i wont be buying it. If instead i want to stop watching and instead play it myself, then it has hooked me.

That's true. But the way PC gaming and PC tech is intertwined seems a bit strange to me, as console gaming is pretty much all about the games. I mostly just really like games.
the reason console gaming does not include tech is because not only it is a locked up tech but it is flat out illegal to mod it
http://scitech.blogs.cnn.com/2009/08/05/student-arrested-for-modding-xbox-consoles/
http://www.wired.com/2009/08/game-console-jailbreaking-arrest/

This is the same practice that is ENCOURAGED on mobile phones to get rid of shitty pre-loaded crap they tend to be full off. This is the same as reinstalling your own version of windows on PC. yet for consoles you will get arrested. no wonder it does not have a hardware scene, everyone in it would have to be in jail.

------------------------------------------------------------------------------


Patathatapon said:
https://www.quora.com/Why-do-movies-look-smooth-at-24-fps-but-video-games-look-terrible-at-24-fps-Is-it-because-of-motion-blur

Is it credible? Hell if I know. You asked for a link, that's what I'm giving you.
Emphasis mine. No, its not. Games have motion blur, some of them very good quality motion blur. it does not lower bad framerate problems because they are not caused by lack of blur.

Bad Jim said:
Sorry, as much as I admire your heroic defense of high framerates, the Doom engine is indeed capped at 35fps. http://doomwiki.org/wiki/Uncapped_framerate

Try running it in DOSBox with output=OpenGL and you can see the framerate with Fraps.

Perhaps you have been fooled by source ports like ZDoom that remove this cap?
Yeah, sorry about that. I may have mixed Doom up with Quake.
 

NPC009

Don't mind me, I'm just a NPC
Aug 23, 2010
802
0
0
Strazdas said:
NPC009 said:
Yeah, no [https://en.wikipedia.org/wiki/List_of_countries_by_Internet_connection_speeds].
according to this list, 80% of United States has internet connection capable of downloading games. Note that that remaining 20% includes people that dont know or care about internet as well, so the real amount of people that has to relie on retail is miniscule.
I would not recommend downloading games using a connection that'll give you a download rate of less than 1MB a minute (note that the speeds are listed in Mb), as many modern games are several GBs, even the ones on handheld. It's easier to find stuff under 500MB on Steam and GoG, but even a small retro game might take over an hour to download with that kind of speed.

Big enough for sony to get sued [http://www.eurogamer.net/articles/2014-08-07-sony-is-being-sued-for-killzone-failing-to-deliver-native-1080p] as well as EA [http://www.pcgamer.com/investors-sue-ea-over-quality-of-battlefield-4/]. Those people riot mostly on the internet. and yes, they are big enough group given that they are the majority of customers.
Battlefield 4: Seems to go way beyond not being able to reach 60fps. Frame rate isn't even mentioned.
Killzone: Always nice when a company gets a get in the butt for cheating at numbers, but it appears nothing came of this lawsuit. It also seems to be more about the resolution than the frame rate. (And I'm not surprised this went quiet, as the game did run at 1080p 60fps during multiplayer, but that 1080p simply wasn't fully native.)

BTW You don't need a majority to file a class action lawsuit.


The vast majority of games nowadays are GPU-bound and for a budget build i3 will do just fine. while it is just a dualcore, you have to realize that Intels cores have much higher IPC (instructions-per-cycle) count that makes up for it. in fact thats the reason Intels are so popular with gaming because most games dont do multithreading well and having fewer but stronger cores are beneficial. No, you likely wont max out games on i3, but it will play games fine. Yes, if i was building for myself id pick i5 (and i did) but that build was specifically designed to fit within 500 dollars. Also no, 7200rpm is standard, most popular (think - 99% of desktops) harddrive. there is a slower 5400 variant that is mostly used in laptops. however above 7200 rpm (9600rpm, 10600rpm, ect) are all server-grade hardware, not designed for regular users (but can be used if you can afford it).
I've heard of i5 and i7 being fairly popular because they offer a lot of bang for your buck, but this is the first time I've heard the i3s being well-liked as well. I have to wonder about how future proof it is, though, especially in combination with other lower-end components. One advantage of consoles is that the system has a life cycle of atleast five years (with more than a few systems reaching eight years or more), a PC becomes less cost effective if it needs multiple upgrades during the same timeframe to be able to keep up.

As for the harddrive, the serious gamers around me swear solid state is the way to go, but I'll take your word when you say 7200 is perfectly acceptable. It sounds good enough for me, as well.

No, modern windows take a miniscule amount. if you are runnign a fullscreen app it even unloads the desktop manager so you dont even have that one taking memory. the usage you see are mostly third party programs most people have (such as antivirus). Also did you knew that Out of 8 processor cores for Xbox only 6 are available for games because the other 2 are reserved for OS? thats way more resource use than even Vista did. Im not sure about PS4, but i heard it also has system reserved resources that are strangely big.
Yeah, the PS4 sets aside 3.5GB RAM (out of 8GB DDR5 RAM) for system related activities (which are mostly tied the gaming experience, such as share and multiplayer functions - I guess you could compare it to the Steam client), but a part of that is flexible and can be used for the game if developers chose to do so.

Good point about the antivirus and other third-party programs (Steam, Skype, what else?). Those would be a bigger drain nowadays. Not sure if that's really in favour of PC Gaming, though.

but your TV is a suitable screen for your PC! its just your wishes to have a monitor that makes it not a choice for you. same as in my case not having a TV. Especially since modern TVs and monitors are pretty much identical in design other than input choices. Well and most monitors dont have internal speakers.
I disagree. It's fine to hook up a computer to a TV to watch some movies, show a presentation or play a game with a controller, but a lot of PC stuff kinda requires you to sit at a desk unless you enjoy ergonomic nightmares. So unless you have a monitor size HD television to begin with, you need a proper monitor.


Paranoid person will stay paranoid because he is too afraid to get it cured. Like i said, just go to a site like pcpartpicker.com and it will show you whether the build is going to work fine or not, most people will link you to that anyway, so you dont even have to do any work.
Again, lots of people just aren't comfortable working with hundreds of dollars worth of components. PCs are kinda like cars, while nearly everyone should be able to basic things themselves, many prefer to leave it to the professionals.

Looking at the recommended requirements and assuming you meet those, I'm assuming you're meeting the minimum system requirements for many triple A games. That's indeed not too bad.
im not aware of any game so far that i dont meet minimum requirements yet. Even the supposed very demanding Just Cause 3, im just a tad bit above minimum specs.[/quote]

That's one of the games I looked at, among others. It's a popular game and it's sure not to be the last if its kind, so I figure it's a decent foundation for a guideline.

Kinda. Realistically simply having more power wont solve poor programming, but in some cases it can just power through it. for example take GTA 4 - a complete disaster of a PC port that no video card existing at launch could run it even half-decently. now 7 years later GPUs are so powerful they can just choke all the shitty programming done in there and still deliver decent experience. So yes, the problem shouldnt exist in the first place, but since it does, and there is a way i can fix it for myself, well then ill fix it rather than play shit (though in many cases i simply wont even buy that game).
Even if you have to wait several years to get to the point it's fixable, haha? Well, okay, I'm not laughing too hard. it's a shame to see potential wasted, so better late than never, I say. The standards on consoles are higher, but if something does go wrong, users have very few options to fix it themselves.

the only youtube personality i follow is Jim Sterling and yet i have more games in my backlog than i can get through. I guess im not picky enough.... I actually discovered quite a few games reading this forum instead of some critics. i almost never read reviews, i actually prefer reading wiki entries over reviews because im very factual-based and reviews tend to be very opinion-based. There are many ways gamers learn about games. Heck, there was a time i was discovering new games and movies based on most pirated items on one site. Its quite a good measure of popularity.

If i am watching a games gameplay on youtube it becomes pretty easy for me to decide whether i want to buy the game or not. If i want to keep watching, i wont be buying it. If instead i want to stop watching and instead play it myself, then it has hooked me.
Forums work in the same way. People rarely discover games themselves. They hear about it elsehere.

I like going through Steam during the sales and look at weird things I can get for less than 2 bucks, but most games do have atleast a handful of reviews and I do look at those before I buy it or not.



the reason console gaming does not include tech is because not only it is a locked up tech but it is flat out illegal to mod it
http://scitech.blogs.cnn.com/2009/08/05/student-arrested-for-modding-xbox-consoles/
http://www.wired.com/2009/08/game-console-jailbreaking-arrest/

This is the same practice that is ENCOURAGED on mobile phones to get rid of shitty pre-loaded crap they tend to be full off. This is the same as reinstalling your own version of windows on PC. yet for consoles you will get arrested. no wonder it does not have a hardware scene, everyone in it would have to be in jail.
It wasn't always illegal to fiddle around with the insides of consoles in certain ways, and even then is was never a big thing unless you count all those people who got their system modded just to play pirated games. Many modern systems do actually have a small homebrew culture attached to it, though it's more about doing new things with the existing hardware than changing it to be better. Someone got Windows 95 to boot on a 3DS recently, for instance. The PSP is a popular subject as well. And there's always someone trying to run Linux on a modern console. Oh, and you should see some of the fun ways they use Wii hardware at universities! I've seen everything from modified remotes to balance boards being used in stress tests in the psychology department. These people are usually left alone by console companies because they aren't breaking the law hard enough or simply aren't breaking it at all. Piracy however... Yeah, they don't take to kindly to that.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
NPC009 said:
I would not recommend downloading games using a connection that'll give you a download rate of less than 1MB a minute (note that the speeds are listed in Mb), as many modern games are several GBs, even the ones on handheld. It's easier to find stuff under 500MB on Steam and GoG, but even a small retro game might take over an hour to download with that kind of speed.
Why not? I mean sure you will have to wait a few hours for the download to finish. but thats about it.

BTW You don't need a majority to file a class action lawsuit.
I never said you do, what i said is that the majority of customers are not happy when the game is poorly done.

I've heard of i5 and i7 being fairly popular because they offer a lot of bang for your buck, but this is the first time I've heard the i3s being well-liked as well. I have to wonder about how future proof it is, though, especially in combination with other lower-end components. One advantage of consoles is that the system has a life cycle of atleast five years (with more than a few systems reaching eight years or more), a PC becomes less cost effective if it needs multiple upgrades during the same timeframe to be able to keep up.

As for the harddrive, the serious gamers around me swear solid state is the way to go, but I'll take your word when you say 7200 is perfectly acceptable. It sounds good enough for me, as well.
i7 is overkill for gaming. you will find very few games where you will see any benefit over i5 because your GPU will be the one limiting your performance most of the time. GPU is more important than CPU nowadays if you are building a gaming PC as more and more calculations are offloaded to the GPU (which is good in a way since its easier to do the calculations there, but it obviuosly makes the GPU a bottleneck). More cores wont help you much because multithreading is rarely done in any efficient way in games (and dont expect it to get better any time soon, multithreading games is a nightmare for programmers. games cannot be easily predetermined calculations like, say, movie rendering).

Well, i do a major overhaul of my PC every 5-6 years, so its futureproof enough for me. As far as any pc being futureproof, can it run the same games consoles run now? if yes, then it will continue to run the same games at same performance. console hardware does not improve, the games will still be designed with that hardware limit in mind, so if your PC superceeds it now it will continue to do so. PC does not 'need' upgrades during the same timeframe. many pc players however do upgrade because they want to. there is a difference here, as their goal is to max out games which is way beyond console capabilities of rendering, but if you just want equivalent experience then you do not need to upgrade.

Well, i bought a SSD a few months ago and am very happy with it, but it is not a requirement for gaming. Vast majority of gamers still have HDDs (and like i said 7200 is standard), so games tend to be made in such a way that they will work for these people. Yes, SSD is great for loading times. we tend to joke that we got a new first world problem - the game loads so fast we cant read the loading tips. but outside of world loading the actual performance is unchanged because its the GPU thats capping it again. now granted it will be helpful in some extreme cases like the GTA 4 i mentioned where textures were not loading fast enough from a HDD, but that game is a complete mess and is not representative of gaming overall.

Good point about the antivirus and other third-party programs (Steam, Skype, what else?). Those would be a bigger drain nowadays. Not sure if that's really in favour of PC Gaming, though.
If its any consolation, i found that the DRM clients - Steam, Origin, Uplay are very light nowadays and once they are done checking the legality at launch they mostly stay inactive. The rest - Skype, ect. are optional. With antivirus though its worth noting that some are far worse offenders than others. For example Norton will significantly decrease your performance, meanwhile say Avast! (the one i use) has very small impact. I am currently running: Samsung Magician, GameCompanion 2.4, Fraps, Origin, Raptr, Steam, Skype, WhatPulse, Avast! and am seeding a few mods in torrent (perfectly legal, helping the owner host them since i got good internet connection). All of this has not created any noticable impact to my gaming performance. And yes, i did try turning it all off, the difference was very much within margin of error (1-2 fps or so). So at least in my setup, there is no impact from the third party apps that would be worrysome. Oh and i also forgot to mention, i also use Shadowplay that ALWAYS records my gaming footage with ability to save last 5 minutes (much like the share in PS4, but 1080p@[email protected] so much higher quality. Since this feature is built in into my GPU, there is no performance impact (it does freeze the game for a second when saving though because it has to dump multiple GB of data into the hard drive at once and HDD isnt that fast. could be avoided if i recorded in lower quality but meh, thats acceptable tradeoff for quality for me).

I disagree. It's fine to hook up a computer to a TV to watch some movies, show a presentation or play a game with a controller, but a lot of PC stuff kinda requires you to sit at a desk unless you enjoy ergonomic nightmares. So unless you have a monitor size HD television to begin with, you need a proper monitor.
or you could come up with multiple other ways of doing it not the least of which is hoop controller to control your mouse or simply have a wooden board that you can put on your lap and turn it into a mini-table for mosue and keyboard.

Again, lots of people just aren't comfortable working with hundreds of dollars worth of components. PCs are kinda like cars, while nearly everyone should be able to basic things themselves, many prefer to leave it to the professionals.
Funny thing here, theres actually been a massive problem where i live where young people do not evne know how to check air pressure in their cars and drive to the service to do this. even though everyone can do this for free at almost every gas station in 5 minutes.

Also you are aware that basic car maintenance is taught in driving courses and is required to know to pass the drivers exam to get a license? it is literally illegal to drive without knowing how to do those basic things. While there are no laws against it for computers, obviuosly, the comparison is really not in your favor here.

Even if you have to wait several years to get to the point it's fixable, haha? Well, okay, I'm not laughing too hard. it's a shame to see potential wasted, so better late than never, I say. The standards on consoles are higher, but if something does go wrong, users have very few options to fix it themselves.
GTA4 is an extreme case that i used for illustration. however there are more realistic cases, for example Assassins Creed: Unity, where they designed it poorly (mostly runaway drawcalls as it turns out, proving that DX12 is sorely needed, but we didnt knew this at launch). People with top of the line 800dollar GPUs could power through the problem and still play consistently, though. Now i will likely never be one of those people, but i dont begrudge them for wanting to do so.

I like going through Steam during the sales and look at weird things I can get for less than 2 bucks, but most games do have atleast a handful of reviews and I do look at those before I buy it or not.
Steam reviews can be useful. I usually go straight to the negative ones, read the negative sides of the game and sometimes it ends up "wait, that would be a positive for me actually". It actually saved me from buying a game once when the reviews (pretty much consistently all of them) said that the servers were shut down and the game became abandonware that doesnt work.

And there's always someone trying to run Linux on a modern console.
You could very easily install linux on a PS3. It was advertised as one of the features of the console at launch. however a few years after one of the patches removed this capability. Sony claimed it was because this feature allowed piracy to be easier. all the people using PS3s for linux calculations just got fucked. The almighty pirates, taking our abilities one feature at a time, ech?