Your Gaming PC Uses More Energy Than Three Refrigerators

Cowabungaa

New member
Feb 10, 2008
10,806
0
0
The last time I upgraded, my GPU in this case, I actually did start paying attention to power consumption. It makes nothing less than sense to me. It's sensible and moral to do your part so that we as a whole can become more energy efficient and environmentally conscious, and I don't see why gaming should be exempt from that.

As for this study, obviously its results are very specific. They're meant to be so, as to not have too broad and inaccurate a study. People seem to forget that a little here. That of course means that in real life the situation is a little different; there's more varieties of gaming PC's than just the ones used in the study and not everyone games 7.2 hours a day, thankfully.

But that doesn't change anything about the conclusions drawn from it; high(er) end PC gaming is a very power hungry hobby. And that's an expensive and environmentally important thing we tend to easily forget but that we should be aware of so we can do something about it.
 

008Zulu_v1legacy

New member
Sep 6, 2009
6,019
0
0
The power supply in my desktop rig is a beast; 850w. My video card requires a 650w supply, and my system is liquid cooled. I use my laptop for most of my gaming since it only draws 250w. With gaming, Internet and movies on my desktop, my electricity bill is about $90AUD a month, on my laptop, it's about $50.
 

NoPants2win

New member
Dec 4, 2010
72
0
0
This is a weird study. They must know their study was deceptive and would be used to make deceptive click bait articles all over the internet. Why? What do they gain from this?

Power saving features are built in to the operating system and the hardware so that the system doesn't waste power. It's standardized and it's called ACPI. Power use for each component is easy to obtain from the manufacturers' websites and is used by enthusiasts to build the system. I know how much power is coming off each of my supplies 12V rails to the nearest amp. Pretending these things don't exist to make a clickbait style study is just confusing.
 

CrystalShadow

don't upset the insane catgirl
Apr 11, 2009
3,829
0
0
What exactly are they basing this on?
I'm one of those people tgat has a PC switched on most of the time, but I doubt it hits more than 3-400 watts at most at full power, and probably more like 100-150 at idle. (which is most of thw time)

Sure, with it switched on as much as I have it, that might reach 1.5 kwh in a day, but turn on some old-fashioned incandescent light bulbs and you'll beat that figure easily.

OK, so my PC isn't the fastest thing ever, but this is still a weird claim to make.

The cost of running my PC, a laptop, and a game console together is easily swamped by my electric heater though.
Heating is expensive!
 

Veylon

New member
Aug 15, 2008
1,626
0
0
My rig - including monitors and other powered peripherals - uses about 240 W. Yes, I have a meter that tells me this. I turn it off (with an actual cut-the-power switch) when not in use. If I were this hypothetical 7.2 hours per day gamer, this would total 631 kWh per year, or $50.48 at local rates. Since I'm not it costs me less than that and so I'm not terribly concerned.

It's probably worth pointing out that just because a power supply could supply 1875 W doesn't mean it's doing that all the time. It's probably worth pointing out that heat management and power management go hand in hand - parts that produce less heat also consume less energy. There's going to be an upper limit on PC power consumption if, for no other reason, than that the thing will become dangerously hot, both to it's internal parts and to the immediate environment. Contrariwise, nobody much is going to bother making components super energy efficient so long as they aren't contributing to heat problems.
 

DrunkOnEstus

In the name of Harman...
May 11, 2012
1,712
0
0
I sense shenanigans here. Maybe they're talking about nitrogen-cooled triple Titan X setups or something, and disabling every single feature in the BIOS that was designed to counteract this problem and running a 10,000,000,000 draw call test to 100% all three of those GPUs or something, because my entire entertainment stand (an HDTV with my PC, PS3, and Wii) doesn't even touch that level of power usage.

*Tinfoil hat time* Perhaps Sony commissioned this totally scientific study to show how the decision to build a PC instead of buying a PS4 makes you a dirty smog generator that hates the planet. Maybe. It wouldn't be Microsoft, Windows and all that.
 

Smooth Operator

New member
Oct 5, 2010
8,162
0
0
Grade A potato science right there, if you wanted that bullshit to fly you should write articles for Cosmo where people have no clue what the fuck you are talking about.

Let's just run down the number of issues:
- from their numbers they presume 7+ hours, absolutely every single day, on a computer that can actually consume 750W and does so for the entire duration of that timeline
- first off no one but a training esports person will log that amount of gameplay per year
- your computer needs to actually be capable of 750W+ to even achieve 750W consumption ever
- even if your PSU can do 750W+ you need components that will consume this amount, two or three GPUs minimum, or overclocking that is beyond the bounds of sanity
- and lastly the game needs to burn all that processing power for the entire duration... newsflash no fucking game does this, there are peek times where it will happen but mostly they will run lower, so you would need to run several resource hogs to take up the slack in which case your actual game will run like shit

The solutions are also beyond amazing... your plain old HDD consumes 5 bloody Watts at peek performance, which will only ever happen if you are transferring several large file sections at once, that shit does not happen in games.
But you save 70%! Yea you went down to fucking 2W with SSD, fucking 3W saving only ever at peek hard drive use. Not to mention you probably can't afford a SSD only system so you will be running another backup HDD and in the end consume more.
Also which birdbrain told people cooling reduces power consumption? It just fucking cools the system better, any gain you get out of material resistance losses you will suck up with the added cooling system.
 

Kaymish

The Morally Bankrupt Weasel
Sep 10, 2008
1,256
0
0
buhahah looks like my carbon footprint is still small because most of my country is on hydro and geothermal power and my power company brags its on 100% renewables i better crossfire another GPU just to suck some more juice or maybe a server grade Mobo with room for 16 CPU's
 

alj

Master of Unlocking
Nov 20, 2009
335
0
0
Yet another article where they don't understand the technology.

750w is the MAXIMUM power draw. Its significantly less than that when idle or just on desktop and still nowhere near that when running a game.
 

wizzy555

New member
Oct 14, 2010
637
0
0
I'm not sure what people are complaining about the 750 Watts number. The paper acknowledges the different states of power consumption, in fact there's a pretty graph about it.

The fact is yes gaming pcs are going to use more power than consoles. Consoles are designed for with specific hardware in mind for mass production, the heating and energy requirements are optimised carefully. Gaming pcs are designed so you can slap anything together and not have it explode.

Disclaimer: I am a pc gamer and measured it myself with an energy meter.
 

truckspond

New member
Oct 26, 2013
403
0
0
This is one of the reasons why I prefer to use laptops for gaming, they draw a lot less than a full gaming PC due to the requirement for energy efficiency to extend battery life.
 

Fanghawk

New member
Feb 17, 2011
3,861
0
0
Questioning the stats is completely reasonable folks, but those of us basically saying "these extreme gamers aren't the majority!" should remember - according to the study they'd still consume an insane amount of electricity. As in, 2.5% of the PC gamers are consuming 20% of the electricity. That is insane. Even if the number of builds which could do so is limited, that's a major efficiency problem.
 

ExtraArm

New member
Aug 29, 2011
1
0
0
I've lurked here for... a time and never felt the need to actually post that much, I've been content on reading and laughing my arse off to these kinds of articles, but this time I can't believe the amount of bollocks that this article contains.

Now to elaborate on what I mean, I'll put a few examples of what my apartment contains and operates basically every day:

My apartment is about 70sqm (760ish sqf for you imperials)

I have two computers, mine and my wife's running approx. 4 hours a day in gaming use (more on weekends, sometimes up to 8), we generally leave the computers on for the rest of the evening (Idling for about 3 hours or so, sometimes using the wifes one for streaming videos) Both builds have a GTX970 GPU along with i7 4790k processors (both overclocked slightly), 8 gigs of ram, multiple hard drives (4 each) and surprisingly power-efficient flatscreen monitors.

Three rooms + kitchen (used daily, electric stove) + a sauna (used monthly, once or twice) + a fridge/freezer combo

A TV along with a bluray player adorns the living room (TV used daily perhaps 1-3 hours, depending on the mood, and each room contains fairly energy efficient LED lights.

Now according to this article, I should be draining more than 2500 kWh a year, yes? Wrong. All of this, according to my latest electric bill, took exactly 1895kWh last year (I was unemployed then, so we actually used the computers more)

Only bloody way you could drain that much is if you did nothing else than gaming more than 12 hours a day, every day of the year, using a fairly high-end gaming system. The "Science" behind this article is on the level of an Onion article. Complete shite.
 

kasperbbs

New member
Dec 27, 2009
1,855
0
0
valium said:
on the upside, I have a space heater during the winter.
True! I don't really care about a few extra bucks every month, but i would be happy if it didn't help heating up my room up to 32°C during the summer..
 

Something Amyss

Aswyng and Amyss
Dec 3, 2008
24,759
0
0
Kenjitsuka said:
"while console gamers rake in the savings with 134 kw/h."

Yeah... what about reports the new gen's always on (the web) standby BS consumes a SHIT ton of power that is completely wasted to make it boot up once or twice every 24 hours 50% faster? Oh wait, those are a few months old and thus forgotten. Phew, bullet dodged there, article!
Except, you know, the study presented here covers standby and "off" time and apparently factors it in.

Were you just looking for a way to poke holes in the argument or defends PCs from consoles? Because criticising the article for not factoring in something that was included in the research (And took me only a few seconds to find) doesn't seem like a good way to go if you're looking for legit criticism.
alj said:
Yet another article where they don't understand the technology.

750w is the MAXIMUM power draw. Its significantly less than that when idle or just on desktop and still nowhere near that when running a game.
Did you read the actual report? If not, see above.

CrystalShadow said:
What exactly are they basing this on?
Check the article. They spell out their methodology pretty thoroughly.

I'm one of those people tgat has a PC switched on most of the time, but I doubt it hits more than 3-400 watts at most at full power, and probably more like 100-150 at idle. (which is most of thw time)
We're talking things like averages vs your specific case, however.

The cost of running my PC, a laptop, and a game console together is easily swamped by my electric heater though.
Heating is expensive!
That's why you just build your home around a couple of Titan GPUs. Two birds with one stone.

...seriously, I know people who turn off their heat in the winter when gaming not for better cooling, but so they don't cook to death.

ExtraArm said:
Only bloody way you could drain that much is if you did nothing else than gaming more than 12 hours a day, every day of the year, using a fairly high-end gaming system. The "Science" behind this article is on the level of an Onion article. Complete shite.
Hold on, let me see if I've got you right:

your numbers don't match the average Kw/h of the average of a variety of hardware in a test that is not about your hardware specifically, therefore the science is worse than a satire newspaper's and there's no way you could draw these numbers? Is that what you're saying here? Because that seems like a really strange extrapolation from a single person's power consumption.
 

wizzy555

New member
Oct 14, 2010
637
0
0
BTW anyone curious how much their rig is using can buy a plug in power meter for like $15. Then you can pretty much do this sort of analysis yourself.
 

CrazyCapnMorgan

Is not insane, just crazy >:)
Jan 5, 2011
2,742
0
0
crimson5pheonix said:
Admittedly, my electricity bill in the summer is triple what it is in the winter.
My cheapest electric bill this year was $10.44 for the month of July. Thank you, municipal electric! ^_^ Cheapest ever was $7.98.

And my usage was mainly my PS4 and my AC. The computer I used for news and....stuff.
 

Strazdas

Robots will replace your job
May 28, 2011
8,407
0
0
what builds did they use to test this? have they actualy measured consumption or simply added the PSU power limits? 7.2 hours a day is certainly not average time spent gaming (most people dont even do 20 hours in a week).

Also no, you DONT throw in web browsing, streaming video or idle time because those times are low power state runs that consume far less energy. you only count actual full load gaming here. and that consumption is incomparable with other times.

Unlike game consoles, where energy use has been extensively studied, PCs have countless barriers to energy efficiency. Most gaming PCs are custom units, and individual components like CPUs won't list energy levels the way other electronics do. Thankfully, that means simply buying the right components could save you money while keeping your frame rates high. For example, solid state drives use 70 percent less energy than traditional hard drives, while installing an improved cooling system also reduces energy consumption.
Absolute nonsense. All parts that have any significant power usage lists their load and maximum wattages. ALL OF THEM. in fact, its the other electronics that often skimp on that information. Not to mention that you wont be using it at maximum most of the time. In 99% of modern videogames, your CPU is not fully loaded and does not consume its rated power unless you are using a weak CPU fast GPU combo.

Buying different components will not refuce your power usage. more powerful components will simply run bellow maximum to produce same processing effect, using less power.

Solide state drives usage is irrelevant, hard drives usage is less than 1% of your PC power usage. over 90% is in CPU and GPU. third place is your RAM.

Cooling system will NOT reduce your energy consumption. in fact, it will INCREASE it. It will icnrease it in two ways:
1. more coolers or water cooling requires more power.
2. Components are running optimal in certain temperatures. if you cool them bellow that they will require more electricity to run optimally.

Which is great, because the only other alternative is convincing PC gamers that fast systems don't actually improve their gameplay experiences. And if that's the case, we'll be waiting a very long time.
As in - never. Fast systems improve gameplay experiences significantly for any properly coded game.




WouldYouKindly said:
I'm betting a lot of this can be solved by turning the PC off when you're not there. If you've got a solid state drive, this isn't even an annoyance because start up takes no time.
It cannot. Idle usage is very small. It is generally advisable against shutting computer off if you are leaving for just an hour because you actually ensure more wear and tear and power consumption in the shutdown/startup procedure than leaving it run idle. even the hard drives park and stop spinning if your system is truly idle.

wizzy555 said:
The paper also recommends activation the energy saving features on your hardware (if it isn't already on). This includes things like self-adjusting fan speeds. But this may involve poking around in the bios.
self-adjusting fans are noise-reduction feature and not power-reduction one. most fans, especially stock ones (that most people use) are bellow 1W.


Vigormortis said:
* Their model for a "typical gaming PC day" had the PC off for only six hours a day, with an estimated 4.4 hours of game time. So while I can maybe see 4.4 hours of gaming a day (certainly not for anyone with a full-time job), only a complete moron would leave their PC on for 18 hours a day.

HOWEVER....they absolutely have a point. There's very little effort in making gaming PC components as energy efficient as possible, beyond the end-user's attempts. As an industry, we really must start pushing for new standards.

Here's hoping their study, as much as I might question it, brings some much needed attention to the issue.
according to Raptr statistics (which is an optional third party gaming client that was bought by AMD and is now shipped with AMD drivers) average gamer plays bellow 20 hours a week, so 4.4 hours a day sounds very unrealistic. I often end up in "top 10-5%" by playing as little as 24 hours a week and such.

Actually there was a lot of effort at making PC components more energy savy. PC power consumption barely changed for over a decade despite calcualtion power increasing hundredfold. despite that, for example latest desktop line of CPUs from Intel put main focus on energy efficiency rather than power. there is a lot of push already.

Poodleboy said:
The statistics used here seem a bit odd. Firstly, do high end PCs really draw an average of 720W? As far as I can tell, the average rig draws around 350W when running full tilt.
you need a multi-GPU setup to draw 720W with ANY configuration. Single GPU setups no matter how elaborate are never that hungry. Multi-GPU setups are bellow 10% of PC gamers.

Fanghawk said:
Questioning the stats is completely reasonable folks, but those of us basically saying "these extreme gamers aren't the majority!" should remember - according to the study they'd still consume an insane amount of electricity. As in, 2.5% of the PC gamers are consuming 20% of the electricity. That is insane. Even if the number of builds which could do so is limited, that's a major efficiency problem.
irrelevant. even if 2.5% of gamers consumed 20% of the electricity (which in itself is silly) their premise to begin with is false and makes this irrelevant.