Questions about electrical usage of computers
This page is the companion to my article, "How much electricity do computers use?"
That page is more likely what you're looking for.
What do you think about the idea of petitioning Google to use a black background to save energy, or encouraging people to use Blackle, a black-background version of Google? — January 2008
It won't save squat. Instead we should be focusing on things that really matter, like heating, cooling, and lighting use — or even just sleeping or turning off our computers when we're not using them.
For those who haven't heard about the black background thing, the idea is that it takes more energy for a computer screen to show a white background than a black background. But there are three reasons that make this irrelevant:
- It's only true for obsolete CRT's, not
modern LCD's. CRT's are the huge, clunky
old-school monitors that are like TV's, which
are already obsolete. By contrast, LCD's are
what most people are using today, and a black
background makes no energy difference on an
LCD. An LCD uses the same amount of energy no
matter what the background color. I just
checked Best Buy, and they're only offering 2
different CRT's...and 41 LCD's! So for most
people, the background color makes no
difference at all. (And if you do have
an old CRT, the best way to save energy with
it isn't to use a black Google, it's to get
rid of it and replace it with an LCD, and
then sleep it when you're not using it.)
- Most people probably don't search
Google from its home page anyway. Modern
browsers and toolbars have a Google search
box built right into the browser window, so
you can search from any window without
actually visiting the Google home page. I
can't remember the last time I went to the
Google home page. It's probably been
years.
- Even for people with old CRT's who actually go to the Google home page, the savings is tiny. Even after all the international press that Blackle has received, they claimed to have saved only 425 kWh for the nearly one year they've been up. To put that into perspective, the typical American family uses 888 kWh hours in a month. So in nearly a year, Blackle has saved less than half of what a family uses in a month. This is just not significant.
Me, I use around 125 kWh/mo., saving 663 kWh vs. the typical household, or 217 kWh on a per-person basis. In two months, I alone save about as much as Blackle's whole international operation saves in nearly a year.
We can save tons of energy if we switch to CFL lightbulbs and address our heating and cooling use. Or we can chase the things that don't really matter, like using a black background for Google, and save nearly nothing at all.
You advise people to sleep their computers, but computers wear out faster when they're cycled on/off repeatedly, because the constant heating and cooling expands and shrinks the parts. You should read Google's report on hard disk drive failures (PDF) which proves this.
No, this is wrong. As I said quite clearly on the computers page:
- You won't wear out your computer prematurely by cycling it a few times a day. Modern computers are just not that fragile. Jonathan Koomey, a project scientist at the Lawrence Berkeley National Laboratory, agrees: "PCs are not hurt by turning them on and off a few times a day." (Wall St. Journal)
- Even if cycling computers reduced their lifespan (which it doesn't), your computer will still become obsolete and get replaced way before you wore it out.
- Running something all the time (which is what you're suggesting) certainly wears it out faster. Do you think your washing machine would last longer if you ran it 24/7? Your computer has moving parts too (in the hard drive), and it can't run forever.
As for the Google report, it says absolutely nothing about temperature changes from cycling a computer making it wear out quicker. Google's report covered drives that were on continuously. They didn't look at power cycling at all. As for temperature, they found that "[F]ailures do not increase when the average temperature increases. In fact, there's a clear trend shower that lower temperatures are associated with higher failure rates."
Google also found that the more a drive was used, the more likely it was to fail. (p. 5) That supports my point that running your equipment is what wears it out.
You advise people to sleep their computers when they're not using them, but what about using a computer's idle time to help with scientific research, like the projects listed at Grid Republic?
Any energy use has to be weighed against the benefit we expect to get from it. Personally, I don't feel the alleged gains from distributed computing projects are worth the energy it requires and the pollution it creates. Take a look at the projects: One involves searching for extraterrestrial life. Of course it would be kind of ironic if we actually found some aliens and our contact with them was necessarily brief because climate change wiped out life on our planet shortly after we found them. But the real irony is all the projects about studying climate change itself! Now, which is the better way to deal with climate change: Continue studying something that's already been studied to death, or stop causing it in the first place? I'll choose the latter.
What do you think of Shutdown Day, where people are encouraged to keep their computers off for a whole day? — Feb. 2007
It's an interesting idea, but it won't save much energy, nor is that even the point. From the organizers' website: "The purpose is to get people to think about how their lives have changed with the increasing use of the home computer, how society is changing and whether or not any good things are being lost because of this. It is obvious that computers are an extremely important and vital part of society these days."
For energy savings, I prefer that we save energy every day, not just one day a year. Shutting your computer off for one day saves only a piddling 1/365th of its use annually. (0.3%) As I always say, most people will save lots more energy by focusing on their heating, cooling, and lighting use rather than their computers. And with you computer, setting it to sleep after X minutes of idle time and swapping your old CRT monitor for an LCD one will drastically cut how much energy your computer system uses — way more than you'd get by turning your system off for just one day.
And of course, if you participate in Shutdown Day and then watch TV or drive a car instead, you're just trading one form of energy use for another.
All that said, if Shutdown Day appeals to you then by all means participate. Just don't fool yourself into thinking that you'll be saving lots of energy by doing so.
What do you think about the push for more energy-efficient computers? — Dec. 2006
It's promising, for sure, but a year later, what happened to that effort? If a reader has heard any more about the effort to make super-efficient PC's, I'd appreciate hearing about it.
The article doesn't put the numbers into a per-unit perspective, but they're significant: They're talking about saving a whopping 45 watts. That's about a one-third savings. Also note that just recently the EPA started granting the Energy Star label to energy-efficient computers, though those standards are not nearly as strict as the ones proposed by the article you mentioned.
And as always, I remind readers that you'll still save more energy by turning your old product off when you're not using it than you will by buying a new efficient model. Not running an unused device is still way more important than how efficient that device is.
The fact that you are using a computer and wasting energy promoting your ideas indicates that the 3 trees we have used today don't mean a thing to your well meaning agenda. You my friend are a well-meaning person who has their life in reverse and in fact waste more energy than you save! — Allen Ross, Rockport, Texas, May 2006
This misses the point. The idea isn't to get your electrical use down to zero, it's to simply reduce it. If you were fat and lost 10 of your 300 lbs. would you consider yourself a failure because you didn't lose all 300? Of course I use a computer. The fact that I employ the various energy-saving techniques described on this site means that still I use far less energy overall than most. When your monthly electrical consumption gets down to 150 kWh like mine then feel free to write again. :)
My laptop computer runs for five hours on a charge. Does it take more electricity to recharge the battery when it's dead than it would have taken to run the computer off the power adapter for those five hours instead? — Barry Smith, Mar. 2005
Yes, a little bit, because transferring energy from one source to another is always less than 100% efficient. But the difference is not significant. It's not worth measuring.
Will a surge suppressor protect my computer against a lightning strike? — Anonymous, Jan. 2005
Your question really isn't about saving electricity but I'll let it slide this once.
The answer is no. No surge suppressor you can buy will protect against lightning, which can be thousands or even millions of volts. But some surge suppressors come with a warranty that covers lightning damage, even though they can't protect against it, because the manufacturers are trying to get you to buy their product. They know the chances of lightning getting into your electrical lines is rare so their risk in offering that kind of guarantee to you is small. If you don't already have insurance to cover your electronics equipment then buying a surge suppressor that comes with a lightning guarantee is an easy way to insure yourself.
Whether or not you're insured (and whether with real insurance or a product guarantee), the only way to keep your equipment safe during a lightning storm is to physically unplug it from the wall. Turning it off isn't good enough, and turning off the power strip it's on isn't good enough — there's enough power in a lightning surge to jump the little gap that turns a switch off. It's up to you whether you want to go to the hassle of unplugging your electronics when there's a storm. Me, I usually leave everything plugged in and just make sure my computer data is backed up.
If I just turn off the monitor at the switch (the monitor button) with the computer still running will it save electricity? — Anonymous, Dec. 2004
Yes, but there's an easier and better way to do it. Go into your computer's Control Panel or Settings [In Vista, Start > Control Panel > System & Maintenance > Change when the Computer Sleeps ], and set the screen to automatically turn off when you're not using it. In fact, you should set the computer to automatically sleep also so that it saves energy too.
The monitor uses electricity separately from the computer (about 35-80 watts), so turning the monitor off (or sleeping it) does save electricity, but if the computer is still running then it's still using whatever it was using before you messed with the monitor. When you turn off the monitor you might as well sleep the computer too, and save energy there as well.
I heard that laptops still use energy through their transformer, even when they are turned off. Is this true, is it best to unplug the laptop overnight? —R. Larsen, Guatemala, Nov. 2004
The transformer for my Apple PowerBook G4 doesn't use any electricity when the computer is off. I don't know about other models, but if they do it will only be about 1 or 2 watts. It's easy to check — if the transformer is warm, it's drawing electricity.
I am in 11th grade at Seattle Lutheran High School. Yesterday I asked my science teacher why he leaves his computer on at night and never turns it off. He said it was too much of a hassle to turn it on and off every day and that one person does not make a difference. I thought that was a bad attitude but i didn't tell him he was wrong because I did not have any proof. I found your website and i am going to give him some stats on keeping his computer on 24/7. What I found out is that by keeping his computer on 24/7, he wastes 499,200 kilowatts a year! My question is: could you tell me what other things those 499,200 kilowatts could be used for. Maybe those watts could be used to power tools during surgery or something like that. Also, he is into saving the environment, so how could saving those kilowatts be good for the environment. Thanks so much. —Maya Sears, Jan. 2004
Your science teacher at your Lutheran high school said it was too much of a hassle to turn his computer off at the end of the day? Some Lutheran he is. Anyway, good for you on trying to help save electricity! You are right, one person does make a difference. If you look on the front page of my site you'll see a link to a calculator which will show you how much pollution is generated to power your teacher's computer overnight. To use this calculator you'll need to first figure out how much electricity the computer uses, and I'm afraid your calculations are off. First, it looks like you calculated watt-hours, not kilowatt-hours. You have to divide watt-hours by 1000 to get kilowatt-hours. You might have also assumed the computer is not in sleep mode overnight which uses less electricity, although it probably is.
Energy use varies a lot from computer to computer, and if the computer is in sleep mode then it uses even less energy. You can find out exactly how much the computer uses by using a watt-meter. You might also be able to find out on the manufacturer's website how much it uses in sleep mode. If we just assume the computer is in sleep mode and uses 25 watts, and that it's on for an extra 17 hours a day, that's 17 x 25 = 425 watt-hours per school day. If it's on 24 hours a day on the weekends, then that's 24 x 25 = 600 watt-hours. So for the whole week we have (5 x 425) + (2 x 600) = 3325 watt-hours per week. For a school year of 39 weeks, that's 39 x 3325 = 129,675 watt-hours. There are 1000 watt-hours in a kilowatt hour, so that's 129.7 kWh.
Using the calculator link, choose "Commercial" and keep putting in a smaller dollar amount until the kWh gets down to 130 kWh. Then you'll be able to see how much pollution is caused by the computer being on overnight.
As to what else that energy could be used for, the answer is "anything that uses electricity", but that's not the best question. The reason to use electricity is not so you could use the electricity for something else, it's so you can decrease pollution and save money. Good luck!
My brother leaves the computer on 24/7. He told me that his friend told him that when a computer is idle it burns as much energy as a watch battery. I didn't believe this since a watch battery produces electricity. What do you think? — Chris O'Neil, July 2003
You can find this kind of info yourself on this site. Information about how much energy is used by computers and how much is stored in household batteries is on the "How do I find out how much electricity something uses?" page. But let's tackle your question here anyway in the interest of dispelling some myths.
First, you're right that comparing a computer to a battery is an invalid comparison, but for a different reason: a battery stores a fixed amount of energy, but the amount of energy used by a device increases the longer it's left on. In other words, the comparison fails to consider time. It's like saying "My cat eats as much cat food as is in that bag." Sure, but over what period of time? An hour, a week, a month? Similarly, a computer will definitely use as much energy as is in a battery; the question is how long does it take to do so?
Second, a computer that's "idle" is not the same as one that's in sleep mode or on standby. Sleep/standby put the computer into a low power state which saves electricity. If the computer is simply idle (not being used) it won't save any electricity at all.
So let's see how long it would take a computer in sleep mode to burn through the energy contained in a watch battery. According to Watch Batteries USA a typical button battery is 1.55V and 105mAh (0.105 amp-hours), and as we know from our earlier page "How do I find out how much electricity something uses?", we just multiply volts times amps to get watts. So 1.55V x 0.105Ah = 0.16 watt-hours. Also from our How do I find out... page we see that a sleeping computer uses about 22 watts. So a sleeping computer would use the energy contained in a watch battery in 0.0073 hours (0.16/22), or less than thirty seconds.
I work for a company that has over 100 computers in its offices and production areas. One of our MIS people tells me that is is not a good idea to turn our computers off during the night and on weekends because it puts extra stress on them to turn them off and on, that you are shortening the life of a computer and monitor by repeatedly starting it. Do you know if this is actually true? — Janet Smith, April 2003
Your MIS people are wrong. You're not going to wear out your computers prematurely by cycling them off/on overnight and on weekends. This is just an urban myth. Modern computers are simply not that fragile. (By the way, I did technical support at Apple Computer for five years, so I have a background in troubleshooting computer hardware.)
The useful life of a computer these days is only a few years anyway. The computer will become obsolete long before you wear it out, no matter how often you cycle it.
Something else to consider is that all devices eventually wear out after running for a long period of time. Keeping your computer on constantly means it's running three times longer than normal. This extra running time is at least as likely to wear out your computer as turning it off at night. For the monitor, it'll definitely wear out quicker by keeping it on rather than turning it off.
Even if turning a computer off once a day shortened its overall life by a few days, it wouldn't pay to keep it on all the time. Your hundred computers are costing your company several hundred to several thousand dollars a year in energy costs if they're not being turned off at night.
If a computer is used from 8:00 to 5:00 on weekdays, then it's not being used for 16 hours a day during the work week, and 48 hours for the weekend. That's a total of 128 hours a week, or 6656 hours a year. At an estimated 25 watts/hr. in sleep mode, that's 6656 x 25 = 166,400 watt-hours per year, or 166.4 kilowatt-hours per year. At $0.10 a kilowatt-hour, that's $16.64/year.
That's for just one computer. For a hundred computers it would be $1664/year. If the computers aren't in sleep mode, then at 150 watts/hr. it's more like $9,984/year.
And then there are the hidden costs. Computers generate heat, and your company is paying a pretty penny for air conditioning to remove all the heat generated by computers that should be off when they're not being used. And if the AC doesn't run overnight and on weekends, running the computers in a hot environment will do more to shorten their lives than turning them off once a day.
Bottom line: Turn your computers off at night, and don't worry about it.
Here are some articles with more detailed information on this topic:
- How Stuff Works: Whether to turn off?
- Mean Time Between Failure vs. Mean Cycles Between Failure
- MonitorWorld: How can I maximize the life of my monitor?
Postscript: An MIS person told me they want the computers kept on overnight at his facility so they can install software updates over the network. Updates installed during the day would slow down the computer while someone is trying to use it. But companies that do this pay a big penalty in electrical and cooling costs. For companies that must do this, there's probably software that puts the computer into standby (sleep) mode after installing updates, or if there are no updates to install that night.
I have my computer on from about 9 AM to midnight. My question: Would I save electricity if I use sleep mode instead of just sitting with a screen saver going? — Louise Gullion, 2-02
Yes, that's what Sleep mode is for. A screen saver that puts images on the screen doesn't save any electricity, since the monitor and computer are still active. You save electricity only if the monitor isn't generating an image and the computer isn't thinking. Putting the computer to sleep puts both the monitor and computer into very-low-power mode.
We have a visitor in the house who chats on the Internet every single day from about 11:00 p.m. until 7:00 a.m....yes....8 hours straight every day.... My electricity bill doubled last month...could it possibly be from the extra use of the computer? Shall I kick him out? — Kathie Myers, 1-01
Your visitor didn't double your electric bill, unless your bill was about $3.13 a month before (s)he moved in.
A typical desktop computer uses about 65 watts plus another 80 watts for the CRT monitor. The How much it costs section explains how to figure out how much you're paying for electrical use. For example, 145 watts of computer equipment times 8 hours a day times 30 days a month is 34,800 watts, or 34.8 kilowatts. If you're paying 9¢ a kilowatt hour, your friend's computer use is costing you $3.13 a month.
Your bill might have gone up because of a rate increase. Did you check to see if the amount of electricity you used went up (kWh), or just the amount you paid? Energy is getting increasingly expensive, as the world is running out of cheap fossil fuels.
I've heard that when you start up a computer it uses the same amount of electricity than it would if left on and casually used for a week. I've also heard that PC's consume ridiculous amounts of power even on energy saving mode. Is it better to shut down the computer or use energy saving mode? I've heard that having a higher refresh rate and resolution on a monitor coupled with a higher performance computer (megahertz, etc.) uses more power. Is there a significant difference between desktop to desktop? — Y. Gonzalez, Miami FL, 2000
(1) It doesn't take any extra electricity to start up a computer. You can verify this by measuring the consumption, as described on this site.
(2) Energy-saving mode does save power. That's what it's for. The amount saved is listed in the specifications for your computer, or you can measure it using the methods described on this site.
(3) Shutting down obviously saves more electricity than energy-saving mode, but it's inconvenient to have to wait for your computer to start up after a shut down. Probably the best compromise is to shut your computer down at the end of the day and turn it back on the next day (and use energy saver throughout the day when you're not using it).
(4) Monitor resolution and refresh rate don't affect electricity use. Yes, faster processors use more power but you'll have to measure it yourself if you want to compare two different computers. The biggest difference is that the newer computers with flat-panel LCD monitors (like on the newest iMacs) use a lot less electricity than traditional CRT monitors.