Saving Electricity home
Michael Bluejay's home page | Contact
As seen in Newsweek, Forbes, NPR, the Christian Science Monitor, CNET, PC Magazine, InfoWorld, and everywhere else.

NOTE: I haven't updated the site in years and some information might be outdated.  I hope to update the content someday if I can find the time...


[an error occurred while processing this directive]

What's the electrical limit of an outlet, circuit, or panel?

Last update: March 2016

How many watts can a standard outlet deliver before it's overloaded? I ask this because occasionally when I'm using a lot of electronic appliances, electricity shuts off in parts of my home.  I have to switch the fuse in order to restore power. What am I doing wrong?   -- Mark L.

You're not overloading an outlet, you're overloading a circuit.  First understand that each circuit usually supplies power to several outlets and lights.  For example, Circuit A might supply power to the four outlets in the master bedroom plus the ceiling light, Circuit B might supply all power to the bathroom, etc.  Each circuit is controlled by a breaker or a fuse.  So you don't really overload an individual outlet, you overload a whole circuit.

You can't tell which circuit an outlet's on just by looking at it.  The only way to tell is to plug something in, turn it on, and keep turning off breakers (or removing fuses) until the appliance turns off.  You can make a circuit map of all outlets and lights in your home this way.  Once you know which outlets are on the circuit that's being overloaded (and which are not), you can plug some of the offending appliances into outlets on different circuits.  That way the overloaded circuit won't have to try to supply so much power.

Also, if there are any lights on the overloaded circuit, replace them with LED's or compact fluorescent bulbs, which use 70-90% less energy than normal bulbs.

So, to rephrase your question, how many watts can a circuit deliver before it's overloaded?  Most modern residential circuits are 15 or 20 amps, so we're looking at a max load of either (15A x 120V =) 1800 watts or (20A x 120V =) 2400 watts before the breaker trips.  The breaker will be labeled either 15 or 20.  I'm unfamiliar with old-style fuse-type circuits but I'm guessing they're also around 15 or 20 amps.

For continuous loads (on for more than three hours) the limit is 20% lower.  So for 15-amp breaker, you can't draw more than 12 amps from the circuit for more than three hours, or 1440 watts (12A x 120V).  And what do you know, the wattage of a huge window-unit AC or a large electric space heater is...1440 watts. (source 1, source 2)

Some people are tempted to swap out a breaker with a larger one to keep it from tripping.  Don't.  Your home's wiring almost certainly isn't thick enough to handle a higher load. If you put more current through the wiring than it's capable of handling, it can heat up and burn your house down.  If you keep tripping a breaker, just plug some of the offending items into different circuits (or stop using so much electricity to begin with).

Thanks to Frank Ketchum for the reference to the National Electric Code.

I'm trying to determine how many amps I'm putting on a circuit so I don't overload it, but I'm having a hard time understanding the labels. For instance, my DSL modem adapter says "INPUT: 120V 60Hz 30W" and "OUTPUT: 12VAC 1.67A"  I understand how to convert watts to amps [Watts / Voltage = Amps], so it looks like in this case the input (30 watts or .25 amps) is less than output (200.4 watts or 1.67 amps).  But your site says that input is always higher than output. What am I missing? -- David H.

What you're missing is that the input is 120 volts but the output is only 12 volts. Electricity from the wall is AC, and is 120 volts. The adapter changes that to low-voltage DC, usually 3, 6, 9, or 12 volts. So the output is 12V x 1.67A = 20W, which is less than the 30W input. Output is always less than input, because the conversion process is inefficient.

I've heard that it can be a potential hazard to plug a power strip directly into another power strip and that you should rather plug the power strip directly into a wall outlet "only", it this true? Along those same lines, I also heard that you should only plug "one" item into an extension cord even if they have several plugs available. What are the facts?  -- Cincy W., Berkeley, CA

As with wall outlets, it's not the number of items you plug into a strip or extension cord that's the problem, it's the total amount of electricity they draw.  You could plug five power strips into the first strip, then plug in a whopping 25 clock radios into the five power strips, and you wouldn't have a problem.  However, plug just two space heaters into the same outlet or strip and you'd have a problem right away.  Don't exceed the amperage rating of the outlet, power strip, or extension cord.

Another issue is that when you use power strips or extension cords, especially if you daisy-chain them, the capacity drops because the wiring inside them isn't as thick as the wiring in your walls.  When the wire is too thin and the electrical load is very high, the wire can heat up, melt, and start a fire.  That shouldn't be an issue if your appliances are low-draw, like clock radios, but if they're things like dehumidifiers or heaters then that's a problem.

That's probably the reason that UL (the group which evaluates the safety of electrical products) says that you shouldn't daisy-chain multiple power strips, and that you shouldn't plug a power strip into an extension cord.  It's easier to just say "never do that" than to explain in what circumstances it's okay and in what cases it's not.  They might have other reasons I don't know about, so of course if you flout their advice then you do so at your own risk.  Personally, though, I'm unconcerned when doing it myself as long as my devices are sufficiently low-draw.

Regarding a three-outlet extension cord, why would they put multiple outlets on the cord if you weren't supposed to use them?  As long as the total of the devices you're plugging in doesn't exceed the wattage or amperage of the cord (look for the label on the cord) or the circuit the cord is plugged into (whichever is lower), you should be fine.

My service panel is 125 amps, but the total of the individual breakers inside the panel is over 400 amps!  Is my panel overloaded, and is this dangerous?  I'm so freaked out!

Dude, take a chill pill, your panel is fine.  The 125-amp limit on the panel means that's the most your whole house can draw at once; it's unrelated to how many breakers you have.  You might have twenty-five 20-amp breakers (theoretically 500 amps total), but usually at any one time you'll be using just a few amps on several circuits and none on some others.  So, let's say you're using 3 amps on each of ten 20-amp circuits.  Your actual power draw is 10 x 3A = 30 amps, not 10 x 20A = 200 amps.

If you draw more than the 125A limit of the panel, then the main shutoff breakers are supposed to trip to turn the whole panel off. 

We have about 120 servers running in a computer data center. The specs on these say that the power supplies are "Auto-switching 100/240V AC power". Now, if I'm reading your site right, then the most these should draw would be 2 amps --however, we have had five plugged into a 15-amp power strip and the strip has tripped!  My question is, how can these computers be drawing (as they must be) more than 3 Amps each?   --Jessica P.

First off, the 15-amp rating is only for an instantaneous load. For a continuous load, it's likely about 20% less. So your 15-amp strip is really a 12-amp strip, if the equipment is on constantly.

Next, the 100/240V label means that the server can handle any kind of electricity from 100V to 240V, so it will work with the voltage in any country. (US/Japan is 100-120V, most others are 220V). But your question wasn't about foreign use, so now that we've taken care of the 100/240V label let's move on.

Next, I don't see where you're getting that your servers draw a maximum of two amps.  That's unrelated to the 100-240V label.  If the maximum number of amps aren't listed, then the number of watts will be, and you divide the number of watts by the number of volts (120V, for the U.S.) to get the number of amps.

So let's say that one of those two things is the case and you know your servers should be drawing no more than 10 amps, so why is it tripping your "15-amp" (really 12-amp) power strip? There are two possible answers:

The first possibility is that you have a faulty power strip. Try another one.

The second possibility is that when the final server or two is switched on the brief power surge when the equipment is turned on is enough to exceed the 15-amp rating of the strip. The surge you get when you turn on equipment is so brief and so small that you'll never see its effect on your electric bill, but sometimes it's enough to trip a power strip or circuit breaker.

If swapping out power strips doesn't work, I suggest getting a cheap watt-meter and measuring how much electricity each server is using. Either way, I'd be interested in hearing what you ultimately discover.


©1998-2018 Michael Bluejay, Inc. All Rights Reserved. Unauthorized reprinting is prohibited.
All advice is given in good faith. We're not responsible for any errors or omissions. Electricity can kill you; if you're not competent to work on your electrical wiring then hire a professional to do it.
Contact | Misquoting this Website | Privacy | Advertising | My home page

If you liked this site, you might like some of my other sites:

Guide to Household Batteries   Finding Cheap Airfare   How to Buy a House   Bicycle Safety   SEO 101: Getting good search engine rankings