Folks, I don’t mind telling you that this has been a hard article to write. So hard in fact that I’ve avoided writing it for years. Why? Because electricity is hard to explain, and most of the time the explanations leave you with more confusion than you started with.
Is electricity really like water?
Here’s the most common explanation of how electricity works, from HowStuffWorks:
A neat analogy to help understand these terms is a system of plumbing pipes. The voltage is equivalent to the water pressure, the current is equivalent to the flow rate, and the resistance is like the pipe size.
This is a great way to start a 9th grade term paper but it doesn’t leave you with any real deep understanding of what you need to know. It’s not technically untrue but it’s not going to help you in the field. I’m going to try to give you some information that’s going to actually let you use these terms with some comfort.
Generally, volts have to be constant. That’s a useful thing to know about them, as opposed to thinking that they are like water pressure. Here in the US most outlets give you something close to 110 volts. Commercial equipment works on 220 volts and things that plug into USB work on 5 volts.
If you have too much voltage you will burn out the equipment you have connected. If you don’t have enough, the equipment won’t work right. At the very least it will be flaky and it’s possible it won’t work at all.
When you see the lights dim during a brownout, they’re not getting enough voltage. If a circuit breaker fails in your home, you get inconsistent voltage through it and that can make things go on and off, and sometimes break.
Amps are a measure of current. Everyone says that as if it explains everything and you don’t need to ask any more questions.
It’s more important to know that amps aren’t usually constant. If a device needs more energy it will naturally draw more amperage. Things like turning an electric motor or heating up the air are hard to do (compared to something like running a phone or a watch) and so they take more amps.
If you can’t supply enough amps one of two things will happen. The most common thing is that your circuit breaker will flip. A circuit breaker is a special form of switch that flips by itself if a specific condition is met. That condition is simple: too many amps flowing through the line.
The other possibility is that mechanical equipment will fail. It’s like if you haven’t eaten too much and then you go exercise. Your body can’t supply enough power for your muscles to work and so you get faint.
The danger of too many amps is that your wiring will melt, there will be sparks and there will be a fire. That, as they say, is bad.
I’m only mentioning ohms because HowStuffWorks does. Ohms are a measure of resistance which, in my experience, is almost impossible for humans to understand properly. All measures of electricity are tied together so that if you know how many volts and amps you have, you can calculate ohms. If you know how many volts and ohms you have, you can calculate amps, etc.
You will generally not ever have to worry about ohms except when looking at the impedance of cables, which is a completely different thing.
Watts are like the holy grail of electricity knowledge. Watts are going to be the number you use when you’re really figuring out how much power something needs. Why? Because the formula for watts is simple.
Watts=Volts x Amps
So, a watt gives you a measure of how much real electricity is being used by a real thing. You probably first noticed watt measurements with light bulbs. Back before LEDs and all, the watt measurement told you how bright the bulb would be. Why? Because there was a direct measurement between how much power the bulb drew and its brightness. Today with LED and CF bulbs it’s hard to draw that same line. Luckily watts are used in a lot of other ways too.
Power supplies measure their output in watts. Sometimes you see amps used in this way as well, expecially with mobile products. I’m not quite sure what that is, but it doesn’t really matter. To get watts when looking at mobile devices, multiply amps by 5 since all mobile devices work off 5 volts.
Watts are useful for another reason… you’re generally charged by the watt when you get your electric bill. Actually the measurement you see is kWh, meaning kilowatthours. A killowatthour is a measure that says, you used 1,000 watts of power in one hour. Or, you used 4,000 watts for 15 minutes. However long it takes you to use 1,000 watts, that’s the number of kWh you used.
And you’re billed by the kWh. Depending on where you live, you’re probably billed somewhere between a dime and 40 cents per kWh.
Here’s an example
So, if you want to know how much that phone is costing you to charge, or how much you’re paying to leave that light on, here’s the way you do that.
- Find the watts used by the device. It’s usually on the power supply.
- Figure out how long you keep it on. If it’s on all week, that’s 168 hours.
- Multiply watts by how many hours it’s on all month.
- Divide that number into 1,000.
Your phone charger uses 6 watts while charging and practically nothing while not charging. So if you were charging all month, 24 hours a day for 30 days, that’s 6 x 24 x 30. In other words, 4,320 watthours (6 watts x 720 hours.)
4,320 watthours = 4.32 kilowatthours, because you divide by 1,000. If you’re paying 15 cents per kilowatthour, your charger would cost you about 65 cents to charge your phone if you literally charged it constantly for a month. More likely, you use your charger about 2 hours a day, so 60 hours a month. That’s 360 watthours, .36 kilowatthours, or 5.4 cents per month.
In other words, don’t worry about charging your phone.