Search the forums

Loading

110 vs 220 heater is one more effecient?

Wagon's picture

Is there an energy effeciency difference between 110v space heaters and 220v space heaters?  One sub says when turning electricity into heat there is no difference at this level- that it takes the same amount of electrical energy to heat the room regardless of the voltage being 110 or 220.  I always thought 220 was more effecient. 

(post #111992, reply #1 of 8)

I know 220 will cut the amprage in half on each breaker but it's drawing off 2 breakers so I think he's right. If it draws 1300 watts it still 1300 watts just off 2 legs instead of one. I prefer 220 since I think the wires stay cooler since there's less draw on each leg. Just my 2 cents.

 


 


 


Headstong, I'll take on anyone!

(post #111992, reply #4 of 8)

A good point. 120v loads can be configured, by chance or sloppiness, so that the loads between the two hot legs are highly unbalanced. The loads being placed mostly on one leg while the other is unburdened. Which can trip the main breaker or cause the main connections, hot and neutral, to run hot.

240v loads are essentially self-balancing as they automatically draw equally from both hot legs.

A 100A service gives you 100A at 240v. Essentially 200A at 120v. Theoretically a highly unbalanced panel, one feeding from one leg only, could top out while providing only 100A at 120v.

On a practical level my strategy for preventing this is to stack the breakers in such a manner as to diminish the chances of an unbalanced loads. I usually run my 15A breakers down one side. Leave a few spaces for additions later. Then stack the 20A under them. The 240v loads are run down the opposite side starting with the largest and most frequently used, typically any electric heat or water heater, and going down to the less frequently used and smaller loads.

Even without looking at what each circuit is carrying this tends to keep the loads on each leg pretty close to one another. Of course a quick glance at what these loads are can help tune the system. Some of this is a matter of how the people living in the house behave.

(post #111992, reply #2 of 8)

The basic formula is volts X amps = watts. [Actually VA but thr heater is pure resistance so it doesn't matter.]  Herr Ohm passed a law a while ago which requires Volts = Amps X Resistance.  Violation is problematic. Algebraic manipulation [V/A = R] hints that as volts go up, amps go down. Thus 220 volts would move 50% less amps than 110. Lower amps would mean less voltage drop which would make more amps available to be burned up in your heater. Also you get to use smaller wire. Hence, 220 volts is very slightly more efficient on a theoretical basis but not enough to show up on your electric bill. [The waste heat in the wires will also help heat your house.}


~Peter


Why can't the French properly pronounce their own language?

(post #111992, reply #5 of 8)

Peter,you're going to trip people up hinting that as voltage goes up current goes down.I know, that you know, that the 110v heater and the 220v heater are made with different resistances.If you took the 220v heater and applied 110v to it, it would draw half the current that it would at 220v.

(post #111992, reply #3 of 8)

In practical terms. No.

On longer lines a higher voltage can allow either a smaller gauge conductor, cheaper and lighter, to be used while maintaining acceptable voltage drop or, or possible in combination with, the conductors allowed to remain heavier and a savings enjoyed in terms of voltage loss and efficiency.

The first case being more advantageous for loads that tend to load the lines lightly or intermittently. The second on lines feeding heavy loads that run for long periods of time.

For the runs and loads common to any normal sized house the savings tend to be small. Often too low to be accurately measured in the real world or essentially too insignificant to worry about.

(post #111992, reply #6 of 8)

Watts are watts.  If you fix the wire size (you wouldn't), then a 220-volt units puts out a slightly larger fraction of the consumed energy out the business end and dissapates slightly less heat into the wall.


Install costs for 110-volt heaters will require larger wire for the same wattage but only a single-pole breaker.  Anthing you can do with mass-marketed 12-gauge romex is cheaper than going to 10-gauge or larger.  And single-pole, 20-amp breakers are cheaper than 30-amps or anything with 2 poles.


At most it is $6 difference in materials one way or the other and often much, much less of a difference.


If you can install ONE 220-volt unit instead of TWO 110-volt units (because they are vailable in larger wattages), that is a significant labor savings.  Let that be your guide.


David Thomas   Overlooking Cook Inlet in Kenai, Alaska
David Thomas   Overlooking Cook Inlet in Kenai, Alaska

(post #111992, reply #7 of 8)

Sounds like they are right.


Watts is watts. It has to generally follow the laws of thermodynamics. It takes x amount of energy to do a job ... period. And while there may be slight differences in e.g. wire resistance (and the Btu's lost through warmer wires), the difference in your electric bill should be undetectable.

(post #111992, reply #8 of 8)

Watts is watts, whether at 2 volts or 240.

The advantage of 240V devices is that half as much current flows for a given power level, so smaller wire can be used, and there is less heat loss in the wiring itself (less by a factor of 4 if wire size is not reduced).

Note that heat loss in the wiring may or may not represent a loss of efficiency, depending on whether the wiring is inside or outside of the area being heated. In most residental heating applications the wire is nominally inside the heated area, so wiring heat loss can be discounted.


Of all the preposterous assumptions of humanity over humanity, nothing exceeds most of the criticisms made on the habits of the poor by the well-housed, well-warmed, and well-fed.  --Herman Melville