×

# A basic question (with different answers)

1. Junior Member
Join Date
Aug 2015
Location
Clute, Tx
Posts
10
Reputation

## A basic question (with different answers)

Hello all,

I have a question that I ask from time to time and I seem to always get different results.
My question is this... (This is just a scenario/example)
I have a load of 360W and my input voltage can be 120-240.
At 120V, with a 360W load, my current is 3A.
At 240V, with a 360W load, my current is 1.5A.
Which is more efficient?

What i am usually told is 240V because you don't loss as much from heat as you would with 120V.
I was told this same answer from about 10+ people from just out of college to 30+ years of experience.
My other answer was how 240V works in standard U.S. homes and the way the sine waves never let go below zero. (I might of worded this wrong)
This was told to me by 2 people with over 40+ years experience.

My question is, to you all, what is your answer? I am still confused after thinking of this for over a year. Maybe you all can clear this up once and for all!

2. Both answers are essentially saying the same thing. The higher voltage is more efficient because by using two phases it requires less current and smaller conductors than would be needed if you went with the lower voltage. Furthermore if you have a long run, the voltage drop will be less significant at the higher voltage.

3. Member
Join Date
Apr 2014
Location
United States
Posts
42
Reputation
Originally Posted by ArchIsChompa
What i am usually told is 240V because you don't loss as much from heat as you would with 120V.
why would one voltage generate more heat than other? 360W is 360W either way. if this conductor is sized properly it shouldn't matter. i guess that why it's more cost effective to use the higher voltage.

4. Junior Member
Join Date
Aug 2015
Location
Clute, Tx
Posts
10
Reputation
Originally Posted by cjones09
why would one voltage generate more heat than other? 360W is 360W either way. if this conductor is sized properly it shouldn't matter. i guess that why it's more cost effective to use the higher voltage.
This.

Originally Posted by SecondGen
Both answers are essentially saying the same thing. The higher voltage is more efficient because by using two phases it requires less current and smaller conductors than would be needed if you went with the lower voltage. Furthermore if you have a long run, the voltage drop will be less significant at the higher voltage.
It makes sense, yes. But how come we don't just use 480 for everything then seeing how it's more efficient than lower voltages with higher currents. Is it because the cost of insulation would make it kind of pointless or would it be worth it in the long run?
I feel like 120V vs. 240V is a battle of cost price from copper (or whatever conductor you use) vs insulation.

5. Actually most modern industrial environments do use 480/277V for mechanical equipment and lighting. Many data centers utilize higher voltages to improve efficiency as well. Google has been utilizing higher voltages in their servers since the early 2000's. In residential applications I am sure 120V is used for safety, 480/277 is a nasty voltage.

Originally Posted by ArchIsChompa
It makes sense, yes. But how come we don't just use 480 for everything then seeing how it's more efficient than lower voltages with higher currents. Is it because the cost of insulation would make it kind of pointless or would it be worth it in the long run?
I feel like 120V vs. 240V is a battle of cost price from copper (or whatever conductor you use) vs insulation.

6. Originally Posted by ArchIsChompa
Is it because the cost of insulation would make it kind of pointless or would it be worth it in the long run?
Most low voltage wire has 600V insulation so you can use the same wire for 480V that you also use for 120V and there is no cost difference.

7. Junior Member
Join Date
Aug 2015
Location
Clute, Tx
Posts
10
Reputation
Originally Posted by SecondGen
Actually most modern industrial environments do use 480/277V for mechanical equipment and lighting. Many data centers utilize higher voltages to improve efficiency as well. Google has been utilizing higher voltages in their servers since the early 2000's. In residential applications I am sure 120V is used for safety, 480/277 is a nasty voltage.
This is true. I never did take safety into account, but it makes a ton of sense. Thanks a lot, this actually feels like closure to me on this question. It's been eating away at me for quite some time.

Most low voltage wire has 600V insulation so you can use the same wire for 480V that you also use for 120V and there is no cost difference.
I also didn't take this into account. It seems I left a lot of variables out.. Haha, but thank you! You two have seemed to finally put this question to rest for me!

8. Junior Member
Join Date
Oct 2015
Posts
1
Reputation
Originally Posted by cjones09
why would one voltage generate more heat than other? 360W is 360W either way. if this conductor is sized properly it shouldn't matter. i guess that why it's more cost effective to use the higher voltage.
As current passes through the conductor power is lost due to the current raising the temperature of the system. These losses due to heat throughout the circuit are calculated by the equation P=I^2R, and are referred to as I^2R losses.

9. Junior Member
Join Date
Oct 2015
Posts
10
Reputation
This appears to have already been beaten to death, but i'll throw in my 2 pennies worth anyway.

The load does not operate any more, or less efficiently whether you operate it at 120V, 240V, or any other voltage.

As was mentioned, the I^2*R losses become greater when using lower voltage because the current increases to be able to deliver the same wattage.

Why don't we use higher voltage on everything? Primarily it is a safety issue. The NEC specifically prohibits the use of 277V lighting in residences for the (seemingly) obvious reason that 277V will bite considerably harder than 120V.

Aside from safety, you will eventually reach a point of diminishing returns when you consider that although I^2*R losses decrease with lower current (high voltage for a given constant kVA load), insulation costs increase with higher voltage, and (reaching way out into the extremes) you'll begin to have a higher steady state capacitive charging current, greater corona loss, etc.

Related Content