×
Follow Us
Results 1 to 9 of 9

A basic question (with different answers)

 Jump to latest post
    #1
  1. Join Date
    Aug 2015
    Location
    Clute, Tx
    Posts
    10
    Reputation

    A basic question (with different answers)

    Hello all,

    I have a question that I ask from time to time and I seem to always get different results.
    My question is this... (This is just a scenario/example)
    I have a load of 360W and my input voltage can be 120-240.
    At 120V, with a 360W load, my current is 3A.
    At 240V, with a 360W load, my current is 1.5A.
    Which is more efficient?

    What i am usually told is 240V because you don't loss as much from heat as you would with 120V.
    I was told this same answer from about 10+ people from just out of college to 30+ years of experience.
    My other answer was how 240V works in standard U.S. homes and the way the sine waves never let go below zero. (I might of worded this wrong)
    This was told to me by 2 people with over 40+ years experience.

    Click image for larger version. 

Name:	S-phase.JPG 
Views:	151 
Size:	45.8 KB 
ID:	68

    My question is, to you all, what is your answer? I am still confused after thinking of this for over a year. Maybe you all can clear this up once and for all!

  2. #2
  3. SecondGen's Avatar
    SecondGen is offline
    I push buttons.
    NETA Level III Pro Subscriber
    Join Date
    Jan 2014
    Location
    United States
    Posts
    508
    Reputation
    Both answers are essentially saying the same thing. The higher voltage is more efficient because by using two phases it requires less current and smaller conductors than would be needed if you went with the lower voltage. Furthermore if you have a long run, the voltage drop will be less significant at the higher voltage.

  4. #3
  5. Join Date
    Apr 2014
    Location
    United States
    Posts
    42
    Reputation
    Quote Originally Posted by ArchIsChompa View Post
    What i am usually told is 240V because you don't loss as much from heat as you would with 120V.
    why would one voltage generate more heat than other? 360W is 360W either way. if this conductor is sized properly it shouldn't matter. i guess that why it's more cost effective to use the higher voltage.

  6. #4
  7. Join Date
    Aug 2015
    Location
    Clute, Tx
    Posts
    10
    Reputation
    Quote Originally Posted by cjones09 View Post
    why would one voltage generate more heat than other? 360W is 360W either way. if this conductor is sized properly it shouldn't matter. i guess that why it's more cost effective to use the higher voltage.
    This.

    Quote Originally Posted by SecondGen View Post
    Both answers are essentially saying the same thing. The higher voltage is more efficient because by using two phases it requires less current and smaller conductors than would be needed if you went with the lower voltage. Furthermore if you have a long run, the voltage drop will be less significant at the higher voltage.
    It makes sense, yes. But how come we don't just use 480 for everything then seeing how it's more efficient than lower voltages with higher currents. Is it because the cost of insulation would make it kind of pointless or would it be worth it in the long run?
    I feel like 120V vs. 240V is a battle of cost price from copper (or whatever conductor you use) vs insulation.

  8. #5
  9. SecondGen's Avatar
    SecondGen is offline
    I push buttons.
    NETA Level III Pro Subscriber
    Join Date
    Jan 2014
    Location
    United States
    Posts
    508
    Reputation
    Actually most modern industrial environments do use 480/277V for mechanical equipment and lighting. Many data centers utilize higher voltages to improve efficiency as well. Google has been utilizing higher voltages in their servers since the early 2000's. In residential applications I am sure 120V is used for safety, 480/277 is a nasty voltage.

    Quote Originally Posted by ArchIsChompa View Post
    It makes sense, yes. But how come we don't just use 480 for everything then seeing how it's more efficient than lower voltages with higher currents. Is it because the cost of insulation would make it kind of pointless or would it be worth it in the long run?
    I feel like 120V vs. 240V is a battle of cost price from copper (or whatever conductor you use) vs insulation.

  10. #6
  11. madMAX's Avatar
    madMAX is offline Seasoned Member
    Join Date
    Dec 2014
    Posts
    57
    Reputation
    Quote Originally Posted by ArchIsChompa View Post
    Is it because the cost of insulation would make it kind of pointless or would it be worth it in the long run?
    Most low voltage wire has 600V insulation so you can use the same wire for 480V that you also use for 120V and there is no cost difference.

  12. #7
  13. Join Date
    Aug 2015
    Location
    Clute, Tx
    Posts
    10
    Reputation
    Quote Originally Posted by SecondGen View Post
    Actually most modern industrial environments do use 480/277V for mechanical equipment and lighting. Many data centers utilize higher voltages to improve efficiency as well. Google has been utilizing higher voltages in their servers since the early 2000's. In residential applications I am sure 120V is used for safety, 480/277 is a nasty voltage.
    This is true. I never did take safety into account, but it makes a ton of sense. Thanks a lot, this actually feels like closure to me on this question. It's been eating away at me for quite some time.

    Quote Originally Posted by madMAX View Post
    Most low voltage wire has 600V insulation so you can use the same wire for 480V that you also use for 120V and there is no cost difference.
    I also didn't take this into account. It seems I left a lot of variables out.. Haha, but thank you! You two have seemed to finally put this question to rest for me!

  14. #8
  15. Josheau is offline Junior Member Pro Subscriber
    Join Date
    Oct 2015
    Posts
    1
    Reputation
    Quote Originally Posted by cjones09 View Post
    why would one voltage generate more heat than other? 360W is 360W either way. if this conductor is sized properly it shouldn't matter. i guess that why it's more cost effective to use the higher voltage.
    As current passes through the conductor power is lost due to the current raising the temperature of the system. These losses due to heat throughout the circuit are calculated by the equation P=I^2R, and are referred to as I^2R losses.

  16. #9
  17. Join Date
    Oct 2015
    Posts
    10
    Reputation
    This appears to have already been beaten to death, but i'll throw in my 2 pennies worth anyway.

    The load does not operate any more, or less efficiently whether you operate it at 120V, 240V, or any other voltage.

    As was mentioned, the I^2*R losses become greater when using lower voltage because the current increases to be able to deliver the same wattage.

    Why don't we use higher voltage on everything? Primarily it is a safety issue. The NEC specifically prohibits the use of 277V lighting in residences for the (seemingly) obvious reason that 277V will bite considerably harder than 120V.

    Aside from safety, you will eventually reach a point of diminishing returns when you consider that although I^2*R losses decrease with lower current (high voltage for a given constant kVA load), insulation costs increase with higher voltage, and (reaching way out into the extremes) you'll begin to have a higher steady state capacitive charging current, greater corona loss, etc.

Subscribe

Share this thread

Related Topics

  1. Insulation resistance test set question
    By mkennedyShermco in forum NETA Level 4 Exam
    Replies: 2
    Last Post: June 10, 2018, 12:52 PM

Tags for this Thread

Follow us


Explore TestGuy


NETA Certification Training


NICET Electrical Power Testing


Help and Support




You are viewing the archives. Enjoy new features and join the conversation at wiki.testguy.net