Regarding voltage drop on branch circuits, according to code, 5% voltage drop is allowable on branch circuits, however, it is not clear as to whether that is under load or not. Is there anywhere I could look for clarification on that?
What the National Electrical Code (NEC) says about voltage drop is that the maximum recommended voltage drop across branch circuits is 3% and the maximum recommended voltage drop across feeder cable is 3%, but the combination of the voltage drop across the feeder plus the branch circuits should be no more than 5%. So that means, for example, you can have 3% voltage drop on the feeder cable and 2% on the branch circuits, or 2% on the feeder and 3% on the branch circuit, or any combination like that as long as it doesn’t add to more than 5%.
As far as whether that’s under load or not, yes, it has to be under load for there to be any voltage drop. Remember, Ohm’s law says V = I x R (the voltage is the current times the resistance of the wire). That’s true whether the voltage V is a voltage drop or a voltage rise (from a power supply). So if the circuit is not under load, then the current is 0 amps and the voltage drop is 0 x R, which is 0 volts of drop. I conducted a workshop at RZI Lighting in New Orleans yesterday, and we proved that this is true. We connected three 100-foot extension cords together and measured the voltage at the supply and at the end of the extension cords before we plugged in a 1000-watt PAR can. The voltage at the supply was 122.3 volts and it was exactly the same at the end of the 300-feet of extension cord before we plugged in the PAR can. But when we plugged in the PAR can, we measured the voltage at the PAR can (using a two-fer) at 112.1 volts. So we measured a 10.2 volt drop across the cable.
We could have then calculated the resistance of the wire in the extension cord using Ohm’s law. The lamp in the PAR can was rated 1000 watts at 120V, so at 122.3 volts it should draw about 8.5 amps. According to Ohm’s law, R = V/I. So R is 10.2V/8.5A, which is 1.2 ohms. The extension cord was marked 12/3, meaning it had three conductors (hot, neutral, and ground) that were #12 AWG wire. So from that we could conclude that #12 wire has a resistance of 1.2 ohms per 600 feet (300 feet of black wire plus 300 feet of white wire), which is about 2 ohms per 1000 feet. According to Table 8—Conductor Properties, in back of the 2011 NEC (I can’t find my 2014 edition at the moment), the resistance of #12 stranded wire is 1.98 ohms per 1000 feet at 75 degrees C, which is 167 degrees F. It was pretty hot in New Orleans yesterday and the warehouse was not air conditioned, so 167 degrees F sounds about right.