Voltage drop is a real concern when powering a device over a length of wire. The voltage lost between the load (device) and power supply can be significant and actually prevent the device from working. Proper selection of wire gauge is needed to be sure that the device receives a the voltage that is required for it to operate normally. We have provided a simple chart to calculate voltage drop per 100 feet of paired wire as a function of wire gauge and load current.
By matching load current (in AMPs) across the top of the chart with wire gauge (AWG) down the left side of the chart, one can determine voltage drop per 100 feet of paired wire run.
NOTE: A paired wire run represents the feed and return line to the load. Therefore, a 500 foot wire pair is equivalent to 1000 feet of total wire.
Given a load current of 1 AMP, and using 18 AWG wire, how much voltage drop can we expect at the load end for a 350 foot run of paired wire?
Using the chart, we match the row for 18 AWG and the column for 1 AMP and determine that voltage drop per 100 feet is 1.27 Volts. By dividing the paired wire length by 100, we get the factor by which we need to multiply voltage drop per 100 feet to determine total voltage drop. Therefore, 350 feet divided by 100 equals 3.5. Multiply 3.5 by 1.27 volts drop per 100 feet to get your total voltage drop. Thus the total voltage drop is 3.5 times 1.27, or 4.445 voltage drop for 350 feet.