An Often-Overlooked Constraint to Powering Instruments: Voltage Drop

LI-COR occasionally receives an inquiry from users who are concerned that after fielding their instrument for the first time it has failed. When they tested the instrument in the lab everything worked fine. After they connected it to the power supply in the field, however, the instrument would not start up. After checking that the connections were made with the correct polarity, the fuses are OK and that the power supply is outputting something nominally close to 12 VDC, the inevitable conclusion is that the instrument must have broken. This, however, is almost never the case.

After some discussion we generally find that the voltage output by the power supply as reported to us was measured at the power supply terminals, rather than at the instrument. Why does this matter? Shouldn’t the voltage be the same at the power supply and instrument?

Ideally it wouldn’t matter where the voltage was measured, as the only voltage drops that would occur in the system would be those across the instruments that were powered. But in practice the passive components in the systems (wiring, screw terminals, etc.) have some resistance that is greater than zero, and thus have some voltage drop across them. In a well-designed and well-maintained power distribution system these additional drops are not a concern as they are negligibly small. But when too narrow a wire gauge is used over too long a length, or when corrosion or poor connections are present at electrical connections, the voltage drops can be large enough to put the voltage reaching the instrument below the instrument’s minimum input.

The voltage drop (VD) at any point in a circuit is a function of the resistance (R) and current draw (I) at that point, following Ohms Law:

1

For wire conductors the resistance is essentially a function of the material the wire is made of, its cross sectional area (A), and length (L):

2

Where ρ is the electrical resistance of the conductor and is material specific. It is important to note that the total length of wire between the supply and instrument is important when considering its resistance, not just the length of a single lead. Both supply and return wires contribute to the total resistance.

Let’s consider a practical example. Below is a figure showing potential voltage drop for a variety of typical wire gauges used in DC circuits. The figure was generated assuming a 4.8 amp load and a range of wire lengths. This load would be a pretty typical peak current draw for an LI-8150/LI-8100A Automated Soil CO2 Flux System. The input voltage range for the system is specified for a range of 10.5 and 30 VDC. The system will shut down when the supply voltage drops below 10.5 VDC, and will not restart until the supply exceeds 12.1 VDC.

In this example, if the instrument were connected to a battery providing 12.5 VDC using 5 m of 22 AWG wire the instrument wouldn’t be able to power on as the voltage drop would be about 2.47 V (19.8%). Using 20 AWG wire would let it power on when the batteries were fully charged, but the amount the battery could discharge before the instrument shut down would be rather minimal and there would be no possibility for an automatic restart.

In practice, this system ships from us with a 3 m long power cable with four 20 AWG conductors, two each for supply and return. Since cross-sectional area doubles with an increase of three wire gauges, this is equivalent to using a single length of 17 AWG wire to conduct power to the instrument. This puts the potential voltage drop at somewhere around 4%.

Please contact LI-COR Technical Support if you need assistance assessing voltage drop using LI-COR instruments.