Large currents flowing in a length cable can be measured using the voltage drop along the cable. It eliminates the need for a bulky shunt or expensive magnetic measurement method. Accuracy is limited however because of the +0.39%/°C tempco (temperature coefficient) of copper.

Temperature sensors can facilitate compensation, but are point-measurement devices, the relevance of which is questionable over the length of a cable. Consider that a mere 2.5° error or difference from the cable temperature introduces 1% of error.

If at least 10mV is dropped at maximum current, you can readily measure it with modern zero drift amplifiers (auto-zero, chopper, etc.). They offer ultra-low offset performance that enables accurate sensing of the low full-scale voltage drops.

What remains is what to do about the tempco. The solution proposed in this Design Idea takes advantage of the fact that high current cables are made up of many fine strands. The example here will be based on AWG 4 cable with 1,050 strands of AWG 34 wire.

In Figure 1, the operational amplifier non-inverting input senses the cable drop on the load end of the cable. The MOSFET is in the output/feedback path, which continues through the temperature-sensing strand – what would normally be a resistor for setting gain – ending at the power supply. The circuit forces a drop across this gain-setting element which exactly equals the main cable drop. In this case of course, the gain-setting element is the single insulated strand (lacquered such as magnet wire) of 34 gauge wire embedded within a custom insulated cable assembly including the high current cable.

Read more: Use copper to temperature-compensate high current measurements