Not only ordinary wires, but also USB cables can cause voltages to droop. This is evident if you have used an extra long USB cable and found that the device you were charging, such as your phone or tablet, was taking an extra long time to charge. The reason is the excessive voltage drop across the long USB or Universal Serial Bus cable.
Most chargers come with overcurrent protection. That means when the charging current exceeds a certain limit, the charger reduces its output voltage to prevent the charger from burning up. Moreover, when the charging current is high, the cable resistance causes the voltage at the device end to droop, increasing the charging time considerably. Since resistance increases with cable length, the voltage drop is also more with a longer cable. Hence, cable voltage droop has a negative impact on the operation of the system.
Proper charging and time taken is a critical design parameter for a device under load. With system load located at a distance from the output of the power supply, the absence of remote sensing may cause the voltage seen by the load to be significantly lower than desired. Contribution to the voltage droop is from thin circuit board traces, connector interface and cabling resistance. The situation gets worse when load currents are higher, decreasing the operational voltage at the load and causing possible erratic circuit operation.
A typical USB cable uses four wires of 24AWG each about a meter long and has a contact resistance of about 30 milliohms per contact. As a USB cable used for power transfer uses four connections (two on each cable end), the total contact resistance is 120 milliohms. The two one-meter wires of 24AWG have a total resistance of 166 milliohms. That makes the overall resistance of the USB cable to be 286 milliohms.
Typical converters are designed to supply a maximum output current of 2.1A. That means the voltage drop across the cable would be 0.6V when it is supplying maximum current. The voltage expected at the end of the cable would drop to 4.4V for a 5V set-voltage converter. This is much lower than the maximum lower-limit of most loads working at 5V and this may lead to potential issues with high-current loads.
Designers overcome this voltage drop by increasing the output voltage at the source. Instead of the fixed 5V, the converter would generate 5.6V, which after the 0.6V drop would present the necessary 5V to the load. They do this by monitoring the load current by adding a sense resistor in the path of the output current. A differential operational amplifier amplifies the voltage across the sense resistor and this voltage causes the output voltage to increase with increasing load.
As the load current increases, so does the output voltage of the converter. However, at the end of the cable, the compensated voltage is nearly a constant figure, representing a well-regulated voltage.
Compensating the output prevents the voltage at the load from drooping. This avoids potential system issues such as power cycling, latch-up conditions or decreased system performances.