How USB Cables Affect Charging – Simple Test
I came across an interesting article today from Dr Gough, a tech nerd. and thought it good enough to summarize here:
The USB specifications for power from a port vary from 100mA to 1.5A and up to 100W of power for USB Type C, but the cables and connectors used in a cable might not align with the power specifications of the product being designed and used. Cables are typically rated for about 1.8A of current, which is most common for cables used for charging.
The 1.8A rating is based on safety limits for resistive heating of the cable and connectors. The rating is no guarantee your +5V at 1.5A setup will get you the maximum level of power. The important point here, the cable and connector combination is simply a rating to deal with heat, and ensures nothing melts. Going a step further, most specs ensure nothing gets noticeably warm to the human touch.
Every wire that’s not a superconductor has some finite resistance. Said another way, resistance is transferred into heat. Ohm’s law tells says that E = IR, where E is voltage, I is current, and R is resistance. So when you put power through a wire, the current X resistance gives me the voltage that will be “consumed” across that wire, power that turns into heat, and thus, never makes it to your phone.
I want to end this blog post with the above paragraph as that is the real takeaway here. The more inefficient (or cheap) a cable is, the warmer it will get. So if your iPhone cable is warm to the touch, it sucks. If your wire charging your power bank is warm, it sucks. Get a better cable. From what I can tell, there is no rating posted on all these cables you see on Amazon at cheap prices, so word to the wise using your tactile feel!
Continue Reading