Why do USB cables lose the ability to charge quickly over time?

fuzzybabybunny:
I've noticed this over the years. I'll get a brand new USB cable and measure it as charging my Galaxy S5 at a rate of 1500 mA. Over a year or so the same cable, on the same phone, on the same charger, on the same operating system, will only be able to put out 500 mA or so.

Doesn't repeated bending of the thin-gauge power transfer wires in a USB cable eventually increase the internal resistance of the cable, preventing them from attaining the same current speeds as before?

I've often wondered if it would be worth it to take apart the housing of a USB cable and solder on thicker or higher-quality wires to ensure that charge rate can be maintained even with all the physical handling.

Phone chargers generally do not communicate with the phone to negotiate what current is needed. This would be too expensive to include into the charger's circuit. Until recently (some years ago) there was no standardized way to tell the phone what current is available from the charger. So different manufacturers developed different methods to do a simple signaling through the USB data lines which are otherwise unused when charging. Apple e.g. used two resistor voltage dividers to raise the data lines to some defined voltages. This was to tell the phone "this adapter can provide 100mA / 500mA / 1000mA". I don't know what Samsung's method is, I think it will be similar.
The reason for charging problems does not lie with the power lines, but with the data lines. The signaling voltages are easily distorted by corrosion, loose contact springs etc. then the signaling is broken and the phone adjusts the current to some false value.