How to measure the quality of a USB charging cable?
If the only difference in charge time is the cable, there are two possibilities.
1) A specific USB charging cable might have resistors inside one of the connectors that set the voltage on the D+ & D- lines to voltage levels that tells your device how much current it should take in while charging. Although this is not common, I have seen cables just like this.
2) USB cables from different manufacturers can have smaller or larger sizes of wire for the Gnd & Vdd lines. Thin wire gives more voltage drop, which tells your device to reduce the amount of current being taken in. Thicker wire gives a lower voltage drop and your device takes in more current.
In terms of measuring your particular cables, you need a meter that can resolve resistances in fractions of an Ohm. Although I have such a meter, I would use an accurate current-limited power supply set for 500 mA with a voltmeter connected right across the terminals.
Short the Vdd & Gnd pins at one end of the cable and connect the Vdd & Gnd pins of the other end of the cable to your current-limited supply. If you set the current accurately to 500 mA, you can calculate the wire resistance. Do this for all of your USB cables.
I'm suggesting a current of 500 mA because pretty much all USB devices can handle at least that amount of current. Some power supplies and devices can, of course, handle much more. But 500 mA gives you enough current that you can easily measure the resistance.
I have some very nice flat USB charging cables that have very heavy wires for the two outside conductors (Gnd & Vdd) and a very thin twisted-pair in the middle for the D+ & D- conductors. They work very well at charging my phone - better than some of the other cables that I have.