How do I design a 2A or more power supply for my consumer USB devices?

The reason why your Apple/Samsung devices do not draw more current is simple. It is because there is additional data communication going on between the Apple/Samsung device and their dedicated power supplies. This makes both devices recognise eachother and agree on a higher current to be used. The charger sets certain voltages on the USB data lines and this is recognised by the phone or tablet.

Your 20 Amps supply does not supply these voltages on the date lines so it does not "talk" to your Apple/Samsung device therefore these assume it is a normal "dumb" charger and do not draw more current than what is allowed by USB standard which usually is only 100 or 500 mA

To charge an apple device put thoses voltages on the data lines:

desired current : 2,000mA D- : 2.0V D+ : 2.75V

Like this circuit, for apple devices at 2A.

Also, depending on how full the battery is the charge current is also limited. It will only be maximum when the battery is 30 - 70 % charged (these numbers are just my guess). Charging with a high current is bad for the battery when it is very low or almost full.

Sources:

Adafruit : The mysteries of Apple device charging

Voltaic : Choosing USB Pin Voltages


Here's a more complete article that lists the known proprietary resistor-divider D+/D- identification schemes for high-powered chargers. The gist is:

2.0V/2.0V – low power (500mA)

2.0V/2.7V – Apple iPhone (1000mA/5-watt)

2.7V/2.0V – Apple iPad (2100mA/10-watt)

2.7V/2.7V – 12-watt (2400mA, possibly used by Blackberry)

D+/D- shorted together – USB-IF BC 1.2 standard

1.2V/1.2V – Samsung devices

The Samsung values coincide with what's indicated on this schematic [original source] via 10k/33k resistor divider for 2A Galaxy tablet chargers.

Like I said in my other comments, there are also off-the-shelf chips that implement some of these, e.g. MAX14667, TPS2513, Microchip's USB2534 or CYUSB3324. The datasheet of the latter also provides some more details on Samsung devices:

Samsung devices follow multiple charging methods. Some Samsung devices (Samsung Galaxy Tablets) use a proprietary charging method in which the D+ and D- pins are biased to the same potential (~1.2 V). The Samsung Galaxy S series (S3, S4) devices follow the USB-IF BC v1.2 charging standard for DCP, CDP, and SDP mode of operations.

The USB-IF BC 1.2 has these requirements for DCP mode:

  • D+ and D– data lines are shorted together with a maximum series impedance of 200 Ω.

  • Must not cut off power supply until voltage drops below 2V or current exceeds 1.5A

  • Absolute maximum current draw allowed but not required is the limit of the USB 2.0 connector, up to 5A

Also, the Cypress note says that the 2.7V/2.7V -> 2.4A is [also] used by Apple. Discussion on TI forums on which TI employees chimed in (for their TPS2513A) indicates the same. A TI employee said:

Begin from iPad3, 42Whr battery is used, long charging time starts to become an issue. Compare charging time, from battery 0% to 100%, 2.1A charger takes 6hrs, 2.4A charger takes 5hr40mins. So 20mins quicker, but not much.

We believe the key reason Apple release 2.4A charger is for better user experience when charging and playing at same time. When play high quality video games, like Infinity II and charging at same time, my iPad3's battery percentage increase is very very slow, e.g, 30mins later, only increase 2percent which drive me crazy.

While using 2.4A charger, the battery status increase faster, at least I feel normal, ok with it.


The power supply is more than capable of outputting the maximum 2.1 amps, so why doesn't it to at least the tablets?

The USB standard does not allow more than 500mA to be drawn from a standard USB 1 port. Until the device establishes communication with the USB host device it has no way to know how much current is available.

The USB standard actually requires devices to drawn no more than 100mA before communicating with the host and requesting more power. This is important because a standard unpowered USB hub will consume 500mA - 100mA for itself, and 100mA for each of its ports. This means that an unpowered hub cannot, and should not, attempt to supply 500mA to a USB device.

The standard was designed this way to support a variety of usage.

Obviously only Apple follows the standard, and consumes only 100mA prior to requesting more power.

The reality is that few USB ports are unable to supply 500mA without being asked. Many don't even bother to monitor current consumption and shut off non-compliant USB devices. It's almost always safe to draw 500mA from a USB port without asking the host port for the maximum power.

The newer USB specifications allow for higher power ports. Again, though, this must be requested to be compliant with the specification.

USB chargers are typically not intelligent, and don't implement a full USB host port. They use some short cuts - generally using resistors on the D+ and D- lines to signal the USB device that the charger is capable of more power without an official request.

Further, some devices, such as the Apple iOS line, will also monitor the voltage provided, and scale back the current consumption based on voltage drop. For instance, if a charger reports that it can supply 2A, but the voltage doesn't stay at 5V, the iOS device will consume less than the maximum current. It will not charge below 4.5V, nor above 5.5V. So not only does the charger have to present the correct signals to indicate full current is available, it has to maintain good regulation at the maximum current draw.

Keep in mind that this is a safety feature. Not only does the charging device need to be able to supply the current, but the USB cable used needs to be able to carry it. It might not seem like a lot of current, but there are many very cheap thin USB cables on the market that will noticeably warm up with 2A flowing through their undersized conductors. Put that under a flammable pillow and let the heat build up, and you might find more than melted insulation.

Apple not only verifies the charger, but also the cable (using their proprietary chips inside the cable connector) so they can avoid liability for possible losses associated with dangerous chargers and wiring.

As long as you are using the cable that came with the device, though, you should have no issue with this aspect of it, and can focus on the charger signalling.

What do I need to do in my test setup to convince the devices to consume their maximum charging current?

The Apple standard has been loosely adopted by others, or is accepted by others, and consists of placing specific voltage levels on the D- and D+ lines at a low current. Placing approximately 2.0V on the D- line, and 2.75V on the D+ line will signal 2A (10W) is available for charging. This can be done with simple resistors:

schematic

simulate this circuit – Schematic created using CircuitLab

If you follow this circuit in your setup, you should find that at least the Apple devices charge at 2 or more amps, and you may find your other devices also charge at this rate.