# What is the point of a voltage divider if you can't drive anything with it?

Oh, but you can. You can drive an high impedance input with it...including a buffer, which can then in turn be used to drive whatever you want. The more current you draw the more the voltage will droop, so you just make sure to draw as little current as possible. So that the output is, for example, 99.9% of what the divider formula says it should be.

The divider formula is simply a equation that holds true under certain ideal conditions. If you want to mathematically analyze it under real conditions, the equation gets complicated and case-specific, so often it is just easier to force your real world usage such that the equation's assumptions are approximated very closely.

You don't have to draw significant current to "use" a voltage. For example, if you want to measure the output voltage, which is a perfectly useful thing to do, then you can just attach a voltmeter. And ideally, voltmeters don't draw current at all.

If you wanted to drive something at a lower voltage than the input, you wouldn't use a voltage divider because that would be extremely wasteful; most of the energy would be lost in the resistors.

In a high-impedance amplifier, the currents are small in proportion to the voltages present and in this case, voltage dividers see popular and common use to prescale the overall gain of the first stage of the amplifier and also to vary the output level of the amplifier.

The effect you mention (finite current flow causing the voltages in the divider circuit to shift) is called loading and can be minimized even in low-impedance circuits through appropriate choices of the resistances in the divider circuit.