Why is high input impedance good?

It is a good thing for a voltage input, as if the input impedance is high compared to the source impedance then the voltage level will not drop too much due to the divider effect.

For example, say we have a \$10V\$ signal with \$1k\Omega\$ impedance.

We connect this to a \$1M\Omega\$ input, the input voltage will be \$ 10V\cdot\frac{1M\Omega}{1M\Omega+1k\Omega} = 9.99V \$.

If we reduce the input impedance to \$10k\Omega\$, we get \$10V \cdot \frac{10k\Omega}{10k\Omega + 1k\Omega} = 9.09V\$

Reduce it to 1k and we get \$ 10V \cdot \frac{1k\Omega}{1k\Omega + 1k\Omega} = 5V\$

Hopefully you get the picture - generally an input impedance of at least 10 times the source impedance is a good idea to prevent significant loading.

High input impedance is not always a good thing though, for example if you want to transfer as much power as possible then the source and load impedance should be equal. So in the above example the 1k input impedance would be the best choice.
For a current input a low input impedance (ideally zero) is desired, for example in a transimpedance (current to voltage) amplifier.


The "best" value of Impedance depends on the situation and application.

When it is appropriate to have or need a high impedance it is because it is an approximation to an infinite impedance.

An input applied to a signal source acts as a voltage divider.
Vout = Vsignal x Zinput / (Zsource + Zinput)
To get no loading either Zsiganl is zero (low or no impeadance output) and / or Zinput = infinite.
"Suitably high" is the practical version of infinite would be nice."

How large "suitably" is depends on the application.

AC mains has an impedance well under 1 ohm (usually). A test meter with 1000 ohms impednace woul draw about 100 mA !!!! from 110 VAC mains but would only load it down my under 0.1 of a Volt in the process. A test meter of 1 megohm input impedance would draw about 100 uAmp which would be much more acceptable.

For high impedance sources "suitably) needs to be quite large.
A high impedance input places very little load on a signal that is applied to it.
It thus does not reduce it in level (or not much). A unity gain buffer usually has very high impedance and is often used as an input stage to an amplifier chain. A pH probe, used for measuring acidity and alkalinity of a solution, mat have an output impedance of 10's to 100's of megohms. It's voltage level is a direct measure of pH. So anything that seeks to measure the voltage must try not to alter it in the process. A voltage measuring probe will effectively act like a voltage divider. The probe impedance needs to be >> the measured impedance if loading is not to occur.

A probe which is 256 times the impedance of a circuit being measured will cause 1 bit error in an 8 bit system.
A probe which is 4096 times the impedance of a circuit being measured will cause 1 bit error in a 12 bit system.

So to measure with 1 bit in 256 = 1 bit in an 8 bit system with a 1 megohm source impedance you need a 256 Megohm input impedance. For a 10 Megohm source you need a 2.6 Gigohn input impedance. And for a 100 Megohm ource you need ... !!!

As per the formula above, for outputs, LOW impedance is good, with the ideal being zero impedance (a perfect voltage source).

Then there is the special case of matched impedances where source and input are the same. Half the signal is dissipated in the INPUT and half in the output (assuming otherwise lossless connection) BUT there are no reflections due to impedance mismatch. A whole new subject for another time.


Infinite input impedance would allow one to feed any amount of voltage into a load without it absorbing any power. Zero input impedance would allow one to feed any amount of current into a load without it absorbing any power. In cases where one wants to sense voltage without absorbing power, infinite impedance is thus the ideal; conversely, if one wants to sense current, zero impedance is the ideal.

Although sometimes one wants a load that doesn't absorb any power, there are times one wants to feed power into the load. The amount of power fed into a load will be maximized when the input impedance of the load matches the output impedance of whatever is driving it. This situation does not imply maximal energy efficiency, however. Depending upon what's driving the load, a higher or lower input impedance may cause the driving device to waste more or less power internally.

Tags:

Impedance