Why do some multimeters have discrete ranges for measurable values and some just measure based on signal type?

The ones that don't require you to set a range have a feature called "auto-ranging". It's a feature that makes the meter easier to use — especially in the field — but sometimes slower to settle on a reading. It adds a bit of complexity to the meter logic (and the user interface), which is why the cheapest meters don't have it.


Many meters work by converting an input into a small voltage or current and then measuring that. There are a number of approaches meters can use to control their input range, which may be used individually or combined:

  1. Use a constant ratio of input signal to measured signal, and vary the relationship between the measured signal and the display. For example, a meter could scale an input scale by 1000:1, and then display the input scaled so that full-scale voltage is 400mV, 40mV, or 4mV. Such an approach used in isolation will generally limit a meter's useful dynamic range, since 0.1mV of noise would show up as 0.1V of noise in the measured value. Not a problem if the signal being measured is 300V, but very bad if the signal of interest is only 0.3V.

  2. Have several inputs to the measurement circuitry, which receive differently-divided inputs that are always fed to all of them. Have the measurement circuitry select whichever input is using a divide ratio appropriate to the signal being measured. This approach could be combined with the above approach by having inputs with 1000:1 and 10:1 dividers and then displaying the result scaled to show a full-scale voltage of 400mV or 40mV. Inputs over 4 volts would be read as above using the 1000:1 divider, but smaller inputs could use the input from the 10:1 divider, thus reducing 100-fold the effect of noise in the measurement circuitry.

  3. Change the ratio of input value to the voltage that appears at the measurement input.

Note that if a device is only measuring input voltage, approaches #1 and #2 are adequate if used together, since the meter will have little effect on the device or circuit being measured. When measuring resistance or current, however, the meter will need to affect the device under test, and there are trade-offs between accuracy and the amount of effect on the device being tested. One could theoretically measure any resistance by putting one milliamp through it and measuring the voltage, but getting an measurement that's accurate to 1% with a one ohm resistor would require measuring voltage accurate to 0.01mV--rather a tall order. On the flip side, measuring a one meg resistor using a 1mA current would require that one feed a watt into it, which would both be difficult for a meter to do, and might adversely affect the resistor being tested even if one could do it. Adjusting the measurement range for resistance and current is more easily accomplished by adjusting the effect of the measurement on the device under test so that it will produce a voltage in a certain range, but doing that may be more difficult than merely selecting an input. When sensing resistance, having a current source that can be switched under processor control between e.g. 0.1mA and 10mA may not be difficult, but changing the sense resistor when measuring current will generally be impractical.

A meter which uses a rotary switch to select input range can easily adjust how it measures things in whatever way is most convenient. A meter which selects ranges electronically may have more limited options.

Tags:

Multimeter