A digital multimeter

A multimeter or a multitester, also known as a volt/ohm meter or VOM, is an electronic measuring instrument that combines several measurement functions in one unit. A typical multimeter may include features such as the ability to measure voltage, current and resistance. Multimeters may use analog or digital circuitsanalog multimeters and digital multimeters (often abbreviated DMM or DVOM.) Analog instruments are usually based on a microammeter whose pointer moves over a scale calibration for all the different measurements that can be made; digital instruments usually display digits, but may display a bar of a length proportional to the quantity measured.

A multimeter can be a hand-held device useful for basic fault finding and field service work or a bench instrument which can measure to a very high degree of accuracy. They can be used to troubleshoot electrical problems in a wide array of industrial and household devices such as electronic equipment, motor controls, domestic appliances, power supplies, and wiring systems.


Quantities measured

Contemporary multimeters can measure many quantities. The common ones are:

Additionally, some multimeters measure:

Digital multimeters may also include circuits for:

  • Continuity; beeps when a circuit conducts.
  • Diodes (measuring forward drop of diode junctions, i.e., diodes and transistor junctions) and transistors (measuring current gain and other parameters).
  • Battery checking for simple 1.5 volt and 9 volt batteries. This is a current loaded voltage scale. Battery checking (ignoring internal resistance, which increases as the battery is depleted), is less accurate when using a DC voltage scale.



The resolution of a multimeter is often specified in "digits" of resolution. For example, the term 5½ digits refers to the number of digits displayed on the display of a multimeter.

By convention, a half digit can display either a zero or a one, while a three-quarters digit can display a numeral higher than a one but not nine. Commonly, a three-quarters digit refers to a maximum value of 3 or 5. The fractional digit is always the most significant digit in the displayed value. A 5½ digit multimeter would have five full digits that display values from 0 to 9 and one half digit that could only display 0 or 1.[3] Such a meter could show positive or negative values from 0 to 199,999. A 3¾ digit meter can display a quantity from 0 to 3,999 or 5,999, depending on the manufacturer.

While a digital display can easily be extended in precision, the extra digits are of no value if not accompanied by care in the design and calibration of the analog portions of the multimeter. Meaningful high-resolution measurements require a good understanding of the instrument specifications, good control of the measurement conditions, and traceability of the calibration of the instrument.

Specifying "display counts" is another way to specify the resolution. Display counts give the largest number, or the largest number plus one (so the count number looks nicer) the multimeter's display can show, ignoring a decimal separator. For example, a 5½ digit multimeter can also be specified as a 199999 display count or 200000 display count multimeter. Often the display count is just called the count in multimeter specifications.


Resolution of analog multimeters is limited by the width of the scale pointer, vibration of the pointer, the accuracy of printing of scales, zero calibration, number of ranges, and errors due to non-horizontal use of the mechanical display. Accuracy of readings obtained is also often compromised by miscounting division markings, errors in mental arithmetic, parallax observation errors, and less than perfect eyesight. Mirrored scales and larger meter movements are used to improve resolution; two and a half to three digits equivalent resolution is usual (and is usually adequate for the limited precision needed for most measurements).

Resistance measurements, in particular, are of low precision due to the typical resistance measurement circuit which compresses the scale heavily at the higher resistance values. Inexpensive analog meters may have only a single resistance scale, seriously restricting the range of precise measurements. Typically an analog meter will have a panel adjustment to set the zero-ohms calibration of the meter, to compensate for the varying voltage of the meter battery.


Digital multimeters generally take measurements with accuracy superior to their analog counterparts. Standard analog multimeters measure with typically three percent accuracy,[4] though instruments of higher accuracy are made. Standard portable digital multimeters are specified to have an accuracy of typically 0.5% on the DC voltage ranges. Mainstream bench-top multimeters are available with specified accuracy of better than ±0.01%. Laboratory grade instruments can have accuracies of a few parts per million.[5]

Accuracy figures need to be interpreted with care. The accuracy of an analog instrument usually refers to full-scale deflection; a measurement of 10V on the 100V scale of a 3% meter is subject to an error of 3V, 30% of the reading. Digital meters usually specify accuracy as a percentage of reading plus a percentage of full-scale value, sometimes expressed in counts rather than percentage terms.

Quoted accuracy is specified as being that of the lower millivolt (mV) DC range, and is known as the "basic DC volts accuracy" figure. Higher DC voltage ranges, current, resistance, AC and other ranges will usually have a lower accuracy than the basic DC volts figure. AC measurements only meet specified accuracy within a specified range of frequencies.

Manufacturers can provide calibration services so that new meters may be purchased with a certificate of calibration indicating the meter has been adjusted to standards traceable to, for example, the US National Institute of Standards and Technology (NIST), or other national standards laboratory.

Test equipment tends to drift out of calibration over time, and the specified accuracy cannot be relied upon indefinitely. For more expensive equipment, manufacturers and third parties provide calibration services so that older equipment may be recalibrated and recertified. The cost of such services is disproportionate for inexpensive equipment; however extreme accuracy is not required for most routine testing. Multimeters used for critical measurements may be part of a metrology program to assure calibration.

Sensitivity and input impedance

When used for measuring voltage, the input impedance of the multimeter must be very high compared to the impedance of the circuit being measured; otherwise circuit operation may be changed, and the reading will also be inaccurate.

Meters with electronic amplifiers (all digital multimeters and some analog meters) have a fixed input impedance that is high enough not to disturb most circuits. This is often either one or ten megohms; the standardization of the input resistance allows the use of external high-resistance probes which form a voltage divider with the input resistance to extend voltage range up to tens of thousands of volts.

Most analog multimeters of the moving-pointer type are unbuffered, and draw current from the circuit under test to deflect the meter pointer. The impedance of the meter varies depending on the basic sensitivity of the meter movement and the range which is selected. For example, a meter with a typical 20,000 ohms/volt sensitivity will have an input resistance of two million ohms on the 100 volt range (100 V * 20,000 ohms/volt = 2,000,000 ohms). On every range, at full scale voltage of the range, the full current required to deflect the meter movement is taken from the circuit under test. Lower sensitivity meter movements are acceptable for testing in circuits where source impedances are low compared to the meter impedance, for example, power circuits; these meters are more rugged mechanically. Some measurements in signal circuits require higher sensitivity movements so as not to load the circuit under test with the meter impedance.[6]

Sometimes sensitivity is confused with resolution of a meter, which is defined as the lowest voltage, current or resistance change that can change the observed reading.

For general-purpose digital multimeters, the lowest voltage range is typically several hundred millivolts AC or DC, but the lowest current range may be several hundred milliamperes, although instruments with greater current sensitivity are available. Measurement of low resistance requires lead resistance (measured by touching the test probes together) to be subtracted for best accuracy.

The upper end of multimeter measurement ranges varies considerably; measurements over perhaps 600 volts, 10 amperes, or 100 megohms may require a specialized test instrument.

Burden voltage

Any ammeter, including a multimeter in a current range, has a certain resistance. Most multimeters inherently measure voltage, and pass a current to be measured through a shunt resistance, measuring the voltage developed across it. The voltage drop is known as the burden voltage, specified in volts per ampere. The value can change depending on the range the meter selects, since different ranges usually use different shunt resistors.[7][8]

The burden voltage can be significant in low-voltage circuits. To check for its effect on accuracy and on external circuit operation the meter can be switched to different ranges; the current reading should be the same and circuit operation should not be affected if burden voltage is not a problem. If this voltage is significant it can be reduced (also reducing the inherent accuracy and precision of the measurement) by using a higher current range.

Alternating current sensing

Since the basic indicator system in either an analog or digital meter responds to DC only, a multimeter includes an AC to DC conversion circuit for making alternating current measurements. Basic meters utilize a rectifier circuit to measure the average or peak absolute value of the voltage, but are calibrated to show the calculated root mean square (RMS) value for a sinusoidal waveform; this will give correct readings for alternating current as used in power distribution. User guides for some such meters give correction factors for some simple non-sinusoidal waveforms, to allow the correct root mean square (RMS) equivalent value to be calculated. More expensive multimeters include an AC to DC converter that measures the true RMS value of the waveform within certain limits; the user manual for the meter may indicate the limits of the crest factor and frequency for which the meter calibration is valid. RMS sensing is necessary for measurements on non-sinusoidal periodic waveforms, such as found in audio signals and variable-frequency drives.

See also


This article is licensed under the GNU Free Documentation License. It uses material from the article "multimeter".

Licensed under the Creative Commons Attribution Non-commercial 3.0 License