What Is an Analog Multimeter?

Jerry Morrison

An analog multimeter (AMM) is a multi-purpose electronic testing device. A unit will be capable of measuring voltage, current, and resistance and often has the functionality to test for frequency and signal power as well. The display of an analog multimeter usually relies on a microammeter to move a physical pointer over a calibrated scale. The device may also be referred to as a multitester or a Volt-Ohm meter.

A digital multimeter, an alternative to an analog multimeter.
A digital multimeter, an alternative to an analog multimeter.

The standard set of functions for an analog multimeter will include the measurement of voltage in volts, resistance in ohms, and current in amperes. Some models extend the measurement capability to include capacitance, conductance, inductance and frequency. Changing between the quantities measured is a matter of switch setting on the device itself along with the use of various detachable probes. A separate calibrated scale in the display area represents each quantity that can be measured. A moving pointer indicates the resulting value of a measurement.

Resistors are electrical devices that manage the flow of current through a circuit.
Resistors are electrical devices that manage the flow of current through a circuit.

The accuracy of an analog multimeter is a factor of the circuitry of its design and the use of a mechanical display. At the heart of the device is an electromechanical transducer that produces the rotary deflection of pointer in response to an electrical current. The amount of deflection is interpreted as a quantity by means of several calibrated scales over which the pointer travels. A standard, well-calibrated analog multimeter typically has a measurement accuracy of three percent. This accuracy rate is only true for the lower millivolt direct current range and may not be applicable to higher voltages or alternating current.

Interpretation of readings must take into account the quoted accuracy of the device in relation to the calibration of a particular scale. A three percent accuracy rate on a scale of 100 units is three units, which can be an error rate of over 10% for measurements in the first one-third of the scale. The accuracy of an analog multimeter is also subject to problems in visually resolving the smallest increment on a display scale. Width of the pointer, accuracy of the scale and vibration can compromise the accuracy of readings.

An analog mutimeter excels in displaying a changing measurement in real time, far surpassing its digital counterpart. Movement of the pointer clearly indicates positive results for tests where the presence of a quantity is in question and not its discrete value. The rate at which the pointer climbs to its maximum also provides a rough indicator of quantities in currents higher than those for which the device was designed.

You might also Like

Discussion Comments


@David09 - I’ve used them too but I’m surprised that you don’t think the digital multimeters are much of an improvement over the analog meters.

The analog meters require that you dial a knob to a certain range and then compare with a range on the display. You may even need to do some math if that number is not on the scale.

With digital multimeters, you just attach the terminals to the components and read the LCD display. Nothing could be easier.


I learned how to use analog multimeter years ago. I wasn’t an electrician, but I did build electronic circuits as a hobby and I used it to test resistance.

It was really easy to use and the digital units which came later didn’t do a whole lot to improve upon the basic functionality of the unit in my opinion.

I also used the analog multimeter as a battery tester, since it had the capacity to measure voltage. Of course now you can get dedicated battery testers, which will tell you if a battery is good or bad.

They don’t give precise measurements, they just spin the needle into a green area to indicate “good” or a red area to indicate “bad.”

Post your comments
Forgot password?