• A A

## Difference Between Voltmeter and Multimeter

Voltmeter vs Multimeter

Both the voltmeter and multimeter are instruments used in electronic and electrical measurements. These are used to measure almost all the properties of electronic or electrical systems. Physicists, Electronic engineers, electrical engineers and technicians use these instruments in their relative fields.

Voltmeter

The unit “Volt” is named in honor of Alessandro Volta. It is used to measure the potential of a point or potential difference between two points. Usually the voltmeter is a variation of the galvanometer. A very high resistor set up in series with the galvanometer makes the basic voltmeter. Voltmeters have ranges from a few microvolts to about a few Gigavolts. As described earlier, the basic voltmeter consists of a current carrying coil placed inside an external magnetic field. The magnetic field due to the current carrying coil repulses the permanent magnetic field. This effect causes an indicator attached to the coil to rotate; this indicator coil system is spring loaded, thereby bringing the indicator back to zero when no current is present. The angle of the indicator turn is proportional to the current present in the coil. The digital voltmeter uses an analog to digital conversion (ADC) to convert the present voltage to a digital value. But the incoming signal must be amplified or reduced depending on the measuring range used in the instrument before it can be displayed as a digital value. The main problem involving voltmeters are that, they have a finite resistance value. Ideally a voltmeter should have infinite impedance, which means it must not draw any current from the circuit. But this is not the case with real voltmeters. A real voltmeter must have to draw a current from the circuit in order to produce the repulsive magnetic field. However this can be minimized by using amplifiers so that the disturbance to the circuit is minimal.

Multimeter

The multimeter is basically a collection of all the meters possible. It varies from the old Volt-Ampere- Ohm meter to more sophisticated multimeters. The word “multi” means several or many. Hence the name itself suggests it measures many variables. Analog multimeters are basically galvanometers (i.e. a current carrying coil placed in an external magnetic field). Depending on how the resistors are combined, a galvanometer can be used as a voltmeter, an ammeter or an ohmmeter (resistance meter). A dial on the face of the multimeter allows what parameter and what range you are measuring it to be chosen. It can be 0 to 200 mv, 0 to 20 V, 0 to 10 mA, 0 to 2000 Ohms etc. Digital multimeters use different methods to measure these parameters, and they also have more options like diode mode, transistor mode etc.

What’s the difference between Voltmeter and Multimeter?

Voltmeter is used to measure the potential difference between two points, whereas the multimeter is used to measure the voltage difference, current and resistance. It is also used to troubleshoot diodes and transistors. The voltmeter can be considered as a sub part of the multimeter.

Related posts: