Electrical measurement is the backbone of technology. By measuring electrical currents, we can learn different things about the function and performance of machinery, digital technology, and more. Electricity runs our society, and being able to measure and quantify is as important now as ever.
In this article, we’ll explain what exactly electrical measurement is, its common units of measurement, and what kind of tools are utilized in that measurement.
What Are Electrical Measurements?
Electrical measurements are the different ways to quantify the different qualities and behaviors or electricity including voltage, resistance, current, frequency, and power. It’s an umbrella that covers all of the different types of measurements that can be gleaned from electricity.
What Are the Units of Measurement for Electricity?
There are several different standard measurement units for electricity that depend on what exactly is being measured; some of the most common units are volts, amps, watts, ohms, and hertz. (We’ll cover the measurements themselves in more detail in the section below.)
- Volt: Measures the voltage; abbreviated as V or E for electromotive force.
- Amp: Measures the magnitude of the electrical current, abbreviated as A or I.
- Watt: Measures the electrical power; abbreviated as W.
- Ohm: Measures the resistance to the flow of current as well as the impedance in AC; abbreviated as Ω.
- Hertz: Measures a current’s frequency; abbreviated as Hz.
Due to the nature of electricity and the large amounts that experts often deal with when measuring and studying it, many of these units are often expressed with SI-specific prefixes like kilo and mega to indicate large quantities; this sort of notation helps people avoid dealing with overly long numerals. For instance, it’s easier to work with the figure 10 kilowatts in an equation compared to 10,000 watts or 1 megawatt versus 1,000,000 watts.
However, these units can also be expressed in a way to signal miniscule, precise quantities with prefixes such a milli and micro. (One milliwatt is the same as 0.001 watt, while 1 microwatt is the same as 0.000001 watt.)
What Are the Six Most Common Electrical Measurements?
Now that we’ve covered some of the most common units of electrical measurement, let’s dig more into the measurements themselves. Some of the most common electrical measurements are:
- Voltage: Also known as electrical potential difference, electric pressure, or electric tension, voltage is the pressure needed to push electrons through a conducting loop. In other words, it’s the amount of force that drives electrical charge in a circuit. It’s governed by Ohm's law, which states that current across two points is directly proportional to voltage or that voltage equals current multiplied by resistance. The unit of measurement for voltage is volts (V).
- Current: Current is the amount of electrical charge flowing through a system. It’s similarly calculated through Ohm’s law, but in this case, current equals voltage divided by resistance. The unit of measurement for current is Amps (A).
- Resistance: Electrical resistance is a quantification of how much opposition there is to the current in whatever material(s) or system that a current flows through. It’s calculated by dividing voltage by current. The unit of measurement for resistance is Ohms (Ω).
- Power: Power is the rate at which electrical energy is being used. It’s calculated by multiplying voltage by current. The unit of measurement for power is Watts (W).
- Frequency: The number of times an alternating current (AC) cycles or changes direction per second is called electrical frequency. The unit of measurement for frequency is Hertz (Hz), where one Hz equals one cycle per second; so, if a current is operating at 60Hz, it’s cycling 60 times a second.
- Capacitance: The ability of a material, a component, or a system to collect and store an electrical charge is its capacitance. The unit of measurement for capacitance is the farad (C), and it can be calculated by dividing charge by voltage.
It’s important to know the different formulas that build these different measurements (see below) so it’s clear how these different qualities of electricity intersect with and relate to each other. However, in most cases, experts and scientists utilize instruments specifically designed to take electrical measurements to get the most accurate measurement possible.
Measurement | Unit | Symbol | Equation |
Voltage | Volt | V or E | V = I × R |
Current | Ampere | A or I | I = V ÷ R |
Resistance | Ohm | R or Ω | R = V ÷ I |
Power | Watt | W or P | P = V × I |
Frequency | Hertz | Hz | ƒ = 1 ÷ T (time to complete one cycle) |
Capacitance | Farad | C | C = Q (charge) ÷ V |
You can find more common measurements and terminology here.
What Are Electrical Measuring Instruments?
Electrical measuring instruments are the array of tools that people use to gather data on all of the different types of electrical behaviors and qualities. There are both analog and digital instruments with the major difference being how these tools display their measurements. In an analog device, the electricity running through the device physically moves a needle that’s overlaid on a clock-like display of numbers to show what the voltage, amperage, etc., is. In a digital device, though, the device itself converts that measurement directly to a number that is shown on a digital display screen.
What Are the Most Common Electrical Measuring Instruments?
There is a plethora of different types of instruments that can measure electricity. Some are designed to be multi-use and take several different types of electrical measurements, some are built to take just one type of measurement, and some are meant to take very precise electrical measurements. That said, these are some of the more common electrical measuring instruments.
Voltmeter
A voltmeter, as its name suggests, measures the voltage in a circuit. It operates by using two probes to measure the voltage between those two points.

Fluke FLK-V3000 FC Wireless AC Voltage Module
Ammeter
Ammeters measure the current in a circuit and are generally connected directly to the circuit so that they can measure the current in that part of a system.
Ohmmeter
To measure the resistance of a circuit, you would use an ohmmeter. It applies a current to the circuit and then calculates its resistance by measuring the resulting voltage. Unlike voltmeters or ammeters, you should not connect an ohmmeter to a live circuit that has any electrical current running through it.
Multimeter
The multimeter is the jack of all trades when it comes to electrical measurements. Some multimeters are more versatile than others, so the exact measurements a multimeter can take will depend on the model and what the manufacturer designed it to measure. However, most multimeters can at least measure voltage, current, and resistance, typically through the use of a probe. (Most people use multimeters primarily to take voltage measurements.) This versatility makes them extremely useful for field work, cutting down the need for three separate instruments.

Fluke 87V MAX True-rms Digital Multimeter
Oscilloscope
Oscilloscopes measures voltage over time. They display this measurement over time as a wave representing the electrical signal, with time usually represented on the X axis and voltage on the Y axis. Oscilloscopes are useful in pinpointing changes in an electrical circuit and troubleshooting any potential problems.
Clamp Meter
A clamp meter, as the name states, utilizes a clamping mechanism (rather than a probe) to take its measurements. Though some people think of clamp meters as a type of multimeter, they’re not the same — though they’re similar. Where multimeters are used primarily to take voltage measurements, clamp meters are used primarily to collect current measurements.
Its clamping mechanism allows for measurement without needing to directly connect to the system, providing faster and safer results.

Fluke 325 True RMS Clamp Meter
Why Is Calibration Important for Electrical Measurements?
As is the case with any sort of measurement device, you need to regularly calibrate any instrument that takes electrical measurements. Without proper calibration, devices won’t provide accurate readings, which can lead to issues such as system overload and failure, unnecessary downtime, and even unsafe working environments. That’s because, particularly at the industrial level, many expensive and critical systems and pieces of technology are built to run on exact electrical parameters. Even a slight error in measurement can cascade into negative and even disastrous consequences.
For instance, voltage issues or fluctuations can shut down servers and corrupt data, while current abnormalities can cause devices to overheat or suffer irreparable damage. In the specific case of an electricity meter, if it’s not properly calibrated, its inaccurate measurement could be costing a homeowner or business significant money in electricity utilities. Luckily, all of these problems can be avoided before they happen with accurate testing carried out with calibrated equipment.
Electrical Measurement in Review
Electrical measurement is a broad discipline that covers a whole spectrum of electrical tests and characteristics. Between voltage, power, current, and resistance, it provides us with insight into how circuits are running and thus how well our technology is operating. Instruments like digital multimeters are vital to gathering these different measurements and must remain properly calibrated to provide safe and accurate results for all. Want to learn more about electrical measurement and how to handle it properly? Check out these related resources: