The term "RMS Voltage" is often thrown around when the subject of inverters is discussed but how many people actually understand what it is, what it means and what relevance it is to inverters?
In our usual syle we will try to answer these questions in simple terms that someone not overly familiar with electronics or electricity can understand. The following explanation is far removed from the usual technical explanations seen in electronic or electrical text books but it is perfectly valid and was chosen so as to be simple for the non technically minded to understand. Some very basic electrical knowledge and some very basic maths is assumed.
Firstly "RMS" stands for "Root Mean Square". The relavance of this term will become apparent later.
Two very simple formulae that you should understand first are:
V=IR = volts = I (amps) * resistance
and
P=VI = Power = Volts * amps
(I is the standard letter used to donote current)
So let's start with a very simple DC circuit using some very simple numbers......
A DC power source of 10 volts feeding into a load resistor of 1 Ohm
The first formula V=IR can be rearranged to I=V/R so we can put these numbers in and we get I=10/1=10 therefore a current of 10 Amps will exist in the load.
Dead simple.
The second formula (P=VI) will show the total power consumed by the load. In this case we will get P=10*10 = 100 Watts. So the load resistor will dissipate 100 watts as heat. That is where all the power goes, into heat.
Now then, if we leave this switched on for one hour, then the load will have produced 100 Watts of heat continuously for 1 hour. That is, obviously, an average power output of 100 Watts for the period of 1 hour.
If we leave it switched on for 30 minutes, then switched off for 30 minutes then it will have produced a total of 100 Watts for the first 30 minutes and 0 Watts for the second 30 minutes. Quite clearly this is an average of 50 Watts over the one hour period.
The power supply of 10 volts was switched on for 30 minutes, then switched off for 30 minutes. So clearly over the 1 hour perdiod, the average voltage was half this ie 5 volts (think about it).
So we know that 10 Volts (average) produced 100 watts (average) and 5 volts (average) produced 50 watts (average) into the same load.
But if we try to calculate this using our two formulae of I=V/R and P=VI we find that the answer is different!
Let's try it. We have an average of 5 volts so the current I=5V/1 Ohm=5 Amps. And using P=VI we therefore get P=5Volts*5Amps=25 Watts. Yet we know from what we just did above that the average power was 50 watts not 25 Watts.
Clearly something is very wrong!
The problem is this. Had we reduced the DC voltage from 10 Volts to 5 Volts then this calculation would work correctly. However we didn't reduce the voltage, we switched it on for half the time, and off for half the time. This apparently gives us an average voltage of 5 Volts.
Look at the formulae again.......
V=IR and P=VI. Rearrange V=IR to I=V/R and it becomes clear that P=V2/R. Therefore power is proportional to the square of the voltage. So if we double the voltage (ie multiply by 2), the power will be increased by 22 ie by 4 times.
Conversely, if we half the voltage (ie divide by 2) the power will be divided by 22 ie 4.
And this is what we found above when we reduced the voltage from 10 volts to 5 volts. The power reduced from 100 watts to 25 watts.
So why didn't this work when we switched the 10 Volt supply on for 30 minutes, then off for 30 minutes?
It's because the power is proportional to the square of the voltage as opposed to being proportional the voltage directly.
So the power is clearly (somehow) related to the average of the voltage (as switching it on and off obviously affects the average voltage and also affects the average power output) and is also related to the square of the voltage as doubling the voltage produces 4 times the power.
Well, let's try this. First square the voltage, so in our first example we get 102 Volts for the first 30 minutes ie 100 Volts, then 02 Volts ie 0 Volts for the next 30 minutes. The average (strictly speaking the arithmetic mean) of this is 50 Volts. But we squared the voltages to start with so let's now take the square root of this = 7.071 Volts.
Firstly take a note of what we did to the voltage above. First we squared it, then we took the arithmetic mean, then we took the square root of the result. So we end up with the square Root, of the Mean of the Square. Hence RMS. Root Mean Square.
And ended up with the very odd looking figure of 7.071 Volts.
However back to our formula:
(remember that this is the RMS voltage of the 10 volts switched on for half of the time and off for half of the time)
V=IR and P=VI
Rearrange V=IR to I=V/R we get I=7.071 Volts/1 Ohm = 7.071 Amps.
And from P=VI we get P=7.071 Volts*7.071 Amps= 50 Watts. So it worked fine.
This is what RMS Voltage is and means. It is the true voltage of a waveform that is not steady. You have seen above how it is distinctly different from the
Now what possible relevance is this to us?
Well, as it happens, most voltmeters and multimeters measure and display the average voltage not the RMS voltage. This can cause serious problems when measuring waveforms that are not steady DC.
In our example of the 10 Volt supply switched on for half the time and switched off for half the time, then clearly the meter would show 10 Volts for 30 minutes, then 0 Volts for the next 30 minutes. However with a quickly changing waveform where the voltage switches on and off at say 50Hz, the meter would show 5 Volts, which is the
The shape of the waveform greatly affects the difference between the average voltage and the RMS voltage so there is no simple conversion between the two. The only exceptions to this are with a pure sine wave where the RMS voltage is 1.11 times the average voltage and a perfect squarewave where the RMS and average voltage are the same. With any other waveshape the difference could be literally any number you care to think of.
Special meters are available that measure RMS voltage and they usually state on them "True RMS reading" or similar words. If the meter does not show this (or something similar including the acronym RMS) then it is an average reading meter which will not give the true voltage reading on a changing waveform. They only work on pure sinewaves and steady DC. And in fact when measuring an AC signal they actually measure the average voltage then the scale is set up to display this multiplied by 1.11 so it displays the RMS voltage but only of a pure sinewave.
One way to calculate the RMS voltage of a waveform is to draw a graph of the waveform on graph paper, divide the waveform up into tiny segments, square each segment, then take an average of these squared segments over one complete cycle, then square root the results. This is, in fact, how modern, microprocessor based, measuring instruments do it.
And to calculate the true power in watts (as opposed to VA) a similar procedure is followed by dividing the voltage waveform up into segments, calculating the current in the load for each segment then averaging the results over one complete cycle. And again this is how modern, microprocessor based, instruments perform power calculations. This also has the advantage of automatically taking into account the Power Factor and thereby showing the true power as opposed to apparent power.
The main relevance in our field is that of modifed sinewave inverters where a normal meter will give grossly inaccurate readings which will always be much lower than the true RMS voltage.
A typical 230 Volt modified sine wave inverter could show anything between 150 and 230 Volts on a normal meter (depending upon the exact waveform).
So next time someone says "What is this RMS thingy all about" you can tell them.
Page last updated 02/04/2008.
Website best viewed on a computer of some sort.