Resolution and accuracy

These two specifications are often misunderstood. Occasionally they are deliberately distorted.

Resolution is effectively the smallest change a meter can display. For instance a 4 digit voltmeter displaying the 12 volt system battery voltage could display, as it's smallest change, 0.01 volts. It isn't possible for it to display anything smaller than this because there simply aren't enough digits available. Often, meters show a pointless resolution. Sometimes pointless for the intended purpose, sometimes pointless for the specified accuracy.

All measurements contain some error. The accuracy (or error) is usually defined as the percentage error. So a meter with a specified accuracy of 1% (an accuracy easily sufficient for monitoring deep cycle batteries), displaying 12.50 volts, would mean that the actual voltage is within 1% of this 12.5 volts, i.e. within the range 12.375 to 12.625 volts.

On top of this is the fact that, in the case of digital instruments, there is another error. That of the last digit. A typical specification would state something like 1% +/-1 LSD. This means the meter will read accurately to within 1% of the actual voltage (giving the range in the above example of 12.375 to 12.635 volts) plus or minus 1 Least Significant Digit. As a 4 digit meter would not have the last digit (the 5) in the above example, it would display somewhere in the range of 12.37 to 12.64 plus or minus 1 LSD giving a final range of 12.36 to 12.65 volts.

That is a range of 0.29 volts. So in reality, the last digit (representing 0.01 volts) is rather pointless on a meter with 1% accuracy.

It can, however, be used to show trends, i.e. an increase of 0.02 volts will be displayed, but whether this increase is from say 12.40 volts to 12.42 volts or whether it is from 12.63 volts to 12.65 volts is anyone's guess. However, in many cases, just to see the trend is sufficient and in this case the absolute accuracy is not so important.

Many voltmeters intended for deep cycle battery bank monitoring claim an accuracy of 0.3% +/- 1 LSD and display to 0.01 volts. This is an incredibly accurate measurement. In this case an actual battery voltage of 12.50 volts would display somewhere in the range 12.46 to 12.54 volts. So even with this level of accuracy the last digit is actually rather pointless from a technical standpoint.

SmartGauge achieves this same basic accuracy of 0.3% however in SmartGauge the last digit is rounded to the nearest 0.05 volts for display purposes only (in software revision r1.05 and later - prior versions displayed to 0.01 volts). This achieves two things. It prevents a totally pointless display with the last digit bouncing all over the place due to interference from other equipment and the actual battery voltage fluctuations due to small intermittent loads. It also makes the display far easier to read.

With modern digital circuits it is almost a trivial matter to reach any desired level of accuracy. Meters with an accuracy of 0.001% are relatively simple to achieve. Much higher accuracy is common for laboratory instruments and for calibration purposes. However what is the point for monitoring battery voltage? It does nothing more than lead to a continually changing, confusing display.

Displaying battery voltage to a resolution of 0.05 volts for 12 volt systems and to 0.1 volt for 24 volt systems gives the most easily readable display. Any higher resolution just gives meaningless information that is of no help to the user. We are clearly not alone in this viewpoint, as a quick look at many battery voltage monitors will show.

Internally, SmartGauge resolves and calculates to 0.01 volts with a basic accuracy of 0.3%. But it does not display to this resolution for ease of operation.

 

Web site and all contents Copyright SmartGauge Electronics 2005, 2006, 2007, 2008. All rights reserved.
Page last updated 02/04/2008.
Website best viewed on a computer of some sort.