top of page

Resolution: "The smallest incremental change that an instrument can reliably measure and display".

 

NOT (necessarily) the last digit on the display. As an example, the XY table of my milling machine has a readout which "indicates" down to 1 micron, and I was told, by the salesman, that this was the resolution. But as soon as you move the table you realise that, because of internal software, it actually jumps up in steps of 5 microns.

So the resolution, the smallest incremental change that can be displayed, is actually 5 microns. This has nothing to do with the accuracy, or repeatability of the machine: that's more like 100 microns, especially under load, because of the clearance in the bearings. So, actually, I'd rather it looked like the second graph. Much easier to read...

 

But with everything else constant, it can display a 5 micron change, so that's the resolution. 

digital display.jpg
better digital display.jpg

Accuracy: "The degree to which the result of a measurement conforms to the correct value".

 

Made up of the resolution, the repeatability and the linearity of the system. Often quoted as a % figure down to a certain % of the range (eg 1% down to 5% of the range). (Which means that, below 5% it will be worse...)

 

This is important if you are buying an instrument with a much greater range than you need (some EvFs, for instance). One of the factors affecting the overall accuracy may be electronic noise. If this is constant throughout the range, it's a much greater percentage of the signal at low forces. You may find therefore, for mice, that you are trying to measure in this unspecified, low accuracy region.

So always look at the range as well as the accuracy.

constant noise source.jpg
resolution
accuracy
bottom of page