This, to me, is the smallest incremental change that the instrument can detect and display. So if the digital readout can indicate 500gf, and has 3 digits, then its resolution is 1gf. If it has 4 digits, then it can probably resolve 0.1gf. Probably?

Moving away from force measurement for a moment, here’s a fairly extreme example. It’s the digital readout from Topcat Metrology’s milling machine. It’s indicating distance from an origin of the table that holds the workpiece, in X and Y. The units are mm, so you might think that it measures to 1 micron (0.001mm). And I was told this by the sales rep. Very impressive.

No, it doesn’t. First of all, it steps up in fives, so the next increment from 25.470 is 25.475. So its resolution is 0.005mm, or 5 microns. If I turn the handle to move the table, I can see steps of 5 microns. But they are pretty irrelevant, because of the mechanical tolerances in the system. I’m not complaining, it’s a very good machine, but any number beyond the first decimal place is random as far as the real position of the table is concerned. I’d much rather it looked like this…

…because the accuracy and repeatability of the machine itself limit it to 0.1mm. That’s what it can detect and achieve when you are cutting a component.

So check that your force gauge really does count up in steps of 0.1gf. If it goes up in steps of 0.5gf, then that’s the resolution. Which wouldn’t be much use for mice as every measurement would be rounded up or down… a lot!

And then check the rest of the numbers: a display increment of 0.1gf is of limited interest if the overall accuracy, or linearity, is 1gf.

Don’t be too impressed by the number of digits on the display!