Pretty obvious, you might think. Tells you the maximum that the instrument can measure, in this case force. Except that…
- Some instruments have multiple ranges (100gf, 50gf, and 20gf). If you only ever want to measure to 20gf, then beware of an instrument which goes up to 100gf because the additional ranges are usually achieved by varying the electronic amplification, NOT by changing the way the basic transducer works. More amplification equals more noise and any other undesirable effects, such as non-linearity and dynamic errors will also be amplified. It’s much better to buy an instrument designed for the force range you want to use. Think what happens when you zoom in with your phone to take a picture of something small and far away. The quality goes right down.
Ask yourself what range you really need. The instrument bought for a rat is not going to work very well with a mouse.
Definition: “The degree to which the result of a measurement conforms to the correct value”.
Here’s a hypothetical graph from the calibration of a “perfectly accurate” force measurement system with the true force on the X axis and the indicated value on the Y. There is no difference between the true reading and the indicated reading at any of the six calibration points (seven if you include zero). The calibration was probably achieved by clamping the instrument in series with another, higher grade force transducer (5-10 times “better”), applying a series of forces and then recording and comparing the two outputs. Or masses could be attached to the instrument to provide a series of known forces. Either way, it’s the manufacturer’s quality control check.
A high grade calibration would usually be done three times, to check the repeatability of the rig. For an even higher grade, the force transducer would be removed and replaced before the third run, to measure the reproducibility as well. Either way, the scatter should be small as these are very controlled conditions.
Here’s a calibration for a force transducer with a non-linear response in the form of an “S” curve. (Transducers do often have characteristics like this, but I’ve exaggerated to make it easier to see). In the absence of any other errors, the accuracy of this device could be quoted as the maximum deviation from the theoretical straight line. At small forces, this would of course make it a large percentage of the force being measured, so a lower limit is often stated: “Accuracy +_ 1% of reading, down to 5% of range”.
Think carefully about any measurements made in that bottom 5%
Back to Datasheet Jargon