..... this morning he checked the same finger twice , one on each side of finger , one side was 8.9 the other side was 9.2 .....
In science and engineering, when making precision measurements they talk about
Calibration. The subject is very esoteric, but, in this case, very relevant. Additionaly, although the [separate] Wikipedia page on
Calibration Curves only deals with analytical chemistry, calibration curves are used with a wide variety of instrumentation to improve the accuracy of the measurement.
What does this mean?
I recently bought a set of electronic scales, Say I had some very accurate test weights: I could gradually add weights to the scales and note down the results. Then, gradually remove weights while again noting down the result. This would give me a calibration table for the instrument and it might look something like this:
Test Weight (oz) | % Error:
weight increasing | % Error:
weight decreasing |
0 | 0 | 0 |
1 | -10 | 7 |
2 | -9 | -13 |
3 | -1 | 0 |
4 | 7 | -5 |
5 | 11 | -8 |
6 | -14 | 3 |
7 | -9 | 9 |
8 | 14 | -6 |
9 | 3 | -1 |
10 | 8 | -4 |
Full disclosure: A skilled statistician, physicist or engineering will tell you that particular curve is garbage. The reason is because, rather than calibrating a real instrument, I used a random number generator to get the errors. However, it does nicely illustrate the principle.
The things to note are:
- Different test weights have different errors
- The error is different depending on whether you are adding or removing weights
As I say, the art of calibration is very esoteric and involves maths that will set your eyes rolling. With regard to a consumer level BG meter, I believe they have quoted accuracy of about 15%, but as I hope you can see from my little demonstration that quoted figure does not really tell the whole story
Edited table because I stupidly copied the error of zero as 6%