snip
The concepts being wrestled with here are "precision" and "accuracy".
Precision implies repeatable results to some number of decimal places
plus or minus an uncertainty factor.
That is true.
Accuracy implies the correct answer in absolute terms.
Accuracy is related to how good (accurate) the data set is. For
example, "accurate within 3 meters" is not an absolute - it could
be dead on, or three meters off. Now if the phrase stated "accurate
to within 2.987654321 =/- .0000000001 meters" - that is precise - you
will always know that you will be within 2.987654321 +/- .0000000001
meters of any mark rather than somewhere within the accuracy range of
0 to 3 meters.
snip
I work with precision measuring devices and find that these are slippery
concepts for most people. The shoddy day-to-day usage and close relationship
between the two words does not make things any easier.
See:
http://www.ieee-uffc.org/freqcontrol...g/vigaccur.htm
for a nice intuitive explanation of the difference between accuracy and
precision.
Mark Browne