In our everyday language as well as in industrial world quite often used terms are accuracy and precision. Usually these two words are associated either with measurement or manufacturing. What is the accuracy of this measurement? Or How precise is your manufacturing? Our first thought is related rather to a level of quality, something related to advanced equipment being used. I did some research regarding these two terms. Let’s feel the difference.
Accuracy – from Latin accūrātus (“done with care”), perfect past participle of accūrō (“take care of”); from ad- (“to, towards, at”) + cūrō (“take care”), from cūra (“care”).
While
Measurement accuracy according VIM is defined as: closeness of agreement of a measured quantity value and true quantity value of a measurand.
However mathematical representation of an accuracy is rather understood as an Error. A measurement error according VIM is: measured quantity value minus reference quantity value.
Δ E = I – CTV
• Δ E – Measurement Error
• I – Indicated/Estimated value of a measured quantity
• CTV – Conventional True Value as a reference quantity value. Conventional True Value is always represented by highest reference standard
Now, since we never know true quantity value, there is no instrument, which tells us what the true value of a measurand is, only solution is to replace with a term conventional true value. As I mentioned above Conventional True Value is a value represented by highest available level of a standard instrument. We get a measurement error and will use it as a numerical represenation of accuracy (or inaccuracy) which gives us feeling of “done with care”.
The right way of expressing quality of a measurement instrument and measurement is through estimation of measured quantity, or estimated error and its measurement uncertainty.
Estimated value can be expressed as a single measurement equal to indicated value or through statistical representation of repeated measurement values such as arithmetic mean and also a dispersion will be attributed such as standard deviation with his level of probability, what is in our case called measurement uncertainty.
Result of measurement:
Ī ± U (k = 2) (Estimation ± Expanded Uncertainty)
Δ E ± U (k = 2) (Error of measurement ± Expanded Uncertainty)
Similar analogy we can apply on term:
Precision
mid 18th century: from French précision or Latin praecisio(n- ), from praecidere ‘cut off’ (see precise).
While
Measurement precision according VIM is defined as: closeness of agreement between indications or measured quantity values obtained by replicate measurements on the same or similar objects under specified conditions.
Here we want to know if we repeat measurement couple of times, how many time we are going to get the same indication. Since every measurement is affected by random and systematical errors, only solution to see where we are in terms of precision is to get an estimation, such as arithmetic mean and see its dispersion expressed ie. through standard deviation better through measurement uncertainty. In order to see how much we need to “cut off” to get where we want to be, we ask a question how precise you are? We can answer we have done it with high precision and there is no need of further “cutting”.
References:
(1) https://en.wiktionary.org/wiki/accurate#Synonyms
(2) VIM International vocabulary of metrology – Basic and general concepts and associated terms