Is accuracy binary?

I'm writing a thesis in a scientific area, and in a draft I wrote that it was "not possible to tell which set of data is more accurate", meaning that it was not possible to tell which was closer to the true values of the thing being measured.

One of my supervisors objected to this language, claiming that either one is accurate or one is not - that it's no more possible to be more or less accurate than it is to be more or less pregnant.

For the purposes of this thesis I will simply avoid this phrasing, but out of curiosity: is he right?


As is very often the case with words, accurate has different and conflicting senses. AHDEL has

accurate adj.

  1. Conforming exactly to fact; errorless.
  2. Deviating only slightly or within acceptable limits from a standard.

...

and Collins (same link):

  1. faithfully representing or describing the truth
  2. showing a negligible or permissible deviation from a standard

...

For accuracy, ODO has:

accuracy NOUN

1[mass noun]

The quality or state of being correct or precise.

‘we have confidence in the accuracy of the statistics’

1.1 technical

The degree to which the result of a measurement, calculation, or specification conforms to the correct value or a standard.

Your tutor is wrong to dismiss per se the definitions listed second here (which would be understood as the intended (gradable) sense from the context you give). However, if your institution has a style guide proscribing this sense, you should stick with their style preferences. But don't think that this is a universal requirement. University, maybe.


No, if you understood your supervisor correctly, your supervisor is incorrect. Whenever you measure anything, you obtain a measurement result and an uncertainty of that measurement result. The lower the uncertainty, the higher the quality of the measurement. For a technical treatment of measurement uncertainty (including consideration of terms such as accuracy, precision, systematic error, random error, true value, etc.), see http://www.bipm.org/en/publications/guides/gum.html.

For the sake of your question, I will use accuracy instead of uncertainty. Imagine two people shooting at similarly oriented targets from a fixed distance. If one of the people hits the bullseye every time and the other misses the target completely every time, then without a doubt, the first person is a more accurate shot than the second person. Moreover, people will exist with skill levels anywhere between these two extremes.

The same considerations apply to measurements. One would expect sophisticated techniques employed by skilled researchers to come much closer to identifying "true values" ("bullseyes") than crude techniques employed by unsophisticated researchers. Compare atomic clocks to sundials.

Addendum: The target shooting example is often used to explain the difference between accuracy and precision. See images. Also see accuracy and precision on Wikipedia. Thanks to @DavidRicherby for prompting this addendum.


In industry the concept of manufacturing tolerance is very important, see this discussion of the subject for example. However different products require different tolerances.

For example a 5mm bolt intended for use in a jet engine would, typically, be made to much tighter tolerances than one intended for holding flat-pack furniture together because the demands on the furniture bolt would be much lower. The diameter of the furniture bolt could easily be 2% out either way and still be perfectly adequate but that sort of accuracy in a jet engine could be highly dangerous.

Similarly the length of a 3-metre rolled steel joist (RSJ) will be specified differently depending on the application. If it is intended to span a gap resting on masonry pillars it will be specified as having a strict minimum length of 3000mm (a tolerance of -0%) but a maximum length of, perhaps, 3060mm (a tolerance of +2%). If it is to fit between two steel columns it would have maximum tolerance of +0% and a minimun tolerance of, say, -1%. A joist rolled for a steel stockholder, however, might have a tolerance of 2% or more either way.

The point is that what is accurate in manufacturing can be very different depending on the product and the application. What is very accurate in one context can be highly inaccurate in another.

The same thing applies, surprisingly, in financial reporting. Bank accounts have to be accurate to the smallest unit of currency (pence in the UK, cents in the US or Euro zone) but management and government reports can, and frequently are, prepared in terms of thousands or even millions of the largest unit (pounds, dollars and euros for example). This does not mean that the reports are incorrect, far from it. It just means that they are prepared to a lower degree of accuracy because that is what is appropriate to their purpose.

On a related issue interest and tax calculations often have to be rounded up or down because the 'accurate' calculation results in fractions of the smallest unit of currency.

Accuracy is certainly not a binary term, if it were we would never get anything done. The aim in all fields where accuracy is important should be to be as accurate as necessary, that is to work within the appropriate tolerances.


An interesting question.

Your original quote had the adjective "accurate", not the noun "accuracy". English adjectives are either "gradable" or "non-gradable". Compare "cold" with "married": you can be more "cold" or less "cold", but you either are or are not "married". Some adjectives are, purely by convention, normally considered "non-gradable", such as "stationary".

So your supervisor's point is not that the noun "accuracy" is binary, but that the adjective "accurate" is non-gradable.

I have to admit, that to me, as a native speaker "more accurate" sounds fine.

For the noun "accuracy", the Oxford Shorter Dictionary gives the following definition.

accuracy noun 1. The state of being accurate; precision, correctness. 2. The degree of refinement in measurement of specification, as given by the extent of conformity with a standard or true value.

Of course, you are thinking, quite correctly, of the second definition of the noun. However, for the adjective, it says the following.

accurate adjective 1. Of a thing or person: exact or correct, as the result of care. 2. (Obsolete) Executed with care. 3. Of a thing: in exact conformity with a standard or with truth.

So, from those definitions, it sounds like a non-gradable adjective. Your supervisor, although s/he disagrees with you and me, might be able to claim some authority for the proposition that you cannot say "more accurate".

Since this is a supervisor, and there appears to be at least some difference of opinion, perhaps you should compromise by rewording it to "it is not possible to tell which set of data has a a higher level of accuracy."