Associated Topics || Dr. Math Home || Search Dr. Math

### Defining Relative Error

```Date: 03/11/2004 at 11:48:45
From: Mark
Subject: Relative error v. Relative error?

Dear Dr. Math,

I am a Mathematics Editor at a small publishing company in NJ working
on a 9th grade standards-based book and am having trouble discerning
what precisely is relative error.  I have searched the Internet and
our own archives and have come up with two different definitions.

At most sites I have seen this definition:

Relative error is the absolute error divided by the "true"
measurement.

This is the definition with which I am the most familiar, and what I
felt relative error was.  Then I came across a different definition at
http://www.glencoe.com/sec/math/mac/mac01/course3/pdf/m_31107.pdf :

The relative error of a measurement is found by dividing the
greatest possible error by the measurement itself.

It strikes me that these are two very different formulas for finding
what seems like two very different types of relative error.  One is
more of a comparison of your error to the true value.  Meanwhile I'm
not sure what to make of the other.  Possibly it is a comparison
between the degree of error and the actual value.  Perhaps it should
be called the Greatest Possible Relative Error?

Any and all help discerning which definition is correct, and what
perhaps one might call the latter definition (if not the relative
error) would be greatly appreciated.

Thanks!

- Mark

```

```
Date: 03/12/2004 at 12:14:21
From: Doctor Peterson
Subject: Re: Relative error v. Relative error?

Hi, Mark.

As you suggest, they really are two different things entirely.

Your first definition, absolute error divided by true value, is a
measure of _accuracy_; it tells us how close a particular measurement
is to the correct value.  We can only determine this when we know the
true value.

Your second definition, "greatest possible error" divided by the
measured value, is a measure of _precision_, which depends only on
the measuring instrument itself, not on the actual value.  If we
assume the instrument is correctly calibrated, then it tells how far
off we might be due simply to the number of digits of accuracy or the
spacing of the marks.  If it is not calibrated, then precision is
irrelevant to accuracy, but still meaningful in its own right.

So what you've found is that the term "relative error" is used in two
very different (but easily confused) contexts.  We might call them
"actual relative error" and "possible relative error".

One of the references I gave before makes the same point:

Error in Measurements - Introduction to Chemical Sciences,
James A. Plambeck, Univ. of Alberta
http://www.ualberta.ca/~jplambec/che/p101/p01017.htm

The amount of error associated with a particular measurement may
be considered from the point of view of precision or the point
of view of accuracy.  The precision of a measurement expresses
the error, or deviation, of the measurement from the average of
a large number of measurements of the same quantity, while the
accuracy of a measured value expresses the deviation of the
measurement from the true value of the quantity.  Error is
considered from the point of view of accuracy when the true
value is known, but when the true value of a quantity is not
known precision must be used in place of accuracy.  It is
impossible to obtain accuracy if precision cannot be obtained,
but precision does not guarantee accuracy.  Any significant
systematic error (an error which, for some systematic or
determinate reason, influences the measurement in a known or
knowable way) may give results which are very precise--and
highly inaccurate.

Scientists often obtain the precision of a measurement not by
actually carrying out a large number of measurements but from
knowledge of the limitations of the apparatus used to carry out
the measurement procedure....these precisions can be obtained using
proper measuring techniques and are a measure of the deviation
expected in repetitive measurements.

So precision is what you can determine from the measurements alone
(how close they are to one another), or from the nature of the
instrument itself (how close a ruler's marking are, for example); in
this context, relative error indicates the spread of possible
measurements.  On the other hand, accuracy is based on the true value,
and in that context, relative error indicates how far the measurement
is from the true value.

using different terms, "relative error" and "relative deviation":

Accuracy and Precision

http://king.prps.k12.ca.us/prhs/pasohigh/classes/Fairbank/public.www/homepage/physics/accpre.HTM

Accuracy is the degree to which a measurement agrees with an
accepted value for those measurements.  They can be evaluated in
absolute or relative terms.  The absolute error is the absolute
value of the difference between the accepted value and the
measurement.  This can be written as an equation as shown below.

Absolute error = Observed - Accepted value          Ea = |O - A|

This can be expressed as a percentage error also.  The percentage
error is the relative error.  It is expressed by the following
equation.

Absolute error                    Ea
Relative error = -------------- x 100%        Er = -- x 100
Accepted value                     A

Data can also be evaluated in terms of how many measurements
that are made in the same manner deviate from one another.  This
is known as precision and is evaluated in terms of absolute and
relative deviation.  Absolute deviation is the absolute value of
the difference between the mean or average value and the measured
value.  This is expressed below in the equation.

Absolute deviation = Observed - Mean value    Da = |O - M|

Another way to express the deviation or precision is as a
percentage.  This is the relative deviation and is expressed as
follows.

Average absolute dev              Da
Relative deviation = -------------------- x 100%  Dr = -- x 100
Mean value                    M

because it deals with a set of actual measurements rather than
possible measurements; but it is closely related.  I would call your
second definition "relative uncertainty" if I had to give it its own
name.  Uncertainty is just a different type of "error".

And that turns out to be just what the following nice glossary uses:
I'll close with this:

Definitions of Measurement Uncertainty Terms
http://www.physics.unc.edu/~deardorf/uncertainty/definitions.html

I won't quote from it, because you'll want to be sure to read through
the whole page to see both careful definitions, and the conflicts the
author found among different sources.

As you can see, all terms and concepts tend to be rather flexible,
adapting to different situations by changing their meaning slightly,
while retaining the essential concept.  That can be a little
confusing, but it is the way language works, even in math!  So
textbooks will never quite be able to match exactly with real-world
uses of terms, because you don't want to confuse kids with this
reality.  A little awareness of it may be good, however!

- Doctor Peterson, The Math Forum
http://mathforum.org/dr.math/
```
Associated Topics:
High School Definitions
High School Statistics
Middle School Definitions
Middle School Statistics
Middle School Terms/Units of Measurement

Search the Dr. Math Library:

 Find items containing (put spaces between keywords):   Click only once for faster results: [ Choose "whole words" when searching for a word like age.] all keywords, in any order at least one, that exact phrase parts of words whole words

Submit your own question to Dr. Math
Math Forum Home || Math Library || Quick Reference || Math Forum Search