Drexel dragonThe Math ForumDonate to the Math Forum

Ask Dr. Math - Questions and Answers from our Archives
_____________________________________________
Associated Topics || Dr. Math Home || Search Dr. Math
_____________________________________________

Fraction or Decimal?


Date: 06/23/98 at 15:49:36
From: Jessica Burton
Subject: Precision of fractions

Dr. Math,

Please settle a bet... 

In general, which is more precise, a fraction or a decimal (for 
instance, 1/3 vs. 0.33)?

Thanks,
Jessica


Date: 06/24/98 at 09:00:58
From: Doctor Jerry
Subject: Re: Precision of fractions

Hi Jessica,

I'm guessing that you are thinking of the decimal representation of 
fractions. For example 1/4 = 0.25. In this case, both 1/4 and 0.25 
have equal precision. 

I wrote 1/4 = 0.25 because these two things represent exactly the same 
number. The fraction 1/3, however, is different in that if you divide 
1 by 3 you will get 0.3333333....  The threes never stop. If you 
divide 1 by 4, you get 0.25 and that's it. So, I can say that
1/3 = 0.3333.... (the dots mean that the 3 is repeated indefinitely)  
but 1/3 is not equal to 0.33. In this case, 1/3 is more precise. In 
fact, 1/3 - 0.33 = 1/3 - 33/100 = 1/300, which is the error you would 
commit if you were to use 0.33 in place of 1/3.

- Doctor Jerry, The Math Forum
  http://mathforum.org/dr.math/   


Date: 06/24/98 at 11:55:03
From: Doctor Peterson
Subject: Re: Precision of fractions

Hi, Jessica -

This is a fascinating question, because it leads into some ideas worth 
thinking about.

My first answer is that fractions are unquestionably more precise, 
in at least two ways. First, any rational number can be exactly 
represented by a fraction (that's what a rational number is, in the 
first place), while most rational numbers can't be exactly represented 
by a decimal. 

Your example of 1/3 makes this very clear. It may take a huge 
numerator and denominator to represent some numbers, but even a simple 
little number like 1/3 can't be represented exactly by any number of 
decimal places (unless you use a notation to indicate a repeating 
decimal, in which case it is just as exact as a fraction). In fact, 
*most* rational numbers do not produce terminating decimals; only 
those whose denominators contain only factors of 2 and 5 can be 
represented exactly by a finite decimal. So fractions mean exactly 
what they say, while decimals are usually just approximations.

Secondly, when you work with fractions, you don't lose any of that 
precision, as long as you are only adding, subtracting, multiplying, 
dividing, and taking (integer) powers. You have probably had the 
experience of doing a series of calculations on a calculator and 
finding that the answer was .99999998 when you expected 1.0; that's 
because a calculator can only store a limited number of decimal 
places, and calculations can increase the error caused by rounding 
until it becomes noticeable. With decimals, that is unavoidable, 
because you can never store all the digits; with fractions, it will 
only happen when the numerator or denominator gets too big to handle.

On the other hand, sometimes a number can be very precise, but not 
really accurate. How can that be? I can think of two cases where 
fractions are inaccurate. First, there is the mathematical problem of 
"real" numbers: not all numbers are rational. If you take the square 
root of 1/2, the result can't be represented by any fraction, so you 
would have to approximate it by some fraction, such as 29/41. Then 
your answer looks precise, but the precision is misleading, because it 
doesn't accurately represent the truth! In fact, since most real 
numbers are irrational, most numbers can't be represented accurately 
by a fraction!

Second, there is the scientific problem of "real" numbers: nothing we 
can measure in the real world is exact, so the precision of a fraction 
doesn't accurately represent our knowledge. If I measure something as 
1/2 inch, it may really be 1001/2000 inch. Again, the precision of my 
fraction is misleading. I don't really know that it is exactly 1/2 
inch; the fraction is just an approximation.

A benefit of decimals is that they provide an easy way to indicate how 
precise your measurement is. If I read the length off a ruler, I can 
say it's 0.5 inch; if I use a laser to measure it, I might say 0.50000 
inch, because I know that my measurement was no more than 0.000005 
away from the correct value. That way, the precision of my number 
reflects the accuracy of my measurement, and I am not implying more 
precision than I really have. To put it another way, decimals give me 
a way to control my level of precision, and in that way can be said to 
be more precise than fractions!

To sum this up: Fractions are technically more precise, but either one 
is only as accurate as you make it; both can be used either as an 
approximation or as an exact value. Working with rational numbers, a 
decimal will usually be only an approximation; but with real numbers 
(in either sense), you usually can do no better than an approximation 
anyway.

You should have known you wouldn't get a simple yes or no answer that 
would settle your bet. You'll have to decide which of you is right, 
based on how you are defining precision!

- Doctor Peterson, The Math Forum
  http://mathforum.org/dr.math/   
    
Associated Topics:
Elementary Fractions
Elementary Measurement
Middle School Fractions
Middle School Measurement

Search the Dr. Math Library:


Find items containing (put spaces between keywords):
 
Click only once for faster results:

[ Choose "whole words" when searching for a word like age.]

all keywords, in any order at least one, that exact phrase
parts of words whole words

Submit your own question to Dr. Math

[Privacy Policy] [Terms of Use]

_____________________________________
Math Forum Home || Math Library || Quick Reference || Math Forum Search
_____________________________________

Ask Dr. MathTM
© 1994-2013 The Math Forum
http://mathforum.org/dr.math/