Drexel dragonThe Math ForumDonate to the Math Forum

Ask Dr. Math - Questions and Answers from our Archives
_____________________________________________
Associated Topics || Dr. Math Home || Search Dr. Math
_____________________________________________

Was 1 Ever Considered to Be a Prime Number?

Date: 02/29/2004 at 01:17:19
From: Jim
Subject: 1 as a prime number

I learned that a prime number was one divisible by only itself and 1,
but my 4th grader says that per her book a prime requires 2 different
factors.  I note your Greek reference for 1 not being prime, which
would indicate that I'm wrong and there was no change in definition.  
However, Ray's New Higher Arithmetic (1880) states, "A prime number is
one that can be exactly divided by no other whole number but itself
and 1, as 1, 2, 3, 5, 7, 11, etc."  Can you tell me when this change
happened and why?



Date: 02/29/2004 at 17:39:00
From: Doctor Rob
Subject: Re: 1 as a prime number

Thanks for writing to Ask Dr. Math, Jim!

I believe the 1880 book you cited is wrong--1 has never been and will
never be considered a prime.
   
If you treated 1 as a prime, then the Fundamental Theorem of
Arithmetic, which describes unique factorization of numbers into
products of primes, would be false, or would have to be restated in
terms of "primes different from 1."  The same is true of many other
theorems of number theory and commutative algebra.  Rather than use
this phrase, it makes more sense to define primes so as not to
include 1.  Also, the multiplicative inverse of 1 (reciprocal of 1)
exists in the positive integers, which is true of no other positive
integer.  We call such numbers "units," and this property makes
them different from non-units.

Feel free to reply if I can help further with this question.

- Doctor Rob, The Math Forum
  http://mathforum.org/dr.math/ 



Date: 03/01/2004 at 09:18:44
From: Doctor Peterson
Subject: Re: 1 as a prime number

Hi, Jim.

I'm going to disagree slightly with what Dr. Rob told you: although 
the definition of prime never SHOULD have included 1, and DIDN'T in 
the late 20th century, this fact was not always recognized in the 
relatively distant past.  This is discussed here:

    http://mathworld.wolfram.com/PrimeNumber.html 

  The number 1 is a special case which is considered neither prime
  nor composite (Wells 1986, p. 31). Although the number 1 used to
  be considered a prime (Goldbach 1742; Lehmer 1909; Lehmer 1914;
  Hardy and Wright 1979, p. 11; Gardner 1984, pp. 86-87; Sloane
  and Plouffe 1995, p. 33; Hardy 1999, p. 46), it requires special
  treatment in so many definitions and applications involving
  primes greater than or equal to 2 that it is usually placed into
  a class of its own. A good reason not to call 1 a prime number
  is that if 1 were prime, then the statement of the fundamental
  theorem of arithmetic would have to be modified since "in
  exactly one way" would be false because any n = n*1. In other
  words, unique factorization into a product of primes would fail
  if the primes included 1. A slightly less illuminating but
  mathematically correct reason is noted by Tietze (1965, p. 2),
  who states "Why is the number 1 made an exception? This is a
  problem that schoolboys often argue about, but since it is a
  question of definition, it is not arguable."

I'm assuming that the references from 1979 on, at least, say that
primes were formerly defined to include 1, rather than using that
definition themselves.  Texts, also, may not always be careful about 
definitions; your "divisible by only itself and 1" may well be
intended to imply that "itself and 1" are not the same number, or the 
question of whether 1 is a prime may not have been considered.

Here is another discussion of this question that I found:

    http://mathforum.org/kb/message.jspa?messageID=1379707 

Read especially John Conway's contributions, which point out that 
mathematicians recognized the need to clarify the definition when 
certain aspects of abstract algebra developed in the 1900's, which 
gave them a new perspective on the question; but that school texts, 
as usual, were slow to adopt the corrected definition:

  The change gradually took place over this century [the 1900's],
  because it simplifies the statements of almost all theorems.
  If you count 1 as a prime, for example, numbers don't have
  unique factorizations into primes, because for example  6 = 1
  times 2 times 3 as well as 2 times 3.  It's a bit of a nuisance
  that Lehmer's 1914 "List of all prime numbers below 10 million"
  counts 1 as a prime.

  I think the development of number theory for other rings played
  a big part, because there one finds other "units" besides 1
  (for instance  +-1  and +-i in the Gaussian integers), and these
  units clearly behave in many ways that make them different from
  the primes.

  Other examples of the kind of thing that goes wrong if you count
  1 as a prime are arithmetical theorems like

    "If p,q,r,... are distinct primes, then the number of divisors
    of  p^a.q^b.r^c....   is   (a + 1)(b + 1)(c + 1)... ."

  Mathematicians this century [the 1900's] are generally much more
  careful about exceptional behavior of numbers like 0 and 1 than
  were their predecessors: we nowadays take care to adjust our
  statements so that our theorems are actually true.  It's easy to
  find lots of statements in 19th century books that are actually
  false with the definitions their authors used - one might well
  find the above one, for instance, in a work whose definitions
  allowed 1 to be a prime.  Nowadays, we no longer regard that as
  satisfactory.

  The changeover has been very gradual, and I'll bet there are
  publications from the last few years in which 1 is still counted
  as a prime--in other words, it's not yet complete.  In the
  1950s and 1960s, books that chose the new definition would
  always be careful to point out that they were doing so, and that
  most authors included 1 with the primes.

  The real thing that gets such a change accepted is when it gets
  into high-school textbooks.  I think that perhaps we must thank
  "the new math" movement, which for all its faults did get some
  of the terminology and conventions into the high schools that
  had hitherto only been used in the Universities.  School
  textbooks don't like to muddy the waters by explaining such
  things as variations in usage, so would tend to give just one
  definition.  My guess is that you'll find that schoolbooks of
  the 1950s defined primes so as to include 1, while those of the
  1970s explicitly excluded 1.

It sounds like your textbooks, and mine, might have used the old 
definition!

If you have any further questions, feel free to write back.

- Doctor Peterson, The Math Forum
  http://mathforum.org/dr.math/ 
Associated Topics:
High School History/Biography
Middle School History/Biography
Middle School Prime Numbers

Search the Dr. Math Library:


Find items containing (put spaces between keywords):
 
Click only once for faster results:

[ Choose "whole words" when searching for a word like age.]

all keywords, in any order at least one, that exact phrase
parts of words whole words

Submit your own question to Dr. Math

[Privacy Policy] [Terms of Use]

_____________________________________
Math Forum Home || Math Library || Quick Reference || Math Forum Search
_____________________________________

Ask Dr. MathTM
© 1994-2013 The Math Forum
http://mathforum.org/dr.math/