Date: Dec 6, 2012 4:22 AM
Author: mueckenh@rz.fh-augsburg.de
Subject: Matheology § 172


Matheology § 172

Wallis in 1684 [?] accepts, without any great enthusiasm, the use of
Stevin's decimals. He still only considers finite decimal expansions
and realises that with these one can approximate numbers (which for
him are constructed from positive integers by addition, subtraction,
multiplication, division and taking nth roots) as closely as one
wishes. However, Wallis understood that there were proportions which
did not fall within this definition of number, such as those
associated with the area and circumference of a circle:

Real numbers became very much associated with magnitudes. No
definition was really thought necessary, and in fact the mathematics
was considered the science of magnitudes. Euler, in Complete
introduction to algebra (1771) wrote in the introduction:
"Mathematics, in general, is the science of quantity; or, the science
which investigates the means of measuring quantity." He also defined
the notion of quantity as that which can be continuously increased or
diminished and thought of length, area, volume, mass, velocity, time,
etc. to be different examples of quantity. All could be measured by
real numbers.

Cauchy, in Cours d'analyse (1821), did not worry too much about the
definition of the real numbers. He does say that a real number is the
limit of a sequence of rational numbers but he is assuming here that
the real numbers are known. Certainly this is not considered by Cauchy
to be a definition of a real number, rather it is simply a statement
of what he considers an "obvious" property. He says nothing about the
need for the sequence to be what we call today a Cauchy sequence and
this is necessary if one is to define convergence of a sequence
without assuming the existence of its limit.

[J.J. O'Connor and E.F. Robertson: "The real numbers: Stevin to
Hilbert"]
http://www-history.mcs.st-and.ac.uk/HistTopics/Real_numbers_2.html

Regards, WM