Date: Apr 1, 2013 3:42 PM
Author: fom
Subject: Re: Mathematics and the Roots of Postmodern Thought

On 4/1/2013 12:18 PM, david petry wrote:
> On Monday, April 1, 2013 5:01:04 AM UTC-7, Jesse F. Hughes wrote:

>> david petry <> writes:
>>> Applied mathematicians know they have to produce something that is
>>> of use to the scientists, which does imply that they are taking
>>> falsifiability into consideration.


>> I still don't understand.
> That doesn't surprise me.

>> Can you give an example of some piece of mathematics that an applied
>> mathematician would choose to avoid, because it's not "falsifiable"?

> Cantorian set theory.

>> And can you tell me whether the axioms of, say, the theory of real
>> numbers are falsifiable?

> I don't know what you are referring to by "the axioms of the theory
> of real numbers".

Well, then.

That presents a problem. Does it not? You do not even
know what it is that you are criticizing.

>> Of course, if the theory of real numbers is not falsifiable, it would
>> seem you have a problem, right? Don't applied mathematicians (and
>> scientists!) use that theory regularly?

> The real numbers can be developed in the context of falsifiability,
> which should be obvious since scientists use real numbers.

On your account, they technically would not be using real numbers
since every logical construction or definition which would account
for an "arithmetic of real numbers" not based on an unsubstantiated
account would involve an infinity.

> The real numbers that scientists use are finite precision real numbers,
> which can be thought of as rational numbers together with an error

> The theory of infinite precision real numbers can be developed as
> the limiting case when the error goes to zero.

And, the definition that resolved criticisms of the "infinity"
required for the geometric representation of that limiting case
led to questions concerning continuity. This, in turn, led to
the set theory of Dedekind and Cantor that you find so reprehensible.

Here is a modern resolution not involving set theory:

You can find the discussion of how one cannot accept
a geometrically-based arithmetic in which

x^2 = 0



on page 2 (14 pages into the PDF)

So, which part of history shall we discard?

Shall it be Descartes' assumption that numbers
may be placed as coordinates on axes to label
the geometric points?

Or, shall it be Vieta's uniform treatment of
arithmetic monads and geometric magnitudes in
his development of algebra?

What does it mean for the error to "go to zero"
in your statement?

I have already been called a "parrot" and a
"deplorable slave" by your partner in criticism
for pointing out that the standard use of the
sign of equality involves the presumption of

It is clearly seen in comparison with the
choice above. My claim is that Leibniz' original
formulation of the principle of identity of
indiscernibles had been geometric rather than
"logical". This is seen in the statement:

> "What St. Thomas affirms on this point
> about angels or intelligences ('that
> here every individual is a lowest
> species') is true of all substances,
> provided one takes the specific
> difference in the way that geometers
> take it with regard to their figures."
> Leibniz

And, in contrast to the logicist position
on the sign of equality, I hold that this
is best compared with Cantor:

> "If m_1, m_2, ..., m_v, ... is any
> countable infinite set of elements
> of [the linear point manifold] M of
> such a nature that [for closed
> intervals given by a positive
> distance]:
> lim [m_(v+u), m_v] = 0 for v=oo
> then there is always one and only one
> element m of M such that
> lim [m_(v+u), m_v] = 0 for v=oo"
> Cantor to Dedekind

So, what do you mean when you say "go to zero"?

Is "bad arithmetic" scientific?

Or is "completed infinity" scientific?

To which version of "go to zero" would you insist
that "stupid mathematicians" and "elite philosophers"
make an ontological commitment?