Date: Apr 1, 2013 2:50 PM
Author: Jesse F. Hughes
Subject: Re: Mathematics and the Roots of Postmodern Thought

david petry <david_lawrence_petry@yahoo.com> writes:

> On Monday, April 1, 2013 5:01:04 AM UTC-7, Jesse F. Hughes wrote:
>

>> david petry <david_lawrence_petry@yahoo.com> writes:
>
>> > Applied mathematicians know they have to produce something that is
>> > of use to the scientists, which does imply that they are taking
>> > falsifiability into consideration.

>
>

>> I still don't understand.
>
> That doesn't surprise me.
>
>

>> Can you give an example of some piece of mathematics that an applied
>> mathematician would choose to avoid, because it's not "falsifiable"?

>
> Cantorian set theory.


Aside from the fact that applied problems don't tend to require
infinite sets, what is your evidence that applied mathematicians avoid
Cantorian set theory because is allegedly unfalsifiable?

>
>> And can you tell me whether the axioms of, say, the theory of real
>> numbers are falsifiable?

>
> I don't know what you are referring to by "the axioms of the theory
> of real numbers".


There are a number of different axiomatizations of R. Let's take the
first one Google provides:

https://en.wikipedia.org/wiki/Real_number#Axiomatic_approach

Are these axioms falsifiable? How do I tell?

>
>> Of course, if the theory of real numbers is not falsifiable, it would
>> seem you have a problem, right? Don't applied mathematicians (and
>> scientists!) use that theory regularly?

>
> The real numbers can be developed in the context of falsifiability,
> which should be obvious since scientists use real numbers.


Can you show me a falsifiable set of axioms for R and some indication
that this is a set of axioms that applied mathematicians use (because
scientists wouldn't have it any other way)?

> The real numbers that scientists use are finite precision real
> numbers, which can be thought of as rational numbers together with
> an error estimate. The theory of infinite precision real numbers
> can be developed as the limiting case when the error goes to zero.


Oh? So real scientists do not believe in pi, but only in
approximations to pi? And also, real scientists do not use, oh,
sqrt(2)?

That is fascinating!

Can you show me any published account (besides your own) which
indicates that scientists do what you say? Is there any scientist
ever who has actually published an exhortation to always treat pi as a
fiction and only use approximations to pi, because pi itself is part
of the unfalsifiable pseudoscience?

I'm eager to believe you, oh, golly I am. But it feels like you're
making it up.

--
"I said someday we'll love again then you'll know the score.
I've taught you everything I know and maybe even more.
'That's true,' she said. 'More than you ever will.'"
--"Just a Wave", Butch Hancock