On Sunday, 2 March 2014 07:06:44 UTC+2, Dan Christensen wrote: > You don't know much about proofs, do you, John Gabriel?
Infinitely more than you do.
> Which of your "axiom(s) of arithmetic" did you apply to derive the following statement?
> 1. A magnitude is the idea of size of extent. We can either tell that two magnitudes are equal or not. If we can tell they are not equal, then we know which is smaller or bigger, but we can't tell how much bigger or smaller. This is called qualitative measurement (without numbers).
The axiom is self-explanatory. However, when you start asking me about what bigger or smaller means, and telling me these need to be defined, a red flag appears in my superior mind. It is called the moron flag.
Bigger and smaller are very well defined. These are all in the dictionary. No need to redefine them. Now, since I am dealing with humans (?), I must be, because even computers can't be as stupid as you, I apply my theory of learning called Inferential Suspension of Knowledge Acquisition. This is how AI algorithms should be designed. I don't need to tell you step by step how to determine what is bigger or smaller. I would tell a computer how to do that. But you are not a computer, are you? :-)
So, how do humans learn what these words mean? Well, they usually learn the concepts at an early age, unless they have down-syndrome and need to read a dictionary in their later years to understand the meaning.
Human learning: I see two similar objects. What do I do next? I compare them. What do I deduce by comparing them? They are either the same (=) or different (-).
Computer (mimic) algorithm:
Which decimal representation is bigger, 3.14159 or -1?
a. If signs are different, number that has minus sign is smaller, hence other number is bigger. STOP. b. If signs are positive, compare numbers digit by digit, starting from the left. If any digit is different, then number with larger digit is bigger. c. If signs are negative, compare numbers digit by digit, starting from the left. If any digit is different, then number with larger digit is smaller.
In machine code, a similar process is accomplished using logic operators such as XOR, AND, NOT, OR, etc.
You can't show me even ONE example of how the Peano axioms define bigger or smaller. And no, you can't use anything related to the successor function, because it assumes the well ordering of the natural numbers. I have repeatedly told you this, but you seem incapable of comprehending it.
In a word, your websites are *junk*. I don't mean to berate you personally, but frankly, there is no other way to tell you.
As for me, I am inclined to create computers that think like humans, not humans that function like computers. :-)
Whatever I imagine is real because whatever I imagine is well defined.