In article <firstname.lastname@example.org>, <email@example.com> wrote:
>> If you have a different definition of 0.999... frmo the usual one, it's >> not surprising that you get different answers.
>I would just like to point out that I am not attempting to say that >0.999... defined as a limit is not equal to 1. What I am amazed by is >the degree to which the entity I am expressing, that is in no way a >limit, cannot be accepted and understood in its own terms by others. >They refuse to allow me the use of a decimal representation of >0.999... unless I will strictly adhere to another mathematical >convention, no matter how much I attempt to disown that convention and >state that my proof is not trying to contradict the truths within the >realm of that convention. It gets somewhat tiresome...
See what I said in the quoted paragraph above. You can make up whatever definition you like of 0.999..., but if it isn't the one used in mathematics then what's the point? You can disown the convention, but it still *is* the convention.
Do you agree that given the standard meaning of 0.999..., it's equal to 1?
-- Richard -- "Consideration shall be given to the need for as many as 32 characters in some alphabets" - X3.4, 1963.