On 31 May 2007 05:04:58 -0700, email@example.com wrote:
>This has been a very educational evening for me. I would still like to >point out that no one has refuted my proof itself or pointed to >specific error or logical over-stepping it may contain.
Using the standard definition of the real numbers together with the standard definition of an infinite decimal as a limit, it's easy to prove that .999... _is_ equal to 1, hence it's automatic that your proof is flawed.
You're not entitled to change the standard definitions. You are allowed to define a new number system of your own, but then call it something else so as not to conflict with the existing standards.
However, by not accepting limits as numbers, you lose a lot of mathematics. The beauty of limits is that, in most ways, they're just as good as numbers. They can be added, subtracted, multiplied and divided (with the usual restriction about dividing by 0). Moreover they adhere to all standard laws of numbers (associative laws, commutative laws, distributive laws, etc), and they interact transparently with the rational numbers. The gain in terms of completeness of the number system far outweighs any initial aversion.