> > "bassam king karzeddin" <firstname.lastname@example.org> wrote > in > > message > > > news:22019165.1180601903492.JavaMail.jakarta@nitrogen. > > > mathforum.org... > > > Re: What is wrong between decimal and fraction? > > > Posted: May 28, 2007 6:01 PM Plain Text > > Reply > > > > > > > > > Dear All > > > > > > > Mr King. > > > > > [...] > > > Any positive real number (except one) is a > unique > > production of prime > > > numbers with each prime raised to a non-zero > > integer and therefore of > > > unique decimal representation > > > > > > > Factorisation is good. > > > > > Hence, the irrational numbers are all those > numbers > > that have endless > > > decimal digital expansion in any number system, > > provided that their > > > terminating digits are not all zero > > > > > > > Why? > > From the early definition of the rational numbers, we > can simply extend their concept, but with infinite > integers, so the real number definition becomes as a > ratio of two finite or infinite coprime integers > > And this definition doesn't count (zero, one, > infinity) as real numbers except by CONVENTION > > > > > > > From this you can see now why (0.999...) is an > > irrational number even we > > > don't know its prime factorization and therefor > > can't be equal to one > > > > > > [...] > > > > > > > I'm not sure what this is, but it's not a sound > > nd proof. > > In my opinion, the proof is straight foreword from > the definition only > > > > -- > > Glen > > > > > Regards > > B.Karzeddin
First-Consider a number (N) with finite number of digits say (M), of string (999...), this of course can be factored into prime numbers only in one way according to unique factorization theorem for any distinct positive integer
Second-Now add one to the previous number (N), (999..+1), you will get another number (N+1) with (M+1)digits and COMPLETELY deferent prime factorization
Third repeat the process for the integer (N) of string (999...) with (M+1) Digits , and this is the principle of Induction method of the proof, where you would find that always applicable, then
You will always get two sets of prime factorization for (N), and (N+1), where can never be considered equal,
BECAUSE ALL THEIR PRIME FACTORS ARE DEFFERENT,
therefore their division (N/(N+1)) can not be equal to one except by consideration or limit or convention as most of you had already learnt
Is not this is a rigorous proof for such a SILLY problem?