Date: Oct 4, 2017 1:58 AM
Author: plutonium.archimedes@gmail.com
Subject: We instantly recognize 200/3 = 66+2/3, why do we fail with 2/3 =<br> .66(+2/3) Re: reason why 1=.99999....99(+9/9)

 So, we look back at Math History and see that Newton realized there was a huge gulf of error in numbers when doing a division process and not completing it

Newton saw that you have a remainder and you must deal with it sufficiently to say you have a legitimate number

Newton saw this:

______
3| 200 = 66 + 2/3

And Newton thus saw that

______
3| 2.00 = .66 was insufficient

Could Newton accept this as answer-- .66666...... Could he accept that as answer?

No, for that is still the process of dividing.

So, what Newton needed as a LOGICAL Ending of division, is that INDEX of adding a tad bit on the end, even though it is right wards of the decimal point.

He needed something as LOGICALLY SATISFYING as
______
3| 200 = 66 + 2/3

That is logically satisfying, a denoument of the division process, an end to the division process.

So, Newton needed an index to the right of the decimal point and he wrote it as a (2/3).

So, the 2/3 as written in decimal representation is properly written as .6666....66(+2/3) where the index of (+2/3) is at the infinity borderline.

Now, where in math history, did things go wrong so that people thought 1/3 was .3333.... and 2/3 was .66666..... and 1 was .99999..... where did things go wrong?

Well a bit of history says that Descartes was the first to call Reals as Reals to distinguish them apart from imaginaries. Imaginaries was what are used by Cardano and others to solve for polynomials and for quartic equations.

Descartes was 1596-1650, and Newton was 1642-1726, and Euler was 1707-1783, and Cauchy was 1789-1857, and Dedekind was 1831-1916, and Cantor was 1845-1918.

Now somewhere in that chain of history of mathematics, somewhere the idea that 1 = .99999..... arose and perhaps never fiercely debated. But we do know that by 1990s the idea 1= .99999.... was fiercely debated on internet and is still hotly contended.

If Newton's Compleat Quotient had been developed, would the issue of 1=.9999.... never arisen?

I firmly suspect so.

And, further, I believe if someone had developed Newton's Compleat Quotient, that long before AP showed up, that someone would have recognized 1*10^604 was the Infinity Borderline.

So, where did math go astray from Newton to AP? By the time of Cantor, we can well see that 1=.999.... was well entrenched as math fakery. For if Cantor had realized 1 = .9999...99(+9/9) would have completely ruined, spoiled his diagonal method. How are you going to alter a index? (Another aside, that the diagonal method falls to pieces by All Possible Digit Arrangements).

We can see that Cauchy's limit was just screwball band-aid on .9999....