Date: Oct 5, 2017 3:24 AM
Subject: Newton called it Compleat Quotient, I called it the suffix, or<br> remainder suffix Re: We instantly recognize 200/3 = 66+2/3, why do we fail<br> with 2/3 written as .6666(+2/3)
Alright, I am delighted that Newton stumbled upon this idea of a remainder in a number. He called it a Compleat Quotient, which is highly appropriate. Of course Newton needed it for series work.
I need it because the Reals of Old Math are shoddily represented, getting people all fouled up, all messed up, all screwed up with certain numbers like .33333333...... or .9999999......
Here is a post of last August where I called it the suffix. I think I will end up calling it the remainder suffix, or I just may use Newton's Compleat Quotient.
---- last August post ----
Date: Sat, 19 Aug 2017 16:13:57 -0700 (PDT)
Subject: Greatest mistake missed in math Decimal Number Representation, and
why .9999... =1 was math professors downfall
From: Archimedes Plutonium <plutonium....@gmail.com>
Injection-Date: Sat, 19 Aug 2017 23:13:58 +0000
Greatest mistake missed in math Decimal Number Representation, and why .9999... =1 was math professors downfall
Now when I came to sci.math in 1993, August of that year, one topic that seemed to endlessly come up in sci.math was the issue of .9999.... =1. Where math professors the world over adamantly believed and forced the acceptance by all who want to pass mathematics, that .9999.... =1.
These goofballs of logic of math even came up with silly proofs, such as this::
N = .9999....
multiply both sides by 10
10N = 9.99999.....
subtract N on both sides
10N - N = 9.99999.... - N
Replace N with N= .99999.... on the right side
10N - N = 9.99999.... - .99999.....
10N - N = 9
9N = 9
Now the huge flaw in that argument, is you subtract .9999... on the right side but you do not subtract it on the left side, instead you subtract 1
Another crazy argument touted as proof is this one::
1/3 = .333333......
multiply both sides by 3
1 = .99999.....
Another flawed proof, and vastly more subtle in its error. It is like the fake proof of Dandelin spheres in geometry, where the kook mathematician has a hidden fatal assumption. In the case of Dandelin, he assumes the section is ellipse, and then pitifully goes about saying the spheres intersect the foci of the ellipse, when there never is an ellipse, but a oval.
In the above, the Fatal Hidden Assumption was 1/3 = .33333....
And that brings me to the topic at hand here. Old Math started with the Decimal Representation of Numbers way back to 1202 when Fibonacci introduced the decimal representation in his book Liber Abaci (Book of Calculation)
Decimal Representation of Numbers was one of the greatest, most fabulous, discoveries in all of mathematics. It is what the paint brush was to painting, or the surf board was to surfing. But bringing that great discovery to mathematics, was not without any snags, without any treacherous pitfalls. There was one huge treacherous pitfall that escaped every mathematician until now. And is the reason, the very reason we have the most oft topic posted to sci.math in the past 24 years of .9999....=1
Why did math professors the world over buy into the crap of .99999.... = 1. Their defense was that they imagined a curve, a function of .99999.... with a definition of infinity as endlessness, they imagined in their no logical mind that .9999.... converges to 1.
What they should have done, instead, is focus on 1/3 to see if it makes sense for it to be .333333.....
That is what I did in those 24 years, kept focus on 1/3, and it payed off just recently. Now it seems almost a miracle that it took 24 years to realize just one tiny small bit of factual data. A factual data that all of us educated in mathematics ran across when we were in Grade School. Yet none of us had the logical mind to fix, to patch the huge hole in math of its Decimal Number Representation.
This fact I speak of is this::
We are asked to divide 3 into 10,000 in Grade School
3 | 10,000 = 3,333
Now, if we handed that in to the teacher as our final answer, we missed something. We did not get it math correct, for we missed the remainder of 1.
Our answer to be perfectly math correct should have been 3,333+1/3
For we had that 1 carry over remainder, and so we should have tacked on a 1/3 to the 3,333
That example is what was missed in all of math history from 1202 to present day, and because it was missed, that nearly every math professor was a goofball of reasoning when it came to the issue of .9999... =1
Because 1/3 is not .3333333......
3 | 10,000 = 3,333+1/3
Means that 1/3 as decimal is really .333333..33(1/3)
1/3 as decimal is not .3333333.....
What the (1/3) is in .333333..33(1/3) is a second-decimal point in recognition that a remainder exists and was carried over.
When we have 10,000 divided by 3, is not 3,333 but is 3,333+1/3 that is true math.
And just because we have whole numbers with a carryover of 1/3, does not stop or prohibit us from a carryover in decimal fractions of 3 divided into 1.0000....
So when we divide 3 into 1.0000... we must show a carryover of 1 and write it as a suffix as (1/3)
So that 3 divided into 1.00000.... is .33333..33(1/3), and it is not .33333....
What that causes in math, that fixing of a second decimal point, what it causes is the argument that 1 = .99999.... is a total insane hallucination.
1 does equal .99999..99(3*(1/3)) for that equals .99999..99(1) which when added to the string of 9s preceding the suffix, you add the 1 to them and generate the 1.000000....
So, in those 24 years, I smelled something goofy about 1=.9999.... but could not put the finger directly on the foul smell. And in the past weeks of doing irrationals, sorting out algebraic from transcendental, I had to sort out the Rationals of .9999... and .3333.... along with irrationals like .1234567891011.....
In sorting out irrationals from rationals, I realized what the solution was for 1= .99999.... That from 1202 with Fibonacci, no-one in the math community had the logical brains to realize, Decimal Representation requires two decimal points, one of which is the suffix (1/3).