Date: Nov 17, 2012 10:57 AM
Author: Robert Hansen
Subject: Re: How teaching factors rather than multiplicand & multiplier confuses kids!


On Nov 16, 2012, at 5:49 PM, Joe Niederberger <niederberger@comcast.net> wrote:

> Robert Hansen says:
>> Can you tell me how you picture the students' minds unravelling these examples into a generic theory of continuity?
>
> Kirby replies:

>> I'm not necessarily looking for a "generic theory of continuity" as I don't imagine most adults have that either.
>
> I don't think most college math majors do either -- I'm curious, what does R. Hansen mean by "generic theory of continuity"?
>
> Joe N


I said that in the context of Kirby's post, where he was going on and on with physical examples of the use of the word "continuous", essentially a lesson in semantics. I meant "generic" in the sense of "devoid of physics and semantics, specifically, mathematical".

Functions are deterministic, for every x there is one y. This is sufficient and largely taken for granted in algebra. Right or wrong, the focus of algebra is not on the function, or even a piece of the function, but on the value of the function at a particular x. As long as you are not dividing by zero, you are usually in the clear.

Calculus, on the other hand, is about the function, or at least, about intervals on the function. Determinism is still taken for granted but how does that determinism behaves over an interval, or more importantly, as we approach a point.

I distinctly remember our discussions of calculus (back in high school) beginning with a discussion of finding the slope of the tangent to a curve. This discussion began simple enough with finding the slope of a secant of the curve using elementary algebra. However, as the discussion progresses to the difference quotient with its denominator approaching zero, our algebra breaks down. If there is one thing that algebra hates, it's a denominator of zero. To the rescue comes the notion of the limit. Instead of focusing on the point (as we do in algebra) we focus on how the function behaves as it approaches the point. Thus, the seeds of the finer details of real numbers, functions over those real numbers, and the notion of continuity are sewn. No, I do not believe in starting with epsilon-delta, but you should be heading there, to borrow a sentiment from Lin McMullin in the AP Calculus forum.

When you teach continuity as if it were an artifact of calculus, or worse, a result of calculus, rather than what it really is, the essential ingredient to calculus, you will miss all of that. And when I say essential ingredient, I mean seeing the continuity in real numbers and in functions over those numbers. An example of continuity (not lifting a pencil) might be an opening or starting point to the discussion, but it clearly isn't the discussion. Without depth, it isn't anything but a superficial opinion, although the "conceptualists" would like you think that it is everything.

Bob Hansen