The Math Forum

Ask Dr. Math - Questions and Answers from our Archives
Associated Topics || Dr. Math Home || Search Dr. Math

Understanding the Need for Limits

Date: 02/19/2001 at 14:23:45
From: Josh
Subject: Understanding Limits (Calculus)

I searched all over your site and couldn't find anything about 
understanding limits.

Basically I'm trying to find any extra help I can on understanding the 
concept of a limit. Most of the time I get the same vague description:

   "The value that f(x) approaches as x approaches some number."

I can see the need for a limit on things like:

     lim (x->0) [sinx/x]

because you need to know what the value "approaches" not what it 
actually equals, because at 0 it basically doesn't exist. But why do 
they give us problems like:

     lim (x->3) [x^2]

is this just for practice, or am I missing something?

Basically what I'm asking:

   1. Should the concept of a limit be applicable to all functions, or 
      is it really only useful for certain situations?

   2. Are there any other ways of describing a limit?

   3. Why is this concept so vague?

One more thing. The definition of a derivative is:

     lim (h->0) [f(x+h) - f(x)] / h

but I don't see the need for a limit here because all you do is plug 
in the value of 0 for h once you've reduced the function into its 
simplest form. So why the limit? Why not just say, "when h = 0"?

I really appreciate any time you spend on this. A service like this is 
very useful and informative.

Thank you.

Date: 02/19/2001 at 16:22:40
From: Doctor Peterson
Subject: Re: Understanding Limits (Calculus)

Hi, Josh. These are good questions, and too easily overlooked when 
this subject is taught too formally. I think it's important to have 
not only a good understanding of the formal definitions, but also a 
feel for how things work and what they are all about.

>But why do they give us problems like:
>(x->3)lim [x^2]
>is this just for practice, or am I missing something?

Exactly: This is for practice with the concept of limits and the 
methods for proving them. But also, it introduces the concept of 
continuity. It is possible that such a function might NOT approach its 
value at 3, so you have to prove that it does. In case you haven't 
been introduced to continuity yet, a continuous function is one whose 
limit is in fact equal to its value; the fact that most familiar 
functions are continuous is the reason the concept seems a little 
silly to you.

>Basically what I'm asking:
>   1. Should the concept of a limit be applicable to all functions, 
>      or is it really only useful for certain situations?

As I suggested in my mention of continuity, the limit concept is 
applicable to all functions, but is only "interesting" for those 
peculiar functions that either are not continuous, or are not defined 
at some points.

>   2. Are there any other ways of describing a limit?

The general idea of "approaching," and the formal definition using 
delta and epsilon, are the two main ways to discuss limits. The latter 
is just a precise statement of the former. Actually, there's an even 
more formal and general (and therefore hard to follow) definition 
that takes it beyond functions of a single real variable, which 
generalizes the idea of a delta to "open balls" or "neighborhoods." If 
you didn't find "Formal Definition of a Limit" in our archives, you 
may want to read it, because it attempts to demystify the formal 
definition. It's at:   

>   3. Why is this concept so vague?

When calculus was first being developed, the concepts of 
differentiation and integration were far more vague, and needed 
careful definition before mathematicians could be comfortable working 
with them - there was no way to prove anything about such unformed 
concepts as "sliding two points on a curve together." The 
delta-epsilon definition was introduced precisely so that these 
concepts did not have to be so vague. I'm not sure exactly what about 
limits seems vague to you. I think probably you don't mean "vaguely 
defined," but something like "vaguely related to anything else," or 
"not clearly needed." I hope my answers to your other questions deal 
with that.

>The definition of a derivative is:
>     lim (h->0) [f(x+h) - f(x)] / h
>I don't see the need for a limit here because all you do is plug in 
>the value of 0 for h once you've reduced the function into its 
>simplest form. So why the limit? Why not just say, "when h = 0?"

Because "reducing a function to its simplest form" is not something we 
can do arbitrarily, or even define clearly in all cases. (You admitted 
that there is a need to use a limit for sin(x)/x, and that is exactly 
what you get if you work out the derivative of the sine at zero.) If 
we want a definition of the derivative, it has to apply to any 
differentiable function, not just to those we can work with easily. 
And in fact the limit concept clarifies exactly what you mean when you 
talk about the "simplest form," in a way that is precisely defined.

Think about the geometric model of the derivative as the slope of a 
curve. This makes it the limit of the slope of chords, as the 
endpoints move together. This can't be defined at all for h = 0; there 
is no chord in that case. So it has to be defined as a limit. The "h" 
form of the definition merely formalizes this definition; we can't 
drop the need for a limit just because it's now written in algebraic 

Feel free to continue this conversation - your questions are 
definitely worth asking, and I'd like to keep talking until you're 
confident that you understand the concepts.

- Doctor Peterson, The Math Forum   

Date: 02/21/2001 at 14:29:38
From: Josh
Subject: Re: Understanding Limits (Calculus)

Let me start by thanking you for your response. I can't stress enough 
how great this service is!

Now, on to more questions. I have read the simplified formal 
definition of the limit and it certainly makes things more clear.

Let's see if I have this straight:

A limit is when you take a value for an independent variable and show 
that the function produces a value for the dependent variable that is 
consistently close to (whatever value it produces) even if the 
independent variable's value is only close to it's initial value?  

So if I put in a 3 and get out a 5, the limit exists if a 3.0001 
produces a number close to 5, and 2.9999 produces a number close to 5? 
Which is basically like saying that the value was approached from both 

Then you take this concept and use it to evaluate "interesting" 
functions where you need to know what value a function approaches but 
not what it equals at certain points. Or to show that a function does, 
or does not, remain continuous for all independent values.

Am I close?

I guess the hard part about limits is understanding what, or when, 
they are used. Our Calculus book covered them in the first couple of 
chapters, but then we haven't really seen them again since. Other than 
using the definition of a derivative, which now we just use the 
shortcut rules rather then the formal definition, when 

Thank you again. I hope I'm not coming off too thickheaded. I've 
always been the type to prefer understanding things thoroughly, not 

Thank you again for your time.

Date: 02/21/2001 at 15:20:36
From: Doctor Peterson
Subject: Re: Understanding Limits (Calculus)

Hi, Josh.

Never apologize for trying to understand something thoroughly! It's 
wonderful to see someone who isn't satisfied with the shallow stuff.

I think you've basically got it. Of course, you just gave examples, so 
what you said is far from complete. It wouldn't be enough just to show 
that the value at 3.00001 is close to the limit; you have to show that 
this is true no matter how you define "close." I forget just how the 
answer I pointed you to stated it, but here's how I explain the 
definition of a limit:

You don't believe I'm right about the limit, and challenge me to prove 
that the value of the function, f(x), stays close to 5 whenever x is 
close to 3. Since "close" is not well defined, you're not satisfied to 
let me decide how close is close enough; in order to convince you, I 
let you decide how close it has to be. You say, okay, I want you to 
make f(x) within 0.00000000001 of 5. I do some calculating and say, as 
long as x is within 0.000000000000000001 of 3, it will work. Then you 
say, well, maybe I should have picked a smaller number, and then you 
couldn't do it! But I turn around and show you that I have a formula I 
can use to pick my distance, which will work no matter how small your 
definition of "close" is. At that point you give up. I've convinced 
you. We don't need to play the game any more.

So the epsilon-delta proof says, no matter how close you consider 
close, there will be some range of x around the target value for which 
f(x) will be that close to the proposed limit.

You're right that limits are used mostly at the definition level in 
elementary calculus; once you've been convinced that the shortcuts 
give the correct derivative or integral, you can ignore them. Whenever 
you come across a new type of function, of course, you'll have to go 
back to the definition; and there are some types of problems where the 
limit is an essential part of the problem (such as infinite series, 
which involve a related sort of limit). But the main use of the limit, 
as far as you are concerned right now, is just to make it possible to 
do calculus with confidence that everything is working right behind 
the scenes. Calculus was done before the limit concept was clearly 
defined; but mathematicians had a queasy feeling that their 
foundations weren't quite stable until that work was done. Now that 
the foundation has been stabilized, you can just shut the basement 
door and not worry what's going on down there (most of the time). It's 
still good to understand it, though, so you'll know what to think 
about when an earthquake hits.

- Doctor Peterson, The Math Forum   

Date: 02/26/2001 at 12:53:36
From: Josh
Subject: Re: Understanding Limits (Calculus)

I can see where you're coming from on the definition you gave me. It 
makes perfect sense. All I need to do is get more comfortable with the 
mathematical proof side of the limit concept (the delta epsilon one.) 
I just need to let it sink in. Can I ask you a semi-personal question? 
At what point in your mathematics career did you first feel like you 
understood limits? Is this concept something that a first semester 
Calculus student should feel comfortable with? Or is it something that 
comes with time and more advanced knowledge?

I really liked your last description of a limit. That makes a lot more 

Thank you again for your time.

If there is anything I can do to help your Web site, let me know.


Date: 02/26/2001 at 17:10:32
From: Doctor Peterson
Subject: Re: Understanding Limits (Calculus)

Hi, Josh.

I don't specifically recall being troubled by the limit concept when I 
first learned it (though it was a long time ago); but my 
understanding has certainly deepened with time and further courses. I 
suppose I've used it too much since then to be able to recall what 
that first exposure was like - just as you probably don't recall how 
amazing the concept of talking felt when you said your first word.

- Doctor Peterson, The Math Forum   
Associated Topics:
High School Calculus

Search the Dr. Math Library:

Find items containing (put spaces between keywords):
Click only once for faster results:

[ Choose "whole words" when searching for a word like age.]

all keywords, in any order at least one, that exact phrase
parts of words whole words

Submit your own question to Dr. Math

[Privacy Policy] [Terms of Use]

Math Forum Home || Math Library || Quick Reference || Math Forum Search

Ask Dr. MathTM
© 1994- The Math Forum at NCTM. All rights reserved.