Search All of the Math Forum:
Views expressed in these public forums are not endorsed by
Drexel University or The Math Forum.



Limit Problem
Posted:
Jan 26, 2013 5:56 PM


I am having a problem following an example in my book. I understand the concept of limit but sometimes I get confused manipulating expressions with absolute values in them. Here is the problem:
Prove lim(x>c) 1/x = 1/c, c not equal zero
So 0 <  xc < delta, implies 1/x  1/c < epsilon
1/x  1/c =  (cx) / {xc} = 1/x * 1/c * (xc) < epsilon
Factor 1/x is troublesome if x is near zero, so we bound it to keep it away from zero.
So c = c  x + x <= cx + x and this imples x >= c  xc
I think I understand everything up to this point, but not the next steps, which are
If we choose delta <= c/2 we succeed in making x >= c / 2. Finally if we require delta <= [(epsilon) * (c**2)} / 2 then
[1/x * 1/c * xc] < [1 / (c/2)] * [1/c] * [((epsilon) * (c**2)) / 2] = epsilon
How did they know to choose delta <= c/2?
How does that lead to x > c/2 implies 1/x < 1/(c/2) ?
I did not sleep well last night and I feel I must be missing something that would be obvious if my head was clearer. Thanks for any help.



