Convergence

From Math Images

(Difference between revisions)
Jump to: navigation, search
(Root Test)
(Examples of Series)
Line 761: Line 761:
We have shown that the alternating harmonic series is ''convergent'' but not ''absolutely convergent''. Recall that, for a series to be absolutely convergent, the series consisting of the absolute value of each of its terms must converge. But the series consisting of the absolute values of the terms of the alternating harmonic series is the regular harmonic series, which diverges! So while absolute convergence proves convergence, not all convergent series converge absolutely.
We have shown that the alternating harmonic series is ''convergent'' but not ''absolutely convergent''. Recall that, for a series to be absolutely convergent, the series consisting of the absolute value of each of its terms must converge. But the series consisting of the absolute values of the terms of the alternating harmonic series is the regular harmonic series, which diverges! So while absolute convergence proves convergence, not all convergent series converge absolutely.
-
 
-
==Examples of Series==
 
-
{{{!}}border="0" cellpadding=5 cellspacing=5
 
-
{{!}}{{Anchor|Reference=Figure3a|Link=[[Image:InfiniteSeries1.jpg|center|thumb|325px|Figure 3-a<br>Geometric Sequence with ''r'' = 1/2]]}}{{!}}{{!}}{{Anchor|Reference=Figure3b|Link=[[Image:InfiniteSeries2.jpg|center|thumb|325px|Figure 3-b<br>Geometric Sequence with ''r'' = 2]]}}{{!}}{{!}}{{Anchor|Reference=Figure3c|Link=[[Image:InfiniteSeries3.jpg|center|thumb|325px|Figure 3-c<br>Geometric Sequence with ''r'' = -2]]}}
 
-
{{!}}}
 

Revision as of 14:36, 19 July 2013

This is a Helper Page for:
Taylor Series

A sequence is a finite or infinite ordered list of objects, called terms. A series is the sum of the terms in a sequence. This page will explore and clarify what it means for infinite sequences and series to converge.

Contents

Notation for Sequences and Series

Let us introduce some formal notation to accompany the definitions of sequence and series given above. The general form of an infinite sequence is a list of terms:

Eq. 1         A = (a_1, a_2, a_3 , a_4, \cdots).

We call the nth term of the series a_n.

We can write an infinite series corresponding to A. The series is also given in summation notation:

Eq. 2         s = a_1 + a_2 + a_3 + a_4 + \cdots = \sum ^{\infty} _{k=1} a_k.

We call the finite sum series of the first n terms s_n:

Eq. 3         s_n = a_1 + a_2 + \cdots + a_n = \sum ^n _{k=1} a_k.

We can also think of a series in terms of its sequence of partial sums, which is a sequence in which each term is the nth sum series of another series:

Eq. 4         S_n = (s_1, s_2, s_3, \cdots, s_n).

In the above notations, lower-case variables represent series and upper-case variables represent sequences. The sequence of partial sums is a sequence in which each term is the sum of a series. This can be confusing but will help us in defining convergence below.

It is legitimate to denote the first term of a sequence as a_0. This is a cosmetic decision and will not affect the result of whether the series converges or diverges, or what it converges to, although it can change some of the intermediary math and the formulas that we come up with. On this page, we will use the above convention for the first term of a sequence.

Convergence and Divergence

Generally speaking, a sequence or series converges if it "tends to" a single value as the value of n, the number of terms, increases. However, this general characterization takes on slightly different meaning depending on whether we are referring to a sequence or a series. In a convergent sequence, the term a_n approaches a single number for arbitrarily large n. In a convergent series, the sum of all of the terms up to a_n approaches a single number for arbitrarily large n. The value to which a series converges is also called its sum.

This page will focus on the convergence of series and sequences consisting of real numbers. The same principles are often applicable in other systems, like the complex numbers.

We will use the notations laid out above to define convergence more formally using limits.


The sequence A is said to be convergent if the following limit exists:

 \lim_{n \to \infin}a_n = L_A , where -\infty < L_A < \infty.

If this limit does not exist, then the sequence A is said to be divergent.

When a limit does not exist and is not bounded above (or below), as in the case that it increases (or decreases) without bound, we will sometimes say that the limit is \infty. This does not mean that the limit exists, but we mean to distinguish it from a limit that does not exist simply because it does not tend to a single number (for example, in the case of the sequence (1, 0, 1, 0, 1, ...)).


We said that the sums up to a_n must approach a single number for a series to converge. In other terms,

 \lim_{n \to \infin} (a_1 + a_2 + a_3 + \cdots + a_n) = \lim_{n \to \infin} \sum _{k = 1} ^n a_k = L_s , where - \infty < L_s < \infty.

Again, if the limit does not exist, then the series is said to be divergent. The sum of terms in the parentheses is the right-hand side of Eq. 3, and so is equal to s_n. Substituting, we get:

Eq. 5         \lim_{n \to \infin}s_n = L_s.

Recall that nth term of a sequence of partial sums is the nth sum series for a sequence. Then we have concluded that we can think of the convergence of a series in terms of the convergence of the sequence of its partial sums. How does this work?

Say we are given the infinite series...

Say we are given the infinite series:

s = 1 + \frac{1}{2} + \frac{1}{4} + \frac{1}{8} + \frac{1}{16} + \cdots

Some might recognize this as an infinite geometric series with first term a = 1 and common ratio r = 1/2. The series will have nth term a_n = (1/2)^{n-1}. We know, then, that the sum of the series is:

s = \frac{a}{1-r} = \frac{1}{1 - \frac{1}{2}} = 2.

The series converges to 2. Can we also use the criteria established above to find this sum? As we said, we want to examine the limit of the sequence of partial sums. We will do so using Eq. 5.

Figure BThe first 5 partial sums of the terms of a geometric series with common ratio 2. For each nth sum, the addition of the nth term is shown.
Figure B
The first 5 partial sums of the terms of a geometric series with common ratio 2. For each nth sum, the addition of the nth term is shown.

The first term of the sequence of partial sums is s_1 = a_1:

s_1 = 1.

The next term is s_2 = a_1 + a_2:

s_2 = 1 + \frac{1}{2} = \frac{3}{2}.

And so on:

s_3 = 1 + \frac{1}{2} + \frac{1}{4} = \frac{7}{4}
s_4 = 1 + \frac{1}{2} + \frac{1}{4} + \frac{1}{8} = \frac{15}{8}
s_5 = 1 + \frac{1}{2} + \frac{1}{4} + \frac{1}{8} + \frac{1}{16} = \frac{31}{16}

The denominator is a power of 2 (ie. 2, 4, 8, 16,...); the numerator is double that power of 2, minus 1 (ie. 3, 7, 15, 31,...). We can characterize this in general, in terms of n,

s_n = 1 + \frac{1}{2} + \frac{1}{4} + \frac{1}{8} + \frac{1}{16} + \cdots + \frac{1}{2^{n-1}} = \frac{2^n-1}{2^{n-1}}.

(is it necessary/worth it to prove this by induction?)

Figure AThe first 10 partial sums of the terms of a geometric series with common ratio 2.
Figure A
The first 10 partial sums of the terms of a geometric series with common ratio 2.

Now that we have an expression for s_n in terms of n, we can plug it into Eq. 5 and determine what the series converges to:

\begin{align}
\lim_{n \to \infin} \frac{2^n-1}{2^{n-1}} &=  \lim_{n \to \infin} \left( \frac{2^n}{2^{n-1}} - \frac{1}{2^{n-1}} \right) \\
&= \lim_{n \to \infin} \left( 2 - \frac{1}{2^{n-1}} \right)
\end{align}

Since the expression \frac{1}{2^{n-1}} goes to 0 as n increases without bound, the above limit is 2. confirming the result of the formula for the sum of an infinite geometric series. This result should make sense; looking to Figure A, we see the sums are bounded above and get close to 2 pretty quickly, but never actually seem to reach 2.


In the last example, we were able to find the sum of the geometric series before considering the nth partial sum. This is not always possible, however, since most series will not come with a formula to quickly give us their sum. However, if we can find an expression for the nth partial sum in terms of n, then we can still figure out whether the series converges, and often we can figure out to what value it converges. Consider the following series:

s = \frac{1}{2} + \frac{1}{6} + \frac{1}{12} + \frac{1}{20} + \cdots + \frac{1}{n^2 + n} + \cdots.

This is not a geometric series. We don't have a formula for computing its sum. However, we can still look at the sequence of partial sums:

s_1 = \frac{1}{2}
s_2 = \frac{1}{2} + \frac{1}{6} = \frac{4}{6} = \frac{2}{3}
s_3 = \frac{1}{2} + \frac{1}{6} + \frac{1}{12} = \frac{9}{12} = \frac{3}{4}
s_4 = \frac{1}{2} + \frac{1}{6} + \frac{1}{12} + \frac{1}{20} = \frac{48}{60} = \frac{4}{5}

In general, then,

s_n = \frac{1}{2} + \frac{1}{6} + \frac{1}{12} + \frac{1}{20} + \cdots + \frac{1}{n^2 + n} = \frac{n}{n+1}.

We now can observe the limit using Eq. 5:

s = \lim _{n \rightarrow \infty} \frac{n}{n+1} = 1.

Therefore, s converges to 1.



There is another "type" of convergence that we need to define before proceeding. Suppose that we have a series s = a_1 + a_2 + a_3 + \cdots. If the series

|a_1| + |a_2| + |a_3| + \cdots

converges, then s is said to be absolutely convergent; that is, the series consisting of the absolute values of each term of s converges.

Furthermore, we establish the following:

Theorem: If a series converges absolutely, then it also converges conditionally; that is, if any of its terms were to be made negative. (Conditionally convergent is essentially a synonym for convergent as we defined it above.)

This should make sense. If terms, for instance, alternate between positive and negative and become progressively smaller in absolute value (tending to 0 as n goes to infinity), then one might suppose that they will eventually "hone in" on a single value, especially if the terms would converge if they were all positive. We'll look at this in more detail in the alternating series test.

This seems like a trivial thing to observe, but it is important. As we'll see, some of the tests below (the comparison test and the limit test) require that all terms in a series be non-negative. This does not mean, necessarily, that these tests cannot be used to demonstrate that a series with some negative terms converges. If we can make each term positive and prove that the resulting series converges, then the original series is absolutely convergent and therefore convergent.

The alternating harmonic series is an interesting example of a convergent, but not absolutely convergent, series, which will be discussed later.

Convergence Tests

There is a problem with the above method for establishing convergence: we cannot write the nth partial sum for every series. What do we do in such cases? Finding the sum of a series may not always be possible, but we can at least figure out whether the series converges or diverges. Here we will lay out several tests to check whether a series converges or diverges.

We will pay particular attention to when these tests are inconclusive. A particular test will not necessarily tell us whether or not a series converges. Some will get us an answer more easily than others. Some are not actually convergence tests, but are divergence tests. It is important not to forget what each test is capable of proving.


Before proceeding, there is a phrase which we will find useful to define: A statement holds for sufficiently large n if there exists some N such that the statement holds for all n \geq N.

In other words, a statement may not hold for n = 3 or n = 5 or even n = 100, but it can still hold for "sufficiently large" n. When talking about convergence of infinite series, though, what we often care about is not whether a statement holds "for all" n but whether we can increase n enough so that it holds for all values of n that are larger than some value (N). This idea will come up repeatedly, especially in the proofs of the following convergence tests. Alternatively, we might simply say for large n, indicating that while the statement may not be true for all n, it will hold if we increase n enough.

nth-term Test

The nth-term Test can be used to quickly establish whether or not a series diverges.

Theorem: If the nth term of a series does not approach 0 as n approaches infinity, then the series diverges. In other words, if
\lim _{n \rightarrow \infty} a_n \neq 0,
then s diverges.

The nth-term test is inconclusive when the nth term does approach 0 as n approaches infinity. A series can "pass" the nth term test and still diverge, so passing the test does not prove convergence. The test's utility is in weeding out some divergent series, but not all.

Geometric series only converge when the common ratio is between -1 and 1. Consider the geometric series with first term 1 and common ratio -1...

Geometric series only converge when the common ratio is between -1 and 1. Consider the geometric series with first term 1 and common ratio -1:

s = 1 - 1 + 1 - 1 + 1 - \cdots.

The limit of the nth term as n goes to infinity is not 0, so the series does not converge.


However, the nth-term test is not always conclusive. Consider the harmonic series, which does not converge:

s = 1 + \frac{1}{2} + \frac{1}{3} + \frac{1}{4} + \frac{1}{5} + \cdots + \frac{1}{n} + \cdots.

We will examine the Harmonic series in greater detail below, but for now it will suffice to note that

\lim _{n \rightarrow \infty} a_n = \lim _{n \rightarrow \infty} \frac{1}{n} = 0,

even though the series diverges. We can use the nth-term test to find some divergent series, but it will not find all divergent series. Series whose nth term goes to 0 may be convergent or divergent; series whose nth term does not go to 0 are definitely divergent.



The nth-term test is fairly intuitive. Here we will prove it by refining our understanding of limits...

The nth-term test is fairly intuitive. Here we will prove it more rigorously by refining our understanding of convergence and limits.

Definition: A series converges if, for every \epsilon > 0, there exists some N such that, for all m > n > N, it holds that
|a_n + a_{n+1} + \cdots + a_m|.

In other words, for any positive value (which we call \epsilon, the degree of tolerance), no matter how small, we must be able to find a sufficiently large term, the sum of any number of terms after which is less than \epsilon. If we can do this, then the limit exists. If there is some \epsilon > 0 that is smaller than the terms after any N we might choose, then the series diverges. This level of specificity might seem unnecessary, but it will help us clarify this proof.


Consider a series and its corresponding sequence of its terms: s = a_1 + a_2 + a_3 + a_4 + \cdots and A = (a_1, a_2, a_3, a_4, \cdots).

We will assume that A converges to some nonzero value (this does not mean that A has all nonzero terms, just that it does not converge to 0) and show that s necessarily diverges.

For the purpose of this proof, we will denote the nth partial sum of s as L_n. Furthermore, note that, for all n, we can obtain any subsequent sum by adding the subsequent term:

L_{n+1} = L_n + a_{n+1}
L_{n+1} - L_n = a_{n+1}.

In order for s to converge, it must be possible to select a large enough value of n so that |L_{n+1} - L_n| can be made smaller than any positive number \epsilon. However, if we take the limit of the above equation, we get:

\lim _{n \rightarrow \infty} (L_{n+1} - L_n) = \lim _{n \rightarrow \infty} a_{n+1}.

Since we have assumed that

0 < \lim _{n \rightarrow \infty} a_{n+1},

there exists some \epsilon such that

0 < \epsilon < \lim _{n \rightarrow \infty} a_{n+1}.

In other words, because the limit of the nth term is not 0, there exists some positive number \epsilon which the difference between two successive terms, for arbitrarily large n, cannot be less than. Therefore the series cannot converge.


Alternating Series Test

The alternating series test can be used to determine whether or not series with terms alternating between positive and negative converge.

Theorem: Consider a series with alternating terms:
s = a_1 - a_2 + a_3 - a_4 + a_5 - \cdots + (-1)^{n-1} a_n + \cdots.
(Note that, in the way that s is written here, a_k is positive for all k, and the negative terms are "handled" by the negative signs "outside" of the coefficients. We treat all coefficients as positive and keep in mind that we are working with a series with alternating terms.)
If 0 < a_{n+1} < a_n for all sufficiently large n, and \lim_{n \rightarrow \infty} a_n = 0, then s converges.

As the nth-term test showed, having the nth term go to 0 does not necessarily prove that a series converges. However, if the series is an alternating series, then it does prove convergence. Why this exception?

If terms are alternating between positive and negative, and their absolute values are getting smaller, then the terms should "hone in" on some number! Each positive term will "overshoot," and each negative term will "undershoot," but each term will bring us closer to the sum. It will be impossible for the sum to increase without bound. (Include an image for this.)

One example of the alternating series test is the alternating harmonic series...

One example of the alternating series test is the alternating harmonic series:

1 - \frac{1}{2} + \frac{1}{3} - \frac{1}{4} + \cdots.

As mentioned as an example of the nth-term test, the harmonic series diverges even though its nth term goes to 0. The nth term of the alternating harmonic series likewise goes to 0:

\lim _{n \rightarrow \infty} a_n = \lim _{n \rightarrow \infty} \frac{1}{n} = 0.

Likewise, for all n,

0 < a_{n+1} < a_n
0 < \frac{1}{n+1} < \frac{1}{n}.

So the alternating series test shows us that the alternating harmonic series converges. We will revisit this series later to see what it converges to.


Consider another series:

\frac{2}{2} - \frac{3}{4} + \frac{4}{6} - \frac{5}{8} + \frac{6}{10} - \cdots + (-1)^{n+1} \frac{n + 1}{2n} + \cdots

This is an alternating series. We see that, for all n,

0 < a_{n+1} < a_n
0 < \frac{n+2}{2n+2} < \frac{n+1}{2n}.

However, we must also observe the limit:

\lim _{n \rightarrow \infty} a_n = \lim _{n \rightarrow \infty} \frac{n+1}{2n} = \frac{1}{2}.

Since the nth term doesn't go to 0 as n goes to infinity (even though the absolute value of each term becomes smaller), the series diverges.



...


Comparison Test

The comparison test can be used to show either convergence or divergence. The comparison test, as the name implies, involves the comparison of one series, which is known beforehand to be either convergent or divergent, to the series whose convergence or divergence we are trying to ascertain. We call the chosen series the comparison series. A successful comparison depends on our choosing the right comparison series.

This test requires series with non-negative terms. Recall, however, the definition of absolute convergence defined above. If we have a series with some negative terms, we can try taking the absolute value of each term. If the resulting series converges, then the original series with some negative terms also converges.

Theorem: Consider a series with non-negative terms:
s = a_1 + a_2 + a_3 + \cdots.
If there exists another convergent series with non-negative terms
t = b_1 + b_2 + b_3 + \cdots
such that a_n \leq b_n for all n, then s is also convergent.
Likewise, if there exists another divergent series with non-negative terms
t = b_1 + b_3 + b_3 + \cdots
such that a_n \geq b_n for all n, then s is also divergent.

More specifically, we can truncate s after any number N of terms, and then compare the terms following the (N - 1)th term to another series. This works because the terms up to N are finite and so have a finite sum. In other words, our comparison only must hold for sufficiently large n; that is, there must exist some N such that the comparison holds for all n greater than N.

The comparison test is inconclusive when:

  • the terms of s are greater than or equal to the corresponding terms of a convergent series.
  • the terms of s are less than or equal to the tcorresponding terms of a divergent series.


Geometric series are particularly useful when we are using the comparison test because it is easy to know when they do and do not converge, based on th [...]

Geometric series are particularly useful when we are using the comparison test because it is easy to know when they do and do not converge, based on their common ratios. As such, if we find that each term of a given series is smaller than the corresponding term of some geometric series with common ratio between -1 and 1, then we know that the series converges. For example, consider the series:

s = \frac{1}{2} + \frac{1}{5} + \frac{1}{9} + \frac{1}{17} + \cdots + \frac{1}{2^{n-1} + 1} + \cdots.

Each term is smaller than the corresponding term in the geometric series:

t = 1 + \frac{1}{2} + \frac{1}{4} + \frac{1}{8} + \frac{1}{16} + \frac{1}{32} + \cdots + \frac{1}{2^{n-1}} + \cdots.

Therefore, s converges.


Consider another series:

s = 1 + \frac{1}{\sqrt{2}} + \frac{1}{\sqrt{3}} + \frac{1}{\sqrt{4}} + \frac{1}{\sqrt{5}} + \cdots + \frac{1}{\sqrt{n}} + \cdots.

We need only compare s with the harmonic series (which, as previously mentioned, diverges) to see that it diverges also. Since

\frac{1}{n} \leq \frac{1}{\sqrt{n}} for all n \geq 1,

the series diverges.


Consider another series:

s = 1 + \frac{1}{4} + \frac{1}{9} + \frac{1}{16} + \frac{1}{25} + \cdots + \frac{1}{n^2} + \cdots.

Does this series converge or diverge? If we compare it to the harmonic series, which diverges, we find that each term is smaller than that of the Harmonic series (inconclusive). If we compare it to a convergent geometric series with common ratio r = 1/2, we find that each term of s after n = 5 is larger than that of the geometric series (inconclusive). Choosing a geometric series with a larger common ratio for comparison would "delay" the number n of terms after which the geometric series becomes smaller than s, but ultimately, comparison of s with any convergent geometric series will be inconclusive.

The above series s is what we call a p series with p = 2, and it does, in fact, converge (at least for this value of p). (A p series is a series with nth term 1/n^p.) However, we will have to use other means to determine convergence. We will revisit this series in later tests.



The proof of the comparison test is rather intuitive...

The proof of the comparison test is rather intuitive. Suppose we have two series with non-negative terms:

s = a_1 + a_2 + a_3 + \cdots
t = b_1 + b_2 + b_3 + \cdots

Suppose that we know that t converges to some value T. If every term of s is less than or equal to its corresponding term in t, a convergent series, then s must converge to some S \leq T.

Alternatively, suppose that we know that t diverges. If every term of s is larger than its corresponding term in t, a divergent series, then s must diverge because its sum would have to be larger than that of t.


Let us also justify that we need only find a comparison series for sufficiently large n (for all n greater than some N). Suppose we have the series:

s = a_1 + a_2 + a_3 + \cdots.

Let t be a convergent series:

t = b_1 + b_2 + b_3 + \cdots.

Suppose, however, that a_1 \geq b_1, a_2 \geq b_2, a_3 \geq b_3, \cdots, a_{N-1} \geq b_{N-1}. Does this mean that the comparison test is inconclusive? Not necessarily.

The series s_{N-1} = a_1 + a_2 + a_3 + \cdots a_{N-1} is a finite series that has a finite sum. We can still, then, consider the infinite series beginning with the term a_N:

s - s_{N-1} = a_N + a_{N+1} + a_{N+2} + \cdots.

If each term of this series is less than its corresponding term in t, then this series s - s_{N-1} must converge and have a sum less than t. We can the add the finite number of terms in the series s_{N-1} and keep our finite sum. Therefore, we only need to find a comparison series for sufficiently large n.



The comparison test is fairly basic. The tests provided later can generally be applied in a more straightforward way, but we will find that the theory behind each other test ultimately relies on the comparison test. Watch out for this pattern in later proofs of tests like the ratio test and root test. These tests draw their power from essentially comparing any series to a generalized geometric series, which eliminates the need for the "tester" to choose a comparison series. (is this series getting too far ahead of the page?)

Limit Test

The limit test is another form of comparison test. One chooses another comparison series for which the nth term is known.

Theorem: Let s = a_1 + a_2 + a_3 + \cdots be a series with non-negative terms whose convergence or divergence we are attempting to ascertain. Let t = b_1 + b_2 + b_3 + \cdots be the chosen comparison series with non-negative terms. Consider the limit:
Eq. 6         \lim _{n \rightarrow \infty} \frac{a_n}{b_n}
Based on the evaluation of the above limit and what we know about t, we may be able to determine something about s. In particular:
  • If the limit is nonzero and less than infinity, then the two series either both converge or both diverge. Since we chose t knowing whether it converges or diverges, we know that s does the same.
  • If the limit is 0 and t converges, then s converges as well.
  • If the limit is \infty and t diverges, then s diverges as well.

If follows that the limit test is inconclusive when:

  • The limit is 0 and t diverges.
  • The limit is \infty and t converges.


Our goal in using the limit test is to select a comparison series that seems like it would behave like the original series...

Our goal in using the limit test is to select a comparison series that seems like it would behave like the original series, s, for large n. We select a series that we know is either convergent or divergent. What does this mean?

Suppose we have the series

s = \sum _{k = 1} ^{\infty} \frac{n}{n^3 - 5n + 5}.

Does it converge? We know that when we take limits of fractions of polynomials, we can look at the leading terms, so we might suppose that s would behave similarly to a series with nth term 1 / n^2. Such a series (a p series with p = 2) is convergent.

However, we can't use the comparison test successfully because for sufficiently large n,

\frac{n}{n^3 - 5n + 5} > \frac{1}{n^2}.

For large n, the nth term of s is larger than the nth term of our convergent comparison series, so the comparison test is inconclusive. This is why we use the limit test.

\lim _{n \rightarrow \infty} \frac{\frac{n}{n^3 - 5n + 5}}{\frac{1}{n^2}} = \lim _{n \rightarrow \infty} \frac{n^3}{n^3 - 5n + 5}.

This limit is 1. Therefore, by the limit test, s converges as well.

Consider the series

\sum _{k = 1} ^{\infty} \left( \frac{1 + n}{3n} \right) ^n.

We might guess that this series will behave like a series with nth term 1 / 3^n. So we compare:

\lim _{n \rightarrow \infty} \frac{\left( \frac{1 + n}{3n} \right) ^n}{\frac{1}{3^n}} = \lim _{n \rightarrow \infty} \left( \frac{1}{3n} + \frac{n}{3n} \right) ^n {\frac{1}{3^n}} = \lim _{n \rightarrow \infty} \left( \frac{1}{3} \cdot \frac{1}{n} + \frac{1}{3} \right) ^n {1 \over 3^n} = \lim _{n \rightarrow \infty} \left( 1 + \frac{1}{n} \right) ^n {1 \over 3^{2n}}.

Since \lim _{n \rightarrow \infty} \left( 1 + \frac{1}{n} \right) ^n = e, a constant, this limit is 0. Therefore, since the comparison series is convergent, s is convergent.



We must consider 4 cases, depending on whether the chosen series t is convergent or divergent and depending on the limit...

We must consider 4 cases, depending on whether the chosen series t is convergent or divergent and depending on the limit of Eq. 6.

The first case will be proven rigorously. The other cases are fairly similar, and so their proofs will be slightly abbreviated, but the logic behind them will follow that of Case 1.

Case 1: The limit of Eq. 6 exists and is nonzero, and t converges.

Since the limit exists, we have
\lim _{n \rightarrow \infty} \frac{a_n}{b_n} = L for some 0 < L < \infty. (L must be positive because both series have non-negative terms.)
There exists some N such that for all n greater than N,
\frac{a_n}{b_n} < L + 1.
In other words, since the limit of the ratio between the terms a_n and b_n as n goes to infinity is L, after a certain number (N) of terms, it must be true that the ratio is less than L + 1.
We then consider the series
(L + 1) t = (L + 1) b_1 + (L + 1) b_2 + (L + 1) b_3 + \cdots.
Since t is convergent, the above series is also convergent (it is just a scalar multiple of a convergent series). We also know that
\frac{a_n}{b_n} < L + 1
a_n < (L + 1)b_n for sufficiently large n.
Therefore, by the comparison test, s converges because each of its terms for n greater than N is smaller than the corresponding term of a convergent series (L + 1)t.

Case 2: The limit of Eq. 6 exists and is nonzero, and t diverges.

We can again say that Eq. 6 is equal to some number L. In this case, we will note that there exists some number N such that for all n greater than N
\frac{a_n}{b_n} > \frac{L}{2}
a_n > \frac{L}{2} b_n
The sequence (L/2) t must, as a scalar multiple of divergent series t, diverge. Since, for sufficiently large n, each term of s is greater than the corresponding term of a divergent series, s diverges as well.

Case 3: The limit of Eq. 6 is 0, and t converges.

In this case, we find that the limit of Eq. 6 is 0. Then we know that, for sufficiently large n, it is true that
\frac{a_n}{b_n} < 1
a_n < b_n.
t converges, and for sufficiently large n each term of s is less than the corresponding term of t. By the comparison test, s converges.

Case 4: The limit of Eq. 6 is infinity, and t diverges.

In this case, the limit does not exist (or, we can say, is infinity). Then we know that, for sufficiently large n, it is true that
\frac{a_n}{b_n} > 1
a_n > b_n.
t diverges, and for sufficiently large n each term of s is greater than the corresponding term of t. By the comparison test, s diverges.


Ratio Test

The comparison test and the limit test both work for series with non-negative terms. What if a series has alternating terms? The comparison tests also can be somewhat unwieldy and tedious in the respect that they require us to choose a comparison series, which may or may not be what we are looking for. While the ratio test is not always conclusive, we can find out whether it will get us an answer pretty quickly, whereas when the comparison tests are inconclusive, it's not clear whether we should think of another comparison series or try another test altogether.

The ratio test, in essence, depends on the growth rate of a given series as n becomes arbitrarily large.

Theorem: Let s = a_1 + a_2 + a_3 + \cdots + a_n + a_{n+1} + \cdots be the series whose convergence or divergence we are trying to ascertain. We consider the limit:
Eq. 7         \lim _{n \rightarrow \infty} \left| \frac{a_{n+1}}{a_n} \right|.
We obtain a certain results in two possible cases:
  • If the above limit is less than 1, the series converges.
  • If the above limit is greater than 1, the series diverges.

The ratio test is inconclusive when the above limit is equal to 1.

As mentioned above, the ratio test relies on our knowledge of the growth rates of functions. The ratio between the terms a_{n+1} and a_n generally simplifies to a fraction containing different types of functions. As such, evaluating the limit of Eq. 7 requires knowledge of how different sorts of functions behave for large n. The growth rates of some common functions, from fastest to slowest, are:

  • factorial, ie. n!
  • exponential, ie. 3^n
  • polynomial, ie. n^2 - 2n + 5
  • logarithmic, ie. \ln(n)

This hierarchy does necessarily hold for all n, but it will hold for all sufficiently large n. For instance, the growth of f(x) = 2^{0.00001x} might be much less than the growth of g(x) = \ln (10000x) for low values of x, but there must exist some sufficiently large x, past which the growth rate of f is greater than that of g (by sufficiently large, again, we mean that there is some X such that the growth rate of f is greater than that of g for all x > X). In other words, for large enough x,

\left| \frac{f(x+1)}{f(x)} \right| > \left| \frac{g(x+1)}{g(x)} \right|.

The following examples will illustrate how these growth rates play out in the application of the ratio test.

...

Consider a series

s = \sum _{k = 1} ^{\infty} \frac{k^2}{k!}.

Does it converge? The ratio test gives us a quick way to determine whether or not it does. Just compute the limit:

\lim _{n \rightarrow \infty} \frac{\frac{(n+1)^2}{(n+1)!}}{\frac{n^2}{n!}} = \lim _{n \rightarrow \infty} \frac{n^2 + 2n + 1}{n^2} \cdot \frac{n!}{(n+1)!} = \lim _{n \rightarrow \infty} \left(1 + \frac{2}{n} + \frac{1}{n^2} \right) \cdot \frac{1}{n+1}.

The expression in the parentheses will go to 1 as n goes to infinity, since the fractions with n in the denominator goes to 0. The fraction outside the parentheses, 1 / (n + 1), likewise goes to 0. Therefore the limit evaluates to 1 multiplied 0; the limit is 0. Since the limit is less than 1, s converges.

Consider the series

s = \sum _{k = 1} ^{\infty} \frac{2^k}{k^3 + 1}.

We compute:

\lim _{n \rightarrow \infty} \frac{\frac{2^{n+1}}{(n+1)^3 + 1}}{\frac{2^n}{n^3 + 1}} = \lim _{n \rightarrow \infty} \frac{2^{n+1}}{2^n} \cdot \frac{n^3 +1}{(n+1)^3 + 1} = \lim _{n \rightarrow \infty} 2 \cdot \frac{n^3 +1}{(n+1)^3 + 1}.

We need not expand the fraction with the polynomials; we know that the leading terms will both be to the third power and will have coefficients 1, so we know the limit of the fraction as n goes to infinity will be 1. Therefore, the whole limit evaluates to 2. Since 2 is greater than 1, the series diverges.

We might have been able to anticipate this result by looking at growth rates, as described above. The nth term of s is 2^n / (n^3 + 1). There is exponential growth in the numerator and polynomial growth in the denominator. We should expect terms should become very large for large n, since the numerator will grow faster than the denominator. And this is what the ratio test tells us! For arbitrarily large n, the ratio between two successive terms tends toward 2; this means that, for large n, the series begins to behave like a geometric series with common ratio 2, which certainly diverges!


The ratio test is powerful, but it does not always get us an answer. Consider the series with nth term 1 / n^2, which we know to diverge. We get the ratio:

\lim _{n \rightarrow \infty} \frac{\frac{1}{(n+1)^2}}{\frac{1}{n^2}} = \lim _{n \rightarrow \infty} \frac{n^2}{(n+1)^2} = \lim _{n \rightarrow \infty} \frac{n^2}{n^2 + 2n + 1}.

The above limit is 1, at which value the ratio test is inconclusive.



The ratio test essentially looks at the ratio between two successive terms for arbitrarily large n and determines whether...

The ratio test essentially looks at the ratio between two successive terms for arbitrarily large n and determines whether a successful comparison could be made with a geometric series.

Let s = a_1 + a_2 + a_3 + \cdots + a_n + a_{n+1} + \cdots be the series whose convergence or divergence we are trying to ascertain.

Case 1: The limit of Eq. 7 is less than 1, so s converges absolutely.

Suppose we found that, for all n \geq N, it is true that
\left| \frac{a_{n+1}}{a_n} \right| < \frac{1}{k} where k > 1 (so 1 / k < 1).
We construct the following convergent geometric series:
Eq. 8         a_N + \frac{a_N}{k} + \frac{a_N}{k^2} + \frac{a_N}{k^3} + \frac{a_N}{k^4} + \cdots where k > 1 (so 1 / k < 1).
Since, in this case, Eq. 7 is less than 1, we can choose some k > 1 such that
\lim _{n \rightarrow \infty} \left| \frac{a_{n+1}}{a_n} \right| < \frac{1}{k}.
Since, for all sufficiently large n, the ratio of each successive pair of terms is less than 1 / k, we can successfully compare s with Eq. 8, so s converges also.

Case 2: The limit of Eq. 7 is greater than 1, so s diverges.

A similar method is employed to demonstrate that s diverges. Instead suppose that we find that, for all n \geq N,
\left| \frac{a_{n+1}}{a_n} \right| > k where k > 1.
We construct the following divergent geometric series:
Eq. 9         a_N + k a_N + k^2 a_N + k^3 a_N + k^4 a_N + \cdots.
Since, in this case, Eq. 7 is greater than 1, we can choose some k > 1 such that
\lim _{n \rightarrow \infty} \left| \frac{a_{n+1}}{a_n} \right| > k.
Since, for sufficiently large n, the ratio of each successive pair of terms is greater than k, we can successfully compare s with Eq. 9, so s diverges also.

Note that the ratio test, although it is not a "comparison test" in the respect that we must choose a comparison series, relies on the comparison test!


Root Test

The root test functions similarly to the ratio test.

Theorem: Let s = a_1 + a_2 + a_3 + \cdots + a_n + \cdots be the series whose convergence or divergence we are trying to ascertain. We consider the limit:
Eq. 10         \lim _{n \rightarrow \infty} |a_n|^{1/n}.
Based on this limits, we may reach a definite conclusion:
  • If the above limit evaluates to be less than 1, then s converges absolutely.
  • If the above limit evaluates to be greater than 1, then s diverges.

Like the ratio test, the root test is inconclusive when the above limit evaluates to 1.

The root test will give us the same result as the ratio test, but sometimes can be easier to apply, depending on how a_n is written (the root test is particularly helpful when the nth term is to the nth power). The following examples will illustrate such circumstances.

...

The nth root test is useful when we are given series like this:

s = \sum _{n = 1} ^{\infty} \left( \frac{2}{3 + n} \right) ^n = \left( \frac{2}{4} \right) + \left( \frac{2}{5} \right) ^2 + \left( \frac{2}{6} \right) ^3 + \cdots .

Since the nth term is to the nth power, the root test is particularly effective:

\lim _{n \rightarrow \infty} \left| \left( \frac{2}{3 + n} \right) ^n \right|^{1/n} = \lim _{n \rightarrow \infty} \frac{2}{3 + n}.

Since this expression goes to 0 as n goes to infinity, s converges.



The root test is proved in a way similar to the ratio test...

The root test is proved in a way similar to the ratio test.

Case 1: The limit of Eq. 10 is less than 1, so s converges absolutely.

We can choose some k > 1 such that
\lim _{n \rightarrow \infty} |a_n|^{1/n} < \frac{1}{k} < 1.
Equivalently, we can say that, for all n \geq N,
|a_n| < \frac{1}{k^n} < 1.
So we can construct a convergent geometric series:
\frac{1}{k^N} + \frac{1}{k^{N+1}} + \frac{1}{k^{N+2}} + \frac{1}{k^{N+3}} + \cdots.
The above series will, by construction, compare successfully with the series
|a_N| + |a_{N+1}| + |a_{N+2}| + |a_{N+3}| + \cdots.
Therefore, s converges absolutely.


Case 2: The limit of Eq. 10 is greater than 1, so s diverges.

We can choose some k > 1 such that
\lim _{n \rightarrow \infty} 1 < k < |a_n|^{1/n}.
Equivalently, we can say that, for all n \geq N,
1 < k^n < |a_n|.
So we can construct a divergent geometric series:
k^N + k^{N+1} + k^{N+2} + k^{N+3} + \cdots.
The above series will, by construction, compare successfully with the series
|a_N| + |a_{N+1}| + |a_{N+2}| + |a_{N+3}| + \cdots.
Therefore, s diverges.

Note that, like the ratio test, the root test essentially "finds," based on the limit as n goes to infinity of the nth root of the nth term, whether there is a geometric series with which s can be successfully compared.


Integral Test

The integral test works for series that are positive and monotone decreasing on some interval [N, \infty).

Theorem: Suppose we have a series
s = a_1 + a_2 + a_3 + \cdots a_n + \cdots = \sum _{n = 1} ^{\infty} a_n.
The series converges if the improper integral
Eq. 11         \int _N ^{\infty} a_x dx
is finite. If the above integral diverges, then the series diverges as well.

Power Series

A power series is a series with a variable to some power in each term. The general form of a power series is:

f(x) = a_0 + a_1 x + a_2 x^2 + a_3 x^3 + a_4 x^4 + \cdots.

(Note that we have used a_0 to denote the constant term, unlike the convention used throughout the page in which a_1 denotes the first term. This is chosen here so that the subscripts of the coefficients correspond with the exponential order of each term. It will not significantly impact our math.)

Power series can be very useful; Taylor series, which are a way of approximating infinitely differentiable functions, for instance, are a form of power series. Since the convergence of series depends on the value of each term, the convergence of power series often depends on the variable. As such we can use the rules of convergence to establish a radius of convergence, the interval on which the series converges.

This is the particular advantage of the last few tests we developed: the ratio test, the root test, and the integral test. Because we don't have to select another comparison series, the expressions evaluated in the tests just depend on the nth term of the series we are investigating. This lets us look at the effect a variable (say, x) has on whether a series converges. For example, what if we look at the series:

f(x) = x + x^2 + x^3 + \cdots + x^n + \cdots.

This is a geometric series with common ratio x. It should converge for -1 < x < 1. Can we confirm this result? Using the ratio test (Eq. 7), we evaluate:

\lim _{n \rightarrow \infty} \left| \frac{x^{n+1}}{x^n} \right| = \lim _{n \rightarrow \infty} \left| x \right|.

This limit must be less than 1 for the series to converge absolutely. And we find that it is less than 1 for -1 < x < 1.


What about for more complicated series? Say we have:

f(x) = \sum _{n = 1} ^{\infty} \frac{(x+1)^n}{3^n} = \frac{x+1}{3} + \frac{(x+1)^2}{3^2} + \frac{(x+1)^3}{3^3} + \cdots.

Here we can easily apply the root test, since the whole expression for the nth term is to the nth power:

\lim _{n \rightarrow \infty} \left| \frac{(x+1)^n}{3^n} \right| ^ {1 / n} = \lim _{n \rightarrow \infty} \left| \left( \frac{x+1}{3} \right) ^n \right| ^ {1 / n} = \lim _{n \rightarrow \infty} \left| \frac{x+1}{3} \right|.

We have conveniently eliminated n from the limit expression, so it simply goes to (x+1)/3! We need this expression to be less than 1, so we set up the inequality:

\left| \frac{x+1}{3} \right| < 1.

Rearranging this, we obtain:

2 < x < 4

as the radius of convergence.


Consider the series:

f(x) = \sum _{n = 1} ^{\infty} \frac{x^{2n}}{n!}.

What is the radius of convergence for this series?

Again we use the ratio test:

\lim _{n \rightarrow \infty} \left| \frac{\frac{x^{2n + 2}}{(n+1)!}}{\frac{x^{2n}}{n!}} \right| = \lim _{n \rightarrow \infty} \left| \frac{x^{2n+2}}{x^{2n}} \cdot \frac{n!}{(n+1)!} \right| = \lim _{n \rightarrow \infty} \frac{x^2}{n+1}.

Since x^2 is a constant, regardless of how large x is, this expression goes to 0 as n goes to infinity. Therefore, the series converges for all x.

Harmonic Series

The Harmonic series provides us with an interesting challenge according to the tests we've established. The harmonic series is as follows:

s = 1 + \frac{1}{2} + \frac{1}{3} + \frac{1}{4} + \frac{1}{5} + \frac{1}{6} + \frac{1}{7} + \cdots + \frac{1}{n} + \cdots

The harmonic series diverges. However, it's not obvious that it should, and it's not straightforward to demonstrate using the tests we have shown already.

The divergence of the harmonic series confounds our intuition. We would expect that, for large n, a series with nth term 1 / n would eventually stop growing, but it in fact keeps growing without bound, albeit slowly. Examples/pictures of harmonic series growth?


As mentioned previously, the nth root test is inconclusive: as n goes to infinity, a_n goes to 0. What if we compare it to a geometric series with, say, common ratio r = 1/2?

1 + \frac{1}{2} + \frac{1}{4} + \cdots.

Although the first two terms are equal, the third term of the harmonic series is greater than the third term of the geometric series. We could try choosing a larger common ratio, which would slow the growth rate of our chosen geometric series, but that would only delay the value of N after which the harmonic series' terms would begin to exceed those of the geometric series.

What about the ratio test? We must consider the limit:

\lim _{n \rightarrow \infty} \frac{\frac{1}{n+1}}{\frac{1}{n}} = \lim _{n \rightarrow \infty} \frac{n}{n+1}.

This limit is 1, so even the ratio test is inconclusive.


The trick of proving that the harmonic series diverges lies, in fact, in the comparison test. We saw that comparison with a geometric series was not conclusive. Several mathematicians have nevertheless thought of other series which show that the harmonic series diverges. Here we will give just one example.

Consider the series:

Eq. X         1 + \frac{1}{2} + \frac{1}{4} + \frac{1}{4} + \frac{1}{8} + \frac{1}{8} + \frac{1}{8} + \frac{1}{8} + \cdots.

This series may, at first, appear to have no pattern; some of the terms appear more than once, and the number of times that they appear is not consistent. However, the term 1/2 appears once, the term 1/4 appears twice, and the term 1/8 appears 4 times. With the exception of the first term 1, each term appears half of the number of times as the number in its denominator. We recognize that, if the series were continued, the ninthth term would be 1/16 (and the term would repeat 8 times).

How does this help us prove the divergence of the harmonic series? Since we are using the comparison test, we need to know that Eq. X diverges if we are to show that the harmonic series diverges. We do this be slightly modifying the way we've written Eq. X:

1 + \left( \frac{1}{2} \right) + \left( \frac{1}{4} + \frac{1}{4} \right) + \left( \frac{1}{8} + \frac{1}{8} + \frac{1}{8} + \frac{1}{8} \right) + \cdots
1 + 1 \cdot \frac{1}{2} + 2 \cdot \frac{1}{4} + 4 \cdot \frac{1}{8} + \cdots
1 + \frac{1}{2} + \frac{1}{2} + \frac{1}{2} + \cdots

Each term \frac{1}{k} repeats k / 2 times, and \frac{k}{2} \cdot \frac{1}{k} = \frac{1}{2}, so we obtain a divergent series (by the nth-term test).

So our chosen series is divergent. We still must show that each of the terms of the harmonic series is greater than or equal to its corresponding term in Eq. X. Let's try putting the two series alongside each other:

1 + \frac{1}{2} + \frac{1}{4} + \frac{1}{4} + \frac{1}{8} + \frac{1}{8} + \frac{1}{8} + \frac{1}{8} + \cdots
1 + \frac{1}{2} + \frac{1}{3} + \frac{1}{4} + \frac{1}{5} + \frac{1}{6} + \frac{1}{7} + \frac{1}{8} + \cdots

As we can see above, the first, second, fourth, and eighth terms are equal. The third, fifth, sixth, and seventh terms of the harmonic series are greater than those of the corresponding terms of Eq. X. By the comparison test, then, the harmonic series diverges.

Alternating Harmonic Series

The harmonic series should not be confused with the alternating harmonic series, which is, as its name implies, the harmonic series with terms alternating between positive and negative:

s = 1 - \frac{1}{2} + \frac{1}{3} - \frac{1}{4} + \frac{1}{5} - \frac{1}{6} + \frac{1}{7} - \cdots

The alternating harmonic series converges. We can even find its sum. Previously, we have found the sum of certain series via Eq. 5, by simply observing the limit of the sequence of partial sums (see Eq. 4) as n goes to infinity. What happens if we try to do this for the alternating harmonic series? The sequence of partial sums is:

S = (s_1, s_2, s_3, s_4, s_5, s_6, s_7, \cdots)
S = (1, \frac{1}{2}, \frac{5}{6}, \frac{7}{12}, \frac{47}{60}, \frac{37}{60}, \frac{319}{420}, \cdots).

Based on the above, finding a general expression, in terms of n, for the nth partial sum seems neither promising nor likely. What can we do instead? We will use Taylor series. In particular, we will use the Taylor series for the natural logarithm, the derivation of which requires some calculus and background knowledge of Taylor series.

For the purposes of this page, we will just provide the Taylor series for the natural logarithm:

\ln (1 + x) = x - \frac{x^2}{2} + \frac{x^3}{3} - \frac{x^4}{4} + \cdots, for x \leq 1.

If we plug in x = 1, we get:

\ln (2) = 1 - \frac{1}{2} + \frac{1}{3} - \frac{1}{4} + \cdots.

The right-hand side is the alternating harmonic series. Since x = 1 is in the domain of the Taylor series for \ln(1 + x), the series converges to \ln (2).

As it turns out, \ln (2) is about equal to 0.6931. s_7 = 319/420 is about equal to 0.7595. Maybe if we went out several more terms in our sequence of partial sums, we would be able guess that the alternating harmonic series converges to \ln (2), if we knew what we were looking for. The Taylor series confirms the answer much more easily, and its result is exact.


We have shown that the alternating harmonic series is convergent but not absolutely convergent. Recall that, for a series to be absolutely convergent, the series consisting of the absolute value of each of its terms must converge. But the series consisting of the absolute values of the terms of the alternating harmonic series is the regular harmonic series, which diverges! So while absolute convergence proves convergence, not all convergent series converge absolutely.

Personal tools