Convergence
From Math Images
In mathematics, one often encounters infinite series, and it is helpful to know if or when these converge. It may seem counterintuitive that an infinite number of nonzero terms can add up to a real number.
To understand this, we must first understand what it means for a sequence to converge. We will then use this understand to illustrate what it means for a series to converge.
Basic Description
The general form of an infinite sequence is a list of terms:
Its finite sum series of the first n terms is:
The sequence is said to be convergent if the following limit exists:
 where is real.
If this limit doesn't exist, then the sequence is said to be divergent.
Now we consider an infinite series:
If , then does not exist. The goal is to determine whether exists if .
Examples of Series
As we can see in the definition, whether a sequence is convergent or not depends on its sum series. If the sequence is "summable" when n goes to infinity, then its convergent. If it's not, then it's divergent. Following are some examples of convergent and divergent sequences:
 , convergent.
 , convergent.
 , divergent. Vibrates above and below 0 with increasing magnitudes.
 , divergent. Adds up to infinity.
Seq. 1 comes directly from the summation formula of geometric sequences. Seq. 2 is a famous summable sequence discovered by Leibniz. We are going to briefly explain these sequences in the following sections.
Seq. 3 and Seq. 4 are divergent because both of them add up to infinity. However, there is one important difference between them. On one hand, Seq. 3 has terms going to infinity, so it's not surprising that this one is not summable. On the other hand, Seq. 4 has terms going to zero, but they still have an infinitely large sum! This counterintuitive result was first proved by Johann Bernoulli and Jacob Bernoulli in 17th century. In fact, this sequence is so epic in the history of math that mathematicians gave it a special name: the harmonic series. Click here for a proof of the divergence of harmonic series^{[1]}.
By definition, divergent series are not summable. So if we talk about the "sum" of these series, we may get ridiculous results. For example, look at the summation formula of geometric series:
This formula could be easily derived with a little manipulation of algebra, or by expanding the Maclaurin series of the left side. Click here for a simple proof^{[2]}. However, what we want to show here is that this formula doesn't work for all values of r. For values less than 1, such as 1/2, we can get reasonable results like:
However, if the value of r is larger than 1, such as 2, things start to get weird:
How can we get a negative number by adding a bunch of positive integers? Well, if this case makes mathematicians uncomfortable, then they are going to be even more puzzled by the following one, in which r = 2:
This is ridiculous: the sum of integers can not possibly be a fraction. In fact, we are getting all these funny results because the last two series are divergent, so their sums are not defined. See the following images for a graphic representation of these series:



In the images above, the blue lines trace the geometric sequences, and the red lines trace their sum series. As we can see, the first sequence with r = 1/2 does have a limited sum, since its sum series converge to a finite value as n increases. However, the sum series of the other two sequences don't converge to anything. They never settle around a finite value. Thus the second and third sequences diverge, and their sums don't exist. Although we can still write down the summation formula in principle, this formula is meaningless. So no wonder we have got those weird results.