Drexel dragonThe Math ForumDonate to the Math Forum

Ask Dr. Math - Questions and Answers from our Archives
_____________________________________________
Associated Topics || Dr. Math Home || Search Dr. Math
_____________________________________________

Mean-Variance Ratio of the Poisson Distribution


Date: 03/27/2001 at 19:08:10
From: Jocelyn
Subject: Proof of mean/variance = 1 for Poisson

I would like to show that V(X)/E(X) = 1 for the Poisson distribution. 
In other words, the ratio of the variance and the expected mean equals 
one.

I know the formula for the Poisson distribution and I think that

     E(X) = x * f(x)

and

     V(X) = x^2 * f(x)

but I am not sure about that.

In order to show this ratio I need to do a derivative under a sum, 
which I am not sure how to do either.

Any help would be appreciated.
Thank you.


Date: 03/27/2001 at 21:58:43
From: Doctor Jordi
Subject: Re: Proof of mean/variance = 1 for Poisson

Hello, Jocelyn - thanks for writing to Ask Dr. Math.

Yes, you are on the right track for finding the mean and variance of 
the Poisson distribution. Doing the summations (with the help of a few 
clever tricks) will yield the desired results. I can show you how to 
do it.

First, let's make sure you and I agree. The probability density 
function of the Poisson distribution is given by:

                 [L^x]*[e^(-L)]
     p(X = x) = ----------------
                       x!

Where I have used capital L to represent the parameter of the 
distribution. Traditionally, the Greek letter Lambda is used for this 
parameter. I will keep calling it L from now on, though.

Finding E(x) = mean of the Poisson is actually fairly simple. We go 
back to the definition of E(x) to see that we need to find the 
following infinite sum.

             inf
             ---
             \         [L^x]*[e^(-L)]       
     E(X) =  /    x * ----------------
             ---            x!
             x=0

We will apply a standard procedure for summing up this series or a 
very often-used trick, if you like. (My professors like to say that "a 
twice-used trick is a procedure.") We will factor out whatever we can 
from under the sigma sign, trying to arrive at something like

     (Some Expression) * Infinite Sum of p(x)

And since we already know that p(x) is the density function of the 
Poisson, we know that it must sum up to 1. You will see the above 
procedure being applied again and again whenever you find expected 
values.

Let's do it. First we notice that when x = 0, the entire first term 
vanishes, so we can rewrite the expression as:

             inf
             ---
             \         [L^x]*[e^(-L)]       
     E(X) =  /        ----------------
             ---            (x-1)!
             x=1

Where I have made the additional simplification of cancelling the first 
term in the factorial with the x we had in the numerator.  Now, the 
above looks *almost* the same as p(x), except that we have (x - 1)! 
instead of x!. No problem. Let us make a change-of-variable and choose 
y = x - 1. We'll see how everything turns out after we do that.
        
             inf            if y = x - 1, then x = y + 1
             ---            |
             \         [L^(y + 1)]*[e^(-L)]       
     E(X) =  /        -----------------------
             ---            y!
             y=0     
              |
              When x = 1, y = 0

Almost there - except that now we have an extra L that multiplies 
every term in the series. So, all we have to do is factor it out.

               inf            
               ---            
               \         [L (y)]*[e^(-L)]       
     E(X) = L* /        ------------------
               ---            y!
               y=0

And there you have it. The resulting summation is nothing more than 
adding up all the values of the density function of the Poisson, and 
we know that by the definition of the density function, it must all 
add up to 1. This gives us the final very neat result:

     E(X) = L

The variance is found by very similar means, except that this time we 
employ another interesting little procedure (I would call it a trick, 
but since it is also used for finding the variance of a Binomial 
distribution, it's a procedure). Recall that

     Var(X) = E(X^2) - [E(X)]^2

Since we already know E(X), all we have to do is now find E(X^2) in 
order to obtain the variance. Unfortunately, this is a very difficult 
task, so we instead do something else.

Notice that E[X(X-1)] = E(X^2 - X) = E(X^2) - E(X). This alternate 
expression is easier to add up, because of that factorial in the 
denominator of the Poisson density. Finding it will give you an 
expression containing E(X^2) which can then be used for finding the 
variance. In other symbols,

     E[X(X - 1] + E(X) - [E(X)]^2 = E(X^2) - E(X) + E(X) - [E(X)]^2
                                  = E(X^2) - [E(X)]^2 
                                  = Var(X)

So the only ingredient missing is E(X(X - 1)). This I'll let you find 
yourself. You need to add up the following infinite sum:

                    inf
                    ---
                    \                [L^x]*[e^(-L)]       
     E[X(X - 1)] =  /    x(x - 1) * ---------------
                    ---                   x!
                    x=0

Do it the same way I did it for the E(X). Play around with the 
expressions, do a substitution, and try to arrive at an expression of 
the form of (Something)*[Sum of p(x)]. I hope this gives you no major 
difficulties, once you have fully understood how to do it for E(X).

I can think of an entirely different standard method for finding the 
mean and variance of the Poisson distribution. It involves a 
theoretical device called moment generating functions, abbreviated 
mgf's. Since this method is very straightforward, provided that we 
have already derived the mgf of the Poisson, I'll explain it too.

The mgf of ANY distribution is given by the formula:

     mgf(t) = E[e^(tX)]

where E( ) denotes the expected value function related to that 
distribution and x is a random variable. In particular, the mgf of the 
Poisson can be found by evaluating the above expected value (which 
involves a lot of fun and summing up again an infinite series). In the 
end, one can arrive at the following formula:

     mgf(t) = exp[(e^t - 1)L]

where exp[ ] is another way of writing e^( ) and L is again the 
parameter related to the Poisson distribution, same L as before.

Now, the beauty of the mgf is that it greatly simplifies calculations 
for obtaining means and variances. It is called _moment_ generating 
function, because in a sense, it packs of all the moments of the 
distribution into one neat expression. Moments of a random variable 
are E(X), E(X^2), E(X^3), etc. Notice how the mean is the first 
moment, E(X), and the variance is the second moment minus the first 
moment squared, i.e.

     Var(X) = E(X^2) - [E(X)]^2

There is a standard result that says that if we find the kth 
derivative of the mgf and evaluate it at t = 0, we will obtain the kth 
moment. This theorem can be obtained by expressing the mgf as an 
infinite sum (use the definition), then differentiating term-by-term, 
and finally evaluating at t = 0. Using this little theorem and the mgf 
of the Poisson we find that:

 d mgf(t) |      d exp[(e^t - 1)L] |                           |
 -------- |    = ----------------- |    = exp[(e^t - 1)L]*Le^t |    = L
    dt    |t=0          dt         |t=0                        |t=0

It is reassuring to see that this agrees with our previous result.

Now we need to find the second derivative in order to get the second 
moment. We can then use the second moment to find the variance.

      d^2 mgf(t)  |        d exp[(e^t - 1)L]*Le^t  |
     ------------ |    =  ------------------------ |
         dt^2     |t=0                dt           |t=0

                                                                  |
            = exp[(e^t - 1)L] * (Le^t)^2 + exp[(e^t - 1)L] * Le^t |
                                                                  |t=0

            = L^2 + L

And just for an added touch of suspense, I'll let you figure out from 
here what the variance of a Poisson random variable is.

I hope you found this explanation useful and easy to understand. If 
you still need more clarification, or if you have other questions of 
any kind, please write back.

- Doctor Jordi, The Math Forum
  http://mathforum.org/dr.math/   
    
Associated Topics:
College Statistics

Search the Dr. Math Library:


Find items containing (put spaces between keywords):
 
Click only once for faster results:

[ Choose "whole words" when searching for a word like age.]

all keywords, in any order at least one, that exact phrase
parts of words whole words

Submit your own question to Dr. Math

[Privacy Policy] [Terms of Use]

_____________________________________
Math Forum Home || Math Library || Quick Reference || Math Forum Search
_____________________________________

Ask Dr. MathTM
© 1994-2013 The Math Forum
http://mathforum.org/dr.math/