Date: Dec 22, 2012 4:35 PM
Author: weschrist
Subject: Entropy of linear function?

Hi all,

Full disclosure, I'm a hydrogeologist... I play in the dirt and water. I love math and use it often, but I'm by no means an expert.

My question is in regards to Shannon Entropy = H(x). As I understand it, it is basically a measure of information contained in a signal.

H(x) = - sum(pi*log(pi)), where pi is the probability of getting value i

A continuous series of the same value (i.e. all 1's) would have pi=1 for all i, which gives H(x)=0. A series of N values would have pi=ni/N where ni is the number of time value i occurs. I hope that's right, otherwise I'm in worse shape than I thought. A random set of values where no two values are the same would have pi=1/N. As N increases, the entropy increases.

So my question is, if you have a linearly increasing set of values where every value is greater than the previous, do you really have high entropy (high information content)? Or is there some assumption of stationarity or transformation to ensure stationarity (i.e. take out the linear trend and drop the entropy to zero)?

Any insight would be greatly appreciated.