The Math Forum

Search All of the Math Forum:

Views expressed in these public forums are not endorsed by NCTM or The Math Forum.

Math Forum » Discussions » Math Topics » discretemath

Notice: We are no longer accepting new posts, but the forums will continue to be readable.

Topic: Entropy of linear function?
Replies: 0  

Advanced Search

Back to Topic List Back to Topic List   Topics: [ Previous | Next ]

Posts: 1
From: Big Blue
Registered: 12/22/12
Entropy of linear function?
Posted: Dec 22, 2012 4:35 PM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

Hi all,

Full disclosure, I'm a hydrogeologist... I play in the dirt and water. I love math and use it often, but I'm by no means an expert.

My question is in regards to Shannon Entropy = H(x). As I understand it, it is basically a measure of information contained in a signal.

H(x) = - sum(pi*log(pi)), where pi is the probability of getting value i

A continuous series of the same value (i.e. all 1's) would have pi=1 for all i, which gives H(x)=0. A series of N values would have pi=ni/N where ni is the number of time value i occurs. I hope that's right, otherwise I'm in worse shape than I thought. A random set of values where no two values are the same would have pi=1/N. As N increases, the entropy increases.

So my question is, if you have a linearly increasing set of values where every value is greater than the previous, do you really have high entropy (high information content)? Or is there some assumption of stationarity or transformation to ensure stationarity (i.e. take out the linear trend and drop the entropy to zero)?

Any insight would be greatly appreciated.

Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© The Math Forum at NCTM 1994-2018. All Rights Reserved.