[courtesy copy to the author] [cross-post and follow-up to "sci.stat.math"]
Dans l'article <email@example.com>, "Dimitris Agrafiotis" <firstname.lastname@example.org> =E9crivait:
> I have a rather complicated integral to compute and I was wondering if > Mathematica can help me do it. Unfortunately, I don't have access to > this package. The integral is as follows: >=20 > +inf > S =3D -integral(p(x) ln(p(x))) > -inf
This is the Shannon entropy of your distribution, isn't it ?
Did i read somewhere that the entropy of the sum of independant random variables is egal to the sum of their individual entropies ?
> where > n > p(x) =3D (1/a) * SUM(exp^-((x - c(i))^2 / b) > i=3D1 >=20 > and a, b, and c(i) are constants. p(x) is essentially a sum of > n Gaussians, each centered at its own c(i). x and c(i) are d- > dimensional vectors. Can the integral be computed/approximated, > and if so, how many dimensions can we handle? How about the > sample size, n (i.e. the number of Gaussians)? Can that be > arbitrarily large?
Individual entropies can here be calculated, since these are all normal random variables; as far as i can remind, S(i) =3D cst + ln(std(i)). So, = if the theorem of addition of entropies can be applied, the answer should be pretty simple. But i am not sure for the theorem... Can some confirm ?
Tout avantage a ses inconv=E9nients, et r=E9ciproquement. ------------------------------- J.Rouxel [Les Shadoks] La vengeance du Ma= rin