Date: Jul 4, 2013 11:06 AM
Author: ferreat
Subject: Maximum entropy

Hello,
I have a doubt about the distribution of random variables that maximize the differential entropy in a set of inequalities. It is well known that the Normal distribution maximizes the differential entropy. I have the following set of inequalities (also in the file attached):

T1 < I(V;Y1|U)
T2 < I(U;Y2)
T3 < I(X1,X2;Y3|V)
T4 < I(X1,X2;Y3)

where, Y1=X1+N1, Y2=a*X1+N2, Y3=b*X1+X2+N3. N1,N2,N3 are Gaussian ~ N(0,1). The lower case a and b are positive real numbers a < b. U, V, X1 and X2 are random variables. I want to maximize that set of inequalities. I know the following:

(i) From T4, h(Y3) maximum is when Y3 is Gaussian then X1 and X2 are Gaussian.

(ii) From T2 we maximize it by having h(Y2) or h(a*X1+N2) maximum. From this by the Entropy Power Inequality (EPI) we bound -h(a*X1+N2|U) and have X1|U Gaussian.

(iii) From T1 we maximize it by having h(Y1|U) or h(X1+N1|U) maximum which we can do as -h(a*X1+N2|U) in the part ii can be bounded having Y1 Gaussian (satisfying the maximum entropy theorem).

The Question:

From T3, can I assume that jointly Gaussian distribution will maximize h(Y3|V) or h(b*X1+X2+N3) having the assumptions i,ii,iii ?

My aim is to show that jointly Gaussian distribution of U, V, X1 and X2 maximizes the set of inequalities. I hope anyone can help me out with this.