gea3stwg

2022-01-21

Joint Entropy closed-form analytical solution
Differential entropy of a single Gaussian random variable is
$H\left(X\right)=\frac{1}{2}\mathrm{ln}\left(2\pi e\sigma 2\right)$
What then is the closed-form analytical solution for joint entropy, H(X,Y)?

ul2ph3ojc

Let ($X,Y\right)\sim N\left(0,K\right)$, were
$K=\left[\begin{array}{cc}{\sigma }^{2}& \rho {\sigma }^{2}\\ \rho {\sigma }^{2}& {\sigma }^{2}\end{array}\right]$
Then differential entropy is
$h\left(X\right)=h\left(Y\right)=\frac{1}{2}\mathrm{log}\left(2\pi e\right){\sigma }^{2}$
and joint entropy is
$h\left(X,Y\right)=\frac{1}{2}{\mathrm{log}\left(2\pi e\right)}^{2}|K|$
$=\frac{1}{2}{\mathrm{log}\left(2\pi e\right)}^{2}{\sigma }^{4}\left(1-{\rho }^{2}\right)$

Do you have a similar question?