Joint Entropy closed-form analytical solution Differential entropy of a single Gaussian

gea3stwg

gea3stwg

Answered question

2022-01-21

Joint Entropy closed-form analytical solution
Differential entropy of a single Gaussian random variable is
H(X)=12ln(2πeσ2)
What then is the closed-form analytical solution for joint entropy, H(X,Y)?

Answer & Explanation

ul2ph3ojc

ul2ph3ojc

Beginner2022-01-22Added 12 answers

Let (X,Y)N(0,K), were
K=[σ2ρσ2ρσ2σ2]
Then differential entropy is
h(X)=h(Y)=12log(2πe)σ2
and joint entropy is
h(X,Y)=12log(2πe)2|K|
=12log(2πe)2σ4(1ρ2)

Do you have a similar question?

Recalculate according to your conditions!

Ask your question.
Get an expert answer.

Let our experts help you. Answer in as fast as 15 minutes.

Didn't find what you were looking for?