If one has a random variable X, described by a finite probability distribution with equally likely possible values x_1,…,x_n, then the inequality: E[logX] le log(E[X]) is a reformulation of the arithmetic/geometric mean inequality.

Continuous/probability version of arithmetic/geometric mean inequality
If one has a random variable X, described by a finite probability distribution with equally likely possible values ${x}_{1},\dots ,{x}_{n}$, then the inequality: $E\left[\mathrm{log}X\right]\le \mathrm{log}\left(E\left[X\right]\right)$ is a reformulation of the arithmetic/geometric mean inequality.
And a necessary and sufficient condition for equality (maybe only if the distribution is concentrated at one point, for example)?
You can still ask an expert for help

• Questions are typically answered in as fast as 30 minutes

Solve your problem for the price of one coffee

• Math expert for every subject
• Pay only if we can solve it

Rayna Aguilar
Step 1
Let $X:\mathrm{\Omega }\to \mathbb{R}$ be a random variable on any probability space $\left(\mathrm{\Omega },\mathcal{F},\mu \right)$ such that $X>0$ almost surely. Then, we have the inequality $E\left[\mathrm{log}X\right]⩽\mathrm{log}E\left[X\right]$ with equality if and only if X is constant almost surely.
Step 2
This is a special case of Jensen's inequality, $E\left[\phi \left(X\right)\right]⩾\phi \left(E\left[X\right]\right)$ which holds for convex functions $\phi :\mathrm{supp}\left(X\right)\to \mathbb{R}$, where supp(X) is the smallest interval containing the essential support of X, i.e., the support of the pushforward measure induced on R by the random variable X, by setting $\varphi \left(x\right)=-\mathrm{log}x$ for $x\in \left(0,\mathrm{\infty }\right)$.
The function $\varphi \left(x\right)=-\mathrm{log}x$ is strictly convex, and so by the equality conditions for Jensen's inequality, it follows that equality holds if and only if X is constant almost surely.