 # Probability and Variance-Bernoulli, Binomial, Negative Binomial. First I will state the definition of variance of expected value: If X is a random variable who's values are in N, then E(X)=sum_{n >= 0}P(X>n) skystasvs 2022-09-13 Answered
Probability and Variance-Bernoulli, Binomial, Negative Binomial
I am taking a course in probability and I have trouble computing the variance of a random variable.
There are 2 cases we saw in class that I would like to understand:
First I will state the definition of variance of expected value:
- If X is a random variable who's values are in N, then
$\mathbb{E}\left(X\right)={\mathrm{\Sigma }}_{n\ge 0}\mathbb{P}\left(X>n\right)$
- If X is a random variable and E(X) exists, then the variance of X is:
$Var\left(X\right)=\mathbb{E}\left(\left(X-\mathbb{E}\left(X\right){\right)}^{2}\right)$
Now here are the examples I'd like to understand:
1. binomial distribution: If X is a random variable that follows a binomial distribution of parametres n and p then we can write $X={X}_{1}+{X}_{2}+{X}_{3}+...+{X}_{n}$ where the ${X}_{i}$'s are bernoulli variables of parametre p. Then
$Var\left(X\right)=np\left(1-p\right)$
2. negative binomial distribution (Pascal law): If X is a random variable that follows a pascal law of parameters r and p the $X+r={X}_{1}+...+{X}_{r}$ where ${X}_{i}$'s are independant geometric variables. Then
$Var\left(X\right)=Var\left(X+r\right)=r\frac{1-p}{{p}^{2}}$
You can still ask an expert for help

• Live experts 24/7
• Questions are typically answered in as fast as 30 minutes
• Personalized clear answers

Solve your problem for the price of one coffee

• Math expert for every subject
• Pay only if we can solve it Harper Brewer
Step 1
If X is Bernoulli, with $P\left(\text{Success}\right)=p,$, then $E\left(X\right)=\left(0\right)\left(1-p\right)+1p=p.$. Because ${0}^{2}=0$ and ${1}^{2}=1$, we also have $E\left({X}^{2}\right)=p.$.
Binomial: Then for $Y\sim Binom\left(n,p\right),$, we have
$E\left(Y\right)=E\left(\sum _{i=1}^{n}{X}_{i}\right)=\sum _{i=1}^{n}E\left({X}_{i}\right)=np.$
Also, direct proofs, with
$E\left(Y\right)=\sum _{i=0}^{n}i\left(\genfrac{}{}{0}{}{n}{i}\right){p}^{i}\left(1-p{\right)}^{n-i}=\sum _{i=1}^{n}i\left(\genfrac{}{}{0}{}{n}{i}\right){p}^{i}\left(1-p{\right)}^{n-i}=\cdots =np\left(1\right)=np,$
(with a change in index $j=i-1$) are given in many elementary texts.
Step 2
Similarly, using independence,
$Var\left(Y\right)=Var\left(\sum _{i=1}^{n}{X}_{i}\right)=\sum _{i=1}^{n}Var\left({X}_{i}\right)=np\left(1-p\right).$
Again here, proofs using about the same method as above to find $E\left(Y\left(Y-1\right)\right),$, and then Var(Y) from that and E(Y), are given in many elementary texts.
For the negative binomial, it seems intuitive that the avarage waiting time for the rth Success should be r/p. A rigorous derivation of the expectation of a negative binomial random variable often uses some sort of trick involving differentiation of a sum. Probably the easiest route is to find the moment generating function and differentiate it to get the mean and variance.

We have step-by-step solutions for your answer!