I am taking a course in probability and I have trouble computing the variance of a random variable.

There are 2 cases we saw in class that I would like to understand:

First I will state the definition of variance of expected value:

- If X is a random variable who's values are in N, then

$$\mathbb{E}(X)={\mathrm{\Sigma}}_{n\ge 0}\mathbb{P}(X>n)$$

- If X is a random variable and E(X) exists, then the variance of X is:

$$Var(X)=\mathbb{E}{\textstyle (}(X-\mathbb{E}(X){)}^{2}{\textstyle )}$$

Now here are the examples I'd like to understand:

1. binomial distribution: If X is a random variable that follows a binomial distribution of parametres n and p then we can write $X={X}_{1}+{X}_{2}+{X}_{3}+...+{X}_{n}$ where the ${X}_{i}$'s are bernoulli variables of parametre p. Then

$$Var(X)=np(1-p)$$

2. negative binomial distribution (Pascal law): If X is a random variable that follows a pascal law of parameters r and p the $X+r={X}_{1}+...+{X}_{r}$ where ${X}_{i}$'s are independant geometric variables. Then

$$Var(X)=Var(X+r)=r{\displaystyle \frac{1-p}{{p}^{2}}}$$