How can you find P(X/(Y-X)<0) if X∼Geometric(p) and Y∼Bernoulli(p)

How can you find $P\left(\frac{X}{Y-X}<0\right)$ if $X\sim Geometric\left(p\right)$ and $Y\sim Bernoulli\left(p\right)$
Let the independent random variables $X\sim Geometric\left(p\right)$ and $Y\sim Bernoulli\left(p\right)$, I want to prove that $P\left(\frac{X}{Y-X}<0\right)=\left(p-1{\right)}^{2}\left(p+1\right)$
Do I need the joint probability mass function for this or should it be proven in some other way?
You can still ask an expert for help

• Questions are typically answered in as fast as 30 minutes

Solve your problem for the price of one coffee

• Math expert for every subject
• Pay only if we can solve it

Octavio Barr
Step 1
Given that Y is Bernoulli distributed, it will have probability mass function $P\left(Y=0\right)=\left(1-p\right)$ and $P\left(Y=1\right)=p$. As Frank has helpfully commented, we can condition on Y.
When $Y=0$, we have $P\left(\frac{X}{Y-X}<0|Y=0\right)=P\left(-X<0|Y=0\right)=P\left(X>0|Y=0\right)$
When $Y=1$, we $P\left(\frac{X}{Y-X}<0|Y=1\right)=P\left(\frac{X}{1-X}<0|Y=1\right)=P\left(X>1|Y=1\right)$
Step 2
Thus, we have the following (the third line being a consequence of the independence between X and Y)
$\begin{array}{rl}P\left(\frac{X}{Y-X}<0\right)& =P\left(\frac{X}{Y-X}<0|Y=0\right)P\left(Y=0\right)+P\left(\frac{X}{Y-X}<0|Y=1\right)P\left(Y=1\right)\\ & =P\left(X>0|Y=0\right)P\left(Y=0\right)+P\left(X>1|Y=1\right)P\left(Y=1\right)\\ & =P\left(X>0\right)P\left(Y=0\right)+P\left(X>1\right)P\left(Y=1\right)\\ & =\left(\sum _{k=1}^{\mathrm{\infty }}p\left(1-p{\right)}^{k}\right)\left(1-p\right)+\left(\sum _{k=2}^{\mathrm{\infty }}p\left(1-p{\right)}^{k}\right)\left(p\right)\\ & =\left(1-p{\right)}^{2}+p\left(1-p{\right)}^{2}\\ & =\left(p-1{\right)}^{2}\left(1+p\right)\end{array}$