n balls are chosen randomly and without replacement from an urn containing N white balls and M black balls. Give the probability mass function of the random variable X which counts the number of white balls chosen. Show that the expectation of X is (Nn)/(M+N)

wijii4 2022-09-17 Answered
Probability problem involving hyper-geometric distribution
n balls are chosen randomly and without replacement from an urn containing N white balls and M black balls. Give the probability mass function of the random variable X which counts the number of white balls chosen. Show that the expectation of X is N n ( M + N ) .
Hint: Do not use the hypergeometric distribution. Instead, write X = X 1 + X 2 + . . . + X N where X i equals 1 if the ith white ball was chosen.
You can still ask an expert for help

Expert Community at Your Service

  • Live experts 24/7
  • Questions are typically answered in as fast as 30 minutes
  • Personalized clear answers
Learn more

Solve your problem for the price of one coffee

  • Available 24/7
  • Math expert for every subject
  • Pay only if we can solve it
Ask Question

Answers (1)

Adelaide Barr
Answered 2022-09-18 Author has 9 answers
Step 1
P ( X i = 0 ) = N + M 1 N + M N + M 2 N + M 1     . . .     N + M n N + M ( n 1 ) which, when simplified gives:
P ( X i = 0 ) = N + M n N + M
The way to think about this is the following. X i = 0 denotes the fact of not picking the ith white ball. So, in the n draws, we're allowed to pick any other ball. Therefore, during the first draw, we can pick N + M 1 balls (as we don't want to pick the ith white ball), and the probability to do so is p = N + M 1 N + M .
When we do the second draw we are allowed to pick N + M 2 balls (as we can't pick the ith white ball AND since there's only N M 1 balls in the urn after the first draw),and the probability to do so is p = N + M 2 N + M 1
Applying the same logic to the n draws we come to the conclusion that P ( X i = 0 ) = N + M n N + M
Step 2
Now, we have P ( X i = 1 ) = 1 P ( X i = 0 ) = n M + N .
Finally, to get the expected value of X, first we notice that E ( X i ) = P ( X i = 1 ) (see post for more details), then from the linearity of the expected value, we have that
E ( X ) = E ( X 1 ) + E ( X 2 ) + . . . + E ( X N ) = i = 1 N E ( X i ) = N n N + M which is the correct solution.

We have step-by-step solutions for your answer!

Expert Community at Your Service

  • Live experts 24/7
  • Questions are typically answered in as fast as 30 minutes
  • Personalized clear answers
Learn more

You might be interested in

asked 2022-08-28
Probability, geometric distribution definitions
Consider an unfair die, where the probability of obtaining 6 is p 1 / 6.. The die is thrown several times. Call T the RV that counts the number of throws before a 6 appears for the first time. What will be the distribution of T.
asked 2022-09-25
The probability of winning in a shootout, using a geometric random variable
Sir Lancelot and Sir Galahad are doing a shoot out, in which they try to shoot each other while shooting in the same time at each other. The probability of Sir Lancelot to hit Sir Galahad is 0.5 and the probability of Sir Galahad to hit Sir Lancelot is 0.25. All the shots are independent.
A. What is the probability that the shoot out will end in the n's round ?
B. If it is known that after k rounds the shoot out did not end, what is the chance that is will end within two rounds ?
C. What is the chance of Sir Lancelot to win ?
D. What is the chance of Sir Galahad to win ?
asked 2022-07-22
A Diminishing Geometric Distribution
A standard geometric distribution can be interpreted as the number of Bernoulli trials required to get one success. However, what if the probability of success if each trial diminishes by some factor with each failure?
Let p be probability of success of the first trial and d be the diminishing factor of each failure such that the probability of success of trial n is p d n 1 . I have been trying to calculate the expected value of this distribution without success. My main stumbling block is the probability of failing n times:
P ( n  failures ) = k = 1 n ( 1 p d k 1 )
Given that, the expected value would be
E ( X ) = n = 1 n p d n 1 F ( n 1 )
Is there a way to get a closed form for this?
asked 2022-08-16
A system used to read electric meters automatically requires the use of a 64-bit computer message. Occasionally random interference causes a digit reversal resulting in a transmission error. assume that the probability of a digit reversal for each bit is 1/2000. Let X denote the number of transmission errors per 64-bit message sent. Is X geometric?
asked 2022-08-10
Expected value for geometric probability
Points P = ( X P , Y P ) and Q = ( X Q , Y Q ) were independently chosen from square (−1,0),(0,-1),(1,0),(0,1) with geometric probability.
How does one find E | X P X Q | 2   ?
How do you even define expected value here?
asked 2022-07-17
What is the expected volume of the simplex formed by n + 1 points independently uniformly distributed on S n 1 ?
asked 2022-08-21
Geometric interpretation of multiplication of probabilities?
When dealing with abstract probability space Ω which consists of atomic events with measure ( P : Ω R ) defined for them, it seems natural to start immediately imagining simple cases like this: Ω is some closed area in 2-D space partitioned into subareas, representing atomic events. Measure is the area. The total area of Ω is 1.
In fact this is what they sometimes picture in textbooks with the help of vienne diagrams or whatever.
This intuition works fine for simple cases, especially with adding probabilities.
But then it comes to probability multiplication: I don't understand how to interpret it within this simple model. Is there a way?.. Is there at all some mental model to think about probability multiplication in simple geometric terms, besides Lebesgue terms?

New questions