Let X_1,X_2,...,X_n be a random sample from a N(theta1,theta2) distribution. Find the uniformly minimum variance unbiased estimator of 3theta_2^2.

Aleah Avery

Aleah Avery

Answered question

2022-11-23

Let X 1 , X 2 , . . . , X n be a random sample from a N ( θ 1 , θ 2 ) distribution. Find the uniformly minimum variance unbiased estimator of 3 θ 2 2 .

Answer & Explanation

Kristen Garza

Kristen Garza

Beginner2022-11-24Added 13 answers

Lets X i N ( μ , σ 2 ). We want to show ( X i , X i 2 ) is complete for ( μ , σ 2 ).
It is enough to show ( X ¯ , S = ( X i X ¯ ) 2 ) is complete. We know X ¯ and S are independent and X ¯ N ( μ , σ 2 n ), S G a m m a ( n 1 2 , 2 σ 2 ).
We should show if ( μ , σ 2 )
E ( g ( X ¯ , S ) ) = 0 P ( g ( X ¯ , S ) = 0 ) = 1
0 = E ( g ( X ¯ , S ) ) = 0 + g ( x ¯ , s ) f ( x ¯ ) f ( s ) d x ¯ d s
= 1 Γ ( n 1 2 ) ( σ 2 ) n 1 2 0 ( + g ( x ¯ , s ) f ( x ¯ ) s n 1 2 1 e s σ 2 d x ¯ ) d s
= 1 Γ ( n 1 2 ) ( σ 2 ) n 1 2 0 ( + g ( x ¯ , s ) f ( x ¯ ) s n 1 2 1 d x ¯ ) e s σ 2 d s
= 1 Γ ( n 1 2 ) ( σ 2 ) n 1 2 0 ( h ( s ) ) e s σ 2 d s
The above is a Laplace transform of h ( s ), which implies h ( s ) = 0, a.e.
So
0 = + g ( x ¯ , s ) f ( x ¯ ) d x ¯
= + g ( x ¯ , s ) 1 2 π σ 2 n e n 2 σ 2 ( x ¯ μ ) 2 d x ¯
= + ( g ( x ¯ , s ) 1 2 π σ 2 n e n 2 σ 2 x ¯ 2 e n 2 σ 2 μ 2 ) e n 2 σ 2 2 x ¯ μ d x ¯
The above is a Two-sided Laplace transform.
So g ( x ¯ , s ) = 0 a.e.

Do you have a similar question?

Recalculate according to your conditions!

New Questions in College Statistics

Ask your question.
Get an expert answer.

Let our experts help you. Answer in as fast as 15 minutes.

Didn't find what you were looking for?