Assume a linear regression model: y_i = beta_0 + beta_1 x_(i1)..... + beta_p x_(ip)+ epsilon_i. Prove that: Cov(e,Y) = 0 where: e = the residuals vector, Y = the predicted vector of Y

anudoneddbv

anudoneddbv

Answered question

2022-07-15

Assume a linear regression model: y i = β 0 β 1 x i 1 +...+ β p x i p + ϵ i
C o v ( e , Y ^ )
where: e = the residuals vector
Y ^ = the predicted vector of Y
Use the fact that X T e

Answer & Explanation

fairymischiefv9

fairymischiefv9

Beginner2022-07-16Added 11 answers

You can write your estimator:
Y ^ = X β ^
Therefore, and by rules of C o v, you can take the constant matrix out (right side with transpose):
C o v ( e , Y ^ ) = C o v ( e , X β ^ ) = C o v ( e , β ^ ) X T
Now you can proove that:
C o v ( e , β ^ ) = 0
Since:
C o v ( e , β ^ ) = C o v ( Y Y ^ , β ^ ) = C o v ( Y , β ^ ) C o v ( Y ^ , β ^ )
1.
C o v ( Y , β ^ ) = C o v ( Y , ( X T X ) 1 X T Y ) = C o v ( Y , Y ) × ( ( X T X ) 1 X T ) T = σ 2 I X ( X T X ) 1 = σ 2 X ( X T X ) 1
2.
C o v ( Y ^ , β ^ ) = C o v ( X β ^ , β ^ ) = X [ C o v ( β ^ , β ^ ) ] = X [ σ 2 ( X T X ) 1 ] = σ 2 X ( X T X ) 1
Therefore:
C o v ( Y , β ^ ) C o v ( Y ^ , β ^ ) = σ 2 X ( X T X ) 1 σ 2 X ( X T X ) 1 = 0
Both equations rely on knowledge about the distribution of β ^ N ( β , σ 2 ( X T X ) 1 ) and that Y ^ = X β ^ = X ( X T X ) 1 X T Y

Do you have a similar question?

Recalculate according to your conditions!

New Questions in High school statistics

Ask your question.
Get an expert answer.

Let our experts help you. Answer in as fast as 15 minutes.

Didn't find what you were looking for?