In a normal linear model (with intercept), show that if the residuals satisfy e_i=a+beta x_i , then each residual is equal to zero.

vagnhestagn

vagnhestagn

Answered question

2022-09-07

In a normal linear model (with intercept), show that if the residuals satisfy e i = a + β x i , for i = 1 n, where x is a predictor in the model, then each residual is equal to zero.
How to do this

Answer & Explanation

Mckenna Friedman

Mckenna Friedman

Beginner2022-09-08Added 10 answers

Since your regression model has intercept, we can assume, the X matrix for the regression has the form
X = ( 1 x 1 T 1 x 2 T 1 x n T )
Note that the residual must be orthogonal to every vector in column space of X. This is because the predicted value, Y ^ = P X Y (where P X is the orthogonal projection matrix onto the column space of X), and hence the residual vector e = Y Y ^ = ( I P X ) Y. So for any vector c of appropriate dimension
c T e = ( c T c T P X ) Y .
Now if c lies in the column space of X then
P X c = c
or
c T = c T P X T = c T P X  since  P X  is symmetric
and it follows for any c in the column space of X X
c T e = 0.
Now, your condition implies e itself lies in the column space of X and hence must be orthogonal to itself, i.e., e T e = e 2 = 0, i.e., e = 0..
dalllc

dalllc

Beginner2022-09-09Added 1 answers

If our linear model is given by
y i = w 0 + w 1 x i + e i
we can substitute e i = β 0 + β 1 x i to obtain
y i = w 0 + w 1 x i + ( β 0 + β 1 x i )
y i = [ w 0 + β 0 ] + [ w 1 + β 1 ] x i
y i = w ~ o + w ~ 1 x i + e ~ i ,
in which e ~ i = 0. Hence, as the error is not random. The model can capture everything that is present in the error. Residual term e ~ i 0..

Do you have a similar question?

Recalculate according to your conditions!

New Questions in High school statistics

Ask your question.
Get an expert answer.

Let our experts help you. Answer in as fast as 15 minutes.

Didn't find what you were looking for?