Simple Linear Regression - Difference between predicting and estimating? Here is what my notes s

g2esebyy7

g2esebyy7

Answered question

2022-04-25

Simple Linear Regression - Difference between predicting and estimating?
Here is what my notes say about estimation and prediction:
Estimating the conditional mean
"We need to estimate the conditional mean β0+β1x0 at a value x0, so we use Y0^=β0^+β1^x0 as a natural estimator." here we get
Y0^~N(β0+β1x0,σ2h00)  where  h00=1n+(x0x)2(n1)sx2
with a confidence interval for E(Y0)=β0+β1x0 is
(b0^+b1^x0csh00,b0^+b1^x0+csh00)
where c=tn2,1α2 Where these results are found by looking at the shape of the distribution and at E(Y0^) and var(Y0^)
Predicting observations
"We want to predict the observation Y0=β0+β1x0+ϵ0 at a value x0"
E(Y0^Y0)=0  and  var(Y0^Y0)=σ2(1+h00)
Hence a prediction interval is of the form
(b0^+b1^x0csh00+1,b0^+b1^x0+csh00+1)

Answer & Explanation

Colonninisxi

Colonninisxi

Beginner2022-04-26Added 16 answers

You have to distinguish between estimating E[YX=x0] and {Y}(x0)
For the former you have that E[YX=x]^=β0^+β^1x0,
such that Var(E[YX=x]^)=σ2[1n+(x0x)2(n1)sx2]=σ2h00,
where for the latter you have that
Var(Y^(x0))=Var(E[YX=x]^)+Var(ϵ0)=σ2(1+h00).
Namely, for the former case you are estimating the conditional mean of Y at x0, while for the latter you are estimating (predicting) the value itself. The conditional mean "smooths out" the variance of the error term as E[YX=x]=β0+β1x, however the prediction of a new value Y(x0) should consider also the (estimated) variance of the error term (to account for the fluctuation around the conditional mean).

Do you have a similar question?

Recalculate according to your conditions!

New Questions in College Statistics

Ask your question.
Get an expert answer.

Let our experts help you. Answer in as fast as 15 minutes.

Didn't find what you were looking for?