The last lessons have spent a lot of time describing the slope and intercept terms (and their variances) of the one-variable sample regression function. We also know that for any particular value of the independent variable (call it X0), that the predicted value of Y0 is Y0 = B0 + B1X0 . (This is sometimes called a "point prediction.")
a) Prove that Y0 is an unbiased estimator of E[Y0|X0].
b) Derive the formula for the variance of Yo . Show at least two steps in this derivation.
a. You are looking for Var(Yo) = Var(B0 + B1X0) . This is the variance of a sum of two random variables. What is the general formula for such a sum? (Go back to week 2 lectures, if you need a reminder.) Use that formula now.
b. If you did hint 1 correctly, you will see you need the formula for cov(B0, B1) . Take it on faith that this can be found to be cov(B0, B1) = -Xbar var(B1) . (You might find it interesting that the two estimators have a negative correlation. A steeper slope tends to imply a lower intercept, and vice versa.)