Suppose time-series data has been generated according to the following process:
yt = α+ Φy_{t-1} + ut, u_{t} = ε_{t} + ρε_{t-1} ,
where ε_{t} is independent white noise. Our main interest is consistent estimation of Φ from realizations on y_{t}.
1) Provide conditions for this process to be stationary.
2) From hereonout, assume the process is stationary. Will OLS generally provide you with consistent point estimates of Φ? Can you give conditions under which it will? Provide the asymptotic distribution of OLS under these assumptions.
3) ARMA processes are generally estimated by ML. Do you have enough information to set up the (marginal or conditional) likelihood function?
4) Show that
E[y_{t-j}(y_{t} - α - Φy_{t-1})] = 0; j ≥ 2,
holds.
5) Use the moment conditions under problem 4 to derive a consistent GMM estimator of (α,Φ). Does it require knowledge of the distribution of ε_{t}? Note that we discussed asymptotic properties of GMM estimators assuming a fixed amount of moments.
6) Given your answer to the previous problem, derive a lower bound on the variance of your estimator.
7) Suppose that you are uncertain about the model specification above. Moreover, you fear that ut might have the structure
u_{t }= ε_{t} + ρε_{t-1} + δ_{t-2};
which is an MA(2) if δ ≠ 0. Can your estimation framework from above (problems 5 and 6) be used to test the null hypothesis that δ = 0? If yes, show how.
8) To convince yourself that your estimator is consistent, try it on simulated data (use Matlab or R, say). You can experiment with the number of moments to investigate the effect on the bias and the coverage of confidence intervals.