A limit theory is developed for mildly explosive autoregression under both weakly and strongly dependent innovation errors. We find that the asymptotic behaviour of the sample moments is affected by the memory of the innovation process both in the in the form of the limiting distribution and, in the case of long range dependence, in the rate of convergence. However, this effect is not present in least squares regression theory as it is cancelled out by the interaction between the sample moments. As a result, the Cauchy regression theory of Phillips and Magdalinos (2007a) is invariant to the dependence structure of the innovation sequence even in the long memory case.
Download the paper in PDF format
Tassos Magdalinos
View all Granger Centre discussion papers | View all School of Economics featured discussion papers
School of EconomicsUniversity of Nottingham University Park Nottingham, NG7 2RD
lorenzo.trapani@nottingham.ac.uk