In this paper we consider estimating the timing of a break in level and/or trend when the order of integration and autocorrelation properties of the data are unknown. For stationary innovations, break point estimation is commonly performed by minimizing the sum of squared residuals across all candidate break points, using a regression of the levels of the series on the assumed deterministic components. For unit root processes, the obvious modification is to use a first differenced version of the regression, while a further alternative in a stationary autoregressive setting is to consider a GLS-type quasi-differenced regression. Given uncertainty over which of these approaches to adopt in practice, we develop a hybrid break fraction estimator that selects from the levels-based estimator, the first-difference-based estimator, and a range of quasi-difference-based estimators, according to which achieves the global minimum sum of squared residuals. We establish the asymptotic properties of the estimators considered, and compare their performance in practically relevant sample sizes using simulation. We find that the new hybrid estimator has desirable asymptotic properties and performs very well in finite samples, providing a reliable approach to break date estimation without requiring decisions to be made regarding the autocorrelation properties of the data.
Download the paper in PDF format
David I. Harvey and Stephen J. Leybourne
View all Granger Centre discussion papers | View all School of Economics featured discussion papers
School of EconomicsUniversity of Nottingham University Park Nottingham, NG7 2RD
lorenzo.trapani@nottingham.ac.uk