Title: Inference in time series models using smoothed-clustered standard errors
Authors: Timothy Vogelsang - Michigan State University (United States) [presenting]
Seunghwa Rho - Emory University (United States)
Abstract: A long run variance estimator is proposed for conducting inference in time series regression models that combines the nonparametric approach with a cluster approach. The basic idea is to divide the time periods into non-overlapping clusters. The long run variance estimator is constructed by first aggregating within clusters and then kernel smoothing across clusters or applying the nonparametric series method to the clusters with Type II discrete cosine transform. We develop asymptotic theory for test statistics based on these ``smoothed-clustered'' long run variance estimators. We derive asymptotic results holding the number of clusters fixed and also treating the number of clusters as increasing with the sample size. For kernel smoothing, these two asymptotic limits are different whereas for the cosine series approach, the limits are the same. When clustering before kernel smoothing, the ``fixed-number-of-clusters'' asymptotic approximation works well whether the number of clusters is small or large. Finite sample simulations suggest that the naive i.i.d. bootstrap mimics the fixed-number-of-clusters critical values. The simulations suggest that clustering before kernel smoothing can reduce over-rejections caused by strong serial correlation with a cost of power. When clustering is natural, it can reduce over-rejection problems and achieve small gains in power for the kernel approach. In contrast, the cosine series approach does not benefit from clustering.