CMStatistics 2018: Start Registration
View Submission - CMStatistics
Title: An algebraic estimator for large spectral matrices Authors:  Matteo Barigozzi - Università di Bologna (Italy)
Matteo Farne - University of Bologna (Italy) [presenting]
Abstract: A method is presented to estimate a large $p$-dimensional spectral matrix assuming that the data follow a dynamic factor model with a sparse residual. In specific, we apply a nuclear norm plus $l_1$ norm heuristics to any kernel input estimate at each frequency. We assume that the latent eigenvalues scale to $p^{\alpha}$, $\alpha \in [0,1]$, and the sparsity degree scales to $p^{\delta}$, with $\delta \leq \frac{1}{2}$ and $\delta \leq \alpha$. We prove that the algebraic recovery of latent rank and sparsity patterns is guaranteed if the smallest latent eigenvalue $\lambda_r$ and the minimum residual nonzero entry in absolute value $\min_S$ are large enough across frequencies. The identifiability of the underlying matrix recovery problem requires the absolute convergence of latent and residual filters and a limited discrepancy among the eigenvectors of the factorial coefficients and the sparsity patterns of the residual coefficients across lags. The consistency of the input is derived via an appropriate weak dependence assumption both on factors and residuals. The recovery quality directly depends on the ratio $\frac{p^{\alpha}}{\sqrt{T}}$, where $T$ is the sample length, and the magnitude of $T$ is required to be $p^{3\delta}$ or larger. In a wide simulation study, we stress the crucial role of $\lambda_r$ and $\min_S$ across frequencies, highlighting the conditions which cause our method to fail.