Title: Selecting the number of maximum autocorrelation factors
Authors: Souveek Halder - Australian National University (Australia) [presenting]
Abstract: Dimension reduction is a popular technique in statistics, used to transform high dimensional data into a low dimensional space. Principal Component Analysis (PCA) is one of the classical methods that is popularly used by researchers for dimension reduction. However, a major drawback of PCA is that it only provides the best linear approximation for a high dimensional dataset. Maximum Autocorrelation Factors (MAF) was developed as an alternative to PCA by Switzer and Green in the 1980s. One significant challenge in using MAF for dimension reduction is to determine how many factors to retain or, in other words, how much dimension reduction is to be done. In most cases the choice is made using some ad-hoc method such as an autocorrelation scree-plot or a bootstrap based technique. However, these methods cannot be considered best in all situations. Thus, our focus is on tests that deal with eigenvalues, as computing MAF is equivalent to solving an eigenvalue/eigenfunction problem. We evaluate the performance of Conditional Singular Value test (CSV), a signal-to-noise ratio test, the Pseudorank method, and a likelihood ratio test.