CFE 2020: Start Registration
View Submission - CMStatistics
Title: Optimal linear discriminators for the discrete choice model in growing dimensions Authors:  Debarghya Mukherjee - University of Michigan (United States) [presenting]
Moulinath Banerjee - University of Michigan (United States)
Yaacov Ritov - University of Michigan (United States)
Abstract: Manski's celebrated maximum score estimator for the discrete choice model, which is an optimal linear discriminator, has been the focus of much investigation in both the econometrics and statistics literature. Still, its behavior under growing dimension scenarios largely remains unknown. That gap is addressed. Two different cases are considered: $p$ grows with $n$, but at a slow rate, i.e. $p/n\to 0$; and $p >> n$ (fast growth). In the binary response model, we recast Manski's score estimation as an empirical risk minimization for a classification problem. We derive the $l_2$ rate of convergence of the score estimator under a transition condition in terms of our margin parameter that calibrates the level of difficulty of the estimation problem. We also establish upper and lower bounds for the minimax $l_2$ error in the binary choice model that differ by a logarithmic factor and construct a minimax-optimal estimator in the slow growth regime. Some extensions to the general case -- the multinomial response model -- are also considered. Last but not least, we use a variety of learning algorithms to compute the maximum score estimator in growing dimensions.