Title: Efficient second-order optimization methods for machine learning
Authors: Fred Roosta - University of Queensland (Australia) [presenting]
Michael Mahoney - Stanford (United States)
Abstract: Contrary to the scientific computing community which has, wholeheartedly, embraced the second-order optimization algorithms, the machine learning community has long nurtured a distaste for such methods, in favor of first-order alternatives. We argue that such reluctance to employ curvature information can indeed hinder the training procedure in a variety of ways. Specifically, in the context of non-convex machine learning problems, we demonstrate the theoretical properties as well as empirical performance of a variety of efficient Newton-type algorithms. In the process, we highlight the serious disadvantages of first-order methods and, in their light, showcase the practical advantages offered by such second-order methods.