Title: A geometrical framework for covariance matrices and covariance operators in machine learning and applications
Authors: Minh Ha Quang - RIKEN (Japan) [presenting]
Abstract: Symmetric positive definite (SPD) matrices, in particular covariance matrices, play important roles in many areas of mathematics and statistics, with numerous applications various different fields, including machine learning, brain imaging, and computer vision. The set of SPD matrices is not a subspace of Euclidean space and consequently algorithms utilizing the Euclidean metric tend to be suboptimal in practice. A lot of recent research has therefore focused on exploiting the intrinsic geometrical structures of SPD matrices, in particular the view of this set as a Riemannian manifold. We will present a survey of some of the recent developments in the generalization of the geometrical structures of finite-dimensional covariance matrices to those of infinite-dimensional covariance operators. Computationally, we focus on covariance operators in Reproducing Kernel Hilbert Spaces (RKHS). This direction exploits the power of kernel methods from machine learning in a geometrical framework, both mathematically and algorithmically. The theoretical formulation will be illustrated with applications in computer vision, which demonstrate both the power of kernel covariance operators as well as of the algorithms based on their intrinsic geometry.