Title: Convergence of gradient descent method for minimum error entropy principle
Authors: Ting Hu - Wuhan University, The Hong Kong Polytechnic University (China) [presenting]
Abstract: Information theoretical learning refers to a framework of learning methods that use concepts of entropies and divergences from information theory to substitute the conventional statistical descriptors of variances and covariance. It becomes an important research topic in signal processing and machine learning as many algorithms have been developed within this framework and many applications domains have been discovered. We study a kernel version of minimum error entropy methods that can be used to find non-linear structures in the data. We show that the kernel minimum error entropy can be implemented by kernel based gradient descent algorithms with or without regularization.