Title: Mathematical foundations of learning with information theoretic criteria
Authors: Yunlong Feng - The State University of New York at Albany (United States)
Qiang Wu - Middle Tennessee State University (United States) [presenting]
Abstract: Learning with information theoretic criteria, namely, the maximum correntropy criterion (MCC) and the minimum error entropy (MEE), has been shown successful in a variety of applications. It is particularly powerful to handle data contaminated by outlier or heavy tailed noises. To theoretically justify the effectiveness of these two information theoretic criteria, we studied the consistency of MCC and MEE based machine learning algorithms. We showed that, with appropriate parameter selection strategies, these algorithms can effectively learn both mean regression function and modal regression function. We also proved some no-free-lunch theorems which indicate that, in some scenarios, there must be some sacrifice of information contained in the data in order to achieve high prediction accuracy.