Title: Moment distances for comparing high-entropy distributions with application in domain adaptation
Authors: Werner Zellinger - Johannes Kepler University Linz (Austria) [presenting]
Bernhard Moser - Software Competence Center Hagenberg (Austria)
Michael Zwick - Software Competence Center Hagenberg (Austria)
Edwin Lughofer - Johannes Kepler University Linz (Austria)
Thomas Natschlaeger - Software Competence Center Hagenberg (Austria)
Susanne Saminger-Platz - Johannes Kepler University Linz (Austria)
Abstract: Given two samples, the similarity of the distributions of the sample representations in the latent space of a discriminative model shall be enforced. Standard approaches are based on the minimization through probability metrics, e.g. by the Wasserstein metric, the Maximum Mean Discrepancy, or f-divergences. However, also moment distances not satisfying the identity of indiscernibles, i.e. pseudo-metrics, performed well in many practical tasks. The $L^1$-distance between two distributions having finitely many moments in common can be very large. The question is under which constraints on the distributions small values of moment distances imply distribution similarity. In this work we propose to focus on distributions with approximately maximal differential entropy and finitely many constrained moments. Fast $L^1$-convergence rates of sequences of distributions of this class are obtained by means of moment distances. A new generalization error bound is proposed for unsupervised moment-based domain adaptation.