Title: Robust inference in mixture models maximum mean discrepancy relaxations
Authors: Yordan Raykov - University of Nottingham (United Kingdom) [presenting]
Abstract: Bayesian inference delivers principled rules for learning from data and integrating out uncertainty. As data grows through, posterior estimates concentrate around the likelihood, and the robustness properties embedded in the prior diminish. This problem is often approached by different forms of model-agnostic robust inference frameworks, such as generalised Bayesian inference. We propose a model-specific approach for robust inference of mixture-type densities to any potential likelihood of misspecification; we call this the neighbourhood mixture model. We propose to use the maximum mean discrepancy (MMD) to relax the assumptions of the component parameters(s) from being points to neighbourhoods. The proposed framework is shown to lead to superior maximum-a-posterior point estimates across many practical tasks, such as clustering in the presence of model misspecification, learning the number of mixture components, and clustering of single-cell data in the presence of sequencing depth.