Title: Learning and prediction via hierarchies of random measures in Bayesian nonparametrics
Authors: Igor Pruenster - Bocconi University (Italy) [presenting]
Abstract: The Hierarchical Dirichlet Process enjoyed huge success in the MachineLearning literature. It originated in the context of topic modeling as a powerful generalization of the ubiquitous Latent Dirichlet Allocation model that allows learning the number of topics from the data. the hierarchical Dirichlet Process and more general Bayesian nonparametric models constructed as hierarchies of random probability measures can be naturally embedded within the framework of partial exchangeability, a neat probabilistic representation for multiple distinct yet related populations. Moreover, the discrete nature of these models yields ties across populations, resulting in a shrinkage property often described as sharing of information'' and crucial for the learning mechanism. We obtain distributional results, which include a characterization of the induced random partitions and a complete posterior representation: these are key to the derivation of effective sampling schemes. Further interesting extensions deal with tree structures and hierarchies of random measures. Illustrations concerning species sampling, survival analysis, network theory and topic modeling are provided.