B1441
Title: Sequential inference for the bayesian mallows model
Authors: Anja Stein - Lancaster University (United Kingdom) [presenting]
David Leslie - University of Lancaster (United Kingdom)
Arnoldo Frigessi - University of Oslo (Norway)
Abstract: Recently, a Bayesian inference approach has been developed for the Mallows model, a probability distribution that models ranking data. The framework currently uses MCMC methods to learn and sample from the posterior distribution of the model. However, MCMC is computationally costly if the data arrives sequentially, a scenario that can arise in many settings including internet data where individuals express preferences over items and we wish to infer a consensus ranking to inform what to display to future customers, in turn, express preferences. Using MCMC, each time that new data arrives we need to re-run the full MCMC. Instead, we develop a Sequential Monte Carlo (SMC) method to sequentially update our inference with the Bayesian Mallows model. This allows us to efficiently fit the Bayesian Mallows model in scenarios where we receive new observations through time, both in terms of rankings from previously unobserved individuals, and in terms of updated rankings from existing individuals who have previously provided a (partial) ranking. We provide comparison results between the MCMC and SMC approaches for all scenarios considered, using existing MCMC and new SMC code in the BayesMallows R package.