Title: Asymptotic normality of the maximum likelihood estimator for the latent block model
Authors: Vincent Brault - AgroParisTech (France)
Christine Keribin - INRIA - Paris-Saclay University (France) [presenting]
Mahendra Mariadassou - INRA (France)
Abstract: Latent Block Model (LBM) is a probabilistic method based on a mixture model to cluster simultaneously the $d$ columns and $n$ rows of a data matrix. Maximum likelihood parameter estimation in LBM is a difficult and multifaceted problem, as neither the likelihood, nor the expectation of the conditional likelihood are numerically tractable. Then, the standard EM must be adapted and various estimation strategies have been proposed and are now well understood empirically. But as far as now, theoretical guarantees about their asymptotic behaviour is rather sparse. We show here that under some mild conditions on the parameter space, and in an asymptotic regime where $\log(d)/n$ and $\log(n)/d$ go to $0$ when $n$ and $d$ go to $+\infty$, (1) the maximum-likelihood estimate of the complete model (with known labels) is consistent and (2) the log-likelihood ratios are equivalent under the complete and observed (with unknown labels) models. This equivalence allows us to transfer the asymptotic consistency (1) to the maximum likelihood estimate under the observed model.