CMStatistics 2017: Start Registration
View Submission - CMStatistics
Title: The Kullback-Leibler divergence in testing multivariate extreme value models Authors:  Sebastian Engelke - Ecole Polytechnique Federale de Lausanne (Switzerland)
Philippe Naveau - CNRS-IPSL (France)
Chen Zhou - Erasmus University Rotterdam (Netherlands)
Chen Zhou - Erasmus University Rotterdam (Netherlands) [presenting]
Abstract: Many effects of climate change are reflected in the frequency and severity of the extreme events in the tail of the distributions. Detecting such changes requires a statistical methodology that can test the distributional changes in the large observations in the sample. We propose a simple, non-parametric test that decides whether two multivariate distributions exhibit the same tail behavior. The test is based on the Kullback-Leibler divergence, between exceedances over a high threshold of the two multivariate random vectors. We show that such a divergence measure is closely related to the divergence between Bernoulli random variables. We study the properties of the test and further explore its effectiveness for finite sample sizes. As an application we apply the method to precipitation data where we test whether the marginal tails and/or the extreme value dependence structure have changed over time.