CMStatistics 2021: Start Registration
View Submission - CMStatistics
Title: Testing for more positive expectation dependence with application to model comparison Authors:  Michel Denuit - Universite catholique de Louvain (Belgium)
Julien Trufin - Universite libre de Bruxelles (Belgium)
Thomas Verdebout - Universite Libre de Bruxelles (Belgium)
Julien Trufin - Université Libre de Bruxelles (Belgium) [presenting]
Abstract: Modern data science tools are effective to produce predictions that strongly correlate with responses. Model comparison can therefore be based on the strength of dependence between responses and their predictions. Positive expectation dependence turns out to be attractive in that respect. The present talk proposes an effective testing procedure for this dependence concept and applies it to compare two models. A simulation study is performed to evaluate the performances of the proposed testing procedure. Empirical illustrations using insurance loss data demonstrate the relevance of the approach for model selection in supervised learning. The most positively expectation dependent predictor can then be autocalibrated to obtain its balance-corrected version that appears to be optimal with respect to Bregman, or forecast dominance.