Title: Beyond FDR: Towards simultaneous selective inference and post-hoc error control
Authors: Aaditya Ramdas - Carnegie Mellon University (United States) [presenting]
Eugene Katsevich - Stanford University (United States)
Abstract: The false discovery rate (FDR) is a popular error criterion for multiple testing, but it is not without its flaws. Indeed, (a) controlling the mean of the false discovery proportion (FDP) does not preclude large FDP variability, and (b) committing to an error level before observing the data limits its use in exploratory data analysis. We take a step towards addressing both drawbacks by proving uniform FDP bounds for a variety of existing FDR procedures. We open up a middle ground between fully simultaneous inference (guarantees for all possible rejection sets), and fully selective inference (guarantees only for a single rejected set). They allow the scientist to ``spot'' one or more suitable rejection sets (select post-hoc on the algorithm's trajectory) by picking data-dependent sizes or error-levels, after examining the entire path of estimated FDPs and the uniform upper band on the true FDP. This post-hoc mode of inference addresses both aforementioned drawbacks of FDR. Our bounds apply to online FDR procedures as well, providing simultaneous high-probability uniform FDP bounds at arbitrary data-dependent query times for arbitrary online procedures. Finally, our analysis unifies existing martingale and empirical process viewpoints on FDR algorithms.