Title: Calibrated inference: Statistical inference that accounts for both sampling uncertainty and distributional uncertainty
Authors: Dominik Rothenhaeusler - Stanford University (United States) [presenting]
Yujin Jeong - Stanford University (United States)
Abstract: During data analysis, analysts often have to make seemingly arbitrary decisions. For example during data pre-processing, there are a variety of options for dealing with outliers or inferring missing data. Similarly, many specifications and methods can be reasonable to address a certain domain question. This may be seen as a hindrance to reliable inference since conclusions can change depending on the analyst's choices. We argue that this situation is an opportunity to construct confidence intervals that account not only for sampling uncertainty but also for some type of distributional uncertainty. Distributional uncertainty is closely related to other issues in data analysis, ranging from dependence between observations to selection bias and confounding. We demonstrate the utility of the approach on simulated and real-world data.