B0859
Title: Sensitivity analysis for violations of proximal identification assumptions
Authors: Raluca-Ioana Cobzaru - Massachusetts Institute of Technology (United States) [presenting]
Roy Welsch - Massachusetts Institute of Technology (United States)
Stan Finkelstein - MIT (United States)
Zach Shahn - IBM Research (United States)
Kenney Ng - MIT-IBM Watson AI Lab (United States)
Abstract: Causal inference from observational data often rests on the unverifiable assumption of no unmeasured confounding. Recently, proximal inference has been introduced to leverage negative control outcomes and exposures as proxies to adjust for bias from unmeasured confounding. However, some of the key assumptions that proximal inference relies on are themselves empirically untestable. Additionally, the impact of violations of proximal inference assumptions on the bias of effect estimates is not well understood. We derive bias formulas for proximal inference estimators under a linear structural equation model data-generating process. These results are the first step toward sensitivity analysis and quantitative bias analysis of proximal inference estimators. While limited to a particular family of data-generating processes, our results offer some more general insight into the behavior of proximal inference estimators.