Title: Double estimation friendly inference in high-dimensional statistics
Authors: Rajen D Shah - University of Cambridge (United Kingdom) [presenting]
Peter Buehlmann - ETH Zurich (Switzerland)
Nicolai Meinshausen - ETH Zurich (Switzerland)
Abstract: A regression setting is considered where inference concerning the relationship between a response $Y$ and a variable $X$, after controlling for a high-dimensional vector of additional covariates $Z$, is of interest. In recent years, great advances have been made in developing methods for this important task, largely based on the debiased Lasso. However, these techniques typically require a sparse linear model of $Y$ on $(X, Z)$ to hold. We will introduce a framework for inference that is valid if either a sparse linear model of $Y$ on $Z$, or one of $X$ on $Z$ holds. For example, in the latter setting, the methods allow for testing conditional independence of $X$ and $Y$ given high-dimensional covariates $Z$, and the construction of confidence intervals for the coefficient corresponding to $X$ in a potentially only partially linear model where the contribution of $Z$ to the response is nonlinear. The framework can also be extended to encompass high-dimensional generalised linear models for either the relationship between $Y$ and $Z$ or $X$ and $Z$.