View Submission - EcoSta2022

A0507
**Title: **Small tuning parameter selection for the debiased lasso
**Authors: **Akira Shinkyu - Shiga University (Japan) **[presenting]**

**Abstract: **The debiased Lasso has been proposed for statistical inference in high dimensional linear regression models. It needs an estimate of the column vector of the precision matrix to correct the bias of the Lasso, and usually, we get it by the node-wise Lasso. It is common to set the order of the tuning parameter of the node-wise Lasso as $\sqrt{\log p/n}$. However, the debiased Lasso with the tuning parameter has a large bias when the column vector of the precision matrix is not sparse, so the number of nonzero coefficients should be much small such that $o(\sqrt{n}/\log p)$ for asymptotic normality. Motivated by this issue, we show that by setting the order of the tuning parameter of the node-wise Lasso as $1/\sqrt{n}$, the bias of the debiased Lasso can be removed more without making the variance diverge and sparsity conditions on the column vector of the precision matrix. This implies that the debiased Lasso is asymptotically normal even if the number of nonzero coefficients is $o(\sqrt{n/\log p})$, although it may not be efficient. We also propose a tuning parameter selection procedure for the node-wise Lasso.