Title
Bias-Aware Inference in Regularized Regression Models
Author(s)
Timothy B. Armstrong Timothy Armstrong (Yale University)
Michal Kolesár Michal Kolesár (Princeton University)
Soonwoo Kwon Soonwoo Kwon (Yale University)
Abstract
We consider inference on a regression coefficient under a constraint on the magnitude of the control coefficients. We show that a class of estimators based on an auxiliary regularized regression of the regressor of interest on control variables exactly solves a tradeoff between worst-case bias and variance. We derive "bias-aware" confidence intervals (CIs) based on these estimators, which take into account possible bias when forming the critical value. We show that these estimators and CIs are near-optimal in finite samples for mean squared error and CI length. Our finite-sample results are based on an idealized setting with normal regression errors with known homoskedastic variance, and we provide conditions for asymptotic validity with unknown and possibly heteroskedastic error distribution. Focusing on the case where the constraint on the magnitude of control coefficients is based on an `p norm (p ≥ 1), we derive rates of convergence for optimal estimators and CIs under high-dimensional asymptotics that allow the number of regressors to increase more quickly than the number of observations.
Creation Date
2020-12
Section URL ID
Paper Number
2020-2
URL
https://www.princeton.edu/~mkolesar/papers/regularized_regression.pdf
File Function
Jel
C20
Keyword(s)
Regularized regression
Suppress
false
Series
13