B0800
Title: Randomization tests for assessing covariate balance when designing and analyzing matched datasets
Authors: Zach Branson - Carnegie Mellon University (United States) [presenting]
Abstract: Observational studies are often complicated by covariate imbalances among treatment groups, and matching methods alleviate this complication by finding subsets of treatment groups that exhibit covariate balance. Balance often serves as evidence that a matched dataset approximates a randomized experiment, but what kind of experiment does it approximate? We develop a randomization test to assess if matched data approximates a particular experimental design, such as complete randomization or block randomization. Our test can incorporate any design and allows for a graphical display that puts several designs on the same univariate scale, thereby allowing researchers to pinpoint which design, if any, is most appropriate for a matched dataset. After researchers determine a design, we recommend a randomization-based analytical approach that can incorporate any design and treatment effect estimator. We find that our test can frequently detect violations of randomized assignment, and also that matched datasets with high levels of balance tend to approximate balance-constrained designs like rerandomization, thereby allowing for precise causal analyses. However, assuming a precise design should be proceeded with caution, because it can harm inference if there are still large biases due to remaining imbalances after matching. We also demonstrate how this approach can be used for instrumental variable analyses and regression discontinuity designs, all using our R package randChecks.