Identification in a regression discontinuity (RD) design hinges on the discontinuity in the probability of treatment when a covariate (assignment variable) exceeds a known threshold. If the assignment variable is measured with error, however, the discontinuity in the first stage relationship between the probability of treatment and the observed mismeasured assignment variable may disappear. Therefore, the presence of measurement error in the assignment variable poses a challenge to treatment effect identification. This paper provides sufficient conditions for identification when only the mismeasured assignment variable, the treatment status and the outcome variable are observed. We prove identification separately for discrete and continuous assignment variables and study the properties of various estimation procedures. We illustrate the proposed methods in an empirical application, where we estimate Medicaid takeup and its crowdout effect on private health insurance coverage.
Despite the widespread use of graphs in empirical research, little is known about readers’ ability to
process the statistical information they are meant to convey (“visual inference”). In this paper, we evaluate
several key aspects of visual inference in regression discontinuity (RD) designs by measuring how
well readers can identify discontinuities in graphs. First, we assess the effects of graphical representation
methods on visual inference, using randomized experiments crowdsourcing discontinuity classifications
with graphs produced from data generating processes calibrated on datasets from 11 published papers.
Second, we evaluate visual inference by both experts and non-experts and study experts’ ability to predict
our experimental results. We find that experts perform comparably to non-experts and partly anticipate
the effects of graphical methods. Third, we compare experts’ visual inference to commonly used econometric
procedures in RD designs and observe that it achieves similar or lower type I error rates. Fourth,
we conduct an eyetracking study to further understand RD visual inference, but it does not reveal gaze
patterns that robustly predict successful inference. We also evaluate visual inference in the closely related regression kink design.