Despite the widespread use of graphs in empirical research, little is known about readers’ ability to
process the statistical information they are meant to convey (“visual inference”). In this paper, we evaluate
several key aspects of visual inference in regression discontinuity (RD) designs by measuring how
well readers can identify discontinuities in graphs. First, we assess the effects of graphical representation
methods on visual inference, using randomized experiments crowdsourcing discontinuity classifications
with graphs produced from data generating processes calibrated on datasets from 11 published papers.
Second, we evaluate visual inference by both experts and non-experts and study experts’ ability to predict
our experimental results. We find that experts perform comparably to non-experts and partly anticipate
the effects of graphical methods. Third, we compare experts’ visual inference to commonly used econometric
procedures in RD designs and observe that it achieves similar or lower type I error rates. Fourth,
we conduct an eyetracking study to further understand RD visual inference, but it does not reveal gaze
patterns that robustly predict successful inference. We also evaluate visual inference in the closely related regression kink design.