Optimal Data-Driven Regression Discontinuity Plots∗ Supplemental Appendix Sebastian Calonico†

Matias D. Cattaneo‡

Rocio Titiunik§

November 25, 2015

Abstract This supplemental appendix contains the proofs of our main theorems, additional methodological and technical results, detailed simulation evidence, and further empirical illustrations not included in the main paper to conserve space.



Financial support from the National Science Foundation (SES 1357561) is gratefully acknowledged. Department of Economics, University of Miami. ‡ Department of Economics and Department of Statistics, University of Michigan. § Department of Political Science, University of Michigan. †

Contents 1 Implied Weights in Optimal WIMSE Approach

2

2 Proofs of Main Theorems

3

2.1

Lemma SA1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

3

2.2

Lemma SA2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

5

2.3

Lemma SA3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

6

2.4

Proof of Theorem 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

8

2.5

Proof of Theorem 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

2.6

Proof of Theorem 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

2.7

Proof of Remark 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

2.8

Proof of Theorem 4 and Remark 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

3 Data-Driven Implementations with Arbitrary w(x)

12

3.1

Evenly Spaced RD Plots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

3.2

Quantile Spaced RD Plots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

4 Other Empirical Applications

18

4.1

U.S. Senate Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

4.2

Progresa/Oportunidades Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

4.3

Head Start Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

4.4

Summary of Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

5 Complete Simulation Results

28

6 Numerical Comparison of Partitioning Schemes

48

1

1

Implied Weights in Optimal WIMSE Approach

Recall from the main paper that the optimal choices of number of bins based on a WIMSE can be written as JES-ω,−,n = dω− JES-µ,−,n e

and JES-ω,+,n = dω+ JES-µ,+,n e,

where JES-µ,−,n and JES-µ,+,n denote the IMSE-optimal choices and ω− = (ωB,− /ωV ,− )1/3 and ω+ = (ωB,+ /ωV ,+ )1/3 . As discussed in the paper, this result may be used to justify ad-hoc rescalings chosen by the researchers when using the IMSE-optimal choices as a starting point. In particular, given a choice of rescaling factors ω− and ω+ , we have:  (ωV ,− , ωB,− ) =

3 ω− 1 , 3 1 + ω3 1 + ω− −



 and

(ωV ,+ , ωB,+ ) =

3 ω+ 1 , 3 1 + ω3 1 + ω+ +

 ,

which are the resulting weights entering the WIMSE objective function that would be compatible with such choices of rescale constants for the IMSE-optimal number of bins. To gain some intuition on the relative weights emerging from manual rescaling of the IMSEoptimal choice, we present the implied weights in the optimal WIMSE approach for different, common choices of rescaling constants ω: ω 0.1 0.2 0.5 1 2 5 10

ωV 0.999 0.992 0.889 0.500 0.111 0.008 0.001

ωB 0.001 0.008 0.111 0.500 0.889 0.992 0.999

As expected, the larger ω the smaller the weight on variance (ωV ) and the larger the weight on bias (ωB ) in the WIMSE objective function. Our software implementations in R and Stata compute this weights explicitly as part of the standard output; see Calonico, Cattaneo and Titiunik (2014a, 2015) for further details.

2

2

Proofs of Main Theorems

We state and prove results only for the treatment group (subindex “+”) because for the control group the results and proofs are analogous. Here we only provide short, self-contained proofs of the main results presented in the paper. To this end, we first state three preliminary technical lemmas. We also offer short proofs of these lemmas, and provide references to the underlying results not reproduced here to conserve space. Recall that the lower and upper end points of P+,j are denoted, respectively, by p+,j−1 and p+,j for j = 1, 2, · · · , J+,n , which are nonrandom under ES partitioning and random under QS partitioning. Let p¯+,j = (p+,j + p+,j−1 )/2 be the middle point of bin P+,j . Throughout the supplemental appendix C denotes an arbitrary positive, bounded constant taking different values in different places.

2.1

Lemma SA1

This lemma holds for any nonrandom partition P+,n satisfying C1 C2 ≤ min |p+,j − p+,j−1 | ≤ max |p+,j − p+,j−1 | ≤ , 1≤j≤J+,n J+,n 1≤j≤J+,n J+,n for fixed positive constants C1 and C2 . In particular, it holds for PES,+,n . Note also that Lemma SA1(i) shows that P(N+,j > 0) → 1 uniformly in j, which guarantees that the estimators for the ES partitioning scheme are well-behaved in large samples. Lemma SA1. Let Assumption 1 hold. For PES,+,n , if J+,n log(J+,n ) →0 n

and

3

J+,n → ∞,

then the following results hold.

(i)

max

1≤j≤J+,n

|1(N+,j > 0) − 1| = oP (1).

−1 |N+,j /n − P[Xi ∈ P+,j ]| = oP (J+,n ).   n 1 X X − p ¯ X − p ¯ −1 max 1P+,j (Xi ) i +,j − E 1P+,j (Xi ) i +,j = oP (J+,n ). 1≤j≤J+,n n p+,j − p+,j−1 p+,j − p+,j−1 i=1  Xi − p¯+,j = o(J −1 ). max E 1P+,j (Xi ) +,n 1≤j≤J+,n p+,j − p+,j−1

(ii)

max

1≤j≤J+,n

(iii) (iv)

Proof of Lemma SA1. The proof of this lemma is very similar to the results given in the supplemental appendix of Cattaneo and Farrell (2013). Part (i) follows by properties of the Binomial distribution and simple bounding arguments, under the assumptions imposed. For part (ii), note −1 that E[1(Xi ∈ P+,j )] = P[Xi ∈ P+,j ] = O(J+,n ) and C1 /J+,n ≤ V[1(Xi ∈ P+,j )] ≤ C2 /J+,n ,

uniformly in j = 1, 2, · · · , J+,n . For any ε > 0, and using Bernstein inequality, we have  Nj − P[Xi ∈ P+,j ] > ε P J+,n max 1≤j≤J+,n n " n # X ≤ J+,n max P (1(Xi ∈ P+,j ) − P[Xi ∈ P+,j ]) > nε/J+,n 1≤j≤J+,n i=1 ) ( 2 n2 ε2 /J+,n ≤ J+,n max 2 exp − Pn 1≤j≤J+,n 2 i=1 V[1(Xi ∈ P+,j )] + 2nε ) (   Cn Cn ≤ C exp − + log(J ) ≤ C exp − + log(J ) → 0, +,n +,n 2 ε J+,n J+,n + J+,n 

provided that J+,n log(J+,n )/n → ∞. Part (iii) follows by similar arguments. Finally, to verify part (iv), using change of variables we obtain   Xi − p¯+,j max E 1P+,j (Xi ) 1≤j≤J+,n p+,j − p+,j−1 Z xu x − p ¯ +,j 1P+,j (x) = max f (x)dx 1≤j≤Jn p+,j − p+,j−1 x ¯ Z 1 = max (p+,j − p+,j−1 ) uf (u(p+,j − p+,j−1 ) + p¯+,j )du 1≤j≤J+,n −1 Z xu − x ¯ 1 = max uf (¯ p+,j )du + o(1) , 1≤j≤J+,n J+,n −1 4

and the result follows.

2.2

Lemma SA2

This second lemma characterizes the properties of the random partitioning scheme based on quantile estimates. These results will be used when handling the partitioning scheme PQS,+,n : recall that p+,j = Fˆ+−1 (j/J+,n ) in this case, j = 1, 2, · · · , J+,n , and thus set q+,j = F+−1 (j/J+,n ) with F+−1 (y) = inf{x : F+ (x) ≥ y} with

F+ (x) =

P[Xi ≤ x, Xi ≥ x ¯] = F (x|Xi ≥ x ¯). P[Xi ≥ x ¯]

Lemma SA2. Let Assumption 1 hold. For PQS,+,n , if J+,n log(J+,n ) →0 n

and

J+,n → ∞, log(n)

then the following results hold.

(i) (ii)

max

−1 |N+,j /N+ − 1/J+,n | = oP (J+,n ).

max

−1 |p+,j − p+,j−1 − (q+,j − q+,j−1 )| = oP (J+,n ).

1≤j≤J+,n

1≤j≤J+,n

Proof of Lemma SA2. Because the sample size N+ is random, we employ the following P result: if N+ →as ∞ and Zn →as Z∞ , then ZN+ →as Z∞ . In our case, N+ = ni=1 1(Xi ≥ x ¯) and thus N+ /n →as P+ . Hence, it suffices to assume N+ → ∞ is not random, but we need to prove the statements in an almost sure sense. The rest of the proof takes limits as N+ → ∞. Part (i) now follows from properties of distribution function and quantile processes (e.g., Shorack and Wellner, 2009). Using continuity and boundedness of f (x), we have

N+,j

     n X j −1 j − 1 −1 ˆ ˆ ≤ Xi < F+ = 1 F+ J+,n J+,n i=1       j j−1 N+ = N+ Fˆ+ Fˆ+−1 − N+ Fˆ+ Fˆ+−1 {1 + oas (1)} = {1 + oas (1)}, J+,n J+,n J+,n

5

uniformly in j = 1, 2, · · · , J+,n , under the rate restrictions imposed. Similarly, part (ii) follows from properties of the modulus of continuity of the sample quantile process (e.g., Mason (1984) and Shorack and Wellner (2009, Chapter 14)). We have

|p+,j − p+,j−1 − (q+,j − q+,j−1 )|          −1 j j −1 −1 j − 1 −1 j − 1 ˆ ˆ = oas (J −1 ), − F+ − F+ − F+ = max F+ +,n 1≤j≤J+,n J+,n J+,n J+,n J+,n max

1≤j≤J+,n

under the rate restrictions imposed.

2.3

Lemma SA3

Our final third technical lemma gives the main convergence results for the spacings estimators used to construct data-driven choices of partition sizes. We employ the notation introduced in Section 5 of the main paper. Lemma SA3. Let Assumption 1 hold, and set ` ∈ Z+ . If Yi (1) is continuously distributed and g : [¯ x, xu ] → R+ is continuous, then the following results hold.

(i)

N+`−1

N+ X

¯ +,(i) ) →P `!P `−1 (X+,(i) − X+,(i−1) )` g(X +

i=2 N+

(ii)

N+`−1

X

Z

xu

f (x)1−` g(x)dx.

x ¯

¯ +,(i) ) →P `!P `−1 2 (X+,(i) − X+,(i−1) ) (Y+,[i] − Y+,[i−1] ) g(X + `

2

i=2

Z

xu

2 f (x)1−` σ+ (x)g(x)dx.

x ¯

Proof of Lemma SA3. We prove the result assuming that N+ is nonrandom, and thus limits are taken as N+ → ∞. Set Ui = F+ (X+,i ) ∼ Uniform(0, 1) and U(i) = F+ (X+,(i) ), i = 1, · · · , N+ . ¯ : i = 2, · · · , N+ }, where {Ei : i = Recall that {N+ (U(i) − U(i−1) ) : i = 2, · · · , N+ } =d {Ei /E ¯ = PN+ Ei /N+ , and where 2, · · · , N+ } i.i.d. random variables with Ei ∼ Exponential(1) and E i=2 Z1 =d Z2 denotes that Z1 and Z2 have the same probability law. Set u ¯i = (i − 1/2)/N+ and recall that max2≤i≤N+ supU(i−1) ≤u≤U(i) |u − u ¯i | →P 0. PN+ ` Ei →P E[Ei` ] = `!, and uniform continuity of g(·) and For part (i), using the above, N+−1 i=2

6

f (·), N+ X ¯ +,(i) ) (X+,(i) − X+,(i−1) )` g(X

N+`−1

i=2 N+ g(F+−1 (un,i )) 1 X = (N+ (U(i) − U(i−1) ))` {1 + oP (1)} N+ f+ (F+−1 (un,i ))` i=2 N+

  1 X Ei ` g(F+−1 (un,i )) =d {1 + oP (1)} ¯ N+ E f+ (F+−1 (un,i ))` i=2 N+ −1 1 X ` g(F+ (un,i )) {1 + oP (1)} = E[Ei ] N+ f+ (F+−1 (un,i ))` i=2 Z 1 g(F+−1 (u)) →P `! du, −1 ` 0 f+ (F+ (u))

and the result follows by change of variables and because f+ (x) = f (x)1(x ≥ x ¯)/P+ . This result PN+ ¯ +,(i) ) = OP (N 1−` ). implies, in particular, i=2 (X+,(i) − X+,(i−1) )` g(X + For part (ii), let X(+) = (X+,(1) , X+,(2) , · · · , X+,(N+ ) ). Recall that (Y+,[1] , Y+,[2] , · · · , Y+,[N+ ] ) are independent conditional on X(+) and E[g(Y+,[i] )|X(+) ] = E[g(Y+,[i] )|X+,(i) ] = G(X+,(i) ) with 2 (X 2 G(x) = E[g(Y+,i )|X+,i = x]. Therefore, E[(Y+,[i] − Y+,[i−1] )2 |X(+) ] = σ+ +,(i) ) + σ+ (X+,(i−1) ) + −2 2 (X 2 (E[Y+,[i] |X(+) ] − E[Y+,[i−1] |X(+) ])2 = σ+ +,(i) ) + σ+ (X+,(i−1) ) + OP (N+ ), uniformly in i. This

gives N+`−1

N+ X ¯ +,(i) ) = T1 + T2 , (X+,(i) − X+,(i−1) )` (Y+,[i] − Y+,[i−1] )2 g(X i=2

with T1 =

N+`−1

N+ X 2 2 ¯ +,(i) ) + oP (1), (X+,(i) − X+,(i−1) )` (σ+ (X+,[i] ) + σ+ (X+,[i−1] ))g(X i=2

T2 =

N+`−1

N+ X

  ¯ +,(i) ). (X+,(i) − X+,(i−1) )` (Y+,[i] − Y+,[i−1] )2 − E[(Y+,[i] − Y+,[i−1] )2 |X[+] ] g(X

i=2 2 (X 2 2 ¯ Noting that σ+ +,(i) ) + σ+ (X+,(i−1) ) = 2σ+ (X+,(i) ){1 + oP (1)}, uniformly in i, it follows that Rx 2 (x)g(x)dx, as in part (i). Thus, it remains to show that T → T1 →P `!P+`−1 2 x¯ u f (x)1−` σ+ 2 P

0. To this end, first define Y˜i = (Y+,[i] − Y+,[i−1] )2 − E[(Y+,[i] − Y+,[i−1] )2 |X(+) ], and note that

7

E[Y˜i , Y˜i−s |X(+) ] = 0 whenever s ≥ 2, which implies

2(`−1)

V[T2 |X(+) ] ≤ N+

N+ X

¯ +,(i) )2 (X+,(i) − X+,(i−1) )2` V[Y˜i |X(+) ]g(X

i=2 N+ 2(`−1)

X ¯ +,(i) )g(X ¯ +,(i−1) ) (X+,(i) − X+,(i−1) )` (X+,(i−1) − X+,(i−2) )` E[Y˜i Y˜i−1 |X(+) ]g(X

+ 2N+

i=2

≤ CN+−1 ,

and the result follows by the dominated convergence theorem. P The random sample size case (N+ = ni=1 1(Xi ≥ x ¯)) can be handled, for example, using the approach described in Aras et al. (1989) and references therein.

2.4

Proof of Theorem 1

For the variance part, we have n X 1(N+,j > 0)1P+,j (x) X V[ˆ µ+ (x; J+,n )|Xn ] = 1P+,j (Xi )σ+2 (Xi ), 2 N +,j j=1 i=1 J+,n

2 (·) on [¯ x, xu ] and Lemma SA1, we obtain and using uniform continuity of w(·) and σ+

Z

xu

V[ˆ µ+ (x; J+,n )|Xn ]w(x)dx x ¯ J+,n

=

X n X 1(N+,j > 0) Z xu (x)w(x)dx 1 1P+,j (Xi )σ+2 (Xi ) P+,j 2 N x ¯ +,j j=1 i=1

J+,n

=

X 1(N+,j > 0) 2 (p+,j − p+,j−1 )σ+ (¯ p+,j )w(¯ p+,j ){1 + oP (1)} N+,j j=1

J+,n 2 (¯ p+,j )w(¯ p+,j ) 1 X σ+ {1 + oP (1)}, = n f (¯ p+,j ) j=1

8

because P[Xi ∈ P+,j ] =

R p+,j p+,j−1

f (x)dx = (p+,j − p+,j−1 )f (¯ p+,j ){1 + o(1)} uniformly in j. Using

properties of the Riemann integral it then follows that Z

xu

V[ˆ µES,+ (x; J+,n )|Xn ]w(x)dx x ¯ J+,n X σ 2 (¯ p+,j )w(¯ p+,j ) J+,n 1 = (p+,j − p+,j−1 ) + {1 + oP (1)} n xu − x ¯ f (¯ p+,j ) j=1 Z xu 2 σ+ (x) J+,n 1 w(x)dx{1 + oP (1)} = n xu − x ¯ x¯ f (x) J+,n = VES,+ {1 + oP (1)}, n

because p+,j+1 − p+,j = (xu − x ¯)/J+,n for the evenly spaced partition. Rx µ+ (x; Jn )|Xn ] − µ+ (x))2 w(x)dx = T1 + T2 + T3 with Next, for the bias term, note that x¯ u (E[ˆ xu

Z

Z

2

T1 (x) w(x)dx,

T1 =

xu

2

T2 (x) w(x)dx,

T2 =

xu

T1 (x)T2 (x)w(x)dx,

T3 = 2 x ¯

x ¯

x ¯

Z

J+,n

T1 (x) =

X

1P+,j (x)(1(N+,j > 0)µ+ (¯p+,j ) − µ+ (x)),

j=1 J+,n

T2 (x) =

X

1P+,j (x)

j=1

n 1(N+,j > 0) X

N+,j

!

1P+,j (Xi )(µ+ (Xi ) − µ+ (¯p+,j )) .

i=1

Using uniform continuity of µ+ (·) and w(·) on [¯ x, xu ] and Lemma SA1, we obtain J+,n Z xu 2 1 X  (1) T1 = µ+ (¯ p+,j ) w(¯ p+,j ) 1P+,j (x)(¯p+,j − x)2 dx{1 + oP (1)} 12 x ¯ j=1

J+,n

2  1 X (1) p+,j ){1 + oP (1)} (p+,j − p+,j−1 )3 µ+ (¯ p+,j ) w(¯ 12 j=1 Z 1 (xu − x ¯)2 xu  (1) 2 −2 = 2 µ+ (x) w(x)dx{1 + oP (1)} = J+,n BES,+ {1 + oP (1)}, 12 J+,n x ¯ =

because

Rb

2 a ((a + b)/2 − x) dx

= (b − a)3 /12 and p+,j+1 − p+,j = (xu − x ¯)/J+,n for the evenly spaced

−2 −2 partition. This implies that T1 = OP (J+,n ). Thus, to finish the proof, we show that T2 = oP (J+,n ) −2 and T3 = oP (J+,n ). For T2 , using uniform continuity of µ+ (·) and w(·) on [¯ x, xu ] and Lemma SA1

9

we have J+,n

|T2 | ≤ C

X

1(N+,j > 0)

j=1

2 N 2 /n2 J+,n +,j

n

X − p¯ 1X 1P+,j (Xi ) i +,j n p+,j − p+,j−1

−2 {1 + oP (1)} = oP (J+,n ),

i=1

while, for T3 , Cauchy-Swartz inequality implies |T3 | ≤

2.5

!2



√ −1 −1 −2 T1 T2 = OP (J+,n )oP (J+,n ) = oP (J+,n ).

Proof of Theorem 2

Recall that p+,j = Fˆ+−1 (j/J+,n ) and q+,j = F+−1 (j/J+,n ). If J+,n < N+ , then

1(N+,j > 0) = 1,

but now the partitioning scheme PQS,+,n is random. For the variance part, letting q¯+,j = (q+,j + q+,j−1 )/2, we have xu

Z

V[ˆ µQS,+ (x; J+,n )|Xn ]w(x)dx x ¯ J+,n

=

X j=1

1 2 N+,j

Z x ¯

xu

1P+,j (x)w(x)dx

X n

1P+,j (Xi )σ+2 (Xi )

i=1

J+,n

=

J+,n X 2 (p+,j − p+,j−1 )σ+ (¯ p+,j )w(¯ p+,j ){1 + oP (1)} N+ j=1

J+,n J+,n X 2 (q+,j − q+,j−1 )σ+ (¯ q+,j )w(¯ q+,j ){1 + oP (1)} N+ j=1 Z xu J+,n J+,n 1 2 σ+ (x)w(x)dx{1 + oP (1)} = VQS,+ {1 + oP (1)}, = n P+ x¯ n

=

using Lemma SA2 and properties of the Riemann integral. For the bias part, using the previous results and proceeding as in the proof of Theorem 1, Z

xu

(E[ˆ µQS,+ (x; Jn )|Xn ] − µ+ (x))2 w(x)dx

x ¯ J+,n  2 1 X (1) = (p+,j − p+,j−1 )3 µ+ (¯ p+,j ) w(¯ p+,j ){1 + oP (1)} 12 j=1

J+,n  2 1 X (1) (q+,j − q+,j−1 )3 µ+ (¯ q+,j ){1 + oP (1)} q+,j ) w(¯ 12 j=1 !2 Z (1) 1 P+2 xu µ+ (x) −2 = 2 w(x)dx{1 + oP (1)} = J+,n BQS,+ {1 + oP (1)}, f (x) J+,n 12 x¯

=

10

because, for quantile spaced partitions, expanding F+−1 (u) around u ¯ = F+ (¯ q+,j ) ∈ [(j−1)/J+,n , j/J+,n ]), q+,j − q+,j−1 = F+−1



j J+,n



− F+−1



j−1 J+,n

 =

1 1 {1 + oP (1)}, f+ (¯ q+,j ) J+,n

uniformly in j = 1, 2, · · · , J+,n , where f+ (x) = ∂F+ (x)/∂x = f (x)1(x ≥ x ¯)/P+ .

2.6

Proof of Theorem 3

Using Lemma SA3 with ` = 1 and g(x) = 1, N+

VˆES,+ =

1 1X 1 (X+,(i) − X+,(i−1) )(Y+,[i] − Y+,[i−1] )2 = xu − x ¯2 xu − x ¯ i=2

Z x ¯

xu

2 σ+ (x)dx + oP (1),

which gives VˆES,+ →P VES,+ . Next, note that for power series estimators, Newey (1997, Theorem 4) gives (1)

(1)

sup |ˆ µ+,kn (x) − µ+ (x)|2 = OP (kn7 /n + kn−2S+8 ) = oP (1).

x∈[¯ x,xu ]

Using this uniform consistency result we have n n  2 (x − x  2 (xu − x ¯)2 X ¯ )2 1 X (1) u BˆES,+ = 1(Xi < x¯) µˆ(1) (X ) 1 (X < x ¯ ) µ (X ) = + oP (1) i i i + +,kn 12n 12 n i=1 i=1 Z (xu − x ¯)2 xu  (1) 2 µ+ (x) w(x)dx + oP (1), = 12 x ¯

which gives BˆES,+ →P BES,+ . Putting the above together, consistency of all the data-driven selectors follows.

2.7

Proof of Remark 1

Note that for power series estimators, Newey (1997, Theorem 4) gives sup |ˆ µ+,kn ,p (x) − E[Y (1)p |Xi = x]|2 = OP (kn3 /n + kn−2S+2 ) = oP (1)

x∈[¯ x,xu ]

11

for p = 1, 2, under the assumptions imposed, which implies 2 2 2 sup |ˆ σ+ (x) − σ+ | = OP (kn3 /n + kn−2S+2 ) = oP (1).

x∈[¯ x,xu ]

2 (x), Using this result, and Lemma SA3 with ` = 1 and g(x) = σ+ N+

VˇES,+ =

1 X 2 ¯ +,(i) ) (X+,(i) − X+,(i−1) )ˆ σ+,k (X xu − x ¯ i=2 N+

=

1 1 X 2 ¯ +,(i) ) + oP (1) →P (X+,(i) − X+,(i−1) )σ+,k (X xu − x ¯ xu − x ¯ i=2

Z

xu

2 σ+ (x)dx = VES,+ .

x ¯

Combining this with Theorem SA1, the different consistency results follow.

2.8

Proof of Theorem 4 and Remark 2

Proceeding as in the proofs of Theorem 3 and Remark 1, the results are established using Lemma SA3, N+ /n →P P+ , and uniform consistency of power series estimators, as appropriate for each case.

3

Data-Driven Implementations with Arbitrary w(x)

In this section we provide data-driven implementations for all of our number of bins selectors when w(x) is taken as given. As discussed in the main text, we estimate the unknown constants using ideas related to spacings estimators whenever possible, but we also discuss series (polynomial) nonparametric regression estimates for completeness (to handle the non-continuous outcome case). Recall the notation introduced in the main paper related to order statistics and concomitants. For a collection of continuous random variables {(Zi , Wi ) : i = 1, 2, · · · , n} we let W(i) be the i-th order statistic of Wi and Z[i] its corresponding concomitant. That is, W(1) < W(2) < · · · < W(n) and (Z[i] , W(i) ) = (Zi , W(i) ) for all i = 1, 2, · · · , n. Letting {(Y−,i , X−,i ) : i = 1, 2, · · · , N− } and {(Y+,i , X+,i ) : i = 1, 2, · · · , N+ } be the subsamples of control (Xi < x ¯) and treatment (Xi ≥ x ¯)

12

units, respectively. We also have: ¯ −,(i) = X

X−,(i) + X−,(i−1) , 2

i = 2, 3, · · · , N− ,

ˆ , µ ˆ−,k (x) = rk (x)0 β −,k

¯ +,(i) = X

X+,(i) + X+,(i−1) , 2

i = 2, 3, · · · , N+ ,

ˆ , µ ˆ+,k (x) = rk (x)0 β +,k

(1)

(1)

(1)

(1)

(1)

and rk (x) = ∂rk (x)/∂x = (0, 1, 2x, 3x2 , · · · , kxk−1 )0 .

3.1

Evenly Spaced RD Plots

For the case of ES RD Plots with generic w(x) weighting scheme, we propose the following estimators:

N−

VˆES,− =

1 nX ¯ −,(i) ), (X−,(i) − X−,(i−1) )2 (Y−,[i] − Y−,[i−1] )2 w(X x ¯ − xl 4

(SA-1)

i=2

N−  2 (¯ x − xl )2 X (1) ¯ ¯ −,(i) ), BˆES,− = (X−,(i) − X−,(i−1) ) µ ˆ−,k (X ) w(X −,[i] 12

(SA-2)

i=2

and

N+

VˆES,+ =

1 nX ¯ +,(i) ), (X+,(i) − X+,(i−1) )2 (Y+,[i] − Y+,[i−1] )2 w(X xu − x ¯4

(SA-3)

i=2

N+  2 (xu − x ¯)2 X (1) ¯ ˆ ¯ +,(i) ). (X+,(i) − X+,(i−1) ) µ ˆ+,k (X BES,+ = w(X +,[i] ) 12

(SA-4)

i=2

Thus, our proposed data-driven selectors for ES RD Plots take the form: 

JˆES-µ,−,n

2BˆES,− =   VˆES,−

 JˆES-ω,−,n =  ω− 

2BˆES,− VˆES,− &

JˆES-ϑ,−,n =

!1/3

 n1/3   

!1/3



and JˆES-µ,+,n

 n1/3   

Vˆ− n ˆ VES,− log(n)2

2BˆES,+ =   VˆES,+ 

and JˆES-ω,+,n =  ω+  '

& and JˆES-ϑ,+,n =



!1/3

2BˆES,+ VˆES,+

n1/3  , 

(SA-5)



!1/3

n1/3  , 

' Vˆ+ n , VˆES,+ log(n)2

(SA-6)

(SA-7)

using the estimators in (SA-1)–(SA-4), and where Vˆ− and Vˆ+ are consistent estimators of their population counterparts V− and V+ . The following theorem shows that, when the polynomial 13

fits are viewed as nonparametric approximations with k = kn → ∞, the different number of bins selectors are nonparametric consistent. Theorem SA1. Suppose Assumption 1 holds with S ≥ 5, w : [xl , xu ] 7→ R+ is continuous, and Yi (0) and Yi (1) are continuously distributed. If kn7 /n → 0 and kn → ∞, then JˆES-ϑ,−,n →P 1, JES-ϑ,−,n

JˆES-ω,−,n →P 1, JES-ω,−,n

JˆES-ω,+,n →P 1, JES-ω,+,n

JˆES-ϑ,+,n →P 1, JES-ϑ,+,n

provided that Vˆ− →P V− and Vˆ+ →P V+ . Proof of Theorem SA1. Using Lemma A3 with k = 2 and N+ /n →P P+ , N+

VˆES,+ =

1 nX ¯ +,(i) ) (X+,(i) − X+,(i−1) )2 (Y+,[i] − Y+,[i−1] )2 w(X xu − x ¯4 i=2 N+ N+ X

1 ¯ +,(i) ) + oP (1) (X+,(i) − X+,(i−1) )2 (Y+,[i] − Y+,[i−1] )2 w(X xu − x ¯ 4P+ i=2 Z xu 2 σ+ (x) 1 w(x)dx + oP (1), = xu − x ¯ x¯ f+ (x) =

which gives VˆES,+ →P VES,+ . Similarly, VˆES,− →P VES,− . (1)

(1)

Next, recall that for power series estimators supx∈[¯x,xu ] |ˆ µ+,kn (x) − µ+ (x)|2 = OP (kn7 /n + kn−2S+8 ) = oP (1). Using this uniform consistency result, and Lemma A3 with k = 1, we have N+  2 (xu − x ¯ )2 X (1) ¯ +,(i) ) w(X ¯ +,(i) ) BˆES,+ = (X+,(i) − X+,(i−1) ) µ ˆ+,kn (X 12

= =

(xu − 12 (xu − 12

i=2 N+ 2 x ¯) X

x ¯)2

 2 (1) ¯ ¯ +,(i) ) + oP (1) w(X (X+,(i) − X+,(i−1) ) µ+ (X ) +,(i)

i=2 xu

Z

x ¯



2 (1) µ+ (x) w(x)dx + oP (1),

which gives BˆES,+ →P BES,+ . Similarly, BˆES,− →P BES,− .



Recall that the special case ωV ,− = ωV ,+ = 1/2 gives JˆES-µ,−,n = JˆES-ω,−,n and JˆES-µ,+,n = JˆES-ω,+,n . Theorem SA1 therefore gives a formal justification for employing any of the selectors introduced in our paper for the number of bins in ES RD Plots constructed with a known, arbitrary 14

weight function w(x); a particular choice being w(x) = 1. As discussed in the main text, when Yi (0) and Yi (1) are not continuously distributed, the concomitant-based estimation method becomes invalid. In this case, we need to employ other more standard nonparametric techniques. For example, assuming that E[Yi (t)2 |Xi = x], t = 0, 1, are twice continuously differentiable, we can use the following estimators: N−

VˇES,− =

1 nX 2 ¯ −,(i) )w(X ¯ −,(i) ), (X−,(i) − X−,(i−1) )2 σ ˆ−,k (X x ¯ − xl 2 i=2

N+

VˇES,+ =

1 nX 2 ¯ +,(i) )w(X ¯ +,(i) ), (X+,(i) − X+,(i−1) )2 σ ˆ+,k (X xu − x ¯2 i=2

2 (x) = µ ˆ+,k,2 (x) − (ˆ µ+,k,1 (x))2 , σ ˆ+,k

2 (x) = µ ˆ−,k,2 (x) − (ˆ µ−,k,1 (x))2 , σ ˆ−,k

where, for k ∈ Z+ and p ∈ Z++ , ˆ µ ˆ−,k,p (x) = rk (x)0 β −,k,p ,

ˆ µ ˆ+,k,p (x) = rk (x)0 β +,k,p ,

ˆ β −,k,p = arg min

β∈Rk+1

ˆ β +,k,p = arg min

β∈Rk+1

n X

1(Xi < x¯)(Yip − rk (Xi )0 β)2 ,

i=1 n X

1(Xi ≥ x¯)(Yip − rk (Xi )0 β)2 ,

i=1

and note that µ ˆ−,k (x) = µ ˆ−,k,1 (x) and µ ˆ+,k (x) = µ ˆ+,k,1 (x) with our notation. From results for power series estimators, sup |ˆ µ+,kn ,p (x) − E[Y (1)p |Xi = x]|2 = OP (kn3 /n + kn−2S+2 ) = oP (1)

x∈[¯ x,xu ]

for p = 1, 2, under the assumptions imposed, which implies 2 2 2 sup |ˆ σ+ (x) − σ+ | = OP (kn3 /n + kn−2S+2 ) = oP (1).

x∈[¯ x,xu ]

15

Therefore, Lemma A3 with k = 2 and N+ /n →P P+ , N+

VˇES,+ =

1 nX 2 ¯ ¯ +,(i) ) (X+,(i) − X+,(i−1) )2 σ ˆ+ (X+,(i) )w(X xu − x ¯2 i=2 N+ N+ X

1 2 ¯ ¯ +,(i) ) + oP (1) (X+,(i) − X+,(i−1) )2 σ+ (X+,(i) )w(X xu − x ¯ 2P+ i=2 Z xu 2 σ+ (x) 1 = w(x)dx + oP (1), xu − x ¯ x¯ f+ (x) =

which gives VˇES,+ →P VES,+ . Similarly, VˇES,− →P VES,− . Combining these results with Theorem SA1, it can easily be shown that the following selectors are consistent for any continuous, arbitrary choice of w(x): 

JˇES-µ,−,n

2BˆES,− =  Vˇ ES,− 

 JˇES-ω,−,n =  ω− 

2BˆES,− VˇES,− &

JˇES-ϑ,−,n =



!1/3

n1/3   



and JˇES-µ,+,n



!1/3

n1/3   

n Vˆ− ˇ VES,− log(n)2

2BˆES,+ =  Vˇ ES,+  

and JˇES-ω,+,n =  ω+  '

& and JˇES-ϑ,+,n =



!1/3

2BˆES,+ VˇES,+

n1/3  , 

(SA-8)



!1/3

n1/3  , 

' n Vˆ+ , VˇES,+ log(n)2

(SA-9)

(SA-10)

provided that Vˆ− →P V− and Vˆ+ →P V+ .

3.2

Quantile Spaced RD Plots

We discuss generic estimators for QS RD Plots employing an arbitrary, known weighting function w(x), paralleling the results given above for ES RD Plots. The underlying estimators are: N− n X ¯ −,(i) ), (X−,(i) − X−,(i−1) )(Y−,[i] − Y−,[i−1] )2 w(X 2N−

(SA-11)

N−  2 N−2 X (1) ¯ ˆ ¯ −,(i) ), BQS,− = (X−,(i) − X−,(i−1) )3 µ ˆ−,k (X ) w(X −,(i) 72

(SA-12)

VˆQS,− =

i=2

i=2

16

and

N+ n X ¯ +,(i) ), (X+,(i) − X+,(i−1) )(Y+,[i] − Y+,[i−1] )2 w(X 2N+

(SA-13)

N+  2 N+2 X (1) ¯ ˆ ¯ +,(i) ). BQS,+ = w(X (X+,(i) − X+,(i−1) )3 µ ˆ+,k (X +,(i) ) 72

(SA-14)

VˆQS,+ =

i=2

i=2

Therefore, the resulting selectors for QS partitions take the form: 

JˆQS-µ,−,n

2BˆQS,− =   VˆQS,−

 JˆQS-ω,−,n =  ω− 

2BˆQS,− VˆQS,−

!1/3

 n1/3   

!1/3

& JˆQS-ϑ,−,n =



and JˆQS-µ,+,n

 n1/3   

Vˆ− n ˆ VQS,− log(n)2

2BˆQS,+ =   VˆQS,+ 

and JˆQS-ω,+,n =  ω+  '

& and JˆQS-ϑ,+,n =



!1/3

2BˆQS,+ VˆQS,+

n1/3  , 

(SA-15)



!1/3

n1/3  , 

' Vˆ+ n , VˆQS,+ log(n)2

(SA-16)

(SA-17)

using the estimators in (SA-11)–(SA-14), and appropriate consistent estimators Vˆ− and Vˆ+ . As in the case of Theorem SA1 for ES RD plots, the following theorem shows that these automatic partition-size selectors are nonparametric consistent if the polynomial fits are viewed as nonparametric approximations with k = kn → ∞. Theorem SA2. Suppose Assumption 1 holds with S ≥ 5, w : [xl , xu ] 7→ R+ is continuous, and Yi (0) and Yi (1) are continuously distributed. If kn7 /n → 0 and kn → ∞, then JˆQS-ω,−,n →P 1, JQS-ω,−,n

JˆQS-ϑ,−,n →P 1, JQS-ϑ,−,n

JˆQS-ω,+,n →P 1, JQS-ω,+,n

JˆQS-ϑ,+,n →P 1, JQS-ϑ,+,n

provided that Vˆ− →P V− and Vˆ+ →P V+ . In practice, the choice w(x) = 1 is arguably the simplest one, but our results permit any continuous function w(x). The proof of Theorem SA2 is very similar to the proof of Theorem SA1 given above, and hence omitted here to conserve space. Next, for the case of non-continuous potential outcomes Yi (0) and Yi (1), we use the series polynomial estimation approach already introduced. Assuming that E[Yi (t)2 |Xi = x], t = 0, 1, are 17

twice continuously differentiable, we may use the following estimators: N− n X 2 ˇ ¯ −,(i) )w(X ¯ −,(i) ), VQS,− = (X−,(i) − X−,(i−1) )ˆ σ−,k (X N− i=2

N+ n X 2 ¯ +,(i) )w(X ¯ +,(i) ), VˇQS,+ = (X+,(i) − X+,(i−1) )ˆ σ+,k (X N+ i=2

2 (x) and σ 2 (x) are the polynomial approximations already discussed. The associated where σ ˆ−,k ˆ+,k

data-driven partition-size selectors are 

JˇQS-µ,−,n

2BˆQS,− =  Vˇ QS,− 

 JˇQS-ω,−,n =  ω− 

2BˆQS,− VˇQS,− &

JˇQS-ϑ,−,n =

!1/3

 n1/3   

!1/3



and JˇQS-µ,+,n

 n1/3   

n Vˆ− ˇ VQS,− log(n)2

2BˆQS,+ =  Vˇ QS,+  

and JˇQS-ω,+,n =  ω+  '

& and JˇQS-ϑ,+,n =



!1/3

2BˆQS,+ VˇQS,+

n1/3  , 

(SA-18)



!1/3

n1/3  , 

' n Vˆ+ , VˇQS,+ log(n)2

(SA-19)

(SA-20)

which are easily shown to be consistent in the sense of Theorem SA2, provided the conditions in that theorem hold.

4

Other Empirical Applications

In this section we include three additional empirical applications to illustrate the performance of our proposed methods when applied to different real datasets. Software packages in R and STATA are described in Calonico et al. (2015, 2014a).

4.1

U.S. Senate Data

We employ an extract of the dataset constructed by Cattaneo et al. (2015), who study several measures of incumbency advantage in U.S. Senate elections for the period 1914–2010. In particular, we focus here on the RD effect of the Democratic party winning a U.S. Senate seat on the vote share obtained in the following election for that same seat. This empirical illustration is analogous

18

to the one presented by Lee (2008) for U.S. House elections: the running variable is the state-level margin of victory of the Democratic party in an election for a Senate seat, the threshold is x ¯=0 and the outcome is the vote share of the Democratic party in the following election for the same Senate seat in the state, which occurs six years later. The unit of observation is the state, and the data set has a total of n = 1, 297 state-year complete observations. Results are presented in Figures SA-1 and SA-2.

4.2

Progresa/Oportunidades Data

We illustrate the performance of our methods employing household data from Oportunidades (formerly known as Progresa), a well-known large-scale anti-poverty conditional cash transfer program in Mexico. This conditional cash transfer program targeted poor households in rural and urban areas in Mexico. The program started in 1998 under the name of Progresa in rural areas. The most important elements of the program are the nutrition, health and education components. The nutrition component consists of a cash grant for all treated households and an additional supplement for households with young children and pregnant or lactating mothers. The educational grant is linked to regular attendance in school and starts on the third grade of primary school and continues until the last grade of secondary school. The transfer constituted a significant contribution to the income of eligible families. This social program is best known for its experimental design: treatment was initially randomly assigned at the locality level in rural areas. Progresa was expanded to urban areas urban in 2003. Unlike the rural program, the allocation across treatment and control areas was not random. Instead, it was first offered in blocks with the highest density of poor households. In order to accurately target the program to poor households, in both rural and urban areas Mexican officials constructed a pre-intervention (at baseline) household poverty-index that determined each household’s eligibility. Thus, Progresa/Oportunidades’ eligibility assignment rule naturally leads to sharp (intention-to-treat) regression-discontinuity designs. For additional details for data construction, empirical analysis and related literature, see Calonico et al. (2014b, Section S.4). Our empirical exercise investigates the program treatment effect on household non-food consumption expenditures two years after its implementation. In this application, Xi denotes the 19

household’s poverty-index, x ¯ = 0 denotes the centered cutoff for each RD design, and Yi denotes per capita non-food consumption. Our final database contains 691 control households (Xi < 0) and 2, 118 intention-to-treat households (Xi ≥ 0) in the urban RD design (n = 2, 809, Xi ∈ [−2.25 , 4.11]). Results are presented in Figures SA-3 and SA-4.

4.3

Head Start Data

Head Start is a program of the United States Department of Health and Human Services that provides early childhood education, health, nutrition, and parent involvement services to lowincome children and their families. It was established in 1965 as part of the War on Poverty, in order to foster stable family relationships, enhance children’s physical and emotional well-being, and establish an environment to develop cognitive skills. For each county, eligibility is based on the county’s poverty rate, inducing a natural RD design. Ludwig and Miller (2007) uses this to identify the program’s effects on health and schooling. For each county i = 1, 2, ..., n, the forcing variable is the county’s 1960 poverty rate with treatment assignment given by Ti = 1(Xi ≥ x ¯), where Xi represents the county’s poverty rate in 1960 and x ¯ is the fixed threshold level. The cutoff is set to the poverty rate value of the 300th poorest county in 1960, which in this dataset is given by x ¯ = 59.198. Here we consider as outcome variable the mortality rates per 100, 000 for children between 5–9 years old, with Head Start-related causes, for 1973 − 1983 (see Panel A, Figure IV in Ludwig and Miller (2007)). Results are presented in Figures SA-5 and SA-6.

4.4

Summary of Results

In all the empirical applications we considered, the data-driven selectors introduced in the main paper seemed to perform very well. The mimicking variance selector for the number of bins consistently delivered a disciplined “cloud of points”, which appears to be substantially more useful than the scatter plot of the raw data. In addition, the IMSE-optimal choice of number of bins also performed well, in all cases “tracing out” the estimated smooth polynomial regression fits. As for the implementations, spacings estimators perform on par with polynomial estimators in all the 20

applications considered. Finally, it is worth noting that ES and QS RD plots do not necessarily deliver different number of bins. For example, in the Head Start data set, the mimicking variance choices are essentially identical for both types of RD plots.

21

100

80

60

40

20

0

100

80

60

Vote Share in Election at time t

−100







−100













































● ● ●



























● ●









● ●



● ● ● ●















●●

● ●







● ●● ●



● ●







● ●●



● ● ●

● ●



−50



● ●●



●●







●● ●









0



Vote Share in Election at time t+1



● ●●

50

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ●● ● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●●●● ● ●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ●●● ● ●● ● ● ● ● ● ● ●● ● ● ● ●● ●● ●●● ● ●● ● ●● ● ● ● ● ●● ● ●● ● ●●● ●● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●●● ● ● ●● ● ●● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ●● ● ●● ● ●● ●●●● ● ●● ● ●● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ●● ● ●● ● ● ●● ● ●● ● ● ● ● ● ●●● ● ● ● ● ●●● ● ●● ● ● ●● ●●●● ● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●●● ●● ● ● ●● ● ● ● ●●● ● ● ● ●● ● ● ●● ●●● ●● ● ● ● ●● ●● ● ● ●●●● ●● ● ●●● ● ● ● ●●● ●● ●● ● ● ●●● ●● ● ● ●●●● ● ● ● ●● ● ● ● ● ●●●●●● ● ●● ● ● ● ● ●●●● ● ● ●●●● ●●●●● ●●● ●● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ●●●● ●●● ● ● ● ●●● ●● ●● ● ●● ● ● ●● ● ● ●●●● ● ●●● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ●● ● ●●● ●●●●●●● ● ● ●●● ● ●● ● ● ●● ● ● ●● ● ● ● ● ● ●●● ● ● ● ●●●● ● ●●●● ●● ●● ● ● ● ● ●● ●● ●● ●● ●● ● ● ● ●●●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ●●● ●●● ● ●● ● ● ● ● ● ●● ● ● ● ●●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●● ●● ● ●● ● ● ● ● ● ●●●● ● ● ● ●● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ●●● ● ●● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ●● ●● ● ● ●● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ●●●● ● ● ● ● ●●● ● ● ●●● ● ● ● ● ●● ● ● ●● ●● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ●● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●

● ● ●

● ● ●





●● ● ●











































● ● ●













●● ● ●

















● ●







● ●









● ● ● ●















●●

● ●









● ●● ●



● ●







● ●●



● ● ●

● ●



−50



● ●●



●●







●● ●









0



Vote Share in Election at time t+1



● ●●

50

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ●● ● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●●●● ● ●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ●●● ● ●● ● ● ● ● ● ● ●● ● ● ● ●● ●● ●●● ● ●● ● ●● ● ● ● ● ●● ● ●● ● ●●● ●● ● ● ●● ●● ● ● ● ● ●● ● ● ● ●● ● ● ●●●● ●●● ●●●● ● ● ● ● ● ●● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ●●● ● ●● ● ●● ● ● ● ●● ● ●● ●● ● ●● ●● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ● ●● ● ● ● ● ● ●●● ● ● ● ● ●●● ● ● ●● ● ●● ●●●● ●● ● ●● ●● ●●● ● ● ●● ● ● ● ●●● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●●● ● ● ● ●● ● ● ●● ●●● ●● ● ● ● ●● ●● ● ● ● ●●●● ●● ● ●●● ● ● ● ●●● ●● ●● ● ● ●●● ●● ● ● ●●●● ● ● ● ●● ● ● ● ● ●●●●●● ● ●● ● ● ● ● ●●●● ● ● ●●●● ●●●●● ●●● ●● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ●●●● ●●● ● ● ● ●●● ●● ●● ● ●● ● ● ●● ● ● ●●●● ● ●●● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ●● ● ●●● ●●●●●●● ● ● ●●● ● ●● ● ● ●● ● ● ●● ● ● ● ● ● ●●● ● ● ● ●●●● ● ●●●● ●● ●● ● ● ● ● ●● ●● ●● ●● ●● ● ● ● ●●●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ●●● ●●● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●●●●● ● ● ● ● ●● ●● ● ● ●●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●●●● ● ● ● ●● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ●●● ● ●● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ●● ●● ● ● ●● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ●●●● ● ● ● ● ●●● ● ● ●●● ● ● ● ● ●● ● ● ●● ●● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ●● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●

● ● ●

● ● ●





●● ● ●









N− = 595 ; N+ = 702

(d) Scatter Plot of Raw Data







N− = 595 ; N+ = 702

(a) Scatter Plot of Raw Data



































● ●●●









● ●●●













100

● ●









● ● ● ● ● ● ●





● ● ●





● ●

● ● ●

● ● ● ● ●

100

● ●









● ● ● ● ● ● ●





● ● ●





● ●

● ● ●

● ● ● ● ●

−100















−50

● ● ●











0

● ●

● ●





● ● ●



● ●

● ●



● ●

50



● ● ●



● ●





● ●













100



−100









● ●







−50



● ● ●



● ●

● ●





0

● ●

● ● ●



● ●











● ●





50









● ● ●





● ●











(b) Mimicking Variance, Spacings JˆES-ϑ,−,n = 15 ; JˆES-ϑ,+,n = 35

Vote Share in Election at time t+1

100



(e) Mimicking Variance, Series JˇES-ϑ,−,n = 21 ; JˇES-ϑ,+,n = 36

Vote Share in Election at time t+1

−100

−100

0 Vote Share in Election at time t+1

50

0 Vote Share in Election at time t+1

50

(f) IMSE-optimal, Series ˇ JES-µ,−,n = 8 ; JˇES-µ,+,n = 9

−50

(c) IMSE-optimal, Spacings JˆES-µ,−,n = 8 ; JˆES-µ,+,n = 9

−50

Figure SA-1: Scatter Plot and Automatic Data-driven ES RD Plots for Senate Elections Data

100

100

Notes: (i) sample size is n = 1, 297; (ii) N− and N+ denote the sample sizes for control and treatment units, respectively; (iii) solid blue lines depict 4th order polynomial fits using control and treated units separately.

Vote Share in Election at time t

100 80 60 40 20 0 100 80 60 40 20

Vote Share in Election at time t Vote Share in Election at time t

100 80 60 40 20 0 100 80 60 40

0

40

20

0

Vote Share in Election at time t Vote Share in Election at time t

20 0

22

100

80

60

40

20

0

100

80

60

Vote Share in Election at time t

−100







−100













































● ● ●















●●





● ●● ●

● ● ●





● ●



−50



● ●●



●●





●● ●









0



Vote Share in Election at time t+1



50







●● ● ●











































● ● ●





















●●





● ●● ●

● ● ●





● ●



−50



● ●●



●●





●● ●









0



● ● ●



● ●

● ●

● ●

Vote Share in Election at time t+1





50







●● ● ●









N− = 595 ; N+ = 702

(d) Scatter Plot of Raw Data





● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ●● ●● ● ● ● ● ● ●● ● ● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●●●● ● ●● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ●● ●●● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ●● ●●● ● ●● ● ●● ● ● ● ● ●● ● ●● ● ●●● ●● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ●● ● ● ●●●● ●●● ●●●● ● ●● ● ● ● ● ●● ● ● ● ●●●● ● ●● ●● ● ●● ●● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●●● ● ●● ● ●● ● ●● ●● ● ● ●● ● ●● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ●● ● ●● ● ● ●● ● ●● ● ● ● ● ● ●●● ● ● ● ● ●●● ● ●● ● ● ●● ●●●● ● ●● ● ● ●● ●● ●●● ● ● ● ●●● ● ●● ● ● ● ● ● ● ●● ● ● ● ●●● ●● ● ● ●● ●● ● ● ●●● ●● ● ● ●● ● ● ● ●●● ● ● ● ●● ● ● ●● ●●● ●● ● ● ● ●● ●● ● ● ●●●● ●● ● ●●● ● ● ● ●●● ●● ●● ● ● ●●● ●● ● ● ●●●● ● ● ● ●● ● ● ● ● ●●●●●● ● ●● ● ● ● ● ●●●● ● ● ●●●● ●●●●● ●●● ●● ● ● ●● ● ● ● ● ● ● ● ●● ● ●● ●● ●● ●●●● ●●● ● ● ● ●●● ●● ●● ●●● ● ●● ● ● ●● ● ● ●● ● ● ● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ●● ● ●● ●● ● ●●● ● ●● ● ● ●●●● ●●● ●● ● ● ● ● ●●● ● ● ● ●●●● ● ●●●● ●● ●● ● ● ● ● ●● ●● ●● ●● ●● ● ● ● ●●●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ●●● ●●● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●●●●● ● ● ● ● ●● ●● ● ● ●●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●●●● ● ● ● ●● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ●●● ● ●● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ●● ●● ● ● ●● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ●●●● ● ● ● ● ●●● ● ● ●●● ● ● ● ● ●● ● ● ●● ●● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ●● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●



N− = 595 ; N+ = 702

(a) Scatter Plot of Raw Data



● ●

● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●●●● ● ●● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ●● ●●● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ●● ●●● ● ●● ● ●● ● ● ● ● ●● ● ●● ● ●●● ●● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ●● ● ● ●●●● ●●● ●●●● ● ●● ● ● ● ● ●● ● ● ● ●●●● ● ●● ●● ● ●● ●● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●●● ● ●● ● ●● ● ●● ●● ● ● ●● ● ●● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ●● ● ●● ● ● ●● ● ●● ● ● ● ● ● ●●● ● ● ● ● ●●● ● ● ●● ● ●● ●●●● ● ●● ● ● ●● ●● ●●● ● ● ● ●●● ● ●● ● ● ● ● ● ● ●● ●● ● ● ● ●●● ●● ● ●● ●● ● ● ●●● ●● ● ● ●● ● ● ● ●●● ● ● ● ●● ● ● ● ● ● ● ●● ●● ● ●●●●● ● ● ●●●● ● ●● ● ●● ● ● ● ●●● ●● ● ●●● ● ●● ● ● ● ●●●● ● ● ●● ●●●● ● ●● ●● ● ● ● ●● ● ● ●●●● ● ●●● ●●● ●● ● ● ● ●● ●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ●● ●● ● ●● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ●● ● ●●● ●● ● ● ● ●● ●● ● ●●● ● ●● ● ● ●●●● ●●● ●● ● ● ● ● ●●● ● ● ● ●●●● ● ●●●● ●● ●● ● ● ● ● ●● ●● ●● ●● ●● ● ● ● ●●●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ●●● ●●● ● ●● ● ● ● ● ● ●● ● ● ● ●●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●● ●● ● ●● ● ● ● ● ● ●●●● ● ● ● ●● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ●●● ● ●● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ●● ●● ● ● ●● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ●●●● ● ● ● ● ●●● ● ● ●●● ● ● ● ● ●● ● ● ●● ●● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ●● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●































● ●●●









● ●●●













100

● ●









● ● ● ● ● ● ●





● ● ●





● ●

● ● ●

● ● ● ● ●

100

● ●









● ● ● ● ● ● ●





● ● ●





● ●

● ● ●

● ● ● ● ●

−100



−50



● ● ●





● ●



● ●











0



●● ●●● ●

● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●











● ●

● ● ● ●













●●

● ●

● ●

● ●





50









● ●

100

● ●



−100



−50





● ●











● ●











0

● ●

●● ● ●



● ●● ●



● ●





● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ●

●●

● ●

● ●

● ●



50



● ●









(b) Mimicking Variance, Spacings JˆQS-ϑ,−,n = 28 ; JˆQS-ϑ,+,n = 49

Vote Share in Election at time t+1



100





(e) Mimicking Variance, Series JˇQS-ϑ,−,n = 29 ; JˇQS-ϑ,+,n = 50

Vote Share in Election at time t+1

−100

−100

0 Vote Share in Election at time t+1

50

0 Vote Share in Election at time t+1

50

(f) IMSE-optimal, Series ˇ JQS-µ,−,n = 21 ; JˇQS-µ,+,n = 16

−50

(c) IMSE-optimal, Spacings ˆ JQS-µ,−,n = 21 ; JˆQS-µ,+,n = 16

−50

Figure SA-2: Scatter Plot and Automatic Data-driven QS RD Plots for Senate Elections Data

100

100

Notes: (i) sample size is n = 1, 297; (ii) N− and N+ denote the sample sizes for control and treatment units, respectively; (iii) solid blue lines depict 4th order polynomial fits using control and treated units separately.

Vote Share in Election at time t

100 80 60 40 20 0 100 80 60 40 20

Vote Share in Election at time t Vote Share in Election at time t

100 80 60 40 20 0 100 80 60 40

0

40

20

0

Vote Share in Election at time t Vote Share in Election at time t

20 0

23









●●





















●●

● ●





● ● ● ●● ● ●

● ●

● ●









● ●● ●●











● ●



● ●

● ●







●● ● ●

● ● ●





● ● ●

















●● ●













●● ●

● ● ● ●●



● ●



●● ●







● ●



● ●



● ●



●●



● ●



●● ●















● ●





−1

●●

● ●

● ●●

0

1 Poverty Index

2

3









●●

























●●

● ●





● ● ● ●● ● ●

● ●

● ●











● ●● ●●







● ●



● ●

● ●











●● ●

● ●



● ● ●



















●● ●













●● ●

● ● ● ●●



● ●

●● ●









● ●



● ●



● ●



●●



● ●



●● ●





● ●



● ●





−1

●●

● ●

● ●

● ●●

0

1 Poverty Index

2

3

N− = 691 ; N+ = 2, 118

(d) Scatter Plot of Raw Data

−2















● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ●●● ●● ●● ●● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ● ●● ● ● ● ● ● ● ●● ● ● ●● ●●● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ●●● ● ● ●●● ●● ● ● ● ● ●●●●● ● ● ● ●●●● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ● ●● ● ● ●●● ● ● ● ● ● ● ● ●● ● ●● ●● ● ● ● ● ● ● ●● ●● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ●● ● ●● ●●● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ●● ● ●● ● ● ●● ●●● ● ● ● ● ● ● ● ● ● ●● ●● ●● ● ● ●● ● ● ● ●● ● ● ● ● ●●●●● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ● ●● ● ● ● ●● ● ●● ● ●● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ●●● ● ●● ● ●● ● ●●●● ●●●●● ● ● ●●● ● ● ● ●● ●●●● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ● ● ●● ● ●● ●● ● ●● ●● ●●●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●●●● ● ●● ● ● ●●● ● ● ● ● ● ●●● ●● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ●● ● ● ● ● ● ● ●● ● ●● ●●● ● ●● ● ● ●● ● ● ● ● ● ●● ●● ●● ● ● ● ● ● ● ●● ●●● ●● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ●●● ● ●● ●●● ●● ● ●● ● ●● ● ● ●●● ●● ● ● ● ●●●● ●●● ●● ●●●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●● ● ● ● ● ●● ●● ● ● ● ● ●● ●● ●●● ●● ● ●● ●● ● ●● ● ● ●●● ● ● ● ●● ● ●● ● ● ●● ● ● ● ● ●●● ●● ● ● ● ●●● ● ● ● ● ● ● ● ●●● ●● ●● ●●●● ● ●●● ●● ● ● ● ● ● ●●● ● ●● ●● ● ● ● ●●●●● ● ● ●● ● ●● ● ● ● ●● ● ●● ● ●● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ●● ●●● ● ●● ●●● ● ●● ● ● ● ●● ●●● ● ● ● ● ●● ●●● ● ●● ● ● ● ● ● ● ●● ● ●●●● ●● ●●● ●● ● ● ● ● ● ● ●●● ● ● ●●● ● ● ●● ●● ●● ● ● ● ● ●●●● ● ●● ●● ● ● ● ● ●● ● ●● ●● ● ● ● ●● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ●● ● ● ● ● ●●● ●●● ●● ● ●● ●● ● ● ● ●●●● ● ●●●● ●● ● ● ● ● ● ● ●● ● ● ●● ●● ● ● ● ●● ●● ●● ●● ●● ●● ● ● ●● ● ● ● ● ●●● ●● ● ● ● ● ● ● ●●● ● ●● ● ●●● ● ● ● ● ● ●● ● ●● ●●● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●●●●●●● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ●●● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ● ● ●●● ●●●● ● ● ●● ●● ● ● ● ● ●● ● ●● ● ● ● ● ●●●● ● ●● ● ● ●● ● ●● ●●●● ●● ● ●● ● ● ●● ● ● ●●● ● ●● ● ● ●●●● ●●●● ●● ● ●● ● ●● ● ●●● ● ● ● ●● ● ●● ●●● ● ●● ● ● ●●● ●● ●● ●● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ●● ●●●●●● ●●●● ●● ● ● ●●● ● ●●● ● ●●●●● ● ● ● ● ● ● ●●●● ●● ●●● ● ●● ● ● ●● ● ● ●●● ● ● ●● ●● ● ● ●●● ●●● ●● ●● ●●● ● ●● ● ● ● ●● ● ● ● ● ●● ●● ● ●● ● ● ●● ● ●● ● ● ●●●● ● ● ●●● ●● ●●● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ●●●● ● ●●●● ●● ● ● ● ● ●● ● ●● ●● ● ● ● ● ●●● ● ● ● ● ● ●● ● ● ● ●● ● ●● ● ● ●● ● ●● ● ● ● ●● ●● ● ● ●●●● ● ● ● ●●● ● ● ●● ●● ● ●● ● ●● ●● ● ● ● ● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ●● ● ●● ●● ● ● ●● ● ●● ● ● ●● ●● ●● ● ● ● ● ●● ● ● ●● ● ●● ● ● ● ● ●●● ● ●●●● ●●●●● ●●● ● ●● ● ● ●●● ● ● ● ●● ● ● ● ● ●●●● ● ● ● ● ●● ● ● ● ●●●●●●●● ● ● ● ● ● ● ●●●●● ● ●● ● ● ● ● ● ●● ● ● ●● ● ●● ●● ●●● ● ● ● ● ● ● ● ●● ● ● ●● ●●● ● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ●● ● ●●●● ● ● ● ● ● ●● ● ●●● ● ● ●● ● ● ●● ● ● ● ●● ● ●●●●● ● ● ●● ● ●● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ●●●● ● ●●●●●● ● ●● ● ● ● ● ● ● ● ● ●● ●● ● ●● ●● ● ● ●●●●●● ● ● ● ● ● ● ●●● ● ● ● ●● ● ●●● ● ●● ●● ● ●● ● ●●●● ●● ● ● ● ● ●● ●●● ●● ● ●● ● ●●● ●● ● ● ●●● ● ● ●● ●● ● ● ● ●● ● ●●● ● ● ● ●● ●● ● ● ● ●●● ●● ●●● ● ● ● ●● ● ● ● ●● ● ● ● ●●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ●●● ●● ● ● ●● ●● ● ● ●● ● ●●●● ● ● ●● ● ●●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ●● ● ●● ●● ●● ● ●●● ●● ● ● ● ● ●● ● ● ●● ● ● ● ●● ● ●●● ●● ● ● ● ● ●● ● ● ●●●●● ●●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ●●● ●●● ● ●●●● ●● ●● ● ●● ● ● ● ●● ● ● ● ● ● ●● ●● ● ● ●●●● ● ● ●● ●● ● ● ● ● ●● ● ●● ● ●● ● ● ●●●●● ● ● ● ● ●●● ● ● ●●● ●● ● ●●● ●●●● ● ● ●●● ● ●●● ●● ● ● ● ● ●● ●● ● ● ● ●● ● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ●● ● ●● ● ●● ● ● ● ● ● ●● ● ● ●●● ● ●● ● ● ●●● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ● ●●● ●● ●●● ● ●● ● ● ●● ● ●●● ● ● ●● ●● ● ● ●● ●● ●● ●● ● ● ● ● ● ● ●●●●● ●● ● ● ●● ● ● ● ●● ●●● ● ● ●● ● ●● ● ● ● ● ●





● ●

N− = 691 ; N+ = 2, 118

(a) Scatter Plot of Raw Data

−2















● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ●● ● ●● ● ● ● ● ● ● ● ●● ●●● ●● ●● ●● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ● ●● ● ● ● ● ● ● ●● ● ● ●● ●●● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ●●● ● ● ●●● ●● ● ● ● ● ●●●●● ● ● ● ●●●● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ●● ● ●● ● ●● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ●● ● ●● ●●● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ●● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ●●●●● ●● ●● ●● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ●● ●● ● ● ● ●● ● ●● ● ●● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ●●● ● ●● ● ●● ● ●●●● ●●●●● ● ● ●●● ● ● ● ●● ●●●● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ● ● ●● ● ●● ●● ● ●● ●● ●●●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●●●● ● ●● ● ● ●●● ● ● ● ● ● ●●● ●● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ●● ● ● ● ● ● ● ●● ● ●● ●●● ● ●● ● ● ●● ● ● ● ● ● ● ●● ●● ●● ● ● ● ● ● ● ●● ●●● ●● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ●●● ● ●● ●●● ●● ● ●● ● ●● ● ● ●●● ●● ● ● ● ●●●● ●●● ●● ●●●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●● ● ● ● ● ●● ●● ● ● ● ● ●● ●●● ●●● ●● ● ●● ●● ● ●● ● ● ●●● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ● ● ●●● ●● ● ● ● ●●● ● ● ● ● ●● ●● ●●●● ● ●●● ●● ● ● ● ● ● ●●● ● ●● ●● ● ● ● ● ● ●●●●● ● ● ●● ● ●● ● ● ● ●● ● ●● ● ●● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ●● ●●● ● ●● ●●● ● ●● ● ● ● ●● ●●● ● ● ● ● ●● ●●● ● ●● ● ● ● ● ● ● ●● ● ●●●● ●● ●●● ●● ● ● ● ● ● ● ●●● ● ● ●●● ● ● ●● ●● ●● ● ● ● ● ●●●● ● ●● ●● ● ● ● ● ●● ● ●● ●● ● ● ● ●● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ●● ● ● ● ● ●●● ●●● ●● ● ●● ●● ● ●●●● ●● ● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ●● ● ●● ●● ●● ●● ●● ●● ● ● ●● ●● ●●●● ● ● ● ● ● ●● ●●● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ●● ●● ●● ● ● ●● ● ● ● ●●●● ● ● ●● ● ●● ● ● ●● ● ● ●●● ● ●●●●●● ● ● ●● ● ● ● ● ●● ● ●●●● ● ●● ● ● ● ●●● ●● ● ● ●● ● ●● ● ● ● ● ● ●●● ●●●● ● ● ●● ●● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ● ●● ●●●● ●● ● ●● ● ● ●● ● ● ●●● ● ●● ● ● ●●●● ●●●● ● ● ●● ● ● ●● ● ●● ● ●●● ● ● ● ● ●● ●●● ● ●● ● ● ●●● ●● ●● ● ● ●● ● ●● ●● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ●● ●●●●●● ●●●● ●● ● ● ●●● ● ●●● ● ● ●●● ● ● ● ● ● ● ●●●● ●● ●●● ● ●● ● ● ●● ● ● ●●● ● ● ●● ●● ● ● ●●● ●●● ●● ●● ●●● ● ●● ● ● ● ●● ● ● ● ● ●● ●● ● ●● ● ● ●● ● ●● ● ● ●●●● ● ● ●●● ●● ●●● ● ● ● ● ● ● ●● ●●● ● ● ● ●●●● ● ● ● ● ●●●● ● ●● ● ● ● ● ●● ● ●● ●● ● ● ● ● ●●● ● ● ● ● ● ●● ● ● ● ●● ● ●● ● ● ●● ● ●● ● ● ● ●● ●● ● ● ●●●● ● ● ● ●●● ● ● ●● ●● ● ●● ● ●● ●● ● ● ● ● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ●● ● ●● ●● ● ● ●● ● ●● ● ● ●● ●● ●● ● ● ● ● ●● ● ● ●● ● ●● ● ● ● ● ●●● ● ●●●● ●●●●● ●●● ● ●● ● ● ●●● ● ● ● ●● ● ● ● ● ●●●● ● ● ● ● ●● ● ● ● ●●●●●●●● ● ● ● ● ● ● ●●●●● ● ●● ● ● ● ● ● ●● ● ● ●● ● ●● ●● ●●● ● ● ● ● ● ● ● ●● ● ● ●● ●●● ● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ●● ● ●●●● ● ● ● ● ● ●● ● ●●● ● ● ●● ● ● ●● ● ● ● ●● ● ●●●●● ● ● ●● ● ●● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ●●●● ● ●●●●●● ● ●● ● ● ● ● ● ● ● ● ●● ●● ● ●● ●● ● ● ●●●●●● ● ● ● ● ● ● ●●● ● ● ● ●● ● ●●● ● ●● ●● ● ●● ● ●●●● ●● ● ● ● ● ●● ●● ●●● ●● ● ●● ● ●●● ●● ● ●●● ● ● ●● ●● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ● ●●● ●● ●●● ● ● ● ● ●● ● ● ● ●● ● ● ● ●●● ● ● ● ● ●● ● ● ● ●● ● ●● ● ● ● ● ●●● ●● ● ● ●● ●● ● ● ● ● ●●●● ● ●● ●● ● ●●●● ● ● ● ●● ●● ● ● ● ●● ●● ● ● ● ● ●● ● ● ● ● ● ●●● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●●● ● ● ● ● ● ●●● ●● ●●● ● ●● ● ● ● ● ● ● ● ●● ● ●● ● ● ●●● ● ● ● ● ● ● ●● ● ● ● ● ●● ●●● ● ●●● ●●● ● ●●●● ●● ●● ● ● ●● ● ● ● ●● ● ● ● ● ● ●● ●● ● ●●●● ● ● ●● ●● ● ● ● ● ●● ● ●● ● ●● ● ● ●●●●● ● ● ● ● ●●● ● ● ●●● ● ●● ● ●●● ●●●● ● ● ●●● ● ●●● ●● ● ● ● ● ●● ●● ● ● ● ●● ● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ●● ● ●● ● ● ● ●● ● ●● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ●● ●● ● ● ● ● ● ● ●● ● ● ●● ● ● ●● ● ●●● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ●●●●● ●● ● ● ● ●● ● ● ● ●● ●●● ● ● ●● ● ●● ● ● ● ● ●











4



4







−2



● ●







● ● ●





● ●

−1









● ●

● ●













●●









0







● ●

































● ● ● ● ● ●





● ●







1



● ●

● ●

Poverty Index













● ● ● ●

2















● ● ●

● ●

● ●



3







● ● ●

● ● ●











● ●

























−1









●●





● ●













0





● ● ● ● ●



● ●

● ● ●





● ●

















● ● ● ● ● ●





● ●

● ●

● ●

1



● ● ●



Poverty Index







● ●

● ● ●





● ●

2

● ●











● ●

● ●





● ●





3

● ●

















(e) Mimicking Variance, Series JˇES-ϑ,−,n = 77 ; JˇES-ϑ,+,n = 67

−2











4

4



(b) Mimicking Variance, Spacings JˆES-ϑ,−,n = 69 ; JˆES-ϑ,+,n = 59









−2

−2

0

1 Poverty Index

2

3

0

1 Poverty Index

2

3

(f) IMSE-optimal, Series ˇ JES-µ,−,n = 7 ; JˇES-µ,+,n = 9

−1

(c) IMSE-optimal, Spacings JˆES-µ,−,n = 7 ; JˆES-µ,+,n = 9

−1

4

4

Notes: (i) sample size is n = 2, 809; (ii) N− and N+ denote the sample sizes for control and treatment units, respectively; (iii) solid blue lines depict 4th order polynomial fits using control and treated units separately.











500 400

● ●

400

500

400

300

200

100

500

Figure SA-3: Scatter Plot and Automatic Data-driven ES RD Plots for Progresa/Oportunidades (Urban Localities).

Per Capita Food Consumption

Per Capita Food Consumption

0

500

400

300

200

100

0

Per Capita Food Consumption Per Capita Food Consumption

300 200 100 0 500 400 300 200 100 0

Per Capita Food Consumption Per Capita Food Consumption

300 200 100 0 500 400 300 200 100 0

24









●●





















●●

● ●





● ● ● ●● ● ●

● ●

● ●









● ●● ●●











● ●



● ●

● ●







● ●







● ● ●











●● ●













●● ●

● ● ● ●●



● ●



●● ●







● ●



● ●



● ●



●●



● ●



●● ●











● ●



●●

−1

● ●



●●



● ● ●

● ●

0 Poverty Index

1

2

3

● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ●● ● ● ● ● ● ● ● ●● ●●● ●● ●● ●● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ● ●● ● ● ● ● ● ● ●● ● ● ●● ●●● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ●●● ● ● ●●● ●● ● ● ● ● ●●●●● ● ● ● ●●●● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ●● ● ●● ● ●● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ●● ● ●● ●●● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ●● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ●●●●● ●● ●● ●● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ●● ●● ● ● ● ●● ● ●● ● ●● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ●●● ● ●● ● ●● ● ●●●● ●●●●● ● ● ●●● ● ● ● ●● ●●●● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ● ● ●● ● ●● ●● ● ●● ●● ●●●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●●●● ● ●● ● ● ●●● ● ● ● ● ● ●●● ●● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ●● ● ● ● ● ● ● ●● ● ●● ●●● ● ●● ● ● ●● ● ● ● ● ● ● ●● ●● ●● ● ● ● ● ● ● ●● ●●● ●● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ●●● ● ●● ●●● ●● ● ●● ● ●● ● ● ●●● ●● ● ● ● ●●●● ●●● ●● ●●●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●● ● ● ● ● ●● ●● ● ● ● ● ●● ●●● ●●● ●● ● ●● ●● ● ●● ● ● ●●● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ● ● ●●● ●● ● ● ● ●●● ● ● ● ● ●● ●● ●●●● ● ●●● ●● ● ● ● ● ● ●●● ● ●● ●● ● ● ● ● ● ●●●●● ● ● ●● ● ●● ● ● ● ●● ● ●● ● ●● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ●● ●●● ● ●● ●●● ● ●● ● ● ● ●● ●●● ● ● ● ● ●● ●●● ● ●● ● ● ● ● ● ● ●● ● ●●●● ●● ●●● ●● ● ● ● ● ● ● ●●● ● ● ●●● ● ● ●● ●● ●● ● ● ● ● ●●●● ● ●● ●● ● ● ● ● ●● ● ●● ●● ● ● ● ●● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ●● ● ● ● ● ●●● ●●● ●● ● ●● ●● ● ●●●● ●● ● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ●● ● ●● ●● ●● ●● ●● ●● ● ● ●● ●● ●●●● ● ● ● ● ● ●● ●●● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ●● ●● ●● ● ● ●● ● ● ● ●●●● ● ● ●● ● ●● ● ● ●● ● ● ●●● ● ●●●●●● ● ● ●● ● ● ● ● ●● ● ●●●● ● ●● ● ● ● ●●● ●● ● ● ●● ● ●● ● ● ● ● ● ●●● ●●●● ● ● ●● ●● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ● ●● ●●●● ●● ● ●● ● ● ●● ● ● ●●● ● ●● ● ● ●●●● ●●●● ● ● ●● ● ● ●● ● ●● ● ●●● ● ● ● ● ●● ●●● ● ●● ● ● ●●● ●● ●● ● ● ●● ● ●● ●● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ●● ●●●●●● ●●●● ●● ● ● ●●● ● ●●● ● ● ●●● ● ● ● ● ● ● ●●●● ●● ●●● ● ●● ● ● ●● ● ● ●●● ● ● ●● ●● ● ● ●●● ●●● ●● ●● ●●● ● ●● ● ● ● ●● ● ● ● ● ●● ●● ● ●● ● ● ●● ● ●● ● ● ●●●● ● ● ●●● ●● ●●● ● ● ● ● ● ● ●● ●●● ● ● ● ●●●● ● ● ● ● ●●●● ● ●● ● ● ● ● ●● ● ●● ●● ● ● ● ● ●●● ● ● ● ● ● ●● ● ● ● ●● ● ●● ● ● ●● ● ●● ● ● ● ●● ●● ● ● ●●●● ● ● ● ●●● ● ● ●● ●● ● ●● ● ●● ●● ● ● ● ● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ●● ● ●● ●● ● ● ●● ● ●● ● ● ●● ●● ●● ● ● ● ● ●● ● ● ●● ● ●● ● ● ● ● ●●● ● ●●●● ●●●●● ●●● ● ●● ● ● ●●● ● ● ● ●● ● ● ● ● ●●●● ● ● ● ● ●● ● ● ● ●●●●●●●● ● ● ● ● ● ● ●●●●● ● ●● ● ● ● ● ● ●● ● ● ●● ● ●● ●● ●●● ● ● ● ● ● ● ● ●● ● ● ●● ●●● ● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ●● ● ●●●● ● ● ● ● ● ●● ● ●●● ● ● ●● ● ● ●● ● ● ● ●● ● ●●●●● ● ● ●● ● ●● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ●●●● ● ●●●●●● ● ●● ● ● ● ● ● ● ● ● ●● ●● ● ●● ●● ● ● ●●●●●● ● ● ● ● ● ● ●●● ● ● ● ●● ● ●●● ● ●● ●● ● ●● ● ●●●● ●● ● ● ● ● ●● ●● ●●● ●● ● ●● ● ●●● ●● ● ●●● ● ● ●● ●● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ● ●●● ●● ●●● ● ● ● ● ●● ● ● ● ●● ● ● ● ●●● ● ● ● ● ●● ● ● ● ●● ● ●● ● ● ● ● ●●● ●● ● ● ●● ●● ● ● ● ● ●●●● ● ●● ●● ● ●●●● ● ● ● ●● ●● ● ● ● ●● ●● ● ● ● ● ●● ● ● ● ● ● ●●● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●●● ● ● ● ● ● ●●● ●● ●●● ● ●● ● ● ● ● ● ● ● ●● ● ●● ● ● ●●● ● ● ● ● ● ● ●● ● ● ● ● ●● ●●● ● ●●● ●●● ● ●●●● ●● ●● ● ● ●● ● ● ● ●● ● ● ● ● ● ●● ●● ● ●●●● ● ● ●● ●● ● ● ● ● ●● ● ●● ● ●● ● ● ●●●●● ● ● ● ● ●●● ● ● ●●● ● ●● ● ●●● ●●●● ● ● ●●● ● ●●● ●● ● ● ● ● ●● ●● ● ● ● ●● ● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ●● ● ●● ● ● ● ●● ● ●● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ●● ●● ● ● ● ● ● ● ●● ● ● ●● ● ● ●● ● ●●● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ●●●●● ●● ● ● ● ●● ● ● ● ●● ●●● ● ● ●● ● ●● ● ● ● ● ●









































●●

● ●





● ● ● ●● ● ●

● ●

● ●









● ●● ●●







● ●

● ●











● ● ●













●● ●















●● ●

● ● ● ●●



● ●









● ●



● ●



● ●



●●





●●

















−1

●●



●●

● ●

● ●

● ●●

0

1 Poverty Index

2

3

N− = 691 ; N+ = 2, 118

(d) Scatter Plot of Raw Data

−2









● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ●●● ●● ●● ●● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ● ●● ● ● ● ● ● ● ●● ● ● ●● ●●● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ●●● ● ● ●●● ●● ● ● ● ● ●●●●● ● ● ● ●●●● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ● ●● ● ● ●●● ● ● ● ● ● ● ● ●● ● ●● ●● ● ● ● ● ● ● ●● ●● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ●● ● ●● ●●● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ●● ● ●● ● ● ●● ●●● ● ● ● ● ● ● ● ● ● ●● ●● ●● ● ● ●● ● ● ● ●● ● ● ● ● ●●●●● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ● ●● ● ● ● ●● ● ●● ● ●● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ●●● ● ●● ● ●● ● ●●●● ●●●●● ● ● ●●● ● ● ● ●● ●●●● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ● ● ●● ● ●● ●● ● ●● ●● ●●●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●●●● ● ●● ● ● ●●● ● ● ● ● ● ●●● ●● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ●● ● ● ● ● ● ● ●● ● ●● ●●● ● ●● ● ● ●● ● ● ● ● ● ●● ●● ●● ● ● ● ● ● ● ●● ●●● ●● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ●●● ● ●● ●●● ●● ● ●● ● ●● ● ● ●●● ●● ● ● ● ●●●● ●●● ●● ●●●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●● ● ● ● ● ●● ●● ● ● ● ● ●● ●● ●●● ●● ● ●● ●● ● ●● ● ● ●●● ● ● ● ●● ● ●● ● ● ●● ● ● ● ● ●●● ●● ● ● ● ●●● ● ● ● ● ● ● ● ●●● ●● ●● ●●●● ● ●●● ●● ● ● ● ● ● ●●● ● ●● ●● ● ● ● ●●●●● ● ● ●● ● ●● ● ● ● ●● ● ●● ● ●● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ●● ●●● ● ●● ●●● ● ●● ● ● ● ●● ●●● ● ● ● ● ●● ●●● ● ●● ● ● ● ● ● ● ●● ● ●●●● ●● ●●● ●● ● ● ● ● ● ● ●●● ● ● ●●● ● ● ●● ●● ●● ● ● ● ● ●●●● ● ●● ●● ● ● ● ● ●● ● ●● ●● ● ● ● ●● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ●● ● ● ● ● ●●● ●●● ●● ● ●● ●● ● ● ● ●●●● ● ●●●● ●● ● ● ● ● ● ● ●● ● ● ●● ●● ● ● ● ●● ●● ●● ●● ●● ●● ● ● ●● ● ● ● ● ●●● ●● ● ● ● ● ● ● ●●● ● ●● ● ●●● ● ● ● ● ● ●● ● ●● ●●● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●●●●●●● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ●●● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ● ● ●●● ●●●● ● ● ●● ●● ● ● ● ● ●● ● ●● ● ● ● ● ●●●● ● ●● ● ● ●● ● ●● ●●●● ●● ● ●● ● ● ●● ● ● ●●● ● ●● ● ● ●●●● ●●●● ●● ● ●● ● ●● ● ●●● ● ● ● ●● ● ●● ●●● ● ●● ● ● ●●● ●● ●● ●● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ●● ●●●●●● ●●●● ●● ● ● ●●● ● ●●● ● ●●●●● ● ● ● ● ● ● ●●●● ●● ●●● ● ●● ● ● ●● ● ● ●●● ● ● ●● ●● ● ● ●●● ●●● ●● ●● ●●● ● ●● ● ● ● ●● ● ● ● ● ●● ●● ● ●● ● ● ●● ● ●● ● ● ●●●● ● ● ●●● ●● ●●● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ●●●● ● ●●●● ●● ● ● ● ● ●● ● ●● ●● ● ● ● ● ●●● ● ● ● ● ● ●● ● ● ● ●● ● ●● ● ● ●● ● ●● ● ● ● ●● ●● ● ● ●●●● ● ● ● ●●● ● ● ●● ●● ● ●● ● ●● ●● ● ● ● ● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ●● ● ●● ●● ● ● ●● ● ●● ● ● ●● ●● ●● ● ● ● ● ●● ● ● ●● ● ●● ● ● ● ● ●●● ● ●●●● ●●●●● ●●● ● ●● ● ● ●●● ● ● ● ●● ● ● ● ● ●●●● ● ● ● ● ●● ● ● ● ●●●●●●●● ● ● ● ● ● ● ●●●●● ● ●● ● ● ● ● ● ●● ● ● ●● ● ●● ●● ●●● ● ● ● ● ● ● ● ●● ● ● ●● ●●● ● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ●● ● ●●●● ● ● ● ● ● ●● ● ●●● ● ● ●● ● ● ●● ● ● ● ●● ● ●●●●● ● ● ●● ● ●● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ●●●● ● ●●●●●● ● ●● ● ● ● ● ● ● ● ● ●● ●● ● ●● ●● ● ● ●●●●●● ● ● ● ● ● ● ●●● ● ● ● ●● ● ●●● ● ●● ●● ● ●● ● ●●●● ●● ● ● ● ● ●● ●●● ●● ● ●● ● ●●● ●● ● ● ●●● ● ● ●● ●● ● ● ● ●● ● ●●● ● ● ● ●● ●● ● ● ● ●●● ●● ●●● ● ● ● ●● ● ● ● ●● ● ● ● ●●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ●●● ●● ● ● ●● ●● ● ● ●● ● ●●●● ● ● ●● ● ●●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ●● ● ●● ●● ●● ● ●●● ●● ● ● ● ● ●● ● ● ●● ● ● ● ●● ● ●●● ●● ● ● ● ● ●● ● ● ●●●●● ●●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ●●● ●●● ● ●●●● ●● ●● ● ●● ● ● ● ●● ● ● ● ● ● ●● ●● ● ● ●●●● ● ● ●● ●● ● ● ● ● ●● ● ●● ● ●● ● ● ●●●●● ● ● ● ● ●●● ● ● ●●● ●● ● ●●● ●●●● ● ● ●●● ● ●●● ●● ● ● ● ● ●● ●● ● ● ● ●● ● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ●● ● ●● ● ●● ● ● ● ● ● ●● ● ● ●●● ● ●● ● ● ●●● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ● ●●● ●● ●●● ● ●● ● ● ●● ● ●●● ● ● ●● ●● ● ● ●● ●● ●● ●● ● ● ● ● ● ● ●●●●● ●● ● ● ●● ● ● ● ●● ●●● ● ● ●● ● ●● ● ● ● ● ●









● ●

N− = 691 ; N+ = 2, 118

(a) Scatter Plot of Raw Data

−2















4



4







● ●





−1

● ●





● ●











● ●



0

● ●







1





● ● ●



Poverty Index

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ●

● ● ●





● ●



● ● ●









● ●

2



● ● ●



3





● ●





−1





















● ●● ●



0







● ● ● ● ● ●● ● ●





● ●● ● ● ● ● ●



●●



















●● ●











●●

1

● ●● ●



Poverty Index





●●●

● ●

● ● ●



● ●



● ●

2



● ● ●



3



(e) Mimicking Variance, Series JˇQS-ϑ,−,n = 46 ; JˇQS-ϑ,+,n = 47

−2





(b) Mimicking Variance, Spacings JˆQS-ϑ,−,n = 45 ; JˆQS-ϑ,+,n = 46

−2







4

4

−1

0

1 Poverty Index

2

3

−1

0

1 Poverty Index

2

3

(f) IMSE-optimal, Series ˇ JQS-µ,−,n = 36 ; JˇQS-µ,+,n = 14

−2

(c) IMSE-optimal, Spacings ˆ JQS-µ,−,n = 36 ; JˆQS-µ,+,n = 14

−2

4

4

Notes: (i) sample size is n = 2, 809; (ii) N− and N+ denote the sample sizes for control and treatment units, respectively; (iii) solid blue lines depict 4th order polynomial fits using control and treated units separately.













500 400

● ●

400

500

400

300

200

100

500

Figure SA-4: Scatter Plot and Automatic Data-driven QS RD Plots for Progresa/Oportunidades (Urban Localities).

Per Capita Food Consumption

Per Capita Food Consumption

0

500

400

300

200

100

0

Per Capita Food Consumption Per Capita Food Consumption

300 200 100 0 500 400 300 200 100 0

Per Capita Food Consumption Per Capita Food Consumption

300 200 100 0 500 400 300 200 100 0

25

Ages 5−9, Head Start−related causes, 1973−1983

15

10

5

0

0 0

0 0





● ●









● ●











● ● ● ●

● ● ●



















● ●

●●



● ●



● ● ●

● ●

● ● ●

● ● ● ●● ● ●





● ●

● ●















● ●

● ●●





● ● ● ● ●●● ●

●● ●

●● ●























●● ●

● ●





● ● ●

● ●



● ●











● ●

























● ●







● ● ● ● ● ●

● ●











●●



●●

● ●







● ● ●













●● ● ●





















● ●





● ●

● ●





● ● ●



















● ●





40

40 Poverty Rate

60

60

80

●●●●



● ●











● ●











● ● ● ●

● ● ●

















● ● ● ● ● ● ●







● ●

● ●



● ●

● ● ●

● ●

● ● ●





●●

● ●● ●





● ●

● ●















● ●

● ●●







● ● ● ● ●●● ●

●● ●

●● ●























●● ●

● ●





● ● ●











● ● ●

● ●



● ●

















● ●





●● ●

● ●



● ● ● ● ● ● ● ●

● ● ● ●● ●



● ●













●●



●●

● ●







● ● ●







●●













●● ● ●















● ●

● ●





● ● ●



















● ●





40

40 Poverty Rate

60

60

80

●●●●



N− = 2, 810 ; N+ = 294

(d) Scatter Plot of Raw Data

20 20

● ● ●●● ●● ● ● ● ● ● ●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ●●● ● ●●● ● ● ●●●●● ●● ●● ●●● ●● ● ●● ● ● ●●● ●●●●● ●●● ●● ● ●







● ●









● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ●●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ●● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ●● ● ●● ● ● ●●● ● ● ●● ●● ● ● ● ● ●● ● ●●● ● ● ●● ● ● ●●● ● ● ●● ● ●● ● ●● ● ● ● ● ● ●● ●● ● ●● ● ●● ● ● ● ●● ● ● ● ●● ● ● ●● ● ●● ●● ● ●● ●● ● ● ● ● ● ●●●● ● ● ●● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ●●●● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ●●●●● ●● ● ● ● ● ●●●● ●● ● ●● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ●●● ● ● ●● ● ● ● ●●●●● ● ● ● ● ● ● ● ● ●●●●●● ●● ● ● ● ●● ● ● ● ● ● ●●● ● ● ●●● ● ● ● ● ●●●● ● ●● ● ● ● ● ●● ●●● ● ●●● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ●● ● ●● ● ● ● ● ● ●● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ●● ●● ●● ●● ●● ●● ● ●●● ● ●●● ● ●● ●● ● ● ●● ●●● ● ● ● ●● ● ● ● ●●●●● ● ●● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ●● ● ● ● ●● ●● ● ●● ●● ● ● ● ● ● ● ●● ●● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ●● ● ●● ●● ● ●●●●● ● ●●● ● ● ●● ● ●● ●● ● ● ● ● ●● ●● ● ● ● ●● ● ● ●● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ●●● ●●●●● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ●● ●● ● ●● ●● ●● ● ● ●● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●● ●● ●●● ●●●● ●● ●● ● ● ● ● ● ● ● ● ●● ● ●●●●● ●● ●●● ● ● ● ● ● ●● ● ●● ●● ●● ● ●● ● ● ● ● ● ●● ● ●● ● ●●●● ●●●●● ● ●● ●●●●●● ●● ● ● ● ● ● ●●●● ● ● ● ●●●●●● ● ●●● ●● ●● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ●● ●● ● ● ●●● ●● ● ● ● ● ● ● ● ●●● ● ● ●



●●



N− = 2, 810 ; N+ = 294

(a) Scatter Plot of Raw Data

20 20

● ● ●●● ●● ● ● ● ● ● ●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ●●● ● ●●● ● ● ●●●●● ●● ●● ●●● ●● ● ●● ● ● ●●● ●●●●● ●●● ●● ● ●















80

80

0

0



● ● ●

● ●



● ● ● ●

20





● ●









● ●

● ●







40



● ●



Poverty Rate





● ●







● ● ● ●





60

●●









● ●

● ●

● ●









● ●













● ●



● ●

● ● ●





80

● ● ●●●









● ●

● ●



● ● ●

20





● ●

● ●

● ●

● ●

● ● ●

40

● ●



Poverty Rate











● ●

● ● ●



60



● ● ●●





● ●









● ●

● ●























●● ● ●





●●●



● ● ●

●●







(e) Mimicking Variance, Series JˇES-ϑ,−,n = 39 ; JˇES-ϑ,+,n = 52



80

● ●●



(b) Mimicking Variance, Spacings JˆES-ϑ,−,n = 45 ; JˆES-ϑ,+,n = 42





0

0

40 Poverty Rate

60

40 Poverty Rate

60

(f) IMSE-optimal, Series ˇ JES-µ,−,n = 6 ; JˇES-µ,+,n = 6

20

(c) IMSE-optimal, Spacings JˆES-µ,−,n = 6 ; JˆES-µ,+,n = 6

20

Figure SA-5: Scatter Plot and Automatic Data-driven ES RD Plots for Head Start Assistance

● ● ● ● ● ● ●







● ●● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ●● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ●● ● ●● ● ● ●●● ● ● ●● ●● ● ● ● ● ●● ● ●●● ● ● ●● ● ● ●●● ● ● ●● ● ●● ● ●● ● ● ● ● ● ●● ●● ● ●● ● ●● ● ● ● ●● ● ● ● ●● ● ● ●● ● ●● ●● ● ●● ●● ● ● ● ● ● ●●●● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●●●●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ●●●●● ●● ● ● ● ● ●●●● ●● ● ●● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ●●● ● ● ●● ● ● ●●●●●● ● ●● ● ● ● ● ● ● ● ● ● ● ●●●●●● ●● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●●● ● ●● ● ●● ● ● ● ● ● ●● ●● ●● ● ● ● ●●● ● ●●● ● ● ● ● ● ● ● ●●● ● ● ●● ● ● ●● ●● ● ● ● ●● ● ●● ● ● ● ● ● ● ●●● ● ●● ●● ● ● ● ●● ● ● ● ●● ●● ● ●● ●● ● ●●● ● ●● ● ●● ●● ● ● ●● ●●● ● ● ● ●● ● ● ● ●●●●● ● ●● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ●● ● ● ● ●● ●● ● ●● ● ●● ● ● ● ● ● ● ●●●●● ● ● ●● ●● ● ●● ●● ●● ● ● ● ● ● ● ● ●● ● ●● ●● ● ●●●●● ● ●●● ● ● ●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●● ● ●●● ● ● ●●● ● ● ● ●●● ● ● ● ●● ● ●● ●● ● ● ● ●● ● ● ●● ●● ● ● ● ●● ●● ●● ●●● ● ● ● ● ● ●● ●● ●●●● ● ● ● ● ● ● ● ● ● ●● ● ●● ●● ●● ●●● ●●●● ●● ●● ● ● ● ● ● ● ● ● ●● ● ●●●●● ●● ●●● ● ● ● ● ● ●● ● ●● ●● ●● ● ●● ● ● ● ● ● ●● ● ●● ● ●●●● ●●●●● ● ●● ●●●●●● ●● ● ● ● ● ● ●●●● ● ● ● ●●●●●● ● ●●● ●● ●● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ●● ●● ● ● ●●● ●● ● ● ● ● ● ● ● ●●● ● ● ●



●●



15 10 5

15 10 5

80

80

Notes: (i) sample size is n = 3, 104; (ii) N− and N+ denote the sample sizes for control and treatment units, respectively; (iii) solid blue lines depict 4th order polynomial fits using control and treated units separately.

Ages 5−9, Head Start−related causes, 1973−1983

−5

15

10

5

0

−5

Ages 5−9, Head Start−related causes, 1973−1983 Ages 5−9, Head Start−related causes, 1973−1983

0 −5 15 10 5 0 −5

Ages 5−9, Head Start−related causes, 1973−1983 Ages 5−9, Head Start−related causes, 1973−1983

0 −5 15 10 5 0 −5

26

Ages 5−9, Head Start−related causes, 1973−1983

15

10

5

0

0 0

0 0





● ●









● ●











● ● ● ●

● ● ●



















● ●

●●



● ●



● ● ●

● ●

● ● ●

● ● ● ●● ● ●





● ●

● ●















● ●

● ●●





● ● ● ● ●●● ●

●● ●

●● ●























●● ●

● ●





● ● ●

● ●



● ●











● ●































● ● ● ● ● ● ● ●

● ●









● ●●

● ●

●●









● ● ●













●● ● ●





















● ●





● ●

● ●





● ● ●



















● ●





40

40 Poverty Rate

60

60

80

●●●●





● ●

● ●



















● ● ● ●

● ● ●

















● ● ● ● ● ● ●







● ●

● ●



● ●

● ● ●

● ●

● ● ●





●●

● ●● ●





● ●

● ●















● ●

● ●●







● ● ● ● ●●● ●

●● ●

●● ●























●● ●

● ●





● ● ●











● ● ●

● ●



● ●

















● ●





●● ●

● ●



● ● ● ● ● ● ● ●

● ● ● ●● ●



● ●













●●



●●

● ●







● ● ●







●●













●● ● ●













● ●

● ●





● ● ●



















● ●





40

40 Poverty Rate

60

60

80

●●●●



N− = 2, 810 ; N+ = 294

(d) Scatter Plot of Raw Data

20 20

● ● ●●● ●● ● ● ● ● ● ●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ●●● ● ●●● ● ● ●●●●● ●● ●● ●●● ●● ● ●● ● ● ●●● ●●●●● ●●● ●● ● ●





● ●









● ●● ● ● ● ● ● ● ●● ● ● ● ● ●●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ●● ● ●● ● ● ●●● ● ● ●● ●● ● ● ● ● ●● ● ●●● ● ● ●● ● ● ●●● ● ● ●● ● ●● ● ● ●● ● ● ● ● ● ●● ●● ● ●● ● ●● ● ● ● ●● ● ● ● ●● ● ● ●● ● ●● ●● ● ●● ●● ● ● ● ● ● ●●●● ● ● ●● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ●●●● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ●●●●● ●● ● ● ● ● ●●●● ●● ● ●● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ●●● ● ● ●● ● ● ● ●●●●● ● ● ● ● ● ● ● ● ●●●●●● ●● ● ● ● ●● ● ● ● ● ● ●●● ● ● ●●● ● ● ● ● ●●●● ● ●● ● ● ● ● ●● ●●● ● ●●● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ●● ● ●● ● ● ● ● ● ●● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ●● ●● ●● ●● ●● ●● ● ●●● ● ●●● ● ●● ●● ● ● ●● ●●● ● ● ● ●● ● ● ● ●●●●● ● ●● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ●● ● ● ● ●● ●● ● ●● ●● ● ● ● ● ● ● ●● ●● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ●● ● ●● ●● ● ●●●●● ● ●●● ● ● ●● ● ●● ●● ● ● ● ● ●● ●● ● ● ● ●● ● ● ●● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ●●● ●●●●● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ●● ●● ● ●● ●● ●● ● ● ●● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●● ●● ●●● ●●●● ●● ●● ● ● ● ● ● ● ● ● ●● ● ●●●●● ●● ●●● ● ● ● ● ● ●● ● ●● ●● ●● ● ●● ● ● ● ● ● ●● ● ●● ● ●●●● ●●●●● ● ●● ●●●●●● ●● ● ● ● ● ● ●●●● ● ● ● ●●●●●● ● ●●● ●● ●● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ●● ●● ● ● ●●● ●● ● ● ● ● ● ● ● ●●● ● ● ●

●●



N− = 2, 810 ; N+ = 294

(a) Scatter Plot of Raw Data

20 20

● ● ●●● ●● ● ● ● ● ● ●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ●●● ● ●●● ● ● ●●●●● ●● ●● ●●● ●● ● ●● ● ● ●●● ●●●●● ●●● ●● ● ●















80

80

0

0

● ●







● ● ● ●





20







● ● ●



● ● ●







● ● ●







● ●



40











Poverty Rate

● ●



● ●



● ●



● ●

60















●●

● ● ●















● ●●

● ● ● ●

● ● ● ● ●

●● ●



● ●



● ●









● ●







● ●● ●

● ●

20

●● ●

●● ● ●



● ● ●















● ●

● ●



40









Poverty Rate

● ●



● ●



● ●



60

● ● ●● ●●

● ●●

● ● ●

● ●



● ●













●●





●●













● ●

















(e) Mimicking Variance, Series JˇQS-ϑ,−,n = 49 ; JˇQS-ϑ,+,n = 49













80



80

(b) Mimicking Variance, Spacings JˆQS-ϑ,−,n = 48 ; JˆQS-ϑ,+,n = 47



0

0

40 Poverty Rate

60

40 Poverty Rate

60

(f) IMSE-optimal, Series ˇ JQS-µ,−,n = 31 ; JˇQS-µ,+,n = 6

20

(c) IMSE-optimal, Spacings JˆQS-µ,−,n = 31 ; JˆQS-µ,+,n = 6

20

Figure SA-6: Scatter Plot and Automatic Data-driven QS RD Plots for Head Start Assistance

● ● ● ● ● ● ●







● ●● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ●● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ●● ● ●● ● ● ●●● ● ● ●● ●● ● ● ● ● ●● ● ●●● ● ● ●● ● ● ●●● ● ● ●● ● ●● ● ●● ● ● ● ● ● ●● ●● ● ●● ● ●● ● ● ● ●● ● ● ● ●● ● ● ●● ● ●● ●● ● ●● ●● ● ● ● ● ● ●●●● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●●●●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ●●●●● ●● ● ● ● ● ●●●● ●● ● ●● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ●●● ● ● ●● ● ● ●●●●●● ● ●● ● ● ● ● ● ● ● ● ● ● ●●●●●● ●● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●●● ● ●● ● ●● ● ● ● ● ● ●● ●● ●● ● ● ● ●●● ● ●●● ● ● ● ● ● ● ● ●●● ● ● ●● ● ● ●● ●● ● ● ● ●● ● ●● ● ● ● ● ● ● ●●● ● ●● ●● ● ● ● ●● ● ● ● ●● ●● ● ●● ●● ● ●●● ● ●● ● ●● ●● ● ● ●● ●●● ● ● ● ●● ● ● ● ●●●●● ● ●● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ●● ● ● ● ●● ●● ● ●● ● ●● ● ● ● ● ● ● ●●●●● ● ● ●● ●● ● ●● ●● ●● ● ● ● ● ● ● ● ●● ● ●● ●● ● ●●●●● ● ●●● ● ● ●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●● ● ●●● ● ● ●●● ● ● ● ●●● ● ● ● ●● ● ●● ●● ● ● ● ●● ● ● ●● ●● ● ● ● ●● ●● ●● ●●● ● ● ● ● ● ●● ●● ●●●● ● ● ● ● ● ● ● ● ● ●● ● ●● ●● ●● ●●● ●●●● ●● ●● ● ● ● ● ● ● ● ● ●● ● ●●●●● ●● ●●● ● ● ● ● ● ●● ● ●● ●● ●● ● ●● ● ● ● ● ● ●● ● ●● ● ●●●● ●●●●● ● ●● ●●●●●● ●● ● ● ● ● ● ●●●● ● ● ● ●●●●●● ● ●●● ●● ●● ● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ●● ●● ● ● ●●● ●● ● ● ● ● ● ● ● ●●● ● ● ●



●●



15 10 5

15 10 5

80

80

Notes: (i) sample size is n = 3, 104; (ii) N− and N+ denote the sample sizes for control and treatment units, respectively; (iii) solid blue lines depict 4th order polynomial fits using control and treated units separately.

Ages 5−9, Head Start−related causes, 1973−1983

−5

15

10

5

0

−5

Ages 5−9, Head Start−related causes, 1973−1983 Ages 5−9, Head Start−related causes, 1973−1983

0 −5 15 10 5 0 −5

Ages 5−9, Head Start−related causes, 1973−1983 Ages 5−9, Head Start−related causes, 1973−1983

0 −5 15 10 5 0 −5

27

5

Complete Simulation Results

We report the results from a Monte Carlo experiment to study the finite-sample behavior of our proposed methods. We consider several data generating processes, which vary in the distribution of the running variable, the conditional variance, and the distribution of the unobserved error term in the regression function. Specifically, the data is generated as i.i.d. draws, {(Yi , Xi )0 : i = 1, 2, ..., n} following

Yi = µ(Xi ) + εi ,

where µ(x) =

Xi ∼ Fx ,

εi ∼ σ(Xi )Fε ,

   0.48 + 1.27x + 7.18x2 + 20.21x3 + 21.54x4 + 7.33x5

if x < 0

  0.52 + 0.84x − 3.00x2 + 7.99x3 − 9.01x4 + 3.56x5

if x ≥ 0

,

and Fx equals either (2B(p1 , p2 ) − 1), with B(p1 , p2 ) denoting a Beta distribution with parameters p1 and p2 , or equals a mixture of two normal distributions with means µ1 and µ2 , respectively, same standard deviations set to 1/4 and mixing weights ω1 and ω2 , respectively. In addition, σ(x) is either equal to 1 (homoskedasticity) or equal to exp(−|x|/2) (heteroskedasticity), and Fε is either √ N (0, 1) or (χ4 − 4)/ 8. The functional form of µ(x) is obtained by fitting a 5-th order global polynomial with different coefficients for control and treatment units separately using the original data of Lee (2008), after discarding observations with past margin of victory greater than 99 and less than −99 percentage points. Figure SA-7 plots the regression function µ(x) and the different choices for the density of Xi . Notice that some of these densities take on “low” values in some regions of the support of Xi , in same cases near the RD cutoff. Our Monte Carlo experiment considers 16 models that combine different choices of Fx , σ(x) and Fε , as described in Table SA-1. For each model in Table SA-1, we set n = 5, 000 and generate 5, 000 simulations to compute the IMSE of both ES and QS partitioning schemes for different possible number of bins, as well as for the IMSE-optimal data-driven selector proposed. In each case considered, we also computed the mimicking variance selectors introduced in the paper, both infeasible and data-driven versions. All tables include results for both ES and QS RD plots organized in two distinct panels. Panel 28

A focuses attention on the IMSE of different partitioning schemes in finite samples, as well as the performance of the associated IMSE-optimal data-driven selectors. All IMSEs are normalized relative to the IMSE evaluated at the optimal partition-size choice to avoid any scaling issue. Panel B reports several features of the empirical (finite-sample) distribution of the different data-driven number of bins selectors introduced in this paper: (i) spacings-based selectors for ES RD plots, (ii) polynomial-based selectors for ES RD plots, (iii) spacings-based selectors for QS RD plots, and (iv) polynomial-based selectors for QS RD plots. Therefore, our Monte Carlo experiment is designed to capture the finite-sample performance of Theorems 1 and 2 in terms of providing a good approximation to the IMSE (Panel A), and the finite-sample performance of Theorems 3 and 4 as well as the other consistency results discussed in the remarks in the paper (Panel B). The results of our simulation experiment are very encouraging. First, in all cases the IMSE is minimized at the corresponding IMSE-optimal number of bins choice derived in the paper, suggesting that Theorems 1 and 2 provide a good finite-sample approximation. The theoretical IMSE-optimal number of bins almost always exactly coincides with the simulated IMSE-optimal number of bins. Second, in all models we find that our proposed data-driven implementations of the different number of bins selectors perform quite well, exhibiting a concentrated finite-sample distribution centered at the target population (optimal) choice introduced in this paper. That is, the summary statistics in Panel B of each table show that our data-driven implementations of the population selectors choices have a finite sample distribution well centered and concentrated around their population targets, when using either spacings estimators or polynomial estimators. In sum, our extensive simulation study indicates that the different data-driven number of bins selectors underlying the construction of the RD plots perform well in finite samples.

29

0.0

0.2

0.4

µ(x)

0.6

0.8

1.0

Figure SA-7: Data Generating Processes

−1.0

−0.5

0.0

0.5

1.0

x

1.0

(a) Regression function, µ(x).

0.0

0.2

0.4

f(x)

0.6

0.8

p1 = 1, p2 = 1 p1 = 1 2, p2 = 1 2 p1 = 4 5, p2 = 1 5 p1 = 1 5, p2 = 4 5

−1.0

−0.5

0.0

0.5

1.0

x

1.5

(b) Xi ’s distribution, B(p1 , p2 ).

0.0

0.5

f(x)

1.0

µ = (−1/4 , 1/4), ω = (1/2 , 1/2) µ = (−1/2 , 1/2), ω = (1/2 , 1/2) µ = (−1/2 , 1/2), ω = (4/5 , 1/5) µ = (−1/2 , 1/2), ω = (1/5 , 4/5)

−1.0

−0.5

0.0

0.5

x

(c) Xi ’s distribution, Mixture of Normals

30

1.0

Table SA-1: Data Generating Processes Panel A: Models 1 to 8 Model 1 2 3 4 5 6 7 8

p1 1 0.5 0.2 0.8 1 0.5 0.2 0.8

p2 1 0.5 0.8 0.2 1 0.5 0.8 0.2

σ 2 (x) 1 1 exp(−|x|/2) exp(−|x|/2) 1 1 exp(−|x|/2) exp(−|x|/2)

Fε N (0, 1) N (0, 1) N (0, 1) N (0, 1) √ (χ4 − 4)/ 8 √ (χ4 − 4)/ 8 √ (χ4 − 4)/ 8 √ (χ4 − 4)/ 8

Panel B: Models 9 to 16 Model 9 10 11 12 13 14 15 16

µ1 -0.25 -0.5 -0.5 -0.5 -0.25 -0.5 -0.5 -0.5

µ2 0.25 0.5 0.5 0.5 0.25 0.5 0.5 0.5

ω1 0.5 0.5 0.8 0.2 0.5 0.5 0.8 0.2

ω2 0.5 0.5 0.2 0.8 0.5 0.5 0.2 0.8

31

σ 2 (x) 1 1 exp(−|x|/2) exp(−|x|/2) 1 1 exp(−|x|/2) exp(−|x|/2)

Fε N (0, 1) N (0, 1) N (0, 1) N (0, 1) √ (χ4 − 4)/ 8 √ (χ4 − 4)/ 8 √ (χ4 − 4)/ 8 √ (χ4 − 4)/ 8

Table SA-2: Simulations Results for Model 1 Panel A: IMSE for Grid of Number of Bins and Estimated Choices J−,n

IMSEES,− (J−,n ) IMSE∗ ES,−

J+,n

IMSEES,+ (J+,n ) IMSE∗ ES,+

J−,n

IMSEQS,− (J−,n ) IMSE∗ QS,−

J+,n

IMSEQS,+ (J+,n ) IMSE∗ QS,+

20 21 22 23 24 25 26 27 28 29 30

1.047 1.027 1.013 1.005 1.000 1.000 1.003 1.008 1.016 1.025 1.036

11 12 13 14 15 16 17 18 19 20 21

1.148 1.081 1.039 1.014 1.002 1.000 1.006 1.017 1.033 1.053 1.076

20 21 22 23 24 25 26 27 28 29 30

1.047 1.027 1.013 1.005 1.000 1.000 1.003 1.008 1.016 1.025 1.036

11 12 13 14 15 16 17 18 19 20 21

1.148 1.081 1.039 1.014 1.002 1.000 1.006 1.017 1.033 1.053 1.076

JˆES-µ,−,n JˇES-µ,−,n

1.033 1.034

JˆES-µ,+,n JˇES-µ,+,n

0.9435 0.9428

JˆQS-µ,−,n JˇQS-µ,−,n

1.072 1.073

JˆQS-µ,+,n JˇQS-µ,+,n

0.9351 0.9347

Panel B: Summary Statistics for the Estimated Number of Bins Pop. Par. JES-µ,−,n = 25 JES-ϑ,−,n = 118

JES-µ,+,n = 16 JES-ϑ,+,n = 116

JQS-µ,−,n = 25 JQS-ϑ,−,n = 118

JQS-µ,+,n = 16 JQS-ϑ,+,n = 116

Min.

1st Qu.

Median

Mean

3rd Qu.

Max.

Std. Dev.

JˆES-µ,−,n JˇES-µ,−,n JˆES-ϑ,−,n JˇES-ϑ,−,n

22 23 105 110

25 25 116 117

26 26 120 119

25.95 25.93 119.6 119.3

27 26 123 121

29 29 139 131

0.93 0.87 5.05 2.72

JˆES-µ,+,n JˇES-µ,+,n JˆES-ϑ,+,n JˇES-ϑ,+,n

14 14 103 107

15 15 113 115

15 15 117 117

15.34 15.34 116.7 116.7

16 16 120 118

17 17 139 128

0.57 0.55 4.71 2.65

JˆQS-µ,−,n JˇQS-µ,−,n JˆQS-ϑ,−,n JˇQS-ϑ,−,n

23 23 108 110

26 26 117 117

27 27 120 119

26.91 26.89 119.6 119.3

27 27 122 121

30 30 134 131

0.92 0.90 3.66 2.71

JˆQS-µ,+,n JˇQS-µ,+,n JˆQS-ϑ,+,n JˇQS-ϑ,+,n

14 14 106 107

15 15 114 115

15 15 117 117

15.21 15.21 116.6 116.7

15 15 119 118

17 17 130 128

0.51 0.50 3.50 2.65

Notes: (i) Population quantities: JES-µ,·,n = IMSE-optimal partition size for ES RD Plot. JES-ϑ,·,n = Mimicking variance partition size for ES RD Plot. JQS-µ,·,n = IMSE-optimal partition size for QS RD Plot. JQS-ϑ,·,n = Mimicking variance partition size for QS RD Plot. IMSE∗ES,· = IMSEES,· (JES-µ,·,n ) = ES IMSE function evaluated at optimal choice. IMSE∗QS,· = IMSEQS,· (JQS-µ,·,n ) = QS IMSE function evaluated at optimal choice. (ii) Estimators: JˆES-µ,·,n = spacings JˆES-ϑ,·,n = spacings JˆQS-µ,·,n = spacings JˆQS-ϑ,·,n = spacings

estimator estimator estimator estimator

of of of of

JES-µ,·,n ; JES-ϑ,·,n ; JQS-µ,·,n ; JQS-ϑ,·,n ;

JˇES-µ,·,n JˇES-ϑ,·,n JˇQS-µ,·,n JˇQS-ϑ,·,n

= = = =

polynomial polynomial polynomial polynomial

32

estimator estimator estimator estimator

of of of of

JES-µ,·,n . JES-ϑ,·,n . JQS-µ,·,n . JQS-ϑ,·,n .

Table SA-3: Simulations Results for Model 2 Panel A: IMSE for Grid of Number of Bins and Estimated Choices J−,n

IMSEES,− (J−,n ) IMSE∗ ES,−

J+,n

IMSEES,+ (J+,n ) IMSE∗ ES,+

J−,n

IMSEQS,− (J−,n ) IMSE∗ QS,−

J+,n

IMSEQS,+ (J+,n ) IMSE∗ QS,+

26 27 28 29 30 31 32 33 34 35 36

1.032 1.019 1.010 1.004 1.001 1.000 1.001 1.004 1.009 1.015 1.022

11 12 13 14 15 16 17 18 19 20 21

1.157 1.088 1.043 1.017 1.003 1.000 1.004 1.015 1.030 1.050 1.072

19 20 21 22 23 24 25 26 27 28 29

1.047 1.026 1.012 1.004 1.000 1.000 1.004 1.010 1.019 1.029 1.042

13 14 15 16 17 18 19 20 21 22 23

1.086 1.045 1.018 1.004 0.998 1.000 1.007 1.019 1.035 1.054 1.075

JˆES-µ,−,n JˇES-µ,−,n

1.086 1.088

JˆES-µ,+,n JˇES-µ,+,n

0.9009 0.9005

JˆQS-µ,−,n JˇQS-µ,−,n

0.9271 0.9292

JˆQS-µ,+,n JˇQS-µ,+,n

0.9399 0.9394

Panel B: Summary Statistics for the Estimated Number of Bins Pop. Par. JES-µ,−,n = 31 JES-ϑ,−,n = 114

JES-µ,+,n = 16 JES-ϑ,+,n = 118

JQS-µ,−,n = 24 JQS-ϑ,−,n = 114

JQS-µ,+,n = 18 JQS-ϑ,+,n = 118

Min.

1st Qu.

Median

Mean

3rd Qu.

Max.

Std. Dev.

JˆES-µ,−,n JˇES-µ,−,n JˆES-ϑ,−,n JˇES-ϑ,−,n

30 31 98 104

33 33 112 112

34 34 115 114

34.13 34.08 115.1 114.5

35 35 118.2 117

39 38 134 126

1.09 1.01 5.18 3.05

JˆES-µ,+,n JˇES-µ,+,n JˆES-ϑ,+,n JˇES-ϑ,+,n

13 13 102 110

14 14 116 118

15 15 120 120

14.84 14.83 120.3 120.2

15 15 124 122

18 17 145 133

0.72 0.70 5.63 3.22

JˆQS-µ,−,n JˇQS-µ,−,n JˆQS-ϑ,−,n JˇQS-ϑ,−,n

21 21 104 106

22 22 112 113

22 22 115 114

22.24 22.2 114.8 114.4

23 22 117 116

24 24 128 124

0.53 0.50 3.46 2.56

JˆQS-µ,+,n JˇQS-µ,+,n JˆQS-ϑ,+,n JˇQS-ϑ,+,n

15 15 108 109

16 16 117 118

17 17 120 120

16.71 16.72 119.9 119.9

17 17 122 122

20 20 134 132

0.65 0.65 3.66 2.81

Notes: (i) Population quantities: JES-µ,·,n = IMSE-optimal partition size for ES RD Plot. JES-ϑ,·,n = Mimicking variance partition size for ES RD Plot. JQS-µ,·,n = IMSE-optimal partition size for QS RD Plot. JQS-ϑ,·,n = Mimicking variance partition size for QS RD Plot. IMSE∗ES,· = IMSEES,· (JES-µ,·,n ) = ES IMSE function evaluated at optimal choice. IMSE∗QS,· = IMSEQS,· (JQS-µ,·,n ) = QS IMSE function evaluated at optimal choice. (ii) Estimators: JˆES-µ,·,n = spacings JˆES-ϑ,·,n = spacings JˆQS-µ,·,n = spacings JˆQS-ϑ,·,n = spacings

estimator estimator estimator estimator

of of of of

JES-µ,·,n ; JES-ϑ,·,n ; JQS-µ,·,n ; JQS-ϑ,·,n ;

JˇES-µ,·,n JˇES-ϑ,·,n JˇQS-µ,·,n JˇQS-ϑ,·,n

= = = =

polynomial polynomial polynomial polynomial

33

estimator estimator estimator estimator

of of of of

JES-µ,·,n . JES-ϑ,·,n . JQS-µ,·,n . JQS-ϑ,·,n .

Table SA-4: Simulations Results for Model 3 Panel A: IMSE for Grid of Number of Bins and Estimated Choices J−,n

IMSEES,− (J−,n ) IMSE∗ ES,−

J+,n

IMSEES,+ (J+,n ) IMSE∗ ES,+

J−,n

IMSEQS,− (J−,n ) IMSE∗ QS,−

J+,n

IMSEQS,+ (J+,n ) IMSE∗ QS,+

49 50 51 52 53 54 55 56 57 58 59

1.008 1.005 1.002 1.001 1.000 1.000 1.001 1.002 1.004 1.006 1.009

8 9 10 11 12 13 14 15 16 17 18

1.279 1.149 1.071 1.027 1.005 1.000 1.007 1.022 1.044 1.071 1.102

40 41 42 43 44 45 46 47 48 49 50

1.010 1.006 1.002 1.000 1.000 1.000 1.001 1.003 1.006 1.010 1.014

8 9 10 11 12 13 14 15 16 17 18

1.265 1.139 1.064 1.023 1.003 1.000 1.008 1.025 1.048 1.076 1.108

JˆES-µ,−,n JˇES-µ,−,n

1.09 1.097

JˆES-µ,+,n JˇES-µ,+,n

0.9534 0.9504

JˆQS-µ,−,n JˇQS-µ,−,n

0.869 0.872

JˆQS-µ,+,n JˇQS-µ,+,n

0.9628 0.9609

Panel B: Summary Statistics for the Estimated Number of Bins Pop. Par. JES-µ,−,n = 54 JES-ϑ,−,n = 112

JES-µ,+,n = 13 JES-ϑ,+,n = 149

JQS-µ,−,n = 45 JQS-ϑ,−,n = 155

JQS-µ,+,n = 13 JQS-ϑ,+,n = 149

Min.

1st Qu.

Median

Mean

3rd Qu.

Max.

Std. Dev.

JˆES-µ,−,n JˇES-µ,−,n JˆES-ϑ,−,n JˇES-ϑ,−,n

54 54 90 99

58 58 108 108

59 59 112 111

59.05 58.85 112.1 110.9

60 60 116 114

65 64 138 127

1.59 1.28 6.65 4.08

JˆES-µ,+,n JˇES-µ,+,n JˆES-ϑ,+,n JˇES-ϑ,+,n

11 11 111 125

12 12 140 143

13 13 147 148

12.79 12.8 147.6 147.8

13 13 155 152

16 16 193 174

0.73 0.68 10.94 6.47

JˆQS-µ,−,n JˇQS-µ,−,n JˆQS-ϑ,−,n JˇQS-ϑ,−,n

36 36 140 142

38 38 151 151

39 39 154 153

38.8 38.72 154.2 153.3

39 39 157 155

42 42 168 165

0.82 0.78 4.07 3.12

JˆQS-µ,+,n JˇQS-µ,+,n JˆQS-ϑ,+,n JˇQS-ϑ,+,n

11 11 119 125

12 12 142 143

13 13 147 147

12.74 12.76 147.5 147.8

13 13 153 152

15 15 182 174

0.61 0.59 8.29 6.47

Notes: (i) Population quantities: JES-µ,·,n = IMSE-optimal partition size for ES RD Plot. JES-ϑ,·,n = Mimicking variance partition size for ES RD Plot. JQS-µ,·,n = IMSE-optimal partition size for QS RD Plot. JQS-ϑ,·,n = Mimicking variance partition size for QS RD Plot. IMSE∗ES,· = IMSEES,· (JES-µ,·,n ) = ES IMSE function evaluated at optimal choice. IMSE∗QS,· = IMSEQS,· (JQS-µ,·,n ) = QS IMSE function evaluated at optimal choice. (ii) Estimators: JˆES-µ,·,n = spacings JˆES-ϑ,·,n = spacings JˆQS-µ,·,n = spacings JˆQS-ϑ,·,n = spacings

estimator estimator estimator estimator

of of of of

JES-µ,·,n ; JES-ϑ,·,n ; JQS-µ,·,n ; JQS-ϑ,·,n ;

JˇES-µ,·,n JˇES-ϑ,·,n JˇQS-µ,·,n JˇQS-ϑ,·,n

= = = =

polynomial polynomial polynomial polynomial

34

estimator estimator estimator estimator

of of of of

JES-µ,·,n . JES-ϑ,·,n . JQS-µ,·,n . JQS-ϑ,·,n .

Table SA-5: Simulations Results for Model 4 Panel A: IMSE for Grid of Number of Bins and Estimated Choices J−,n

IMSEES,− (J−,n ) IMSE∗ ES,−

J+,n

IMSEES,+ (J+,n ) IMSE∗ ES,+

J−,n

IMSEQS,− (J−,n ) IMSE∗ QS,−

J+,n

IMSEQS,+ (J+,n ) IMSE∗ QS,+

16 17 18 19 20 21 22 23 24 25 26

1.080 1.047 1.024 1.010 1.002 1.000 1.002 1.009 1.018 1.030 1.044

19 20 21 22 23 24 25 26 27 28 29

1.059 1.035 1.018 1.008 1.002 1.000 1.002 1.006 1.014 1.023 1.034

15 16 17 18 19 20 21 22 23 24 25

1.072 1.039 1.017 1.005 1.000 1.000 1.005 1.014 1.027 1.042 1.059

30 31 32 33 34 35 36 37 38 39 40

1.025 1.015 1.008 1.003 1.001 1.000 1.001 1.003 1.007 1.011 1.017

JˆES-µ,−,n JˇES-µ,−,n

1.065 1.067

JˆES-µ,+,n JˇES-µ,+,n

0.8511 0.8504

JˆQS-µ,−,n JˇQS-µ,−,n

0.9663 0.9679

JˆQS-µ,+,n JˇQS-µ,+,n

0.9004 0.9003

Panel B: Summary Statistics for the Estimated Number of Bins Pop. Par. JES-µ,−,n = 21 JES-ϑ,−,n = 145

JES-µ,+,n = 24 JES-ϑ,+,n = 102

JQS-µ,−,n = 20 JQS-ϑ,−,n = 145

JQS-µ,+,n = 35 JQS-ϑ,+,n = 141

Min.

1st Qu.

Median

Mean

3rd Qu.

Max.

Std. Dev.

JˆES-µ,−,n JˇES-µ,−,n JˆES-ϑ,−,n JˇES-ϑ,−,n

19 19 106 125

22 22 141 143

23 23 148 147

22.86 22.83 148.3 147.6

24 23 156 152

28 26 201 179

1.04 0.91 11.48 6.59

JˆES-µ,+,n JˇES-µ,+,n JˆES-ϑ,+,n JˇES-ϑ,+,n

17 17 82 90

20 20 99 101

21 21 103 103

20.91 20.91 103.6 103.5

22 22 108 106

27 27 130 119

1.33 1.30 6.29 3.95

JˆQS-µ,−,n JˇQS-µ,−,n JˆQS-ϑ,−,n JˇQS-ϑ,−,n

17 17 120 126

19 19 144 145

19 19 149 149

19.44 19.43 149.6 149.1

20 20 155 153

23 22 187 181

0.74 0.70 8.59 6.60

JˆQS-µ,+,n JˇQS-µ,+,n JˆQS-ϑ,+,n JˇQS-ϑ,+,n

28 28 130 131

31 31 140 141

32 32 143 143

31.91 31.92 142.9 142.9

33 33 146 145

40 40 159 155

1.61 1.61 3.97 3.25

Notes: (i) Population quantities: JES-µ,·,n = IMSE-optimal partition size for ES RD Plot. JES-ϑ,·,n = Mimicking variance partition size for ES RD Plot. JQS-µ,·,n = IMSE-optimal partition size for QS RD Plot. JQS-ϑ,·,n = Mimicking variance partition size for QS RD Plot. IMSE∗ES,· = IMSEES,· (JES-µ,·,n ) = ES IMSE function evaluated at optimal choice. IMSE∗QS,· = IMSEQS,· (JQS-µ,·,n ) = QS IMSE function evaluated at optimal choice. (ii) Estimators: JˆES-µ,·,n = spacings JˆES-ϑ,·,n = spacings JˆQS-µ,·,n = spacings JˆQS-ϑ,·,n = spacings

estimator estimator estimator estimator

of of of of

JES-µ,·,n ; JES-ϑ,·,n ; JQS-µ,·,n ; JQS-ϑ,·,n ;

JˇES-µ,·,n JˇES-ϑ,·,n JˇQS-µ,·,n JˇQS-ϑ,·,n

= = = =

polynomial polynomial polynomial polynomial

35

estimator estimator estimator estimator

of of of of

JES-µ,·,n . JES-ϑ,·,n . JQS-µ,·,n . JQS-ϑ,·,n .

Table SA-6: Simulations Results for Model 5 Panel A: IMSE for Grid of Number of Bins and Estimated Choices J−,n

IMSEES,− (J−,n ) IMSE∗ ES,−

J+,n

IMSEES,+ (J+,n ) IMSE∗ ES,+

J−,n

IMSEQS,− (J−,n ) IMSE∗ QS,−

J+,n

IMSEQS,+ (J+,n ) IMSE∗ QS,+

41 42 43 44 45 46 47 48 49 50 51

1.013 1.008 1.004 1.002 1.000 1.000 1.001 1.002 1.004 1.007 1.011

7 8 9 10 11 12 13 14 15 16 17

1.247 1.113 1.039 1.004 0.994 1.000 1.018 1.045 1.078 1.116 1.158

30 31 32 33 34 35 36 37 38 39 40

1.016 1.008 1.003 1.000 0.999 1.000 1.002 1.006 1.011 1.017 1.024

6 7 8 9 10 11 12 13 14 15 16

1.472 1.240 1.110 1.041 1.008 1.000 1.008 1.028 1.057 1.092 1.131

JˆES-µ,−,n JˇES-µ,−,n

1.095 1.099

JˆES-µ,+,n JˇES-µ,+,n

0.9544 0.9521

JˆQS-µ,−,n JˇQS-µ,−,n

0.8966 0.8977

JˆQS-µ,+,n JˇQS-µ,+,n

0.9651 0.9629

Panel B: Summary Statistics for the Estimated Number of Bins Pop. Par. JES-µ,−,n = 46 JES-ϑ,−,n = 109

JES-µ,+,n = 12 JES-ϑ,+,n = 119

JQS-µ,−,n = 35 JQS-ϑ,−,n = 109

JQS-µ,+,n = 11 JQS-ϑ,+,n = 119

Min.

1st Qu.

Median

Mean

3rd Qu.

Max.

Std. Dev.

JˆES-µ,−,n JˇES-µ,−,n JˆES-ϑ,−,n JˇES-ϑ,−,n

44 45 77 92

50 50 104 105

51 51 110 109

50.93 50.82 109.9 109.1

52 52 115 113

58 57 139 130

1.83 1.61 8.11 5.60

JˆES-µ,+,n JˇES-µ,+,n JˆES-ϑ,+,n JˇES-ϑ,+,n

9 9 82 102

11 11 113 116

11 11 120 120

11.17 11.17 120.4 120.4

12 12 127 124

15 14 161 141

0.74 0.69 10.36 5.89

JˆQS-µ,−,n JˇQS-µ,−,n JˆQS-ϑ,−,n JˇQS-ϑ,−,n

28 28 99 101

30 30 107 108

31 31 109 109

31.02 31 109.4 109.1

32 31 111 111

35 35 120 117

0.86 0.84 2.75 2.17

JˆQS-µ,+,n JˇQS-µ,+,n JˆQS-ϑ,+,n JˇQS-ϑ,+,n

9 9 99 101

11 11 115 116

11 11 119 120

11.06 11.06 119.8 120.1

11 11 124 124

13 13 149 140

0.59 0.57 6.86 5.55

Notes: (i) Population quantities: JES-µ,·,n = IMSE-optimal partition size for ES RD Plot. JES-ϑ,·,n = Mimicking variance partition size for ES RD Plot. JQS-µ,·,n = IMSE-optimal partition size for QS RD Plot. JQS-ϑ,·,n = Mimicking variance partition size for QS RD Plot. IMSE∗ES,· = IMSEES,· (JES-µ,·,n ) = ES IMSE function evaluated at optimal choice. IMSE∗QS,· = IMSEQS,· (JQS-µ,·,n ) = QS IMSE function evaluated at optimal choice. (ii) Estimators: JˆES-µ,·,n = spacings JˆES-ϑ,·,n = spacings JˆQS-µ,·,n = spacings JˆQS-ϑ,·,n = spacings

estimator estimator estimator estimator

of of of of

JES-µ,·,n ; JES-ϑ,·,n ; JQS-µ,·,n ; JQS-ϑ,·,n ;

JˇES-µ,·,n JˇES-ϑ,·,n JˇQS-µ,·,n JˇQS-ϑ,·,n

= = = =

polynomial polynomial polynomial polynomial

36

estimator estimator estimator estimator

of of of of

JES-µ,·,n . JES-ϑ,·,n . JQS-µ,·,n . JQS-ϑ,·,n .

Table SA-7: Simulations Results for Model 6 Panel A: IMSE for Grid of Number of Bins and Estimated Choices J−,n

IMSEES,− (J−,n ) IMSE∗ ES,−

J+,n

IMSEES,+ (J+,n ) IMSE∗ ES,+

J−,n

IMSEQS,− (J−,n ) IMSE∗ QS,−

J+,n

IMSEQS,+ (J+,n ) IMSE∗ QS,+

13 14 15 16 17 18 19 20 21 22 23

1.119 1.068 1.035 1.014 1.003 1.000 1.003 1.011 1.023 1.039 1.057

16 17 18 19 20 21 22 23 24 25 26

1.069 1.039 1.018 1.006 1.000 1.000 1.004 1.012 1.022 1.035 1.051

12 13 14 15 16 17 18 19 20 21 22

1.121 1.066 1.031 1.011 1.001 1.000 1.006 1.017 1.032 1.050 1.072

22 23 24 25 26 27 28 29 30 31 32

1.044 1.026 1.014 1.005 1.001 1.000 1.002 1.005 1.011 1.019 1.029

JˆES-µ,−,n JˇES-µ,−,n

1.065 1.065

JˆES-µ,+,n JˇES-µ,+,n

0.8495 0.8493

JˆQS-µ,−,n JˇQS-µ,−,n

1.008 1.008

JˆQS-µ,+,n JˇQS-µ,+,n

0.9261 0.9264

Panel B: Summary Statistics for the Estimated Number of Bins Pop. Par. JES-µ,−,n = 18 JES-ϑ,−,n = 119

JES-µ,+,n = 21 JES-ϑ,+,n = 102

JQS-µ,−,n = 17 JQS-ϑ,−,n = 119

JQS-µ,+,n = 27 JQS-ϑ,+,n = 102

Min.

1st Qu.

Median

Mean

3rd Qu.

Max.

Std. Dev.

JˆES-µ,−,n JˇES-µ,−,n JˆES-ϑ,−,n JˇES-ϑ,−,n

16 16 87 102

19 19 113 116

20 20 120 120

19.71 19.69 120.2 119.7

20 20 127 124

24 24 165 145

1.23 1.17 9.88 5.92

JˆES-µ,+,n JˇES-µ,+,n JˆES-ϑ,+,n JˇES-ϑ,+,n

13 14 75 82

17 17 97 98

18 18 102 102

18.14 18.13 102.4 102.2

19 19 108 106

25 26 137 124

1.71 1.69 7.90 5.77

JˆQS-µ,−,n JˇQS-µ,−,n JˆQS-ϑ,−,n JˇQS-ϑ,−,n

15 15 97 104

17 17 115 116

17 17 120 119

17.31 17.31 119.8 119.6

18 18 124 123

20 20 146 142

0.94 0.92 6.81 5.43

JˆQS-µ,+,n JˇQS-µ,+,n JˆQS-ϑ,+,n JˇQS-ϑ,+,n

22 22 94 96

25 25 100 100

25 25 101 101

25.42 25.42 101.3 101.2

26 26 103 102

31 31 109 109

1.32 1.31 2.43 1.85

Notes: (i) Population quantities: JES-µ,·,n = IMSE-optimal partition size for ES RD Plot. JES-ϑ,·,n = Mimicking variance partition size for ES RD Plot. JQS-µ,·,n = IMSE-optimal partition size for QS RD Plot. JQS-ϑ,·,n = Mimicking variance partition size for QS RD Plot. IMSE∗ES,· = IMSEES,· (JES-µ,·,n ) = ES IMSE function evaluated at optimal choice. IMSE∗QS,· = IMSEQS,· (JQS-µ,·,n ) = QS IMSE function evaluated at optimal choice. (ii) Estimators: JˆES-µ,·,n = spacings JˆES-ϑ,·,n = spacings JˆQS-µ,·,n = spacings JˆQS-ϑ,·,n = spacings

estimator estimator estimator estimator

of of of of

JES-µ,·,n ; JES-ϑ,·,n ; JQS-µ,·,n ; JQS-ϑ,·,n ;

JˇES-µ,·,n JˇES-ϑ,·,n JˇQS-µ,·,n JˇQS-ϑ,·,n

= = = =

polynomial polynomial polynomial polynomial

37

estimator estimator estimator estimator

of of of of

JES-µ,·,n . JES-ϑ,·,n . JQS-µ,·,n . JQS-ϑ,·,n .

Table SA-8: Simulations Results for Model 7 Panel A: IMSE for Grid of Number of Bins and Estimated Choices J−,n

IMSEES,− (J−,n ) IMSE∗ ES,−

J+,n

IMSEES,+ (J+,n ) IMSE∗ ES,+

J−,n

IMSEQS,− (J−,n ) IMSE∗ QS,−

J+,n

IMSEQS,+ (J+,n ) IMSE∗ QS,+

49 50 51 52 53 54 55 56 57 58 59

1.008 1.005 1.002 1.001 1.000 1.000 1.001 1.002 1.004 1.006 1.009

8 9 10 11 12 13 14 15 16 17 18

1.279 1.149 1.071 1.027 1.005 1.000 1.007 1.022 1.044 1.071 1.102

40 41 42 43 44 45 46 47 48 49 50

1.010 1.006 1.002 1.000 1.000 1.000 1.001 1.003 1.006 1.010 1.014

8 9 10 11 12 13 14 15 16 17 18

1.265 1.139 1.064 1.023 1.003 1.000 1.008 1.025 1.048 1.076 1.108

JˆES-µ,−,n JˇES-µ,−,n

1.097 1.104

JˆES-µ,+,n JˇES-µ,+,n

0.9335 0.9308

JˆQS-µ,−,n JˇQS-µ,−,n

0.9043 0.9079

JˆQS-µ,+,n JˇQS-µ,+,n

0.9649 0.9629

Panel B: Summary Statistics for the Estimated Number of Bins Pop. Par. JES-µ,−,n = 54 JES-ϑ,−,n = 113

JES-µ,+,n = 13 JES-ϑ,+,n = 144

JQS-µ,−,n = 45 JQS-ϑ,−,n = 156

JQS-µ,+,n = 13 JQS-ϑ,+,n = 145

Min.

1st Qu.

Median

Mean

3rd Qu.

Max.

Std. Dev.

JˆES-µ,−,n JˇES-µ,−,n JˆES-ϑ,−,n JˇES-ϑ,−,n

53 54 82 94

58 58 108 108

59 59 114 113

59.38 59.15 114.1 112.7

61 60 120 117

66 65 149 137

1.98 1.60 8.89 6.08

JˆES-µ,+,n JˇES-µ,+,n JˆES-ϑ,+,n JˇES-ϑ,+,n

10 10 105 117

12 12 142 146

13 13 152 152

12.57 12.59 152.6 152.5

13 13 162 159

17 16 227 188

0.82 0.76 15.09 9.37

JˆQS-µ,−,n JˇQS-µ,−,n JˆQS-ϑ,−,n JˇQS-ϑ,−,n

38 38 138 142

40 40 153 153

40 40 156 156

40.33 40.24 156.6 155.6

41 41 160 158

44 44 177 170

0.84 0.82 4.77 3.95

JˆQS-µ,+,n JˇQS-µ,+,n JˆQS-ϑ,+,n JˇQS-ϑ,+,n

11 11 112 115

12 12 143 144

13 13 150 151

12.7 12.71 150.8 151.1

13 13 158 157

16 16 208 188

0.69 0.67 11.11 9.56

Notes: (i) Population quantities: JES-µ,·,n = IMSE-optimal partition size for ES RD Plot. JES-ϑ,·,n = Mimicking variance partition size for ES RD Plot. JQS-µ,·,n = IMSE-optimal partition size for QS RD Plot. JQS-ϑ,·,n = Mimicking variance partition size for QS RD Plot. IMSE∗ES,· = IMSEES,· (JES-µ,·,n ) = ES IMSE function evaluated at optimal choice. IMSE∗QS,· = IMSEQS,· (JQS-µ,·,n ) = QS IMSE function evaluated at optimal choice. (ii) Estimators: JˆES-µ,·,n = spacings JˆES-ϑ,·,n = spacings JˆQS-µ,·,n = spacings JˆQS-ϑ,·,n = spacings

estimator estimator estimator estimator

of of of of

JES-µ,·,n ; JES-ϑ,·,n ; JQS-µ,·,n ; JQS-ϑ,·,n ;

JˇES-µ,·,n JˇES-ϑ,·,n JˇQS-µ,·,n JˇQS-ϑ,·,n

= = = =

polynomial polynomial polynomial polynomial

38

estimator estimator estimator estimator

of of of of

JES-µ,·,n . JES-ϑ,·,n . JQS-µ,·,n . JQS-ϑ,·,n .

Table SA-9: Simulations Results for Model 8 Panel A: IMSE for Grid of Number of Bins and Estimated Choices J−,n

IMSEES,− (J−,n ) IMSE∗ ES,−

J+,n

IMSEES,+ (J+,n ) IMSE∗ ES,+

J−,n

IMSEQS,− (J−,n ) IMSE∗ QS,−

J+,n

IMSEQS,+ (J+,n ) IMSE∗ QS,+

16 17 18 19 20 21 22 23 24 25 26

1.080 1.047 1.024 1.010 1.002 1.000 1.002 1.009 1.018 1.030 1.044

19 20 21 22 23 24 25 26 27 28 29

1.059 1.035 1.018 1.008 1.002 1.000 1.002 1.006 1.014 1.023 1.034

15 16 17 18 19 20 21 22 23 24 25

1.072 1.039 1.017 1.005 1.000 1.000 1.005 1.014 1.027 1.042 1.059

30 31 32 33 34 35 36 37 38 39 40

1.025 1.015 1.008 1.003 1.001 1.000 1.001 1.003 1.007 1.011 1.017

JˆES-µ,−,n JˇES-µ,−,n

1.039 1.042

JˆES-µ,+,n JˇES-µ,+,n

0.8473 0.8474

JˆQS-µ,−,n JˇQS-µ,−,n

1.019 1.021

JˆQS-µ,+,n JˇQS-µ,+,n

0.9442 0.9443

Panel B: Summary Statistics for the Estimated Number of Bins Pop. Par. JES-µ,−,n = 21 JES-ϑ,−,n = 150

JES-µ,+,n = 24 JES-ϑ,+,n = 102

JQS-µ,−,n = 20 JQS-ϑ,−,n = 151

JQS-µ,+,n = 35 JQS-ϑ,+,n = 142

Min.

1st Qu.

Median

Mean

3rd Qu.

Max.

Std. Dev.

JˆES-µ,−,n JˇES-µ,−,n JˆES-ϑ,−,n JˇES-ϑ,−,n

18 18 103 121

22 22 139 142

22 22 147 146

22.36 22.32 147.9 146.7

23 23 156 151.2

26 26 207 175

1.16 1.04 12.86 7.34

JˆES-µ,+,n JˇES-µ,+,n JˆES-ϑ,+,n JˇES-ϑ,+,n

16 16 68 77

20 20 94 95

21 21 100 100

20.85 20.84 100.5 100.1

22 22 107 105

26 26 140 128

1.47 1.40 9.47 6.87

JˆQS-µ,−,n JˇQS-µ,−,n JˆQS-ϑ,−,n JˇQS-ϑ,−,n

17 17 119 123

20 20 143 144

21 20 149 148

20.53 20.5 149 148.4

21 21 155 153

24 24 191 176

0.93 0.89 9.17 7.31

JˆQS-µ,+,n JˇQS-µ,+,n JˆQS-ϑ,+,n JˇQS-ϑ,+,n

28 29 122 123

32 32 136 136

34 34 139 139

33.72 33.72 139.2 139.2

35 35 142 142

43 43 157 154

1.85 1.84 4.79 4.18

Notes: (i) Population quantities: JES-µ,·,n = IMSE-optimal partition size for ES RD Plot. JES-ϑ,·,n = Mimicking variance partition size for ES RD Plot. JQS-µ,·,n = IMSE-optimal partition size for QS RD Plot. JQS-ϑ,·,n = Mimicking variance partition size for QS RD Plot. IMSE∗ES,· = IMSEES,· (JES-µ,·,n ) = ES IMSE function evaluated at optimal choice. IMSE∗QS,· = IMSEQS,· (JQS-µ,·,n ) = QS IMSE function evaluated at optimal choice. (ii) Estimators: JˆES-µ,·,n = spacings JˆES-ϑ,·,n = spacings JˆQS-µ,·,n = spacings JˆQS-ϑ,·,n = spacings

estimator estimator estimator estimator

of of of of

JES-µ,·,n ; JES-ϑ,·,n ; JQS-µ,·,n ; JQS-ϑ,·,n ;

JˇES-µ,·,n JˇES-ϑ,·,n JˇQS-µ,·,n JˇQS-ϑ,·,n

= = = =

polynomial polynomial polynomial polynomial

39

estimator estimator estimator estimator

of of of of

JES-µ,·,n . JES-ϑ,·,n . JQS-µ,·,n . JQS-ϑ,·,n .

Table SA-10: Simulations Results for Model 9 Panel A: IMSE for Grid of Number of Bins and Estimated Choices J−,n

IMSEES,− (J−,n ) IMSE∗ ES,−

J+,n

IMSEES,+ (J+,n ) IMSE∗ ES,+

J−,n

IMSEQS,− (J−,n ) IMSE∗ QS,−

J+,n

IMSEQS,+ (J+,n ) IMSE∗ QS,+

15 16 17 18 19 20 21 22 23 24 25

1.088 1.051 1.026 1.010 1.002 1.000 1.003 1.010 1.020 1.034 1.049

12 13 14 15 16 17 18 19 20 21 22

1.133 1.075 1.037 1.014 1.003 1.000 1.004 1.014 1.027 1.045 1.065

61 62 63 64 65 66 67 68 69 70 71

1.006 1.004 1.002 1.001 1.000 1.000 1.000 1.001 1.002 1.004 1.006

23 24 25 26 27 28 29 30 31 32 33

1.028 1.015 1.006 1.001 0.999 1.000 1.003 1.009 1.016 1.025 1.035

JˆES-µ,−,n JˇES-µ,−,n

0.9429 0.9447

JˆES-µ,+,n JˇES-µ,+,n

0.9666 0.9633

JˆQS-µ,−,n JˇQS-µ,−,n

1.026 1.027

JˆQS-µ,+,n JˇQS-µ,+,n

0.71 0.7095

Panel B: Summary Statistics for the Estimated Number of Bins Pop. Par. JES-µ,−,n = 20 JES-ϑ,−,n = 103

JES-µ,+,n = 17 JES-ϑ,+,n = 96

JQS-µ,−,n = 66 JQS-ϑ,−,n = 103

JQS-µ,+,n = 28 JQS-ϑ,+,n = 96

Min.

1st Qu.

Median

Mean

3rd Qu.

Max.

Std. Dev.

JˆES-µ,−,n JˇES-µ,−,n JˆES-ϑ,−,n JˇES-ϑ,−,n

16 17 71 83

19 19 97 99

19 19 103 102

19.22 19.19 103.1 102.4

20 20 109 106

23 23 132 123

0.94 0.86 8.83 5.77

JˆES-µ,+,n JˇES-µ,+,n JˆES-ϑ,+,n JˇES-ϑ,+,n

14 15 69 77

16 16 92 93

17 17 96 97

16.81 16.83 96.25 96.48

17 17 101 100

20 20 120 114

0.82 0.77 7.06 4.64

JˆQS-µ,−,n JˇQS-µ,−,n JˆQS-ϑ,−,n JˇQS-ϑ,−,n

45 45 93 95

64 64 101 101

68 68 103 103

68.02 68.01 102.7 102.6

72 72 105 104

89 89 114 112

6.29 6.26 3.02 2.18

JˆQS-µ,+,n JˇQS-µ,+,n JˆQS-ϑ,+,n JˇQS-ϑ,+,n

14 14 86 89

18 18 94 95

19 19 96 96

19.77 19.77 95.83 95.91

21 21 98 97

41 41 107 103

3.08 3.08 2.66 1.86

Notes: (i) Population quantities: JES-µ,·,n = IMSE-optimal partition size for ES RD Plot. JES-ϑ,·,n = Mimicking variance partition size for ES RD Plot. JQS-µ,·,n = IMSE-optimal partition size for QS RD Plot. JQS-ϑ,·,n = Mimicking variance partition size for QS RD Plot. IMSE∗ES,· = IMSEES,· (JES-µ,·,n ) = ES IMSE function evaluated at optimal choice. IMSE∗QS,· = IMSEQS,· (JQS-µ,·,n ) = QS IMSE function evaluated at optimal choice. (ii) Estimators: JˆES-µ,·,n = spacings JˆES-ϑ,·,n = spacings JˆQS-µ,·,n = spacings JˆQS-ϑ,·,n = spacings

estimator estimator estimator estimator

of of of of

JES-µ,·,n ; JES-ϑ,·,n ; JQS-µ,·,n ; JQS-ϑ,·,n ;

JˇES-µ,·,n JˇES-ϑ,·,n JˇQS-µ,·,n JˇQS-ϑ,·,n

= = = =

polynomial polynomial polynomial polynomial

40

estimator estimator estimator estimator

of of of of

JES-µ,·,n . JES-ϑ,·,n . JQS-µ,·,n . JQS-ϑ,·,n .

Table SA-11: Simulations Results for Model 10 Panel A: IMSE for Grid of Number of Bins and Estimated Choices J−,n

IMSEES,− (J−,n ) IMSE∗ ES,−

J+,n

IMSEES,+ (J+,n ) IMSE∗ ES,+

J−,n

IMSEQS,− (J−,n ) IMSE∗ QS,−

J+,n

IMSEQS,+ (J+,n ) IMSE∗ QS,+

17 18 19 20 21 22 23 24 25 26 27

1.064 1.036 1.018 1.006 1.001 1.000 1.003 1.010 1.020 1.032 1.046

11 12 13 14 15 16 17 18 19 20 21

1.129 1.068 1.030 1.008 0.999 1.000 1.008 1.021 1.039 1.061 1.086

28 29 30 31 32 33 34 35 36 37 38

1.018 1.009 1.003 1.000 0.999 1.000 1.003 1.007 1.012 1.019 1.027

13 14 15 16 17 18 19 20 21 22 23

1.110 1.062 1.030 1.011 1.002 1.000 1.004 1.013 1.027 1.043 1.062

JˆES-µ,−,n JˇES-µ,−,n

1.047 1.049

JˆES-µ,+,n JˇES-µ,+,n

0.9967 0.995

JˆQS-µ,−,n JˇQS-µ,−,n

1.044 1.045

JˆQS-µ,+,n JˇQS-µ,+,n

0.8817 0.8811

Panel B: Summary Statistics for the Estimated Number of Bins Pop. Par. JES-µ,−,n = 22 JES-ϑ,−,n = 121

JES-µ,+,n = 16 JES-ϑ,+,n = 111

JQS-µ,−,n = 33 JQS-ϑ,−,n = 121

JQS-µ,+,n = 18 JQS-ϑ,+,n = 111

Min.

1st Qu.

Median

Mean

3rd Qu.

Max.

Std. Dev.

JˆES-µ,−,n JˇES-µ,−,n JˆES-ϑ,−,n JˇES-ϑ,−,n

20 20 91 99

22 22 106 107

23 23 110 109

22.89 22.86 109.8 109.3

23 23 113 111

26 26 131 120

0.81 0.75 5.35 2.97

JˆES-µ,+,n JˇES-µ,+,n JˆES-ϑ,+,n JˇES-ϑ,+,n

14 14 78 89

15 15 94 96

16 16 97 98

15.66 15.68 97.45 97.57

16 16 101 99

18 17 116 107

0.54 0.51 4.68 2.62

JˆQS-µ,−,n JˇQS-µ,−,n JˆQS-ϑ,−,n JˇQS-ϑ,−,n

27 28 97 101

32 32 107 107

33 33 109 109

33.45 33.43 109.4 109.2

35 35 111 111

41 41 121 120

1.69 1.67 3.33 2.46

JˆQS-µ,+,n JˇQS-µ,+,n JˆQS-ϑ,+,n JˇQS-ϑ,+,n

13 13 88 90

15 15 95 96

16 16 97 97

15.93 15.93 97.35 97.4

17 17 99 99

22 22 108 105

1.21 1.20 2.82 1.98

Notes: (i) Population quantities: JES-µ,·,n = IMSE-optimal partition size for ES RD Plot. JES-ϑ,·,n = Mimicking variance partition size for ES RD Plot. JQS-µ,·,n = IMSE-optimal partition size for QS RD Plot. JQS-ϑ,·,n = Mimicking variance partition size for QS RD Plot. IMSE∗ES,· = IMSEES,· (JES-µ,·,n ) = ES IMSE function evaluated at optimal choice. IMSE∗QS,· = IMSEQS,· (JQS-µ,·,n ) = QS IMSE function evaluated at optimal choice. (ii) Estimators: JˆES-µ,·,n = spacings JˆES-ϑ,·,n = spacings JˆQS-µ,·,n = spacings JˆQS-ϑ,·,n = spacings

estimator estimator estimator estimator

of of of of

JES-µ,·,n ; JES-ϑ,·,n ; JQS-µ,·,n ; JQS-ϑ,·,n ;

JˇES-µ,·,n JˇES-ϑ,·,n JˇQS-µ,·,n JˇQS-ϑ,·,n

= = = =

polynomial polynomial polynomial polynomial

41

estimator estimator estimator estimator

of of of of

JES-µ,·,n . JES-ϑ,·,n . JQS-µ,·,n . JQS-ϑ,·,n .

Table SA-12: Simulations Results for Model 11 Panel A: IMSE for Grid of Number of Bins and Estimated Choices J−,n

IMSEES,− (J−,n ) IMSE∗ ES,−

J+,n

IMSEES,+ (J+,n ) IMSE∗ ES,+

J−,n

IMSEQS,− (J−,n ) IMSE∗ QS,−

J+,n

IMSEQS,+ (J+,n ) IMSE∗ QS,+

25 26 27 28 29 30 31 32 33 34 35

1.026 1.014 1.006 1.002 1.000 1.000 1.003 1.007 1.013 1.021 1.029

9 10 11 12 13 14 15 16 17 18 19

1.224 1.122 1.059 1.022 1.004 1.000 1.006 1.019 1.039 1.063 1.091

40 41 42 43 44 45 46 47 48 49 50

1.008 1.004 1.001 1.000 0.999 1.000 1.002 1.004 1.007 1.011 1.016

10 11 12 13 14 15 16 17 18 19 20

1.169 1.091 1.042 1.014 1.001 1.000 1.007 1.021 1.040 1.062 1.089

JˆES-µ,−,n JˇES-µ,−,n

1.036 1.041

JˆES-µ,+,n JˇES-µ,+,n

0.9962 0.9944

JˆQS-µ,−,n JˇQS-µ,−,n

1.083 1.085

JˆQS-µ,+,n JˇQS-µ,+,n

0.9214 0.9201

Panel B: Summary Statistics for the Estimated Number of Bins Pop. Par. JES-µ,−,n = 30 JES-ϑ,−,n = 150

JES-µ,+,n = 14 JES-ϑ,+,n = 147

JQS-µ,−,n = 45 JQS-ϑ,−,n = 153

JQS-µ,+,n = 15 JQS-ϑ,+,n = 144

Min.

1st Qu.

Median

Mean

3rd Qu.

Max.

Std. Dev.

JˆES-µ,−,n JˇES-µ,−,n JˆES-ϑ,−,n JˇES-ϑ,−,n

28 29 112 119

30 30 128 129

31 30 132 131

30.57 30.49 132 130.9

31 31 136 133

33 32 155 144

0.73 0.63 5.48 3.28

JˆES-µ,+,n JˇES-µ,+,n JˆES-ϑ,+,n JˇES-ϑ,+,n

12 12 99 108

14 14 121 124

14 14 127 127

14.1 14.12 127 127

14 14 133 130

17 16 165 148

0.68 0.63 8.76 5.06

JˆQS-µ,−,n JˇQS-µ,−,n JˆQS-ϑ,−,n JˇQS-ϑ,−,n

42 42 120 123

46 46 130 130

47 47 133 132

47.28 47.22 132.9 132.3

48 48 135 134

52 52 146 143

1.45 1.42 3.52 2.72

JˆQS-µ,+,n JˇQS-µ,+,n JˆQS-ϑ,+,n JˇQS-ϑ,+,n

11 11 103 106

13 13 119 120

14 14 123 123

13.75 13.75 123.5 123.7

14 14 127 127

18 18 147 144

0.93 0.92 6.05 4.64

Notes: (i) Population quantities: JES-µ,·,n = IMSE-optimal partition size for ES RD Plot. JES-ϑ,·,n = Mimicking variance partition size for ES RD Plot. JQS-µ,·,n = IMSE-optimal partition size for QS RD Plot. JQS-ϑ,·,n = Mimicking variance partition size for QS RD Plot. IMSE∗ES,· = IMSEES,· (JES-µ,·,n ) = ES IMSE function evaluated at optimal choice. IMSE∗QS,· = IMSEQS,· (JQS-µ,·,n ) = QS IMSE function evaluated at optimal choice. (ii) Estimators: JˆES-µ,·,n = spacings JˆES-ϑ,·,n = spacings JˆQS-µ,·,n = spacings JˆQS-ϑ,·,n = spacings

estimator estimator estimator estimator

of of of of

JES-µ,·,n ; JES-ϑ,·,n ; JQS-µ,·,n ; JQS-ϑ,·,n ;

JˇES-µ,·,n JˇES-ϑ,·,n JˇQS-µ,·,n JˇQS-ϑ,·,n

= = = =

polynomial polynomial polynomial polynomial

42

estimator estimator estimator estimator

of of of of

JES-µ,·,n . JES-ϑ,·,n . JQS-µ,·,n . JQS-ϑ,·,n .

Table SA-13: Simulations Results for Model 12 Panel A: IMSE for Grid of Number of Bins and Estimated Choices J−,n

IMSEES,− (J−,n ) IMSE∗ ES,−

J+,n

IMSEES,+ (J+,n ) IMSE∗ ES,+

J−,n

IMSEQS,− (J−,n ) IMSE∗ QS,−

J+,n

IMSEQS,+ (J+,n ) IMSE∗ QS,+

15 16 17 18 19 20 21 22 23 24 25

1.061 1.030 1.011 1.001 0.998 1.000 1.007 1.017 1.031 1.047 1.066

16 17 18 19 20 21 22 23 24 25 26

1.075 1.043 1.021 1.008 1.001 1.000 1.003 1.010 1.020 1.033 1.047

24 25 26 27 28 29 30 31 32 33 34

1.024 1.012 1.004 1.000 0.999 1.000 1.003 1.009 1.016 1.025 1.034

21 22 23 24 25 26 27 28 29 30 31

1.034 1.018 1.007 1.001 0.999 1.000 1.004 1.010 1.018 1.028 1.040

JˆES-µ,−,n JˇES-µ,−,n

1.014 1.015

JˆES-µ,+,n JˇES-µ,+,n

0.9924 0.9926

JˆQS-µ,−,n JˇQS-µ,−,n

1.097 1.098

JˆQS-µ,+,n JˇQS-µ,+,n

0.8544 0.8545

Panel B: Summary Statistics for the Estimated Number of Bins Pop. Par. JES-µ,−,n = 20 JES-ϑ,−,n = 157

JES-µ,+,n = 21 JES-ϑ,+,n = 134

JQS-µ,−,n = 29 JQS-ϑ,−,n = 153

JQS-µ,+,n = 26 JQS-ϑ,+,n = 135

Min.

1st Qu.

Median

Mean

3rd Qu.

Max.

Std. Dev.

JˆES-µ,−,n JˇES-µ,−,n JˆES-ϑ,−,n JˇES-ϑ,−,n

17 17 108 124

19 19 136 138

19 19 143 142

19.48 19.46 143.3 142.5

20 20 150 147

23 22 189 164

0.86 0.77 10.43 5.94

JˆES-µ,+,n JˇES-µ,+,n JˆES-ϑ,+,n JˇES-ϑ,+,n

19 20 94 100

20 21 108 109

21 21 111 111

20.81 20.81 111 110.8

21 21 114 113

22 22 130 120

0.54 0.47 4.85 2.80

JˆQS-µ,−,n JˇQS-µ,−,n JˆQS-ϑ,−,n JˇQS-ϑ,−,n

25 25 118 122

30 30 135 136

31 31 140 139

30.67 30.65 139.9 139.7

32 32 144 143

37 37 169 160

1.77 1.73 7.19 5.49

JˆQS-µ,+,n JˇQS-µ,+,n JˆQS-ϑ,+,n JˇQS-ϑ,+,n

18 18 103 106

21 21 111 111

22 22 113 113

21.8 21.8 113 113

23 23 115 114

29 29 125 122

1.68 1.66 2.90 2.19

Notes: (i) Population quantities: JES-µ,·,n = IMSE-optimal partition size for ES RD Plot. JES-ϑ,·,n = Mimicking variance partition size for ES RD Plot. JQS-µ,·,n = IMSE-optimal partition size for QS RD Plot. JQS-ϑ,·,n = Mimicking variance partition size for QS RD Plot. IMSE∗ES,· = IMSEES,· (JES-µ,·,n ) = ES IMSE function evaluated at optimal choice. IMSE∗QS,· = IMSEQS,· (JQS-µ,·,n ) = QS IMSE function evaluated at optimal choice. (ii) Estimators: JˆES-µ,·,n = spacings JˆES-ϑ,·,n = spacings JˆQS-µ,·,n = spacings JˆQS-ϑ,·,n = spacings

estimator estimator estimator estimator

of of of of

JES-µ,·,n ; JES-ϑ,·,n ; JQS-µ,·,n ; JQS-ϑ,·,n ;

JˇES-µ,·,n JˇES-ϑ,·,n JˇQS-µ,·,n JˇQS-ϑ,·,n

= = = =

polynomial polynomial polynomial polynomial

43

estimator estimator estimator estimator

of of of of

JES-µ,·,n . JES-ϑ,·,n . JQS-µ,·,n . JQS-ϑ,·,n .

Table SA-14: Simulations Results for Model 13 Panel A: IMSE for Grid of Number of Bins and Estimated Choices J−,n

IMSEES,− (J−,n ) IMSE∗ ES,−

J+,n

IMSEES,+ (J+,n ) IMSE∗ ES,+

J−,n

IMSEQS,− (J−,n ) IMSE∗ QS,−

J+,n

IMSEQS,+ (J+,n ) IMSE∗ QS,+

15 16 17 18 19 20 21 22 23 24 25

1.088 1.051 1.026 1.010 1.002 1.000 1.003 1.010 1.020 1.034 1.049

12 13 14 15 16 17 18 19 20 21 22

1.133 1.075 1.037 1.014 1.003 1.000 1.004 1.014 1.028 1.045 1.066

61 62 63 64 65 66 67 68 69 70 71

1.006 1.003 1.002 1.001 1.000 1.000 1.000 1.001 1.002 1.004 1.006

23 24 25 26 27 28 29 30 31 32 33

1.028 1.014 1.006 1.001 0.999 1.000 1.003 1.009 1.016 1.025 1.035

JˆES-µ,−,n JˇES-µ,−,n

0.95 0.9532

JˆES-µ,+,n JˇES-µ,+,n

0.9652 0.9578

JˆQS-µ,−,n JˇQS-µ,−,n

1.092 1.093

JˆQS-µ,+,n JˇQS-µ,+,n

0.8257 0.8247

Panel B: Summary Statistics for the Estimated Number of Bins Pop. Par. JES-µ,−,n = 20 JES-ϑ,−,n = 104

JES-µ,+,n = 17 JES-ϑ,+,n = 96

JQS-µ,−,n = 66 JQS-ϑ,−,n = 104

JQS-µ,+,n = 28 JQS-ϑ,+,n = 96

Min.

1st Qu.

Median

Mean

3rd Qu.

Max.

Std. Dev.

JˆES-µ,−,n JˇES-µ,−,n JˆES-ϑ,−,n JˇES-ϑ,−,n

15 16 55 64

19 19 97 98

19 19 106 104

19.4 19.36 104.8 103.9

20 20 113 110

24 23 143 135

1.02 0.92 11.59 8.93

JˆES-µ,+,n JˇES-µ,+,n JˆES-ϑ,+,n JˇES-ϑ,+,n

13 14 46 57

16 16 88 90

17 17 96 96

16.84 16.87 94.64 95.13

17 17 102 100

21 20 126 117

1.00 0.88 10.72 7.54

JˆQS-µ,−,n JˇQS-µ,−,n JˆQS-ϑ,−,n JˇQS-ϑ,−,n

49 49 93 96

68 68 102 102

72 72 104 104

72.34 72.33 103.8 103.7

77 77 106 105

104 105 118 114

6.56 6.53 3.22 2.47

JˆQS-µ,+,n JˇQS-µ,+,n JˆQS-ϑ,+,n JˇQS-ϑ,+,n

13 13 85 88

19 19 92 92

22 22 93 93

22.89 22.9 93.37 93.49

25 25 95 95

51 51 103 102

4.86 4.86 2.67 1.96

Notes: (i) Population quantities: JES-µ,·,n = IMSE-optimal partition size for ES RD Plot. JES-ϑ,·,n = Mimicking variance partition size for ES RD Plot. JQS-µ,·,n = IMSE-optimal partition size for QS RD Plot. JQS-ϑ,·,n = Mimicking variance partition size for QS RD Plot. IMSE∗ES,· = IMSEES,· (JES-µ,·,n ) = ES IMSE function evaluated at optimal choice. IMSE∗QS,· = IMSEQS,· (JQS-µ,·,n ) = QS IMSE function evaluated at optimal choice. (ii) Estimators: JˆES-µ,·,n = spacings JˆES-ϑ,·,n = spacings JˆQS-µ,·,n = spacings JˆQS-ϑ,·,n = spacings

estimator estimator estimator estimator

of of of of

JES-µ,·,n ; JES-ϑ,·,n ; JQS-µ,·,n ; JQS-ϑ,·,n ;

JˇES-µ,·,n JˇES-ϑ,·,n JˇQS-µ,·,n JˇQS-ϑ,·,n

= = = =

polynomial polynomial polynomial polynomial

44

estimator estimator estimator estimator

of of of of

JES-µ,·,n . JES-ϑ,·,n . JQS-µ,·,n . JQS-ϑ,·,n .

Table SA-15: Simulations Results for Model 14 Panel A: IMSE for Grid of Number of Bins and Estimated Choices J−,n

IMSEES,− (J−,n ) IMSE∗ ES,−

J+,n

IMSEES,+ (J+,n ) IMSE∗ ES,+

J−,n

IMSEQS,− (J−,n ) IMSE∗ QS,−

J+,n

IMSEQS,+ (J+,n ) IMSE∗ QS,+

17 18 19 20 21 22 23 24 25 26 27

1.064 1.036 1.017 1.006 1.001 1.000 1.003 1.010 1.020 1.032 1.046

11 12 13 14 15 16 17 18 19 20 21

1.129 1.068 1.030 1.008 0.999 1.000 1.008 1.021 1.040 1.061 1.086

28 29 30 31 32 33 34 35 36 37 38

1.018 1.009 1.003 1.000 0.999 1.000 1.003 1.007 1.012 1.019 1.027

13 14 15 16 17 18 19 20 21 22 23

1.110 1.061 1.030 1.011 1.002 1.000 1.004 1.014 1.027 1.043 1.062

JˆES-µ,−,n JˇES-µ,−,n

1.059 1.061

JˆES-µ,+,n JˇES-µ,+,n

0.9816 0.98

JˆQS-µ,−,n JˇQS-µ,−,n

1.172 1.173

JˆQS-µ,+,n JˇQS-µ,+,n

0.8606 0.8602

Panel B: Summary Statistics for the Estimated Number of Bins Pop. Par. JES-µ,−,n = 22 JES-ϑ,−,n = 121

JES-µ,+,n = 16 JES-ϑ,+,n = 111

JQS-µ,−,n = 33 JQS-ϑ,−,n = 121

JQS-µ,+,n = 18 JQS-ϑ,+,n = 111

Min.

1st Qu.

Median

Mean

3rd Qu.

Max.

Std. Dev.

JˆES-µ,−,n JˇES-µ,−,n JˆES-ϑ,−,n JˇES-ϑ,−,n

20 20 77 92

23 23 105 106

23 23 109 108

23.06 23.03 109 108.5

24 24 113 111

26 26 131 125

0.86 0.77 6.57 3.58

JˆES-µ,+,n JˇES-µ,+,n JˆES-ϑ,+,n JˇES-ϑ,+,n

14 14 75 85

15 15 95 97

15 15 99 99

15.43 15.43 98.67 98.73

16 16 102 101

17 17 119 110

0.59 0.54 5.67 3.31

JˆQS-µ,−,n JˇQS-µ,−,n JˆQS-ϑ,−,n JˇQS-ϑ,−,n

30 30 97 98

36 36 106 107

37 37 109 109

37.42 37.4 108.8 108.6

39 39 111 110

45 44 121 119

1.94 1.92 3.57 2.77

JˆQS-µ,+,n JˇQS-µ,+,n JˆQS-ϑ,+,n JˇQS-ϑ,+,n

13 13 88 91

15 15 96 97

15 15 98 98

15.56 15.56 98.56 98.59

16 16 101 100

21 21 111 108

1.14 1.13 3.05 2.30

Notes: (i) Population quantities: JES-µ,·,n = IMSE-optimal partition size for ES RD Plot. JES-ϑ,·,n = Mimicking variance partition size for ES RD Plot. JQS-µ,·,n = IMSE-optimal partition size for QS RD Plot. JQS-ϑ,·,n = Mimicking variance partition size for QS RD Plot. IMSE∗ES,· = IMSEES,· (JES-µ,·,n ) = ES IMSE function evaluated at optimal choice. IMSE∗QS,· = IMSEQS,· (JQS-µ,·,n ) = QS IMSE function evaluated at optimal choice. (ii) Estimators: JˆES-µ,·,n = spacings JˆES-ϑ,·,n = spacings JˆQS-µ,·,n = spacings JˆQS-ϑ,·,n = spacings

estimator estimator estimator estimator

of of of of

JES-µ,·,n ; JES-ϑ,·,n ; JQS-µ,·,n ; JQS-ϑ,·,n ;

JˇES-µ,·,n JˇES-ϑ,·,n JˇQS-µ,·,n JˇQS-ϑ,·,n

= = = =

polynomial polynomial polynomial polynomial

45

estimator estimator estimator estimator

of of of of

JES-µ,·,n . JES-ϑ,·,n . JQS-µ,·,n . JQS-ϑ,·,n .

Table SA-16: Simulations Results for Model 15 Panel A: IMSE for Grid of Number of Bins and Estimated Choices J−,n

IMSEES,− (J−,n ) IMSE∗ ES,−

J+,n

IMSEES,+ (J+,n ) IMSE∗ ES,+

J−,n

IMSEQS,− (J−,n ) IMSE∗ QS,−

J+,n

IMSEQS,+ (J+,n ) IMSE∗ QS,+

25 26 27 28 29 30 31 32 33 34 35

1.026 1.014 1.006 1.001 0.999 1.000 1.003 1.007 1.013 1.021 1.030

9 10 11 12 13 14 15 16 17 18 19

1.223 1.121 1.058 1.022 1.004 1.000 1.006 1.020 1.039 1.063 1.091

40 41 42 43 44 45 46 47 48 49 50

1.008 1.004 1.001 1.000 0.999 1.000 1.002 1.004 1.007 1.011 1.016

10 11 12 13 14 15 16 17 18 19 20

1.168 1.090 1.041 1.014 1.001 1.000 1.007 1.021 1.040 1.063 1.089

JˆES-µ,−,n JˇES-µ,−,n

1.041 1.046

JˆES-µ,+,n JˇES-µ,+,n

0.9839 0.9816

JˆQS-µ,−,n JˇQS-µ,−,n

1.158 1.161

JˆQS-µ,+,n JˇQS-µ,+,n

0.9187 0.9176

Panel B: Summary Statistics for the Estimated Number of Bins Pop. Par. JES-µ,−,n = 30 JES-ϑ,−,n = 149

JES-µ,+,n = 14 JES-ϑ,+,n = 140

JQS-µ,−,n = 45 JQS-ϑ,−,n = 151

JQS-µ,+,n = 15 JQS-ϑ,+,n = 137

Min.

1st Qu.

Median

Mean

3rd Qu.

Max.

Std. Dev.

JˆES-µ,−,n JˇES-µ,−,n JˆES-ϑ,−,n JˇES-ϑ,−,n

28 28 102 111

30 30 126 127

31 30 130 129

30.51 30.42 130 129

31 31 134 132

33 33 158 141

0.78 0.67 6.56 3.77

JˆES-µ,+,n JˇES-µ,+,n JˆES-ϑ,+,n JˇES-ϑ,+,n

12 12 87 101

13 14 117 119

14 14 124 124

13.93 13.94 124.4 124.3

14 14 131 129

17 16 168 155

0.75 0.68 11.00 6.83

JˆQS-µ,−,n JˇQS-µ,−,n JˆQS-ϑ,−,n JˇQS-ϑ,−,n

44 44 120 120

49 49 129 129

50 50 131 131

50.34 50.28 131.3 130.8

51 51 134 133

56 56 144 142

1.63 1.59 3.61 2.84

JˆQS-µ,+,n JˇQS-µ,+,n JˆQS-ϑ,+,n JˇQS-ϑ,+,n

11 11 98 103

13 13 115 116

14 14 120 120

13.65 13.66 120.4 120.5

14 14 125 124

19 19 152 143

1.10 1.11 7.05 5.95

Notes: (i) Population quantities: JES-µ,·,n = IMSE-optimal partition size for ES RD Plot. JES-ϑ,·,n = Mimicking variance partition size for ES RD Plot. JQS-µ,·,n = IMSE-optimal partition size for QS RD Plot. JQS-ϑ,·,n = Mimicking variance partition size for QS RD Plot. IMSE∗ES,· = IMSEES,· (JES-µ,·,n ) = ES IMSE function evaluated at optimal choice. IMSE∗QS,· = IMSEQS,· (JQS-µ,·,n ) = QS IMSE function evaluated at optimal choice. (ii) Estimators: JˆES-µ,·,n = spacings JˆES-ϑ,·,n = spacings JˆQS-µ,·,n = spacings JˆQS-ϑ,·,n = spacings

estimator estimator estimator estimator

of of of of

JES-µ,·,n ; JES-ϑ,·,n ; JQS-µ,·,n ; JQS-ϑ,·,n ;

JˇES-µ,·,n JˇES-ϑ,·,n JˇQS-µ,·,n JˇQS-ϑ,·,n

= = = =

polynomial polynomial polynomial polynomial

46

estimator estimator estimator estimator

of of of of

JES-µ,·,n . JES-ϑ,·,n . JQS-µ,·,n . JQS-ϑ,·,n .

Table SA-17: Simulations Results for Model 16 Panel A: IMSE for Grid of Number of Bins and Estimated Choices J−,n

IMSEES,− (J−,n ) IMSE∗ ES,−

J+,n

IMSEES,+ (J+,n ) IMSE∗ ES,+

J−,n

IMSEQS,− (J−,n ) IMSE∗ QS,−

J+,n

IMSEQS,+ (J+,n ) IMSE∗ QS,+

15 16 17 18 19 20 21 22 23 24 25

1.059 1.030 1.011 1.001 0.998 1.000 1.007 1.018 1.031 1.048 1.066

16 17 18 19 20 21 22 23 24 25 26

1.073 1.042 1.021 1.007 1.001 1.000 1.003 1.010 1.021 1.033 1.048

24 25 26 27 28 29 30 31 32 33 34

1.023 1.011 1.004 1.000 0.999 1.000 1.004 1.009 1.016 1.025 1.035

21 22 23 24 25 26 27 28 29 30 31

1.033 1.017 1.007 1.001 0.999 1.000 1.004 1.010 1.019 1.029 1.041

JˆES-µ,−,n JˇES-µ,−,n

1.048 1.05

JˆES-µ,+,n JˇES-µ,+,n

0.9941 0.9938

JˆQS-µ,−,n JˇQS-µ,−,n

1.071 1.072

JˆQS-µ,+,n JˇQS-µ,+,n

0.8365 0.8365

Panel B: Summary Statistics for the Estimated Number of Bins Pop. Par. JES-µ,−,n = 20 JES-ϑ,−,n = 155

JES-µ,+,n = 21 JES-ϑ,+,n = 134

JQS-µ,−,n = 29 JQS-ϑ,−,n = 151

JQS-µ,+,n = 26 JQS-ϑ,+,n = 136

Min.

1st Qu.

Median

Mean

3rd Qu.

Max.

Std. Dev.

JˆES-µ,−,n JˇES-µ,−,n JˆES-ϑ,−,n JˇES-ϑ,−,n

17 17 94 116

20 20 132 134

20 20 139 138

20.09 20.05 138.9 138

21 21 146 142

24 23 179 164

0.92 0.80 11.15 6.18

JˆES-µ,+,n JˇES-µ,+,n JˆES-ϑ,+,n JˇES-ϑ,+,n

19 19 85 98

20 20 108 109

21 21 112 112

20.74 20.74 112 111.9

21 21 116 114.2

23 23 132 127

0.66 0.58 6.23 4.09

JˆQS-µ,−,n JˇQS-µ,−,n JˆQS-ϑ,−,n JˇQS-ϑ,−,n

24 24 113 117

29 29 132 132

30 30 136 136

30.13 30.11 136.6 136.3

31 31 141 140

37 37 163 161

1.73 1.70 7.10 5.74

JˆQS-µ,+,n JˇQS-µ,+,n JˆQS-ϑ,+,n JˇQS-ϑ,+,n

17 17 102 104

20 20 111 111

21 21 113 113

21.08 21.07 113 113

22 22 115 115

28 28 125 125

1.42 1.41 3.35 2.74

Notes: (i) Population quantities: JES-µ,·,n = IMSE-optimal partition size for ES RD Plot. JES-ϑ,·,n = Mimicking variance partition size for ES RD Plot. JQS-µ,·,n = IMSE-optimal partition size for QS RD Plot. JQS-ϑ,·,n = Mimicking variance partition size for QS RD Plot. IMSE∗ES,· = IMSEES,· (JES-µ,·,n ) = ES IMSE function evaluated at optimal choice. IMSE∗QS,· = IMSEQS,· (JQS-µ,·,n ) = QS IMSE function evaluated at optimal choice. (ii) Estimators: JˆES-µ,·,n = spacings JˆES-ϑ,·,n = spacings JˆQS-µ,·,n = spacings JˆQS-ϑ,·,n = spacings

estimator estimator estimator estimator

of of of of

JES-µ,·,n ; JES-ϑ,·,n ; JQS-µ,·,n ; JQS-ϑ,·,n ;

JˇES-µ,·,n JˇES-ϑ,·,n JˇQS-µ,·,n JˇQS-ϑ,·,n

= = = =

polynomial polynomial polynomial polynomial

47

estimator estimator estimator estimator

of of of of

JES-µ,·,n . JES-ϑ,·,n . JQS-µ,·,n . JQS-ϑ,·,n .

6

Numerical Comparison of Partitioning Schemes

We proposed two alternative ways of constructing RD plots, one employing ES partitioning and the other employing QS partitioning. While developing a general theory for optimal partitioning scheme selection is beyond the scope of this paper, we can employ our IMSE expansions to compare the two partitioning schemes theoretically in order to assess their relative IMSE-optimality properties. Without loss of generality we focus on the IMSE for the treatment group (“+” subindex). Assuming the regularity conditions imposed in the paper hold, we obtain (up to the ceiling operator for selecting the optimal partition sizes): √ 3 IMSEES,+ (JES,+,n ) =

3 CES,+ n−2/3 {1 + oP (1)}, 4

√ 3 IMSEQS,+ (JQS,+,n ) =

3 CQS,+ n−2/3 {1 + oP (1)}, 4

where xu

Z CES,+ =



2 (1) µ+ (x) w(x)dx

1/3 Z

x ¯

CQS,+

 Z  = x ¯

xu

xu

x ¯ (1) µ+ (x)

f (x)

1/3

!2

Z

w(x)dx

2 (x) σ+ w(x)dx f (x)

xu

2/3

2 σ+ (x)w(x)dx

, 2/3 .

x ¯

Thus, in order to compare the performance of the partition-size selectors for ES and QS RD plots we need to compare the two DGP constants CES,+ and CQS,+ . It follows that when f (x) ∝ κ (i.e., the running variable is uniformly distributed), then CES,+ = CQS,+ and therefore both partitioning schemes have equal (asymptotic) IMSE when the corresponding optimal partition size is used. Unfortunately, when the density f (x) is not constant on the support [xl , xu ], it is not possible to obtain a unique ranking between IMSEES,+ (JES,+,n ) and IMSEQS,+ (JQS,+,n ). Heuristically, the QS RD plots should perform better in cases where the data is sparse because the estimated quantile spaced partition should adapt to this situation better, but we have been unable to provide a formal ranking along these lines. Nonetheless, in Table SA-18 we explore the ranking between the two partitioning schemes using the 16 data generating processes discussed in our simulation study (Table SA-1). As expected, this

48

Table SA-18: Comparison of Partitioning Schemes

Model Model Model Model Model Model Model Model Model Model Model Model Model Model Model Model

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

BES,− BQS,−

VES,− VQS,−

IMSEES,− (JES-µ,−,n ) IMSEQS,− (JQS-µ,−,n )

BES,+ BQS,+

VES,+ VQS,+

IMSEES,+ (JES-µ,+,n ) IMSEQS,+ (JQS-µ,+,n )

1.000 2.290 2.466 1.258 2.466 1.258 2.466 1.258 0.028 0.309 0.301 0.309 0.028 0.309 0.301 0.309

1.000 1.000 1.389 1.004 1.000 1.000 1.389 1.004 1.000 1.000 1.015 0.977 1.000 1.000 1.015 0.977

1.000 1.319 1.682 1.084 1.352 1.081 1.682 1.084 0.303 0.677 0.677 0.666 0.303 0.677 0.677 0.666

1.000 0.784 1.038 0.447 1.038 0.447 1.038 0.447 0.241 0.655 0.831 0.570 0.241 0.655 0.831 0.570

1.000 1.000 1.004 1.389 1.000 1.000 1.004 1.389 1.000 1.000 0.977 1.015 1.000 1.000 0.977 1.015

1.000 0.925 1.016 0.953 1.004 0.765 1.016 0.953 0.624 0.867 0.928 0.839 0.624 0.867 0.928 0.839

table shows that when f (x) is uniform both IMSE are equal, while when f (x) is not uniform either IMSE may dominate the other. This depends on the shape of the regression function (different for control and treatment sides) and conditional heteroskedasticity in the underlying true data generating process.

49

References Aras, G., Jammalamadaka, S. R., and Zhou, X. (1989), “Limit Distribution of Spacings Statistics when the Sample Size is Random,” Statistics and Probability Letters, 8, 451–456. Calonico, S., Cattaneo, M. D., and Titiunik, R. (2014a), “Robust Data-Driven Inference in the Regression-Discontinuity Design,” Stata Journal, 14, 909–946. (2014b), “Supplement to ‘Robust Nonparametric Confidence Intervals for RegressionDiscontinuity Designs’,” Econometrica Supplemental Material, 82, http://dx.doi.org/10. 3982/ECTA11757. (2015), “rdrobust: An R Package for Robust Nonparametric Inference in RegressionDiscontinuity Designs,” R Journal, 7, 38–51. Cattaneo, M. D., and Farrell, M. H. (2013), “Optimal Convergence Rates, Bahadur Representation, and Asymptotic Normality of Partitioning Estimators,” Journal of Econometrics, 174, 127–143. Cattaneo, M. D., Frandsen, B., and Titiunik, R. (2015), “Randomization Inference in the Regression Discontinuity Design: An Application to Party Advantages in the U.S. Senate,” Journal of Causal Inference, 3, 1–24. Lee, D. S. (2008), “Randomized Experiments from Non-random Selection in U.S. House Elections,” Journal of Econometrics, 142, 675–697. Ludwig, J., and Miller, D. L. (2007), “Does Head Start Improve Children’s Life Chances? Evidence from a Regression Discontinuity Design,” Quarterly Journal of Economics, 122, 159–208. Mason, D. M. (1984), “A Strong Limit Theorem for the Oscillation Modulus of the Uniform Empirical Quantile Process,” Stochastic Processes and its Applications, 17, 127–136. Newey, W. K. (1997), “Convergence Rates and Asymptotic Normality for Series Estimators,” Journal of Econometrics, 79, 147–168. Shorack, G., and Wellner, J. (2009), Empirical Process with Applications to Statistics, Siam.

50

Optimal Data-Driven Regression Discontinuity Plots ...

Nov 25, 2015 - 6 Numerical Comparison of Partitioning Schemes ...... sistently delivered a disciplined “cloud of points”, which appears to be substantially more ...

1MB Sizes 2 Downloads 263 Views

Recommend Documents

Optimal Data-Driven Regression Discontinuity Plots
the general goal of providing a visual representation of the design without ... Calonico, Cattaneo, and Titiunik: Optimal Data-Driven RD Plots. 1755 disciplined ...

Regression Discontinuity Design with Measurement ...
Nov 20, 2011 - All errors are my own. †Industrial Relations Section, Princeton University, Firestone Library, Princeton, NJ 08544-2098. E-mail: zpei@princeton.

Regression Discontinuity Designs in Economics
(1999) exploited threshold rules often used by educational .... however, is that there is some room for ... with more data points, the bias would generally remain—.

A Regression Discontinuity Approach
“The promise and pitfalls of using imprecise school accountability measures.” Journal of Economic Perspectives, 16(4): 91–114. Kane, T., and D. Staiger. 2008.

Read PDF Matching, Regression Discontinuity ...
Discontinuity, Difference in Differences, and. Beyond - Ebook PDF, EPUB, KINDLE isbn : 0190258748 q. Related. Propensity Score Analysis: Statistical Methods and Applications (Advanced Quantitative Techniques in the · Social Sciences) · Mastering 'Met

Regression Discontinuity Design with Measurement ...
“The Devil is in the Tails: Regression Discontinuity Design with .... E[D|X∗ = x∗] and E[Y|X∗ = x∗] are recovered by an application of the Bayes' Theorem. E[D|X.

Regression Discontinuity Designs in Economics - Vancouver School ...
with more data points, the bias would generally remain— even with .... data away from the discontinuity.7 Indeed, ...... In the presence of heterogeneous treat-.

A Regression Discontinuity Approach
We use information technology and tools to increase productivity and facilitate new forms of scholarship. ... duration and post-unemployment job quality. In.

A Regression Discontinuity Approach
Post-Unemployment Jobs: A Regression Discontinuity Approach .... Data. The empirical analysis for the regional extended benefit program uses administrative ...

Local Polynomial Order in Regression Discontinuity Designs
Oct 21, 2014 - but we argue that it should not always dominate other local polynomial estimators in empirical studies. We show that the local linear estimator in the data .... property of a high-order global polynomial estimator is that it may assign

Interpreting Regression Discontinuity Designs with ...
Gonzalo Vazquez-Bare, University of Michigan. We consider ... normalizing-and-pooling strategy so commonly employed in practice may not fully exploit all the information available .... on Chay, McEwan, and Urquiola (2005), where school im-.

rdrobust: Software for Regression Discontinuity Designs - Chicago Booth
Jan 18, 2017 - 2. rdbwselect. This command now offers data-driven bandwidth selection for ei- ..... residuals with the usual degrees-of-freedom adjustment).

Regression Discontinuity and the Price Effects of Stock ...
∗Shanghai Advanced Institute of Finance, Shanghai Jiao Tong University. †Princeton .... The lines drawn fit linear functions of rank on either side of the cut-off.

Power Calculations for Regression Discontinuity Designs
Mar 17, 2018 - The latest version of this software, as well as other related software for RD designs, can be found at: https://sites.google.com/site/rdpackages/. 2 Overview of Methods. We briefly ...... and Applications (Advances in Econometrics, vol

Local Polynomial Order in Regression Discontinuity ...
Jun 29, 2018 - Central European University and IZA ... the polynomial order in an ad hoc fashion, and suggest a cross-validation method to choose the ...

Power Calculations for Regression Discontinuity Designs
first command rdpower conducts power calculations employing modern robust .... and therefore we do not assume perfect compliance (i.e., we do not force Ti ...

Partisan Imbalance in Regression Discontinuity Studies ...
Many papers use regression discontinuity (RD) designs that exploit the discontinuity in. “close” election outcomes in order to identify various political and ...

rdrobust: Software for Regression Discontinuity Designs - Chicago Booth
Jan 18, 2017 - This section provides a brief account of the main new features included in the upgraded version of the rdrobust ... See Card et al. (2015) for ...

A Practical Introduction to Regression Discontinuity ...
May 29, 2017 - variables—the student's score in the mathematics exam and her score in the ..... at the raw cloud of points around the cutoff in Figure 3.1.

Optimal Inference in Regression Models with Nearly ...
ymptotic power envelopes are obtained for a class of testing procedures that ..... As a consequence, our model provides an illustration of the point that “it is de-.

Dot Plots - Semantic Scholar
Dot plots represent individual observations in a batch of data with symbols, usually circular dots. They have been used for more than .... for displaying data values directly; they were not intended as density estimators and would be ill- suited for

Sports Plots
Construct a box and whisker plot. Then place a dot on your graph to represent Yao Ming's heights. Do not connect it to anything. Minimum: ______. Lower Quartile: ______. Median: ______. Upper Quartile: ______. Maximum: ______. 5. Compare the box and