The Housing and Educational Consequences of the School Choice Provisions of NCLB: Evidence from Charlotte, NC
Online Appendix Stephen B. Billings, Eric J. Brunner and Stephen L. Ross
This appendix provides additional details and analyses to supplement the information in Billings, Brunner, and Ross (BBR). Appendix A provides further details on the data, including: 1) how we match the HMDA data to the mortgage deeds data, 2) the distribution of failing block groups by pre NCLB sale prices, and 3) the distribution of failing schools by the first year for which a school qualified as a failing school. Appendix B presents and discusses additional analyses that compliment those found in BBR, including the results of three additional falsification tests and results based on alternative tercile specifications. Finally, Appendix C provides additional information on the generalizability of our results.
____________ * Billings: Department of Economics, University of North Carolina at Charlotte, 9201 University City Blvd, Charlotte, NC 28223,
[email protected]; Brunner: Department of Public Policy, University of Connecticut, 1800 Asylum Ave, Fourth Floor, West Hartford, CT 06117,
[email protected]; Ross: Department of Economics, University of Connecticut, 341 Mansfield Road, Unit 1063, Storrs, CT 06269-1063,
[email protected].
Appendix A Additional Data Details 1. Matching HMDA Data to Mortgage Deeds Data The HMDA data provide information on a homebuyer’s income as well as information on the mortgage loan amount, the name of the mortgage lender, and the census tract of the purchased home. To merge the mortgage deeds data with the HMDA data, we first geocoded the address of each home in the mortgage deeds data to obtain the census tract within which each home was located. We then merged the two datasets based on: 1) mortgage loan amount, 2) mortgage lender name and 3) census tract. Based on this matching process, we were able to successfully match approximately 80 percent of the mortgage originations that occurred between 2004 and 2010. To limit the influence of rental properties, we further restricted the matched data in three ways by excluding: 1) mortgages that are not described in HMDA as owner-occupied, 2) parcels for which the property address did not match the owner’s mailing address, and 3) mortgages with unreasonably high stated annual income for an owner-occupant, i.e. annual income larger than the mortgage amount. The first restriction leaves us with 52,666 parcel-level observations on homebuyer incomes, while the other two restrictions reduce the sample to 37,472 and 47,032, respectively. 2. The Distribution of Failing Block Groups by Pre NCLB Sale Prices The number of failures in the highest price tercile is relatively low and so our parameter of interest, which is the difference between the effect of failure in the highest and lowest price terciles, is based on 14 top tercile block groups that are bisected by an attendance zone of a school with an AYP failure during our sample period, as compared to 39 and 62 block groups with a failure for the middle and lowest price terciles. Figure 1A illustrates the distribution of the 186 census block groups with failing schools by pre NCLB sale prices. The vertical lines in the figure correspond to the average census block group sales price that separates all 352 block groups (block groups with and without failing schools) into the terciles shown in the top row of
1
Table 1 of BBR. 1 The distribution of block groups is quite skewed and additional block groups cannot be incorporated into the top price subsample without adding block groups that have substantially lower price levels than the subsample as a whole. While the small number of block groups has no impact on the validity of our analyses, it may impact generalizability, which we discuss in Appendix C. 3. Distribution of Failing Schools by First Year of Failing School Status We also observe that a substantial fraction of schools that fail to meet AYP fail immediately upon full implementation of AYP standards, as seen in Figure 2A. The top panel provides the distribution of our failing schools by the first year for which a school qualified as a failing school. The bottom panel provides the number of failing schools in each year. The first year of high stakes NCLB testing was 2002-2003, but the AYP standard in that initial year was set much lower in order to ease the transition into the new testing regime. The first year of high AYP standards was 2003-2004 and just under half of our twice failing Title 1 schools are classified as failing to make AYP in 2004-2005. Note that in one school, Westerly Hills Elementary, performance was so poor that it failed to meet AYP standards even under the lower 2002-2003 standards and thus is classified as failing to make AYP in 2003-2004. Westerly Hills was the only elementary school with less than 60% of students meeting AYP in 2001-2002.
1
As noted in BBR, the highest housing value block group that contains at least part of an attendance zone for a school that experienced two consecutive AYP failures had an average sales price during the pre-period of $389,217. All 352 block groups with average transaction prices lower than this amount are ordered by average price and divided into terciles. The remaining 15 blocks with higher average price or not used to create the terciles.
2
Figure 1A Distribution of Mean Sale Price by Block Group: 1998-2002
Notes: This figure highlights the distribution of our 186 Census Block Groups with failing schools by average 1998-2002 sales price. Vertical lines indicate the average CBG sales price that separates all 352 Census Block Groups (that are below the maximum sale price for which we observe a failing school) into terciles listed in the top row of Table 1 of BBR.
3
Figure 2A Distribution of Failing Schools by First Year for which a School Qualified as a Failing School
Notes: The top figure shows the distribution of failing schools by the first year for which a school qualified as a failing school. The bottom figure shows the number of failing schools in each year.
4
Appendix B Additional Falsification Tests and Extensions 1. Additional Falsification Tests The falsification tests reported in Table 9 of BBR are designed to rule out the possibility that our results are driven by systematic spatial variation in residential composition across the entire Charlotte-Mecklenburg county school district. In this section we report the results of three further falsification tests. The first falsification test involves treating schools that failed twice in a given year as if they failed AYP two years earlier. To implement this falsification test, we add two years of observations prior to the beginning of AYP testing and drop all observations following the actual AYP failure. Results are reported in Table 1A. Similar to our primary falsification test reported in Table 9 of BBR, columns 1 and 2 of Table 1A re-estimate the housing price and the income model for the full sample of mortgages from Table 3 of BBR. Similarly, column 3 re-estimates the model for current residents attending a non-assigned school from Table 4 of BBR while column 4 re-estimates the moved into neighborhood/block group model from Table 8 of BBR. As Table 1A reveals, we find no relationship between AYP failure and any of our outcomes for the third tercile. In fact, we find that housing prices are lower in the second tercile and incomes are lower in the third tercile after an AYP failure. The fact that we find negative effects on home prices and homebuyer incomes in these falsification tests is not surprising. These schools are adjusting to the redistricting that occurred after the end of court order bussing. Given that families have time to move out of the attendance zones of struggling schools that are likely to fail AYP in the future, it is not surprising that prices and incomes would be falling in these attendance zones For the second falsification test, we drop any Title 1 school that experienced two failures and then assign schools as pseudo failing in a year if the school missed AYP in the same subject for the previous 2 years and is not a Title 1 school. Since non-Title 1 schools are not subject to the choice sanctions associated with NCLB, there should be no incentive for families to strategically move into the best neighborhoods in attendance zones of these schools. These results are reported in Table 2A and again we find no relationship between AYP failure and any of our outcomes for the third tercile. 5
In the final falsification test we drop any school that experienced two failures and treat schools that experienced a single failure as failing AYP in the year following that failure. Note that this falsification test requires us to utilize both Tile 1 schools that fail AYP only once and non-Title 1 schools that fail AYP once, since virtually all Title 1 schools that fail once continue to fail throughout our sample period. 2 As shown in Table 3A, we find no evidence of higher housing prices, higher income, higher likelihood of attending a non-assigned school, or higher likelihood of being a new resident for the third tercile upon failure. We do find that the coefficient on the any fail indicator is positive and statistically significant at the 10 percent level in both column 3 (attend a non-assigned school) and column 4 (moved into home since last year). However, one might expect that an AYP failure would have direct effects on school choice decisions even without the advantages offered by increased priority in the school choice lottery. This possibility is a major reason why we have focused primarily on our differences across the difference-in-differences estimates. As noted in Section V. D. of BBR the falsification tests reported in this appendix attempt to address variation over time, but are imperfect in part because AYP failure and the declines in school quality that precede failure can have direct effects on the observed outcomes. Therefore, we cannot entirely rule out the possibility that our primary results are driven by systematic variation over time that arises as schools and residential populations adjust to the new boundaries that were drawn shortly before the implementation of NCLB. In Table 10 of BBR, we present results from a model where we interact AYP failure and the tercile interactions with whether the AYP failure occurred three or more years after the full implementation of AYP standards and find that all of our results are robust in terms of significance and magnitude when based only on the schools that have an immediate AYP failure. In Table 4A, we present the results of an additional model designed to examine whether our results are robust to the timing of AYP failures. Specifically, we interact the AYP failure indicator and the tercile interactions with years since the failure. This allows us to focus on the effects in the years immediately following the failure. Again, all results are robust in the level coefficients on AYP failure and the tercile interactions when our analysis is based on 2
Specifically, there are 32 Title 1 schools in our sample that failed to meet AYP for two consecutive years and were therefore subject to choice sanctions. Of those, only 4 failed in one or more consecutive years and then subsequently made AYP for more than one year in a row.
6
comparisons across time periods that are in close temporal proximity. Specifically, the interactions with years since fail are small and insignificant except for the housing price model. For housing price, these interactions are negative suggesting that any housing price effects erode after four years. Most importantly, there is no evidence of a positive and significant interaction, which would be consistent with effect sizes getting larger as we compare years that are further apart and therefore potentially different. 2. Alternative Tercile Specifications The core results and falsification tests reported in BBR and the three additional falsification tests reported in this appendix, are based on specifications that include an indicator for AYP failure and that indicator interacted with indicators for whether a census block group is in the 2nd or 3rd tercile of neighborhood quality. As a result, the coefficients on the tercile interactions represent the difference between the difference-in-differences estimates of the effect of failure to meet AYP for the lowest quality neighborhoods and failure to meet AYP for other terciles of neighborhood quality. As noted in BBR, because the announcement of a school failure may have direct effects on our outcomes of interest that are unrelated to the benefit of higher priority in the school choice system, the simple DD estimate, given by the coefficient on the AYP failure indicator, may not be tied directly to the school choice mechanism. In contrast, because the coefficients on the tercile interactions represent the difference between the difference-indifferences estimates, they are less likely to be affected by any such direct effects of AYP failure. As a result, to isolate the causal effect of AYP failure on our outcomes of interest, we focus primarily on the coefficients on the tercile interactions. Nevertheless, it is instructive to also examine the overall effect of AYP failure in each tercile directly. Obviously these effects are simply the sum of the estimated coefficient on the AYP failure indicator and the estimated coefficient on the tercile interactions in the tables presented in BBR and the three falsification tests reported in this appendix. However, in order to also obtain standard errors for those estimates, in this section we estimate models similar to our core models except we now interact the AYP failure indicator with all three housing price tercile indicators. Thus, the estimated coefficients reported in this section are now all difference-in-differences estimates for the respective terciles. 7
Table 5A presents results similar to those reported in Table 3 of BBR except we now use the alterative tercile specification where we interact the AYP failure indicator with all three housing price tercile indicators. A brief inspection of Table 5A reveals that the results from this alternative specification are quite similar to those reported in Table 3 of BBR. In Table 6A we present results similar to those reported in tables 5 and 8 of BBR using the alterative tercile specification. The results reported in columns 1-3 of Table 6A are generally quite similar to the corresponding results reported in Table 5 of BBR. The one exception is that the estimated coefficient on the top tercile interaction in the attend magnet school specification (column 3) is slightly larger than the corresponding estimate in Table 5 of BBR and is now statistically significant. As shown in column 4 of Table 6A we also obtain results similar to those reported in the bottom panel of Table 8 of BBR for the moved into home since last year specification. Tables 7A, 8A, 9A and 10A present results from our falsification tests using the alternative tercile specification where we interact the AYP failure indicator with all three housing price tercile indicators. Specifically, Table 7A presents results similar to those reported in Table 9 of BBR, while Tables 8A, 9A and 10A present results similar to those reported in Tables 1A, 2A and 3A. The difference-in-differences estimates reported in Table 7A are generally similar to our main falsification results reported in Table 9 of BBR. Specifically, all of the estimated coefficients in Table 7A are small in magnitude and statistically insignificant, as one would expect if our core results have a causal interpretation. The results reported in Tables 8A, 9A and 10A are also similar to those reported in Tables 1A, 2A and 3A. The one exception is the estimated coefficient on the highest tercile interaction in column 2 of Table 10A which is now positive and statistically significant. While the positive and statistically significant coefficient on the highest tercile indicator in Table 10A implies we fail the falsification test for the homebuyer income model in one of our falsification tests, we are not overly concerned by this finding for several reasons. First, as noted previously, because the announcement of a school failure may have direct effects on our outcomes of interest that are unrelated to the benefit of higher priority in the school choice system, the simple DD estimates reported in Table 10A may not be tied directly to the school choice mechanism. Again, this is why we have focused on the difference between the difference-in-differences estimates
8
and in those models we find that the estimated coefficient on the highest tercile interaction is statistically insignificant (see results in Table 3A). Second, note that relative to the estimated coefficient on the highest tercile interaction in our causal model reported in column 2 of Table 3 of BBR, the corresponding estimate on the highest tercile interaction in the falsification test reported in Table 10A is quite small in magnitude. Specifically, in Table 3 of BBR we find an estimated coefficient on the highest tercile interaction of 0.241. In contrast, the corresponding estimate in Table 10A is only 0.051, or roughly five times smaller in magnitude. Finally, we note that across the 32 falsification tests reported in Table 9 of BBR and Tables 1A-3A and 7A-10A of this appendix, only one of the estimated coefficients on the highest tercile interaction is positive and statistically significant.
3.
Impact of AYP Failure on Student Achievement Finally, in Table 11A we examine the effect of AYP failure on student test scores.
Columns 1 and 2 report results based on our original resident model given by equation (1), while columns 3 and 4 report results based on our current resident model given by equation (3). The specifications in top panel include controls for race, gender and grade while the specifications in the bottom panel replace student characteristics with student fixed effects. In general, the estimates reported in Table 11A are too noisy to be informative, as evidenced by the fact that all but one of the estimated coefficients reported in both the top and bottom panel are statistically insignificant. 3
3
The effect of switching to a higher quality non-assigned school on test scores is ambiguous. On the one hand, the ability to attend a better quality school could increase student performance. On the other hand, the disruptions associated with changing schools could decrease student performance. See Hanushek, Kain and Rivkin (2004) and Schwerdt and West (2013) for evidence on the impact of switching schools on student performance.
9
Table 1A Falsification Tests based on Years Prior to Failing Schools
Any Failing NBHD T2 * Fail NBHD T3 * Fail
Observations
(1)
(2)
(3) Attend Non-Assigned School
(4) Moved Into Home (since last year)
Log Price
Log Income
-0.039 (0.034) -0.107** (0.046) 0.057 (0.050)
-0.086 (0.072) 0.115 (0.091) -0.181** (0.073)
0.011 (0.017) 0.000 (0.027) 0.033 (0.052)
0.031 (0.022) -0.037 (0.031) -0.032 (0.061)
152,994
46,805
334,331
316,772
Notes: Table presents a series of falsifications tests for the core results presented in BBR. All specifications based on falsification tests where we drop all observations after a school fails to meet AYP for two consecutive years (i.e. real failure) and then construct a placebo failing status indicator variable that takes the value of unity for school-years that are two years prior to the first year a school actually fails. For columns 1,3 and 4 we extend our sample back to 2003 in order to obtain falsification years for the large number of schools that fail in 2005. For column 2, our data begins in 2004 and thus we cannot include earlier years. In order to include more prior years for the large number of schools that fail in 2005, we do just one year prior to first year a school actually fails in column 2. Standard errors clustered by CBG. *** p<0.01, ** p<0.05, * p<0.10. Columns 1 and 2 are based on the first two models presented in Table 3 of BBR for housing market outcomes while columns 3 and 4 focus on current resident models with individual fixed effects.
10
Table 2A Falsification Tests based Non-Title 1 Failing Schools
Any Failing NBHD T2 * Fail NBHD T3 * Fail
(1)
(2)
(3)
(4)
Log Price
Log Income
Attend Non-Assigned School
Moved Into Home (since last year)
-0.053 (0.205) 0.076 (0.201) 0.072 (0.205)
-0.128 (0.111) 0.064 (0.116) 0.184 (0.116)
0.022 (0.037) -0.060 (0.051) -0.049 (0.045)
0.051 (0.050) -0.029 (0.058) -0.046 (0.053)
Observations 134,283 47,112 250,155 249,773 Notes: Table presents a series of falsifications tests for the core results presented in BBR. For each model given in column headings, we drop all observations with any failing schools and then assign schools as pseudo failing in a year if the school missed AYP in the same subject for the previous 2 years and is not a Title 1 school. The row headings indicate pseudo fails and we limit pseudo fails to our three terciles of failing neighborhoods. Standard errors clustered by CBG. *** p<0.01, ** p<0.05, * p<0.10. Columns 1 and 2 are based on the first two models presented in Table 3 for housing market outcomes while columns 3 and 4 focus on current resident models with individual fixed effects.
11
Table 3A Falsification Tests based on Single Year of Not Meeting AYP
Any Failing NBHD T2 * Fail NBHD T3 * Fail
Observations
(1)
(2)
(3)
(4)
Log Price
Log Income
Attend Non-Assigned School
Moved Into Home (since last year)
0.029 (0.038) 0.017 (0.057) -0.034 (0.049)
0.032 (0.027) 0.005 (0.036) 0.019 (0.034)
0.043* (0.024) -0.050* (0.026) -0.033 (0.026)
0.032* (0.019) -0.019 (0.022) -0.023 (0.021)
134,283
47,112
250,155
249,773
Notes: Table presents a series of falsifications tests for the core results presented in BBR. For each model given in column headings, we drop all observations with any failing schools and then assign schools as pseudo failing in a year if the school missed AYP in the prior year, but made AYP 2 years prior. The row headings indicate pseudo fails and we limit pseudo fails to our three terciles of failing neighborhoods. Standard errors clustered by CBG. *** p<0.01, ** p<0.05, * p<0.10. Columns 1 and 2 are based on the first two models presented in Table 3 of BBR for housing market outcomes while columns 3 and 4 focus on current resident models with individual fixed effects.
12
Table 4A Impact of Failing Designation on Housing Market, Mobility and Attendance at NonAssigned School by Years since Initially a Failing School (1)
(2)
(3)
(4)
Log Price
Log Income
Attend Non-Assigned School
Moved Into Home (since last year)
Any Failing
-0.047 (0.077)
-0.049 (0.047)
0.043** (0.019)
0.022 (0.020)
NBHD T2 * Fail
0.050 (0.088)
0.045 (0.070)
-0.030 (0.038)
-0.065* (0.034)
NBHD T3 * Fail
0.280***
0.247***
0.232**
0.189**
(0.100)
(0.059)
(0.091)
(0.088)
-0.004
0.011
0.020
-0.004
Any Failing*Years Since Fail NBHD T2 * Fail*Years Since Fail NBHD T3 * Fail*Years Since Fail
Observations
(0.016)
(0.017)
(0.016)
(0.005)
0.099**
-0.033
0.026
0.019
(0.040)
(0.027)
(0.030)
(0.014)
-0.072***
-0.023
0.040
0.013
(0.027)
(0.021)
(0.028)
(0.049)
157,955
52,666
306,651
306,142
Notes: All boundaries based on 2002-03 school year. Observations include 2004-2011 school years. All terciles of CBGs based on average CBG housing prices for all transacted sales between 1998 and 2002. Terciles for CBG prices are restricted to CBGs where prices fall within the range of any CBG that contains a failing neighborhood. Years since fail is a count variable for the number of years since a school first obtained failing status. All models include CBG by year fixed effects as well as fixed effects for each unique combination of assigned elementary, middle and high school in 2002-03, quarter by year fixed effects as well as lagged average school test scores for assigned elementary, middle and high schools. The price model in column 1 also includes 47 indicators for unique structural attributes and measures of proximity to downtown Charlotte and the Interstate. Column 2 indicates our main mortgage income model. Columns 3 and 4 focus on current resident models with individual fixed effects. Standard errors clustered by CBG. *** p<0.01, ** p<0.05, * p<0.10.
13
Table 5A Impact of Failing Designation on Housing Market Outcomes (Alternative Tercile Specification) (1)
(2)
(3)
(4)
Log Price
Log Income
Log Income
Log Income
NBHD T1 * Fail
-0.054 (0.055)
-0.040 (0.045)
-0.047 (0.069)
-0.013 (0.033)
NBHD T2 * Fail
0.116* (0.064)
-0.019 (0.048)
-0.041 (0.066)
-0.031 (0.047)
NBHD T3 * Fail
0.084* (0.049)
0.202*** (0.052)
0.131** (0.060)
0.185*** (0.052)
Observations
157,955
52,666
37,472
47,032
Notes: All boundaries based on 2002-03 school year. Observations include 2004-2011. All terciles of CBGs based on average CBG housing prices for all transacted sales between 1998 and 2002. Terciles for CBG prices are restricted to CBGs where prices fall within the range of any CBG that contains a failing neighborhood. Top panel presents results from standard difference-in-difference models where we compare changes before and after failure to overall changes over time in non-failing locations. Table presents estimates from our preferred specification given by equation (1) in BBR. All specifications include CBG by year fixed effects as well as fixed effects for each unique combination of assigned elementary, middle and high school in 2002-03, quarter by year fixed effects as well as lagged average school test scores for assigned elementary, middle and high schools. The price model in column 1 also includes 47 indicators for unique structural attributes and measures of proximity to downtown Charlotte and the Interstate. Column 2 indicates our main mortgage income model; column 3 removes parcels that may not be owner-occupied based on parcel records (mailing vs. physical address for ownership records); column 4 removes observations where mortgage income exceeds the amount the mortgage. Standard errors clustered by CBG. *** p<0.01, ** p<0.05, * p<0.10.
14
Table 6A Impact of Failing Designation on Attendance at Non-Assigned School Based on Current Residence 02-03 (Alternate Tercile Specification)
Variable
NBHD T1 * Fail NBHD T2 * Fail NBHD T3 * Fail
Observations
(1)
(2)
(3)
(4)
Attend Non-Assigned School
Attend Non-Assigned Non-Magnet School
Attend Magnet School
Moved Into Home (since last year)
0.073*** (0.023) 0.068** (0.029) 0.377*** (0.065)
0.054** (0.024) 0.052* (0.029) 0.273*** (0.056)
0.019* (0.010) 0.016 (0.012) 0.104*** (0.022)
0.015 (0.017) -0.029** (0.014) 0.169*** (0.047)
306,651
306,651
306,651
306,142
Notes: All boundaries based on 2002-03 school year. Observations include 2004-2011 school years. Failing based on assigned school for a given year. All terciles of CBGs based on average CBG housing prices for all transacted sales between 1998 and 2002. Terciles for CBG prices are restricted to CBGs where prices fall within the range of any CBG that contains a failing neighborhood. All regressions include CBG by year fixed effects and assigned school fixed effects as well as lagged average school test scores for assigned elementary, middle and high schools. Only include students in grade 8 or lower since we have no failing high schools in our dataset. Column headings indicate dependent variables which are dummies for attending non-assigned schools, non-assigned non-magnet schools, magnet schools and moved into home in the past year. All specifications include student fixed effects. Standard errors clustered by CBG. *** p<0.01, ** p<0.05, * p<0.10.
15
Table 7A Falsification Tests based on Random Attendance Boundary Shifts (Alternative Tercile Specification) (1)
NBHD T1 * Fail NBHD T2 * Fail NBHD T3 * Fail
(2)
(3)
(4)
Log Price
Log Income
Attend Non-Assigned School
Moved Into Home (since last year)
0.0407 (0.0540) 0.0059 (0.0279) 0.0341 (0.0460)
0.0181 (0.0364) -0.0083 (0.0380) -0.0561 (0.123)
-0.0053 (0.0309) -0.0085 (0.0133) 0.0032 (0.0487)
0.0005 (0.0130) -0.0103 (0.0153) 0.0129 (0.0463)
Notes: This table presents a series of falsifications tests for the core results in BBR using alternative tercile specification. For each column, we estimate 100 regressions based on our original models and samples, but randomly shift our school attendance boundaries and treat homes and students as being assigned schools based on those new (pseudo) boundaries. We randomly shift attendance boundaries by between one and two times the average diameter of a CBG (3,590 feet) with a failing school in every direction. The random shifts are based on shifting the entire school district map of school attendance boundaries and both direction and the distance of the shift are randomly determined. Results are robust to different distances of boundary shifts (beyond 2 miles, we start to lose a number of CBGs due to boundaries falling outside the school district) as well as different directions. Cells indicate mean coefficients and standard deviations of those 100 regressions for the models based on column headings. Observations for each regression are smaller than main models and vary with each boundary shift due to the loss of some parcels when boundaries shift outside the school district boundaries. Standard errors clustered by CBG. *** p<0.01, ** p<0.05, * p<0.10. Columns 1 and 2 are based on the first two models presented in Table 3 for housing market outcomes and columns 3 and 4 focus on current resident models with individual fixed effects.
16
Table 8A Falsification Tests based on Years Prior to Failing Schools (Alternative Tercile Specification)
NBHD T1 * Fail NBHD T2 * Fail NBHD T3 * Fail
Observations
(1)
(2)
(3) Attend Non-Assigned School
(4) Moved Into Home (since last year)
Log Price
Log Income
-0.039 (0.034) -0.146*** (0.039) 0.019 (0.038)
-0.086 (0.072) 0.028 (0.061) -0.267*** (0.009)
0.011 (0.017) 0.011 (0.021) 0.043 (0.050)
0.031 (0.022) -0.006 (0.022) -0.001 (0.058)
152,994
46,805
334,331
316,772
Notes: Table presents falsification tests for core models presented in BBR using alternative tercile specification. All specifications based on falsification tests where we drop all observations after a school fails to meet AYP for two consecutive years (i.e. real failure) and then construct a placebo failing status indicator variable that takes the value of unity for schoolyears that are two years prior to the first year a school actually fails. For columns 1,3 and 4 we extend our sample back to 2003 in order to obtain falsification years for the large number of schools that fail in 2005. For column 2, our data begins in 2004 and thus we cannot include earlier years. In order to include more prior years for the large number of schools that fail in 2005, we do just one year prior to first year a school actually fails in column 2. Standard errors clustered by CBG. *** p<0.01, ** p<0.05, * p<0.10. Columns 1 and 2 are based on the first two models presented in Table 3 of BBR for housing market outcomes while columns 3 and 4 focus on current resident models with individual fixed effects.
17
Table 9A Falsification Tests based on Non-Title 1 Failing Schools (Alternative Tercile Specification)
NBHD T1 * Fail NBHD T2 * Fail NBHD T3 * Fail
Observations
(1)
(2)
(3)
(4)
Log Price
Log Income
Attend Non-Assigned School
Moved Into Home (since last year)
-0.053 (0.205) 0.023 (0.030) 0.019
-0.128 (0.111) -0.064* (0.039) 0.055
0.022 (0.037) -0.037 (0.044) -0.027
0.051 (0.050) 0.021 (0.031) 0.005
(0.019)
(0.036)
(0.030)
(0.020)
134,283
47,112
250,155
249,773
Notes: Table presents falsification tests for core models presented in BBR using alternative tercile specification. For each model given in column headings, we drop all observations with any failing schools and then assign schools as pseudo failing in a year if the school missed AYP in the same subject for the previous 2 years and is not a Title 1 school. The row headings indicate pseudo fails and we limit pseudo fails to our three terciles of failing neighborhoods. Standard errors clustered by CBG. *** p<0.01, ** p<0.05, * p<0.10. Columns 1 and 2 are based on the first two models presented in Table 3 of BBR for housing market outcomes while columns 3 and 4 focus on current resident models with individual fixed effects.
18
Table 10A Falsification Tests based on Single Year of Not Meeting AYP (Alternative Tercile Specification)
NBHD T1 * Fail NBHD T2 * Fail NBHD T3 * Fail
Observations
(1)
(2)
(3) Attend Non-Assigned School
(4) Moved Into Home (since last year)
Log Price
Log Income
0.029 (0.038) 0.045 (0.042) -0.005 (0.031)
0.032 (0.027) 0.037 (0.027) 0.051** (0.020)
0.043* (0.024) -0.007 (0.010) 0.010 (0.011)
0.032* (0.019) 0.012 (0.010) 0.009 (0.011)
134,283
47,112
250,155
249,773
Notes: Table presents a series of falsifications tests for the core results presented in BBR using alternative tercile specification. For each model given in column headings, we drop all observations with any failing schools and then assign schools as pseudo failing in a year if the school missed AYP in the prior year, but made AYP 2 years prior. The row headings indicate pseudo fails and we limit pseudo fails to our three terciles of failing neighborhoods. Standard errors clustered by CBG. *** p<0.01, ** p<0.05, * p<0.10. Columns 1 and 2 are based on the first two models presented in Table 3 of BBR for housing market outcomes while columns 3 and 4 focus on current resident models with individual fixed effects.
19
Table 11A Impact of Failing Designation on Test Scores
Variable Any Failing NBHD T2 * Fail NBHD T3 * Fail
(1)
(2)
(3)
(4)
Orig Residence
Orig Residence
Current Residence
Current Residence
Read Test Score
Math Test Score
Read Test Score
Math Test Score
0.034 (0.023) -0.038 (0.033) 0.020 (0.073)
0.028 (0.023) -0.056 (0.035) 0.003 (0.065)
-0.015 (0.018) 0.019 (0.026) -0.000 (0.088)
-0.043* (0.024) 0.044 (0.032) -0.068 (0.095)
Student Fixed Effects Any Failing NBHD T2 * Fail NBHD T3 * Fail
Observations
Student Fixed Effects
-0.001 (0.025) 0.004 (0.030) 0.099 (0.150)
-0.001 (0.030) -0.016 (0.037) 0.021 (0.077)
-0.016 (0.021) 0.012 (0.029) 0.001 (0.091)
-0.037 (0.025) 0.038 (0.032) -0.049 (0.096)
239,180
239,959
241,695
242,478
Notes: All boundaries based on 2002-03 school year. Observations include 2004-2011 school years. All terciles of CBGs based on average CBG housing prices for all transacted sales between 1998 and 2002. Terciles for CBG prices are restricted to CBGs where prices fall within the range of any CBG that contains a failing neighborhood. All regressions include CBG by year fixed effects and assigned school fixed effects as well as lagged average school test scores for assigned elementary, middle and high schools. Only include students in grade 8 or lower since we have no failing high schools in our dataset. Dependent variables for test scores are normalized to mean zero an standard deviation of one relative to state average test scores in a given year and grade. Specification in top panel includes controls for race, gender, grade. Specification in bottom panel replaces student characteristics with student fixed effects. Standard errors clustered by CBG. *** p<0.01, ** p<0.05, * p<0.10.
20
Appendix C Generalizability and Implications As noted in Appendix A, the number of failures in the highest price tercile is relatively low and so our parameter of interest, which is the difference between the effect of failure in the highest and lowest price terciles is based on a relatively small number of high priced block groups that bisect an attendance zone of a school with an AYP failure. Although this has no impact on the validity of our analyses, it may impact generalizability. Consequently, in this section we present results from several further analyses designed to speak to the generalizability of our core results. We begin by presenting results for our key housing transaction outcomes, similar to those reported in Table 3 of BBR, that are based on a simple difference-in-differences analysis that compares changes before and after an AYP failure to overall changes over time in non-failing locations. The results are reported in Table 12A. These results were obtained by simply dropping the block group-by-year fixed effects in equation (1) of BBR and are presented in order to illustrate the basic empirical patterns for housing price and home-buyer income that arise in our data. Note that the results reported in Table 12A utilize all neighborhoods with failing schools to identify the effect of AYP failure on our outcomes of interest while the results reported in Table 3 of BBR only utilize neighborhoods bisected by a new attendance zone boundary. 4 As in Table 3 of BBR, column 1 of Table 12A presents results where the dependent variable is the natural log of the sale price of residential homes, while columns 2, 3 and 4 present results where the dependent variable is the natural log of homebuyer income. 5 The results reported in Table 12A support the findings based on our preferred specification that includes block group-by-year fixed effects. Specifically, similar to the results shown in Table 3 of BBR, in column 1 of Table 12A the estimated coefficient on the interaction term
4
The simple difference-in-differences estimates reported in Table 12A help speak to the generalizability of our results since the number of neighborhoods with failing schools is almost twice that of neighborhoods bisected by a failing school attendance boundary. However, we focus primarily on the results reported in Table 3 of BBR, which include block group-by-year fixed effects, since we are concerned that block groups with failing schools may experience different price and income trends than block groups with non-failing schools. 5 Column 2 is for the sample of owner-occupied transactions from the HMDA data, column 3 restricts the sample by eliminating observations where mailing and physical addresses differ, and column 4 restricts the sample to mortgages with stated incomes that are no larger than the mortgage amount.
21
between the indicator for failure to meet AYP and the highest neighborhood quality tercile indicator is positive and statistically significant. Thus, similar to our core results, we find that that the highest quality neighborhoods within the attendance zones of schools that fail to meet AYP experience an increase in housing values post AYP failure. As shown in columns 2-4 of Table 12A, we also find that for the highest quality neighborhoods, homebuyer income increases with AYP failure. Again, these results are consistent with the results from our preferred specifications shown in Table 3 of BBR. To further examine the generalizability of our results, in Table 13A we show how average student attributes differ depending on whether or not a block group is located within the attendance zone of a school that is ever classified as a Title 1 Choice School. Specifically, Table 13A shows average student attributes by block group neighborhood quality terciles by whether or not the block group contains a school that ever failed to meet AYP standards. It is important to note that Table 13A is not a balancing test; rather it is designed to provide context on the generalizability of our results. As expected, there are rather large differences in the characteristics of block groups that do and do not contain a school that ever failed to meet AYP standards. More significantly for our purposes, the differences are exaggerated for highest quality neighborhoods (Tercile 3) especially for racial composition. 6 As columns 6 and 7 reveal, block groups that contain a failing school are 50 percent black on average compared to the other top tercile neighborhoods where the percent black is only 16 percent. On the other hand, the block groups that make up the Title 1 Choice block groups in column 7 are quite heterogeneous. For example, while the average share black for the entire sample is over 0.43, for the block groups in column 7, the share black is greater than 0.30 in just 8 of the 19 block groups. The other 11 block groups have more modest black shares and are much more comparable to the top tercile block groups that do not contain a Title 1 Choice attendance zone. To further examine racial differences between block groups that do and do not contain a school that ever failed to meet AYP standards, we used equation (1) in BBR to examine the racial composition of who was moving into the AYP failing attendance zones in these block 6
These differences are not particularly surprising given that NCLB sanctions are only imposed on school that both fail to meet AYP standards for two consecutive years and are Title 1 schools, where 75% or more of the students are eligible for federal lunch subsidies.
22
groups. Specifically, in addition to providing information on a homebuyer’s income, the HMDA data also provide information on a homebuyer’s race and ethnicity. We therefore used our matched sample of mortgage deeds data and HMDA data to create indicators for whether a homebuyer was Black, Hispanic or Non-White and used these indicators as the dependent variables in equation (1). Results are reported in Table 14A and reveal no relationship between race or ethnicity and AYP failure overall or for highest neighborhood quality terciles.
23
Table 12A Impact of Failing Designation on Housing Market Outcomes: Difference-in-Differences Model (1)
(2)
(3)
(4)
Log Price
Log Income
Log Income
Log Income
-0.009
-0.010
-0.009
-0.007
(0.028)
(0.021)
(0.027)
(0.021)
0.027
0.009
0.025
0.004
(0.036)
(0.030)
(0.037)
(0.032)
NBHD T3 * Fail
0.103** (0.048)
0.124*** (0.041)
0.116** (0.051)
0.128*** (0.045)
Observations
157,955
52,666
37,472
47,032
Any Failing
NBHD T2 * Fail
Notes: Table presents results from standard difference-in-differences models where we compare changes before and after failure to overall changes over time in non-failing locations. All boundaries based on 2002-03 school year. Observations include 2004-2011. All terciles of CBGs based on average CBG housing prices for all transacted sales between 1998 and 2002. Terciles for CBG prices are restricted to CBGs where prices fall within the range of any CBG that contains a failing neighborhood. All models include CBG by school attendance boundary fixed effects as well as quarter by year fixed effects as well as lagged average school test scores for assigned elementary, middle and high schools. The price model in column 1 also includes 47 indicators for unique structural attributes and measures of proximity to downtown Charlotte and the Interstate. Column 2 indicates our main mortgage income model; column 3 removes parcels that may not be owner-occupied based on parcel records (mailing vs. physical address for ownership records); column 4 removes observations where mortgage income exceeds the amount the mortgage. Standard errors clustered by CBG. *** p<0.01, ** p<0.05, * p<0.10.
24
Table 13A Means by Failing Designation
(1) All Average Sales Price (1998-2002) range (Define Terciles)
(2)
(3)
(4)
(5)
(6)
(7)
Tercile 1
Tercile 2
Tercile 3
$31,466-$93,877
$93,930-$149,790
$150,235-$389,217
CBG ever contains failing school
0.344
0
1
0
1
0
1
Student not attending Assigned Neighborhood School
0.350
0.401
0.456
0.330
0.411
0.252
0.450
Student attends M agnet School
0.0662
0.0542
0.0746
0.0651
0.0731
0.0584
0.152
Student attends Non-assigned Non-M agnet School
0.284
0.347
0.381
0.265
0.338
0.193
0.297
Live on SF Parcel
0.733
0.645
0.652
0.767
0.570
0.856
0.495
M ale
0.514
0.517
0.515
0.517
0.513
0.510
0.516
Black
0.433
0.648
0.754
0.410
0.486
0.157
0.499
Hispanic
0.0958
0.133
0.101
0.0964
0.196
0.0473
0.0822
Reading test score
0.00439
-0.167
-0.255
0.00312
-0.110
0.270
0.0119
M ath test score
0.00424
-0.189
-0.268
-0.00163
-0.112
0.287
-0.0383
3.709
3.669
3.725
3.678
3.575
3.775
3.525
Avg Elementary School Test Scores
-0.0433
-0.218
-0.469
-0.00685
-0.322
0.379
-0.447
Avg M iddle School Test Scores
-0.0502
-0.169
-0.403
0.00547
-0.368
0.296
-0.382
Avg High School Test Scores
-0.418
-0.536
-0.584
-0.396
-0.531
-0.252
-0.478
Observations
88,984
6,332
19,709
23,795
9,768
26,390
669
School Grade
Notes: All variables based on 2002-03 school year student population and individual covariates (except Avg Elementary/M iddle/HS School Test Scores). Avg Elementary/M iddle/HS School Test Scores are based on average reading and math test scores for all students assigned to that school. Each cell provides a mean for each tercile separately for students located in block groups that never contain a failing school (CBG ever contains failing school = 0) and students located in block groups that contain in failing schools (CBG ever contains failing school = 1). A few observations (1,258) are assigned to CBGs that did not have any property sales to characterize terciles but are included in the "All" column.
25
Table 14A Impact of Failing Designation on Racial Composition of Homebuyers
Any Failing NBHD T2 * Fail NBHD T3 * Fail
Observations
(1) Black
(2) Hispanic
(3) Non-White
0.011 (0.039) 0.017 (0.051) 0.055 (0.061)
-0.079 (0.059) 0.079 (0.069) 0.017 (0.066)
-0.054 (0.057) 0.074 (0.065) 0.062 (0.106)
50,039
50,039
50,039
Notes: All boundaries based on 2002-03 school year. Observations include 2004-2011 school years. All terciles of CBGs based on average CBG housing prices for all transacted sales between 1998 and 2002. Terciles for CBG prices are restricted to CBGs where prices fall within the range of any CBG that contains a failing neighborhood. All models include CBG by year fixed effects as well as fixed effects for each unique combination of assigned elementary, middle and high school in 2002-03, quarter by year fixed effects as well as lagged average school test scores for assigned elementary, middle and high schools. Standard errors clustered by CBG. *** p<0.01, ** p<0.05, * p<0.10. Dependent variables are indicators for race reported as part of HMDA based on our sample of mortgages given by column 2 of Table 3 of BBR. We have slightly fewer observations than our main income results due to either missing or multiple racial categories for a mortgage applicant.
26
References Hanushek, E.A., Kain, J.F., & Rivkin, S.G. (2004). Disruption Versus Tiebout Improvement: The Costs and Benefits of Switching Schools. Journal of Public Economics, 88(9–10), 1721–1746. Schwerdt, G., West, M. R. (2013). The Impact of Alternative Grade Configurations on Student Outcomes through Middle and High School. Journal of Public Economics, 97, 308-326.
27