Learn About: 21st Century | Charter Schools | Homework
Home / Edifier


The EDifier

February 7, 2017

School Improvement Grants: Why didn’t $7 billion change results for students?

Mathematica recently released a study of the federal program of Student Improvement Grants (SIG). Their findings? Schools receiving the extra funds showed no significant improvement over similar schools that did not participate. With a price tag of $7 billion (yes, with a “b”), this strikes many as a waste of taxpayer dollars. Interestingly, the study also found no evidence that the SIG schools actually had significantly higher per-pupil expenditures than similar schools that didn’t receive the grants, which may have contributed to the mediocre results.

SIG awarded up to $2 million annually to 1,400 schools, which was administered by states. The program began in the 2010-11 school year and continues through the end of the 2016-17 year. Starting in 2017-2018, the new Every Student Succeeds Act (ESSA) will allow states to use up to seven percent of their Title I allotments to improve the bottom five percent of schools. States may choose to dole out funds via formula or competitive grants, but districts are the ones responsible for using evidence-based practices to improve schools.

Under the old SIG rules, the federal government required schools to choose from one of these four turnaround models:

SIG 1

The new report analyzed transformation, turnaround, and restart models, and found no statistically significant effects for any of them. The authors did find positive, but not statistically significant, effects on math and reading scores for schools receiving the grant, but lower high school graduation rates. Critics of the new report have noted that the mathematical model chosen was not sensitive enough to detect small effects. The authors did find mixed effects each year, which many studies would have the power to find as significant, but due to the design, these remain insignificant. To give perspective of the magnitude of these effects, the effect of decreasing elementary class sizes by seven students is about 0.2 standard deviations; the effect of urban charter schools compared to their neighborhood schools after one year is 0.01 in math and -0.01 in reading (0.15 and 0.10 after four years). According to the Mathematica study, the results of SIG in 2012-2013 were 0.01 standard deviations in math and 0.08 standard deviations in reading, with a drop of in the graduation rate (note that SIG had a positive impact on the graduation rate in 2011-2012, which suggests that these results are not statistically significant, or could be zero). Not enough to conclude a positive effect, for sure, but not nothing, either.

 

SIG3

I’ll offer a couple of my own thoughts (based on research, of course) on why SIG didn’t have the success that was hoped for:

1. The authors found no evidence that the grant funds actually increased per-pupil spending. In government-speak, the funds may have supplanted other funding streams instead of supplementing them, even though the law states that federal funds are supposed to supplement other funds spent. They found that SIG schools spent about $245 more per student than similar non-SIG schools in 2011-2012, and only $100 more in 2012-2013 (again the results are not statistically significant, meaning that we can’t confidently say that the difference isn’t zero). Recent studies have shown that spending makes a difference in education, so this may help explain why we didn’t see a difference here.

2. Students in many priority schools (the bottom five percent of schools), which are the ones that qualified for SIG grants, may have had the option to transfer to higher-performing schools. While the report doesn’t address this, it seems that students with more involved parents and better academic achievement may have been more likely to utilize this offer, thus lowering the average scores of the schools they left behind. Students perform better when surrounded with higher-performing peers, which means that the lack of overall effect could have been influenced by the loss of higher achieving students.

3. Schools receiving SIG grants were high-poverty and high-minority. The average rate of students eligible for free-and-reduced price (FRL) lunches in the study group was 83 percent, with non-white students making up 91 percent of the school populations (as compared with the overall school population being about 50 percent FRL-eligible and 50 percent non-white). While the resources allocated through SIG to these schools should have made spending more equitable, schools may have still struggled with recruiting and retaining experienced, qualified teachers, which is often a challenge for high-poverty, high-minority schools. Research is clear that integrated schools have better outcomes for students than segregated schools. Yet, the reform strategies used under SIG (replacing school staff and/or converting to a charter school) did little to improve school integration.

Hopefully, states and districts will learn from these lessons and use school reforms that fundamentally change the practices of the school, not just a few personnel: increased funding, school integration, changes in instructional practices, meaningful teacher/principal mentoring and development, and/or wrap-around services for students in poverty or who have experienced trauma.






Leave a Reply

Your email address will not be published. Required fields are marked *


RSS Feed