Learn About: 21st Century | Charter Schools | Homework
Home / Edifier


The EDifier

February 7, 2017

School Improvement Grants: Why didn’t $7 billion change results for students?

Mathematica recently released a study of the federal program of Student Improvement Grants (SIG). Their findings? Schools receiving the extra funds showed no significant improvement over similar schools that did not participate. With a price tag of $7 billion (yes, with a “b”), this strikes many as a waste of taxpayer dollars. Interestingly, the study also found no evidence that the SIG schools actually had significantly higher per-pupil expenditures than similar schools that didn’t receive the grants, which may have contributed to the mediocre results.

SIG awarded up to $2 million annually to 1,400 schools, which was administered by states. The program began in the 2010-11 school year and continues through the end of the 2016-17 year. Starting in 2017-2018, the new Every Student Succeeds Act (ESSA) will allow states to use up to seven percent of their Title I allotments to improve the bottom five percent of schools. States may choose to dole out funds via formula or competitive grants, but districts are the ones responsible for using evidence-based practices to improve schools.

Under the old SIG rules, the federal government required schools to choose from one of these four turnaround models:

SIG 1

The new report analyzed transformation, turnaround, and restart models, and found no statistically significant effects for any of them. The authors did find positive, but not statistically significant, effects on math and reading scores for schools receiving the grant, but lower high school graduation rates. Critics of the new report have noted that the mathematical model chosen was not sensitive enough to detect small effects. The authors did find mixed effects each year, which many studies would have the power to find as significant, but due to the design, these remain insignificant. To give perspective of the magnitude of these effects, the effect of decreasing elementary class sizes by seven students is about 0.2 standard deviations; the effect of urban charter schools compared to their neighborhood schools after one year is 0.01 in math and -0.01 in reading (0.15 and 0.10 after four years). According to the Mathematica study, the results of SIG in 2012-2013 were 0.01 standard deviations in math and 0.08 standard deviations in reading, with a drop of in the graduation rate (note that SIG had a positive impact on the graduation rate in 2011-2012, which suggests that these results are not statistically significant, or could be zero). Not enough to conclude a positive effect, for sure, but not nothing, either.

 

SIG3

I’ll offer a couple of my own thoughts (based on research, of course) on why SIG didn’t have the success that was hoped for:

1. The authors found no evidence that the grant funds actually increased per-pupil spending. In government-speak, the funds may have supplanted other funding streams instead of supplementing them, even though the law states that federal funds are supposed to supplement other funds spent. They found that SIG schools spent about $245 more per student than similar non-SIG schools in 2011-2012, and only $100 more in 2012-2013 (again the results are not statistically significant, meaning that we can’t confidently say that the difference isn’t zero). Recent studies have shown that spending makes a difference in education, so this may help explain why we didn’t see a difference here.

2. Students in many priority schools (the bottom five percent of schools), which are the ones that qualified for SIG grants, may have had the option to transfer to higher-performing schools. While the report doesn’t address this, it seems that students with more involved parents and better academic achievement may have been more likely to utilize this offer, thus lowering the average scores of the schools they left behind. Students perform better when surrounded with higher-performing peers, which means that the lack of overall effect could have been influenced by the loss of higher achieving students.

3. Schools receiving SIG grants were high-poverty and high-minority. The average rate of students eligible for free-and-reduced price (FRL) lunches in the study group was 83 percent, with non-white students making up 91 percent of the school populations (as compared with the overall school population being about 50 percent FRL-eligible and 50 percent non-white). While the resources allocated through SIG to these schools should have made spending more equitable, schools may have still struggled with recruiting and retaining experienced, qualified teachers, which is often a challenge for high-poverty, high-minority schools. Research is clear that integrated schools have better outcomes for students than segregated schools. Yet, the reform strategies used under SIG (replacing school staff and/or converting to a charter school) did little to improve school integration.

Hopefully, states and districts will learn from these lessons and use school reforms that fundamentally change the practices of the school, not just a few personnel: increased funding, school integration, changes in instructional practices, meaningful teacher/principal mentoring and development, and/or wrap-around services for students in poverty or who have experienced trauma.






December 7, 2016

PISA scores remain stagnant for U.S. students

The results of the latest PISA or the Program for International Student Assessment are in and as usual, we have an interpretation of the highlights for you.

If you recall, PISA is designed to assess not just students’ academic knowledge but their application of that knowledge and is administered to 15-year-olds across the globe every three years by the U.S. Department of Education’s National Center for Education Statistics (NCES) in coordination with the Paris-based Organisation for Economic Cooperation and Development (OECD). Each iteration of the PISA has a different focus and the 2015 version honed in on science, though it also tested math and reading proficiency among the roughly half-million teens who participated in this round. So, how did American students stack up?

In short, our performance was average in reading and science and below average in math, compared to the 35 other OECD member countries.  Specifically, the U.S. ranked 19th in science, 20th in reading and 31st in math. But PISA was administered in countries beyond OECD members and among that total group of 70 countries and education systems (some regions of China are assessed as separate systems), U.S. teens ranked 25th in science, 22nd in reading, and 40th in math.  Since 2012, scores were basically the same in science and reading, but dropped 11 points in math.

PISA Science

Before you get too upset over our less-than-stellar performance, though, there are a few things to take into account.  First, scores overall have fluctuated in all three subjects.  Some of the top performers such as South Korea and Finland have seen 20-30 point drops in math test scores from 2003 to 2015 at the same time that the U.S. saw a 13 point drop.  Are half of the countries really declining in performance, or could it be a change in the test, or a change in how the test corresponds with what and how material is taught in schools?

Second, the U.S. has seen a large set of reforms over the last several years, which have disrupted the education system.  Like many systems, a disruption may cause a temporary drop in performance, but eventually stabilize.  Many teachers are still adjusting to teaching the Common Core Standards and/or Next Generation Science Standards; the 2008 recession caused shocks in funding levels that we’re still recovering from; many school systems received waivers from No Child Left Behind which substantially change state- and school-level policies.  And, in case you want to blame Common Core for lower math scores, keep in mind that not all test-takers live in states that have adopted the Common Core, and even if they do, some have only learned under the new standards for a year or two.  Andreas Schleicher, who oversees the PISA test for the OECD, predicts that the Common Core Standards will eventually yield positive results for the U.S., but that we must be patient.

Demographics

Student scores are correlated to some degree with student poverty and the concentration of poverty in some schools.  Students from disadvantaged backgrounds are 2.5 times more likely to perform poorly than advantaged students.  Schools with fewer than 25 percent of students who are eligible for free or reduced price lunch (about half of all students nationwide are eligible) would be 2nd in science, 1st in reading, and 11th in math out of all 70 countries.  At the other end of the spectrum, schools with at least 75 percent of students who are eligible for free or reduced price lunch, 44th in science, 42nd in reading, and 47th in math.  Compared only to OECD countries, high-poverty schools would only beat four countries in science, four countries in reading, and five in math.

Score differences for different races in the U.S. show similar disparities.

How individual student groups would rank compared to the 70 education systems tested:

Science Reading Math
White 5th 4th 20th
Black 49th 44th 51st
Hispanic 40th 37th 44th
Asian 8th 2nd 20th
Mixed Race 19th 20th 38th

 

Equity

Despite the disparities in opportunity for low-income students, the number of low-income students who performed better than expected increased by 12 percentage points since 2006, to 32 percent.  The amount of variation attributable to poverty decreased from 17 percent in 2006 to 11 percent in 2015, meaning that poverty became less of a determining factor in how a student performed.

Funding

America is one of the largest spenders on education, as we should be, given our high per capita income.  Many have bemoaned that we should be outscoring other nations based on our higher spending levels, but the reality is that high levels of childhood poverty and inequitable spending often counteract the amount of money put into the system.  For more info on this, see our previous blogpost.






January 22, 2016

CPE examines educational equity in new paper

It’s been over 60 years since the U.S. Supreme Court declared education “a right which must be made available to all on equal terms.” In ruling that separate was in fact not equal, Brown v Board of Education forced federal, state and local governments to open public schools to all children in the community.

Yet integrating school buildings would prove to be just the first step in an ongoing journey toward educational equity in the nation. There remained – and still remain – structural and social barriers to making a world-class public education “available to all on equal terms.” In addition, our ideas about equity have evolved to encompass more than a guarantee that school doors will be open to every child.

CPE explores these issues and more in our latest paper, Educational Equity: What does it mean, how do we know when we reach it? Our hope is to provide a common vocabulary for school boards to help them start conversations in their communities and thereby bring the nation closer to fulfilling its promise of equal opportunity for all.

Filed under: Achievement Gaps,CPE,Demographics,equity,funding — NDillon @ 7:00 am





September 16, 2015

Budgets, data and honest conversation

Balancing school budgets in a time of shortfalls is a thankless job. Whatever gets cut will nonetheless have its champions, many of whom are willing to let their unhappiness known. Really loud. But one of the nation’s largest school districts is meeting this challenge with a new app that gives the community a channel for telling school leaders exactly what expenditures they want preserved. The hitch – users keep their preferred items only by eliminating others.  In this way, the app delivers an object lesson in how really tough these decisions are.

Fairfax County school district in Virginia serves nearly 190,000 students with an annual budget of $2.6 billion. Despite the community’s affluence, enrollments are rising faster than revenues, and the district is facing a $50-100 million deficit. An earlier citizen task force was charged with recommending ways to close this gap. After reviewing the data, the task force suggested, among other things, eliminating high school sports and band. To say the proposal was not well received is to state the obvious. And the public howls and teeth-gnashing have yet to subside.

So what’s a broke district to do? Give the data to the community. Fairfax released this web-based budget tool to the public this week as a means to call the question: In order to keep [your priority here], what do we get rid of? Users are able to choose from more than 80 budget items to cut in seven categories: “school staffing and schedules,” “instructional programs,” “nonacademic programs,” “instructional support,” “other support,” “employee compensation” and “new or increased fees.”  Each item has a dollar figure attached and the goal is to reduce the budget by $50 million.

I happen to be a Fairfax resident so I was happy to test-drive this web tool. The first thing that struck me was the near absence of low-hanging fruit. All of the big ticket items hurt, mostly because the savings come from reduction in staff or valuable instruction time. Increase elementary class size by one student: $12.9 million. Reduce daily course offerings in high school from seven to six: $25 million. Reduce kindergarten from full-day to half-day: $39 million. Yikes! Given these choices, I could see why eliminating high school sports at nearly $9 million could start to look like a lesser evil.

On the other hand, items that seemed to do the least damage to the educational mission also saved a relative pittance. Raise student parking fees by $50: $300,000.  Reduce district cable TV offerings: $100,000. Increase community use fees: $70,000. Clearly, the nickel-and-dime strategy was not going to get me close to $50 million.

In the end, I came within the 10 percent margin of hitting the target (while keeping high school sports) and I submitted my preferences. But I’ll be honest. They include some choices that I do not feel the least bit happy about. And that’s the point. In 2010, CPE published a report on the impact of the recession on school budgets across the country. The title, Cutting to the Bone, pretty much tells the story. The current Fairfax deficit represents only 2 percent of its yearly budget. But after years of cost-cutting, there’s no fat left to trim.

Clearly, if I were a school board member, I would want to know more about the impact of these programs and policies before making any final decisions. But presenting the data on their cost and what the dollars buy – as this tool does — is a really good way to educate the community about the challenge and engage them in an honest conversation about how they can best serve their students, especially when revenues run short. — Patte Barth

Filed under: Data,funding,Public education — Tags: , , — Patte Barth @ 10:11 am





March 27, 2015

One in six chance you won’t get funding for child care

In an issue report authored by the Office of the Assistant Secretary for Planning and Evaluation (ASPE), an agency of the Department of Health and Human Services (DHHS), federal child care subsidies were vastly underused in fiscal year 2011. The report found that of the population of children eligible (i.e., 14.3 million in 2011), 83 percent did not receive federal assistance. That translates into just shy of 12 million children (11.8 million) who did not receive financial support to attend child care. In terms of state assistance, the numbers and percentages are only slightly better. Of the 8.4 million children who were eligible to receive child care subsidies under state rules (which can be, and often are, more restrictive than the federal eligibility parameters), only 29% did so (i.e., 71% or 5.96 million children did not receive child care subsidies).

The numbers can continue to be shocking. Here are some other trends reported within the ASPE brief. First, analyses reveal that amongst children from families between 150% and 199% of the federal poverty limit (for 2011), 96% of these families were not served.

Another finding from the 2011 data reveals that the older the child, the less likely they were to receive a subsidy. Moreover, children ages 10 to 12 were more than four times as likely to not receive child care subsidies compared to children ages 0 to 5. This was also true for 6- to 9-year-olds, who were half as likely to have received a child care subsidy compared to those younger (yet still twice as likely as the 10- to 12-year-olds)!

Provided as an appendix to the report, some background information is provided on this sample of children and their families. Included in this table, are the numbers of families with parents employed for 20 or more hours a month and you can compare this across age ranges. Looking at the total sample, 84% of all eligible families fell into the highest category of employment yet, of this same sample of working families, only 1 in 5 of them received child care subsidies.

Although we would not expect that the same 84% of working eligible families is the same group as the families who did not receive any child care assistance, but clearly there is a big disconnect somewhere in the system. One would suspect that the families who are working as much as possible would be those that need child care (let alone financial assistance for it) the most. Moreover, children (and families) living in poverty are already more likely to face enormous obstacles and as positioned for in our “Changing Demographics of the United States and their Schools” article, these children can especially benefit from programs such as preschool and participation can lead to fewer behavior problems and reduce the likelihood of school expulsion later in their academic career. This misalignment of need and services is unsettling and will be something that we should continue to monitor for change. – David Ferrier






Older Posts »
RSS Feed