One surefire way for education policy groups to get press is to release a state report card. Any kind of ranking is clickbait for news outlets. Plus, with a state-of-education report card you get a bonus man-bites-dog story when the grade-giving institution is the one being graded. Consequently, organizations representing business interests from teachers’ unions to think tanks have gotten into the act at one time or another. But readers should beware. When it comes to ranking states on education, a rose is not a rose is not a rose.
Three state report cards released over the winter show how widely the grades vary, even though they are all ostensibly evaluating the same thing – public education. The American Legislative Exchange Council published its Report Card on American Education in November. Just last week, the Network for Public Education released a 50 State Report Card. Both ALEC and NPE are advocacy organizations with clear, and contradictory, agendas. January saw the release of Education Week’s annual Quality Counts which, as the education publication of record, represents the Goldilocks in this bunch.
What, if anything, can we learn by looking at these three rankings collectively? On the one hand, there is little agreement among the organizations regarding which states are top performers: no state makes the top 10 in all three lists. Yet on the other hand, there is consensus that no state is perfect and that much more work needs to be done, since no state earned an ‘A.’
Obviously, these reports differ because they value different things. ALEC and NPE grade states on education policies that they like. ALEC, which advertises itself as supportive of “limited government, free markets and federalism,” awards states that promote choice and competition, such as allowing more charter schools, providing private school options with taxpayer support, and having few or no regulations on homeschooling. NPE emphasizes the “public” in public education and opposes privatization and so-called “corporate reforms” such as merit pay, alternative certification for teachers, and especially high-stakes testing. Policies that earned high grades by ALEC, therefore, got low grades from NPE and vice versa.
The two had one area of agreement, however, albeit by omission. The report cards say little (ALEC) or nothing (NPE) about actual performance. The result is that grades on both reports have no relationship to student learning.
To its credit, ALEC features a separate ranking on states’ NAEP scores for low-income students as their way to draw attention to student performance. However, by doing so, the authors also cast a light on how little ALEC’s preferred policies relate to achievement. For every Indiana, which earned ALEC’s top grade and produces high NAEP scores, there is a Hawaii whose low-income kids ranked 6th on NAEP, but earned an ALEC ‘D+.’ NPE isn’t any better. Despite the appearance of high-performing states like Massachusetts and Iowa in the NPE Top 10, they also awarded high-scoring Indiana an ‘F’ and Colorado a ‘D.’
In contrast to ALEC and NPE, Ed Week does not take positions on education policy. Its state report card focused on K-12 achievement, school finance, and something they call “chance for success” — demographic indicators related to student achievement including poverty, parent education and early education enrollments. With policy out of the equation, Ed Week’s grades in each domain track fairly consistently with the overall grade suggesting that the indicators identified by the authors tell us at least something about the quality of education.
So which state gets bragging rights? If you want to use one of these report cards as fodder for your own particular brand of advocacy, then by all means go with ALEC or NPE – whichever one fits your views best. But if you really want to know how well different education policies work, you’d be better off consulting the research. You can start here, here and here.
As for ranking states by their education systems? Stick with Goldilocks.