Learn About: 21st Century | Charter Schools | Homework
Home / Edifier


The EDifier

May 14, 2015

Proficiency Rates Differ Between State and National Tests

Large gaps in proficiency rates still exist between state and national tests according to a new report by Achieve, Inc. It has been known for several years that more students reach the proficiency benchmark on their state assessment than on the National Assessment of Education Progress (NAEP), and that gap remains today. In fact, proficiency rates on most state assessments are 30 percentage points higher than they are on NAEP.  What this means is that if one of these states reported 80 percent of their students reached the proficiency benchmark on their state assessment, than just 50 percent likely reached it on NAEP.

In some states the gap was even larger. In Georgia, for example, the difference was 60 percentage points in 4th grade reading which was the largest difference in the country. In this case 94 percent of 4th graders were deemed proficient on the Georgia state assessment while just 34 percent reached the proficiency level on NAEP. Georgia wasn’t alone. Louisiana, Alaska, and Arkansas all had gaps of at least 50 percentage points. Similar results were found in 8th grade math as well.

However, there were states with small if any gaps. In fact, in New York more students were deemed proficient on NAEP than on the state assessment in both 4th grade reading and 8th grade math. The report also singled out a dozen or so states that had similar proficiency rates on their state assessments as on NAEP, or as the report called them the “Top Truth Tellers.”

The results aren’t entirely surprising. The Achieve report is based on results from the 2013-14 state assessments when nearly all states were still using older tests. Most states will be giving new Common Core aligned tests for the first time this year which will likely lead to lower proficiency rates as was seen in Kentucky and New York — states that have been administering Common Core aligned assessments for a couple years already. What will be interesting is how this analysis will look a year from now when state scores are based on more rigorous Common Core aligned assessments. I’m guessing the Common Core states will see their scores more aligned with NAEP while those who don’t will still have significant gaps. The question remains, will there be more pushback in states with lower proficiency rates or in those with larger gaps? I guess we will have to wait until next year to find out.—Jim Hull






November 20, 2014

Growing concerns on testing

A recent opinion piece in the Denver Post challenged the commonly claimed notion that American public students are being tested too much. Recently, high school seniors in Colorado refused to take state assessments in science and social studies, arguing these assessments do not reflect what they have been taught.

But Alicia Caldwell, an editorial writer at the Post, writes that students from third to 12th grade are only tested 1.4% of the time in school, citing data from the state of Colorado’s Department of Education. Caldwell also points out that there was local input on these testing decisions, as eight educators from these school districts were placed on the committee that enacted the social studies standards in 2009.

These standards were put into place because Colorado students were required to take way too many remedial classes in college, which they received no credit but have to pay for. In essence, the Colorado students had to pay for classes that they should have already passed in high school. Finally, the author highlights the role of local districts, as “local districts are layering their own assessments on top of those required for the state, adding to total test time.” This reminds us that the amount of testing is the result of federal, state, and local policies. If parents or students, such as those in Colorado, are complaining about too much testing, then it is the school board and local government’s responsibility to make their testing information transparent.

Colorado is not the only state where communities have voiced their concern on testing. Maryland has also engaged in the debate over the right amount of testing. Eighth-graders in Baltimore schools, for instance, spend 14 to 46 hours a year on standardized assessments. A school year amounts to approximately 1000 instruction hours, so this would mean students are spending 1.4 to 4.6% on testing. When expressed as a percentage, this level of testing does not seem as significant as some of testing critics claim it to be. In Anne Arundel County, students are tested 46 hours per year and 33 of these tests are locally mandated tests. This again demonstrates the role of local government and school board decisions in testing.

An upcoming brief from the Center for Public Education will examine these and other concerns on testing and explain what studies have found on the subject. Stay tuned!







RSS Feed