Learn About: 21st Century | Charter Schools | Homework
Home / Edifier


The EDifier

July 10, 2015

‘Proficient’ in the eye of the beholder

While we often talk about the American educational system, in truth we have 50 systems, each with the latitude to define its own academic standards. A newly published analysis  by the National Center of Education Statistics shows just how widely those expectations for student learning differ among states. Moreover, the findings suggest that most states could be aiming too low.

For the last ten years, NCES has conducted periodic statistical analyses that map student proficiency on state tests to their respective performance on NAEP. This national assessment is administered in all states and it is, by large consensus, considered the gold standard both in the richness of content and the quality of the assessment itself. As such, states where their students perform at about the same level on the state test as they do on NAEP can be considered to have high performance standards.

Some partial findings:

  • Grade 4: Only two states (New York and Wisconsin) had state proficiency standards equivalent to NAEP-proficient in both reading and math; an additional three states (Massachusetts, North Carolina and Texas) were aligned with NAEP-basic in reading and NAEP-proficient in math. Four states (Alabama, Georgia, Idaho and Maryland) had proficiency levels aligned with NAEP-below basic. A whopping 22 states were in the NAEP-below basic rate in reading.
  • Grade 8: Only New York’s proficiency levels aligned with NAEP-proficient in both reading and math, while North Carolina and Texas were within NAEP-basic in reading and NAEP-proficient in math. Five states (Alabama, Connecticut, Georgia, Idaho and Ohio) were in the below basic range in both subjects. Unlike grade 4, only three states’ grade 8 performance (DC, Indiana and Mississippi) was at the NAEP-below basic level in reading. The majority of states were within the NAEP-basic range in reading and math.

Alert readers will note, of course, that some high-performing states like Connecticut and Maryland had proficiency levels that aligned with NAEP’s lowest performance designation. The analysis is, to be sure, an imperfect comparison. Even so, the relationship between state alignment to NAEP-proficient and their relative performance is fairly consistent, as you can see in the chart featured below as well as in the full report.

Despite the study’s limitations, NCES provides important context for states to help them gauge the quality of their standards. According to the Atlantic , Peggy Carr, NCES’s acting commissioner, explained to reporters that NAEP-proficient is considered to be at a level that shows students are on track to be “college-ready.” The most recent administration showed that only 35 percent of the nation’s fourth-graders performed at proficient or above on NAEP-reading; about the same proportion of eighth-graders (36 percent) were proficient in math. Clearly, we have our work cut out for us in order to meet the goal of all graduates prepared for college and careers.

The NCES study was based on 2013 data so it’s too early to see the impact of the common core standards and aligned assessments in those states that have adopted them. Several states that opted out, however, are also committed to the college and career-ready agenda. NCES’s next iteration of this series should, therefore, give us more insight into how well we are advancing.

NAEPmap

 

Filed under: Assessments,Common Core,standards — Tags: , , — Patte Barth @ 3:42 pm





July 2, 2015

Testing, opt outs and equity

Spring heralds the return of many things – tulips, bare pavement, baseball, and for millions of public schoolkids, state tests. This year, however, the inevitable proved to be largely evitable. April tulips weren’t seen until late May. Much of the country experienced a white Easter. Major league games were snowed out. And tens of thousands of students just said “no” to being tested.

To be sure, the vast majority of students took their exams as expected. New York state has by far the largest number of test refusers. Yet an analysis by the New York Times estimates that only 165,000 New York students, or about one out of every six, opted out of one or more tests in 2015. Like New York, Colorado has experienced higher than usual opt outs but 83 percent of seniors still took their exams this year.

Despite the small numbers nationwide, the opt out movement is drawing attention to the test weariness that has been settling on many public school parents, teachers and students, even among those who don’t opt out. New common core tests seem to be adding to their anxiety. By making their frustrations visible, the test refusniks are starting to influence testing policy and its place in school accountability, most notably in Congress and proposed ESEA bills currently under consideration.

So who are these opt outers? The New York Times analysis found that the movement appears to be a mostly middle-class phenomenon. According to their calculations, poor districts in New York (Free & Reduced Price Lunch > 60%) had the fewest test refusers followed by the most wealthy (FRPL < 5%). An April 2015 poll by Siena College provides some other clues by identifying racial differences in voter attitudes. While a 55 percent majority of white voters in the empire state approved of opting out, only 44 percent of black and Latino voters did.

A 2015 survey from the California Public Policy Institute identified similar racial differences in opinions about the common core. Substantial majorities of Californian Latinos, Asians and blacks expressed confidence that the new standards will “make students more college and career ready” compared to less than half of white voters.

One probable reason for these racial and class differences is the role standards and assessments have played in educational equity over the last two decades. The 1994 re-authorization of ESEA laid the foundation for what would eventually become NCLB’s test-based accountability by calling on states to “establish a framework for comprehensive, standards-based education reform for all students.”  At that time, researchers and analysts were beginning to show that the achievement gap was not just a reflection of inequitable resources but also of unequal expectations. A 1994 study from the U.S. Department of Education’s Office of Research, for example, found that “students in high poverty schools … who received mostly A’s in English got about the same reading score [on NAEP] as did the ‘C’ and ‘D’ students in the most affluent schools.” In math, “the ‘A’ students in the high poverty schools most closely resembled the ‘D’ students in the most affluent schools.”  In 2001, NCLB would define further measures to correct these inequities by requiring state tests that would give the public a common, external measurement for gauging whether academic standards were being implemented equally between high- and low-poverty schools.

Indeed, the civil rights community has been among the most vocal supporters of standardized tests in accountability systems. Earlier this year, a coalition of 25 civil rights organizations led by the Leadership Conference on Civil and Human Rights released a statement of principles for ESEA reauthorization. Signatories included the NAACP, the National Council of La Raza, the National Congress of American Indians, and the National Disabilities Rights Network. Among other things, the principles call for retaining the annual testing requirements of NCLB. In May, twelve of these organizations issued another statement specifically criticizing the opt out movement, declaring:

[T]he anti-testing efforts that appear to be growing in states across the nation, like in Colorado and New York, would sabotage important data and rob us of the right to know how our students are faring. When parents ‘opt out’ of tests—even when out of protest for legitimate concerns—they’re not only making a choice for their own child, they’re inadvertently making a choice to undermine efforts to improve schools for every child.

The statement was not universally embraced. Notable civil rights leader Pedro Noguera along with the Advancement Project’s Browne Dianis and John Jackson of the Schott Foundation took exception to what they consider to be a “high-stakes, over-tested climate” for disadvantaged students. Yet their objections are not so much against tests themselves, but in how the information is used.

There is a growing consensus that the balance between assessment for improvement and assessment for accountability has become skewed toward high stakes – something many believe has a perverse effect on classroom practice. But like Mr. Noguera and his colleagues, many educators and experts also believe that standardized tests are not the problem, it’s the out-sized role they have assumed in everything from instruction to teacher evaluation. The next few months promise to launch many federal and state conversations about what a proper role for state tests should be. Ideally, it will serve ongoing improvement while assuring the public that all students are receiving the benefits of solid public education.

Filed under: Achievement Gaps,Assessments,Common Core,equity,Testing — Tags: , , , , , — Patte Barth @ 1:10 pm





May 14, 2015

Proficiency Rates Differ Between State and National Tests

Large gaps in proficiency rates still exist between state and national tests according to a new report by Achieve, Inc. It has been known for several years that more students reach the proficiency benchmark on their state assessment than on the National Assessment of Education Progress (NAEP), and that gap remains today. In fact, proficiency rates on most state assessments are 30 percentage points higher than they are on NAEP.  What this means is that if one of these states reported 80 percent of their students reached the proficiency benchmark on their state assessment, than just 50 percent likely reached it on NAEP.

In some states the gap was even larger. In Georgia, for example, the difference was 60 percentage points in 4th grade reading which was the largest difference in the country. In this case 94 percent of 4th graders were deemed proficient on the Georgia state assessment while just 34 percent reached the proficiency level on NAEP. Georgia wasn’t alone. Louisiana, Alaska, and Arkansas all had gaps of at least 50 percentage points. Similar results were found in 8th grade math as well.

However, there were states with small if any gaps. In fact, in New York more students were deemed proficient on NAEP than on the state assessment in both 4th grade reading and 8th grade math. The report also singled out a dozen or so states that had similar proficiency rates on their state assessments as on NAEP, or as the report called them the “Top Truth Tellers.”

The results aren’t entirely surprising. The Achieve report is based on results from the 2013-14 state assessments when nearly all states were still using older tests. Most states will be giving new Common Core aligned tests for the first time this year which will likely lead to lower proficiency rates as was seen in Kentucky and New York — states that have been administering Common Core aligned assessments for a couple years already. What will be interesting is how this analysis will look a year from now when state scores are based on more rigorous Common Core aligned assessments. I’m guessing the Common Core states will see their scores more aligned with NAEP while those who don’t will still have significant gaps. The question remains, will there be more pushback in states with lower proficiency rates or in those with larger gaps? I guess we will have to wait until next year to find out.—Jim Hull






May 4, 2015

ACT now, before time runs out!

In a report released by ACT, the testing company once again sought to explain into the concept of career readiness (part of the now common terminology “college and career readiness”) and to explain what it is in particular that so many students are desired to have and what schools are expected to impart, as well as how best to measure it.

The brief report begins by explaining that college and career readiness are often considered to be measured by the same assessments, however there are several significant differences between these two and that college readiness and career readiness are best measured separately. Stemming from misinterpretations of ACT’s 2006 Ready for College and Ready for Work report, the intention was to highlight that those students who choose to enter the workforce after high school still benefit significantly in school from exposure to academically rigorous standards as do those students preparing for college. Apparently, some saw this to say that by assessing the skills that serve as foundational components of both college readiness and career readiness that these two constructs are then the same.

The recent report explains that when defining and assessing one’s readiness to enter the workforce, there are skill sets that one acquires, from broad abilities that would apply to numerous jobs to specific skills that are job-specific. Accordingly, there are three levels of workplace readiness that follow this general to specific structure: work readiness, career readiness, and job readiness.

Work readiness is the most general form of academic readiness for the workplace. These would be the skills that would prepare any high school graduate for postsecondary workforce training regardless of the intended career or occupation. Career readiness, more directed than work readiness, would be the workplace readiness that would be required for a specific group of careers. For example, whereas all graduates would need foundational work readiness skills such as reading and math proficiency, the fields of health care and construction would generally require different types of skills (for example, the importance of knowing statistics or creating financial statements may be ranked differently by construction and health care professions) regardless of what specific profession is chosen. The last, and most specific, form of workplace readiness is job readiness. This would relate to the skill sets and competencies required or expected for a specific job or occupation.

Similar to our Defining a 21st Century Education report, the ACT report also includes a discussion as to whether including more than just academic skills is appropriate in assessing college and career readiness. In addition to core academic skills (such math, science, and English/language arts), three other skill domains are elaborated. These include: cross-cutting capabilities include those higher-level thinking and social skills (e.g., critical thinking, problem-solving, cooperation), behavioral skills, such as one’s ability to work well in a team setting and managing stress, and navigation skills, such as goal orientation and self-knowledge of abilities. ACT posits that without the consideration of these non-academic components in assessment, the value placed on such skills and abilities will be ignored despite their recognized importance by the education, business, and industry communities. Certainly, an environment fostering these skills would benefit students by way supporting a more comprehensive education. In the very least, it would be difficult to argue against wanting students to have such competencies. ACT concludes that they are currently underway researching how they can aid in examining this more “holistic approach” to career readiness. –David Ferrier






March 17, 2015

Math skills needed to climb the economic ladder

economic ladder

With all the headlines about students opting-out of testing it appears there is an assumption that test scores have no connection to a student’s future success. There is certainly room to debate how much testing students should be taking and what role test results should play in student, teacher, and school accountability but it can’t be ignored that the test scores do in fact matter. No, test results are not a perfect measure of a student’s actual knowledge and skills but perfect shouldn’t be the enemy of the good. That is, test scores are a good measure of a student’s knowledge and skills and the new Common Core tests appear to be an even more accurate measure than previous state assessments that at best were good measures of basic skills.

But does it really matter how students perform on a test? Yes, especially for students from the most economically disadvantaged families. If they want to climb up the economic ladder they better perform well on their math tests. When I examined the impact of the math scores of 2004 seniors who took part in the Educational Longitudinal Study (ELS) I found that those students who came from families at the bottom quartile of socioeconomic status (SES) were more likely to move up the economic ladder, the better they performed on the ELS math assessment. For example, just 5 percent of low-SES students who scored within the lowest quartile on the math assessment moved up to the highest quartile in SES by 2012. On the other hand, 36 percent of low-SES students who had scored within the top quartile on the math assessment climbed to the top of the SES ladder by 2012. Moreover, nearly half of low-SES students remained in the lowest SES quartile in 2012 if they also scored among the lowest quartile on the math assessment. Yet, only 11 percent of low-SES students who scored among the top quartile on the math assessment remained low-SES in 2012.

Taken together this provides strong evidence that economically disadvantaged students can improve their chances of moving up the economic ladder by performing well on math tests. On the other hand, low-performance on math tests will likely lead to continued economic challenges in their adult lives.

Of course, it is not simply improving test scores that enable economically disadvantaged students to move up the economic ladder, it is the skills the higher test scores represent. As CPE’s reports on getting into and succeeding in college showed, obtaining higher math skills leads to greater success in college. Furthermore, an upcoming CPE report will also show that higher math skills also increases the chances non-college enrollees will get a good job and contribute to society as well. So there is strong evidence that increasing a student’s math knowledge, as measured by standardized tests, gives economically disadvantaged students the tools they need to climb up the economic ladder. –Jim Hull

Filed under: Assessments,Testing — Tags: , — Jim Hull @ 11:32 am





Older Posts »
RSS Feed