Learn About: 21st Century | Charter Schools | Homework
Home / Edifier


The EDifier

May 14, 2015

Proficiency Rates Differ Between State and National Tests

Large gaps in proficiency rates still exist between state and national tests according to a new report by Achieve, Inc. It has been known for several years that more students reach the proficiency benchmark on their state assessment than on the National Assessment of Education Progress (NAEP), and that gap remains today. In fact, proficiency rates on most state assessments are 30 percentage points higher than they are on NAEP.  What this means is that if one of these states reported 80 percent of their students reached the proficiency benchmark on their state assessment, than just 50 percent likely reached it on NAEP.

In some states the gap was even larger. In Georgia, for example, the difference was 60 percentage points in 4th grade reading which was the largest difference in the country. In this case 94 percent of 4th graders were deemed proficient on the Georgia state assessment while just 34 percent reached the proficiency level on NAEP. Georgia wasn’t alone. Louisiana, Alaska, and Arkansas all had gaps of at least 50 percentage points. Similar results were found in 8th grade math as well.

However, there were states with small if any gaps. In fact, in New York more students were deemed proficient on NAEP than on the state assessment in both 4th grade reading and 8th grade math. The report also singled out a dozen or so states that had similar proficiency rates on their state assessments as on NAEP, or as the report called them the “Top Truth Tellers.”

The results aren’t entirely surprising. The Achieve report is based on results from the 2013-14 state assessments when nearly all states were still using older tests. Most states will be giving new Common Core aligned tests for the first time this year which will likely lead to lower proficiency rates as was seen in Kentucky and New York — states that have been administering Common Core aligned assessments for a couple years already. What will be interesting is how this analysis will look a year from now when state scores are based on more rigorous Common Core aligned assessments. I’m guessing the Common Core states will see their scores more aligned with NAEP while those who don’t will still have significant gaps. The question remains, will there be more pushback in states with lower proficiency rates or in those with larger gaps? I guess we will have to wait until next year to find out.—Jim Hull






May 4, 2015

ACT now, before time runs out!

In a report released by ACT, the testing company once again sought to explain into the concept of career readiness (part of the now common terminology “college and career readiness”) and to explain what it is in particular that so many students are desired to have and what schools are expected to impart, as well as how best to measure it.

The brief report begins by explaining that college and career readiness are often considered to be measured by the same assessments, however there are several significant differences between these two and that college readiness and career readiness are best measured separately. Stemming from misinterpretations of ACT’s 2006 Ready for College and Ready for Work report, the intention was to highlight that those students who choose to enter the workforce after high school still benefit significantly in school from exposure to academically rigorous standards as do those students preparing for college. Apparently, some saw this to say that by assessing the skills that serve as foundational components of both college readiness and career readiness that these two constructs are then the same.

The recent report explains that when defining and assessing one’s readiness to enter the workforce, there are skill sets that one acquires, from broad abilities that would apply to numerous jobs to specific skills that are job-specific. Accordingly, there are three levels of workplace readiness that follow this general to specific structure: work readiness, career readiness, and job readiness.

Work readiness is the most general form of academic readiness for the workplace. These would be the skills that would prepare any high school graduate for postsecondary workforce training regardless of the intended career or occupation. Career readiness, more directed than work readiness, would be the workplace readiness that would be required for a specific group of careers. For example, whereas all graduates would need foundational work readiness skills such as reading and math proficiency, the fields of health care and construction would generally require different types of skills (for example, the importance of knowing statistics or creating financial statements may be ranked differently by construction and health care professions) regardless of what specific profession is chosen. The last, and most specific, form of workplace readiness is job readiness. This would relate to the skill sets and competencies required or expected for a specific job or occupation.

Similar to our Defining a 21st Century Education report, the ACT report also includes a discussion as to whether including more than just academic skills is appropriate in assessing college and career readiness. In addition to core academic skills (such math, science, and English/language arts), three other skill domains are elaborated. These include: cross-cutting capabilities include those higher-level thinking and social skills (e.g., critical thinking, problem-solving, cooperation), behavioral skills, such as one’s ability to work well in a team setting and managing stress, and navigation skills, such as goal orientation and self-knowledge of abilities. ACT posits that without the consideration of these non-academic components in assessment, the value placed on such skills and abilities will be ignored despite their recognized importance by the education, business, and industry communities. Certainly, an environment fostering these skills would benefit students by way supporting a more comprehensive education. In the very least, it would be difficult to argue against wanting students to have such competencies. ACT concludes that they are currently underway researching how they can aid in examining this more “holistic approach” to career readiness. –David Ferrier






March 17, 2015

Math skills needed to climb the economic ladder

economic ladder

With all the headlines about students opting-out of testing it appears there is an assumption that test scores have no connection to a student’s future success. There is certainly room to debate how much testing students should be taking and what role test results should play in student, teacher, and school accountability but it can’t be ignored that the test scores do in fact matter. No, test results are not a perfect measure of a student’s actual knowledge and skills but perfect shouldn’t be the enemy of the good. That is, test scores are a good measure of a student’s knowledge and skills and the new Common Core tests appear to be an even more accurate measure than previous state assessments that at best were good measures of basic skills.

But does it really matter how students perform on a test? Yes, especially for students from the most economically disadvantaged families. If they want to climb up the economic ladder they better perform well on their math tests. When I examined the impact of the math scores of 2004 seniors who took part in the Educational Longitudinal Study (ELS) I found that those students who came from families at the bottom quartile of socioeconomic status (SES) were more likely to move up the economic ladder, the better they performed on the ELS math assessment. For example, just 5 percent of low-SES students who scored within the lowest quartile on the math assessment moved up to the highest quartile in SES by 2012. On the other hand, 36 percent of low-SES students who had scored within the top quartile on the math assessment climbed to the top of the SES ladder by 2012. Moreover, nearly half of low-SES students remained in the lowest SES quartile in 2012 if they also scored among the lowest quartile on the math assessment. Yet, only 11 percent of low-SES students who scored among the top quartile on the math assessment remained low-SES in 2012.

Taken together this provides strong evidence that economically disadvantaged students can improve their chances of moving up the economic ladder by performing well on math tests. On the other hand, low-performance on math tests will likely lead to continued economic challenges in their adult lives.

Of course, it is not simply improving test scores that enable economically disadvantaged students to move up the economic ladder, it is the skills the higher test scores represent. As CPE’s reports on getting into and succeeding in college showed, obtaining higher math skills leads to greater success in college. Furthermore, an upcoming CPE report will also show that higher math skills also increases the chances non-college enrollees will get a good job and contribute to society as well. So there is strong evidence that increasing a student’s math knowledge, as measured by standardized tests, gives economically disadvantaged students the tools they need to climb up the economic ladder. –Jim Hull

Filed under: Assessments,Testing — Tags: , — Jim Hull @ 11:32 am





November 20, 2014

Growing concerns on testing

A recent opinion piece in the Denver Post challenged the commonly claimed notion that American public students are being tested too much. Recently, high school seniors in Colorado refused to take state assessments in science and social studies, arguing these assessments do not reflect what they have been taught.

But Alicia Caldwell, an editorial writer at the Post, writes that students from third to 12th grade are only tested 1.4% of the time in school, citing data from the state of Colorado’s Department of Education. Caldwell also points out that there was local input on these testing decisions, as eight educators from these school districts were placed on the committee that enacted the social studies standards in 2009.

These standards were put into place because Colorado students were required to take way too many remedial classes in college, which they received no credit but have to pay for. In essence, the Colorado students had to pay for classes that they should have already passed in high school. Finally, the author highlights the role of local districts, as “local districts are layering their own assessments on top of those required for the state, adding to total test time.” This reminds us that the amount of testing is the result of federal, state, and local policies. If parents or students, such as those in Colorado, are complaining about too much testing, then it is the school board and local government’s responsibility to make their testing information transparent.

Colorado is not the only state where communities have voiced their concern on testing. Maryland has also engaged in the debate over the right amount of testing. Eighth-graders in Baltimore schools, for instance, spend 14 to 46 hours a year on standardized assessments. A school year amounts to approximately 1000 instruction hours, so this would mean students are spending 1.4 to 4.6% on testing. When expressed as a percentage, this level of testing does not seem as significant as some of testing critics claim it to be. In Anne Arundel County, students are tested 46 hours per year and 33 of these tests are locally mandated tests. This again demonstrates the role of local government and school board decisions in testing.

An upcoming brief from the Center for Public Education will examine these and other concerns on testing and explain what studies have found on the subject. Stay tuned!






October 7, 2014

More Students Taking Advanced Placement But College Readiness Remains Flat

In a departure from past releases, this year’s SAT results included results from the College Board’s two other testing programs— the PSAT/NMSQT and their Advanced Placement (AP) exams— providing a more complete picture of student progress towards college readiness throughout high school.

This year’s picture provides evidence that more students, especially poor and minority students, are taking more rigorous courses such as Advanced Placement (AP), yet such improvements have not led to an increase in college-readiness rates. Unfortunately, it is not clear why this is the case especially since the AP test-taking rates for the nation’s largest growing population, Hispanics, make up a large portion of the increase in AP test-taking.

Although Hispanic students made tremendous strides on the AP, as a group, they were less likely to reach the college readiness benchmark on the SAT. While nearly 43 percent of the Class of 2014 who took the SAT reached the college readiness benchmark score of 1550, just under a quarter of Hispanic test-takers did so. Moreover, black students who took the SAT were even less likely to be considered ‘college ready,’ as just under 16 percent met or exceeded the college readiness threshold.

 

The Findings

 

College Readiness

  • Nearly half (43 percent) of the test-takers met the SAT College-Ready Benchmark in 2014, which is unchanged from the year prior and slightly lower than in 2009 (44 percent).
    • The SAT College-Ready Benchmarks represent a student who scores a combined 1550 or higher. Students hitting this benchmark have a 65 percent chance of earning a B-minus grade point average in their freshman year courses.
  • Minority students are less likely to be college-ready.
    • Just 15.8 percent of black students and 23.4 percent of Hispanic students were college-ready, according to the SAT’s Benchmark.

Core Course Rigor

  • Three-quarters of SAT test-takers completed the recommended “core” college-preparatory curriculum, which is an increase from 70 percent in 2001.

Test Takers

  • Just over 1.67 million students from the Class of 2014 took the SAT sometime during their high school which was a 4 percent increase from 2013.
  • More minority students are taking the SAT.
    • Nearly half (48 percent) of test takers were minorities in 2014 compared to 46 percent just a year earlier.

 

Advanced Placement (AP)

  • In 2014, 22 percent of the nation’s 11th– and 12th-graders took at least one AP exam which is nearly double the number of students from just a decade ago, when 12 percent took an AP exam.
  • Even though more students took an AP exam, passing ratings improved as well. In 2004, just 8 percent of 11th– and 12th-graders passed an AP exam; that rate increased to 13 percent in 2014.
  • Hispanic students (19 percent) are taking AP courses at nearly the same rate as the overall national average (22 percent), yet black (13 percent) and Native American (12 percent) students are still less likely to take AP.
  • According to the College Board’s PSAT/NMSQT results, nearly 40 percent of PSAT/NMSQT had the potential to succeed in an AP course but never took an exam. However, such students may have taken other college-level courses such as International Baccalaureate or Honors programs.





Older Posts »
RSS Feed