Learn About: 21st Century | Charter Schools | Homework
Home / Edifier


The EDifier

November 20, 2014

Growing concerns on testing

A recent opinion piece in the Denver Post challenged the commonly claimed notion that American public students are being tested too much. Recently, high school seniors in Colorado refused to take state assessments in science and social studies, arguing these assessments do not reflect what they have been taught.

But Alicia Caldwell, an editorial writer at the Post, writes that students from third to 12th grade are only tested 1.4% of the time in school, citing data from the state of Colorado’s Department of Education. Caldwell also points out that there was local input on these testing decisions, as eight educators from these school districts were placed on the committee that enacted the social studies standards in 2009.

These standards were put into place because Colorado students were required to take way too many remedial classes in college, which they received no credit but have to pay for. In essence, the Colorado students had to pay for classes that they should have already passed in high school. Finally, the author highlights the role of local districts, as “local districts are layering their own assessments on top of those required for the state, adding to total test time.” This reminds us that the amount of testing is the result of federal, state, and local policies. If parents or students, such as those in Colorado, are complaining about too much testing, then it is the school board and local government’s responsibility to make their testing information transparent.

Colorado is not the only state where communities have voiced their concern on testing. Maryland has also engaged in the debate over the right amount of testing. Eighth-graders in Baltimore schools, for instance, spend 14 to 46 hours a year on standardized assessments. A school year amounts to approximately 1000 instruction hours, so this would mean students are spending 1.4 to 4.6% on testing. When expressed as a percentage, this level of testing does not seem as significant as some of testing critics claim it to be. In Anne Arundel County, students are tested 46 hours per year and 33 of these tests are locally mandated tests. This again demonstrates the role of local government and school board decisions in testing.

An upcoming brief from the Center for Public Education will examine these and other concerns on testing and explain what studies have found on the subject. Stay tuned!






October 7, 2014

More Students Taking Advanced Placement But College Readiness Remains Flat

In a departure from past releases, this year’s SAT results included results from the College Board’s two other testing programs— the PSAT/NMSQT and their Advanced Placement (AP) exams— providing a more complete picture of student progress towards college readiness throughout high school.

This year’s picture provides evidence that more students, especially poor and minority students, are taking more rigorous courses such as Advanced Placement (AP), yet such improvements have not led to an increase in college-readiness rates. Unfortunately, it is not clear why this is the case especially since the AP test-taking rates for the nation’s largest growing population, Hispanics, make up a large portion of the increase in AP test-taking.

Although Hispanic students made tremendous strides on the AP, as a group, they were less likely to reach the college readiness benchmark on the SAT. While nearly 43 percent of the Class of 2014 who took the SAT reached the college readiness benchmark score of 1550, just under a quarter of Hispanic test-takers did so. Moreover, black students who took the SAT were even less likely to be considered ‘college ready,’ as just under 16 percent met or exceeded the college readiness threshold.

 

The Findings

 

College Readiness

  • Nearly half (43 percent) of the test-takers met the SAT College-Ready Benchmark in 2014, which is unchanged from the year prior and slightly lower than in 2009 (44 percent).
    • The SAT College-Ready Benchmarks represent a student who scores a combined 1550 or higher. Students hitting this benchmark have a 65 percent chance of earning a B-minus grade point average in their freshman year courses.
  • Minority students are less likely to be college-ready.
    • Just 15.8 percent of black students and 23.4 percent of Hispanic students were college-ready, according to the SAT’s Benchmark.

Core Course Rigor

  • Three-quarters of SAT test-takers completed the recommended “core” college-preparatory curriculum, which is an increase from 70 percent in 2001.

Test Takers

  • Just over 1.67 million students from the Class of 2014 took the SAT sometime during their high school which was a 4 percent increase from 2013.
  • More minority students are taking the SAT.
    • Nearly half (48 percent) of test takers were minorities in 2014 compared to 46 percent just a year earlier.

 

Advanced Placement (AP)

  • In 2014, 22 percent of the nation’s 11th- and 12th-graders took at least one AP exam which is nearly double the number of students from just a decade ago, when 12 percent took an AP exam.
  • Even though more students took an AP exam, passing ratings improved as well. In 2004, just 8 percent of 11th- and 12th-graders passed an AP exam; that rate increased to 13 percent in 2014.
  • Hispanic students (19 percent) are taking AP courses at nearly the same rate as the overall national average (22 percent), yet black (13 percent) and Native American (12 percent) students are still less likely to take AP.
  • According to the College Board’s PSAT/NMSQT results, nearly 40 percent of PSAT/NMSQT had the potential to succeed in an AP course but never took an exam. However, such students may have taken other college-level courses such as International Baccalaureate or Honors programs.





August 20, 2014

ACT scores improved while college readiness flattened

According to ACT’s The Condition of College & Career Readiness 2014 report released today, after several years of overall ACT scores remaining flat, scores dipped by two-tenths the between 2012 and 2013. This was likely due, at least partially, to the fact that ACT included students who required accommodations to take the test, such as extra time. Such students–on-average– typically perform lower, so their inclusion may have negatively impacted last year’s results. However, the Class of 2014 took back some of these losses by posting a gain of one-tenth of a point while still including all test takers.

Unlike overall scores that improved in 2014, the percent of students meeting ACT’s college readiness benchmarks remained flat after posting gains over the past several years. However, there were some differences by subject areas. In fact, more 2014 graduates met the college readiness benchmark in science than in 2013. On the other hand, fewer 2014 graduates met the college readiness benchmark in math than in 2013.

More positive results were found at the state level where all eight states that have administered the ACT to all students for multiple years as part of their statewide assessment systems (Colorado, Illinois, Kentucky, Michigan, North Carolina, North Dakota, Tennessee, and Wyoming) scored higher in 2014 than in 2013. In fact, a handful of these states make fairly dramatic gains in just the past year.

On the surface, the results don’t show much change in how prepared our graduates are for life after high school. Overall scores increased while there was no change in how many graduates were deemed college-ready. Keep in mind that ACT scores change very little from year to year so it will take several years to determine if these results are the start of a trend or not.

What is clear is that overall scores and college readiness results have not suffered, even as we’ve seen a record number of students graduate from high school on time, and seen a dramatic increase in the number of students taking the ACT test and advancing to college. Of course, there is room for improvement but these results show that our nation’s high schools are indeed preparing more students for college than ever before.– Jim Hull

 

Key findings below

State Scores

  • Of the 33 states where at least 40 percent of graduates took the ACT:
    • Minnesota once again achieved the highest composite score with 22.9.
      • However, just 76 percent of Minnesota 2014 graduates took the ACT
    • Graduates from Hawaii posted the lowest scores among states with a score of 18.2.
  • Of the 12 states where 100 percent of graduates took the ACT:
    • Utah had the highest score at 20.8, followed by Illinois (20.7) and Colorado (20.6).
    • North Carolina (18.9), Mississippi (19.0), and Louisiana (19.2) had the lowest scores out of this group.
    • Three states (Wyoming, Tennessee, and Kentucky) improved their scores by three-tenths of a point over the past year while Colorado, Michigan, and North Carolina improved their scores by two-tenths of a point.
      • Louisiana saw their scores drop by three-tenths of a point over the past year.

National Scores

  • The nation’s graduating Class of 2014 had an average composite score of 21.0, which was one-tenth of a point increase from 2013.  Scores had decreased by two-tenths of a point between 2012 and 2013 likely due to fact ACT included scores from students who received special accommodations such as extra time for the first time in 2013. Such students are typically lower performing students than those who do not receive accommodations.
    • At this score, an average high school graduate has about a 75 percent chance of getting admitted into a good college.*
  • Scores increased by two-tenths of a point in reading (21.3) and increased by one-tenth of point in English (20.3) and science (20.8) between 2013 and 2014, while scores on the math test remained at 20.9.
  • Scores for black and white students improved.
    • White graduates increased their scores by one-tenth of a point between 2013 and 2014 (22.2 to 22.3), although it was still a tenth of a point below their 2012 score.
    • The average black graduate score improved from 16.9 to 17.0 over the past year as well.
    • As for Hispanic graduates, their scores remained at 18.8 just as in 2013.

College Readiness

  • Twenty-six percent of 2014 high school graduates were college-ready in all four ACT subject tests (English, reading, math, and science), which is the same as in 2013 but a three percentage point increase since 2009.
    • Graduates who achieve these benchmarks are ready to succeed in first-year, credit-bearing college courses in the specific subjects ACT tests, according to ACT research. “Success” is defined as a 75% likelihood of earning a ‘C’ or better in the relevant course.
  • Little change in college readiness by subject.
    • The number of graduates reaching ACT’s college-ready benchmark in science increased by one percent from 2013 to 2014.
    • In math, the number of graduates deemed college-ready decreased by one percent.
    • In English and reading there was no change in the number of graduates being college-ready in those subject areas.

Core Course Rigor

  • Graduates who completed ACT’s recommended core curriculum were much more likely to be college-ready.
    • Two-thirds (67 percent) of graduates who completed at least four years of English courses were college-ready in English compared to 36 percent of those who did not. In reading, 46 percent of graduates who completed at least four years of English courses met ACT’s college-ready benchmarks for reading compared to 32 percent who did not.
    • There was a much greater disparity when it came to math and science.
      • For those graduates that completed three or more years worth of math nearly half (46 percent) were college-ready in math compared to just eight percent who did not.
      • For those graduates that completed three or more years worth of science nearly 41 percent were college-ready in science compared to just eight percent who did not.

Test Takers

  • About 57 percent of all 2014 high graduates took the ACT, compared to 54 percent in 2013 and 45 percent in 2009.
  • More minority graduates are taking the ACT.
    • In 2014, nearly 28 percent of ACT test-takers were Hispanic or black, compared to 24 percent in 2010.
    • Furthermore, the percentage of test-takers who were white decreased between 2010 and 2014, from 62 percent to 56 percent.

For more information on how to use college entrance exam scores to evaluate your school, check out the Center’s Data First Web site.

* Data based on calculations from the Center for Public Education’s Chasing the College Acceptance Letter: Is it harder to get into college





May 7, 2014

U.S. 12th-graders make small gains on national assessment

Today, the National Center for Education Statistics (NCES) released the results of the 2013 National Assessment of Educational Progress (NAEP) in mathematics and reading for our nation’s 12th graders.  While the nation as a whole has seen significant improvements at the 4th and 8th grade levels, the same improvement has yet to show up at the end of high school. In neither math nor reading did scores significantly change from 2009—the last time 12th grade NAEP was administered. However, scores in math are higher than they were in 2005—the furthest back math scores can be compared. On the other hand, reading scores have remained relatively unchanged over the past decade and were slightly lower than in 1992—the first year the reading assessment was administered.

It is important to keep in mind that results for our 12th graders are dependent on how many students remained in school. Unlike at 4th and 8th grades where students are required to be in school, at the 12th grade level most students have the option of dropping out. When our high schools retain a larger proportion of students it could impact the results. This indeed may be the case as it was reported last week that our national graduation rate is at an all-time high of 80 percent– with a significant improvement since 2006. So it is possible that scores would have been higher if graduation rates remained near 70 percent as they were for most of the 1990s and early 2000s.

Yet, higher graduation rates can’t fully explain why scores at the 12th grade have basically flat-lined while they have accelerated in earlier grades because scores have not changed much for most student groups. The exception is math where Black, Hispanic, and Asian/Pacific Islander students made significant gains from 2005 to 2013 (5, 7, and 10 points respectively) although none of that increase is due to any improvements since 2009. Most scores were relatively unchanged no matter if groups were defined by parent’s highest education level, male or female, or high or low-performer.

What is clear is that those students who took more rigorous courses achieved the highest scores. Those students who took Calculus scored the equivalent to nearly 4 more years worth of learning than students whose highest math course was Algebra II or Trigonometry and nearly 7 more years worth of learning than those students who never completed a course beyond Algebra I. In reading, those students who say they discuss reading interpretations nearly every day achieve the equivalent to nearly two years worth of learning over students who rarely discuss reading interpretations.

Last week’s news about our historic graduation rate is certainly worth celebrating. Schools have also made strides at enrolling more students in high-level courses. But today’s NAEP results show that much more work still needs to be done. Simply earning a high school diploma is not enough. Students need to succeed in rigorous courses in high school to gain the knowledge and skills needed for the 21st century labor market.– Jim Hull

 






May 6, 2014

More teachers think ‘just the right amount’ of time spent testing


image for blog
A recently released study from the Northwest Evaluation Association finds that both teachers and administrators view the amount of time spent on testing more favorably now than they did two years ago.  The Make Assessments Matter report found that compared to 2011, increasing numbers of teachers and administrators believe that “just the right amount” of time is spent on assessments. While the majority of teachers still think that too much time is spent on testing, there was a notable increase in the number of teachers who think the amount of time spent on assessments is appropriate.

In 2011, 28 percent of teachers thought students spent “just the right amount” of time preparing for and taking tests; by 2013 this number had increased to 38 percent. When it comes to how much of their own time they have to invest in assessments, 36 percent of teachers in 2011 believed they spend “just the right amount” of time preparing for and administering assessments to their students. By 2013, 42 percent of teachers surveyed believed the amount of time they spent preparing for and administering assessments was appropriate.

District administrators’ favorable views on the appropriate amount of time spent testing increased even more than those of teachers over the same period.  While in 2011 only 29 percent of administrators believed students were spending “just the right amount” of time on testing, in 2013 48 percent believed that the amount of time spent on testing was suitable. The increase for the amount of time teachers spend on testing is slightly smaller for administrators, with the 31 percent viewing it as “just right” in 2011 increasing to 42 percent in 2013.

The study also reports on student experiences with assessments. Somewhat surprisingly, more than 90 percent of students agreed that assessments are either “very important” or “somewhat important” for a variety of purposes, including helping their teachers chart their progress, understanding what they’re learning, helping them get into college, and knowing whether they will be promoted to the next grade.

This seems to be a good start, but there is still work to be done, as 53 percent of teachers and 40 percent of administrators still think that students spend too much time preparing for and taking assessments. However, considering the political polarization and public scrutiny that has been following the early implementation and field testing of the Common Core State Standards, it appears that many teachers and administrators are actually happier with the current level of testing than they were two years ago. -Patricia Campbell

Filed under: Assessments,teachers,Testing — Patricia Campbell @ 3:58 pm





Older Posts »
RSS Feed