Learn About: 21st Century | Charter Schools | Homework
Home / Edifier


The EDifier

August 20, 2014

ACT scores improved while college readiness flattened

According to ACT’s The Condition of College & Career Readiness 2014 report released today, after several years of overall ACT scores remaining flat, scores dipped by two-tenths the between 2012 and 2013. This was likely due, at least partially, to the fact that ACT included students who required accommodations to take the test, such as extra time. Such students–on-average– typically perform lower, so their inclusion may have negatively impacted last year’s results. However, the Class of 2014 took back some of these losses by posting a gain of one-tenth of a point while still including all test takers.

Unlike overall scores that improved in 2014, the percent of students meeting ACT’s college readiness benchmarks remained flat after posting gains over the past several years. However, there were some differences by subject areas. In fact, more 2014 graduates met the college readiness benchmark in science than in 2013. On the other hand, fewer 2014 graduates met the college readiness benchmark in math than in 2013.

More positive results were found at the state level where all eight states that have administered the ACT to all students for multiple years as part of their statewide assessment systems (Colorado, Illinois, Kentucky, Michigan, North Carolina, North Dakota, Tennessee, and Wyoming) scored higher in 2014 than in 2013. In fact, a handful of these states make fairly dramatic gains in just the past year.

On the surface, the results don’t show much change in how prepared our graduates are for life after high school. Overall scores increased while there was no change in how many graduates were deemed college-ready. Keep in mind that ACT scores change very little from year to year so it will take several years to determine if these results are the start of a trend or not.

What is clear is that overall scores and college readiness results have not suffered, even as we’ve seen a record number of students graduate from high school on time, and seen a dramatic increase in the number of students taking the ACT test and advancing to college. Of course, there is room for improvement but these results show that our nation’s high schools are indeed preparing more students for college than ever before.– Jim Hull

 

Key findings below

State Scores

  • Of the 33 states where at least 40 percent of graduates took the ACT:
    • Minnesota once again achieved the highest composite score with 22.9.
      • However, just 76 percent of Minnesota 2014 graduates took the ACT
    • Graduates from Hawaii posted the lowest scores among states with a score of 18.2.
  • Of the 12 states where 100 percent of graduates took the ACT:
    • Utah had the highest score at 20.8, followed by Illinois (20.7) and Colorado (20.6).
    • North Carolina (18.9), Mississippi (19.0), and Louisiana (19.2) had the lowest scores out of this group.
    • Three states (Wyoming, Tennessee, and Kentucky) improved their scores by three-tenths of a point over the past year while Colorado, Michigan, and North Carolina improved their scores by two-tenths of a point.
      • Louisiana saw their scores drop by three-tenths of a point over the past year.

National Scores

  • The nation’s graduating Class of 2014 had an average composite score of 21.0, which was one-tenth of a point increase from 2013.  Scores had decreased by two-tenths of a point between 2012 and 2013 likely due to fact ACT included scores from students who received special accommodations such as extra time for the first time in 2013. Such students are typically lower performing students than those who do not receive accommodations.
    • At this score, an average high school graduate has about a 75 percent chance of getting admitted into a good college.*
  • Scores increased by two-tenths of a point in reading (21.3) and increased by one-tenth of point in English (20.3) and science (20.8) between 2013 and 2014, while scores on the math test remained at 20.9.
  • Scores for black and white students improved.
    • White graduates increased their scores by one-tenth of a point between 2013 and 2014 (22.2 to 22.3), although it was still a tenth of a point below their 2012 score.
    • The average black graduate score improved from 16.9 to 17.0 over the past year as well.
    • As for Hispanic graduates, their scores remained at 18.8 just as in 2013.

College Readiness

  • Twenty-six percent of 2014 high school graduates were college-ready in all four ACT subject tests (English, reading, math, and science), which is the same as in 2013 but a three percentage point increase since 2009.
    • Graduates who achieve these benchmarks are ready to succeed in first-year, credit-bearing college courses in the specific subjects ACT tests, according to ACT research. “Success” is defined as a 75% likelihood of earning a ‘C’ or better in the relevant course.
  • Little change in college readiness by subject.
    • The number of graduates reaching ACT’s college-ready benchmark in science increased by one percent from 2013 to 2014.
    • In math, the number of graduates deemed college-ready decreased by one percent.
    • In English and reading there was no change in the number of graduates being college-ready in those subject areas.

Core Course Rigor

  • Graduates who completed ACT’s recommended core curriculum were much more likely to be college-ready.
    • Two-thirds (67 percent) of graduates who completed at least four years of English courses were college-ready in English compared to 36 percent of those who did not. In reading, 46 percent of graduates who completed at least four years of English courses met ACT’s college-ready benchmarks for reading compared to 32 percent who did not.
    • There was a much greater disparity when it came to math and science.
      • For those graduates that completed three or more years worth of math nearly half (46 percent) were college-ready in math compared to just eight percent who did not.
      • For those graduates that completed three or more years worth of science nearly 41 percent were college-ready in science compared to just eight percent who did not.

Test Takers

  • About 57 percent of all 2014 high graduates took the ACT, compared to 54 percent in 2013 and 45 percent in 2009.
  • More minority graduates are taking the ACT.
    • In 2014, nearly 28 percent of ACT test-takers were Hispanic or black, compared to 24 percent in 2010.
    • Furthermore, the percentage of test-takers who were white decreased between 2010 and 2014, from 62 percent to 56 percent.

For more information on how to use college entrance exam scores to evaluate your school, check out the Center’s Data First Web site.

* Data based on calculations from the Center for Public Education’s Chasing the College Acceptance Letter: Is it harder to get into college





May 7, 2014

U.S. 12th-graders make small gains on national assessment

Today, the National Center for Education Statistics (NCES) released the results of the 2013 National Assessment of Educational Progress (NAEP) in mathematics and reading for our nation’s 12th graders.  While the nation as a whole has seen significant improvements at the 4th and 8th grade levels, the same improvement has yet to show up at the end of high school. In neither math nor reading did scores significantly change from 2009—the last time 12th grade NAEP was administered. However, scores in math are higher than they were in 2005—the furthest back math scores can be compared. On the other hand, reading scores have remained relatively unchanged over the past decade and were slightly lower than in 1992—the first year the reading assessment was administered.

It is important to keep in mind that results for our 12th graders are dependent on how many students remained in school. Unlike at 4th and 8th grades where students are required to be in school, at the 12th grade level most students have the option of dropping out. When our high schools retain a larger proportion of students it could impact the results. This indeed may be the case as it was reported last week that our national graduation rate is at an all-time high of 80 percent– with a significant improvement since 2006. So it is possible that scores would have been higher if graduation rates remained near 70 percent as they were for most of the 1990s and early 2000s.

Yet, higher graduation rates can’t fully explain why scores at the 12th grade have basically flat-lined while they have accelerated in earlier grades because scores have not changed much for most student groups. The exception is math where Black, Hispanic, and Asian/Pacific Islander students made significant gains from 2005 to 2013 (5, 7, and 10 points respectively) although none of that increase is due to any improvements since 2009. Most scores were relatively unchanged no matter if groups were defined by parent’s highest education level, male or female, or high or low-performer.

What is clear is that those students who took more rigorous courses achieved the highest scores. Those students who took Calculus scored the equivalent to nearly 4 more years worth of learning than students whose highest math course was Algebra II or Trigonometry and nearly 7 more years worth of learning than those students who never completed a course beyond Algebra I. In reading, those students who say they discuss reading interpretations nearly every day achieve the equivalent to nearly two years worth of learning over students who rarely discuss reading interpretations.

Last week’s news about our historic graduation rate is certainly worth celebrating. Schools have also made strides at enrolling more students in high-level courses. But today’s NAEP results show that much more work still needs to be done. Simply earning a high school diploma is not enough. Students need to succeed in rigorous courses in high school to gain the knowledge and skills needed for the 21st century labor market.– Jim Hull

 






May 6, 2014

More teachers think ‘just the right amount’ of time spent testing


image for blog
A recently released study from the Northwest Evaluation Association finds that both teachers and administrators view the amount of time spent on testing more favorably now than they did two years ago.  The Make Assessments Matter report found that compared to 2011, increasing numbers of teachers and administrators believe that “just the right amount” of time is spent on assessments. While the majority of teachers still think that too much time is spent on testing, there was a notable increase in the number of teachers who think the amount of time spent on assessments is appropriate.

In 2011, 28 percent of teachers thought students spent “just the right amount” of time preparing for and taking tests; by 2013 this number had increased to 38 percent. When it comes to how much of their own time they have to invest in assessments, 36 percent of teachers in 2011 believed they spend “just the right amount” of time preparing for and administering assessments to their students. By 2013, 42 percent of teachers surveyed believed the amount of time they spent preparing for and administering assessments was appropriate.

District administrators’ favorable views on the appropriate amount of time spent testing increased even more than those of teachers over the same period.  While in 2011 only 29 percent of administrators believed students were spending “just the right amount” of time on testing, in 2013 48 percent believed that the amount of time spent on testing was suitable. The increase for the amount of time teachers spend on testing is slightly smaller for administrators, with the 31 percent viewing it as “just right” in 2011 increasing to 42 percent in 2013.

The study also reports on student experiences with assessments. Somewhat surprisingly, more than 90 percent of students agreed that assessments are either “very important” or “somewhat important” for a variety of purposes, including helping their teachers chart their progress, understanding what they’re learning, helping them get into college, and knowing whether they will be promoted to the next grade.

This seems to be a good start, but there is still work to be done, as 53 percent of teachers and 40 percent of administrators still think that students spend too much time preparing for and taking assessments. However, considering the political polarization and public scrutiny that has been following the early implementation and field testing of the Common Core State Standards, it appears that many teachers and administrators are actually happier with the current level of testing than they were two years ago. -Patricia Campbell

Filed under: Assessments,teachers,Testing — Patricia Campbell @ 3:58 pm





April 3, 2014

U.S. students score well on first PISA problem solving exam

Earlier this week, the Organization for Economic Cooperation and Development (OECD) released a report on the first ever Program for International Student Assessment (PISA) problem solving exam. 15-year-olds in the U.S. who took the exam scored above average but also had scores that were significantly lower than those of 10 of the other 44 countries and economies participating in the exam. Students in the U.S. performed on par with 15-year-olds in England, France, the Netherlands, Italy, Germany, and Norway, but still lagged behind highest-scoring Singapore and Korea, as well as students in several other nations.

American students’ scores on the problem solving portion of the PISA exam were quite a bit higher than their scores on the math, reading, and science portions of the exam, possibly implying that students in the U.S. are better at applying what they’ve learned to real life situations than they are at performing strictly academic tasks. That said, the nations that excelled on the problem solving portion of the exam, such as Korea and Singapore, also do well on the traditional academic sections of the PISA exam.

Right now these findings don’t mean much, as this is the first time a problem solving portion of the test has been administered and the sample size was very small (less than 1,300 students in the U.S. took the problem solving exam). However, it is good to see assessments moving beyond only measuring math and reading proficiency and at least attempting to measure deeper learning skills that are also needed to solve real-world problems.  At this point in time we can’t say how well an assessment like this truly predicts problems solving ability, but it seems like a positive development that PISA is acknowledging that problem solving skills will be important for many of these students in their future jobs. At the very least, these results provide an interesting cross-sectional picture of problem solving skills throughout the world. If you’d like to try your hand at some of the problems, sample questions from the 2012 PISA problem solving exam can be found here and here.

Filed under: Assessments,International Comparisons — Patricia Campbell @ 4:08 pm





March 31, 2014

Common Core standards undergoes field testing

Field testing for the Common Core-aligned Smarter Balanced Assessment Consortium (SBAC) and Partnership for Assessment of Readiness for College and Careers (PARCC) tests is now underway in many states and will continue to be implemented in others over the course of the next several weeks. More than four million students in grades 3-11 will be helping field test the new math and English language arts assessments. These field tests are an opportunity for students to experience the testing environment and get a sense of what they will be expected to know and be able to do for future Common Core-aligned assessments, but with no stakes attached at this time.

Doing field tests an entire year before these assessments will be used for any type of evaluation helps to ensure that the new Common Core-aligned assessments are reliable, valid, and fair for all students taking them, and gives SBAC and PARCC time to adjust both content or structural issues that might pop up during field testing. This trial run also gives teachers and schools a chance to practice administering the test and an opportunity to work out any technical or procedural problems before the assessments begin next year. Additionally, the field tests will introduce students to a type of assessment that is different from what many are used to: one that emphasizes critical thinking and problem solving and focuses less on memorization and simply filling in the correct bubble.

Of course, these assessments are not perfect, but that is all the more reason that this “test of the test” is important. Much of the controversy surrounding the implementation of the Common Core States Standards has involved how assessment would work and if it would be any different from the testing most states require now. This trial run allows thousands of teachers and millions of students across the country to become accustomed to the new system in what is essentially a “no stakes” testing environment. Will there be glitches? Of course – but the field tests allow time for these issues to be worked out before the actual test is administered next year. It might be an imperfect solution, but it is certainly a step in the right direction to ensure that CCSS-aligned assessments are the best that they can be before they are administered in a high stakes environment.

The Alliance for Excellent Education has a helpful Common Core Field Test Q&A available with more information. For more on the Common Core State Standards, visit CPE’s Common Core Resource page.

-Patricia Campbell






Older Posts »
RSS Feed