Today, the National Center for Education Statistics (NCES) released the results of the 2013 National Assessment of Educational Progress (NAEP) in mathematics and reading for our nation’s 12th graders. While the nation as a whole has seen significant improvements at the 4th and 8th grade levels, the same improvement has yet to show up at the end of high school. In neither math nor reading did scores significantly change from 2009—the last time 12th grade NAEP was administered. However, scores in math are higher than they were in 2005—the furthest back math scores can be compared. On the other hand, reading scores have remained relatively unchanged over the past decade and were slightly lower than in 1992—the first year the reading assessment was administered.
It is important to keep in mind that results for our 12th graders are dependent on how many students remained in school. Unlike at 4th and 8th grades where students are required to be in school, at the 12th grade level most students have the option of dropping out. When our high schools retain a larger proportion of students it could impact the results. This indeed may be the case as it was reported last week that our national graduation rate is at an all-time high of 80 percent– with a significant improvement since 2006. So it is possible that scores would have been higher if graduation rates remained near 70 percent as they were for most of the 1990s and early 2000s.
Yet, higher graduation rates can’t fully explain why scores at the 12th grade have basically flat-lined while they have accelerated in earlier grades because scores have not changed much for most student groups. The exception is math where Black, Hispanic, and Asian/Pacific Islander students made significant gains from 2005 to 2013 (5, 7, and 10 points respectively) although none of that increase is due to any improvements since 2009. Most scores were relatively unchanged no matter if groups were defined by parent’s highest education level, male or female, or high or low-performer.
What is clear is that those students who took more rigorous courses achieved the highest scores. Those students who took Calculus scored the equivalent to nearly 4 more years worth of learning than students whose highest math course was Algebra II or Trigonometry and nearly 7 more years worth of learning than those students who never completed a course beyond Algebra I. In reading, those students who say they discuss reading interpretations nearly every day achieve the equivalent to nearly two years worth of learning over students who rarely discuss reading interpretations.
Last week’s news about our historic graduation rate is certainly worth celebrating. Schools have also made strides at enrolling more students in high-level courses. But today’s NAEP results show that much more work still needs to be done. Simply earning a high school diploma is not enough. Students need to succeed in rigorous courses in high school to gain the knowledge and skills needed for the 21st century labor market.– Jim Hull
A recently released study from the Northwest Evaluation Association finds that both teachers and administrators view the amount of time spent on testing more favorably now than they did two years ago. The Make Assessments Matter report found that compared to 2011, increasing numbers of teachers and administrators believe that “just the right amount” of time is spent on assessments. While the majority of teachers still think that too much time is spent on testing, there was a notable increase in the number of teachers who think the amount of time spent on assessments is appropriate.
In 2011, 28 percent of teachers thought students spent “just the right amount” of time preparing for and taking tests; by 2013 this number had increased to 38 percent. When it comes to how much of their own time they have to invest in assessments, 36 percent of teachers in 2011 believed they spend “just the right amount” of time preparing for and administering assessments to their students. By 2013, 42 percent of teachers surveyed believed the amount of time they spent preparing for and administering assessments was appropriate.
District administrators’ favorable views on the appropriate amount of time spent testing increased even more than those of teachers over the same period. While in 2011 only 29 percent of administrators believed students were spending “just the right amount” of time on testing, in 2013 48 percent believed that the amount of time spent on testing was suitable. The increase for the amount of time teachers spend on testing is slightly smaller for administrators, with the 31 percent viewing it as “just right” in 2011 increasing to 42 percent in 2013.
The study also reports on student experiences with assessments. Somewhat surprisingly, more than 90 percent of students agreed that assessments are either “very important” or “somewhat important” for a variety of purposes, including helping their teachers chart their progress, understanding what they’re learning, helping them get into college, and knowing whether they will be promoted to the next grade.
This seems to be a good start, but there is still work to be done, as 53 percent of teachers and 40 percent of administrators still think that students spend too much time preparing for and taking assessments. However, considering the political polarization and public scrutiny that has been following the early implementation and field testing of the Common Core State Standards, it appears that many teachers and administrators are actually happier with the current level of testing than they were two years ago. -Patricia Campbell
Earlier this week, the Organization for Economic Cooperation and Development (OECD) released a report on the first ever Program for International Student Assessment (PISA) problem solving exam. 15-year-olds in the U.S. who took the exam scored above average but also had scores that were significantly lower than those of 10 of the other 44 countries and economies participating in the exam. Students in the U.S. performed on par with 15-year-olds in England, France, the Netherlands, Italy, Germany, and Norway, but still lagged behind highest-scoring Singapore and Korea, as well as students in several other nations.
American students’ scores on the problem solving portion of the PISA exam were quite a bit higher than their scores on the math, reading, and science portions of the exam, possibly implying that students in the U.S. are better at applying what they’ve learned to real life situations than they are at performing strictly academic tasks. That said, the nations that excelled on the problem solving portion of the exam, such as Korea and Singapore, also do well on the traditional academic sections of the PISA exam.
Right now these findings don’t mean much, as this is the first time a problem solving portion of the test has been administered and the sample size was very small (less than 1,300 students in the U.S. took the problem solving exam). However, it is good to see assessments moving beyond only measuring math and reading proficiency and at least attempting to measure deeper learning skills that are also needed to solve real-world problems. At this point in time we can’t say how well an assessment like this truly predicts problems solving ability, but it seems like a positive development that PISA is acknowledging that problem solving skills will be important for many of these students in their future jobs. At the very least, these results provide an interesting cross-sectional picture of problem solving skills throughout the world. If you’d like to try your hand at some of the problems, sample questions from the 2012 PISA problem solving exam can be found here and here.
Field testing for the Common Core-aligned Smarter Balanced Assessment Consortium (SBAC) and Partnership for Assessment of Readiness for College and Careers (PARCC) tests is now underway in many states and will continue to be implemented in others over the course of the next several weeks. More than four million students in grades 3-11 will be helping field test the new math and English language arts assessments. These field tests are an opportunity for students to experience the testing environment and get a sense of what they will be expected to know and be able to do for future Common Core-aligned assessments, but with no stakes attached at this time.
Doing field tests an entire year before these assessments will be used for any type of evaluation helps to ensure that the new Common Core-aligned assessments are reliable, valid, and fair for all students taking them, and gives SBAC and PARCC time to adjust both content or structural issues that might pop up during field testing. This trial run also gives teachers and schools a chance to practice administering the test and an opportunity to work out any technical or procedural problems before the assessments begin next year. Additionally, the field tests will introduce students to a type of assessment that is different from what many are used to: one that emphasizes critical thinking and problem solving and focuses less on memorization and simply filling in the correct bubble.
Of course, these assessments are not perfect, but that is all the more reason that this “test of the test” is important. Much of the controversy surrounding the implementation of the Common Core States Standards has involved how assessment would work and if it would be any different from the testing most states require now. This trial run allows thousands of teachers and millions of students across the country to become accustomed to the new system in what is essentially a “no stakes” testing environment. Will there be glitches? Of course – but the field tests allow time for these issues to be worked out before the actual test is administered next year. It might be an imperfect solution, but it is certainly a step in the right direction to ensure that CCSS-aligned assessments are the best that they can be before they are administered in a high stakes environment.
The Alliance for Excellent Education has a helpful Common Core Field Test Q&A available with more information. For more on the Common Core State Standards, visit CPE’s Common Core Resource page.
Big changes coming for the SAT – What do they really mean?
On Wednesday, the College Board announced a major overhaul of the SAT in what will be the second revision of the college entrance exam in less than ten years. Substantial changes include:
- The test will again be scored out of 1600, and the penalty for guessing will be eliminated
- Some of the more obscure vocabulary words are being thrown out and replaced with words that are commonly used in the academic and professional worlds
- The essay portion of the test will now be optional and source-based, and students choosing to complete it will have 50 minutes, rather than 25, to do so
- Math questions will focus on three main areas: problem solving and data analysis, algebra, and real-world math related to the design, technology, and engineering fields
Perhaps the most substantial change is that the new test will be closely aligned with what high schools are teaching. It will require students to analyze nonfiction texts, build an argument using evidence, and apply math concepts to real life situations; all skills that are emphasized in the Common Core State Standards. The alignment between the new SAT and the CCSS is not surprising, as David Coleman, a key architect of the Common Core, now serves as President of the College Board. The goal for the redesign was to create an SAT that is more transparent, focused, and closely tied to the work that students do in school every day. The College Board believes the test should move toward evidence-based thinking and reinforcing the skills that students should have already learned in high school, and move away from the need for test taking tips, tricks, and strategies that make the test prep industry so profitable and allow affluent students whose families can afford expensive tutors and intense coaching to “game” to SAT. The College Board is also partnering with Kahn Academy to offer free online test preparation materials in an attempt to level the playing field for SAT-takers and curb exorbitant spending on test prep.
While the College Board’s goal of reducing inequality is certainly admirable, we have to ask – how much will these changes really matter? The SAT is becoming less and less relevant in college admissions decisions now that over 800 colleges and universities have “test optional” admissions policies. Even among students who are still required to submit test scores for college admissions, the SAT is declining in popularity. For the last two years more students have chosen the ACT over the SAT for their college admissions test (although this could change now that both tests focus on what students have learned in school). I am also relatively unconvinced that changing the test will rein in the culture of test prep hysteria among parents. This new SAT might be more difficult to “teach to” but that’s not going to stop affluent parents from purchasing every book, tutor, or service that might help their children gain an edge. Changing the test is not going to kill the test prep industry, as the College Board seems to hope it might.
The bright spot seems to be that the test is moving toward aligning with what students are actually learning in school. Since high school grades are routinely given more weight in college admissions, it just makes sense to test students on material that matches up with what they have learned, rather than arcane words they may never see again after SAT day. This realignment and the availability of free online prep materials are steps in the right direction, even if they don’t substantially change the culture of college test preparation. — Patricia Cambell