PARCC test results lower for computer-based tests

In school year 2014-2015, students took the Partnership for Assessment of Readiness for College and Careers (PARCC) exam on a pilot basis. The PARCC exam was created to be in alignment with the Common Core Standards and is among the few standardized assessment measures of how well school districts are teaching higher-level competencies.

On February 3, Education Week reported in an article that the results for students who took the computer-based version of the exam were significantly lower than the results for students who took a traditional pencil and paper version. While the article states that the PARCC organization does not have a response or clear answer on why this occurred, I will offer my own explanation based on my experience as a teacher of students who took this exam last year.

I taught high school History, and the largest discrepancy in the results between students who took the computer versus paper exam was at the high school level. This is my theory for the discrepancy. Throughout students’ academic careers we teachers teach them to “mark-up” the text. This means that as they read books, articles, poems, and primary sources etc. students should have a pen/pencil and highlighter in their hand. There are many acronyms for how students should “mark-up” their text. One is HACC- Highlight, Annotate, Circle unknown words, Comment. There are many others but the idea is the same. Students are taught to summarize each paragraph in the margins and make note of key words. This helps students to stay engaged with the reading, find main ideas, and critically think about what they are reading. It also makes it easier to go back and skim the text for the main ideas and remember what they read without re-reading.

Generally students are forced to mark-up/annotate the text in this way but, honestly, I still do this! And, I would bet that many adults do too. If you need to read a long article at work, many people print it out and read it with a pen in hand. It makes it easier to focus on what you are reading. Now imagine that someone is going to test you on that article. You will be even more anxious to read the article carefully and write notes for yourself in the margins.

The point is that students are taught to do this when reading, especially when reading passages for exams when there will be questions based on the passage. My own students had this drilled into them throughout the high school years when I knew and taught them. Sometime last year the teachers learned that our school would be giving the pilot version of the PARCC exam to our students. During a teacher professional development day we were asked to go online to the PARCC website and learn about the test and take a practice exam. I encourage you to go online and take it for yourself — this exam is hard! We were asked to analyze the questions and think about ways we could change our own in-class exams to better align with PARCC. We were told that it would soon replace our state’s standardized exam.

One of the first things we all noticed was how long the reading passages are for the ELA portion of the test. It took a long time to read through them and we all struggled to read it on a computer screen. I really wanted to have a printed version to write my notes down! It was long and detailed and I felt as though by the time I saw the questions I would have to re-read the whole passage to find the answer (or find the section where I could infer an answer). I knew the students would struggle with this and anticipated lower scores on this exam than the state test. I was thankful that their scores wouldn’t actually count this year. But what happens when this becomes a high-stakes test?

As I anticipated, the scores for students who took the computer-based exams were far lower than those who took a traditional paper test. The Illinois State Board of Education found that, across all grades, 50% of students scored proficient of the paper-based PARCC exam compared to only 32% of students who took the exam online. In Baltimore County, students who took the paper test scored almost 14 points higher than students of similar demographics who took the test on the computer.

The low scores on the test are a different story. Organizations will need to analyze the results of this major pilot test and determine its validity. Students and teachers, if it becomes mandatory, will have to adjust to better learn the standards and testing format associated with this test. The bigger story is that there are significant hardships that come with taking a computer-based test.

My main concern is the reading passages. I don’t believe teachers should abandon the “mark it up” technique to bend to computer-based testing because learning how to annotate a text is valuable throughout people’s lives. I saw the students struggle to stare at the computer screen and focus on the words. Many used their finger on the screen to follow along with what they were reading. It was clearly frustrating for them not to be able to underline and make notes like they were used to doing.

Other concerns are that this test is online. It requires access to the internet, a multitude of computers for students to test, and students and teacher who are technologically savvy. When my school gave the test, it took several days and a lot of scheduling and disruption to get all students to take the test given our limited number of computers. Certain rooms of the building have less reliable internet connection than others and some students lost connection while testing. Sometimes the system didn’t accept the student login or wouldn’t change to the next page. There were no PARCC IT professionals in the building to fix these issues. Instead, teachers who didn’t know the system any better than the students tried to help.

Not all students were ultimately able to take or finish the exam because of these issues. Thankfully their results didn’t matter for their graduation! There are also equity concerns between students who are familiar with computers and typing and those who do not have much exposure to technology. As a teacher in an urban school I can tell you that was not uncommon to see students typing essays on their phones because they didn’t have a computer.

As a whole, I’m not surprised by the discrepancy in test scores and I imagine that other teachers are not either. The Education Week article quotes the PARCC’s Chief of Assessment in saying “There is some evidence that, in part, the [score] differences we’re seeing may be explained by students’ familiarity with the computer-delivery system.” This vague statement only hits the tip of the iceberg. I encourage those analyzing the cause of the discrepancy to talk to teachers and students. Also, ask yourselves how well you would do taking an exam completely online, particularly when there are long reading passages. –Breanna Higgins

Leave a comment

Your email address will not be published.