Learn About: 21st Century | Charter Schools | Homework
Home / Edifier


The EDifier

December 7, 2016

PISA scores remain stagnant for U.S. students

The results of the latest PISA or the Program for International Student Assessment are in and as usual, we have an interpretation of the highlights for you.

If you recall, PISA is designed to assess not just students’ academic knowledge but their application of that knowledge and is administered to 15-year-olds across the globe every three years by the U.S. Department of Education’s National Center for Education Statistics (NCES) in coordination with the Paris-based Organisation for Economic Cooperation and Development (OECD). Each iteration of the PISA has a different focus and the 2015 version honed in on science, though it also tested math and reading proficiency among the roughly half-million teens who participated in this round. So, how did American students stack up?

In short, our performance was average in reading and science and below average in math, compared to the 35 other OECD member countries.  Specifically, the U.S. ranked 19th in science, 20th in reading and 31st in math. But PISA was administered in countries beyond OECD members and among that total group of 70 countries and education systems (some regions of China are assessed as separate systems), U.S. teens ranked 25th in science, 22nd in reading, and 40th in math.  Since 2012, scores were basically the same in science and reading, but dropped 11 points in math.

PISA Science

Before you get too upset over our less-than-stellar performance, though, there are a few things to take into account.  First, scores overall have fluctuated in all three subjects.  Some of the top performers such as South Korea and Finland have seen 20-30 point drops in math test scores from 2003 to 2015 at the same time that the U.S. saw a 13 point drop.  Are half of the countries really declining in performance, or could it be a change in the test, or a change in how the test corresponds with what and how material is taught in schools?

Second, the U.S. has seen a large set of reforms over the last several years, which have disrupted the education system.  Like many systems, a disruption may cause a temporary drop in performance, but eventually stabilize.  Many teachers are still adjusting to teaching the Common Core Standards and/or Next Generation Science Standards; the 2008 recession caused shocks in funding levels that we’re still recovering from; many school systems received waivers from No Child Left Behind which substantially change state- and school-level policies.  And, in case you want to blame Common Core for lower math scores, keep in mind that not all test-takers live in states that have adopted the Common Core, and even if they do, some have only learned under the new standards for a year or two.  Andreas Schleicher, who oversees the PISA test for the OECD, predicts that the Common Core Standards will eventually yield positive results for the U.S., but that we must be patient.

Demographics

Student scores are correlated to some degree with student poverty and the concentration of poverty in some schools.  Students from disadvantaged backgrounds are 2.5 times more likely to perform poorly than advantaged students.  Schools with fewer than 25 percent of students who are eligible for free or reduced price lunch (about half of all students nationwide are eligible) would be 2nd in science, 1st in reading, and 11th in math out of all 70 countries.  At the other end of the spectrum, schools with at least 75 percent of students who are eligible for free or reduced price lunch, 44th in science, 42nd in reading, and 47th in math.  Compared only to OECD countries, high-poverty schools would only beat four countries in science, four countries in reading, and five in math.

Score differences for different races in the U.S. show similar disparities.

How individual student groups would rank compared to the 70 education systems tested:

Science Reading Math
White 5th 4th 20th
Black 49th 44th 51st
Hispanic 40th 37th 44th
Asian 8th 2nd 20th
Mixed Race 19th 20th 38th

 

Equity

Despite the disparities in opportunity for low-income students, the number of low-income students who performed better than expected increased by 12 percentage points since 2006, to 32 percent.  The amount of variation attributable to poverty decreased from 17 percent in 2006 to 11 percent in 2015, meaning that poverty became less of a determining factor in how a student performed.

Funding

America is one of the largest spenders on education, as we should be, given our high per capita income.  Many have bemoaned that we should be outscoring other nations based on our higher spending levels, but the reality is that high levels of childhood poverty and inequitable spending often counteract the amount of money put into the system.  For more info on this, see our previous blogpost.






November 2, 2016

Thoughts on nuance and variance

As we approach the 2016 general election, I’ve heard public officials, family, and friends make very clear statements regarding which side of the aisle they support.  Yet, I find it hard to believe that the average American falls in line 100% with either political party, or supports every word and tenet of a particular public policy.  We are nuanced people.  Very few issues are as black-and-white as we’d like them to be.  Here’s a guide for things to consider when considering your stance on a particular issue, candidate, or political party, put in the context of educational issues.

  1. Most issues have an “it depends” clause.

With the onslaught of information available today, it makes sense that we want answers that are black-and-white.  The reality, though, is that there’s gray area for most policies and practices.  We also have to balance our ideological values with evidence.  Charter school proponents may believe in free-market values and choice to improve public schools through vouchers and charter schools, but I haven’t seen widespread evidence that choice in and of itself actually improves academic achievement or long-term outcomes in significant ways.  Yes, there are individual students who have benefited, but there are also individual students who have lost out.  Charter school opponents claim that taking away publicly-elected oversight through school boards is detrimental to the public’s ability to provide free and quality education to all.  Yet, the reality is that some public schools have dismal records, and charter or private schools have sometimes had success with the same students.  We have to acknowledge that we all want good things for our kids, and then use the evidence to figure out what that looks like without demonizing the other side.

  1. Most policies rely heavily on the quality of their implementation to be successful.

Common Core seems to be a prime example of this.  Two-thirds of Americans are in support of some sort of common standards across the country.  Yet, barely half of Americans are in support of Common Core.  Support for both questions have dwindled significantly from about 90% of public support in 2012.  Even presidential candidate Hillary Clinton has called the roll-out of Common Core “disastrous,” despite supporting them overall.

CommonCore

Source: http://educationnext.org/ten-year-trends-in-public-opinion-from-ednext-poll-2016-survey/

They were implemented quickly in many states, often without the curriculum materials or professional development to help teachers succeed in teaching the new standards.  While support for Common Core seems to be leveling off with teachers, who are most familiar with them, several states have repealed or are considering repealing the Common Core.  The new state standards that have been written in South Carolina and Indiana are extremely similar to the Common Core, which means that it may not be the concept or content that people disagree with so much as how they were implemented and the ensuing political backlash.

 

  1. Statistics usually tell us about an average (the typical student) but variance is also important.

Charter schools are a prime example of this.  On average, they have similar student achievement outcomes as traditional public schools.  But, there are schools that outperform their counterparts and schools that woefully underperform.  We have to think about those schools, too.

This is also clear in school segregation.  The average black student in the U.S. attends a school that is 49% black, 28% white, 17% Latino, 4% Asian, and 3% “Other,” but that doesn’t mean that every black student has this experience.  At the edges of the spectrum, however, 13% of U.S. public schools are over 90% black and Latino, while 33% of schools are less than 10% black and Latino.  To understand the reality, we need to look at the variety of students’ experiences (known in statistic-speak as “variance”) not just the average.

  1. There’s always room for improvement. “Fixing” a policy may mean making adjustments, not abandoning it altogether.

Student assessments under No Child Left Behind (2001) resulted in the narrowing of curriculum.  But, we also learned more about disadvantaged student groups and have continued closing the achievement gap for students of color.  Should we throw out testing altogether? Some would say yes, but most Americans say no.  Graduation rates, college enrollment, and achievement scores have all increased since NCLB passed in 2001.  What we can do is improve on student assessments.  Adjusting consequences for students, teachers, and schools could result in less narrowing of curriculum and subjects taught.  Involving more well-rounded tests that encourage creative and critical thinking would help teachers emphasize these skills in class.  Continued improvement in data use can help teachers and school administrators adjust their practices and policies to see continued student growth.  States have the power to make some of these changes under the new Every Student Succeeds Act without dismantling gains made under No Child Left Behind.






February 23, 2016

Common Core’s happy days may be here again

Did a relationship ever sour so quickly as the Common Core and public opinion? Back in 2010 when the college- and career-ready standards were shiny and new, leaders from business and higher education as well as a certain U.S. Secretary of Education praised their rigor, coherence and attention to critical thinking. Within a year, 45 governors and D.C. had rushed to adopt them as their own – a move a majority of teachers and parents viewed favorably.

Then, implementation happened. Many teachers felt rushed to produce results. Parents couldn’t understand their child’s homework. Their anxiety fed chatter on talk radio and social media that did the incredible. It united anti-corporate progressives and anti-government tea partiers in opposition to the new standards and the assessments that go with them. States once on board with the program began to bail in face of angry constituents.

Recently, though, the mood appears to be shifting back into neutral. Presidential candidates deliver variations of the “repeal Common Core” line to applause, but the issue doesn’t seem to be gaining much traction in the race. The newly reauthorized ESEA deflates anti-Common Core messaging by explicitly forbidding the federal government from compelling or encouraging state adoption of any set of standards, including the Common Core.  After a flurry of state legislative proposals were introduced to undo the standards, only a handful were ever signed into law, and in some of those states, the replacements aren’t substantively different from the ones they tossed.

New studies related to the Common Core could prompt a wary public to give the standards a second look. In the first, a Harvard research team led by Thomas Kane surveyed a representative sample of teachers and principals in five Common Core states about implementation strategies. They were then able to match responses to student performance on the Core-aligned assessments, PARCC and Smarter Balanced.

According to their report, Teaching Higher: Educators’ perspective on Common Core implementation, three out of four teachers have “embraced the new standards” either “quite a bit” or “fully.” When asked how much of their classroom instruction changed, a similar proportion said it had by one half or more. Four in five math teachers say they have increased “emphasis on conceptual understanding” and “application of skills,” while an even higher proportion of English teachers reported assigning more writing “with use of evidence.” All are attributes emphasized in the standards.

The research team then related the survey results to students’ scores on the new assessments after controlling for demographics and prior achievement. While they did not find strategies of particular impact on English language arts, they did identify math practices that were associated with higher student scores: more professional development days; more classroom observations “with explicit feedback tied to the Common Core”; and the “inclusion of Common Core-aligned student outcomes in teacher evaluations.”

Casting light on such strategies is only worthwhile, however, if there is also evidence that the Common Core are good standards. Enter the Fordham Institute. The education think tank assembled a team of 40 experts in assessment and teaching to evaluate the quality of PARCC and Smarter Balanced. For comparison, they examined college-ready aligned ACT Aspire and MCAS, the highly regarded Massachusetts state assessment. The grades 5 and 8 test forms were analyzed against criteria developed by the Council of Chief State School Officers for evaluating “high-quality assessments” that aim to assess college- and career-readiness.

The short version.  All four tests scored highly for “depth,” that is, items that are “cognitively demanding.” PARCC and Smarter Balanced, however, edged out both ACT Aspire and MCAS in “content.” The researchers conducted an additional analysis against other assessments and found the Common Core-aligned tests also “call for greater emphasis on higher-order thinking skills than either NAEP or [the international] PISA, both of which are considered to be high-quality challenging assessments.”

Whether or not participating in national standards is a good idea is a decision that should rightfully be made by individual states. There are many legitimate political arguments for going either way, and each state will likely view it differently. But whether the Common Core standards – in full or in part – represent the expectations a state should have for all its students is an educational question that is worth considering on its own merits.

These early reports suggest that the new standards are higher and deeper than what states had before. Most teachers, although not all, have “embraced” them and are changing their instruction accordingly. We are learning anecdotally, too, that as parents see evidence of their child’s growth, they come around as supporters (see here and here).  What this means for the future is anyone’s guess. But for now it’s looking like the Common Core or something very much like them may be seeing happier days ahead. — Patte Barth

This entry first appeared on Huffington Post February 22, 2016.

 






February 3, 2016

PARCC test results lower for computer-based tests

In school year 2014-2015, students took the Partnership for Assessment of Readiness for College and Careers (PARCC) exam on a pilot basis. The PARCC exam was created to be in alignment with the Common Core Standards and is among the few standardized assessment measures of how well school districts are teaching higher-level competencies.

On February 3, Education Week reported in an article that the results for students who took the computer-based version of the exam were significantly lower than the results for students who took a traditional pencil and paper version. While the article states that the PARCC organization does not have a response or clear answer on why this occurred, I will offer my own explanation based on my experience as a teacher of students who took this exam last year.

I taught high school History, and the largest discrepancy in the results between students who took the computer versus paper exam was at the high school level. This is my theory for the discrepancy. Throughout students’ academic careers we teachers teach them to “mark-up” the text. This means that as they read books, articles, poems, and primary sources etc. students should have a pen/pencil and highlighter in their hand. There are many acronyms for how students should “mark-up” their text. One is HACC- Highlight, Annotate, Circle unknown words, Comment. There are many others but the idea is the same. Students are taught to summarize each paragraph in the margins and make note of key words. This helps students to stay engaged with the reading, find main ideas, and critically think about what they are reading. It also makes it easier to go back and skim the text for the main ideas and remember what they read without re-reading.

Generally students are forced to mark-up/annotate the text in this way but, honestly, I still do this! And, I would bet that many adults do too. If you need to read a long article at work, many people print it out and read it with a pen in hand. It makes it easier to focus on what you are reading. Now imagine that someone is going to test you on that article. You will be even more anxious to read the article carefully and write notes for yourself in the margins.

The point is that students are taught to do this when reading, especially when reading passages for exams when there will be questions based on the passage. My own students had this drilled into them throughout the high school years when I knew and taught them. Sometime last year the teachers learned that our school would be giving the pilot version of the PARCC exam to our students. During a teacher professional development day we were asked to go online to the PARCC website and learn about the test and take a practice exam. I encourage you to go online and take it for yourself — this exam is hard! We were asked to analyze the questions and think about ways we could change our own in-class exams to better align with PARCC. We were told that it would soon replace our state’s standardized exam.

One of the first things we all noticed was how long the reading passages are for the ELA portion of the test. It took a long time to read through them and we all struggled to read it on a computer screen. I really wanted to have a printed version to write my notes down! It was long and detailed and I felt as though by the time I saw the questions I would have to re-read the whole passage to find the answer (or find the section where I could infer an answer). I knew the students would struggle with this and anticipated lower scores on this exam than the state test. I was thankful that their scores wouldn’t actually count this year. But what happens when this becomes a high-stakes test?

As I anticipated, the scores for students who took the computer-based exams were far lower than those who took a traditional paper test. The Illinois State Board of Education found that, across all grades, 50% of students scored proficient of the paper-based PARCC exam compared to only 32% of students who took the exam online. In Baltimore County, students who took the paper test scored almost 14 points higher than students of similar demographics who took the test on the computer.

The low scores on the test are a different story. Organizations will need to analyze the results of this major pilot test and determine its validity. Students and teachers, if it becomes mandatory, will have to adjust to better learn the standards and testing format associated with this test. The bigger story is that there are significant hardships that come with taking a computer-based test.

My main concern is the reading passages. I don’t believe teachers should abandon the “mark it up” technique to bend to computer-based testing because learning how to annotate a text is valuable throughout people’s lives. I saw the students struggle to stare at the computer screen and focus on the words. Many used their finger on the screen to follow along with what they were reading. It was clearly frustrating for them not to be able to underline and make notes like they were used to doing.

Other concerns are that this test is online. It requires access to the internet, a multitude of computers for students to test, and students and teacher who are technologically savvy. When my school gave the test, it took several days and a lot of scheduling and disruption to get all students to take the test given our limited number of computers. Certain rooms of the building have less reliable internet connection than others and some students lost connection while testing. Sometimes the system didn’t accept the student login or wouldn’t change to the next page. There were no PARCC IT professionals in the building to fix these issues. Instead, teachers who didn’t know the system any better than the students tried to help.

Not all students were ultimately able to take or finish the exam because of these issues. Thankfully their results didn’t matter for their graduation! There are also equity concerns between students who are familiar with computers and typing and those who do not have much exposure to technology. As a teacher in an urban school I can tell you that was not uncommon to see students typing essays on their phones because they didn’t have a computer.

As a whole, I’m not surprised by the discrepancy in test scores and I imagine that other teachers are not either. The Education Week article quotes the PARCC’s Chief of Assessment in saying “There is some evidence that, in part, the [score] differences we’re seeing may be explained by students’ familiarity with the computer-delivery system.” This vague statement only hits the tip of the iceberg. I encourage those analyzing the cause of the discrepancy to talk to teachers and students. Also, ask yourselves how well you would do taking an exam completely online, particularly when there are long reading passages. –Breanna Higgins

Filed under: Accountability,Assessments,Common Core,High school,Testing — Tags: , , — Breanna Higgins @ 4:27 pm





November 11, 2015

More students are graduating but are they leaving high school prepared?

Last month the U.S. Department of Education released preliminary data showing the U.S. is on-track to set yet another record on-time high school graduation rate. While a preliminary national rate was not provided, the data showed that at least 36 states have increased their graduation rates over the previous year which reported an unprecedented 81 percent on-time rate nationally.

Another report was released yesterday by the Alliance for Excellent Education, America’s Promise Alliance, Civic Enterprises, and Everyone Graduates Center showing the recent increase in on-time graduation has led to the number of high school dropouts to fall from 1 million in 2008 to 750,000 in 2012. Over the same time period the number of so-called ‘Drop Out Factories’– high schools that fail to graduate at least 60 percent of their students within four years—decreased from just over 1,800 to 1,040 schools. These are dramatic decreases in such a short amount of time by any measure. But these decreases are made even more impressive by the fact that between 2002 and 2008 the number of dropouts increased by over 25,000 while the number of ‘Drop-out Factories’ fell by less than 200.

More students may be graduating high school but does that necessarily mean more students are finishing high school with the skills they need to succeed in college or the workplace? This is the big question. If high schools are just handing out pieces of paper to any student who attends for four years, a higher graduation rate doesn’t mean much of anything. Yet, if more students are graduating college and career ready, then indeed the record graduation rate is something to celebrate.

Unfortunately, it isn’t possible to determine how many students are graduating college and career ready, at least at the national level. Reason being, each state sets its own requirement for obtaining a high school diploma. In fact, a number of states set different requirements for different types of high school diplomas. A recent report from Achieve found 93 diploma options across all 50 states and the District of Columbia for the Class of 2014. The report noted that only 5 states (Delaware, the District of Columbia, Georgia, Kentucky, and Tennessee) require their students to meet college and career ready standards in math and English Language Arts (ELA) to earn a high school diploma. Meaning, these are the only states whose graduation rates are the same as the percent of graduates who are college and career ready.

This doesn’t mean that other states don’t have college and career readiness requirements to earn a high school diploma. In fact, 26 other states offer at least one diploma aligned with college and career standards. However, these states also offer multiple diplomas where students may still graduate high school without meeting college and career ready expectations by either opting out of the college and career ready requirements or choosing not to opt in. Moreover, just 9 of these states publicly report the percentage of students earning college and career ready aligned diplomas. So only in 14 states do we know what percent of high school graduates finish high school ready for college or the workforce.

The lack of alignment between diploma requirements and college-career ready standards may lead some to conclude the recent rise in graduation rates is due to a lowering the bar to graduation. But that would be wrong. Achieve’s most recent annual Closing the Expectations Gap report shows the bar to a high school diploma has been raising in most states—not falling. In fact, when Achieve first started examining high school graduation requirements in 2004 not a single state aligned their graduation requirements to college and career standards, and only Arkansas and Texas required students to pass an advanced Algebra course to earn a high school diploma. Since that time a number of states have adopted similar requirements for high school diploma.

The good news, then, is that graduation rates are not increasing simply by giving out more diplomas, but by more students meeting more rigorous graduation requirements. The bad news is it is still unclear how many of those requirements are aligned with college and career standards. Knowing how many students complete high school college and career ready is vitally important for policymakers in order to make more informed decisions to ensure all students leave high school prepared for postsecondary success. – Jim Hull






Older Posts »
RSS Feed