Learn About: 21st Century | Charter Schools | Homework
Home / Edifier


The EDifier

March 7, 2013

John Stossel, funky charts and Simpson’s paradox

John Stossel was on Fox and Friends this morning to promote an upcoming show about public schools. Remember, this is the guy who gave us Stupid in America — his ABC documentary from a few years back about our allegedly failing schools. During his segment, he claimed that “America has tripled spending, but test scores haven’t improved.”  The culprits? Teachers unions, school boards and other unnamed bureaucrats. Viewers were then shown a graph that indeed featured a flat line representing test scores over 40 years (improvement 1 point) with a second line escalating to $149,000 over the same period. The source was given as NCES. This got my fact-checking synapses sparking.

While I could not find the exact graph they showed on TV, Stossel did post this rather snazzy display on his blog with the same data:

Go ahead and take a moment to admire the work of the Fox News graphics department. Ok, now let’s talk data. This chart shows scores for three subjects (math, reading and science) and dollar figures (the “cost of education”) from 1970 to 2010. While not noted, I’m assuming the data source is still NCES.

This may get a little wonky, but stay with me.  NCES reports trend data over four decades for only two tests:  the National Assessment of Educational Progress (NAEP) Long-Term Trends (LTT) and the SAT. NCES also has international test scores, but that data only goes back to the 1990s so that couldn’t be what Stossel used.  The SAT does not assess science, which leaves NAEP LTT as the only possibility. It’s not a perfect match. The last NAEP LTT administration was in 2008 although Stossel’s chart shows data to 2010. But I’m going to assume that he fudged a little on the timeframe because nothing else qualifies.

NAEP LTT is given to a representative sample of students age nine, 13 and 17. I’m also going to assume that his analysis is based on 17-year-olds because the data matches his in reading and comes closest in math (more on this later).  Between 1971 and 2008, LTT reading scores for 17-year-olds have been relatively flat, posting an increase of just 1 point (not 1% as shown on Stossel’s chart, but we’ll blame the designer for that common mistake).  Here’s what it looks like:

Now let’s have some fun. Let’s look at the same test scores disaggregated by race and ethnicity:

Note that every group improved more than the overall score did: White 17-year-olds by 2 points with their Black and Hispanic classmates gaining a whopping 25 and 17 points respectively. This gives me a chance to talk about Simpson’s paradox.  The paradox occurs when “a trend that appears in different groups of data disappears when these groups are combined, and the reverse trend appears for the aggregate data.”  In this case, the overall trend for 17-year-olds is flat while each group gained, some groups by a lot. The reason is that the distribution of racial/ethnic groups has changed significantly between 1975 and 2008. Here is the distribution of the NAEP samples for the two years:

The proportion of Black and Hispanic 17-year-olds is larger while the proportion of White students in 2008 is 25 percentage points lower than it was in 1975. Even though Black and Hispanic performance also increased by a lot, they were still lower-performing than their White peers in 2008. Thus, all groups gain, but when their performance is combined the overall trend is flat.

Clearly, no one would argue that an achievement gap, though improving, is acceptable and we can move on to other things. But it’s just as absurd to look at these gains and find evidence of failing schools, as Stossel does.  And the absurdity doesn’t end there. Stossel, in turns out, is a master cherry picker of data. Let’s look at the rest of NAEP Long Term Trends:

  • Reading, 13-year-olds, 1971-2008: Overall scores +12; Black students +23; Hispanic +24.
  • Reading, 9-year-olds, 1971-2008: Overall +5; Black +21, Hispanic +10.
  • Mathematics, 17-year-olds, 1978 (first year tested)-2008: Overall +6, Black +19, Hispanic +17.
  • Mathematics, 13-year-olds, 1978-2008: Overall +17, Black +32, Hispanic +17.
  • Mathematics, 9-year-olds,  1978-2008: Overall +24, Black +32, Hispanic +30.

Notice a pattern?  If one were to apply Stossel’s grossly oversimplified analysis of education cost to scores — and I’m not saying you should — but if you did, you would have to say our public schools are producing a return on our investment.   Then again, how he got those cost figures is another topic for another day.

Filed under: Achievement Gaps,Data,Demographics,Public education — Tags: , , , — Patte Barth @ 2:46 pm





December 7, 2011

Urban districts making progress, but more work needed

Earlier today, the National Center for Education Statistics (NCES) released the fifth installment of the Trial Urban District Assessment (TUDA), which reports on the performance of fourth- and eighth-graders on NAEP reading and mathematics in participating urban districts. Overall, both math and reading results show our urban schools have made significant progress over the past decade, yet a long climb remains until they close the gap between themselves and our high performing suburban districts.

There are some important takeaways from these results. First, the data over the past decade clearly shows that urban districts can and do improve student achievement. Second, change doesn’t happen overnight. Although the data shows that in some districts students are achieving nearly two years more of learning compared to their peers a decade earlier, those gains came from long, gradual improvement. It’s important to remember that our urban schools are on the right track. Let’s not derail their successes by trying to accelerate those gains without knowing what is making the gains possible.

Below are some of the major findings from both the math and reading assessments.

Math

Fourth Grade

  • Atlanta (3 points), Austin (5 points), Baltimore City (3 points), and Philadelphia (4 points) were the only districts to significantly increase their scores from 2009 to 2011. During this same time period, scores for the nation increased by 1 point. 
  •  Boston and Washington, D.C. made the greatest gains from 2003 to 2011 by increasing scores 17 points each. Such increases are roughly equivalent to about a year and half worth of learning.
    • During this time, Cleveland was the only participating district that did not improve its performance.
  • Austin (TX), Charlotte (NC), and Hillsborough (FL) were the only urban districts to score higher than the overall national average while San Diego’s score was not significantly different from the national average.  Charlotte was the only district to do so in 2009.
  • Eight urban districts scored higher than the average for students attending schools in large cities (cities of populations of 250,000 or more). This was up from seven districts in 2009.
  • The percent of students scoring at or above Proficient varied dramatically among urban districts, from 48 percent in Charlotte to just 3 percent in Detroit.
    • Only three districts increased the percent of students reaching the Proficient level since 2009, although seven out of nine districts increased their percentages since 2003.
    • Students at the Proficient level are able to “draw a line segment of a given length.”

Eighth Grade

  • Six districts significantly increased their scores from 2009 to 2011. This was up from just two states that increased their scores between 2007 and 2009. 
  • From 2003 to 2011, nine out of ten districts made significant gains in their performance, with Atlanta (22) and Boston (20) all making gains roughly equivalent to two years’ worth of additional learning.
    • Cleveland was the only district to not make significant progress during this time period.
  • Austin and Charlotte were the only districts to outscore the nation as a whole, while Boston and Hillsborough’s scores were not significantly different from the national average.
  • Six urban districts did score higher than the 2011 average for students attending schools in large cities, which is an increase from five districts in 2009.
  • The percent of students scoring at or above proficient varied just as it did at the fourth grade level. Austin had the highest percentage at 38 percent, while Detroit once again had the lowest percentage at just 4 percent.

Reading

Fourth Grade

  • None of the 18 districts that participated in both 2009 and 2011 saw any significant changes.  During this same time period scores for the nation remained flat.
  • Austin (TX), Charlotte (NC), Hillsborough County (FL), Jefferson County (KY), and Miami-Dade (FL) scored slightly higher than the overall national average.
  • Austin, Charlotte, Hillsborough County, Jefferson County, and Miami-Dade scored higher than the average for large cities (cities of populations of 250,000 or more).  
  • The percent of students scoring at or above proficient varied dramatically among urban districts from 44 percent in Hillsborough County to just 7 percent in Detroit.
    • However, 45 percent more students in large cities were proficient in 2011 than in 2003

Eighth Grade

  • Charlotte (6 points) was the only school district to significantly increase their scores from 2009 to 2011.  During this same time period students nationally increased their scores 2 points.
  • Austin, Charlotte, Hillsborough County (FL), Jefferson County (KY), and Miami-Dade scored higher than the average for large cities.
    • A few districts had slight score decreases since the first year they participated.  The District of Columbia’s score decreased by 3 points since 2002, Fresno’s (CA) score decreased by 2 points since 2009, Miami-Dade’s score decreased by 1 point since 2009, and Milwaukee’s score decreased by 3 points since 2009.
  • Hillsborough County was the only district to outscore the nation as a whole while Jefferson County and Miami-Dade didn’t score significantly different from the national average.
  • The range of students scoring at or above proficient was wide just as it was at the fourth grade level. Charlotte had the highest percentage at 34 percent while Detroit once again had the lowest at just 5 percent.
    • However, overall 50 percent more students in large urban cities were proficient in 2011 compared to 2003.

 

— Jim Hull and Mandy Newport

 

 

Filed under: Achievement Gaps,Assessments,Data,NAEP,Report Summary — Tags: , , , , — rstandrie @ 5:48 pm





August 24, 2011

Comparing states to other countries: A fair comparison?

Can U.S. compete if only 32 percent of students are proficient in math? This was the headline over at MSNBC.com last week in reference to a new report that compared each state’s math and reading performances to that of 65 other countries. Researchers at Harvard’s Program on Education Policy and Governance made the comparison by comparing the results of the 2007 8th grade NAEP assessments, which all states participated in, to the 2009 PISA results that 15-year olds (mostly high school sophomores) in the U.S. and 64 other countries participated in. Although the same students did not take both assessments and the assessments were taken in different years, the researchers claim that such a comparison is a fair representation of the Class of 2011 math and reading performance.

I’ll admit I’m not an assessment expert or a statistician, but I am a little skeptical of how fair that comparison is. For one, reliably comparing different assessments is quite difficult. These researchers took a fairly simplistic approach that just looked at the percent of 8th graders achieving at NAEP’s Proficient level (32 percent) in math and then looked to see at what score did the top-performing 32 percent of U.S. students score at on the 2009 PISA test. Then they used that PISA score to represent achieving ‘proficiency’ on NAEP for other countries. The same comparison was done for reading.

A similar study being conducted by NCES, scheduled to be released next year, takes a more comprehensive approach. In this study, they will compare NAEP to TIMSS results by having a sample of 8th grade students in each state take both the NAEP and TIMSS assessment in math and science. This will likely provide a more accurate comparison of how states truly compare to other countries, since they are using results from two assessments taken by the same students in the same year, unlike the Harvard study.

I could go on about the possible limitations of making such linkages, but such arguments would be fairly technical and wouldn’t change the main finding from the Harvard study: too few U.S. students are proficient in math. As I stated in The Proficiency Debate: A guide to NAEP achievement levels, NAEP’s “proficient” level is a fairly high, although reachable, standard, and is not the same as being “on grade level.” With that being said, the data makes it quite obvious that a significant number of countries have a far greater proportion of their students obtaining higher level skills than the U.S. and that in some states very few student are acquiring these knowledge and skills. As the Harvard report states, if all states were able to increase the proportion of students obtaining these skills to that of the proportion of Canada, the U.S. could increase its GDP by nearly $1 trillion per year over the next 80 years. Sounds like a great argument for states to increase their investment in education instead of cutting funding. This is a more effective alternative to cutting our nation’s deficit than simply cutting spending and raising taxes.– Jim Hull






August 11, 2011

How does your state’s standard compare?

Yesterday, the National Center for Education Statistics (NCES) at the U.S. Department of Education released a new report, Mapping State Proficiency Standards onto NAEP Scales: 2005-2009. The report enables states to compare the rigor of their standards for proficiency in fourth and eighth grades in both math and reading to that of other states. To do so, it places each state’s assessment cut-score for proficiency — the score which students much reach to be considered proficient — onto NAEP’s scoring scale using statistical mapping techniques. This means it shows where on NAEP’s scoring scale a student would fall if that student scored right at the state’s cut-score for proficiency on the state assessment.

Example: If a fourth grader in Vermont scored at the proficient cut-score on the Vermont state assessment, that score would correspond to a score of 214 on NAEP, which falls within NAEP’s Basic Achievement Level.

 What did the report find?

  • The differences where states set their proficiency standards vary greatly.
    • The difference in scores between the states with the five highest and lowest standards is comparable to the difference in scores between NAEP’s Basic and Proficient levels.
    • The range of state standards is between 60 and 71 NAEP points, which equates to about six or seven years of learning. It is also more than twice the size of the Black/White achievement gap in 4th grade reading, which is 25 NAEP points.
  • Most state’s proficiency standards are at or below NAEP’s definition of Basic performance.
    • In grade 4 reading, 35 of 50 states set their standard for proficiency lower than NAEP’s cut-score for its Basic level. For grade 8 reading, 16 out of the 50 states did so.
    • In grade 4 math, seven of 50 states set their score for proficiency below the cut score for NAEP’s Basic level, with 42 states setting their proficiency score within NAEP’s Basic level. One state—Massachusetts—set its proficiency score within NAEP’s Proficiency level. Similar results were found in at the 8th grade level.
  • The rigor of state standards increased in states that substantively changed their assessments between 2007 and 2009.
    • Across the 34 math and reading assessments that substantively changed between 2007 and 2009, in 21 cases the rigor of the standards increased.
    • In just 5 cases did the rigor of the state standards decrease.
  • Most state results show more positive changes in the proportion of students reaching proficiency than NAEP results.
    • The change in the percent of students reaching proficiency between 2007 and 2009 was more positive in 17 of 22 state assessments than on NAEP.

Keep in mind when reading the report that NAEP does not necessarily define proficiency the same way states do. NAEP defines Proficiency as competency over challenging subject matter, not grade-level performance as states attempt to do. It is also worth mentioning that no country, not even the highest performing countries, would have 100 percent of their students reach NAEP’s Proficiency level. and that some leading assessment experts have stated that proficiency for accountability purposes probably lies somewhere between NAEP’s Basic and Proficient levels.

Even with that in mind, the results should be a warning flag to many states, especially those who set their proficiency standard below NAEP’s Basic level. But this could be a moot point in the coming years, as most states have signed on to the Common Core of Standards, where the goal is college and career readiness, not proficiency as both state assessements and NAEP are currently setup to measure. In the meantime, states should still ensure they set their proficiency standards at a level where students demostrate they have the skills necessary to get into college or get a good job after high school. — Jim Hull

For more information on how NAEP’s proficiency levels compare to states’, check out the Center for Public Education’s The proficiency debate: A guide to NAEP achievement levels.

The graph below from today’s report shows where your state’s cut-score for proficiency falls on the NAEP 8th grade math assessment score scale (Other grade and subjects can be found on pages 10, 11, and 12 of the report). Scores above 299 fall in or above NAEP’s Proficient level, while scores above 262 but below 299 fall within NAEP’s Basic level.






June 14, 2011

U.S. Students Lag in Their U.S. History

This morning the 2010 NAEP results in U.S. History for 4th, 8th, and 12th graders was released. The NAEP History assessment is designed to measure their knowledge of American history in the context of democracy, culture, technology and economic changes, and America’s changing world role. Results for 2010 were compared to results in previous assessments in 1994, 2001, and 2006.  The report also examined the change in Advanced Placement U.S. History course taking between 1990 and 2009. Furthermore, the report also examines the access minority students have to AP U.S. History courses in their high schools.

Results for 4th and 8th graders showed some positive signs, but results for U.S. 12th graders were disappointing. At the 4th and 8th grade levels, low-achieving and minority students made tremendous gains over the past decade to narrow achievement gaps. However, similar results were not seen at the 12th grade level where scores from students from almost all racial/ethnic groups and at all achievement levels remained relatively unchanged over the past decade and half. However, a bright spot for our nation’s high schools is that more students, particularly minority students, have access and are taking Advanced Placement U.S. History courses to better prepare for life after high school.

Here are some of the major findings from the report:

Fourth Grade

  • Overall scores were not significantly higher in 2010 (214) compared to 2006 (211).
    • However, scores were significantly higher than in 2001 (208) and 1994 (205). 
  • Minority students made substantial gains to narrow achievement gaps.
    • Since 1994, Black and Hispanic students increased their scores by 22 points and 23 points, respectively. This represents approximately two additional years’ worth of learning.
    • Much of the increase took place since 2001, where Black scores increased by 13 points and Hispanic scores by 14.
    • Due to these gains, the Black-White and Hispanic-White gaps narrowed by 12 and 13 points, respectively.
  • The lowest-performing students made the greatest gains.
    • Since 1994, students scoring at the 10th percentile increased their scores by 22 points
    • Since 2001 these students have increased their scores by 12 points.
  • Seventy-three percent of 4th graders scored at or above NAEP’s basic level.
    • Students scoring at or above this level should be able to identify and describe a few of the most familiar people, places, events, ideas, and documents in American history.  
  • Just 20 percent scored at or above NAEP’s proficient level.
    • Students scoring at this level should be able to identify, describe, and comment on the significance of many historical people, places, ideas, events and documents.

Eighth Grade

  • Overall scores were significantly higher in 2010 (266) compared to 2006 (263) as well as compared to 1994 (259).
  • Minority students made significant gains to narrow achievement gaps with white students
    • Since 2001, Black and Hispanic students increased their scores by 10 and 12 points, respectively. This represents approximately an additional year’s worth of learning.
    • Due to these gains, the Black-White and Hispanic-White gaps narrowed by 5 and 7 points, respectively, since 2001.
  • The lowest-performing students made significant gains.
    • Since 2001, students scoring at the 10th percentile increased their scores by 11 points.
  • The percent of 8th graders scoring at or above NAEP’s basic level is increasing.
    • Since 2001, the percent of students scoring basic or above increased from 62 percent to 69 percent in 2010.
    • Students scoring at or above this level should also have a beginning understanding of the fundamental political ideas and institutions of American life and their historical origins.   
  • There was no change in the percent of 8th graders scoring at or above NAEP’s proficient level.
    • Just 17 percent of 8th graders scored at or above proficient in 2010, which is not significantly different from 2001.
    • Students at this level should be able to incorporate geographic, technological, and other considerations in their understanding of events and should have knowledge of significant political ideas and institutions.
  • The vast majority of 8th graders took U.S. History.
    • Eighty-four percent of 8th graders said they were taking U.S. history at the time of the assessment.

Twelfth Grade

  • Scores have remained relatively stagnant since 1994.
    • Scores have increased just 2 points since 1994 and did not significantly increase between 2001 and 2010.
    • Scores have remained stagnant across all achievement levels.
  • Achievement gaps remained relatively unchanged.
    • Scores for Black students have not changed significantly since 1994.
    • Scores for Hispanic students increased 8 points since 1994, but have not changed significantly between 2001 and 2010.
    • Furthermore, Black-White and Hispanic-White achievement gaps have remained relatively unchanged over the past 16 years.
  • There was no significant change in percent of 12th graders scoring at or above NAEP’s basic level.
    • Fewer than half (45 percent) of 12th graders scored at or above the basic level.
    • Students at this level they should have a sense of continuity and change in history and be able to relate relevant experience from the past to their understanding of contemporary issues.
  • There was no significant change in the percent of 12th graders scoring at or above NAEP’s proficient level.
    • Just 12 percent of 12th graders scored at or above proficient in 2010, which is not significantly different from any other previous assessment.  
    • Students at this level should be able to communicate reasoned interpretations of past events, using historical evidence effectively to support their positions.  

High School History Course Taking

  • More students took AP U.S. history in 2009 (13 percent) than in 1990 (6 percent).
    • Four times as many Hispanic students (12 percent) took the course in 2009 than in 1990 (3 percent).  
  • A greater number of minority students had access to U.S. History Advanced Placement (AP) courses in 2009 than in 1990.
    • In 1990, just 49 percent of Black students and 54 percent of Hispanic students attended a high school that offered an AP U.S. History course.
    • In 2009, the percentage increased to 83 percent and 91 percent for Black and Hispanic students, respectively.
    • On the other hand, only 75 percent of White students had access to an AP History course in 2009.
  • Students attending high-minority (defined as 50% or more of enrollment) high schools were more likely to have access to an AP U.S. History course than students attending low-minority schools.
    • While a similar number of students in low-minority (43 percent) and high-minority (42 percent) schools had access to AP U.S History in 1990, students in high minority (90 percent) high schools were more likely to have access to an AP U.S. History course than students in low-minority schools (66 percent) in 2010.

For more information on NAEP go to www.centerforpubliceducation.org and check out The proficiency debate: A guide to NAEP achievement levels.—Jim Hull

Filed under: Achievement Gaps,Course taking,High school,Middle school,Report Summary — Tags: , — Jim Hull @ 4:29 pm





« Newer PostsOlder Posts »
RSS Feed