How does the U.S. compare?

Comparing the performance of the U.S. education system to other countries is anything but straightforward. Take for example math. How well does the U.S. perform in math compared to other countries? You probably heard something along the lines of:

U.S. students scored below the international average in math and ranked 25th out of 34 industrialized countries.

But you also may have heard:

U.S. students scored above the international average in math and performed as well or better than all but five countries.

Each statement casts a dramatically different light on the state of our education system. Yet both statements are accurate. So what should you think about how U.S. students really compare internationally when the facts are so radically different? 

First, you should check out the Center’s Guide to international assessments. There you’ll find out that not all international assessments measure the same knowledge and skills or grade levels. You’ll also find out that the U.S. performance is not compared to the same countries in each of the assessments. So comparing relative achievement with averages and rankings, as in the statements above, can be misleading if you don’t know which countries the U.S. is being compared to.

For example, in our above statements the first is taken from the 2009 PISA results, which compared math literacy scores of U.S. 15 year-olds to that of 15 year-olds in 33 other industrialized countries. The second statement is taken from the 2007 TIMSS results, which compared math scores of U.S. 8th graders to that of 8th graders in 46 other countries, many of which are developing countries. As you can see, comparing the results is not exactly comparing apples to apples.

But such differences shouldn’t preclude you from making valid comparisons using the two assessments. As a matter of fact, not looking at both assessments would limit the knowledge you could gain. However, you need to do it correctly.

One such way is to compare countries similar to the U.S. that took part in each of the assessments. Comparing the so-called G8 countries (Canada, France, Germany, Italy, Japan, Russia, United Kingdom, and the United States) is one way to do it. These countries have similar economies, and most take part in each of the international assessments, so they make for a valid comparison.

With this in mind, the National Center for Education Statistics (NCES) recently released a report comparing U.S. education to that of the other G8 countries. Not only does the report compare achievement levels, but enrollment, expenditures, and attainment measures as well. The good thing about this report is that it is as close to an apples to apples comparison as you can get in comparing education systems internationally.

So how’d we do? One of the big points that stood out to me was the fact that less than half (47 percent) of U.S. 3 and 4 year-olds are in school compared to over 80 percent for all other G8 countries. So, in the countries we are in direct competition with economically, their students are getting a significant head start in their education.

This is significant because the NCES report also shows that the U.S. is outranked by other G8 countries not only on assessments but also in earning high school and college diplomas. Could it be that the U.S. is lagging, at least in part, because of the fact that so few of our 3 and 4 year olds are not in pre-school? The CPE research on pre-k certainly shows the power of quality pre-k programs to increase both achievement and future attainment levels of all students, especially disadvantaged students.

Yet in these tough economic times, public funds for pre-k are diminishing. Are we going to abandon a strategy that we know works, and that other countries seem to be using successfully? – Jim Hull

Leave a comment

Your email address will not be published.