Learn About: 21st Century | Charter Schools | Homework
Home / Edifier


The EDifier

January 31, 2014

Are elementary school parents more demanding than high school parents?

demanding parentIn a recent Op-Ed Thomas Friedman asks are we falling behind* as a country in education because:

“…too many parents and too many kids just don’t take education seriously enough and don’t want to put in the work needed today to excel?”

Friedman asks this question in response to a recent speech in which Secretary Duncan stated ”…I wished our biggest challenge here in the U.S. was too many parents demanding excellent schools” after telling a story in which South Korean President Lee told President Obama his country’s biggest education problem was that parents were too demanding. Secretary Duncan went on to quote Amanda Ripley author of The Smartest Kids in the World and How They Got There stating:

“too many parents and too many kids just don’t take education seriously enough and don’t want to put the work needed today to really excel.”

Quite the bold argument but are there any actual facts to back up these claims? To argue that U.S. parents and students are lazy or at the very least complacent there must be some compelling data to back up these claims. So what ‘evidence’ does Friedman provide to back up his hypothesis? Letters from two, count them, two veteran high school teachers who obviously had become disenfranchised because they believed their students were being asked to do less and in fact were doing less. As heartfelt and compelling as these letters are they are still just the experience of two teachers. Not exactly a representative sample of teachers nationwide.

So the question must be asked: Are students doing less now and if so, is it because less is expected of them?  Of course, the answers to these questions are quite subjective. However, the teachers Friedman highlighted backed up their claims by noting that it is harder to get students to do homework now than every before—so in essence they were using homework completion as a proxy for student effort. And if you look at homework data that was collected along with the Long-term NAEP assessment for both 13 and 17 year-olds it does appear students are doing less homework on-average than they were a couple decades ago–although parents of high school children taking 4 Advanced Placement classes may find this hard to believe.

So it appears there is evidence to support the teachers’ contention that students are doing less homework now than in previous years but such evidence does not provide the complete story. As Secretary Duncan was claiming indirectly that students in other countries– like South Korea– are out working our students at least in part because their parents demand more. If we once again use homework as a proxy for student effort it is the South Korean parents who are less demanding. According to data from the 2011 Trends in Math and Science Study (TIMSS) 15 percent of U.S. 8th graders spent 3 or more hours per week doing homework compared to just 2 percent of 8th graders in South Korea. South Korea is not an outlier either. In Japan and Finland–both high performing countries– the percentage is about the same as South Korea. The other extreme is true as well where 78 percent of South Korean 8th graders spent less than 45 minutes per week doing homework compared to just 43 percent of U.S. 8th graders.

Unfortunately, there isn’t as much information on homework in high school but South Korea is known for how much time their high school students spend on homework. So even if we take that as a given does that mean that South Korean parents suddenly demand more once they children hit high school? Do U.S. parents push their children through elementary school then suddenly stop demanding such hard work when they enter high school? I don’t think so.

While there is evidence that students are spending less time on homework and it probably true U.S. high school students on-average spend less time on homework than high school students in other countries, it doesn’t necessarily mean students are not working as hard or less is expected of them. In fact, the assumption our students are expected to do less is wrong. When you actually look at the data you see today’s students are taking much more rigorous courses. For example, according to data from Long-Term NEAP in 1986 just 79 percent of 17 year-olds had taken Algebra compared to 96 percent of students in 2012. Furthermore, 76 percent of 17 year olds took Algebra II in 2012 compared to just 44 percent in 1986. The percent of student taking Calculus has also dramatically increased from just 7 percent in 1986 to 24 percent in 2012. Such increases in rigorous courses were not relegated to math courses either. Similar increases were also made in science as well with many more students taking chemistry and physics now than in the 1980s.

While students may be spending less time on homework, they are taking more challenging courses. So to claim our students aren’t working as hard or not expected to do as much is not supported by actual evidence. In fact, our students are expected to do more and are in fact doing more than ever before. Can we expect more? We sure can but just because parents make education the end-all be-all of human existence in a couple high performing countries doesn’t mean that’s how parents should act here. Parents should set high expectation from their children and their local schools that will educate them but they should also let their children be children as well. – Jim Hull

*I’ll take on the inaccurate assumption the U.S. is falling behind other countries next week. 






December 3, 2013

Disappointing results from latest international assessment

Results from the 2012 Program for International Assessment (PISA) were released today that compared the reading, mathematics, and science literacy of 15-year olds in 65 countries including the United States. Unfortunately, the overall results were not positive for our nation’s schools. In fact, the U.S. failed to improve on any of the three subjects tested since 2000- the first year PISA was administered. Due to this lack of improvement a greater number of countries outperformed the U.S. in 2012 than did in 2009—the last year PISA was administered—in all three subject areas. In particular, in mathematics the U.S. was significantly outperformed by 29 countries in 2012 compared to 24 countries in 2009. Even in reading where the U.S. has compared much more favorably, U.S. 15-year olds were outperformed by 19 countries in 2012 compared to just 9 countries in 2009.

What the results indicate is that while the U.S. performance remains relatively unchanged, other countries are leapfrogging over the U.S. by making significant gains in reading, mathematics, and science just between 2009 and 2012. These include countries such the United Kingdom, Ireland, Canada, Poland, and Australia which all have among the highest child poverty rates in the world and all these countries outperformed the U.S. in mathematics.  Certainly, poverty impacts student achievement but the U.S. can learn from these countries on how to more successfully educate poor students. One bright point from the PISA results for the U.S. is that the achievement gap between high and low-socioeconomic status (SES) did narrow slightly between 2009 and 2012. However, even if every other country had a similar SES rate as the U.S. the U.S. performance would actually drop slightly while the performance of many other countries would actually improve. This provides evidence that the mediocre U.S. performance is not simply due to demographics.

While the PISA results are disappointing they are the exception rather than the rule when it comes measuring U.S. performance. On other international assessments such as TIMSS and PIRLS the U.S. has made significant progress over the past decade or so. In fact, in math the U.S. is among the world leaders in gains between 1995 and 2011. The U.S. has also made significant gains on domestic assessments such as NAEP. And the U.S. estimated on-time graduation rate has improved from 67 percent in 2000 to 74 percent in 2010—which is nearly at an all-time high. This makes the lack of improvement on PISA all that more surprising. We need to take a deeper look into PISA data to find out why the U.S. is making such gains on other indicators that are not showing up in PISA. Without knowing the answer to this question limits our ability to use the PISA results to improve our schools.

The Findings

Mathematics Literacy

  • The U.S. score of 481 was significantly lower than the international average* of 496.
  • The U.S. was outperformed by 29 of 64 countries**.
    • Shanghai-China was the highest performing country (613) followed by Singapore (573), Hong-Kong-China (561), Chinese Taipei (560), and Korea (554).
    • The U.S. performed similarly to 9 countries including Norway, Italy, Russia, and Hungry.
    • The U.S. performed significantly better than 26 countries such as Israel (466), Greece (453), Mexico (413), and Brazil (391).
  • Scores for the U.S. have not improved.
    • Scores for the U.S. were similar between 2009 and 2012 as well as between 2000 and 2012.
    • Twenty-nine countries outperformed the U.S. in 2012 compared to 24 countries in 2009.
      • In 2009 Poland, Austria, Ireland, Czech Republic, and United Kingdom performed similarly to the U.S. but outperformed the U.S. in 2012.
  • The U.S. has fewer advanced students and more low performing students than most countries.
    • A smaller percentage of U.S. students (9 percent) scored within the top two PISA achievement levels than the international average (13 percent).
    • Twenty-seven countries had a higher percentage of high performing students. Shanghai-China led the world with more than half (55 percent) reaching these advanced levels followed by Singapore (40 percent), Chinese Taipei (37 percent), Hong Kong-China (34 percent), and Korea (31 percent).
    • The U.S. also had a larger proportion of low-performing students**(26 percent) than the international average (23 percent) and 29 counties had a lower percentage of low-performing students than the U.S.

Science Literacy

  • The U.S. did not score significantly different from the international average of 501.
  • The U.S. was outperformed by 22 of 64 other countries.
    • Shanghai-China was the highest performing country (580) followed by Hong-Kong-China (555), Singapore (551), Japan (547), and Finland (545).
    • The U.S. performed similarly to 13 countries including France, Italy, Norway, and Croatia.
    • The U.S. performed significantly better than 29 countries such as Russia (486), Sweden (485), Mexico (415), and Brazil (405).
  • Scores for the U.S. have not improved.
    • Scores for the U.S. were basically unchanged between 2009 and 2012.
    • The 2012 scores were also similar to the scores in 2000.
    • Twenty-two countries outperformed the U.S. in 2012 compared to 18 countries in 2009.
      • In 2009 Poland, Ireland, and the Czech Republic performed similarly to the U.S. but outperformed the U.S. in 2012.
  • The U.S. has fewer advanced students and more low performing students than most countries.
    • Seven percent of U.S. students scored within the top two PISA achievement levels which is similar to the international average.
    • Seventeen countries had a higher percentage of high performing students than the U.S. Shanghai-China led the world with 27 percent of students reaching these advanced levels followed by Singapore (23 percent), Japan (18 percent), and Finland (17 percent).
    • Twenty-one countries had a lower percentage of low-performing students than the U.S. However, the U.S. had a similar proportion of low-performing students (18 percent) than the international average.

Reading Literacy

  • The U.S. did not score significantly different from the international average of 496.
  • The U.S. was outperformed by 19 of 64 other countries.
    • Just like in mathematics and science Shanghai-China was the highest performing country (570) followed by Hong-Kong-China (545), Singapore (542), Japan (538), and Korea (536).
    • The U.S. performed similarly to 12 countries including France, Italy, United Kingdom, and Israel.
    • The U.S. performed significantly better than 34 countries such as Russia (475), Greece (477), Mexico (424), and Brazil (410).
  • Scores for the U.S. have not improved.
    • Scores for the U.S. were basically unchanged between 2009 and 2012.
    • The 2012 scores were also similar to the scores in 2000.
    • Ten more countries outperformed the U.S. in 2012 than in 2009.
      • In 2009 Poland, Ireland, Estonia, Switzerland, and Germany performed similarly to the U.S. but outperformed the U.S. in 2012.
  • The U.S. has fewer advanced students and more low performing students than most countries.
    • Eight percent of U.S. students scored within the top two PISA achievement levels which is similar to the international average.
    • Fourteen countries had a significantly greater share of high performers with Shanghai-China leading the world with 25 percent followed by Singapore (21 percent), and Japan (18 percent).
    • The U.S. also had a similar proportion of low-performing students (17 percent) than the international average although 14 countries had a higher percentage.

Demographics

  • The U.S. is not uniquely diverse.
    • The U.S. has about the same proportion of ‘disadvantaged’ students as the international average.
    • The U.S. has the 6th largest share of immigrant students.
    • When controlling for the socioeconomic status (SES) of students across countries the U.S. ranking would actually decline compared to other countries.

For more information about PISA and other international assessments of student achievement check out the Center’s More than a horse race: A guide to international tests of student achievement.

 

* The OECD average is used at the international average
** OECD used the term education systems instead of countries.
*** Students who scored below the 2nd PISA achievement level.






December 2, 2013

10 questions to understanding PISA results

The big day is almost upon us. Tomorrow the results from the 2012 Program for International Student Assessment (PISA) will be released. The rhetoric pertaining to the quality of our public schools is certainly going to be amplified tomorrow, with critics lamenting how the results show our public schools are in dire straits while others will argue the results are meaningless. To help you understand what the PISA results actually signify, the Center for Public Education has answered 10 key questions about what PISA actually measures and what the results mean for our public schools.

1. What is PISA?

The Program for International Student Assessment (PISA) is an assessment of reading, math, and science literacy given every three years to 15-year-old students in public and private schools in about 65 countries. The international institution Organisation for Economic Cooperation and Development (OECD) coordinates the development and administration of PISA worldwide while the U.S. Department of Education’s National Center for Education Statistics (NCES) conducts the assessments in the U.S.

Unlike most state assessments that measure how much knowledge a student has acquired, PISA is designed to measure how well students can apply their knowledge to real-world situations. To measure such skills, the test items on PISA are primarily “constructed response,” meaning the test-taker has to write their answers to the questions, and there are few multiple-choice items. U.S. students typically do not perform as well on open-ended, constructed response items. This is one reason many states are adopting new standards, including the new Common Core State Standards, which are intended to emphasize how well students can solve problems and think critically based on the concepts, topics and procedures they have learned.

2. Why are PISA results important?

PISA is one of the few tools we have to compare the outcomes of high school students internationally.  PISA provides valuable information on how prepared high school students are for postsecondary success whether in the workplace, career training, or higher education.

3. Is the U.S. ranking on PISA negatively impacted because unlike other countries the U.S. educates and tests all its students?

No, this used to be true several decades ago, but is no longer the case. Every industrialized country now educates all their students, including language minority, special needs and low-performing students. Every country that participates in PISA must adhere to strict sampling rules to ensure the country’s results are nationally representative of all 15-year-old students. Indeed, the decision to test secondary students at age 15 was made in part because young people at that age are still subject to compulsory schooling laws in most participating nations, which provides more assurance that PISA will capture the broadest sample.

4. Where does the U.S. really rank on PISA?

In 2009, 30 countries had higher mathematics scores than the U.S. but just 23 of these countries significantly outperformed the U.S. Because only a sample of each nation’s students participate in PISA, much like political polls, each country’s score has a margin of error. This means that the score is actually an estimate of how the country would perform if every 15-year-old took PISA. In science, 21 countries had higher scores than the U.S., but only 18 scored significantly higher; in reading, while 16 countries scored higher, just nine countries significantly outperformed the U.S.

OECD reports statistically significant differences in performance between nations, which is a more accurate way to look at PISA rankings than a straight listing of average scores.

5. Does PISA measure the effectiveness of public school systems?

Not completely, for three reasons: 1) PISA results are representative of the performance of all 15-year-olds in participating countries including those  attending private schools; 2) PISA makes no attempt to isolate schools from outside factors such as poverty or high proportions of non-native language speakers that may have an impact on  performance —such factors are important to include in the mix when evaluating the effectiveness of each country’s schools; and 3) No single measure can incorporate every outcome we expect from our public schools. To gain a better perspective of the overall effectiveness of educational systems, you should consider multiple measures. NSBA’s Center for Public Education’s Data First Data Center is a good resource to get you started when examining public schools in the U.S.

6. How does the U.S. stack up on other international measures?

The U.S. fares much better on other international assessments.  U.S. 4th and 8th graders performed among the top 10 countries in both math and science on the most recent Trends in Mathematics and Science Study, which was administered to more than 60 countries (TIMSS, 2011). Moreover, only four countries outperformed U.S. 4th graders in reading on the 2011 Progress on International Reading Literacy Study (PIRLS). Finally, U.S. students led the world in civics in 1999, the last year the CivEd was given. As of 2009, the nation’s 15-year-old students did not compare as well on PISA, especially in math and science. However, the U.S. performed better in reading by scoring among the “top 10.”

7. Has the U.S. shown improvement on PISA?

The U.S. saw a slight improvement in math scores between 2006 and 2009. It wouldn’t be surprising if such gains continued in 2012 as U.S. high school students continue to take more rigorous math courses. It is important to point out that the U.S. has demonstrated improvements on other measures since PISA was first given in 2000. U.S. 4th and 8th graders made among the greatest gains in math between 1995 and 2011 on TIMSS. The U.S. also made dramatic gains in on-time graduation rates by improving from 67 percent in 2000 to 75 percent in 2010 according to Education Week. Even on the National Assessment of Education Progress (NAEP), U.S. 4th and 8th graders have shown significant progress between 2000 and 2013, although high school students are not showing the same gains. The lack of progress on PISA appears to be the exception rather than the rule in terms of international comparisons.

8. How should the results be used?

We need to get beyond seeing PISA as a horse race by fixating on whether the U.S. finishes in win, place, or show. Instead, we need to see PISA results as an opportunity to assess if best practices in teaching and learning in other countries can also work for secondary schools here in the U.S. For example, we should  look at how much time other countries give teachers for professional development, how much they pay their teachers, how much time teachers spend in the classroom, how much flexibility exists at the local level, how special needs students are taught, and how much time students spend in school. Answers to these and others questions could be instructive for U.S. educators and policymakers. While PISA gives us an opportunity to learn from other countries it is important to keep in mind that just because a high-performing or high-gaining country does something does not mean it will work in U.S. schools.

9. Does poverty affect the U.S. performance on PISA more than in other countries?

Many analysts observe that poverty has a greater impact on student performance in the U.S. than elsewhere. For one thing, the U.S. has the highest child poverty rates among industrialized countries. For another, students in the U.S. who live in poverty tend to have less access to resources that research consistently shows impact student achievement, including highly effective teachers, access to rigorous curriculum, and high quality pre-k programs. Yet, poverty is just one of several factors that affect the standing of the U.S. In comparing the performance of top students around the world—where poverty is likely less of a factor—America’s top students still do not compare well to their peers in other countries. For example, in 2009 19 countries’ top students (scoring in the top 10 percent) outperformed the U.S.’s top students in science on PISA.

10. Are PISA results a precursor of America’s future economic competitiveness?

Our high school graduates’ preparation for postsecondary success certainly has some impact on the future economic competitiveness of the U.S. However, as stated in question 5, PISA is just one measure of high school students’ college and career readiness. In addition, many factors besides K-12 schooling contribute to the economic competitiveness of the U.S. and every other country, including, for example, a country’s monetary and fiscal policies. But for a country to maximize its economic output it needs a well-educated society which would lead to lower unemployment rates and less demand for government services. Stanford University Economist Eric Hanushek estimates that if the U.S. had scored 50 points higher on PISA in 2000 by 2015 GDP would be 4.5 percent higher than currently projected. Such an increase is the equivalent to the total expenditures on U.S. K-12 schools in 2015. Keep in mind, however, this does not mean that if the U.S. doesn’t improve on PISA that GDP will decline when our current high school graduates enter the workforce. However, it does show that education does affect future economic outcomes.






November 21, 2013

Don’t ignore international assessments

The U.S. will once again see how our nation’s high school students stack up against their peers in 65 other countries in reading, math, and science when the 2012 PISA results are released on December 3rd. PISA results typically garner a lot of attention because it’s the only assessment that compares the knowledge and skills of high schools students in nearly every industrialized nation in the world in reading, math and science.

Unfortunately, the U.S. typically doesn’t compare well to other countries on PISA especially in math and science. In 2009—the last time PISA was administered– 23 countries outperformed the U.S. in math while 18 countries outperformed the U.S. in science. The U.S. faired better in reading by performing as well as or better than all but 8 countries. These results show there is plenty of room for improvement.

Critics often use these results to argue that our schools need to do a better job preparing our future workforce or risk an economic disaster. While others argue that results from international assessments such as PISA are meaningless and should all but be ignored. I’ll bet most of the rhetoric after the PISA results are released will fall within these two camps.

However, as I wrote in our Guide to International Assessments we should get beyond such rhetoric and use the results to learn from other countries on what is working for their students. And not just those countries who score higher than we do either. We should also look at those countries that have made the greatest gains and check out what changes they made that may have contributed to their newfound success. We should also look deeper into the data to determine which countries did a better job educating certain students. For example, CPE delved deep into PIRLS- 4th grade reading international assessment—and found that language minority students perform as well in the U.S. as language minority students in other industrialized countries. Similar analyses should be conducted in other subjects and with other student groups, too, to gain a better understanding of what is working in schools around the globe.

While PISA results should not be used as the sole measure of the effectiveness of our schools, it is one tool that should not be ignored. PISA provides valuable information on how prepared our students will likely be for life after high school. But other information should be used, such as high school graduation rates, college persistence and graduation rates, as well as unemployment rates for recent graduates to gain a greater perspective on how well our high schools are preparing our students. Just like PISA none of these measures alone provides a complete picture of the quality of our public schools but they each provide valuable information that should not be ignored. – Jim Hull






September 26, 2013

Despite plateau in overall scores, minority students are more prepared for college

While the overall flat nature of the scores are nothing to celebrate, a closer look at the latest SAT data shows public schools are doing a better job preparing poor and minority students for college according to the 2013 SAT Report on College Readiness released today. 

Although scores for minority students have increased, it is important to point out that huge gaps remain between minority students and their white classmates. The results show that minority students are not completing the rigorous courses they need not only to score higher on the SAT but to prepare them to get into and succeed in college.

Just as the ACT showed last month, these results show schools need to double and even triple their efforts in making sure all students are adequately prepared for college-level work. To do so, high schools need to ensure that all students are taking the courses they need to succeed in college. Unfortunately, as CPE’s latest report Out of Sync found, most states do not require the courses students need to succeed in college as a high school graduation requirement. As more graduates plan on enrolling in college, it is more important than ever that a high school diploma represent a student who is ready for higher education, whether at a two or four-year institution. – Jim Hull

The Findings

National Scores

  • The nation’s graduating Class of 2013 had an average composite score of 1498, which is unchanged from 2012 (1500) but significantly lower than 2009 (1505).
    • At a score of 1498, an average high school graduate has about a 75 percent chance of getting admitted into a competitive four-year college.*
  • Scores remained unchanged in all three sections over the past year. Just as in 2012, scores were 496 in Critical Reading, 514 in Math, and 488 in Writing for 2013.   
  • Scores improved for most racial/ethnic groups.
    • The average combined Hispanic student score was 1354 in 2013, which is three points higher than in 2012 and nine points lower than in 2008.
    • The average black student score was 1278 in 2013, which is five points higher than in 2012 and two points lower than in 2008.
    • The average white student score was 1576 in 2013, which is two points lower than in 2012 and three points lower than in 2008.

College Readiness

  • Nearly half (43 percent) of the test-takers met the SAT College-Ready Benchmark in 2013, which is unchanged from the year prior and slightly lower than in 2009 (44 percent).
    • The SAT College Ready Benchmarks represent a student who scores a combined 1550 or higher. Students hitting this benchmark have a 65 percent chance of earning a B-minus grade point average in their freshman year courses.
  • Minority students are less likely to be college ready.
    • Just 15.6 of black students and 23.5 percent of Hispanic students were college ready according to the SAT’s Benchmark.
    • However, both black and Hispanic students saw increases in reaching the SAT Benchmark from 2012 to 2013.

Core Course Rigor

  • Seventy-five percent of SAT test-takers completed the recommended “core” college-preparatory curriculum, which is an increase from 70 percent in 2001.
    • Just 66 percent of black students and 70 percent of Hispanic students completed the core curriculum, compared to 80 percent of white students.
    • However, both black and Hispanic students saw a one percentage point increase in core curriculum completion rates since 2012.
  • High school graduates who took math or English AP or Honors courses scored significantly higher than students who complete four or more year’s worth in each subject, not only in the relevant subject area, but in all three SAT sections.

Test Takers

  • Just over 1.66 million students from the Class of 2013 took the SAT sometime during their high school which was a slight dip from 2012.
  • Slightly more minority students are taking the SAT.
    • In 2013, 17 percent of SAT test-takers were Hispanic which was the same as in 2012, but greater than the 12 percent in 2008.
    • Thirteen percent of SAT test-takers were black in 2013 which was the same as in 2012, but greater than the 11 percent in 2008.
    • The percent of test-takers who were white continues to drop from 57 percent in 2008 to 51 percent in 2012 to just 50 percent in 2013.
  • A greater number of students whose first language isn’t English are taking the SAT.
    • In 2013 13 percent of SAT test-takers’ first language was not English compared to 9 percent in 2008.
  • The vast majority (82 percent) of SAT test-takers want to earn at least a Bachelor’s degree, up from 75 percent a decade ago.

For more information on how to use college entrance exam scores to evaluate your school, check out the Center’s Data First Web site.






« Newer PostsOlder Posts »
RSS Feed