Editor’s note: For years, there have been concerns about discordant trend lines for students in many states – rising, according to the state’s own standardized tests, but anemic according to the National Assessment of Educational Progress. In Florida (where we’re based), the lines tend to be in sync. But this post scrutinizes a state where that isn’t the case.
by Alan Bonsteel
The California Department of Education (CDE) just announced it must delay the release of its annual STAR testing results because at least some of the test questions were posted on Internet sites such as Twitter and Facebook. Way back in April, CDE found about 1600 images on the Internet, though many were only of test booklets and student answer sheets rather than actual questions. After investigating, CDE traced the images to 12 schools around the state. It’s now trying to analyze how much damage the leaked questions, at least 36 to date, have caused so it can accurately report scores.
Almost all states do their own testing of their public school students, an obvious conflict of interest that almost always results in unrealistically rosy test results. Tests tied to inherently weak standards that states have been allowed to adopt produce test results that do not correlate with what students actually need to know. Here in California, the STAR has made up nearly all of the misnamed “Academic Performance Index” since 1999. Although the high school exit exam has been added, the addition of graduation and dropout data, called for in the legislation more than a decade ago, has yet again been delayed until next year.
From the outset the STAR results soared, at least in part due to lax security, a failure to change or even rotate questions from year to year, and consequently, some teaching to the actual questions on the test. By contrast, the two other standardized tests given throughout the United States, the NAEP, or National Assessment of Educational Progress, and the SAT, have remained generally flat, showing only very modest improvements over more than a decade in each case. Both of these latter two tests are given by independent testing authorities, and security has been maintained with them.
California is not the worst when NAEP scores are analyzed; Iowa, Maine, and Oklahoma have grabbed that honor. Thanks to Education Next, you can find a map with results from 41 states here. California edged itself just above the median.
For years our group, California Parents for Educational Choice, has commented in the news media about the disconnect between the self-administered STAR and the two objective exams, with telling results.
Now the Education Next study confirms that we are not doing well enough in California, or nationally for that matter, to catch up with our international competition or even to catch up with the states doing the best in improving results. In a state that becoming more and more dependent on knowledge based industries – biotech, electronics, engineering – this is indeed bad news.
The study also further reinforces what our group has been saying for a decade – the problem in education is not primarily funding. Just about as many high-spending states showed relatively small gains as showed large ones. There’s not much correlation, but if you hamstring your education code with teacher union-inspired legislation as California has repeatedly done, the correlation declines even further. When you can’t fire or even often evaluate a teacher, what do you expect?
Throughout the U.S., local school choice groups can always make the same comparison with the NAEP and the SAT in contrast to self-administered and usually non-secure tests. We can make the point that schools of choice, especially when one adjusts for the socio-economic status of the students and the much lower per-student funding of charter schools, are by objective standards in the aggregate outperforming traditional public schools. This tends to be especially true after charter schools mature for 4-5 years.
(Image from ehow.com)