Editor’s note: For years, there have been concerns about discordant trend lines for students in many states – rising, according to the state’s own standardized tests, but anemic according to the National Assessment of Educational Progress. In Florida (where we’re based), the lines tend to be in sync. But this post scrutinizes a state where that isn’t the case.
by Alan Bonsteel
The California Department of Education (CDE) just announced it must delay the release of its annual STAR testing results because at least some of the test questions were posted on Internet sites such as Twitter and Facebook. Way back in April, CDE found about 1600 images on the Internet, though many were only of test booklets and student answer sheets rather than actual questions. After investigating, CDE traced the images to 12 schools around the state. It’s now trying to analyze how much damage the leaked questions, at least 36 to date, have caused so it can accurately report scores.
Almost all states do their own testing of their public school students, an obvious conflict of interest that almost always results in unrealistically rosy test results. Tests tied to inherently weak standards that states have been allowed to adopt produce test results that do not correlate with what students actually need to know. Here in California, the STAR has made up nearly all of the misnamed “Academic Performance Index" since 1999. Although the high school exit exam has been added, the addition of graduation and dropout data, called for in the legislation more than a decade ago, has yet again been delayed until next year.
From the outset the STAR results soared, at least in part due to lax security, a failure to change or even rotate questions from year to year, and consequently, some teaching to the actual questions on the test. By contrast, the two other standardized tests given throughout the United States, the NAEP, or National Assessment of Educational Progress, and the SAT, have remained generally flat, showing only very modest improvements over more than a decade in each case. Both of these latter two tests are given by independent testing authorities, and security has been maintained with them.
California is not the worst when NAEP scores are analyzed; Iowa, Maine, and Oklahoma have grabbed that honor. Thanks to Education Next, you can find a map with results from 41 states here. California edged itself just above the median.
For years our group, California Parents for Educational Choice, has commented in the news media about the disconnect between the self-administered STAR and the two objective exams, with telling results. (more…)
Editor's note: Here's another selection of "choice nuggets," a feature we started last week to keep some smaller but still blogworthy items from going to the compost heap. 
Are vouchers too popular, or not popular enough?
For years, school choice critics have posited that vouchers and tax-credit scholarships will open the floodgates for a mass exodus from public schools. So it was a bit of a monkey shock last week to read Diane Ravitch belittling Louisiana’s new voucher program because, in her view, too few students had applied.
“Not exactly a stampede for the exits,” Ravitch wrote. “No big rush to enroll in the little church schools that are supposedly better than the public schools … ”
According to published reports, about 9,000 students applied for vouchers, not counting those already enrolled in the voucher program in New Orleans. Sounds like a lot of people to me. But if it’s obvious that only a small percentage of parents will opt for private schools (because, truth be told, most parents are satisfied with their public schools) then why are critics so upset? Doesn’t that undermine the argument that school choice is a Trojan Horse for profiteers?
Ravitch ends her piece by suggesting Louisiana officials puffed up the application numbers. “As usual,” she concluded, “they were playing the media for headlines.”
Two days later, the Washington Post’s “Answer Sheet” blog ran Ravitch’s piece in full.
A tale of two reports
Two national reports released in the last week purported to offer some gauge of academic progress in Florida’s public schools. One relied on apparently undisclosed measures to determine that Florida’s educational ranking dropped from No. 35 to No. 42 in the past year. The other tracked nearly 20 years of scores from the National Assessment of Educational Progress to conclude that Florida students have made more progress than their peers in every state but one.
Guess which report got more play? (more…)