PolitiFact misreads Wisconsin voucher research

Speaking before the Milwaukee Rotary Club on May 6th, Wisconsin gubernatorial candidate Mary Burke claimed the state’s school voucher program “has no research that shows that it’s going to improve student learning.” PolitiFact Wisconsin examined some school choice research and noted Burke’s claim was an “overstatement,” but then rated it “Mostly True.”

PolitiFact must be grading on a curve.

There is no nuance about the word “no”; in this instance, it means “none” or “zero.” So why does PolitiFact cite three research papers that find academic gains attributable to vouchers and then give the “no research” claim a “Mostly True” rating?

To achieve such a conclusion, PolitiFact researchers had to misread the evidence they evaluated, overvalue academic caution (ironically while rating a politician’s hyperbole) and exclude other supportive research. Let me explain by offering more detail about the research PolitiFact cited, and the other research it inexplicably overlooked.

Public Policy Forum’s  report found public school students scored higher on state assessments than private school voucher students. However, the report failed to control for income differences or provide test scores of voucher students prior to using the voucher. It is possible voucher students are poorer, on average, or that they tended to score worse on state assessments even before receiving a voucher (as is the case here in Florida). Because of these faults, the report cannot make any claims about the impact of vouchers on students. PolitiFact overlooked the PPF study’s methodological weaknesses and gave the report greater weight than all other studies mentioned.

PolitiFact also cited a multi-year study by researchers at the University of Arkansas, which revealed statistically significant achievement growth in reading, but not math, in the final year. Academic researchers tend to be cautious in their conclusions. The researchers in this case mentioned the achievement gains coincided with implementation of high-stakes testing and noted this could be an alternative explanation for the observed gains. But PolitiFact overstates the nuance so much it functionally ignores the positive finding.

PolitiFact does accurately cite two reports about vouchers in Milwaukee: One from 2003, by Caroline Hoxby, found public school students saw test score gains when public schools faced competition from vouchers; and one from 2008, by  researchers at the Federal Reserve, found improved public school performance once the voucher program expanded the supply of private schools and the amount of the scholarship in 1998.

However, PolitiFact researchers skipped other high-quality studies that also found learning gains:

Even ignoring these five additional research papers, PolitiFact still cited three supportive of vouchers. Burke’s claim that there is “no research” was not a mere “overstatement.” It was false.

One last note: The evidence suggests voucher students are experiencing slightly higher gains than public school students while those public school students are also experiencing significant gains of their own. To call the evidence “weak,” as PolitiFact did, is like raising the high bar in the middle of an Olympic high jump competition and claiming the jumper failed because he cleared the bar by an inch instead of a foot.


Avatar photo

BY Patrick R. Gibbons

Patrick Gibbons is public affairs manager at Step Up for Students and a research fellow for the Friedman Foundation for Educational Choice. A former teacher, he lived in Las Vegas, Nev., for five years, where he worked as an education writer and researcher. He can be reached at (813) 498.1991 or emailed at pgibbons@stepupforstudents.org. Follow Patrick on Twitter: at @PatrickRGibbons and @redefinEDonline.