There have been state-commissioned evaluations of both the Milwaukee and Cleveland voucher programs. In Milwaukee, there were no appreciable academic gains at all. In Cleveland gains were found in one subject, science, for voucher students attending well-established private schools, with declines in test scores for students attending newly opened private schools. Harvard University’s Paul Peterson, whose own methodology and objectivity have been criticized by many in the education research community, has challenged both sets of findings. In the case of Milwaukee, a third researcher, Princeton University economist Cecilia Rouse, has evaluated both the Peterson and Wisconsin studies, finding mixed results. Overall, there is no clear evidence that publicly funded vouchers have improved educational performance of voucher students.
Milwaukee Parental Choice Program
In Milwaukee, the state commissioned the most thorough research yet conducted on vouchers. This five year research study conducted by a University of Wisconsin – Madison team led by John Witte found no appreciable academic gains in reading and math by students attending voucher schools.6 Attrition rates were high: 44% and 32% in the first two years.7 Reasons for leaving the program included application and fee problems, transportation, and the limitation on religious instruction (before the state Supreme Court allowed the expansion of the program in 1998).8 Among parents whose children remained in the program, satisfaction was high.9 Parents of children who remained in the voucher program had higher educational levels—a key determinant of a student’s academic achievement—than those who did not.10
Using Witte’s data, Paul Peterson’s team employed different assumptions and statistical techniques and claimed there was a statistically significant gain for voucher students in the third and fourth years of the program.11 This finding was disputed by many in the research community, who argued that by the third year the control and experimental groups were not comparable. An annual attrition rate of 30%—primarily students doing poorly in the voucher program—ensured that those who remained were an academically superior subset, not a random sample.12 Peterson’s methodology has also been criticized on other points, including his reliance on tiny samples—in some cases as low as 26 students—and for dismissing group differences between voucher school students and the control group (non-choice public school students).13 Witte described Peterson’s reanalysis of the Milwaukee data as a “confusing, tortured effort to try to find any evidence that students enrolled in private schools … do better than any students in the Milwaukee Public Schools.”14 Peterson’s unconventional reporting of statistical significance tests has drawn fire from Professor Alex Molnar, then at the University of Wisconsin-Milwaukee, who described Peterson’s term “substantially significant” as an “important-sounding characterization with no precise research meaning.” The pro-voucher publication Wall Street Journal also criticized the Peterson study writing that he had been “loose with his claims.”15 Still others have pointed to contradictions between Peterson’s claims of academic achievement and his own statistical data,16 as well as a lack of adequate controls to take prior academic achievement of students into account. Additional criticisms cite Peterson’s failure to adequately address variables that affect a child’s success in school, including parental involvement and expectations, parental employment and marital status, family size and those families receiving Aid to Families with Dependent Children (AFDC) benefits.17
Princeton University’s Cecilia Rouse conducted a third analysis of the same data. Rouse found that the voucher students had made some small gains in math, and that “the [voucher] effects on the reading scores are as often negative as positive and are nearly always statistically indistinguishable from zero.”18 She found positive effects on math scores or voucher students were apparent only for the subgroup of students who were in the voucher program over a four-year period. Again, she reported that this does not account for the significant number of dropouts from the voucher program. Moreover, Rouse showed that Milwaukee public schools serving low-income populations that have small class sizes and receive additional state funding keep pace with voucher schools in math gains and substantially outpace them in reading.19 Rouse does not assert that this proves that small classes are the causal factor but calls for more investigation. Alex Molnar and Charles Achilles of Eastern Michigan University, however, do find that reducing class size is more effective than a voucher policy in helping at-risk students a view supported by Princeton University economists Alan Krueger and Diane Whitmore.20 Referring to another study in which Peterson found large improvements in test scores for African American students and claimed these gains outpace those observed in the Tennessee class size reduction experiment called STAR (Student/Teacher Achievement Ratio), Molnar and Achilles fault Peterson for making an apples-to-oranges comparison. They found that when the apples-to-apples comparison is made between similar students, “For those [minority] students, the STAR effects were approximately double the total effects [of vouchers].”21
In 1995, Wisconsin lawmakers who support vouchers responded to the research, not by changing the voucher program, but by eliminating any further state-sponsored research into the educational results of vouchers. The most recent Wisconsin state audit of the voucher program found that “some hopes for the program—most notably, that it would increase participating pupils’ academic achievement—cannot be documented, largely because uniform testing is not required in participating schools.”22 Wisconsin taxpayers thus have absolutely no current information on whether vouchers are having any positive effects on education.
Back to Top
Cleveland Scholarship and Tutoring Grant Program
The state-sponsored study of the Cleveland voucher program, conducted by an Indiana University team led by Kim Metcalf, found that after two years there was no improvement in the overall test scores of those students using vouchers in established private schools. On a subject-by-subject basis, there were gains for voucher students only in science.23 Overall test scores in the other four (math, language arts, social studies and reading) subject scores revealed no differences between voucher and public school students. A 2001 report published by Metcalf’s team found that public school groups included in the study demonstrated greater learning gains in the subjects tested—language, reading and math—over the two years of the study than either of the voucher student groups. While voucher students had higher total test scores soon after entering first grade, this advantage quickly began to erode. In fact, one group of public school students, the applicant/non-recipients surpassed the 2-year voucher students (with a score of 576) and had completely caught up with the 3-year voucher students by the end of second grade (scores of 583 each). And the non-applicant public school students (577) pulled slightly ahead of the 2-year voucher students and closed what had been an 11-point gap between themselves and the 3-year voucher students to only six points.24
Additionally, the clearest, most unequivocal finding from the official evaluations of the voucher program was that students who used vouchers to attend new private schools—those established specifically to serve voucher students—scored significantly lower than their peers, in both public schools and the more established voucher supported private schools, on academic tests in all subjects.25 For example, Metcalf found students who used vouchers at the new HOPE Academy schools scored below their peers in both public schools and the established private schools in all five (science, math, language arts, social studies, reading) subjects tested.26 Students using vouchers to attend older, established private schools performed on par with those attending public schools.27
Responding to Metcalf’s first year report, Paul Peterson and his research team claimed that findings from his 1997 evaluation of the two HOPE Academy voucher schools showed significant academic gains.28 He was hired to evaluate these schools by their founder David Brennan, a prominent voucher advocate.29 Peterson criticized the Indiana University study primarily for failing to include the HOPE Academy scores and for using second grade test scores taken prior to entry in voucher schools as a basis for comparison with third grade voucher scores.30
Metcalf responded with a strong article entitled “Advocacy in the Guise of Science.” In it, he suggested that the Peterson researchers “are strong supporters of vouchers and have done much to promote the implementation of voucher programs throughout the country. So it is possible that they are engaged in a deliberate effort to misrepresent the Cleveland data in order to influence educational policy.”31 Specifically, he responded that he did include the HOPE scores but put them into a different section because those students took a different test. With regard to the use of second grade tests, he points out that assessing first year results of an experiment without a baseline is “a little like trying to determine who won a basketball game by looking only at the points scored in the second half of the game.”32 The Ohio Legislative Committee on Education Oversight (LCEO), responsible for monitoring Cleveland’s voucher program, further discredited Peterson’s criticisms. The LCEO had appointed Peterson to its technical review committee of the Indiana study, and charged that Peterson released his critique to the Wall Street Journal, the Washington Post, and the World Wide Web because he didn’t like the results, even though the study’s methods “are viewed as appropriate and credible by [other] disinterested scholars.”33
Back to Top
Florida A+ Plan
In Florida, vouchers are available to students living in the attendance zone of any schools designated by the state as “failing” in any two out of four years. As of the 2001-02 school year, the Florida program has been operating for three years, has included just ten elementary schools statewide, and has no documented research on its impact on achievement among voucher students. There has been an effort by researchers to assess the impact of the “voucher threat” in Florida on public school student test scores (see further).
Back to Top