In April of 2012, KIPP responded to a press release for Is choice a panacea? An analysis of black secondary student attrition from KIPP, other private charters and urban districts, a peer-reviewed paper published in the Berkeley Review of Education (BRE) about African American secondary student attrition from charter schools across the state of Texas. KIPP began their response by criticizing the research on four main points. I will address those points.
1. Vasquez Heilig relied on previous studies that claimed KIPP achieves results through high student attrition, while completely ignoring findings from the independent research group Mathematica that KIPP loses fewer black male students than neighboring district schools.
Notably, the suite of Mathematica reports and conference presentations focus on middle schools only. Our analyses focus on secondary students— middle and high schools in grades 7-12 (See p. 160 in the article). Also, our study was published in BRE, a peer-reviewed journal which required a blinded review process. Saying Mathematica is “independent” when KIPP is one of their paying “clients” is disingenuous.
2. The researchers used faulty methodology to draw false and inflammatory conclusions, directly contradicting Mathematica’s research—which found that KIPP produces significant and substantial academic gains for all students, including African-American students, that are not due to attrition.
I posit that it is inflammatory to accuse a study of faulty methodology without providing a valid basis for that claim… Actually, I don’t believe KIPP read the paper. Why? Because we never made claims relating to the relationship between attrition and achievement. We simply conducted an analysis of publicly available attrition data that is collected by Texas law in the PEIMS (The Lone Star State’s school data repository). KIPP Houston’s African American secondary student attrition problem was sitting in the PEIMS for a decade— it is still there for anyone to see regardless of KIPP’s spin.
3. Vasquez Heilig makes the inaccurate claim that KIPP receives $3,361 more in total revenue than the Houston Independent School District, and incorrectly infers that KIPP Houston spends more per pupil than the district. In reality, KIPP Houston, like all public charter schools in Texas, receive less per pupil funding than district schools and no public revenue for facilities. Excluding private funds raised to cover facilities costs, KIPP Houston spends less per student per year than HISD.
KIPP is incorrect. NEPC also thinks so here. Its hard to argue with publicly available data that they themselves are required to report by law. Per student revenue for KIPP Austin ($17,286) and KIPP Houston ($13,488) relative to Austin ISD ($10,667) and Houston ISD ($10,127) is readily available online each year from the State of Texas. However, considering the current school finance debacle in Texas, where approaching $6 billion was cut from education in the last legislature, in retrospect, I think KIPP should be applauded for spending more on education— as should other charters such as Making Waves.
4. The paper consists largely of repackaged findings from a study on KIPP released last year by Western Michigan University (WMU), whose conclusions were immediately discredited by researchers at the liberal Brookings Institute and other respected organizations.
The BRE paper uses different data sources (Texas PEIMS data versus WMU’s Common Core federal data). Our study focuses specifically on African American students in all charters across the state of Texas whereas the WMU study conducted a national analysis focusing solely on KIPP campuses. Again, I don’t think KIPP bothered to read the paper (but that never gets in the way of spin) because they stated,
In comparing KIPP’s student attrition with that of neighboring school districts, Vasquez Heilig and his colleagues relied on faulty assumptions. At the time of the study, not all of the cities in question had KIPP high schools, so the report’s conclusions are based on comparing attrition in charter middle schools to district middle and high schools—an apples to oranges comparison.
This is not a valid critique. We noted in that paper that only KIPP Houston had a decades’ worth of 7-12 data (we show this on p. 168), so the decadal KIPP attrition analysis was restricted to Houston (See Table 9 on p. 171).
In conclusion: Does 100% of 60% really add up to 100%? (See p. 172) That’s Enron math.
If you have any other questions shoot me an email at email@example.com