Education policy research and the rhetoric emergent from that research typically fails to represent the realities – the real distribution – of schooling across our nation. We focus extensively on urban schooling most often ignoring what might or might not work in the suburbs or rural areas. We focus on development of reading and math/STEM skills but far less on other content, knowledge or skills. We focus on measuring teacher/teaching quality through estimates of student gains in reading and math, but often ignore the contextual factors that may influence teacher effectiveness, or other less commonly measured outcomes.
These foci aren’t necessarily problematic. After all, there’s only so much we can research at any given time. The problem lies in our desire – specifically in translation to policy recommendations – to broadly extrapolate the meaning of these findings. Sometimes, it’s a mere extrapolation problem, where researchers and well-meaning policymakers simply wish to project one finding onto an entirely different situation.
Other times, it’s a straight up bait-and-switch, where self-interested actors or advocates point to one great success, and then swap it for a cheap imitation in their own policy recommendations/proposal. It’s like holding up Exeter or Andover as examples of great private schooling to advance a voucher argument, and then providing a voucher sufficient for children to attend their local evangelical school housed in a double-wide on a vacant lot. [private school cost/spending data here]
Charter school research is, in my view, one of the most problematic areas of education policy research, especially in its translation to policy recommendations. The most extreme version of the pro-charter unregulated expansion argument goes something like this:
Look at the research on those massive gains created by KIPP schools, especially those studies done by Mathematica researchers, as well as the work of Dynarski and colleagues in Boston. And hey, look at Fryer’s stuff on NYC’s ‘no excuses’ charters! This is incredible. We must move forward with all deliberate speed to replicate this!
And how must we do that – well, we really need to take the lid off this movement – no caps. Increase subsidies. Have more authorizers available to fast-track the flood of new applications from these great providers! That’s it. Move… and move now! From Arizona, to Ohio, Michigan, Florida and beyond – authorizers will ensure accountability. The few… and let me emphasize that… few… bad actors will be shut down, and these amazing providers will flourish across our great nation!
Okay… so maybe I’m overdoing it a bit. Certainly some, even those at TB Fordham institute have now admitted problems in the Ohio charter accountability process. And the folks at Brookings have raised questions about the quality of Arizona charter schools.
At least a handful of studies on high profile charter operators have yielded substantive, positive results, at least with respect to growth on narrowly measured student achievement outcomes, and in some cases on college acceptance/matriculation. Of course, even these studies, like the Mathematica/KIPP studies, or Fryer studies tend to totally ignore key features of the models that may be contributing to those outcomes – like money, smaller classes, more time and teacher pay to support that time.
But here’s the bigger picture – In all of this time that we’ve been allowing and inducing charter school growth, while studying KIPPs and others to validate positive effects – we’ve paid far too little attention to the actual distribution of providers out there. Most charter schools aren’t KIPP (whether we like KIPP or their educational model/practices). And most charter schools across the country aren’t like NYC’s (or Boston’s) other major charter operators.
First, here’s a national map of charter school distribution based on the 2013 NCES Common Core, public school universe survey, with indicators from 2011-12 of the management type fort those schools (data set for Wisconsin incomplete). Yeah… there are charter schools in major Northeastern/Mid-Atlantic cities from Boston to DC, but there are also one heck of a lot of charter schools in Florida, Ohio, Michigan, Arizona and other states where charters have a far less stellar track record. There are also, as shown in Figure 1, a large number of charter schools presently operated by for-profit management companies (much more on the implications of this in a future post).
Figure 1. National Distribution of Charter Schools by Management Type
And even in the Northeast, most of them are NOT those heavily privately subsidized KIPP schools (though there are a few more “Uncommon” schools than elsewhere). Yeah, there are a few scattered around, and by 2015, KIPP’s own website indicates that they serve about 70k kids, but that would still put them only close to the top 5 providers nationally, against our 2011-12 figures (assuming no other operator grew since then, as KIPP did).
Who is out there? Well, Edison Schools are still there. White Hat, K12, and others with more questionable track records. And this (region) is as good as it gets!
In these maps, I’ve only labeled the management company for companies that managed at least 20 schools in our 2011-12 data. Management company data linked from prior work by Gary Miron and colleagues (diligently matched/vetted by my doc student Mark Weber. Mapped version here from prelim data set). Note that a common issue with mapping this type of data is that the school address may show up at the corporate address rather than the actual school location.
Figure 2. Atlantic Coast Schools
Figure 3. Southeast
But for a few KIPP schools, more southeastern charters are either managed by smaller firms, are independent, or are managed by White Hat. Florida is even more fun. Florida has a plethora of providers we don’t often hear about in academic research, but more often hear about in scandalous news headlines (Academica, Mavericks, Imagine[more later])
Figure 4. Florida
Great lakes states are dominated by providers like White Hat Management, Summit Academy, Concept Schools and also Imagine Schools! No, these are not the Exeters and Andovers (or KIPPs) of charter schooling. What they are is the actual population of providers as they exist across states. Again, there are a few KIPPs in there but they are clearly overshadowed by other providers for whom we have far less rigorous academic analysis of their program/service quality.
Figure 5. Great Lakes
The KIPP presence in Texas and New Orleans is somewhat stronger, but again, they exist amidst a multitude of other providers including National Heritage Academies, Cosmos schools, among others.
Figure 6. Gulf Coast
Figure 7. Southwest
Figure 8. California
I’m certainly not saying these are all bad schools. That any non-KIPP charter can’t be as good as a KIPP charter. That’s just silly. I’m also not saying that all KIPP schools are great and that they are the only great charter schools. Yes, my own research has found KIPP schools, along with Uncommon Schools and other NYC charter chains to be particularly well endowed financially.
What I am saying is that it is very problematic to adopt the logic –
KIPP = Charter School
KIPP = Awesome (by virtue of KIPP-iness alone)
Charter = Awesome (by association with KIPP-iness)
therefore unfettered expansion of charters is super totally awesome!
For most of America, this argument leads to a massive bait and switch.
Further, as I’ve explained in recent posts, it would be worth our time to step out of the charter vs. district school box and ask more broadly, what’s working? Where is it working? Why is it working? Can it work similarly elsewhere? And most importantly, what the heck to we mean by “working?”
We must evaluate collectively charter, district, and for that matter private schools across rural, urban and suburban settings, with consideration for the uniqueness of regional, state and local policy contexts. This is all doable. But periodically, we need to step back and assess what’s really going on.
Market share by manager (for those enrolling over 10k students in 2011-12. all others aggregated to single category. excludes Wisconsin data)
Hmmm… seemed relevant (a bit over the top, but relevant):