MPR’s Unfortunate Sidestepping around Money Questions in the Charter CMO Report

Posted on November 6, 2011



Let me start by pointing out that Mathematica Policy Research, in my view, is an exceptional research organization. They have good people. They do good work and have done much to inform public policy in what I believe are positive ways. That’s why I found it so depressing when I started digging through the recent report on Charter CMOs – a report which as framed, was intended to explore the differences in effectiveness, practices and resources of charter schools operated by various Charter Management Organizations.

First, allow me to point out that I believe that the “relative effectiveness of CMOs” is not necessarily the right question – though it does have particular policy relevance when framed that way. Rather, I believe that the right questions at this point are not about charter versus non-charter, KIPP versus Imagine or White Hat, but rather about what these schools are doing, and whether we have evidence that it works (across a broad array of students and outcome measures). Then, once we get a better picture of what is working… and for that matter … what is not, we also need to consider very carefully… and in detail… the cost structure of the alternatives – that is, if what they are doing is really alternative to (different from) what others are doing. Of course, it is relevant from a measured expansion strategy to know which management organizations have particularly effective strategies. But we only develop useful information on how to transfer successes beyond the charter network by understanding the costs and effects of the strategies themselves.

So, as I read through the Mathematica CMO study, I was curious to see how they addressed resource issues.  What I found in terms of “money issues” were three graphs… each of which were pretty damn meaningless, and arguably well below Mathematica’s high quality research standards.

Here’s the first graph. It shows what I believe to be the average per pupil spending of charter schools by the CMO network and shows a very wide range. Now, This one bugs me on a really basic level, because as far as I can tell, the authors didn’t even try to correct their spending measures for differences in regional costs. So, any CMO which operates more schools in lower cost labor markets will appear lower and any CMO in higher cost labor markets will likely appear higher. In short, this graph really means absolutely nothing. It tells us nothing at all.

Figure 1

Source: http://www.mathematica-mpr.com/publications/PDFs/Education/cmo_final.pdf

Rule #1: Money always needs to be evaluated in context.  Actually, the easiest way to deal with regional or local corrections is to simply compare the expenditures to average expenditures of other school types in the same labor market.  That is, what percent above or below traditional public schools and/or private schools is charter spending among schools in the same labor market (can use Core Based Statistical Areas as a proxy for labor market). Notably, the tricky part here is figuring out the relevant spending components, such as equating traditional public school facilities, special education and transportation costs with cost responsibilities of charters. Alternatively, one can use something like the NCES Education Comparable Wage Index (though dated now) to adjust spending figures across labor markets.

In their second figure, Mathematica compares reported IRS filing expenditures to public subsidy figures. But rather than bothering to dig up the public subsidy figures themselves, Mathematic relies on figures from a dated and highly suspect report – the Public Impact/Ball State report on charter school finances. I’ve written previously about the many problems with the data in this report. There’s really no reason Mathematica should have been relying on secondary reported data like these when it’s pretty damn easy to go to the primary source.  Further, this graph doesn’t really tell us anything either.

Figure 2

Source: http://www.mathematica-mpr.com/publications/PDFs/Education/cmo_final.pdf

What do we really need and want to know? We need to know:

  1. Does it cost more and how much more to do the kinds of things the report identifies as practices of successful charter schools, such as running marginally smaller schools with smaller class sizes?
  2. What kind of wages are being paid to recruit and retain teachers who are working the extra hours and delivering the supposedly more successful models?
  3. How does the aggregate of these spending practices stack up against other types of schools in given local/regional economic contexts?

The financial analyses provided by Mathematica may as well not even be there. Actually, it would be a much better report if those graphs were just dropped. Because they are meaningless. They are also simply bad analyses. Analyses that are certainly well below the technical quality of research commonly produced by Mathematica.

Here are a few examples of what I’ve been finding on these questions, from recent blog posts, but part of a larger exploration of what we can learn from extant data on charter school resource allocation.

First, here’s some data on KIPP schools expenditures compared in context in NYC. That is, comparing the relevant school site expenditures (with footnote on the odd additional spending embedded in KIPP Academy financial reports) within NYC.  Here, it would appear that KIPP schools in certain zip codes in NYC may be significantly outspending traditional public schools serving the same grade ranges in the same zip codes (perhaps more consistently if we spread the KIPP Academy spending across the network, as I discuss in my report below [end of post]). The next step here is to compare the underlying salary structures, class sizes and other factors which explain (or are a result of) these spending differences. I’m not there yet with this analysis. More to come.

Figure 3

Second, Here’s how KIPP (and other charter) school spending per pupil compares in Houston Texas, based only on the school site spending reports from the Texas Education Agency, and not necessary including additional CMO level allocations (in the works).  Clearly, there’s some screwy stuff to be sorted out here as well. My point with these figures is merely to show how one can put spending in context and use more relevant numbers. Again, there are similar next steps to explore.

Figure 4

From a related recent post, here again are the class sizes and salary structure of Amistad Academy, a successful Achievement First school in New Haven Connecticut.  If there are two things that really drive the cost of operating any particular educational model it’s a) the quantity of staff needed to deliver the model – as can be measured in terms of class sizes (number of teachers), b) the price that must be paid for each staff member in order to recruit and retain the kind of staff you want to be delivering that model.

Figure 5

Figure 6

These figures show that two strategies employed by Amistad are a) lower early grades class sizes and b) much higher teacher salaries across the entire range of experience (among the experience range held by Amistad teachers) but especially in the early –mid-career stages.  These are potentially expensive strategies to replicate and/or maintain. But, they may just be good strategies… and may actually be the most cost –effective approach. We’ll never know if we don’t actually take the time to study it. We may also find that these approaches become more expensive as we attempt to scale them up and put greater strain on local teacher labor markets (supply).

Notably, I’ve been finding similar approaches to teacher compensation in the more recognized New Jersey Charter schools. I have shown previously, and here it is again, that schools like TEAM Academy seem to be shooting for higher salaries than neighboring/host public districts.  So too are schools like North Star Academy. But others (often less stellar [pun intended] charters) are not.

Figure 7

 

Now’s the time to get more serious about digging into the resource issues and providing useful information on the underlying cost structure of the educational models and strategies being used in successful charter networks, individual schools or anywhere for that matter.

Mathematica is far from alone in paying short shrift to these questions.  Roland Fryer’s Houston Apollo 20 study provided only marginally less flimsy analysis of the costs associated with the “no excuses” model (and made unsupported assertions regarding the relationship of Apollo 20 costs to “no excuses” charter school costs see http://www.houstonisd.org/HISDConnectEnglish/Images/Apollo/ApolloResults.pdf, full paper provides only marginally more information re: costs)

So, why do I care so much about this… and more importantly… why should anyone else? Well, as I explained in a previous post there’s a lot of mythology out there about education policy solutions – like no excuses charter schools – that can do more with less. That can get better outcomes for less money.  Most of the reports that pitch this angle simply never add up the money. And they fail to do any analysis of what it might cost to implement similar strategies at greater scale or in different contexts.  Is it perhaps possible that most improvements will simply come at greater overall cost?

Here’s the other part that’s been bugging me. It has often been asserted that the way to fix public schools is to either A) replace them with more charter schools and B) stop bothering with small class size and get rid of additional pay for things like increased experience.

As far as I can tell from the available data Option A and Option B above may just involve diametrically opposed strategies. As far as I’ve seen in many large data sets, charter schools that we generally acknowledge as “successful” are trying to pay teachers well and their teacher salaries are generally highly predictable as a function of experience (based on regression models of individual teacher data). That said, the shape of their salary schedules is often different from their hosts and surroundings – different in a way I find quite logical. Further, Charters with additional resources seem to be leveraging those resources at least partly to keep class sizes down (certainly not in the 35 to 40 student range of many NYC public schools, or CA schools).  Total staffing costs may still be lower mainly because charter teachers and other staff still remain “newer.” But sustaining current wage premiums may be tricky as charter teachers stay on for longer periods.

Again, in my preliminary analyses, I’m seeing some emphasis in some cases on early grades which makes sense. What I’m not seeing is dramatically lower spending, with very large class sizes, flat (w/respect to experience) but high teacher salaries (maximized w/in the budget constraint) – at least among high flying charters.  That is, I’m not seeing a complete disregard for class size reduction in order to achieve the wage premium. I’m seeing both/and, not either or (and both/and is more expensive than either/or).

So, on the one hand, pundits are arguing to expand “successful” charter schools which are pursuing rather traditional resource allocation strategies, while arguing that public school resource allocation strategies are fatally flawed and entirely inefficient. They only get away with this argument because they fail to explore in any depth how successful charter schools allocate resources and the cost implications of those strategies. It’s time to start taking this next step!

See also:

From: Baker, B.D. & Ferris, R. (2011). Adding Up the Spending: Fiscal Disparities and Philanthropy among New York City Charter Schools. Boulder, CO: National Education Policy Center. Retrieved [date] from http://nepc.colorado.edu/publication/NYC-charter-disparities.

About these ads
Posted in: Charter Schools