MPR’s Unfortunate Sidestepping around Money Questions in the Charter CMO Report


Let me start by pointing out that Mathematica Policy Research, in my view, is an exceptional research organization. They have good people. They do good work and have done much to inform public policy in what I believe are positive ways. That’s why I found it so depressing when I started digging through the recent report on Charter CMOs – a report which as framed, was intended to explore the differences in effectiveness, practices and resources of charter schools operated by various Charter Management Organizations.

First, allow me to point out that I believe that the “relative effectiveness of CMOs” is not necessarily the right question – though it does have particular policy relevance when framed that way. Rather, I believe that the right questions at this point are not about charter versus non-charter, KIPP versus Imagine or White Hat, but rather about what these schools are doing, and whether we have evidence that it works (across a broad array of students and outcome measures). Then, once we get a better picture of what is working… and for that matter … what is not, we also need to consider very carefully… and in detail… the cost structure of the alternatives – that is, if what they are doing is really alternative to (different from) what others are doing. Of course, it is relevant from a measured expansion strategy to know which management organizations have particularly effective strategies. But we only develop useful information on how to transfer successes beyond the charter network by understanding the costs and effects of the strategies themselves.

So, as I read through the Mathematica CMO study, I was curious to see how they addressed resource issues.  What I found in terms of “money issues” were three graphs… each of which were pretty damn meaningless, and arguably well below Mathematica’s high quality research standards.

Here’s the first graph. It shows what I believe to be the average per pupil spending of charter schools by the CMO network and shows a very wide range. Now, This one bugs me on a really basic level, because as far as I can tell, the authors didn’t even try to correct their spending measures for differences in regional costs. So, any CMO which operates more schools in lower cost labor markets will appear lower and any CMO in higher cost labor markets will likely appear higher. In short, this graph really means absolutely nothing. It tells us nothing at all.

Figure 1

Source: http://www.mathematica-mpr.com/publications/PDFs/Education/cmo_final.pdf

Rule #1: Money always needs to be evaluated in context.  Actually, the easiest way to deal with regional or local corrections is to simply compare the expenditures to average expenditures of other school types in the same labor market.  That is, what percent above or below traditional public schools and/or private schools is charter spending among schools in the same labor market (can use Core Based Statistical Areas as a proxy for labor market). Notably, the tricky part here is figuring out the relevant spending components, such as equating traditional public school facilities, special education and transportation costs with cost responsibilities of charters. Alternatively, one can use something like the NCES Education Comparable Wage Index (though dated now) to adjust spending figures across labor markets.

In their second figure, Mathematica compares reported IRS filing expenditures to public subsidy figures. But rather than bothering to dig up the public subsidy figures themselves, Mathematic relies on figures from a dated and highly suspect report – the Public Impact/Ball State report on charter school finances. I’ve written previously about the many problems with the data in this report. There’s really no reason Mathematica should have been relying on secondary reported data like these when it’s pretty damn easy to go to the primary source.  Further, this graph doesn’t really tell us anything either.

Figure 2

Source: http://www.mathematica-mpr.com/publications/PDFs/Education/cmo_final.pdf

What do we really need and want to know? We need to know:

  1. Does it cost more and how much more to do the kinds of things the report identifies as practices of successful charter schools, such as running marginally smaller schools with smaller class sizes?
  2. What kind of wages are being paid to recruit and retain teachers who are working the extra hours and delivering the supposedly more successful models?
  3. How does the aggregate of these spending practices stack up against other types of schools in given local/regional economic contexts?

The financial analyses provided by Mathematica may as well not even be there. Actually, it would be a much better report if those graphs were just dropped. Because they are meaningless. They are also simply bad analyses. Analyses that are certainly well below the technical quality of research commonly produced by Mathematica.

Here are a few examples of what I’ve been finding on these questions, from recent blog posts, but part of a larger exploration of what we can learn from extant data on charter school resource allocation.

First, here’s some data on KIPP schools expenditures compared in context in NYC. That is, comparing the relevant school site expenditures (with footnote on the odd additional spending embedded in KIPP Academy financial reports) within NYC.  Here, it would appear that KIPP schools in certain zip codes in NYC may be significantly outspending traditional public schools serving the same grade ranges in the same zip codes (perhaps more consistently if we spread the KIPP Academy spending across the network, as I discuss in my report below [end of post]). The next step here is to compare the underlying salary structures, class sizes and other factors which explain (or are a result of) these spending differences. I’m not there yet with this analysis. More to come.

Figure 3

Second, Here’s how KIPP (and other charter) school spending per pupil compares in Houston Texas, based only on the school site spending reports from the Texas Education Agency, and not necessary including additional CMO level allocations (in the works).  Clearly, there’s some screwy stuff to be sorted out here as well. My point with these figures is merely to show how one can put spending in context and use more relevant numbers. Again, there are similar next steps to explore.

Figure 4

From a related recent post, here again are the class sizes and salary structure of Amistad Academy, a successful Achievement First school in New Haven Connecticut.  If there are two things that really drive the cost of operating any particular educational model it’s a) the quantity of staff needed to deliver the model – as can be measured in terms of class sizes (number of teachers), b) the price that must be paid for each staff member in order to recruit and retain the kind of staff you want to be delivering that model.

Figure 5

Figure 6

These figures show that two strategies employed by Amistad are a) lower early grades class sizes and b) much higher teacher salaries across the entire range of experience (among the experience range held by Amistad teachers) but especially in the early –mid-career stages.  These are potentially expensive strategies to replicate and/or maintain. But, they may just be good strategies… and may actually be the most cost –effective approach. We’ll never know if we don’t actually take the time to study it. We may also find that these approaches become more expensive as we attempt to scale them up and put greater strain on local teacher labor markets (supply).

Notably, I’ve been finding similar approaches to teacher compensation in the more recognized New Jersey Charter schools. I have shown previously, and here it is again, that schools like TEAM Academy seem to be shooting for higher salaries than neighboring/host public districts.  So too are schools like North Star Academy. But others (often less stellar [pun intended] charters) are not.

Figure 7

 

Now’s the time to get more serious about digging into the resource issues and providing useful information on the underlying cost structure of the educational models and strategies being used in successful charter networks, individual schools or anywhere for that matter.

Mathematica is far from alone in paying short shrift to these questions.  Roland Fryer’s Houston Apollo 20 study provided only marginally less flimsy analysis of the costs associated with the “no excuses” model (and made unsupported assertions regarding the relationship of Apollo 20 costs to “no excuses” charter school costs see http://www.houstonisd.org/HISDConnectEnglish/Images/Apollo/ApolloResults.pdf, full paper provides only marginally more information re: costs)

So, why do I care so much about this… and more importantly… why should anyone else? Well, as I explained in a previous post there’s a lot of mythology out there about education policy solutions – like no excuses charter schools – that can do more with less. That can get better outcomes for less money.  Most of the reports that pitch this angle simply never add up the money. And they fail to do any analysis of what it might cost to implement similar strategies at greater scale or in different contexts.  Is it perhaps possible that most improvements will simply come at greater overall cost?

Here’s the other part that’s been bugging me. It has often been asserted that the way to fix public schools is to either A) replace them with more charter schools and B) stop bothering with small class size and get rid of additional pay for things like increased experience.

As far as I can tell from the available data Option A and Option B above may just involve diametrically opposed strategies. As far as I’ve seen in many large data sets, charter schools that we generally acknowledge as “successful” are trying to pay teachers well and their teacher salaries are generally highly predictable as a function of experience (based on regression models of individual teacher data). That said, the shape of their salary schedules is often different from their hosts and surroundings – different in a way I find quite logical. Further, Charters with additional resources seem to be leveraging those resources at least partly to keep class sizes down (certainly not in the 35 to 40 student range of many NYC public schools, or CA schools).  Total staffing costs may still be lower mainly because charter teachers and other staff still remain “newer.” But sustaining current wage premiums may be tricky as charter teachers stay on for longer periods.

Again, in my preliminary analyses, I’m seeing some emphasis in some cases on early grades which makes sense. What I’m not seeing is dramatically lower spending, with very large class sizes, flat (w/respect to experience) but high teacher salaries (maximized w/in the budget constraint) – at least among high flying charters.  That is, I’m not seeing a complete disregard for class size reduction in order to achieve the wage premium. I’m seeing both/and, not either or (and both/and is more expensive than either/or).

So, on the one hand, pundits are arguing to expand “successful” charter schools which are pursuing rather traditional resource allocation strategies, while arguing that public school resource allocation strategies are fatally flawed and entirely inefficient. They only get away with this argument because they fail to explore in any depth how successful charter schools allocate resources and the cost implications of those strategies. It’s time to start taking this next step!

See also:

From: Baker, B.D. & Ferris, R. (2011). Adding Up the Spending: Fiscal Disparities and Philanthropy among New York City Charter Schools. Boulder, CO: National Education Policy Center. Retrieved [date] from http://nepc.colorado.edu/publication/NYC-charter-disparities.

7 thoughts on “MPR’s Unfortunate Sidestepping around Money Questions in the Charter CMO Report

  1. Another question I had about the report is why the district average is used as the reference point. If a charter is located in the poorest section of town in order to enroll “under served populations,” then isn’t the proper reference point the other schools in that zip code or surrounding zip codes? Such information is relatively easy to get and I am sure some low level staffer at Mathematica could have dug it up for all schools in a week or so.

    The reference group is critical in these studies because the public is made to believe that the charters are taking regular neighborhood kids and doing wonderful things with them. Using the district as a reference group makes it easy to have more poor kids since the charters are in a poor neighborhood. But, if you read teh report carefully, you will notice that the students enrolling in the charters have scores at or above the average FOR THE DISTRICT. In most places, this means the charters are enrolling kids with greater initial performance than the surrounding schools. The public then compares the charters to the surrounding schools (at the behest of the charter PR people) and thinks they do great. But, as you have pointed out, the APPROPRIATE reference group is the schools in the same or surrounding zip codes.

    We would get some dramatically different results with respect to “creaming” if Mathematics used a more appropriate reference group.

    Their treatment of disappearance seems problematic as well unless their growth models account for peer effects.

      1. well, Mathematica clearly does not read Schoolfinance1010. Or they chose use the comparison group most favorable to charters. Ignorance or bias for funders? Maybe there is a good explanation for using district averages as the comparison other than its a boatload less work, but I don’t see one.

      2. Certainly have to compare on grade level. Can make a clearer case in very large city like NYC that comparisons should not be districtwide. But then have to have clear cut rationale for geographic space for comparison. Best option in my view is weighted average of sending attendance areas… or some centroid based population weighted comparison (for resident population of sending area). I find district wide/district average comparisons generally unhelpful, but also find it difficult to generate the necessary information for a justifiable alternative (more narrow geographic focus).

  2. Look into the numbers OF the numbers, too. The chart claims that KIPP schools spend a higher percentage on students with disabilities, but on what type of services? Many of their charters do not take moderate to severely disabled students and tend to “fudge” numbers on how they provide what ‘type” of service.

    I’ve been collecting data for years on LAUSD charters regarding their lack of enrollment of students with disabilities and the lack of real services provided. If you dig deeper into the data, charters claim to provide such disability services as “Resource” and (in the high schools) “college readiness”. The “services are not true IEP Transition services, but cheater, general college counseling type stuff that they also use for all students. It is not targeted, because they are not taking those students who need intensive Transition Plans.

    As I’ve brought these shams to the attention of the LAUSD School Board over the years to no avail (many bought and paid for by Eli Broad and his charter foundation interests), the charters have become more creative in playing with enrollment/service numbers to appear as if they were really starting to help the more moderate to severely disabled population. It’s not true. In fact some service descriptions are claimed with elementary and middle schools that are specifically for high school regarding Transition and should not even be recorded as such. They try to beef up their service numbers with inappropriate categories that they’re not utilizing as they should with the students who need them.

    We don’t even know if the teachers are really qualified to teach moderate to severely disabled students. I tried recently to get the credential of a teacher for a new charter middle school this summer as a friend told me they were “really interested” in enrolling her son with high-functioning autism. In CA a “level 2 moderate to severe special education credential” is needed. The state credentialing site listed 17 people with the teacher’s first and last name. I needed the middle initial. I called and asked the school site, was told it was “confidential” information – even though the teacher’s name and photo showed on their website. I have the right to know under NCLB, but was stonewalled.

    LAUSD has a department specifically devoted to keeping current on all regular school teacher credentials. I went to the main office asking for help. “Oh, we don’t keep track of charters, you need to go to their office.” So I did and ran into the department head who claimed that “each school is responsible for those records.” In other words, I was back where I started because the LAUSD Charter officer had NO database and has no idea what type of credentials (if any) a teacher has on any one of our 180 charter campuses. How can a family make an informed choice about a school when we don’t even know if a teacher is qualified?

    As it turned out, this particular teacher did not have the proper credential and the school should not have been soliciting to enroll this student. We can assume that they were going to play the other game that charters play: enroll a student with disabilities at the beginning of the year for the block grant funding received with his/her ADA, then come norming day – remove the student by ‘counseling out” (“we’re sorry, but we can’t provide the services your child needs here ” is the usual ploy). The funds, unfortunately, do NOT follow the child and the charter keeps ADA for a student they’ve kicked to the curb. The student is then placed back into a regular public school (or nonpublic) at a deficit to the district since they don’t have the student’s ADA. Borrowing from the general fund, yet again for a student that a charter refused to take responsibility for.

    There is no compliance, no oversight – so we can guess that data is fudged, altered or never checked. Charters are sucking scarce public education funds from regular schools that are being overburdened with the cost of these moderate to severely disabled students as they’re sent back. Taking the “easy” kids – KIPP schools are top-heavy with SLI (Speech/Language Impairment) and SLD (Specific Learning Disability). There are either very few as “token” or no Blind, Deaf or Hard of Hearing, Cerebral Palsy, Moderate to Severe Autism, Intellectual Disability (formerly mentally Retarded) students as they require more costly and intensive services.

    Numbers mean nothing unless you know how to look at them. Charters are very creative at trying to look compliant with disability compliance lately due to public pressure, but in reality are still practicing exclusive and discriminatory enrollment against them. Public schools take all children, charters do not.

    1. You raise very important points about the parsing of special education spending data in relation to specific populations (within disability classifications). Relevant, accessible data of the type needed to do the analyses you discuss are hard to come by. Indeed, I suspect that many charters which do show some level of disability concentration are actually showing elevated mild LD/BD cases and have very few if any more severe disabilities, which obviously come at higher cost. In some states like Pennsylvania the funding formula has incentivized such behavior by paying charter schools for children with any disability on the basis of the district average expenditure on serving children with disabilities. In PA, this has resulted in the establishment of some charters which focus on serving children with mild disabilities. This generates excess funding for those charters – because the district average expense includes more severe disabilities – and adversely affects the sending district (because that district is then left with a) less money and b) a more concentrated severe disability population). I’m not sure if they’ve corrected this issue in the past year or so.

      Clearly, funding formulas must be sensitive to the wide variety of costs associated with degrees of severity, and accountability and reporting – both fiscal and student reporting – must be sufficiently detailed to tease out these critical questions pertaining to equity/adequacy. The same is true for teacher credential/qualifications data.

      Policymakers and pundits seem to like to talk a big game about transparency and public accountability with respect to charters, but we’re certainly not seeing it in terms of publicly accessible data/information/documentation. And that only seems to be getting worse.

      On a related note, while charter operators and advocates love to invoke claims of being “public schools” like any other and with all the duties and responsibilities of public schools, charter school operators, especially private managers and boards of directors, when legally challenged to provide open records access, open meetings access, or to comply with federal statutes pertaining to public institutions (state actors), the most frequent legal strategy seems to be for the charters, their boards and management companies to claim that they are not, in fact, state actors and thus not obligated to comply as such. Charters have lost many of these legal challenges in their own state courts, but have won a few big ones, including in federal court (9th circuit – Caviness v. Horizon Learning). Preston Green of Penn State Law has an interesting forthcoming article on this topic, and the implications for student rights.

      I suspect we’ll see increasing attempts among charters to withdraw further from public accountability/transparency, and I’m not sure I see states stepping in to tighten legislation in this regard. As such, I have little expectation that quality of or access to data that might be used for enforcing accountability will improve in the foreseeable future.

      1. I’ve had to put in very specific FERPA requests each year after the December numbers are reported (and after Norming Day figures are finalized) to the State Dept of Ed. I ask for these things:
        1. List of all charters, including California ID codes (CDS), address & contact info
        2. total enrollment of all charters
        3. enrollment by disability type

        Upon receiving that information I then must wait for the LAUSD presentation of the Special Education Local Plan where each school district must submit, school by school, a list of the type of services provided at each school site. Having the California ID code helps with this data because, many times, charters change their names to skirt past abuses. The code stays the same. See link here for California requirements: http://www.cde.ca.gov/sp/se/ds/lclpln.asp

        I’ve also discovered that charter operators with more than one school campus sometimes “share” a CDS code which is technically illegal. One site, one address, one code is how all regular public schools are required to function – but somehow there are charters wtih different addresses and locations manage to share one code. This causes false data because they could have one location with NO students with disabilities at all and another that is the campus serving students with disabilities. By using one code for two sites ( in some cases 3 or 4 in the past) a charter appears to be compliant and enrolling students with disabilities when in fact might have all at one campus and none at the other.

        I’ve complained about this for several years as well but our charter-friendly board ignores the outright fraud perpetrated by these schools because they’ve had their campaigns financed by outside charter interests.

        Families should fully investigate how much special interest, charter funding is supporting decisions with public schools. Until Jetty Brown was elected Governor and cleaned house with the State Board of Education – most of those appointments were influenced by GOP, charter special interests. A majority of the former board members were involved with charters in some form or another, whether by running a charter organization or being on a board of a charter organization. None would recuse themselves from decisions over the past 8 years when they should have and our laws have been created and written by charter foundation funding instead of true academics and educators.

        I’m sure that one of our most helpful state educational data searches was dismantled by this board as we previously were able to search for “enrollment by disability type” by individual school. Now we can only access by “district” for that information. A family can’t determine if a charter discriminates if they can’t view the data. When I questioned the dismantling of this particular item several years ago, I was told that the data submitted by the schools was “questionable” and “faulty”. Why then would they post all other “student” data submitted by schools (dropout/English Learners/ enrollment/graduation)? Only the special education data has been eliminated by individual school. See link here: http://dq.cde.ca.gov/dataquest/

        Charter organizations have been so active and involved with the legislative end of things that families of students with disabilities can’t compete. The playing field has not been level for many years. The damage has been done and our leadership has no idea how difficult it is (and will only get worse) for our moderate to severely disabled students.

Comments are closed.