Segregating Suburbia: A Princeton Story

Others around me have for some time been raising concerns about the emergence of boutique, suburban charter schools. Until now, I’ve largely blown off those concerns in part as I’ve questioned just how much sorting a charter school can achieve in a relatively homogeneous suburban area.

Suburbs have their own unique portfolio of schools.  One might find in any leafy suburb near a major metropolitan area a very fine local public school district, perhaps a private catholic school in certain regions of the country and in many areas, an elite private independent day school or two – oft named “day school” or “country day school.” These portfolios have been in existence, in some cases, for centuries.  At some future point, I may discuss more extensively the public private balance issue and the role that elite, and less so, private schools play when embedded in otherwise elite communities that also have relatively elite public school systems.

Rarely would one expect to find the charter school movement trying to infiltrate this environment, adding that other element to the portfolio. And if and when this does happen, what niche do they try to fill? On the one hand, one might try to establish a charter that handles the “difficult” cases from the local school system – those that might not fit particularly well in either the public system or have access to appropriate private schooling.

But, I’m reminded… though I can’t find the link right now… of a Palo Alto, CA charter school that had basically established itself as the equivalent of a publicly subsidized elite private school. [Found! By a commenter below] It’s a rather clever financial model. If elite private school tuition is running at about $30k per year per child…and per pupil cost of a quality private education program about $32 to $35k… one could either pay that price, or gather a group of close friends, and apply for a charter, where each child might receive an allotment of $10 to $15k from the local district and then parents could quietly agree to chip in the other $15k to achieve similar quality schooling to  the private option – at half the price.

Of course, there are many additional costs of getting that ball rolling, including finding and leasing space for start up years, and running capital fundraising campaigns for future years. By establishing a charter school in this way, these parents really couldn’t officially exclude others from their school or obligate private contributions within their “club”… but they sure could make any free-rider, or other resource drain on their schooling model feel uncomfortable enough to leave.

On the one hand, it might not be considered that problematic for a group of parents with “average children” in the local district to require (via establishing a charter school) that district to subsidize their quasi-private endeavor.  I would argue that it becomes more problematic when an above average income group in the community, with relatively low need children (by usual classifications), obligates the local public school district to subsidize their segregationist preferences. That is, asking those less well off than you to subsidize your quasi-private school alternative.

But, just how much sorting can a suburban charter school achieve anyway? And can a suburban charter school establish itself as a quasi-elite-private school in a market  where there are already several private schooling options. That is, would parents of advantaged children actually seek to establish a school that taxes those less well off than them, to subsidize their charter school, instead of paying the full price of tuition at local private schools?  Evidence from Princeton, New Jersey suggests that the answer to this question may in fact be yes!

Let’s take a look.

Here’s the lay of the land… from the broad viewpoint… with district housing values in the brown shading.

Princeton GlobalPrinceton is the dark area in the middle of the picture, with very high average housing values. Princeton also is home to numerous… and I mean numerous private independent day and boarding schools, many of which (along a single road) serve a large portion of school aged children from Princeton and surrounding communities and many of which have been around for a very long time. Princeton is also known for having and exceptionally strong local public district. To the south and west is Trenton, with high poverty schools including high poverty charter schools (yellow stars). Notably, in Princeton the lowest poverty “public” school is Princeton Charter School. Princeton Public Schools each have much higher rates of children qualified for free or reduced priced lunch.

Here’s a zoom in on Princeton:

Princeton ZoomWhile many of the triangles (private schools) in other parts of the state are preschools, etc., many in  Princeton are actually relatively large elite private day and boarding schools.  Rather amazingly, Princeton Charter School appears – at least by exclusion of low income children – to be positioning itself as a publicly subsidized alternative to the elite private schools and not as a more broadly accessible charter alternative.

Here’s a breakout of the details on the Princeton Charter population compared to the district:

Slide1

And here’s the composition of the special education populations:

Slide2

That is, PCS has only the mildest, lowest cost children with disabilities.

Put bluntly, these figures show that the parent population of Princeton Charter is obligating the parents of much less advantaged children, including parents of children with special education needs, subsidize their preference to have a school more like the private day schools along Great Road.

While I’m still not entirely sure what to make of this… it does concern me.

It also ought to raise questions for leaders of private school alternatives in these communities. On balance, I’ve never seen the charter school movement as a particular competitive threat to private independent day schools, as charters have often been primarily urban, serving minority populations and employing “no excuses” strategies that most parents in leafy suburbs would not find palatable for their own children.

Urban charter schools have arguably taken their toll on urban catholic school enrollments, but that’s another story.  But, to the extent that state charter policies permit the type of school establishment and segregation going on in Princeton, more an more parents may find ways to organize quasi-private-elite schools to serve their needs – effectively seeking taxpayer charity to support their country club preferences. This indeed may pose a threat to financially less well endowed private schools.

In a twisted sort of way, it’s rather like asking your local public parks department to pay for your membership to the local private country club – thus reducing the quality of services to others who really don’t have access to the country club (even if it proclaims it’s open to all comers).

Much more to ponder here… but the numbers on Princeton Charter School certainly raise some serious red flags.

Note: In New Jersey and elsewhere, there are numerous other taxpayer subsidies that support private schooling, ranging from property tax exemptions and exemptions on charitable gifts, to textbook subsidies (loans from local districts) and transportation reimbursements. So, to an extent, all private schools and privately schooled children are receiving some level of subsidy at taxpayer expense. But, that level increases dramatically if/when the local public district is also required to hand over the full annual operating expense per child.

Friday Story Time: Deconstructing the Cycle of Reformy Awesomeness

Once upon a time, there was this totally awesome charter school in Newark, NJ. It was a charter school so awesome that its leaders and founders and all of their close friends decided they must share their miracle with the world in books on the reasons for their awesomeness, including being driven by data and teaching like a champion!

The school’s break-the-mold – beating the odds – disruptively innovative awesomeness was particularly important during this critical time of utter collapse of the American education system which had undoubtedly been caused by corrupt self-interested public school teachers (& their unions) who had been uniformly ill-trained by antiquated colleges and universities that themselves were corrupt and self-interested and generally in the business of selling worthless graduate degrees.

In fact, the undisputed awesomeness of this North Star Academy could, in theory, provide the foundation for a whole new approach to turning around the dreadful state of American education.

And thus came the Cycle of Reformy Awesomeness, which looks something like this:

Slide1

Built on the foundation of awesomeness established by THE North Star Academy, since teachers are the undisputed most important in school factor determining student outcomes, the awesomeness of North Star could be attributed primarily to the quality of the teachers and innovative practices they used in their data driven classrooms!

Thus, by extension, we must establish new institutions of teacher preparation whereby these truly exceptional teachers (of 3 to 5 years experience) not only are provided the opportunity to share their expertise on a personal collaborative level with their own colleagues, but rather, we should let these teachers be the instructors in a new graduate school of education (regardless of academic qualifications) and we should actually let them grant graduate degrees in education to their own colleagues.

This new approach of letting teachers in a school grant graduate degrees to their own work colleagues (and those in other network schools) could lead to rapid diffusion of excellence and would most certainly negate the corrupt perverse incentives pervasive throughout the current, adult oriented self-interested American higher education system! Disruptive innovation indeed!

And so their founders and disciples took their show on the road. They took their show to state departments of education to urge fast-tracked uncritical promotion of their cycle of awesomeness. They gained leverage on local boards of education in nearby school districts to promote diffusion of their awesomeness. And they set out to other state departments of education to share their insights on how to achieve awesomeness with drive by data… excuse me… being driven by data!

And driven by data they were… for example… absolutely all of the kids in their school passed that test in high school.

And there was much rejoicing.

Slide2And that one too:

Slide3

 

And there was much rejoicing.

And they were only getting better, and better and better:

Slide4

And there was much rejoicing.

And better:

Slide5The more they looked at their own data – well, really only one measure of their data – the more they patted themselves on the back, congratulated their own reformy awesomeness and shared it with the world. And the state!

Slide6

And there was much rejoicing.

Yup… 100% graduation rate… which is totally unheard of for a high poverty, urban high school in dreadful Newark, NJ! [or at least for a school that happens to be located in the high poverty city of Newark].

A true miracle it was… is… and shall be. One that must be proliferated and shared widely.

But alas, the more they shared, the more they touted their awesomeness, the more it started to become apparent that all might not be quite so rosy in North Star land.

As it turned out, those kids in North Star really didn’t look so much like those others they were apparently so handily blowing out on state tests….

Slide11

And there was complete freakin’ silence!

Somehow, this rapidly growing miracle school was managing to serve far fewer poor children than others (except a few other charter schools also claiming miracle status) around them.

And, they were serving hardly any children with disabilities and few or none with more severe disabilities.

Slide12

And again there was complete freakin’ silence!

And if that was the case, was it really reasonable to attribute their awesomeness to the awesomeness of their own teachers – their innovative strategies… and the nuanced, deep understanding of being driven by data?

Actually, it is perhaps most befuddling if not outright damning that such non-trivial data could be so persistently ignored in a school that is so driven by data?

And there was complete freakin’ silence!

But alas, these were mere minor signals that all might not be as awesome as originally assumed.

It also turned out that of all the 5th graders who entered the halls of awesomeness, only about half ever made it to senior year – year after year after year after year… after year.

Slide14

And for black boys in the school, far fewer than that:

Slide15

And there was complete freakin’ silence!

And in any given year, children were being suspended from the school at an alarming rate.

Slide13

Again… raising the question of how a school driven by data could rely so heavily on a single metric – test scores and pass rates derived from them – to proclaim their awesomeness, when in fact, things were looking somewhat less than awesome.

Could a school really be awesome  if only the fewer than half who remain (or 20% of black boys who remain) pass the test? Might it matter at least equally as much what happened to the the other half who left?

Was it perhaps possible that the “no excuses” strategies endorsed as best practices both in their school and in their training of each other really weren’t working so well…and weren’t the strategies of true teaching champions… but rather created a hostile and oppressive environment causing their high attrition rate? Well… one really can say this one way or the other…

Regardless of the cause, what possibly could such a school share with those traditional supposedly failing public schools who lacked similar ability to send the majority of their children packing? Further, what possibly could the rather novice teachers in this school charged with granting their own co-workers graduate credentials share with experienced researchers and university faculty training the larger public school teacher workforce?

Alas the miracle was (is) crumbling.

But that miracle wasn’t just any ol’ miracle. Rather, it was the entire foundation for the reformy cycle of awesomeness! And without that foundation, the entire cycle comes crumbling down.

Slide16

No miraculously awesome charter school [in fact, one might argue that any school with such attrition is an unqualified failure].

Thus no valid claim of miraculous teachers and teaching.

Thus no new secret sauce for teacher preparation.

All perpetrated with deceptive and in some cases downright fraudulent (100% graduation rate?) presentation of data.

And thus the search continues… for the next miracle… and the next great disruptive innovation to base on that miracle… whatever… wherever it may be.

 

 

 

 

 

 

 

Newark Charter Update: A few new graphs & musings

It’s been a while since I’ve written anything about New Jersey Charter schools, so I figured I throw a few new graphs and tables out there. In the not too distant past, I’ve explained:

  1. That Newark charter schools in particular, persist in having an overall cream-skimming effect in Newark, creating demographic advantage for themselves and ultimately to the detriment of the district.
  2. That while the NJ CREDO charter school effect study showed positive effects of charter enrollment on student outcomes specifically (and only) in Newark, the unique features of student sorting (read skimming) in Newark make it difficult to draw any reasonable conclusions about the effectiveness of actual practices of Newark Charters. Note that in  my most recent post, I re-explain the problem with asserting school effects, when a sizable component of the school effect may be a function of the children (peer group) served.
  3. In many earlier posts, I evaluated the extent to which average performance levels of Newark (and other NJ) charter schools were higher or lower than those of demographically similar schools, finding that charters were/are pretty much scattered.
  4. And I’ve raised questions about other data – including attrition rates – for some high flying NJ charters.

As an update, since past posts have only looked at NJ charter performance in terms of “levels” (shares of kids proficient, or not), let’s take a look at how Newark district and charter schools compare on the state’s new school level growth percentile measures. In theory, these measures should provide us a more reasonable measure of how much the schools contribute to year over year changes in student test scores. Of course, remember, that school effect is conflated with peer effect and with every other attribute of the yearly in and out of school lives of the kids attending each school.

And bear in mind that I’ve critiqued in great detail previously that New Jersey’s growth percentile scores appear to do a particularly crappy job at removing biases associated with student demographics, or with average performance levels of kids in a cohort.  To summarize prior findings:

  1. school average growth percentiles tend to be lower in schools with higher average rates of proficiency to begin with.
  2. school average growth percentiles tend to be lower in schools with higher shares of low income children.
  3. school average growth percentiles tend to be lower in schools with more non-proficient scoring special education students.

And each of these relationships was disturbingly strong. So, any analysis of the growth percentile data must be taken with a grain of salt.

So, pretending for a moment that the growth percentile data aren’t complete garbage, let’s take a look at the growth percentile data for Newark Charter Schools, along side district schools.

Let’s start with a statewide look at charter school growth percentiles compared to district schools. In this figure, I’ve graphed the 7th grade ELA growth percentiles with respect to average school level proficiency rates, since the growth percentile data seem so heavily biased in this regard. As such, it seems most reasonable to try to account for this bias by comparing schools against those with the most similar current average proficiency rates.

Figure 1. Statewide Language Arts Growth with Respect to Average Proficiency (Grade 7)

Slide1

Now, if we buy these growth percentiles as reasonable, then one of our conclusions might be that Robert Treat Academy is one of, if not the worst school in the state – at least in terms of its ability to contribute to test score gains. By contrast, Discovery Charter school totally rocks.

Other charters to be explored in greater depth below, like TEAM Academy in Newark fall in the “somewhat better than average” category (marginally above the trendline) and frequently cited standouts like North Star Academy somewhat higher (though in the cloud, statewide).

So, let’s focus on Newark in particular.

Figure 2. Newark Language Arts Growth with Respect to Average Proficiency (Grade 5)

Slide2

Figure 3. Newark Language Arts Growth with Respect to Average Proficiency (Grade 6)

Slide3

Figure 4. Newark Language Arts Growth with Respect to Average Proficiency (Grade 7)

Slide4

Figure 5. Newark Language Arts Growth with Respect to Average Proficiency (Grade 8)

Slide5In my earlier posts, it was typically schools like Treat, North Star, Gray and Greater Newark that rose to the top, with TEAM posting more average results, but all of these results heavily mediated by demographic differences, with Treat and North Star hardly resembling district schools at all, and TEAM coming closer but still holding a demographic edge over district schools.

In these updated graphs, using the growth measures, one must begin to question the Robert Treat miracle especially. Yeah… they start high… and stay high on proficiency… but they appear to contributed little to achievement gains. Again, that is, if these measures really have any value at all. Gray is also hardly a standout… or actually it is a standout… but not in a good way.

TEAM continues to post solidly above average, but still in the non-superman (mere mortal) mix of district & charter schooling in Newark.

Remember, school gains are a function of all that goes on in the lives of kids assigned to each school, including in school and out of school stuff, including peer effect.

Let’s focus in on the contrast between TEAM and North Star for a bit. These are the two big ones in Newark now, and they’ve evolved over time toward providing K-12 programs. Here’s the most recent demographic data comparing income status and special education populations by classification, for NPS, TEAM and North Star.

Figure 6. Demographic data for NPS, TEAM and North Star (2012-13 enrollments & 2011-12 special education)

Slide6

North Star especially continues to serve far fewer of the lowest income children. And, North Star continues to serve very few children with disabilities, and next to none with more severe disabilities. Similarly, in TEAM, most children with disabilities have only mild specific learning disabilities or speech/language impairment.

But this next piece remains the most interesting to me. I’ve not revisited attrition rates for some time, and now these schools are bigger and have a longer track record, so it’s hard to argue that the patterns we see over several cohorts, including the most recent several years, for schools serving over 1,000 children, are anomalies.  At this point, these data are becoming sufficiently stable and predictable to represent patterns of practice.

The next two tables map the changes in cohort size over time for cohorts of students attending TEAM and North Star. The major caveat of these tables is that if there are 80 5th graders one year and 80 6th graders the next, we don’t necessarily know that they are the same 80 kids. 5 may have left and been replaced by 5 new students. But, taking on new students does pose some “risk” in terms of expected test scores, so some charters engage in less “backfilling” than others, and fewer backfill enrollments in upper grades.

Since tests that influence SGPs are given in grades 5 – 8 (well, 3 – 8, but 5-8 is most relevant here), the extent to which kids drop off between grade 5 & 6, 6 & 7, and who drops off between those grades can, of course, affect the median measured gain (if kids who were more likely to show low gains leave, and thus aren’t around for the next year of testing, and those more likely to show high gains stay, then median gains will shift upward from what they might have otherwise been).

First, lets look at TEAM.

Figure 7. TEAM Cohort Attrition Rates

Slide8Among tested grade ranges, with the exception of the most recent cohort, TEAM keeps from the upper 80s to low 90s – percentages of 5th graders who make it to 8th grade (with potential replacement involved).  Any annual attrition may bias growth percentiles, as noted above, if potentially lower  gain students are more likely to leave. But without student level data, that’s a bit hard to tell.

TEAMs’ grade 5 to 12 attrition is greater, dropping over 25% of kids per cohort. From 9 to 12, about 20% disappear.

But these figures are far more striking for North Star.

Figure 8. North Star Cohort Attrition Rates

Slide7Within tested grades, North Star matches TEAM in the most recent year, but for previous years, North Star loses marginally more kids from grades 5 to 8, hanging mainly in the lower to mid 80s.  So, if there is bias in who is leaving – if weaker – slower gain students are more likely to leave, that may partially explain North Star’s greater gains seen above. Further, as weaker students leave, the peer group composition changes, also having potential positive effects on growth for those who remain.

Now… the other portion of attrition here doesn’t presently affect the growth percentile scores, but it is indeed striking, and raises serious policy concerns about the larger role of a school like North Star in the Newark community.

From grade 5 to 12, North Star persistently finishes less than half the number who started! As noted above, this is no anomaly at this point. It’s a pattern and a persistent one, over the four cohorts that have gone this far. I may choose to track this back further, but going back further brings us to smaller starting cohorts, increasing volatility.

Even from Grade 9 to 12, only about 65% persist.

Parsing these data a step further, let’s look specifically at attrition for Black boys at North Star.

Figure 9. Cohort Decline for Black Boys

Slide1I’ve flipped the direction of the years here…to be moving forward in the logical left to right direction. So, reorient yourself!  For grade 5 to 12, North Star had only one cohort that approached retaining 50% (well… actually, 42%). In other years, grade 5 to 12 attrition was around 75% or greater for black boys. Grade 9 to 12 attrition was about 40% in the most recent two years, and much more than that previously for black boys. Of the 50 or so annual entrants at 5th grade to North Star prior to recent doubling, only a handful would ever make it to 12th grade.

The concern here, of course, is what is happening to the rest of those students who leave, and what is the effect of this churn on surrounding schools – perhaps both charter and district schools who are absorbing these students who are so rapidly shed. [to the extent, if any, that exceptional middle school preparation at a school like North Star leads students to scholarship opportunities at elite private schools, or acceptance to highly selective magnet schools, this attrition may be less ugly than it looks]

Of course, this does lead one to question how North Star is able to report to the state a 100% graduation rate and a .3% dropout rate? Seems a bit suspect, eh?

Figure 9. What North Star reports as its dropout and graduation rates

Slide9

Notably absent HERE, as well, is any mention of the fact that only a handful of kids actually stick around through grade 12?

So, is this data driven leadership, or little more than drive by data? Seems that they’ve missed a really, really critical issue. [if you lose more than half of your kids btw grades 5 and 8, and even more than that for one of your target populations – black boys – that kind of diminishes the value of the outcomes created for the handful who stay, doesn’t it? Not for the stayers individually, but certainly for the school as a whole.]

A few closing thoughts…

As I’ve mentioned on many previous occasions, it is issues such as this as well as the demographic effects of charters, magnets and other schools that induce student sorting in the district, that must be carefully tracked and appropriately managed.  Neither an actual public school, nor a school chartered to serve the public interest (with public resources) should be shielded from scrutiny.

If we are really serious about promoting a system of great schools (as opposed to a school system) which productively integrates charter and district schools, then we can nolonger sit by and permit behavior by some that is more likely than not, damaging to others (in that same system).  That’s simply not how a “system of great schools” works, or how any well-functioning system, biological, ecological, economic, social or otherwise works.

But sadly, those who most vociferously favor charter expansion as a key element of supposed “portfolio” models of schooling appear entirely uninterested mitigating parasitic activity (that which achieves the parasites goal at the expense of the host. e.g. parasitic rather than symbiotic). Rather, they fallaciously argue that an organism consisting entirely of potential parasites is itself, the optimal form. That the good host is one that relinquishes? (WTF?) As if somehow, the damaging effects of skimming and selective attrition might be lessened or cease to exist if the entirety of cities such as Newark were served only by charter schools.  Such an assertion is not merely suspect, it’s absurd.

So then, imagine if you will, an entire district of North Stars? Or an entire district of those who strive to achieve the same public accolades of North Star? That would sure work well from a public policy standpoint. They’d be in constant bitter battle over who could get by with the fewest of the lowest income kids. Anyone who couldn’t “cut it” in 5th or 6th grade, along with each and every child with a disability other than speech impairment would dumped out on the streets of Newark. Even after the rather significant front end sorting, we’d be looking at 45% citywide graduation rates – actually – likely much lower than that because some of the aspiring North Star’s would have to take students even less likely to complete under their preferred model.

Yes, there would probably eventually be some “market segmentation” (a hearty mix of segregation, tracking & warehousing of kids with disabilities) – special schools for the kids brushed off to begin with – and special schools for those shed later on. But, under current accountability policies, those “special schools” would be closed and reconstituted every few years or so since they won’t be able to post the requisite gains. Sounds like one hell of a “system of great schools,” doesn’t it.

To the extent we avoid changing the incentive structure & accountability system, the tendency to act parasitic rather than in a more beneficial relationship will dominate. The current system is driven by the need to post good numbers – good “reported” numbers. NJ has created a reporting system that allows North Star to post a 100% grad rate and .3% dropout rate despite completing less than 50% of their 5th graders.

What do they get for this? Broad awards, accolades from NJDOE… & the opportunity to run their own graduate school to train teachers in their stellar methods (that result in nearly every black boy leaving before graduation).

A major problem here is that the incentive structure, the accountability measures, and system as it stands favor taking the parasitic path to results.

That said, in my view, it takes morally compromised leadership to rationalize taking this to the extent that North Star has. TEAM, for example, exists under the very same accountability structures. And while TEAM does its own share of skimming and shedding, it’s no North Star.

But I digress.

More to come – perhaps.

Suspension Rates for Schools in Newark

When Dummy Variables aren’t Smart Enough: More Comments on the NJ CREDO Study

This is  a brief follow up on the NJ CREDO study, which I wrote about last week when it was released. The major issues with that study were addressed in my previous post, but here, I raise an additional non-trivial issue that plagues much of our education policy research. The problems I raise today not only plague the CREDO study (largely through no real fault of their own…but they need to recognize the problem), but also plague many/most state and/or city level models of teacher and school effectiveness.

We’re all likely guilty at some point in time or another – guilty of using dummy variables that just aren’t precise enough to capture what is that we are really trying to measure. We use these variables because, well, they are available, and often, greater precision is not. But the stakes can be high if using these variables leads to misclassification/misidentification of schools for closure, teachers to be dismissed, or misidentification of supposed policy solutions deserving greater investment/expansion.

So… what is a dummy variable? Well, a dummy variable is when we classify students as Poor or Non-poor by using a simple, single income cut-off and assigning, for example, the non-poor a value 0f “0” and poor a value of “1.” Clearly, we’re losing much information when we take the entire range of income variation and lump it into two categories. And this can be consequential as I’ve discussed on numerous previous occasions. For example, we might be estimating a teacher effectiveness model and comparing teachers who each have a class loaded with 1s and  few 0s.  But, there’s likely a whole lot of variation across those classes full of 1s – variation between classrooms with large numbers of very low income, single parent & homeless families versus the classroom where those 1s are marginally below the income threshold.

For those who’ve not really pondered this, consider that for 2011 NAEP 8th grade math performance in New Jersey, the gap between non-low income and reduced lunch kids (185% income threshold for poverty) is about the same as the gap between free (130% income level) & reduced!

Slide4

The NJ CREDO charter school comparison study is just one example. CREDO’s method involves identifying matched students who attend charter schools and districts schools based on a set of dummy variables. In their NJ study, the indicators included an indicator for special education status and an indicator for children qualified for free or reduced priced lunch (as far as one can tell from the rather sketchy explanation provided). If their dummy variable matches, they are considered to be matched – empirically THE SAME. Or, as stated in the CREDO study:

…all candidates are identical to the individual charter school student on all observable characteristics, including prior academic achievement.

Technically correct – Identical on the measures used – but identical? Not likley!

The study also matched on prior test score, which does help substantially in providing additional differentiation within these ill-defined categories. But, it is important to understand that annual learning gains  – as well as initial scores/starting point – are affected by a child’s family income status. Lower income, among low income, is associated with increased mobility (induced by housing instability). Quality of life during all those hours kids spend outside of school (including nutrition/health/sleep, etc.) affect childrens’ ability to fully engage in their homework and also likely affect summer learning/learning loss (access to summer opportunities varies by income/parental involvement, etc.). So – NO – it’s not enough to only control for prior scores. Continued deprivation influences continued performance and performance growth. As such, this statement in the CREDO report is quite a stretch (but is typical, boilerplate language for such a study):

The use of prior academic achievement as a match factor encompasses all the unobservable characteristics of the student, such as true socioeconomic status, family background, motivation, and prior schooling.

Prior scores DO NOT capture persistent differences in unobservables that affect the ongoing conditions under which children live, which clearly affect their learning growth!

Now, one problem with the CREDO study is that we really don’t know which schools are involved in the study, so I’m unable here to compare the demographics of the schools actually included among charters with district schools. But, for illustrative purposes, here are a few figures that raise significant questions about the usefulness of matching charter students and district students on the basis  of “special education” as a single indicator, and “free AND reduced” lunch qualification as a single indicator.

First, here are the characteristics of special education populations in Newark district and charter schools.

Slide1As I noted in my previous post, nearly all special education students in Newark Charter schools have mild specific learning disabilities and the bulk of the rest have speech impairment.  Yet, students in districts schools who may have received the same dummy variable coding are far more likely to have multiple disabilities, mental retardation, emotional disturbance, etc. It seems rather insufficient to code these groups with a single dummy variable… even if the classifications of the test-taker population were more similar than those of the total enrolled population (assuming many of the most severely disabled children were not in that test-taker sample?).

Now, here are the variations by income status – first for district and charter schools in the aggregate:

Slide2

Here, charters in Newark as I’ve noted previously, generally have fewer low income students, but they have far fewer students below the 130% income threshold than they do between the 130% and 185% thresholds. It would be particularly interesting to be able to parse the blue regions even further as I suspect that charters serve an even smaller share of those below the 100% threshold.  Using a single dummy variable, any child in either the red or blue region was assigned a 1 and assumed to be the same (excuse me… “IDENTICAL?”). But, as it turns out, there is about twice the likelihood that the child with a 1 in a charter school was in a family between the 130% and 185% income thresholds. And that may matter quite a bit, as would additional differences within the blue region.

Here’s the distribution of free vs. reduced price lunch across NJ charter schools – among their free/reduced populations.

Slide3

While less than 10% of the free/reduced population in NPS is in the upper income bracket, a handful of Newark Charter schools – including high flyers like Greater Newark, Robert Treat and North Star, have 20% to 30% of their (relatively small) low income populations in the upper bracket of low income. That is, for the “matched child” who attended Treat, North Star or Greater Newark there was a 2 to 3 times greater chance than for the their “peer” in NPS that they were from the higher (low) income group.

Again… CREDO likely worked with the data they have. However, I do find inexcusable the repeated sloppy use of the term “poverty” to refer to children qualified for free or reduced price lunch, and the failure of the CREDO report to a) address any caveats regarding their use of these measures or b) provide any useful comparisons of the differences in overall demographic context between charter schools and district schools.

The Secrets to Charter School Success in Newark: Comments on the NJ CREDO Report

Today, with much fanfare, we finally got our New Jersey Charter School Report. The unsurprising findings of that report are that charter schools in Newark in particular seem to be providing students with greater average annual achievement gains than those of similar (matched) students attending district schools. Elsewhere around the state charter schools are pretty much average.

Link to report: http://credo.stanford.edu/pdfs/nj_state_report_2012_FINAL11272012.pdf

So then, the big question is, what exactly is behind the apparent success of Newark Charter schools – or at least some of them enough to influence the analysis as a whole – that makes them successful? Further, and perhaps more importantly, is there something about these schools that makes them successful that can be replicated?

The General Model

Allow me to start by pointing out that the CREDO study uses its usual approach  – a reasonable one given data and system constraints, of identifying matched sets of students from feeder schools (or areas) who end up in district schools and in charter schools. CREDO then compares (estimates) the year to year test score gains of students in the charter and district schools.

The CREDO approach, while reasonable, simply can’t sort out which component of student achievement gain is created by “school factors” (such as teacher quality, length of day/year, etc.) and which factors are largely a function of concentrating non-low income, non-ell, non-disabled females in charter schools while concentrating the “others” in district schools.

School Effect = Controllable School Factors + Peer Group & Other Factors

In other words, we simply don’t know what component of the effect has to do with school quality issues that might be replicated and what component has to do with clustering kids together in a more advantaged peer group. Yes, the study controls for the students’ individual characteristics, but no, it cannot sort out whether the clustering of students with more or less advantaged peers affects their outcomes (which it certainly does). Lottery-based studies suffer the same problem, when lotteried in and lotteried out students end up in very different peer contexts. Yes, the sorting mechanism is random, but the placement is not. The peer selection effect may be exacerbated by selective attrition (shedding weaker and/or disruptive students over time). And Newark’s highest flying charter schools certainly have some issues with attrition.

Given my numerous previous posts, I would suggest Figure 1 as the general model of the secrets of Newark Charter School success.

Figure 1. The General Model

Put simply, while resource use – additional time, compensation, etc. – may be part of the puzzle – the scalable part – the strong sorting patterns of students into charter and district schools clearly play some role – a substantial role – and one that constrains our ability to use “chartering” as a broad-based public policy solution.

One Part Segregation

Let’s start by taking a look at the most recent available data on the segregation of students by disability status, free lunch status, gender and language proficiency. Now, the CREDO report is careful to point out that charter school enrollments match the demographics of their feeder schools – and uses this finding as an indication that therefore charter schools aren’t cream-skimming. That’s all well and good…. EXCEPT … that for some (actually many) reason, charter schools themselves end up having far fewer of the lowest income students. See Figure 2.

Figure 2. % Free Lunch

Now, one technical quibble I have with the CREDO report is that it relies on the free/reduced priced lunch indicator to identify economic disadvantage (and then sloppily throughout refers to this as “poverty”). I have shown on numerous previous occasions that Newark charters tend to serve larger shares of the less poor children and smaller shares of the poorer children. So, it is quite likely that the CREDO matched groups of students actually include disproportionate shares of “reduced lunch” children for charters and “free lunch” children sorted into district schools. This is a non-trivial difference! [gaps between free lunch and reduced lunch students tend to be comparable to gaps between reduced lunch and non-qualified students.]

Here are the other sorting issues:

Figure 3. % ELL/LEP

Figure 4. % Female

 

Figure 5 shows that not only do charter schools in Newark tend to serve far fewer children with disabilities, they especially serve few or no students with more severe disabilities. In fact, they serve mainly students with Specific Learning Disabilities and Speech Language Impairment. Given the data in Table 5, it is actually quite humorous – if not strangely disturbing – that the CREDO study attempted to parse the relative effectiveness of district and charter schools at producing outcomes for children with disabilities using only a single broad classification [Student matching was based on a single classification, creating the possibility that children with speech language impairment in charters were being compared with children with mental retardation and autistic children in district schools. It is likely that most students who took the assessments were those with less severe disabilities in both cases.].

Figure 5. Special Education Distributions

Here are some related findings from (and links to) previous posts

Newark Charter Effects on NPS School Enrollments

New Jersey Charter School Special Education

Newark Charter School Attrition Rates

Here are just a few visuals of how the free lunch shares and female student test-taker shares relate to general education proficiency rates on 8th grade math. Both are relatively strong determinants of cross-school proficiency. And both with respect to gender balance and free lunch balance, Newark Charter schools are substantively different from their district school counterparts.

Figure 6: 8th Grade Math & % Free Lunch

Figure 7: 8th Grade Math & % Female

 

Now, these are performance level differences, which are not the same as the gain measures estimated in the CREDO study. But, I’ve chosen the 8th grade scores because that is when the charter scores tend to pull away from the district school scores (that is, these are the score levels at the tail end of achieving greater gains). But, the contexts of the gains for charter students are so substantially different from the contexts of achievement gains for district school students that scalability is highly questionable.

As I’ve said before – There just aren’t enough non-disabled, non-poor, fluent English speaking females in Newark to fully replicate district-wide the successes of the city’s highest flying charters.

One Part Compensation

Now, I’ve also written many posts which address the resource advantages and some resource allocation issues for high flying New York City charter schools, which a) also promote substantial student population segregation and b) have been shown in numerous studies to yield positive achievement gains.

I do not intend to imply by my above critique that peer group effect is necessarily the ONLY effect driving Newark Charter’s supposed success. The problem is that because high flying Newark Charters in particular serve such uncommon student populations we can never really sort out the peer group versus school quality effects.

It is certainly reasonable to assume that the additional time and effort spent with these students in some schools – even though they are a more advantaged (less disadvantaged) group – makes a difference.  No excuses charters in Newark like those in New York City tend to provide longer school days and longer school years, and importantly, they compensate their teachers for the additional time & effort. Here’s a simple chart of the average teacher compensation for early career teachers in NPS and Newark Charters. NPS teachers catch back up in later years, but as I’ve pointed out in numerous previous posts, a handful of Newark charters have adopted the reasonable (smart) competitive strategy of leveraging higher salaries and salary growth at the front end to improve teacher retention and recruitment.

Figure 9: Newark Teacher Compensation

Below is a more precise comparison that teases out the differences that aren’t so apparent in Figure 9. For Figure 10, I have used 3 years of data on teachers to estimate a regression model of teacher salaries as a function of experience, degree level and data year.

Some of Newark’s “high flying charters” [North Star, Gray, TEAM] tend to substantially outpace salaries of NPS teachers over the first ten years of a teacher career. Few of these schools have any teachers with more years of experience than 10. Other Newark charter schools maintain at least relatively competitive salaries with NPS.

Now, a critical point here is that as I’ve shown above, teaching in many of these schools comes with the perk of working with a much more advantaged student population. As such,  it is conceivable that even a comparable wage provides recruitment advantage – given the student population difference. Clearly, a higher wage provides a significant recruitment advantage – though in the case of the highest paying school(s), the elevated salary comes with substantial additional obligations.

Figure 10. Modeled Teacher Salary Variation by Experience

Closing Thoughts

So, when all is said and done, this new “charter school” report like many that have come before it leaves us sadly unfulfilled, at least with respect to its potential to provide important policy insights. Most cynically, one might argue the main finding of the report is simply that cream-skimming works – generates a solid peer effect that provides important academic advantages to a few – and serving a few is better than serving none at all (assuming the latter is really the alternative?). Keep it up!  Don’t worry ’bout the rest of those kids who get shuffled off into district schools. Quite honestly, given the huge, persistent differences in student populations between high flying Newark charters and districts schools, and given the relatively consistency of research on peer group effects, it would be shocking if the CREDO report had not found that Newark charters outperform district schools.

While it is likely that there exists some strategies employed by some charters (as well as some strategies employed by some district schools) that are working quite well – THE CREDO REPORT PROVIDES ABSOLUTELY NO INSIGHTS IN THIS REGARD.  It’s a classic “charter v. district” comparison – where it is assumed that “chartering” represents one set of educational/programmatic strategies and “districting” represents another – when in fact, neither is true (see the scatter of dots in my plots above to see the variations in each group!).

Parsing Charter School Disability Enrollments in PA and NJ

Here are a few quick figures that parse the disability classifications of children with disabilities served by charter schools in Pennsylvania and New Jersey.

Two previous posts set the stage for this comparison. In one, I explained how charter schools in the city of Newark, NJ, by taking on fewer low income students, far fewer LEP/ELL students and very few children with disabilities other than those with the mildest/lowest cost disabilities (specific learning disability and speech/language impairment) are leaving behind a much higher need, higher cost population for the district schools to serve.

Effects of Charter Enrollment on District Enrollment in Newark:

https://schoolfinance101.wordpress.com/2012/08/06/effects-of-charter-enrollment-on-newark-district-enrollment/

In another post, I walked through the financial implications of Pennsylvania’s special education funding formula and specifically the charter school special education funding formula on districts where large shares of low need disability students are siphoned off by charters and where high need disability students are left behind to be served by districts with depleted resources.

The Commonwealth Triple-screw:

https://schoolfinance101.wordpress.com/2012/06/05/the-commonwealth-triple-screw-special-education-funding-charter-school-payments-in-pennsylvania/

In short, under the Pennsylvania charter school funding formula, for each child classified as having a disability and choosing to attend a charter school, the sending district must pay the “average special education expenditure” of the district – regardless of the actual IEP needs of that student. So, there’s a strong financial incentive to serve large numbers of low need special education students in PA charters. But this, of course, leaves a mess behind for local districts, who then have a far higher need special education population and have lost substantial shares of their available funding (due to a completely arbitrary and wrongheaded calculation of the sending tuition rate).

This post merely provides a few more comprehensive follow up figures on the issue of higher versus lower need disability students and charter school enrollments.

First, in New Jersey, here’s the statewide breakout of charter special education enrollments and market shares based on data from 2010 (same as used in Newark post)

  • In short, charter schools in NJ serve about 1.7% of the population.
  • They serve about 1.05% of the population of children with disabilities.
  • AND… they serve only  about .23% of the population of children with disabilities other than Specific Learning Disability or Speech/Language Impairment!

That’s a big deal! It’s a big deal because this leaves behind significant numbers of high need disability children to be served by districts. And, to the extent that charter expansion follows the same trend, this will lead to even greater concentration of children with disabilities in general in district schools and children with more severe disabilities in particular.

Here’s the average disability classification profile for NJ public districts and for NJ charter schools.

Now, for Pennsylvania, where there exists a significant incentive for charter schools to boost their special education populations but to avoid serving children with more severe disabilities. Here are the counts for counties with at least 500 students in charter schools:

Here are the enrollment shares within counties:

And finally, here are the population shares served:

So, for example, in Philadelphia county, which is the city:

  • Charter schools serve 16.2% of the student population
  • Charter schools serve about 14.6% of the children with disabilities
  • BUT… charter schools serve only about 6.3% of children with disabilities other than SLD or SLI!

Even in those counties where charters serve a larger share of the county-wide total special education population, they only occasionally serve an equitable share of children with more severe disabilities (often in specialized schools).

In Delaware County, charters do serve a higher overall special education population share than districts in the county, but serve a much smaller share of non-SLI/SLD disabilities. And Chester-Upland in particular bears the fiscal brunt of this practice!

That said, clearly, PA charter schools are generally serving more comparable aggregate shares of children with disabilities than NJ charter schools and perhaps the financial incentive plays a role.

Again, a critical issue here is the nature of the population left behind in district schools.

These figures also dispel a common assertion of charter advocates/pundits who, when challenged as to why special education rates tend to be generally low in charters, often argue that it’s because the charters are implementing better early interventions and thus avoiding classifying children in marginal categories like “specific learning disability.” To begin with, there’s absolutely no evidence to support this claim. That aside, these figures show that in fact, many charters do seem to have plenty of students in these marginal categories. What they don’t have is students in the more severe disability categories such as mental retardation and traumatic brain injury and it is certainly unlikely that charter school early interventions are successfully preventing children from being later misclassified into these categories.

Statewide, of 724 children with TBA, only 7 were in charters. Of 21,987 mentally retarded children, only  396 (1.8%) were in charters.  But about 4.1% of all enrollments were in charters.

I’ll admit… I am losing my patience on some of these issues. Excuse me for a moment while I vent. I’m losing my patience in large part because of the ridiculous responses/reactions I get every time I simply post some data either relating to charter school enrollments or finances.  I seem only to get a flood of ridiculous responses when I’m presenting information on Charter schools. Not when I criticize value-added estimates, or point out misuse of SGPs. Pretty much exclusively when I present data on charter schools.

It’s time to cut the crap and start digging into what’s really going on here, and how to move toward a system that best serves all of the children rather than ignoring and brushing aside these issues and pushing forward with what appears to be an emerging parasitic model.

Let’s evaluate the incentives. And instead of protecting perverse, damaging financial incentives like those in PA, simply because they drive more money to charters, let’s do the right thing. Hey, it may be the case that charter allocations are otherwise too low, but raising them for the wrong reasons, with a wrong mechanism  and with bad incentives is still, wrong, wrong and bad.

It may also be the case that the data we are using for making comparisons – using total of free and reduced lunch, rather than parsing income categories, comparing total special education rates instead of by classification, are encouraging charter operators to boost their enrollment subgroups by focusing on the margins. In which case, we need to make it absolutely clear by increasing data reporting precision and availability, that serving kids just under the threshold (or in marginal categories) isn’t enough. More fine grained comparisons are necessary!

I’ve said before that I don’t really believe that every school – every magnet school – every charter school – every traditional public school – can or should try to serve exactly the same population. I do believe there’s room for specialization in the system. I also believe that many charters that “succeed” so-to-speak, do so because they’ve figured out how to serve well their non-representative populations. And many would likely fail miserably at trying to serve children with more severe disabilities (as many district schools have).

BUT… accepting that there’s room for some specialization within the system and some uneven distribution of students is a far cry from what is now emerging, as charter market shares increase significantly in some cities and in some zip codes. And that must stop!

New Jersey Charter Data Roundup: A look at the 2010-11 Report Cards

Here’s a quick run-down on the 2010-11 New Jersey School Report Card data on charter schools. No-one else is putting out decent summaries of this stuff, so I feel obligated to revisit these data periodically. They don’t change much over time. But those older blog posts get buried over time. So, here we go.

Let’s take a specific look at Newark because that’s where most of our attention has been paid regarding high flying charter performance.

Data sources:

1. NJDOE Report Card

2. NJDOE Enrollment File

3. NJDOE Directory File (for City location)

Percent Free Lunch

Percent ELL

Percent Female

Regression Model of Charter Performance

More explanation is provided below. But this regression model (raw output on link below) is simply intended to compare the average proficiency rates across all tests and grades of charter schools to other schools in the same city and with similar characteristics. The bottom line is that as in previous similar regressions, there remains a small statistically non-significant margin of difference in average overall proficiency. But, the graphs that follow are perhaps more fun/interesting to explore.

CharterRegression

Now, for the following figures, the overall charter effect variable is removed, so that we can see how individual charter schools lie with respect to expected proficiency levels. The following figures compare schools to their predicted performance given each of the characteristics in the regression model. On the vertical axis is the standardized residual or the standard deviations above or below predicted performance. Along the horizontal axis is the percent free lunch of the schools, just so that we can see how they sort out by poverty concentration. Note that poverty concentration is already controlled for in the models. I begin with a few figures for select tests in Newark, and then present some statewide figures.

Newark Schools over and under predicted performance

Statewide schools over and under predicted performance

On average, this statewide picture is actually pretty ugly. It would certainly be very hard to argue that charter school expansion across New Jersey has led to any substantive overall improvement of educational opportunities. Numerous charter schools are substantial underperformers. And overall, as the regression model indicates, the net performance is bread even.

Take home points

This analysis merely compares the average proficiency rates of schools with similar characteristics in the same city. It does not measure whether charters “add value” per se.  This isn’t really ideal from a research perspective, because it doesn’t attempt to sort out whether these schools are actually doing something that leads to higher performance.

To address this question we might try either of two strategies – estimating achievement gains across matched schools – or hypothetically matched schools/children, or by a lottery based analysis comparing kids lotteried in to those lotteried out and staying in neighborhood schools.

But, I would argue that we still might not learn much of policy relevance for Newark from either of these approaches. Why?

Well, let’s consider the first approach – a matched school analysis (or virtual match based on individual students).  Let’s say we specifically wanted to determine the effectiveness of schools like North Star, Robert Treat or Gray charter.  The problem is that there really aren’t any “matched” schools or match-able kids – except perhaps those in magnet schools.  Note on matching-based-analyses… given that nearly all kids in a city like Newark qualify for Free OR REDUCED lunch, matching would have to be done on the basis of Free Lunch! If not, substantial precision/accuracy is lost and the comparisons invalid.

We might look outside of Newark for matched schools or students, but then other contextual factors might compromise the analysis quite substantially, and this might cut either for or against the charters.

Further, it appears that gender balance matters – not just a little – but a lot. Gordon McInnis tipped me off to this.  I hadn’t realized how big a deal it was in these schools.

Note that I’ve also left out attrition here, so that even if the schools were matched by poverty rates, gender and ELL concentration, there might be substantive differences in which students leave over time, altering the peer group composition over time (as weaker students leave).  Again, it may be most relevant to compare Newark Charters to Newark Magnets and/or children who attend them, which are most similar to these Newark Charters.

We could try to construct hypothetical or virtual matches based on similar individual children (to those in the charters) across the district who may or may not actually attend school together. But therein lies the problem, most other similar kids left in district schools would be attending school in substantively different peer groups than those in charters like North Star, Gray or Treat.

AND if we did find an “effect” on student achievement growth what the heck would it mean? And how would it inform our policy decisions?

Well, if we did, we would still have significant difficulty sorting out as to whether that effect has anything to do with school quality, or with student peer group  – quite possibly the largest in school factor affecting achievement.

Alternatively, one could attempt a lottery based analysis in which we look at the gains of kids lotteried in and lotteried out of the charters – left in their neighborhood schools. But in this case we would certainly have kids whose peer groups differ dramatically.  Again, we could try to “correct” for that uneven distribution, but the fact is that we simply can’t fully correct for the substantial contextual differences across these schools.  Too many Newark charters (and those in Jersey City and Hoboken) simply don’t even come close to resembling the student composition of traditional public schools in the same area.

So who cares? Well, it matters a great deal for policy implications whether the effect is created by concentrating less poor, English speaking females in a given school or by actually providing substantively better curriculum/instruction.  The latter might be scalable but the FORMER IS NOT! There just aren’t enough non-poor girls in Newark to create (or expand) a whole bunch of these schools!

NJ Charter Data Round-up

Note: I will be making updates to this post in the coming days/weeks.

As we once again begin discussing & debating the appropriate role for Charter schools in New Jersey’s education reform “mix,” here’s a round-up on the New Jersey charter school numbers, in terms of demographic comparisons to all other public and charter schools in the same ‘city’ and proficiency rates (across all grades) compared to all others in the same ‘city.’

Key Findings:

Many NJ charter schools, especially those most often touted in the media as great success stories, continue to serve student populations that differ dramatically from populations of surrounding schools in the same city (see note *). These charters differ in terms of percentages of children who qualify for free lunch, percent classified as having disabilities, or percent with limited English language proficiency.

On average, given their demographics, NJ charter schools continue to have proficiency rates around where one would expect. Demographically advantaged charter schools have higher average proficiency than other schools around them. Demographically disadvantaged charter schools have lower average proficiency rates than others around them. Not tricky/heavy statistics here. Just a comparison of relative proficiency and relative demography.

When one estimates what I would call a “descriptive regression” model characterizing the differences in proficiency rates across district and charter schools in the same cities, one finds that compared against schools of similar demography, and on the same grade level and subject area tests, the charter proficiency rates, on average are no different than their traditional public school counterparts. In this particular regression model, charters did have higher proficiency in Science (charter x science interaction). More descriptive stuff to come when I get a chance. Not sure when that will be.

Note: The model includes a fixed effect for CITY location for each traditional public and charter school, such that each charter is compared against other schools in the same CITY.

But to be absolutely clear, this particular analysis misses the point entirely in two ways. First, it is merely descriptive of the average proficiency rates of charter and non-charter schools across tests, subjects & grades. It is not a test, by any means of comparative effectiveness of schools. Second, as I explain below,  comparisons of charterness vs. non-charterness are not particularly helpful for informing policy.

Policy Perspectives:

Issue 1: The relevant policy question is not whether charters on average perform better than traditional public schools on average and therefore whether we should simply replace more traditional public schools with more charter schools. The relevant questions are “what works? For whom? And under what circumstances?” Charter schools, traditional public schools and private schools all vary widely in quality and in their ability to serve different populations well. Some schools of each organizational type do well (at least for some kids) while others, quite bluntly, suck, no matter who they try to serve. Further, I’ve written previously about these arguments that charters or private schools “do more (than traditional public schools) with less money.” However, rarely are those money comparisons rigorously or accurately conducted. Often times the assertion of “more with less” isn’t backed by any analysis at all of the “with less” part of that equation (and sometimes not the “more” part either). But these are the types of issues we need to be exploring, including specifically what are the resource implications of the models being offered by those “successful” schools, be they charters, traditional public schools, or other alternatives.

Issue 2: It may not be that the only appropriate role for charters in the mix is for them to all try to serve the most representative population – a population mirroring that of the city as a whole or their zip codes. But, for those that don’t – for those that serve a niche – we need to recognize them as such, and need to monitor the extent that their demographic selection may have adverse effects on the system as a whole. We also need to recognize that their demographic difference may play a significant role in explaining either their apparent success, or apparent failure. We should recognize, for example, that schools like Robert Treat or North Star Academy may be showing high outcomes but are doing so largely as a function of serving very different populations than others around them. Further, there may be nothing wrong with that if they are truly doing well by the kids they serve. That may just be their appropriate niche. We just can’t pretend that this model of success can be spread city wide or statewide. And, it may be inappropriate to encourage these schools to serve more representative populations. Perhaps they should stick with what they are good at. As a result, it may be more reasonable for charters like North Star or Robert Treat to establish similar niche schools in other New Jersey cities rather than pretending they can expand dramatically in the same cities and still maintain their current level of achievements.

Issue 3: We also need to remember that NJ’s large urban districts themselves operate a wide variety of schools and segment their own student populations at the secondary level through such options as magnet schools. Charters aren’t the only segmenting force. Charters including those that are demographically representative and those that aren’t have simply become a part of that mix. And we need to recognize where each fits into that mix and consider very seriously the implications for the system as a whole.

Issue 4: Finally, as I so often point out policy perspectives and parental interests may differ sharply when it comes to “elite” charter schools. From a policy perspective, elite charter schools provide limited implications for scalability (and for charters as a broad-based policy “solution”) because their benefits are derived from concentrating motivated, often less poor (non-disabled & fluent English speaking), self-selected students with the staying power to endure “no excuses” charter models.  From a parental perspective, this public policy limitation often provides the strongest personal incentive to pursue a specific school for one’s own children. Again, it comes down to that “other” strongest in-school factor driving student success – peer effect. Peer effect is a limitation (confounding factor) in public policy (unless we can find clever strategies to optimize peer distribution). But peer effect may be a legitimate quality indicator for parental choices.

Data notes:

As I’ve noted numerous times on this blog, my goal here is to access and report on publicly available data from widely recognized and/or official government sources. These are the most recent data of that type available. And here are the sources:

District and Charter School Location Information: http://nces.ed.gov/ccd/bat (2009-2010)

District special education classification rates: http://www.nj.gov/education/specialed/data/ADR/2010/classification/distclassification.xls

School level % LEP/ELL & % Free Lunch: http://www.nj.gov/education/data/enr/enr11/enr.zip

Combined Demographic Data: Charter Demographics 2011

*Note: City and Zip Code averages constructed by summing all students, all free lunch students, all LEP/ELL students for all schools in each “city” and in each “zip  code” as identified by school location based on the NCES Common Core of Data and then dividing city-wide (or zip code wide) % LEP/ELL and % Free Lunch by city (or zip) wide total enrollment for both traditional public schools and charters (that is, charters are part of the city-wide, or zip-wide average).  For special education, to estimate the citywide (and zip) average for schools, the district overall rate was applied to district schools.  This would not be an appropriate way to compare individual city schools to charter schools, since special education populations are not evenly distributed across city schools (or throughout a zip code), but is a more reasonable approach for generating the citywide aggregates. Again, charters are included in citywide and in zip-code level averages.

Digging for Consistent, Comprehensive Financial Data on New Jersey Charter Schools

I’ve commented in the past about the difficulties of obtaining reconcilable data on finances of New Jersey Charter Schools. What do I mean by reconcilable? Well, when I’m looking at financial data on charter schools in particular, I like to be able to see some relationship between expenditure and revenue data reported on IRS 990 filings (Tax returns of the non-profit boards/foundations/agencies that operate the charters) and state government (department of ed) reported expenditures and/or any annual financial report documents that might be required by charter authorizers. This really is an authorizer/accountability issue. A financial reporting requirement issue.

When I did my study on New York City Charter schools last year I was quite pleased to find a) annual financial reports on nearly all NYC charters housed by the State University of New York, b) IRS 990 filings for nearly all NYC charter schools, and c) a pretty strong relationship between the reported expenditures on one form and the reported expenditures on the other. Here are two graphs of those relationships – the first including the higher outliers (which is partly a reporting issue, with KIPP Academy embedding systemwide expenses-an issue consistent on both forms).

Example: NYC Charters

Here’s how it looks if I focus on those spending less than $20,000 per pupil:

Example: NYC Charters

So, in NYC, I have pretty solid information from both sources, well aligned but with some notable exceptions. Some of these exceptions were further reconciled, or at least changed positions, when we added in expenditures from affiliated foundations (Harlem Children’s Zone, HCZ in particular).

In New Jersey, financial data on charter schools seems to be improving, but remains sparse. For example, in my most recent search of IRS 990 filings through Guidestar, I was able to obtain the following distribution of numbers of institutions by most recent available year –

2010 (2009-10 school year) = 40

2009 (2008-09 school year) = 8

2005 = 1

2004=1

2003=3

1998 = 1

Yet, the New Jersey Department of Education (NJDOE) reports data on 64 charter schools (63 with expenditure data). So, I can still only easily access up-to-date IRS filings on about 2/3 of NJ charter schools. This to me, is a concern, but it is a massive improvement over the past few years. I now actually have enough data from each source to check the relationship between the two, and where data are reported, that relationship is strong:

But we still have limited information on many NJ charter schools, and only a single source of data on which to rely. Indeed, it is the official state department of education data, but it’s always nice to be able to reconcile with other official data/filings/reports.

Note also that in NY, the points that fall in line, fall right in line – on a straight line – with exactly reconcilable numbers. The NJ ones which are reported are getting better… mostly in line.

Here’s how the spending per pupil rankings play out in NJ using each source. First, the IRS 990 data:

Next the NJDOE spending guide data:

Note that things change a bit when we add in those cases where IRS 990s weren’t reported.

So, there are a lot of schools missing in that first graph, and adding the others in does change things a bit. But I’d like to see both forms of data readily available on an annual basis.

Among other things, these data reveal some striking differences in spending, which perhaps result at least partly from access to non-public funding, but also partly result from differences in host district funding. An important question here is whether these differences are driven systematically by differences in the needs of the student populations served by these particular schools.

That is, are the differences in spending across charters a predictable function of various student needs, such as concentrations of low income children, English language learners or children with disabilities?

Are the differences in spending across charters partially explained by regional differences in labor costs? (e.g. competitive wages for school employees such as teachers?)

That is, to what extent to these substantial differences in spending across charters enhance equity, as opposed to eroding it.  And to what extent should we be concerned about the role of charters within the public system eroding equity (e.g. are traditional resource equity concerns relevant when individuals and families choose less well resourced schools? Do more well resourced schools tend to have longer waiting lists? Makes for a fun legal question, as well as a moral/ethical question).

I’ll explore these issues in a future post. For now, I’ve just been trying to get enough coverage of data on the financing of NJ charter schools in order to be able to conduct such analysis. And it has been very frustrating that such data are not readily available for all schools and easily reconcilable.

 

Misinformed charter punditry doesn’t help anyone (especially charters!)

Download slides of figures below: TEAM Academy Slides Oct 5 2011

Link to NCES Common Core Build a Table: http://nces.ed.gov/ccd/bat/

Link to Special Education Data (NJDOE): http://www.nj.gov/education/specialed/data/ADR/2010/classification/distclassification.xls

Link to School Report Card Download (NJDOE): http://education.state.nj.us/rc/rc10/database/RC10%20database.xls

Link to Enrollment Data 2010-11 (NJDOE):  http://www.nj.gov/education/data/enr/enr11/enr.zip

 

Misinformed charter punditry doesn’t help anyone. It doesn’t help the public to make more informed decisions either about choices for their own children or about policy preferences more generally. It also doesn’t help charter operators get their jobs done and it doesn’t help those working in traditional public schools focus on things that really matter.  This post is in direct response to the irresponsible and unjustified statement below from a recent editorial in the NJ Star Ledger:

The best of these schools, like the TEAM Academy in Newark, are miracles in our midst. With the same demographic mix of students as district schools, their kids are doing much better in basic skills. And they are doing it for less money, in a setting that is safe and orderly.

http://blog.nj.com/njv_editorial_page/2011/10/nj_sets_right_course_on_charte.html

Nearly every phrase in this statement is misleading or simply wrong. And that’s a shame. My apologies for being trapped in meetings yesterday and not having a chance to return calls on this topic. I might have been able to head this off.  Perhaps most disturbingly, this stuff really doesn’t help out TEAM Academy much either. Readers of my blog know that I often go after stories about the high flying Newark and Jersey City charters which, for the most part, stick out like sore thumbs when it comes to demographics and attrition. Readers also realize that it is not that I think these schools are doing a bad job. Rather, I think many are doing a great service. But, I am concerned that the media often deceives the public into believing that the “successes” of schools like North Star and Robert Treat can be scaled up to improve the entire system, which they cannot, because they simply do not serve students like those in the rest of the system.

My readers also know that I’ve generally left TEAM Academy alone here, and for a few reasons. First, TEAM’s demographics are less extreme outliers than those of the other high flyers. Second, TEAM’s outcomes are also more modest, but pretty good. Third, and perhaps this is revealing of preferential treatment on my part, but the head of TEAM, Ryan Hill has always been one for open and honest conversation on these very topics – perhaps because he understands fully that I’m not out to get him, or any other charter leaders here. Rather, I’m out to paint a realistic picture of what’s going on.

So, here I’m going to paint a realistic picture of TEAM Academy. This is not criticism. It’s realism. And again, I do appreciate Ryan Hill’s efforts and TEAM’s role in the Newark community. That’s why I think the above statement is so irresponsible. It sets an inappropriate bar and casts TEAM in an inappropriate light. It’s not a miracle. It doesn’t serve the same population. It spends quite a bit (but spending is all relative) and pays its teachers particularly well.

First, here are the percentages of children qualified for free lunch within the TEAM zip code in Newark:

Here’s an updated graph of TEAM vs. all NPS schools districtwide, using % free lunch data from 2010-11 from the NJDOE enrollment files: http://www.state.nj.us/education/data/enr/enr11/stat_doc.htm


I have previously reported on special education data, which are sorely lacking in NJ at the school level. Suffice it to say that all official reports indicate lower special education enrollments in TEAM than district averages, but unofficial and district provides school site reports for Newark Public Schools vary widely. Here’s the most recent classification data at the district level for Essex County districts and select Newark charters:

While TEAM has a much higher classification rate than other “high-flying” Newark charters, its total rate is still much lower than Newark Public Schools. Further, we have no information on the enrollment of children with severe disabilities.

Second, here are the cohort attrition rates for Newark charters. Indeed TEAM has lower attrition than some, but still shows significant attrition from year to year (old slide, so North Star is highlighted). We don’t know much about the nature of that attrition, nor can these data tell us about it.

Now on to resource issues. According to TEAM Academy’s IRS 990 form, the school spent in 2010:

Total Program Expenditures = $19,452,929

TEAM IRS 990

On 1,050 students

For a total per pupil of $18,527

It is important to understand that this figure may not be a full representation of what TEAM spends. It does not include additional expenditures on school activities by the national KIPP organization under which TEAM operates (which may include professional development, instructional materials, other gifts/stipends, etc.).

It is critically important to understand that this figure is not directly comparable to NPS total district budget per pupil for many reasons.  NJDOE data for making such comparisons are problematic in a number of ways, and newly revised data are no better than the older data.

This figure would need to be compared with an appropriate school site expenditure figure for NPS schools serving similar grade levels and populations.  For example, NPS district expenditures include the expenditures for transportation of charter students (which should be added to charter expense, not counted on host district expense). Further, one must acknowledge that since TEAM serves a far fewer children with disabilities than the district, especially those with more severe disabilities, TEAMs per pupil costs are lower. Note that spending on children with disabilities often consumes about 25%  of district budgets (to serve about 14 to 16% of children, on average).* Appropriate comparisons would include relevant facilities expenses (annualized) for both charter and host.*  I wrote extensively about the complexities of making similar comparisons in NYC last winter: http://nepc.colorado.edu/publication/NYC-charter-disparities And I continue to work on this topic, as it applies to NJ districts and charter schools.

But here is perhaps the most important point that can be made about resources…

There should be no shame in trying to spend enough money to actually provide a decent education!

It is twisted logic to assume otherwise! And the Star Ledger editorial ignorantly advances this twisted logic.

There’s no shame in doing more with more or even similar levels of resources (if that is indeed what’s happening).

Here are some insights into how TEAM spends. Many pundits these days talk about how we shouldn’t be throwing so much money at those already overpaid teachers.  Well, here’s how TEAM Academy’s salaries stack up against some nearby public districts and against some other charters. This is an unfinished analysis, based on actual individual teacher salaries from a statewide database.

TEAM has strategically, I would argue, put itself in a position to recruit top new teaching candidates on the front end and scaled up salaries to retain teachers who’ve made it past those rough first few years. Yes, TEAM is leveraging its resources to pay competitive wages (something not so hip and cool in today’s reformy rhetoric), which I would argue is a smart move. And, in the Newark context it’s not a difficult move because the NPS district salary schedule is so flat on the front end. It’s easy to beat. And relative salaries matter. Indeed, TEAM has placed more value on early-mid career than late career, but it’s not that TEAM reduces salaries for later career teachers, but rather that TEAM salaries climb earlier. As of now, TEAM doesn’t have many “senior” teachers, partly because it hasn’t been around that long.

Again, to summarize:

  • It’s not a miracle but it just may be a pretty good school.
  • It doesn’t serve the same population, but serves more similar population than many other high-flying charters.
  • It spends quite a bit and pays its teachers particularly well, but structures that pay differently.

AND THERE’S ABSOLUTELY NOTHING WRONG WITH THAT. (even if it doesn’t make good news copy!)

So, that’s my “real” TEAM story – at least in data terms. I assume Ryan Hill can provide some insights from the trenches (perhaps while humming this catchy tune: http://www.youtube.com/watch?v=gQjFHxJ9IKs)!

*For example, special education costs per pupil within a district budget that spends $20,000 per pupil might be $5,000 per pupil, or 25% (based specifically on analysis of special education expenditures in Connecticut districts). In New York City, the Independent Budget Office (see my NEPC report on charter spending above) estimated occupancy costs for facilities to be approximately $2,700 per pupil. That is to say, on balance, the differences in district special education population costs (relative to Charter special education costs) would typically more than offset differences in facilities costs per pupil, assuming district schools have $0 facilities costs (which is an extreme, incorrect assumption).

DATA UPDATE – HERE ARE TEAM ACADEMY’S 2010 OUTCOMES IN PERSPECTIVE

The following graphs do a relatively simple comparison of proficiency rates by schoolwide % of children qualifying for free lunch. Two data issues are important to recognize here:

1) I’ve used schoolwide % free lunch here instead of test taker % free or reduced lunch because, as I’ve explained numerous times before, the vast majority of Newark families fall below the 185% income threshold and qualify for at least reduced price lunch. As such, that measure captures little or no difference across schools. But there are differences, and those differences are captured by looking at the lower income threshold for reduced price lunch.

2) Because charter schools including TEAM serve so many fewer children with disabilities and few or no children with severe disabilities, one must compare the proficiency rates of GENERAL test takers only. If, for example, a host district has 10% more kids with disabilities and those kids are invariably non-proficient, that’s a 10% proficiency difference to begin with.

In these figures, I’m considering only low income concentrations with respect to outcomes. On that basis alone, TEAM is marginally above expectations a) overall, and b) on most grade level assessments. On the high school assessment, TEAM does somewhat better, but schools are pretty much scattered all over the place. It’s a solid school, but no miracles.