Why is it OK for Think Tanks to just make stuff up?


Something that has perplexed me for some time in my field of school finance, is why it seems to be okay for policy advocates and “Think Tanks” to just make stuff up. For example, to just make up what level of funding would be appropriate for accomplishing any particular set of goals? or to just make up a figure for how much more a child with specific educational needs requires under state school finance policy. Just “making stuff up” seems particularly problematic for “Think Tanks,” which as far as I can tell should be producing information backed by at least some degree of … Thinking? Perhaps based on some of the more reasonable thinking of the field?

This topic comes to mind today because ConnCan has just released a report (http://www.conncan.org/matriarch/documents/TheTab.pdf)    on how to fix Connecticut school funding which provides classic examples of just makin’ stuff up (page 25). The report begins with a few random charts and graphs showing the differences in funding between wealthy and poor Connecticut school districts and their state and local shares of funding. These analyses, while reasonably descriptive are relatively meaningless because they are not anchored to any well conceived or articulated explanation of “what should be.” Such a conception might be located here or even here (Chapters 13, 14 & 15 are particularly on target)!

The height of making stuff up in the report is the recommended policy solution to the problem which is never clearly articulated. There are problems in CT, but The Tab, certainly doesn’t identify them!

The supposed ideal policy solution involves a pupil-based funding formula where each pupil should receive at least $11,000 per pupil (made up), and each child in poverty (no definition provided – just a few random ideas in a footnote) should receive an additional $3,000 per pupil (also made up) and each child with limited English language proficiency should receive an additional $400 per pupil (yep… totally made up). There is minimal attempt in the report (http://www.conncan.org/matriarch/documents/TheTab.pdf) to explain why these figures are reasonable. They’re simply made up.

The authors do provide some back-of-the-napkin explanations for the numbers they made up – based on those numbers being larger than the amounts typically allocated (not necessarily true). They write off the possibility that better numbers might be derived by way of a general footnote reference to a chapter in the Handbook of Research on Education Finance and Policy by Bill Duncombe and John Yinger which actually explains methods for deriving such estimates.

The authors of The Tab conclude: “Combined with federal funding that flows on the basis of poverty and (in some cases) the English Language Learner weight of an additional $400, the $3,000 poverty weight would enable districts and schools to devote considerable resources to meeting the needs of disadvantaged students.” I’m glad they are so confident in their “made up” numbers! I, however, am less so!

It would be one thing if there was no conceptual or methodological basis for figuring out which children require more resources or how much more they might actually need. Then, I guess, you might have to make stuff up. Even then, it might be reasonable to make at least some thoughtful attempt to explain why you made up the numbers you… well… made up. But alas, such thinking seems beyond the grasp of at least some “think tanks.” Guess what? There actually are some pretty good articles out there which attempt to distill additional costs associated with specific poverty measures… like this one, by Bill Duncombe and John Yinger:

How much more does a disadvantaged student cost?

It’s not like the title of this article somehow conceals its contents, does it? Nor is the journal in which it was published (Economics of Education Review) somehow tangential to the point at hand. This paper, prepared for the National Research Council provides some additional insights into additional costs associated with poverty and methods for estimating those costs.

Rather than even attempt to argue that these figures are somehow founded in something, the authors of The Tab seem to push the point that it really doesn’t matter what these numbers are as long as the state allocates pupil-based funding.  That’s the fix! That’s what matters… not how much funding or whether the right kids get the right amounts. In fact, the reverse is true. The potential effectiveness, equity and adequacy of any decentralized weighted funding system is highly contingent upon driving appropriate levels of funding and funding differentials across schools and districts!

I’ve critiqued the notion of pupil-based funding as a panacea, here:

Review of Fund the Child: Bringing Equity, Autonomy and Portability to Ohio School Finance

Review of Shortchanging Disadvantaged Students: An Analysis of Intra-district Spending Patterns in Ohio

Review of Weighted Student Formula Yearbook 2009

Oh, and also here: http://epaa.asu.edu/epaa/v17n3/

Among other things, in each of these critiques of think-tank reports I question why it seems okay to just make up “weights” and cost figures when applying distribution formulas – either for within or between district distribution.

Just thinking… but not making stuff up!

Advertisements

3 Comments

  1. Maine’s Essential Programs and Services (EPS) formula was developed by the Center for Education Policy, Applied Research, & Evaluation at the University of Southern Maine. It has been in place for years now and there are many of us who still don’t truly understand why the numbers used are or how they are applied, particularly school system to school system. Nowhere on the Department of Education website is there a clear explanation of the basic formula. And nowhere on-line can anyone find an “ED 279” (even as an example) which is the 4-page spreadsheet each school system receives detailing how their EPS funds were calculated.

    Add to this a bottomline confusion over whether the EPS funding formula creates the base or the top and I think the question of “making stuff up” applies here just as well as any other state – even though it appears to have been thought through.

    Simple example: in the early years, each school system was allocated $408/student in the formula for “central office administration”. When the legislature decided to mandate school consolidation in 2007, it unilaterally cut that number in half (which created $36.5 million in savings which – miraculously – was exactly what was needed to plug the budget hole that year). There was no concurrent requirement that school systems cut their central office administration by 1/2. There is no other explanation except that this number – and probably more – was simply “made up” to suit the purpose.

    1. You raise a fun and interesting point – that indeed many parameters in existing state school funding formulas are simply “made up.” Some are made up not only in magnitude, but even in concept – like the Kansas adjustment for the 16 districts with the highest priced houses – intended to help them overcome their teacher recruitment difficulties. Of course, the reality is that these, of all Kansas districts, are the districts with the fewest problems recruiting and retaining teachers. Such parameters are sadly too common in state school finance formulas.

      I’d like to hold policy think tanks and consultants to a slightly higher standard here. An analysis with recommendations for legislative action should be better than simply “made up,” even if the policies that ultimately get implemented are not. Those who write such analyses and make proposals should not be giving legislators even more reason to make stuff up, or act on made up stuff.

  2. Well, if your ideology and ideological-driven sugar daddies require you to maintain a certain level of consistency and certainty, well, real numbers and models just screw that up. Consequently, it’s better to pull numbers out of your a**, er, HAT, than draw on real data and well-tested models and formulae. (“Data? Real data? Perish the thought…”)

    Just sayin….then again, I’m but a humble historian.

Comments are closed.