Idiot of the Week (year) Award… The Cartel… Check this out!

See updated post on this topic: https://schoolfinance101.wordpress.com/2010/04/16/cartel-recap/

====

Okay… so I’m curious about The Cartel movie that documents the failures of New Jersey’s public education system… and the high costs of those failures. One might construct a reasonable statistical case for some of the problems facing New Jersey schools… but not documentary filmmaker Bob Bowdon in “The Cartel.” I’ve not seen it yet…. but their page on Facts and Figures here, includes some of the dumbest assertions I think I’ve seen in a long time:

http://thecartelmovie.com/

Go to the bottom of the page where this complete moron attempts to argue that states which spend more on education have lower SAT scores… that spending more leads to lower SAT scores.

[BOWDON APPEARS TO HAVE REMOVED THE TWO CHARTS WHICH ATTEMPT TO MAKE THIS ARGUMENT. SEE COMMENT BELOW]

QUOTES/LABELS FROM THE CHARTS INCLUDED:

“THE MORE A STATE SPENDS ON SCHOOLS THE LOWER ITS KIDS’ SAT SCORES”

“EVERY EXTRA $100 IN EDUCATION SPENDING LOWERS SAT SCORES BY 1 1/3 POINTS”

He kept this statement “With spending as high as $483,000 per classroom (confirmed by NJ Education Department records), New Jersey students fare only slightly better than the national average in reading and math, and rank 37th in average SAT scores.” On his “The Deal” page…

In fact, there may be a connection… that is… states that spend more which happen to be in the northeast, happen to have higher SAT participation rates… because northeastern colleges and universities use the SAT. 82% of New Jersey students take the SAT.  This figure is 9% in Alabama and 4% in Mississippi, and students taking the SAT in those states tend to be the select few interested in attending competitive northeastern colleges.  So, we’re comparing the top 4% of Mississippi students to the 82% of NJ students. Anyway… that absurdity aside, here’s a better picture of how the relationship between state spending on schools relates to state average outcomes. The following four graphs show the relationship between predicted basic state and local revenue per pupil (controlling for sparsity, econ. of scale, state poverty rates, ELL children and regional wage variation) and National Assessment of Educational Progress 2007 scores. Actually, somewhat to my own surprise there is a reasonably positive relationship here. THAT SAID… I DO NOT ASSUME  THIS TO BE A SIMPLE DIRECT CAUSAL RELATIONSHIP. There are many potentially interesting underlying stories that might be told here about regional differences in income, adult population education levels, tax policy structures, etc.

Anyway… for me… this foolishness has reduced significantly any interest I may have had in actually seeing the movie.  Ignorant… juvenile… silly… I’m not even sure how to classify this attempt at a “brilliant revelation” from a scatterplot (FYI – I used to teach my 7th graders how to do this stuff… and draw appropriate inferences…not this kind of crap.)

I was initially pleased to see that the “facts and figures” page on the site actually had links to reasonable facts and figures and reports… rather than making them up off the cuff…(a topic I’ve written about with regard to teacher salaries, administrative salaries, Abbott spending and many other related topics – https://schoolfinance101.wordpress.com/2008/12/09/notes-from-a-school-finance-curmudgeon/).

Here’s the relationship between SAT participation rates and SAT combined scores.

Presentation1

By the way… this graph I previously posted compares teacher salaries other professions holding similar degree levels, at similar age, over time in NJ. And these are hourly wage comparisons. Interestingly, teachers have fallen further and further behind over time.

https://schoolfinance101.wordpress.com/2009/01/14/those-darn-overpaid-nj-teachers-sucking-the-life-out-of-the-lagging-economy/

========

And here’s where NJ actually stands on corrected spending measures and standardized outcomes:

Do School Finance Reforms Make Any Difference?

Well… here’s what a number of reasonably strong empirical studies have shown…

•   David Card & A. Abigail Payne, School Finance Reform, The Distribution of School Spending, and the Distribution of Student Test Scores, 83 J. Pub. Econ. 49 (2002)
–    Using micro samples of SAT scores from this same period, we then test whether changes in spending inequality affect the gap in achievement between different family background groups. We find evidence that equalization of spending leads to a narrowing of test score outcomes across family background groups. (p. 49)
•    John Deke, A Study of the Impact of Public School Spending on Postsecondary Educational Attainment Using Statewide School District Financing in Kansas, 22 Econ. Educ. Rev. 275 (2003)
–    In this paper, I use a policy change in Kansas involving statewide school district refinancing to measure the impact of per-pupil spending on the probability that a student will choose to acquire more education. Using panel models that, if biased, are likely biased downward, I have a conservative estimate of the impact of a 20% increase in spending on the probability of going on to postsecondary education. The regression results show that such a spending increase raises that probability by approximately 5%.
•   Thomas Downes (2004) School Finance Reform and School Quality: Lessons from Vermont. In Yinger, J. (ed) Helping Children Left Behind: State Aid and the Pursuit of Educational Equity. Cambridge, MA: MIT Press.
–    All of the evidence cited in this paper supports the conclusion that Act 60 has dramatically reduced dispersion in education spending and has done this by weakening the link between spending and property wealth. Further, the regressions presented in this paper offer some evidence that student performance has become more equal in the post–Act 60 period. And no results support the conclusion that Act 60 has contributed to increased dispersion in performance. (p. 312)
•    Tom Downes, Jeffrey Zabel, Dana Ansel (2009) Incomplete Grade: Massachusetts Education Reform at 15. Boston, MA. MassINC.
–    The achievement gap notwithstanding, thisresearch provides new evidence that the state’s investment has had a clear and significant impact. The achievement gap notwithstanding, this research provides new evidence that the state’s investment has had a clear and significant impact. how education reform has been successful in raising the achievement of students in the previously low-spending districts.4  Quite simply, this comprehensive analysis documents that without Ed Reform the achievement gap would be larger than it is today. (p. 5)
•    Jonathan Guryan (2003) Does Money Matter? Estimates from Education Finance Reform in Massachusetts. Working Paper No. 8269. Cambridge, MA: National Bureau of Economic Research.
–    Using state aid formulas as instruments, I find that increases in per-pupil spending led to significant increases in math, reading, science, and social studies test scores for 4th- and 8th-grade students. The magnitudes imply a $1,000 increase in per-pupil spending leads to about a third to a half of a standard-deviation increase in average test scores. It is noted that the state aid driving the estimates is targeted to under-funded school districts, which may have atypical returns to additional expenditures.
•    Margaret Goertz & Michael Weiss (2008) Assessing Success in School Finance Litigation: The Case of New Jersey
–    State Assessments: The gap between Abbott districts and all other districts was reduced to 12 points by 2005 or 0.40 standard deviation units. The gap between the Abbott districts and the high wealth districts closed from 25 points to 15 points in 2005 (Figure 7). Performance in the low, middle, and high wealth districts essentially remained parallel during this time. (p. 17)
–    NAEP: The NAEP results confirm the changes we saw using state assessment data. NAEP scores in 4th grade reading and mathematics in Central Cities rose 19 and 20 points, respectively between the mid-1990s and 2005, a rate that was faster than either the Urban Fringe or the state as a whole. P. 20)

Should Think Tanks be Licensed to Think? (and when should a license be revoked?)

We all have our flaws. As social science researchers, writers, reviewers, etc. dealing with highly political and politicized topics, we all let our personal biases creep inappropriately into our work at times. In the world of education policy writing and research, there are many occasions where individuals and organizations are provided opportunities to selectively review and present data and summaries of existing literature in order to make one point or another or argue one side of an issue or another. There will always be reasonable differences of opinion on complex social science issues, both with regard to the methods that should be used to provide definitive answers, and the definiteness of the answers themselves.

All of that aside, there are, I believe some really basic rules of thumb that should be followed and some offenses that should be considered so egregious that they warrant public rebuke and removal from the table around which public discourse occurs on key education policy topics.

My latest example of an offense so egregious comes from a review I completed today on a report from Lisa Snell of the Reason Foundation – the Weighted Student Formula Yearbook 2009. Reason Foundation is a Think Tank, so-to-speak, or at least fashions itself as such. There are many “Think Tanks” across the political spectrum and these Think Tanks frequently chime in with well-publicized findings regarding education policy and finance issues. So, this is not new. That said, this particular report is simply so awful, I can’t shake it from my head.

To summarize, the report touts urban school reforms which involve decentralized management and school choice, two issues on which I personally do not have strong opinions one way or the other, from an ideological perspective (rather, my concern is that if one pursues a particular reform, it should be done using appropriate evidence, and should be done well – or as well as can be – and not half-baked). The report inappropriately casts the varied reforms presented as Weighted Student Formulas  – a specific reform which involves using a student need focused funding formula to drive financial resources to schools – something which, if done well, could be quite a good idea.

So… I digress… what is so awful then about the Reason report? Why do I think their license to think should be revoked and their seat at the education policy debate table eliminated? In an effort to make their case that the reforms implemented in 14 cities and 1 state have all been highly successful, the authors go so far as to cite evidence of improved student outcomes – in one third of their 15 cases – that occurred in years prior to the implementation of the reforms. That is, the reforms somehow, amazingly, through space-time travel perhaps, had retroactive effects on student outcomes. I’m dumbfounded!

Here’s a section of my review:

=====

Most problematic is the fact that in five of the 15 cases discussed (one third of the cases), outcome successes mentioned actually occurred prior to the implementation of WSF or SBB/SBM (see Appendix A). For example, the report commends NYC for winning the 2007 Broad prize, the year before its Fair Student Funding policy was implemented. The report might arguably attribute this success to mayoral takeover, which began in 2002. But this is the WSF Yearbook, and even the expansive definition of WSF used in the report did not encompass mayoral takeovers.

Similarly, the report commends Hartford for raising test scores in 2008, the year before implementing WSF, and it commends Denver for making strong improvements between 2005 and 2008, whereas WSF was implemented in 2008-09. These successes lead one to question why these districts would want to implement WSF and risk undoing their prior achievements.

The most egregious claim of retroactive causation appears in the press release for the report:

The results from districts using student-based funding are promising. Prior to 2008, less than half of Hartford, Connecticut’s education money made it to the classroom. Now, over 70 percent makes it there. As a result, the district’s schools posted the largest gains, over three times the average increase, on the state’s Mastery Tests in 2007-08 (emphasis added).[i]

Yet, the report itself states that Hartford only began implementing WSF in 2008-09, and only expected to achieve the 70 percent target of available resources allocated to schools and classrooms by 2009-2010 (p. 61). It is difficult to conceive of any defense for Reason’s claims.

=====

Link to complete review: http://www.epicpolicy.org/files/TTR-Baker-Reason-WSF.pdf

I’ve personally come to refer to this offense as the Ouchi Retroactive Causation effect, since the first time I encountered such an absurd argument was in the work of William Ouchi who made much the same analytic gaffe in reference to the success of reforms in Houston. Scott Thomas and I, in our report on Weighted Funding for Hawaii noted: “… most obviously problematic is the authors’ touting of Houston’s success in improving performance through its WSF and decentralized governance, based on performance data from 1999 to 2001, when the formula was implemented (first year of phase-in) in 2001-02” (Baker and Thomas, 2006, p. 9).

http://www.hawaiireporter.com/file.aspx?Guid=1e930a29-3017-40fc-b58c-cf1513149ce7

http://www.hcps.k12.hi.us/STATE/COMM/DOEPRESS.NSF/a1d7af052e94dd120a2561f7000a037c/36dfa2492b206d2c0a2571be000450fd/$FILE/WSF-Baker.Thomas-2-07.28.06.pdf

For a more thorough critique of the work of Ouchi on this topic, see my article with Doug Elmer in Educational Policy (vol. 23 no. 1, 2009)

Look… we all make errors of logic and errors of analysis. So too do relatively safe and conscientious drivers get into an occasional auto accident. However, if a driver drives 15 times in one week, and drives drunk 5 times, that driver should lose his/her license. Claiming retroactive causation 5 times in 15 cases in a report which then proposes “best practices” for schools is comparably irresponsible. Arguably, the first or second offense is bad enough.

If nothing else, the Reason Foundation and the Eli and Edythe Broad Foundation which provided support for the Weighted Student Formula Yearbook should evaluate carefully whether such haphazard and arguably reckless thinking really advances the public policy agenda in any productive way. Whatever good elements there may be in the attempts of these districts to improve educational opportunities for the children they serve are severely undermined by the Weighted Student Funding Yearbook.


[i] http://www.reason.org/news/show/1007460.html