An Illustrative Case of the Numbskullery of Evaluating Teacher Preparation by Student Growth Scores

Posted on July 19, 2013



Assumption:  A good teacher preparation program is one that produces teachers whose students achieve high test score gains

Relay Graduate School of Education is housed in North Star Academy in Newark, and its course modules are largely provided by relatively inexperienced “champion” teachers within its own network (and in from the school itself).  The program is designed to train its own future teachers [and others in network] – and to actually credential them (and grant them graduate degrees) in the specific methods used in their school(s).

Put simply, Relay GSE uses relatively inexperienced teachers to grant degrees to their own new colleagues, where those colleagues may be required by the school to gain those credentials in order to retain employment. No conflict of interest here? But I digress. Back to the point.

Their modules, as shown on the Relay website, are in their best light, little more than mindless professional development for classroom management, and reading inspirational books by school founders, discussed with “champion” teachers. Hardly the stuff of legitimate graduate work, in any field. But again, I digress.

Relay GSE will likely place a significant number of its graduates in its own school (or in network).

North Star Academy has pretty good growth scores, by the (bogus) New Jersey growth metric.

Therefore, not only is North Star Academy totally awesome, but Relay GSE must be an outstanding  teacher preparation institution! It’s just that simple. They must be offering that secret sauce of teaching pedagogy which we should all be looking to as a model. Right?

Setting aside that the New Jersey growth scores themselves are suspect, and that the endeavor of linking teacher preparation program effectiveness to such measures is completely invalid, what the current approach fails to recognize is that North Star Academy actually retains less than 50% of any given 5th grade cohort through 12th grade in any given year, and far fewer than that for black boys. The school loses the vast majority of black boys, and for the few who remain behind, their growth scores – likely as influenced by dwindling peer group composition among those left as by “teacher” effects – are pretty good.

But is a school really successful if 50 enter 5th grade, 1/3 are gone by 8th grade and only a handful ever graduate?

Is this any indication of the quality of teaching, or pedagogy involved?  I won’t go so far as to suggest that what I personally might perceive as offensive, demeaning pedagogy is driving these attrition rates (okay… maybe I just did).

But, at the very least, I might argue that a school that loses over half its kids from grade 5 to 12 is a failing school, not an outstanding one. Whether that has any implications for labeling their teachers as “failing” and their preparation programs as “failing” is another question entirely.

It is quite simply completely and utterly ridiculous to suggest that Relay GSE is an outstanding graduate school of education as a function of measured test score gains of the few students who might stick around to take the tests in subsequent years.

No secret sauce here… just a boatload of bogus policy assumptions creating perverse incentives and taking our education system even further in the wrong direction.

Notably, this does not prove it’s a bad, or awful grad school of education either (see their videos, and read the reports here for evidence of that).

My point here is that this particular case – or what it has the potential to be – is wonderfully (in a twisted way) illustrative of the numbskullery that pervades public education policy from k-12 school accountability metrics to proposals for “improving” teacher preparation.

This foolishness must stop.

Posted in: Uncategorized