The Endogeneity of the Equitable Distribution of Teachers: Or, why do the girls get all the good teachers?

Recently, the Center for American Progress (disclosure: I have a report coming out through them soon) released a report in which they boldly concluded, based on data on teacher ratings from Massachusetts and Louisiana, that teacher quality is woefully inequitably distributed across children by the income status of those children. As evidence of these inequities,Continue reading “The Endogeneity of the Equitable Distribution of Teachers: Or, why do the girls get all the good teachers?”

An Update on New Jersey’s SGPs: Year 2 – Still not valid!

I have spent much time criticizing New Jersey’s Student Growth Percentile measures over the past few years, both conceptually and statistically. So why stop now. We have been told over and over again by the Commissioner and his minions that New Jersey’s SGPs take fully into account student backgrounds by accounting for each student’s initialContinue reading “An Update on New Jersey’s SGPs: Year 2 – Still not valid!”

The Value Added & Growth Score Train Wreck is Here

In case you hadn’t noticed evidence is mounting of a massive value-added and growth score train wreck. I’ve pointed out previously on this blog that there exist some pretty substantial differences in the models and estimates of teacher and school effectiveness being developed in practice across states for actual use in rating, ranking, tenuring andContinue reading “The Value Added & Growth Score Train Wreck is Here”

Rebutting (again) the Persistent Flow of Disinformation on VAM, SGP and Teacher Evaluation

This post is in response to testimony overheard from recent presentations to the New Jersey State Board of Education. For background and more thorough explanations of issues pertaining to the use of Value-added Models and Student Growth Percentiles please see the following two sources: Baker, B.D., Oluwole, J., Green, P.C. III (2013) The legal consequencesContinue reading “Rebutting (again) the Persistent Flow of Disinformation on VAM, SGP and Teacher Evaluation”

Deconstructing Disinformation on Student Growth Percentiles & Teacher Evaluation in New Jersey

CROSS-POSTED FROM: http://njedpolicy.wordpress.com/ Deconstructing Disinformation on Student Growth Percentiles & Teacher Evaluation in New Jersey (Printable Policy Brief): SGP_Disinformation_BakerOluwole Bruce D. Baker, Rutgers University, Graduate School of Education Joseph Oluwole, Montclair State University Introduction This brief addresses problems with, and disinformation about New Jersey’s Student Growth Percentile (SGP) measures which are proposed by New JerseyContinue reading “Deconstructing Disinformation on Student Growth Percentiles & Teacher Evaluation in New Jersey”

Friday AM Graphs: Just how biased are NJ’s Growth Percentile Measures (school level)?

New Jersey finally released the data set of its school level growth percentile metrics. I’ve been harping on a few points on this blog this week. SGP data here: http://education.state.nj.us/pr/database.html Enrollment data here: http://www.nj.gov/education/data/enr/enr12/stat_doc.htm First, that the commissioner’s characterization that the growth percentiles necessarily fully take into account student background is a completely bogus andContinue reading “Friday AM Graphs: Just how biased are NJ’s Growth Percentile Measures (school level)?”

Briefly Revisiting the Central Problem with SGPs (in the creator’s own words)

When I first criticized the use of SGPs for teacher evaluation in New Jersey, the creator of the Colorado Growth Model responded with the following statement: Unfortunately Professor Baker conflates the data (i.e. the measure) with the use. A primary purpose in the development of the Colorado Growth Model (Student Growth Percentiles/SGPs) was to distinguishContinue reading “Briefly Revisiting the Central Problem with SGPs (in the creator’s own words)”

On Misrepresenting (Gates) MET to Advance State Policy Agendas

In my previous  post I chastised state officials for their blatant mischaracterization of metrics to be employed in teacher evaluation. This raised (in twitter conversation) the issue of the frequent misrepresentation of findings from the Gates Foundation Measures of Effective Teaching Project (or MET). Policymakers frequently invoke the Gates MET findings as providing broad basedContinue reading “On Misrepresenting (Gates) MET to Advance State Policy Agendas”

Who will be held responsible when state officials are factually wrong? On Statistics & Teacher Evaluation

While I fully understand that state education agencies are fast becoming propaganda machines, I’m increasingly concerned with how far this will go.  Yes, under NCLB, state education agencies concocted completely wrongheaded school classification schemes that had little or nothing to do with actual school quality, and in rare cases, used those policies to enforce substantiveContinue reading “Who will be held responsible when state officials are factually wrong? On Statistics & Teacher Evaluation”

Gates Still Doesn’t Get It! Trapped in a World of Circular Reasoning & Flawed Frameworks

Not much time for a thorough review of the most recent release of the Gates MET project, but here are my first cut comments on the major problems with the report. The take home argument of the report seems to be that their proposed teacher evaluation models are sufficiently reliable for prime time use andContinue reading “Gates Still Doesn’t Get It! Trapped in a World of Circular Reasoning & Flawed Frameworks”