I’m on the road all week — from DC to Oregon to Philadelphia to Oklahoma City — and everywhere I go people seem to be talking about the L.A. Times’ recent expose into the city’s school teachers, and the extent to which individual teachers are either helping students learn — or holding them back.
The conversations are based on the Times’ decision to use value-added analysis, which rates teachers based on their students’ progress on standardized tests from year to year. Thickening the plot, the Times produced this report using seven years of data the school district had — but had never analyzed. As the paper explains: “Value-added analysis offers a rigorous approach. In essence, a student’s past performance on tests is used to project his or her future results. The difference between the prediction and the student’s actual performance after a year is the ‘value’ that the teacher has added or subtracted.”
Because the idea of value-added analysis, or VAA, seems to be everywhere in K-12 education discussions (it has been embraced by the Obama administration, and many of the field’s leading philanthropic entities, from Gates to Walton to Broad, are intrigued by the approach), I want to offer what I see as the good, the bad and the ugly of VAA — and of the Times’ decision to use VAA as the foundation of its landmark report: