This weekend the L.A. Times continued their excellent series on evaluating teachers by their value-added scores with an article highlighting teachers who are outperforming their peers and released the long-awaited database of all 6,000+ teacher scores. A few key takeaways:
Value-added is the worst form of teacher evaluation but it’s better than everything else.
This cannot be taken lightly. Value-added measures of teacher effectiveness are not all that great. There’s error in individual teachers, the scores vary more than we’d like from year-to-year, and there are too few teachers that can be measured this way. What’s more, because the estimations get stronger the more years of data we have, their accuracy is inversely related to their usefulness.
But, and this is an enormous caveat, everything else we currently use is worse. A teacher’s years of experience, their education credentials, their certification status, the prestige of their college or their college GPA, even in-class observations. None of these measures does as good of a job at predicting a student’s academic growth as a teacher’s value-added score. Yet, we continue to use these poor proxies for quality at the same we have such passionate fights about measures of actual performance.
L.A. has it backwards.
From a separate article on the teacher response, we learn that:
Since The Times began publishing the series, L.A. Unified has moved swiftly to conduct its own value-added analysis and will give teachers their confidential score by October. The district has said the scores could be used to guide training for struggling teachers.
In addition, the district and the teachers union have agreed to begin negotiations on a new evaluation system.
This is just crazy. The Times used the data to conduct an analysis before the district did. The newspaper published the data before the district did anything with it. And now individual teachers are seeing their scores in public before they’ve even had a chance to view them privately. The district won’t even be prepared to give them their scores until October.
Have pity on the individual teachers for this public outing, but, at the same time, don’t blame the Times for what they’re doing. The teachers union has pressured the district against using value-added measures in teacher performance evaluations, and only now are they moving forward together. The district has been complicit for years, and then took the easy way out and gave the data to a newspaper. And, in an ironic twist of fate, the newspaper could publish the value-added results precisely because they were not part of teacher personnel files. Those are private and cannot be released publicly.
In contrast, Tennessee has been using a value-added model since the late 1980′s, and every year since the mid-1990′s every single eligible teacher has received a report on their results. When these results were first introduced, teachers were explicitly told their results would never be published in newspapers and that the data may be used in evaluations. In reality, they had never really been used in evaluations until the state passed a law last January requiring the data to make up 35 percent of a teacher’s evaluation. This bill, and 100% teacher support for the state’s Race to the Top application that included it, was a key reason the state won a $500 million grant in the first round.
Tennessee is a good comparison, because here is a place with longstanding, low-stakes use of the data. The data will now have much higher stakes attached to it, but there wasn’t nearly the acrimony that’s happening now in LA. That’s because, to a large extent, LAUSD has sat on this information for so long without doing anything with it. Kudos to the intrepid reporter for digging it out and making a story of it, but the fact that it’s been buried for so long and is only seeing the light of day in this manner has made it that much more controversial. LAUSD could’ve avoided all the headache by doing something with the data themselves years ago. That should’ve started with letting the teachers see their own data, because they are interested in it. The teachers quoted in the Times articles and the 2,000+ teacher requests the newspaper has received since the story’s release suggest that teachers do want to know how they perform on these measures.
Instead of a methodical process where teachers slowly become used to seeing their data and therefore comfortable with its use, LA now has a situation where many people are unfamiliar and uncomfortable with the data at the same time there’s suddenly pent-up demand from teachers, parents, and the public to see it.
Few teachers understand the methodology behind value-added.
This is related to the second point, but it’s deserving of its own discussion. Check out some of the teacher comments on their own evaluations, and you start to see what I mean. There’s widespread misunderstanding of how value-added works. Value-added accounts for how academically strong incoming students are. Value-added measures student academic growth in math and reading; it doesn’t measure time spent learning these subjects or enjoyment of them. The measure does not have a ceiling effect; in fact, higher scoring students actually had higher rates of growth (remember this is controlled for, so no teacher is given extra credit for teaching gifted students). The scores are unrelated to Master’s degrees, years of experience, or National Board certification.
This isn’t a knock against these teachers. I picked these responses more or less at random, and they represent a general apprehension and misunderstanding about what value-added is and what it attempts to do. It’s a perfectly understandable reaction given the circumstances, but it suggests that more work needs to be done explaining what value-added is and how it works. Hopefully the district can learn from its earlier mistakes and start working with teachers to understand the strengths, and limitations, of value-added data.