In the midst of the ongoing debate on how to best measure teacher effectiveness, Educational Testing Service (ETS) announced the release today of a presentation given by Stanford University Professor Emeritus Edward Haertel as the 14th William H. Angoff Memorial Lecture. Haertel’s lecture, Reliability and Validity of Inferences about Teachers Based on Student Test Scores discusses the use of Value-Added Models (VAMs) as a major part of determining teacher effectiveness and analyzes why their usefulness for doing so has been seriously overstated.
Haertel’s lecture drew upon detailed analyses of VAM characteristics — including the exploration of its validity, reliability, predictive powers and potential effects in various situations — and leads to the conclusion that they are not well suited for the job.
“Teacher VAM scores should absolutely not be included as a substantial factor with a fixed weight in consequential teacher personnel decisions,” says Haertel. “The information that they provide is simply not good enough to use as a major component for gauging a teacher’s effectiveness.”
Haertel’s presentation and the published lecture released today elaborate on this conclusion by discussing how VAMs, which are essentially a method of evaluating teachers by comparing their students’ test scores in a given school year with the scores from a previous one, cannot sufficiently isolate the various factors that influence a student’s education to a high enough degree to make VAMs useful. Some of the factors that can cause the discrepancies include a student’s out-of-school experiences, individual aptitudes, peer influences, previous education or school-specific academic climates.
“With the wealth of research that has been conducted over the years on the validity and usefulness of Value-Added Models in high-stakes teacher evaluations, Professor Haertel’s conclusion that no statistical manipulation of the data can result in an accurate and unbiased comparison of teachers working under extremely different conditions has strong empirical support,” says Richard Coley, ETS’s Executive Director of the Center for Research on Human Capital and Education.
Despite the limited usefulness that VAMs have when used as a major factor in measuring teacher effectiveness, there are a number of instances in which both students and teachers can benefit from VAMs, according to Haertel.
“VAMs may have a modest place in teacher evaluation systems, but only as an adjunct to other information used in a context where teachers and principals have genuine autonomy in their decisions about using and interpreting teacher effectiveness estimates in local contexts,” says Haertel.
Haertel’s original presentation and published lecture paper also discuss a number of ways to make VAMs moderately more accurate or useful, including the use of multiple factors, test scores from multiple years, and the presentation of clear and accurate information about any uncertainty that exists regarding the causes of variances in student performance.
From: Educational Testing Service
Follow the Magazine:
(After you have filled in your email address in the column at the right hand side of the screen, a confirmation email will sent to your email address. You will have to confirm it before subscription begins)
Follow us on Twitter:
Like us on Facebook:
**As part of the Magazine’s drive to reward subscribers/followers, we will be providing subscribers/followers special access to exclusive content which will not be otherwise available to normal visitors. Please be sure to subscribe to the Magazine. Many visitors have given us positive comments that they will be bookmarking the site, but as the system is unable to capture a working email address to which the passcodes for exclusive content will be sent, they will miss out on this content. Do note that passcodes are locked to each exclusive content, not a one-for-all access, so do provide a working email address that you check regularly so as not to miss out on them!