The Empire Strikes Back. Feebly.

On June 2, Donald Wilms, President of the Chesterfield Education Association, responded in the Times-Dispatch to Bart Hinkle’s editorial of May 28.

Hinkle had made the point that the Virginia Education Association’s attempt to suppress truthful data on teacher effectiveness sought to keep “parents and taxpayers . . . in the dark about which teachers are doing a great job – and which ones aren’t.” 

Wilms sought to argue that the data should be kept secret.  He mostly demonstrated that the Chesterfield Education Association needs a better argument.

Wilms brought on some emotional arm-waving about the students who may come to school on test day after oversleeping and missing breakfast; after a fight; after losing a boy-/girlfriend; or the like.  He neglected to mention that the data he disdains are based on two (or more) successive years’ test scores:  An outside event in the second year could lower a student’s measured progress but the same event in the first year could increase the score difference and, thus, the progress measure.

The nut of Wilms’ argument, however, was the kids who are at a systematic disadvantage:

[W]ould it be fair for schools with high-income households — where both parents are college-educated, where kids go to museums and take exotic vacations, where parents have two cars and are available to take kids to the public library — to compete with schools where kids live in single-parent households, where parents hold several low-wage jobs, with hours preventing them from being home to take kids to museums or public libraries, and incomes preventing them from any vacation at all? Heck, it would be unfair to compare a school in western Chesterfield with one in eastern Chesterfield, let alone to compare one of Chesterfield’s (or Henrico’s) wealthiest school communities with one of Richmond’s neediest school communities, don’t you think?

* * *

What these [SGP data] really illustrate is which teachers have the at-risk kids and which don’t.

On these emotional grounds, Wilms attacked the fairness of the “flawed rankings” of the Student Growth Percentile, the “SGP” (and, apparently, other measures of teaching effectiveness).  He disdained to discuss the actual data, which are to the contrary.

When it was considering adopting the SGP, VDOE provided data showing that test scores fall with increasing economic disadvantage (no surprise there) but that SGP percentiles do not:

image

That is because the SGP, by design, compares low-performers only to other low-performers.  Unlike the SOL, it measures progress relative to others similarly situated.

Indeed, the data for Chesterfield County make a clear case that students who score poorly one year, for whatever reason, can show outstanding academic growth the next year.  Here, for instance, are the 2014 math SGP scores of Chesterfield students plotted against those same students’ 2013 SOL scores.

image

Before drawing any conclusions, we need to refine the analysis slightly: Students who score pass, advanced (SOL 500 or above) for two years running do not receive an SGP (it’s hard to improve on “excellent”).  In a sense, the SGP penalizes teachers and schools with large numbers of very bright students!

The SOL scores ≥ 500 in the graph above represent students who scored advanced 2013 but less than advanced in 2014.  The students who scored advanced in both ‘13 and ‘14 do not appear in this dataset at all, which biases the analysis.  So let’s look only at the students with 2013 SOLs < 500:

image

The result doesn’t change much.  The R2 value of 0.14% tells us that the SGPs are quite uncorrelated with the previous year’s SOL scores.  Of course, correlation is necessary but not sufficient to show causation.  Said otherwise: Chesterfield students with low SOL scores in the previous year, for whatever reason, show superior (or inferior) academic growth (as measured by the SGP) in the current year just as often as students who scored well in the previous year.

We can’t directly test Wilms’ statement about pitting a rich school against a poor one because VDOE (known here as the Virginia Department of Data Suppression) hasn’t released (and won’t release) the data.  So let’s go one better: Let’s use the data we have to compare some of the worst students in Chesterfield in terms of 2013 SOL with some of the best.

Using the math data, here is the first group, all at 50 points or more below the passing score of 400:

image

As before, the 2014 SGP does not correlate with the 2013 SOL. 

Next the 2013 SOLs between 450 (50 points above the passing score) and 499 (where we stop to avoid the advanced student penalty).

image

Pretty much the same story: High or low SOL one year does not predict SGP growth in the next year.

Said yet otherwise: Part of teaching in the public schools is dealing with kids who are disadvantaged; the SGP identifies teachers who do that well; the SOL may not.

BTW: The reading data tell the same story with slightly different numbers.  Let’s avoid some clutter and leave them out. 

It’s easy to understand why Wilms would prefer the current system.  In 2011 (the only year that VDOE slipped up and posted evaluation data), Chesterfield teachers were evaluated on seven criteria.  Six of those were inputs, i.e., relatively easy to measure but only distantly related to student learning.  On the only important measure, “Student Achievement and Academic Progress,” only twelve of 1222 Chesterfield teachers (1%) were said to need improvement and none was unsatisfactory. 

image

But when we look at real performance data (2012 SGPs are the earliest that Brian Davison sued out of VDOE), we see student progress that was much worse than “unsatisfactory.”  For example here is the math performance of Teacher No. 34785 (identifier anonymized by VDOE):

image

The yellow points are the annual SGP averages of that teacher’s math students.  The blue bars show the 95% confidence intervals. 

The Chesterfield math averages those years were 48.7, 49.2, and 48.2.

Then we have Nos. 54317 and 86898:

image  image

To put some context on these numbers, here is the 2014 statewide distribution of math SGP averages by teacher.

image

The mean was 49.3; the estimated standard deviation, 16.3.  That is, about sixteen percent of the teachers were below 33.0; another sixteen percent, above 65.6.  Two and a half percent were below 16.7; another 2.5%, above 81.9. I’ve marked those values on the 2014 Chesterfield distribution:

image

The problem here is not “flawed rankings.”  It is flawed teachers that the Chesterfield “Education” Association does not want the parents of Chesterfield County to know about.

BTW: The data also show that there are some really fine teachers in Chesterfield.  For example, in math we see Nos. 29893, 28974, and 81816:

image image image

The Chesterfield and Virginia “Education” Associations don’t want you to know about these outstanding teachers, either.  They just want you to think that 99% are above average.

And VDOE is a co-conspirator.  Your tax dollars “at work.”