Lake Woebegon of Teachers??

Browsing through the VDOE Web pages, one finds the Teacher and Principal Evaluation Collection Results.

The only data there are for 2011.  Even that limited dataset, however, is sufficient to demonstrate that the “evaluation” process was ridiculous, if not  fraudulent.

The 2011 data show that all forty-six Richmond principals were “satisfactory.”  All our principals, it seems, were at or above average.  Never mind that Richmond’s reading SOL pass rate that year was 1.6 standard deviations below the division mean and its math score was 2.0 standard deviations low.  (Richmond is the gold square on the graphs.)

imageimage

The teacher data were more nuanced but similarly ridiculous.  Here is a summary.

  Classroom Management/ Positive Learning Environment Communi-cation Skills  Evaluation and Assessment Implements and Manages Instruction Knowledge of Subject Planning Activities Profes-sional Responsi-bilities Total
EE = Exceeds Expectations 317 437 208 273 479 240 302 2256
ME = Meets Expectations  698 598 826 754 555 787 733 4951
NI = Needs Improvement 20 2 3 10 3 9 2 49
U = Unsatis-factory  2 0 0 0 0 1 0 3
Total 1037 1037 1037 1037 1037 1037 1037 7259

So we see that three of 7,259 ratings were “unsatisfactory” and forty-nine were “needs improvement.”  That is, the report says that only 0.72% of the items in Richmond teachers’ evaluations showed some aspect of failure to meet or exceed expectations in 2011.

That is absurd in the abstract; in light of the available data it is baldly mendacious.

You may recall that the SGP data (that Brian Davison had to pry from VDOE with a lawsuit) can measure teacher performance.  Unlike the SOL itself, the SGP data are not correlated with economic advantage or disadvantage.  So the “poor students” excuse doesn’t work as to SGP.

We have SGP data for the following year, 2012.  Here, with caveats, are the reading data, starting with the distribution of teacher average SGPs (i.e., the average, by teacher, of the students’ SGPs).

image

The orange line is a Gaussian distribution fitted to the data: Mean = 45.0; standard deviation = 10.8.

Then here is the distribution of Richmond reading teachers’ average SGPs.

image

Note the absence of very high performing teachers and the plethora of low performers in Richmond.  One hundred nine of 205 Richmond reading teachers (53.2% v. 50% in a normal distribution) are below the state mean; sixteen (7.8% v. 2.5% in a normal distribution) are more than two standard deviations and fifty-two (25.4%, v. 16% in a normal distribution) are more than one standard deviation below the state mean.

For math, the state distribution has a mean of 46.8 and a standard deviation of 14.6.

image

In contrast to the reading data, Richmond has some outstanding math teachers but their numbers are outweighed by underperforming teachers.

image

Indeed, 111 of 193 Richmond math teachers (57.5%) are below the state mean; six (3.1%) are more than two standard deviations and thirty-seven (19.2%) are more than one standard deviation below the state mean.

Yet, according to the evaluations from the previous year, Richmond’s teachers were just fine, thank you, in 99.3% of all measures.

Just as a reminder, the effect of a good or bad teacher can be dramatic.  Here for instance are the students’ 2014 reading SGPs for Richmond teacher #66858 (anonymized in the VDOE database).

image

And here, in contrast, are the students’ SGPs for teacher #68809.

image

Unfortunately, we have too many who are more like #68809 than #66858.

Richmond’s subpar teacher performance has only worsened recently, as reflected in the deteriorating SOL performance: For 2015, we are the sixth worst division in the state in math, second worst in reading.

Of course, our principals and Superintendent have the power (and duty) to remedy these glaring defects.  Inadequate teacher performance that is not corrected reflects inadequate principal performance; inadequate principal performance that is not corrected reflects inadequate Superintendent performance.  We need to hold these public employees accountable when they don’t deal with the teachers who are harming Richmond’s schoolchildren.

But, then, it’s difficult to impose accountability when VDOE is hiding the relevant data.

Your tax dollars at work.

 

PS: You’ll notice that the internal “evaulations” deal with inputs, which are easy to measure (and to fudge), while the important quantities are outputs (how much our students learn), which are hard to measure.  VDOE is making it harder to know about outputs by abandoning the SGP and installing “progress tables” that measure progress but mostly ignore the lack of it.  Even so, we’ll have some measure of outputs, albeit VDOE doubtless will not release the data.

Seems to me City Council should demand, as a condition of all the money they are spending on RPS, an audit of teacher performance (SGP in the past, progress tables going forward) with an analysis of principal and Superintendent actions regarding underperforming teachers.  For sure, if Council doesn’t demand those data, the rest of us will continue to be operated on the mushroom principle (keep ‘em in the dark, feed ‘em horse poop).

 

I’ll save VDOE’s feckless “improvements” in the evaluation process for another post.