Lake Woebegon of Teachers??

Browsing through the VDOE Web pages, one finds the Teacher and Principal Evaluation Collection Results.

The only data there are for 2011.  Even that limited dataset, however, is sufficient to demonstrate that the “evaluation” process was ridiculous, if not  fraudulent.

The 2011 data show that all forty-six Richmond principals were “satisfactory.”  All our principals, it seems, were at or above average.  Never mind that Richmond’s reading SOL pass rate that year was 1.6 standard deviations below the division mean and its math score was 2.0 standard deviations low.  (Richmond is the gold square on the graphs.)

imageimage

The teacher data were more nuanced but similarly ridiculous.  Here is a summary.

  Classroom Management/ Positive Learning Environment Communi-cation Skills  Evaluation and Assessment Implements and Manages Instruction Knowledge of Subject Planning Activities Profes-sional Responsi-bilities Total
EE = Exceeds Expectations 317 437 208 273 479 240 302 2256
ME = Meets Expectations  698 598 826 754 555 787 733 4951
NI = Needs Improvement 20 2 3 10 3 9 2 49
U = Unsatis-factory  2 0 0 0 0 1 0 3
Total 1037 1037 1037 1037 1037 1037 1037 7259

So we see that three of 7,259 ratings were “unsatisfactory” and forty-nine were “needs improvement.”  That is, the report says that only 0.72% of the items in Richmond teachers’ evaluations showed some aspect of failure to meet or exceed expectations in 2011.

That is absurd in the abstract; in light of the available data it is baldly mendacious.

You may recall that the SGP data (that Brian Davison had to pry from VDOE with a lawsuit) can measure teacher performance.  Unlike the SOL itself, the SGP data are not correlated with economic advantage or disadvantage.  So the “poor students” excuse doesn’t work as to SGP.

We have SGP data for the following year, 2012.  Here, with caveats, are the reading data, starting with the distribution of teacher average SGPs (i.e., the average, by teacher, of the students’ SGPs).

image

The orange line is a Gaussian distribution fitted to the data: Mean = 45.0; standard deviation = 10.8.

Then here is the distribution of Richmond reading teachers’ average SGPs.

image

Note the absence of very high performing teachers and the plethora of low performers in Richmond.  One hundred nine of 205 Richmond reading teachers (53.2% v. 50% in a normal distribution) are below the state mean; sixteen (7.8% v. 2.5% in a normal distribution) are more than two standard deviations and fifty-two (25.4%, v. 16% in a normal distribution) are more than one standard deviation below the state mean.

For math, the state distribution has a mean of 46.8 and a standard deviation of 14.6.

image

In contrast to the reading data, Richmond has some outstanding math teachers but their numbers are outweighed by underperforming teachers.

image

Indeed, 111 of 193 Richmond math teachers (57.5%) are below the state mean; six (3.1%) are more than two standard deviations and thirty-seven (19.2%) are more than one standard deviation below the state mean.

Yet, according to the evaluations from the previous year, Richmond’s teachers were just fine, thank you, in 99.3% of all measures.

Just as a reminder, the effect of a good or bad teacher can be dramatic.  Here for instance are the students’ 2014 reading SGPs for Richmond teacher #66858 (anonymized in the VDOE database).

image

And here, in contrast, are the students’ SGPs for teacher #68809.

image

Unfortunately, we have too many who are more like #68809 than #66858.

Richmond’s subpar teacher performance has only worsened recently, as reflected in the deteriorating SOL performance: For 2015, we are the sixth worst division in the state in math, second worst in reading.

Of course, our principals and Superintendent have the power (and duty) to remedy these glaring defects.  Inadequate teacher performance that is not corrected reflects inadequate principal performance; inadequate principal performance that is not corrected reflects inadequate Superintendent performance.  We need to hold these public employees accountable when they don’t deal with the teachers who are harming Richmond’s schoolchildren.

But, then, it’s difficult to impose accountability when VDOE is hiding the relevant data.

Your tax dollars at work.

 

PS: You’ll notice that the internal “evaulations” deal with inputs, which are easy to measure (and to fudge), while the important quantities are outputs (how much our students learn), which are hard to measure.  VDOE is making it harder to know about outputs by abandoning the SGP and installing “progress tables” that measure progress but mostly ignore the lack of it.  Even so, we’ll have some measure of outputs, albeit VDOE doubtless will not release the data.

Seems to me City Council should demand, as a condition of all the money they are spending on RPS, an audit of teacher performance (SGP in the past, progress tables going forward) with an analysis of principal and Superintendent actions regarding underperforming teachers.  For sure, if Council doesn’t demand those data, the rest of us will continue to be operated on the mushroom principle (keep ‘em in the dark, feed ‘em horse poop).

 

I’ll save VDOE’s feckless “improvements” in the evaluation process for another post.

Where Are the Data

As a further look at the performance and underperformance of Richmond’s elementary schools, here is the range of 2015 pass rates on the reading tests.

image

Here we see Carver and Fairfield Court outperforming (we’ll deal with Munford below) while Woodville underperforms at an unconscionable level.  In the meantime, the charter school, Patrick Henry, is in the middle of the pack.

The math scores paint a similar picture except that Cary joins the outperformers and Patrick Henry sinks to the bottom third.

image

The Fall membership data from VDOE tell us that Munford, the green point, is blessed with a large population of more affluent kids while the other leaders, blue with Carver to the left, are not. 

image

Woodville, with 79% economically disadvantaged students, is the orange point.

Here is the same graph for the math tests.  Cary joins the leaders as the left-hand blue point.

image

For sure, the economic status of the students does not explain these data.

Here is the dataset.

School Name

% ED

Reading

Math

Bellevue Elementary

61%

64%

73%

Blackwell Elementary

53%

53%

66%

Broad Rock Elementary

70%

81%

83%

Chimborazo Elementary

68%

50%

57%

E.S.H. Greene Elementary

73%

55%

71%

Elizabeth D. Redd Elementary

68%

63%

68%

Fairfield Court Elementary

84%

88%

90%

G.H. Reid Elementary

65%

49%

51%

George Mason Elementary

81%

43%

61%

George W. Carver Elementary

71%

98%

97%

Ginter Park Elementary

58%

63%

79%

J.B. Fisher Elementary

42%

83%

90%

J.E.B. Stuart Elementary

69%

72%

80%

J.L. Francis Elementary

68%

59%

69%

John B. Cary Elementary

69%

72%

93%

Linwood Holton Elementary

29%

74%

69%

Mary Munford Elementary

11%

90%

91%

Miles Jones Elementary

70%

61%

70%

Oak Grove/Bellemeade Elementary

75%

41%

56%

Overby-Sheppard Elementary

68%

48%

62%

Patrick Henry School Of Science And Arts

31%

67%

65%

Southampton Elementary

54%

73%

75%

Swansboro Elementary

69%

52%

51%

Westover Hills Elementary

64%

53%

68%

William Fox Elementary

16%

85%

82%

Woodville Elementary

79%

29%

30%

The data do raise some questions:

  • Where is VDOE?  Where is their study that explains the over- and under- and mediocre-performance of these schools?  What are they doing to transmit that information to the other schools?
  • Carver and Fairfield and Cary (in math) are doing something right (or cheating extravagantly); what is it and why are the other schools not doing it?
  • Patrick Henry has absorbed a lot of money and energy but is not getting results.  What is wrong there?
  • Where are the Woodville parents?  Why are they not at the School Board every meeting to demand that RPS stop abusing their kids?
  • Where is VCU?  To date, their major “contributions” have been a study to validate the VGLA that, upon examination, is a whitewash and the hiring of Richmond’s failed Superintendent as an Associate Professor in “Educational Leadership.”  Perhaps they could do something constructive for a change.

Reversible “Progress”

The ratty, old steel guardrails on Riverside Drive did a fair job of standing the tests of time and drunks.  The new, wood guardrail got (and failed) its first test about 02:00 Friday morning on the curve just west of the 42d St. parking lot.

The folks who installed the railing did a nice job of setting it straight; our Friday morning visitor undid that straightness,

150419 004

mostly by moving the posts.

150419 005

150419 006

If our City repairs the new guardrail with the same care that they bring to picking up the leaves on Riverside Drive, we are in for a deteriorating wooden eyesore.

150419 001

Your tax dollars at “work.”