The third SGP data release by VDOE contains anonymized teacher IDs (but no data by school). These abbreviated data serve to emphasize the perversity of VDOE’s suppression of the teacher identities (and other data).
In Richmond, according to the database, 304 teachers taught reading in grades 4 to 8 in the 2012-2014 period. Of these, 74 taught the subject all three years. A graph of the average SGP performance of that 74 is far too busy to convey much information, aside from showing the remarkable range of the average scores and the dip in 2013 because of the former Superintendent’s failure to align the curriculum to the new SOL tests.
If we remove all but the top and bottom four or five 2014 scores, the graph is more informative.
The average shows the Richmond dip in 2013. Note that the State SOL scores dropped in 2013, because of the new tests, but the SGP did not: The SGP measures relative performance and the statewide SGP average is unaffected by the overall drop in SOL scores. Richmond’s SGP dropped relative to the statewide numbers, however, denoting underperformance here.
Turning to specifics: Teacher No. 66858 (Gr 5 reading) started above average and improved dramatically. Teacher No. 74415 (Gr 4 reading) started below average and deteriorated dramatically.
The distribution of Teacher 66858’s 2014 SGP scores provides a more detailed picture of that teacher’s excellent job.
This teacher had only one student whose reading progress in 2014 was below average. The low end of the 95% confidence interval of the mean for these data is 76.2.
The high end of the 95% confidence interval for for the mean of this teacher is 13.9. Notice that this teacher’s 2013 performance was not a whole lot better.
The principal who allowed 25 kids (we have SGPs for 24 of the 25) to be subjected to this educational malpractice in 2014 should have been fired. Yet VDOE deliberately makes it impossible for Richmond’s parents to know whether this situation has been corrected or whether, as is almost certain, another batch of kids is being similarly afflicted with this awful teacher.
The math data show a similarly diverse pattern, albeit without the 2013 drop: good and average teachers getting better; average and bad teachers getting worse; bad teachers staying bad.
It turns out that both of the reading teachers above also taught math to the same kids that they taught (or failed to teach, in the one case) reading.
No. 66858 turns out to be an excellent math teacher, albeit not as excellent as at reading.
Similarly, # 74415 is a bad math teacher, but not as awful as at reading.
No real surprises here. We would expect that, to some degree, teaching math might take a different skill set than teaching reading. We also might expect that a good reading teacher would be good at math, and a bad one at reading similarly bad at math.
I could go on and on but the point already is clear: VDOE is concealing important data about the performance of our teachers and principals. Without these data, the public cannot know which of our principals are harboring educational malpractice.
Finally, Algebra I. Only nine Richmond teachers taught this subject in all the three years so the graph includes them all.
These data paint a picture of improvement, but see the caveats below.
Thirty-one Richmond students were afflicted with teacher No. 68640 but we have SGPs for only 13. The scores of those 13 do not paint a pretty picture.
This teacher improved from appalling to awful from 2013 to 2014 but still had only three students above average in 2014. It is tempting to think that this teacher demonstrates that yet another principal needs firing but there are problems with the data.
The Algebra I data are skewed in at least two ways: The bright kids tend to pass it in middle school. The ones who can’t pass in high school contribute to our appalling dropout rate. Then we have the students who take Algebra but don’t get an SGP score because of the byzantine rules (see the class above with 31 students but only 13 SGPs).
And then, the students who have trouble in high school tend to retake (and retake and retake) the test in order to graduate. VODE has let slip enough retake data to suggest that the retake scores are bogus.
For instance, here are the 2014 Algebra I retest counts in Richmond.
On the data that slipped out of VDOE, those retests increased the SOL scores of the students involved by an average of 24.6 points. One student improved his/her Algebra I score by 108 points.
The data are here. Note that the averages at the link include the retest score decreases and still show a net positive 12+ points.
The summerlong retesting is another facet of VDOE’s deliberate concealment: They have enough SOL data to schedule graduations in May but they do not release the SOLs until late August. No telling what machinations take place in those three months; the data above suggest many and major.
So, we have VDOE manipulating the data in secret and, even more to the point, concealing data about the good and bad teachers in our schools.
Our tax dollars at “work.”