## SGP III – Rating the Teachers: Reading

The most recent VDOE database of Student Growth Percentiles contains (anonymous) teacher IDs.  This gives us a first peek at how well, and how badly, some of Richmond’s teachers are performing.

With all the earlier caveats, let’s start with the statewide distributions of teachers’ average SGP scores in reading and math.

Brian Davison points out that both distributions are reasonably symmetrical, suggesting that we do not have an unusually large number of teachers doing particularly well or poorly.  That said, no parent will want a child to be subjected to the reading teacher in the first percentile, the other teacher in the second, or the three in the eighth.

The math scores are more widely distributed, showing a larger number of excellent and a larger number of awful teachers.

Aside from targeting the lowest performers in both subjects, these data suggest that we need math retraining more than reading.

The orange curve is a normal distribution, showing the least squares fit.  The average and standard deviation of the fitted curve are shown at the base of the graph.

The distribution of sixth grade reading teachers is close to the same.

The Richmond distributions conform to that pattern.  First, grade 5:

(Note that this average by teacher is slightly less than the average by student, above.)

As you see, this distribution is a bit wider than the statewide distribution.  That is, Richmond has relatively more excellent fifth grade reading teachers than the statewide average, and also relatively more who are not performing.  Five (of sixty-seven) Richmond teachers are more than two standard deviations above the state average; three are more than two standard deviations below.

Those teachers at the low end need some work but, for the most part, Richmond’s fifth graders are in pretty good hands as to reading.

Only one of Richmond’s twenty-one sixth grade reading teachers produced an average student improvement better than the state average; none was more than two standard deviations above the statewide average.  Six (or seven, depending on the rounding) were more than two standard deviations below the state average and four were more than three standard deviations below.  The Richmond average is 1.5 standard deviations below the state average.

These data tell us that Richmond’s sixth grade reading teachers are not doing a bad job.  They are doing an appalling job.

Upon some reflection, the data also tell us two even more important things:

• The principals (and the Superintendent) now have a quantitative measure of teacher performance (at least as to reading and math).  If they don’t do something (soon!) about rewarding the excellent performers and retraining or firing the poor ones, we’ll know they need to be replaced themselves.
• VDOE is hiding the identities of these high- and low-performing teachers from the parents who pay them and the teachers and whose kids are directly affected by teacher performance.  Apparently those bureaucrats think it would be intrusive for the parents of Virginia’s schoolchildren to know whether their kids are in the hands or excellent, average, or lousy teachers.  I think the term for that kind of inexcusable bureaucratic arrogance is “malfeasance.”

Tomorrow, the Math situation.  (Hint: It’s even worse.)

## New SGP Data II – Math by Division

Continuing the analysis of the third set of SGP data from VDOE (with the same caveats mentioned earlier), here are the 2014 division average math and Algebra I scores.

As before, Richmond is the yellow bar, Petersburg is red, Hampton is green, and Norfolk is blue.

As with the reading scores, Richmond is doing fairly well with math in grades 4 and 5.  Then then comes Grade 6 math where Richmond drops to next-to-last place:

Algebra I:

As with the reading scores, the Richmond math scores plummet when the students enter middle school.  Yet the state averages remain nearly flat (as they nearly should; if VDOE were not manipulating the data, the state average should be entirely flat at 50 on every test).

Something uniquely ugly happens in the sixth grade in Richmond.

I asked the formidable Carol Wolf:

What’s going on with the 6th grade?  Richmond’s [SGP] scores in both reading and math fall into a pit from fifth to 6th grade.  I’m looking at data that show that NO Richmond teacher for whom we have SGP data taught 6th grade reading two years in a row; ONE Richmond teacher for whom we have SGP data taught 6th grade math two years in a row.  Nobody taught either subject three years in a row.

Any ideas?

She replied:

I have asked several teachers what and why they think it is that Richmond’s 6th and 7th graders go from a fully accredited elementary school to being dumber than a sack of rocks when they hit middle school.

Their collective answer:  The elementary schools are cheating.

Could be.  The 8th Grade SGPs (which are below but approaching state average values) are based entirely on the change from previous years’ middle school SOL scores, while the 7th Grade SGP scores can reach one year into elementary school and the 6th grade SGP scores are based entirely on the change from students’ SOL histories in elementary school.  If Richmond’s elementary SOL scores were artificially high and the middle school SOLs were low normal, the 6th graders and, to a lesser degree the 7th graders, would be starting at an artificially high SOL, so their SGP scores would show abnormally little improvement.  That is, the SGP scores would be abnormally low in the sixth and, to a lesser degree, seventh grades.

The 6th Grade teach-once-then-teach-something-else pattern would suggest that the new teachers get the sixth grade classes and that they get out as soon as they get any seniority.  That would be consistent with unusually low sixth grade SGP scores, whether the elementary SOLs were inflated or not.

Let’s label Carol’s suggestion as an hypothesis and try to think of an experiment to falsify it.

In any event, with the prophylactic effect of Richmond’s appalling dropout rate (and, probably, Richmond’s remarkable retest rate), the scores rebound for Algebra I.

Coming Attraction: SGP distributions by teacher.

## New SGP Data–Reading by Division

VDOE has provided Brian Davison a third set of SGP data.  This set includes (anonymized) teacher IDs.

As a reminder, in the words of VDOE:

A student growth percentile expresses how much progress a student has made relative to the progress of students whose achievement was similar on previous assessments.

A student growth percentile complements a student’s SOL scaled score and gives his or her teacher, parents and principal a more complete picture of achievement and progress. A high growth percentile is an indicator of effective instruction, regardless of a student’s scaled score.

That last point is particularly important: The SOL is significantly affected by the student’s economic status; poorer students generally get lower scores.  The SGP, in contrast, measures improvement relative to similarly situated students; the SGP ranking is pretty much independent of economic status.  Thus, the SGP gives us a measure of quality that allows comparisons from division to division, school to school, and teacher to teacher.

CAVEATS:

• VDOE has manipulated these data.  For example, they have suppressed the data in cases where the number of students (presumably in a class) is fewer than ten.  In contrast to earlier data sets, they have suppressed the no-retest scores for students showing a retest score.  They also have suppressed retest scores below passing (<400).   Those are the changes I know about or have found; there may well be others.   In any event, these data deviate from the expected mean and median of 50, probably because of the VDOE manipulations.  For example, the average of the 2014 sixth grade reading SGPs in this dataset is 47.7 and the distribution is not the expected flat line:

• VDOE gave us anonymized teacher IDs, but not the names of the teachers involved.  More on this outrage in a later post.
• The SGP is calculated from the difference between similarly sized SOL scores, which can lead to large relative errors in the SGP.  This suggests considerable caution in interpreting any individual student’s SGP score or the average (or distribution) for a teacher where the number of scores is small.

That said, here is the distribution of 2014 division average Reading SGPs.

On this graph, and those below, Richmond is the yellow bar, Petersburg is red, Norfolk is blue, and Hampton is green.

Excel is unable to readably list all the divisions on this graph; the list here and below excludes every 2d division.  Thus, on the graph below, the blue bar (Norfolk) appears between Surry and Essex Counties.  That’s just an artifact of the program, not evidence that Norfolk has disappeared.

The graph above averages the results of reading tests in the five grades, 4-8 (the SOL score in Grade 3 is the starting point so there’s no third grade SGP).  It turns out there is a considerable variation from grade to grade.  Here, then, are the data for each grade, starting with Grade 4 Reading:

There you see Richmond doing quite well.  But wait.  Here are the data for the higher grades, starting with Grade 6 Reading:

These graphs show a distressing pattern that we will see repeated in the math graphs: Richmond does an average or better job in the elementary schools and a particularly awful job in the middle schools.  Note here the drop from a better-than-average 49 in the fifth grade to a third-from-the-bottom 33 in the sixth.

In summary, here are the 2014 Richmond reading averages by grade v. the state.

These data don’t tell us why our middle schools are doing such an awful job.  But they (in conjunction with the data by teacher – stay tuned for a later post) certainly tell us where the system is in dire need of repair.

Next post: Math SGPs by division.