New SGP Data–Reading by Division

VDOE has provided Brian Davison a third set of SGP data.  This set includes (anonymized) teacher IDs.

As a reminder, in the words of VDOE:

A student growth percentile expresses how much progress a student has made relative to the progress of students whose achievement was similar on previous assessments.

A student growth percentile complements a student’s SOL scaled score and gives his or her teacher, parents and principal a more complete picture of achievement and progress. A high growth percentile is an indicator of effective instruction, regardless of a student’s scaled score.

That last point is particularly important: The SOL is significantly affected by the student’s economic status; poorer students generally get lower scores.  The SGP, in contrast, measures improvement relative to similarly situated students; the SGP ranking is pretty much independent of economic status.  Thus, the SGP gives us a measure of quality that allows comparisons from division to division, school to school, and teacher to teacher.

CAVEATS:

  • VDOE has manipulated these data.  For example, they have suppressed the data in cases where the number of students (presumably in a class) is fewer than ten.  In contrast to earlier data sets, they have suppressed the no-retest scores for students showing a retest score.  They also have suppressed retest scores below passing (<400).   Those are the changes I know about or have found; there may well be others.   In any event, these data deviate from the expected mean and median of 50, probably because of the VDOE manipulations.  For example, the average of the 2014 sixth grade reading SGPs in this dataset is 47.7 and the distribution is not the expected flat line:

image

  • VDOE gave us anonymized teacher IDs, but not the names of the teachers involved.  More on this outrage in a later post.
  • The SGP is calculated from the difference between similarly sized SOL scores, which can lead to large relative errors in the SGP.  This suggests considerable caution in interpreting any individual student’s SGP score or the average (or distribution) for a teacher where the number of scores is small.

That said, here is the distribution of 2014 division average Reading SGPs.

image

On this graph, and those below, Richmond is the yellow bar, Petersburg is red, Norfolk is blue, and Hampton is green.

Excel is unable to readably list all the divisions on this graph; the list here and below excludes every 2d division.  Thus, on the graph below, the blue bar (Norfolk) appears between Surry and Essex Counties.  That’s just an artifact of the program, not evidence that Norfolk has disappeared.

The graph above averages the results of reading tests in the five grades, 4-8 (the SOL score in Grade 3 is the starting point so there’s no third grade SGP).  It turns out there is a considerable variation from grade to grade.  Here, then, are the data for each grade, starting with Grade 4 Reading:

image

Grade 5 Reading:

image

There you see Richmond doing quite well.  But wait.  Here are the data for the higher grades, starting with Grade 6 Reading:

image

Grade 7 Reading:

image

Grade 8 Reading:

image

These graphs show a distressing pattern that we will see repeated in the math graphs: Richmond does an average or better job in the elementary schools and a particularly awful job in the middle schools.  Note here the drop from a better-than-average 49 in the fifth grade to a third-from-the-bottom 33 in the sixth.

In summary, here are the 2014 Richmond reading averages by grade v. the state.

image

These data don’t tell us why our middle schools are doing such an awful job.  But they (in conjunction with the data by teacher – stay tuned for a later post) certainly tell us where the system is in dire need of repair.

 

Next post: Math SGPs by division.