New SGP Data II – Math by Division

Continuing the analysis of the third set of SGP data from VDOE (with the same caveats mentioned earlier), here are the 2014 division average math and Algebra I scores.

As before, Richmond is the yellow bar, Petersburg is red, Hampton is green, and Norfolk is blue.

Grade 4 math:

image

Grade 5 math:

image

As with the reading scores, Richmond is doing fairly well with math in grades 4 and 5.  Then then comes Grade 6 math where Richmond drops to next-to-last place:

image

Grade 7 math:

image

Grade 8 math:

image

Algebra I:

image

As with the reading scores, the Richmond math scores plummet when the students enter middle school.  Yet the state averages remain nearly flat (as they nearly should; if VDOE were not manipulating the data, the state average should be entirely flat at 50 on every test).

image

image

Something uniquely ugly happens in the sixth grade in Richmond.

I asked the formidable Carol Wolf:

What’s going on with the 6th grade?  Richmond’s [SGP] scores in both reading and math fall into a pit from fifth to 6th grade.  I’m looking at data that show that NO Richmond teacher for whom we have SGP data taught 6th grade reading two years in a row; ONE Richmond teacher for whom we have SGP data taught 6th grade math two years in a row.  Nobody taught either subject three years in a row.

Any ideas?

She replied:

I have asked several teachers what and why they think it is that Richmond’s 6th and 7th graders go from a fully accredited elementary school to being dumber than a sack of rocks when they hit middle school.  

Their collective answer:  The elementary schools are cheating.

Could be.  The 8th Grade SGPs (which are below but approaching state average values) are based entirely on the change from previous years’ middle school SOL scores, while the 7th Grade SGP scores can reach one year into elementary school and the 6th grade SGP scores are based entirely on the change from students’ SOL histories in elementary school.  If Richmond’s elementary SOL scores were artificially high and the middle school SOLs were low normal, the 6th graders and, to a lesser degree the 7th graders, would be starting at an artificially high SOL, so their SGP scores would show abnormally little improvement.  That is, the SGP scores would be abnormally low in the sixth and, to a lesser degree, seventh grades.

The 6th Grade teach-once-then-teach-something-else pattern would suggest that the new teachers get the sixth grade classes and that they get out as soon as they get any seniority.  That would be consistent with unusually low sixth grade SGP scores, whether the elementary SOLs were inflated or not.

Let’s label Carol’s suggestion as an hypothesis and try to think of an experiment to falsify it.

In any event, with the prophylactic effect of Richmond’s appalling dropout rate (and, probably, Richmond’s remarkable retest rate), the scores rebound for Algebra I.

image

 

Coming Attraction: SGP distributions by teacher.

New SGP Data–Reading by Division

VDOE has provided Brian Davison a third set of SGP data.  This set includes (anonymized) teacher IDs.

As a reminder, in the words of VDOE:

A student growth percentile expresses how much progress a student has made relative to the progress of students whose achievement was similar on previous assessments.

A student growth percentile complements a student’s SOL scaled score and gives his or her teacher, parents and principal a more complete picture of achievement and progress. A high growth percentile is an indicator of effective instruction, regardless of a student’s scaled score.

That last point is particularly important: The SOL is significantly affected by the student’s economic status; poorer students generally get lower scores.  The SGP, in contrast, measures improvement relative to similarly situated students; the SGP ranking is pretty much independent of economic status.  Thus, the SGP gives us a measure of quality that allows comparisons from division to division, school to school, and teacher to teacher.

CAVEATS:

  • VDOE has manipulated these data.  For example, they have suppressed the data in cases where the number of students (presumably in a class) is fewer than ten.  In contrast to earlier data sets, they have suppressed the no-retest scores for students showing a retest score.  They also have suppressed retest scores below passing (<400).   Those are the changes I know about or have found; there may well be others.   In any event, these data deviate from the expected mean and median of 50, probably because of the VDOE manipulations.  For example, the average of the 2014 sixth grade reading SGPs in this dataset is 47.7 and the distribution is not the expected flat line:

image

  • VDOE gave us anonymized teacher IDs, but not the names of the teachers involved.  More on this outrage in a later post.
  • The SGP is calculated from the difference between similarly sized SOL scores, which can lead to large relative errors in the SGP.  This suggests considerable caution in interpreting any individual student’s SGP score or the average (or distribution) for a teacher where the number of scores is small.

That said, here is the distribution of 2014 division average Reading SGPs.

image

On this graph, and those below, Richmond is the yellow bar, Petersburg is red, Norfolk is blue, and Hampton is green.

Excel is unable to readably list all the divisions on this graph; the list here and below excludes every 2d division.  Thus, on the graph below, the blue bar (Norfolk) appears between Surry and Essex Counties.  That’s just an artifact of the program, not evidence that Norfolk has disappeared.

The graph above averages the results of reading tests in the five grades, 4-8 (the SOL score in Grade 3 is the starting point so there’s no third grade SGP).  It turns out there is a considerable variation from grade to grade.  Here, then, are the data for each grade, starting with Grade 4 Reading:

image

Grade 5 Reading:

image

There you see Richmond doing quite well.  But wait.  Here are the data for the higher grades, starting with Grade 6 Reading:

image

Grade 7 Reading:

image

Grade 8 Reading:

image

These graphs show a distressing pattern that we will see repeated in the math graphs: Richmond does an average or better job in the elementary schools and a particularly awful job in the middle schools.  Note here the drop from a better-than-average 49 in the fifth grade to a third-from-the-bottom 33 in the sixth.

In summary, here are the 2014 Richmond reading averages by grade v. the state.

image

These data don’t tell us why our middle schools are doing such an awful job.  But they (in conjunction with the data by teacher – stay tuned for a later post) certainly tell us where the system is in dire need of repair.

 

Next post: Math SGPs by division.