“Adjusting” Accreditation Scores

The accreditation scores for 2017-2018, based on the 2017 testing, are up on the VDOE site.

In the distant past (2005), VDOE’s opaque accreditation process transformed 76.3 and 73.7 math scores at Jeter-Watson into “perfect scores” and embarrassed the Governor.

They now admit to manipulating the scores:

Accreditation ratings also reflect adjustments made for schools that successfully remediate students who initially fail reading or mathematics tests. Adjustments also may be made for students with limited English proficiency and for students who have recently transferred into a Virginia public school. All of these factors are taken into account in calculating pass rates in each subject area.

(But don’t ask them for the remediation data.  They’ll tell you to go breathe ozone.)

They tell me the English rating is based on an average of the reading and writing pass rates.  I’m waiting for information on whether that is an average of the school averages or an average by students.  In the meantime, let’s look at the math data.

The accreditation “adjustments” to pass rates are not so dramatic these days, an average of 2.6 points on the math tests this year, but they still have significant effects.

To start, here is a plot of the math “adjusted” rates (i.e., the rates used for accreditation), by school, vs. the actual pass rates.

image

In a world populated with honest data, all those points would lie on the red line.  As you see, some do, and a few lie below (hard to know how the adjustments would produce that result), but most of the data show accreditation scores that are “adjusted” to larger values.

NOTE: It took several hours to groom the dataset.  There were ten schools for which the database reported SOL pass rates but the Accreditation list had no report.  As well, VDOE reported accreditation data for 41 schools not listed in the SOL database and listed another six in the accreditation data without any numbers.  Then there are another four schools that appear in both lists but are missing data in one or the other.  The data here are for the remaining 1,772 schools.

If we plot the distribution of differences (i.e. adjusted score minus actual pass rate), we see that most of the adjustments are five points are fewer. 

image

Rescaling the y-axis reveals that numbers even in the 10% range are not trivial and, in one case, the “adjustments” produced a 26 point gain.

image

The adjustments reduce the population of scores just below the 70% cutoff for accreditation and increase the population above that benchmark:

image

The (least squares) fitted curves show the shift in the average score.

A plot of counts of adjusted minus counts of actual pass rates emphasizes how the adjustments deplete the population below the 70% cutoff and increase it above.

image

The outstanding example:  There are 37 schools with 69% math pass rates but only five with that adjusted rate.  The average adjusted rate for those 37 schools is 74%.

VDOE writes the SOL tests.  They can boost the pass rates and accreditation rates simply by making the tests easier.  Yet they indulge in this opaque process to produce meaningless numbers that favor some schools over others.

Moreover, they do not adjust the scores for the one factor that they measure and that we know affects the rates: Economic Disadvantage.

And remember that the pass rates themselves have been fudged in some cases: See, e.g., this and this.

So “opaque” is insufficient to describe this process.  “Opaque and corrupt and unfair” comes closer.

Your tax dollars at “work.”