On Tuesday, the Governor announced a “10-Point Increase in Fully Accredited Schools.” As Jim Bacon quickly pointed out, some part of that increase must be due to the newly-allowed retakes that boosted pass rates by about four percent.
Then we have the “adjustments.” VDOE acknowledges that it fiddles the numbers:
Accreditation ratings also reflect adjustments made for schools that successfully remediate students who initially fail reading or mathematics tests. Adjustments also may be made for students with limited English proficiency and for students who have recently transferred into a Virginia public school. All of these factors are taken into account in calculating pass rates in each subject area.
That falls considerably short of earlier admissions. Indeed, we know that earlier “adjustments” converted a 76.3 and a 73.7 into “perfect scores” and embarrassed the Governor.
In any case, the process is opaque. About all we can do is compare the “adjusted” pass rates with those reported in the SOL database (that already includes the 4% retake boost). I have a modest example here.
For the 1774 schools that appear in both databases (see below for the missing 49), the “adjustments” increase the math pass rates:
Excel is happy to fit curves to these data. For the fitted curves, the actual mean is 82.4, the “adjusted” mean is 84.6.
All this produced a nice increase in the number of schools that made the 70% cutoff:
VDOE writes the tests; they can make them as hard or easy as they wish. Yet they indulge in this byzantine, opaque process. And then they brag about the fudged results.
Moreover, there’s a problem with the data.
In juxtaposing the Accreditation and SOL data, I had to make sure that the school names in both lists were the aligned. In many cases they were not. So I spent a rainy afternoon yesterday getting the lists to match.
To accomplish that, I dealt with dozens of cases where the SOL database had a space after the school name but the accreditation list did not (Ask Excel to compare two strings and it really compares them). As well, I had to deal with cases such as a Norfolk school that was “Mary Calcott Elementary School” in one list and “Mary Calcott Elementary” in the other. Beyond those minor issues, I had to remove 48 schools that were in the accreditation list but not in the SOL database.
(You might notice that 1774+48=1822, which is one short of the 1823 reported by VDOE. I had to move these by hand and perhaps I messed up a cut-and-paste operation. I’m not sufficiently invested in this to spend another afternoon trying to figure out who’s missing.)
We are left to wonder how they calculated “adjusted” pass rates for these schools that apparently had no pass rates.
I also had to remove twelve schools from the SOL report that were not in the accreditation list.
At least the two Richmond schools here make some sense: Elkhardt and Thompson were combined into a single school this year. We are left to wonder why their pass rates were reported separately but they got accredited jointly,* and what happened to the accreditations of the other schools in this list.
As a more global matter, we are left to speculate why they fudge these data. And how they do it. And what other ways the data are screwed up.
Oh, and if one secret process for manipulating the data were not enough, we have another: the federal count of schools and divisions that met or failed to meet their Annual Measurable Objectives (aka “AMO’s,” of course). The only thing to be said for this further waste of taxpayer dollars is that it may be more honest: 51.5% of Virginia schools flunked.
*Actually, we know the answer, at least as to the latter: The combined Elkhardt-Thompson is a “new school,” so it got a bye on accreditation. The joint accreditation thus solved the problem of Thompson, which was denied accreditation last year.