“Adjusting” Accreditation Scores

The accreditation scores for 2017-2018, based on the 2017 testing, are up on the VDOE site.

In the distant past (2005), VDOE’s opaque accreditation process transformed 76.3 and 73.7 math scores at Jeter-Watson into “perfect scores” and embarrassed the Governor.

They now admit to manipulating the scores:

Accreditation ratings also reflect adjustments made for schools that successfully remediate students who initially fail reading or mathematics tests. Adjustments also may be made for students with limited English proficiency and for students who have recently transferred into a Virginia public school. All of these factors are taken into account in calculating pass rates in each subject area.

(But don’t ask them for the remediation data.  They’ll tell you to go breathe ozone.)

They tell me the English rating is based on an average of the reading and writing pass rates.  I’m waiting for information on whether that is an average of the school averages or an average by students.  In the meantime, let’s look at the math data.

The accreditation “adjustments” to pass rates are not so dramatic these days, an average of 2.6 points on the math tests this year, but they still have significant effects.

To start, here is a plot of the math “adjusted” rates (i.e., the rates used for accreditation), by school, vs. the actual pass rates.


In a world populated with honest data, all those points would lie on the red line.  As you see, some do, and a few lie below (hard to know how the adjustments would produce that result), but most of the data show accreditation scores that are “adjusted” to larger values.

NOTE: It took several hours to groom the dataset.  There were ten schools for which the database reported SOL pass rates but the Accreditation list had no report.  As well, VDOE reported accreditation data for 41 schools not listed in the SOL database and listed another six in the accreditation data without any numbers.  Then there are another four schools that appear in both lists but are missing data in one or the other.  The data here are for the remaining 1,772 schools.

If we plot the distribution of differences (i.e. adjusted score minus actual pass rate), we see that most of the adjustments are five points are fewer. 


Rescaling the y-axis reveals that numbers even in the 10% range are not trivial and, in one case, the “adjustments” produced a 26 point gain.


The adjustments reduce the population of scores just below the 70% cutoff for accreditation and increase the population above that benchmark:


The (least squares) fitted curves show the shift in the average score.

A plot of counts of adjusted minus counts of actual pass rates emphasizes how the adjustments deplete the population below the 70% cutoff and increase it above.


The outstanding example:  There are 37 schools with 69% math pass rates but only five with that adjusted rate.  The average adjusted rate for those 37 schools is 74%.

VDOE writes the SOL tests.  They can boost the pass rates and accreditation rates simply by making the tests easier.  Yet they indulge in this opaque process to produce meaningless numbers that favor some schools over others.

Moreover, they do not adjust the scores for the one factor that they measure and that we know affects the rates: Economic Disadvantage.

And remember that the pass rates themselves have been fudged in some cases: See, e.g., this and this.

So “opaque” is insufficient to describe this process.  “Opaque and corrupt and unfair” comes closer.

Your tax dollars at “work.”

Carver But Not Walker

The Feds have just named seven Virginia schools as National Blue Ribbon Schools.  They choose the schools based on performance on standardized tests or for “closing achievement gaps.”

This year’s list included Carver, selected as [search for Carver] an Exemplary High Performing School.  Indeed, Carver was the best performing elementary school in Richmond this year and the sixteenth best school in the state.

The list did not include Maggie Walker.  Indeed, the list never [search for Walker in all years] has included Maggie Walker.  Yet Walker was rated tenth best public high school in the nation in 2014 by the Daily Beast.

But, you see, Maggie Walker is a “program,” not a “school.”  So the scores go to high schools in the students’ home districts, albeit those students do not attend those high schools.

Never mind that it has a school board and is accredited as a “school” and has a four-year program and grants diplomas to the 100% of its students who graduate.

Do you suppose the feds know that VDOE is lying to them about the SOL scores of our local high schools?  Do you care that VDOE brokered this corrupt deal so the local Superintendents would let their bright kids go to MLW without lowering the SOLs of the local high schools?  Do you wonder that VDOE manipulates other data (see this and this for some introductory examples; as soon as Bacon’s Rebellion gets its server back up, I’ll add some other links here)?

Your tax dollars at “work.”

Graduation and Not, 2016

VDOE posted the 2016 4-year cohort graduation data yesterday.  Their press release burbled on about the increase of the On-Time rate to over 91%. 

As we shall see, the On-Time rate is a fiction, created by VDOE to inflate the rate.  But first, some background.

  • The Standard Diploma requires twenty-two “standard credits” and six “verified credits” in specified subjects.  
  • The Advanced Studies Diploma requires twenty-four standard and nine verified credits.

These are the only diplomas recognized by the Feds for calculation of the federal graduation indicator.  VDOE counts three further diplomas toward its inflated “On-Time” graduation rate:

  • The Modified Standard Diploma is available to students “who have a disability and are unlikely to meet the credit requirements for a Standard Diploma.”  This diploma is being phased out in favor of “credit accommodations” that will allow students who would have pursued a Modified Standard Diploma to earn a Standard Diploma.  Those of us who have watched the wholesale institutional cheating via the VGLA may be forgiven for thinking that these accommodations will be a fertile field for schools and divisions to game the system.
  • The Special Diploma, now known by the new euphemism “Applied Studies Diploma,” “is available to students with disabilities who complete the requirements of their Individualized Education Program (IEP) and who do not meet the requirements of for other diplomas.”
  • The General Achievement Diploma “is intended for individuals who are at least 18 years of age and not enrolled in public school or not otherwise meeting the compulsory school attendance requirements set forth in the Code of Virginia.”  This one does not amount to much in the stats: Richmond had none this year; Virginia, fewer than ten.

I have commented elsewhere on Richmond’s abuse of the process for identifying and testing kids with disabilities.

This year, the 4-year cohort On-Time rate for Virginia was 91.3%.  The federal graduation indicator, known here as the “actual” graduation rate, was 87.7%.

Here are the actual cohort rates.


That 18% difference between the state and Richmond rates is 266 members of the Richmond cohort of 1,476.  Doubtless too many of that 266 will wind up as people you would not want to meet in a dark alley.

Here are the same data, juxtaposed with the inflated “On-Time” rates.


The statewide difference between the “On Time” and federal rates was 3.6% or 3,439 students; the Richmond difference was 10.6%, 156 students.

An analysis by degree type shows Richmond’s shortage of advanced diplomas and excess of nonstandard degrees. 


The Richmond rate decreased this year.


Note:  The estimable Carol Wolf reminds me that all the Richmond graduation rates are bogus in that the Maggie Walker students are reported at high schools they do not attend.

Maggie What?

The Times-Dispatch reports this morning that US News & World Report ranks Community and Open 9th and 10th in Virginia based on college readiness.

First place in the state is Fairfax County’s TJ (actually located in Alexandria!), a Governor’s School

Absent from the list is Maggie Walker, also a Governor’s School.

“What?” you say!  Maggie Walker is a public high school for high-ability students, issues diplomas to its graduates, is governed by a school board comprised of representatives from twelve local school systems, and is accredited as a “school.” 

But VDOE says it’s a “program,” not a “school.”  (While Governor’s School TJ in Alexandria is a “school.”)

AND, since MLW is not a school, the SOL scores of its students are reported to the high schools in their home districts [note: old document; today Pearson surely reports the scores directly], albeit they do not attend those schools.

Do you suppose the feds know that VDOE is lying to them about the SOL scores of our local high schools?  Do you care that VDOE brokered this corrupt deal so the local Superintendents would let their bright kids go to MLW without lowering the SOLs of the local high schools?

BTW: The Daily Beast has better information: In ‘14 they ranked MLW #12 public school in the nation.

Why Does VDOE Use Biased Data to Accredit Our Schools?

VDOE has an elaborate scheme to accredit (or not accredit) Virginia’s schools.  The basis is SOL pass rates (plus, for high schools, the graduation rate that depends on passing at least six end-of-course SOL tests).

But we know that the SOL is influenced by economic status.  For example, here are the 2015 reading pass rates by division vs. the percentage of economically disadvantaged students in the division.

We’re not here to discuss whether this correlation suggests that more affluent families live in better school districts, whether their children are better prepared for school, whether their children have higher IQs, or whatever.  The point here is that more affluent kids will show better SOL scores than less affluent students.

That’s only part of the problem with accreditation.  VDOE adjusts (I would say “manipulates”) the accreditation data in secret ways that mostly boost the scores.  In one case, that manipulation converted a 76.3 and a 73.7 into “perfect scores” and embarrassed the Governor.

So it’s no surprise that VDOE has not used, and now is abandoning, a measure of student progress that is insensitive to economic advantage or disadvantage and that might even be resistant to manipulation, the Student Growth Percentile (“SGP”).

VDOE says:

A student growth percentile expresses how much progress a student has made relative to the progress of students whose achievement was similar on previous assessments.
A student growth percentile complements a student’s SOL scaled score and gives his or her teacher, parents and principal a more complete picture of achievement and progress. A high growth percentile is an indicator of effective instruction, regardless of a student’s scaled score.

VDOE calculated SGPs in reading, math, and algebra for at least three years, ending in 2014. Then they abandoned the SGP for a new measure that looks to be coarser than the SGP. 

VDOE says that the new measure might be useful in the accreditation process because it allows “partial point[s] for growth,” i.e. another way to boost the scores.  There is no mention of sensitivity to economic disadvantage.

How about it, VDOE?  Does your dandy new measure of progress cancel the advantage of the more affluent students?  And if it does, will you use it to replace the SOL in the accreditation process?

Maggie What?

The Times-Dispatch this morning reports that Open and Community high schools have been rated among the top ten schools in Virginia.

That’s no surprise.  Except perhaps as to math and science, both schools do an outstanding job.



The surprise is that Maggie Walker did not make the list.

The MW Web site [School Profile page] tells us that the Daily Beast ranked Walker 12th best public high school in the nation on August 27, 2014.  Yet, if you go to the VDOE Web site you won’t even find SOL scores for Walker.

Ask VDOE about this and they’ll tell you something like what they told me:

Governor’s Schools are regional centers or programs and do not report membership [what you and I would call enrollment] to the Virginia Department of Education (VDOE). Students who attend these programs are included in the average daily membership (ADM) of the schools they otherwise would have attended in their home division. Only schools that report membership to VDOE are assigned a school number.

The assessments of students attending regional centers and programs are coded with the school numbers of the schools where these students are included in membership. This is a long-standing practice that pre-dates the Standards of Learning (SOL) program.

Note, however, that is not true for Fairfax County’s Thomas Jefferson, the Governor’s School for Science and Technology in Northern Virginia.  



In short, Maggie Walker, a four-year, full day, public high school that issues diplomas to its graduates, is not a “school.”  The SOL (and other) scores of the MW students are falsely reported at the high schools in those students’ home districts.  So, of course, if you look to the official SOL data, MW does not exist.

Do you wonder why I call VDOE the “State Department of Data Suppression and Manipulation”?

Accreditation and Not

VDOE has updated its 2015-16 Accreditation data.

I earlier discussed their byzantine, opaque process for accrediting Virginia’s public schools.

Well, some of those schools.  You won’t find a rating for Maggie Walker, for instance, because VDOE counts the scores of the MW students at high schools they don’t attend.  And that just scratches the surface of the “adjustments” that boosted the accreditation scores this year by 6.1%.

This year they made it even easier to avoid “Accreditation Denied” by relabeling some denied schools as “Partially Accredited: Approaching Benchmark-Pass Rate” or “Partially Accredited: Improving School-Pass Rate, or “Partially Accredited: Reconstituted School,” among others.

Adjustments or not, relabeling or not, VDOE could not get entirely away from Richmond’s second-from-last place performance on the reading tests or its sixth-from-last place on the math tests.  The initial accreditation results showed Richmond with 37.8% accredited, v. 77.6% of the state’s schools and, more to the point here, with 15.6% “To Be Determined.”  VDOE now has acted on five of the seven Richmond TBDs, which bumps the Accreditation Denied rate from 4.4% to 8.9% and the Reconstituted rate from zero to 6.7%.  Here are the new data:




Note that one of the new schools is Elkhardt/Thompson; the relabeling converts Thompson’s earlier “Denied” rating into “New School.”  All told, 53% of the Richmond schools were warned or denied accreditation this year with two schools still TBD and another failed middle school, Thompson, hiding in the definitional weeds.

The keel of this sinking ship is the middle schools: King denied; Hill improving;  Binford, Brown, and Henderson reconstituted; Elkhardt/Thompson new and camouflaging the denied Thompson.  Only Franklin, which includes both middle and high school grades, is fully accredited.

Data are here.

Graduating (and Not)

The RT-D this morning reports that Virginia’s on-time graduation rate of 90.5% “tops” the national average of 82%.

The RT-D is mixing apples and pomegranates.  They are comparing national cohort data for 2014 with Virginia “on-time” data from 2015.

The Virginia “on-time” rate is a fiction, generated by VDOE to  inflate the actual rate.  The actual 4-year cohort Virginia rate in 2015 was 86.7%.

Even so, that’s Virginia.  This is Richmond.  The (awful) Richmond rate actually dropped this year.


The 2015 cohort also had 167 dropouts in Richmond, 11.8% of the cohort. 

The enrollment pattern by grade gives a more nuanced picture of the huge numbers of students Richmond loses to dropouts and from parents who move to the much better schools in the Counties.


Lies, Damn Lies, and Accreditation “Adjustments”

On Tuesday, the Governor announced a “10-Point Increase in Fully Accredited Schools.”  As Jim Bacon quickly pointed out, some part of that increase must be due to the newly-allowed retakes that boosted pass rates by about four percent. 

Then we have the “adjustments.”  VDOE acknowledges that it fiddles the numbers:

Accreditation ratings also reflect adjustments made for schools that successfully remediate students who initially fail reading or mathematics tests. Adjustments also may be made for students with limited English proficiency and for students who have recently transferred into a Virginia public school. All of these factors are taken into account in calculating pass rates in each subject area.

That falls considerably short of earlier admissions.  Indeed, we know that earlier “adjustments” converted a 76.3 and a 73.7 into “perfect scores” and embarrassed the Governor

In any case, the process is opaque.  About all we can do is compare the “adjusted” pass rates with those reported in the SOL database (that already includes the 4% retake boost).  I have a modest example here.

For the 1774 schools that appear in both databases (see below for the missing 49), the “adjustments” increase the math pass rates:


Excel is happy to fit curves to these data.  For the fitted curves, the actual mean is 82.4, the “adjusted” mean is 84.6.

All this produced a nice increase in the number of schools that made the 70% cutoff:


VDOE writes the tests; they can make them as hard or easy as they wish.  Yet they indulge in this byzantine, opaque process.  And then they brag about the fudged results.

Moreover, there’s a problem with the data.

Data Problem

In juxtaposing the Accreditation and SOL data, I had to make sure that the school names in both lists were the aligned.  In many cases they were not.  So I spent a rainy afternoon yesterday getting the lists to match.

To accomplish that, I dealt with dozens of cases where the SOL database had a space after the school name but the accreditation list did not (Ask Excel to compare two strings and it really compares them).  As well, I had to deal with cases such as a Norfolk school that was “Mary Calcott Elementary School” in one list and “Mary Calcott Elementary” in the other.  Beyond those minor issues, I had to remove 48 schools that were in the accreditation list but not in the SOL database.


(You might notice that 1774+48=1822, which is one short of the 1823 reported by VDOE.  I had to move these by hand and perhaps I messed up a cut-and-paste operation.  I’m not sufficiently invested in this to spend another afternoon trying to figure out who’s missing.)

We are left to wonder how they calculated “adjusted” pass rates for these schools that apparently had no pass rates.

I also had to remove twelve schools from the SOL report that were not in the accreditation list.


At least the two Richmond schools here make some sense: Elkhardt and Thompson were combined into a single school this year.  We are left to wonder why their pass rates were reported separately but they got accredited jointly,* and what happened to the accreditations of the other schools in this list.

As a more global matter, we are left to speculate why they fudge these data.  And how they do it.  And what other ways the data are screwed up.

Oh, and if one secret process for manipulating the data were not enough, we have another: the federal count of schools and divisions that met or failed to meet their Annual Measurable Objectives (aka “AMO’s,” of course).  The only thing to be said for this further waste of taxpayer dollars is that it may be more honest: 51.5% of Virginia schools flunked.



*Actually, we know the answer, at least as to the latter: The combined Elkhardt-Thompson is a “new school,” so it got a bye on accreditation.  The joint accreditation thus solved the problem of Thompson, which was denied accreditation last year.

Lying by Telling the Truth

Monday, His Excellency Arne Duncan touted “a continuing upward trend in graduation rates.”  USDOE has a Press Release to the same effect.

What Duncan and USDOE neglected to mention was that the NAEP long-term data do not show improvements in reading or math scores of seventeen-year-old students.



Duncan is not dumb, so he must be deliberately overlooking the obvious conclusion, which is nothing to brag about:

Easier Grading = Higher Graduation Rate

Your tax dollars at “work.”