No SATisfaction

VDOE has posted the 2015 Virginia average SAT scores.  As of today (9/11/15), RPS has not. 

While I was looking for scores, I found a list of SAT scores for State universities in Virginia.  The site does not date those.  Given that the scores do not change a lot from year to year, I thought it might be interesting to juxtapose the university data with the 2014 Richmond scores.

Here, then, are the 25th and 75th percentile reading scores of students entering our public universities, along with the state and Richmond averages for college-bound students as reported by RPS:


Notice that this is an apples-and-oranges comparison.  That said, the state average for college-bound students is close to the 25th percentile scores of entering students at Mason, VMI, and Christopher Newport.  The Richmond average is fifty points below the 25th percentile at Longwood.

And here are the math scores:


Requiem for the VGLA

I have written at length about Richmond’s abuse of its students with disabilities in order to improve SOL scores.  The 2015 data help complete a post mortem on that outrage so here is one last set of data.

Starting in 2005, VDOE allowed the divisions to offer a home-brewed (and, most importantly, locally graded) alternative test for students who could perform at grade level but whose disability interfered with taking the written SOL tests, the VGLA

You might reasonably think that only a few kids in any division would need that accommodation.  In fact, the use of the VGLA mushroomed.  One teacher quoted her director for the reason: “My dog could pass VGLA.”

In 2009, the Superintendent in Buchanan County admitted he had “encouraged to use of VGLA as a mechanism to assist schools in obtaining accreditation and in meeting AYP targets.”  Instead of firing that Superintendent for flagrant cheating and child abuse, VDOE merely required him to write a “Corrective Action Plan.”

Indeed, despite having a computer full of data showing abuse of the VGLA in Richmond and elsewhere, VDOE remained deliberately ignorant of the problem until the introduction of HB304 in the 2010 General Assembly, requiring a specific justification for every child taking the VGLA.  At that point, the State Superintendent became “concerned” and VDOE promulgated new math tests (2012) and reading tests (2013) that eliminated the VGLA except for some ESL students.

The new tests were tough; they reduced pass rates in most divisions. 


The disappearance of the VGLA also had a dramatic effect on the pass rates for students with disabilities.  The effect in Richmond was even more dramatic.  Data here are pass rates.  The red curve is Richmond students with disabilities divided by the state average of students with disabilities; the blue is the same ratio for students without disabilities.


Here we see Richmond’s students with disabilities outperforming their peers statewide(!), while Richmond’s students without disabilities underperformed.  Then came the new reading tests in 2013 and Richmond’s students with disabilities had the locally-graded VGLA snatched away and replaced by the same test everybody else was taking.  The Richmond students with disabilities suddenly underperformed their peers statewide by even more than Richmond’s students without disabilities were underperforming.

The math scores show the same effect.


The important, and shameful, outcome here is that no division Superintendent and nobody at VDOE went to jail or even was fired.  After the General Assembly put up a big Stop sign, VDOE merely changed the system.  And the bureaucrats in Buchanan and Richmond and, doubtless, elsewhere who were violating the public trust were left to think up new ways to abuse the children in their care.

And the kids who were harmed by this cynical and disgraceful episode were left to fend for themselves.

Your tax dollars at work.

Richmond Schools by Grade

The wonderful VDOE database can give us pass rates by test subject by grade.  For a single school, here Westover Hills, the data look something like this:



And here is Lucille Brown:



But there’s no way to fit all the data on one graph, or even on a few.  So I have posted the spreadsheet.

After you open the spreadsheet, select the reading or math tab and follow the instructions there.  Please let me know if you have a problem: john {at} crankytaxpayer [dot] org.

The spreadsheet is here.

High Schools

Finally, here are the End of Course pass rates at Richmond’s high schools for the reading and math tests:



Notice the huge decrease in scores caused by the new reading tests in ‘13 and the appalling drop attending the new math tests in ‘12.

These data must be taken with a front end loader full of salt because of Richmond’s remarkably high retest rate that VDOE kept hidden until a judge made them cough up the 2014 SGP data.  Here, for example, are the 2014 algebra I retest counts in Richmond.

My analysis of those data showed that the retests increased the SOL scores of the students involved by an average of 24.6 points on a test where 400 was passing (i.e., by 6.1% of the passing score).

This year, for the first time, retests were available for elementary and middle school students.  Even with that boost (about 4% according to VDOE), we have seen the Richmond middle school and far too many elementary school scores endured at appalling levels.

Richmond Middle Schools

Here are the average reading pass rates, by year, of the Richmond middle schools.


Recalling that the Accreditation Benchmark for reading is 75%, we see our highest scoring middle school five points below that mark.

Recalling further that Thompson was the only Richmond school denied accreditation last year, and noticing that Thompson almost looks good in comparison to King and Henderson, we see further evidence of VDOE’s wholesale cooking of the accreditation data.

The math data paint a similar picture.


The Accreditation Benchmark here is 70% and only Hill comes even close.  Note also that the “accreditation denied” Thompson has whipped the “accredited with warning” Elkhardt, Henderson, and King for all four years of the new math tests.

It should be a crime to subject any child to any of these schools.  It is an embarrassment.

Richmond Elementary Schools

Looking further into the SOL pass rates, here are the reading data for Richmond’s elementary schools by year:


That graph is much too complicated but it does contain all the data, plus the Richmond average.  Also, the graph does illuminate the disastrous effect of the new tests in 2013 and the relative abilities of some schools to weather the change (notably Carver(!), Munford, and Fox, and, with a different pattern, Fairfield) and the gross failures of others (notable Mason, Oak Grove, and Woodville).  Doubtless this tells us a lot about the principals of those schools, especially in light of the more challenging student populations at Carver and Fairfield.

Focusing on the high and low performers, plus our neighborhood school (with the State average for grades 3-5 added) gives a more readable picture.


The math scores paint a similar picture, except that the Big Hit from the new tests came a year earlier.



More on Economic Disadvantage vs. SOL Pass Rates

I earlier showed that Richmond’s dismal pass rates on the reading and math SOLs are not explained by Richmond’s large population of economically disadvantaged students.

Drilling further into the data, here are the twenty-eight divisions with the largest populations of economically disadvantaged students:


If we plot the pass rates vs. the % ED for these divisions, we obtain:



The gold points are Richmond; the red are Newport News, on the left, and Norfolk.  The high performer there on reading is Norton; high score on math is Highland.

To the point here, Richmond did not outscore any of these divisions on the reading tests and did better in math than only three: Brunswick, Martinsville, and Cumberland.  Said otherwise, all Virginia divisions with similar to much larger populations of economically disadvantaged students are getting better pass rates in reading and twenty-four of twenty-seven and are outpacing Richmond in math.

Poverty is not an excuse for the awful performance of our school system.

Lies, Damn Lies, and SOL Scores

Both VODE VDOE [blush] and the Governor are out with press releases bragging on the “significant progress” (or significant improvement) in SOL pass rates.  To their credit, both then acknowledge that the five point increases in the reading and math pass rates come in the face of four point score boosts from the newly installed retakes.

VDOE has not released data that would let us examine the retake boost in any detail.  Using their four-point number, the “significant progress,” 4.6% in reading and 5.2% in math, looks more like minimal progress: 0.6% in reading, 1.2% in math:


As Omar might (not) have written:

The moving target moves; and having moved,

Moves on:  nor all thy piety nor wit

Shall lure it back to give an honest answer

Nor all thy tears wash away the bureaucrats’ obfuscation.

Anatomy of a Lousy Performance: SOLs by School

Here are the 2014 and 2015 Reading pass rates by Richmond school:




You may recall that the accreditation benchmark for reading is a 75% pass rate.  VDOE cooks the accreditation numbers so thoroughly that the 75% criterion may be interesting as a rule of thumb but it is meaningless as to which schools actually get accredited.  You’ll notice that none of the mainstream middle schools and far too few of the elementary schools made 75% this year.  Indeed, King went from unspeakably bad to worse, never mind anything to do with 75%.

For another, perhaps more useful, measure, the statewide average reading pass rate was 79.0.

Franklin has both middle and high school grades so I’ve included it in both lists, although its scores can’t be directly compared to either.

Carver continued its spectacular performance, leading among the (only) six elementary schools to beat 75%. 

Next the math data.  Recall that the accreditation criterion is 70%.  The state average pass rate this year was 79.4.




Notice the decreases at Open and Marshall, as well as the uniformly miserable pass rates of the middle schools.  Note the several elementary schools doing pretty well, led again by Carver. 

SOL v. Cost

Table 13 in the Superintendent’s Annual Report lists annual disbursements by division.  Unfortunately, we only have the 2014 data; the current data ordinarily don’t come out until the following Spring.

Deleting the facilities, debt, and contingency entries, and juxtaposing the resulting disbursement totals with the 2015 Reading SOL Pass rates, produces the following graph.


Richmond is the gold square.  The red diamonds are, from the left, Hampton, Newport News, and Norfolk.  Thus we see the comparable old, urban jurisdictions performing poorly at about average cost while Richmond’s reading performance is much worse at a much higher cost.

The datum up there at $11,127, 23% less expensive than Richmond, is West Point, with an 87.8% pass rate.

The R2 value of 2.3% tells us that, among the Virginia school divisions, reading performance and cost per student are essentially uncorrelated.

The math data paint a similar picture.


The division pass rates again fail to correlate with expenditure. 

The point up top ($11,127, 89.0%) is West Point, again.

These data say, quite clearly, that Richmond’s education establishment should stop whining about money and start educating the City’s children.