Put ‘Em On a Bus to West Point

We have seen that school divisions with higher poverty rates score less well on the SOLs.  For example, looking at the 2015 reading pass rates by division v. the percentage of students classified as economically disadvantaged, we see:


We also have seen that spending more money per student does not correlate with higher pass rates.  For instance:


Note: The data here are 2014 disbursements (the 2015 data won’t be available until sometime this Spring), with disbursements for facilities, debt service, and contingency reserve removed.

To remove the effect of economic disadvantage, we use the fitted trendline for the first graph to normalize the scores (i.e., express the pass rates as percentages of the trendline rates).  That produces the following graph for the reading tests.


It turns out that the pass rates fail to correlate with expenditures per student.

BTW: Richmond is the gold square on the graph; the red diamonds are, from the left, Hampton, Newport News, and Norfolk.  The yellow diamond outperforming divisions are, from the left, Norton, Wise, West Point, and Highland.

Here is the same graph for the math tests:


The gold square again is Richmond; the red diamonds again are Hampton, Newport News, and Norfolk.  The yellow diamonds are Wise, West Point, and Highland.

Again, with the average effect of poverty removed, something other than money explains the differences in  performance of the divisions.   Whatever that “something” is, Richmond does not know it, or at least does not practice it.

To me, these data suggest that we should put the Richmond Superintendent and School Board on a bus and send them to Wise and West Point for a week each.  And tell them to stop whining about money.  Whatever the solution to our (HUGE) problem may be, it does not come with a dollar sign.


The Joint Legislative Audit and Review Commission (JLARC) has just published its draft report “Efficiency and Effectiveness of K-12 Spending.”  Unfortunately, that report does not even look carefully at where Virginia is spending its educational dollars, much less answer the (much harder) question of what we are getting for that money.

The Mandates

The General Assembly gave JLARC two decently clear mandates.

SJR328 (2013): JLARC shall

study the efficiency and effectiveness of elementary and secondary school spending in Virginia.  [It] shall (i) study the efficiency and effectiveness of elementary and secondary school spending in Virginia, including evaluating the findings from School Efficiency Reviews and assessing the extent to which recommendations have been implemented; (ii) compare to other states how and to what extent Virginia funds elementary and secondary education; and (iii) identify opportunities to improve the quality of education students receive in consideration of the funds spent.

2014 Appropriation Act, Item 30 (at p. 62 of the link): JLARC to examine virtual instruction, to include “effectiveness of  virtual schooling in terms of  student academic achievement outcomes on assessment tests and course completion or graduation rates.”

The “Study”

The result is a 112 page draft that ignores the instructions of the General Assembly. 

Of the nine recommendations, six talk about efficiency; half of the six deal with school busses; only one of the six deals with something that relates to education.  None tells us about the educational effectiveness of our school spending or how to improve it:

  1. Track teacher turnover.
  2. Provide facilities management expertise.
  3. Provide “guidance” regarding sharing information about facilities management best practices.
  4. Consider statewide contract for bus routing and monitoring software.
  5. Provide transportation management expertise.
  6. Assist with transportation management best practices.

As to virtual schooling, JLARC again avoids answering the question.  The three recommendations:

  1. Provide information about online schools.
  2. Estimate costs of online learning.
  3. Compare achievement of virtual v. physical schools

That last one is particularly rich: JLARC is recommending that VDOE do what the General Assembly told JLARC to do.

Cranky’s Conclusion

This study is a wordy waste of money.  It does not answer the questions posed by the General Assembly.  Instead, it raises a new question: Why are we paying JLARC to not do what it’s been told to do?

A Reader’s Conclusion (added on 9/17)

A reader suggests an alternate (and more pertinent) conclusion: Why are we paying JLARC not to do what it’s been told to do, when we already are paying VDOE that should be doing [what JLARC failed to do]?

New (Federal) College Data

USDOE has just posted a considerable trove of college data.

CAVEAT:  These data are mostly for students who received federal financial aid. 

  • “Average Annual Cost”: The average annual net price for federal financial aid recipients, after aid from the school, state, or federal government. For public schools, this is only the average cost for in-state students.
  • “Graduation Rate”: The graduation rate after six years for schools that award predominantly four-year degrees and after four years for all other schools. These rates are only for full-time students enrolled for the first time.
  • “Salary After Attending”: The median earnings of former students who received federal financial aid, at 10 years after entering the school.

My quick reading of the data does not disclose what fraction(s)of the student populations are represented here. 

With that warning, here is a look at the Virginia public and not-for-profit colleges.  First the graduation rates:


The winners there are UVa in red, W&L in yellow, and W&M in green.

Next, the median salary ten years out:


W&L, in yellow, is the big winner here.

Finally, a bang/buck calculation, ((Salary * Graduation Rate) / Average Cost):


Colors, as before, are UVa in red, W&L in yellow.

Here is the dataset, sorted by school name.


You might be interested in comparing these data with the results of the Brookings “value-added” study.

No SATisfaction

VDOE has posted the 2015 Virginia average SAT scores.  As of today (9/11/15), RPS has not. 

While I was looking for scores, I found a list of SAT scores for State universities in Virginia.  The site does not date those.  Given that the scores do not change a lot from year to year, I thought it might be interesting to juxtapose the university data with the 2014 Richmond scores.

Here, then, are the 25th and 75th percentile reading scores of students entering our public universities, along with the state and Richmond averages for college-bound students as reported by RPS:


Notice that this is an apples-and-oranges comparison.  That said, the state average for college-bound students is close to the 25th percentile scores of entering students at Mason, VMI, and Christopher Newport.  The Richmond average is fifty points below the 25th percentile at Longwood.

And here are the math scores:


Requiem for the VGLA

I have written at length about Richmond’s abuse of its students with disabilities in order to improve SOL scores.  The 2015 data help complete a post mortem on that outrage so here is one last set of data.

Starting in 2005, VDOE allowed the divisions to offer a home-brewed (and, most importantly, locally graded) alternative test for students who could perform at grade level but whose disability interfered with taking the written SOL tests, the VGLA

You might reasonably think that only a few kids in any division would need that accommodation.  In fact, the use of the VGLA mushroomed.  One teacher quoted her director for the reason: “My dog could pass VGLA.”

In 2009, the Superintendent in Buchanan County admitted he had “encouraged to use of VGLA as a mechanism to assist schools in obtaining accreditation and in meeting AYP targets.”  Instead of firing that Superintendent for flagrant cheating and child abuse, VDOE merely required him to write a “Corrective Action Plan.”

Indeed, despite having a computer full of data showing abuse of the VGLA in Richmond and elsewhere, VDOE remained deliberately ignorant of the problem until the introduction of HB304 in the 2010 General Assembly, requiring a specific justification for every child taking the VGLA.  At that point, the State Superintendent became “concerned” and VDOE promulgated new math tests (2012) and reading tests (2013) that eliminated the VGLA except for some ESL students.

The new tests were tough; they reduced pass rates in most divisions. 


The disappearance of the VGLA also had a dramatic effect on the pass rates for students with disabilities.  The effect in Richmond was even more dramatic.  Data here are pass rates.  The red curve is Richmond students with disabilities divided by the state average of students with disabilities; the blue is the same ratio for students without disabilities.


Here we see Richmond’s students with disabilities outperforming their peers statewide(!), while Richmond’s students without disabilities underperformed.  Then came the new reading tests in 2013 and Richmond’s students with disabilities had the locally-graded VGLA snatched away and replaced by the same test everybody else was taking.  The Richmond students with disabilities suddenly underperformed their peers statewide by even more than Richmond’s students without disabilities were underperforming.

The math scores show the same effect.


The important, and shameful, outcome here is that no division Superintendent and nobody at VDOE went to jail or even was fired.  After the General Assembly put up a big Stop sign, VDOE merely changed the system.  And the bureaucrats in Buchanan and Richmond and, doubtless, elsewhere who were violating the public trust were left to think up new ways to abuse the children in their care.

And the kids who were harmed by this cynical and disgraceful episode were left to fend for themselves.

Your tax dollars at work.

Richmond Schools by Grade

The wonderful VDOE database can give us pass rates by test subject by grade.  For a single school, here Westover Hills, the data look something like this:



And here is Lucille Brown:



But there’s no way to fit all the data on one graph, or even on a few.  So I have posted the spreadsheet.

After you open the spreadsheet, select the reading or math tab and follow the instructions there.  Please let me know if you have a problem: john {at} crankytaxpayer [dot] org.

The spreadsheet is here.

High Schools

Finally, here are the End of Course pass rates at Richmond’s high schools for the reading and math tests:



Notice the huge decrease in scores caused by the new reading tests in ‘13 and the appalling drop attending the new math tests in ‘12.

These data must be taken with a front end loader full of salt because of Richmond’s remarkably high retest rate that VDOE kept hidden until a judge made them cough up the 2014 SGP data.  Here, for example, are the 2014 algebra I retest counts in Richmond.


My analysis of those data showed that the retests increased the SOL scores of the students involved by an average of 24.6 points on a test where 400 was passing (i.e., by 6.1% of the passing score).

This year, for the first time, retests were available for elementary and middle school students.  Even with that boost (about 4% according to VDOE), we have seen the Richmond middle school and far too many elementary school scores endured at appalling levels.

Richmond Middle Schools

Here are the average reading pass rates, by year, of the Richmond middle schools.


Recalling that the Accreditation Benchmark for reading is 75%, we see our highest scoring middle school five points below that mark.

Recalling further that Thompson was the only Richmond school denied accreditation last year, and noticing that Thompson almost looks good in comparison to King and Henderson, we see further evidence of VDOE’s wholesale cooking of the accreditation data.

The math data paint a similar picture.


The Accreditation Benchmark here is 70% and only Hill comes even close.  Note also that the “accreditation denied” Thompson has whipped the “accredited with warning” Elkhardt, Henderson, and King for all four years of the new math tests.

It should be a crime to subject any child to any of these schools.  It is an embarrassment.

Richmond Elementary Schools

Looking further into the SOL pass rates, here are the reading data for Richmond’s elementary schools by year:


That graph is much too complicated but it does contain all the data, plus the Richmond average.  Also, the graph does illuminate the disastrous effect of the new tests in 2013 and the relative abilities of some schools to weather the change (notably Carver(!), Munford, and Fox, and, with a different pattern, Fairfield) and the gross failures of others (notable Mason, Oak Grove, and Woodville).  Doubtless this tells us a lot about the principals of those schools, especially in light of the more challenging student populations at Carver and Fairfield.

Focusing on the high and low performers, plus our neighborhood school (with the State average for grades 3-5 added) gives a more readable picture.


The math scores paint a similar picture, except that the Big Hit from the new tests came a year earlier.



More on Economic Disadvantage vs. SOL Pass Rates

I earlier showed that Richmond’s dismal pass rates on the reading and math SOLs are not explained by Richmond’s large population of economically disadvantaged students.

Drilling further into the data, here are the twenty-eight divisions with the largest populations of economically disadvantaged students:


If we plot the pass rates vs. the % ED for these divisions, we obtain:



The gold points are Richmond; the red are Newport News, on the left, and Norfolk.  The high performer there on reading is Norton; high score on math is Highland.

To the point here, Richmond did not outscore any of these divisions on the reading tests and did better in math than only three: Brunswick, Martinsville, and Cumberland.  Said otherwise, all Virginia divisions with similar to much larger populations of economically disadvantaged students are getting better pass rates in reading and twenty-four of twenty-seven and are outpacing Richmond in math.

Poverty is not an excuse for the awful performance of our school system.