The Times-Dispatch reports that “Richmond’s top officials spent their quarterly meeting again calling for more money for the city’s school system.” This abiding demand for more money ignores the more important question: What was RPS doing with the $335,290,809 it already had.
VDOE has some data on that.
The latest expenditure data are from 2017. (We are three months beyond the end of the 2018 session but VDOE won’t have the 2018 data until about the time we start seeing dandelions.)
These data show division expenditures for operations divided by the end-of-year average daily membership. A footnote to the spreadsheet tells us that “[o]perations include regular day school, school food services, summer school, adult education, pre-kindergarten, and other education, but do not include non-regular day school programs, non-local education agency … programs, debt service, or capital outlay additions.”
By that measure, Richmond is the 17th most expensive division per student:
Richmond is the yellow bar. The peer jurisdictions – Norfolk, Newport News, and Hampton – are the red bars.
We are spending $1,396 per kid more than the state average and $1,881 more than Norfolk. In terms of SOL scores, we get very little return for all that money.
Richmond is the gold square; the red diamonds are the peer jurisdictions, from the top Hampton, Norfolk, and Newport News. As a courtesy to my reader(s) there, the green circle is Lynchburg and the green diamond is Charles City.
The R-squared value for the least squares fitted line tells us that the reading pass rate and the per pupil expenditure are not correlated.
The math data tell the same story.
“But wait!” you say. We know that the SOL is not a fair measure because “economically disadvantaged” students do not score as well as their more affluent peers.
Indeed. The Board of Education had a better measure, the Student Growth Percentile, that measured learning and did not correlate with poverty. They abandoned it, however, because it measured too well: It told us how well each teacher performed.
Denied the best data, all we can do is use the SOL numbers and a little algebra to offset the average effect of poverty.
On the reading tests in ‘17, the division percentage of “disadvantaged” students predicted about 35% of the variance of the division average SOL scores.
(You’ll notice that Richmond grossly underperformed even the fitted line.)
Let’s be generous, and calculate an adjusted pass rate as if the correlation were 100% (“Generous” for sure: Notice that some of the pass rates get pushed over 100%.)
The adjustment for Richmond’s 64.2% poverty rate boosts its rate nicely but only boosts the ranking to fourth from worst, up from second.
And, boosted or not, our scores are low and pricey:
The correction for math is slightly different (28.025% of the poverty rate v. 28.671% for reading) and the outcome is slightly worse (Richmond is third worst).
In short, poverty does not come close to explaining the awful performance of the Richmond Public Schools.
It would be good if our Leaders were to stop whining about wanting more money and start explaining why they get such lousy results with the very large amount of money they already are spending.