Richmond: More Money, Worse Schools

Our school Superintendent posted an op-ed in the Times-Dispatch complaining that:

Virginia’s highest poverty school divisions — which serve large percentages of children of color — receive 8.3 percent less in per-pupil funding than the state’s wealthiest districts. Put plainly: The students who should be getting more are actually getting less.

As set out in the previous post, Virginia’s high poverty divisions actually spend on average more per pupil than the more affluent divisions.  Richmond, the gold square on the graph, spends a lot more than average; indeed, it is the tenth biggest spender (of 132).

Table 12 in the (State) Superintendent’s Annual Report permits a look into the sources of those funds that the Richmond schools are spending.  As before, the most recent data in the state report are from 2017. 

The table breaks out division receipts by source:

  • State Sales and Use Tax (1-1/8 % of the sales tax receipts);
  • State Funds (appropriated by the Generous Assembly);
  • Federal Funds (direct federal grants plus federal funds distributed through state agencies);
  • Local Funds (local appropriations);
  • Other Funds (private sources, food service  receipts, transportation revenues, sale of assets & supplies, rebates and refunds, and receipts from other agencies); and
  • Loans, Bonds, etc. (Literary Fund loans, bonds, interest earned)

Of these, the local, state, and federal fund sources predominate.

Let’s start with a graph of the state and federal funds per student vs. the division percentage of economically disadvantaged (“ED”) students:


The immediate lesson here is that Superintendent Kamras is simply wrong about high-poverty schools being starved for outside funds: The sales tax funding is essentially flat v. % ED while the state appropriations and the federal funding increase with increasing % ED.

Richmond, the gold squares, is a bit low in terms of state funding, but that deficit is offset (and then some) by federal money.  Richmond’s sales tax funding, $1,029 per student, is hidden in the forest of other schools, almost exactly on the fitted line.

Only in local funding can we can find any hint of support for the notion that divisions with more poor students receive less money.


Of course, it is no surprise that less affluent jurisdictions might provide fewer funds to their school systems.  For the most part, they have less money to spend on any governmental operation. 

(But when all the fund sources are added in, the spending on education increases with increasing populations of disadvantaged students: See the graph at the top of this post.)

Kamras’ own division, with a 64% ED population in its schools, nonetheless came up in 2017 with $2,249 in local funds per student more than that fitted line would predict. 

In summary, Richmond schools received LOTS of money in these categories:


[“Predicted” values here are calculated from the Richmond % ED and the fitted lines in the graphs above.]

So, when he says “The students who should be getting more are actually getting less,” our Superintendent is wrong.  And, even more to the point, Kamras’ own schools are enjoying much more than average financial support.

The Kamras op-ed is a bald attempt to excuse the ongoing failure of the Richmond public schools to educate Richmond’s schoolchildren.  For example, on the 2018 reading pass rates:

The excuse is contradicted by reality: Those Richmond schools are swimming in money.  Even more to the point, the performance of Virginia school divisions is unrelated to how much money they spend.  For example (here in terms of the per pupil expenditure for operations):


It would be helpful for our Superintendent to turn his energy to improving the performance of our schools and to stop misleading the public about the reasons those schools are so ineffective.

If At First You Don’t Succeed, Change the Subject

On Sunday, The Times-Dispatch published an op-ed in which Superintendent Kamras decried “institutional racism” and suggested that the first step to deal with it would be more money for divisions with higher poverty rates.  Such as Richmond, of course.

Kamras wrote:

According to the National Center on Education Statistics, Virginia’s highest poverty school divisions — which serve large percentages of children of color — receive 8.3 percent less in per-pupil funding than the state’s wealthiest districts. Put plainly: The students who should be getting more are actually getting less.

The estimable Jim Bacon pulled some data from the VDOE Web site to question that proposition.  The full dataset makes an even stronger case than the one Bacon argued.

The Superintendent’s Annual Report includes at Table 13 a list of division disbursements.  The latest data there are from 2017.  The VDOE Web site also sports a (very nice) database that provides “membership” data by race and by “disadvantage” (primarily free and reduced lunch numbers), inter alia.

Excel is happy to juxtapose the datasets on a graph.  Let’s start with economic disadvantage (“ED”).


Note:  The disbursement data here are division totals, less spending for facilities, debt service, and contingency reserve.

The fitted line suggests that divisions with more ED students spend more per student ($244 for a 10% increase in ED enrollment) but the 2.3% R-squared value tells us the two variables are very weakly correlated.

In any case, there is no pattern here of deprivation of those divisions with large ED populations.  Quite the contrary, most of the Big Spenders are high poverty divisions. 

As well, we see Richmond, the gold square, spending lots of money.  Indeed, Richmond is the tenth biggest spender among the 132 divisions.


We can argue about the reasons for the lousy performance of Richmond’s public schools, but lack of money is not a candidate.

Of course, Kamras was talking about race, not poverty.  There is no need here to accept his undocumented melding of those two factors; the VDOE database also has the division membership by race.


Note: Bland and Highland are absent from this dataset, presumably because their black enrollments are small enough to trigger the suppression rules.

Again, the slope is in the wrong direction for the Kamras complaint ($275 increase per 10% increase in the black student population).  And this time the 7.7% R-squared value hints more strongly at a correlation.

The absence of Highland County from the list moves Richmond up to ninth place (of 130 divisions).


Bottom line: The VDOE data contradict Kamras’ claim that divisions with more poor or more black students are under funded vis-à-vis the other divisions.

More fundamentally, Kamras’ jeremiad about funding overlooks the abundant data that show no relationship between division funding and SOL performance.  Money is not the problem in Richmond’s schools; lousy schools are the problem.

Looking at the study he cites, it appears that Kamras is complaining about the funds Virginia schools receive [Table 7] in an arbitrary grouping, not what each division spends.  Whatever that study may mean, it cannot contradict the Virginia data that show Richmond and other high-poverty (and high black student percentage) divisions spending about as much money as their more affluent peers.  And, for sure, those school systems can’t spend more than they receive.

Indeed, Kamras’ division is spending a lot of money and getting lousy results.  It would be helpful for our Superintendent to spend more energy improving the performance of his schools and less on misleading the public about the reason those schools are so awful.

Graduation Rates: Official Fiction

The estimable Carol Wolf sent me the link to an article reporting improving graduation rates of disabled students and asked whether that were reflected in Virginia or Richmond.

We earlier saw that Virginia’s graduation rate has been increasing while the reading and math End of Course pass rates were falling.  That is, the Board of “Education” has its thumb on the statistical scale.  Per Carol’s’ inquiry, let’s take a closer look at the overall rates and delve into the rates for disabled students.

The current requirements for a “standard” diploma include six “verified” credits, two in English plus one each in math, a laboratory science, history & social science, and a student-selected subject.  To earn a verified credit, the student must both pass the course and pass the End of Course (“EOC”) SOL test “or a substitute assessment approved by the Board of Education.”

[Do you see the thumb there on the scale?]

To start, here are the reading EOC pass rates for the past five years.


Hmmm.  How might we explain those Richmond disabled numbers for 2014-16?  Friar Occam might suggest cheating.  In any case, this is not a picture of improving performance.

Then we have writing.


History & Social Science.




And science.


There are some bumps and squiggles there but the trends are clear: The state averages are fading and the Richmond, plunging. 

The five subject average smooths out the variations.


That’s clear enough: The statewide averages have declined in the last two years; despite some gains in ‘15 and ‘16, those averages have declined overall since 2014.  The Richmond averages have plummeted.

Turning to diplomas: Our educrats report (and brag upon) an “on-time” graduation rate.  To get that rate they define “graduates” to include students who earn receive any of the following diplomas: Advanced Studies, Standard, Modified Standard, Special, and General Achievement

To their credit, the federales do not count the substandard diplomas: The federal rate includes only advanced and standard diplomas.  To combat that bit of realism, Virginia two years ago redefined the Modified Standard Diploma by allowing “credit accommodations” to transform it, in most cases, into a Standard Diploma.

This had a nice effect statewide and a dramatic effect in Richmond.



With that background, let’s look at the four-year cohort graduation rates.


Those increases are enough to warm an educrat’s heart, at least until we notice that:

  • The pass rates don’t support the recent increases, and
  • That 2017 bump in the disabled rate (that boosts the overall rate is some measure) reflects 1,200 or more modified standard diplomas that were transformed into standard diplomas by fiat.

The redefinition give Richmond a nice bump in 2017 but the overall rate resumed its decline in 2018.


So, yes, Carol.  The Virginia four-year cohort graduation rates rose, both for disabled students and for all students.  The rise was enhanced after 2016 by (even further) manipulated data.  The rise continued at a time the pass rates in the End of Course SOL tests were declining.  If you believe those improving numbers, I want to sell you some shares in a nice James River bridge.

Richmond’s declining numbers remind us that even bogus statistics can’t make Richmond’s public schools look like anything but a menace to the students.

Your tax dollars at “work.”

Postscript: It looks like inflated graduation rates are a national phenomenon.

Boosting the Graduation Rate

As VDOE bragged, their (bogus) “On-Time” graduation rate rose this year.  They didn’t brag about the Richmond rate; it dropped.

Turns out, the (somewhat less bogus) “federal” rates show the same pattern.


Note: Post updated from October, 2018 to correct the 2017 state rate.

The increase in the state rate was driven by an increase in the standard diploma rate.  The drop in the Richmond rate came from a 2.7% decrease in the advanced studies rate.


But you, Astute Reader, are still looking at that first graph and asking: “John!  You’ve been ranting about how VDOE’s manipulation improved the state rate by about 1.3 points for ‘17 and ‘18 and the Richmond rate by perhaps five points.  Where are those increases?”

Ah, what a pleasure to have an attentive reader!  The 1.3 and 5 point boosts look to have been offset or partially offset by decreases in the End of Course pass rates. 

Turning to the data, here are the graduation rates again along with the averages of the five EOC subject area pass rates.


Students must pass six EOC tests to graduate.  Thus, the decreases of the pass rates of those required courses must have lowered the graduation rates.  Then the VDOE data manipulation offset those graduation rate declines in some measure. 

That looks like a general explanation.  The specific would require a more detailed knowledge of which students passed or failed which courses, and where in their four-year journey through high school, and whether they graduated.  For sure, the drop in the Richmond pass rates is consistent with the absence of the five-point boost there.

Of course, correlation is not causation and doubtless there are other factors in the mix here.  The floor is open for any more convincing suggestion.

BTW: The Big Drops in Richmond, and the lesser in state, EOC pass rates mostly came in math and science.



Preview of Coming Attraction: The Board of “Education” has its next Finagle factor in place: Under the new accreditation regulation (subsection B.3), we now have “locally verified credits” for students who flunk the required SOL tests.  This should insure another nice increase in the state graduation rate, paid for by another not-so-nice decrease in student learning.

Punishing Poverty

Our Board of “Education” has engineered a reporting system
that rewards the more affluent divisions and penalizes the poorer ones.

We have seen that economically disadvantaged (“ED”) students underperform their more affluent peers (“Not ED”) in terms of pass rates on the SOL tests.  In 2018, that underperformance ranged from 17.43% on the History & Social Science tests to 21.82% on the writing tests.


This places schools and divisions with larger ED populations at an unfair disadvantage with respect to the SOL averages.

Here, to start, is a selection of reading pass rates.


I’ve selected these nine divisions because their ED populations are spaced at about 10% intervals.  (They are, from the left, Falls Church, Powhatan, Arlington, Gloucester, Bath, Dickenson, Roanoke City, Northampton, and Greensville.)  The orange diamonds are the division average reading pass rates of the Not ED students; the green triangles are the ED pass rates. 

The SOL average pass rates are the blue circles.  The divisions with smaller numbers of ED students to drag down that average have higher SOL numbers.  Indeed, the divisions with fewer than 51% ED (the statewide average) enjoy a boost; divisions with more than 51% suffer a penalty.

The extreme examples are Falls Church and Greensville County (which includes Emporia).

Falls Church had the lowest ED numbers in the state, 9%.  The reading pass rate for their Not ED students was 95%; for ED, it was 67% (a 28 point spread; no bragging rights there).  The average of the ED and Not ED averages was 81%, which would have been a fair measure of the performance of both groups.  The division SOL pass rate, however, was 92% because of the small number of EDs.  Falls Church thus enjoyed an eleven point SOL bonus for affluence.

Greensville County was the other end of the scale, 93% ED.  The pass rates were lower: 76% for Not ED, 59% for ED, a seventeen point spread with a 68% average.  But the reported SOL was 61%.  Greensville suffered a 7 point SOL penalty for poverty.

For a more complete view of the situation, here is a graph of division pass rates, Not ED, ED, and SOL, all plotted v. the ED percentage.


The Falls Church points are enlarged and filled with purple; Greensville, maroon; Richmond, gold.

Let’s simplify the picture by looking just at the least squares fitted lines:


The statistics of the fitted lines tell three different stories:

  • ED: The fitted line shows an 0.43% decrease in the ED pass rate for a 10% increase in the ED population but the correlation is minuscule.
  • Not ED: The slope is minus 1.35% for a 10% increase in ED and there is something of a correlation.  It looks like increasing the ED population is mildly related to a decrease in the Not ED rate, but not so much the ED (this kind of data can’t show whether the ED population increase causes part of the Not ED pass rate decrease).
  • The SOL pass rate decreases by 2.77% for a 10% increase in the ED population.  Thirty-five percent is a pretty good correlation and the reason is obvious: Divisions with larger ED populations have more ED (i.e., lower) pass rates included in the average.

The math data tell much the same story.


Here the average of the Falls Church ED/Not ED pass rates is 74% but the small ED population results in an SOL pass rate 13 points higher.  The Greensville SOL reflects a 3 point penalty vs. the ED/Not ED average, which is that small only because the ED and Not ED rates are only seven points apart.

Notice that the Falls Church ED rate again is far below the Not ED, here by 33%, which is double the state average difference.  Either those Falls Church schools are coasting with unusually bright Not ED students, or struggling with unusually low-performing ED students, or doing a poor teaching job with the ED group, or some combination of such factors.


The slopes again show how the SOL average penalizes divisions with large ED populations.  As well, the R-squared for the Not ED group suggests, even more strongly than in the reading data, that increasing the ED population is associated with an effect on the performance of the Not ED group.

We cannot infer from the data why the Board of “Education” would embrace this system that punishes poverty.  We can notice, however, that the system is unfair on its face. 

Indeed, the Board had a fair measure of learning, the SGP, that was independent of poverty.  But the Board abandoned that system on the flimsy excuse that it could not calculate the results until Summer School results were in.  In fact, they knew that when the started calculating the SGP.  As well, they were (and are) perfectly capable of calculating the SGPs in May for the students (and teachers) not involved in Summer School.  Waiting until August merely gives them a chance to camouflage some of the poor performances during the regular school year.

Your tax dollars at “work.”

Here, for the record, is a list of the divisions that received more than a 2% reading SOL boost from the ED/Not ED average in 2018:


And here are those that enjoyed a penalty of 2% or more:


Ah, well, we can make one inference:  If you’re that Board and you’re going to make an enemy, better the Superintendent in Greensville or Colonial Beach than the one in Falls Church or Loudoun.

Don’t Blame the Kids, Chapter MMXIX

The previous post shows (yet again) that lack of money does not explain Richmond’s awful SOL pass rates.

That post, however, left open the question whether Richmond’s relatively large population of economically disadvantaged (“ED”) students might explain Richmond’s poor performance, at least in part.  The VDOE database has the pass rate data to answer that question.

Let’s start by looking at the division average reading pass rates for both the ED and Not ED groups as functions of the division expenditure in excess of the Required Local Expenditure.


The least squares line fitted to the Not ED pass rates shows an increase of 1.15% for a 100% increase in the excess expenditure but the correlation is close to vanishingly small.  The line fitted to the ED pass rates in fact shows a decrease of 3.3% per hundred but still with a correlation that would not support a conclusion anywhere except, perhaps, in a sociology thesis.

These data are consistent with the earlier analysis that showed no benefit to ED or Not ED pass rates from increased per student day school expenditures, increased instructional salaries, or increased per student number of instructional positions.

In the Not ED data, Richmond is the larger diamond with the gold fill at a pass rate of 74.3%, fourth worst in the state.  The Richmond ED rate, the larger circle with, again, the gold fill is 51.7%, second worst in Virginia.   In the previous post, Richmond’s overall SOL pass rate was 58.9%, third worst.

In short, large ED population or not, Richmond is doing a terrible job of teaching all its students.

For reference, the peer cities are in red fill, from the left Norfolk, Newport News, and Hampton.  Charles City is green; Lynchburg, blue.

Eight of the ten highest-priced divisions beat the state reading average for Not ED students (green fill in the table below); four beat the ED average.


Of the ten lowest excess divisions, six beat the average for Not ED students while nine beat the ED average.


As to the divisions of interest, here are their pass rates expressed as differences from the averages of the division pass rates.


The data for the other subjects are consistent with the conclusion that Richmond’s school are ineffective, for both ED and Not ED students.



Notice that I had to expand the axis to capture Richmond’s tied-for-worst ED writing pass rate.







The next time RPS starts (well, continues) to whine about needing more money for instruction, please be sure to ask them exactly what that money would buy, given that more money in other divisions doesn’t seem to buy anything but more taxes.

More on Money and Learning

A helpful reader points out that the 2018 Required Local Effort report is up on the Web.

Va. Code § 22.1-97 requires an annual report showing whether each school division had provided the annual expenditure required by the Standards of Quality.  The report presents the data as “required local effort” (“RLE”) per school division, along with the actual expenditure, both in dollars and as the excess above the RLE.

Note: The report breaks out the data for all divisions but the SOL database consolidates Emporia with Greensville County, James City County with Williamsburg, and Fairfax City with the County.  I’ve calculated those three consolidated RLE’s from the individual division data.

To start, here are the 2018 division reading pass rates plotted vs. the expenditures beyond the RLE.


The slope of the fitted line would suggest that the average reading pass rate increases by 1.17% for a 100% increase in the excess RLE.  The correlation, however, is minuscule.

Translating the statistics-speak into English: The division excess expenditure does not predict the reading pass rate.

Richmond is the gold square on that graph.  The peer cities are the red diamonds: From the left, Norfolk, Newport News, and Hampton.  Charles City is the green diamond; Lynchburg, the blue.

The Big Spenders, yellow circles on the graph, are:


For the most part, those divisions performed well (green shading for those > the average).

The Little Spenders, the red circles on the graph, are:


With the exception of Petersburg, those divisions also performed well, or close to it.  Notice that six of the ten are in Southwest Virginia and five of those performed quite well on the reading tests.

BTW: Williamsburg had the lowest excess, 3.69%, but that datum disappears here into the average with James City County.

Excel is happy to calculate and plot a Bang/Buck ratio by dividing the RLE excess into the pass rate.  The result:


The Bang/Buck correlates well with the reciprocal of the Excess; no surprise there.

Looking just at the middle of the curve, we see the divisions of interest underperforming the fitted curve, Hampton only slightly.


The data for the other subjects tell much the same story (albeit with a hint of a correlation in the writing data).





All that said, we know that “economically disadvantaged” students underperform their more affluent peers and that Richmond’s large population of ED students puts RPS at a disadvantage in terms of raw SOL pass rates. 

Indeed.  Stay tuned for the next post.

No SATisfaction

Until yesterday, if you were to look on the RPS Homepage, under “About RPS,” you would have found a list of SAT scores that stopped in 2016.  The estimable Carol Wolf asked RPS when they planned to update the data; Ms. Hudacsko of RPS got things straightened out.

Let’s look at those updated data.  First the reading/writing numbers:


*Before 2016, the SAT had separate reading and writing scores; since then, they report a single “reading and writing” number.  The data above (and below) are those combined scores for 2016 and later but only the reading scores before then.

Of course, some schools did better than others.


Turning to math:



It’s not Good News that our best two high schools underperformed the state average on both tests.

For some context, here are data on this year’s admissions at Longwood


and at THE University.


To get some further context, I consulted Dr. Google and found a shortage of posted SAT scores for individual schools.  Fairfax is a notable exception (when you look at the numbers, below, you’ll see why).  Here are their 2018 totals (reading/writing + math), along with the same data for the Richmond high schools and for Richmond, Virginia, and the US.


Now, class: In 25,000 words or fewer, explain how that performance comports with the RPS Web page that says:


RPS Embarrassing Itself

The RPS Web site has a chart:


The chart tells us the information there is


No telling why they would post old data.

In any case, the “success” headline is a lie.  A quick run to the VDOE database measures the size of that lie: On the 2015 SOL tests, just announced in August, 2016, Richmond was lowest in the state in terms of the reading pass rate and second lowest in writing, history & social science, math, and science.

image  image image



If that is “success,” I am Bill Gates.

Returning to the chart, we see this picture of RPS demographics.


Hmmm.  That 1.33% Asian slice is smaller than the 0.06% Hawaiian and much smaller than the 0.16% Am-Indian.  Let’s see what Excel does with the same numbers:


OK.  The graph is wrong but, unlike the “success” lie, there’s no benefit to it.  It just demonstrates incompetence in graphing.

Then we have arithmetic:


Thing is, if you add up the numbers, it’s 43 schools, not 44.

According to the Chart, RPS is spending more than a third of a billion dollars a year.  For that price, in one chart on their Web site they manage to tell a blatant lie, proffer a nonsense graph, and demonstrate that they have not mastered simple arithmetic.

Your tax dollars at “work.”