Extra Bucks, Negative Return

Looking at the Division disbursement data for 2018 (now that they are up on the VDOE Web site) one sees Richmond spending some $1,875 per student for things other than its day school operations ($15,697 vs. $13,821).

Note: Here and below, the “total” disbursements include everything but spending for facilities, debt service, and contingency reserve.  Specifically, they include, in addition to costs of the day school operation, spending for food services, summer school, adult education, pre-kindergarten, and “other” educational programs (“enterprise operations, community service programs, and other non-[division] programs”)

A graph of those data for the Virginia divisions suggests that Richmond (the gold square) is an outlier. 


Indeed, along with Staunton, Petersburg, and Bristol, Richmond leads the pack in spending for things other than day school.


Except for Charlottesville and Arlington (and Richmond, of course), the divisions with this kind of large difference are not the Big Spenders. 

A look at all the divisions shows that the difference does not scale with day school spending.


Richmond is the gold square; the red diamonds are the peer cities, from the bottom Hampton, Newport News, and Norfolk.  The purple diamond is Lynchburg; the large green diamond, Charles City.

More significantly, the divisions with more of this excess spending are not getting better SOL scores.  Quite the contrary:



The red, from the left, are Hampton, Newport News, and Norfolk.

But wait: We know that the raw SOL pass rate discriminates against divisions with large populations of economically disadvantaged (“ED”) students.  So let’s look at the performance of both the ED and Not ED groups.



That’s clear enough:  On average, the divisions that spend more money for things other than day school have lower SOL pass rates for both ED and Not ED students.  In the case of Richmond, those pass rates are grossly lower.

Notice also that the decrease in performance with increasing non-day school spending is larger for the Not ED students, particularly on the math tests. 

As always, the correlations here (where they are non-trivial) do not imply causation.  The other side of that coin, however, shows that increasing spending for things other than the day school operation is not related to improving the day school performance.

One more question for these data: How do those excess expenditures relate to the population of economically disadvantaged students?


The free/reduced price lunch cost should merely displace other lunch spending but the School Breakfast Program costs should increase with the ED population.  And here the expenditure for things other than day school rises with the number of poorer students, with a decent correlation.  So, at least in a qualitative fashion, this makes sense. 

That said, this extra spending is not associated with more learning.  We can wonder why it’s in the education budget.

Closer to home, the Richmond spending remains a substantial outlier.  Richmond keeps demonstrating that it does not know how to competently run its day school operation.  While they are thus failing to educate Richmond’s schoolchildren, they are spending a lot of money (almost twice the average) on other things. 

If money were as important to performance as they say it is (albeit money in fact is irrelevant to performance), you’d think they would redirect some of those dollars.  Go figure.

Boosted Pass/Graduation Rates?

An earlier post included some interesting Richmond data.  For instance:



Students must pass those End of Course (“EOC”) tests to obtain “verified credits” toward the graduation requirements

The state EOC numbers are high, compared to the 8th grade rates; the Richmond numbers are unbelievably high. 

There are (at least) two ways the pass rates could improve between middle and high school:

  • The high school dropouts leave behind a group of better performing students;
  • The EOC tests are easier or are scored less rigorously than the middle school tests.

We already know about Richmond’s shocking dropout rate so let’s do a thought experiment on the 2018 data:

  • Assume the cohort dropout rate of 20%;
  • Assume that the none of the (soon to be) dropouts passed the 8th grade tests;
  • Assume that the non-dropouts in high school (from earlier years) passed the EOC tests at the same rate as the 2018 8th graders did the 2018 8th grade tests.

That is, assume the 8th grade pass rate is and has been the average of 80% of some higher number and 20% of zero; then assume the EOC pass rate will be equal to that higher number.  You’ll recognize that this is a very rough approximation but, in light of the reasonably constant state 8th grade and EOC and Richmond 8th grade numbers, not an outrageous one.

A little back-of-the-envelope arithmetic calculates a dropout-boosted EOC pass rate of 64.8% in reading vs. the 70.9  actual and 52.1% in math, vs. the actual 59.2. 


It looks like getting rid of those dropouts produces a nice boost.  No wonder Richmond is so casual about its horrendous dropout rate.

The state data are consistent with the same phenomenon but the lower dropout rate (5.50%) gives less dramatic results for the calculated EOC numbers.


Even so, these extreme assumptions are not nearly enough to explain the actual EOC pass rates. 

If dropouts don’t explain the entire pass rate differences, we are left with the Board of “Education” messing with the EOC tests to boost the graduation rates.  For sure, we know that the Board sets the cut scores [@ p.130].  A boost of about five points (huge, in SOL terms) in both subjects would explain the state data and would go a long way toward rationalizing the Richmond numbers.

Of course, this speculation doesn’t prove anything.  But we already know that such manipulation of the data would align with other data engineering that makes the schools (that the Board supervises) look better than they really are.  See this and this and this and this and this and this.  Also see this, suggesting that the Board cares more about the more affluent divisions.

We’ve no direct way to test the notion1 but these data certainly suggest a simple pattern: EOC pass rates are boosted at the state level by the dropouts and by VBOE’s manipulation of the SOL test scoring; the EOC scores in Richmond are similarly boosted by the rigged scoring and inflated even more because of the terrible dropout rate.

Your tax dollars at “work.”

1.  VDOE does have the information but don’t waste any time waiting for them to disclose it.

Richmond: More Money, Worse Schools, False Excuse: Update

An earlier post discussed our Superintendent’s false statement about financial support.  That post was based on the 2017 VDOE data, the latest funding data then available.  I’ve updated the post with the 2018 numbers that VDOE just posted.

Our school Superintendent wrote an op-ed for the Times-Dispatch complaining that:

Virginia’s highest poverty school divisions — which serve large percentages of children of color — receive 8.3 percent less in per-pupil funding than the state’s wealthiest districts. Put plainly: The students who should be getting more are actually getting less.

In fact, Virginia’s high poverty divisions (larger numbers of economically disadvantaged students) actually spend more per pupil on average than the more affluent divisions.

Richmond, the gold square on the graph, spends $2,219 more per student than the fitted line would predict; indeed, it is the fourteenth biggest spender (of 132).

Table 15 in the (State) Superintendent’s Annual Report permits a look into the sources of those funds that the Richmond schools are spending.  The table breaks out division receipts by source:

  • State Sales and Use Tax (1-1/8 % of the sales tax receipts);
  • State Funds (appropriated by the Generous Assembly);
  • Federal Funds (direct federal grants plus federal funds distributed through state agencies); and
  • Local Funds (local appropriations).

Let’s start with a graph of the per student spending of state and federal funds vs. the division percentage of economically disadvantaged (“ED”) students:

The immediate lesson here is that Superintendent Kamras is simply wrong about high-poverty schools being starved for outside funds: The sales tax funding is essentially flat v. % ED while the state appropriations and the federal funding increase with increasing % ED.  Indeed, the R-squared value on the state funds, 30%, implies a meaningful correlation; the federal value, 50%, is robust.

Richmond, the gold squares, is a bit low in terms of state funding, but that deficit is offset (and then some) by federal money.  Richmond’s sales tax funding, $1,071 per student, is hidden in the forest of other schools, almost exactly on the fitted line.

Only in local funding can we find any hint of support for the notion that divisions with more poor students receive less money.

Of course, it would be no surprise that less affluent jurisdictions might provide fewer funds to their school systems.  For the most part, they have less money to spend on any governmental operation.

Kamras’ own division, with a 61% ED population in its schools, noneltheless came up in 2018 with $1,806 in local funds per student more than that fitted line would predict.

As well, when all the fund sources are added in, the average spending on education increases with increasing populations of disadvantaged students.  The R-squared, however, tells us that expenditure and ED percentage are essentially uncorrelated.  See the graph at the top of this post.

In summary, Richmond schools received LOTS of money in these categories, much more than the average division:

[“Predicted” values here are calculated from the Richmond % ED and the fitted lines in the graphs above.  The sum of the predicted values is seven dollars less than the value calculated from the actual total, which is probably explained by the inclusion of tuition from other divisions in the total reported.]

So, when he says “The students who should be getting more are actually getting less,” our Superintendent is wrong.  And, even more to the point, Kamras’ own schools are enjoying much more than average financial support.

The Kamras op-ed is a misleading attempt to excuse the ongoing failure of the Richmond public schools to educate Richmond’s schoolchildren.  For example, on the 2018 reading pass rates:


The excuse is contradicted by reality: Those Richmond schools are swimming in money.  Even more to the point, the performance of Virginia school divisions is unrelated to how much money they spend.  (Indeed, the trend lines point in the other direction.)  For example:

Richmond again is the gold square, spending lots of money and getting lousy results.  For comparison, the peer cities are the red diamonds: from the left, Hampton, Norfolk, and Newport News, all spending much less than Richmond and getting much better outcomes.  As a courtesy to my readers there, the violet diamond is Lynchburg, the green, Charles City.

It would be helpful for our Superintendent to turn his energy to improving the performance of our schools and to stop misleading the public about the reasons those schools are so ineffective.

The Alternative and Dropouts

Richmond Alternative School serves “students with academic, attendance and behavior challenges.”  Their Web page says they “use positive norms and positive peer pressure from the group in order to maintain a positive learning culture.”

RPS hired Community Education Partners in 2004 to run this receptacle for disruptive students.  In 2013, Richmond took the school over, saving about $2 million per year.  They now have hired Camelot Education to run it. 

The original contractor was doing a decent job, given the tough clientele. 


RPS picked a bad year to take over the school; the new math tests had lowered scores statewide in 2012 and new tests in English and science did the same thing in 2013, albeit more in Richmond.  

Alternative didn’t bounce back until the new contractor was in charge.  For instance:



There is a wrinkle in those data, however: The state and Richmond averages include the elementary school pass rates while Alternative has only the middle and high school grades.  The Alternative trends in the graphs are accurate enough but the comparison to the state and Richmond numbers is not.

If we pull the data by grade, we are hindered somewhat by the VDOE suppression rule that omits data for groups of <10 students.  Even so, there is information here:



(“EOC” is the end of courses tests that are required to obtain “verified credits” toward the graduation requirements.)

There are some details missing but the Richmond Alternative jump in 2017 is clear enough.  Either the contractor is cheating prodigiously, the dropouts are having an exceptional effect on the scores of the remaining students, or its it’s time for our School Board to admit its own incompetence and hire outsiders to run all the schools.  I lean toward that last explanation. 

In any event, something stinks here.

. . .

Maybe two somethings:  Those Richmond EOC numbers are suspiciously high.  Time for some more digging.

Looking Past the Donut

As you ramble on through Life, then,
Whatever be your goal,
Keep your eye upon the doughnut
And not upon the hole.

That may be good advice in life.  As to the performance of Richmond’s public schools, there’s a question whether the very large dropout hole has the perverse effect of enlarging the SOL donut.

We already know about the appalling condition of that donut, Richmond’s SOL performance.  Let’s further examine the hole, the kids who have dropped out and can’t further damage the pass rates.

The VDOE Web site has dropout statistics by division and by school.  I pulled down those by school for 2018 (the latest available) along with the Fall enrollment (“membership”) data for that year.

First the high schools.  The scales on the ordinates here are the same so we can compare the schools.

image image image image


And the average of averages for the five schools:


It looks like Wythe, with some modest help from everybody but Armstrong, is driving Richmond’s anomalously high 9th grade dropout rate.

The selective high schools paint a much prettier picture.

image image

No need for a graph for Franklin Military; their high school numbers all are zero.

The middle school numbers are much smaller, albeit several show troublesome trends.  Notice the expanded ordinate.

image image image image image image image image

The middle school average of averages:


Richmond Alternative, the dumping ground for troublesome kids (esp. in middle school), requires expanded ordinates.

image image

If Richmond were serious about dealing with dropouts (the numbers say they are not), these data suggest the places to start.

Attendance, Not

Having noticed Richmond’s atrocious dropout rate, I went digging on the VDOE Web site and found detailed dropout data.  I fired up Microsoft Access, pulled the 2018 dropout data, and set them beside the Fall enrollments for that year.

The result is ugly.


The state data show a regular increase with grade.  The astounding Richmond numbers show an unexpected maximum in the ninth grade.  Perhaps that is due to the notorious “ninth grade bump.”  


(Data are for all Virginia divisions.)

The principal argument for social promotion seems to be that holding the kid back is even less effective for learning than promoting that student.  The literature I’ve seen does not explain why that reason (or simple inertia or moving the problem on to the next level or whatever else might explain social promotion) stops working at the ninth grade.

In any event, Richmond’s ninth grade bump looks to survive Richmond’s ninth grade surge in dropouts.


If you have a testable hypothesis that explains all this, please do share it with me.

Turning back to the dropout rates, we see that, until the 12th grade and contrary to the state pattern, the rate among Richmond’s economically disadvantaged (“ED”) students is lower than among their more affluent peers (“Not ED”). 


This last set of data doesn’t illuminate the reasons for Richmond’s unusual (and unusually awful) dropout rates but it does suggest where we might start working on the problem: the ninth grade bump.

Attendance. And Not.

The 2018 Superintendent’s Annual Report has the dropout numbers.  Combining those data with the September 30 head count, we see that Richmond is the runaway leader in dropouts.


Those 517 dropouts in Richmond look less appalling when expressed as a percentage because most of the dropouts come in the later grades.  (Richmond’s 4-year cohort dropout rate in 2018 was 20.2%.)

It is perhaps more revealing to compare these percentages.


Or, in terms of a graph,


That Richmond rate is 3.99 times the state average.

Our Leaders, however, are shouting “Nolo Problemo!”  Last year our Generous Assembly gutted the mandatory attendance laws; Richmond is celebrating by firing its attendance officers.

Your tax dollars at “work.”

Is Marriage Good for Schools?

Having been enticed by the 2017 census data, I turned from the “female householder, no husband present” data to the “married couple families” numbers.

Notes: The census table does not include data for the two towns that have independent school divisions, Colonial Beach and West Point.  Presumably those data are included in the reports for the surrounding counties; I don’t have a way to correct for that (other than by leaving out the counties, which I didn’t do).  VDOE reports consolidated SOL data for Emporia/Greensville County, Fairfax City/County, and Williamsburg/James City County under the latter jurisdiction in each case; I’ve summed the census data for each pair and reported the joint average.

Let’s start with reading.


Not only did we get a two-digit R-squared, for a change, but one that is robust: Division reading SOL pass rates are strongly associated here (ρ = +0.68) with the percentage of married couple families.

Of course, the correlation does not tell us about causation.  Indeed, it is likely that other factors drive both sets of numbers. 

That caveat aside, we again see Richmond (the gold square) underperforming substantially while the peer jurisdictions (the red diamonds, from the left Norfolk, Newport News, and Hampton) are near or above the fitted line.  (As a courtesy to both of my readers, the purple diamond is Lynchburg, the green, Charles City.)

For a closer look at Richmond’s marital neighborhood, here is the same graph, expanded to show only the jurisdictions with <65% married couples.


The fitted line is for the entire dataset.

The math scores tell the same story (with Richmond underperforming even more dramatically).



As with the “female householder” dataset, the married couple SOL data also show suggestive correlations with both the economically disadvantaged (“ED”) and the more affluent (“Not ED”) students’ performance.





(In the second graph of each pair, only the ED points are labeled.  Look directly above those points – same % married – to find the corresponding Not ED points.)

Indeed, the fits to the Not ED group are substantially more robust than to the ED students, showing where marriage or the underlying factors that affect both marriage rates and SOL pass rates has/have stronger effects.

As in the single parent case, these data tell us:

  1. The raw SOL pass rates punish a division for poverty and even more for families without married couples; and
  2. Even with the pass rates corrected for the effects of both larger populations of ED students and lower populations of married couples, the Richmond schools underperform.  Appallingly.

Needed: A Lot of Good Men

We have seen that “economically disadvantaged” (“ED”) students generally underperform their more affluent peers (“Not ED”) on the SOL tests. 

The natural consequence of that is that divisions with more ED students generally deliver lower pass rates.  Less obviously, as the percentage of ED students increases, the pass rates of the Not ED students decrease while those of the ED students are only slightly lowered.  For instance:


There is a lot of scatter in those data.  Indeed, the only robust correlation is in the all-student division average rates, where the effect of increasing numbers of ED students is to be expected.

Looking for other factors that might have more predictive effect, let’s turn to the census data showing numbers of “female householder, no husband present” families in the Virginia jurisdictions.  The latest data there are for 2017.

To start, here are the division average pass rates on the 2017 reading tests v. the no-husband percentages:


Notes: The census data do not include Colonial Beach and West Point, both of which have independent school districts; their census data probably are subsumed in reports for the counties that include those towns.  The gold square on the graph is Richmond; the red diamonds are the peer cities, from the left Newport News, Hampton, and Norfolk; the green is Charles City; purple, Lynchburg.

Whew!  This is much more dramatic: The ED graph (2018 data) has a slope of –2.8 % per 10% increase in the Ed population.  Here (2017 data) we see a slope of nearly -6.4% for a 10% increase in the no-husband population.  The fitted line of the ED SOL graph extrapolates to a pass rate of 63.3% at 100% ED; the no-husband graph extrapolates to a pass rate of 25.7 at 100% no-husband households.

For a look at the divisions with a tougher job, here is the no-husband graph with only the high percentage divisions shown (the fitted line is for the entire dataset).


The math data tell much the same story.



Take a bow, ghost of Secretary Moynihan.

Taking a further step, here are the reading data broken out for the ED and the Not ED students.


The ED data extrapolate to 29.8% at 100% no-husband households; the Not ED extrapolation gives 50.2%.

Looking just at the >25% no-husband divisions:


Note: The second graph is too busy as it is so I’ve left the labels off the ED series.  You can identify the points there by looking for the labeled Not ED points directly above (i.e., same % no-husband).

Recalling that correlation does not imply causation, we cannot say that the missing husbands are the cause of these effects.  We can say that both ED and Not ED pass rates are generally lower in divisions with larger percentages of female parent homes and that the effect is nearly the same for both ED and Not ED students.  As well, the absence of the husband predicts over a fifth of the variance (statistic-speak for scatter from the fitted line).

Richmond, as usual, is underperforming. 

(Darn!  Another excuse that’s available to RPS only if they’re willing to lie.)

The math data sing a variation on the same theme.



We can draw at least two conclusions:

  1. As with the ED population, the relationship of SOL test results to the number of no-husband households shows that the raw SOL pass rate is a defective standard for measuring division academic performance.  Said more succinctly: The SOL measure punishes a division for poverty and, still more powerfully, for single parenthood in its student base.
  2. Even with the pass rates corrected for effects of the large ED and no-husband populations, Richmond schools underperform.  Atrociously.

What Funding Gap?

The estimable Jim Bacon points to a study of school spending by state.  He concludes: “In Virginia, districts that serve mostly black students spend about $200 more per student on average.”

Well, please recall what Mark Twain said about statistics.

Please also notice that the study was prepared by EdBuild, which advocates for school funding and is funded in part by the Gates FoundationAnother study (with, obviously, a different viewpoint) asserts that of 2,625 political contributions by staff of Gates grantees, only 6 went to Republicans. 

In this context, we can wonder about the methodology of the EdBuild study: The study compared funding of “nonwhite” districts – those with more than 75% nonwhite students – with “white” districts – those with more than 75% white students.  The study does not explain the basis of the 75% criterion; it does not report the results of choosing other criteria; it does not mention local costs of living.

(Indeed, ± 75% is very close to ± 1.2 standard deviations; if there had been a statistical basis for the study, we might have expected to see a 68% or a 95% criterion.)

Thus, it is hard to know exactly what the study shows, albeit it seems to give Virginia some modest bragging rights.

There is a Virginia data set that can shed some light on the matter.  The Superintendent’s Annual Report for 2017 (the latest available data) provides at Table 13 disbursement data for each division.  I’ve extracted the day school (school operations not including food, adult ed., pre-K, etc.) expenditure per student.  The Fall Membership Report database provides the 2017 enrollments for students of all races and for “white, not of Hispanic origin” students, inter alia

If we graph the day school expenditure v. the percentage of nonwhite students, we obtain:


Of course, correlation does not imply causation (another thing the EdBuild study does not mention) but the absence of correlation does tell us to look elsewhere for causes.  Here, 93% of the variance (that’s statspeak for “scatter”) comes from factors other than the percentage of nonwhite students.

In any case, the least squares fit offers the same result as an eyeball examination: There’s LOTS of scatter but the divisions with larger nonwhite populations are not being punished.  So, modest (6.6%) bragging rights.

Looking at the data we also see that the EdBuild average being pulled up by the older, urban divisions with large nonwhite populations and with higher costs of living (with the notable exceptions of Petersburg, which has a reputation for being notoriously impoverished, and Sussex, which is decidedly non-urban).


Among those divisions, the R-squared rises to 9%.


At the other end of the spectrum, the low expenditure divisions are mostly rural counties with relatively lower costs of living.


The Big spenders here are Bath (38% of the budget is VEPCO money), Highland (43% of budget from property taxes), and Rappahannock (who knows?).

Interesting, perhaps.  It might also be interesting to look at the expenditures corrected for cost of living. 

In any case, no racial funding gap on the expenditures.

With all that said, it remains that, while school finances are important to the teachers and the schools’ bureaucrats, they are irrelevant to student performance among the Virginia divisions, e.g.,


Note: These are total disbursements, not just day school.  Richmond is the gold square; the peer cities Hampton, Newport News, and Norfolk, the red diamonds; Lynchburg, blue; Charles City, green.