Richmond Elementary Schools: Mind the Gap!

An earlier post showed that, among the school divisions, an increasing percentage of economically disadvantaged (“ED”) students was slightly correlated with a decreasing pass rate of the more affluent (“Not ED”) students and that the division average for ED students was ca. 20 points lower than for Not ED. Another post showed that two of Richmond’s high-scoring elementary schools were not getting high-scoring pass rates from their ED students.

Let’s take a more general look at the ED/Not ED performance of Richmond’s elementary schools.

To start, here are the 2019 reading pass rates of the Not ED students of Richmond elementary schools plotted against the percentage of ED students in the tested group. Fairfield Court and Miles Jones are missing from the graph (see below).


As with the divisions, the pass rate decreases with increasing % ED, here by about four points per 10% ED increase. The R-squared value of 29% indicates a modest correlation.

Things get more interesting when we look at the ED pass rates.


As expected, the pass rates are generally lower. The slope drops to 1.5% for a 10% increase in ED population while the R-squared decreases to 9%.

Of interest here, the surprisingly lower pass rates of the more affluent schools contribute to that lower slope.

Two schools with >70% ED (Cary and Obama) outperformed Munford (13% ED) and four outscored Fox (25% ED). Indeed, none of the five low-ED schools covered itself with glory in terms of ED performance.

Here are the data:



  • The “#N/A” entries for Jones and Fairfield Court indicate cases where VDOE did not report Not ED data, probably as a result of small ED populations and the VDOE suppression rules.
  • The VDOE database does not offer state average pass rates for elementary schools. The state numbers here are the average of the averages for each of grades 3-5. Given that the state enrollment is approximately flat across those grades, that should give a close estimate of the average over all elementary students.

The Munford ED pass rate is the same 62% as the state average; Fox is 3 points lower. The state average ED population is 44%, Munford is 13%, Fox is 25%.

Let’s take this one step further: The fitted line in the ED graph, above, slopes down. In an ideal world, it would be exactly horizontal, indicating that the average performance of ED students was independent of the % ED in the tested group.

The statistics of the fitted line allow calculation of the difference between each school average and the fitted line, thus removing the average effect of the increasing ED percentage. The results, sorted by the difference:


Or, in terms of a graph:


For sure, some of our schools get lousy results. But others do much better. And some of the low-%ED schools don’t get good ED performance, even when they get very good Not ED pass rates. It would be useful to understand the reasons for those differences.

The math data show a similar pattern.


The slope here is down from the reading, three points per 10% ED vs. four. The R-squared is about 20%, v. 30% for reading. But the conclusion remains the same: On average, the Not ED pass rates decline with increasing % ED students in the tested group. These data don’t tell us why.

The ED pass rates again show a lower slope and an R-squared value that indicates very little correlation.


Five schools with >70% ED outperformed Munford (a sixth tied) and six outdid Fox. Whatever the magic at Fox and Munford, it doesn’t seem to work for their ED students.

The data:


Munford is four points below the state average; Fox is nine below.

In terms of ED pass rate differences from the fitted line:



Some Richmond schools are doing much better with their ED students than others. The literature suggests that the important variables are the socioeconomic status of the students and the effectiveness of the teachers. Here, we can wonder whether the large differences in ED performance might be related to the quality of the teaching.

For sure, the ED students at Munford and Fox don’t seem to be gaining any benefit from exposure to large populations of Not ED students.


The definition of economic disadvantage gives us, at best, a rough measure. Teacher performance, however, can be measured, independently of ED. Unfortunately, our Board of Education abandoned the measure they had, the SGP, because it did provide an accurate measure of teacher performance.

In that situation, we can only wonder what’s going on. Or, perhaps, blunder along and hope for better.