We have seen that, on the 2017 data, division average SOL pass rates are not correlated with the numbers of teachers per student. There is a flaw in that analysis, however: Economically disadvantaged (“ED”) students score lower on the SOL, on average, than Not ED students. Thus, divisions with larger populations of ED students tend have lowered overall pass rates.

The VDOE database can break out the pass rates for both ED students and their more affluent peers, so let’s take a more nuanced look.

To start, here are the 2017 division average reading pass rates for both ED and Not ED students graphed vs. the number of teachers per thousand students. (The latest available teacher numbers are from 2017).

The slopes of the least squares fitted lines might suggest that more teachers in the division correlate with decreased pass rates of the Not ED students and slightly increased rates of the ED students. But the R-squared values tell us that the pass rates in both datasets are essentially uncorrelated with the teacher/student ratios.

In short, these data reach the same result as the overall pass rate data: Divisions with more teachers per student do not, on average, have better pass rates.

The largest exception to that generality, out there with 189 teachers per thousand and a 97.1% Not ED pass rate (and a much better than average ED rate), is Highland County. The ED superstar is West Point with 135 teachers per thousand and an ED pass rate of 85.2, followed by Wise (115, 84.4) and Bath (159, 82.7).

To single out some other divisions: Richmond is the yellow squares. The peer cities are the red triangles, from the left Newport News, Hampton, and Norfolk. Charles City is green; Lynchburg, blue.

Just looking at the graph, Richmond’s ED rate is farther from the fitted line than its Non ED. Indeed, Excel tells us that the Richmond Not ED average is 11.8 points below the all divisions average. That is, Richmond’s Not ED students passed at a rate 11.8% lower than the state average for Not Ed students. The Richmond ED rate is 17.1 points below the state average for ED students. That is, Richmond’s Not ED performance is poor; it’s ED performance is half again worse.

Aside from the question why their ED scores are so low, these data falsify Richmond’s frequent excuse that it must deal with a large ED population: Richmond does a lousy job with its Not ED students, and an even worse one with the ED students. Whatever the cause of Richmond’s awful SOL performance, it infects the entire student population.

Next up, writing:

Pretty much the same story there as the reading data (but notice how Highland reverts back toward the mean). Richmond’s ED rate is lowest in the state.

The graphs for the other three subjects are chapters in the same book:

There is one interesting trend here: Richmond’s ED underperformance, relative to the Not ED students, is much smaller in science and in history/SS than in English, and is somewhat less in math. To quantify that, here are the Richmond differences from the division means for each group in each subject:

These data do not separate out any of the factors that can affect student performance, other than (roughly grouped) economic status; they do comport with the notion that Richmond has a lot of work to do, especially with its ED students.

To the bottom line: These data are consistent with the conclusion in the recent study that “the evidence suggests at best a small effect [of class size] on reading achievement. There is a negative, but statistically insignificant, effect on mathematics.”

So, when our School Board again says it needs more money, ask them what for. And if it’s for more teachers (indeed, whatever it’s for), ask them to prove that the money will improve learning.