More Salary $ ≠ Higher Pass Rate

We often hear that better pay leads to better teaching. VDOE has some data on that.

Table 19 in the Superintendent’s Annual Report lists, among other things, the average annual teachers salaries by division. Data here are averages for both teachers and principals. The very nice database on the VDOE site provides pass rates. The latest data in the Annual Report are from 2018 so the pass rates here are from 2018 as well.

For the reading tests, the data look like this:


“ED” on the chart refers to the “economically disadvantaged” students. Statewide, ED students underperform their more affluent peers (“Not ED”) by about twenty points so I’ve posted both sets of data.

The fitted line to the Not ED data suggests that a $10,000 increase in the average salary is associated with a 1.1% increase in the pass rate. The R-squared value of 2.3%, however, tells us that the two variables are essentially uncorrelated.

The ED data suggest a decrease in the pass rate of 3.1% per $10,000 increase in average salary. The R-squared of 9.2% indicates a slight correlation, but nothing to write a thesis about.

The fair conclusion is that division average reading pass rates are unrelated to average teacher salaries, except for a hint that ED rates may decrease slightly with increasing average salary.

The enlarged, squared points are Richmond, paying a bit more than average and getting a lot less.

The math and science data tell much the same story.



For the record, here are the ten high- and low paying divisions:



Notice that high paying and high scoring Falls Church is not doing well at all with its ED students. Ditto most of those Big Spenders. Also notice that, except for Lexington and Halifax, the Little Spenders have ED pass rates within ten to fifteen points of their Not ED.

CAVEAT: These data are consistent with the notions that Virginia’s systems for evaluating educational outcomes and for setting teacher salaries are counterproductive:

Research dating back to the 1966 release of Equality of Educational Opportunity (the “Coleman Report”) shows that student performance is only weakly related to school quality. The report concluded that students’ socioeconomic background was a far more influential factor. However, among the various influences that schools and policymakers can control, teacher quality was found to account for a larger portion of the variation in student test scores than all other characteristics of a school, excluding the composition of the student body (so-called peer effects).

Yet, Virginia’s evaluation system punishes schools and divisions with larger numbers of “economically disadvantaged” students. Moreover, the counterproductive salary scales reward degrees and time in service, not teaching effectiveness:

Teachers’ education (degree) and experience levels are probably the most widely studied teacher attributes, both because they are easy to measure and because they are, in the vast majority of school systems, the sole determinants of teachers’ salaries. However, there appears to be only weak evidence that these characteristics consistently and positively influence student learning.

For another look at the relationship between educational inputs and outputs, see this study from the OECD.