Big Budgets Don’t Teach

There is an unfortunate tendency to measure school quality in terms of inputs, particularly money, albeit the important measure should be the output, how much the kids learn.  The SOL provides a ready, but rough, measure of output.

We have seen that neither division expenditure nor nor excess local financial effort nor average division teacher salary correlates with SOL pass rates.

But we know that SOL scores are sensitive to at least one other variable, the economic status of the students: Economically disadvantaged students score lower, on average.

For about four years, until last year, VDOE produced student growth percentile (“SGP”) data that are not dependent upon economic advantage or disadvantage.  VDOE attempted to hide those data but Brian Davison of Loudoun pried loose some of the data by division.

I’ve taken the latest of those data, for 2014, and juxtaposed them with the teacher salary data for that year to see if there’s anything to learn there.

To start, here are the division reading SGPs plotted vs. the division average teaching salary (regular K-12 education teachers, art, music, physical education, technology, remedial, gifted, mathematics, reading, special education, and ESL teachers; not included in the calculation are: teacher aides, guidance counselors or librarians).

image

The 7.1% R2 tells us there is only a faint correlation between how much the teachers are paid and how much the students learn.

Even so, there is some interesting information in these numbers.

For a start, we see that Richmond (the gold square) and the peer jurisdictions (red circles, from the top Hampton, Norfolk, and Newport News) all are getting better results than the raw SOL pass rates suggest.  Richmond still underperforms, but the peer jurisdictions are close to the middle of the pack, with Hampton just above the 46.6% division median.

The Big Spenders here are the NoVa divisions: the divisions above $60,000 are, from the left, Loudoun, Prince William, Manassas City, Fairfax, Falls Church, Alexandria, and Arlington.  The outstanding performers, all but one with modest salaries, are Poquoson, Bland, Botetourt, Wise, and Falls Church.

The eye opener is the surrounding counties (the green circles, from the top Charles City, Chesterfield, Henrico, and Hanover): Charles City is outperforming the others; the others are not obtaining student growth at any level to brag about.  Of course, all are beating Richmond, but by much less than the raw pass rates would suggest.

The math data are less flattering to Richmond but similarly interesting.  Caveat: These data do not include algebra (for reasons of the maximum number of rows that would fit into Excel).

image

The 3% R2 again indicates no significant correlation between the SGPs and the salaries.

Here we see Richmond, again the gold square, 5.5 percentiles below the math median of 48.4, compared to the 3.1 points below the reading median.  In blue, Newport News and Hampton are above the median; Norfolk is 1.8 points below (and should be dropping unless they are cheating).

The Counties (green circles, from the left Charles City, Hanover, Chesterfield, and Henrico) are in the middle of the pack, except for Henrico, which is 1.8 points below Richmond.

The Big Spenders are getting less for their money here than with reading.

The outstanding performers are, from the top, Bristol, Buckingham, Bland (again), Botetourt (again), and Surry.  We probably should reserve judgment about Botetourt; they were caught cheating wholesale with the VGLA.

The dataset is posted here.

The bottom line: Spending more money on teacher salaries does not correlate with better student learning of reading or math.

But, then, we knew all along that the key to student performance is management, not money.