Budgeting to Waste Money

The 2016 RPS Budget is up.  It’s time to start trying to understand where our money is going.

For a first effort, let’s look at the budget by school compared to the September, 2015 enrollment, or “membership” as they call it.

Here is the budgeted expense per student for the Richmond elementary schools. 

image

The data come remarkably close to fitting a straight line (R2 = 66%), which suggests that there are important economies of scale.  Said otherwise, our little schools are more expensive per student than the larger ones.

All of the three smallest schools (from the left, Swansboro, Cary, and Bellevue) look to be anomalously pricey.  The slope of –$6.36 per student further tells us that if those three schools were combined, it would save about $3.6 million per year.

Here are the data:

image

We can examine the academic benefit of the smaller schools by looking at the 2015 SOL pass rates

image

And plotting the pass rates vs. the budgeted expenditures per student tells us about the educational return on the money.

image

That’s clear enough: Neither smaller schools nor more money per student correlates significantly with better performance in the Richmond elementary schools.  Looks to me like any new schools should be quite large.

Turning to the middle schools, Binford is anomalously pricey.

image

As to the pass rates, there’s one wrinkle: The 2015 SOL data predate the Elkhardt/Thompson merger.  Rather than fiddle with the data (e.g., use the 2015 enrollments) I’ve left Elkhardt/Thompson off the following graphs.

image

Here, it looks like the smaller schools have some advantage, but a 9% correlation on the reading tests is nothing to bet money on and, as to math, 4% is even less so.

And, as to budget, we’re spending lots of money per student at little Binford, with precious little return.

image

Finally, the high schools.

image

Those expensive, little schools are doing very well.

image

But if we take selective Open and Community out of the mix, we see that decreasing size doesn’t improve the pass rate in our mainstream high schools.

image

As to cost, the expensive, selective, little schools do better as to reading, but not so much as to math.

image

If we again take Community and Open out of the mix, we (yet again) see that more money per student is not buying better performance.

image

Indeed, to a 20% correlation, more money is buying less math performance.  That’s driven by the large, inexpensive high school, Huguenot.

We’ve heard that Richmond purposely invests in “small schools.”  For sure, this year Norfolk has, on the average, 44% more students per school (and, dare I say it, vastly better SOL pass rates).

image

What we are getting for those small schools is high costs but no better teaching. 

As the estimable Carol Wolf keeps pointing out, we have too many old, decrepit, little schools.  When we replace them, it would be a mistake to build new, shiny, little schools.

Spending More But Not Teaching More, II

If at first you don’t succeed, hope for some help.  In this case, after I posted the 2014 data thinking they were 2015, Steve Fuhrman emailed to say that VEA has the 2015 Required Local Effort (“RLE”) data posted.  Indeed, they do.  Indeed, so does VDOE.  Somehow I pulled up the previous year’s data last time I tried.

Turning to the data, we again have some caveats: The RLE data again include both Emporia and Greensville County, Fairfax City and County, Williamsburg and James City County but the SOL data combine each of those pairs.  I’ve deleted Emporia, Greensville, Williamsburg, and James City.  The Fairfax RLE data are close (116 for the County, 103 for the City) and the County is so large that it should swamp any difference in pass rates so I’ve included the County RLE excess and the County+City pass rates.

As well, the VEA data show Lee County with no numbers and as “Resubmission Pending” so I’ve deleted Lee County.

To the good, we have Accomack County this time.

With those changes, here are the 2015 division reading pass rates plotted vs. the 2015 Expenditure for Operations excess over the Required Local Effort.

image

As a thank you to Steve, the green diamond is Charles City County. 

The gold square is Richmond.  The red diamonds are the Richmond peers, from the left Norfolk, Newport News, and Hampton. 

The median excess RLE is 77%.  The two high-spending, high-scoring divisions are West Point (284% excess RLE) and Falls Church (196%).

The least squares fitted line suggests that doubling the RLE increases the pass rate by about 2% but the 1.7% R2 tells us that the pass rate and excess RLE are essentially uncorrelated.

Here are the data for the remaining subjects and the five subject average:

image

image

image

image

image

Finally, with the omissions noted above, here are the five subject data.  I made the upper case entries in the RLE list (e.g., Roanoke CO) to force it to sort the same way as the  SOL list.

image

Spending More But Not Teaching More

As we have seen, some Virginia school divisions spend a lot more than others but the spending does not correlate with SOL performance.  On the 2014 data, an analysis of the excess “expenditures and appropriations designated to meet [the] required local effort in support of the Standards of Quality” produced the same result.

VDOE now has published the 2015 RLE data.  The estimable Jim Bacon yesterday posted an initial look at those data, which prompted me to take a more detailed look.

Note: Jim Weigand points out that these are 2014 RLE data, NOT 2015.  Sigh.  If I hadn’t screwed this up, I’d be wondering at length why it takes VDOE over a year and a half to post the ‘14 data.

Details: Accomack is missing from the RLE report and is omitted here.  The RLE report shows separate data for Emporia and Greensville County, Fairfax City and County, and Williamsburg and James City County; the SOL data, however, combine those three pairs.  I have omitted Emporia, Greensville, Williamsburg, and JCC; Fairfax City and County have nearly the same RLE (about 129% excess each) and the County is large enough to swamp the City data in any case so I used the County RLE and City+County SOL pass rate. 

With those adjustments, here are the 2015 division reading pass rates vs. the 2015 local expenditures for operations above the RLE.

image

The fitted line suggests that tripling the RLE increases the pass rate by about 4% but the 1.8% R2 tells us that the pass rate and excess expenditure are essentially uncorrelated.

Richmond is the gold square; from the left, the red diamonds are Hampton, Norfolk, and Newport News.

The Big Spenders out there are Sussex (221% excess, 72% pass rate) and West Point (218%, 94%).

And here are the data for the other four tests and the five subject average:

image\

image

image

image

image

Here, with the omissions noted above, are the five subject data.  I made the lower case entries in the RLE list to force it to sort the same as the SOL list.

  XS RLE 5 Subject Average
ALBEMARLE 140.25% 81.3%
ALEXANDRIA 183.58% 70.9%
ALLEGHANY 180.40% 76.9%
AMELIA 44.58% 77.8%
AMHERST 94.26% 79.2%
APPOMATTOX 15.34% 81.6%
ARLINGTON 193.87% 86.4%
AUGUSTA 77.48% 80.6%
BATH 118.81% 79.0%
BEDFORD 87.24% 79.8%
BLAND 38.08% 77.6%
BOTETOURT 132.86% 89.4%
BRISTOL 44.94% 80.5%
BRUNSWICK 17.81% 68.1%
BUCHANAN 73.69% 75.1%
BUCKINGHAM 37.03% 76.5%
BUENA VISTA 63.02% 68.7%
CAMPBELL 112.60% 80.9%
CAROLINE 36.65% 73.7%
CARROLL 102.29% 79.3%
CHARLES CITY 95.38% 75.8%
CHARLOTTE 34.84% 84.1%
CHARLOTTESVILLE 154.45% 76.0%
CHESAPEAKE 114.57% 85.5%
CHESTERFIELD 82.30% 82.7%
CLARKE 101.54% 81.3%
COLONIAL BEACH 64.95% 77.1%
COLONIAL HEIGHTS 171.97% 83.2%
COVINGTON 152.31% 75.7%
CRAIG 39.00% 82.4%
CULPEPER 60.11% 78.9%
CUMBERLAND 69.99% 68.8%
DANVILLE 88.81% 67.7%
DICKENSON 63.43% 77.6%
DINWIDDIE 70.52% 74.0%
ESSEX 49.29% 66.4%
FAIRFAX 127.86% 85.5%
FALLS CHURCH 170.65% 92.0%
FAUQUIER 112.23% 83.1%
FLOYD 45.87% 79.7%
FLUVANNA 65.86% 81.6%
FRANKLIN CITY 102.95% 75.1%
FRANKLIN co 64.35% 82.2%
FREDERICK 124.29% 78.9%
FREDERICKSBURG 134.11% 76.8%
GALAX 70.74% 77.9%
GILES 43.03% 80.5%
GLOUCESTER 98.21% 81.9%
GOOCHLAND 59.77% 85.4%
GRAYSON 38.09% 78.1%
GREENE 73.40% 77.7%
HALIFAX 34.36% 72.4%
HAMPTON 88.31% 74.3%
HANOVER 58.92% 86.6%
HARRISONBURG 102.25% 74.2%
HENRICO 69.50% 79.7%
HENRY 39.17% 78.4%
HIGHLAND 23.29% 78.3%
HOPEWELL 73.16% 69.4%
ISLE OF WIGHT 68.85% 84.5%
KING and QUEEN 73.63% 77.6%
KING GEORGE 53.80% 81.2%
KING WILLIAM 100.45% 83.6%
LANCASTER 77.00% 66.9%
LEE 9.93% 77.8%
LEXINGTON 52.26% 88.2%
LOUDOUN 138.33% 88.0%
LOUISA 69.69% 83.3%
LUNENBURG 24.07% 75.2%
LYNCHBURG 103.34% 68.9%
MADISON 136.20% 77.7%
MANASSAS 172.35% 75.1%
MANASSAS PARK 102.57% 74.3%
MARTINSVILLE 111.14% 62.2%
MATHEWS 58.44% 80.2%
MECKLENBURG 29.19% 75.9%
MIDDLESEX 35.60% 82.8%
MONTGOMERY 79.77% 83.1%
NELSON 101.51% 78.1%
NEW KENT 81.64% 85.2%
NEWPORT NEWS 110.30% 72.6%
NORFOLK 90.52% 72.3%
NORTHAMPTON 31.46% 69.4%
NORTHUMBERLAND 56.26% 76.9%
NORTON 47.40% 81.0%
NOTTOWAY 27.39% 74.1%
ORANGE 63.27% 80.3%
PAGE 64.59% 76.5%
PATRICK 11.02% 79.7%
PETERSBURG 44.37% 60.4%
PITTSYLVANIA 22.77% 82.9%
POQUOSON 97.70% 89.5%
PORTSMOUTH 85.72% 74.3%
POWHATAN 107.65% 84.6%
PRINCE EDWARD 95.62% 71.2%
PRINCE GEORGE 45.02% 81.2%
PRINCE WILLIAM 98.56% 82.3%
PULASKI 65.25% 77.5%
RADFORD 83.96% 76.6%
RAPPAHANNOCK 76.17% 80.6%
RICHMOND CITY 90.30% 61.2%
RICHMOND co 76.54% 80.3%
ROANOKE CITY 132.74% 75.9%
ROANOKE co 103.62% 87.7%
ROCKBRIDGE 82.02% 80.9%
ROCKINGHAM 138.87% 83.8%
RUSSELL 29.03% 80.9%
SALEM 142.76% 87.5%
SCOTT 13.33% 86.0%
SHENANDOAH 84.78% 77.4%
SMYTH 44.66% 77.2%
SOUTHAMPTON 68.22% 82.5%
SPOTSYLVANIA 120.89% 80.6%
STAFFORD 123.97% 84.8%
STAUNTON 88.40% 75.3%
SUFFOLK 66.41% 76.3%
SURRY 136.52% 77.5%
SUSSEX 221.04% 76.1%
TAZEWELL 8.84% 84.0%
VIRGINIA BEACH 120.91% 84.0%
WARREN 83.74% 78.1%
WASHINGTON 108.98% 83.6%
WAYNESBORO 120.38% 71.0%
WEST POINT 217.53% 94.5%
WESTMORELAND 54.40% 75.1%
WINCHESTER 134.37% 75.0%
WISE 101.67% 88.9%
WYTHE 64.87% 80.2%
YORK 80.51% 88.0%

Gains and Not So Much, V

I have discussed the notion that divisions with high pass rates have little room for improvement while those with lower pass rates are shooting at larger targets.  VDOE’s (bogus) accreditation process seems to recognize this.  I suggested that a better measure of progress is the overall pass rate change divided by the previous year’s failure rate, in order to measure the relative change in the failure rate.

Today we have those data for the Richmond schools.  I’ve left out Amelia St. Sp. Ed., which had an enormous (51%) drop in the pass rate.

To begin, the raw change in the five-subject average pass rates:

image

And then the pass rate change divided by the 2014 failure rate:

image

Here are the data.  Note that comparison of the high schools with the elementary and middle schools is biased this year by the new retest policy in grades 3-8 that boosted the pass rates by ca. 4%.

image

Accreditation and Not

VDOE has updated its 2015-16 Accreditation data.

I earlier discussed their byzantine, opaque process for accrediting Virginia’s public schools.

Well, some of those schools.  You won’t find a rating for Maggie Walker, for instance, because VDOE counts the scores of the MW students at high schools they don’t attend.  And that just scratches the surface of the “adjustments” that boosted the accreditation scores this year by 6.1%.

This year they made it even easier to avoid “Accreditation Denied” by relabeling some denied schools as “Partially Accredited: Approaching Benchmark-Pass Rate” or “Partially Accredited: Improving School-Pass Rate, or “Partially Accredited: Reconstituted School,” among others.

Adjustments or not, relabeling or not, VDOE could not get entirely away from Richmond’s second-from-last place performance on the reading tests or its sixth-from-last place on the math tests.  The initial accreditation results showed Richmond with 37.8% accredited, v. 77.6% of the state’s schools and, more to the point here, with 15.6% “To Be Determined.”  VDOE now has acted on five of the seven Richmond TBDs, which bumps the Accreditation Denied rate from 4.4% to 8.9% and the Reconstituted rate from zero to 6.7%.  Here are the new data:

image

image

image

Note that one of the new schools is Elkhardt/Thompson; the relabeling converts Thompson’s earlier “Denied” rating into “New School.”  All told, 53% of the Richmond schools were warned or denied accreditation this year with two schools still TBD and another failed middle school, Thompson, hiding in the definitional weeds.

The keel of this sinking ship is the middle schools: King denied; Hill improving;  Binford, Brown, and Henderson reconstituted; Elkhardt/Thompson new and camouflaging the denied Thompson.  Only Franklin, which includes both middle and high school grades, is fully accredited.

Data are here.

Gains and Not So Much, IV

We have seen that Richmond has fallen far behind on the SOL pass rates

image

For example, see the data here

image

and here.

image

image

More recently, I noticed that divisions with high pass rates have little room for improvement while those with lower pass rates are shooting at larger targets.  The VDOE (bogus) accreditation process seems to recognize this.  I suggested that a better measure of progress is the overall pass rate change divided by the previous year’s failure rate, which measures the decrease in the failure rate.

Added Note: Jim Weigand reminds me that the 2015 pass rates are boosted by about 4% by the new retest policy, albeit the Virginia Department of Data Suppression is not posting anything that would let us evaluate that effect on any particular division or school.

Here are those data for Richmond, broken out by race and economic disadvantage (“Y” indicates economically disadvantaged, “N” = not).  Let’s start with Reading.

image

image

image

Then Math:

image

image

image

Writing.  Note: No data for Richmond’s white, economically disadvantaged kids:

image

image

image

History & Social Science:

image

image

image

Science:

image

image

image

Finally, the five subject averages:

image

image

image

These data suggest that Richmond is doing a poorer job with its students who are not economically disadvantaged and that it is focusing on reading and math to the detriment of writing, history & social science, and science.

Gains and Not So Much, III

Brian Davison suggests that the increased number of retakes benefits the divisions with low pass rates, i.e., with lots of students who might be eligible for retakes.  But, since the State Department of Data Suppression does not tell us about retakes, we can’t know about that.

I’ll suggest there is a more subtle problem: If a division with a 90% overall pass rate increases that rate by 1%, it has decreased the failure rate by 10%.  In contrast, a division with a 50% pass rate that increases the pass rate by 1% leaves 49% of its students failing; it decreases the failure rate by 2% of the failure rate.  To achieve a result equivalent to the first division, this division must increase its overall pass rate by 5%.  But then, it is shooting at a larger target:  A division with a high pass rate has little room for improvement; a division with a low pass rate has plenty of room for (and needs to make lots of) improvement.

The estimable Carol Wolf suggests that I use a simpler analogy: If your pass rate is 50%, you get fifty shots per hundred kids at improving it; if the pass rate is 90%, you get only ten.

That is, a fairer measure of progress is the overall pass rate change divided by the percentage of students who failed to pass the previous year.

Here are those data for Lynchburg:

image

image

image

Gains and Not So Much, II

Delving further into Jim Weigand’s report that his Super is bragging on the year-to-year SOL performance in Lynchburg: The problem is the absence of data on the new retest rate that raised pass rates by about 4% in 2015.  So let’s look at the 2014-2015  changes in pass rates v. the state average, which at least discounts the Lynchburg score gains by the statewide equal opportunity for grade inflation.

Here, for a start, are the five-subject pass rate changes by race and economic disadvantage (“Y” = economically disadvantaged as reported in the VDOE database; “N” = not).

image

On the five-subject average, the Super gets bragging rights for the non–ED white kids, with a 1.4% score gain over the state average; otherwise he has some explaining to do, especially as to the ED white students, who improved by 3.9% less than the state average.

On the reading tests, the opportunity for bragging rights doubles by the addition of the black, economically disadvantaged students.  But the score gain by the white, ED population lags by 2.5% and the black, not-ED by 2.9%

image

Finally, on the math tests, the Lynchburg Super gets full bragging rights, especially as to the black, ED students.

image

Looks to me like Lynchburg needs figure out what its math teachers are doing and spread some of that to the teaching of the other four subjects.

Gains and Not So Much

Jim Weigand of Lynchburg reports that his Super is bragging on the year-to-year SOL performance there.  Jim points out that we don’t know how much of that can be attributed to the new retesting regime that raised the grade 3-8 pass rate by about 4%.

I’ve asked Chuck Pyle whether retest data are available.  While waiting for his answer, I thought I’d take a glace at the raw year-to-year changes.

For a start, here are the five-subject average division pass rate changes from 2014 to ‘15:

image

Note that Excel can get only so much information on the axis and, thus, gives names only for alternate divisions.  For example, Richmond is the unnamed yellow bar between Rappahannock and Richmond County.

The green bar is Lynchburg; the red are, from the top, Hampton, Newport News, and Norfolk.  The average gain is 3.9%

Here are the same data, sorted by change in pass rate:

image

From this direction, Lynchburg looks pretty good, albeit we don’t know whether they had an unusual retest rate.

For completeness, here are the Reading and Math score increases (reading average = 5.4%, math, 6.3%):

image

image

On that last graph, Highland is off the scale at 27.8%.

I’ll bet you a #2 lead pencil that the State Department of Data Suppression does not have the retest data, which will leave us not knowing how much any of these changes represents actual improvement in the pass rate.

Lousy Schools, Not a Shortage of Money

I earlier showed that Richmond’s dismal pass rates on the reading and math SOLs are not explained by Richmond’s expense per student or its large population of economically disadvantaged students.  Indeed, even among the twenty-eight divisions with similar or higher populations of economically disadvantaged students, Richmond underperforms grossly.

It’s arctic outside this morning and I’m in no hurry to vacuum the house, so I thought I’d take a further look at the data.  The VDOE SOL database is glad to deliver the pass rates of the students who are not economically disadvantaged.  Here, then, are the five-subject division average pass rates of the non-ED student populations v. the 2014 division disbursements per student (VDOE won’t post the 2015 data until sometime this Spring), with disbursements for adult education, facilities, debt service, and contingency funds not included.

image

Richmond is the gold square; the red diamonds, from the left, are peer cities Hampton, Newport News, and Norfolk.

Excel was glad to fit a least-squares line to the data.  The slope suggests that increasing disbursements decreases the pass rates but the R2 shows that pass rates and disbursements are essentially uncorrelated.

One might think that a very large population of economically disadvantaged students would weigh on the performance of the less disadvantaged.  To look at that, here are the same data for the twenty-eight divisions with the highest populations of economically disadvantaged students.

image

The red diamonds are Newport News and Norfolk. 

As with the general populations, Richmond is spending a lot of money and getting lousy results.

Note that all but two of these twenty-eight divisions have larger ED populations than Richmond:

image

Indeed, Bristol, Colonial Beach, and Roanoke all managed pass rates >90% with ED percentages higher than Richmond’s.

As I’ve said, Richmond’s problem is lousy schools, not poverty or lack of money.

Here are the data:

image