Money Don’t Buy You Learning

It’s December. The Generous Assembly is about to return and the demands for more education funding [see Executive Summary at p.4] resound throughout the Commonwealth.

The data would suggest that these demands are misplaced.

VDOE won’t post the 2020 expenditure data until sometime this Spring and there were no 2020 SOLs, so we’ll use the 2019 expenditure and SOL data. The expenditure numbers below are the those for “day school operation” (the sum of Administration, Instruction, Attendance and Health Services, Pupil Transportation, and O&M spending). Student counts are the year-end average daily membership.

One wrinkle: Statewide, “economically disadvantaged” (here, “ED”) students underperform their more affluent peers (“Not ED”) by some 17 to 22 points, depending on the subject. Thus the division average pass rates depend both on student performance and the relative numbers of ED and Not ED students. We’ll avoid that issue here by looking at the rates for both groups.

Here, then, are the division average reading pass rates for the two groups plotted v. the division day school expenditure per student.


Richmond is the enlarged points with yellow fill. The red-filled points are, from the left, the peer cities Hampton, Norfolk, and Newport News.

The fitted lines suggest that performance of the Not ED students increases slightly with expenditure (about 2% per $10,000) while the ED scores decrease (ca. 4% per $10,000). The R-squared values, however, tell us there is only a minuscule correlation between the pass rates and the expenditures.

We can get a clearer view of the data for Richmond and the peer cities by expanding the axis to hide the Very Big Spenders.



Need I say it? Richmond is spending well above average money and obtaining lousy results for both groups of students.

The math data tell the same story: More Money doesn’t correlate with more learning; Richmond spends a lot and gets awful pass rates.



As a pleasant contrast to that bad news, the (locally created and run) Comprehensive Instructional Plan has produced remarkable gains in the Southwest:


They’ve told us how they achieved this:

  • Identify the good teachers,
  • Share their materials and techniques,
  • Measure what works,
  • Focus on core skills,
  • Set high expectations,
  • Bond with the students, and
  • Use the feckless VDOE only for what it actually can do well: crunch numbers.

While the Generous Assembly is in town perhaps they will consider taking the school improvement budget that is wasted at VDOE and giving it to the CIP, where they know how to get results.

More Graduates, Less Learning

The estimable Jim Bacon notices the increased graduation rates this year and wonders how much of the increase reflects the waivers issued by the Superintendent.  We have some of the underlying data.

On May 26, the Governor issued Executive Order 51 that provides, in part:

Authorization for the heads of executive branch agencies, on behalf of their regulatory boards as appropriate, and with the concurrence of their Cabinet Secretary, to waive any state requirement or regulation. . . .  All waivers issued by agencies shall be posted on their websites.

The “guidance” on the VDOE Web Site provides, also in part:

The following graduation requirements are waived based on authority granted to the Superintendent of Public Instruction per Executive Order Fifty-One (2020):

  • Students currently enrolled in a course for which they need a verified credit in order to graduate;

  • Students who have previously been awarded standard credit, but have not earned the associated verified credit;

  • Students who have not completed the student-selected test;

  • Students who are currently enrolled in or have previously completed a course leading to a CTE credential necessary for a Standard Diploma but have not yet earned the credential;

  • Students who have not completed a United States and Virginia history course*;

  • Students who have not completed a fine or performing arts or career and technical education course*;

  • Students in the second of sequential courses*;

  • Students who have not completed an economics and personal finance course*.

VDOE does not set out any direct measure of the effect of these waivers but the history of the cohort graduation rates provides some clues.

But first, what rates shall we measure?

The VDOE announcement brags that the 4-year, “on time,” 2020 cohort graduation rate rose to 92.3% from 91.5% last year, “despite the closure of schools due to COVID-19 in March.”

At the threshold, we might notice that the “on time” graduation rate is manipulated by the inclusion of Modified Standard diplomas that are issued to students who “are unlikely to meet the credit requirements for a Standard Diploma.” The more honest numbers are the “federal” rates (counting only the advanced and standard diplomas), 89.9% this year and 88.7% last year.

Note added later on 10/3: Oops! In fact, there were fewer than ten Modified Standard diplomas this year. The Board already put its thumb on the scale as to that diploma; “The Modified Standard Diploma will not be an option for students with disabilities who enter the ninth grade for the first time beginning in 2013-2014. Credit accommodations allow students with disabilities who previously would have pursued a Modified Standard Diploma to earn a Standard Diploma.” The current jiggering is set out here: “Special education students and limited English students who have plans in place that allow them more time to graduate will be counted as graduates or non-graduates when they earn a diploma or otherwise exit high school.” Translated, that means that Special Ed and LEP students who don’t graduate in four years get counted in the next (or later) year’s cohort. Talk about win-win for boosting the graduation numbers.

Here is the recent history of the federal diploma rates.


The state rate was flat between ‘18 and ‘19. Extrapolating that suggests that the 1.2 point improvement in ‘20 was entirely artificial. The fitted line shows an annual increase of ca. 0.3% per year as of 2019, suggesting that 3/4 of the 1.2% increase this year came from the waived requirements. Slice that any way you like, most of the increase reflects administrative fiat, not academic accomplishment.

Thus, of the 98,481 students in the cohort, it looks like somewhere between about 885 and 1180 received bogus diplomas.

Closer to home, the Richmond data suggest that the waived requirements artificially reversed Richmond’s plunging graduation rates.

You and I get to wonder what kind of lying bureaucrat would brag that the artificial increase served “to ensure that students were not held back because being unable to take a Standards of Learning test or complete a required course” while ignoring the wholesale grant of degrees to students who would not have earned them.

The distribution of the 2020 division rates skews toward the low end with Richmond, the yellow datum, leading the skew.


Data for Covington, Lexington, and Highland Co. are absent. I’ve colored the 90% datum light blue to indicate the location of the state average.

All that said, “economically disadvantaged” (here, “ED”) students underperform their more affluent peers (“Not ED”).  The division average SOL pass rates for the two groups differ by about 20 points.

Similarly, the (inflated) state average graduation rates for Not ED and ED students differ this year by just 10%, so the division averages again reflect affluence as well as performance. To deal with that, let’s break out the data for those two groups.



Here we see that the state average increase this year came mostly from the ED population. This makes sense, given that the ED group graduates at a lower rate and would be helped more by a waiver of graduation requirements.

As to Richmond, this year’s jump came entirely from the ED average while the graduation rate of Not ED students actually dropped.

(If you can explain that Not ED drop in the face of waived requirements, or those other fluctuations in the Richmond rates, please relieve my confusion with an email to John{at}calaf{dot}org.)

When we separate out the ED and Not ED data, Richmond’s place among the divisions is more complicated.



Richmond is the gold bar in the first graph and is one of the four represented by the gold bar in the second. The light blue bars again are at the counts at the state averages, here 92 and 82.

As to Not ED students, Richmond’s 59% graduation rate (!) is third from the bottom. (Mercy! What can be going on in Covington, 12%, and Hopewell, 37%?)  As to ED students, Richmond’s 74% is too low, but not in the cellar.

These data do not begin to explain why Richmond’s Not ED graduation rate is so appallingly low, nor why it declined (yet again) this year.

Finally, here are the graduation rate changes from 2019 to 2020, sorted by decreasing ED differences.


It is far from obvious why any of those graduation rates should decrease in light of the waivers, much less how Goochland landed a 24.76% decrease in the ED rate.

“Equity” and the Performance of Virginia’s Black and White Students

The estimable Jim Bacon suggests that the Northam administration’s emphasis on “equity” and “restorative-justice” is keeping disorderly students in the classroom to the detriment of the other students. As well, he posits that behavior problems are more common among black students so the effect should be larger in divisions with larger black populations.

VDOE has some data that might speak to those issues.

Elementary and middle school students mostly take the same tests at the same time. High school, not so much. So let’s look at the data for the elementary and middle school grades.

First, the disorder. The Safe Schools Information Resource goes back to 2015. For grades 3-8, the statewide counts there of individual offenders as a percentage of their ethnic population are:


In absolute terms, the 2019 rate of individual offenders for black students, 16.4% is 9.2% higher than the all students rate of 7.3% while the white students’ rate is 2.8% below.

All three rates increased from 2015: All students by 0.7%, black by 0.8%, and white by 0.6%. That is consistent with the Bacon hypothesis and it confounds any notion that the government’s actions are reducing disorder in our schools. 

The picture in Richmond is less definitive.


Contrary to the state averages, all three of Richmond’s offender rates decreased after 2016, all students by 6.1%, black by 4.4%, and white by 2.2%

But then there is Fairfax County.


(Oops! Corrected error in the x-axis, 6/26/20. Hat Tip WayneS.)

Overall, the Fairfax rate rose by 3.3% (61.5% of the 2015 rate); the black rate increased by 1.9% (35.3% of the ‘15 rate); the white rate, 0.73% (59.9% of the 2015 starting point).

It is tempting to assign the huge increases in the offender counts in Fairfax to the emphasis there on “restorative-justice (pdf)” and “equity.” The smaller increases statewide offer a lesser but still enticing temptation. No telling what those Richmond decreases mean but with a 2019 offender rate of 17.5 %, 2.4 times the state average of 7.3%, the only sensible inference is that’s it’s past time to move one’s family to one of the nearby counties.

For sure, something awful is going on in Richmond schools and something is getting worse statewide, much worse in Fairfax.

To look for the possibility of an “equity” effect on performance, let’s turn to the SOL pass rates.

The big two SOL tests are reading and math. First, the state average pass rates on the reading tests.


The large drop in 2013 came from new, tougher tests. Those new tests also exacerbated the black/white performance gap.

To the point here, these data show slightly improving pass rates for both black and white students in the first years after the new tests but faltering rates in 2019. During the same period, the gap between black and white students improved (decreased) by 4.0 points, a significant amount, albeit virtually all the improvement came during the recovery, such as it was, from the new tests.

On the math tests, the big drop came with new, tougher tests in 2012. More recently, the pass rates were faltering until they enjoyed a bounce in 2019 from newer, easier tests.  Also in the recent period the black/white gap was worsening slightly but improved a bit with the easier tests.

(There’s a template for making the entire school system look better: Just water down the tests every year.)


The black/white difference improved by 3.4 points from 2013 to 2019 with, again, most of the improvement coming before the “equity” campaign.

If the Governor has had any effect on pass rates, it’s not obvious here. So let’s turn to a couple of the more interesting divisions.

Richmond has one of the largest percentages of black students in the state.


(2019 data. Race group names abbreviated; Am. Indian and Pacific Islander groups both < 0.5% and omitted to simplify the graph).

Or, in terms of the black/white ratio,


The Richmond reading data tell a sorry story.


There’s no recent improvement in the Richmond rates; indeed, the (already appalling) rates for the black students have declined some in the last three years while the black/white gap worsened  by 9.5 points. That gap now stands at 39.3, 188% of the state average gap.

On the math tests, the Richmond gap has deteriorated consistently since 2014, landing at 35.9% in 2019.


Turning to Fairfax, a County that has been a leader in the restorative-justice (pdf) movement. 

The racial distribution in the Fairfax schools is different from the state average and quite different from Richmond.



Both black and white students in Fairfax consistently beat the state averages for their groups but the black/white gaps closely track the state numbers.



These data suggest that the “equity” movement in the Virginia’s schools has not improved pass rates on the elementary and middle school reading and math tests and has generally failed to halt ongoing deterioration of the black/white rate gaps.

The recent, general decline of these pass rates, except under the new, easier math tests, is consistent with increasing disorder in the classrooms but does not establish a causal relationship.

On these data, Bacon is batting at least .750:

  1. Keeping disorderly students in the classroom: Number of offenders is rising;
  2. To the detriment of the other students: Pass rates are falling;
  3. Behavior problems are more common among black students: Astronomically;
  4. The effect should be larger in divisions with larger black populations: Not so, at least on this very small sample of divisions.

Because of the very small number of divisions in the sample here, I need to look at the data for all the divisions. Stay tuned.

More on 2019 Graduation Rates

Having just looked in some detail at the dropout data, let’s turn to the graduation rates. These are 2019, 4-year cohort, on-time graduation data.

But first, some background:

According to the U.S. Census Bureau’s American Community Survey, the population of U.S. 18- through 24-year-olds not enrolled in school and without a high school diploma or General Educational Development, or GED, credential was 16.4 percent in 2009. Among 16- to 24-year-olds who were incarcerated during 2006-07, only 1 in 1,000 had a bachelor’s degree, while 6.3 percent were high school dropouts who didn’t have a GED. (Sum, Khatiwada, McLaughlin & Palma, 2009).

As to Virginia, here are the division average, on-time diploma rates for economically disadvantaged students (“ED”) plotted v. the rates of their more affluent peers (“Not ED”) (data are percentages).


Richmond’s disastrous performance aside, these data share with the dropout data a curious inversion: Given that ED students generally underperform their Not ED peers on the SOL tests (for example, see here), we might expect that the ED graduation rates would be lower than the Not ED. The state averages, ED 87.2 & Not ED 93.9 are consistent with that. But Richmond shows a higher ED than not ED rate, 73.9 v. 65.9. And the fitted line, notwithstanding the relatively low R-squared value, suggests that on average Not ED rates below 84.6 are associated with higher ED than Not ED graduation rates.


Indeed, all the divisions above the gray line on the graph below exhibit that anomaly.


Trophy Offer: As with the dropout data, I’ll give a #2 lead pencil as a prize to anybody who can offer a (testable) hypothesis that explains this phenomenon. But before you heat your brain up on this, take a look at the post that will follow this one in a day or two.

Turning to the data by school, we see the details of Richmond’s win in the race to the bottom.


As well, the Richmond schools, other than Marshall and the three selective schools, show anomalously high ED graduation rates.


Of course, Richmond’s graduation rates are a direct reflection of the dropout rates.

image image

The red diamonds are the peers, from the left Newport News, Hampton, and Norfolk.

More on 2019 Dropouts

The VDOE Web site has some Excel-friendly dropout data that are more granular than the more usual reports. In particular, these data include 4-year cohort dropout rates for both the economically disadvantaged (“ED”) students and their more affluent peers (“Not ED”).

First, some background:

According to the U.S. Census Bureau’s American Community Survey, the population of U.S. 18- through 24-year-olds not enrolled in school and without a high school diploma or General Educational Development, or GED, credential was 16.4 percent in 2009. Among 16- to 24-year-olds who were incarcerated during 2006-07, only 1 in 1,000 had a bachelor’s degree, while 6.3 percent were high school dropouts who didn’t have a GED. (Sum, Khatiwada, McLaughlin & Palma, 2009).

To start with the Virginia numbers, here are the 2019 division data, plotted as the ED dropout rate v. the Not ED rate (data are percentages).


Richmond, the gold square, is the “leader” here. The red diamonds are the peer jurisdictions, from the left Hampton, Newport News, and Norfolk. The blue diamond is the state average. Charles City is the green diamond at (0, 0) with no dropouts of either sort. Lynchburg is the purple diamond.

A handful of divisions (Halifax, King & Queen, West Point) are absent from this graph because VDOE reported no datum for one rate or the other (probably because of their suppression rules).

There is a curiosity here: We know that the Not ED students generally outperform their less affluent peers on the SOL tests. For example, see here. The state average dropout rates, (4.15 Not ED, 8.17 ED), are consistent with that. But the slope of the fitted line here is less than one, indicating that the divisions with larger Not ED dropout rates have relatively lower (i.e., better) ED rates. Indeed, at the intercept the least squares fitted ED dropout rate is higher than the 0% Not ED rate by 5.5%, but by 0.83 the two rates are equal and at higher Not ED dropout rates, the fitted average ED rates are lower than the Not ED.


Trophy Offer: I’ll give a #2 lead pencil as a prize to anybody who can offer a (testable) hypothesis that explains this phenomenon. Send your ideas to john{at}calaf{dot}org.

The data by school show one egregious datum (Fairfax County Adult High, with an 85.5 Not ED rate) and a scattering of merely awful numbers, half of them from Richmond (gold squares, again).


Note: A number of schools are absent here because of the suppression rule.

Perhaps it would be useful to shrink the axes to eliminate that Fairfax point.


Focusing on the Richmond schools, we see:


Or, in terms of a table:


Notice that, aside from the three selective schools, all but Alternative have lower (better) ED than Not ED dropout rates (albeit Alternative is “selective” in that it serves “students with academic, attendance, and behavior challenges”).

In the meantime, Marshall and TJ have double the state average Not ED rates and the other four have astronomical rates for both groups.

Back to the Ed. Week article quoted above:

A 2008 review of the research on preventing dropouts by the U.S. Department of Education also identifies key components of effective programs. Besides data-based, early-warning systems, these strategies include: creating more personalized learning environments for students; providing extra support and academic enrichment for struggling students; assigning adult advocates to students deemed to be at risk of dropping out; and providing rigorous and relevant instruction to engage students in learning.

Middle Schools in Context

For the Richmond middle schools, the 2019 SOL pass rates range from encouraging to appalling, mostly the latter.

To start, here are the rates on the 6th grade reading test. “ED” indicates “economically disadvantaged” students. “Not ED” are the more affluent peers. I’ve highlighted the Richmond data: green for Not ED and red for ED.



  • Pass rates on the SOL tests generally are ca. 15 to 20 points lower for ED than for Not ED students. The reported SOL pass rates are are averages over all students so that schools and divisions with larger proportions of ED students generally have lower reported pass rates. This reporting system thus unfairly rewards the more affluent schools and divisions and penalizes the poorer ones. By examining both the ED and Not ED results, we can avoid that problem.
  • Some of the very high and low %ED schools run into the suppression rule: VDOE suppresses the report where there are <10 students in a group. Data for those cases are missing from the database and, thus, from these graphs.

To focus on the Richmond schools, let’s hide the statewide data but keep the fitted lines that tell us about the state averages. And, FWIW, the purple line below is the nominal level for accreditation. But recall that VDOE “adjusts” the pass rates in order to accredit many schools that come nowhere near that threshold.


Notice the changed range on the x-axis.

Franklin, a selective school, is the winner here. Among the Not ED rates of the mainstream middle schools, Hill beats the state average, Binford flirts with it, and Brown is low (and below the accreditation threshold). Everything else, ED and Not ED, is below, mostly far below, the state average for ED students. Henderson is missing, courtesy of the suppression rule. Their ED pass rate is 48%. The Boushall ED and Not ED rates are inverted, as are the Elkhardt-Thompson, albeit less dramatically.

Turning to the 7th grade:



MLK is victim of the suppression rule; the ED pass rate there is 25%. Franklin makes a less splendid (but still above average) showing. For Boushall, the usual ED/Not ED order is inverted again. Richmond, again, overpopulates the cellar.

Eighth grade reading:



Franklin is back in the saddle here. MLK rises above the suppression rule, but its news is hardly good. The Boushall Not ED rate again is below the ED; something unusual looks to be going on there.

The new math tests in 2019 raised pass rates statewide by 3.4% for Not ED students, 6.6% for ED. Notice the elevated fitted lines here. That dilution of the tests did little for Richmond.



MLK and Henderson are caught by the suppression rule. MLK’s 27% ED rate, Henderson’s 37%, and too much of the other Richmond data again help define the bottom of the barrel. The Boushall rates again are inverted.

Next, the seventh grade.



Franklin makes its worst showing here. Henderson and MLK again have too few Not ED students to get past the suppression rule; their ED rates are 22% and 14%. Binford takes a hit. The Boushall numbers again are inverted.

Eighth grade math:



Henderson is suppressed; it’s ED rate is 61%. The ED and Not ED rates are inverted at MLK and Elkhardt-Thompson. Perhaps this reflects the natural fluctuation to be expected with very small numbers of Not ED students at MLK (25 Not ED students tested) but that looks less likely at E-T (49 Not ED students).

The bottom line: Except for Franklin and for Not ED students at Hill and (mostly) at Binford, Richmond’s middle schools are a disgrace to the community and a threat to our children.

Improving Graduation Rates

How? By applying the lesson of the history of rampant cheating when the locals get to grade SOLs: 

Department of Education

State Board of Education

Regulations Establishing Standards for Accrediting Public Schools in Virginia [8 VAC 20 ‑ 131]


Amendments to permit students entering the ninth grade prior to the 2018-19 school year to be awarded locally-awarded verified credit in English and mathematics when certain Board of Education-established criteria are met.

General Information

Action Summary
The proposed amendments allow students that entered the ninth grade prior to the 2018-19 school year to be awarded locally-awarded verified credits in English and mathematics when certain Board of Education-established criteria are met…

And Then There Was Petersburg

The previous post showed the recent progress, particularly of the economically disadvantaged (“ED”) students, in Southwest Virginia (Region 7). For example, on the reading tests:


Contrast this with the abiding failure in Petersburg.


The math data tell much the same story.



Note added 11/7: A reader (the reader?) points out that PBurg has lots of ED students. Indeed, 76% this year, v. 61% in Region 7. The point remains: Region 7 has LOTS of ED students and manages to improve their pass rates; PBurg has more and utterly fails to improve.

In SW Virginia, the schools formed the Comprehensive Instructional Program, first implemented in 2015. The CIP now has expanded to some forty-one divisions.

As to Petersburg, the State Board of Education adopted a Corrective Action Plan in 2016. This was only the latest step in a process that began in 2004.

(“MOU” is bureaucratese for Memorandum of Understanding, which is the misleading term for an edict of the Board of Education.)

Here is an abbreviated comparison of the Region 7’s Comprehensive Instructional Program and the Board of Education’s Corrective Action Plan for Petersburg:


STOP! Please go back and read all of each of those lists so you can fully appreciate the fecklessness of the Board’s approach.

A modest proposal: Let’s expand the CIP statewide and and shrink the Board of Education’s function to what they can usefully do: statistics and webinars.

Super Southwest Scores

The estimable Jim Bacon reports on the recent success of schools in Southwest Virginia. The schools there collaborated in an initiative that now has spread to encompass 41 divisions. A map of those divisions is here. (Click on the map to identify a division of interest.)

The VDOE Web pages have some data on the results. The graphs below are for the original area, “Region 7.”

Image of Virginia regions

The recent improvements in Region 7 SOL pass rates have been remarkable.



The numbers beneath those averages are even more remarkable.

To start, these schools have large proportions of what the state calls “economically disadvantaged” (here “ED”) students. Statewide, ED students pass the SOL tests at lower averages than their more affluent (“Not ED”) peers: in 2019, 21.97 points lower in reading, 16.79 points in math.

Region 7 has a lot of ED students. For example, on the reading tests:


Note: There is an anomaly in these numbers. Washington County goes from 62.3% ED in 2018 to 99.1% in 2019. This looks like a data error; I have asked VDOE about it. The 2019 average with Washington Co. excluded would be 56.0%. For now, it would be wise to ignore the Washington Co. numbers and to discount that 61% some.

The recent improvements in the Region 7 SOL averages are driven by improvements in the ED pass rates (that were better than the State averages even before the CIP collaboration).



At the risk of cluttering those graphs, here they are again with the SOL averages included.



Of course, the progress in the region has not been homogeneous. Here, to risk a spaghetti graph, are the Not ED reading data. (Excel runs out of colors, thus the duplicates. But, really, there are too many data in this graph.)


The yellow line running across the bottom is Buchanan Co. The blue pair at the top in recent years are Radford and Wise Co. The light green line that plunges from near the top in 2018 to dead last in 2019 is Washington Co., with 2019 probably a bad datum.

The ED data show everybody but Pulaski beating the state average.


That green winner in 2019 is Washington County (again probably a data problem).

For the record, the math data:



The Bacon post will tell you how they did this.