Economic Disadvantage v. More Teachers

We have seen that, on the 2017 data, division average SOL pass rates are not correlated with the numbers of teachers per student.  There is a flaw in that analysis, however:  Economically disadvantaged (“ED”) students score lower on the SOL, on average, than Not ED students.  Thus, divisions with larger populations of ED students tend have lowered overall pass rates.

The VDOE database can break out the pass rates for both ED students and their more affluent peers, so let’s take a more nuanced look.

To start, here are the 2017 division average reading pass rates for both ED and Not ED students graphed vs. the number of teachers per thousand students.  (The latest available teacher numbers are from 2017).

The slopes of the least squares fitted lines might suggest that more teachers in the division correlate with decreased pass rates of the Not ED students and slightly increased rates of the ED students.  But the R-squared values tell us that the pass rates in both datasets are essentially uncorrelated with the teacher/student ratios.

In short, these data reach the same result as the overall pass rate data: Divisions with more teachers per student do not, on average, have better pass rates.

The largest exception to that generality, out there with 189 teachers per thousand and a 97.1% Not ED pass rate (and a much better than average ED rate), is Highland County.  The ED superstar is West Point with 135 teachers per thousand and an ED pass rate of 85.2, followed by Wise (115, 84.4) and Bath (159, 82.7).

To single out some other divisions: Richmond is the yellow squares.  The peer cities are the red triangles, from the left Newport News, Hampton, and Norfolk.  Charles City is green; Lynchburg, blue.

Just looking at the graph, Richmond’s ED rate is farther from the fitted line than its Non ED.  Indeed, Excel tells us that the Richmond Not ED average is 11.8 points below the all divisions average.  That is, Richmond’s Not ED students passed at a rate 11.8% lower than the state average for Not Ed students.  The Richmond ED rate is 17.1 points below the state average for ED students.  That is, Richmond’s Not ED performance is poor; it’s ED performance is half again worse.

Aside from the question why their ED scores are so low, these data falsify Richmond’s frequent excuse that it must deal with a large ED population: Richmond does a lousy job with its Not ED students, and an even worse one with the ED students.  Whatever the cause of Richmond’s awful SOL performance, it infects the entire student population.

Next up, writing:

Pretty much the same story there as the reading data (but notice how Highland reverts back toward the mean).  Richmond’s ED rate is lowest in the state.

The graphs for the other three subjects are chapters in the same book:


There is one interesting trend here: Richmond’s ED underperformance, relative to the Not ED students, is much smaller in science and in history/SS than in English, and is somewhat less in math.  To quantify that, here are the Richmond differences from the division means for each group in each subject:

These data do not separate out any of the factors that can affect student performance, other than (roughly grouped) economic status; they do comport with the notion that Richmond has a lot of work to do, especially with its ED students.

To the bottom line: These data are consistent with the conclusion in the recent study that “the evidence suggests at best a small effect [of class size] on reading achievement. There is a negative, but statistically insignificant, effect on mathematics.”

So, when our School Board again says it needs more money, ask them what for.  And if it’s for more teachers (indeed, whatever it’s for), ask them to prove that the money will improve learning.

Larger Classes, Same Performance

The estimable Jim Bacon has posted on the class size/student performance issue.  He used a graph I produced showing division average math pass rates vs. the number of teachers per thousand students.

Oops!  I used 2016 pass rates and 2017 teacher/student ratios.  Using the 2017 pass rates changes the graph slightly but does not modify the conclusions.

Here are the correct data for math and the other four subjects as well.  Pass rates are from VDOE’s very nice (but very slow) database; teacher data are from the Superintendent’s annual report.

First, reading:

The negative slope might suggest that increasing the number of teachers is associated with lowered SOL pass rates but the R-squared value tells us that the two variables are essentially uncorrelated.

Richmond is the gold square on the graph.  The red diamonds are the peer cities, from the left Newport News, Hampton, and Norfolk.  Charles City is green; Lynchburg, blue.

The Big Hitter up there is Falls Church, with a 92.6% pass rate at 119.5 teachers/thousand.  Next in line are West Point (135.0, 91.0) and  Poquoson (105.1, 90.7).

The data for the other four subjects tell much the same story.

More on “Accreditation” in Richmond

Under the new accreditation regulation, no Richmond school was denied accreditation this year.  No matter how badly a school has failed its students, the principal now can say his/her school is “accredited” (albeit 24 were accredited “with conditions”).

Three of the most egregious examples of that are here.  I’ve redone the graphs to present the entire list of schools that have acquired accreditation by fiat.  (Data: last year, this year.)

Note: Data here are for year tested; accreditation year is one year later.  Thus, the 2018 data in the graph are from the 2017-18 testing and establish (or would if they meant anything) accreditation status for 2018-19.








Note: The elementary schools don’t test for writing.












Then we have the schools that were “Accredited with Warning” last year but were saved from even that by the new system.







So much for “this new approach to accountability.”

Let’s Squander More Money!

The Times-Dispatch reports that “Richmond’s top officials spent their quarterly meeting again calling for more money for the city’s school system.”  This abiding demand for more money ignores the more important question:  What was RPS doing with the $335,290,809 it already had.

VDOE has some data on that.

The latest expenditure data are from 2017. (We are three months beyond the end of the 2018 session but VDOE won’t have the 2018 data until about the time we start seeing dandelions.)

These data show division expenditures for operations divided by the end-of-year average daily membership.  A  footnote to the spreadsheet tells us that “[o]perations include regular day school, school food services, summer school, adult education, pre-kindergarten, and other education, but do not include non-regular day school programs, non-local education agency … programs, debt service, or capital outlay additions.”

By that measure, Richmond is the 17th most expensive division per student:


Richmond is the yellow bar.  The peer jurisdictions – Norfolk, Newport News, and Hampton – are the red bars.

We are spending $1,396 per kid more than the state average and $1,881 more than Norfolk.  In terms of SOL scores, we get very little return for all that money.


Richmond is the gold square; the red diamonds are the peer jurisdictions, from the top Hampton, Norfolk, and Newport News.  As a courtesy to my reader(s) there, the green circle is Lynchburg and the green diamond is Charles City.

The R-squared value for the least squares fitted line tells us that the reading pass rate and the per pupil expenditure are not correlated.

The math data tell the same story.


“But wait!” you say.  We know that the SOL is not a fair measure because “economically disadvantaged” students do not score as well as their more affluent peers.

Indeed.  The Board of Education had a better measure, the Student Growth Percentile, that measured learning and did not correlate with poverty.  They abandoned it, however, because it measured too well: It told us how well each teacher performed.

Denied the best data, all we can do is use the SOL numbers and a little algebra to offset the average effect of poverty. 

On the reading tests in ‘17, the division percentage of “disadvantaged” students predicted about 35% of the variance of the division average SOL scores.


(You’ll notice that Richmond grossly underperformed even the fitted line.)

Let’s be generous, and calculate an adjusted pass rate as if the correlation were 100% (“Generous” for sure: Notice that some of the pass rates get pushed over 100%.)


The adjustment for Richmond’s 64.2% poverty rate boosts its rate nicely but only boosts the ranking to fourth from worst, up from second.

And, boosted or not, our scores are low and pricey:


The correction for math is slightly different (28.025% of the poverty rate v. 28.671% for reading) and the outcome is slightly worse (Richmond is third worst).


In short, poverty does not come close to explaining the awful performance of the Richmond Public Schools.

It would be good if our Leaders were to stop whining about wanting more money and start explaining why they get such lousy results with the very large amount of money they already are spending.

We Beat Petersburg!

In Richmond, the only way to brag about the schools has been to say “We beat Petersburg!”  We can say it again this year.


For example, here are the division average reading pass rates.  The graphed data are rounded to the nearest whole number to allow calculation of the distribution; the data in the table are rounded to one decimal place.


The blue bar at 56 indicates that one division’s pass rate (it was Danville’s 56.25) rounded to 56.  The blue bar at 57 shows that one division rounded to that value; the red bar there tells us it was Petersburg.  Similarly, Richmond is the one division at 59.

The chart tells us that Richmond was 2.5 standard deviations below the division average, 77.0.  Petersburg was low by 2.8 standard deviations.

Here are the data for the other four subjects.





Finally, the average of the five averages.


Notwithstanding these dismal results, all the Richmond schools (except Carver, where they were caught cheating) are accredited.  But, then, so are all the Petersburg schools.

Meanwhile, the Richmond School Board is busy adopting a redundant free speech policy instead of telling us what they will do to improve the teaching.

Charter Middle School?

The Times-Dispatch has a story, “Richmond parents want a charter middle school.  It faces an uphill battle.” 

The snippet on the “Education” section of the Web page tells us the issue:

The power to approve a nascent effort to launch a public charter middle school in Richmond rests solely with officials wary of watching limited dollars needed for their own underperforming schools follow students elsewhere.

The data suggest that those “wary” School Board members and City Great Ones should think a little harder.

Let’s start with the distributions of fifth and sixth grade pass rates on the reading tests.


Here we see nine elementary schools with fifth grade pass rates that meet or beat the (former) 75% benchmark for accreditation on the reading tests: Munford, Fox, Southampton, Broad Rock, Stuart (now Obama), Holton, Patrick Henry, Cary, and Bellevue.  The only middle schools that make the same cut for the sixth grade are Alternative and Franklin Military. 

Relevant here, Richmond Alternative operates Spartan Academy, which “serves as a school to support students with academic, attendance and behavior challenges.”  Franklin is a different kind of specialty school; it does a decent job for selected students who elect “to experience a regular academic course of study while participating in a Junior Reserve Officer Training Program or Middle School Leadership Program.”

The highest-scoring non-specialty middle school on the graph is Binford, with a 69.5% pass rate, followed by Hill, 67.4%, and Brown, 62.6%.

The math data paint a similarly ugly picture.


Munford leads the fifth grade parade, followed by Cary, Fox, Fisher, Patrick Henry, Broad Rock, and Obama, all of which beat the nominal 70% benchmark for math. 

Franklin and Alternative 6th grade averages again beat the benchmark.  Next in line is Hill, at 65.2%, followed by Brown, 55.4%, and Binford, 54.1%.

(This is not to say that there aren’t serious problems in both elementary and middle schools.  Just look at those collections of pass rates below 50% in both subjects.)

To the point here, there are some Richmond elementary schools where a parent can send a kid while entertaining only the normal parental worries.  But come middle school, parents who can afford it have a good reason to opt for a County or private school.

And opt out they do: The Richmond enrollment plummets after the fifth grade.


Or, in terms of the raw enrollments:


(That ninth-grade “bump” is a national phenomenon that appears to reflect laissez faire promotion policies in the lower grades.  Most of the drain in Richmond after the 9th grade is dropouts.)

So, the question for our city and school board is more nuanced than just watching dollars flow to a charter middle school:  Would having a decent middle school help stanch the current flow of middle school funds out of the Richmond school system?

College Graduation Rates v. SAT Scores

We have seen that the graduation rates of our 4-year public colleges correlate well with the SAT scores of their freshmen.  Let’s return to that subject with some more recent data.

The most recent SCHEV cohort graduation data are from the 2012-13 first-time, full time freshmen.  SCHEV also has SAT data for entering freshmen that year.

Here, then, for our 4-year public colleges, are the 4-year graduation rates for that 2013 cohort plotted v. those median SAT scores (math + verbal).


The correlation is solid:  Entering SAT scores, on average, are an excellent predictor of that graduation rate (recalling, always, that the correlation does not prove causation).

Three schools considerably outperform the fitted line: James Madison and Longwood at +11 and VMI at +7.  The three largest underperformers are Old Dominion (-13), George Mason (-11), and VCU (-7).

Turning to the five-year rates:


The outperformers on the five-year rate are JMU (+12 from the fitted line), Longwood (+10), VMI (+7), and Radford (+4).  The underperformers are Old Dominion (-9), Norfolk State (-8), Wm. & Mary and George Mason (-5), and Mary Washington (-4). 

We might think that outstanding performance at the 4-year level would hinder performance at five years; JMU, VMI, and Longwood (and, to a lesser extent, Radford) all belie that notion.

The five-year rates are remarkably higher, even at THE University and W&M, with the increases generally larger at the schools with the lower rates.


Remember: The cohort data are for full time, first-time freshmen.  We might think that these rates would not reflect the larger part-time populations at the urban universities; we might be wrong in that.

BTW: Looking just at graduation rates, we see there is one (private) school that beat even THE University (on the 4-year rate, UVa, 88%; W&L, 93%).


We could argue endlessly about the causes for these correlations and differences.  The better question, I suggest, is what all these schools might do to improve their graduation rates:  On average, 45% of those full time, first-time freshmen entering 4-year public colleges did not graduate in four years; 28% did not make it in six. 

In terms of people:  About 14,450 (“about” because of the roundoff of the rate data) of 32,112 first-time, full time freshmen entering 4-year public universities in the fall of 2012 did not graduate in four years; about 8,990 did not make it in six years.

Score Distributions

Let’s look at the distributions of 2018 SOL pass rates.

Note added 10/22:  Aha!  I figured out how to graph both all the schools and the Richmond schools from the same pivot table.  So here are the redone graphs.

10/24: Oops.  Correcting an error in the way Excel handled the >50 and <50 entries for suppressed data.

First, the reading tests:


Next math.


Last, Science.


Accreditation: Progress by Fiat

How to Accredit a Failed (and Failing) School

This year, the SOL pass rates declined in the three subjects that underlie the accreditation process, both statewide and in Petersburg.


At the same time, the Board of Education proclaimed a statewide accreditation triumph.



  • One school that was “Accredited Pending Review of Alternative Accreditation Plan” in ‘18 is omitted from the graph. 
  • The VDOE Web page says 92% Accredited.  The data from their spreadsheet show 92.77%.  Even without their curious stance on roundoff (see below), that rounds to 93%.
  • See below for the demise in 2018 of the several “Partially Accredited” categories and the genesis that year of the “Accredited with Conditions” status.

We can gain some insight into the reason for this boom in accreditations atop a slump in pass rates by examining the Petersburg data.

The Petersburg Schools have been operating under Memoranda of Understanding since at least 2004

The system was denied accreditation continuously from 2006 to 2017.  On the 2017 data, Stuart Elementary and Johns Middle were denied accreditation; AP Hill Elementary accreditation was withheld because the staff there got caught cheating.

This year, notwithstanding all that “help” from the state, all the Petersburg reading pass rates, except perhaps for Peabody/Johns, decreased.



  • Data here are from the SOL database for reading; they no longer test writing in the elementary grades.  Numbers in the School Quality Profiles can be different (see below).
  • Petersburg changed the names of three elementary schools this year:
    1. A.P. Hill became Cool Spring
    2. J.E.B. Stuart became Pleasants Lane
    3. R.E. Lee became Lakemont

Except at the high school, the math rates dropped as well.


The science rates rose at Walnut Hill and Pleasants Lane; they fell at the high school and tanked at Lakemont.


The Petersburg schools’ home page announced the resulting change in accreditation:


All Petersburg schools are accredited in the 2018-19 school year.

(To their credit, they also said that, except for Cool Spring and Walnut Hill, those accreditations were “with conditions.”)

The accreditation changes were dramatic: 


How shall we explain this situation?

Well, if you are feeling masochistic, you can read the new accreditation regulation and dig into the numbers buried in the School Quality Profiles.   Or you can read on here.

This year, there are three performance levels

For English:

  1. Level 1: 75% current or 3-year average pass rate (boosted for “recovery,” growth, and English learners) or Level 2 prior year and decrease failure rate by 10%.
  2. Level 2: 66% current or 3-year average pass rate (boosted for retakes, growth, and English learners) or 50% prior year (boosted) pass rate improved by 10%. 
  3. Level 3: Everybody else, plus anybody at Level 2 for four years.

For math, the scheme is the same except the Level 1 benchmark is 70% and there is no boost for English learners.  For science, the math scheme applies except there is no boost for growth or recovery.

Here, then, are the English accreditation scores for the Petersburg schools.


The colors tell where the numbers came from:

  • Light green: The pass rate as reported by VDOE.
  • Brown: The percentage of students who flunked reading last year, took the remediation program, and passed this year.  They count twice(!).
  • Dark Green: The students who failed but showed “growth.”
  • Blue: English learners (who are counted only if they pass, but then count twice).
  • Red: The Level 1 benchmark.
  • Gray: The Level 2 benchmark.

There are some anomalies here:

  • VDOE reported a 51.3% pass rate for Cool Spring but their accreditation page says 55% (they round everything off). 
  • Pleasants Lane numbers similarly improved from 60.38 to 61.  (Formerly VDOE rounded up at 0.45; that does not explain this change). 
  • As well, the 71.49 at Walnut Hill became a 72 (perhaps because of the enhanced rounding up). 
  • The reading pass rate/accreditation numbers probably are not comparable for Johns and the high school because the writing scores get averaged in.

Even with all the Finagle factors, Vernon Johns did not come close, even to the new, diluted benchmark. 

Presumably Cool Spring could not enjoy “Growth” or “Recovery” boosts because the cheating canceled the 2017 scores.

But these enhanced pass rates are not the end of the matter.  There also is a potential boost from the three-year average: 


Poor Vernon Johns still looks to be beyond help.  (BTW: VDOE does not explain how they calculate the numbers to accommodate missing Johns data for ‘17 and the Peabody scores prior to the merger).  We need not examine the 2015 rules or the 50%-with-10%-gain options here: VDOE tells us Johns is Level 3 (also in math and science).

Despite an appalling performance in 2018 and no data in 2017 (cheating), Hill/Cool Spring makes the diluted 66% level (and nearly makes the 75% benchmark). 

A little arithmetic (nearly) shows where the Cool Spring 73 came from:  Here are the accreditation scores for Hill/Cool Spring for the last three years:


The actual three-year average for Reading is 48.  BUT if we just average the 2018 value with the (obviously cheating-enhanced) 2016 value, we get 72.  Close enough to 73 for VDOE, it seems.

Math is even more dramatic: 


But the three-year bonus takes care of Pleasants Lane and elevates Cool Spring to Level 1.


Again, the only way VDOE can get the Hill/Cool Spring numbers is by ignoring 2017.  No telling why they report 70 when the actual average was 69.

By any measure, Johns and Lakemont strike out and, indeed, VDOE reports them as Level 3.

The science scoring does not include the recovery et al. adjustments.  On the pass rates, only Walnut Hill makes either benchmark. 


The 3-year average does not save anybody.


On these numbers, Cool Spring’s enhanced average is 65, one point short of the Level 2 benchmark.

Another way to make Level 2 is to take a score of 50 or better and raise it by ten points.  That plainly does not work for Cool Spring, which started in ‘17 at zero (or in ‘16 at 83 in science and dropped to just over half that in ‘18).

Nonetheless VDOE reports Cool Spring as Level 2 for science.

These “levels” feed into the accreditation ratings:  A school is “Accredited” if all its school quality indicators are Level 1 or 2.  Any school with an indicator at Level 3 is “Accredited with Conditions.”  A school accredited with conditions “may” be denied accreditation if it fails to adopt and implement relevant corrective action plans “with fidelity.”  The regulation does not tell us what level of “fidelity” is sufficient.

Aside:  The question whether a school has implemented “with fidelity” does not “rest[] entirely upon . . . [an] examination” or constitute “approval by the Board of Education of a school division corrective action plan” so the due process requirements of the Administrative Process Act should apply.  If the Board of Education should ever try to deny accreditation somewhere, it will be interesting to see whether they comply with those requirements.

Another way to get accreditation this year is to make the grade under the previous regulation.  VDOE tells us no Petersburg school is in that category this year.

A last way to make full accreditation is to enjoy the running three-year accreditation exception from the statute.  On that subject, here are the Cool Spring data:


Cool Spring was accredited in 2015 and 2016 (certainly because they were cheating) but not in 2014 (and of course not in 2017, what with getting caught at the cheating).  So there’s no three-year run and the statute does not apply.

(But the data do make one wonder why VDOE did not bother to examine those obviously bogus 2015 and 2016 numbers and the resulting accreditation ratings.  Indeed, if not rewarded for that obvious cheating, Cool Spring would not enjoy being “accredited” this year.)

Thus, we have the curious situation where VDOE blesses the cheating at AP Hill/Cool Spring with bogus English and Math ratings and with a science rating that looks to be invented from whole cloth.

There are further “achievement gap” measures that provide useful information but let’s pass over those and look at the 2018 results for Petersburg.


As the Petersburg Web page said, everybody is accredited! 

None of the “conditions” schools, even Johns, which flunked on all the academic measures, can be denied accreditation unless it “fails to adopt and implement . . . corrective action plans with fidelity.”

So there you have it: The process rewards past cheating.  The old benchmarks at 75 and 70% are reduced to 66% (at least for four consecutive years) and apply to manipulated pass rates that can be nearly double the actual pass rates.  A school that misses those relaxed benchmarks, and does not benefit from one of the helpful exceptions, is accredited, albeit “with conditions.”   That school can be denied accreditation only if it does not, in effect, tell the Board of Education to go jump in the James.  Or, in the case of Petersburg, the Appomattox.

So, Petersburg gets to feel good about being “accredited.”  Never mind the increased numbers of kids who are not being educated.

98.2 million of your tax dollars at “work.”

Boosting the Graduation Rate

As VDOE bragged, their (bogus) “On-Time” graduation rate rose this year.  They didn’t brag about the Richmond rate; it dropped.

Turns out, the (somewhat less bogus) “federal” rates show the same pattern.


The increase in the state rate was driven by an increase in the standard diploma rate.  The drop in the Richmond rate came from a 2.7% decrease in the advanced studies rate.


But you, Astute Reader, are still looking at that first graph and asking: “John!  You’ve been ranting about how VDOE’s manipulation improved the state rate by about 1.3 points for ‘17 and ‘18 and the Richmond rate by perhaps five points.  Where are those increases?”

Ah, what a pleasure to have an attentive reader!  The 1.3 and 5 point boosts look to have been offset or partially offset by decreases in the End of Course pass rates. 

Turning to the data, here are the graduation rates again along with the averages of the five EOC subject area pass rates.


Students must pass six EOC tests to graduate.  Thus, the decreases of the pass rates of those required courses must have lowered the graduation rates.  Then the VDOE data manipulation offset those graduation rate declines in some measure. 

That looks like a general explanation.  The specific would require a more detailed knowledge of which students passed or failed which courses, and where in their four-year journey through high school, and whether they graduated.  For sure, the drop in the Richmond pass rates is consistent with the absence of the five-point boost there.

Of course, correlation is not causation and doubtless there are other factors in the mix here.  The floor is open for any more convincing suggestion.

BTW: The Big Drops in Richmond, and the lesser in state, EOC pass rates mostly came in math and science.



Preview of Coming Attraction: The Board of “Education” has its next Finagle factor in place: Under the new accreditation regulation (subsection B.3), we now have “locally verified credits” for students who flunk the required SOL tests.  This should insure another nice increase in the state graduation rate, paid for by another not-so-nice decrease in student learning.