Richmond Elementary Schools: Mind the Gap!

An earlier post showed that, among the school divisions, an increasing percentage of economically disadvantaged (“ED”) students was slightly correlated with a decreasing pass rate of the more affluent (“Not ED”) students and that the division average for ED students was ca. 20 points lower than for Not ED. Another post showed that two of Richmond’s high-scoring elementary schools were not getting high-scoring pass rates from their ED students.

Let’s take a more general look at the ED/Not ED performance of Richmond’s elementary schools.

To start, here are the 2019 reading pass rates of the Not ED students of Richmond elementary schools plotted against the percentage of ED students in the tested group. Fairfield Court and Miles Jones are missing from the graph (see below).


As with the divisions, the pass rate decreases with increasing % ED, here by about four points per 10% ED increase. The R-squared value of 29% indicates a modest correlation.

Things get more interesting when we look at the ED pass rates.


As expected, the pass rates are generally lower. The slope drops to 1.5% for a 10% increase in ED population while the R-squared decreases to 9%.

Of interest here, the surprisingly lower pass rates of the more affluent schools contribute to that lower slope.

Two schools with >70% ED (Cary and Obama) outperformed Munford (13% ED) and four outscored Fox (25% ED). Indeed, none of the five low-ED schools covered itself with glory in terms of ED performance.

Here are the data:



  • The “#N/A” entries for Jones and Fairfield Court indicate cases where VDOE did not report Not ED data, probably as a result of small ED populations and the VDOE suppression rules.
  • The VDOE database does not offer state average pass rates for elementary schools. The state numbers here are the average of the averages for each of grades 3-5. Given that the state enrollment is approximately flat across those grades, that should give a close estimate of the average over all elementary students.

The Munford ED pass rate is the same 62% as the state average; Fox is 3 points lower. The state average ED population is 44%, Munford is 13%, Fox is 25%.

Let’s take this one step further: The fitted line in the ED graph, above, slopes down. In an ideal world, it would be exactly horizontal, indicating that the average performance of ED students was independent of the % ED in the tested group.

The statistics of the fitted line allow calculation of the difference between each school average and the fitted line, thus removing the average effect of the increasing ED percentage. The results, sorted by the difference:


Or, in terms of a graph:


For sure, some of our schools get lousy results. But others do much better. And some of the low-%ED schools don’t get good ED performance, even when they get very good Not ED pass rates. It would be useful to understand the reasons for those differences.

The math data show a similar pattern.


The slope here is down from the reading, three points per 10% ED vs. four. The R-squared is about 20%, v. 30% for reading. But the conclusion remains the same: On average, the Not ED pass rates decline with increasing % ED students in the tested group. These data don’t tell us why.

The ED pass rates again show a lower slope and an R-squared value that indicates very little correlation.


Five schools with >70% ED outperformed Munford (a sixth tied) and six outdid Fox. Whatever the magic at Fox and Munford, it doesn’t seem to work for their ED students.

The data:


Munford is four points below the state average; Fox is nine below.

In terms of ED pass rate differences from the fitted line:



Some Richmond schools are doing much better with their ED students than others. The literature suggests that the important variables are the socioeconomic status of the students and the effectiveness of the teachers. Here, we can wonder whether the large differences in ED performance might be related to the quality of the teaching.

For sure, the ED students at Munford and Fox don’t seem to be gaining any benefit from exposure to large populations of Not ED students.


The definition of economic disadvantage gives us, at best, a rough measure. Teacher performance, however, can be measured, independently of ED. Unfortunately, our Board of Education abandoned the measure they had, the SGP, because it did provide an accurate measure of teacher performance.

In that situation, we can only wonder what’s going on. Or, perhaps, blunder along and hope for better.

Picking the Unblemished Cherries

The headline in the Chronicle says, “SOL scores soar in Charles City.”

The text continues:

In Charles City, several areas saw massive gains in comparison to the previous year. Of the improvement areas, perfect pass rates were achieved in the areas of Algebra II and chemistry. Subject areas that saw a 20 or more point gain include grade seven English reading (64 to 85), World History I (59 to 79), World History II (54 to 92), grade three mathematics (53 to 81), grade seven mathematics (36 to 66), geometry (28 to 51), and the aforementioned Algebra II (34 to 100). . .

“When we got those scores in, it was absolute elation,” said Charles City superintendent of schools David Gaston. “Coming off last year, we didn’t do poorly but we knew we had areas to work on.

To be sure, CCCo saw some nice pass rate gains this year. As well, there were some (unmentioned) decreases.


In the larger view, the overall trends do not justify the Superintendent’s “elation.”

Reading is indeed up this year. But over half of the gain was used to overcome the loss of 2018.


The six year trend is +0.21% per year.

The trends in the other subject areas are less encouraging.

In math, this year’s gain was not sufficient to overcome the 2018 decrease, and the overall trend is down.


History and social science, moreso.


The writing pass rate this year fell to a six-year low, wiping out the gains of the previous two years.


This year’s decrease in the science pass rate more than undid 2018’s increase.


The average of those five subject area pass rates increased by 1.2 points this year but the trend continued to decline (-1.4 points per year).


It takes selective blinders to see cause for “elation” in these data. A more measured reaction might be “some bright spots but large areas that continue to need our attention.”

Profligate Promotion

The conventional wisdom these days seems to be that both social promotion and retention in grade are ineffective. For example:

Studies indicate that retention negatively impacts students’ behavior, attitude, and attendance, but it is still practiced in schools around the country. Social promotion undermines students’ futures when they fail to develop critical study and job-related skills; however, it too is still practiced in many schools throughout the United States. These practices are ruining public education as we know it, and unless we innovate and find alternative strategies to replace them, the US. K-12 education system will continue to underperform.

BTW: That article says that both social promotion and retention in grade “are ruining public education.” A closer view might suggest that the prerequisite failure to learn is the problem, not the dilemma of retention v. promotion.

In any case, the data tell us that Richmond has elected for social promotion.

Table 7 in the Superintendent’s Annual Report gives the number of students repeating the same grade as in 2018. The 2018 Table 7 reports the 2018 Fall membership. The ratio gives the percentage of students held back. The SOL database offers the percentage of students failing the tests in each subject in each grade. With those data in hand, it is straightforward to produce a graph:


Changing math failure rates for reading gives a similar picture.


The other subjects are not tested in all grades 3-12. In that data desert, the 6th grade failure rate in writing probably says something about the fifth grade education in that subject.


2019 Dropouts

Richmond won the dropout race this year.

Here are the eleven divisions with the largest All Students 4-year cohort dropout rates for 2019:


That 24.4% is 372 students (of 1,523) who dropped out.

The peer city, Norfolk, is highlighted in red.

The “ED” column is the rate for economically disadvantaged students. Sorted by the ED rate, the list changes but Richmond’s position does not.


Just to cleanse the palate, here are the divisions with the lowest rates, sorted first by the all students rate, then by ED.



In terms of the subgroups of students, the rates of black, white, and homeless students are unacceptable; those of the English learners, Hispanics, and students with disabilities are astronomical.


The VDOE information sheet and FAQ don’t tell us what “anytime” means in this context.

Our Superintendent’s Plan, Dreams4RPS, lists ten priorities; the eighth is “Decrease chronic absenteeism-overall and for each subgroup.” There is no mention of how, by how much, or who is to be responsible for the decrease. None of the five “priorities” in the Plan mentions attendance, truancy, or absenteeism.

More Salary $ ≠ Higher Pass Rate

We often hear that better pay leads to better teaching. VDOE has some data on that.

Table 19 in the Superintendent’s Annual Report lists, among other things, the average annual teachers salaries by division. Data here are averages for both teachers and principals. The very nice database on the VDOE site provides pass rates. The latest data in the Annual Report are from 2018 so the pass rates here are from 2018 as well.

For the reading tests, the data look like this:


“ED” on the chart refers to the “economically disadvantaged” students. Statewide, ED students underperform their more affluent peers (“Not ED”) by about twenty points so I’ve posted both sets of data.

The fitted line to the Not ED data suggests that a $10,000 increase in the average salary is associated with a 1.1% increase in the pass rate. The R-squared value of 2.3%, however, tells us that the two variables are essentially uncorrelated.

The ED data suggest a decrease in the pass rate of 3.1% per $10,000 increase in average salary. The R-squared of 9.2% indicates a slight correlation, but nothing to write a thesis about.

The fair conclusion is that division average reading pass rates are unrelated to average teacher salaries, except for a hint that ED rates may decrease slightly with increasing average salary.

The enlarged, squared points are Richmond, paying a bit more than average and getting a lot less.

The math and science data tell much the same story.



For the record, here are the ten high- and low paying divisions:



Notice that high paying and high scoring Falls Church is not doing well at all with its ED students. Ditto most of those Big Spenders. Also notice that, except for Lexington and Halifax, the Little Spenders have ED pass rates within ten to fifteen points of their Not ED.

CAVEAT: These data are consistent with the notions that Virginia’s systems for evaluating educational outcomes and for setting teacher salaries are counterproductive:

Research dating back to the 1966 release of Equality of Educational Opportunity (the “Coleman Report”) shows that student performance is only weakly related to school quality. The report concluded that students’ socioeconomic background was a far more influential factor. However, among the various influences that schools and policymakers can control, teacher quality was found to account for a larger portion of the variation in student test scores than all other characteristics of a school, excluding the composition of the student body (so-called peer effects).

Yet, Virginia’s evaluation system punishes schools and divisions with larger numbers of “economically disadvantaged” students. Moreover, the counterproductive salary scales reward degrees and time in service, not teaching effectiveness:

Teachers’ education (degree) and experience levels are probably the most widely studied teacher attributes, both because they are easy to measure and because they are, in the vast majority of school systems, the sole determinants of teachers’ salaries. However, there appears to be only weak evidence that these characteristics consistently and positively influence student learning.

For another look at the relationship between educational inputs and outputs, see this study from the OECD.

Rigged Grading

A large study by the OECD concludes: “[T]he socio-economic composition of schools explains far more of the differences in student performance between schools than do other school factors that are more easily amenable to policy makers, such as school resources and school policies.” That is consistent with the Virginia data that show economically disadvantaged (“ED”) students underperforming their more affluent peers (“Not ED”) by about twenty points on the SOL pass rates.

The SOL reporting system ignores this issue and punishes schools with larger populations of less affluent students. Four Richmond elementary schools illustrate the situation.

Fox and Munford serve relatively affluent neighborhoods; both schools turn in superb SOL results. Cary and Obama have much tougher clienteles and do not show as well. Nonetheless, Cary and Obama get better ED pass rates than do Fox and Munford.

Hard to believe, eh? Well, look at this. And notice how the SOL average is affected by the %ED.


The SOL average over all students punishes Cary and Obama for the relative poverty of the majority of their students, albeit those students outscore the ED students at Fox and Munford. At the same time, Munford and Fox are rewarded for having small ED populations.

If we look at the averages of the ED and Not ED rates, a different pattern emerges.


In terms of those averages, all four schools cluster near the 75% accreditation benchmark. This calculation rewards Cary and Obama for the superior performance of their tougher-to-teach students and recognizes the inferior results at Fox and Munford.

On the math tests, Cary escapes most of the SOL penalty by way of a very high ED pass rate but Obama confirms the point. The Fox and Munford SOLs again shine, despite relatively lower ED pass rates.


The ED/Not ED average again tells a more balanced story.


The Board of Education bases its accreditation calculation on the SOL average over all students in a school or division. Indeed, the public measure of school quality is that same average. That rewards the schools and divisions with fewer ED students, whether or not they get good results with those ED students, and penalizes schools and divisions with large ED populations, even when they get better than average results with their ED students.

A plot of division average reading pass rates and ED/Not ED averages vs. the %ED illuminates the difference between the SOL average pass rate and the ED/Not ED average.


The fitted line to the SOL scores (red) suggests that the SOLs decrease on average by about 2.5% for a 10% increase in the ED population. The R-squared of 23% suggests a modest correlation. Indeed, we have seen that the correlation derives almost entirely from the effect of the ED group’s generally lower scores.

The green points show the averages of the ED and Not ED pass rates. The slope of the fitted line is much lower, minus 0.7% for a 10% ED increase, and the 2.3% R-squared value denotes only a trace of correlation. Said otherwise, this average is almost entirely unrelated to the percentage of ED students in the division.

The math and science data tell the same story.



An ideal grading system would present a horizonal fitted line with an R-Squared of zero. The ED/Not Ed average comes close and, in any event, is vastly less unfair than the current system.

BTW: The Board of Education had an even better system, the SGP, that was uncorrelated with economic status. The Board abandoned that system because it provided an accurate measure of teacher performance.

So we are left with a reporting system that punishes schools and divisions that serve larger populations of poorer students.

If that is a fair system, I am Santa Claus.

2019 Graduation Rates

The four-year cohort graduation rates are up on the VDOE Web site.

The Board of Education brags on its (inflated) “On-Time” rate that counts the “Board of Education-approved”  diplomas. The Federales, to their credit, count only the Advanced Studies and Standard diplomas.

This year, the Federal rate for the state slipped from 88.8% to 88.7%; the Richmond average fell from 67.6% to 64.9%.


On these data, there’s no telling how much of the Richmond decline is the result of correcting the transcript issues.

Breaking out the Standard and Advanced rates, the Advanced average fell from 52.0 to 51.5% while the Standard rate rose from 36.8% to 37.2%. The Richmond Advanced rate fell from 23.3% to 21.7%, the Standard, 44.3% to 43.2%.


Here are the Richmond rates by school, sorted by the Federal rate.


And, to complete the picture, the rates for Richmond’s economically disadvantaged students (ca. 2/3 of the Richmond population).


Note: The zero Advanced rates here for Wythe and Franklin probably are artifacts of the VDOE suppression rules for groups of <10 students. The Franklin rate calculates as 29%; the Wythe could be as high as 5.5%.

“Help” for Petersburg II: Mirror of VBOE Incompetence

The Petersburg Public Schools have been laboring under the supervision of the Board of Education since at least 2004.

The results are appalling.

Here, to start, are this year’s accreditation calculations. (See this post for descriptions of the L1 and L2 benchmarks and the various score boosts.)




Or, in summary:


Petersburg reached this condition as the latest step in a march to abject failure. The decline was exacerbated in 2017 when the staff at AP Hill Elementary (now Cool Spring) were caught cheating.

And, please remember that the accreditation system is rigged to avoid this kind of failure.

The Board’s intervention at Petersburg goes back to a Memorandum of Understanding (“MOU”) in 2004:

Approaches of the last two MOUs illuminate the Board’s feckless approach to improving the Petersburg schools. In November, 2009 the Board took over the Petersburg system:

The VBOE and the VDOE will continue to assign a CAO . . .   The CAO will have administrative authority over processes, procedures, and strategies that are implemented in support of the MOU and funded by targeted federal and state funds with subsequent review and approval by the Petersburg City School Board.

Then, in April, 2016 the Board retreated to “coordinat[ion]” and “technical assistance”:

The Director of [Office of School Improvement] will coordinate with school division staff and other VDOE offices to develop a Corrective Action Plan for Petersburg Public Schools and to provide technical assistance in support of the MOU and Corrective Action Plan.

Notwithstanding this pusillanimity, The Board has the authority to compel Petersburg to fix its schools:

Va. Code § 22.1-8 provides: “The general supervision of the public school system shall be vested in the Board of Education.”

Va. Code § 22.1-253.13:8 provides:

The Board of Education shall have authority to seek school division compliance with the foregoing Standards of Quality. When the Board of Education determines that a school division has failed or refused, and continues to fail or refuse, to comply with any such Standard, the Board may petition the circuit court having jurisdiction in the school division to mandate or otherwise enforce compliance with such standard, including the development or implementation of any required corrective action plan that a local school board has failed or refused to develop or implement in a timely manner.

Yet the Board has never sued Petersburg, or any other division, to compel compliance with the law. The reason is clear: The Board would have to tell the judge what the division must do to comply with the law but the Board DOES NOT KNOW HOW TO FIX BAD SCHOOLS [Sept. 21, 2016 video starting at 1:48].

I think it is past time for the Board members (and their staff at VDOE) to be directed to work that is better suited to their capabilities.

Effects of “Help” for Petersburg

The Board of “Education” has been “supervising” Petersburg since at least 2004.

If anything, all that “supervision” has let Petersburg’s performance continue to decay. In summary:

Notes: Statewide, economically disadvantaged (“ED”) students underperform their more affluent peers (“Not ED”) by about 20% in terms of the SOL pass rates. Yet, “the socio-economic composition of schools explains far more of the differences in student performance between schools than do other school factors .” This makes the SOL average an unfair measure for divisions, such as Petersburg, with large ED populations. Accordingly, the graphs here and below show the performance of both the ED and Not ED students. The new math tests in 2012 and the new English tests in 2013 dropped pass rates statewide; the new math tests in 2019 raised pass rates statewide by 3.4% for Not ED students, 6.6% for ED. The red lines are the nominal levels for accreditation. VDOE “adjusts” the pass rates in order to accredit some schools that come nowhere near those thresholds.

Turning to the individual Petersburg schools: The 2017 data for AP Hill, now Cool Spring, are missing because the school (the staff, not the kids) was caught cheating. The splendid numbers before and dismal performance since 2017 show much the same pattern as Richmond’s GW Carver.

image image

To the point here, with the false aura of the cheating removed, the data show that the school is unable to prepare half its students to pass the SOLs.

Note in passing: Last year, the Board of Education accredited Cool Spring in English based on the three year average. Lacking 2017 data because of the cheating, the Board ignored 2017 and reached back to the (cheating enhanced) 2016 pass rate to compute an average that met the “Level 2” threshold.

Pleasants Lane, formerly JEB Stuart, showed some improvement in the math scores this year (as did the state average with the help of the new, failure-averse scoring system), but the Reading rates slid.

image image

Lakemont, formerly RE Lee, painted a less rosy picture.

image image

Walnut Hill, in contrast, stayed in the running for accreditation this year.

image image

There are four middle schools in the database: Peabody, Vernon Johns Jr., Old Vernon Johns, and Vernon Johns. The relationship between these is not clear. However, only Vernon Johns has data after 2017 so we’ll go with that:

image image

The high school’s students suffer from declining performance.

image image

It’s hard to see any benefit here from sixteen years of Board of Education “supervision.” To the contrary, the ongoing, dismal failure of the Petersburg schools testifies to the abiding fecklessness of that Board.

Your tax dollars at “work” (this year, $1,939,750 for “school improvement).”

Middle and High School Accreditation Analysis

Here, to start, are the 2019 English Accreditation data for Richmond’s middle schools.


(For details regarding the benchmarks and adjustments, see the earlier post.)

Franklin, a selective school whose pass rates include results from both middle and high school students, made L1 on pass rate alone. Adjustments brought Hill to exactly the L1 benchmark. Binford and Brown adjusted to L2. None of the other four came close to L2.

The math data look a bit better.


Brown adjusted to within one point of L2 (66) and also was 65 on the three year average and did not decrease the failure rate by 10% from a pass rate of at least 50%, but was awarded L2 anyhow. Even more entertaining, Elkhardt-Thompson adjusted to 62 and was 55 on the three year and failed to reduce the failure rate by 10% but also wound up at L2. It looks like near misses count in horseshoes, hand grenades, nuclear warfare and accreditation.

In science, four schools made L1 on the pass rate, the other four did not even come close to L2.


The overall picture:


Hill made L1 on all the major measures but wound up at L3 because of problems with achievement gaps of its black, poor, and disabled students (the standard says the rating is L3 if two or more student groups are at L3). Similarly, Brown, although credited with L2 on the English pass rate, flunked English because of achievement gaps (also the black, poor, and disabled groups).

Turning to the high schools.


The English pass rates were excellent at the selective schools and more than good enough at Marshall and TJ. Huguenot made L2. Wythe missed L2 on the pass rate but made L1(!) with a 75 on the three year average. Armstrong barely made 50%.

The math numbers were not so good.


Among the standard high schools, only Huguenot made L1 with the others languishing.

The science data painted a similar picture but with all the standard high schools below the L2 benchmark.


Marshall (65 this year and 54 on the three-year average) made L2 by reducing the failure rate by >10%.  TJ (64 this year, 60 last year, 63 on the 3-year average) got a mystery boost to L2.

At the bottom line, only the selective schools were accredited. None of the standard high schools made L2, despite the hidden help they all obtained by stealing the scores of Maggie Walker students).