Charter Middle School?

The Times-Dispatch has a story, “Richmond parents want a charter middle school.  It faces an uphill battle.” 

The snippet on the “Education” section of the Web page tells us the issue:

The power to approve a nascent effort to launch a public charter middle school in Richmond rests solely with officials wary of watching limited dollars needed for their own underperforming schools follow students elsewhere.

The data suggest that those “wary” School Board members and City Great Ones should think a little harder.

Let’s start with the distributions of fifth and sixth grade pass rates on the reading tests.


Here we see nine elementary schools with fifth grade pass rates that meet or beat the (former) 75% benchmark for accreditation on the reading tests: Munford, Fox, Southampton, Broad Rock, Stuart (now Obama), Holton, Patrick Henry, Cary, and Bellevue.  The only middle schools that make the same cut for the sixth grade are Alternative and Franklin Military. 

Relevant here, Richmond Alternative operates Spartan Academy, which “serves as a school to support students with academic, attendance and behavior challenges.”  Franklin is a different kind of specialty school; it does a decent job for selected students who elect “to experience a regular academic course of study while participating in a Junior Reserve Officer Training Program or Middle School Leadership Program.”

The highest-scoring non-specialty middle school on the graph is Binford, with a 69.5% pass rate, followed by Hill, 67.4%, and Brown, 62.6%.

The math data paint a similarly ugly picture.


Munford leads the fifth grade parade, followed by Cary, Fox, Fisher, Patrick Henry, Broad Rock, and Obama, all of which beat the nominal 70% benchmark for math. 

Franklin and Alternative 6th grade averages again beat the benchmark.  Next in line is Hill, at 65.2%, followed by Brown, 55.4%, and Binford, 54.1%.

(This is not to say that there aren’t serious problems in both elementary and middle schools.  Just look at those collections of pass rates below 50% in both subjects.)

To the point here, there are some Richmond elementary schools where a parent can send a kid while entertaining only the normal parental worries.  But come middle school, parents who can afford it have a good reason to opt for a County or private school.

And opt out they do: The Richmond enrollment plummets after the fifth grade.


Or, in terms of the raw enrollments:


(That ninth-grade “bump” is a national phenomenon that appears to reflect laissez faire promotion policies in the lower grades.  Most of the drain in Richmond after the 9th grade is dropouts.)

So, the question for our city and school board is more nuanced than just watching dollars flow to a charter middle school:  Would having a decent middle school help stanch the current flow of middle school funds out of the Richmond school system?

College Graduation Rates v. SAT Scores

We have seen that the graduation rates of our 4-year public colleges correlate well with the SAT scores of their freshmen.  Let’s return to that subject with some more recent data.

The most recent SCHEV cohort graduation data are from the 2012-13 first-time, full time freshmen.  SCHEV also has SAT data for entering freshmen that year.

Here, then, for our 4-year public colleges, are the 4-year graduation rates for that 2013 cohort plotted v. those median SAT scores (math + verbal).


The correlation is solid:  Entering SAT scores, on average, are an excellent predictor of that graduation rate (recalling, always, that the correlation does not prove causation).

Three schools considerably outperform the fitted line: James Madison and Longwood at +11 and VMI at +7.  The three largest underperformers are Old Dominion (-13), George Mason (-11), and VCU (-7).

Turning to the five-year rates:


The outperformers on the five-year rate are JMU (+12 from the fitted line), Longwood (+10), VMI (+7), and Radford (+4).  The underperformers are Old Dominion (-9), Norfolk State (-8), Wm. & Mary and George Mason (-5), and Mary Washington (-4). 

We might think that outstanding performance at the 4-year level would hinder performance at five years; JMU, VMI, and Longwood (and, to a lesser extent, Radford) all belie that notion.

The five-year rates are remarkably higher, even at THE University and W&M, with the increases generally larger at the schools with the lower rates.


Remember: The cohort data are for full time, first-time freshmen.  We might think that these rates would not reflect the larger part-time populations at the urban universities; we might be wrong in that.

BTW: Looking just at graduation rates, we see there is one (private) school that beat even THE University (on the 4-year rate, UVa, 88%; W&L, 93%).


We could argue endlessly about the causes for these correlations and differences.  The better question, I suggest, is what all these schools might do to improve their graduation rates:  On average, 45% of those full time, first-time freshmen entering 4-year public colleges did not graduate in four years; 28% did not make it in six. 

In terms of people:  About 14,450 (“about” because of the roundoff of the rate data) of 32,112 first-time, full time freshmen entering 4-year public universities in the fall of 2012 did not graduate in four years; about 8,990 did not make it in six years.

Score Distributions

Let’s look at the distributions of 2018 SOL pass rates.

Note added 10/22:  Aha!  I figured out how to graph both all the schools and the Richmond schools from the same pivot table.  So here are the redone graphs.

10/24: Oops.  Correcting an error in the way Excel handled the >50 and <50 entries for suppressed data.

First, the reading tests:


Next math.


Last, Science.


Accreditation: Progress by Fiat

How to Accredit a Failed (and Failing) School

This year, the SOL pass rates declined in the three subjects that underlie the accreditation process, both statewide and in Petersburg.


At the same time, the Board of Education proclaimed a statewide accreditation triumph.



  • One school that was “Accredited Pending Review of Alternative Accreditation Plan” in ‘18 is omitted from the graph. 
  • The VDOE Web page says 92% Accredited.  The data from their spreadsheet show 92.77%.  Even without their curious stance on roundoff (see below), that rounds to 93%.
  • See below for the demise in 2018 of the several “Partially Accredited” categories and the genesis that year of the “Accredited with Conditions” status.

We can gain some insight into the reason for this boom in accreditations atop a slump in pass rates by examining the Petersburg data.

The Petersburg Schools have been operating under Memoranda of Understanding since at least 2004

The system was denied accreditation continuously from 2006 to 2017.  On the 2017 data, Stuart Elementary and Johns Middle were denied accreditation; AP Hill Elementary accreditation was withheld because the staff there got caught cheating.

This year, notwithstanding all that “help” from the state, all the Petersburg reading pass rates, except perhaps for Peabody/Johns, decreased.



  • Data here are from the SOL database for reading; they no longer test writing in the elementary grades.  Numbers in the School Quality Profiles can be different (see below).
  • Petersburg changed the names of three elementary schools this year:
    1. A.P. Hill became Cool Spring
    2. J.E.B. Stuart became Pleasants Lane
    3. R.E. Lee became Lakemont

Except at the high school, the math rates dropped as well.


The science rates rose at Walnut Hill and Pleasants Lane; they fell at the high school and tanked at Lakemont.


The Petersburg schools’ home page announced the resulting change in accreditation:


All Petersburg schools are accredited in the 2018-19 school year.

(To their credit, they also said that, except for Cool Spring and Walnut Hill, those accreditations were “with conditions.”)

The accreditation changes were dramatic: 


How shall we explain this situation?

Well, if you are feeling masochistic, you can read the new accreditation regulation and dig into the numbers buried in the School Quality Profiles.   Or you can read on here.

This year, there are three performance levels

For English:

  1. Level 1: 75% current or 3-year average pass rate (boosted for “recovery,” growth, and English learners) or Level 2 prior year and decrease failure rate by 10%.
  2. Level 2: 66% current or 3-year average pass rate (boosted for retakes, growth, and English learners) or 50% prior year (boosted) pass rate improved by 10%. 
  3. Level 3: Everybody else, plus anybody at Level 2 for four years.

For math, the scheme is the same except the Level 1 benchmark is 70% and there is no boost for English learners.  For science, the math scheme applies except there is no boost for growth or recovery.

Here, then, are the English accreditation scores for the Petersburg schools.


The colors tell where the numbers came from:

  • Light green: The pass rate as reported by VDOE.
  • Brown: The percentage of students who flunked reading last year, took the remediation program, and passed this year.  They count twice(!).
  • Dark Green: The students who failed but showed “growth.”
  • Blue: English learners (who are counted only if they pass, but then count twice).
  • Red: The Level 1 benchmark.
  • Gray: The Level 2 benchmark.

There are some anomalies here:

  • VDOE reported a 51.3% pass rate for Cool Spring but their accreditation page says 55% (they round everything off). 
  • Pleasants Lane numbers similarly improved from 60.38 to 61.  (Formerly VDOE rounded up at 0.45; that does not explain this change). 
  • As well, the 71.49 at Walnut Hill became a 72 (perhaps because of the enhanced rounding up). 
  • The reading pass rate/accreditation numbers probably are not comparable for Johns and the high school because the writing scores get averaged in.

Even with all the Finagle factors, Vernon Johns did not come close, even to the new, diluted benchmark. 

Presumably Cool Spring could not enjoy “Growth” or “Recovery” boosts because the cheating canceled the 2017 scores.

But these enhanced pass rates are not the end of the matter.  There also is a potential boost from the three-year average: 


Poor Vernon Johns still looks to be beyond help.  (BTW: VDOE does not explain how they calculate the numbers to accommodate missing Johns data for ‘17 and the Peabody scores prior to the merger).  We need not examine the 2015 rules or the 50%-with-10%-gain options here: VDOE tells us Johns is Level 3 (also in math and science).

Despite an appalling performance in 2018 and no data in 2017 (cheating), Hill/Cool Spring makes the diluted 66% level (and nearly makes the 75% benchmark). 

A little arithmetic (nearly) shows where the Cool Spring 73 came from:  Here are the accreditation scores for Hill/Cool Spring for the last three years:


The actual three-year average for Reading is 48.  BUT if we just average the 2018 value with the (obviously cheating-enhanced) 2016 value, we get 72.  Close enough to 73 for VDOE, it seems.

Math is even more dramatic: 


But the three-year bonus takes care of Pleasants Lane and elevates Cool Spring to Level 1.


Again, the only way VDOE can get the Hill/Cool Spring numbers is by ignoring 2017.  No telling why they report 70 when the actual average was 69.

By any measure, Johns and Lakemont strike out and, indeed, VDOE reports them as Level 3.

The science scoring does not include the recovery et al. adjustments.  On the pass rates, only Walnut Hill makes either benchmark. 


The 3-year average does not save anybody.


On these numbers, Cool Spring’s enhanced average is 65, one point short of the Level 2 benchmark.

Another way to make Level 2 is to take a score of 50 or better and raise it by ten points.  That plainly does not work for Cool Spring, which started in ‘17 at zero (or in ‘16 at 83 in science and dropped to just over half that in ‘18).

Nonetheless VDOE reports Cool Spring as Level 2 for science.

These “levels” feed into the accreditation ratings:  A school is “Accredited” if all its school quality indicators are Level 1 or 2.  Any school with an indicator at Level 3 is “Accredited with Conditions.”  A school accredited with conditions “may” be denied accreditation if it fails to adopt and implement relevant corrective action plans “with fidelity.”  The regulation does not tell us what level of “fidelity” is sufficient.

Aside:  The question whether a school has implemented “with fidelity” does not “rest[] entirely upon . . . [an] examination” or constitute “approval by the Board of Education of a school division corrective action plan” so the due process requirements of the Administrative Process Act should apply.  If the Board of Education should ever try to deny accreditation somewhere, it will be interesting to see whether they comply with those requirements.

Another way to get accreditation this year is to make the grade under the previous regulation.  VDOE tells us no Petersburg school is in that category this year.

A last way to make full accreditation is to enjoy the running three-year accreditation exception from the statute.  On that subject, here are the Cool Spring data:


Cool Spring was accredited in 2015 and 2016 (certainly because they were cheating) but not in 2014 (and of course not in 2017, what with getting caught at the cheating).  So there’s no three-year run and the statute does not apply.

(But the data do make one wonder why VDOE did not bother to examine those obviously bogus 2015 and 2016 numbers and the resulting accreditation ratings.  Indeed, if not rewarded for that obvious cheating, Cool Spring would not enjoy being “accredited” this year.)

Thus, we have the curious situation where VDOE blesses the cheating at AP Hill/Cool Spring with bogus English and Math ratings and with a science rating that looks to be invented from whole cloth.

There are further “achievement gap” measures that provide useful information but let’s pass over those and look at the 2018 results for Petersburg.


As the Petersburg Web page said, everybody is accredited! 

None of the “conditions” schools, even Johns, which flunked on all the academic measures, can be denied accreditation unless it “fails to adopt and implement . . . corrective action plans with fidelity.”

So there you have it: The process rewards past cheating.  The old benchmarks at 75 and 70% are reduced to 66% (at least for four consecutive years) and apply to manipulated pass rates that can be nearly double the actual pass rates.  A school that misses those relaxed benchmarks, and does not benefit from one of the helpful exceptions, is accredited, albeit “with conditions.”   That school can be denied accreditation only if it does not, in effect, tell the Board of Education to go jump in the James.  Or, in the case of Petersburg, the Appomattox.

So, Petersburg gets to feel good about being “accredited.”  Never mind the increased numbers of kids who are not being educated.

98.2 million of your tax dollars at “work.”

Boosting the Graduation Rate

As VDOE bragged, their (bogus) “On-Time” graduation rate rose this year.  They didn’t brag about the Richmond rate; it dropped.

Turns out, the (somewhat less bogus) “federal” rates show the same pattern.


The increase in the state rate was driven by an increase in the standard diploma rate.  The drop in the Richmond rate came from a 2.7% decrease in the advanced studies rate.


But you, Astute Reader, are still looking at that first graph and asking: “John!  You’ve been ranting about how VDOE’s manipulation improved the state rate by about 1.3 points for ‘17 and ‘18 and the Richmond rate by perhaps five points.  Where are those increases?”

Ah, what a pleasure to have an attentive reader!  The 1.3 and 5 point boosts look to have been offset or partially offset by decreases in the End of Course pass rates. 

Turning to the data, here are the graduation rates again along with the averages of the five EOC subject area pass rates.


Students must pass six EOC tests to graduate.  Thus, the decreases of the pass rates of those required courses must have lowered the graduation rates.  Then the VDOE data manipulation offset those graduation rate declines in some measure. 

That looks like a general explanation.  The specific would require a more detailed knowledge of which students passed or failed which courses, and where in their four-year journey through high school, and whether they graduated.  For sure, the drop in the Richmond pass rates is consistent with the absence of the five-point boost there.

Of course, correlation is not causation and doubtless there are other factors in the mix here.  The floor is open for any more convincing suggestion.

BTW: The Big Drops in Richmond, and the lesser in state, EOC pass rates mostly came in math and science.



Preview of Coming Attraction: The Board of “Education” has its next Finagle factor in place: Under the new accreditation regulation (subsection B.3), we now have “locally verified credits” for students who flunk the required SOL tests.  This should insure another nice increase in the state graduation rate, paid for by another not-so-nice decrease in student learning.

Graduation (and Not)

The 2018 four-year cohort graduation data are up on the VDOE Web site.

For a discussion of the ways VDOE manipulates the data, see this post.  To the point here, the state’s “on-time” rate boosts the numbers, especially for those divisions willing to misclassify students as handicapped.  Now we have an additional boost: “Credit Accommodations” now allow students with disabilities who previously would have pursued a Modified Standard Diploma to earn a Standard Diploma.

The data below are the “federal” graduation rate, the sum of standard and advanced studies diplomas divided by the cohort size.  Because of the Credit Accommodations, the numbers do not compare directly to data from prior years.  (Calculations from earlier data suggest that this manipulation boosts Richmond’s federal rate by about 5% and the state rate by about 1.3%)

The cohort data also include dropout numbers.

Let’s start with some division graduation and dropout rates.


By nearly a 2:1 ratio, those Richmond diplomas are standard, not advanced studies.


The Richmond numbers are driven by three of our general population high schools, with the other two tagging along.


We have the worst graduation rate in the state.


Also the worst dropout rate.


Finally, here are the lists of the worst schools in terms of graduation rate,


and dropout rate.  (The “#NA()” entries are cases where groups of <10 students invoked VDOE’s suppression rules.)


“Quality Indicators”

The new “school quality indicators” for Accreditation come in three levels (see Subsection F).  For the English tests, they are defined as:

Level One: Schools with a current year or three-year average rate of at least 75%, or schools that were at Level Two the prior year and decrease the failure rate by 10% or more from the prior year.

Level Two: Schools not meeting Level One performance with a current year or three-year average rate of at least 66%, or schools with a prior year rate of at least 50% that decrease the failure rate by 10% or more from the prior year. A school shall not receive a Level Two performance designation for more than four consecutive years.

Level Three: Schools not meeting Level One or Level Two performance.

Math and science are the same except the Level One benchmark is 70%.

The definitions for dropout rate, absenteeism, and graduation rate are similarly complicated.  The benchmarks, before the complications, are:

  • Dropouts: ≤6%;
  • Absenteeism: ≤15%; and
  • Graduation Index: ≥88%

See the regulation for the complications. 

Given that the entire process is built upon arbitrary criteria, let’s create an arbitrary grading scale for the three academic areas, along with dropouts, absenteeism, and graduation index:

  • L1: 100
  • L2: 50
  • L3: 0

With that we can average the results to calculate an overall Quality Indicator Score.

On that basis, 1,464 of 1,813 schools (80.8%) score 100%.  There are seventeen Richmond schools in that group. 


That “TS” represents a group too small to evaluate, they say.

The other twenty-six Richmond schools did less well:


If we turn to the bottom of the state list, as measured by this score, we see MLK in a three way tie for last place, with five other Richmond schools in the 15-way tie for fourth from last:


But, by golly, they’re all accredited.

Quote Without Comment, CGCS Version

The estimable Carol Wolf points me to a June, 2018 Review of the Finance and Business Operations of the Richmond Public Schools prepared by the Council of the Great City Schools.

First the Good News: Our new Superintendent requested the study.

The rest is Bad News.  You really should read the report to understand what a huge hole Mr. Kamras is trying to climb out of. 

Here, as a teaser, are some tidbits:

  • The team noted an absence of focus on successful student outcomes.
  • The team did not see any evidence that the district has developed an action plan to address the issues identified in the Memorandum of Understanding or has used the MOU as an opportunity to accelerate change. When the team asked for a copy of the district’s Corrective Action Plan, the team was told, “We are working on our CAP with the state now.” Additionally, a request for samples of the last six “required monthly updates” on steps taken to implement corrective action in the areas of operations and support services went unanswered.
  • There is a lack of communication channels up-and-down and side-to-side within and between departments. The team was told that . . . Departments work in silos with little communications between and among staff teams . . .
  • The team found few analytical tools and techniques, such as key performance indicators (KPIs), are used to measure and compare performance to increase effectiveness, achieve greater efficiencies, and set goals.
  • None of the interviewees could articulate a vision, mission, goals, objectives, or priorities of the administration.
  • Although employee performance evaluations are generally issued annually, assessments are not tied to goals or accountabilities.
  • Business plans with goals and objectives, benchmarks, accountabilities, timelines, deliverables, cost estimates, cost-benefit analysis, return on investment, and other analytics are generally not used or required. Performance metrics to drive programs and support projects and initiatives have not been developed.
  • The lack of a robust position control and management system has created frustration and finger-pointing between budget and human resources departments.
  • Purchase orders are required for all purchases, regardless of value.
  • Audit responses:

  • image

  • The team was unable to determine if any of the 27 recommendations from the August 2004 School Efficiency Review: City of Richmond Public Schools Division, conducted by the Commonwealth of Virginia – Office of the Secretary of Finance, were acted upon.
  • The team noted . . . [l]ittle recognition by most interviewed of how their specific role and function supported the classroom, students, or student achievement.  
  • The internal audit function is misaligned in that the current reporting relationship represents an internal control issue as the independence of the function has the potential to be compromised. . .  Further, the Internal Auditor is not included in the distribution of the external audit reports.
  • The district lacks a designated cybersecurity position to help prevent information breaches, equipment damage, overall network failures, and the potential for “hacking.”
  • The district’s Enterprise Resource Planning (ERP) legacy software system is antiquated (25+ years old), highly customized, and highly inefficient.
  • Annual building safety inspections are not taking place.

ALL Our Schools Are Above Average!

Note added Sept. 28.  Upon rereading the reg. I see I overstated the requirement for denial.  There is a four-year window built in at Level Two but once a school reaches Level Three, it is Accredited with Conditions.  There follows a regulatory minuet but the bottom line is that the school has to tell the Board to go jump in the James to get its accreditation denied.

The 2019 Accreditation Results (based on 2018 testing) are up.

Last year, nineteen (of forty-four) Richmond schools were denied accreditation.  This year, all Richmond – and, indeed, all Virginia – schools are accredited!

(Under the new, byzantine system – see below – nineteen Richmond schools are “Accredited” this year and twenty-four are “Accredited With Conditions.”  The “condition” is that each of the twenty-four must “develop and implement a corrective action plan.”  It will be at least four years before any Virginia school can be denied accreditation.  Only if it fails to negotiate and implement that plan “with fidelity” can the school be denied accreditation.)

If you thought that the elimination of all those denials of accreditation reflected performance improvements, you’d be wrong. 

In Richmond, the reading pass rate rose by 0.73% this year, the science by 0.53; the others fell sufficiently that the five subject average fell by 1.73%.


For reference:  The nominal benchmark for English is 75 (the blue line on the graph); for the other subjects, 70 (the orange line).

Statewide, all the averages fell.


The picture is even more interesting at some of the individual Richmond schools.  For example, we have Armstrong, the 12th worst school in Virginia in terms of the all-subjects pass rate:


In this mixed bag of gains and losses, only one pass rate broke 50%.

But Armstrong now is accredited, albeit “with conditions.”

MLK, the second worst school in the state this year on the all-subject average, paints an even more remarkable picture:  It’s highest 2018 pass rate was 43.18 in History and Social Science.  It’s lowest, a plunge to 16.67 in writing.


But, by golly, MLK now is accredited, also “with conditions.”

Then we have Woodville, the fourth worst school in Virginia, as measured by the all-subjects average:


No pass rate above 40% this year, but accredited, again “with conditions.”

Note: The “#N/A” is Excel’s way of saying that the State does not test writing in elementary school and it can’t graph the nothingness.

How did the Board of “Education” produce these bizarre results? 

It all goes back to Petersburg, which has been operating under Memoranda of Understanding since 2004 and which was denied accreditation for 2006.  And has been denied accreditation ever since.

The Board has demonstrated beyond doubt that it does not know how to fix Petersburg (and admitted as much, see the Sept. 21, 2016 video starting at 1:48).  Faced now with the ongoing Petersburg debacle and with accreditation disasters in Richmond and elsewhere, the Board punted: They adopted a new, emasculated accreditation regulation.

I commented on that regulation at the proposed stage, pointing out, inter alia, that the changes “make it almost impossible for a school to be denied accreditation.”

To read the entire regulation is to earn a PhD in masochism.  (A reporter for the Daily Press did read it; she survived, and wrote a nice summary.) 

Blessedly, the important parts are short:

  • If a school does not meet a standard and does not come close to meeting it (the regulation dilutes the 75%/70% benchmarks to 66%) for four consecutive years, it falls to performance Level Three.  8VAC20-131-380.E.2.
  • A school at Level Three must develop a “corrective action plan.”  8VAC20-131-400.D. 
  • If a school (or division) fails to adopt and implement a corrective action plan “with fidelity,” it can be denied accreditation.  Id.

In short, in order to lose accreditation, a school must foul up badly for four consecutive years and then tell the Board to go to hell.

That is not a problem, however; that is a feature.  The regulation imposes a sterile paperwork smokescreen to hide the Board’s incompetence as to the inadequate schools in Petersburg and Richmond (and elsewhere).  And, not at all beside the point, to make the Board and all those awful schools look better than they are in fact. 

Never mind the kids who are the victims of this perverse charade.

98.2 million dollars of our tax money at “work.”


Postscript:  Here is the Richmond list.


State Malfeasance at Carver (and Everywhere Else)

The Virginia Board of Education’s concern for the effect of institutional cheating on the Carver students only applies to a fraction of those students.  And it does not apply to students affected by cheating elsewhere.

Following its investigation of the institutional cheating at Richmond’s George Washing Carver Elementary School this spring, the Department of “Education” wrote a thirty-three page report

Perhaps the most devastating feature of the report was the analysis of the effect of the past cheating upon the Carver students who went on to middle school.  Two graphs show the impact reaching back to the cohort of students who were in the fifth grade at Carver in 2016:



As we might expect, kids who were not doing well at Carver were told they were wonderful; then they bombed out in middle school.  This has been going on since at least 2016, and probably since 2014.

Section V. of the report says that RPS “must” implement a set of seven “actions.”  The two actions relating to help for the affected students are:

1. By Friday, September 21, 2018, RPS division-level staff will develop: 1) a plan to evaluate whether additional instructional supports are necessary for any former GWC ES student entering middle school for the first time in 2018-2019 and 2) a plan to implement any additional instructional supports needed for these students.

2. By Friday, September 21, 2018, RPS division-level staff will develop: 1) a plan to evaluate whether additional instructional supports are necessary for GWC ES students entering the fourth and fifth grade in the 2018-2019 school year and 2) a plan to implement any additional instructional supports needed for these students.

So, Richmond must act to ameliorate the effect of the cheating upon students who were at Carver this year. 

But the Carver students in grade 5 last year and the students of the graphs, who were in grade 5 in 2016, must fend for themselves.

And, looking at the graphs, the cheating was rampant in 2015 and ramping up in 2014.  But the Board of “Education” is indifferent to the effect on the fifth graders of those years.

As well, the report shows an analogous impact on Carver students who transferred to other elementary schools.  But those students, also, must fend for themselves.

The Carver outrage was not the first such incident.  There was an institutional cheating scandal at Oak Grove in 2005.  There was another at Petersburg’s A.P. Hill in 2017. 

We now see that the Board of “Education” has had a simple tool for spotting and – if the analysis were known to be in use – for preventing such cheating.

But the Board has not used this simple analysis in the past and it shows no inclination toward a general deployment now.

It is hard to know what is most outrageous:

  • This official decision to ignore the effect of the cheating on so many students;
  • The failure of the Board of “Education,” to have been conducting that cohort study every year to catch – and, even more to the point, to prevent – this kind of cheating; or
  • The failure of the Board of “Education,” to institute an annual, general cohort study going forward, primarily to prevent another such abuse of Virginia schoolchildren.

Ah, well: $98.2 million of your tax money at “work.”