Quote Without Comment, CGCS Version

The estimable Carol Wolf points me to a June, 2018 Review of the Finance and Business Operations of the Richmond Public Schools prepared by the Council of the Great City Schools.

First the Good News: Our new Superintendent requested the study.

The rest is Bad News.  You really should read the report to understand what a huge hole Mr. Kamras is trying to climb out of. 

Here, as a teaser, are some tidbits:

  • The team noted an absence of focus on successful student outcomes.
  • The team did not see any evidence that the district has developed an action plan to address the issues identified in the Memorandum of Understanding or has used the MOU as an opportunity to accelerate change. When the team asked for a copy of the district’s Corrective Action Plan, the team was told, “We are working on our CAP with the state now.” Additionally, a request for samples of the last six “required monthly updates” on steps taken to implement corrective action in the areas of operations and support services went unanswered.
  • There is a lack of communication channels up-and-down and side-to-side within and between departments. The team was told that . . . Departments work in silos with little communications between and among staff teams . . .
  • The team found few analytical tools and techniques, such as key performance indicators (KPIs), are used to measure and compare performance to increase effectiveness, achieve greater efficiencies, and set goals.
  • None of the interviewees could articulate a vision, mission, goals, objectives, or priorities of the administration.
  • Although employee performance evaluations are generally issued annually, assessments are not tied to goals or accountabilities.
  • Business plans with goals and objectives, benchmarks, accountabilities, timelines, deliverables, cost estimates, cost-benefit analysis, return on investment, and other analytics are generally not used or required. Performance metrics to drive programs and support projects and initiatives have not been developed.
  • The lack of a robust position control and management system has created frustration and finger-pointing between budget and human resources departments.
  • Purchase orders are required for all purchases, regardless of value.
  • Audit responses:

  • image

  • The team was unable to determine if any of the 27 recommendations from the August 2004 School Efficiency Review: City of Richmond Public Schools Division, conducted by the Commonwealth of Virginia – Office of the Secretary of Finance, were acted upon.
  • The team noted . . . [l]ittle recognition by most interviewed of how their specific role and function supported the classroom, students, or student achievement.  
  • The internal audit function is misaligned in that the current reporting relationship represents an internal control issue as the independence of the function has the potential to be compromised. . .  Further, the Internal Auditor is not included in the distribution of the external audit reports.
  • The district lacks a designated cybersecurity position to help prevent information breaches, equipment damage, overall network failures, and the potential for “hacking.”
  • The district’s Enterprise Resource Planning (ERP) legacy software system is antiquated (25+ years old), highly customized, and highly inefficient.
  • Annual building safety inspections are not taking place.

ALL Our Schools Are Above Average!

Note added Sept. 28.  Upon rereading the reg. I see I overstated the requirement for denial.  There is a four-year window built in at Level Two but once a school reaches Level Three, it is Accredited with Conditions.  There follows a regulatory minuet but the bottom line is that the school has to tell the Board to go jump in the James to get its accreditation denied.

The 2019 Accreditation Results (based on 2018 testing) are up.

Last year, nineteen (of forty-four) Richmond schools were denied accreditation.  This year, all Richmond – and, indeed, all Virginia – schools are accredited!

(Under the new, byzantine system – see below – nineteen Richmond schools are “Accredited” this year and twenty-four are “Accredited With Conditions.”  The “condition” is that each of the twenty-four must “develop and implement a corrective action plan.”  It will be at least four years before any Virginia school can be denied accreditation.  Only if it fails to negotiate and implement that plan “with fidelity” can the school be denied accreditation.)

If you thought that the elimination of all those denials of accreditation reflected performance improvements, you’d be wrong. 

In Richmond, the reading pass rate rose by 0.73% this year, the science by 0.53; the others fell sufficiently that the five subject average fell by 1.73%.


For reference:  The nominal benchmark for English is 75 (the blue line on the graph); for the other subjects, 70 (the orange line).

Statewide, all the averages fell.


The picture is even more interesting at some of the individual Richmond schools.  For example, we have Armstrong, the 12th worst school in Virginia in terms of the all-subjects pass rate:


In this mixed bag of gains and losses, only one pass rate broke 50%.

But Armstrong now is accredited, albeit “with conditions.”

MLK, the second worst school in the state this year on the all-subject average, paints an even more remarkable picture:  It’s highest 2018 pass rate was 43.18 in History and Social Science.  It’s lowest, a plunge to 16.67 in writing.


But, by golly, MLK now is accredited, also “with conditions.”

Then we have Woodville, the fourth worst school in Virginia, as measured by the all-subjects average:


No pass rate above 40% this year, but accredited, again “with conditions.”

Note: The “#N/A” is Excel’s way of saying that the State does not test writing in elementary school and it can’t graph the nothingness.

How did the Board of “Education” produce these bizarre results? 

It all goes back to Petersburg, which has been operating under Memoranda of Understanding since 2004 and which was denied accreditation for 2006.  And has been denied accreditation ever since.

The Board has demonstrated beyond doubt that it does not know how to fix Petersburg (and admitted as much, see the Sept. 21, 2016 video starting at 1:48).  Faced now with the ongoing Petersburg debacle and with accreditation disasters in Richmond and elsewhere, the Board punted: They adopted a new, emasculated accreditation regulation.

I commented on that regulation at the proposed stage, pointing out, inter alia, that the changes “make it almost impossible for a school to be denied accreditation.”

To read the entire regulation is to earn a PhD in masochism.  (A reporter for the Daily Press did read it; she survived, and wrote a nice summary.) 

Blessedly, the important parts are short:

  • If a school does not meet a standard and does not come close to meeting it (the regulation dilutes the 75%/70% benchmarks to 66%) for four consecutive years, it falls to performance Level Three.  8VAC20-131-380.E.2.
  • A school at Level Three must develop a “corrective action plan.”  8VAC20-131-400.D. 
  • If a school (or division) fails to adopt and implement a corrective action plan “with fidelity,” it can be denied accreditation.  Id.

In short, in order to lose accreditation, a school must foul up badly for four consecutive years and then tell the Board to go to hell.

That is not a problem, however; that is a feature.  The regulation imposes a sterile paperwork smokescreen to hide the Board’s incompetence as to the inadequate schools in Petersburg and Richmond (and elsewhere).  And, not at all beside the point, to make the Board and all those awful schools look better than they are in fact. 

Never mind the kids who are the victims of this perverse charade.

98.2 million dollars of our tax money at “work.”


Postscript:  Here is the Richmond list.


State Malfeasance at Carver (and Everywhere Else)

The Virginia Board of Education’s concern for the effect of institutional cheating on the Carver students only applies to a fraction of those students.  And it does not apply to students affected by cheating elsewhere.

Following its investigation of the institutional cheating at Richmond’s George Washing Carver Elementary School this spring, the Department of “Education” wrote a thirty-three page report

Perhaps the most devastating feature of the report was the analysis of the effect of the past cheating upon the Carver students who went on to middle school.  Two graphs show the impact reaching back to the cohort of students who were in the fifth grade at Carver in 2016:



As we might expect, kids who were not doing well at Carver were told they were wonderful; then they bombed out in middle school.  This has been going on since at least 2016, and probably since 2014.

Section V. of the report says that RPS “must” implement a set of seven “actions.”  The two actions relating to help for the affected students are:

1. By Friday, September 21, 2018, RPS division-level staff will develop: 1) a plan to evaluate whether additional instructional supports are necessary for any former GWC ES student entering middle school for the first time in 2018-2019 and 2) a plan to implement any additional instructional supports needed for these students.

2. By Friday, September 21, 2018, RPS division-level staff will develop: 1) a plan to evaluate whether additional instructional supports are necessary for GWC ES students entering the fourth and fifth grade in the 2018-2019 school year and 2) a plan to implement any additional instructional supports needed for these students.

So, Richmond must act to ameliorate the effect of the cheating upon students who were at Carver this year. 

But the Carver students in grade 5 last year and the students of the graphs, who were in grade 5 in 2016, must fend for themselves.

And, looking at the graphs, the cheating was rampant in 2015 and ramping up in 2014.  But the Board of “Education” is indifferent to the effect on the fifth graders of those years.

As well, the report shows an analogous impact on Carver students who transferred to other elementary schools.  But those students, also, must fend for themselves.

The Carver outrage was not the first such incident.  There was an institutional cheating scandal at Oak Grove in 2005.  There was another at Petersburg’s A.P. Hill in 2017. 

We now see that the Board of “Education” has had a simple tool for spotting and – if the analysis were known to be in use – for preventing such cheating.

But the Board has not used this simple analysis in the past and it shows no inclination toward a general deployment now.

It is hard to know what is most outrageous:

  • This official decision to ignore the effect of the cheating on so many students;
  • The failure of the Board of “Education,” to have been conducting that cohort study every year to catch – and, even more to the point, to prevent – this kind of cheating; or
  • The failure of the Board of “Education,” to institute an annual, general cohort study going forward, primarily to prevent another such abuse of Virginia schoolchildren.

Ah, well: $98.2 million of your tax money at “work.”

Muzik Non Rilassante

Note added 9/20: Mr. Muzik emailed a response to this post.  I’ve appended it here.
Still later: Plus a note from Mr. Pyle of VDOE.

The Principal at Munford, Greg Muzik, posted a couple of comments to Carol Wolf’s repost of my Munford post.  Yesterday, I copied him on a note to Carol and Chuck Pyle of VDOE that included a spreadsheet analyzing his data and reexamining my own.  Mr. Muzik replied to the email.  I now respond:

Mr. Muzik,

I write in response to your email of yesterday to me with copies to Mrs. Wolf and Mr. Pyle (copy attached below).

Your email begins:

I am not sure what you mean by bowderlized. I sent the data based on what you had put on our blog related to student pass rates (just grade 3 and 5 from 2017 and 2018 and what Carol W. had requested.

Notes: The parenthesis is not closed in the original. The “what you had put on our blog” is nonsense: I posted originally to my own blog; Mrs. Wolf reposted to hers; I have not “put” anything on any blog of yours. As set out below, my analysis manifestly was not restricted to “just grade 3 and 5.”

By “bowdlerized,” I mean the adjective form of the verb, bowdlerize:

Bowdlerize  verb . . .
2 : to modify by abridging, simplifying, or distorting in style or content . . .
censor, clean (up), expurgate, launder, red-pencil

The pdf you sent Mrs. Wolf contains eight pages. These report “CAT” data for Munford: For 2018, one page each 3d grade reading and math and one each 5th grade reading and math; and the same four datasets for 2017.

These data appear to be the basis for your objection to my remarks. Those remarks were, in relevant part:


It would be unusual to see scores in the 86’s for both lower grades and some 8 points higher in grade 5 in either subject, much less in both.  It must be beyond coincidence that, as well, the reading and math scores are the same at each grade level and when averaged, either by grade or by student.

Your response, posted in comments on Mrs. Wolf’s site, discussed at length the CAT pass rate differences at Munford between the third and fifth grades, with no mention of the fourth grade and without any principled explanation of the pattern of reading/math pass rates at all three grade levels in the official VDOE data containing the results of all the tests.

You thrashed that straw man after admitting “I am not sure where Butcher is pulling his data.” If you had attended either my post or Mrs. Wolf’s repost, you would have seen the link to the VDOE database (containing the scores I list above) in the second line of each post.

(On the subject of attention or the lack of it, I would point out that my name is “Butcher,” not “Bollard” as you call me at one point in your comment on the Wolf site.)

If you had wished to dispute my data, the principled approach would have been to download the VDOE data, check my analysis, point out any discrepancy, and suggest a data exchange to resolve any dispute. Instead, you used a partial dataset that is not available to the public to attack my use of the official data that I had sourced and that you admitted were foreign to you.

More to the point, your analysis, such as it was, focused on the aspect of the data that I called “unusual” and, aside from your analysis of the wrong, partial database, failed to respond to the more serious issue, the identical pass rates.

I found ten instances in the six-year period where a Richmond elementary school reported identical reading and math pass rates at grade 3, 4, or 5.


Exactly half of those cases were at Munford.

The presence of Carver in this list carries a particularly unfortunate implication. The presence of Oak Grove, where there was a 2005 cheating scandal, casts a weaker aura.

Friar Occam would counsel the same conclusion that I suggested:

Cheating (by the school, not the students), done systematically, could easily produce equal pass rates in the two subjects.  Coincidence is most unlikely here.  No other explanation presents itself.

I now would amend that to include another unlikely possibility, error by VDOE in processing the data. While possible, that also is not a likely explanation. I have been looking at their data for more than a decade. I recall only one instance where they made a mistake (it was obvious on the face of the data) and one other where they failed to demand that Richmond complete a faulty submission.

As you saw (or should have seen) from my earlier email, I have asked Mr. Pyle whether VDOE has an explanation for the discrepancy between your and VDOE’s pass rate data. The copies you provided, along with the discussion in your last email, give us that explanation: You chose to rely upon a limited set of data, not available to the public, that reports the results of CAT tests, not the full range of tests included in the VDOE pass rate data that I cited and used in my analysis.

To the more fundamental point, VDOE has already taken a first look at the Munford data: Mr. Pyle of VDOE wrote me on the 17th:

Not the final word, but I wanted you to know that we looked at the Mary Munford Elementary data. When you breakdown the grades 3-5 reading and math pass rates, the advanced/pass-proficient percentages are not identical. Given the number of students tested in each grade level, coincidence is not out of the realm of possibility.

Of course, coincidence is not out of the range of possibility; neither is VDOE error. Neither is probable, however, and coincidence is especially unlikely in light of the data above.

BTW: You have overlooked a feature of the State data that supports the notion that the Munford scores were not rigged. One giveaway to the Carver data was the sudden rise to best in the City. For (a startling) example:


In contrast, the Munford data, particularly in the third grade, show year-to-year variations, with an improving trend, all in a good but not spectacular range. For example:


If your people were cooking the data, these numbers could be expected to be more uniform and, especially, higher, contributing to a statewide all-subject average rank much better than Munford’s current #203 (of 1,714).

On another subject, you said, “Butcher seems to be complaining about our tax dollars used to support education.” Again, you have revealed your lack of attention to the material you are criticizing. My comment, “98.2 million dollars of our tax money at ‘work,’” was the sarcastic conclusion to a jeremiad about VDOE’s ability to easily spot cheating such as persisted at Carver for at least four years, and its abiding failure to do so.

Indeed, in the Carver case, VDOE demonstrated that it has a key to falsification of the cheating hypothesis as to Munford. Their analysis of the Carver situation comported with the obvious notion that students whose scores are artificially raised in elementary school will generally score much lower once they enter middle school. I already have asked Mr. Pyle whether VDOE intends to perform a similar cohort analysis of the Munford students who have recently entered middle school. I hope you will join me and asking that they conduct that study for all of our elementary schools.  Even though it cannot identify problems with this year’s grade 5 pass rates, that study can root out past problems and, more importantly, should serve as prophylaxis for next year’s testing.


The email:

I am not sure what you mean by bowderlized. I sent the data based on what you had put on our blog related to student pass rates (just grade 3 and 5 from 2017 and 2018 and what Carol W. had requested.   The data sent is “raw” data that shows the actual number of students who took the test and who passed or failed.    The data from the state you get may be the adjusted pass rates.  This takes into account students who were “recovery” that can add to your pass rate, those who transferred into the school form another division after the first 30 days of school (Failing scores don’t count) and English Language learners whose scores are not counted when DOE adjusts the pass rates. It also does not include special Education students who take the Va. Alternative Assessment (VAAP).   That data is reported separately, but included in the state published pass rates.    As I told Ms. Wolf, the scores at Munford are at or just a little higher than what would be expected based on the population we serve.  While I would love to say the reason our students perform so well is they great teachers and principal, the results reflect what we see all over Virginia and the nation.

I did neglect to include a sheet for the Plain English Math Assessment that one student took that so the math and reading pass rates in 2018 were the same (4 students failed both math and reading our of the 71 who took the tests). but having the same number of kids passing or failing a test is not unusual  and does not mean much. In our case, 3 of the 4 students were special education students and this is a challenge for all schools,  and one of the gap groups.  In grade 5 reading and math go hand in hand as there is a great deal of reading students must do in the math assessment.  The pass rate has nothing to do with how student performed as passing scores range from 400 – 600.     A better way to look at grade level performance is the mean scaled score.  The mean scaled score for math was 510.5 and the mean scaled score for reading was 497.9.   So even students who passed had great variation in their scores (as expected).  

For schools, the raw data is what is used to look at instruction, addtional support services and programs to support students.  The adjusted pass rates are really just a school accountability process for accreditation.  For instruction use, we use the raw data.   

On Tue, Sep 18, 2018 at 9:54 AM, John Butcher <[deleted]> wrote:

Carol and Chuck,

Here . . . are the (obviously bowderlized) Pearson pdf that Carol received from Munford along with my spreadsheet that compares the Pearson data for Munford (4th grade numbers were absent from the Pearson data I received), my original Munford numbers extracted from a large dl from the VDOE site, and a fresh Munford dl from the VDOE site. Yellow headings denote my calculations.

The VDOE pass rates are identical in both downloads.  There is a clear disconnect between the Pearson and VDOE data. 


  • Do your data folks have an explanation?
  • Carol tells me that the Principal told her that the Pearson data include SGPs.  Does the FOIA still define “public record” to include data “prepared or owned by, or in the possession of a public body or its officers, employees or agents in the transaction of public business”?


Muzik email of 9/20:

1.  The data I provided  is the raw data from Pearson and was not altered data in any way.  This does not account for SOA adjustments that relate more to accreditation that student performance.  The Raw data is what drives instructional priories and programs to address remediation and student support.  

2.  I only addressed grade 3 and 5 because there seemed to be a concern about the SOL pass rate growth of 8 points from grade 3 – 5 which matches  the growth state wide from grade 3 – 5.  I provided several reasons why we see growth from grade 3 – 5

3.  It it not unusual   to have the same pass rates in reading and math in a specific grade level. In grade 5 last year, it just happened that there were 4 students who failed the SOL math and 4 who failed the reading.  This does not mean the results are the same as the range of failing is 0 – 400.   It is also not unusual  for students who fail the reading test to also fail the math especially if they are students with disabilities.  Many students who struggle with reading comprehension, vocabulary and fluency may also have difficulty with math fluency.   However, I reviewed the specific students scores and in grade 5.  Only one student failed both the reading and math SOL tests.  The rest of the students failed one and passed the other with scores ranging from a low of 301 to a high of 397.  A student with a score of 397 was one question away from passing!  

As already indicated, there is nothing unusual  about the SOL results at Munford. We are one of very few schools in the city that has very few children living in poverty.     Our student population reflects the neighborhood we serve.  Our scores reflect what is seen across the state based on the population of students.      All other outcome indicators at Munford have similar results such as the PALS, NWEA MAP and ARDT when we used it.   Mary Munford has outstanding teachers who provide high quality instruction to students and this does impact SOL results,  but may not be seen if simply looking at pass rates.  High quality instruction helps all students, but pass/fail  end of course testing does not  show student growth during the year.


Pyle email:

I’ll just add that I am not familiar with the form of the data schools receive directly from Pearson. The 2017-2018 pass rates reported by VDOE in August were calculated according to business rules under ESSA, just as pass rates were previously calculated according to business rules under NCLB. The adjusted accreditation pass rates we’ve reported for SOA purposes reflect the state Board of Education’s regulations and another set of business rules. Next week, we will report accreditation data aligned with the revised SOA.

Middle School Muddle

We have seen that Richmond’s middle schools have underperformed, particularly since the new math tests in 2012 and the new English tests in 2013.  For instance:



Or, in terms of the individual middle schools (here, the 6th grade pass rates):



Note: To achieve some level of clutter control, I’ve left out Chandler, which closed in 2010.  I’ve also omitted Elkhardt and Thompson, which merged to become Elkhardt-Thompson in 2016.

Looking just at the average pass rates for transitional grades 5 and 6 we see:


We know that at least some of the Richmond difference between the elementary and middle school grades can be explained by cheating in the elementary schools (by the schools, not by the students).  Doubtless the absence of the bogus Carver scores from the 2018 data helps explain Richmond’s Grade 5 drop that year.

The math data show much the same picture.


We can get a more nuanced view by comparing the 6th grade scores with the rates from the fifth grade a year earlier.  This gives a measure (not a cohort analysis, but the best we can do with the public data) of the progress of the class as it goes from elementary to middle school.

For example, Richmond’s 56.25 reading pass rate in 2018 is 13.36 points below the fifth grade’s 69.61 in 2017.  A graph gives us the history.


The dips in 2013 reflect the score drops with the new tests.  Then both Richmond and the state averages recovered, albeit Richmond to a much larger gap.  In an ideal world, both graphs would level off at zero.

To the point here, the state average these days shows a small drop in pass rate going from elementary to middle school while the Richmond decrease ranges from 9.6 to 16.7 points larger.

On the math tests, the state average 6th Grade rates are higher than the 5th, probably indicating that the new 6th grade test is the easier.  As well, the state recovery from the new tests was much more robust.


But Richmond remains 17.9 points down this year, even after the Carver cheating bonus has been removed.

There is little hope that the Board of “Education” will be looking in to this.  That Board did not bother to look at the data that nailed the Carver cheating until after our Superintendent asked for an investigation.  That dereliction of their duty as to Carver was part of a general failure, probably ongoing, to even glance at the data, much less to investigate the discrepancy.

Thus, we’ll have to depend on how vigorously our new Superintendent roots out the other (alleged) elementary school cheating.  If he does his job, the 2019 data may give a clearer picture of whether the current underperformance of Richmond’s middle schools reflects middle school incompetence or widespread elementary school cheating (or, Heaven forefend it, both).

Mystery at Munford

Having seen too much of the tragedy of Richmond’s worst schools, I thought I’d turn for a moment to Richmond’s best elementary school (at least as measured by the SOL pass rates).

On the 2018 all-subject average, Mary Munford Elementary is #203 from the top in the state. 




The Munford pass rates this year were 89.45 in both reading and math.


Turning to the pass rates by grade and subject, we see:







That looks to be just fine until we notice the pattern.


It would be unusual to see scores in the 86’s for both lower grades and some 8 points higher in grade 5 in either subject, much less in both.  It must be beyond coincidence that, as well, the reading and math scores are the same at each grade level and when averaged, either by grade or by student.

These numbers suggest that something is going on here.  They don’t say what it might be.

My spreadsheet has the other Richmond schools, and it goes back to 2013.  When I asked it to find identical reading and math pass rates, the result looked like this for the 3d Grade this year:


The fourth grade was even more interesting:


Let’s collect the whole batch and summarize the cases of identical reading and math pass rates by grade in the time frame of these data.


If we run the same analysis by school (so the average is over all students, not by grade), the only hit is in 2018.


Oak Grove got caught cheating in 2005.  Carver has been cheating since about 2014.  Recently, Carver, Ginter Park and, perhaps, Fisher have reported anomalously high pass rates for their disabled students. 

Cheating (by the school, not the students), done systematically, could easily produce equal pass rates in the two subjects.  Coincidence is most unlikely here.  No other explanation presents itself.

This suggests that our Superintendent may have a much larger problem than just Carver.

As well, this serves to remind us of the underlying, pervasive problem: The Board of “Education” found lots of data (especially student-specific data that they won’t share with the public) that told of the cheating at Carver.  Most telling, the cohort data showed that students with splendid scores at Carver mostly failed when they got to middle school. 



(And recall that Hill is our best middle school, with a 2018 average reading pass rate of 72 and math, 69.)

The Board did not bother to look at those data, however, until after our Superintendent asked for an investigation.  We can be confident that this dereliction of their duty as to Carver is part of a general failure, probably ongoing, to even glance at those data for other schools.

98.2 million dollars of our tax money at “work.”


On the all subject average, Richmond’s Woodville Elementary School has the fourth worst pass rate in the state, 33.82.


Woodville’s reading average was 36.5; math was 40.0.  History was 34.3.  It was the 24.6 in science that brought the average down.

The third grade reading scores have been improving in recent years (although if a 44% pass rate qualifies as “improved,” it’s hard to think of what came before without shuddering).


The higher grades, not so much.



The pattern mostly repeated on the math tests, albeit with drooping pass rates in the fifth grade.




One can think of Woodville as a (very expensive) ad for the County schools.

The Board of “Education” Wants to Help Richmond the Way It Has Helped Petersburg

For years, the mantra to distract attention from Richmond’s failed schools has been, “We beat Petersburg.”

On the 2018 SOL pass rates, we can say it again as to all five subjects and the five subject average: Petersburg has the worst SOL pass rates in the state.  Richmond is only second worst.

Specifically: Here are the bottom 20 (or more) divisions in each subject.  I’ve highlighted the peer jurisdictions in red and, as a courtesy to my readers there (two is more than none!), I’ve also highlighted Charles City and Lynchburg.







Petersburg has been operating under Memoranda of Understanding (i.e., edicts of the Board of “Education”) since at least 2004.

As I have pointed out, the Board of “Education” is a paper tiger.  It has the power to sue to compel compliance with the Standards of Quality.  It has never done so.  It has, instead, persisted with a failed Memorandum of Understanding process that it knows does not work.

There is a simple explanation for this counterproductive behavior:

If it were to sue, the Board would have to tell the judge what Petersburg must do to fix the schools.  The Board cannot do this because it does not know (Sept. 21, 2016 video starting at 1:48) how to fix those schools.  That is, the Board knows it would be futile to sue (and even more embarrassing than its present failure).

So now the Board has brought the same disruptive, expensive, and futile process to Richmond with, in this first year, the inevitable absence of any measurable benefit to the students.

On the evidence of fourteen years of sterile (if not destructive) State supervision of Petersburg and a fruitless year of State supervision of Richmond, RPS would be wise to tell the Board of “Education” to take its Memorandum of Understanding and go away.

Poverty: The Bad Excuse For Bad Schools

The poverty excuse for poor school performance again rears its head.

The 2018 SOL data tell us, yet again, to look elsewhere for the causes of poor school performance.

Before we look at those new data, let’s clear away some underbrush:

  • It is beyond question that poor kids underperform on the SOL tests.  For example, on the 2018 state average pass rate data, the “economically disadvantaged” (here abbreviated “ED”) students underperform their more affluent peers by 21.7 points on the reading tests and 20.0 on the math:


  • Correlation, however, does not imply causation.  The cause of this underperformance may well be something related to poverty, not the ED itself.
  • The data tell us that economic disadvantage has much lower correlation with SOL pass rates than do other factors.  See below.
  • Even with the flawed SOL yardstick, we can identify schools and divisions that perform well and that underperform.  Also see below.
  • The State Board of “Education” had, but has abandoned, a poverty-neutral measure of academic growth, the Student Growth Percentile.  The SGP data showed large variations in division (here and here) and teacher performance.
  • Poverty makes a perfect excuse for poor school performance because some portion of the population will always be less affluent than the citizenry in general.  And, for sure, the schools can’t fix poverty, so they like to blame that external factor for their own failures.
  • There are indications that even perfectly awful schools with large numbers of ED students can be improved.

Turning to the 2018 data, let’s start with the division pass rates on the reading tests vs. the percentage of the economically disadvantaged students.


A glance at the graph tells the same story as the statistics: As the ED percentage increases, the scores go down but there is a huge amount of scatter.  Some divisions greatly outscore the trendline and some grossly underperform.  Clearly, some factor(s) must have a much stronger relationship with SOL performance than the incidence of poverty in the student population.

Richmond is the gold square on the graph.  The peer cities are the red diamonds, from the left Hampton, Newport News, and Norfolk.

Richmond’s cry of “poverty” is falsified on these data: All but one of the 22 divisions with ED populations larger than Richmond’s outperformed Richmond.


The math data tell the same story.


Note: The ED percentages here and below are of the students taking the tests in question.  They are different here for the two subjects.  For example, Richmond is 68.2% on the reading tests, 67.1%, math.

Of course, division averages do not separate out the performance of the schools with larger or smaller ED populations.  So let’s take a look at the data by school.

Note: Some of the smaller schools are absent from these graphs because of the VDOE suppression rules as to small groups of students.

I’ve broken the data out by grade.  First, reading, in the elementary grades:




These are modest correlations, especially in the non-ED data, with both groups showing roughly the same change with increasing ED percentage (between 1.6% and 2.0% decrease per 10% increase in ED).

On to middle school:




These are about the trends we might expect but with some better (albeit still modest) correlations.  One interesting difference:  It looks like the effect of increasing ED percentage is about a third larger on the non-ED than on the ED pass rates.

I’ll spare you the math data by school.  They tell the same story but with lower scores and even more scatter.

The bottom line: At the school level, as at the division level, by far the largest effect on SOL pass rates relates to some factor(s) other than the relative number of economically disadvantaged students. 

And we know what one of the other factors (probably the most important one) is: teacher effectiveness.  For example, the SGP data showed for one year (I think it was 2014):

Only one of Richmond’s twenty-one sixth grade reading teachers produced an average student improvement better than the state average; none was more than two standard deviations above the statewide average.  Six (or seven, depending on the rounding) were more than two standard deviations below the state average and four were more than three standard deviations below.  The Richmond average is 1.5 standard deviations below the state average.

And the Richmond picture was even worse in 6th Grade math:


Note: State average there was 50.6.

Thus, not only is it futile to blame poverty for poor school performance, we know at least one place where we can improve learning: teacher performance.

More Teachers? More Schools? No More Learning?

We hear a lot about the benefits (and not) of smaller class size. 

The VDOE Web site has data on Fall division enrollments and numbers of teaching positions and SOL performance so let’s look at those numbers.  (The latest teacher numbers are from 2017 so I’ll use the other data from that year.)

To start, here are the division average reading pass rates plotted against the number of students per teaching position.


Notice the large range here, from 6.85 in Highland to 15.4 in Prince William.  The average of division averages is 11.5.

The gold square is Richmond at 10.7 students per teaching position.  The red diamonds are the peer cities, from the left Norfolk, Hampton, and Newport News.

Recall that smaller classes mean smaller ratios so look to the left for “better” in terms of class size.

The fitted line has a slight positive slope (0.3% pass rate increase for an increase of 1.0 in the student/teacher ratio), suggesting that smaller classes are associated with lower pass rates.  But the R-squared value, 0.4%, tells us that the two variables are essentially uncorrelated.

On these data, it looks like those divisions that are hiring more teachers per student are not, on average, getting any reading benefit from the extra money.

The math data tell much the same story.


We’ve also heard that Richmond has more small schools than usual, on purpose. 

We can get a measure of the number of schools in a division by counting the principals.  The Richmond average is 514 students per principal while the average of the division averages is 515.  The range, again, is surprisingly large, from 101, Highland again, to 943.6, Chesterfield.

The graphs do not suggest a benefit from Richmond’s average school size vis-à-vis the peers. 



So, next time RPS tells me it wants to replace some of its ancient infrastructure with other small schools, I’ll tell ‘em to take my share in stock in a nice James River bridge.