Richmond High Schools

Having looked at the 2019 pass rates of Richmond’s elementary schools and middle schools, let’s turn to the high schools.

Note: The Board of Education has designed its SOL reporting to discriminate against Richmond and other divisions with large populations of economically disadvantaged (“ED”) students. Those students underperform their more affluent peers (“Not ED”) by about 20 points on average. As a result, the SOL averages for divisions such as Richmond (ca. 2/3 ED) are lowered relative to divisions with similar ED and Not ED pass rates but fewer ED students. Fortunately, the database provides both ED and Not ED pass rates.

The End of Course (“EOC”) tests are primarily administered in the high schools. The standard diploma requires that the student pass the EOC tests in two English courses and one math course.

To start, here are the EOC reading pass rate averages by school for Not ED (more affluent) students.

image

All three of the selective schools aced these tests. Of the mainstream high schools, only TJ met the nominal benchmark for accreditation (75%). And notice, this is the result for the more affluent (and presumably higher-scoring) students.

The Not ED pass rates for the mainstream high schools were reduced by the loss of some better-performing students to the three selective schools. At the same time, the rates for those five schools were boosted some by the scores of Maggie Walker students; those are reported at the high schools in those students home districts, not at Walker. VDOE does not report the magnitude of those effects.

Turning to the ED pass rates:

image

Again the selective high schools turned in superb numbers. Marshall was the pick of the mainstream high schools, 4.3 points below the nominal benchmark for accreditation; the other schools worked together to lower the Richmond average to 61.4.

Again, the selective schools skimmed some of the most capable ED students from the pool. That cannot have affected the division average. To the contrary, the Richmond average was boosted some by the rip-off of the Maggie Walker results.

Even so, the averages told a sad story about the state of our high schools.

image

The Not ED minus ED data showed a curious pattern.

image

The Richmond average difference was inflated in some measure by the Maggie Walker swindle. The selective schools showed the effect of attracting some of the more capable ED students. The mainstream high schools were all over the place.  The TJ difference was as astounding in one direction as the Marshall in the other.

Except perhaps at Franklin, the numbers tested were not so small that a few high or low scores could have produced a large fluctuation in the pass rate.

image

Either the nonselective schools were showing wildly variable performances in their teachers or in their learners. Or both.

Turning to the math tests and the Not ED pass rates:

image

The nominal accreditation benchmark here is 70; only the selective schools met it.

As to the ED pass rates, none of the mainstream high school came close to the benchmark. The Richmond average barely broke 50%.

image

There is a complication: The Richmond math averages include results from middle schools. Those offer advanced classes, including some in the high school math subjects, that allow some of their better students to get a jump on the graduation requirements. The entire picture looks like this:

image

image

In light of that selection process, is is no surprise that the middle schools outscored the high schools.

image

The state averages are subject to the same issue.  Thus, the comparison with the overall Richmond averages should be a fair one, albeit it does not directly measure the average high school performance.

image

Richmond has a very large math problem.

As to the Richmond average, the Not ED/ED difference was within the realm of reason. Otherwise another unexplained spectrum.

image

And, again, the numbers tested were not so small as to explain all the scatter.

image

On these data, the scatter in the Not ED/ED pass rate differences will remain a puzzle.

However, the message of the averages (and of the pass rates of too many schools) is clear.

First Look at Accreditation

The 2019 accreditation results are up.

Notes:

  • Please recall that these ratings are so thoroughly rigged as to be bogus.
  • VDOE calls them the “2019-2020 Accreditation Ratings” because they supposedly apply to the current school year, not the 2019 school year for which they (purportedly) measure performance.
  • In the interest of honest reporting, the tables below say “Flunked” where VDOE says “Accredited With Conditions.” That official euphemism allows everybody to say a school is “accredited” when it actually flunked, even by the current, rigged standards.

Here, then are the school totals for the state, Richmond, three peer cities, and poor Petersburg:

image

Or, expressing the numbers percentages of the totals:

image

Richmond is up this year as to the percentage of schools accredited, from 43.2% to 45.5% (one school difference), and also improved on the percentage flunking, from 54.5% to 50% (two schools).

Caveat: The table below shows that both of the schools “Accredited Pending Review of Alternative Plan” deserve to flunk.

Richmond had 2.3% of the schools in Virginia this year and 16.7% of the schools that flunked.

Finally, here is a summary of the “indicators” for the Richmond schools as to the basic subjects as well as truancy, graduation rate, and dropout rate.

image

A reader (THE reader?) tells me to say what L1 et al. mean. L1 means the school met the requirement, such as it is; L2 means it met a relaxed requirement; L3 is the laggards. It’s more complicated than that; details here.

In cases that are all L1 here but the school flunked, e.g. Albert Hill, you can dig into the complete table or the “profile” to see what drove the rating. Albert Hill is all L1 on this table but flunked on the achievement gaps.

Despite, appalling pass rates this year, Fairfield Court was accredited on a three year average (that included two years of excellent numbers that almost certainly were based on cheating). The current school year should resolve that anomaly.

Richmond Middle Schools

Having looked at the 2019 pass rates of Richmond’s elementary schools, let’s turn to the middle schools.

Note: The Board of Education has designed its SOL reporting to discriminate against Richmond and other divisions with large populations of economically disadvantaged (“ED”) students. Those students underperform their more affluent peers (“Not ED”) by about 20 points on average. As a result, the SOL averages for divisions such as Richmond (ca. 2/3 ED) are lowered relative to divisions with similar ED and Not ED pass rates but fewer ED students. Fortunately, the database provides both ED and Not ED pass rates.

We’ll start with the 6th grade reading pass rates for the Not ED (more affluent) students.

image

Henderson is blank on this graph because its small number of Not ED students tested triggered the suppression rule (<10 Not ED students tested at this grade level). All we can say about that school is that nearly all the 6th Grade students are ED.

Franklin is unusual in that it has both middle- and high school grades. I’ve included Franklin here because the by-grade data cut out any effect of the high school grades. All the same, Franklin is a selective school so its numbers don’t compare directly to the mainstream middle schools.

Aside from Franklin, only Hill beat the state average. Binford handily beat the nominal benchmark for English accreditation (75%). The Not ED students of the remaining schools failed at either bad or catastrophic rates.

As to the ED students, only Franklin broke 50% on the 6th grade reading tests.

image

Turning to the seventh grade:

image

This time MLK replaced Henderson in the clutches of the suppression rule. Elkhardt-Thompson and Boushall stayed in the race to the cellar, joined there by Henderson.

As to the ED students, Franklin led the pack. Otherwise, only Brown and Hill broke 50%.

image

The 8th grade data were another chorus of the same distressing song.

image

image

The math data were even more disturbing.

image

image

image

image

image

image

Turning to the school averages we lose Franklin: The database will not give an average over just the middle school students there.

On the reading tests we see Not ED students did well at Hill and Binford.

image

None of the other schools made the nominal benchmark for accreditation.

No school did well with its ED students.

image

In terms of the Not ED/ED differences, Hill and Binford did a Munford: Despite excellent Not ED numbers, the ED pass rates were unusually low.

image

The other schools produced unusually small Not ED/ED differences and the Boushall numbers are anomalous, with the ED students outscoring the Not ED.

Turning to the Not ED math data, we see Hill did well again while Binford performed at the margin and the other schools languished in failure.

image

The ED math numbers were even worse than the reading, with no school breaking 50%.

image

The Not ED/ED differences were similar to those on the reading tests with Hill’s ED underperformance even more exaggerated, Binford’s less so, and Boushall again an anomaly.

image

In light of these data, it is no mystery why Richmond’s enrollment drops over 15% between the 5th and 6th grades.

https://calaf.org/wp-content/uploads/2019/03/image-21.png

Stay tuned for the high school numbers.

Richmond Elementary Schools

While we await the (fictional) accreditation numbers, it might be interesting to look at some of the 2019 Richmond pass rates in more detail.

The excellent – but very sloooow – VDOE database can provide data by subject and by test level. The database also breaks out pass rates for students who are economically disadvantaged (“ED”) and for their more affluent peers (“Not ED”).

The ED averages run about 20 points below the Not ED so the SOL averages punish the schools with large ED populations. To avoid that distortion, we’ll look here at the the underlying ED/Not ED numbers.

To start, here are the Not ED pass rates on the 3d Grade Reading tests.

image

Those ten schools with no data are the victims of the suppression rule (no numbers released to the public if fewer than 10 students). About all this tells us is that those schools have very large ED populations.

Here we see Munford, Holton, Patrick Henry, and Fox beating the state average. Then there are five schools where half or more of the Not ED students cannot pass the 3d grade reading tests.

Turning to the 3d grade ED students, we see the state average is lower than the ED by 24 points and the Richmond average, by 23.

image

On these data, there’s no telling how well Munford and Fox did with the ED students. In any case, they had too few to affect their averages much.

Patrick Henry’s and Holton’s ED students badly underperformed their Not ED peers.

At the bottom of the scale, we see sixteen (57% of twenty-eight) schools where half or more of the ED students flunked the 3d grade reading tests. Indeed, on average 57.82% of the Richmond ED students did not pass those tests.

The fourth grade data paint a similar picture, albeit with some interesting differences.

image

Do you believe that Chimborazo number? It could be that Chimborazo has an outstanding fourth grade English teacher; perhaps it has a few really bright kids; maybe it has a Carver type operation. No telling from these data.

Patrick Henry and Holton again did well. 

Turning to the 4th Grade ED students:

image

As with the third grade, Holton did not do well. There were too few ED students to tell about Patrick Henry and, again, Munford.

Swansboro looks to have a very effective 4th grade English teacher.

But look at Chimborazo, second from the bottom. Those Not ED Chimborazo data look to be anomalous.

The fifth grade data again show a similar picture with, again, some interesting variations.

image

image

Holton turns in yet another high Not ED/low ED performance.

In this case we have ED data for all the schools and we see Fox, Southampton, Cary, and Patrick Henry beating Munford as to the ED pass rate, with Munford not quite three points above the state average. Of course, these data do not separate the effects of teaching, student ability, or home environment so they do not speak directly to the quality of those schools. That said, Munford does not get any bragging rights.

To the good, these ED numbers, while appalling, are not as awful as those for the earlier grades.

For an overall view that reduces the effects of the suppression rules, here are the school average reading pass rates.

image

image

Never mind the Munford/Fox empire: Cary, Obama, and Fisher are the ED stars here, with Redd and Francis also beating out Fox. Given that about 2/3 of Richmond students are ED, it’s interesting to see the duopoly thus dethroned.

These data also provide a nuance to moving students between Cary and either Fox or Munford. Cary gets better results with its own, majority ED students. The Munford and Fox ED numbers do not suggest that the Fox/Munford environments might convey any major benefits to those Cary ED students (for even more of that, see the math data below). Indeed, these data suggest it might be helpful to the Fox/Munford ED students if they were moved to Cary.

Turning to math (where a new, failure-averse scoring system improved pass rates statewide this year), the 3d Grade Not ED pass rates (where we have data) range from excellent to (mostly) discouraging.

image

The ED pass rates are heartbreakers in too many cases, but look at Cary, Obama, and Redd.

image

Nine schools have 3d grade pass rates below 50%, with Fox(!) nearly in that league.

Fourth Grade.

image

image

Notice Swansboro, which also had good reading numbers in the fourth grade.

Fifth grade. Notice Fox upping its game here.

image

image

Looking at the school-wide data, we see Munford and Fox still further down in the pack as to their ED students.

image

image

Notice that Ginter Park joins Fox and Munford with excellent Not ED and not so excellent ED pass rates.

One more look: Here are the Not ED/ED differences.

image

image

The several large differences raise the question whether those schools (which include Munford and Fox and some other high-performing schools) are serving their ED students well.

Then there are the anomalous cases, Greene and a few others, where the ED students passed at higher rates than the Not ED. Without more information it’s impossible to know what’s going on there but, for sure, something is out of whack.

———-

Note added on 9/17:

Here are two graphs that examine the Obama/Cary/Fox/Munford situation more directly. The red lines are the nominal accreditation thresholds.

image

image

Those math data are particularly dramatic: Obama and Cary are dealing with much more challenging situations and getting clearly better results with their ED students.

———-

Stay tuned for a look at the middle schools.

Superintendent’s Dilemma

The Petersburg experience – fifteen years of state “supervision” that has produced declining, basement-level SOL pass rates – raises interesting questions about Richmond and its recent embrace of that supervision.

I. Where we are

Richmond’s SOL performance has long been comparable to that of Petersburg and other failed divisions. In recent years, that performance has deteriorated.

Note: Economically disadvantaged (“ED”) students pass the SOL at rates about 20% lower than their more affluent peers (“Not ED”). Thus the SOL averages punish divisions with large ED populations, e.g., Richmond. The data below, then, analyze the ED and Not ED pass rates, not the averages.

Here are the Richmond data for the last six years, presented as pass rate differences from the State averages.

image

image

image

image

image

There are some ups but the overall pattern is down.

It’s disgraceful that any school system, much less the one in the state capitol, would do this to its schoolchildren.

II. Memoranda of Understanding Have Failed to Help Petersburg

As set out in detail here and here, Petersburg has labored since 2004 under four different Memoranda of Understanding (“MOUs”) issued by the Board of Education. Those edicts have left Petersburg foundering with declining pass rates.

III. Richmond’s MOU Is An Exercise in Bureaucratic Busywork

The Richmond MOU is long on coordination and meetings and consultations and technical assistance. It is short on specific fixes for Richmond’s awful schools.

The Tell is in what the MOU does not say: “If you haven’t fixed those schools by date x, we’ll sue you.”  That is because the Board of Education does not know how to fix Richmond’s broken school system (Sept. 21, 2016 video starting at 1:48).  They don’t know what to tell a judge that Richmond should be made to do, so they don’t even contemplate exercising their authority to sue.

IV. Our Superintendent Faces a Tough Choice

In light of the Petersburg experience and the vacuity of the Richmond MOU, our Superintendent has to be wondering whether to squander his limited resources on MOU window decoration or direct those resources to fixing our schools.

Fearless Predictions:

  • If our Super follows the MOU course, he will join the parade of failed Richmond superintendents.
  • If he tells the Board of Education to fashion its MOU into a kite and go fly it, that Board will back down.

Open Question: Even if our Superintendent ignores the MOU, can he fix our schools?

Further “Progress Report” on Petersburg

Despite ongoing “supervision” by the Board of Education since at least 2004,
the SOL pass rates in Petersburg are declining vs. the state average.

The Board of Education has been actively “supervising” the Petersburg public schools since at least 2004.

https://calaf.org/wp-content/uploads/2019/08/image-42.png

(“MOU” is bureaucratese for “Memorandum of Understanding”).

Despite all that “supervision,” the Petersburg schools are marinating in failure.

Let’s analyze that situation in further detail.

Statewide, economically disadvantaged (“ED”) students pass the SOLs at about 20% lower rates than their more affluent peers (“Not ED”). This makes the SOL average an unfair measure for divisions, such as Petersburg (and Richmond) that have large ED populations. So let’s look at the ED and Not ED pass rates, not the average SOL.

We have seen, for instance, the reading data for Petersburg and the state for 2014 to 2019:

image

It is perhaps more illuminating to take the state averages as benchmarks and look at the Petersburg differences.

image

Three things are apparent in these data:

  1. The overall pattern is decline.
  2. Petersburg’s ED students passed the reading tests at rates ranging from eleven to fifteen points lower than their peers, statewide. Petersburg’s Not ED students did much worse. There is no way to tell on these data whether this Not ED underperformance reflects the schools or the particular student mix in Petersburg.
  3. All these numbers would be disastrously bad, even without the decline.

The writing data paint a similar picture, but with even larger deficits for both groups.

image

History and Social Science: Another pattern of decline from bad to terrible.

image

Ditto, math:

image

Finally, science:

image

There’s no sign that anybody at the Board or Department of Education has been held accountable for this miserable failure.

That “Memorandum of Understanding” is mislabeled. It should be titled “Suicide Pact.”

Just Carver?

The RT-D this morning reports: “Frustrated with the fallout of a cheating ring at a Richmond elementary [Carver], members of the city’s School Board on Monday pushed for more support for the school.”

That’s overdue, of course. But what about Fairfield Court, where they almost certainly were cheating and where the scores are even lower? And while we’re counting, how about MLK, where there’s no indication of cheating, just abiding, appalling failure?

image

Not Just Carver?

We have seen the pattern at Carver: Astounding pass rates followed by appalling pass rates.

At Carver, the plunge came after some of the staff there got caught cheating. There has not been an investigation at Fairfield Court (at least not that we know of) but the similar pattern there suggests there should be one.

image image

Notes: “ED” indicates economically disadvantaged; “Not ED” denotes the more affluent peers. Both schools have very large percentages of ED students with, accordingly, low percentages of Not ED; the missing blue bars in the graphs are cases where the VDOE suppression rules blocked posting of the Not ED data. The red lines are the nominal accreditation benchmarks.

The other subjects tell much the same story.

image image

image  image

image image

It looks like there need to be some firings at Fairfield Court beyond the replacement of the principal this spring. In any case, there is something terribly wrong there and the students, parents, and taxpayers are entitled to know what it is. Even more to the point, these data raise the question what RPS will be doing to help the middle school students whose “education” at Fairfield Court left them unprepared for what came next.

Geography of Achievement (or Not)

On average, economically disadvantaged (“ED”) students pass the SOL tests at a rate about 20% lower than their more affluent peers (“Not ED”). Thus, the SOL averages punish the divisions with larger populations of ED students by averaging in larger numbers of lower scores.

To avoid that, we can look separately at the ED and Not ED pass rates.

To get a picture of the geography of the pass rates, I’ve turned to Excel’s “filled map” feature. To start, here is that map of the division average reading pass rates of Not ED students.

image

The orange county is Halifax, which was hit by the VDOE data suppression rule. The orange dot is Williamsburg, which the program does not include in the reported “Williamsburg-James City County.”

Here is the same map for the ED reading pass rates.

image

The colors make an interesting point: Compared to their peers, not ED students score well ‘most everywhere; ED students, only in a few places, even though compared only to other ED students.  Then, there’s that interesting collection of high scores in SW Virginia for both groups.

The math data paint a rosier picture.

image

image

Richmond, of course, is that large, magenta blob sandwiched between two greener (but for the ED students, not very green) counties.