Good News for Transparency

The estimable Carol Wolf just sent me a copy of the order in Paul Goldman’s Freedom of Information Act suit for documents regarding the Coliseum boondoggle.  These include a stack that the City released to the Times-Disgrace but wanted over $2,000 to “review” before they would even consider giving them to Paul. (Can you spell “That Stinks!”?)

My handy ‘puter converted this from the pdf; any errors (notably the format) belong to that process, not to the Court:


In the Circuit Court of the City of Richmond, John Marshall Courts Building

clip_image002 Case No.: CL19-2591-6


On May 29, 2019, CAME Paul Goldman (“Petitioner”), pro se, and the City of Richmond (“Respondent”), by counsel, on Petitioner’s Petition for Mandamus (the “Petition”), in which Petitioner requested an order in mandamus compelling the Respondent to produce certain documents previously requested by Petitioner in letters dated March 21, 2019 and April 25, 2019, pursuant to the Virginia Freedom of Information Act, Virginia Code 2.2-3700 et seq. (“FOIA” or the “Act”). The letters correspond with Count I and Count Il of the Petition, respectively.

WHEREUPON, with the parties in apparent agreement that the documents covered in the Petition are “public records” subject to disclosure unless excluded under the Act, the hearing proceeded with Petitioner first offering an opening statement followed by that of the Respondent. Respondent presented several documents during its presentation and made argument concerning their meaning and legal import but did not move them into evidence. Throughout, Petitioner argued and objected to Respondent’s position, observing that under the Act, “the public body shall bear the burden of proof to establish an exclusion by a preponderance of the evidence.” Va.

Code Am. S 2.2-3713(E). This requirement extends to Respondent’s affidavit of a witness which purported to detail the reasons and circumstances that would support Respondent’s argument that the information sought in the Petition was subject to ongoing negotiation and thereby excluded from disclosure under Virginia Code 2.2-3705.1(12). The Court stated that it was unwilling to accept statements by counsel for Respondent as evidence over objection. This addressed the information sought by Petitioner in Count I of the Petition.

As to the information sought in Count Il, Petitioner contends that he now seeks only 2643 documents that Respondent has previously provided to the Richmond Times Dispatch (the “RDT”), pursuant to a FOIA request. Respondent contends that before it can know what documents to disclose, if any, Respondent needs to examine each document to determine whether disclosure is proper. Further, Respondent, by counsel, stated that some documents were provided to the RDT inadvertently and may not be subject to disclosure. Respondent contends that the case of Tull v. Brown, 255 Va. 177, 494 S.E.2d 855 (1998) stands for the proposition that a public body’s prior disclosure of information does not waive its claim of exemption for the same information in a subsequent request, for the well-known reason that estoppel does not operate against the government. The Court finds, however, that the Tully case is distinguishable from the case at bar, and accordingly, not controlling for reasons stated on the record.


shall disclose the documents requested in the Petition upon payment by Petitioner of costs in the amount of $50.00 for disclosure under Count I and costs in the amount of $200.00 for disclosure under Count Il, within ten (10) days of such payment. It is further ORDERED that Petitioner’s request for attorney’s fees is DENIED.

The Clerk is directed to forward a copy of this Order to all parties.

Exceptions are noted.

clip_image004IT IS SO ORDERED.

ENTER: 6 / 7/ 19


Help for the Board of “Education”

Now that our Board of “Education” has demonstrated that it does not know how – more likely, does not wish – to inspect its database in order to detect official cheating to boost graduation rates, I thought I’d give them an easy place to start doing their job.

And give the Governor a list of places where the Board’s malfeasance probably has allowed school divisions to issue bogus diplomas wholesale.

Brief background:

  • The End of Course (“EOC”) SOL tests are the gateway to graduation; to earn a standard diploma, a student must pass two English EOC tests, one math, one history & social science, and one lab science.
  • On average, the pass rates of Economically Disadvantaged students (“ED”) on the EOC tests is about 17% less than the rate of their more affluent (“Not ED”) peers; consistent with that, ED students graduate at about a 6% lower rate than Not ED.
  • Richmond’s ED students in 2018 passed the reading and math EOC tests at a rate 13% lower than the Not ED students but, as a result of wholesale cheating (by the schools, not the students!), the ED cohort graduation rate was 7.5% higher than the Not ED rate.
  • The Board of “Education” did not notice this discrepancy (nor similar warnings in the data from earlier years).

Let’s start with a plot of the 2018 division average differences between the ED and Not ED graduation rates against the difference in EOC reading pass rates.


The state average is the red circle showing an ED pass rate 15.6% below the Not ED rate and an ED graduation rate 6.0% below.

Richmond, the gold square, is up in the anomalous zone, with an unexceptional pass rate difference but with an ED graduation rate 7.5% higher than the Not ED rate.  We now know that the reason for that anomaly is wholesale cheating.  By the schools.

As a courtesy to my readers there, the blue diamond on the graph is Lynchburg; the green, Charles City.

BTW: The points over to the right side of the Y-axis are divisions where the ED students passed at higher rates than the Not ED.  If the Board of “Education” elects to start doing its job, it should also look at those divisions to see whether they are messing with the classification of handicapped students or otherwise manipulating the data.

The math data paint a similar picture.


Let’s take a closer look at the divisions that joined Richmond in the group with anomalously high ED graduation rates.  (That’s not to say that some of these other data are not abnormal.  I’m just going for the obvious cases.)


The math data again tell much the same story.


Small populations can lead to large variability in averages so these data, standing alone, don’t say much about the smaller divisions.  But, for sure, the data wave a large, red flag over the ED graduation rates from Norfolk, Newport News, Va. Beach, Portsmouth, Arlington, Roanoke, and Lynchburg.

But don’t hold your breath waiting for the Board of “Education” to look behind those numbers.

High School Corruption and State Malfeasance

In 2016, VDOE conducted a course schedule audit at Armstrong as part of the work following denial of accreditation there.  VDOE reported discrepancies in course requirements and transcript accuracy, inter alia.  The 2017 follow-up audit “concluded that there was not sufficient evidence to determine that problems identified . . . had been resolved.”

In 2018, at the request of our Superintendent, VDOE expanded the audit to include all five mainstream high schools.  They found:

  • Bell schedules did not meet the number of hours required by the Standards of Accreditation.
  • Verified credits did not populate in the transcripts.
  • Attendance data was incorrect throughout the transcripts.
  • Some students received one credit for classes that should not have carried credit.
  • Some students received two credits for classes that should have carried one credit, such as Career and Technical Education (CTE)classes.
  • Credit was incorrectly given for what appear to be locally developed elective courses without evidence of approval by Richmond Public Schools Board.
  • Credit was incorrectly given for middle school courses ineligible for high school credit.
  • Course sequencing issues were identified.
  • Academic and Career Plans lacked meaningful content.

Translating that from the careful bureaucratese of VDOE: The schools were cheating wholesale to boost the graduation rates.  Indeed, the RT-D later reported the schools were:

Rubber-stamping student work. Choosing to use an alternative test instead of giving students the common state test. Putting students on individualized education programs to circumvent state graduation requirements.

As Scott Adams is fond of saying,

Wherever you have large stakes, an opportunity for wrong-doing, and a
small risk of getting caught, wrong-doing happens. That’s a universal law of
human awfulness. When humans CAN cheat, they do. Or at least
enough of them do.

RPS seems to be determined to prove Adams right.  This latest outrage, comes in the wake of the wholesale cheating at Carver.

To be clear: We’re talking here about cheating by the school staff, not by the kids.

In the past, this kind of thing has been manifest in the data.  An earlier post looked at the RPS pass rate averages and suggested cheating.  So I thought I’d take a (further) look at the graduation numbers.

Notes on the data:

  • Economically disadvantaged students (here, “ED”) underperform their more affluent peers (“Not ED”) on the SOL pass rate averages by somewhere around twenty points.  Where there are large ED populations, the overall SOL averages can be misleading, so we’ll look at the ED and Not ED pass rates separately.
  • The End of Course (“EOC”) tests are the gateway to obtaining “verified credits” toward the graduation requirements.  To graduate with a standard diploma, the student must pass two EOC tests in English, one in math, one in history & social sciences, and one in a lab science.
  • On the 2018 EOC tests, about 60% of the Richmond students tested were reported to be ED.


  • That’s an average.  The tested ED populations of our mainstream high schools vary considerably (the selective high schools all do very well, thank you, and the boot camp Richmond Alternative is a mess so there’s nothing to be learned here from their data).



  • The graduation rates here are the boosted rates the Board of “Education” uses to make their constituent schools look better than they really are.

With that out of the way, let’s turn to the cohort graduation rates of those five schools and the pass rates there on the EOC tests. 

First, on the reading tests:


The orange diamonds are the 2018 cohort graduation rates/2018 pass rates of the ED students at the five high schools.  The blue circles, the same data for the Not ED students.

The easy way to read the graph: Look at the slope from orange ED to blue Not ED: 

  • Up and to the right is the expected condition with the ED students both scoring below and graduating at lower rates than the Not ED. 
  • As the line approaches horizontal, the ED pass rate is lower but the graduation rate approaches the Not ED rate.  We can wonder what is going on.
  • When the line slopes down, the lower scoring ED students are graduating at a higher rate than the Not ED.  Think cheating.
  • When the line approaches or passes vertical, ED students are passing at about the same rate as the Not ED, and graduating at a higher rate.  Something is doubly rotten.

Here, the Armstrong data look reasonable: The Not ED students outscored the ED by almost fifteen points and enjoyed an almost ten point better graduation rate.  We can wonder whether this is the result of the earlier audit there (see below).

In contrast, the ED students at both John Marshall and TJ graduated at higher rates than their Not ED peers while passing the EOC tests at 20+ and ~15 percent lower rates.  Something boosted those ED graduation rates.  Cheating is the obvious first candidate.

Wythe shows a more striking, anomalous pattern, with a (smaller than expected) 10% pass rate deficit and a twelve point higher graduation rate for the ED students.

Huguenot is anomalous in two respects: The pass rates of the two groups are almost identical and the ED students graduated at a rate almost 30% higher than the Not ED.  We can wonder whether the large ESL population there bears on this.

The math data tell the same stories (at dishearteningly lower pass rates).


Armstrong’s data again look reasonable.  Wythe and, especially, Marshall and TJ show higher ED graduation rates than Not ED, with lower pass rates.  Huguenot shows a much higher ED graduation rate at about the same pass rate.

To gain some context, let’s look at some history. 

First, as background, the state averages.



There is the norm: Slope up from orange to blue.  In terms of numbers, ED pass rates about twenty points below the Not ED and cohort graduation rates about seven points below.

Turning to Richmond, here is Armstrong on the reading EOC tests:


2014 and 2015 clearly suggest cheating.  2016 suggests it; 2017 says it again.  But 2018 looks reasonable. 

We might infer that the presence of VDOE in 2016 had a salutary effect at Armstrong.  A caveat: Because of the very small numbers of Not ED students at Armstrong, the Not ED pass and graduation rates can be expected to fluctuate considerably from year to year (as they do here).  These data fit the inference of cheating before ‘18 with a course correction then but there might well be a more benign explanation.

The math data at Armstrong also fit the inference of cheating, abated upon the appearance of VDOE.


The Marshall data are consistent with unabated cheating, albeit with higher pass and graduation rates than at Armstrong.



The Wythe data tell a similar story, with a suggestion of robust cheating in 2018.



Huguenot data suggest ongoing, rampant cheating.



On both tests the 2015 Huguenot numbers show ED and Not ED pass rates that are nearly identical, so the positive slope that year does not offer any solace.

Finally we have TJ, with anomalous but OK patterns in 2017 (albeit with unacceptably low ED pass rates) but otherwise showing data consistent with cheating.



————————– Rant Begins ————————-

Our Board of “Education” has created a haven for cheating high schools:

  • They don’t look at their mountain of graduation data until after they review a school that has been denied accreditation (if they even look then), and
  • They now have rigged the system so it is almost impossible for a school to be denied accreditation.

The Board has institutionalized Adams’ “small risk of getting caught.”

Why did VDOE ignore its mountain of data and do nothing in Richmond until (1) Armstrong was denied accreditation (under the old, now replaced, system), and (2) our Superintendent invited them in to the other high schools?  Why are they not looking for this pattern at high schools in other divisions? 

I think the answer is at least nonfeasance and probably malfeasance.  It’s past time to fire the Board and appoint people who will actually do the job.

————————– Rant Ends ————————-

It will be interesting to see how all this affects this year’s graduation rates.  Those data should be available in mid-August.  Stay tuned.

Oops: Corrected the link and list of required EOC courses.

Feeding and Failing the Kids

The VDOE data show each school division’s spending for the day school operation (administration, instruction, attendance & health, pupil transportation, and O&M) and then for food services, summer school, adult education, pre-kindergarten, and “other” educational programs (“enterprise operations, community service programs, and other non-[division] programs”).  The previous post discussed the very large expenditures beyond day school for Richmond and some other divisions; that spending correlated fairly well with the division percentage of economically disadvantaged (“ED”) students.

An astute reader asked whether the spending for food services showed a correlation, particularly in the light of the federal support from the National School Lunch Program and the School Breakfast Program for ED students. 

Indeed, there is a decent correlation.


Richmond is the gold square.  The red diamonds are the peer cities, from the left Hampton, Newport News, and Norfolk.  Lynchburg is purple; Charles City, green.  Richmond is $108 (18%) above the fitted line.

To look at those figures in context, here they are again (green this time), along with the total spending per student beyond the day school operation (“Excess”) and that total minus the food expenditure.


The excess shows a modest correlation while excess minus food shows very little; that correlation in the excess clearly comes from the food expenditure.

As with the overall Excess spending, the SOL pass rates show a modest negative correlation with the food service expenditure.



Again, as with the overall Excess spending, most of that negative correlation is driven by decreases in pass rates of the students who are not economically disadvantaged.



These data sing the same song: This extra spending is not associated with more learning.  We can wonder why it’s in the education budget.

We earlier saw that increasing a division’s percentage of ED students is associated with a decrease in the pass rates of the Not ED students more than the ED; these data are consistent with that (but do not proffer an explanation).

Finally, we have the SOL performance vs. the Excess spending other than for food services:



There is a bit of a correlation here, but in the wrong direction. 

Closer to home, the Richmond spending remains a substantial outlier.  RPS is spending $856 per student more in non-food Excess than the average division; that $19.7 million is buying the Reading and math deficits you see in these graphs. 

RPS keeps demonstrating that it does not know how to competently run its day school operation.  While they are thus failing to educate Richmond’s schoolchildren, they are spending a lot of money on things other than the day school operation. 

If money were as important to performance as they say it is (albeit money in fact is irrelevant to performance), you’d think they would redirect some of those dollars.  Go figure.

Extra Bucks, Negative Return

Looking at the Division disbursement data for 2018 (now that they are up on the VDOE Web site) one sees Richmond spending some $1,875 per student for things other than its day school operations ($15,697 vs. $13,821).

Note: Here and below, the “total” disbursements include everything but spending for facilities, debt service, and contingency reserve.  Specifically, they include, in addition to costs of the day school operation, spending for food services, summer school, adult education, pre-kindergarten, and “other” educational programs (“enterprise operations, community service programs, and other non-[division] programs”)

A graph of those data for the Virginia divisions suggests that Richmond (the gold square) is an outlier. 


Indeed, along with Staunton, Petersburg, and Bristol, Richmond leads the pack in spending for things other than day school.


Except for Charlottesville and Arlington (and Richmond, of course), the divisions with this kind of large difference are not the Big Spenders. 

A look at all the divisions shows that the difference does not scale with day school spending.


Richmond is the gold square; the red diamonds are the peer cities, from the bottom Hampton, Newport News, and Norfolk.  The purple diamond is Lynchburg; the large green diamond, Charles City.

More significantly, the divisions with more of this excess spending are not getting better SOL scores.  Quite the contrary:



The red, from the left, are Hampton, Newport News, and Norfolk.

But wait: We know that the raw SOL pass rate discriminates against divisions with large populations of economically disadvantaged (“ED”) students.  So let’s look at the performance of both the ED and Not ED groups.



That’s clear enough:  On average, the divisions that spend more money for things other than day school have lower SOL pass rates for both ED and Not ED students.  In the case of Richmond, those pass rates are grossly lower.

Notice also that the decrease in performance with increasing non-day school spending is larger for the Not ED students, particularly on the math tests. 

As always, the correlations here (where they are non-trivial) do not imply causation.  The other side of that coin, however, shows that increasing spending for things other than the day school operation is not related to improving the day school performance.

One more question for these data: How do those excess expenditures relate to the population of economically disadvantaged students?


The free/reduced price lunch cost should merely displace other lunch spending but the School Breakfast Program costs should increase with the ED population.  And here the expenditure for things other than day school rises with the number of poorer students, with a decent correlation.  So, at least in a qualitative fashion, this makes sense. 

That said, this extra spending is not associated with more learning.  We can wonder why it’s in the education budget.

Closer to home, the Richmond spending remains a substantial outlier.  Richmond keeps demonstrating that it does not know how to competently run its day school operation.  While they are thus failing to educate Richmond’s schoolchildren, they are spending a lot of money (almost twice the average) on other things. 

If money were as important to performance as they say it is (albeit money in fact is irrelevant to performance), you’d think they would redirect some of those dollars.  Go figure.

Boosted Pass/Graduation Rates?

An earlier post included some interesting Richmond data.  For instance:



Students must pass those End of Course (“EOC”) tests to obtain “verified credits” toward the graduation requirements

The state EOC numbers are high, compared to the 8th grade rates; the Richmond numbers are unbelievably high. 

There are (at least) two ways the pass rates could improve between middle and high school:

  • The high school dropouts leave behind a group of better performing students;
  • The EOC tests are easier or are scored less rigorously than the middle school tests.

We already know about Richmond’s shocking dropout rate so let’s do a thought experiment on the 2018 data:

  • Assume the cohort dropout rate of 20%;
  • Assume that the none of the (soon to be) dropouts passed the 8th grade tests;
  • Assume that the non-dropouts in high school (from earlier years) passed the EOC tests at the same rate as the 2018 8th graders did the 2018 8th grade tests.

That is, assume the 8th grade pass rate is and has been the average of 80% of some higher number and 20% of zero; then assume the EOC pass rate will be equal to that higher number.  You’ll recognize that this is a very rough approximation but, in light of the reasonably constant state 8th grade and EOC and Richmond 8th grade numbers, not an outrageous one.

A little back-of-the-envelope arithmetic calculates a dropout-boosted EOC pass rate of 64.8% in reading vs. the 70.9  actual and 52.1% in math, vs. the actual 59.2. 


It looks like getting rid of those dropouts produces a nice boost.  No wonder Richmond is so casual about its horrendous dropout rate.

The state data are consistent with the same phenomenon but the lower dropout rate (5.50%) gives less dramatic results for the calculated EOC numbers.


Even so, these extreme assumptions are not nearly enough to explain the actual EOC pass rates. 

If dropouts don’t explain the entire pass rate differences, we are left with the Board of “Education” messing with the EOC tests to boost the graduation rates.  For sure, we know that the Board sets the cut scores [@ p.130].  A boost of about five points (huge, in SOL terms) in both subjects would explain the state data and would go a long way toward rationalizing the Richmond numbers.

Of course, this speculation doesn’t prove anything.  But we already know that such manipulation of the data would align with other data engineering that makes the schools (that the Board supervises) look better than they really are.  See this and this and this and this and this and this.  Also see this, suggesting that the Board cares more about the more affluent divisions.

We’ve no direct way to test the notion1 but these data certainly suggest a simple pattern: EOC pass rates are boosted at the state level by the dropouts and by VBOE’s manipulation of the SOL test scoring; the EOC scores in Richmond are similarly boosted by the rigged scoring and inflated even more because of the terrible dropout rate.

Your tax dollars at “work.”

1.  VDOE does have the information but don’t waste any time waiting for them to disclose it.

Richmond: More Money, Worse Schools, False Excuse: Update

An earlier post discussed our Superintendent’s false statement about financial support.  That post was based on the 2017 VDOE data, the latest funding data then available.  I’ve updated the post with the 2018 numbers that VDOE just posted.

Our school Superintendent wrote an op-ed for the Times-Dispatch complaining that:

Virginia’s highest poverty school divisions — which serve large percentages of children of color — receive 8.3 percent less in per-pupil funding than the state’s wealthiest districts. Put plainly: The students who should be getting more are actually getting less.

In fact, Virginia’s high poverty divisions (larger numbers of economically disadvantaged students) actually spend more per pupil on average than the more affluent divisions.

Richmond, the gold square on the graph, spends $2,219 more per student than the fitted line would predict; indeed, it is the fourteenth biggest spender (of 132).

Table 15 in the (State) Superintendent’s Annual Report permits a look into the sources of those funds that the Richmond schools are spending.  The table breaks out division receipts by source:

  • State Sales and Use Tax (1-1/8 % of the sales tax receipts);
  • State Funds (appropriated by the Generous Assembly);
  • Federal Funds (direct federal grants plus federal funds distributed through state agencies); and
  • Local Funds (local appropriations).

Let’s start with a graph of the per student spending of state and federal funds vs. the division percentage of economically disadvantaged (“ED”) students:

The immediate lesson here is that Superintendent Kamras is simply wrong about high-poverty schools being starved for outside funds: The sales tax funding is essentially flat v. % ED while the state appropriations and the federal funding increase with increasing % ED.  Indeed, the R-squared value on the state funds, 30%, implies a meaningful correlation; the federal value, 50%, is robust.

Richmond, the gold squares, is a bit low in terms of state funding, but that deficit is offset (and then some) by federal money.  Richmond’s sales tax funding, $1,071 per student, is hidden in the forest of other schools, almost exactly on the fitted line.

Only in local funding can we find any hint of support for the notion that divisions with more poor students receive less money.

Of course, it would be no surprise that less affluent jurisdictions might provide fewer funds to their school systems.  For the most part, they have less money to spend on any governmental operation.

Kamras’ own division, with a 61% ED population in its schools, noneltheless came up in 2018 with $1,806 in local funds per student more than that fitted line would predict.

As well, when all the fund sources are added in, the average spending on education increases with increasing populations of disadvantaged students.  The R-squared, however, tells us that expenditure and ED percentage are essentially uncorrelated.  See the graph at the top of this post.

In summary, Richmond schools received LOTS of money in these categories, much more than the average division:

[“Predicted” values here are calculated from the Richmond % ED and the fitted lines in the graphs above.  The sum of the predicted values is seven dollars less than the value calculated from the actual total, which is probably explained by the inclusion of tuition from other divisions in the total reported.]

So, when he says “The students who should be getting more are actually getting less,” our Superintendent is wrong.  And, even more to the point, Kamras’ own schools are enjoying much more than average financial support.

The Kamras op-ed is a misleading attempt to excuse the ongoing failure of the Richmond public schools to educate Richmond’s schoolchildren.  For example, on the 2018 reading pass rates:

The excuse is contradicted by reality: Those Richmond schools are swimming in money.  Even more to the point, the performance of Virginia school divisions is unrelated to how much money they spend.  (Indeed, the trend lines point in the other direction.)  For example:

Richmond again is the gold square, spending lots of money and getting lousy results.  For comparison, the peer cities are the red diamonds: from the left, Hampton, Norfolk, and Newport News, all spending much less than Richmond and getting much better outcomes.  As a courtesy to my readers there, the violet diamond is Lynchburg, the green, Charles City.

It would be helpful for our Superintendent to turn his energy to improving the performance of our schools and to stop misleading the public about the reasons those schools are so ineffective.

The Alternative and Dropouts

Richmond Alternative School serves “students with academic, attendance and behavior challenges.”  Their Web page says they “use positive norms and positive peer pressure from the group in order to maintain a positive learning culture.”

RPS hired Community Education Partners in 2004 to run this receptacle for disruptive students.  In 2013, Richmond took the school over, saving about $2 million per year.  They now have hired Camelot Education to run it. 

The original contractor was doing a decent job, given the tough clientele. 


RPS picked a bad year to take over the school; the new math tests had lowered scores statewide in 2012 and new tests in English and science did the same thing in 2013, albeit more in Richmond.  

Alternative didn’t bounce back until the new contractor was in charge.  For instance:



There is a wrinkle in those data, however: The state and Richmond averages include the elementary school pass rates while Alternative has only the middle and high school grades.  The Alternative trends in the graphs are accurate enough but the comparison to the state and Richmond numbers is not.

If we pull the data by grade, we are hindered somewhat by the VDOE suppression rule that omits data for groups of <10 students.  Even so, there is information here:



(“EOC” is the end of courses tests that are required to obtain “verified credits” toward the graduation requirements.)

There are some details missing but the Richmond Alternative jump in 2017 is clear enough.  Either the contractor is cheating prodigiously, the dropouts are having an exceptional effect on the scores of the remaining students, or its it’s time for our School Board to admit its own incompetence and hire outsiders to run all the schools.  I lean toward that last explanation. 

In any event, something stinks here.

. . .

Maybe two somethings:  Those Richmond EOC numbers are suspiciously high.  Time for some more digging.

Looking Past the Donut

As you ramble on through Life, then,
Whatever be your goal,
Keep your eye upon the doughnut
And not upon the hole.

That may be good advice in life.  As to the performance of Richmond’s public schools, there’s a question whether the very large dropout hole has the perverse effect of enlarging the SOL donut.

We already know about the appalling condition of that donut, Richmond’s SOL performance.  Let’s further examine the hole, the kids who have dropped out and can’t further damage the pass rates.

The VDOE Web site has dropout statistics by division and by school.  I pulled down those by school for 2018 (the latest available) along with the Fall enrollment (“membership”) data for that year.

First the high schools.  The scales on the ordinates here are the same so we can compare the schools.

image image image image


And the average of averages for the five schools:


It looks like Wythe, with some modest help from everybody but Armstrong, is driving Richmond’s anomalously high 9th grade dropout rate.

The selective high schools paint a much prettier picture.

image image

No need for a graph for Franklin Military; their high school numbers all are zero.

The middle school numbers are much smaller, albeit several show troublesome trends.  Notice the expanded ordinate.

image image image image image image image image

The middle school average of averages:


Richmond Alternative, the dumping ground for troublesome kids (esp. in middle school), requires expanded ordinates.

image image

If Richmond were serious about dealing with dropouts (the numbers say they are not), these data suggest the places to start.

Attendance, Not

Having noticed Richmond’s atrocious dropout rate, I went digging on the VDOE Web site and found detailed dropout data.  I fired up Microsoft Access, pulled the 2018 dropout data, and set them beside the Fall enrollments for that year.

The result is ugly.


The state data show a regular increase with grade.  The astounding Richmond numbers show an unexpected maximum in the ninth grade.  Perhaps that is due to the notorious “ninth grade bump.”  


(Data are for all Virginia divisions.)

The principal argument for social promotion seems to be that holding the kid back is even less effective for learning than promoting that student.  The literature I’ve seen does not explain why that reason (or simple inertia or moving the problem on to the next level or whatever else might explain social promotion) stops working at the ninth grade.

In any event, Richmond’s ninth grade bump looks to survive Richmond’s ninth grade surge in dropouts.


If you have a testable hypothesis that explains all this, please do share it with me.

Turning back to the dropout rates, we see that, until the 12th grade and contrary to the state pattern, the rate among Richmond’s economically disadvantaged (“ED”) students is lower than among their more affluent peers (“Not ED”). 


This last set of data doesn’t illuminate the reasons for Richmond’s unusual (and unusually awful) dropout rates but it does suggest where we might start working on the problem: the ninth grade bump.