Bigger Bucks Vanishing Into Failure

We have seen that, for the most part, Richmond pays its teachers less than average


Let’s see if we can figure out where the money is going and what we’re getting for it.

Table 13 in the Superintendent’s Annual Report shows disbursements by division.  The data below are from the 2018 table, not including the spending for facilities, debt service, and contingency.  The Richmond total is $361,601,063.

(That’s just north of a third of a BILLION dollars.)

The table also gives the end-of-year Average Daily Membership (“ADM”).  Dividing the total disbursement by the ADM for Richmond, three peer cities, and the division average, gives us:



Breaking out that $2,887 per student excess by category gives us:


Note:  The table tells us that the preK spending includes Head Start; the “other” educational spending goes for enterprise operations, community service programs, and other matters that do not involve the delivery of instruction or related activities for K-12 students.

Here we see particularly large spending for attendance and health (6% of the $2,887 excess), food services (8%), pre-kindergarten (19%), and “other” educational programs (6%).  Those are the small change, however: 57% of the excess goes for instruction.

The only way to get this unusually large expenditure for instruction while paying most teachers less than average is to have a lot of teachers or spend a lot of money on something other than teachers’ salaries.

Let’s look at that from two directions: First, the relationship, if any, between disbursements and SOL pass rates; second, the relationship, if any, between number of teachers and pass rates.

We know that, in terms of SOL pass rates, Virginia’s economically disadvantaged (“ED”) students underperform their more affluent peers (“Not ED”) by ca. 15-20%.  This renders the division average pass rate unfair to those divisions with relatively large ED populations.  So let’s look at the data for both groups, not at the overall division averages:


Richmond is the gold points.  Whatever they’re spending all that money on, we’re getting precious little return.

The peer cities, Hampton, Newport News, and Norfolk are the red points, left to right.  They are spending a lot less per student than Richmond and getting a lot more performance.

Lynchburg is blue; Charles City, light green.

The fitted lines suggest that pass rates for both groups decrease with increasing disbursements but the R-squared values are tiny for the Not ED group and right small for the ED.

The math data tell much the same story, but with R-squared values that suggest a bit more of a (negative) relationship between pass rate and spending.


In any case, the divisions that spend more per student do not get better SOL pass rates on average.

The Superintendent’s Table 19 gives us the number of instructional positions (classroom teachers, guidance counselors, librarians, technology instructors, principals, and assistant principals) per student.   Juxtaposing those data with the pass rates gives:


BTW: The peers have rearranged here.  From the left it’s Newport News, Hampton, and Norfolk.  Notice that Hampton and Norfolk have more instructional positions per student that we do, while spending less per student.

Go figure!  Richmond is spending lots of money on instruction and not paying most of its teachers an average salary but operating with one of the lowest instructional position/student ratios in the state. 

And we’re getting precious little for it.

I’ll save the spending puzzle for another post.  First hypothesis: Richmond is paying a lot to people other than teachers.

In the meantime, these data tell us that division average reading pass rates increase with the instructional position/pupil ratio.  The R-squared for the ED students, 5.9%, says there’s a hint of a correlation (ρ = +0.24).  For the Not ED pass rates, we also see a positive slope but the R-squared is less than 1%..

The math data tell the same story but with even lower R-squared values.


So, as to Richmond: More money, fewer instructional positions, lower teacher pay, awful pass rates.  Whatever the problem with Richmond’s schools may be, it’s not a shortage of money.

Pop Quiz

It’s Spring!  The data VDOE has had since October have made it into the Superintendent’s Annual Report.  In particular, we now have the salary data.

Please review the following excerpts from that report in preparation for the quiz:



Did you study carefully? If not, that’s OK. The quiz is open book.



  1. If you were fresh out of college with a teaching license and were thinking about living in the Richmond area, would you prefer an elementary school job with RPS or one of the Counties?
  2. If you had taught elementary grades in Richmond for a couple of years and had demonstrated that you were a capable teacher, would you be tempted to jump ship for one of the Counties?
  3. Will this salary structure, in combination with the tough clientele and awful politics in Richmond, do a reverse cull of the the better teachers, tending to leave Richmond with the less effective ones?


Time’s up.  Turn in your papers.

Congratulations!  You’ve aced it!

Now, for your homework: Prepare for tomorrow’s quiz by learning to count to ten.

OOPS!  I flunked a bigger test:  A kind reader pointed out that I mixed up the labels on both graphs.  Now fixed.

It’s the Principal of the Thing

On April 26, the Free Press posted a story listing ten RPS schools where the principals were said to be being replaced.  There was an RT-D story about the same time; it has since been taken down, perhaps because of an error in the list.  Now the RT-D has a final list along with the names of the new principals.

The Grade 8 and EOC testing window this year had a last reporting date of March 29; the end of the latest test window for other tests was June 21.  RPS surely had the high school results but probably not the results for the lower grades before the list leaked.  No telling when the list was prepared but, below grade 8, it had to be before RPS had all the 2019 SOL results.

In any case, VDOE will have the summer results on July 19 but won’t release the final pass rates until mid-August, so we are stuck with the 2018 and earlier data.  Let’s see what those numbers tell us about the ten schools.

Here, as background (you’ll soon see why), are the reading pass rates at Carver.  The 2018 data are missing because the staff there got caught cheating.


The “ED” data are the pass rates of the “economically disadvantaged” students.  The “Not ED” points are the data for their more affluent peers.  The missing Not ED data reflect the VDOE suppression rule (no data when a group is <10 students).

The “All” data are the school average.  Here, the very large percentage of ED students makes the ED and All rates nearly equal.

The yellow line is the nominal accreditation value, 75% for English, 70% for everything else.

To the point here,

  • Until the cheating was halted in ‘18, the pass rates for both groups were astronomical, and
  • The ED students were reported to pass at about the same rates as the Not ED, in contrast to the state average difference of some 15 to 20 points.

The math data from Carver tell much the same story.


Those data paint a portrait of cheating.  By the school staff, not by the kids.

As a contrast and to set an upper bound for reasonableness, here is Munford, the best-performing of Richmond’s elementary schools in terms of SOL pass rates.



In that light, let’s look at Fairfield Court, which is one of the ten on the list.



There are no ED data for 2014.  Whatever the reason, it clearly is not the suppression rule.

These data suggest (but, of course, do not prove) two things:

  1. They were cheating at Fairfield, just not as flagrantly as at Carver, and
  2. Before the 2018 testing, they got the Word that the State was becoming involved and let the numbers return to their actual level.

Of course, the 2018 score drop alone would be a reason to wonder whether it was time for a new principal.

Moving on to the rest of the list, here is Bellevue.



The ED decreases in ‘17 and ‘18 and the Not ED in ‘18 probably offer a complete explanation for the new principal.

Next Blackwell.



Ongoing, sorry performance could explain what happened there.

George Mason:



Sorry performance in reading and declining math scores.

Ginter Park:



Hard to know what to make of this: The Not ED rates are unusually close to the the ED, with the exception of reading in the last two years; perhaps more to the point, the reading scores are low and neither subject shows ED improvement.

Next, Greene.



Declining performance, when increasing is what’s needed.

Last among the elementary schools, Overby-Sheppard.



Could be lack of improvement; the generally small gap between ED and Not ED may indicate a problem.

Turning to the middle schools, here as a marker is Hill, the best of a sorry lot (and NOT on the list).



(Notice the ED/Not ED gap that runs about twice the state average).

Henderson is the only middle school on the list.



(Notice the abnormally thin ED/Not ED difference.)

As to the principal, low pass rates and a lack of recent improvement might well explain the turnover.

Then we have the high schools.  First on the list, Wythe.



The reason for the turnover there is clear enough.  As well, the ED/Not ED difference is remarkably thin.

Next, Marshall.



Hmmm.  No smoking gun there but another thin ED/Not ED gap.

Next, TJ.



No obvious reason here for making the list.

Last, one of the two selective high schools, Community.



If there’s a problem here, it’s not in reading.  Nor in the improvement of the math rates.

Bottom line: Except for a few obvious cases, these data don’t tell us what brought in the new principals.  The data do tell us that all ten principals have big jobs to do.


Postscript: Our neighborhood school, where the principal was replaced last year.



Good News for Transparency

The estimable Carol Wolf just sent me a copy of the order in Paul Goldman’s Freedom of Information Act suit for documents regarding the Coliseum boondoggle.  These include a stack that the City released to the Times-Disgrace but wanted over $2,000 to “review” before they would even consider giving them to Paul. (Can you spell “That Stinks!”?)

My handy ‘puter converted this from the pdf; any errors (notably the format) belong to that process, not to the Court:


In the Circuit Court of the City of Richmond, John Marshall Courts Building

clip_image002 Case No.: CL19-2591-6


On May 29, 2019, CAME Paul Goldman (“Petitioner”), pro se, and the City of Richmond (“Respondent”), by counsel, on Petitioner’s Petition for Mandamus (the “Petition”), in which Petitioner requested an order in mandamus compelling the Respondent to produce certain documents previously requested by Petitioner in letters dated March 21, 2019 and April 25, 2019, pursuant to the Virginia Freedom of Information Act, Virginia Code 2.2-3700 et seq. (“FOIA” or the “Act”). The letters correspond with Count I and Count Il of the Petition, respectively.

WHEREUPON, with the parties in apparent agreement that the documents covered in the Petition are “public records” subject to disclosure unless excluded under the Act, the hearing proceeded with Petitioner first offering an opening statement followed by that of the Respondent. Respondent presented several documents during its presentation and made argument concerning their meaning and legal import but did not move them into evidence. Throughout, Petitioner argued and objected to Respondent’s position, observing that under the Act, “the public body shall bear the burden of proof to establish an exclusion by a preponderance of the evidence.” Va.

Code Am. S 2.2-3713(E). This requirement extends to Respondent’s affidavit of a witness which purported to detail the reasons and circumstances that would support Respondent’s argument that the information sought in the Petition was subject to ongoing negotiation and thereby excluded from disclosure under Virginia Code 2.2-3705.1(12). The Court stated that it was unwilling to accept statements by counsel for Respondent as evidence over objection. This addressed the information sought by Petitioner in Count I of the Petition.

As to the information sought in Count Il, Petitioner contends that he now seeks only 2643 documents that Respondent has previously provided to the Richmond Times Dispatch (the “RDT”), pursuant to a FOIA request. Respondent contends that before it can know what documents to disclose, if any, Respondent needs to examine each document to determine whether disclosure is proper. Further, Respondent, by counsel, stated that some documents were provided to the RDT inadvertently and may not be subject to disclosure. Respondent contends that the case of Tull v. Brown, 255 Va. 177, 494 S.E.2d 855 (1998) stands for the proposition that a public body’s prior disclosure of information does not waive its claim of exemption for the same information in a subsequent request, for the well-known reason that estoppel does not operate against the government. The Court finds, however, that the Tully case is distinguishable from the case at bar, and accordingly, not controlling for reasons stated on the record.


shall disclose the documents requested in the Petition upon payment by Petitioner of costs in the amount of $50.00 for disclosure under Count I and costs in the amount of $200.00 for disclosure under Count Il, within ten (10) days of such payment. It is further ORDERED that Petitioner’s request for attorney’s fees is DENIED.

The Clerk is directed to forward a copy of this Order to all parties.

Exceptions are noted.

clip_image004IT IS SO ORDERED.

ENTER: 6 / 7/ 19


Help for the Board of “Education”

Now that our Board of “Education” has demonstrated that it does not know how – more likely, does not wish – to inspect its database in order to detect official cheating to boost graduation rates, I thought I’d give them an easy place to start doing their job.

And give the Governor a list of places where the Board’s malfeasance probably has allowed school divisions to issue bogus diplomas wholesale.

Brief background:

  • The End of Course (“EOC”) SOL tests are the gateway to graduation; to earn a standard diploma, a student must pass two English EOC tests, one math, one history & social science, and one lab science.
  • On average, the pass rates of Economically Disadvantaged students (“ED”) on the EOC tests is about 17% less than the rate of their more affluent (“Not ED”) peers; consistent with that, ED students graduate at about a 6% lower rate than Not ED.
  • Richmond’s ED students in 2018 passed the reading and math EOC tests at a rate 13% lower than the Not ED students but, as a result of wholesale cheating (by the schools, not the students!), the ED cohort graduation rate was 7.5% higher than the Not ED rate.
  • The Board of “Education” did not notice this discrepancy (nor similar warnings in the data from earlier years).

Let’s start with a plot of the 2018 division average differences between the ED and Not ED graduation rates against the difference in EOC reading pass rates.


The state average is the red circle showing an ED pass rate 15.6% below the Not ED rate and an ED graduation rate 6.0% below.

Richmond, the gold square, is up in the anomalous zone, with an unexceptional pass rate difference but with an ED graduation rate 7.5% higher than the Not ED rate.  We now know that the reason for that anomaly is wholesale cheating.  By the schools.

As a courtesy to my readers there, the blue diamond on the graph is Lynchburg; the green, Charles City.

BTW: The points over to the right side of the Y-axis are divisions where the ED students passed at higher rates than the Not ED.  If the Board of “Education” elects to start doing its job, it should also look at those divisions to see whether they are messing with the classification of handicapped students or otherwise manipulating the data.

The math data paint a similar picture.


Let’s take a closer look at the divisions that joined Richmond in the group with anomalously high ED graduation rates.  (That’s not to say that some of these other data are not abnormal.  I’m just going for the obvious cases.)


The math data again tell much the same story.


Small populations can lead to large variability in averages so these data, standing alone, don’t say much about the smaller divisions.  But, for sure, the data wave a large, red flag over the ED graduation rates from Norfolk, Newport News, Va. Beach, Portsmouth, Arlington, Roanoke, and Lynchburg.

But don’t hold your breath waiting for the Board of “Education” to look behind those numbers.

High School Corruption and State Malfeasance

In 2016, VDOE conducted a course schedule audit at Armstrong as part of the work following denial of accreditation there.  VDOE reported discrepancies in course requirements and transcript accuracy, inter alia.  The 2017 follow-up audit “concluded that there was not sufficient evidence to determine that problems identified . . . had been resolved.”

In 2018, at the request of our Superintendent, VDOE expanded the audit to include all five mainstream high schools.  They found:

  • Bell schedules did not meet the number of hours required by the Standards of Accreditation.
  • Verified credits did not populate in the transcripts.
  • Attendance data was incorrect throughout the transcripts.
  • Some students received one credit for classes that should not have carried credit.
  • Some students received two credits for classes that should have carried one credit, such as Career and Technical Education (CTE)classes.
  • Credit was incorrectly given for what appear to be locally developed elective courses without evidence of approval by Richmond Public Schools Board.
  • Credit was incorrectly given for middle school courses ineligible for high school credit.
  • Course sequencing issues were identified.
  • Academic and Career Plans lacked meaningful content.

Translating that from the careful bureaucratese of VDOE: The schools were cheating wholesale to boost the graduation rates.  Indeed, the RT-D later reported the schools were:

Rubber-stamping student work. Choosing to use an alternative test instead of giving students the common state test. Putting students on individualized education programs to circumvent state graduation requirements.

As Scott Adams is fond of saying,

Wherever you have large stakes, an opportunity for wrong-doing, and a
small risk of getting caught, wrong-doing happens. That’s a universal law of
human awfulness. When humans CAN cheat, they do. Or at least
enough of them do.

RPS seems to be determined to prove Adams right.  This latest outrage, comes in the wake of the wholesale cheating at Carver.

To be clear: We’re talking here about cheating by the school staff, not by the kids.

In the past, this kind of thing has been manifest in the data.  An earlier post looked at the RPS pass rate averages and suggested cheating.  So I thought I’d take a (further) look at the graduation numbers.

Notes on the data:

  • Economically disadvantaged students (here, “ED”) underperform their more affluent peers (“Not ED”) on the SOL pass rate averages by somewhere around twenty points.  Where there are large ED populations, the overall SOL averages can be misleading, so we’ll look at the ED and Not ED pass rates separately.
  • The End of Course (“EOC”) tests are the gateway to obtaining “verified credits” toward the graduation requirements.  To graduate with a standard diploma, the student must pass two EOC tests in English, one in math, one in history & social sciences, and one in a lab science.
  • On the 2018 EOC tests, about 60% of the Richmond students tested were reported to be ED.


  • That’s an average.  The tested ED populations of our mainstream high schools vary considerably (the selective high schools all do very well, thank you, and the boot camp Richmond Alternative is a mess so there’s nothing to be learned here from their data).



  • The graduation rates here are the boosted rates the Board of “Education” uses to make their constituent schools look better than they really are.

With that out of the way, let’s turn to the cohort graduation rates of those five schools and the pass rates there on the EOC tests. 

First, on the reading tests:


The orange diamonds are the 2018 cohort graduation rates/2018 pass rates of the ED students at the five high schools.  The blue circles, the same data for the Not ED students.

The easy way to read the graph: Look at the slope from orange ED to blue Not ED: 

  • Up and to the right is the expected condition with the ED students both scoring below and graduating at lower rates than the Not ED. 
  • As the line approaches horizontal, the ED pass rate is lower but the graduation rate approaches the Not ED rate.  We can wonder what is going on.
  • When the line slopes down, the lower scoring ED students are graduating at a higher rate than the Not ED.  Think cheating.
  • When the line approaches or passes vertical, ED students are passing at about the same rate as the Not ED, and graduating at a higher rate.  Something is doubly rotten.

Here, the Armstrong data look reasonable: The Not ED students outscored the ED by almost fifteen points and enjoyed an almost ten point better graduation rate.  We can wonder whether this is the result of the earlier audit there (see below).

In contrast, the ED students at both John Marshall and TJ graduated at higher rates than their Not ED peers while passing the EOC tests at 20+ and ~15 percent lower rates.  Something boosted those ED graduation rates.  Cheating is the obvious first candidate.

Wythe shows a more striking, anomalous pattern, with a (smaller than expected) 10% pass rate deficit and a twelve point higher graduation rate for the ED students.

Huguenot is anomalous in two respects: The pass rates of the two groups are almost identical and the ED students graduated at a rate almost 30% higher than the Not ED.  We can wonder whether the large ESL population there bears on this.

The math data tell the same stories (at dishearteningly lower pass rates).


Armstrong’s data again look reasonable.  Wythe and, especially, Marshall and TJ show higher ED graduation rates than Not ED, with lower pass rates.  Huguenot shows a much higher ED graduation rate at about the same pass rate.

To gain some context, let’s look at some history. 

First, as background, the state averages.



There is the norm: Slope up from orange to blue.  In terms of numbers, ED pass rates about twenty points below the Not ED and cohort graduation rates about seven points below.

Turning to Richmond, here is Armstrong on the reading EOC tests:


2014 and 2015 clearly suggest cheating.  2016 suggests it; 2017 says it again.  But 2018 looks reasonable. 

We might infer that the presence of VDOE in 2016 had a salutary effect at Armstrong.  A caveat: Because of the very small numbers of Not ED students at Armstrong, the Not ED pass and graduation rates can be expected to fluctuate considerably from year to year (as they do here).  These data fit the inference of cheating before ‘18 with a course correction then but there might well be a more benign explanation.

The math data at Armstrong also fit the inference of cheating, abated upon the appearance of VDOE.


The Marshall data are consistent with unabated cheating, albeit with higher pass and graduation rates than at Armstrong.



The Wythe data tell a similar story, with a suggestion of robust cheating in 2018.



Huguenot data suggest ongoing, rampant cheating.



On both tests the 2015 Huguenot numbers show ED and Not ED pass rates that are nearly identical, so the positive slope that year does not offer any solace.

Finally we have TJ, with anomalous but OK patterns in 2017 (albeit with unacceptably low ED pass rates) but otherwise showing data consistent with cheating.



————————– Rant Begins ————————-

Our Board of “Education” has created a haven for cheating high schools:

  • They don’t look at their mountain of graduation data until after they review a school that has been denied accreditation (if they even look then), and
  • They now have rigged the system so it is almost impossible for a school to be denied accreditation.

The Board has institutionalized Adams’ “small risk of getting caught.”

Why did VDOE ignore its mountain of data and do nothing in Richmond until (1) Armstrong was denied accreditation (under the old, now replaced, system), and (2) our Superintendent invited them in to the other high schools?  Why are they not looking for this pattern at high schools in other divisions? 

I think the answer is at least nonfeasance and probably malfeasance.  It’s past time to fire the Board and appoint people who will actually do the job.

————————– Rant Ends ————————-

It will be interesting to see how all this affects this year’s graduation rates.  Those data should be available in mid-August.  Stay tuned.

Oops: Corrected the link and list of required EOC courses.

Feeding and Failing the Kids

The VDOE data show each school division’s spending for the day school operation (administration, instruction, attendance & health, pupil transportation, and O&M) and then for food services, summer school, adult education, pre-kindergarten, and “other” educational programs (“enterprise operations, community service programs, and other non-[division] programs”).  The previous post discussed the very large expenditures beyond day school for Richmond and some other divisions; that spending correlated fairly well with the division percentage of economically disadvantaged (“ED”) students.

An astute reader asked whether the spending for food services showed a correlation, particularly in the light of the federal support from the National School Lunch Program and the School Breakfast Program for ED students. 

Indeed, there is a decent correlation.


Richmond is the gold square.  The red diamonds are the peer cities, from the left Hampton, Newport News, and Norfolk.  Lynchburg is purple; Charles City, green.  Richmond is $108 (18%) above the fitted line.

To look at those figures in context, here they are again (green this time), along with the total spending per student beyond the day school operation (“Excess”) and that total minus the food expenditure.


The excess shows a modest correlation while excess minus food shows very little; that correlation in the excess clearly comes from the food expenditure.

As with the overall Excess spending, the SOL pass rates show a modest negative correlation with the food service expenditure.



Again, as with the overall Excess spending, most of that negative correlation is driven by decreases in pass rates of the students who are not economically disadvantaged.



These data sing the same song: This extra spending is not associated with more learning.  We can wonder why it’s in the education budget.

We earlier saw that increasing a division’s percentage of ED students is associated with a decrease in the pass rates of the Not ED students more than the ED; these data are consistent with that (but do not proffer an explanation).

Finally, we have the SOL performance vs. the Excess spending other than for food services:



There is a bit of a correlation here, but in the wrong direction. 

Closer to home, the Richmond spending remains a substantial outlier.  RPS is spending $856 per student more in non-food Excess than the average division; that $19.7 million is buying the Reading and math deficits you see in these graphs. 

RPS keeps demonstrating that it does not know how to competently run its day school operation.  While they are thus failing to educate Richmond’s schoolchildren, they are spending a lot of money on things other than the day school operation. 

If money were as important to performance as they say it is (albeit money in fact is irrelevant to performance), you’d think they would redirect some of those dollars.  Go figure.

Extra Bucks, Negative Return

Looking at the Division disbursement data for 2018 (now that they are up on the VDOE Web site) one sees Richmond spending some $1,875 per student for things other than its day school operations ($15,697 vs. $13,821).

Note: Here and below, the “total” disbursements include everything but spending for facilities, debt service, and contingency reserve.  Specifically, they include, in addition to costs of the day school operation, spending for food services, summer school, adult education, pre-kindergarten, and “other” educational programs (“enterprise operations, community service programs, and other non-[division] programs”)

A graph of those data for the Virginia divisions suggests that Richmond (the gold square) is an outlier. 


Indeed, along with Staunton, Petersburg, and Bristol, Richmond leads the pack in spending for things other than day school.


Except for Charlottesville and Arlington (and Richmond, of course), the divisions with this kind of large difference are not the Big Spenders. 

A look at all the divisions shows that the difference does not scale with day school spending.


Richmond is the gold square; the red diamonds are the peer cities, from the bottom Hampton, Newport News, and Norfolk.  The purple diamond is Lynchburg; the large green diamond, Charles City.

More significantly, the divisions with more of this excess spending are not getting better SOL scores.  Quite the contrary:



The red, from the left, are Hampton, Newport News, and Norfolk.

But wait: We know that the raw SOL pass rate discriminates against divisions with large populations of economically disadvantaged (“ED”) students.  So let’s look at the performance of both the ED and Not ED groups.



That’s clear enough:  On average, the divisions that spend more money for things other than day school have lower SOL pass rates for both ED and Not ED students.  In the case of Richmond, those pass rates are grossly lower.

Notice also that the decrease in performance with increasing non-day school spending is larger for the Not ED students, particularly on the math tests. 

As always, the correlations here (where they are non-trivial) do not imply causation.  The other side of that coin, however, shows that increasing spending for things other than the day school operation is not related to improving the day school performance.

One more question for these data: How do those excess expenditures relate to the population of economically disadvantaged students?


The free/reduced price lunch cost should merely displace other lunch spending but the School Breakfast Program costs should increase with the ED population.  And here the expenditure for things other than day school rises with the number of poorer students, with a decent correlation.  So, at least in a qualitative fashion, this makes sense. 

That said, this extra spending is not associated with more learning.  We can wonder why it’s in the education budget.

Closer to home, the Richmond spending remains a substantial outlier.  Richmond keeps demonstrating that it does not know how to competently run its day school operation.  While they are thus failing to educate Richmond’s schoolchildren, they are spending a lot of money (almost twice the average) on other things. 

If money were as important to performance as they say it is (albeit money in fact is irrelevant to performance), you’d think they would redirect some of those dollars.  Go figure.

Boosted Pass/Graduation Rates?

An earlier post included some interesting Richmond data.  For instance:



Students must pass those End of Course (“EOC”) tests to obtain “verified credits” toward the graduation requirements

The state EOC numbers are high, compared to the 8th grade rates; the Richmond numbers are unbelievably high. 

There are (at least) two ways the pass rates could improve between middle and high school:

  • The high school dropouts leave behind a group of better performing students;
  • The EOC tests are easier or are scored less rigorously than the middle school tests.

We already know about Richmond’s shocking dropout rate so let’s do a thought experiment on the 2018 data:

  • Assume the cohort dropout rate of 20%;
  • Assume that the none of the (soon to be) dropouts passed the 8th grade tests;
  • Assume that the non-dropouts in high school (from earlier years) passed the EOC tests at the same rate as the 2018 8th graders did the 2018 8th grade tests.

That is, assume the 8th grade pass rate is and has been the average of 80% of some higher number and 20% of zero; then assume the EOC pass rate will be equal to that higher number.  You’ll recognize that this is a very rough approximation but, in light of the reasonably constant state 8th grade and EOC and Richmond 8th grade numbers, not an outrageous one.

A little back-of-the-envelope arithmetic calculates a dropout-boosted EOC pass rate of 64.8% in reading vs. the 70.9  actual and 52.1% in math, vs. the actual 59.2. 


It looks like getting rid of those dropouts produces a nice boost.  No wonder Richmond is so casual about its horrendous dropout rate.

The state data are consistent with the same phenomenon but the lower dropout rate (5.50%) gives less dramatic results for the calculated EOC numbers.


Even so, these extreme assumptions are not nearly enough to explain the actual EOC pass rates. 

If dropouts don’t explain the entire pass rate differences, we are left with the Board of “Education” messing with the EOC tests to boost the graduation rates.  For sure, we know that the Board sets the cut scores [@ p.130].  A boost of about five points (huge, in SOL terms) in both subjects would explain the state data and would go a long way toward rationalizing the Richmond numbers.

Of course, this speculation doesn’t prove anything.  But we already know that such manipulation of the data would align with other data engineering that makes the schools (that the Board supervises) look better than they really are.  See this and this and this and this and this and this.  Also see this, suggesting that the Board cares more about the more affluent divisions.

We’ve no direct way to test the notion1 but these data certainly suggest a simple pattern: EOC pass rates are boosted at the state level by the dropouts and by VBOE’s manipulation of the SOL test scoring; the EOC scores in Richmond are similarly boosted by the rigged scoring and inflated even more because of the terrible dropout rate.

Your tax dollars at “work.”

1.  VDOE does have the information but don’t waste any time waiting for them to disclose it.

Richmond: More Money, Worse Schools, False Excuse: Update

An earlier post discussed our Superintendent’s false statement about financial support.  That post was based on the 2017 VDOE data, the latest funding data then available.  I’ve updated the post with the 2018 numbers that VDOE just posted.

Our school Superintendent wrote an op-ed for the Times-Dispatch complaining that:

Virginia’s highest poverty school divisions — which serve large percentages of children of color — receive 8.3 percent less in per-pupil funding than the state’s wealthiest districts. Put plainly: The students who should be getting more are actually getting less.

In fact, Virginia’s high poverty divisions (larger numbers of economically disadvantaged students) actually spend more per pupil on average than the more affluent divisions.

Richmond, the gold square on the graph, spends $2,219 more per student than the fitted line would predict; indeed, it is the fourteenth biggest spender (of 132).

Table 15 in the (State) Superintendent’s Annual Report permits a look into the sources of those funds that the Richmond schools are spending.  The table breaks out division receipts by source:

  • State Sales and Use Tax (1-1/8 % of the sales tax receipts);
  • State Funds (appropriated by the Generous Assembly);
  • Federal Funds (direct federal grants plus federal funds distributed through state agencies); and
  • Local Funds (local appropriations).

Let’s start with a graph of the per student spending of state and federal funds vs. the division percentage of economically disadvantaged (“ED”) students:

The immediate lesson here is that Superintendent Kamras is simply wrong about high-poverty schools being starved for outside funds: The sales tax funding is essentially flat v. % ED while the state appropriations and the federal funding increase with increasing % ED.  Indeed, the R-squared value on the state funds, 30%, implies a meaningful correlation; the federal value, 50%, is robust.

Richmond, the gold squares, is a bit low in terms of state funding, but that deficit is offset (and then some) by federal money.  Richmond’s sales tax funding, $1,071 per student, is hidden in the forest of other schools, almost exactly on the fitted line.

Only in local funding can we find any hint of support for the notion that divisions with more poor students receive less money.

Of course, it would be no surprise that less affluent jurisdictions might provide fewer funds to their school systems.  For the most part, they have less money to spend on any governmental operation.

Kamras’ own division, with a 61% ED population in its schools, noneltheless came up in 2018 with $1,806 in local funds per student more than that fitted line would predict.

As well, when all the fund sources are added in, the average spending on education increases with increasing populations of disadvantaged students.  The R-squared, however, tells us that expenditure and ED percentage are essentially uncorrelated.  See the graph at the top of this post.

In summary, Richmond schools received LOTS of money in these categories, much more than the average division:

[“Predicted” values here are calculated from the Richmond % ED and the fitted lines in the graphs above.  The sum of the predicted values is seven dollars less than the value calculated from the actual total, which is probably explained by the inclusion of tuition from other divisions in the total reported.]

So, when he says “The students who should be getting more are actually getting less,” our Superintendent is wrong.  And, even more to the point, Kamras’ own schools are enjoying much more than average financial support.

The Kamras op-ed is a misleading attempt to excuse the ongoing failure of the Richmond public schools to educate Richmond’s schoolchildren.  For example, on the 2018 reading pass rates:

The excuse is contradicted by reality: Those Richmond schools are swimming in money.  Even more to the point, the performance of Virginia school divisions is unrelated to how much money they spend.  (Indeed, the trend lines point in the other direction.)  For example:

Richmond again is the gold square, spending lots of money and getting lousy results.  For comparison, the peer cities are the red diamonds: from the left, Hampton, Norfolk, and Newport News, all spending much less than Richmond and getting much better outcomes.  As a courtesy to my readers there, the violet diamond is Lynchburg, the green, Charles City.

It would be helpful for our Superintendent to turn his energy to improving the performance of our schools and to stop misleading the public about the reasons those schools are so ineffective.