Feckless “Supervision” of Petersburg

Despite fifteen years of “supervision” by the Board and Department of Education, the Petersburg schools marinate in failure.

Va. Code § 22.1-8 provides: “The general supervision of the public school system shall be vested in the Board of Education.”

Va. Code § 22.1-253.13:8 provides:

The Board of Education shall have authority to seek school division compliance with the foregoing Standards of Quality. When the Board of Education determines that a school division has failed or refused, and continues to fail or refuse, to comply with any such Standard, the Board may petition the circuit court having jurisdiction in the school division to mandate or otherwise enforce compliance with such standard, including the development or implementation of any required corrective action plan that a local school board has failed or refused to develop or implement in a timely manner.

The agenda package for the October, 2018 meeting of the Board of Education contains a summary of the “supervision” the Board has provided to Petersburg:


“MOU” is bureaucratese for “Memorandum of Understanding,” which in turn is bureaucratese for an edict to which the Board can point in order to claim it is doing something about lousy schools. In the Real World, the Board’s MOUs are feckless nonsense.

The 2019 SOL data now are out; they add a fifteenth year to the span of the Board’s sterile attempts to improve the Petersburg schools.

Aside: On average, Virginia’s economically disadvantaged (“ED”) students underperform their more affluent peers (“Not ED”) by about 20%. The SOL average pass rate thus is lowered for the divisions with larger ED populations. To avoid the Biased SOL average, let’s look at the underlying averages for the ED and Not ED groups.

To start, here are the reading pass rates:


Until 2017, Petersburg’s Not ED pass rate was slightly above the state average ED rate. Then the Petersburg rate dropped.

Hint: Staff at Petersburg’s AP HIll Elementary got caught cheating at the end of 2017.

At all times on this graph, Petersburg’s stronger group was well below the nominal benchmark for accreditation in English, the red line on the graph. The ED average was flirting with the 50% level.

Both Petersburg groups slid this year: The 2019 Petersburg Not ED average was 5.3 points below the state ED average; the Petersburg ED average dropped to 49.8%. Said otherwise, 40.5% of Petersburg’s Not Ed students and half the ED students flunked the 2019 reading SOL. 

Petersburg’s “progress” on the writing tests has been even less edifying.


Likewise, History.


On the math tests, the raw data suggest that Petersburg improved a bit this year.


That impression fades when we notice that the 2019 math tests were new and the relaxed scoring boosted the state Not ED average by 3.4% and the ED by 6.6% over the 2018 pass rates. In light of those increases, Petersburg’s increases of 2.0 and 3.4%, respectively, are, in fact, decreases.

Finally, science, where Petersburg again reached toward lower pass rates.


Despite fifteen years of “supervision” from the Board and Department of Education, Petersburg wallows in failure.  The Board has yet to sue any school division, much less Petersburg, under the authority of § 22.1-253.13:8.

Isn’t it long past time for the Board of Education and the senior bureaucrats at the Department to be directed to employment that is better suited to their talents?

2019 SOLs: Tops of the Lists

To clear the palate after that last post, here are the 25 Virginia schools with the highest pass rates in Reading and Math for the students who are and are not economically disadvantaged.





You’ll notice Fairfax County’s Thomas Jefferson High School for Science and Technology, a Governor’s School, at 100% on all four lists. You won’t see Maggie Walker anywhere in the database, albeit the pass rates there would also put it on all four of these lists. Walker also is a Governor’s School, with a four-year program; it issues diplomas to its graduates. But VDOE does not report the SOL scores of the Walker students; those scores go, instead, to artificially inflate the scores of the high schools in the home districts of the Walker students.

2019 v. 2018

Statewide, economically disadvantaged (“ED”) students underperform their more affluent peers (“Not ED”) by passing the SOL tests at about a 20% lower rate. Thus, the average SOL is an unfair measure that punishes the divisions with large ED populations (e.g., Richmond, which is about 2/3 ED). So we’ve been looking at the 2019 SOL data separately for the ED and Not ED populations, not for the misleading average.

In that vein, here are the 2019 Reading pass rates for the Not ED students in Richmond’s elementary schools (the blue bars). The yellow bars show the changes from 2018.


The change at Barack Obama is calculated from the 2018 datum at JEB Stuart.

The “0.0” entries at Fairfield Court and Miles Jones indicate cases where the Not ED population was so small (<10) that VDOE suppressed the data. One particularly happy note is the 20.6 point gain at my neighborhood school, Westover Hills.

The RT-D reports that Carver, after the cheating scandal, plummeted to second-worst in Richmond this year. The database, however, reports as  to Carver, “There is [sic] no data for this report.”


(I’m glad my old Latin teacher did not live to see that assault on the language from the Department of “Education.”)

Turning to the pass rates of the ED students, we see (still without a Carver entry):


The Good News there is Swansboro and, again, Westover Hills.

The Bad News starts with Southampton, on of our better elementary schools, and Munford, our very best as measured by the SOL average. The very small ED population at Munford leaves the school with bragging rights on average but with some explaining to do regarding its plunging ED performance, especially v. Cary.

Turning to our awful middle schools, again on the reading tests for Not ED students, we see a twinkle of sunshine at Elkhardt Thompson but compounded disasters at Henderson, MLK, and Boushall.


As to the ED students, Elkhardt Thompson again shows a small gain while all the others slip further into the mire of failure.


I’ve included Franklin in the high school data although it serves both middle and high school grades. Franklin joins all the comprehensive high schools in losing ground on the reading tests as to its Not ED students.


As to the ED students, Franklin reversed course while Huguenot and TJ showed large drops.


Turning to the math data, and recalling that the new tests and scoring boosted the Not ED pass rate average by 3.4% and the ED by 6.6%, here are the Richmond elementary schools.


Notice the expanded axis to accommodate the large gains at Ginter Park, Overby-Sheppard, and Westover Hills (with some lesser, but still very nice gains elsewhere). We’ll fervently hope these gains are genuine. Please notice that the scale applies ONLY to the blue bars.

A number of our elementary schools beat the +6.6% average gain for ED students; a number did not. (Go, Westover Hills!)


The middle school data offer slivers of hope and buckets of despair.



For the high schools, the axis again expands to accommodate large gains (Franklin and Community); again, the numbers on that axis apply only to the blue bars.



Stay tuned for a look at the best and worst schools in the state.

The Race to the SOL Bottom

Virginia’s economically disadvantaged (“ED”) students underperform their more affluent peers (“Not ED”) on the SOL tests; statewide, the pass rate difference is about 20%, depending on the subject. As a result, the raw SOL punishes divisions, such as Richmond, with relatively large ED populations.

So, in this look at Richmond’s place in the race to be the worst school division, let’s consider both the Not ED and the ED pass rates.

First, Reading: Here are the ten divisions with the worst Not ED pass rates.

image image

Here the Richmond Not ED pass rate fell by 2.2% but six other divisions managed to do worse. Petersburg, e.g., held it’s last place position with a decrease of 5.0%. Overall, Richmond improved from third to sixth from the bottom.

Notes: The “Division Averages” there are averages of the division pass rates, not the average of all Virginia students taking the tests. Some of the Halifax County data are absent from the 2019 download; I’m told that all the Halifax reading data were ED. In any case, Halifax is missing here..

As to the ED students, Richmond held on at second from worst.

image image

On the math tests (where a new, failure-averse scoring system improved pass rates statewide), the boost to Richmond’s Not ED rate was sufficiently feeble to drop the division from fifth to fourth worst.

image image

As well, despite the ED increase, Richmond slipped from fourth to third worst.

image image

For another view of Richmond’s performance, here is a graph of division reading pass rates v. the percentage of ED students taking the tests.


Richmond is the yellow points at 64.1% ED. The peer city, Norfolk, is the red points at 65.9% ED. (Remember, those are percentages in the tested population, not in the division overall.)

Richmond’s Not ED pass rate is 10.1 points, 1.6 standard deviations, below the fitted line. The ED rate is –18.2%, 2.2 standard deviations below the ED line.

An interesting aspect of this graph is the negative slope and nontrivial R-squared of the least squares fit to the Not ED pass rates while the R-squared value for the ED data is too small to imply any correlation. Said in English, it looks like the percentage of ED students shows a modest, negative correlation with the pass rates of their Not ED peers but not those of the ED students themselves.

The math data tell much the same story.


In this case, Richmond’s Not ED pass rate is 2.5 standard deviations below the fitted line while the ED is –1.8 SDs.

On both graphs, notice the peer jurisdiction outperforming Richmond notwithstanding ED populations in the same ballpark. Note also the divisions with ED populations larger than Richmond who, for the most part, manage to achieve higher pass rates.

2019 SOL Pass Rates

The 2019 SOL data are up on the VDOE Web page. Let’s see what they say about Richmond.

The raw SOL pass rates are misleading and unfair to Richmond and the other divisions with large populations of “economically disadvantaged” (“ED”) students. Overall, ED students pass the SOLs at a rate about 20% lower than their more affluent (“Not ED”) peers. Thus, the average pass rates are lowered for the divisions with more ED students. In Richmond, with about 2/3 ED students, the hit is particularly large.

So let’s look at both the ED and Not ED data.  To start, here are the reading pass rates by year for the state and for Richmond:


We’ll compare Richmond to the other divisions in the next post. For now, just notice that Richmond’s Not ED rate is below the 75% nominal accreditation level and 14.6 points below the state average while it’s ED rate is below 50%, 17.25 points below the Virginia average.

The average reading pass rates dropped this year, especially in Richmond.


The math data paint a more interesting picture. The Board changed the test this year and relaxed the scoring, presumably as part of an ongoing effort to reach Lake Woebegone status. That helped a lot statewide; in Richmond, not so much.



Notice that Richmond’s Not ED math pass rate this year was below the Virginia ED rate.

For the writing data, I had to change the y-axis range to accommodate the Richmond ED numbers.


The Richmond Not ED numbers improved this year; that improvement was swamped, however, by the continued slide of the ED pass rate.


The history and Social Science numbers continued a downward course.


Especially in Richmond.


The science pass rates nearly held steady this year.



It probably is too soon to hope that our new Superintendent might have started to improve Richmond’s dismal numbers. Any hope that he might have just broken even is now seen to have been futile.

2018 Richmond Crime Rates

The Virginia State Police publish an annual report on Crime in Virginia.  They count the “Type A” offenses reported per police unit:

Destruction/Damage/Vandalism of Property
Drug/Narcotic Offenses
Fraud Offenses
Gambling Offenses
Motor Vehicle Theft
Pornography/Obscene Material
Prostitution Offenses
Sex Offenses, Forcible & Nonforcible
Stolen Property Offenses
Weapon Law Violations

This year they classify offenses in three categories:


These data have their peculiarities. First, a counting problem: The published report [at p.8] for 2018 reports 89,701 “Crimes Against the Person,” while their (wonderfully complete but complex) database reports 105,004.  The report says (at page 8), “Several offenses may have occurred in one crime incident; therefore the total number of Group A offenses reported was 418,074.”  Yet the database total is 436,464.

Note: In one sense I’m being unfair. The data from the local sources change daily as after-the-fact information emerges and the database is updated to reflect those changes.  But the database has had six months to settle down and differences noted above are VERY large compared to the ordinary after-the-fact changes.

Whatever they are counting, the total should be the total. 

In any case, the numbers below are what they report in the database as of July 17-19, 2019.

They report the numbers by police agency, both the local force and, in most cases, the State Police.  For example, the Richmond Police Department shows 21,859 total offenses and the State Police show 233 in Richmond.  The report also includes data for the colleges (e.g., 1,269 for the VCU Campus Police), the Capitol Police (74 offenses), and state agencies such as the ABC Board (100 offenses).  Just as a statistical matter, the small jurisdictions produce some weird numbers because even a small number of crimes can produce a large change in the crime rate.  As well, the State Police report a significant fraction of the incidents in some small jurisdictions; for instance, in Craig County in 2018, the sheriff reported 26 offenses while the State Police reported 22.

The data below are for the jurisdiction’s local law enforcement agency only.

To start, here are the 2018 counts, expressed as Type A offense reports per 100 population vs. population.


Richmond is the gold square.  The red diamonds, from the left, are the peer jurisdictions, Hampton, Newport News, and Norfolk.  South Boston and Clifton Forge are omitted for lack of population data.

There is no particular reason to expect these data to fit a straight line but Excel is happy to fit one.  The slope suggests that the rate (per hundred population) increases by about 0.12 for a population increase of 100,000.  The R2, however, tells us that population explains less than half of 1% of the variance in the crime rate; i.e., overall crime rate (by this measure) does not correlate with population.

Here is the same graph, with the axis expanded to cut off the Big Guys (Fairfax, Va. Beach, Prince Wm., Loudoun, Chesterfield, and Henrico) in order to emphasize the distribution of the smaller jurisdictions.


The Top Twenty for offense rates all are cities.


And most of the Leaders there are little guys.

If we sort by population, we get:


Notice the large differences between the rates of the large cities and the large counties.

Just to cleanse the palate, here are the twenty jurisdictions with the lowest counts per hundred population (all are counties).


CAVEATS: These numbers tell us about overall crime rates but not about the environment faced by any particular citizen.  As well, the VSP emphasizes that, as we see above, population is not a good predictor of crime rate.  They list [at p.8] other factors:

1. Population density and degree of urbanization;
2. Population variations in composition and stability;
3. Economic conditions and employment availability;
4. Mores, cultural conditions, education, and religious characteristics;
5. Family cohesiveness;
6. Climate, including seasonal weather conditions;
7. Effective strength of the police force;
8. Standards governing appointments to the police force;
9. Attitudes and policies of the courts, prosecutors and corrections;
10. Citizen attitudes toward crime and police;
11. The administrative and investigative efficiency of police agencies and the organization and cooperation of adjoining and overlapping police jurisdictions;
12. Crime reporting practices of citizens.

Here are the rates for Richmond, the total for all jurisdictions (still of the local law enforcement agencies), and the peer cities for the most-reported offenses.


And some less common offenses:


And some still less common crimes:


The 2018 Richmond rate increased to 9.76 from 8.65 in 2017 while the state total continued a decline to just above half the Richmond rate.


As the graph at the top shows, the Type A total is driven by the property crime numbers.  To see how violent and drug crime are doing, we have to look underneath the totals.

When we do that, we see that the Richmond count of simple assaults rose a bit in ‘18, as did the drug offenses.


Note: This graph and those immediately below report the raw counts of offenses reported to the Richmond Police Dept., not the count per 100.  Throughout this period, the Richmond population has been just over 200,000, with very little change, so you can get close to the rates per 100 by dividing these numbers by 2,000.  For example, the quick number for drug offenses is just short of 1.0 per hundred while the actual rate is 0.90.

The robbery numbers continued a long downward trend; aggravated assaults continued to rise.


The murder and kidnap counts dropped.  The decreases from the early 2000’s, both here and above, are remarkable.


For a list of the hot blocks for drugs in Richmond see this page.  And see this page for data showing a nice improvement in Forest Hill (with some recent slippage).

Much of Richmond’s plethora of crime is drug-related

To complement the still outrageous crime rate, our schools are among the worst in the state and our public housing agency maintains a sanctuary for crime on its property.  To support all this dysfunction, we pay some of the highest taxes in the state.  Go figure.

Note: In the past, Mr. Westerberg of the VSP kindly furnished a copy of the data as an Excel spreadsheet, so I didn’t have to copy the numbers from the PDF report on the Web.  He now is enjoying his retirement and the VSP has put up a database with more numbers than one might want.

“No Tax Increase” Increases

Following last year’s increase on the meals tax (ca. $9.1 million per year), our Mayor asked Council for a nine cent increase in the property tax and a fifty cents per pack cigarette tax (ca. $24 million per year).  Having lost on the real estate increase this year, Mayor Stoney “did not rule out the ideal of proposing a real estate tax increase” next year.

All that sounds like the danger is at least on hold until next year.

But wait!  Councilman Agelasto’s July newsletter paints on a larger canvas:

  • Property assessments will rise about 8% this year;
  • Gas rates will increase 3.5%;
  • The typical water bill will increase by $1.41;
  • The typical wastewater bill will increase by $2.39; and
  • The typical stormwater utility rate will increase by 4%.

The BLS tells us the Consumer Price Index for urban consumers rose 1.8%, May ‘18 to May ‘19.

Try to not think about that as you dodge the potholes.  And as you hear about RPS’s demand for more money while it already is wasting some fifty million dollars a year.

Where Have All the Dollars Gone?

RPS Wants More Money But Already Is Wasting At Least $52.6 Million a Year

Table 13 in the Superintendent’s Annual Report gives us the 2018 disbursements for each of 132 school divisions, along with the end of year Average Daily Membership (“ADM”).  If we calculate the total disbursements, leaving out facilities, debt service, and contingency reserve, we see that Richmond is the eleventh most expensive division per student.


Not only does RPS spend more per student than the average division, it spends still more than the school systems of the peer cities, Hampton, Newport News, and Norfolk.


Table 13 breaks out the expenditures in ten basic categories (as well as the three I’ve left out).  Here are those data for Richmond, expressed as differences from the per student division averages.


The $1,659 Richmond excess in the Instruction category accounts for 57.5% of the $2,887 total.

If we multiply these numbers by the Richmond ADM, we get the excess spending, here expressed in millions of dollars.


If Richmond paid higher than average salaries or had more teachers than average, it could help explain (perhaps even justify) some of that $38.2 million excess “Instruction” spending.

We have data on that:

  • The average instructional salary in Richmond (Table 19) was $53,138.87 while the division average was $58,677.08.  Multiplication of  the –$5,538.21 difference by Richmond’s 1,905.22 positions gives –$10.6 million.  Richmond is saving $10.6 million vs. the average division by its lower than average salaries.
  • Dividing Richmond’s 1,905.22 instructional positions by the 23,036.78 ADM gives 0.08270 teachers per student.  The state number is 0.08583.  The difference is a Richmond deficit of 0.00313 teachers per student, which at that $53K average salary is a saving of $3.8 million vs. the division average.

Thus the number of teachers and average salary serve to exacerbate, not mitigate, Richmond’s excess spending for instruction (here, in millions of dollars):


The SOL pass rates give a measure of the return for that $52.6 million excess.

Note: As to SOLs, we have seen that Virginia’s economically disadvantaged (“ED”) students underperform their more affluent peers (“Not ED”) by some 15 to 20 points, depending on the test.  This makes the division SOL average a biased measure that punishes the divisions with larger percentages of ED students.  So let’s look at the pass rates for both groups, not at the division average pass rates.


That’s clear enough: Division average pass rates do not correlate with the spending.  Richmond (the enlarged points with gold fill) spends a lot of money and gets wretched pass rates.  The peer cities (the red fill, from the left Hampton, Newport News, and Norfolk) do much better on much less money.

The math data tell the same story, albeit with an even more dismal showing by Richmond.


Curiously, with an R-squared of 7.9%, the ED data for math suggest a slight negative correlation between pass rates and spending.

In light of this, we can wonder why City Council this year gave RPS an extra $37 million of our money without any showing that the extra money would help anything and without any trace of accountability of anyone for the effect of the new money or the lack of effect of that excess $52.6 million in 2018.

Bigger Bucks Vanishing Into Failure

We have seen that, for the most part, Richmond pays its teachers less than average


Let’s see if we can figure out where the money is going and what we’re getting for it.

Table 13 in the Superintendent’s Annual Report shows disbursements by division.  The data below are from the 2018 table, not including the spending for facilities, debt service, and contingency.  The Richmond total is $361,601,063.

(That’s just north of a third of a BILLION dollars.)

The table also gives the end-of-year Average Daily Membership (“ADM”).  Dividing the total disbursement by the ADM for Richmond, three peer cities, and the division average, gives us:



Breaking out that $2,887 per student excess by category gives us:


Note:  The table tells us that the preK spending includes Head Start; the “other” educational spending goes for enterprise operations, community service programs, and other matters that do not involve the delivery of instruction or related activities for K-12 students.

Here we see particularly large spending for attendance and health (6% of the $2,887 excess), food services (8%), pre-kindergarten (19%), and “other” educational programs (6%).  Those are the small change, however: 57% of the excess goes for instruction.

The only way to get this unusually large expenditure for instruction while paying most teachers less than average is to have a lot of teachers or spend a lot of money on something other than teachers’ salaries.

Let’s look at that from two directions: First, the relationship, if any, between disbursements and SOL pass rates; second, the relationship, if any, between number of teachers and pass rates.

We know that, in terms of SOL pass rates, Virginia’s economically disadvantaged (“ED”) students underperform their more affluent peers (“Not ED”) by ca. 15-20%.  This renders the division average pass rate unfair to those divisions with relatively large ED populations.  So let’s look at the data for both groups, not at the overall division averages:


Richmond is the gold points.  Whatever they’re spending all that money on, we’re getting precious little return.

The peer cities, Hampton, Newport News, and Norfolk are the red points, left to right.  They are spending a lot less per student than Richmond and getting a lot more performance.

Lynchburg is blue; Charles City, light green.

The fitted lines suggest that pass rates for both groups decrease with increasing disbursements but the R-squared values are tiny for the Not ED group and right small for the ED.

The math data tell much the same story, but with R-squared values that suggest a bit more of a (negative) relationship between pass rate and spending.


In any case, the divisions that spend more per student do not get better SOL pass rates on average.

The Superintendent’s Table 19 gives us the number of instructional positions (classroom teachers, guidance counselors, librarians, technology instructors, principals, and assistant principals) per student.   Juxtaposing those data with the pass rates gives:


BTW: The peers have rearranged here.  From the left it’s Newport News, Hampton, and Norfolk.  Notice that Hampton and Norfolk have more instructional positions per student that we do, while spending less per student.

Go figure!  Richmond is spending lots of money on instruction and not paying most of its teachers an average salary but operating with one of the lowest instructional position/student ratios in the state. 

And we’re getting precious little for it.

I’ll save the spending puzzle for another post.  First hypothesis: Richmond is paying a lot to people other than teachers.

In the meantime, these data tell us that division average reading pass rates increase with the instructional position/pupil ratio.  The R-squared for the ED students, 5.9%, says there’s a hint of a correlation (ρ = +0.24).  For the Not ED pass rates, we also see a positive slope but the R-squared is less than 1%..

The math data tell the same story but with even lower R-squared values.


So, as to Richmond: More money, fewer instructional positions, lower teacher pay, awful pass rates.  Whatever the problem with Richmond’s schools may be, it’s not a shortage of money.