Enthusiasm v. Data at Westover Hills

The RT-D’s Ned Oliver did a piece the other day about the flight from Richmond’s public schools.  As a contrast (an antidote?) to that, Oliver discussed Westover Hills Elementary School, where

Principal Virginia Loving said the parents who come to open houses to check out the school rarely ask about test scores, instead focusing on class size and extracurricular opportunities — two points where her school is strong.

Indeed, Oliver quotes a spokesperson for RPS for the proposition

I know that’s (Superintendent Jason) Kamras’ vision — to have all of our schools look like Westover Hills — look like the city.

Let’s take a look at those “rarely ask[ed]” about test scores.  First, the reading pass rates.




Recall that VBOE installed new English and science tests in 2013 that lowered pass rates statewide.  The decreases were generally exaggerated in Richmond because our then-Superintendent failed to align the curricula to the new tests. 

At Westover Hills, we see 2013 reading decreases that more closely mirrored the state averages than did the Richmond decreases, presumably reflecting better preparation at the school. 

In 2014, the third grade enjoyed a reading renaissance.  Unfortunately, that lasted only the one year and was followed by a decline that persisted through 2017.

The fourth grade decline anticipated the 2013 drop and then continued to 2014.  The smaller gains in 2015 and 2016 were wiped out by a further plunge in 2017.

The fifth grade suffered a further drop in 2015, from which it has only partially recovered.

The reading scores for all three grades remain below the 2017 Richmond averages while, on the division average, Richmond was third worst in the state.  Those differences are appalling: Third Grade, 19 points below Richmond, 35 below the state average; Fourth Grade, 29 and 42 points down; fifth Grade, 11 and 22 points.

The mathematics pass rates paint a similarly dismal picture.




The new tests here were in 2012.  In 2017, Richmond had the second lowest division pass rate in math. 

The 2017 Westover Hills deficits were:


Finally, science:


Before the new tests, Westover Hills was running well above the state average on the 5th grade science test (the only grade tested to the present).  In 2017, it was 18 points below Richmond, 33 below the state.

If our new Superintendent really wants “to have all of our schools look like Westover Hills,” he is prescribing a massive (but happy, it seems) failure that is far beyond even the current, appalling state of our schools.

Sex Offender at 4310 New Kent

The State Police Web site has the Sex Offender Registry required by Va. Code § 9.1-913.

The registry now includes a map.  Here is the portion that includes our neighborhood.


That closest entry to our neighborhood is Mr. Edwards at 4310 New Kent (You’ll have fill in the number to prove you’re not a robot before the link will open).


The Registry doesn’t say that Mr. Edwards would be dangerous, just that he was convicted of aggravated sexual battery in 2010.  You get to draw your own inference as to risk to your family.

Counting Crimes

The Virginia State Police keep a database of crimes reported under the FBI’s Uniform Crime Reporting system.  They count the “Type A” offense reports by police unit:

     Destruction/Damage/Vandalism of Property
     Drug/Narcotic Offenses
     Fraud Offenses
     Gambling Offenses
     Motor Vehicle Theft
     Pornography/Obscene Material
     Prostitution Offenses
     Sex Offenses, Forcible & Nonforcible
     Stolen Property Offenses
     Weapon Law Violations

The 2016 version of the annual report, Crime in Virginia, is available on the VSP Web site.  Mr. Westerberg of the VSP has (again) kindly furnished a copy of the data as an Excel spreadsheet so I haven’t had to copy the numbers out of the pdf on the Web.

These data have their peculiarities.  VSP reports the number of incidents, not offenses, such that some of the count includes multiple offenses.  Where an incident includes more than one offense, they report the worst.  Thus, for an incident where an offender murders someone in the course of a burglary while carrying drugs, the incident is reported as a murder.

They report the numbers by police agency.  Thus, there are counts both for the Farmville Police and the Prince Edward Sheriff, despite their overlap in the Town.  They also list incidents reported to the State Police; for example, the Richmond Police Department shows 19,081 incident reports in 2016 and the State Police show 206 in Richmond that year. 

The report also includes data for the colleges, the Capitol Police, and state agencies such as the ABC Board. 

Finally, the small jurisdictions produce some weird statistics because even a small variation can produce a large change in the crime rate.  As well, in some small jurisdictions the State Police report a significant fraction of the incidents.  For instance, in Alleghany County in 2016, 215 incidents were reported to the sheriff and 122 to the State Police.

I produced the graphs and table below by leaving out the data for the State Police and State agencies.  I also omitted data for the jurisdictions with populations <10,000.

Here, then, are the 2016 data by jurisdiction, expressed as Type A offense reports per 100 population, plotted v. population.


The graph is distorted by the Big Guys, Fairfax and, to a much lesser extent, Virginia Beach, Prince William, Chesterfield, Loudoun, and Henrico.  If we expand the axis to shove those jurisdictions off the graph we get:


The correlation is zilch, suggesting that population is not related to the rate of offenses reported.

Richmond is the gold square here; the red diamonds are the peer jurisdictions, Hampton, Newport News, and Norfolk (left to right).

Richmond is sixth from the “top”:


(The VSP database truncates at about 25 characters, so we get, e.g., the Roanoke Police “Depar”.)

The Richmond rate is down this year, reflecting the statewide trend, albeit nearly double the statewide rate.


The Type A total is driven by the property crime numbers: Typically the larceny, vandalism, and motor vehicle theft numbers will account for 2/3 of the Type A total.  To see how violent and drug crime are doing, we have to look underneath the totals.

When we do that, we see the number of simple assaults and drug incidents both dropped in Richmond in ‘16.


Note: This graph and those immediately below report the raw counts of offenses reported in Richmond, not the rate.  Throughout this period, the Richmond population has been near 200,000, without much change, so you can get close to the rates per 100 by dividing these numbers by two thousand.  Thus, the 1,579 drug incidents in Richmond in 2016 were 0.71 per hundred population; the approximation gives 0.79.

The robbery, aggravated assault, and weapon law numbers did not change much this year albeit the weapon count remains up from the pre-2014 period.


The rape, “other” (non forcible) sex crimes, kidnapping, arson, and murder rates all continued to bounce a small amount.  As with robbery and aggravated assault, the decreases from the early 2000’s are remarkable. 


To see the (decreasing) rates in Forest Hill (notwithstanding the unacceptable rate of car breakins), go here.

Lies, Damn Lies, and Poverty

Our former Superintendent was in the job from January, 2014 to June, 2017.  Under his supervision, Richmond’s (awful) SOL pass rates peaked in 2015 and then declined.


While he was with us, the Super joined in the popular sport of blaming funding and – to the point here – blaming the poorer students for Richmond’s gross underperformance.

I’ve pointed out elsewhere that RPS is wasting something like $55 million per year to obtain appalling pass rates.  They are fibbing about the need for more money (aside from the need to fix the buildings they have been neglecting for decades).

Today I want to look further into the effect of our large population of poorer (“economically disadvantaged”) students.

For sure, economically disadvantaged (“ED”) students underperform their more affluent peers by about 20%.  For example, on the state average data:


Thus, increasing the percentage of ED students can be expected to lower the average pass rates. 

The VDOE data permit a look beyond this dilution of the average pass rate to see if there is an effect of ED enrollments on the performance of each group.

To begin, here are the 2017 3d grade reading pass rates by division for the ED and non-ED students, plotted vs. the percentage of ED students taking the test at that grade level in the division.


(BTW: VDOE has a new front end to the SOL database.  It’s a bit easier to use than the old system but also slow.)

Note: Data are omitted here for the divisions with one or more missing data points (usually because of the [docx] suppression rule that applies for groups of fewer than ten students).

Here, with increasing ED population in the tested group in the division the scores drop some for both the ED and non-ED groups, but the correlations are minuscule. 

We also see Richmond (the square points) underperforming considerably.

The data for the other two elementary grades show similar – well, mostly similar – patterns.



The Grade 4 ED data show a counter-intuitive increase with increasing ED population but the tiny R-squared value tells us that the two variables are essentially uncorrelated.  The decreasing scores of the non-ED groups show some larger, but still small, correlations with the ED population.

In the middle school data, Richmond drops from poor performance to appalling underperformance and the correlation of the non-ED scores with the %ED improves some.




Except for the third and sixth grades, the R-squared values tell us that the average scores are mildly correlated with the ED population percentage for the non-ED students but vanishingly for the ED students themselves.

Richmond’s underperformance relative to the other divisions with similar ED populations ranges from moderate in the elementary grades to bottom-of-the-barrel in the middle school grades.

Beyond emphasizing the awful performance of Richmond’s awful middle schools, these data suggest that the major impact of larger ED populations is on the performance of the non-ED students.

The mathematics data tell much the same story.







The Richmond schools underperform again, disastrously so in the middle school grades.

The fitted lines go the wrong way in three cases for the ED scores but, again, with trivial correlations.  The effect on the non-ED students looks to be larger, and to enjoy better correlations, than with the reading data. 

These data hint at an hypothesis, particularly in the middle school grades: At the division level, the effect of large ED populations is mostly seen in the declining performance of the non-ED students.  The corollary: Placing ED students in schools with low ED populations would not significantly improve the performance of the ED students.

I’ll try to pry out the by-school data to see if the pattern persists there.

In any case, our former Superintendent and the other apologists for RPS are not dealing with the real world when they try to blame Richmond’s poor performance on the economic status of the students: Richmond’s schools, especially the middle schools, underperform grossly in comparison to other divisions with similar ED populations.

How about some truth in education: Let’s blame the schools, not the kids.

And You Thought Our Middle Schools Were Merely Bad

A kind reader points me to a piece in the NYT reporting on a Stanford study.  That study looks at progress from the third to eighth grades as a measure of educational quality.

As the Times reports, richer districts tend to have higher scores.  Here, for instance, are the third grade data for 2,000 large districts; the graph plots relative scores vs. district wealth:


The eighth grade numbers show some a similar pattern but some different rankings:


And here is the pattern of changes from third to eighth grades:


As the graphs suggest, the story focuses on Chicago.

Overall, the study reports a poor correlation between growth rates and average third grade scores. 

At the bottom of the Times story is a link that will produce score change data for a specified district and nearby systems.  So of course I had to put in Richmond.


As you see, Richmond is at the bottom of the barrel. 

The first surprise here is how much better Petersburg does on this measure. 

We also might wonder whether the relatively low growth reported for some excellent systems, e.g., Hanover, reflects weakness at the eighth grade level or strength at the third. 

In the other direction, Richmond’s [lack of] progress from 3d to 8th grades derives from the awful performance of our middle schools relative to the elementary schools.


The Stanford data also are consistent with the notion that poverty does not excuse the appalling performance of Richmond’s public schools.

Summer of Theft from Motor Vehicle

The Police Dept. database now has the summer, 2017 data so I can update my last jeremiad regarding car breakins in our neighborhood.

The most common offense reported in Forest Hill is “Theft from Motor Vehicle.”  Since the start of the database in 2000, there have been 576 unique reports of those, 27% of the total.


I expect the actual number of such thefts is larger but we have no way to know how many of them go unreported.

I like to call those thefts “car breakins” but, in fact, most of the victims left their cars unlocked.  Much of the “Destruction Property/Private Property” reported reflects actual car breakins.

The data show a third summer of increase, following a period of relative tranquility.


The largest contributor to these numbers is the 4200 block of Riverside Drive, home of the “42d St. Parking Lot.”



There’s a lot of activity at the Nature Center as well.


The Good News is the long term decrease in the numbers.


*2017 data through Nov. 24.

There is some history there.

The considerable decreases in the 4200 block came after Parks’ 2005 response to neighborhood complaints: They started locking the gates to the 42d St. lot at night and off season and they installed rocks to block parking in the part of the lot that is less visible from the street.

I attribute the recent increases to the increased use of the Park and the removal of the rocks last year. 

In any case, there are two lessons here:

  • Leaving stuff in the car, especially in an unlocked car, is an invitation to lose the stuff and to help chum the neighborhood for criminals; and
  • Given that most of the thefts are from the vehicles of park visitors, it’s past time for some LARGE signs in the 4200 block (and probably at the Nature Center and 4100 Hillcrest) and, especially, in the 42d St. parking lot to warn the visitors:
  • Car Breakins Here! Lock your junk in your trunk.

Final Accreditation Update

At its October meeting, the Board of Education resolved 58 TBD cases by allowing the schools in question to be “Partially Accredited-Reconstituted.” 

Here is the resulting situation as to Richmond, Petersburg, and the state totals:


The categories there are my shorthand.  The long versions are here and here.  “Cheated” is my shorthand for “Accreditation Withheld-Board Decision.”  That decision followed the cheating scandal at AP Hill Elementary in Petersburg.

As of today, the table on the VDOE Web page has not been updated.  The data above are from the spreadsheet that apparently has been updated.

To focus just on Richmond:


The Good News is that the number of accredited schools is set to rise dramatically.  The Bad News is that the “improvement” need not reflect performance improvements, just a gross dilution of the standards.  Your tax dollars at “work.”

More Money for What?

The tug-of-war between the School Board and City Council over school funding enters a new era of speculation: Will the election results produce more funds for Richmond schools?

That overlooks the more fundamental question: What is the School Board doing with all the money it now is spending?

The most recent data from VDOE are from 2015-16.  Here from that year are the per-student disbursements by division for Richmond, three peer divisions, and the state average for school divisions.  I’ve left out the sums for facilities, debt service, and contingency reserve.


The largest single cost for schools is salaries, so it’s no surprise that Richmond’s excess of $3,051 per student mostly went to the instruction category.


Expressing the Richmond data as differences from the state number, we get:


Or, multiplying those differences by the Richmond enrollment of 21,826:


Multiplying the $3,051 Richmond total excess by the 21,826 enrollment gives the excess Richmond expenditure total: $66.6 million. 

We can account for some of that. 

The 2016 average salary in Richmond was $51,263 vs. the state average of 56,320.  That is, Richmond saved $11.3 million vs. the state average by underpaying its teachers.

At the same time, Richmond has a lot more teachers than average:


At the average salary of $51,263, that comes to an extra cost of $18.5 million.

Combining those data leaves Richmond with an excess cost v. the division average of $59.4 million.


This overlooks the question of what those extra teachers are doing for the RPS students.  To judge from the SOL scores, it’s more a question of what they are doing to the students.

We often hear that special education and poverty are major contributors to Richmond’s higher costs.  It is difficult to come by data to quantify that.  The closest I have come is SchoolDataDirect, the now defunct Standard & Poors data service that was supported by the Gates Foundation.

S&P calculated adjustments to spending on core operating activities to account for poverty, special education, and geographic factors.  On the 2006 data, here are their numbers:


The difference in adjustments is 5.8%; that is, on those (old) data, Richmond schools were 5.8% less expensive than average (in terms of “core” spending) because of those three factors. 

Based on that counter-intuitive result and the absence of current data, let’s pass on those factors.

One final item:  Richmond’s excess expenditure for O&M is $2.8 million.  Even if all that were justified (and given the sorry state of our old school buildings, it probably is), Richmond would be spending an excess $56.6 million.


After these adjustments we are left with an estimated excess in our 2016 school disbursements of some 56+ million compared to the state average (recall that this does not account for disbursements for facilities, debt, or contingency).  What we got for that money in 2016 was the second lowest math pass rate and the lowest reading pass rate among the Virginia school divisions.

Until the RPS can explain where all that money is going and what we are getting for it, there is no reason at all to increase the school budget.  To the contrary, these data make a case for cutting the RPS budget by at least $56 million.