The Worst of the Appalling

The earlier post shows that, despite the happy aura, Westover Hills Elementary School is miserably failing to educate its students. 

I singled that school out because of the story in the paper, not because it is the worst performer in town.  It merely is among the worst.

For example, here from the 2017 testing are the twenty-one worst 3d grade reading pass rates in Virginia.  For reference, the accreditation benchmark is 75%.

image

Westover Hills’ 39.3% ties for ninth from the worst.  Seven other Richmond elementary schools join Westover Hills in this cellar.

It’s a secret pleasure to see a school each from Fairfax and Henrico on this list.  That said, these data tell us that terrible schools are not unique to Richmond, but the concentration of them is.

Moving to the higher grades:

image

image

Notice that Westover Hills’ 5th grade performance floated above the bottom of the barrel.

The math picture is similarly awful.

image

image

image

The middle school numbers are no more encouraging, especially in light of MLK’s successful candidacy for worst in the state.

image

This 7th grade list expands to 22 schools in order in order to include Binford.  Of Richmond’s seven middle schools, all but Albert Hill made this expanded list.

image

Armstrong High makes the 8th grade list courtesy of the complicated SOL system that results in high schools administering the 8th grade test to some students.  Among the middle schools, MLK retains its “lead.”

image

Turning to the End of Course tests:

image

All of Richmond’s mainstream high schools made the bottom 23.  In a happy contrast, both of the more selective high schools aced the reading tests (Franklin, which is selective and has both middle- and high school grades, also did just fine, with a 93.9% pass rate).

image

To complete the picture, here are the math data.  The accreditation benchmark is 70%

image

image

image

The only surprise in the middle school math data is the (slight) competition for MLK in the race to the bottom.

image

image

image

As to the mainstream high schools, Huguenot floats out of the cellar.  Franklin (which is selective), replaces Huguenot.

image

Neither of Richmond’s selective high schools appears near the top of the math EOC list.  Open High is no. 178 from the top with a 98.1% ; Community is 611 at 67.9%, behind Huguenot at 579, 72.1%.

Welcome to Richmond, Mr. Kamras.

Enthusiasm v. Data at Westover Hills

The RT-D’s Ned Oliver did a piece the other day about the flight from Richmond’s public schools.  As a contrast (an antidote?) to that, Oliver discussed Westover Hills Elementary School, where

Principal Virginia Loving said the parents who come to open houses to check out the school rarely ask about test scores, instead focusing on class size and extracurricular opportunities — two points where her school is strong.

Indeed, Oliver quotes a spokesperson for RPS for the proposition

I know that’s (Superintendent Jason) Kamras’ vision — to have all of our schools look like Westover Hills — look like the city.

Let’s take a look at those “rarely ask[ed]” about test scores.  First, the reading pass rates.

image

image

image

Recall that VBOE installed new English and science tests in 2013 that lowered pass rates statewide.  The decreases were generally exaggerated in Richmond because our then-Superintendent failed to align the curricula to the new tests. 

At Westover Hills, we see 2013 reading decreases that more closely mirrored the state averages than did the Richmond decreases, presumably reflecting better preparation at the school. 

In 2014, the third grade enjoyed a reading renaissance.  Unfortunately, that lasted only the one year and was followed by a decline that persisted through 2017.

The fourth grade decline anticipated the 2013 drop and then continued to 2014.  The smaller gains in 2015 and 2016 were wiped out by a further plunge in 2017.

The fifth grade suffered a further drop in 2015, from which it has only partially recovered.

The reading scores for all three grades remain below the 2017 Richmond averages while, on the division average, Richmond was third worst in the state.  Those differences are appalling: Third Grade, 19 points below Richmond, 35 below the state average; Fourth Grade, 29 and 42 points down; fifth Grade, 11 and 22 points.

The mathematics pass rates paint a similarly dismal picture.

image

image

image

The new tests here were in 2012.  In 2017, Richmond had the second lowest division pass rate in math. 

The 2017 Westover Hills deficits were:

image

Finally, science:

image

Before the new tests, Westover Hills was running well above the state average on the 5th grade science test (the only grade tested to the present).  In 2017, it was 18 points below Richmond, 33 below the state.

If our new Superintendent really wants “to have all of our schools look like Westover Hills,” he is prescribing a massive (but happy, it seems) failure that is far beyond even the current, appalling state of our schools.

Sex Offender at 4310 New Kent

The State Police Web site has the Sex Offender Registry required by Va. Code § 9.1-913.

The registry now includes a map.  Here is the portion that includes our neighborhood.

image

That closest entry to our neighborhood is Mr. Edwards at 4310 New Kent (You’ll have fill in the number to prove you’re not a robot before the link will open).

image

The Registry doesn’t say that Mr. Edwards would be dangerous, just that he was convicted of aggravated sexual battery in 2010.  You get to draw your own inference as to risk to your family.

Lies, Damn Lies, and Poverty

Our former Superintendent was in the job from January, 2014 to June, 2017.  Under his supervision, Richmond’s (awful) SOL pass rates peaked in 2015 and then declined.

image

While he was with us, the Super joined in the popular sport of blaming funding and – to the point here – blaming the poorer students for Richmond’s gross underperformance.

I’ve pointed out elsewhere that RPS is wasting something like $55 million per year to obtain appalling pass rates.  They are fibbing about the need for more money (aside from the need to fix the buildings they have been neglecting for decades).

Today I want to look further into the effect of our large population of poorer (“economically disadvantaged”) students.

For sure, economically disadvantaged (“ED”) students underperform their more affluent peers by about 20%.  For example, on the state average data:

image

Thus, increasing the percentage of ED students can be expected to lower the average pass rates. 

The VDOE data permit a look beyond this dilution of the average pass rate to see if there is an effect of ED enrollments on the performance of each group.

To begin, here are the 2017 3d grade reading pass rates by division for the ED and non-ED students, plotted vs. the percentage of ED students taking the test at that grade level in the division.

image

(BTW: VDOE has a new front end to the SOL database.  It’s a bit easier to use than the old system but also slow.)

Note: Data are omitted here for the divisions with one or more missing data points (usually because of the [docx] suppression rule that applies for groups of fewer than ten students).

Here, with increasing ED population in the tested group in the division the scores drop some for both the ED and non-ED groups, but the correlations are minuscule. 

We also see Richmond (the square points) underperforming considerably.

The data for the other two elementary grades show similar – well, mostly similar – patterns.

image

image

The Grade 4 ED data show a counter-intuitive increase with increasing ED population but the tiny R-squared value tells us that the two variables are essentially uncorrelated.  The decreasing scores of the non-ED groups show some larger, but still small, correlations with the ED population.

In the middle school data, Richmond drops from poor performance to appalling underperformance and the correlation of the non-ED scores with the %ED improves some.

image

image

image

Except for the third and sixth grades, the R-squared values tell us that the average scores are mildly correlated with the ED population percentage for the non-ED students but vanishingly for the ED students themselves.

Richmond’s underperformance relative to the other divisions with similar ED populations ranges from moderate in the elementary grades to bottom-of-the-barrel in the middle school grades.

Beyond emphasizing the awful performance of Richmond’s awful middle schools, these data suggest that the major impact of larger ED populations is on the performance of the non-ED students.

The mathematics data tell much the same story.

image

image

image

image

image

image

The Richmond schools underperform again, disastrously so in the middle school grades.

The fitted lines go the wrong way in three cases for the ED scores but, again, with trivial correlations.  The effect on the non-ED students looks to be larger, and to enjoy better correlations, than with the reading data. 

These data hint at an hypothesis, particularly in the middle school grades: At the division level, the effect of large ED populations is mostly seen in the declining performance of the non-ED students.  The corollary: Placing ED students in schools with low ED populations would not significantly improve the performance of the ED students.

I’ll try to pry out the by-school data to see if the pattern persists there.

In any case, our former Superintendent and the other apologists for RPS are not dealing with the real world when they try to blame Richmond’s poor performance on the economic status of the students: Richmond’s schools, especially the middle schools, underperform grossly in comparison to other divisions with similar ED populations.

How about some truth in education: Let’s blame the schools, not the kids.

And You Thought Our Middle Schools Were Merely Bad

A kind reader points me to a piece in the NYT reporting on a Stanford study.  That study looks at progress from the third to eighth grades as a measure of educational quality.

As the Times reports, richer districts tend to have higher scores.  Here, for instance, are the third grade data for 2,000 large districts; the graph plots relative scores vs. district wealth:

image

The eighth grade numbers show some a similar pattern but some different rankings:

image

And here is the pattern of changes from third to eighth grades:

image

As the graphs suggest, the story focuses on Chicago.

Overall, the study reports a poor correlation between growth rates and average third grade scores. 

At the bottom of the Times story is a link that will produce score change data for a specified district and nearby systems.  So of course I had to put in Richmond.

percentile

As you see, Richmond is at the bottom of the barrel. 

The first surprise here is how much better Petersburg does on this measure. 

We also might wonder whether the relatively low growth reported for some excellent systems, e.g., Hanover, reflects weakness at the eighth grade level or strength at the third. 

In the other direction, Richmond’s [lack of] progress from 3d to 8th grades derives from the awful performance of our middle schools relative to the elementary schools.

image

The Stanford data also are consistent with the notion that poverty does not excuse the appalling performance of Richmond’s public schools.

Summer of Theft from Motor Vehicle

The Police Dept. database now has the summer, 2017 data so I can update my last jeremiad regarding car breakins in our neighborhood.

The most common offense reported in Forest Hill is “Theft from Motor Vehicle.”  Since the start of the database in 2000, there have been 576 unique reports of those, 27% of the total.

image

I expect the actual number of such thefts is larger but we have no way to know how many of them go unreported.

I like to call those thefts “car breakins” but, in fact, most of the victims left their cars unlocked.  Much of the “Destruction Property/Private Property” reported reflects actual car breakins.

The data show a third summer of increase, following a period of relative tranquility.

image

The largest contributor to these numbers is the 4200 block of Riverside Drive, home of the “42d St. Parking Lot.”

http://calaf.org/wp-content/uploads/2017/04/image-57.png

image

There’s a lot of activity at the Nature Center as well.

image

The Good News is the long term decrease in the numbers.

image

*2017 data through Nov. 24.

There is some history there.

The considerable decreases in the 4200 block came after Parks’ 2005 response to neighborhood complaints: They started locking the gates to the 42d St. lot at night and off season and they installed rocks to block parking in the part of the lot that is less visible from the street.


I attribute the recent increases to the increased use of the Park and the removal of the rocks last year. 

In any case, there are two lessons here:

  • Leaving stuff in the car, especially in an unlocked car, is an invitation to lose the stuff and to help chum the neighborhood for criminals; and
  • Given that most of the thefts are from the vehicles of park visitors, it’s past time for some LARGE signs in the 4200 block (and probably at the Nature Center and 4100 Hillcrest) and, especially, in the 42d St. parking lot to warn the visitors:
  • Car Breakins Here! Lock your junk in your trunk.

Final Accreditation Update

At its October meeting, the Board of Education resolved 58 TBD cases by allowing the schools in question to be “Partially Accredited-Reconstituted.” 

Here is the resulting situation as to Richmond, Petersburg, and the state totals:

image

The categories there are my shorthand.  The long versions are here and here.  “Cheated” is my shorthand for “Accreditation Withheld-Board Decision.”  That decision followed the cheating scandal at AP Hill Elementary in Petersburg.

As of today, the table on the VDOE Web page has not been updated.  The data above are from the spreadsheet that apparently has been updated.

To focus just on Richmond:

image

The Good News is that the number of accredited schools is set to rise dramatically.  The Bad News is that the “improvement” need not reflect performance improvements, just a gross dilution of the standards.  Your tax dollars at “work.”

More Money for What?

The tug-of-war between the School Board and City Council over school funding enters a new era of speculation: Will the election results produce more funds for Richmond schools?

That overlooks the more fundamental question: What is the School Board doing with all the money it now is spending?

The most recent data from VDOE are from 2015-16.  Here from that year are the per-student disbursements by division for Richmond, three peer divisions, and the state average for school divisions.  I’ve left out the sums for facilities, debt service, and contingency reserve.

image

The largest single cost for schools is salaries, so it’s no surprise that Richmond’s excess of $3,051 per student mostly went to the instruction category.

image

Expressing the Richmond data as differences from the state number, we get:

image

Or, multiplying those differences by the Richmond enrollment of 21,826:

image

Multiplying the $3,051 Richmond total excess by the 21,826 enrollment gives the excess Richmond expenditure total: $66.6 million. 

We can account for some of that. 

The 2016 average salary in Richmond was $51,263 vs. the state average of 56,320.  That is, Richmond saved $11.3 million vs. the state average by underpaying its teachers.

At the same time, Richmond has a lot more teachers than average:

image

At the average salary of $51,263, that comes to an extra cost of $18.5 million.

Combining those data leaves Richmond with an excess cost v. the division average of $59.4 million.

image

This overlooks the question of what those extra teachers are doing for the RPS students.  To judge from the SOL scores, it’s more a question of what they are doing to the students.

We often hear that special education and poverty are major contributors to Richmond’s higher costs.  It is difficult to come by data to quantify that.  The closest I have come is SchoolDataDirect, the now defunct Standard & Poors data service that was supported by the Gates Foundation.

S&P calculated adjustments to spending on core operating activities to account for poverty, special education, and geographic factors.  On the 2006 data, here are their numbers:

image

The difference in adjustments is 5.8%; that is, on those (old) data, Richmond schools were 5.8% less expensive than average (in terms of “core” spending) because of those three factors. 

Based on that counter-intuitive result and the absence of current data, let’s pass on those factors.

One final item:  Richmond’s excess expenditure for O&M is $2.8 million.  Even if all that were justified (and given the sorry state of our old school buildings, it probably is), Richmond would be spending an excess $56.6 million.

image

After these adjustments we are left with an estimated excess in our 2016 school disbursements of some 56+ million compared to the state average (recall that this does not account for disbursements for facilities, debt, or contingency).  What we got for that money in 2016 was the second lowest math pass rate and the lowest reading pass rate among the Virginia school divisions.

Until the RPS can explain where all that money is going and what we are getting for it, there is no reason at all to increase the school budget.  To the contrary, these data make a case for cutting the RPS budget by at least $56 million.