And You Thought Our Middle Schools Were Merely Bad

A kind reader points me to a piece in the NYT reporting on a Stanford study.  That study looks at progress from the third to eighth grades as a measure of educational quality.

As the Times reports, richer districts tend to have higher scores.  Here, for instance, are the third grade data for 2,000 large districts; the graph plots relative scores vs. district wealth:


The eighth grade numbers show some a similar pattern but some different rankings:


And here is the pattern of changes from third to eighth grades:


As the graphs suggest, the story focuses on Chicago.

Overall, the study reports a poor correlation between growth rates and average third grade scores. 

At the bottom of the Times story is a link that will produce score change data for a specified district and nearby systems.  So of course I had to put in Richmond.


As you see, Richmond is at the bottom of the barrel. 

The first surprise here is how much better Petersburg does on this measure. 

We also might wonder whether the relatively low growth reported for some excellent systems, e.g., Hanover, reflects weakness at the eighth grade level or strength at the third. 

In the other direction, Richmond’s [lack of] progress from 3d to 8th grades derives from the awful performance of our middle schools relative to the elementary schools.


The Stanford data also are consistent with the notion that poverty does not excuse the appalling performance of Richmond’s public schools.

Summer of Theft from Motor Vehicle

The Police Dept. database now has the summer, 2017 data so I can update my last jeremiad regarding car breakins in our neighborhood.

The most common offense reported in Forest Hill is “Theft from Motor Vehicle.”  Since the start of the database in 2000, there have been 576 unique reports of those, 27% of the total.


I expect the actual number of such thefts is larger but we have no way to know how many of them go unreported.

I like to call those thefts “car breakins” but, in fact, most of the victims left their cars unlocked.  Much of the “Destruction Property/Private Property” reported reflects actual car breakins.

The data show a third summer of increase, following a period of relative tranquility.


The largest contributor to these numbers is the 4200 block of Riverside Drive, home of the “42d St. Parking Lot.”


There’s a lot of activity at the Nature Center as well.


The Good News is the long term decrease in the numbers.


*2017 data through Nov. 24.

There is some history there.

The considerable decreases in the 4200 block came after Parks’ 2005 response to neighborhood complaints: They started locking the gates to the 42d St. lot at night and off season and they installed rocks to block parking in the part of the lot that is less visible from the street.

I attribute the recent increases to the increased use of the Park and the removal of the rocks last year. 

In any case, there are two lessons here:

  • Leaving stuff in the car, especially in an unlocked car, is an invitation to lose the stuff and to help chum the neighborhood for criminals; and
  • Given that most of the thefts are from the vehicles of park visitors, it’s past time for some LARGE signs in the 4200 block (and probably at the Nature Center and 4100 Hillcrest) and, especially, in the 42d St. parking lot to warn the visitors:
  • Car Breakins Here! Lock your junk in your trunk.

Final Accreditation Update

At its October meeting, the Board of Education resolved 58 TBD cases by allowing the schools in question to be “Partially Accredited-Reconstituted.” 

Here is the resulting situation as to Richmond, Petersburg, and the state totals:


The categories there are my shorthand.  The long versions are here and here.  “Cheated” is my shorthand for “Accreditation Withheld-Board Decision.”  That decision followed the cheating scandal at AP Hill Elementary in Petersburg.

As of today, the table on the VDOE Web page has not been updated.  The data above are from the spreadsheet that apparently has been updated.

To focus just on Richmond:


The Good News is that the number of accredited schools is set to rise dramatically.  The Bad News is that the “improvement” need not reflect performance improvements, just a gross dilution of the standards.  Your tax dollars at “work.”

More Money for What?

The tug-of-war between the School Board and City Council over school funding enters a new era of speculation: Will the election results produce more funds for Richmond schools?

That overlooks the more fundamental question: What is the School Board doing with all the money it now is spending?

The most recent data from VDOE are from 2015-16.  Here from that year are the per-student disbursements by division for Richmond, three peer divisions, and the state average for school divisions.  I’ve left out the sums for facilities, debt service, and contingency reserve.


The largest single cost for schools is salaries, so it’s no surprise that Richmond’s excess of $3,051 per student mostly went to the instruction category.


Expressing the Richmond data as differences from the state number, we get:


Or, multiplying those differences by the Richmond enrollment of 21,826:


Multiplying the $3,051 Richmond total excess by the 21,826 enrollment gives the excess Richmond expenditure total: $66.6 million. 

We can account for some of that. 

The 2016 average salary in Richmond was $51,263 vs. the state average of 56,320.  That is, Richmond saved $11.3 million vs. the state average by underpaying its teachers.

At the same time, Richmond has a lot more teachers than average:


At the average salary of $51,263, that comes to an extra cost of $18.5 million.

Combining those data leaves Richmond with an excess cost v. the division average of $59.4 million.


This overlooks the question of what those extra teachers are doing for the RPS students.  To judge from the SOL scores, it’s more a question of what they are doing to the students.

We often hear that special education and poverty are major contributors to Richmond’s higher costs.  It is difficult to come by data to quantify that.  The closest I have come is SchoolDataDirect, the now defunct Standard & Poors data service that was supported by the Gates Foundation.

S&P calculated adjustments to spending on core operating activities to account for poverty, special education, and geographic factors.  On the 2006 data, here are their numbers:


The difference in adjustments is 5.8%; that is, on those (old) data, Richmond schools were 5.8% less expensive than average (in terms of “core” spending) because of those three factors. 

Based on that counter-intuitive result and the absence of current data, let’s pass on those factors.

One final item:  Richmond’s excess expenditure for O&M is $2.8 million.  Even if all that were justified (and given the sorry state of our old school buildings, it probably is), Richmond would be spending an excess $56.6 million.


After these adjustments we are left with an estimated excess in our 2016 school disbursements of some 56+ million compared to the state average (recall that this does not account for disbursements for facilities, debt, or contingency).  What we got for that money in 2016 was the second lowest math pass rate and the lowest reading pass rate among the Virginia school divisions.

Until the RPS can explain where all that money is going and what we are getting for it, there is no reason at all to increase the school budget.  To the contrary, these data make a case for cutting the RPS budget by at least $56 million.

The Board of “Education” Still Doesn’t Care About Truancy

We are reminded by a piece on NPR that you can’t teach students who don’t attend school. 

The General Assembly noticed that problem awhile back.  In 1999, they amended Code § 22.1-258 to install the currently-effective requirements for truancy responses:

  • Any unexcused absence: Contact with the parent;
  • 5 unexcused absences: Attendance Plan;
  • 6 unexcused absences: Conference with Parents;
  • 7 unexcused absences: Prosecute parents or file CHINS petition.

The Department of “Education” responded by requiring the divisions to report the number of students for whom a conference was scheduled and the aggregate daily attendance.  Notwithstanding its duty and authority “to see that the [mandatory attendance laws] are properly enforced throughout the Commonwealth” the Department cheerfully ignored the other requirements of the statute. 

That willful ignorance continued until, after a bumpy process that started in 2009, the Board adopted a truancy regulation, effective November 30, 2016, that requires reporting of

1. All excused and unexcused absences as defined in this chapter for each individual student shall be collected.
2. For each student with five unexcused absences, whether an attendance plan was developed, and if not, the reason.
3. For each student with six unexcused absences, whether an attendance conference was scheduled, and if not, the reason.
4. For each student with six unexcused absences, whether an attendance conference was actually held, and if not, the reason.
5. For each student with seven unexcused absences, whether a court referral was made or if proceedings against the parent or parents were initiated and, if not, the reason.

Beginning with the current school year, the Department will collect data on attendance plans, conferences, and court referrals.  We’ll have to wait until next year to see whether that covers the requirements of the regulation (much less the statute).  In the meantime, we’re stuck with the old count of 6-absence conferences

CAVEAT: Take these data with a tub of salt.  Richmond ignored the statute for years (without provoking any action from the Board of “Education”).  The current data suggest that Richmond is trying harder; we’ll have to wait to next year for the supporting data to see whether they (and other divisions) have decided to obey the law. 

So, understanding that these “truancy” numbers may be understated:

Dividing the reported number of six-absence conferences by the fall enrollment (“ADM”), we see the top divisions for (this measure of) truancy:


At least on these data, this is not just a big city problem.

For a nice contrast, here are the divisions reporting <0.5%.


There’s no telling which, if any, of those numbers we should believe.

Finally, here are the divisions for which there is NO REPORT and no indication of VDOE action to get one.


Your tax dollars at “work.”

Virginia Colleges: Get What You Pay For?

The USDOE has updated its College Scorecard, apparently with 2016 data.  Here is a summary for the Virginia colleges & universities.


That last column is my Bang per Buck ratio, which is the salary times graduation rate divided by cost per year.

CAVEAT: The graduation rate is for full-time, first-time students.  The earnings are only for Title IV-receiving students and do not include those enrolled in graduate school at the time of collection.

Here are those data in graphs (Please forgive the excess digits after the decimal point in some numbers in the labels; Excel won’t let me set the number of significant figures for each element.  Thus, to get two significant figures in the slope of the first graph, I had to take a ridiculous five in the intercept and six in the R-squared.)




And here are the same graphs for just the Big Five (GMU, Tech, UVa, VCU, W&M):




Structures and SOLs

Paul Goldman’s referendum that would seek to place a school modernization plan in the city charter is pending before the voters. 

An email from Paul today invites us all to attend the School Board meeting tonight to hear about his “plan that will finally give all students their. . . right to a modern,clean, safe, 21st century education compatible School facility.”

It’s hard to know how anybody could oppose replacing the many old school buildings that have not been properly maintained.  Even so, we might wonder whether Paul’s focus on physical facilities, needed though they be, misses the more important point: Education.

We have some data on that.

The new Huguenot facility opened to students on January 5, 2015.  So we have 2-1/2 years to see whether the new digs have affected academics.

To start, here are the reading pass rates by year at Huguenot:


The dips in 2013 surely reflect the new, tougher tests instituted that year.  And we see some recovery in ‘14 and ‘15.  But, to the point here, since then none of the three groups for which VDOE posted data has shown any academic benefit from the new facilities.

Well, how about writing?


OK, how about math?


They started to recover nicely from the new tests of 2012 but then, again, started slipping back.

Maybe History?


Aha!  A small improvement this year among the black and white students.  But nothing to brag about.  And no bounce from the Hispanic, students. 

We’ll need more than that to infer a learning effect from the new digs.  Science is the last chance:


Well, whatever is going on at Huguenot, there’s no pattern of academic boosts from the new facilities.

These data don’t say there couldn’t be a new digs effect at some other school(s).  But they do support the notion that Paul’s focus might be on the second most important issue.

More on the Deceptive “On-Time” Graduation Rate

We have seen that the Board of “Education” has created an inflated, “On-Time” graduation rate to make the numbers look better.  This year that count inflated the state average cohort graduation rate by 2.8% and the Richmond average by 6.7% over the federal (advanced plus standard diploma) rate.  (Actually, by more than that; see below).

Today let’s look at the effect of that per high school.

To start, here is the On Time rate vs. the federal rate for the 296 high schools for which the 2017 4-Year Cohort Report includes both numbers.


The outlier at 13%,16% is Richmond Alternative School, the dumping ground for difficult students (boosted by 3%, as if that could make a silk purse out of a 13% graduation rate).  The other outlier at 58%,58% is JM Langston Focus in Danville, which looks to be a similar outfit.  Let’s leave those two off and see what we get.


The blue line is the “truth in graduation rates” line; very few data points lie on it. 

The average boost looks to increase with decreasing federal rate.  Indeed, that is the case:


The Big Winners here are:


Doubtless the Richmond School Board is not annoyed by the 12.9% gift at Marshall or the 10.9% bonus at Armstrong.

If this were not enough official deception, remember that the Federal rate already includes about a 5% finagle factor because they are starting to use “credit accommodations” that permit counting Modified Standard diplomas as “Standard.”  You can be sure they won’t tell you how large that hidden boost turns out to be.

Your tax dollars at “work.”

Dropout and Not

Early one morning I posted: “You might think that our awful dropout rate would serve to improve the graduation rate.” 

That’s backwards, of course.  If the kids drop out, they can’t graduate. 

BUT, a deeper dive into the data suggests that my backward notion may be about 33% correct.

Let’s start by plotting the dropout rate for the schools with graduating classes against the federal graduation rate.  Data are from the 2017 4-year cohort report for 295 schools, with the suppressed data for 19 schools (<10 students in one category or other) omitted.


This is pretty much the expected result: The schools with high graduation rates don’t have many dropouts.  Indeed, the 65% R-squared tells us there’s a good correlation here.

That school up top with the 23% dropout rate(!) is Richmond’s Huguenot High School (see below).  The other yellow squares are, from the top, Richmond’s Armstrong, Wythe, Marshall (on the left) and JT.

Turning to the advanced studies diplomas, we see much the same pattern but with more scatter.


Notice the low rates at the five Richmond schools.

That point at 100% advanced diplomas in fact is two schools: Richmond’s (selective) Community High and Fairfax’s (very selective) Thomas Jefferson Governor’s School. 

Richmond’s (selective) Franklin Military would also be at 100% but is missing here because of the suppression rules

Richmond’s (also very selective) Maggie Walker is not on the graph because the State reports the results from there at high schools that the Maggie Walker students do not attend.

Last, when we turn to the standard diplomas, we get a result that makes my early-morning blunder look something like an insight:


The considerable scatter here is consonant with the low R-squared value but the pattern still is obvious: Schools with higher rates of standard diplomas (notably the Richmond schools) tend to have higher dropout rates.

Upon reflection, this makes some sense: Schools where more of the graduates hold standard diplomas than advanced are not doing as well overall as the schools that predominantly grant advanced diplomas. 

Indeed, Richmond is the Demon of Dropouts, with an 18% division average.  That average is driven by the 60% rate (70 of 116) at Richmond Alternative (the dumping ground for troublesome students) and counterbalanced by the 0% rates at the selective schools, Community, Franklin, and Open.

The rates at Richmond’s mainstream high schools disclose other problems:


“ED” is economically disadvantaged; “EL” is English learner; “#N/A” indicates suppression of the datum (small number of students). 

The appalling rate of Hispanic dropouts (notably at Huguenot and, to a lesser degree, at Jefferson and Wythe) also contributes to that 18% division average.  Also notice the high rates for disabled students.