The Alternative and Dropouts

Richmond Alternative School serves “students with academic, attendance and behavior challenges.”  Their Web page says they “use positive norms and positive peer pressure from the group in order to maintain a positive learning culture.”

RPS hired Community Education Partners in 2004 to run this receptacle for disruptive students.  In 2013, Richmond took the school over, saving about $2 million per year.  They now have hired Camelot Education to run it. 

The original contractor was doing a decent job, given the tough clientele. 


RPS picked a bad year to take over the school; the new math tests had lowered scores statewide in 2012 and new tests in English and science did the same thing in 2013, albeit more in Richmond.  

Alternative didn’t bounce back until the new contractor was in charge.  For instance:



There is a wrinkle in those data, however: The state and Richmond averages include the elementary school pass rates while Alternative has only the middle and high school grades.  The Alternative trends in the graphs are accurate enough but the comparison to the state and Richmond numbers is not.

If we pull the data by grade, we are hindered somewhat by the VDOE suppression rule that omits data for groups of <10 students.  Even so, there is information here:



(“EOC” is the end of courses tests that are required to obtain “verified credits” toward the graduation requirements.)

There are some details missing but the Richmond Alternative jump in 2017 is clear enough.  Either the contractor is cheating prodigiously, the dropouts are having an exceptional effect on the scores of the remaining students, or its it’s time for our School Board to admit its own incompetence and hire outsiders to run all the schools.  I lean toward that last explanation. 

In any event, something stinks here.

. . .

Maybe two somethings:  Those Richmond EOC numbers are suspiciously high.  Time for some more digging.

Looking Past the Donut

As you ramble on through Life, then,
Whatever be your goal,
Keep your eye upon the doughnut
And not upon the hole.

That may be good advice in life.  As to the performance of Richmond’s public schools, there’s a question whether the very large dropout hole has the perverse effect of enlarging the SOL donut.

We already know about the appalling condition of that donut, Richmond’s SOL performance.  Let’s further examine the hole, the kids who have dropped out and can’t further damage the pass rates.

The VDOE Web site has dropout statistics by division and by school.  I pulled down those by school for 2018 (the latest available) along with the Fall enrollment (“membership”) data for that year.

First the high schools.  The scales on the ordinates here are the same so we can compare the schools.

image image image image


And the average of averages for the five schools:


It looks like Wythe, with some modest help from everybody but Armstrong, is driving Richmond’s anomalously high 9th grade dropout rate.

The selective high schools paint a much prettier picture.

image image

No need for a graph for Franklin Military; their high school numbers all are zero.

The middle school numbers are much smaller, albeit several show troublesome trends.  Notice the expanded ordinate.

image image image image image image image image

The middle school average of averages:


Richmond Alternative, the dumping ground for troublesome kids (esp. in middle school), requires expanded ordinates.

image image

If Richmond were serious about dealing with dropouts (the numbers say they are not), these data suggest the places to start.

Attendance, Not

Having noticed Richmond’s atrocious dropout rate, I went digging on the VDOE Web site and found detailed dropout data.  I fired up Microsoft Access, pulled the 2018 dropout data, and set them beside the Fall enrollments for that year.

The result is ugly.


The state data show a regular increase with grade.  The astounding Richmond numbers show an unexpected maximum in the ninth grade.  Perhaps that is due to the notorious “ninth grade bump.”  


(Data are for all Virginia divisions.)

The principal argument for social promotion seems to be that holding the kid back is even less effective for learning than promoting that student.  The literature I’ve seen does not explain why that reason (or simple inertia or moving the problem on to the next level or whatever else might explain social promotion) stops working at the ninth grade.

In any event, Richmond’s ninth grade bump looks to survive Richmond’s ninth grade surge in dropouts.


If you have a testable hypothesis that explains all this, please do share it with me.

Turning back to the dropout rates, we see that, until the 12th grade and contrary to the state pattern, the rate among Richmond’s economically disadvantaged (“ED”) students is lower than among their more affluent peers (“Not ED”). 


This last set of data doesn’t illuminate the reasons for Richmond’s unusual (and unusually awful) dropout rates but it does suggest where we might start working on the problem: the ninth grade bump.

Attendance. And Not.

The 2018 Superintendent’s Annual Report has the dropout numbers.  Combining those data with the September 30 head count, we see that Richmond is the runaway leader in dropouts.


Those 517 dropouts in Richmond look less appalling when expressed as a percentage because most of the dropouts come in the later grades.  (Richmond’s 4-year cohort dropout rate in 2018 was 20.2%.)

It is perhaps more revealing to compare these percentages.


Or, in terms of a graph,


That Richmond rate is 3.99 times the state average.

Our Leaders, however, are shouting “Nolo Problemo!”  Last year our Generous Assembly gutted the mandatory attendance laws; Richmond is celebrating by firing its attendance officers.

Your tax dollars at “work.”

Is Marriage Good for Schools?

Having been enticed by the 2017 census data, I turned from the “female householder, no husband present” data to the “married couple families” numbers.

Notes: The census table does not include data for the two towns that have independent school divisions, Colonial Beach and West Point.  Presumably those data are included in the reports for the surrounding counties; I don’t have a way to correct for that (other than by leaving out the counties, which I didn’t do).  VDOE reports consolidated SOL data for Emporia/Greensville County, Fairfax City/County, and Williamsburg/James City County under the latter jurisdiction in each case; I’ve summed the census data for each pair and reported the joint average.

Let’s start with reading.


Not only did we get a two-digit R-squared, for a change, but one that is robust: Division reading SOL pass rates are strongly associated here (ρ = +0.68) with the percentage of married couple families.

Of course, the correlation does not tell us about causation.  Indeed, it is likely that other factors drive both sets of numbers. 

That caveat aside, we again see Richmond (the gold square) underperforming substantially while the peer jurisdictions (the red diamonds, from the left Norfolk, Newport News, and Hampton) are near or above the fitted line.  (As a courtesy to both of my readers, the purple diamond is Lynchburg, the green, Charles City.)

For a closer look at Richmond’s marital neighborhood, here is the same graph, expanded to show only the jurisdictions with <65% married couples.


The fitted line is for the entire dataset.

The math scores tell the same story (with Richmond underperforming even more dramatically).



As with the “female householder” dataset, the married couple SOL data also show suggestive correlations with both the economically disadvantaged (“ED”) and the more affluent (“Not ED”) students’ performance.





(In the second graph of each pair, only the ED points are labeled.  Look directly above those points – same % married – to find the corresponding Not ED points.)

Indeed, the fits to the Not ED group are substantially more robust than to the ED students, showing where marriage or the underlying factors that affect both marriage rates and SOL pass rates has/have stronger effects.

As in the single parent case, these data tell us:

  1. The raw SOL pass rates punish a division for poverty and even more for families without married couples; and
  2. Even with the pass rates corrected for the effects of both larger populations of ED students and lower populations of married couples, the Richmond schools underperform.  Appallingly.

Needed: A Lot of Good Men

We have seen that “economically disadvantaged” (“ED”) students generally underperform their more affluent peers (“Not ED”) on the SOL tests. 

The natural consequence of that is that divisions with more ED students generally deliver lower pass rates.  Less obviously, as the percentage of ED students increases, the pass rates of the Not ED students decrease while those of the ED students are only slightly lowered.  For instance:


There is a lot of scatter in those data.  Indeed, the only robust correlation is in the all-student division average rates, where the effect of increasing numbers of ED students is to be expected.

Looking for other factors that might have more predictive effect, let’s turn to the census data showing numbers of “female householder, no husband present” families in the Virginia jurisdictions.  The latest data there are for 2017.

To start, here are the division average pass rates on the 2017 reading tests v. the no-husband percentages:


Notes: The census data do not include Colonial Beach and West Point, both of which have independent school districts; their census data probably are subsumed in reports for the counties that include those towns.  The gold square on the graph is Richmond; the red diamonds are the peer cities, from the left Newport News, Hampton, and Norfolk; the green is Charles City; purple, Lynchburg.

Whew!  This is much more dramatic: The ED graph (2018 data) has a slope of –2.8 % per 10% increase in the Ed population.  Here (2017 data) we see a slope of nearly -6.4% for a 10% increase in the no-husband population.  The fitted line of the ED SOL graph extrapolates to a pass rate of 63.3% at 100% ED; the no-husband graph extrapolates to a pass rate of 25.7 at 100% no-husband households.

For a look at the divisions with a tougher job, here is the no-husband graph with only the high percentage divisions shown (the fitted line is for the entire dataset).


The math data tell much the same story.



Take a bow, ghost of Secretary Moynihan.

Taking a further step, here are the reading data broken out for the ED and the Not ED students.


The ED data extrapolate to 29.8% at 100% no-husband households; the Not ED extrapolation gives 50.2%.

Looking just at the >25% no-husband divisions:


Note: The second graph is too busy as it is so I’ve left the labels off the ED series.  You can identify the points there by looking for the labeled Not ED points directly above (i.e., same % no-husband).

Recalling that correlation does not imply causation, we cannot say that the missing husbands are the cause of these effects.  We can say that both ED and Not ED pass rates are generally lower in divisions with larger percentages of female parent homes and that the effect is nearly the same for both ED and Not ED students.  As well, the absence of the husband predicts over a fifth of the variance (statistic-speak for scatter from the fitted line).

Richmond, as usual, is underperforming. 

(Darn!  Another excuse that’s available to RPS only if they’re willing to lie.)

The math data sing a variation on the same theme.



We can draw at least two conclusions:

  1. As with the ED population, the relationship of SOL test results to the number of no-husband households shows that the raw SOL pass rate is a defective standard for measuring division academic performance.  Said more succinctly: The SOL measure punishes a division for poverty and, still more powerfully, for single parenthood in its student base.
  2. Even with the pass rates corrected for effects of the large ED and no-husband populations, Richmond schools underperform.  Atrociously.

What Funding Gap?

The estimable Jim Bacon points to a study of school spending by state.  He concludes: “In Virginia, districts that serve mostly black students spend about $200 more per student on average.”

Well, please recall what Mark Twain said about statistics.

Please also notice that the study was prepared by EdBuild, which advocates for school funding and is funded in part by the Gates FoundationAnother study (with, obviously, a different viewpoint) asserts that of 2,625 political contributions by staff of Gates grantees, only 6 went to Republicans. 

In this context, we can wonder about the methodology of the EdBuild study: The study compared funding of “nonwhite” districts – those with more than 75% nonwhite students – with “white” districts – those with more than 75% white students.  The study does not explain the basis of the 75% criterion; it does not report the results of choosing other criteria; it does not mention local costs of living.

(Indeed, ± 75% is very close to ± 1.2 standard deviations; if there had been a statistical basis for the study, we might have expected to see a 68% or a 95% criterion.)

Thus, it is hard to know exactly what the study shows, albeit it seems to give Virginia some modest bragging rights.

There is a Virginia data set that can shed some light on the matter.  The Superintendent’s Annual Report for 2017 (the latest available data) provides at Table 13 disbursement data for each division.  I’ve extracted the day school (school operations not including food, adult ed., pre-K, etc.) expenditure per student.  The Fall Membership Report database provides the 2017 enrollments for students of all races and for “white, not of Hispanic origin” students, inter alia

If we graph the day school expenditure v. the percentage of nonwhite students, we obtain:


Of course, correlation does not imply causation (another thing the EdBuild study does not mention) but the absence of correlation does tell us to look elsewhere for causes.  Here, 93% of the variance (that’s statspeak for “scatter”) comes from factors other than the percentage of nonwhite students.

In any case, the least squares fit offers the same result as an eyeball examination: There’s LOTS of scatter but the divisions with larger nonwhite populations are not being punished.  So, modest (6.6%) bragging rights.

Looking at the data we also see that the EdBuild average being pulled up by the older, urban divisions with large nonwhite populations and with higher costs of living (with the notable exceptions of Petersburg, which has a reputation for being notoriously impoverished, and Sussex, which is decidedly non-urban).


Among those divisions, the R-squared rises to 9%.


At the other end of the spectrum, the low expenditure divisions are mostly rural counties with relatively lower costs of living.


The Big spenders here are Bath (38% of the budget is VEPCO money), Highland (43% of budget from property taxes), and Rappahannock (who knows?).

Interesting, perhaps.  It might also be interesting to look at the expenditures corrected for cost of living. 

In any case, no racial funding gap on the expenditures.

With all that said, it remains that, while school finances are important to the teachers and the schools’ bureaucrats, they are irrelevant to student performance among the Virginia divisions, e.g.,

Note: These are total disbursements, not just day school.  Richmond is the gold square; the peer cities Hampton, Newport News, and Norfolk, the red diamonds; Lynchburg, blue; Charles City, green.

Serving the Numbers, Not the Students

With the aid of our General Assembly, Richmond has abandoned its truant students in order to improve its numbers.

We are reminded by a piece on NPR that you can’t teach students who don’t attend school. 

The General Assembly noticed that problem awhile back.  In 1999, they amended Code § 22.1-258 to install requirements for truancy responses:

  • Any unexcused absence: Contact with the parent;
  • 5 unexcused absences: Attendance Plan;
  • 6 unexcused absences: Conference with Parents;
  • 7 unexcused absences: Prosecute parents or file CHINS petition.

That was massively unpopular with our public school bureaucracy.  The Board of “Education” responded by requiring the divisions to report the number of students for whom a conference was scheduled and the aggregate daily attendance.  Notwithstanding its duty and authority “to see that the [mandatory attendance laws] are properly enforced throughout the Commonwealth” the Board cheerfully ignored the other requirements of the statute.

Richmond followed that lead.  After ten absences they sent the parents a letter.  They did very little else, even as their truancy rate exploded.

In 2018, the Generous Assembly amended § 22.1-258 to gut the enforcement mechanism: Now after ten unexcused absences, the attendance officer may prosecute the parents or file a CHINS petition.  The attendance officer is no longer responsible for the five- and six-absence plans and conferences.

Richmond is responding by firing all its seventeen attendance officers and replacing them with seven “attendance liaisons.”

(The statute still provides that “[w]here no attendance officer is appointed by the School Board, the division superintendent or his designee shall act as attendance officer.”  Presumably these “liaisons” now will be the superintendent-designated attendance officers.)

On the 2017 data, 4,998 Richmond students had ten or more unexcused absences.  That’s 294 per attendance officer. 

Who can think that the truancy situation will improve with “liaisons” who should have, on those data, 714 cases each?  But, of course, those “liaisons” don’t have to actually do anything so we might wonder why we’re paying even for seven.

(BTW: At the old limit of seven, there were 7,234 students with 7 or more unexcused absences in Richmond in 2017.  That would be 1,033 per “liaison” if they were actually dealing with truancy.)

We don’t have to dig far to unearth the reasons for this deliberate disservice to schoolchildren: Students who are not in school can’t be taught.  Students who are truant frequently drop out.  Students who have dropped out cannot lower the SOL pass rates.  Indeed, if the division can get rid of these troublesome children in middle school, they won’t even count against the cohort graduation rate.

This is win/win for the schools and the Board of “Education.”  Never mind those inconvenient children.

Mendacious Excuse, II

Having sentenced myself to read the School Board’s no-longer-secret (but probably still illegal) budget, I moved on from Page 11 and was stopped at page 12 by a further false excuse for Richmond’s lousy (and very expensive) performance:

Special Education Students

Another factor for consideration in educating the students residing in the City of Richmond is that approximately 4,100 or 17.5% of our students qualify for special education services. The graph shown below represents the percentage of special education students benched against state-wide averages and surrounding districts; RPS = 17.5%, state average = 13.0%.


This graph is a step up from the one on p.11 that stopped at 2014: This one goes to 2018, albeit the database continues to 2019. 

As well, this page again calls Norfolk a “surrounding district.” 

More to the point, here is my graph for Richmond and the peer districts.


You may have noticed that the Board’s Richmond numbers and mine agree only for 2016.  Either the database has been heavily amended since the Board pulled its data or the Board has miscalculated.

Still more to the point, this appears to be another official embrace of a “Blame the Students” excuse for the School Board’s own failure. 

It is clear, of course, that disabled students underperform their more abled peers.  On the SOL pass rate, the state average difference ranges from just over thirty to over forty points, depending on the subject.


But Richmond magnifies that effect: Because of the awful schools, Richmond’s students, economically disadvantaged and not, grossly underperform their peers.  For example, on the reading tests Richmond’s disabled students underperformed eight of the ten divisions with larger disabled populations and Richmond’s non-disabled students underperformed all ten of those divisions (Richmond is the enlarged, gold points):


On the math tests, it was nine of ten and, again, all ten.


For the School Board to blame those disabled students for its own costly failures is a shameless lie.

Our Neighbor, Norfolk

Following the kerfuffle over our School Board’s’ (probably illegal) secret adoption of its 2020 budget, the Board released that budget.

I have a lot of reading to go, but was stopped by this on page 11 of the budget:

Free and Reduced Lunch Population

Free  and  reduced  lunch  population  is  a  measure  of  poverty.  As  reflected  in  the  Department  of  Education’s October 31, 2013 report, RPS ranked as the 9th highest free and reduced lunch population in the  Commonwealth  with  17,351  or  over  74.25%  of  our  students  receiving  subsidized  meals  under  the  Federal  school  lunch  program.  The  graph  shown  below  depicts  Richmond’s  status  as  compared  to  neighboring districts and the state average.


Norfolk is 92 miles away by car.  Some “neighboring district.”

Then we have the dates: The graph stops at 2014.  The “Program Statistics” page on the VDOE Web site has Free/Reduced Lunch data thru 2018-2019.

Even more to the point, VDOE has a more general measure of poverty, “economically disadvantaged.”

Economically Disadvantaged   A flag that identifies students as economically disadvantaged if they meet any one of the following: 1) is eligible for Free/Reduced Meals, or 2) receives TANF, or 3) is eligible for Medicaid, or 4) identified as either Migrant or experiencing Homelessness.

The VDOE Fall Membership database provides the economically disadvantaged populations through 2018-19 for the state and all divisions.  Here are those data for Richmond and the peer jurisdictions:


Do you suppose the School Board’s graph stopped at 2014 because they were just too lazy to update the graph?  Or, perhaps, because that was the year that showed Richmond’s largest free/reduced percentage?

Still more to the point, this appears to be an official embrace of the School Board’s “Blame the Students” excuse for its own failure. 

It is clear, of course, that economically disadvantaged students underperform their more affluent peers.  On the SOL pass rate, the state average difference is about 20 points, depending on the subject. 


But Richmond magnifies that effect: Because of the awful schools, Richmond’s students, economically disadvantaged and not, grossly underperform their peers.  For example, on the reading tests (Richmond is the enlarged, gold points):


Note added 2/28: That’s %ED in the tested group, not the division.

For the School Board to blame poverty for its own failures is a shameless lie.

It begins to look like reading this budget will be about as much fun as reviewing the performance of Richmond’s schools.