Nonfeasance, n.: Failure to Act

Last year, Richmond had the lowest average reading pass rate and the second lowest math pass rate in the state.  At present only seventeen of forty-five Richmond schools (38%) are fully accredited.

Seeing the handwriting on the wall, Richmond this year “requested” a “division-level academic review

§ 22.1-253.13:3.A
                                                      * * *
When the Board of Education determines through the school academic review process that the failure of schools within a division to achieve full accreditation status is related to division-level failure to implement the Standards of Quality or other division-level action or inaction, the Board may require a division-level academic review. After the conduct of such review and within the time specified by the Board of Education, each school board shall submit to the Board for approval a corrective action plan, consistent with criteria established by the Board setting forth specific actions and a schedule designed to ensure that schools within its school division achieve full accreditation status.

On November 17, 2016, the Board of Education approved the request.  The minute is unfortunately silent as to the process going forward.  The agenda item for that meeting is more forthcoming:

A division-level Memorandum of Understanding and Corrective Action Plan are
expected to come before the Virginia Board of Education by June 22, 2017.

“Expected.”  There is no actual deadline here for submission of the Plan, much less for getting it executed, notwithstanding the statutory authorization for a time limit.

Indeed, the Plan is only “expected” the month after graduation this year.  There is no attempt here to have a plan submitted and approved and in effect for the 2018 school year.

But, then, we’ve already seen that the Board of “Education” does not know how to fix a broken school division; they admit it
(Sept. 21, 2016 video starting at 1:48).

Indeed, with our Superintendent leaving in June, it’s hard to know whether there can be any realistic Plan before 2019 in any case.

Never mind all those kids whom the awful Richmond schools are damaging: There’s no reason for hurry here.

More Teachers, Same SOLs

James Weigand raises the question whether the divisions with more teachers per student get better SOL pass rates.

To look at that question, I’ve prepared a juxtaposition of Table 19 and Table 1 from the 2016 Superintendent’s Annual Report with the pass rates from that year.

Table 19 gives us each division’s total “instructional positions,” defined by VDOE as “classroom teachers, guidance counselors, librarians, technology instructors, principals, and assistant principals.”  Table 1 gives the end of year enrollment (known in bureaucratese as average daily membership, ADM).

Here, to start, are the data for the reading tests.


The red diamonds, from the left, are the peer cities Newport News, Hampton, and Norfolk.  The gold square is Richmond; the blue diamond, Lynchburg; the green diamond, Charles City. 

If we fool around with the fitted curve, we can get the R-squared up to 1.2%, which is to say that the SOL performance does not correlate with the number of teachers per kid.


Here, then, are the data for the remaining subjects and the five-subject average, all with the linear curve fitted.






The closest thing to a correlation is History & Social Science at 6%; all the correlations, such as they are, are negative: Pass rates decreasing with increasing numbers of teachers.

At the division level, 2016 SOL performance did not improve with increasing numbers of teachers per student.

Picking on the Little Guy?

Steve Fuhrmann of Charles City County points out that my analysis of division administrative employees may be misleading: CCCo was the only small division in the group I looked at.  My calculation showed Charles City with overall 60.2 administrators per 100 teachers vs. a state average of 54.3.


Steve sent along a spreadsheet that looks at the same data for the eleven divisions with between 500 and 900 students (excluding tiny Highland with only 209).  Those data show CCCo with a smaller administrative staff, relative to the teaching staff, on average:


(It’s comforting that he also got 60.2% for CCCo.)

That said, Steve joins me in wondering why the smaller divisions face larger administrative costs:

The administrative argument has usually been that smaller divisions are required to provide administrative services– especially for maintaining smaller facilities, complying with unfunded state GA mandates and submitting data to VDOE–and do not have large student populations across which to reduce average costs . . .
In comparison with the 11 smaller divisions, comparing administrative overhead with teaching positions, Charles City actually looks pretty good–except for average salaries where we compare poorly if we are to recruit and retain effective teachers for our students . . .  Norton and especially Lexington appear to be outliers–even though student transportation is the only service that can be assumed be be much lower in cities–and Colonial Beach and West Point, also more compact towns, do not show much lower administrative expenditures

Charles City actually has a lower student-to-teacher ratio and a relatively quite low total administrative-to-teacher burden.

(I wish I could figure out a way to actually test the assumption that smaller divisions really need relatively higher administrative service costs–I suspect it has more to do with conventional sharp divisions of labor expertise, rather than overall functional requirements.)

This also leaves open the question why Petersburg hires so many bureaucrats and turns in such lousy results.

Counting Bureaucrats

Table 18 to the 2016 Superintendent’s Annual Report details “Administrative, Service, and Support Personnel Positions by Function” for each division (as well as for Governor’s Schools and other specialized programs).  The items in the report clearly are budget categories but I’ve not been able to locate the definitions.

For the graphs below, I have extracted the total numbers of positions for Richmond, the State, and the peer cities of Hampton, Newport News, and Norfolk, as well as Charles City, Lynchburg, and poor Petersburg.  I’ve expressed those data as percentages of the end of year enrollments (“Average Daily Membership” or “ADM” in bureaucratese) and as percentages of the numbers of  teaching positions

To start: “Administrative.”  Here we see Richmond with fewer administrators per student and per teacher vs. the state average while the peer cities are near that average, but Petersburg and Charles City have lots of bureaucrats.  Fat lot of good that has done for Petersburg.



Next, “Technical and Clerical.”  Here, everybody is close to the average except for Hampton and, again, Petersburg.



Whatever “Instructional Support” may be, the peer cities are beating Richmond’s SOL performance with fewer positions; Richmond is doing poorly with well more than the state average number of positions, and Petersburg . . .



Richmond has a lot of “Other Professional” positions, whatever those are.  Charles City has a lot more.



Next, “Trades, Labor, Operative, and Service” positions.  One might think Richmond would want more of those to deal with its maintenance problems.



Finally, the totals.



Looks like Petersburg is spending a lot of money on non-teachers.  Trouble is, it’s hard to know how Petersburg might usefully redirect some of those salary dollars, given that even the State Board of “Education” doesn’t know how to fix Petersburg’s broken school system.

Richmond’s count is near average and generally in line with (smaller, in some respects, than) the peer cities.

Richmond Dropouts

The Superintendent’s Annual Report is up with, inter alia, the 2016 dropout counts.

Here are the data for Richmond, expressed as percentages of the Fall enrollments.


There is no entry for white males because of the suppression rules.  The white male count could be as high as nine, which would give a rate of 0.80%.

These are percentages of total enrollments, which in Richmond include the relatively larger elementary school numbers

The more revealing data are the four-year cohort rates that show the dropouts as a percentage of the cohort entering high school four years earlier.  Unfortunately, that report uses a different group classification, except for the total.


As to the total, the cohort rate is 7.2 times the percentage of total enrollment.  That 9.9% is 146 kids, out of the 1,472 student cohort, whom the Richmond Public Schools utterly failed to educate. 

That Richmond rate is nearly double the state average and consistently larger than the rates in the peer cities; indeed, the Richmond rate is even larger (by 36%) than poor Petersburg.


Richmond’s Underperformance

An earlier post looked at the division average pass rates of Economically Disadvantaged and Disabled students, as well as the All Students averages and the performance of the No Group students who are not members of any disadvantaged group (definitions are here). 

I’ve updated those graphs to show Richmond’s performance in each case.  These are 2016 data from VDOE’s excellent database.

Let’s start with reading:


Richmond is the square points at 69.9% Economically Disadvantaged. 

Given that the ED group (yellow on the graph) scores some 15 points below the All Students population (blue points), this large ED population in Richmond surely lowers the All Students pass rate.

Aside: The 39% R-Squared (R=0.62) for the All Students fitted line gives some confidence about the correlation of those scores with the percentage of ED students.  If we extrapolate that line to 100% ED, we get a 62.8% pass rate.  The fit of the ED population is tenuous (R-squared = 2.9%) but the extrapolated value of 63.9 is close enough to the extrapolated All Students value to suggest that the lower scoring ED students are a major influence in lowering the All Students average.

Unfortunately, Richmond’s All Students pass rate is lower than all the other divisions with similar ED populations  Indeed, the Richmond pass rate is lower than all the other divisions and well below the All Students fitted line.   This tells us that Richmond’s problem is more than just the large ED population.

Further, the pass rate of Richmond’s Economically Disadvantaged group is third lowest in the state.  That is, Richmond ED students are underperforming their peers statewide.  Ditto the No Group students (green points on the graph).  And those low pass rates surely contribute to Richmond’s bottom-of-the-barrel All Students average.

Richmond’s Disabled performance falls in the bottom half of the divisions.  The divisions’ Disabled performance correlates modestly with the ED population (R-squared = 18%) but only slightly with the Disabled (R-squared = 1.9%).

Turning to the effect of increasing Disabled population: 


Notice that these latter trendlines suggest Disabled and ED pass rates that increase with increasing Disabled populations.  This might raise some question as to the accuracy (or potential for cheating) in the testing of the disabled students.  The R-squared values, however, tell us that the correlations are tiny.

To put some numbers on these groups, here are the Richmond and averaged division average reading pass rates for each group.


Richmond’s No Group students, who are not members of any disadvantaged group, underperform their peers statewide by about eight points; the Economically Disadvantaged students underperform by fourteen points; while the Disabled students also are low by almost eight percent.

The math data paint much the same picture.



Here, Richmond’s underperformance is, except for the ED group, even more dramatic.


There are two possible explanations for these data: Either Richmond’s public school students are particularly slow learners or Richmond is doing an inferior job of educating all its students. 

We have the evidence of Carver Elementary School to buttress the inference that Friar Occam would suggest.  Carver serves a particularly tough clientele: The Carver population is drawn from a part of the inner city that includes an RRHA project.  Carver’s students were 89% Economically Disadvantaged in 2016, vs. 69.9% for the division.  Yet Carver breaks the Richmond pattern by turning in outstanding and division-leading results.

The next time our Superintendent starts talking about “challenges,” please listen carefully to hear whether he’s again blaming the students or whether he’s beginning to notice the generally substandard teaching in his school system.

Please Empty and Lock Your Car!

The warmer weather has brought the early Park traffic to our neighborhood so it’s time for the annual Jeremiad about Car Breakins (or, as the Police put it, “Theft from Motor Vehicle”).

First the geography:  In the Police database, the Forest Hill neighborhood runs from the park to Westover Hills Blvd. and from Forest Hill Ave. to the river.

As you can see, this does not include all of the Forest Hill Neighborhood Ass’n. area and does include some of the Westover Hills Ass’n. area.

Microsoft has a nice view of the area.

Here, for a start, is the count of the most frequent unique police reports for the neighborhood from the start of the database,  Jan. 2000, through March 2017.


Theft from motor vehicle leads the parade with 27%.  If we counted the broken car windows (most of these thefts are from unlocked cars!), the total would run close to a third.

The Hot Blocks are led by 4200 Riverside, where fully half of the reports are theft from motor vehicle.


The Hot Spot in the 4200 block is the “42d St. Parking Lot” that is located in the park across from 4224 Riverside.


Our crime is seasonal, peaking in the summer with corresponding peaks in the car breakin rate.


The 4200 Riverside block shows much the same pattern.


Notice the considerable decreases after 2005 when Parks started locking the gates to the 42d. St. lot at night and off season and installed the rocks in the part of the lot that is less visible from the street.

There are two items of Bad News here:

*** Rant Begins ***

By far the most common offense (27% in this time frame) is “Theft from Motor Vehicle.” That crime (and much of the property destruction that reflects broken car windows) would entirely disappear if we (and the people who visit in our neighborhood) would lock our stuff in the trunk or take it into the house. Indeed, our bad habits in this respect – and those of folks visiting the Park — have the unfortunate effect of chumming the neighborhood to attract criminals.

*** Rant Ends ***

The other Bad News is that Parks this year removed those unsightly rocks.  This almost certainly will lead to more car breakins in the lot.

080112 002

I think it’s time for some signs in the 4200 block (and probably at the Nature Center and 4100 Hillcrest and around the Stone House) and, especially, in the 42d St. parking lot to warn the visitors:

Car Breakins Here! Lock your junk in your trunk.

Student Groups and SOLs

It is commonplace to expect that increasing the population of poorer (the official euphemism is “economically disadvantaged”) students lowers the test scores.  That certainly is the case among the Virginia school divisions.

The VDOE database reports the SOL scores for the ED group; also the disabled, limited English proficient (“LEP”), migrant, and homeless groups.  VDOE’s data definitions of the groups are here.

The migrant and homeless populations are relatively small; many divisions report zero or fewer than the ten-student data suppression cutoff.  The LEP populations are larger but VDOE still does not report data for many divisions.  Accordingly, the analysis below deals only with the ED and disabled groups, as well as all students and the students not in any of those defined groups (here, “no group”).  As well, VDOE does not provide a datum for the Lexington disabled enrollment (albeit they do show the pass rates) so I’ve omitted the Lexington data.

Let’s start with the reading pass rates by division, plotted v. the percentage of ED students.


As expected, the division average pass rates for all students (the blue diamonds) decrease with increasing percentage of ED students.  The fitted line shows a decent correlation (R-squared = 39%) and a slope decreasing 2.8% for a 10% increase in the ED population.

In contrast, the scores of the ED students themselves (yellow circles) show less than a third of that rate of decrease and a correlation approaching negligible (R-squared = 2.9%). 

Thus, the decrease in the all students average rate must come predominantly from the increasing proportion of lower-scoring ED students.  Indeed, a little arithmetic shows the increasing proportion of lower scoring ED students slightly overestimates the decrease. 


Note: The calculated line here was obtained from the two fitted lines.  Thus, the 80% calculated point was 20% of the All pass rate (blue line) + 80% of the ED rate (yellow line).

The 18% R-squared for the disabled group suggests that the disabled pass rates are related in some measure to the population of ED students.  The VDOE data do not allow us to test the possibility that disabled students are found more often in the ED population.

The No Group population shows about half as much correlation as the disabled group, ca. 11%, but still some effect from increasing poverty. 

Given that the no group students are not members of the ED group (or any other), this (modest) effect of ED population on the no group pass rates cannot come from averaging in the lower ED scores.  We can speculate whether this No Group score decrease is the effect of increasing poverty in a division on the classroom environment, the teachers’ distraction to deal with the larger ED group, or something else.

Overall, these data are consistent with the notion that more poverty in the district will be associated with lower pass rates. 

Turning to the math tests:


This is the same story, told in slightly different numbers.  Increasing poverty lowers all the scores, but the scores of the poor students themselves do not seem to be lowered significantly by increasing the percentage of poor students.  The R-squared of the no group scores, however, rises to 15%.

Next, let’s look at the effect of increasing populations of students with disabilities.


First, notice that the largest disabled population is 21% (Craig County) while the largest ED population was 79% (Northampton County), so the scale here is much shorter.

The fitted lines suggest that the reading scores of the disabled and ED populations increase with increasing populations of disabled students but note the very small R-squared values. 

Of more interest is the behavior of the all student scores with increasing disabled populations.  At 20% disabled, the pass rate would be lowered by about 20% of the difference between the two groups, ca. 20%*40%, which would put the pass rate near 72%.  Using the intercept values for the group scores, the calculation produces 69%, hence the red Estimate line here:


If, instead of the intercepts, we use the fitted lines to calculate the 20% all students score, we still get 71%. 

In short, increasing the disabled population does not decrease the all student scores as much as it should.

The 1.9% R-squared for the disabled data commands caution but the data certainly are consistent with the disabled scores being artificially enhanced in the larger disabled populations, perhaps by the VAAP.  Then, again, it may be that the districts with larger disabled populations have more effective programs to teach those populations.

The math tests show much the same pattern and the R-squared values all run less than 1%.


Here, courtesy of Excel, is the correlation matrix for the reading data (these numbers are R, not R-squared):


And here is the matrix for the math data.


The pass rates correlate fairly strongly with each other (the strongest being the 71% R-squared for the math, All/ED correlation; the weakest being the 26% for the math No Group/Disab. case).  The strong No Group to group correlations suggest that better teaching produces better results, both in the unimpaired and the impaired populations.

Richmond Teachers’ Salaries

The estimable Carol Wolf points out that the distribution of teachers’ salaries may be interesting but, to the folks in Richmond, the Richmond salaries vs. those in the nearby and peer jurisdictions will be more interesting.

Here, then, are the latter, again from Table 19 in the 2016 Superintendent’s Annual Report.




The VDOE Table contains the following notes:

The average annual salaries for elementary and secondary teachers include supplemental salaries and wages (expenditure object 1620) as reported on the Annual School Report.

Teaching positions include classroom teachers, guidance counselors, librarians and technology instructors.

Does More Money for Schools Help Disadvantaged Students?

We earlier saw that larger expenditures per student did not correlate with better division average SOL pass rates.  The 2016 data for all students support the same conclusion.  For example, reading:


The data here are 2016 division pass rates and per student expenditures for operations.  Richmond is the gold square; from the left the red diamonds are Hampton, Newport News, and Norfolk; the green diamond is Charles City; the blue diamond is Lynchburg.

Let’s drill down to see how the division scores of disabled students vary with expenditures.


Here the fitted line shows the pass rate increasing with expenditure (0.7% per $1,000) but the 1.8% R-squared is only a hint at a correlation.

Let’s produce the same plot for the economically disadvantaged students and students with limited English proficiency.



The LEP analysis is complicated by the several divisions (Charles City among them) with fewer then ten LEP students, which leaves their data concealed from us by VDOE’s suppression rules.  Those divisions are absent from this graph.

Last, we come to those students who are not disabled, ED. or LEP, and not members of the much smaller groups not portrayed above, migrant or homeless.  Those students perform quite well, for the most part.


The nearest approach to a correlation here is the 1.8% R-squared for the disabled group.  This lack of correlation tells us that, absent some unrecognized and perverse other factor, division average performance for each of these groups is unrelated to the division expenditure per student.

The math data tell much the same story.






Here the R-squared rises to 5% for the LEP group and 2% for the ED but the correlation, such as it is, is negative:  More money associated with worse performance.

Data are posted here.

And here in Richmond:  As we saw earlier, by each measure RPS is spending a lot of money and getting poor results.