Counting Bureaucrats

Table 18 to the 2016 Superintendent’s Annual Report details “Administrative, Service, and Support Personnel Positions by Function” for each division (as well as for Governor’s Schools and other specialized programs).  The items in the report clearly are budget categories but I’ve not been able to locate the definitions.

For the graphs below, I have extracted the total numbers of positions for Richmond, the State, and the peer cities of Hampton, Newport News, and Norfolk, as well as Charles City, Lynchburg, and poor Petersburg.  I’ve expressed those data as percentages of the end of year enrollments (“Average Daily Membership” or “ADM” in bureaucratese) and as percentages of the numbers of  teaching positions

To start: “Administrative.”  Here we see Richmond with fewer administrators per student and per teacher vs. the state average while the peer cities are near that average, but Petersburg and Charles City have lots of bureaucrats.  Fat lot of good that has done for Petersburg.

image

image

Next, “Technical and Clerical.”  Here, everybody is close to the average except for Hampton and, again, Petersburg.

image

image

Whatever “Instructional Support” may be, the peer cities are beating Richmond’s SOL performance with fewer positions; Richmond is doing poorly with well more than the state average number of positions, and Petersburg . . .

image

image

Richmond has a lot of “Other Professional” positions, whatever those are.  Charles City has a lot more.

image

image

Next, “Trades, Labor, Operative, and Service” positions.  One might think Richmond would want more of those to deal with its maintenance problems.

image

image

Finally, the totals.

image

image

Looks like Petersburg is spending a lot of money on non-teachers.  Trouble is, it’s hard to know how Petersburg might usefully redirect some of those salary dollars, given that even the State Board of “Education” doesn’t know how to fix Petersburg’s broken school system.

Richmond’s count is near average and generally in line with (smaller, in some respects, than) the peer cities.

Richmond Dropouts

The Superintendent’s Annual Report is up with, inter alia, the 2016 dropout counts.

Here are the data for Richmond, expressed as percentages of the Fall enrollments.

image

There is no entry for white males because of the suppression rules.  The white male count could be as high as nine, which would give a rate of 0.80%.

These are percentages of total enrollments, which in Richmond include the relatively larger elementary school numbers

The more revealing data are the four-year cohort rates that show the dropouts as a percentage of the cohort entering high school four years earlier.  Unfortunately, that report uses a different group classification, except for the total.

image

As to the total, the cohort rate is 7.2 times the percentage of total enrollment.  That 9.9% is 146 kids, out of the 1,472 student cohort, whom the Richmond Public Schools utterly failed to educate. 

That Richmond rate is nearly double the state average and consistently larger than the rates in the peer cities; indeed, the Richmond rate is even larger (by 36%) than poor Petersburg.

image

Richmond’s Underperformance

An earlier post looked at the division average pass rates of Economically Disadvantaged and Disabled students, as well as the All Students averages and the performance of the No Group students who are not members of any disadvantaged group (definitions are here). 

I’ve updated those graphs to show Richmond’s performance in each case.  These are 2016 data from VDOE’s excellent database.

Let’s start with reading:

image

Richmond is the square points at 69.9% Economically Disadvantaged. 

Given that the ED group (yellow on the graph) scores some 15 points below the All Students population (blue points), this large ED population in Richmond surely lowers the All Students pass rate.

Aside: The 39% R-Squared (R=0.62) for the All Students fitted line gives some confidence about the correlation of those scores with the percentage of ED students.  If we extrapolate that line to 100% ED, we get a 62.8% pass rate.  The fit of the ED population is tenuous (R-squared = 2.9%) but the extrapolated value of 63.9 is close enough to the extrapolated All Students value to suggest that the lower scoring ED students are a major influence in lowering the All Students average.

Unfortunately, Richmond’s All Students pass rate is lower than all the other divisions with similar ED populations  Indeed, the Richmond pass rate is lower than all the other divisions and well below the All Students fitted line.   This tells us that Richmond’s problem is more than just the large ED population.

Further, the pass rate of Richmond’s Economically Disadvantaged group is third lowest in the state.  That is, Richmond ED students are underperforming their peers statewide.  Ditto the No Group students (green points on the graph).  And those low pass rates surely contribute to Richmond’s bottom-of-the-barrel All Students average.

Richmond’s Disabled performance falls in the bottom half of the divisions.  The divisions’ Disabled performance correlates modestly with the ED population (R-squared = 18%) but only slightly with the Disabled (R-squared = 1.9%).

Turning to the effect of increasing Disabled population: 

image

Notice that these latter trendlines suggest Disabled and ED pass rates that increase with increasing Disabled populations.  This might raise some question as to the accuracy (or potential for cheating) in the testing of the disabled students.  The R-squared values, however, tell us that the correlations are tiny.

To put some numbers on these groups, here are the Richmond and averaged division average reading pass rates for each group.

image

Richmond’s No Group students, who are not members of any disadvantaged group, underperform their peers statewide by about eight points; the Economically Disadvantaged students underperform by fourteen points; while the Disabled students also are low by almost eight percent.

The math data paint much the same picture.

image

image

Here, Richmond’s underperformance is, except for the ED group, even more dramatic.

image

There are two possible explanations for these data: Either Richmond’s public school students are particularly slow learners or Richmond is doing an inferior job of educating all its students. 

We have the evidence of Carver Elementary School to buttress the inference that Friar Occam would suggest.  Carver serves a particularly tough clientele: The Carver population is drawn from a part of the inner city that includes an RRHA project.  Carver’s students were 89% Economically Disadvantaged in 2016, vs. 69.9% for the division.  Yet Carver breaks the Richmond pattern by turning in outstanding and division-leading results.

The next time our Superintendent starts talking about “challenges,” please listen carefully to hear whether he’s again blaming the students or whether he’s beginning to notice the generally substandard teaching in his school system.

Please Empty and Lock Your Car!

The warmer weather has brought the early Park traffic to our neighborhood so it’s time for the annual Jeremiad about Car Breakins (or, as the Police put it, “Theft from Motor Vehicle”).

First the geography:  In the Police database, the Forest Hill neighborhood runs from the park to Westover Hills Blvd. and from Forest Hill Ave. to the river.

As you can see, this does not include all of the Forest Hill Neighborhood Ass’n. area and does include some of the Westover Hills Ass’n. area.

Microsoft has a nice view of the area.

Here, for a start, is the count of the most frequent unique police reports for the neighborhood from the start of the database,  Jan. 2000, through March 2017.

image

Theft from motor vehicle leads the parade with 27%.  If we counted the broken car windows (most of these thefts are from unlocked cars!), the total would run close to a third.

The Hot Blocks are led by 4200 Riverside, where fully half of the reports are theft from motor vehicle.

image

The Hot Spot in the 4200 block is the “42d St. Parking Lot” that is located in the park across from 4224 Riverside.

image

Our crime is seasonal, peaking in the summer with corresponding peaks in the car breakin rate.

image

The 4200 Riverside block shows much the same pattern.

image

Notice the considerable decreases after 2005 when Parks started locking the gates to the 42d. St. lot at night and off season and installed the rocks in the part of the lot that is less visible from the street.


There are two items of Bad News here:

*** Rant Begins ***

By far the most common offense (27% in this time frame) is “Theft from Motor Vehicle.” That crime (and much of the property destruction that reflects broken car windows) would entirely disappear if we (and the people who visit in our neighborhood) would lock our stuff in the trunk or take it into the house. Indeed, our bad habits in this respect – and those of folks visiting the Park — have the unfortunate effect of chumming the neighborhood to attract criminals.

*** Rant Ends ***

The other Bad News is that Parks this year removed those unsightly rocks.  This almost certainly will lead to more car breakins in the lot.

080112 002

I think it’s time for some signs in the 4200 block (and probably at the Nature Center and 4100 Hillcrest and around the Stone House) and, especially, in the 42d St. parking lot to warn the visitors:

Car Breakins Here! Lock your junk in your trunk.

Student Groups and SOLs

It is commonplace to expect that increasing the population of poorer (the official euphemism is “economically disadvantaged”) students lowers the test scores.  That certainly is the case among the Virginia school divisions.

The VDOE database reports the SOL scores for the ED group; also the disabled, limited English proficient (“LEP”), migrant, and homeless groups.  VDOE’s data definitions of the groups are here.

The migrant and homeless populations are relatively small; many divisions report zero or fewer than the ten-student data suppression cutoff.  The LEP populations are larger but VDOE still does not report data for many divisions.  Accordingly, the analysis below deals only with the ED and disabled groups, as well as all students and the students not in any of those defined groups (here, “no group”).  As well, VDOE does not provide a datum for the Lexington disabled enrollment (albeit they do show the pass rates) so I’ve omitted the Lexington data.

Let’s start with the reading pass rates by division, plotted v. the percentage of ED students.

image

As expected, the division average pass rates for all students (the blue diamonds) decrease with increasing percentage of ED students.  The fitted line shows a decent correlation (R-squared = 39%) and a slope decreasing 2.8% for a 10% increase in the ED population.

In contrast, the scores of the ED students themselves (yellow circles) show less than a third of that rate of decrease and a correlation approaching negligible (R-squared = 2.9%). 

Thus, the decrease in the all students average rate must come predominantly from the increasing proportion of lower-scoring ED students.  Indeed, a little arithmetic shows the increasing proportion of lower scoring ED students slightly overestimates the decrease. 

image

Note: The calculated line here was obtained from the two fitted lines.  Thus, the 80% calculated point was 20% of the All pass rate (blue line) + 80% of the ED rate (yellow line).

The 18% R-squared for the disabled group suggests that the disabled pass rates are related in some measure to the population of ED students.  The VDOE data do not allow us to test the possibility that disabled students are found more often in the ED population.

The No Group population shows about half as much correlation as the disabled group, ca. 11%, but still some effect from increasing poverty. 

Given that the no group students are not members of the ED group (or any other), this (modest) effect of ED population on the no group pass rates cannot come from averaging in the lower ED scores.  We can speculate whether this No Group score decrease is the effect of increasing poverty in a division on the classroom environment, the teachers’ distraction to deal with the larger ED group, or something else.

Overall, these data are consistent with the notion that more poverty in the district will be associated with lower pass rates. 

Turning to the math tests:

image

This is the same story, told in slightly different numbers.  Increasing poverty lowers all the scores, but the scores of the poor students themselves do not seem to be lowered significantly by increasing the percentage of poor students.  The R-squared of the no group scores, however, rises to 15%.

Next, let’s look at the effect of increasing populations of students with disabilities.

image

First, notice that the largest disabled population is 21% (Craig County) while the largest ED population was 79% (Northampton County), so the scale here is much shorter.

The fitted lines suggest that the reading scores of the disabled and ED populations increase with increasing populations of disabled students but note the very small R-squared values. 

Of more interest is the behavior of the all student scores with increasing disabled populations.  At 20% disabled, the pass rate would be lowered by about 20% of the difference between the two groups, ca. 20%*40%, which would put the pass rate near 72%.  Using the intercept values for the group scores, the calculation produces 69%, hence the red Estimate line here:

image

If, instead of the intercepts, we use the fitted lines to calculate the 20% all students score, we still get 71%. 

In short, increasing the disabled population does not decrease the all student scores as much as it should.

The 1.9% R-squared for the disabled data commands caution but the data certainly are consistent with the disabled scores being artificially enhanced in the larger disabled populations, perhaps by the VAAP.  Then, again, it may be that the districts with larger disabled populations have more effective programs to teach those populations.

The math tests show much the same pattern and the R-squared values all run less than 1%.

image

Here, courtesy of Excel, is the correlation matrix for the reading data (these numbers are R, not R-squared):

clip_image001

And here is the matrix for the math data.

image

The pass rates correlate fairly strongly with each other (the strongest being the 71% R-squared for the math, All/ED correlation; the weakest being the 26% for the math No Group/Disab. case).  The strong No Group to group correlations suggest that better teaching produces better results, both in the unimpaired and the impaired populations.

Richmond Teachers’ Salaries

The estimable Carol Wolf points out that the distribution of teachers’ salaries may be interesting but, to the folks in Richmond, the Richmond salaries vs. those in the nearby and peer jurisdictions will be more interesting.

Here, then, are the latter, again from Table 19 in the 2016 Superintendent’s Annual Report.

image

image

image

The VDOE Table contains the following notes:

The average annual salaries for elementary and secondary teachers include supplemental salaries and wages (expenditure object 1620) as reported on the Annual School Report.

Teaching positions include classroom teachers, guidance counselors, librarians and technology instructors.


Does More Money for Schools Help Disadvantaged Students?

We earlier saw that larger expenditures per student did not correlate with better division average SOL pass rates.  The 2016 data for all students support the same conclusion.  For example, reading:

image

The data here are 2016 division pass rates and per student expenditures for operations.  Richmond is the gold square; from the left the red diamonds are Hampton, Newport News, and Norfolk; the green diamond is Charles City; the blue diamond is Lynchburg.

Let’s drill down to see how the division scores of disabled students vary with expenditures.

image

Here the fitted line shows the pass rate increasing with expenditure (0.7% per $1,000) but the 1.8% R-squared is only a hint at a correlation.

Let’s produce the same plot for the economically disadvantaged students and students with limited English proficiency.

image

image

The LEP analysis is complicated by the several divisions (Charles City among them) with fewer then ten LEP students, which leaves their data concealed from us by VDOE’s suppression rules.  Those divisions are absent from this graph.

Last, we come to those students who are not disabled, ED. or LEP, and not members of the much smaller groups not portrayed above, migrant or homeless.  Those students perform quite well, for the most part.

image

The nearest approach to a correlation here is the 1.8% R-squared for the disabled group.  This lack of correlation tells us that, absent some unrecognized and perverse other factor, division average performance for each of these groups is unrelated to the division expenditure per student.

The math data tell much the same story.

image

image

image

image

image

Here the R-squared rises to 5% for the LEP group and 2% for the ED but the correlation, such as it is, is negative:  More money associated with worse performance.

Data are posted here.

And here in Richmond:  As we saw earlier, by each measure RPS is spending a lot of money and getting poor results.

2016 Richmond Teacher Pay

Table 19 is up in the Superintendent’s Annual Report with the 2016 salary averages.

The statewide distribution graphs, below, show the count of divisions paying each salary, in $1,000 increments.  The Richmond average salary is marked in yellow on each graph; the state average is blue.

For elementary principals, Richmond’s $90,531 average was 0.40 standard deviation above the division average of $84,581. 

image

(To read the graph, look across the bottom for average salary, rounded to the nearest $1,000 and up and down for number of schools.  Thus, one school paid $44,000.  Six schools, one of which was Richmond, paid $91,000.  Four schools paid the state average, $85,000.)

For secondary principals, Richmond’s $91,266 average was 0.10 standard deviation below the division average of $93,129.

image

For Elementary Assistant Principals, Richmond’s $69,786 average was 0.17 standard deviation above the division average of 67,813.

image

For secondary Assistant Principals, Richmond’s $71,342 average was 0.20 standard deviation below the division average of 73,734.

image

For elementary teachers, Richmond’s $49,100 average was 0.19 standard deviation above the division average of $47,816.

image

For secondary teachers, Richmond’s $51,201 average was 0.08 standard deviation above the division average of $50,563.

image

Looks like we’re underpaying the leaders in our secondary schools.

Some details from the VDOE spreadsheet:

The average annual salaries for elementary and secondary teachers include supplemental salaries and wages (expenditure object 1620) as reported on the Annual School Report.

Teaching positions include classroom teachers, guidance counselors, librarians and technology instructors.

Jointly-operated school divisions (Fairfax City and Fairfax County; Emporia and Greensville County; and Williamsburg and James City County) report positions and average salaries on the annual school report of the fiscal agent division only. Fairfax County, Greensville County and Williamsburg are the fiscal agent divisions.

And a further note: The “division averages” reported above are the averages of the division averages in the VDOE spreadsheet.  VDOE reports the statewide averages; those generally are larger than the division averages, doubtless propelled by the large and very expensive NoVa divisions.

Underperformance by Group

We have seen that Richmond’s underperformance reaches the disabled, economically disadvantaged, and limited English proficiency subgroups.  The VDOE Web site provides data that allow a more fine-grained look at that problem.

To start, here is the distribution of 2016 SOL pass rates by division on the reading tests.  Here, and below, I’ve yellowed the bar at Richmond’s pass rate.

image

The disabled population shows a much different distribution, with Richmond again underperforming, but not so dramatically.

image

The economically disadvantaged group also shows lowered overall performance but not so much as the disabled group.  Richmond does not shine.

image

Next, the LEP group.

image

Note: Twenty-nine divisions have fewer than ten LEP students taking the reading tests.  VDOE does not report pass rates in those cases so those divisions are not included in this graph; similarly, twenty-six divisions are not reported in the LEP math graph below.  VDOE reports seven divisions with LEP averages of zero on both the reading and math tests; those also do not show on the graphs.

Finally, we have the students who are not in any VDOE group: Not disabled, not ED, not LEP, and not in the very small migrant and homeless groups not analyzed above.  Those no-group students test well on average.

image

But even this high-performing group performs badly in Richmond.   

(This despite RPS taking credit for the Maggie Walker students who do not attend any Richmond Public School.)

The math data paint much the same picture.

image

image

image

image

image

There you have it:  On average, Richmond can’t teach reading or math to any disadvantaged group or even to the high-performing, not-disadvantaged group.

But notice that Carver Elementary School, with a student population drawn from a part of the inner city that includes an RRHA project, breaks the pattern by turning in outstanding and division-leading results.

It’s not the students that are the problem here, folks.  It’s the schools.

Moreover, we get to wonder whether Richmond’s relatively better scores from its disabled population suggest that Richmond’s past abuse of its students to cheat on the SOLs continues in some measure.

More on Bang per Buck

To follow up on the previous post, I’ve calculated a five-subject bang/buck value for each division: (pass rate)*100,000/(expenditure per student). 

image

The green bar is West Point; the red bars from the left are Hampton, Newport News, and Norfolk; the blue is Lynchburg (thanks, James); and the yellow is Richmond.

From those data, here are the top ten, bottom ten, average, and selected division values.

image

The ten at the bottom are the eight Big Spenders

image

and the two low scorers, Petersburg and Richmond.