Who Gets the $

Table 14 in the Superintendent’s Annual Report provides division totals regarding the distribution in 2016 of $6.29 billion in state funds.  Table 1 lists the end of year enrollment (aka Average Daily Membership, “ADM”) for each division. 

VDOE dispenses the funds on a per ADM basis but I have so far not found the formula for calculating that number.  The statutory categories are here

In any case, we have the totals.

Here is a short list of divisions showing the total funds and funds per student for each.

image

Note: The average there is by division; the average by student (total funds/total ADM) is $5,065.  This tells already that the (much more numerous) smaller divisions are getting more money per kid on average.  See below.

Next, the distribution of funding per student, in $100 increments. 

image

The inserted colors represent the locations of the divisions listed above; Richmond is gold; Lynchburg, violet; Charles City, green.  The red, from the left, are Richmond’s peer jurisdictions, Hampton, Norfolk and Newport News.

Turning to the funds per student as a function of ADM, we get the following:

image

Same color codes as above.  The peers, from the left, are Hampton, Newport News, and Norfolk.

The fitted line has a slope of -$159 per 10,000 increase in ADM.  The R-squared of 6.9% tells us that funding correlates with ADM to a degree but some other factors predominate.

Expanding the x-axis of that graph, we get:

image

That cluster of small divisions above the line tells suggests that smallness indeed helps to get more money; the smaller cluster below confirms that size is far from the only influence on the funding.

The two Big Winners here are Charlotte County, $9,036 per student, and Brunswick County, $8,500.  The Big Losers are Falls Church, $2,500, and Arlington, $2,600.

Here is the complete list.

image

Elementary, My Dear Weigand

In an earlier post, I said that James Weigand had asked about teachers per student and SOLs.  In fact, he asked about that in elementary school.

Table 17a in the 2016 Superintendent’s Annual Report gives us the pupil/teacher ratios by division for grades K-7; that grade range extends beyond elementary school at both ends (grades 6 & 7 are middle school these days).  Moreover, the SOL testing begins only in grade 3.  Thus the pupil/teacher data and the pass rates are for different grade ranges.

As well, the VDOE database will give the pass rate for each grade but not an average over grades 3-7; the best we can do is take the rates of each grade and average those.

With those limitations, here are the reading data for the last year with available data, 2016.

image

Richmond is the gold square; Charles City, the green diamond; Lynchburg the blue diamond.  The red diamonds are our peer cities, Newport News, Hampton, and Norfolk, from the left.

Excel is happy to fit a line to the data but the R-squared tells us there is no correlation.

The math data paint much the same picture.

image

Given the constraints on the data, it would be rash to draw any conclusion beyond the absence of a correlation in these particular datasets.

Except, of course, for the obvious: Richmond by any measure is racing for last place.  With a nearly average number of teachers (Richmond is 8.0%; the average is 8.2%) we get awful pass rates.

Nonfeasance, n.: Failure to Act

Last year, Richmond had the lowest average reading pass rate and the second lowest math pass rate in the state.  At present only seventeen of forty-five Richmond schools (38%) are fully accredited.

Seeing the handwriting on the wall, Richmond this year “requested” a “division-level academic review

§ 22.1-253.13:3.A
                                                      * * *
When the Board of Education determines through the school academic review process that the failure of schools within a division to achieve full accreditation status is related to division-level failure to implement the Standards of Quality or other division-level action or inaction, the Board may require a division-level academic review. After the conduct of such review and within the time specified by the Board of Education, each school board shall submit to the Board for approval a corrective action plan, consistent with criteria established by the Board setting forth specific actions and a schedule designed to ensure that schools within its school division achieve full accreditation status.

On November 17, 2016, the Board of Education approved the request.  The minute is unfortunately silent as to the process going forward.  The agenda item for that meeting is more forthcoming:

A division-level Memorandum of Understanding and Corrective Action Plan are
expected to come before the Virginia Board of Education by June 22, 2017.

“Expected.”  There is no actual deadline here for submission of the Plan, much less for getting it executed, notwithstanding the statutory authorization for a time limit.

Indeed, the Plan is only “expected” the month after graduation this year.  There is no attempt here to have a plan submitted and approved and in effect for the 2018 school year.

But, then, we’ve already seen that the Board of “Education” does not know how to fix a broken school division; they admit it
(Sept. 21, 2016 video starting at 1:48).

Indeed, with our Superintendent leaving in June, it’s hard to know whether there can be any realistic Plan before 2019 in any case.

Never mind all those kids whom the awful Richmond schools are damaging: There’s no reason for hurry here.

More Teachers, Same SOLs

James Weigand raises the question whether the divisions with more teachers per student get better SOL pass rates.

To look at that question, I’ve prepared a juxtaposition of Table 19 and Table 1 from the 2016 Superintendent’s Annual Report with the pass rates from that year.

Table 19 gives us each division’s total “instructional positions,” defined by VDOE as “classroom teachers, guidance counselors, librarians, technology instructors, principals, and assistant principals.”  Table 1 gives the end of year enrollment (known in bureaucratese as average daily membership, ADM).

Here, to start, are the data for the reading tests.

image

The red diamonds, from the left, are the peer cities Newport News, Hampton, and Norfolk.  The gold square is Richmond; the blue diamond, Lynchburg; the green diamond, Charles City. 

If we fool around with the fitted curve, we can get the R-squared up to 1.2%, which is to say that the SOL performance does not correlate with the number of teachers per kid.

image

Here, then, are the data for the remaining subjects and the five-subject average, all with the linear curve fitted.

image

image

image

image

image

The closest thing to a correlation is History & Social Science at 6%; all the correlations, such as they are, are negative: Pass rates decreasing with increasing numbers of teachers.

At the division level, 2016 SOL performance did not improve with increasing numbers of teachers per student.

Picking on the Little Guy?

Steve Fuhrmann of Charles City County points out that my analysis of division administrative employees may be misleading: CCCo was the only small division in the group I looked at.  My calculation showed Charles City with overall 60.2 administrators per 100 teachers vs. a state average of 54.3.

clip_image001

Steve sent along a spreadsheet that looks at the same data for the eleven divisions with between 500 and 900 students (excluding tiny Highland with only 209).  Those data show CCCo with a smaller administrative staff, relative to the teaching staff, on average:

image

(It’s comforting that he also got 60.2% for CCCo.)

That said, Steve joins me in wondering why the smaller divisions face larger administrative costs:

The administrative argument has usually been that smaller divisions are required to provide administrative services– especially for maintaining smaller facilities, complying with unfunded state GA mandates and submitting data to VDOE–and do not have large student populations across which to reduce average costs . . .
In comparison with the 11 smaller divisions, comparing administrative overhead with teaching positions, Charles City actually looks pretty good–except for average salaries where we compare poorly if we are to recruit and retain effective teachers for our students . . .  Norton and especially Lexington appear to be outliers–even though student transportation is the only service that can be assumed be be much lower in cities–and Colonial Beach and West Point, also more compact towns, do not show much lower administrative expenditures


Charles City actually has a lower student-to-teacher ratio and a relatively quite low total administrative-to-teacher burden.


(I wish I could figure out a way to actually test the assumption that smaller divisions really need relatively higher administrative service costs–I suspect it has more to do with conventional sharp divisions of labor expertise, rather than overall functional requirements.)

This also leaves open the question why Petersburg hires so many bureaucrats and turns in such lousy results.

Counting Bureaucrats

Table 18 to the 2016 Superintendent’s Annual Report details “Administrative, Service, and Support Personnel Positions by Function” for each division (as well as for Governor’s Schools and other specialized programs).  The items in the report clearly are budget categories but I’ve not been able to locate the definitions.

For the graphs below, I have extracted the total numbers of positions for Richmond, the State, and the peer cities of Hampton, Newport News, and Norfolk, as well as Charles City, Lynchburg, and poor Petersburg.  I’ve expressed those data as percentages of the end of year enrollments (“Average Daily Membership” or “ADM” in bureaucratese) and as percentages of the numbers of  teaching positions

To start: “Administrative.”  Here we see Richmond with fewer administrators per student and per teacher vs. the state average while the peer cities are near that average, but Petersburg and Charles City have lots of bureaucrats.  Fat lot of good that has done for Petersburg.

image

image

Next, “Technical and Clerical.”  Here, everybody is close to the average except for Hampton and, again, Petersburg.

image

image

Whatever “Instructional Support” may be, the peer cities are beating Richmond’s SOL performance with fewer positions; Richmond is doing poorly with well more than the state average number of positions, and Petersburg . . .

image

image

Richmond has a lot of “Other Professional” positions, whatever those are.  Charles City has a lot more.

image

image

Next, “Trades, Labor, Operative, and Service” positions.  One might think Richmond would want more of those to deal with its maintenance problems.

image

image

Finally, the totals.

image

image

Looks like Petersburg is spending a lot of money on non-teachers.  Trouble is, it’s hard to know how Petersburg might usefully redirect some of those salary dollars, given that even the State Board of “Education” doesn’t know how to fix Petersburg’s broken school system.

Richmond’s count is near average and generally in line with (smaller, in some respects, than) the peer cities.

Richmond Dropouts

The Superintendent’s Annual Report is up with, inter alia, the 2016 dropout counts.

Here are the data for Richmond, expressed as percentages of the Fall enrollments.

image

There is no entry for white males because of the suppression rules.  The white male count could be as high as nine, which would give a rate of 0.80%.

These are percentages of total enrollments, which in Richmond include the relatively larger elementary school numbers

The more revealing data are the four-year cohort rates that show the dropouts as a percentage of the cohort entering high school four years earlier.  Unfortunately, that report uses a different group classification, except for the total.

image

As to the total, the cohort rate is 7.2 times the percentage of total enrollment.  That 9.9% is 146 kids, out of the 1,472 student cohort, whom the Richmond Public Schools utterly failed to educate. 

That Richmond rate is nearly double the state average and consistently larger than the rates in the peer cities; indeed, the Richmond rate is even larger (by 36%) than poor Petersburg.

image

Richmond’s Underperformance

An earlier post looked at the division average pass rates of Economically Disadvantaged and Disabled students, as well as the All Students averages and the performance of the No Group students who are not members of any disadvantaged group (definitions are here). 

I’ve updated those graphs to show Richmond’s performance in each case.  These are 2016 data from VDOE’s excellent database.

Let’s start with reading:

image

Richmond is the square points at 69.9% Economically Disadvantaged. 

Given that the ED group (yellow on the graph) scores some 15 points below the All Students population (blue points), this large ED population in Richmond surely lowers the All Students pass rate.

Aside: The 39% R-Squared (R=0.62) for the All Students fitted line gives some confidence about the correlation of those scores with the percentage of ED students.  If we extrapolate that line to 100% ED, we get a 62.8% pass rate.  The fit of the ED population is tenuous (R-squared = 2.9%) but the extrapolated value of 63.9 is close enough to the extrapolated All Students value to suggest that the lower scoring ED students are a major influence in lowering the All Students average.

Unfortunately, Richmond’s All Students pass rate is lower than all the other divisions with similar ED populations  Indeed, the Richmond pass rate is lower than all the other divisions and well below the All Students fitted line.   This tells us that Richmond’s problem is more than just the large ED population.

Further, the pass rate of Richmond’s Economically Disadvantaged group is third lowest in the state.  That is, Richmond ED students are underperforming their peers statewide.  Ditto the No Group students (green points on the graph).  And those low pass rates surely contribute to Richmond’s bottom-of-the-barrel All Students average.

Richmond’s Disabled performance falls in the bottom half of the divisions.  The divisions’ Disabled performance correlates modestly with the ED population (R-squared = 18%) but only slightly with the Disabled (R-squared = 1.9%).

Turning to the effect of increasing Disabled population: 

image

Notice that these latter trendlines suggest Disabled and ED pass rates that increase with increasing Disabled populations.  This might raise some question as to the accuracy (or potential for cheating) in the testing of the disabled students.  The R-squared values, however, tell us that the correlations are tiny.

To put some numbers on these groups, here are the Richmond and averaged division average reading pass rates for each group.

image

Richmond’s No Group students, who are not members of any disadvantaged group, underperform their peers statewide by about eight points; the Economically Disadvantaged students underperform by fourteen points; while the Disabled students also are low by almost eight percent.

The math data paint much the same picture.

image

image

Here, Richmond’s underperformance is, except for the ED group, even more dramatic.

image

There are two possible explanations for these data: Either Richmond’s public school students are particularly slow learners or Richmond is doing an inferior job of educating all its students. 

We have the evidence of Carver Elementary School to buttress the inference that Friar Occam would suggest.  Carver serves a particularly tough clientele: The Carver population is drawn from a part of the inner city that includes an RRHA project.  Carver’s students were 89% Economically Disadvantaged in 2016, vs. 69.9% for the division.  Yet Carver breaks the Richmond pattern by turning in outstanding and division-leading results.

The next time our Superintendent starts talking about “challenges,” please listen carefully to hear whether he’s again blaming the students or whether he’s beginning to notice the generally substandard teaching in his school system.

Please Empty and Lock Your Car!

The warmer weather has brought the early Park traffic to our neighborhood so it’s time for the annual Jeremiad about Car Breakins (or, as the Police put it, “Theft from Motor Vehicle”).

First the geography:  In the Police database, the Forest Hill neighborhood runs from the park to Westover Hills Blvd. and from Forest Hill Ave. to the river.

As you can see, this does not include all of the Forest Hill Neighborhood Ass’n. area and does include some of the Westover Hills Ass’n. area.

Microsoft has a nice view of the area.

Here, for a start, is the count of the most frequent unique police reports for the neighborhood from the start of the database,  Jan. 2000, through March 2017.

image

Theft from motor vehicle leads the parade with 27%.  If we counted the broken car windows (most of these thefts are from unlocked cars!), the total would run close to a third.

The Hot Blocks are led by 4200 Riverside, where fully half of the reports are theft from motor vehicle.

image

The Hot Spot in the 4200 block is the “42d St. Parking Lot” that is located in the park across from 4224 Riverside.

image

Our crime is seasonal, peaking in the summer with corresponding peaks in the car breakin rate.

image

The 4200 Riverside block shows much the same pattern.

image

Notice the considerable decreases after 2005 when Parks started locking the gates to the 42d. St. lot at night and off season and installed the rocks in the part of the lot that is less visible from the street.


There are two items of Bad News here:

*** Rant Begins ***

By far the most common offense (27% in this time frame) is “Theft from Motor Vehicle.” That crime (and much of the property destruction that reflects broken car windows) would entirely disappear if we (and the people who visit in our neighborhood) would lock our stuff in the trunk or take it into the house. Indeed, our bad habits in this respect – and those of folks visiting the Park — have the unfortunate effect of chumming the neighborhood to attract criminals.

*** Rant Ends ***

The other Bad News is that Parks this year removed those unsightly rocks.  This almost certainly will lead to more car breakins in the lot.

080112 002

I think it’s time for some signs in the 4200 block (and probably at the Nature Center and 4100 Hillcrest and around the Stone House) and, especially, in the 42d St. parking lot to warn the visitors:

Car Breakins Here! Lock your junk in your trunk.

Student Groups and SOLs

It is commonplace to expect that increasing the population of poorer (the official euphemism is “economically disadvantaged”) students lowers the test scores.  That certainly is the case among the Virginia school divisions.

The VDOE database reports the SOL scores for the ED group; also the disabled, limited English proficient (“LEP”), migrant, and homeless groups.  VDOE’s data definitions of the groups are here.

The migrant and homeless populations are relatively small; many divisions report zero or fewer than the ten-student data suppression cutoff.  The LEP populations are larger but VDOE still does not report data for many divisions.  Accordingly, the analysis below deals only with the ED and disabled groups, as well as all students and the students not in any of those defined groups (here, “no group”).  As well, VDOE does not provide a datum for the Lexington disabled enrollment (albeit they do show the pass rates) so I’ve omitted the Lexington data.

Let’s start with the reading pass rates by division, plotted v. the percentage of ED students.

image

As expected, the division average pass rates for all students (the blue diamonds) decrease with increasing percentage of ED students.  The fitted line shows a decent correlation (R-squared = 39%) and a slope decreasing 2.8% for a 10% increase in the ED population.

In contrast, the scores of the ED students themselves (yellow circles) show less than a third of that rate of decrease and a correlation approaching negligible (R-squared = 2.9%). 

Thus, the decrease in the all students average rate must come predominantly from the increasing proportion of lower-scoring ED students.  Indeed, a little arithmetic shows the increasing proportion of lower scoring ED students slightly overestimates the decrease. 

image

Note: The calculated line here was obtained from the two fitted lines.  Thus, the 80% calculated point was 20% of the All pass rate (blue line) + 80% of the ED rate (yellow line).

The 18% R-squared for the disabled group suggests that the disabled pass rates are related in some measure to the population of ED students.  The VDOE data do not allow us to test the possibility that disabled students are found more often in the ED population.

The No Group population shows about half as much correlation as the disabled group, ca. 11%, but still some effect from increasing poverty. 

Given that the no group students are not members of the ED group (or any other), this (modest) effect of ED population on the no group pass rates cannot come from averaging in the lower ED scores.  We can speculate whether this No Group score decrease is the effect of increasing poverty in a division on the classroom environment, the teachers’ distraction to deal with the larger ED group, or something else.

Overall, these data are consistent with the notion that more poverty in the district will be associated with lower pass rates. 

Turning to the math tests:

image

This is the same story, told in slightly different numbers.  Increasing poverty lowers all the scores, but the scores of the poor students themselves do not seem to be lowered significantly by increasing the percentage of poor students.  The R-squared of the no group scores, however, rises to 15%.

Next, let’s look at the effect of increasing populations of students with disabilities.

image

First, notice that the largest disabled population is 21% (Craig County) while the largest ED population was 79% (Northampton County), so the scale here is much shorter.

The fitted lines suggest that the reading scores of the disabled and ED populations increase with increasing populations of disabled students but note the very small R-squared values. 

Of more interest is the behavior of the all student scores with increasing disabled populations.  At 20% disabled, the pass rate would be lowered by about 20% of the difference between the two groups, ca. 20%*40%, which would put the pass rate near 72%.  Using the intercept values for the group scores, the calculation produces 69%, hence the red Estimate line here:

image

If, instead of the intercepts, we use the fitted lines to calculate the 20% all students score, we still get 71%. 

In short, increasing the disabled population does not decrease the all student scores as much as it should.

The 1.9% R-squared for the disabled data commands caution but the data certainly are consistent with the disabled scores being artificially enhanced in the larger disabled populations, perhaps by the VAAP.  Then, again, it may be that the districts with larger disabled populations have more effective programs to teach those populations.

The math tests show much the same pattern and the R-squared values all run less than 1%.

image

Here, courtesy of Excel, is the correlation matrix for the reading data (these numbers are R, not R-squared):

clip_image001

And here is the matrix for the math data.

image

The pass rates correlate fairly strongly with each other (the strongest being the 71% R-squared for the math, All/ED correlation; the weakest being the 26% for the math No Group/Disab. case).  The strong No Group to group correlations suggest that better teaching produces better results, both in the unimpaired and the impaired populations.