Student Groups and SOLs

It is commonplace to expect that increasing the population of poorer (the official euphemism is “economically disadvantaged”) students lowers the test scores.  That certainly is the case among the Virginia school divisions.

The VDOE database reports the SOL scores for the ED group; also the disabled, limited English proficient (“LEP”), migrant, and homeless groups.  VDOE’s data definitions of the groups are here.

The migrant and homeless populations are relatively small; many divisions report zero or fewer than the ten-student data suppression cutoff.  The LEP populations are larger but VDOE still does not report data for many divisions.  Accordingly, the analysis below deals only with the ED and disabled groups, as well as all students and the students not in any of those defined groups (here, “no group”).  As well, VDOE does not provide a datum for the Lexington disabled enrollment (albeit they do show the pass rates) so I’ve omitted the Lexington data.

Let’s start with the reading pass rates by division, plotted v. the percentage of ED students.

image

As expected, the division average pass rates for all students (the blue diamonds) decrease with increasing percentage of ED students.  The fitted line shows a decent correlation (R-squared = 39%) and a slope decreasing 2.8% for a 10% increase in the ED population.

In contrast, the scores of the ED students themselves (yellow circles) show less than a third of that rate of decrease and a correlation approaching negligible (R-squared = 2.9%). 

Thus, the decrease in the all students average rate must come predominantly from the increasing proportion of lower-scoring ED students.  Indeed, a little arithmetic shows the increasing proportion of lower scoring ED students slightly overestimates the decrease. 

image

Note: The calculated line here was obtained from the two fitted lines.  Thus, the 80% calculated point was 20% of the All pass rate (blue line) + 80% of the ED rate (yellow line).

The 18% R-squared for the disabled group suggests that the disabled pass rates are related in some measure to the population of ED students.  The VDOE data do not allow us to test the possibility that disabled students are found more often in the ED population.

The No Group population shows about half as much correlation as the disabled group, ca. 11%, but still some effect from increasing poverty. 

Given that the no group students are not members of the ED group (or any other), this (modest) effect of ED population on the no group pass rates cannot come from averaging in the lower ED scores.  We can speculate whether this No Group score decrease is the effect of increasing poverty in a division on the classroom environment, the teachers’ distraction to deal with the larger ED group, or something else.

Overall, these data are consistent with the notion that more poverty in the district will be associated with lower pass rates. 

Turning to the math tests:

image

This is the same story, told in slightly different numbers.  Increasing poverty lowers all the scores, but the scores of the poor students themselves do not seem to be lowered significantly by increasing the percentage of poor students.  The R-squared of the no group scores, however, rises to 15%.

Next, let’s look at the effect of increasing populations of students with disabilities.

image

First, notice that the largest disabled population is 21% (Craig County) while the largest ED population was 79% (Northampton County), so the scale here is much shorter.

The fitted lines suggest that the reading scores of the disabled and ED populations increase with increasing populations of disabled students but note the very small R-squared values. 

Of more interest is the behavior of the all student scores with increasing disabled populations.  At 20% disabled, the pass rate would be lowered by about 20% of the difference between the two groups, ca. 20%*40%, which would put the pass rate near 72%.  Using the intercept values for the group scores, the calculation produces 69%, hence the red Estimate line here:

image

If, instead of the intercepts, we use the fitted lines to calculate the 20% all students score, we still get 71%. 

In short, increasing the disabled population does not decrease the all student scores as much as it should.

The 1.9% R-squared for the disabled data commands caution but the data certainly are consistent with the disabled scores being artificially enhanced in the larger disabled populations, perhaps by the VAAP.  Then, again, it may be that the districts with larger disabled populations have more effective programs to teach those populations.

The math tests show much the same pattern and the R-squared values all run less than 1%.

image

Here, courtesy of Excel, is the correlation matrix for the reading data (these numbers are R, not R-squared):

clip_image001

And here is the matrix for the math data.

image

The pass rates correlate fairly strongly with each other (the strongest being the 71% R-squared for the math, All/ED correlation; the weakest being the 26% for the math No Group/Disab. case).  The strong No Group to group correlations suggest that better teaching produces better results, both in the unimpaired and the impaired populations.

Richmond Teachers’ Salaries

The estimable Carol Wolf points out that the distribution of teachers’ salaries may be interesting but, to the folks in Richmond, the Richmond salaries vs. those in the nearby and peer jurisdictions will be more interesting.

Here, then, are the latter, again from Table 19 in the 2016 Superintendent’s Annual Report.

image

image

image

The VDOE Table contains the following notes:

The average annual salaries for elementary and secondary teachers include supplemental salaries and wages (expenditure object 1620) as reported on the Annual School Report.

Teaching positions include classroom teachers, guidance counselors, librarians and technology instructors.


Does More Money for Schools Help Disadvantaged Students?

We earlier saw that larger expenditures per student did not correlate with better division average SOL pass rates.  The 2016 data for all students support the same conclusion.  For example, reading:

image

The data here are 2016 division pass rates and per student expenditures for operations.  Richmond is the gold square; from the left the red diamonds are Hampton, Newport News, and Norfolk; the green diamond is Charles City; the blue diamond is Lynchburg.

Let’s drill down to see how the division scores of disabled students vary with expenditures.

image

Here the fitted line shows the pass rate increasing with expenditure (0.7% per $1,000) but the 1.8% R-squared is only a hint at a correlation.

Let’s produce the same plot for the economically disadvantaged students and students with limited English proficiency.

image

image

The LEP analysis is complicated by the several divisions (Charles City among them) with fewer then ten LEP students, which leaves their data concealed from us by VDOE’s suppression rules.  Those divisions are absent from this graph.

Last, we come to those students who are not disabled, ED. or LEP, and not members of the much smaller groups not portrayed above, migrant or homeless.  Those students perform quite well, for the most part.

image

The nearest approach to a correlation here is the 1.8% R-squared for the disabled group.  This lack of correlation tells us that, absent some unrecognized and perverse other factor, division average performance for each of these groups is unrelated to the division expenditure per student.

The math data tell much the same story.

image

image

image

image

image

Here the R-squared rises to 5% for the LEP group and 2% for the ED but the correlation, such as it is, is negative:  More money associated with worse performance.

Data are posted here.

And here in Richmond:  As we saw earlier, by each measure RPS is spending a lot of money and getting poor results.

2016 Richmond Teacher Pay

Table 19 is up in the Superintendent’s Annual Report with the 2016 salary averages.

The statewide distribution graphs, below, show the count of divisions paying each salary, in $1,000 increments.  The Richmond average salary is marked in yellow on each graph; the state average is blue.

For elementary principals, Richmond’s $90,531 average was 0.40 standard deviation above the division average of $84,581. 

image

(To read the graph, look across the bottom for average salary, rounded to the nearest $1,000 and up and down for number of schools.  Thus, one school paid $44,000.  Six schools, one of which was Richmond, paid $91,000.  Four schools paid the state average, $85,000.)

For secondary principals, Richmond’s $91,266 average was 0.10 standard deviation below the division average of $93,129.

image

For Elementary Assistant Principals, Richmond’s $69,786 average was 0.17 standard deviation above the division average of 67,813.

image

For secondary Assistant Principals, Richmond’s $71,342 average was 0.20 standard deviation below the division average of 73,734.

image

For elementary teachers, Richmond’s $49,100 average was 0.19 standard deviation above the division average of $47,816.

image

For secondary teachers, Richmond’s $51,201 average was 0.08 standard deviation above the division average of $50,563.

image

Looks like we’re underpaying the leaders in our secondary schools.

Some details from the VDOE spreadsheet:

The average annual salaries for elementary and secondary teachers include supplemental salaries and wages (expenditure object 1620) as reported on the Annual School Report.

Teaching positions include classroom teachers, guidance counselors, librarians and technology instructors.

Jointly-operated school divisions (Fairfax City and Fairfax County; Emporia and Greensville County; and Williamsburg and James City County) report positions and average salaries on the annual school report of the fiscal agent division only. Fairfax County, Greensville County and Williamsburg are the fiscal agent divisions.

And a further note: The “division averages” reported above are the averages of the division averages in the VDOE spreadsheet.  VDOE reports the statewide averages; those generally are larger than the division averages, doubtless propelled by the large and very expensive NoVa divisions.

Underperformance by Group

We have seen that Richmond’s underperformance reaches the disabled, economically disadvantaged, and limited English proficiency subgroups.  The VDOE Web site provides data that allow a more fine-grained look at that problem.

To start, here is the distribution of 2016 SOL pass rates by division on the reading tests.  Here, and below, I’ve yellowed the bar at Richmond’s pass rate.

image

The disabled population shows a much different distribution, with Richmond again underperforming, but not so dramatically.

image

The economically disadvantaged group also shows lowered overall performance but not so much as the disabled group.  Richmond does not shine.

image

Next, the LEP group.

image

Note: Twenty-nine divisions have fewer than ten LEP students taking the reading tests.  VDOE does not report pass rates in those cases so those divisions are not included in this graph; similarly, twenty-six divisions are not reported in the LEP math graph below.  VDOE reports seven divisions with LEP averages of zero on both the reading and math tests; those also do not show on the graphs.

Finally, we have the students who are not in any VDOE group: Not disabled, not ED, not LEP, and not in the very small migrant and homeless groups not analyzed above.  Those no-group students test well on average.

image

But even this high-performing group performs badly in Richmond.   

(This despite RPS taking credit for the Maggie Walker students who do not attend any Richmond Public School.)

The math data paint much the same picture.

image

image

image

image

image

There you have it:  On average, Richmond can’t teach reading or math to any disadvantaged group or even to the high-performing, not-disadvantaged group.

But notice that Carver Elementary School, with a student population drawn from a part of the inner city that includes an RRHA project, breaks the pattern by turning in outstanding and division-leading results.

It’s not the students that are the problem here, folks.  It’s the schools.

Moreover, we get to wonder whether Richmond’s relatively better scores from its disabled population suggest that Richmond’s past abuse of its students to cheat on the SOLs continues in some measure.

More on Bang per Buck

To follow up on the previous post, I’ve calculated a five-subject bang/buck value for each division: (pass rate)*100,000/(expenditure per student). 

image

The green bar is West Point; the red bars from the left are Hampton, Newport News, and Norfolk; the blue is Lynchburg (thanks, James); and the yellow is Richmond.

From those data, here are the top ten, bottom ten, average, and selected division values.

image

The ten at the bottom are the eight Big Spenders

image

and the two low scorers, Petersburg and Richmond.

Teaching, Not Treasure

Jim Weigand writes to say that Table 15 is up for the 2016 school year.

That table provides, inter alia, the divisions’ expenditures for operations per student.

The footnotes to the Table tell us:

Operations include regular day school, school food services, summer school, adult education, and other education, but do not include pre-kindergarten, non-regular day school programs, non-local education agency (LEA) programs, debt service, or capital outlay additions. Non-LEA programs include expenditures made by a school division for state-operated education programs (in hospitals, clinics, and detention homes) that are located within the school division and reimbursed with state funds.

State revenues for regional Alternative Education programs and Academic Governor’s Schools are allocated to divisions according to participation, rather than as paid to the fiscal agents for these programs.

The Average Daily Membership (ADM) calculated at the end of the school year includes the ADM of pupils served in the school division and the ADM of resident pupils for whom tuition is paid to another school division, regional special education program, or private school. It excludes Head Start, pre-kindergarten, junior kindergarten students, and students for whom the division receives tuition payments from another division or entity (i.e., out-of-state school division, Comprehensive Services Act, Interstate Compact Agreement).

At the same time, the helpful front end to the SOL database gives us the division average pass rates.

Uniting these two datasets gives a Bang per Buck picture. 

Let’s start with the reading data.

image

Richmond is the gold square.  The red diamonds are, from the left, the peer cities, Hampton, Newport News, and Norfolk.  The outperforming division, there at 94% and only $11,893, is West Point.  The state averages are $11,745 and 77%.

The Big Spenders are:

image

Excel is glad to provide a linear fit to the data and to announce that there is no correlation between the division pass rate and the expenditure per student.

Here are the graphs for the other subjects and the five subject average.  Notice that the scale of the ordinate changes from graph to graph.

image

image

image

image

image

Note: In a couple of cases, the Newport News point was hiding behind the datum for another division; in those cases I colored the covering point.

Looking at just the four peer cities (Hampton, N.News, and Norfolk in red from the left and Richmond in yellow) and the state average (in blue), we get:

image

image

image

image

image

image

Two obvious conclusions:

  • By division, more money does not correlate with better SOL performance.
  • Richmond is spending a lot and teaching very little.


—————————————–

I attempted to post the data table; the software choked (sigh!).  If you’d like a copy, email john [at] crankytaxpayer [dot] org.

$500,000: Price of VAAP Cheating?

A Patrick County jury this week awarded $500K to former Stuart Elementary principal Muriel Waldron. 

As the newspaper reports, the case involves alleged false statements in Waldron’s performance evaluation.  The meat of the case, however, was Waldron’s claim that she was removed as principal because she refused to cram kids into the VAAP (a SOL alternative for students with “significant cognitive disabilities”) to boost the school’s SOL scores.

The good taxpayers of Patrick County now can look forward to the possibility of (federal?) lawsuits by the parents of the kids who might claim their children were misclassified in order to cheat on the SOLs.

We have some data that may speak to the situation.  Here are the reading and math pass rates for the Patrick County elementary schools and the state by year.  According to the paper, Ms. Waldron was removed in 2015.  Note the remarkable improvements at Stuart in 2016 and at some other schools a year earlier.

image

image

Note: The school rates are for all grades; the state rates are averages by grade, which should be quite close to the average by students for the three grades.

Unfortunately, there is no indication that the State Department of Education Superintendent Protection has done or will do anything to prevent this kind of abuse.  For example, we know that a Superintendent who admitted to packing a different test for handicapped students in order “to assist schools in obtaining accreditation” was merely required to write a “corrective action plan.”

Your tax dollars at “work.”

Truancy Postscript

One last look at the truancy data:

Last year’s 6-absence conference data from RPS for the elementary and middle schools don’t look to be useful; they contain too many reports that are obviously bogus.  The high school numbers, in contrast, are so large that they might even be accurate.

So, let’s look for a relationship between the high schools’ SOL performance and those conference numbers. 

Below, I’ve plotted the pass rates vs. the number of conferences expressed as a percentage of the fall enrollment (“ADM” or “Average Daily Membership”).  I’ve omitted the selective schools, Community, Franklin, and Open.

image

It’s reasonable to expect the SOL performance to decrease with increasing unexcused absences and both datasets meet that expectation.  Indeed, the correlation is nontrivial for the math tests and fairly robust for the reading.

Of course, correlation does not imply causation.  But these data (1) make sense, and (2) suggest that the 6-absence conference counts from these schools might be believable.

Here are the data:

clip_image001

One further inference:  Armstrong has only one attendance officer assigned to it and reports 602 conferences, i.e., 3.34 for each of the 180 days in a nominal school year.  If that attendance officer actually scheduled 602 conferences and had the parents and student show up for some number of them, and also prepared at least 602 of the prerequisite 5-absence attendance plans, we’ll have to wonder about the level of preparation.  As well, it it makes sense that he wouldn’t have had time to take more than a few of the 7-absence cases to court.

But, then, the shortage of truancy officers, and the decreasing truancy budget, make it clear that Richmond’s gross violations of the truancy statute are deliberate.

That’s about as far as these data can take us.  It surely would be fine if RPS were more forthcoming (and if VBOE were actually doing its job of enforcing the attendance laws).

—————-

Notes for the interested reader:

  • Jim Bacon points out the UVa study that reports a 19.7% chronic absenteeism rate (defined in the report as ≥ 10% of school days) in Richmond in 2015.  Their data show the rate decreasing from first to fifth grade and then rising steeply through the later grades.  Their data also show chronically absent students underperforming their (chronically present?) peers considerably on the SOLs.
  • A VDOE Web page headlines the requirements of the new truancy regulation adopted by VBOE last June, albeit they won’t collect the (badly needed) data until next year.


RPS Shooting Itself in the Truancy Foot

The ever helpful Ms. Lewis of RPS sent me the list of attendance officers and their assignments from 2016.

It turns out RPS had only eighteen of them to serve 47 schools and to deal with the 7,288 cases that state law required be taken to court.

For a more specific look at the problem, let’s look at the top of the list:

image

The assignments appear to be designed to spread the worst parts of the load.  In each case here, for example, the attendance officer had one easier assignment and one absolute horror.  Thus, Ms. Ponton had Fairfield Court, which reported 6-absence conferences for 21%  of its students last year, and Woodville, which reported 43%.  Mr. Barnes, poor soul, had Bellevue, with too few to report, and Armstrong, with 72% (!).

These eighteen attendance officers managed to take only 226 cases to court (of the 7,288 required by law).  That’s only 12.6 cases per attendance officer (of the 405 required). 

But  it could well be those folks were focused on the 5-absence attendance plans and 6-absence conferences that are prerequisite to the 7-absence court filing.  RPS data show 10,381 students with five unexcused absences that year, which comes to 577 plans required per attendance officer, and 8,502 students with six absences, which requires 472 conferences per officer. 

On these data, we can’t tell what kind of job these attendance officers are doing.  We can tell, however, that that there are far too few of them.

The 2017 budget (pdf)  (the latest on the RPS Web page) exudes indifference to the truancy problem and to the state law on the subject.  Here, from that document, are three years’ allocations for “attendance services,” in millions of dollars:

image

Notice the decrease after 2016’s lawless debacle, outlined above and earlier.

For 2017, the budget shows 43 employees in the attendance services category; the attendance officers were to be paid $17.11 per hour. 

The statute provides that “[w]here no attendance officer is appointed by the school board, the division superintendent or his designee shall act as attendance officer.”  I read that to say that the Superintendent is individually responsible for Richmond’s gross violations of the truancy law.

Perhaps RPS could spend some of the $27 million they were wasting in the instructional program in 2015 to hire many more attendance officers.