Lies, Damn Lies, and Poverty

Our former Superintendent was in the job from January, 2014 to June, 2017.  Under his supervision, Richmond’s (awful) SOL pass rates peaked in 2015 and then declined.

image

While he was with us, the Super joined in the popular sport of blaming funding and – to the point here – blaming the poorer students for Richmond’s gross underperformance.

I’ve pointed out elsewhere that RPS is wasting something like $55 million per year to obtain appalling pass rates.  They are fibbing about the need for more money (aside from the need to fix the buildings they have been neglecting for decades).

Today I want to look further into the effect of our large population of poorer (“economically disadvantaged”) students.

For sure, economically disadvantaged (“ED”) students underperform their more affluent peers by about 20%.  For example, on the state average data:

image

Thus, increasing the percentage of ED students can be expected to lower the average pass rates. 

The VDOE data permit a look beyond this dilution of the average pass rate to see if there is an effect of ED enrollments on the performance of each group.

To begin, here are the 2017 3d grade reading pass rates by division for the ED and non-ED students, plotted vs. the percentage of ED students taking the test at that grade level in the division.

image

(BTW: VDOE has a new front end to the SOL database.  It’s a bit easier to use than the old system but also slow.)

Note: Data are omitted here for the divisions with one or more missing data points (usually because of the [docx] suppression rule that applies for groups of fewer than ten students).

Here, with increasing ED population in the tested group in the division the scores drop some for both the ED and non-ED groups, but the correlations are minuscule. 

We also see Richmond (the square points) underperforming considerably.

The data for the other two elementary grades show similar – well, mostly similar – patterns.

image

image

The Grade 4 ED data show a counter-intuitive increase with increasing ED population but the tiny R-squared value tells us that the two variables are essentially uncorrelated.  The decreasing scores of the non-ED groups show some larger, but still small, correlations with the ED population.

In the middle school data, Richmond drops from poor performance to appalling underperformance and the correlation of the non-ED scores with the %ED improves some.

image

image

image

Except for the third and sixth grades, the R-squared values tell us that the average scores are mildly correlated with the ED population percentage for the non-ED students but vanishingly for the ED students themselves.

Richmond’s underperformance relative to the other divisions with similar ED populations ranges from moderate in the elementary grades to bottom-of-the-barrel in the middle school grades.

Beyond emphasizing the awful performance of Richmond’s awful middle schools, these data suggest that the major impact of larger ED populations is on the performance of the non-ED students.

The mathematics data tell much the same story.

image

image

image

image

image

image

The Richmond schools underperform again, disastrously so in the middle school grades.

The fitted lines go the wrong way in three cases for the ED scores but, again, with trivial correlations.  The effect on the non-ED students looks to be larger, and to enjoy better correlations, than with the reading data. 

These data hint at an hypothesis, particularly in the middle school grades: At the division level, the effect of large ED populations is mostly seen in the declining performance of the non-ED students.  The corollary: Placing ED students in schools with low ED populations would not significantly improve the performance of the ED students.

I’ll try to pry out the by-school data to see if the pattern persists there.

In any case, our former Superintendent and the other apologists for RPS are not dealing with the real world when they try to blame Richmond’s poor performance on the economic status of the students: Richmond’s schools, especially the middle schools, underperform grossly in comparison to other divisions with similar ED populations.

How about some truth in education: Let’s blame the schools, not the kids.

And You Thought Our Middle Schools Were Merely Bad

A kind reader points me to a piece in the NYT reporting on a Stanford study.  That study looks at progress from the third to eighth grades as a measure of educational quality.

As the Times reports, richer districts tend to have higher scores.  Here, for instance, are the third grade data for 2,000 large districts; the graph plots relative scores vs. district wealth:

image

The eighth grade numbers show some a similar pattern but some different rankings:

image

And here is the pattern of changes from third to eighth grades:

image

As the graphs suggest, the story focuses on Chicago.

Overall, the study reports a poor correlation between growth rates and average third grade scores. 

At the bottom of the Times story is a link that will produce score change data for a specified district and nearby systems.  So of course I had to put in Richmond.

percentile

As you see, Richmond is at the bottom of the barrel. 

The first surprise here is how much better Petersburg does on this measure. 

We also might wonder whether the relatively low growth reported for some excellent systems, e.g., Hanover, reflects weakness at the eighth grade level or strength at the third. 

In the other direction, Richmond’s [lack of] progress from 3d to 8th grades derives from the awful performance of our middle schools relative to the elementary schools.

image

The Stanford data also are consistent with the notion that poverty does not excuse the appalling performance of Richmond’s public schools.

Summer of Theft from Motor Vehicle

The Police Dept. database now has the summer, 2017 data so I can update my last jeremiad regarding car breakins in our neighborhood.

The most common offense reported in Forest Hill is “Theft from Motor Vehicle.”  Since the start of the database in 2000, there have been 576 unique reports of those, 27% of the total.

image

I expect the actual number of such thefts is larger but we have no way to know how many of them go unreported.

I like to call those thefts “car breakins” but, in fact, most of the victims left their cars unlocked.  Much of the “Destruction Property/Private Property” reported reflects actual car breakins.

The data show a third summer of increase, following a period of relative tranquility.

image

The largest contributor to these numbers is the 4200 block of Riverside Drive, home of the “42d St. Parking Lot.”

http://calaf.org/wp-content/uploads/2017/04/image-57.png

image

There’s a lot of activity at the Nature Center as well.

image

The Good News is the long term decrease in the numbers.

image

*2017 data through Nov. 24.

There is some history there.

The considerable decreases in the 4200 block came after Parks’ 2005 response to neighborhood complaints: They started locking the gates to the 42d St. lot at night and off season and they installed rocks to block parking in the part of the lot that is less visible from the street.


I attribute the recent increases to the increased use of the Park and the removal of the rocks last year. 

In any case, there are two lessons here:

  • Leaving stuff in the car, especially in an unlocked car, is an invitation to lose the stuff and to help chum the neighborhood for criminals; and
  • Given that most of the thefts are from the vehicles of park visitors, it’s past time for some LARGE signs in the 4200 block (and probably at the Nature Center and 4100 Hillcrest) and, especially, in the 42d St. parking lot to warn the visitors:
  • Car Breakins Here! Lock your junk in your trunk.

Final Accreditation Update

At its October meeting, the Board of Education resolved 58 TBD cases by allowing the schools in question to be “Partially Accredited-Reconstituted.” 

Here is the resulting situation as to Richmond, Petersburg, and the state totals:

image

The categories there are my shorthand.  The long versions are here and here.  “Cheated” is my shorthand for “Accreditation Withheld-Board Decision.”  That decision followed the cheating scandal at AP Hill Elementary in Petersburg.

As of today, the table on the VDOE Web page has not been updated.  The data above are from the spreadsheet that apparently has been updated.

To focus just on Richmond:

image

The Good News is that the number of accredited schools is set to rise dramatically.  The Bad News is that the “improvement” need not reflect performance improvements, just a gross dilution of the standards.  Your tax dollars at “work.”

More Money for What?

The tug-of-war between the School Board and City Council over school funding enters a new era of speculation: Will the election results produce more funds for Richmond schools?

That overlooks the more fundamental question: What is the School Board doing with all the money it now is spending?

The most recent data from VDOE are from 2015-16.  Here from that year are the per-student disbursements by division for Richmond, three peer divisions, and the state average for school divisions.  I’ve left out the sums for facilities, debt service, and contingency reserve.

image

The largest single cost for schools is salaries, so it’s no surprise that Richmond’s excess of $3,051 per student mostly went to the instruction category.

image

Expressing the Richmond data as differences from the state number, we get:

image

Or, multiplying those differences by the Richmond enrollment of 21,826:

image

Multiplying the $3,051 Richmond total excess by the 21,826 enrollment gives the excess Richmond expenditure total: $66.6 million. 

We can account for some of that. 

The 2016 average salary in Richmond was $51,263 vs. the state average of 56,320.  That is, Richmond saved $11.3 million vs. the state average by underpaying its teachers.

At the same time, Richmond has a lot more teachers than average:

image

At the average salary of $51,263, that comes to an extra cost of $18.5 million.

Combining those data leaves Richmond with an excess cost v. the division average of $59.4 million.

image

This overlooks the question of what those extra teachers are doing for the RPS students.  To judge from the SOL scores, it’s more a question of what they are doing to the students.

We often hear that special education and poverty are major contributors to Richmond’s higher costs.  It is difficult to come by data to quantify that.  The closest I have come is SchoolDataDirect, the now defunct Standard & Poors data service that was supported by the Gates Foundation.

S&P calculated adjustments to spending on core operating activities to account for poverty, special education, and geographic factors.  On the 2006 data, here are their numbers:

image

The difference in adjustments is 5.8%; that is, on those (old) data, Richmond schools were 5.8% less expensive than average (in terms of “core” spending) because of those three factors. 

Based on that counter-intuitive result and the absence of current data, let’s pass on those factors.

One final item:  Richmond’s excess expenditure for O&M is $2.8 million.  Even if all that were justified (and given the sorry state of our old school buildings, it probably is), Richmond would be spending an excess $56.6 million.

image

After these adjustments we are left with an estimated excess in our 2016 school disbursements of some 56+ million compared to the state average (recall that this does not account for disbursements for facilities, debt, or contingency).  What we got for that money in 2016 was the second lowest math pass rate and the lowest reading pass rate among the Virginia school divisions.

Until the RPS can explain where all that money is going and what we are getting for it, there is no reason at all to increase the school budget.  To the contrary, these data make a case for cutting the RPS budget by at least $56 million.

The Board of “Education” Still Doesn’t Care About Truancy

We are reminded by a piece on NPR that you can’t teach students who don’t attend school. 

The General Assembly noticed that problem awhile back.  In 1999, they amended Code § 22.1-258 to install the currently-effective requirements for truancy responses:

  • Any unexcused absence: Contact with the parent;
  • 5 unexcused absences: Attendance Plan;
  • 6 unexcused absences: Conference with Parents;
  • 7 unexcused absences: Prosecute parents or file CHINS petition.

The Department of “Education” responded by requiring the divisions to report the number of students for whom a conference was scheduled and the aggregate daily attendance.  Notwithstanding its duty and authority “to see that the [mandatory attendance laws] are properly enforced throughout the Commonwealth” the Department cheerfully ignored the other requirements of the statute. 

That willful ignorance continued until, after a bumpy process that started in 2009, the Board adopted a truancy regulation, effective November 30, 2016, that requires reporting of

1. All excused and unexcused absences as defined in this chapter for each individual student shall be collected.
2. For each student with five unexcused absences, whether an attendance plan was developed, and if not, the reason.
3. For each student with six unexcused absences, whether an attendance conference was scheduled, and if not, the reason.
4. For each student with six unexcused absences, whether an attendance conference was actually held, and if not, the reason.
5. For each student with seven unexcused absences, whether a court referral was made or if proceedings against the parent or parents were initiated and, if not, the reason.

Beginning with the current school year, the Department will collect data on attendance plans, conferences, and court referrals.  We’ll have to wait until next year to see whether that covers the requirements of the regulation (much less the statute).  In the meantime, we’re stuck with the old count of 6-absence conferences

CAVEAT: Take these data with a tub of salt.  Richmond ignored the statute for years (without provoking any action from the Board of “Education”).  The current data suggest that Richmond is trying harder; we’ll have to wait to next year for the supporting data to see whether they (and other divisions) have decided to obey the law. 

So, understanding that these “truancy” numbers may be understated:

Dividing the reported number of six-absence conferences by the fall enrollment (“ADM”), we see the top divisions for (this measure of) truancy:

image

At least on these data, this is not just a big city problem.

For a nice contrast, here are the divisions reporting <0.5%.

image

There’s no telling which, if any, of those numbers we should believe.

Finally, here are the divisions for which there is NO REPORT and no indication of VDOE action to get one.

image

Your tax dollars at “work.”

Virginia Colleges: Get What You Pay For?

The USDOE has updated its College Scorecard, apparently with 2016 data.  Here is a summary for the Virginia colleges & universities.

image

That last column is my Bang per Buck ratio, which is the salary times graduation rate divided by cost per year.

CAVEAT: The graduation rate is for full-time, first-time students.  The earnings are only for Title IV-receiving students and do not include those enrolled in graduate school at the time of collection.

Here are those data in graphs (Please forgive the excess digits after the decimal point in some numbers in the labels; Excel won’t let me set the number of significant figures for each element.  Thus, to get two significant figures in the slope of the first graph, I had to take a ridiculous five in the intercept and six in the R-squared.)

image

image

image

And here are the same graphs for just the Big Five (GMU, Tech, UVa, VCU, W&M):

image

image

image

Structures and SOLs

Paul Goldman’s referendum that would seek to place a school modernization plan in the city charter is pending before the voters. 

An email from Paul today invites us all to attend the School Board meeting tonight to hear about his “plan that will finally give all students their. . . right to a modern,clean, safe, 21st century education compatible School facility.”

It’s hard to know how anybody could oppose replacing the many old school buildings that have not been properly maintained.  Even so, we might wonder whether Paul’s focus on physical facilities, needed though they be, misses the more important point: Education.

We have some data on that.

The new Huguenot facility opened to students on January 5, 2015.  So we have 2-1/2 years to see whether the new digs have affected academics.

To start, here are the reading pass rates by year at Huguenot:

image

The dips in 2013 surely reflect the new, tougher tests instituted that year.  And we see some recovery in ‘14 and ‘15.  But, to the point here, since then none of the three groups for which VDOE posted data has shown any academic benefit from the new facilities.

Well, how about writing?

image

OK, how about math?

image

They started to recover nicely from the new tests of 2012 but then, again, started slipping back.

Maybe History?

image

Aha!  A small improvement this year among the black and white students.  But nothing to brag about.  And no bounce from the Hispanic, students. 

We’ll need more than that to infer a learning effect from the new digs.  Science is the last chance:

image

Well, whatever is going on at Huguenot, there’s no pattern of academic boosts from the new facilities.

These data don’t say there couldn’t be a new digs effect at some other school(s).  But they do support the notion that Paul’s focus might be on the second most important issue.