Are Buildings More Important Than Children?

Lurking behind the (entirely justified) outrage about the condition of Richmond’s school buildings is a much more important issue: What happens – more to the point, what doesn’t happen – inside those buildings.

The SOL results give us a measure of that larger problem.  Richmond last year had the second lowest division average reading pass rate in the Commonwealth

image

and the third lowest math pass rate.

image

Of course, some schools did worse than the average.  In Richmond some schools, especially some middle schools, did much worse.  For example, here are the bottom ten Virginia schools in terms of sixth grade reading and math:

image

image

More data are here showing, inter alia, that MLK Middle is the worst performing school in Virginia.

Surely we need better school buildings.  Before that, however, we need (much!) better schools. 

It’s a puzzle why we are expending so much energy and passion (and, perhaps, money) on the lesser problem.

2017 Richmond Crime Rates

The Virginia State Police publish an annual report on Crime in Virginia.  They count the “Type A” offenses reported per police unit:

Arson
Assault
Bribery
Burglary
Counterfeiting/Forgery
Destruction/Damage/Vandalism of Property
Drug/Narcotic Offenses
Embezzlement
Extortion/Blackmail
Fraud Offenses
Gambling Offenses
Homicide
Kidnapping/Abduction
Larceny/Theft
Motor Vehicle Theft
Pornography/Obscene Material
Prostitution Offenses
Robbery
Sex Offenses, Forcible & Nonforcible
Stolen Property Offenses
Weapon Law Violations

These data have their peculiarities.  The first obvious one: The totals reported by VSP are different, in most cases, from the sums of offenses.  For example, for 2017 the VSP reports 19,270 offenses reported to the Richmond Police Dep’t but the total of the Richmond offenses listed in the same table with that 19,270 is 20,705, a 7.4% difference.  When I inquired about the difference, they responded:

There can be multiple offenses within an incident. If a murder, rape and robbery occur in one incident (one event), all offenses are counted under Incident Based Reporting. The old UCR Summary System used the hierarchy rule and counted only one offense per incident.

That certainly is true, but it does not explain the discrepancy: Whatever they are counting, the total should be the total.  (The table says it reports “offenses.”)  In any case, the numbers below are their totals.

They report the numbers by police agency, both the local force and, in most cases, the State Police.  For example, the Richmond Police Department shows 19,270 incident reports and the State Police show 233 in Richmond.  The report also includes data for the colleges, the Capitol Police, and state agencies such as the ABC Board.  Finally, the small jurisdictions produce some weird statistics because even a small variation can produce a large change in the crime rate.  As well, the State Police report a significant fraction of the incidents in some small jurisdictions; for instance, in Craig County in 2017, the sheriff reported 22 incidents while the State Police reported 20.

I obtained the data below by leaving out the data for the State Police (9,709 offenses, 2.5% of the total) and State agencies (8,633 offenses, 2.2% of the total).  I also left out the jurisdictions with populations <10,000 (19,980 offenses, 5.1%).  That’s a total of 38,322, 9.7% of the 394,197 total offenses.

BTW: The VCU total (not included in Richmond’s total) was 1,207.

Here, then, are the remaining 2017 data (pdf), expressed as Type A offense reports per 100 population vs. population.

2017 Offenses v. Population

Richmond is the gold square.  The red diamonds, from the left, are the peer jurisdictions of Hampton, Newport News, and Norfolk.

There is no particular reason to expect these data to fit a straight line but Excel is happy to fit one.  The slope suggests that the rate (per hundred population) increases by about 0.15 for a population increase of 100,000.  The R2, however, tells us that population explains less than 1% of the variance in the crime rate; i.e., overall crime rate (by this measure) does not correlate with jurisdiction size.

Here is the same graph, with the axis expanded to cut off the Big Guys (Fairfax, Va. Beach, Prince Wm., Chesterfield, Loudoun, and Henrico) in order to emphasize the distribution of the smaller jurisdictions.

Among the jurisdictions with populations >10,000, we are seventh in the state, with a rate 1.94 times the state average.

image

(Blame the cut off department names on the VSP database, which appears to truncate at 25 characters.)

Here are the totals for the eighteen largest jurisdictions, sorted by rate, with the grand total for all but the smallest (<10K) jurisdictions.

image

You’ll notice dramatic difference between the large cities and the large counties.

CAVEATS: These numbers tell us about overall crime rates but not about the environment faced by any particular citizen.  As well, the VSP emphasizes that, as we see above, population is not a good predictor of crime rate.  They list other factors:

1. Population density and degree of urbanization;
2. Population variations in composition and stability;
3. Economic conditions and employment availability;
4. Mores, cultural conditions, education, and religious characteristics;
5. Family cohesiveness;
6. Climate, including seasonal weather conditions;
7. Effective strength of the police force;
8. Standards governing appointments to the police force;
9. Attitudes and policies of the courts, prosecutors and corrections;
10. Citizen attitudes toward crime and police;
11. The administrative and investigative efficiency of police agencies and the organization and cooperation of adjoining and overlapping police jurisdictions;
12. Crime reporting practices of citizens.

The 2017 Richmond rate increased slightly to 8.65 from 8.61 in 2016. 

The Type A total is driven by the property crime numbers: Typically the larceny, vandalism, and motor vehicle theft numbers will account for 2/3 of the Type A total.  To see how violent and drug crime are doing, we have to look underneath the totals.

When we do that, we see that the Richmond count of simple assaults dropped while the drug and weapon law numbers rose.

Note: This graph and those immediately below report the raw counts of offenses reported in Richmond, not the count per 100K.  Throughout this period, the Richmond population has been near 200,000, with very little change, so you can get close to the rates per 100 by dividing these numbers by 2,000.

The robbery numbers continued a long downward trend; aggravated assaults rose slightly.

The “other” sex crimes (non forcible) showed a jump, as did the murder count.  Kidnapping, rape, and arson enjoyed decreases.  The decreases from the early 2000’s, both here and above, are remarkable.

For a list of the hot blocks in Richmond see this page.  And see this page for data showing a nice improvement in Forest Hill.

Much of Richmond’s plethora of crime is drug-related

To complement the still outrageous crime rate, our schools are among the worst in the state and our public housing agency maintains a sanctuary for crime on its property.  To support all this dysfunction, we pay some of the highest taxes in the state.  Go figure.

Note: Mr Westerberg of the VSP kindly furnished a copy of the data as an Excel spreadsheet, so I didn’t have to copy the numbers from the PDF report on the Web.

What’s the Alternative?

The RT-D this morning reports:

The Richmond School Board has extended its relationship with a for-profit company that has overseen academic gains at the city’s alternative school but has faced allegations of abuse elsewhere. The decision, made last week without notice, has raised concerns from residents who’d hoped to weigh in on the decision. The 6-1 vote to exercise a second-year option on the contract with Texas-based Camelot Education was not on the School Board’s original agenda, but added to it at the beginning of the meeting.

We have seen that Richmond’s receptacle for disruptive students, Richmond Alternative, was doing a remarkable job with a tough crowd until RPS took it over from the contractor in order to save $2 million per year.  Pass rates then plummeted. 

Richmond then reconsidered and in ‘16 handed the school over to another contractor.

In that context, let’s look at the “academic gains.”

image

image

Note: The Richmond and state data include scores from the elementary grades; they may be helpful for trends (e.g., the drops with the new math tests in 2012 and the new reading tests in 2013) but are not directly comparable to the Alternative pass rates.

Perhaps the 2017 increases, 56.8 points in reading and 49.7 in math, represent “academic gains.”  Let’s hope so. 

As well, let’s hope our new Superintendent has been informed by the unusual gains at Carver and other schools and will look under the numbers.

For sure, the Department of “Education,” which investigated only one instance of cheating this year  – and then only when the local Superintendent called them in – will marinate in its delight with the increased pass rates and will look under those numbers – no matter how unlikely they may be – only when somebody else points out a problem.

Truant Teachers

If you thought Richmond’s absence rate (24% of students absent 15 days or more in 2016) was bad, wait until you see the teachers’ absence rate.

The Feds’ 2016 Civil Rights Data Collection includes data on the numbers of teachers absent from work more than ten days.  Here are Virginia’s division data, sorted by increasing rates.

image

image

The 68.4% Richmond rate (second from worst) for 2016 deteriorated from 56% (ninth from worst) in 2014.  

The 2016 division mean is 34.6% with a standard deviation of 15.6%.  That puts Richmond 2.2 standard deviations above the mean.

To get this 68.4% average absence rate, some Richmond schools had to have more than 68% of their teachers absent for more than ten days.  Fox and Chimborazo more than managed that: ALL their teachers were absent more than ten days.  Miles Jones only managed 95%.

image

As an interesting contrast, the Maggie Walker rate was 5.6%.

For sure, the schools – especially the elementary schools – are nurseries for respiratory diseases.  The eleven elementary schools at the bottom of this graph average around 50% (with the notable exception of Munford, at 24%).  These contrast with the six elementary schools at the top of the graph that average around 95%.  Aside from this strange dichotomy, sick elementary school students cannot explain why every Richmond school except Munford shows a higher rate than the 34% state average, with non-elementary Armstrong High at 80% and Brown Middle at 87%.

This shows all the symptoms of an horrendous management failure. 

We might expect that more teacher absences would be associated with reduced students performance on the SOL tests.  In Richmond, not so much.  Except, perhaps, for high school reading:

image

image

image

Note: Franklin Military has both middle and high school grades.  Lacking a clean way to separate those data, I’ve omitted Franklin from this analysis.

The data by division don’t even hint at any effect of larger teacher absence rates on SOL performance.

image

It’s enough to make one suspect that it’s not the effective teachers who are running up all those absences.

We have another measure of the effect of these teacher absences:  The 2016 RPS budget includes $102.9 million for classroom instructional staff and $4.3 million for substitutes.  If RPS could cut the teacher absence rate by half (to about the state average), the savings could be used to raise teacher pay by about 2%.

Your tax dollars at “work.”

Secrecy In School Contracting

The RT-D reports some dissatisfaction that the proposals for three new school buildings are not public.

Gnash your teeth all you like, fellow citizens: Your Generous Assembly has decreed that these documents NOT be made public:

D. Any competitive negotiation offeror, upon request, shall be afforded the opportunity to inspect proposal records within a reasonable time after the evaluation and negotiations of proposals are completed but prior to award, except in the event that the public body decides not to accept any of the proposals and to reopen the contract. Otherwise, proposal records shall be open to public inspection only after award of the contract. (emphasis supplied)

Even so, public dissatisfaction over the secrecy is well founded: Our School Board has an extended history of incompetence and, probably, corruption in contracting.  For example:

Perhaps it is good news that at least two of the six members of the “Joint Construction Team” are from the City.  Then, again, it was the City, not the School Board, that built the gym floor at Huguenot.

68.8% Absence Rate at Armstrong

Note: Specialty school reports corrected 6/20.

Continuing with the information from the 2016 Civil Rights Data Collection, here are the counts of absences of fifteen or more school days for the Richmond schools, expressed as percentages of the enrollments.

image

Notice that the high schools (highlights in gold) cluster at the high end of the list, followed by the middle schools (yellow).

Not included in that list are the specialty schools:

image

Recall that Richmond Alternative is the dumping ground for kids whom the regular schools (mostly middle and high schools) can’t handle.  The 107.9% absence rate tells us two things: (1) total attendance varies throughout the year and they reported it at a time when it was even less than the total absences; and (2) whatever the actual percentage is, it probably is obscenely large.

The absence data for MLK Early Learning, Maymont Pre-K, and Governor’s (misspelled in the federal document) Career, showed “-9,” which is code for not applicable.  Presumably that means they don’t keep attendance records.

The presence of Munford and Fox at the top of the big list suggests that we look for a relationship between these absence data and the SOL pass rates.  When we do that, the elementary school data show the expected negative slopes with R-squared values of about 12%:

image

The middle school data show stronger correlations.  Note: Franklin has both middle and high school grades; I’ve left it out.

image

The high school correlation (again, with Franklin omitted) is stronger still for math.

image

Of course, the correlations don’t prove causation.  For instance, it may be that children who cannot perform well choose to avoid the place where they cannot perform well, school.  We can be confident, however, that whatever those children may learn when they are not in school, it is unlikely to improve their SOL performance.

Finally, these data compliment the truancy data that also speak to Richmond’s egregious attendance problem and that illuminate our School Board’s wholesale violation of the state law that requires it to respond to that problem.

What Do They Learn When They’re Absent?

The estimable Carol Wolf points out that the feds’ 2016 Civil Rights Data Collection has been available since April.  Duh!

Some of the data from the earlier (2014) collection are here and here.

The main data file for 2016 is a 33 MB zip; it unzips to a 461 MB csv.  The xlsx version I’m working with is 637 MB.  Until today, I thought I had a large, fast computer.

The feds report total absences (pdf at 6), whether excused or not, of 15 or more school days.  Here is the distribution of those for the Virginia public school divisions:

image

Here is the key to the color bars:

image

The range on the graph is from 0% (Danville) to 28.9% (Dickenson Co.).

Maggie Walker is not a division so it’s not in the graph; it’s in the table because the number is interesting, esp. in reference to Richmond, which is almost twice the state average.

For reference, Richmond’s 2017 unexcused absence data are here.

Stay tuned for the data by school.

Something In Those Mountains?

VPAP posted a map showing average acceptance rates by locality for the 2014-2016 graduating classes to 4-year public schools.  The first glance shows high rates in Southwest Virginia and Charlotte County.

The VPAP map scale runs from zero to 100% while the actual numbers for 2016-7 run from 53.3% to 96.4%; the map uses only half of its scale and, thus, compresses a lot of the data.

I turned to SCHEV for the raw data.  I lack VPAP’s mapping capability so here is Excel’s distribution of the rates, with colored highlights on some localities of interest:
 image

image

And here, for the SW Va. bragging rights, is the entire list (Charlotte Co., the non-SW entry on the map, is 9, i.e., 8th place, on this one-year-only list):

image

image

Cheating: Ignoring the Obvious

In the summer of 2017, VDOE investigated an allegation of cheating at A.P. Hill Elementary School in Petersburg.  As a result, Petersburg fired “several” school employees and the Board of Education withheld accreditation of the school “due to testing irregularities.” (pdf at 137)

A cursory look at the SOL pass rates for A.P. Hill raises the question why it took a “tip” to cause VDOE to investigate.

Let’s start with the 3d grade reading pass rates for Hill and the state for both the disabled and non-disabled populations:

image

VDOE’s suppression rules hide all of Hill’s disabled pass rates but the one for 2015; that datum is well above the state average.  The non-disabled data tell the tale, however: Between 2013 and 2016, Hill went from a failing pass rate (the accreditation level for reading is 75) to matching the state average for two consecutive years.

The fourth grade data are even more remarkable.

image

The fifth grade numbers show both another tremendous increase in the non-disabled rate and one extraordinary datum for the disabled students.

image

The math data tell much the same story:

image

image

image

An alert Superintendent in Petersburg or a competent VDOE would have taken a hard look at that school in 2015. 

There are at least two explanations for what actually happened:

  • “Alert Superintendent in Petersburg” and “competent VDOE” are oxymoronic (we know that is the case for VDOE); or
  • The Superintendent and/or VDOE were/was too happy with those unbelievable numbers to contemplate the obvious explanation.

Closer to home, we’ve seen the same behavior by both our (former) Superintendent and VDOE as to Carver Elementary.  Neither bothered to look behind the phenomenal increases in test scores there, e.g.,

image

Unfortunately, Carver is just the tip of an iceberg of uninvestigated, unbelievable pass rate increases in Richmond.  For example:

image

image

image

image

image

Or see Ginter Park, where, as with Fairfield, the obvious VGLA abuse prior to the new tests in 2012 (math) and 2013 (reading) serves as a preview of what has happened more recently:

image

image

That off-scale 2012 disabled datum is a 19% pass rate.  The plummet from the 100% pass rates reported in the previous two years is an example of what can happen when VDOE eliminates the test a school was using to cheat.  But, as you can see, the school found another way to improve the scores.

image

image

We’ll see whether our new Superintendent looks into the remarkable score increases at these and other schools.

ASIDE: We learn from the RT-D that our Superintendent is concerned about the colors of graduation robes and hats.  We can hope most fervently that the colors are a smokescreen to keep the press busy while he looks at the real, massive problems in Richmond’s public schools.

For sure, we’ll all attend our own funerals before VDOE undertakes a systematic look at this issue.

Your tax dollars at “work.”

Does Elementary School Cheating Make the Middle Schools Look Worse?

In terms of the 2017 average SOL pass rates, Richmond had the second worst schools in Virginia.  In that morass of awfulness, the middle schools stood out: Their performance was even worse than awful.  Indeed, in 2017 MLK had the worst SOL performance in the state.

Some time back I asked the formidable Carol Wolf why the Richmond scores fall into a pit between the fifth and sixth grades.  She said the teachers tell her the elementary schools are cheating. 

Now we hear that there has been an institutional cheating program at Richmond’s Carver Elementary School.  And the data suggest that other elementary schools have been doing the same thing, especially as to the disabled students. 

The SOL data by grade are consistent with this picture. 

Let’s start with the 2017 Reading pass rates.

image

In the elementary grades, we see Richmond’s abled students scoring some fifteen or more points below their peers statewide while the disabled students were closing in on the state average for disabled students.  The Richmond pass rates, especially those of the disabled students, plummeted in middle school while the statewide numbers dropped much less.

image

Statewide, as in this case, the pass rates for disabled students have been running about thirty points below the rates for abled students.  At Carver and elsewhere, we have seen some disabled pass rates near, and sometimes better than, the state average non-disabled rates.

If, as seems probable, Carver’s and some other schools’ numbers are bogus, we would expect the pass rates of the affected students to have tumbled when those students entered the sixth grade and received unenhanced scores.  The data here are consistent with that picture, with, as expected, a larger drop for the disabled population.

Indeed, these data suggest a whole lot of score boosting in Richmond’s elementary schools.

The math data paint much the same picture.  Notice, also, the score decreases continuing into the seventh grade, both in Richmond and in the statewide data.

image

image

The 2016 data tell much the same story.

image

image

image

image

One departure here from the simple view: Richmond’s non-disabled math scores that year dropped more than the disabled.  Perhaps there is another factor at work on the math testing.

In sum, we can view these data as one more reason to think Carol was right: It looks like they’re cheating in (at least some of) Richmond’s elementary schools.

BTW:  VDOE has the data to nail this issue.  If students entering middle school from some elementary schools show large score decreases while students from others do not, VDOE can identify the elementary schools that may be inflating their pass rates. 

Silly me.  Of course VDOE has not performed, and will not perform, that analysis.  Their manifest interest is in high scores and increased graduation rates, not in education.

That leaves it up to our new Superintendent.  He also has the data and he has an interest in not having a cheating scandal on his watch.  We’ll see whether he follows up.