Are Buildings More Important Than Children?

Lurking behind the (entirely justified) outrage about the condition of Richmond’s school buildings is a much more important issue: What happens – more to the point, what doesn’t happen – inside those buildings.

The SOL results give us a measure of that larger problem.  Richmond last year had the second lowest division average reading pass rate in the Commonwealth

image

and the third lowest math pass rate.

image

Of course, some schools did worse than the average.  In Richmond some schools, especially some middle schools, did much worse.  For example, here are the bottom ten Virginia schools in terms of sixth grade reading and math:

image

image

More data are here showing, inter alia, that MLK Middle is the worst performing school in Virginia.

Surely we need better school buildings.  Before that, however, we need (much!) better schools. 

It’s a puzzle why we are expending so much energy and passion (and, perhaps, money) on the lesser problem.

2017 Richmond Crime Rates

The Virginia State Police publish an annual report on Crime in Virginia.  They count the “Type A” offenses reported per police unit:

Arson
Assault
Bribery
Burglary
Counterfeiting/Forgery
Destruction/Damage/Vandalism of Property
Drug/Narcotic Offenses
Embezzlement
Extortion/Blackmail
Fraud Offenses
Gambling Offenses
Homicide
Kidnapping/Abduction
Larceny/Theft
Motor Vehicle Theft
Pornography/Obscene Material
Prostitution Offenses
Robbery
Sex Offenses, Forcible & Nonforcible
Stolen Property Offenses
Weapon Law Violations

These data have their peculiarities.  The first obvious one: The totals reported by VSP are different, in most cases, from the sums of offenses.  For example, for 2017 the VSP reports 19,270 offenses reported to the Richmond Police Dep’t but the total of the Richmond offenses listed in the same table with that 19,270 is 20,705, a 7.4% difference.  When I inquired about the difference, they responded:

There can be multiple offenses within an incident. If a murder, rape and robbery occur in one incident (one event), all offenses are counted under Incident Based Reporting. The old UCR Summary System used the hierarchy rule and counted only one offense per incident.

That certainly is true, but it does not explain the discrepancy: Whatever they are counting, the total should be the total.  (The table says it reports “offenses.”)  In any case, the numbers below are their totals.

They report the numbers by police agency, both the local force and, in most cases, the State Police.  For example, the Richmond Police Department shows 19,270 incident reports and the State Police show 233 in Richmond.  The report also includes data for the colleges, the Capitol Police, and state agencies such as the ABC Board.  Finally, the small jurisdictions produce some weird statistics because even a small variation can produce a large change in the crime rate.  As well, the State Police report a significant fraction of the incidents in some small jurisdictions; for instance, in Craig County in 2017, the sheriff reported 22 incidents while the State Police reported 20.

I obtained the data below by leaving out the data for the State Police (9,709 offenses, 2.5% of the total) and State agencies (8,633 offenses, 2.2% of the total).  I also left out the jurisdictions with populations <10,000 (19,980 offenses, 5.1%).  That’s a total of 38,322, 9.7% of the 394,197 total offenses.

BTW: The VCU total (not included in Richmond’s total) was 1,207.

Here, then, are the remaining 2017 data (pdf), expressed as Type A offense reports per 100 population vs. population.

2017 Offenses v. Population

Richmond is the gold square.  The red diamonds, from the left, are the peer jurisdictions of Hampton, Newport News, and Norfolk.

There is no particular reason to expect these data to fit a straight line but Excel is happy to fit one.  The slope suggests that the rate (per hundred population) increases by about 0.15 for a population increase of 100,000.  The R2, however, tells us that population explains less than 1% of the variance in the crime rate; i.e., overall crime rate (by this measure) does not correlate with jurisdiction size.

Here is the same graph, with the axis expanded to cut off the Big Guys (Fairfax, Va. Beach, Prince Wm., Chesterfield, Loudoun, and Henrico) in order to emphasize the distribution of the smaller jurisdictions.

Among the jurisdictions with populations >10,000, we are seventh in the state, with a rate 1.94 times the state average.

image

(Blame the cut off department names on the VSP database, which appears to truncate at 25 characters.)

Here are the totals for the eighteen largest jurisdictions, sorted by rate, with the grand total for all but the smallest (<10K) jurisdictions.

image

You’ll notice dramatic difference between the large cities and the large counties.

CAVEATS: These numbers tell us about overall crime rates but not about the environment faced by any particular citizen.  As well, the VSP emphasizes that, as we see above, population is not a good predictor of crime rate.  They list other factors:

1. Population density and degree of urbanization;
2. Population variations in composition and stability;
3. Economic conditions and employment availability;
4. Mores, cultural conditions, education, and religious characteristics;
5. Family cohesiveness;
6. Climate, including seasonal weather conditions;
7. Effective strength of the police force;
8. Standards governing appointments to the police force;
9. Attitudes and policies of the courts, prosecutors and corrections;
10. Citizen attitudes toward crime and police;
11. The administrative and investigative efficiency of police agencies and the organization and cooperation of adjoining and overlapping police jurisdictions;
12. Crime reporting practices of citizens.

The 2017 Richmond rate increased slightly to 8.65 from 8.61 in 2016. 

The Type A total is driven by the property crime numbers: Typically the larceny, vandalism, and motor vehicle theft numbers will account for 2/3 of the Type A total.  To see how violent and drug crime are doing, we have to look underneath the totals.

When we do that, we see that the Richmond count of simple assaults dropped while the drug and weapon law numbers rose.

Note: This graph and those immediately below report the raw counts of offenses reported in Richmond, not the count per 100K.  Throughout this period, the Richmond population has been near 200,000, with very little change, so you can get close to the rates per 100 by dividing these numbers by 2,000.

The robbery numbers continued a long downward trend; aggravated assaults rose slightly.

The “other” sex crimes (non forcible) showed a jump, as did the murder count.  Kidnapping, rape, and arson enjoyed decreases.  The decreases from the early 2000’s, both here and above, are remarkable.

For a list of the hot blocks in Richmond see this page.  And see this page for data showing a nice improvement in Forest Hill.

Much of Richmond’s plethora of crime is drug-related

To complement the still outrageous crime rate, our schools are among the worst in the state and our public housing agency maintains a sanctuary for crime on its property.  To support all this dysfunction, we pay some of the highest taxes in the state.  Go figure.

Note: Mr Westerberg of the VSP kindly furnished a copy of the data as an Excel spreadsheet, so I didn’t have to copy the numbers from the PDF report on the Web.

What’s the Alternative?

The RT-D this morning reports:

The Richmond School Board has extended its relationship with a for-profit company that has overseen academic gains at the city’s alternative school but has faced allegations of abuse elsewhere. The decision, made last week without notice, has raised concerns from residents who’d hoped to weigh in on the decision. The 6-1 vote to exercise a second-year option on the contract with Texas-based Camelot Education was not on the School Board’s original agenda, but added to it at the beginning of the meeting.

We have seen that Richmond’s receptacle for disruptive students, Richmond Alternative, was doing a remarkable job with a tough crowd until RPS took it over from the contractor in order to save $2 million per year.  Pass rates then plummeted. 

Richmond then reconsidered and in ‘16 handed the school over to another contractor.

In that context, let’s look at the “academic gains.”

image

image

Note: The Richmond and state data include scores from the elementary grades; they may be helpful for trends (e.g., the drops with the new math tests in 2012 and the new reading tests in 2013) but are not directly comparable to the Alternative pass rates.

Perhaps the 2017 increases, 56.8 points in reading and 49.7 in math, represent “academic gains.”  Let’s hope so. 

As well, let’s hope our new Superintendent has been informed by the unusual gains at Carver and other schools and will look under the numbers.

For sure, the Department of “Education,” which investigated only one instance of cheating this year  – and then only when the local Superintendent called them in – will marinate in its delight with the increased pass rates and will look under those numbers – no matter how unlikely they may be – only when somebody else points out a problem.

Truant Teachers

If you thought Richmond’s absence rate (24% of students absent 15 days or more in 2016) was bad, wait until you see the teachers’ absence rate.

The Feds’ 2016 Civil Rights Data Collection includes data on the numbers of teachers absent from work more than ten days.  Here are Virginia’s division data, sorted by increasing rates.

image

image

The 68.4% Richmond rate (second from worst) for 2016 deteriorated from 56% (ninth from worst) in 2014.  

The 2016 division mean is 34.6% with a standard deviation of 15.6%.  That puts Richmond 2.2 standard deviations above the mean.

To get this 68.4% average absence rate, some Richmond schools had to have more than 68% of their teachers absent for more than ten days.  Fox and Chimborazo more than managed that: ALL their teachers were absent more than ten days.  Miles Jones only managed 95%.

image

As an interesting contrast, the Maggie Walker rate was 5.6%.

For sure, the schools – especially the elementary schools – are nurseries for respiratory diseases.  The eleven elementary schools at the bottom of this graph average around 50% (with the notable exception of Munford, at 24%).  These contrast with the six elementary schools at the top of the graph that average around 95%.  Aside from this strange dichotomy, sick elementary school students cannot explain why every Richmond school except Munford shows a higher rate than the 34% state average, with non-elementary Armstrong High at 80% and Brown Middle at 87%.

This shows all the symptoms of an horrendous management failure. 

We might expect that more teacher absences would be associated with reduced students performance on the SOL tests.  In Richmond, not so much.  Except, perhaps, for high school reading:

image

image

image

Note: Franklin Military has both middle and high school grades.  Lacking a clean way to separate those data, I’ve omitted Franklin from this analysis.

The data by division don’t even hint at any effect of larger teacher absence rates on SOL performance.

image

It’s enough to make one suspect that it’s not the effective teachers who are running up all those absences.

We have another measure of the effect of these teacher absences:  The 2016 RPS budget includes $102.9 million for classroom instructional staff and $4.3 million for substitutes.  If RPS could cut the teacher absence rate by half (to about the state average), the savings could be used to raise teacher pay by about 2%.

Your tax dollars at “work.”

Secrecy In School Contracting

The RT-D reports some dissatisfaction that the proposals for three new school buildings are not public.

Gnash your teeth all you like, fellow citizens: Your Generous Assembly has decreed that these documents NOT be made public:

D. Any competitive negotiation offeror, upon request, shall be afforded the opportunity to inspect proposal records within a reasonable time after the evaluation and negotiations of proposals are completed but prior to award, except in the event that the public body decides not to accept any of the proposals and to reopen the contract. Otherwise, proposal records shall be open to public inspection only after award of the contract. (emphasis supplied)

Even so, public dissatisfaction over the secrecy is well founded: Our School Board has an extended history of incompetence and, probably, corruption in contracting.  For example:

Perhaps it is good news that at least two of the six members of the “Joint Construction Team” are from the City.  Then, again, it was the City, not the School Board, that built the gym floor at Huguenot.

68.8% Absence Rate at Armstrong

Note: Specialty school reports corrected 6/20.

Continuing with the information from the 2016 Civil Rights Data Collection, here are the counts of absences of fifteen or more school days for the Richmond schools, expressed as percentages of the enrollments.

image

Notice that the high schools (highlights in gold) cluster at the high end of the list, followed by the middle schools (yellow).

Not included in that list are the specialty schools:

image

Recall that Richmond Alternative is the dumping ground for kids whom the regular schools (mostly middle and high schools) can’t handle.  The 107.9% absence rate tells us two things: (1) total attendance varies throughout the year and they reported it at a time when it was even less than the total absences; and (2) whatever the actual percentage is, it probably is obscenely large.

The absence data for MLK Early Learning, Maymont Pre-K, and Governor’s (misspelled in the federal document) Career, showed “-9,” which is code for not applicable.  Presumably that means they don’t keep attendance records.

The presence of Munford and Fox at the top of the big list suggests that we look for a relationship between these absence data and the SOL pass rates.  When we do that, the elementary school data show the expected negative slopes with R-squared values of about 12%:

image

The middle school data show stronger correlations.  Note: Franklin has both middle and high school grades; I’ve left it out.

image

The high school correlation (again, with Franklin omitted) is stronger still for math.

image

Of course, the correlations don’t prove causation.  For instance, it may be that children who cannot perform well choose to avoid the place where they cannot perform well, school.  We can be confident, however, that whatever those children may learn when they are not in school, it is unlikely to improve their SOL performance.

Finally, these data compliment the truancy data that also speak to Richmond’s egregious attendance problem and that illuminate our School Board’s wholesale violation of the state law that requires it to respond to that problem.

What Do They Learn When They’re Absent?

The estimable Carol Wolf points out that the feds’ 2016 Civil Rights Data Collection has been available since April.  Duh!

Some of the data from the earlier (2014) collection are here and here.

The main data file for 2016 is a 33 MB zip; it unzips to a 461 MB csv.  The xlsx version I’m working with is 637 MB.  Until today, I thought I had a large, fast computer.

The feds report total absences (pdf at 6), whether excused or not, of 15 or more school days.  Here is the distribution of those for the Virginia public school divisions:

image

Here is the key to the color bars:

image

The range on the graph is from 0% (Danville) to 28.9% (Dickenson Co.).

Maggie Walker is not a division so it’s not in the graph; it’s in the table because the number is interesting, esp. in reference to Richmond, which is almost twice the state average.

For reference, Richmond’s 2017 unexcused absence data are here.

Stay tuned for the data by school.

Something In Those Mountains?

VPAP posted a map showing average acceptance rates by locality for the 2014-2016 graduating classes to 4-year public schools.  The first glance shows high rates in Southwest Virginia and Charlotte County.

The VPAP map scale runs from zero to 100% while the actual numbers for 2016-7 run from 53.3% to 96.4%; the map uses only half of its scale and, thus, compresses a lot of the data.

I turned to SCHEV for the raw data.  I lack VPAP’s mapping capability so here is Excel’s distribution of the rates, with colored highlights on some localities of interest:
 image

image

And here, for the SW Va. bragging rights, is the entire list (Charlotte Co., the non-SW entry on the map, is 9, i.e., 8th place, on this one-year-only list):

image

image

Does Elementary School Cheating Make the Middle Schools Look Worse?

In terms of the 2017 average SOL pass rates, Richmond had the second worst schools in Virginia.  In that morass of awfulness, the middle schools stood out: Their performance was even worse than awful.  Indeed, in 2017 MLK had the worst SOL performance in the state.

Some time back I asked the formidable Carol Wolf why the Richmond scores fall into a pit between the fifth and sixth grades.  She said the teachers tell her the elementary schools are cheating. 

Now we hear that there has been an institutional cheating program at Richmond’s Carver Elementary School.  And the data suggest that other elementary schools have been doing the same thing, especially as to the disabled students. 

The SOL data by grade are consistent with this picture. 

Let’s start with the 2017 Reading pass rates.

image

In the elementary grades, we see Richmond’s abled students scoring some fifteen or more points below their peers statewide while the disabled students were closing in on the state average for disabled students.  The Richmond pass rates, especially those of the disabled students, plummeted in middle school while the statewide numbers dropped much less.

image

Statewide, as in this case, the pass rates for disabled students have been running about thirty points below the rates for abled students.  At Carver and elsewhere, we have seen some disabled pass rates near, and sometimes better than, the state average non-disabled rates.

If, as seems probable, Carver’s and some other schools’ numbers are bogus, we would expect the pass rates of the affected students to have tumbled when those students entered the sixth grade and received unenhanced scores.  The data here are consistent with that picture, with, as expected, a larger drop for the disabled population.

Indeed, these data suggest a whole lot of score boosting in Richmond’s elementary schools.

The math data paint much the same picture.  Notice, also, the score decreases continuing into the seventh grade, both in Richmond and in the statewide data.

image

image

The 2016 data tell much the same story.

image

image

image

image

One departure here from the simple view: Richmond’s non-disabled math scores that year dropped more than the disabled.  Perhaps there is another factor at work on the math testing.

In sum, we can view these data as one more reason to think Carol was right: It looks like they’re cheating in (at least some of) Richmond’s elementary schools.

BTW:  VDOE has the data to nail this issue.  If students entering middle school from some elementary schools show large score decreases while students from others do not, VDOE can identify the elementary schools that may be inflating their pass rates. 

Silly me.  Of course VDOE has not performed, and will not perform, that analysis.  Their manifest interest is in high scores and increased graduation rates, not in education.

That leaves it up to our new Superintendent.  He also has the data and he has an interest in not having a cheating scandal on his watch.  We’ll see whether he follows up.

Carver By the Numbers

The estimable Jim Bacon noticed that the disabled students at Carver outscored their abled peers on the reading tests in 2017.  He said:

While Carver students as a whole out-performed their peers in Richmond schools and state schools, those classified as disabled out-performed their peers by mind-blowing margins. Either Carver has cracked the code on teaching disabled students or… it has been aggressively manipulating test results.

A deeper dive into the numbers suggests that Bacon is being too kind by suggesting that “crack[ing] the code” might be an alternative.

First a quick bit of history:

And a short timeline:

image

Turning to the data, here are the 3d grade reading pass rates by year for Carver, Richmond, and the state.  “Y” indicates disabled; “N” indicates not disabled.

image

The state data, blue, show the statewide drop with the new tests in 2013.  Throughout, the disabled students, the red points, generally underperformed their more abled peers, the yellow points, by about thirty points on the new tests.

The yellow lines are Richmond.  The disabled performance before 2013 reflects their cheating on the VGLA tests.  Since then, Richmond’s disabled students have generally scored below the state average by up to ten points while their more abled peers have been low by about fifteen. 

(In light of Richmond’s overall lousy performance, we can wonder whether those disabled numbers are artificially boosted.  But that is a question for another day.)

Then we have Carver, the green lines.  There is some year-to-year variation, as can be expected from a smaller population.  The non-disabled scores were low before the new principal and have been stratospheric since. 

The scores of the disabled Carver students have been spectacular.  Before 2013, it looks like Carver was abusing the VGLA, along with too many other Richmond schools.  The disabled scores plummeted in ‘13, with the abolition of the VGLA.

After the arrival of the new principal in 2012, Carver’s disabled students often outscored the non-disabled state averages and, in 2015, outscored even the non-disabled Carver students.

Friar Occam would tell us to select the simple explanation: Those Carver scores were “aggressively manipulated.”  That is, they were cheating at Carver, wholesale, both before and after 2012.

Next, fourth grade reading:

image

The missing Carver disabled datum for 2013 probably represents a population small enough to trigger the VDOE suppression rules.  Otherwise, these data tell the same story as the third grade numbers (with a notable higher score by the disabled population in 2017).

Fifth grade reading:

image

These data are a variation on the same theme, but with the stratospheric Carver scores persisting into 2017.

Turning to the math tests:

image

image

image

Note the missing Carver data for 2012.

There are some interesting details here, notably the lower 2017 pass rates in some cases.  Jim Bacon posits “that something changed in the way the SOL tests were administered to make manipulation more difficult.”

The Big Picture is clear, however, imho: The people running Carver have been cheating, prodigiously.  Whatever their technique (I’m hearing tales they posted the answers on the blackboard), they have obtained spectacular pass rates for the non-disabled students and have achieved even better than equal opportunity score boosts among the disabled population. 

I don’t think we need to wait for the retesting for confirmation.  I think that jail would be too good for the staff at Carver.  Stocks for the Carver staff and a bushel of tomatoes for each Carver parent would be a good start.