Mixed News at Holton

Overall, Holton dropped a point on both the reading and math SOLs this year.  The data by subject and grade paint a more nuanced picture.

The reading data suggest a problem in the fourth grade but otherwise look pretty good.

image

image

image

For what it may be worth, I’ve asked Excel to fit a straight line to each dataset to suggest the trend in the pass rates.  In that sense, Holton is showing progress in reading, notwithstanding the 4th grade blip this year.

The math data also suggest an issue in the 4th grade.

image

image

image

The Good News is that Holton is flirting with the state average in reading.  The not so good, that the school is handily beating the Richmond average in math, albeit at a rate still below the 70% accreditation benchmark.

Peak Performers

Here are the thirty schools with the highest average 2018 pass rates in each of the five subject areas, along with the highest average of the five pass rates.  Richmond schools are highlighted.

Note: Maggie Walker would surely be near the top of all these lists except that the Board of “Education” says Walker is a “program,” not a “school.”  That is their way of giving the Walker students’ scores to high schools that those students do not attend in the students home neighborhoods.  Unfortunately, this is just a small example of VBOE mendacity.

image

image

image

image

image

image

The Good, the Excellent, and the Awful

Having looked at Richmond’s peer jurisdictions, let’s turn to our neighbors.

Here, to start, are the reading pass rates by year for the state, some neighboring Counties, and poor Petersburg.

image

It’s instructive to look at the 2012-13 drops that followed the introduction of the new, tougher reading tests.

  • State            -14.2
  • Hanover       -9.6
  • Richmond    -29.1

As of this year, Richmond abided at 20.2 points below the state average.

In terms of raw counts, Richmond this year had 1.75% of the Virginia students taking the reading tests and 3.43% of the students who failed those tests.

image image

The math data paint a similar picture.

image

Likewise, the 2011-2012 score drops with the new math tests:

  • State             -18.0
  • Hanover        -13.6
  • Richmond     -28.4

On the 2018 math data, Richmond was 25.1 points below the state average.  It had 1.72% of the students taking the math tests and 3.63% of the students failing those tests.

image  image

The data for the other three subjects tell much the same story.

image

image

image

As you see, the adjacent Counties and Hanover continue to provide safe harbors for the Richmond parents who move there when their children approach school age.

I’ll devote a later post to Petersburg and the colossal incompetence of the Board of “Education” that has had Petersburg operating under its supervision since at least 2004.

For now, I’ll just point out that the state’s intervention in Richmond this year has paid no dividends for the children who suffer under the Richmond system’s incompetence. 

Division SOL by Year

Here are the division SOL pass rates by year, going back to the start of the database, for Richmond, the peer cities, and the state.  I’ve included Charles City and Lynchburg as a thank-you to my readers there.

First, reading:

image

The new, tougher English tests in 2013 produced those decreases.  Richmond was 9.7 points below the state average in ‘12 and fell to –24.6 in ‘13; as of this year, it remains 20.2 points low.

Next, math.

image

The new math tests came in 2012.  The new tests took Richmond from –10.5 to –20.9 v. the state.  Richmond recovered some but then languished to the current –25.1.

Next, the other three subject areas.

image

image

image

Finally, the average of the five averages:

image

On this average, Richmond is 23.4 points below the state this year.

2018 SOLs by School

Here are Richmond’s 2018 SOL Pass Rates by subject and school, along with the 2017 rates for reference.

First the elementary schools.

image

image

Recall that the (former) accreditation benchmarks were 75 in English and 70 for the other subjects.

Next the middle schools.  (Note the scale change for the first three graphs).

image

image

image

image

image

And the high schools.

image

Again, notice the scale change.  As well, the “0” entries here and below are error indicators, probably because the data have been suppressed.

image

image

image

image

And, finally, the combined schools.

image

image

image

image

image

Are Buildings More Important Than Children?

Lurking behind the (entirely justified) outrage about the condition of Richmond’s school buildings is a much more important issue: What happens – more to the point, what doesn’t happen – inside those buildings.

The SOL results give us a measure of that larger problem.  Richmond last year had the second lowest division average reading pass rate in the Commonwealth

image

and the third lowest math pass rate.

image

Of course, some schools did worse than the average.  In Richmond some schools, especially some middle schools, did much worse.  For example, here are the bottom ten Virginia schools in terms of sixth grade reading and math:

image

image

More data are here showing, inter alia, that MLK Middle is the worst performing school in Virginia.

Surely we need better school buildings.  Before that, however, we need (much!) better schools. 

It’s a puzzle why we are expending so much energy and passion (and, perhaps, money) on the lesser problem.

2017 Richmond Crime Rates

The Virginia State Police publish an annual report on Crime in Virginia.  They count the “Type A” offenses reported per police unit:

Arson
Assault
Bribery
Burglary
Counterfeiting/Forgery
Destruction/Damage/Vandalism of Property
Drug/Narcotic Offenses
Embezzlement
Extortion/Blackmail
Fraud Offenses
Gambling Offenses
Homicide
Kidnapping/Abduction
Larceny/Theft
Motor Vehicle Theft
Pornography/Obscene Material
Prostitution Offenses
Robbery
Sex Offenses, Forcible & Nonforcible
Stolen Property Offenses
Weapon Law Violations

These data have their peculiarities.  The first obvious one: The totals reported by VSP are different, in most cases, from the sums of offenses.  For example, for 2017 the VSP reports 19,270 offenses reported to the Richmond Police Dep’t but the total of the Richmond offenses listed in the same table with that 19,270 is 20,705, a 7.4% difference.  When I inquired about the difference, they responded:

There can be multiple offenses within an incident. If a murder, rape and robbery occur in one incident (one event), all offenses are counted under Incident Based Reporting. The old UCR Summary System used the hierarchy rule and counted only one offense per incident.

That certainly is true, but it does not explain the discrepancy: Whatever they are counting, the total should be the total.  (The table says it reports “offenses.”)  In any case, the numbers below are their totals.

They report the numbers by police agency, both the local force and, in most cases, the State Police.  For example, the Richmond Police Department shows 19,270 incident reports and the State Police show 233 in Richmond.  The report also includes data for the colleges, the Capitol Police, and state agencies such as the ABC Board.  Finally, the small jurisdictions produce some weird statistics because even a small variation can produce a large change in the crime rate.  As well, the State Police report a significant fraction of the incidents in some small jurisdictions; for instance, in Craig County in 2017, the sheriff reported 22 incidents while the State Police reported 20.

I obtained the data below by leaving out the data for the State Police (9,709 offenses, 2.5% of the total) and State agencies (8,633 offenses, 2.2% of the total).  I also left out the jurisdictions with populations <10,000 (19,980 offenses, 5.1%).  That’s a total of 38,322, 9.7% of the 394,197 total offenses.

BTW: The VCU total (not included in Richmond’s total) was 1,207.

Here, then, are the remaining 2017 data (pdf), expressed as Type A offense reports per 100 population vs. population.

2017 Offenses v. Population

Richmond is the gold square.  The red diamonds, from the left, are the peer jurisdictions of Hampton, Newport News, and Norfolk.

There is no particular reason to expect these data to fit a straight line but Excel is happy to fit one.  The slope suggests that the rate (per hundred population) increases by about 0.15 for a population increase of 100,000.  The R2, however, tells us that population explains less than 1% of the variance in the crime rate; i.e., overall crime rate (by this measure) does not correlate with jurisdiction size.

Here is the same graph, with the axis expanded to cut off the Big Guys (Fairfax, Va. Beach, Prince Wm., Chesterfield, Loudoun, and Henrico) in order to emphasize the distribution of the smaller jurisdictions.

Among the jurisdictions with populations >10,000, we are seventh in the state, with a rate 1.94 times the state average.

image

(Blame the cut off department names on the VSP database, which appears to truncate at 25 characters.)

Here are the totals for the eighteen largest jurisdictions, sorted by rate, with the grand total for all but the smallest (<10K) jurisdictions.

image

You’ll notice dramatic difference between the large cities and the large counties.

CAVEATS: These numbers tell us about overall crime rates but not about the environment faced by any particular citizen.  As well, the VSP emphasizes that, as we see above, population is not a good predictor of crime rate.  They list other factors:

1. Population density and degree of urbanization;
2. Population variations in composition and stability;
3. Economic conditions and employment availability;
4. Mores, cultural conditions, education, and religious characteristics;
5. Family cohesiveness;
6. Climate, including seasonal weather conditions;
7. Effective strength of the police force;
8. Standards governing appointments to the police force;
9. Attitudes and policies of the courts, prosecutors and corrections;
10. Citizen attitudes toward crime and police;
11. The administrative and investigative efficiency of police agencies and the organization and cooperation of adjoining and overlapping police jurisdictions;
12. Crime reporting practices of citizens.

The 2017 Richmond rate increased slightly to 8.65 from 8.61 in 2016. 

The Type A total is driven by the property crime numbers: Typically the larceny, vandalism, and motor vehicle theft numbers will account for 2/3 of the Type A total.  To see how violent and drug crime are doing, we have to look underneath the totals.

When we do that, we see that the Richmond count of simple assaults dropped while the drug and weapon law numbers rose.

Note: This graph and those immediately below report the raw counts of offenses reported in Richmond, not the count per 100K.  Throughout this period, the Richmond population has been near 200,000, with very little change, so you can get close to the rates per 100 by dividing these numbers by 2,000.

The robbery numbers continued a long downward trend; aggravated assaults rose slightly.

The “other” sex crimes (non forcible) showed a jump, as did the murder count.  Kidnapping, rape, and arson enjoyed decreases.  The decreases from the early 2000’s, both here and above, are remarkable.

For a list of the hot blocks in Richmond see this page.  And see this page for data showing a nice improvement in Forest Hill.

Much of Richmond’s plethora of crime is drug-related

To complement the still outrageous crime rate, our schools are among the worst in the state and our public housing agency maintains a sanctuary for crime on its property.  To support all this dysfunction, we pay some of the highest taxes in the state.  Go figure.

Note: Mr Westerberg of the VSP kindly furnished a copy of the data as an Excel spreadsheet, so I didn’t have to copy the numbers from the PDF report on the Web.

What’s the Alternative?

The RT-D this morning reports:

The Richmond School Board has extended its relationship with a for-profit company that has overseen academic gains at the city’s alternative school but has faced allegations of abuse elsewhere. The decision, made last week without notice, has raised concerns from residents who’d hoped to weigh in on the decision. The 6-1 vote to exercise a second-year option on the contract with Texas-based Camelot Education was not on the School Board’s original agenda, but added to it at the beginning of the meeting.

We have seen that Richmond’s receptacle for disruptive students, Richmond Alternative, was doing a remarkable job with a tough crowd until RPS took it over from the contractor in order to save $2 million per year.  Pass rates then plummeted. 

Richmond then reconsidered and in ‘16 handed the school over to another contractor.

In that context, let’s look at the “academic gains.”

image

image

Note: The Richmond and state data include scores from the elementary grades; they may be helpful for trends (e.g., the drops with the new math tests in 2012 and the new reading tests in 2013) but are not directly comparable to the Alternative pass rates.

Perhaps the 2017 increases, 56.8 points in reading and 49.7 in math, represent “academic gains.”  Let’s hope so. 

As well, let’s hope our new Superintendent has been informed by the unusual gains at Carver and other schools and will look under the numbers.

For sure, the Department of “Education,” which investigated only one instance of cheating this year  – and then only when the local Superintendent called them in – will marinate in its delight with the increased pass rates and will look under those numbers – no matter how unlikely they may be – only when somebody else points out a problem.

Truant Teachers

If you thought Richmond’s absence rate (24% of students absent 15 days or more in 2016) was bad, wait until you see the teachers’ absence rate.

The Feds’ 2016 Civil Rights Data Collection includes data on the numbers of teachers absent from work more than ten days.  Here are Virginia’s division data, sorted by increasing rates.

image

image

The 68.4% Richmond rate (second from worst) for 2016 deteriorated from 56% (ninth from worst) in 2014.  

The 2016 division mean is 34.6% with a standard deviation of 15.6%.  That puts Richmond 2.2 standard deviations above the mean.

To get this 68.4% average absence rate, some Richmond schools had to have more than 68% of their teachers absent for more than ten days.  Fox and Chimborazo more than managed that: ALL their teachers were absent more than ten days.  Miles Jones only managed 95%.

image

As an interesting contrast, the Maggie Walker rate was 5.6%.

For sure, the schools – especially the elementary schools – are nurseries for respiratory diseases.  The eleven elementary schools at the bottom of this graph average around 50% (with the notable exception of Munford, at 24%).  These contrast with the six elementary schools at the top of the graph that average around 95%.  Aside from this strange dichotomy, sick elementary school students cannot explain why every Richmond school except Munford shows a higher rate than the 34% state average, with non-elementary Armstrong High at 80% and Brown Middle at 87%.

This shows all the symptoms of an horrendous management failure. 

We might expect that more teacher absences would be associated with reduced students performance on the SOL tests.  In Richmond, not so much.  Except, perhaps, for high school reading:

image

image

image

Note: Franklin Military has both middle and high school grades.  Lacking a clean way to separate those data, I’ve omitted Franklin from this analysis.

The data by division don’t even hint at any effect of larger teacher absence rates on SOL performance.

image

It’s enough to make one suspect that it’s not the effective teachers who are running up all those absences.

We have another measure of the effect of these teacher absences:  The 2016 RPS budget includes $102.9 million for classroom instructional staff and $4.3 million for substitutes.  If RPS could cut the teacher absence rate by half (to about the state average), the savings could be used to raise teacher pay by about 2%.

Your tax dollars at “work.”

Secrecy In School Contracting

The RT-D reports some dissatisfaction that the proposals for three new school buildings are not public.

Gnash your teeth all you like, fellow citizens: Your Generous Assembly has decreed that these documents NOT be made public:

D. Any competitive negotiation offeror, upon request, shall be afforded the opportunity to inspect proposal records within a reasonable time after the evaluation and negotiations of proposals are completed but prior to award, except in the event that the public body decides not to accept any of the proposals and to reopen the contract. Otherwise, proposal records shall be open to public inspection only after award of the contract. (emphasis supplied)

Even so, public dissatisfaction over the secrecy is well founded: Our School Board has an extended history of incompetence and, probably, corruption in contracting.  For example:

Perhaps it is good news that at least two of the six members of the “Joint Construction Team” are from the City.  Then, again, it was the City, not the School Board, that built the gym floor at Huguenot.