Dollars ≠ Scholars

VDOE has just posted Table 19 of the 2017 Superintendent’s Annual Report.  That table gives us, inter alia, the Average Annual Salaries for All Instructional Positions of each school division.

The spreadsheet tells us, “All Instructional Positions include classroom teachers, guidance counselors, librarians, technology instructors, principals, and assistant principals.”

Juxtaposing the Table 19 data against the 2017 SOL pass rates we obtain, for the reading tests:

image

The fitted line suggests that, among the Virginia divisions, $10K in higher average salary is associated with a 2% increase in the division average reading pass rate.  The R-squared, however, tells us that the two variables are only distantly related, with about 96% of the variance being explained by other variables.

Richmond is the gold square on the graph.  The peer jurisdictions from the left are Norfolk, Hampton, and Newport News.  Charles City is green; Lynchburg is blue.

The graphs for the other subjects tell much the same story.

image

image

image

image

Also the average of the five averages.

image

That high-performing division at a mere $43,006 is West Point.  The high performer paying almost twice as much, $78,350, is Falls Church.

Well, we knew that money can’t buy you love.  Looks like it doesn’t buy better division pass rates either.

More Teachers ≠ More Graduates

It is Spring and VDOE has posted the teacher data it has had since September.  So let’s look at the relationship between those numbers and the graduation rates.

First, the “Federal” graduation rate, i.e., the sum of standard + advanced diplomas in the 4-year cohort divided by the cohort size.

image

The fitted line might suggest that hiring more teachers leads to lower graduation rates but the tiny R-squared value tells us the two variables are quite uncorrelated.

Richmond is the gold square there.  The red diamonds are the peer cities, from the left Newport News, Hampton, and Norfolk.  Charles City is green; Lynchburg, blue.

As we have seen, VDOE has inflated the federal graduation rate by using “credit accommodations” that permit counting Modified Standard Diplomas as “Standard.”  They do an even better job of cooking the data by defining an “On Time” rate; in 2017 that deception raised Richmond’s federal rate of 70.1% to a bogus 76.8%.

Here is the happier picture painted by this manipulation:

image

Again the two variables are not correlated.  The inflated numbers make Richmond (and the Board of “Education”) look a bit better but do no good at all for the Richmond students whom the system has failed to educate.

Interesting note: For Charles City, both rates are 92.6%.

More Teachers ≠ More Learning

The Dogwood are coming into bloom so we can expect VDOE to start posting the data they have had since last September.  Indeed, they posted Table 17b, “Instructional Positions Per 1,000 ADM,” the other day.

A reader (the reader?) noticed and asked whether there is a relationship between those numbers and the SOL pass rates.  The short answer is “no.” 

The longer answer:

image

The fitted line might suggest that increasing the relative number of teachers is associated with decreasing pass rates but the minuscule R-squared value tells us that the two variables are essentially uncorrelated.

The gold square is Richmond; the red diamonds, from the left, are the peer cities, Newport News, Hampton, and Norfolk.  Lynchburg (home of the reader) is blue; Charles City, green.

Using the same system, here are the data for the other four subject areas.

image

image

image

image

And here is the five subject average.

image

I will resist the temptation to comment further about Richmond’s above-average number of teachers and it’s bottom-of-the-barrel pass rates.  (I earlier discussed Richmond’s very expensive failure rates and dismal graduation rates.)

The Worst of the Appalling

The earlier post shows that, despite the happy aura, Westover Hills Elementary School is miserably failing to educate its students. 

I singled that school out because of the story in the paper, not because it is the worst performer in town.  It merely is among the worst.

For example, here from the 2017 testing are the twenty-one worst 3d grade reading pass rates in Virginia.  For reference, the accreditation benchmark is 75%.

image

Westover Hills’ 39.3% ties for ninth from the worst.  Seven other Richmond elementary schools join Westover Hills in this cellar.

It’s a secret pleasure to see a school each from Fairfax and Henrico on this list.  That said, these data tell us that terrible schools are not unique to Richmond, but the concentration of them is.

Moving to the higher grades:

image

image

Notice that Westover Hills’ 5th grade performance floated above the bottom of the barrel.

The math picture is similarly awful.

image

image

image

The middle school numbers are no more encouraging, especially in light of MLK’s successful candidacy for worst in the state.

image

This 7th grade list expands to 22 schools in order in order to include Binford.  Of Richmond’s seven middle schools, all but Albert Hill made this expanded list.

image

Armstrong High makes the 8th grade list courtesy of the complicated SOL system that results in high schools administering the 8th grade test to some students.  Among the middle schools, MLK retains its “lead.”

image

Turning to the End of Course tests:

image

All of Richmond’s mainstream high schools made the bottom 23.  In a happy contrast, both of the more selective high schools aced the reading tests (Franklin, which is selective and has both middle- and high school grades, also did just fine, with a 93.9% pass rate).

image

To complete the picture, here are the math data.  The accreditation benchmark is 70%

image

image

image

The only surprise in the middle school math data is the (slight) competition for MLK in the race to the bottom.

image

image

image

As to the mainstream high schools, Huguenot floats out of the cellar.  Franklin (which is selective), replaces Huguenot.

image

Neither of Richmond’s selective high schools appears near the top of the math EOC list.  Open High is no. 178 from the top with a 98.1% ; Community is 611 at 67.9%, behind Huguenot at 579, 72.1%.

Welcome to Richmond, Mr. Kamras.

Enthusiasm v. Data at Westover Hills

The RT-D’s Ned Oliver did a piece the other day about the flight from Richmond’s public schools.  As a contrast (an antidote?) to that, Oliver discussed Westover Hills Elementary School, where

Principal Virginia Loving said the parents who come to open houses to check out the school rarely ask about test scores, instead focusing on class size and extracurricular opportunities — two points where her school is strong.

Indeed, Oliver quotes a spokesperson for RPS for the proposition

I know that’s (Superintendent Jason) Kamras’ vision — to have all of our schools look like Westover Hills — look like the city.

Let’s take a look at those “rarely ask[ed]” about test scores.  First, the reading pass rates.

image

image

image

Recall that VBOE installed new English and science tests in 2013 that lowered pass rates statewide.  The decreases were generally exaggerated in Richmond because our then-Superintendent failed to align the curricula to the new tests. 

At Westover Hills, we see 2013 reading decreases that more closely mirrored the state averages than did the Richmond decreases, presumably reflecting better preparation at the school. 

In 2014, the third grade enjoyed a reading renaissance.  Unfortunately, that lasted only the one year and was followed by a decline that persisted through 2017.

The fourth grade decline anticipated the 2013 drop and then continued to 2014.  The smaller gains in 2015 and 2016 were wiped out by a further plunge in 2017.

The fifth grade suffered a further drop in 2015, from which it has only partially recovered.

The reading scores for all three grades remain below the 2017 Richmond averages while, on the division average, Richmond was third worst in the state.  Those differences are appalling: Third Grade, 19 points below Richmond, 35 below the state average; Fourth Grade, 29 and 42 points down; fifth Grade, 11 and 22 points.

The mathematics pass rates paint a similarly dismal picture.

image

image

image

The new tests here were in 2012.  In 2017, Richmond had the second lowest division pass rate in math. 

The 2017 Westover Hills deficits were:

image

Finally, science:

image

Before the new tests, Westover Hills was running well above the state average on the 5th grade science test (the only grade tested to the present).  In 2017, it was 18 points below Richmond, 33 below the state.

If our new Superintendent really wants “to have all of our schools look like Westover Hills,” he is prescribing a massive (but happy, it seems) failure that is far beyond even the current, appalling state of our schools.

Sex Offender at 4310 New Kent

The State Police Web site has the Sex Offender Registry required by Va. Code § 9.1-913.

The registry now includes a map.  Here is the portion that includes our neighborhood.

image

That closest entry to our neighborhood is Mr. Edwards at 4310 New Kent (You’ll have fill in the number to prove you’re not a robot before the link will open).

image

The Registry doesn’t say that Mr. Edwards would be dangerous, just that he was convicted of aggravated sexual battery in 2010.  You get to draw your own inference as to risk to your family.

Counting Crimes

The Virginia State Police keep a database of crimes reported under the FBI’s Uniform Crime Reporting system.  They count the “Type A” offense reports by police unit:

     Arson
     Assault
     Bribery
     Burglary
     Counterfeiting/Forgery
     Destruction/Damage/Vandalism of Property
     Drug/Narcotic Offenses
     Embezzlement
     Extortion/Blackmail
     Fraud Offenses
     Gambling Offenses
     Homicide
     Kidnapping/Abduction
     Larceny/Theft
     Motor Vehicle Theft
     Pornography/Obscene Material
     Prostitution Offenses
     Robbery
     Sex Offenses, Forcible & Nonforcible
     Stolen Property Offenses
     Weapon Law Violations

The 2016 version of the annual report, Crime in Virginia, is available on the VSP Web site.  Mr. Westerberg of the VSP has (again) kindly furnished a copy of the data as an Excel spreadsheet so I haven’t had to copy the numbers out of the pdf on the Web.

These data have their peculiarities.  VSP reports the number of incidents, not offenses, such that some of the count includes multiple offenses.  Where an incident includes more than one offense, they report the worst.  Thus, for an incident where an offender murders someone in the course of a burglary while carrying drugs, the incident is reported as a murder.

They report the numbers by police agency.  Thus, there are counts both for the Farmville Police and the Prince Edward Sheriff, despite their overlap in the Town.  They also list incidents reported to the State Police; for example, the Richmond Police Department shows 19,081 incident reports in 2016 and the State Police show 206 in Richmond that year. 

The report also includes data for the colleges, the Capitol Police, and state agencies such as the ABC Board. 

Finally, the small jurisdictions produce some weird statistics because even a small variation can produce a large change in the crime rate.  As well, in some small jurisdictions the State Police report a significant fraction of the incidents.  For instance, in Alleghany County in 2016, 215 incidents were reported to the sheriff and 122 to the State Police.

I produced the graphs and table below by leaving out the data for the State Police and State agencies.  I also omitted data for the jurisdictions with populations <10,000.

Here, then, are the 2016 data by jurisdiction, expressed as Type A offense reports per 100 population, plotted v. population.

image

The graph is distorted by the Big Guys, Fairfax and, to a much lesser extent, Virginia Beach, Prince William, Chesterfield, Loudoun, and Henrico.  If we expand the axis to shove those jurisdictions off the graph we get:

image

The correlation is zilch, suggesting that population is not related to the rate of offenses reported.

Richmond is the gold square here; the red diamonds are the peer jurisdictions, Hampton, Newport News, and Norfolk (left to right).

Richmond is sixth from the “top”:

image

(The VSP database truncates at about 25 characters, so we get, e.g., the Roanoke Police “Depar”.)

The Richmond rate is down this year, reflecting the statewide trend, albeit nearly double the statewide rate.

image

The Type A total is driven by the property crime numbers: Typically the larceny, vandalism, and motor vehicle theft numbers will account for 2/3 of the Type A total.  To see how violent and drug crime are doing, we have to look underneath the totals.

When we do that, we see the number of simple assaults and drug incidents both dropped in Richmond in ‘16.

image

Note: This graph and those immediately below report the raw counts of offenses reported in Richmond, not the rate.  Throughout this period, the Richmond population has been near 200,000, without much change, so you can get close to the rates per 100 by dividing these numbers by two thousand.  Thus, the 1,579 drug incidents in Richmond in 2016 were 0.71 per hundred population; the approximation gives 0.79.

The robbery, aggravated assault, and weapon law numbers did not change much this year albeit the weapon count remains up from the pre-2014 period.

image

The rape, “other” (non forcible) sex crimes, kidnapping, arson, and murder rates all continued to bounce a small amount.  As with robbery and aggravated assault, the decreases from the early 2000’s are remarkable. 

image

To see the (decreasing) rates in Forest Hill (notwithstanding the unacceptable rate of car breakins), go here.

Lies, Damn Lies, and Poverty

Our former Superintendent was in the job from January, 2014 to June, 2017.  Under his supervision, Richmond’s (awful) SOL pass rates peaked in 2015 and then declined.

image

While he was with us, the Super joined in the popular sport of blaming funding and – to the point here – blaming the poorer students for Richmond’s gross underperformance.

I’ve pointed out elsewhere that RPS is wasting something like $55 million per year to obtain appalling pass rates.  They are fibbing about the need for more money (aside from the need to fix the buildings they have been neglecting for decades).

Today I want to look further into the effect of our large population of poorer (“economically disadvantaged”) students.

For sure, economically disadvantaged (“ED”) students underperform their more affluent peers by about 20%.  For example, on the state average data:

image

Thus, increasing the percentage of ED students can be expected to lower the average pass rates. 

The VDOE data permit a look beyond this dilution of the average pass rate to see if there is an effect of ED enrollments on the performance of each group.

To begin, here are the 2017 3d grade reading pass rates by division for the ED and non-ED students, plotted vs. the percentage of ED students taking the test at that grade level in the division.

image

(BTW: VDOE has a new front end to the SOL database.  It’s a bit easier to use than the old system but also slow.)

Note: Data are omitted here for the divisions with one or more missing data points (usually because of the [docx] suppression rule that applies for groups of fewer than ten students).

Here, with increasing ED population in the tested group in the division the scores drop some for both the ED and non-ED groups, but the correlations are minuscule. 

We also see Richmond (the square points) underperforming considerably.

The data for the other two elementary grades show similar – well, mostly similar – patterns.

image

image

The Grade 4 ED data show a counter-intuitive increase with increasing ED population but the tiny R-squared value tells us that the two variables are essentially uncorrelated.  The decreasing scores of the non-ED groups show some larger, but still small, correlations with the ED population.

In the middle school data, Richmond drops from poor performance to appalling underperformance and the correlation of the non-ED scores with the %ED improves some.

image

image

image

Except for the third and sixth grades, the R-squared values tell us that the average scores are mildly correlated with the ED population percentage for the non-ED students but vanishingly for the ED students themselves.

Richmond’s underperformance relative to the other divisions with similar ED populations ranges from moderate in the elementary grades to bottom-of-the-barrel in the middle school grades.

Beyond emphasizing the awful performance of Richmond’s awful middle schools, these data suggest that the major impact of larger ED populations is on the performance of the non-ED students.

The mathematics data tell much the same story.

image

image

image

image

image

image

The Richmond schools underperform again, disastrously so in the middle school grades.

The fitted lines go the wrong way in three cases for the ED scores but, again, with trivial correlations.  The effect on the non-ED students looks to be larger, and to enjoy better correlations, than with the reading data. 

These data hint at an hypothesis, particularly in the middle school grades: At the division level, the effect of large ED populations is mostly seen in the declining performance of the non-ED students.  The corollary: Placing ED students in schools with low ED populations would not significantly improve the performance of the ED students.

I’ll try to pry out the by-school data to see if the pattern persists there.

In any case, our former Superintendent and the other apologists for RPS are not dealing with the real world when they try to blame Richmond’s poor performance on the economic status of the students: Richmond’s schools, especially the middle schools, underperform grossly in comparison to other divisions with similar ED populations.

How about some truth in education: Let’s blame the schools, not the kids.