Dollars ≠ Scholars

VDOE has just posted Table 19 of the 2017 Superintendent’s Annual Report.  That table gives us, inter alia, the Average Annual Salaries for All Instructional Positions of each school division.

The spreadsheet tells us, “All Instructional Positions include classroom teachers, guidance counselors, librarians, technology instructors, principals, and assistant principals.”

Juxtaposing the Table 19 data against the 2017 SOL pass rates we obtain, for the reading tests:

image

The fitted line suggests that, among the Virginia divisions, $10K in higher average salary is associated with a 2% increase in the division average reading pass rate.  The R-squared, however, tells us that the two variables are only distantly related, with about 96% of the variance being explained by other variables.

Richmond is the gold square on the graph.  The peer jurisdictions from the left are Norfolk, Hampton, and Newport News.  Charles City is green; Lynchburg is blue.

The graphs for the other subjects tell much the same story.

image

image

image

image

Also the average of the five averages.

image

That high-performing division at a mere $43,006 is West Point.  The high performer paying almost twice as much, $78,350, is Falls Church.

Well, we knew that money can’t buy you love.  Looks like it doesn’t buy better division pass rates either.

More Teachers ≠ More Graduates

It is Spring and VDOE has posted the teacher data it has had since September.  So let’s look at the relationship between those numbers and the graduation rates.

First, the “Federal” graduation rate, i.e., the sum of standard + advanced diplomas in the 4-year cohort divided by the cohort size.

image

The fitted line might suggest that hiring more teachers leads to lower graduation rates but the tiny R-squared value tells us the two variables are quite uncorrelated.

Richmond is the gold square there.  The red diamonds are the peer cities, from the left Newport News, Hampton, and Norfolk.  Charles City is green; Lynchburg, blue.

As we have seen, VDOE has inflated the federal graduation rate by using “credit accommodations” that permit counting Modified Standard Diplomas as “Standard.”  They do an even better job of cooking the data by defining an “On Time” rate; in 2017 that deception raised Richmond’s federal rate of 70.1% to a bogus 76.8%.

Here is the happier picture painted by this manipulation:

image

Again the two variables are not correlated.  The inflated numbers make Richmond (and the Board of “Education”) look a bit better but do no good at all for the Richmond students whom the system has failed to educate.

Interesting note: For Charles City, both rates are 92.6%.

More Teachers ≠ More Learning

The Dogwood are coming into bloom so we can expect VDOE to start posting the data they have had since last September.  Indeed, they posted Table 17b, “Instructional Positions Per 1,000 ADM,” the other day.

A reader (the reader?) noticed and asked whether there is a relationship between those numbers and the SOL pass rates.  The short answer is “no.” 

The longer answer:

image

The fitted line might suggest that increasing the relative number of teachers is associated with decreasing pass rates but the minuscule R-squared value tells us that the two variables are essentially uncorrelated.

The gold square is Richmond; the red diamonds, from the left, are the peer cities, Newport News, Hampton, and Norfolk.  Lynchburg (home of the reader) is blue; Charles City, green.

Using the same system, here are the data for the other four subject areas.

image

image

image

image

And here is the five subject average.

image

I will resist the temptation to comment further about Richmond’s above-average number of teachers and it’s bottom-of-the-barrel pass rates.  (I earlier discussed Richmond’s very expensive failure rates and dismal graduation rates.)

The Worst of the Appalling

The earlier post shows that, despite the happy aura, Westover Hills Elementary School is miserably failing to educate its students. 

I singled that school out because of the story in the paper, not because it is the worst performer in town.  It merely is among the worst.

For example, here from the 2017 testing are the twenty-one worst 3d grade reading pass rates in Virginia.  For reference, the accreditation benchmark is 75%.

image

Westover Hills’ 39.3% ties for ninth from the worst.  Seven other Richmond elementary schools join Westover Hills in this cellar.

It’s a secret pleasure to see a school each from Fairfax and Henrico on this list.  That said, these data tell us that terrible schools are not unique to Richmond, but the concentration of them is.

Moving to the higher grades:

image

image

Notice that Westover Hills’ 5th grade performance floated above the bottom of the barrel.

The math picture is similarly awful.

image

image

image

The middle school numbers are no more encouraging, especially in light of MLK’s successful candidacy for worst in the state.

image

This 7th grade list expands to 22 schools in order in order to include Binford.  Of Richmond’s seven middle schools, all but Albert Hill made this expanded list.

image

Armstrong High makes the 8th grade list courtesy of the complicated SOL system that results in high schools administering the 8th grade test to some students.  Among the middle schools, MLK retains its “lead.”

image

Turning to the End of Course tests:

image

All of Richmond’s mainstream high schools made the bottom 23.  In a happy contrast, both of the more selective high schools aced the reading tests (Franklin, which is selective and has both middle- and high school grades, also did just fine, with a 93.9% pass rate).

image

To complete the picture, here are the math data.  The accreditation benchmark is 70%

image

image

image

The only surprise in the middle school math data is the (slight) competition for MLK in the race to the bottom.

image

image

image

As to the mainstream high schools, Huguenot floats out of the cellar.  Franklin (which is selective), replaces Huguenot.

image

Neither of Richmond’s selective high schools appears near the top of the math EOC list.  Open High is no. 178 from the top with a 98.1% ; Community is 611 at 67.9%, behind Huguenot at 579, 72.1%.

Welcome to Richmond, Mr. Kamras.

Enthusiasm v. Data at Westover Hills

The RT-D’s Ned Oliver did a piece the other day about the flight from Richmond’s public schools.  As a contrast (an antidote?) to that, Oliver discussed Westover Hills Elementary School, where

Principal Virginia Loving said the parents who come to open houses to check out the school rarely ask about test scores, instead focusing on class size and extracurricular opportunities — two points where her school is strong.

Indeed, Oliver quotes a spokesperson for RPS for the proposition

I know that’s (Superintendent Jason) Kamras’ vision — to have all of our schools look like Westover Hills — look like the city.

Let’s take a look at those “rarely ask[ed]” about test scores.  First, the reading pass rates.

image

image

image

Recall that VBOE installed new English and science tests in 2013 that lowered pass rates statewide.  The decreases were generally exaggerated in Richmond because our then-Superintendent failed to align the curricula to the new tests. 

At Westover Hills, we see 2013 reading decreases that more closely mirrored the state averages than did the Richmond decreases, presumably reflecting better preparation at the school. 

In 2014, the third grade enjoyed a reading renaissance.  Unfortunately, that lasted only the one year and was followed by a decline that persisted through 2017.

The fourth grade decline anticipated the 2013 drop and then continued to 2014.  The smaller gains in 2015 and 2016 were wiped out by a further plunge in 2017.

The fifth grade suffered a further drop in 2015, from which it has only partially recovered.

The reading scores for all three grades remain below the 2017 Richmond averages while, on the division average, Richmond was third worst in the state.  Those differences are appalling: Third Grade, 19 points below Richmond, 35 below the state average; Fourth Grade, 29 and 42 points down; fifth Grade, 11 and 22 points.

The mathematics pass rates paint a similarly dismal picture.

image

image

image

The new tests here were in 2012.  In 2017, Richmond had the second lowest division pass rate in math. 

The 2017 Westover Hills deficits were:

image

Finally, science:

image

Before the new tests, Westover Hills was running well above the state average on the 5th grade science test (the only grade tested to the present).  In 2017, it was 18 points below Richmond, 33 below the state.

If our new Superintendent really wants “to have all of our schools look like Westover Hills,” he is prescribing a massive (but happy, it seems) failure that is far beyond even the current, appalling state of our schools.

Sex Offender at 4310 New Kent

The State Police Web site has the Sex Offender Registry required by Va. Code § 9.1-913.

The registry now includes a map.  Here is the portion that includes our neighborhood.

image

That closest entry to our neighborhood is Mr. Edwards at 4310 New Kent (You’ll have fill in the number to prove you’re not a robot before the link will open).

image

The Registry doesn’t say that Mr. Edwards would be dangerous, just that he was convicted of aggravated sexual battery in 2010.  You get to draw your own inference as to risk to your family.

Lies, Damn Lies, and Poverty

Our former Superintendent was in the job from January, 2014 to June, 2017.  Under his supervision, Richmond’s (awful) SOL pass rates peaked in 2015 and then declined.

image

While he was with us, the Super joined in the popular sport of blaming funding and – to the point here – blaming the poorer students for Richmond’s gross underperformance.

I’ve pointed out elsewhere that RPS is wasting something like $55 million per year to obtain appalling pass rates.  They are fibbing about the need for more money (aside from the need to fix the buildings they have been neglecting for decades).

Today I want to look further into the effect of our large population of poorer (“economically disadvantaged”) students.

For sure, economically disadvantaged (“ED”) students underperform their more affluent peers by about 20%.  For example, on the state average data:

image

Thus, increasing the percentage of ED students can be expected to lower the average pass rates. 

The VDOE data permit a look beyond this dilution of the average pass rate to see if there is an effect of ED enrollments on the performance of each group.

To begin, here are the 2017 3d grade reading pass rates by division for the ED and non-ED students, plotted vs. the percentage of ED students taking the test at that grade level in the division.

image

(BTW: VDOE has a new front end to the SOL database.  It’s a bit easier to use than the old system but also slow.)

Note: Data are omitted here for the divisions with one or more missing data points (usually because of the [docx] suppression rule that applies for groups of fewer than ten students).

Here, with increasing ED population in the tested group in the division the scores drop some for both the ED and non-ED groups, but the correlations are minuscule. 

We also see Richmond (the square points) underperforming considerably.

The data for the other two elementary grades show similar – well, mostly similar – patterns.

image

image

The Grade 4 ED data show a counter-intuitive increase with increasing ED population but the tiny R-squared value tells us that the two variables are essentially uncorrelated.  The decreasing scores of the non-ED groups show some larger, but still small, correlations with the ED population.

In the middle school data, Richmond drops from poor performance to appalling underperformance and the correlation of the non-ED scores with the %ED improves some.

image

image

image

Except for the third and sixth grades, the R-squared values tell us that the average scores are mildly correlated with the ED population percentage for the non-ED students but vanishingly for the ED students themselves.

Richmond’s underperformance relative to the other divisions with similar ED populations ranges from moderate in the elementary grades to bottom-of-the-barrel in the middle school grades.

Beyond emphasizing the awful performance of Richmond’s awful middle schools, these data suggest that the major impact of larger ED populations is on the performance of the non-ED students.

The mathematics data tell much the same story.

image

image

image

image

image

image

The Richmond schools underperform again, disastrously so in the middle school grades.

The fitted lines go the wrong way in three cases for the ED scores but, again, with trivial correlations.  The effect on the non-ED students looks to be larger, and to enjoy better correlations, than with the reading data. 

These data hint at an hypothesis, particularly in the middle school grades: At the division level, the effect of large ED populations is mostly seen in the declining performance of the non-ED students.  The corollary: Placing ED students in schools with low ED populations would not significantly improve the performance of the ED students.

I’ll try to pry out the by-school data to see if the pattern persists there.

In any case, our former Superintendent and the other apologists for RPS are not dealing with the real world when they try to blame Richmond’s poor performance on the economic status of the students: Richmond’s schools, especially the middle schools, underperform grossly in comparison to other divisions with similar ED populations.

How about some truth in education: Let’s blame the schools, not the kids.

And You Thought Our Middle Schools Were Merely Bad

A kind reader points me to a piece in the NYT reporting on a Stanford study.  That study looks at progress from the third to eighth grades as a measure of educational quality.

As the Times reports, richer districts tend to have higher scores.  Here, for instance, are the third grade data for 2,000 large districts; the graph plots relative scores vs. district wealth:

image

The eighth grade numbers show some a similar pattern but some different rankings:

image

And here is the pattern of changes from third to eighth grades:

image

As the graphs suggest, the story focuses on Chicago.

Overall, the study reports a poor correlation between growth rates and average third grade scores. 

At the bottom of the Times story is a link that will produce score change data for a specified district and nearby systems.  So of course I had to put in Richmond.

percentile

As you see, Richmond is at the bottom of the barrel. 

The first surprise here is how much better Petersburg does on this measure. 

We also might wonder whether the relatively low growth reported for some excellent systems, e.g., Hanover, reflects weakness at the eighth grade level or strength at the third. 

In the other direction, Richmond’s [lack of] progress from 3d to 8th grades derives from the awful performance of our middle schools relative to the elementary schools.

image

The Stanford data also are consistent with the notion that poverty does not excuse the appalling performance of Richmond’s public schools.