Woodville

On the all subject average, Richmond’s Woodville Elementary School has the fourth worst pass rate in the state, 33.82.

image

Woodville’s reading average was 36.5; math was 40.0.  History was 34.3.  It was the 24.6 in science that brought the average down.

The third grade reading scores have been improving in recent years (although if a 44% pass rate qualifies as “improved,” it’s hard to think of what came before without shuddering).

image

The higher grades, not so much.

image

image

The pattern mostly repeated on the math tests, albeit with drooping pass rates in the fifth grade.

image

image

image

One can think of Woodville as a (very expensive) ad for the County schools.


The Board of “Education” Wants to Help Richmond the Way It Has Helped Petersburg

For years, the mantra to distract attention from Richmond’s failed schools has been, “We beat Petersburg.”

On the 2018 SOL pass rates, we can say it again as to all five subjects and the five subject average: Petersburg has the worst SOL pass rates in the state.  Richmond is only second worst.

Specifically: Here are the bottom 20 (or more) divisions in each subject.  I’ve highlighted the peer jurisdictions in red and, as a courtesy to my readers there (two is more than none!), I’ve also highlighted Charles City and Lynchburg.

image

image

image

image

image

image

Petersburg has been operating under Memoranda of Understanding (i.e., edicts of the Board of “Education”) since at least 2004.

As I have pointed out, the Board of “Education” is a paper tiger.  It has the power to sue to compel compliance with the Standards of Quality.  It has never done so.  It has, instead, persisted with a failed Memorandum of Understanding process that it knows does not work.

There is a simple explanation for this counterproductive behavior:

If it were to sue, the Board would have to tell the judge what Petersburg must do to fix the schools.  The Board cannot do this because it does not know (Sept. 21, 2016 video starting at 1:48) how to fix those schools.  That is, the Board knows it would be futile to sue (and even more embarrassing than its present failure).

So now the Board has brought the same disruptive, expensive, and futile process to Richmond with, in this first year, the inevitable absence of any measurable benefit to the students.

On the evidence of fourteen years of sterile (if not destructive) State supervision of Petersburg and a fruitless year of State supervision of Richmond, RPS would be wise to tell the Board of “Education” to take its Memorandum of Understanding and go away.

Poverty: The Bad Excuse For Bad Schools

The poverty excuse for poor school performance again rears its head.

The 2018 SOL data tell us, yet again, to look elsewhere for the causes of poor school performance.

Before we look at those new data, let’s clear away some underbrush:

  • It is beyond question that poor kids underperform on the SOL tests.  For example, on the 2018 state average pass rate data, the “economically disadvantaged” (here abbreviated “ED”) students underperform their more affluent peers by 21.7 points on the reading tests and 20.0 on the math:

image

  • Correlation, however, does not imply causation.  The cause of this underperformance may well be something related to poverty, not the ED itself.
  • The data tell us that economic disadvantage has much lower correlation with SOL pass rates than do other factors.  See below.
  • Even with the flawed SOL yardstick, we can identify schools and divisions that perform well and that underperform.  Also see below.
  • The State Board of “Education” had, but has abandoned, a poverty-neutral measure of academic growth, the Student Growth Percentile.  The SGP data showed large variations in division (here and here) and teacher performance.
  • Poverty makes a perfect excuse for poor school performance because some portion of the population will always be less affluent than the citizenry in general.  And, for sure, the schools can’t fix poverty, so they like to blame that external factor for their own failures.
  • There are indications that even perfectly awful schools with large numbers of ED students can be improved.

Turning to the 2018 data, let’s start with the division pass rates on the reading tests vs. the percentage of the economically disadvantaged students.

image

A glance at the graph tells the same story as the statistics: As the ED percentage increases, the scores go down but there is a huge amount of scatter.  Some divisions greatly outscore the trendline and some grossly underperform.  Clearly, some factor(s) must have a much stronger relationship with SOL performance than the incidence of poverty in the student population.

Richmond is the gold square on the graph.  The peer cities are the red diamonds, from the left Hampton, Newport News, and Norfolk.

Richmond’s cry of “poverty” is falsified on these data: All but one of the 22 divisions with ED populations larger than Richmond’s outperformed Richmond.

image

The math data tell the same story.

image

Note: The ED percentages here and below are of the students taking the tests in question.  They are different here for the two subjects.  For example, Richmond is 68.2% on the reading tests, 67.1%, math.

Of course, division averages do not separate out the performance of the schools with larger or smaller ED populations.  So let’s take a look at the data by school.

Note: Some of the smaller schools are absent from these graphs because of the VDOE suppression rules as to small groups of students.

I’ve broken the data out by grade.  First, reading, in the elementary grades:

image

image

image

These are modest correlations, especially in the non-ED data, with both groups showing roughly the same change with increasing ED percentage (between 1.6% and 2.0% decrease per 10% increase in ED).

On to middle school:

image

image

image

These are about the trends we might expect but with some better (albeit still modest) correlations.  One interesting difference:  It looks like the effect of increasing ED percentage is about a third larger on the non-ED than on the ED pass rates.

I’ll spare you the math data by school.  They tell the same story but with lower scores and even more scatter.

The bottom line: At the school level, as at the division level, by far the largest effect on SOL pass rates relates to some factor(s) other than the relative number of economically disadvantaged students. 

And we know what one of the other factors (probably the most important one) is: teacher effectiveness.  For example, the SGP data showed for one year (I think it was 2014):

Only one of Richmond’s twenty-one sixth grade reading teachers produced an average student improvement better than the state average; none was more than two standard deviations above the statewide average.  Six (or seven, depending on the rounding) were more than two standard deviations below the state average and four were more than three standard deviations below.  The Richmond average is 1.5 standard deviations below the state average.

And the Richmond picture was even worse in 6th Grade math:

http://calaf.org/wp-content/uploads/2015/03/image27.png

Note: State average there was 50.6.

Thus, not only is it futile to blame poverty for poor school performance, we know at least one place where we can improve learning: teacher performance.

More Teachers? More Schools? No More Learning?

We hear a lot about the benefits (and not) of smaller class size. 

The VDOE Web site has data on Fall division enrollments and numbers of teaching positions and SOL performance so let’s look at those numbers.  (The latest teacher numbers are from 2017 so I’ll use the other data from that year.)

To start, here are the division average reading pass rates plotted against the number of students per teaching position.

image

Notice the large range here, from 6.85 in Highland to 15.4 in Prince William.  The average of division averages is 11.5.

The gold square is Richmond at 10.7 students per teaching position.  The red diamonds are the peer cities, from the left Norfolk, Hampton, and Newport News.

Recall that smaller classes mean smaller ratios so look to the left for “better” in terms of class size.

The fitted line has a slight positive slope (0.3% pass rate increase for an increase of 1.0 in the student/teacher ratio), suggesting that smaller classes are associated with lower pass rates.  But the R-squared value, 0.4%, tells us that the two variables are essentially uncorrelated.

On these data, it looks like those divisions that are hiring more teachers per student are not, on average, getting any reading benefit from the extra money.

The math data tell much the same story.

image

We’ve also heard that Richmond has more small schools than usual, on purpose. 

We can get a measure of the number of schools in a division by counting the principals.  The Richmond average is 514 students per principal while the average of the division averages is 515.  The range, again, is surprisingly large, from 101, Highland again, to 943.6, Chesterfield.

The graphs do not suggest a benefit from Richmond’s average school size vis-à-vis the peers. 

image

image

So, next time RPS tells me it wants to replace some of its ancient infrastructure with other small schools, I’ll tell ‘em to take my share in stock in a nice James River bridge.

Third from the Bottom: Henderson

Richmond’s third-worst Middle School, Henderson, was 15th from the lowest in the state this year in terms of the five-subject average SOL pass rate.

https://calaf.org/wp-content/uploads/2018/08/image-54.png

Over the past five years, Henderson has increased its math score from 29.05 to 29.53 and its reading pass rate from 28.01 to 45.21.  Both pass rates declined this year.

image

The line fitted to the reading data extrapolates to the (former) reading benchmark in nine years; the math, 46 years.

Here are the data by subject and grade.

image

image

image

image

image

image

The reading scores are merely awful; the math, appalling.

2d From the Bottom: Elkhardt Thompson

In 2015, RPS merged two failing middle schools to form one failing middle school, ET. 

This year, ET was the 11th worst school in the state on the five-subject average.

https://calaf.org/wp-content/uploads/2018/08/image-54.png

ET got to that point by gaining ground this year.

image

To the extent that the fitted lines mean much with only three data points, they speak of ongoing declines in the pass rates in both reading and math.

This year, the reading pass rate, 40.1, is 34.9 points below the 75%accreditation benchmark.  The math pass rate, 33.3, is 36.7 points below the 70% math benchmark.

Here are the data by grade and subject.

image

image

image

image

image

image

Looks like there is a particular problem in the 7th grade.  But, then, with a failure rate of 59.9% in reading and 66.67 in math, the entire school is a particular problem.

The Board of “Education” has the data to tell us how much that was exacerbated by cheating in the elementary schools that feed ET.  But, of course, they’re not talking.  Indeed, I’ll bet you a #2 lead pencil they don’t even look at those data.

Grade 3 Reading (and not)

The RT-D this morning has a piece on declining third grade reading SOL pass rates and the unpleasant implications of that for our students.

The VDOE database has numbers on the subject.  Here, to start, are the Grade 3 reading pass rate changes this year for Richmond, the state, and the individual Richmond elementary schools.

image

The 5.36% drop in Richmond certainly has been inflated by institutional cheating, and the end of at least some of it. 

Last year, Carver contributed a (pretty good) 79.75 pass rate to the 74.6 state average; this year, there is no Carver score because of the cheating.  It looks like Fairfield (and perhaps some other schools) got the Word and resumed honest testing this year, resulting in the huge drop at Fairfield (and, probably, some of the smaller drops elsewhere).

We’ll have to wait another year to start to sort that out.

In the meantime, let’s look further into the historical record.  We’ll start with this year’s big gainer (Swansboro) and loser (Fairfield) along with the Richmond and state averages.

image

We must hope that the improbable increase at Swansboro reflects a genuine improvement.

Next, Woodville (the fourth worst school in Virginia as measured by the 5-subject average) and Munford (Richmond’s best elementary school, #203 from the top school statewide on the 5-subject average).

image

Woodville has a long way to go but the last three years set a hopeful pattern.

Next, Westover Hills, our neighborhood school, and Patrick Henry, a nearby neighbor.

image

Finally, our new Superintendent’s neighborhood school, Holton, and a nearby neighbor to it, Ginter Park.

image

If you’re interested in the history of some other school, email me, john {at} calaf [dot] org, and I’ll ship you the spreadsheet.

Aside:  Notice that the state average this year is 2.6 points below the old benchmark for accreditation in English, 75%.  Seeing the difficulty in improving the schools’ performance (at least as to the older cities’ performance, they admit they don’t know how (Sept. 21, 2016 video starting at 1:48)), the Board of “Education” has changed the rules to make it much easer for a school to be accredited.  They can’t improve learning, so they have turned to fudging the numbers.

Trends at MLK

In terms of the five-subject average SOL pass rate, Richmond’s Martin Luther King Jr. Middle School is the second worst school in Virginia.

In terms of the school average, the reading pass rate this year increased 6.76 points to 31.6% (that is, 68.4% of the students flunked the test); the math average rose 0.51 to 21.9% (a 78.1% failure rate).

image

If we use the least squares fitted lines to extrapolate, the reading pass rate reaches the (former) 75% accreditation benchmark in 92 years (2110).  The fitted line to the math rate data has a negative slope; it never reaches the math benchmark, 70%.

The sixth grade reading rate improved from 24.6 to 31.6. 

image

The extrapolation reaches the benchmark in 36 years.

The seventh grade reading pass rate rose to 32.7; the data extrapolate to reach 75 in 159 years (161 years if you don’t round up).

image

The fitted line for the eighth grade data has a negative slope; it never reaches the benchmark.  The 12.2 point increase this year still left a 69.9% failure rate.

image

On the sixth grade math tests, the scores dropped below 30% this year.  The line fitted to the data has a negative slope.

image

The seventh grade math pass rate increased by 2.65 points this year, but left an 89.7% failure rate.  The fitted line, yet again, has a negative slope.

image

The eighth grade math score, at 18.9 this year, is lower than last year’s 21.0, and the fitted line again has a negative (this time a large negative) slope.

image

This is a picture of profound failure. 

It may well be that these scores are being depressed by cheating in the elementary schools (by the schools, not the kids) but that merely moves the location of (part of) the problem: RPS is failing, miserably, and is damaging children it is supposed to be educating. 

What Happened at Fairfield Court?

Fairfield Court dropped off an SOL cliff this year, from a (suspiciously good) 76 in reading to a disastrous 38 and from 77 to 30 in math.

Data from the past suggest that the school (not the students, mind you, the school) may have been cheating with its disabled students.  You may think of anomalous disabled pass rates as the canary in the cheating machine.

The data this year show that the Fairfield disabled pass rates shared the plunge but that they remain unusually high vis-à-vis the non-disabled.

image

image

It’s an interesting puzzle why Carver continued cheating this year but Fairfield went (at least mostly) honest.  These data do not suggest an answer.

In any case, the Richmond disabled rates also remain suspiciously high, vis-à-vis the not disabled rates.

image

image

Of course, the State Board of “Education” will continue to ignore this issue.  We can hope our new Superintendent will not want this kind of thing to fester during his tenure.

Mixed News at Holton

Overall, Holton dropped a point on both the reading and math SOLs this year.  The data by subject and grade paint a more nuanced picture.

The reading data suggest a problem in the fourth grade but otherwise look pretty good.

image

image

image

For what it may be worth, I’ve asked Excel to fit a straight line to each dataset to suggest the trend in the pass rates.  In that sense, Holton is showing progress in reading, notwithstanding the 4th grade blip this year.

The math data also suggest an issue in the 4th grade.

image

image

image

The Good News is that Holton is flirting with the state average in reading.  The not so good, that the school is handily beating the Richmond average in math, albeit at a rate still below the 70% accreditation benchmark.