Punishing Poverty

Our Board of “Education” has engineered a reporting system
that rewards the more affluent divisions and penalizes the poorer ones.

We have seen that economically disadvantaged (“ED”) students underperform their more affluent peers (“Not ED”) in terms of pass rates on the SOL tests.  In 2018, that underperformance ranged from 17.43% on the History & Social Science tests to 21.82% on the writing tests.


This places schools and divisions with larger ED populations at an unfair disadvantage with respect to the SOL averages.

Here, to start, is a selection of reading pass rates.


I’ve selected these nine divisions because their ED populations are spaced at about 10% intervals.  (They are, from the left, Falls Church, Powhatan, Arlington, Gloucester, Bath, Dickenson, Roanoke City, Northampton, and Greensville.)  The orange diamonds are the division average reading pass rates of the Not ED students; the green triangles are the ED pass rates. 

The SOL average pass rates are the blue circles.  The divisions with smaller numbers of ED students to drag down that average have higher SOL numbers.  Indeed, the divisions with fewer than 51% ED (the statewide average) enjoy a boost; divisions with more than 51% suffer a penalty.

The extreme examples are Falls Church and Greensville County (which includes Emporia).

Falls Church had the lowest ED numbers in the state, 9%.  The reading pass rate for their Not ED students was 95%; for ED, it was 67% (a 28 point spread; no bragging rights there).  The average of the ED and Not ED averages was 81%, which would have been a fair measure of the performance of both groups.  The division SOL pass rate, however, was 92% because of the small number of EDs.  Falls Church thus enjoyed an eleven point SOL bonus for affluence.

Greensville County was the other end of the scale, 93% ED.  The pass rates were lower: 76% for Not ED, 59% for ED, a seventeen point spread with a 68% average.  But the reported SOL was 61%.  Greensville suffered a 7 point SOL penalty for poverty.

For a more complete view of the situation, here is a graph of division pass rates, Not ED, ED, and SOL, all plotted v. the ED percentage.


The Falls Church points are enlarged and filled with purple; Greensville, maroon; Richmond, gold.

Let’s simplify the picture by looking just at the least squares fitted lines:


The statistics of the fitted lines tell three different stories:

  • ED: The fitted line shows an 0.43% decrease in the ED pass rate for a 10% increase in the ED population but the correlation is minuscule.
  • Not ED: The slope is minus 1.35% for a 10% increase in ED and there is something of a correlation.  It looks like increasing the ED population is mildly related to a decrease in the Not ED rate, but not so much the ED (this kind of data can’t show whether the ED population increase causes part of the Not ED pass rate decrease).
  • The SOL pass rate decreases by 2.77% for a 10% increase in the ED population.  Thirty-five percent is a pretty good correlation and the reason is obvious: Divisions with larger ED populations have more ED (i.e., lower) pass rates included in the average.

The math data tell much the same story.


Here the average of the Falls Church ED/Not ED pass rates is 74% but the small ED population results in an SOL pass rate 13 points higher.  The Greensville SOL reflects a 3 point penalty vs. the ED/Not ED average, which is that small only because the ED and Not ED rates are only seven points apart.

Notice that the Falls Church ED rate again is far below the Not ED, here by 33%, which is double the state average difference.  Either those Falls Church schools are coasting with unusually bright Not ED students, or struggling with unusually low-performing ED students, or doing a poor teaching job with the ED group, or some combination of such factors.


The slopes again show how the SOL average penalizes divisions with large ED populations.  As well, the R-squared for the Not ED group suggests, even more strongly than in the reading data, that increasing the ED population is associated with an effect on the performance of the Not ED group.

We cannot infer from the data why the Board of “Education” would embrace this system that punishes poverty.  We can notice, however, that the system is unfair on its face. 

Indeed, the Board had a fair measure of learning, the SGP, that was independent of poverty.  But the Board abandoned that system on the flimsy excuse that it could not calculate the results until Summer School results were in.  In fact, they knew that when the started calculating the SGP.  As well, they were (and are) perfectly capable of calculating the SGPs in May for the students (and teachers) not involved in Summer School.  Waiting until August merely gives them a chance to camouflage some of the poor performances during the regular school year.

Your tax dollars at “work.”

Here, for the record, is a list of the divisions that received more than a 2% reading SOL boost from the ED/Not ED average in 2018:


And here are those that enjoyed a penalty of 2% or more:


Ah, well, we can make one inference:  If you’re that Board and you’re going to make an enemy, better the Superintendent in Greensville or Colonial Beach than the one in Falls Church or Loudoun.

Economic Disadvantage and Schools’ Reading Performance

In terms of the state and division average SOL pass rates, economically disadvantaged (“ED”) students underperform their more affluent peers (“Not ED”).

For example, here are the 2018 state averages on the reading tests.

This makes the SOL average an unfair tool for measuring academic progress because of the invisible disadvantage the averages place on the schools with larger populations of ED students.

To start a dive underneath the overall averages, here are the 2018 school average reading pass rates for ED and Not ED students, plotted vs. the percentage of ED students in the school.

The blue diamonds are the school averages for the students who are Not ED.  The fitted line comports with the visual picture:  Pass rates of the Not ED students decline as the percentage of ED students increases.

The 24% R-squared value tells us that %ED predicts 24% of the variance in the pass rates.  Of course, that does not say that the population of ED students causes the score change, just that the two variables are related, albeit at some distance.

The orange triangles are the average pass rates of the ED students in those schools.  As we would expect, those numbers mostly are lower than the Not ED rates.  The fitted line shows a more modest negative slope and the R-squared value, 9%, tells us that the correlation between the ED pass rate and the ED percentage in the school is much less robust than the Not ED correlation.

These data comport with the earlier conclusion (again without telling us the underlying factors or mechanism): At the school level, averages for ED students are generally below the averages for Not ED students.

The data also suggest something else: Pass rates of the Not ED students correlate negatively, to a modest degree, with the percentage of ED students in the school.  But the Not ED students, not so much.

This gets even more interesting if we overlay the data for the schools in a particular division.  Let’s start with the Big Guy, Fairfax.

The green dots are the Fairfax Not ED school averages; yellow, the ED.

The green fitted line for the Not ED students lies nearly atop the all schools line with nearly the same R-squared value.

However, in terms of the school averages the Fairfax ED students underperform the state average; as well, the slope is more negative (-2.7% for a 10% increase in ED population vs. -1.8% for the Not ED students).  Moreover, the R-squared values for the two groups are nearly equal and are large enough to suggest a modest correlation.

Why the FFax ED students underperform and why that underperformance increases with the ED population are questions the FFax school board really should address.  For sure, they have a problem there.

Well, that was fun, but distant.  How about Richmond?

Richmond has three schools with low ED populations; the Not Ed students in those schools have OK pass rates but the ED students are a mixed bag.  For the most part, both the ED and Not ED groups perform poorly in the more numerous, high-ED schools, which pulls the fitted lines down.

Indeed, a 10% increase in the ED population is associated with a -5.5% change in the Not ED pass rate and -2.9% in the ED rate.  As well, the R-squared for the Not ED students is reaching toward a robust correlation.  Said in English: On average, Richmond schools with more ED students have lower pass rates, while the pass rates for the Not ED students tend to be lowered more than those for the ED students.

The lowest %ED school, Munford (16%), has a 92% Not ED pass rate and a better than average 77% ED rate.  Richmond Alternative, at 21% ED, has a respectable 87% Not ED rate (especially “respectable” given that it is Richmond’s academic boot camp) but an awful 36% rate for its ED students.  Fox, at 22% ED, has a fine Not Ed pass rate, 95%, but a subpar 63% ED rate.

The yellow point at 48% ED, 100% pass rate, is Community High, a select school showing select school results. That yellow point sits atop a 100% green point.

The other Richmond schools whose Not ED students averaged >90% are Franklin (95%) and Cary and TJ (92% each).

The point at 89% ED shows a 29.6% ED pass rate, third worst in the state for ED students; it is the worst of our awful middle schools (and second worst overall in the state), MLK.

The four yellow points at 100% ED illustrate a minor anomaly in these data: The VDOE suppression rules blocked the head counts for the Not Ed students at Greene, Fairfield Court, Southampton, and Woodville, so (1) there are no corresponding Not ED points, thus (2) those four ED points are a few % farther to the right than they should be.  Count that as a bonus. If those points were in the right places, the fitted line would be even steeper.

These data say, quite clearly, that Richmond has a problem , especially in its (many) schools with large ED populations.  (The estimable Jim Bacon would suggest that problem, at least in part, is student behavior.)

Richmond will continue to look bad at least until it figures out what is wrong here.  On the SOL averages, they look even worse than their (awful) performance would merit because of the large ED populations.  And, to the point of real life and not sterile numbers, Richmond’s schools are failing, miserably, in their purpose of delivering an education “to enable each student to develop the skills that are necessary for success in school, preparation for life, and reaching their full potential.”  That failure is most stark in the case of the students who are already disadvantaged in economic terms.

For the record, here is the Richmond list.  The #DIV/0! and #N/A entries reflect suppressed data.

There are more insights to be had from these data.  Let’s start with the peer cities.

In Hampton, notice the relatively narrow range of ED percentages, the lower than average pass rates, and the steep fitted lines with non-trivial R-squared values.

Newport News data tell the same story but with much steeper slopes and stronger correlations.

Also Norfolk.

Whew!  That looks like a magnified version of Richmond’s ED issues.

Turning to the ‘burbs, these data rat out Hanover, which performs at the state average for its Not ED students but not so well with ED students, even at the lower ED populations.  Hanover gets good numbers on the statewide list of average pass rates, however, because of its low ED percentages.

Then we have Chesterfield, performing at average for both groups.

And Henrico, with notable correlations and major underperformance by both groups in the higher %ED schools.

Finally, Lynchburg, named for a relative of my paternal grandmother and, to the point here, a place where I have a reader.

Notice the milder correlations here.  Also the outstanding Not ED (95%) and not so outstanding ED pass rate (59%) at the high-ED school (Dearington Elementary).  Also the lowest ED pass rate, 47%, contrasting with an 83% Not ED rate (at Linkhorn Middle).

Bottom line: OK Not ED pass rates in L’Burg; not so good ED.

Next up: Math.

Grade 3 Reading (and not)

The RT-D this morning has a piece on declining third grade reading SOL pass rates and the unpleasant implications of that for our students.

The VDOE database has numbers on the subject.  Here, to start, are the Grade 3 reading pass rate changes this year for Richmond, the state, and the individual Richmond elementary schools.


The 5.36% drop in Richmond certainly has been inflated by institutional cheating, and the end of at least some of it. 

Last year, Carver contributed a (pretty good) 79.75 pass rate to the 74.6 state average; this year, there is no Carver score because of the cheating.  It looks like Fairfield (and perhaps some other schools) got the Word and resumed honest testing this year, resulting in the huge drop at Fairfield (and, probably, some of the smaller drops elsewhere).

We’ll have to wait another year to start to sort that out.

In the meantime, let’s look further into the historical record.  We’ll start with this year’s big gainer (Swansboro) and loser (Fairfield) along with the Richmond and state averages.


We must hope that the improbable increase at Swansboro reflects a genuine improvement.

Next, Woodville (the fourth worst school in Virginia as measured by the 5-subject average) and Munford (Richmond’s best elementary school, #203 from the top school statewide on the 5-subject average).


Woodville has a long way to go but the last three years set a hopeful pattern.

Next, Westover Hills, our neighborhood school, and Patrick Henry, a nearby neighbor.


Finally, our new Superintendent’s neighborhood school, Holton, and a nearby neighbor to it, Ginter Park.


If you’re interested in the history of some other school, email me, john {at} calaf [dot] org, and I’ll ship you the spreadsheet.

Aside:  Notice that the state average this year is 2.6 points below the old benchmark for accreditation in English, 75%.  Seeing the difficulty in improving the schools’ performance (at least as to the older cities’ performance, they admit they don’t know how (Sept. 21, 2016 video starting at 1:48)), the Board of “Education” has changed the rules to make it much easer for a school to be accredited.  They can’t improve learning, so they have turned to fudging the numbers.

Pell and The Gray Lady

The estimable Jim Bacon points to a piece in the Gotham City Times regarding college funding and Pell grants.

Jim points out that, according to the Times, UVa and Tech are 2d and 3d from the bottom of the Times’ top twenty “top public universities” in terms of percentage of Pell grants.  As to some of those universities (Tech not so much; UVa not at all), the Times points to recent decreases in Pell percentages.

In fact, the Times has cherry picked the Pell data without showing any relationship to state-level college funding.  As to recent decreases in Pell numbers, the facts in context suggest otherwise.

For background, here are the 2016 median SAT verbal scores vs. Pell percentages of the Virginia 4-year public programs.


UVa and Tech are nationally ranked because they admit smart kids.  Given that smarts correlate strongly but negatively with Pell percentage throughout the Virginia 4-year programs (we can argue about the reasons, but that’s not the issue here), it’s no surprise that those fine schools have low Pell percentages.

Those low numbers are not a problem unless one thinks that these schools should dilute their brands by admitting less qualified students.

As to the alleged trend in Pell percentages with decreasing state support, the Times looks only at 2016 and 2012, and fails to demonstrate any relationship with state funding. 

A more general view of the Pell numbers is more revealing.  To that end, here are the Pell percentages of the average and three selected Virginia 4-year programs, by year:



The jumps in 2010 and –11 are statewide, and suggest that the selective schools were affected proportionately by the increases in Pell funding (see below).

(I’ve included Mary & Bill here because it belongs in any 4-year ranking that includes THE UNIVERSITY and Tech).

More to the point, it’s hard to see any large decreases there.  Indeed, in light of the Pell funding that has been decreasing in recent years, the surprise is the absence of large recent decreases in Virginia Pell percentages at these schools.


The more interesting question here is why the poorer (and statistically less smart) kids graduate at lower rates, especially from the less selective schools. 


I’ll bet you a #2 lead pencil that it has more to do with the quality of K-12 education, esp. in our cities, and the support – both financial and academic – those students receive than with state appropriations.

2016 Richmond Teacher Pay

Table 19 is up in the Superintendent’s Annual Report with the 2016 salary averages.

The statewide distribution graphs, below, show the count of divisions paying each salary, in $1,000 increments.  The Richmond average salary is marked in yellow on each graph; the state average is blue.

For elementary principals, Richmond’s $90,531 average was 0.40 standard deviation above the division average of $84,581. 


(To read the graph, look across the bottom for average salary, rounded to the nearest $1,000 and up and down for number of schools.  Thus, one school paid $44,000.  Six schools, one of which was Richmond, paid $91,000.  Four schools paid the state average, $85,000.)

For secondary principals, Richmond’s $91,266 average was 0.10 standard deviation below the division average of $93,129.


For Elementary Assistant Principals, Richmond’s $69,786 average was 0.17 standard deviation above the division average of 67,813.


For secondary Assistant Principals, Richmond’s $71,342 average was 0.20 standard deviation below the division average of 73,734.


For elementary teachers, Richmond’s $49,100 average was 0.19 standard deviation above the division average of $47,816.


For secondary teachers, Richmond’s $51,201 average was 0.08 standard deviation above the division average of $50,563.


Looks like we’re underpaying the leaders in our secondary schools.

Some details from the VDOE spreadsheet:

The average annual salaries for elementary and secondary teachers include supplemental salaries and wages (expenditure object 1620) as reported on the Annual School Report.

Teaching positions include classroom teachers, guidance counselors, librarians and technology instructors.

Jointly-operated school divisions (Fairfax City and Fairfax County; Emporia and Greensville County; and Williamsburg and James City County) report positions and average salaries on the annual school report of the fiscal agent division only. Fairfax County, Greensville County and Williamsburg are the fiscal agent divisions.

And a further note: The “division averages” reported above are the averages of the division averages in the VDOE spreadsheet.  VDOE reports the statewide averages; those generally are larger than the division averages, doubtless propelled by the large and very expensive NoVa divisions.

Final(?) Accreditation Results

Just a week ago, VDOE announced that the accreditation data for Bellevue, Franklin, and Patrick Henry had been recalculated and that the three schools were fully accredited. 

So we got to redo the Richmond reportAgain.

The latest data are here.  The totals disagree slightly with the table on the VDOE Web site; I’ll bet you a #2 lead pencil that they updated the spreadsheet but not the table.

In any case, here are the data.  On the revised count, we have 2.5% of the schools in Virginia, 17% of the 93 schools rated “Accreditation Denied,” and 8.4% of the 333 schools that were not fully accredited.