Why are our schools better (in terms of SOL pass rates) at teaching History & Social Science than the other four subjects to economically disadvantaged students.

(Notice especially the differences.)

Enduring the second most embarrassing municipal government in the East

Why are our schools better (in terms of SOL pass rates) at teaching History & Social Science than the other four subjects to economically disadvantaged students.

(Notice especially the differences.)

In terms of the state and division average SOL pass rates, economically disadvantaged (“ED”) students underperform their more affluent peers (“Not ED”).

For example, here are the 2018 state averages on the reading tests.

This makes the SOL average an unfair tool for measuring academic progress because of the invisible disadvantage the averages place on the schools with larger populations of ED students.

To start a dive underneath the overall averages, here are the 2018 school average reading pass rates for ED and Not ED students, plotted vs. the percentage of ED students in the school.

The blue diamonds are the school averages for the students who are Not ED. The fitted line comports with the visual picture: Pass rates of the Not ED students decline as the percentage of ED students increases.

The 24% R-squared value tells us that %ED predicts 24% of the variance in the pass rates. Of course, that does not say that the population of ED students causes the score change, just that the two variables are related, albeit at some distance.

The orange triangles are the average pass rates of the ED students in those schools. As we would expect, those numbers mostly are lower than the Not ED rates. The fitted line shows a more modest negative slope and the R-squared value, 9%, tells us that the correlation between the ED pass rate and the ED percentage in the school is much less robust than the Not ED correlation.

These data comport with the earlier conclusion (again without telling us the underlying factors or mechanism): At the school level, averages for ED students are generally below the averages for Not ED students.

The data also suggest something else: Pass rates of the Not ED students correlate negatively, to a modest degree, with the percentage of ED students in the school. But the Not ED students, not so much.

This gets even more interesting if we overlay the data for the schools in a particular division. Let’s start with the Big Guy, Fairfax.

The green dots are the Fairfax Not ED school averages; yellow, the ED.

The green fitted line for the Not ED students lies nearly atop the all schools line with nearly the same R-squared value.

However, in terms of the school averages the Fairfax ED students underperform the state average; as well, the slope is more negative (-2.7% for a 10% increase in ED population vs. -1.8% for the Not ED students). Moreover, the R-squared values for the two groups are nearly equal and are large enough to suggest a modest correlation.

Why the FFax ED students underperform and why that underperformance increases with the ED population are questions the FFax school board really should address. For sure, they have a problem there.

Well, that was fun, but distant. How about Richmond?

Richmond has three schools with low ED populations; the Not Ed students in those schools have OK pass rates but the ED students are a mixed bag. For the most part, both the ED and Not ED groups perform poorly in the more numerous, high-ED schools, which pulls the fitted lines down.

Indeed, a 10% increase in the ED population is associated with a -5.5% change in the Not ED pass rate and -2.9% in the ED rate. As well, the R-squared for the Not ED students is reaching toward a robust correlation. Said in English: On average, Richmond schools with more ED students have lower pass rates, while the pass rates for the Not ED students tend to be lowered more than those for the ED students.

The lowest %ED school, Munford (16%), has a 92% Not ED pass rate and a better than average 77% ED rate. Richmond Alternative, at 21% ED, has a respectable 87% Not ED rate (especially “respectable” given that it is Richmond’s academic boot camp) but an awful 36% rate for its ED students. Fox, at 22% ED, has a fine Not Ed pass rate, 95%, but a subpar 63% ED rate.

The yellow point at 48% ED, 100% pass rate, is Community High, a select school showing select school results. That yellow point sits atop a 100% green point.

The other Richmond schools whose Not ED students averaged >90% are Franklin (95%) and Cary and TJ (92% each).

The point at 89% ED shows a 29.6% ED pass rate, third worst in the state for ED students; it is the worst of our awful middle schools (and second worst overall in the state), MLK.

The four yellow points at 100% ED illustrate a minor anomaly in these data: The VDOE suppression rules blocked the head counts for the Not Ed students at Greene, Fairfield Court, Southampton, and Woodville, so (1) there are no corresponding Not ED points, thus (2) those four ED points are a few % farther to the right than they should be. Count that as a bonus. If those points were in the right places, the fitted line would be even steeper.

These data say, quite clearly, that Richmond has a problem , especially in its (many) schools with large ED populations. (The estimable Jim Bacon would suggest that problem, at least in part, is student behavior.)

Richmond will continue to look bad at least until it figures out what is wrong here. On the SOL averages, they look even worse than their (awful) performance would merit because of the large ED populations. And, to the point of real life and not sterile numbers, Richmond’s schools are failing, miserably, in their purpose of delivering an education “to enable each student to develop the skills that are necessary for success in school, preparation for life, and reaching their full potential.” That failure is most stark in the case of the students who are already disadvantaged in economic terms.

For the record, here is the Richmond list. The #DIV/0! and #N/A entries reflect suppressed data.

There are more insights to be had from these data. Let’s start with the peer cities.

In Hampton, notice the relatively narrow range of ED percentages, the lower than average pass rates, and the steep fitted lines with non-trivial R-squared values.

Newport News data tell the same story but with much steeper slopes and stronger correlations.

Also Norfolk.

Whew! That looks like a magnified version of Richmond’s ED issues.

Turning to the ‘burbs, these data rat out Hanover, which performs at the state average for its Not ED students but not so well with ED students, even at the lower ED populations. Hanover gets good numbers on the statewide list of average pass rates, however, because of its low ED percentages.

Then we have Chesterfield, performing at average for both groups.

And Henrico, with notable correlations and major underperformance by both groups in the higher %ED schools.

Finally, Lynchburg, named for a relative of my paternal grandmother and, to the point here, a place where I have a reader.

Notice the milder correlations here. Also the outstanding Not ED (95%) and not so outstanding ED pass rate (59%) at the high-ED school (Dearington Elementary). Also the lowest ED pass rate, 47%, contrasting with an 83% Not ED rate (at Linkhorn Middle).

Bottom line: OK Not ED pass rates in L’Burg; not so good ED.

Next up: Math.

The RT-D this morning has a piece on declining third grade reading SOL pass rates and the unpleasant implications of that for our students.

The VDOE database has numbers on the subject. Here, to start, are the Grade 3 reading pass rate changes this year for Richmond, the state, and the individual Richmond elementary schools.

The 5.36% drop in Richmond certainly has been inflated by institutional cheating, and the end of at least some of it.

Last year, Carver contributed a (pretty good) 79.75 pass rate to the 74.6 state average; this year, there is no Carver score because of the cheating. It looks like Fairfield (and perhaps some other schools) got the Word and resumed honest testing this year, resulting in the huge drop at Fairfield (and, probably, some of the smaller drops elsewhere).

We’ll have to wait another year to start to sort that out.

In the meantime, let’s look further into the historical record. We’ll start with this year’s big gainer (Swansboro) and loser (Fairfield) along with the Richmond and state averages.

We must hope that the improbable increase at Swansboro reflects a genuine improvement.

Next, Woodville (the fourth worst school in Virginia as measured by the 5-subject average) and Munford (Richmond’s best elementary school, #203 from the top school statewide on the 5-subject average).

Woodville has a long way to go but the last three years set a hopeful pattern.

Next, Westover Hills, our neighborhood school, and Patrick Henry, a nearby neighbor.

Finally, our new Superintendent’s neighborhood school, Holton, and a nearby neighbor to it, Ginter Park.

If you’re interested in the history of some other school, email me, john {at} calaf [dot] org, and I’ll ship you the spreadsheet.

__Aside:__ Notice that the __state__ average this year is 2.6 points below the old benchmark for accreditation in English, 75%. Seeing the difficulty in improving the schools’ performance (at least as to the older cities’ performance, they admit they don’t know how (Sept. 21, 2016 video starting at 1:48)), the Board of “Education” has changed the rules to make it much easer for a school to be accredited. They can’t improve learning, so they have turned to fudging the numbers.

Wallethub has a ranking of US cities, based on their calculation of how well city officials manage and spend public funds by comparing the quality of services residents receive against the city’s total budget.

Richmond is No. 125 of 150.

Va. Beach is No. 17; Chesapeake is 38; Norfolk is 109.

DC is last.

The estimable Jim Bacon points to a piece in the Gotham City Times* *regarding college funding and Pell grants.

Jim points out that, according to the *Times*, UVa and Tech are 2d and 3d from the bottom of the *Times*’ top twenty “top public universities” in terms of percentage of Pell grants. As to some of those universities (Tech not so much; UVa not at all), the *Times* points to recent decreases in Pell percentages.

In fact, the *Times* has cherry picked the Pell data without showing any relationship to state-level college funding. As to recent decreases in Pell numbers, the facts in context suggest otherwise.

For background, here are the 2016 median SAT verbal scores vs. Pell percentages of the Virginia 4-year public programs.

UVa and Tech are nationally ranked because they admit smart kids. Given that smarts correlate strongly but __negatively__ with Pell percentage throughout the Virginia 4-year programs (we can argue about the reasons, but that’s not the issue here), it’s no surprise that those fine schools have low Pell percentages.

Those low numbers are not a problem unless one thinks that these schools should dilute their brands by admitting less qualified students.

As to the alleged trend in Pell percentages with decreasing state support, the *Times* looks only at 2016 and 2012, and fails to demonstrate any relationship with state funding.

A more general view of the Pell numbers is more revealing. To that end, here are the Pell percentages of the average and three selected Virginia 4-year programs, by year:

The jumps in 2010 and –11 are statewide, and suggest that the selective schools were affected proportionately by the increases in Pell funding (see below).

(I’ve included Mary & Bill here because it belongs in any 4-year ranking that includes THE UNIVERSITY and Tech).

More to the point, it’s hard to see any large decreases there. Indeed, in light of the Pell funding that has been decreasing in recent years, the surprise is the __absence__ of large recent decreases in Virginia Pell percentages at these schools.

The more interesting question here is why the poorer (and statistically less smart) kids graduate at lower rates, especially from the less selective schools.

I’ll bet you a #2 lead pencil that it has more to do with the quality of K-12 education, esp. in our cities, and the support – both financial and academic – those students receive than with state appropriations.

Table 19 is up in the Superintendent’s Annual Report with the 2016 salary averages.

The statewide distribution graphs, below, show the count of divisions paying each salary, in $1,000 increments. The Richmond average salary is marked in yellow on each graph; the state average is blue.

For elementary principals, Richmond’s $90,531 average was 0.40 standard deviation above the division average of $84,581.

(To read the graph, look across the bottom for average salary, rounded to the nearest $1,000 and up and down for number of schools. Thus, one school paid $44,000. Six schools, one of which was Richmond, paid $91,000. Four schools paid the state average, $85,000.)

For secondary principals, Richmond’s $91,266 average was 0.10 standard deviation below the division average of $93,129.

For Elementary Assistant Principals, Richmond’s $69,786 average was 0.17 standard deviation above the division average of 67,813.

For secondary Assistant Principals, Richmond’s $71,342 average was 0.20 standard deviation below the division average of 73,734.

For elementary teachers, Richmond’s $49,100 average was 0.19 standard deviation above the division average of $47,816.

For secondary teachers, Richmond’s $51,201 average was 0.08 standard deviation above the division average of $50,563.

Looks like we’re underpaying the leaders in our secondary schools.

Some details from the VDOE spreadsheet:

The average annual salaries for elementary and secondary teachers include supplemental salaries and wages (expenditure object 1620) as reported on the Annual School Report.

Teaching positions include classroom teachers, guidance counselors, librarians and technology instructors.

Jointly-operated school divisions (Fairfax City and Fairfax County; Emporia and Greensville County; and Williamsburg and James City County) report positions and average salaries on the annual school report of the fiscal agent division only. Fairfax County, Greensville County and Williamsburg are the fiscal agent divisions.

And a further note: The “division averages” reported above are the averages of the division averages in the VDOE spreadsheet. VDOE reports the statewide averages; those generally are larger than the division averages, doubtless propelled by the large and very expensive NoVa divisions.

Just a week ago, VDOE announced that the accreditation data for Bellevue, Franklin, and Patrick Henry had been recalculated and that the three schools were fully accredited.

So we got to redo the Richmond report. Again.

The latest data are here. The totals disagree slightly with the table on the VDOE Web site; I’ll bet you a #2 lead pencil that they updated the spreadsheet but not the table.

In any case, here are the data. On the revised count, we have 2.5% of the schools in Virginia, 17% of the 93 schools rated “Accreditation Denied,” and 8.4% of the 333 schools that were not fully accredited.

Here are the End of Course pass rates by year for the Richmond high schools.

For reference, the accreditation levels are 75% for English, 70% otherwise; the mystery “adjustments” will raise most pass rates by a few points.

We might wonder how the large drops in pass rates this year in everything but reading can be reconciled with the graduation rate that nearly held constant at 70%.

We’ve been looking at the 6-year graduation rates of the 2011 freshman cohort in Virginia’s four-year colleges. SCHEV also provides SAT data for those students.

Here are the median SAT scores for the 2011 entering class, sorted by the schools’ graduation rates.

The graduation rates correlate strongly with those SAT scores.

Recall, please, that correlation does not imply causation.

———————————————————————————-

BTW: This year’s medians at Longwood are 490 in both math and reading; the state averages are 513/516. The current Richmond scores are in a different league.