Richmond ‘17 SOL by School

A reader (the reader?) rightfully jumped on me about those complicated elementary school graphs

Here, as a penance, are the 2017 Richmond elementary pass rates by school and subject.

imageimageimageimage

While we are at this, here are the middle schools.  Remember that the Franklin numbers include high school pass rates and that Franklin has a selected population.

imageimageimageimageimage

And the high schools.  Please recall that Open and Community, as well as Franklin, have select populations and should not be compared directly with the five mainstream high schools.

imageimageimageimageimage

How’s that for readable?  Not to mention distressing.

High School Lows

Turning to the high school results from the 2017 SOL data, let’s start with reading.

image

Franklin has both high- and middle school grades so its numbers are not directly comparable.  Moreover, Franklin, Community, and Open have select student populations, so the important results here are those of the mainstream high schools, Armstrong, Wythe, Huguenot, Marshall, and TJ.

Of that Five, only Wythe (barely) made the 75% benchmark for accreditation in English.  Huguenot’s pass rate fell 19% this year; Armstrong’s, 15%.

Next, writing:

image

Of the Five, only TJ beat the writing benchmark.  Huguenot dropped by 17%; Armstrong, by 14%.

History & Social Sciences:

image

The History & SS benchmark is 70%.  Marshall and Huguenot met that requirement.

Marshall improved by 13% this year.  Armstrong’s pass rate rose 7%, but only to 48%.

Math:

image

Even Community had problems with the math tests, and that was a nine point improvement from 2016.  None of the Five made the 70% benchmark.  Armstrong dropped by 14% (to 34%!); Wythe fell by 11%; Marshall, by 9%.

Science:

image

None of the Five made the 70% science benchmark.  Armstrong fell 21% to a 39% pass rate; Marshall dropped by 18%; Huguenot, by 10%.

Five Subject Average:

image

The average of the five pass rates shows 11% declines at Armstrong and Huguenot, a 5% drop at TJ, and a 2% decrease at Wythe.  Marshall rose by 1%. 

The 39% average at Armstrong is a disaster.  Wythe, Huguenot, Marshall, and TJ all did much better than Armstrong, but surely not well enough.

After the middle school numbers, especially MLK, even Armstrong looks pretty good.  But that happy aura dissipates once we notice that the high school pass rates are boosted by Richmond’s dropout rate (Richmond’s cohort rate was 9.9% last year, vs. 5.3% for the state average).

Middle School Shambles

VDOE posted the 2017 SOL data yesterday.  They are bad news for Richmond, particularly for the middle schools.

Here, for a start, are the middle school reading pass rates.

image

I’ve included Franklin, which has middle school grades, but the numbers there are not directly comparable because the high school grades are included in the Franklin averages.

Notice that Elkhardt and Thompson disappeared in 2016, being merged into the new Elkhardt-Thompson.  That didn’t do anything for the pass rates but it did create a “new school” that can’t be denied accreditation for another two years.

Franklin aside, none of these schools made the accreditation cutoff of 75%.  Henderson, Boushall, Elkhardt-Thompson, and, especially, MLK all are disastrously below 50% (i.e., above 50% failure rates).

The writing scores improved this year, from appalling to merely terrible.

image

Again, Franklin aside, no school made the accreditation cutoff.

The history & social sciences numbers are better: Three schools, other than Franklin, made the 70% benchmark.

image

The math scores are another disaster, with only AP Hill (barely) making the 70% accreditation cutoff.

image

Two schools (other than Franklin) beat the science benchmark.

image

The average of the five subject pass rates shows three schools below 50% with Boushall barely above that and headed the wrong way.

image

I said “bad news” at the top of this post.  That is far too weak.  I’m not sure there are words that are acceptable in polite company that describe the magnitude of this assault on Richmond’s schoolchildren.

Bad News at Westover Hills

On the four subject average (reading, history & social sciences, math, and science; they’ve discontinued the writing test for the elementary grades), our neighborhood school dropped fourteen points this year, to fourth lowest among Richmond’s elementary schools.

image

That decline is the composite of lower scores in all four subject areas:

image

image

image

image


Elementary, My Dear Bedden

Here we have the SOL pass rates for Richmond’s elementary schools for 2011 to 2017.

For sure, these graphs are cluttered.  The only way to paint a complete picture is to include all the schools on one graph, albeit the resulting information density makes it difficult to follow some schools.  For a nice picture of the recent results at any particular school, see the VDOE page here and put the school name in the search box.

To start: the reading tests.  The schools in the legend are sorted by the 2017 pass rate.

image

Nice improvement this year at Redd and Mason; nicer still at Stuart (24%!).  Problems at Carver and Francis.

The new English tests in 2013 hit most of the schools quite hard.  Some have recovered; many have not.

Our star performer, Carver, has slid (Dizzy Dean would have said “slud”) to fourth place.

History and Social Sciences did not have a new test to lower the scores but too many of our schools found a way to slide anyhow.

image

Big gains here at Stuart (28%), Patrick Henry (25%), and Ginter Park (21%).  Chimborazo, Swansboro, Cary, and Westover Hills all went the other way.

Next math.  The new tests came in 2012; a number of schools suffered a further hit in 2013, perhaps because of fallout from the new English and science tests that year.

image

Stuart was the big gainer here (after a similar loss the year earlier).  Carver, Francis, Westover Hills, and Bellevue led the losers.

Next, science.

image

2013 was the year of the new, tougher tests

I didn’t have the heart to expand the axis to include Woodville’s 82% failure rate in 2017.  But that was from a mere 3% drop in the pass rate: Westover Hills fell 24% and Oak Grove 20%, with Francis, Ginter Park, and Southampton all more than 10%.

Finally, the four subject average.

image

Stuart was the big gainer here at +17%, followed by Blackwell at 10%.  Westover Hills dropped 14%; Carver, 12%; Swansboro, 11%; and Francis, 10%.

For the view from thirty thousand feet, here are the averages of the elementary school pass rates:

imageimageimageimageimage

The accreditation benchmark is 75 for English, 70 for the other subjects.  So we see the average of the Richmond elementary schools not only declines on every subject but reading; it flunks on every subject but Hist & SS.

But if you think this is bad (and it is), wait for the middle school numbers, up next.

Bedden Blew It

Dana Bedden started in Richmond in January, 2014

We can ascribe the awful performance of the Richmond schools that year, and probably the next, to the previous Superintendent who had failed to align the curricula to the new math tests in 2012 and the new English and science tests in 2013.

After that, Bedden gets the credit.  Or the not credit.

The 2017 school year was Bedden’s third full (and last) year in the saddle.

We got a preview of his performance from the “division-level academic review” conducted by VDOE in the spring of 2017.   For the most part, the resulting report is a bloated batch of bureaucratic babble.  Nonetheless, some truth shines through.

On a scale of zero to three, where zero and one are failing, Richmond received a “0” for student “Outcomes” (no surprise there), a “1” for “Curriculum Alignment,” and a “3” for “Support for Instructional Leadership.”

So, after two+ years of Bedden, the curricula still were not aligned?  And student “outcomes” were zilch?  But the “instructional leadership” was fine?  Please!

Today VDOE released the 2017 SOL results. 

The database with the full (and not rounded) data is not available as I write this.  The summary spreadsheets are available here.

Those spreadsheets give us a measure of Bedden’s performance:

Reading: Second or third lowest pass rate (not counting the school for deaf & blind), up from last place last year but down two points.

image

Writing: Bottom of the barrel, down from 2d worst last year, despite an eight point gain.

image

History & Social Sciences: We beat poor Petersburg again with a constant 67% pass rate.

image

Math: Last year we beat Lancaster; this year, Petersburg, with a pass rate that dropped four points.

image

Science: Next to last again, beating only Petersburg but with a five point drop in our pass rate.

image

Five Subject Average: Beat only Petersburg, up from last place last year; our pass rate dropped slightly, theirs more.

image

Note: These are averages of the five subjects’ pass rates, not of all tests.

It appears that the School Board did well to get rid of Superintendent Bedden. 

Trouble is, now we go part (most?) (all?) of the 2018 school year with an interim Super and the new Super will need at least a couple of years to get any traction.  In the meantime, Richmond’s schoolchildren continue to suffer in our awful schools.

Of course, it could be worse: We could be Petersburg, laboring under Memoranda of Understanding from the Board of Education (that doesn’t know how to fix their schools) for thirteen years and still competing to be the worst school division in Virginia.

I’ll post more Richmond data as they become available.

How ‘Bout Those Elementary Schools

While I have the data (and while trying to not remember that 2017 results will be out in a week or so), here is the performance of the Richmond Elementary schools on the reading tests, by year.

image

The numbers here are averages of the pass rates of the three tested grades (3, 4, & 5).  Schools are sorted by 2016 pass rates.

The Big Dip in 2013 coincides with the new, tougher reading tests.  As you see, some schools were barely touched by the new tests; some were clobbered and have recovered; some were clobbered and remain that way.

The threshold for accreditation is 75%; only seven schools (of twenty-six) made that cutoff in 2016.  Six schools were below 50% with Woodville firmly in last place at 33%.

Next, math:

image

The schools again are sorted by the 2016 numbers so most of the color codes are changed from the reading graph.  (Sigh!)

The new math tests came in 2012.  Note the 2d round reaction in ‘13 at some schools.

The accreditation threshold here is 70.  Thirteen schools made that cut in ‘16; thirteen did not.  Four were below 50%.  Swansboro beat out Woodville for last place, 33% and 39%, respectively.

Stay tuned for the 2017 numbers that should give the final measure of Superintendent Bedden’s success or failure.  (The ‘17 data will tell us nothing about the Board of “Education” that has been busy adopting a “Memorandum of Understanding” instead of doing something useful to fix Richmond’s awful schools.  But, then, even they have noticed that they don’t know how to fix urban schools so perhaps that Mt. Everest of sterile MOU paperwork will keep them from more harmful meddling.)

Pell Plus

We have seen that, among Virginia’s public, 4-year college programs, the graduation rate of Pell grantees correlates strongly with the overall graduation rate of the school.  We also have seen that (federal) Pell and (state) Commonwealth award holders on average graduate at lower rates than students who receive no financial aid.  As well, the data show that students receiving other forms of support graduate, on average, at higher rates than students with no aid.

SCHEV has some more data on this subject. 

Here we look at the 4-, 5-, and 6-year cohort graduation rates of the 2010-11 cohort of first time in college students at our Big Hitter universities.  These data count graduations anywhere, not just at the starting institutions.

The All Students rates look like this:

image

The rates for students who do not receive any financial support are similar but mostly lower.

image

The differences between the two rates show some considerable differences between schools (and emphasize that averages do not tell the whole story).

image

The Pell grantees graduate at still lower rates (UVa excepted).

image

Or, in terms of the Pell rate vs. the no-support rate:

image

Here we see Pell grantees underperforming the no-support group except in the fifth and sixth years at THE University.  It seems that UVa’s selection process works even better for Pell students than for the students who pay their own way.  VCU is another story.

The other group we have seen to underperform on average are the Commonwealth Award grantees.

image

UVa and W&M report no grantees (or at least fewer then ten; see the SCHEV suppression rules).  Tech and Mason outperform here; VCU does not.

The athletic awards show a much different pattern.

image

Those are large differences.  At six years, VCU graduated 13.7% more of its athletes than of its no-support students.  Tech, 18.4% fewer.

BTW: At five and six years, W&M graduated 100% of the supported athletes.  Consistent with its high overall rate, UVa graduated 90.2% at five years, 92% at six.

image

Next, here are the results for the other grant programs whose grantees, averaged over Virginia’s 4-year programs, outperformed the no-support group:

Perkins (Federal program for students enrolled in career & technical ed. programs):

image

PLUS loans:

image

Subsidized and unsubsidized Stafford loans:

image

image

Tuition waiver programs (mostly for employees and families; some for “older” citizens):

image

SCHEV-operated VGAP awards:

image

Federal work-study:

image

Notice the relatively high rate of 4-year degrees (and the widespread overperformance) among the work-study students. 

Doubtless there are multiple factors driving these results.  We might expect those to include the granting process, the selectivity of the school, the pool from which a school draws, and probably other factors.

Nonetheless these data suggest some tentative conclusions:

  • UVa and W&M are skimming the cream from the population of students who receive financial support;
  • As the earlier data suggested, the SCHEV-run VGAP awards are much more effective, in most cases, than the school-run Commonwealth awards;
  • Some schools run counter to the average underperformance of the Pell and Commonwealth grantees (e.g., UVa on Pell; Tech and Mason on Commonwealth); and
  • VCU’s relatively high graduation rate of athletes might suggest either careful selection and nurturing or corrupt grading practices.  It would be good to know which.

Diploma Inflation

The standard diploma requires five “verified credits” (i.e., passing the course plus passing the End of Course (EOC) SOLs or approved substitute tests) in English, math, laboratory science, and history & social science, plus one further verified credit, presumably in one of those areas.   The advanced diploma requires three further verified credits.

We have seen that the 4-year cohort graduation rate statewide has risen in recent years at the same time that the EOC SOL pass rates have generally declined.

image

The five-year graduation rates run a bit higher but show the same pattern (except, of course, that the data end a year earlier).

image

Well, that’s the state.  Let’s look at some divisions.  All the graphs below show the 4-year cohort rates.

Let’s start with Fairfax where, for sure, all the students are above average.

image

Hmmm. Excellent numbers but, again, declining pass rates and increasing diploma rates.  That’s easier to see if we just look at the subject average pass rate and the total diploma rate.

image

Loudoun looks much the same.

image

image

Then we have Richmond and peers.  (Notice the rate of standard diplomas higher than the advanced rate, contra the NoVa jurisdictions and state averages.)

image

image

image

image

image

image

image

image

Finally, because I sometimes have readers there, Charles City and Lynchburg.

image

image

image

image

There are some interesting differences here, particularly the vast differences between the NoVa jurisdictions and the old cities and the lower rates of advanced diplomas in those cities.

The variability of the small numbers in Charles City makes conclusions there problematic but otherwise the underlying theme is constant: Decreasing pass rates with increasing graduation rates.

These data surely allow VBOE to brag on the increasing graduation rates.  Whether they can brag that the kids are learning more is a question these data do not answer.

And, for sure, these data confound the notion that decreasing pass rates should lead to decreasing graduation rates.