Not Learning in Richmond

The SOL pass rates on the VDOE Report Card page gave a preview of the Bad News for 2017. 

The (very nice) front end to the SOL database now reaches the 2017 data so we have some context.

NOTE: Corrected from the earlier post.  There was a sorting error in the VDOE database.  Thanks to the always helpful Chuck Pyle for getting it fixed, quickly.

To start, here are the division average reading pass rates back to 2010.

image

The dip in 2013 coincided with the new, tougher English and science tests.

I’ve included data for the peer jurisdictions, Hampton, Newport News, and Norfolk, as well as for Charles City and Lynchburg (where I’ve sometimes been known to have a reader) and the state average. 

As a measure of relative performance, here are the same data, expressed as differences between the division pass rates and the state average.

image

Next, writing:

image

image

Next, History and Social Science:

image

image

Math:

image

image

The drop in 2012 coincides with the deployment of new, tougher math tests that year.

Science:

image

image

Finally, the averages of the five subject area pass rates:

image

image

It ought to be unconstitutional to do this to Richmond’s schoolchildren.

Bedden Blew It

Dana Bedden started in Richmond in January, 2014

We can ascribe the awful performance of the Richmond schools that year, and probably the next, to the previous Superintendent who had failed to align the curricula to the new math tests in 2012 and the new English and science tests in 2013.

After that, Bedden gets the credit.  Or the not credit.

The 2017 school year was Bedden’s third full (and last) year in the saddle.

We got a preview of his performance from the “division-level academic review” conducted by VDOE in the spring of 2017.   For the most part, the resulting report is a bloated batch of bureaucratic babble.  Nonetheless, some truth shines through.

On a scale of zero to three, where zero and one are failing, Richmond received a “0” for student “Outcomes” (no surprise there), a “1” for “Curriculum Alignment,” and a “3” for “Support for Instructional Leadership.”

So, after two+ years of Bedden, the curricula still were not aligned?  And student “outcomes” were zilch?  But the “instructional leadership” was fine?  Please!

Today VDOE released the 2017 SOL results. 

The database with the full (and not rounded) data is not available as I write this.  The summary spreadsheets are available here.

Those spreadsheets give us a measure of Bedden’s performance:

Reading: Second or third lowest pass rate (not counting the school for deaf & blind), up from last place last year but down two points.

image

Writing: Bottom of the barrel, down from 2d worst last year, despite an eight point gain.

image

History & Social Sciences: We beat poor Petersburg again with a constant 67% pass rate.

image

Math: Last year we beat Lancaster; this year, Petersburg, with a pass rate that dropped four points.

image

Science: Next to last again, beating only Petersburg but with a five point drop in our pass rate.

image

Five Subject Average: Beat only Petersburg, up from last place last year; our pass rate dropped slightly, theirs more.

image

Note: These are averages of the five subjects’ pass rates, not of all tests.

It appears that the School Board did well to get rid of Superintendent Bedden. 

Trouble is, now we go part (most?) (all?) of the 2018 school year with an interim Super and the new Super will need at least a couple of years to get any traction.  In the meantime, Richmond’s schoolchildren continue to suffer in our awful schools.

Of course, it could be worse: We could be Petersburg, laboring under Memoranda of Understanding from the Board of Education (that doesn’t know how to fix their schools) for thirteen years and still competing to be the worst school division in Virginia.

I’ll post more Richmond data as they become available.

How ‘Bout Those Elementary Schools

While I have the data (and while trying to not remember that 2017 results will be out in a week or so), here is the performance of the Richmond Elementary schools on the reading tests, by year.

image

The numbers here are averages of the pass rates of the three tested grades (3, 4, & 5).  Schools are sorted by 2016 pass rates.

The Big Dip in 2013 coincides with the new, tougher reading tests.  As you see, some schools were barely touched by the new tests; some were clobbered and have recovered; some were clobbered and remain that way.

The threshold for accreditation is 75%; only seven schools (of twenty-six) made that cutoff in 2016.  Six schools were below 50% with Woodville firmly in last place at 33%.

Next, math:

image

The schools again are sorted by the 2016 numbers so most of the color codes are changed from the reading graph.  (Sigh!)

The new math tests came in 2012.  Note the 2d round reaction in ‘13 at some schools.

The accreditation threshold here is 70.  Thirteen schools made that cut in ‘16; thirteen did not.  Four were below 50%.  Swansboro beat out Woodville for last place, 33% and 39%, respectively.

Stay tuned for the 2017 numbers that should give the final measure of Superintendent Bedden’s success or failure.  (The ‘17 data will tell us nothing about the Board of “Education” that has been busy adopting a “Memorandum of Understanding” instead of doing something useful to fix Richmond’s awful schools.  But, then, even they have noticed that they don’t know how to fix urban schools so perhaps that Mt. Everest of sterile MOU paperwork will keep them from more harmful meddling.)

Carver!

The Board of Education just posted its list of 2017 Index of Performance Awards (based on 2016 data).

(They released the 2016 SOL data on August 16, 2016 and they had the numbers before then.  We might wonder why it took a year to figure out where to send the awards.)

The 145 schools receiving “Excellence” awards included Richmond’s Carver and Munford elementary schools and Open High.

Open aced the math SOLs last year and got a 97% pass rate on reading.  The average of the five subjects pass rates was 98.7%.  Sounds like “excellence” to me.

Munford has a rep in Richmond and Carver has shone under its new principal.  I thought I’d dig around in their numbers.

Among elementary schools, Carver was eighth in the state in reading in ‘16 with a 97.5% pass rate (average of the pass rates for the three grades tested).  Munford was in 103d place at 91.3%.  In math, Carver was 27th at 95.2%; Munford was 145th at 91.2%.

Those Carver numbers look really fine, especially when we expand the focus to include all the Richmond elementary schools (2016 pass rates; average of the averages for the three grades):

image

image

Both schools showed the effects of the new math tests in 2012 and the new reading tests in 2013, Carver more so.  Both schools have since recovered, Carver to exceptional levels.

image

image

image

image

These data raise two questions:

  • What is Carver doing that Blackwell and Chimborazo and Swansboro and Woodville and too many other Richmond schools are not doing; and
  • Why are we looking all over for a new Superintendent when Carver is so close?

Pell Plus

We have seen that, among Virginia’s public, 4-year college programs, the graduation rate of Pell grantees correlates strongly with the overall graduation rate of the school.  We also have seen that (federal) Pell and (state) Commonwealth award holders on average graduate at lower rates than students who receive no financial aid.  As well, the data show that students receiving other forms of support graduate, on average, at higher rates than students with no aid.

SCHEV has some more data on this subject. 

Here we look at the 4-, 5-, and 6-year cohort graduation rates of the 2010-11 cohort of first time in college students at our Big Hitter universities.  These data count graduations anywhere, not just at the starting institutions.

The All Students rates look like this:

image

The rates for students who do not receive any financial support are similar but mostly lower.

image

The differences between the two rates show some considerable differences between schools (and emphasize that averages do not tell the whole story).

image

The Pell grantees graduate at still lower rates (UVa excepted).

image

Or, in terms of the Pell rate vs. the no-support rate:

image

Here we see Pell grantees underperforming the no-support group except in the fifth and sixth years at THE University.  It seems that UVa’s selection process works even better for Pell students than for the students who pay their own way.  VCU is another story.

The other group we have seen to underperform on average are the Commonwealth Award grantees.

image

UVa and W&M report no grantees (or at least fewer then ten; see the SCHEV suppression rules).  Tech and Mason outperform here; VCU does not.

The athletic awards show a much different pattern.

image

Those are large differences.  At six years, VCU graduated 13.7% more of its athletes than of its no-support students.  Tech, 18.4% fewer.

BTW: At five and six years, W&M graduated 100% of the supported athletes.  Consistent with its high overall rate, UVa graduated 90.2% at five years, 92% at six.

image

Next, here are the results for the other grant programs whose grantees, averaged over Virginia’s 4-year programs, outperformed the no-support group:

Perkins (Federal program for students enrolled in career & technical ed. programs):

image

PLUS loans:

image

Subsidized and unsubsidized Stafford loans:

image

image

Tuition waiver programs (mostly for employees and families; some for “older” citizens):

image

SCHEV-operated VGAP awards:

image

Federal work-study:

image

Notice the relatively high rate of 4-year degrees (and the widespread overperformance) among the work-study students. 

Doubtless there are multiple factors driving these results.  We might expect those to include the granting process, the selectivity of the school, the pool from which a school draws, and probably other factors.

Nonetheless these data suggest some tentative conclusions:

  • UVa and W&M are skimming the cream from the population of students who receive financial support;
  • As the earlier data suggested, the SCHEV-run VGAP awards are much more effective, in most cases, than the school-run Commonwealth awards;
  • Some schools run counter to the average underperformance of the Pell and Commonwealth grantees (e.g., UVa on Pell; Tech and Mason on Commonwealth); and
  • VCU’s relatively high graduation rate of athletes might suggest either careful selection and nurturing or corrupt grading practices.  It would be good to know which.

Diploma Inflation

The standard diploma requires five “verified credits” (i.e., passing the course plus passing the End of Course (EOC) SOLs or approved substitute tests) in English, math, laboratory science, and history & social science, plus one further verified credit, presumably in one of those areas.   The advanced diploma requires three further verified credits.

We have seen that the 4-year cohort graduation rate statewide has risen in recent years at the same time that the EOC SOL pass rates have generally declined.

image

The five-year graduation rates run a bit higher but show the same pattern (except, of course, that the data end a year earlier).

image

Well, that’s the state.  Let’s look at some divisions.  All the graphs below show the 4-year cohort rates.

Let’s start with Fairfax where, for sure, all the students are above average.

image

Hmmm. Excellent numbers but, again, declining pass rates and increasing diploma rates.  That’s easier to see if we just look at the subject average pass rate and the total diploma rate.

image

Loudoun looks much the same.

image

image

Then we have Richmond and peers.  (Notice the rate of standard diplomas higher than the advanced rate, contra the NoVa jurisdictions and state averages.)

image

image

image

image

image

image

image

image

Finally, because I sometimes have readers there, Charles City and Lynchburg.

image

image

image

image

There are some interesting differences here, particularly the vast differences between the NoVa jurisdictions and the old cities and the lower rates of advanced diplomas in those cities.

The variability of the small numbers in Charles City makes conclusions there problematic but otherwise the underlying theme is constant: Decreasing pass rates with increasing graduation rates.

These data surely allow VBOE to brag on the increasing graduation rates.  Whether they can brag that the kids are learning more is a question these data do not answer.

And, for sure, these data confound the notion that decreasing pass rates should lead to decreasing graduation rates.