Elementary, My Dear Bedden

Here we have the SOL pass rates for Richmond’s elementary schools for 2011 to 2017.

For sure, these graphs are cluttered.  The only way to paint a complete picture is to include all the schools on one graph, albeit the resulting information density makes it difficult to follow some schools.  For a nice picture of the recent results at any particular school, see the VDOE page here and put the school name in the search box.

To start: the reading tests.  The schools in the legend are sorted by the 2017 pass rate.


Nice improvement this year at Redd and Mason; nicer still at Stuart (24%!).  Problems at Carver and Francis.

The new English tests in 2013 hit most of the schools quite hard.  Some have recovered; many have not.

Our star performer, Carver, has slid (Dizzy Dean would have said “slud”) to fourth place.

History and Social Sciences did not have a new test to lower the scores but too many of our schools found a way to slide anyhow.


Big gains here at Stuart (28%), Patrick Henry (25%), and Ginter Park (21%).  Chimborazo, Swansboro, Cary, and Westover Hills all went the other way.

Next math.  The new tests came in 2012; a number of schools suffered a further hit in 2013, perhaps because of fallout from the new English and science tests that year.


Stuart was the big gainer here (after a similar loss the year earlier).  Carver, Francis, Westover Hills, and Bellevue led the losers.

Next, science.


2013 was the year of the new, tougher tests

I didn’t have the heart to expand the axis to include Woodville’s 82% failure rate in 2017.  But that was from a mere 3% drop in the pass rate: Westover Hills fell 24% and Oak Grove 20%, with Francis, Ginter Park, and Southampton all more than 10%.

Finally, the four subject average.


Stuart was the big gainer here at +17%, followed by Blackwell at 10%.  Westover Hills dropped 14%; Carver, 12%; Swansboro, 11%; and Francis, 10%.

For the view from thirty thousand feet, here are the averages of the elementary school pass rates:


The accreditation benchmark is 75 for English, 70 for the other subjects.  So we see the average of the Richmond elementary schools not only declines on every subject but reading; it flunks on every subject but Hist & SS.

But if you think this is bad (and it is), wait for the middle school numbers, up next.

Bedden Blew It

Dana Bedden started in Richmond in January, 2014

We can ascribe the awful performance of the Richmond schools that year, and probably the next, to the previous Superintendent who had failed to align the curricula to the new math tests in 2012 and the new English and science tests in 2013.

After that, Bedden gets the credit.  Or the not credit.

The 2017 school year was Bedden’s third full (and last) year in the saddle.

We got a preview of his performance from the “division-level academic review” conducted by VDOE in the spring of 2017.   For the most part, the resulting report is a bloated batch of bureaucratic babble.  Nonetheless, some truth shines through.

On a scale of zero to three, where zero and one are failing, Richmond received a “0” for student “Outcomes” (no surprise there), a “1” for “Curriculum Alignment,” and a “3” for “Support for Instructional Leadership.”

So, after two+ years of Bedden, the curricula still were not aligned?  And student “outcomes” were zilch?  But the “instructional leadership” was fine?  Please!

Today VDOE released the 2017 SOL results. 

The database with the full (and not rounded) data is not available as I write this.  The summary spreadsheets are available here.

Those spreadsheets give us a measure of Bedden’s performance:

Reading: Second or third lowest pass rate (not counting the school for deaf & blind), up from last place last year but down two points.


Writing: Bottom of the barrel, down from 2d worst last year, despite an eight point gain.


History & Social Sciences: We beat poor Petersburg again with a constant 67% pass rate.


Math: Last year we beat Lancaster; this year, Petersburg, with a pass rate that dropped four points.


Science: Next to last again, beating only Petersburg but with a five point drop in our pass rate.


Five Subject Average: Beat only Petersburg, up from last place last year; our pass rate dropped slightly, theirs more.


Note: These are averages of the five subjects’ pass rates, not of all tests.

It appears that the School Board did well to get rid of Superintendent Bedden. 

Trouble is, now we go part (most?) (all?) of the 2018 school year with an interim Super and the new Super will need at least a couple of years to get any traction.  In the meantime, Richmond’s schoolchildren continue to suffer in our awful schools.

Of course, it could be worse: We could be Petersburg, laboring under Memoranda of Understanding from the Board of Education (that doesn’t know how to fix their schools) for thirteen years and still competing to be the worst school division in Virginia.

I’ll post more Richmond data as they become available.

How ‘Bout Those Elementary Schools

While I have the data (and while trying to not remember that 2017 results will be out in a week or so), here is the performance of the Richmond Elementary schools on the reading tests, by year.


The numbers here are averages of the pass rates of the three tested grades (3, 4, & 5).  Schools are sorted by 2016 pass rates.

The Big Dip in 2013 coincides with the new, tougher reading tests.  As you see, some schools were barely touched by the new tests; some were clobbered and have recovered; some were clobbered and remain that way.

The threshold for accreditation is 75%; only seven schools (of twenty-six) made that cutoff in 2016.  Six schools were below 50% with Woodville firmly in last place at 33%.

Next, math:


The schools again are sorted by the 2016 numbers so most of the color codes are changed from the reading graph.  (Sigh!)

The new math tests came in 2012.  Note the 2d round reaction in ‘13 at some schools.

The accreditation threshold here is 70.  Thirteen schools made that cut in ‘16; thirteen did not.  Four were below 50%.  Swansboro beat out Woodville for last place, 33% and 39%, respectively.

Stay tuned for the 2017 numbers that should give the final measure of Superintendent Bedden’s success or failure.  (The ‘17 data will tell us nothing about the Board of “Education” that has been busy adopting a “Memorandum of Understanding” instead of doing something useful to fix Richmond’s awful schools.  But, then, even they have noticed that they don’t know how to fix urban schools so perhaps that Mt. Everest of sterile MOU paperwork will keep them from more harmful meddling.)

Pell Plus

We have seen that, among Virginia’s public, 4-year college programs, the graduation rate of Pell grantees correlates strongly with the overall graduation rate of the school.  We also have seen that (federal) Pell and (state) Commonwealth award holders on average graduate at lower rates than students who receive no financial aid.  As well, the data show that students receiving other forms of support graduate, on average, at higher rates than students with no aid.

SCHEV has some more data on this subject. 

Here we look at the 4-, 5-, and 6-year cohort graduation rates of the 2010-11 cohort of first time in college students at our Big Hitter universities.  These data count graduations anywhere, not just at the starting institutions.

The All Students rates look like this:


The rates for students who do not receive any financial support are similar but mostly lower.


The differences between the two rates show some considerable differences between schools (and emphasize that averages do not tell the whole story).


The Pell grantees graduate at still lower rates (UVa excepted).


Or, in terms of the Pell rate vs. the no-support rate:


Here we see Pell grantees underperforming the no-support group except in the fifth and sixth years at THE University.  It seems that UVa’s selection process works even better for Pell students than for the students who pay their own way.  VCU is another story.

The other group we have seen to underperform on average are the Commonwealth Award grantees.


UVa and W&M report no grantees (or at least fewer then ten; see the SCHEV suppression rules).  Tech and Mason outperform here; VCU does not.

The athletic awards show a much different pattern.


Those are large differences.  At six years, VCU graduated 13.7% more of its athletes than of its no-support students.  Tech, 18.4% fewer.

BTW: At five and six years, W&M graduated 100% of the supported athletes.  Consistent with its high overall rate, UVa graduated 90.2% at five years, 92% at six.


Next, here are the results for the other grant programs whose grantees, averaged over Virginia’s 4-year programs, outperformed the no-support group:

Perkins (Federal program for students enrolled in career & technical ed. programs):


PLUS loans:


Subsidized and unsubsidized Stafford loans:



Tuition waiver programs (mostly for employees and families; some for “older” citizens):


SCHEV-operated VGAP awards:


Federal work-study:


Notice the relatively high rate of 4-year degrees (and the widespread overperformance) among the work-study students. 

Doubtless there are multiple factors driving these results.  We might expect those to include the granting process, the selectivity of the school, the pool from which a school draws, and probably other factors.

Nonetheless these data suggest some tentative conclusions:

  • UVa and W&M are skimming the cream from the population of students who receive financial support;
  • As the earlier data suggested, the SCHEV-run VGAP awards are much more effective, in most cases, than the school-run Commonwealth awards;
  • Some schools run counter to the average underperformance of the Pell and Commonwealth grantees (e.g., UVa on Pell; Tech and Mason on Commonwealth); and
  • VCU’s relatively high graduation rate of athletes might suggest either careful selection and nurturing or corrupt grading practices.  It would be good to know which.

Diploma Inflation

The standard diploma requires five “verified credits” (i.e., passing the course plus passing the End of Course (EOC) SOLs or approved substitute tests) in English, math, laboratory science, and history & social science, plus one further verified credit, presumably in one of those areas.   The advanced diploma requires three further verified credits.

We have seen that the 4-year cohort graduation rate statewide has risen in recent years at the same time that the EOC SOL pass rates have generally declined.


The five-year graduation rates run a bit higher but show the same pattern (except, of course, that the data end a year earlier).


Well, that’s the state.  Let’s look at some divisions.  All the graphs below show the 4-year cohort rates.

Let’s start with Fairfax where, for sure, all the students are above average.


Hmmm. Excellent numbers but, again, declining pass rates and increasing diploma rates.  That’s easier to see if we just look at the subject average pass rate and the total diploma rate.


Loudoun looks much the same.



Then we have Richmond and peers.  (Notice the rate of standard diplomas higher than the advanced rate, contra the NoVa jurisdictions and state averages.)









Finally, because I sometimes have readers there, Charles City and Lynchburg.





There are some interesting differences here, particularly the vast differences between the NoVa jurisdictions and the old cities and the lower rates of advanced diplomas in those cities.

The variability of the small numbers in Charles City makes conclusions there problematic but otherwise the underlying theme is constant: Decreasing pass rates with increasing graduation rates.

These data surely allow VBOE to brag on the increasing graduation rates.  Whether they can brag that the kids are learning more is a question these data do not answer.

And, for sure, these data confound the notion that decreasing pass rates should lead to decreasing graduation rates.

Diploma Inflation?

The estimable Jim Bacon the other day raised the question whether Virginia’s increasing graduation rates might be related to grade inflation.

We have some data on a nearly-related subject:  Recall that the standard diploma requires five “verified credits” (i.e., passing the course plus passing End of Course (“EOC”) SOLs or approved substitute tests) in English, math, laboratory science, and history & social science, (Added note): plus one in a subject of the student’s choice (presumably in one of those five areas) (Hat Tip, the illustrious Chuck Pyle).  The advanced diploma requires three further verified credits.

The VDOE Web site has the Virginia average 4-year cohort graduation rates (back to 2008) along with the End of Course pass rates.  Here are the diploma rates along with the reading pass rates:


Hmmm.  Looks like the pass rate declined slowly through 2012, dropped a bit with the new (harder) tests in 2013, and remained about flat afterward.  During this time, the graduation rate with standard diplomas remained close to 35% while the rate of advanced diplomas rose from 44% to 52%, both with no dip in 2013.

Since Excel already has the data, let’s look at the correlations.



The writing tests tell much the same story.




That’s English.  How about math?


Here the new tests in 2012 had a large effect on the pass rate but the advanced diploma rate still rose unimpeded.  The (negative!) correlations are substantial:



Well, how about History & Social Science?




Finally, science.




Whatever is going on here – and the process is so byzantine that an outsider might despair to understand it – it is clear that the average graduation rate, especially of advanced diplomas, is not constrained by the EOC pass rates.  If anything, the graduation rates and pass rates are going in different directions.

Bacon mentions grade inflation.  This looks like diploma inflation somewhere outside the verified credit process.

Note added July 26: Here is a summary graph with all the data:


More Money for What?

Table 13 in the 2016 Superintendent’s Annual Report sets out the fiscal year’s disbursements by school division. 

The table reports day school disbursements (administration, instruction, attendance & health, pupil transportation, and O&M) plus disbursements for food service, summer school, adult education, pre-K, “other” educational, facilities, debt service, and contingency reserve.  See the footnotes to Table 13 for the details about these categories.  The numbers below omit facilities, debt service, and contingency reserve.

Taking the division totals, and dividing by the year-end Average Daily Membership (“ADM”), we see the following distribution:


Richmond is one of the three divisions at $15,500.

Note: Excel’s histogram analysis rounds up, so Richmond’s $15,052, Charles City’s $15,237, and Franklin City’s $15,317 all get reported at $15,500.

Looking at Richmond, the division average, and the peer cities of Hampton, Newport News, and Norfolk, we see:


Richmond’s $3,051 excess over the state average, times our ADM of 1,253.51 21,826.00, (Oops!  Thanks Jeremy!) gives an excess disbursement of $66.6 million.

Plotting the division average reading pass rates v. the disbursements gives the following graph:


Richmond is the gold square.  The red diamonds, from the left, are Hampton, Newport News, and Norfolk.

The least squares fitted line suggests that pass rates decrease with increasing division disbursements but the R-squared tells us that the two variables are uncorrelated.

Here is the same graph for the math SOLs:


The next time somebody starts claiming that Richmond needs more money, it might be useful to ask what they are doing with the excess $66.6 million they already are spending to achieve the lowest reading and second-lowest math pass rates in Virginia.


According to VBOE, 27 of 44 (61%) of Richmond’s schools are not fully accredited.

This hardly comes as a surprise to VBOE: They’ve had the SOL data since last summer. 

As they report, last July our (soon to be ex-) Superintendent “indicated” that a division-level academic review was in order and in November VBOE voted (pdf @ p.365) to approve that review.  Yet the only progress to date is a “First Review of Division-Level Memorandum of Understanding for Richmond City Public Schools,” on the agenda for the June 22 VBOE meeting.

If VBOE had been sweating through a massive restructuring of the Richmond school system, this delay (while another year’s worth of kids are subject to damage by Richmond’s awful schools) might make sense.  Unfortunately, the only product of this foot-dragging is a meaningless bureaucratic mishmash – a “Memorandum of Understanding” that does nothing but create busywork and a “rough draft” plan that is not a plan.


In bureaucratese, “Memorandum of Understanding” is “MOU.”

You might reasonably ask what an MOU is. 

Well, whatever it is, it’s not a statute, regulation, or contract, or an enforcement order authorized by statute, so it’s not something that could be enforced. 

That said, let’s read this one and see what it does.  Or doesn’t do.

The MOU tells us in seven paragraphs that VBOE or VDOE will:

  1. coordinate . . . to provide technical assistance;
  2. meet with the RPS Board “President” (our School Board calls her its “Chair”);
  3. meet with the RPS Superintendent;
  4. provide “oversight over processes, procedures, and strategies” to include approval for expenditures of state or federal funds;
  5. “work closely” with RPS personnel and (redundantly) approve expenditures;
  6. (redundantly) provide “oversight over processes, procedures, and strategies;” and
  7. modify the MOU at will.

There we have six unquantifiable busywork provisions that are quite silent as to any result beyond a major opportunity for delaying financial processes.  That paragraph 7, however, says something: The MOU is not an agreement; it is some kind of (empty) bureaucratic fiat.

Well, perhaps RPS will be doing something useful.  Let’s see.

Richmond’s eleven paragraphs say they will:

  1. tell VBOE who the top three candidates are before they hire a Superintendent (more on this below);
  2. meet with VDOE;
  3. give VDOE approval of expenditures (If this were enforceable, it would create a huge bureaucratic burden for VDOE and give it control of RPS.  It is not enforceable.  It looks mostly to be a mechanism for delaying RPS decisions to spend money.);
  4. consult with VDOE about instruction and staff development;
  5. consult with VDOE about human resources et al.;
  6. create a corrective action plan (that they haven’t managed to create in the past year);
  7. cause the RPS Superintendent to keep the RPS Board updated;
  8. require RPS people to “participate” in technical assistance and professional development, as specified by VDOE;
  9. appear before VBOE to report;
  10. send the Board and Superintendent to professional development training; and
  11. permit a VDOE bureaucrat to sit on the RPS Board ex officio if RPS is not fully accredited after eight years(!).

Nope.  More busywork.

Paragraph 7 is particularly instructive: If the Super does not keep the Board updated, you’d think they’d fire the Super.  VBOE’s intrusion to this level suggests overwhelming busybodyness at VBOE.  Or sublime dysfunction in Richmond.  Or both.

Deeper Dive

Let’s take a more detailed look at the buns that enclose this nothingburger.

Paragraph 1 provides:

Should a vacancy occur in the position of Division Superintendent, the Richmond City School Board will provide the Superintendent of Public Instruction and the President of the Virginia Board of Education the names and credentials of its top three finalists to fill a vacancy of Division Superintendent or Interim Superintendent at least 5 business days prior to making an offer to the preferred candidate. The credentials of applicants must include experience in leading successful school and division turnaround efforts as evidenced by a multi-year trajectory of improved student achievement outcomes on the Virginia Standards of Learning tests or comparable state-mandated assessments in school divisions outside of Virginia.

The first clause is conditioned on a vacancy we know will occur on July 1, so we get to wonder why it is there.  Indeed, the sentence is quite clear without that clause, so the clause is doubly unnecessary.

(Are you beginning to notice a pattern of redundant nonsense here?)

This paragraph doesn’t tell us what VBOE might do if they object to a candidate and RPS hires her anyhow.  Indeed, there is nothing VBOE can do:

  • VBOE can enforce (presumably by injunction) Title 22.1, but the MOU is not part of that Title.
  • VBOE can sue to enforce the Standards of Quality, but those do not require a turnaround specialist as superintendent.

Presumably VBOE could sue to enforce its regulations regarding division superintendents (here and here), but those regulations are silent as to turnaround experience.  And the MOU is not a regulation or order, so the “turnaround” requirement there is meaningless.

Indeed, this is close kin to the demands for approval of RPS expenditures: Intrusive busywork that is not authorized by law.

Turning to the other end of the sandwich, Paragraph 11 in the Richmond section:

The Richmond City School Board will permit [a VDOE] selected representative to meet with the local board in an ex-officio, non-voting, member capacity should the division fail to have all of its schools Fully Accredited by the beginning of the 2025-2026 school year.

Aside from being quite unenforceable, this provision is ridiculous: If having a State bureaucrat sit on the RPS Board ex officio might accomplish anything, VBOE should demand it now.  The ongoing, outrageous harm to Richmond schoolkids is too high a price for delaying action at all, much less until 2025.

But, of course, eight years is a good number for VBOE: All of the members will have been replaced by then.

The Tell

Paragraph 11 is important for what it does not say: Richmond is in gross violation of the Standards of Quality and if RPS don’t do what VBOE wants, VBOE will sue them.

To the same end, the last section of the MOU, “Additional Consequences for Non-Compliance” quotes a 2016 statute that authorizes VBOE to withhold payment of At-Risk Add-On funds if a local Board fails or refuses to satisfy VBOE in a division-level review.

That statute is part of the Acts of Assembly and is printed for the World in the Code; quoting it here does not add anything.  Presumably this is a threat. 

Probably an empty threat:  Withholding funds from a School Board that already is harming too many of the students in its charge would be counterproductive. 

More to the point, as paragraph 11 warned us, VBOE does not say, “If you haven’t fixed those schools by date x, we’ll sue you.”  That is because VBOE does not know how to fix Richmond’s broken school system (Sept. 21, 2016 video starting at 1:48).  They don’t know what to tell a judge that Richmond should be made to do, so they don’t even contemplate exercising their authority to sue.

Your tax dollars at “work.”

For Better Or Pell

We have seen that, among the Virginia public, 4-year college programs, the graduation rate of Pell grantees correlates strongly with the overall graduation rate of the school.

SCHEV has some more data on this subject.  Here, for the public, four-year institutions, for the entering student year 2010-11, are the four-, five-, and six-year graduation rates at the college of entry of first time in college, full time, students with and without Pell grants, grouped by the number of credits attempted in the first term.


The green curves are for students without Pell grants.  Those attempting 15 to 17.5 credits in the first term (blue diamonds) showed a 58% 4-year rate, while the 5- and 6-year rates rose to 76% each.  Those attempting 18 or more credits (orange diamonds) achieved slightly lower rates while those attempting 12 to 14.5 credits (yellow diamonds) graduated at rates about ten percent lower.

The Pell grant population are the gray curves with the same codes for the data point colors. 


  • For all six groups, the rates did not increase much after the fifth year;
  • For both Pell and non-Pell, the students attempting 14.5 or fewer hours in their first terms graduated at about 10% lower rates than their peers who attempted 15 to 17.5 hour loads;
  • In both groups, students attempting still heavier loads — 18 hours or more — graduated less often than those attempting 15-17.5 but still more often than the 12-14.5 tranche. 

That last is only partially consistent with the notion that A students underestimate their abilities while D and F students overestimate theirs.  Something else is at work here.

In any case, the bottom line remains the same as in the earlier data: Pell grants are subsidizing a lot of failure.