Graduation Rates: Official Fiction

The estimable Carol Wolf sent me the link to an article reporting improving graduation rates of disabled students and asked whether that were reflected in Virginia or Richmond.

We earlier saw that Virginia’s graduation rate has been increasing while the reading and math End of Course pass rates were falling.  That is, the Board of “Education” has its thumb on the statistical scale.  Per Carol’s’ inquiry, let’s take a closer look at the overall rates and delve into the rates for disabled students.

The current requirements for a “standard” diploma include six “verified” credits, two in English plus one each in math, a laboratory science, history & social science, and a student-selected subject.  To earn a verified credit, the student must both pass the course and pass the End of Course (“EOC”) SOL test “or a substitute assessment approved by the Board of Education.”

[Do you see the thumb there on the scale?]

To start, here are the reading EOC pass rates for the past five years.

image

Hmmm.  How might we explain those Richmond disabled numbers for 2014-16?  Friar Occam might suggest cheating.  In any case, this is not a picture of improving performance.

Then we have writing.

image

History & Social Science.

image

Math.

image

And science.

image

There are some bumps and squiggles there but the trends are clear: The state averages are fading and the Richmond, plunging. 

The five subject average smooths out the variations.

image

That’s clear enough: The statewide averages have declined in the last two years; despite some gains in ‘15 and ‘16, those averages have declined overall since 2014.  The Richmond averages have plummeted.

Turning to diplomas: Our educrats report (and brag upon) an “on-time” graduation rate.  To get that rate they define “graduates” to include students who earn receive any of the following diplomas: Advanced Studies, Standard, Modified Standard, Special, and General Achievement

To their credit, the federales do not count the substandard diplomas: The federal rate includes only advanced and standard diplomas.  To combat that bit of realism, Virginia two years ago redefined the Modified Standard Diploma by allowing “credit accommodations” to transform it, in most cases, into a Standard Diploma.

This had a nice effect statewide and a dramatic effect in Richmond.

image

image

With that background, let’s look at the four-year cohort graduation rates.

image

Those increases are enough to warm an educrat’s heart, at least until we notice that:

  • The pass rates don’t support the recent increases, and
  • That 2017 bump in the disabled rate (that boosts the overall rate is some measure) reflects 1,200 or more modified standard diplomas that were transformed into standard diplomas by fiat.

The redefinition give Richmond a nice bump in 2017 but the overall rate resumed its decline in 2018.

image

So, yes, Carol.  The Virginia four-year cohort graduation rates rose, both for disabled students and for all students.  The rise was enhanced after 2016 by (even further) manipulated data.  The rise continued at a time the pass rates in the End of Course SOL tests were declining.  If you believe those improving numbers, I want to sell you some shares in a nice James River bridge.

Richmond’s declining numbers remind us that even bogus statistics can’t make Richmond’s public schools look like anything but a menace to the students.

Your tax dollars at “work.”

Postscript: It looks like inflated graduation rates are a national phenomenon.

Boosting the Graduation Rate

As VDOE bragged, their (bogus) “On-Time” graduation rate rose this year.  They didn’t brag about the Richmond rate; it dropped.

Turns out, the (somewhat less bogus) “federal” rates show the same pattern.

image

Note: Post updated from October, 2018 to correct the 2017 state rate.

The increase in the state rate was driven by an increase in the standard diploma rate.  The drop in the Richmond rate came from a 2.7% decrease in the advanced studies rate.

image

But you, Astute Reader, are still looking at that first graph and asking: “John!  You’ve been ranting about how VDOE’s manipulation improved the state rate by about 1.3 points for ‘17 and ‘18 and the Richmond rate by perhaps five points.  Where are those increases?”

Ah, what a pleasure to have an attentive reader!  The 1.3 and 5 point boosts look to have been offset or partially offset by decreases in the End of Course pass rates. 

Turning to the data, here are the graduation rates again along with the averages of the five EOC subject area pass rates.

image

Students must pass six EOC tests to graduate.  Thus, the decreases of the pass rates of those required courses must have lowered the graduation rates.  Then the VDOE data manipulation offset those graduation rate declines in some measure. 

That looks like a general explanation.  The specific would require a more detailed knowledge of which students passed or failed which courses, and where in their four-year journey through high school, and whether they graduated.  For sure, the drop in the Richmond pass rates is consistent with the absence of the five-point boost there.

Of course, correlation is not causation and doubtless there are other factors in the mix here.  The floor is open for any more convincing suggestion.

BTW: The Big Drops in Richmond, and the lesser in state, EOC pass rates mostly came in math and science.

image

image

Preview of Coming Attraction: The Board of “Education” has its next Finagle factor in place: Under the new accreditation regulation (subsection B.3), we now have “locally verified credits” for students who flunk the required SOL tests.  This should insure another nice increase in the state graduation rate, paid for by another not-so-nice decrease in student learning.

Don’t Blame the Kids, Chapter MMXIX

The previous post shows (yet again) that lack of money does not explain Richmond’s awful SOL pass rates.

That post, however, left open the question whether Richmond’s relatively large population of economically disadvantaged (“ED”) students might explain Richmond’s poor performance, at least in part.  The VDOE database has the pass rate data to answer that question.

Let’s start by looking at the division average reading pass rates for both the ED and Not ED groups as functions of the division expenditure in excess of the Required Local Expenditure.

image

The least squares line fitted to the Not ED pass rates shows an increase of 1.15% for a 100% increase in the excess expenditure but the correlation is close to vanishingly small.  The line fitted to the ED pass rates in fact shows a decrease of 3.3% per hundred but still with a correlation that would not support a conclusion anywhere except, perhaps, in a sociology thesis.

These data are consistent with the earlier analysis that showed no benefit to ED or Not ED pass rates from increased per student day school expenditures, increased instructional salaries, or increased per student number of instructional positions.

In the Not ED data, Richmond is the larger diamond with the gold fill at a pass rate of 74.3%, fourth worst in the state.  The Richmond ED rate, the larger circle with, again, the gold fill is 51.7%, second worst in Virginia.   In the previous post, Richmond’s overall SOL pass rate was 58.9%, third worst.

In short, large ED population or not, Richmond is doing a terrible job of teaching all its students.

For reference, the peer cities are in red fill, from the left Norfolk, Newport News, and Hampton.  Charles City is green; Lynchburg, blue.

Eight of the ten highest-priced divisions beat the state reading average for Not ED students (green fill in the table below); four beat the ED average.

image

Of the ten lowest excess divisions, six beat the average for Not ED students while nine beat the ED average.

image

As to the divisions of interest, here are their pass rates expressed as differences from the averages of the division pass rates.

image

The data for the other subjects are consistent with the conclusion that Richmond’s school are ineffective, for both ED and Not ED students.

image

image

Notice that I had to expand the axis to capture Richmond’s tied-for-worst ED writing pass rate.

image

image

image

image

image

image

The next time RPS starts (well, continues) to whine about needing more money for instruction, please be sure to ask them exactly what that money would buy, given that more money in other divisions doesn’t seem to buy anything but more taxes.

More on Money and Learning

A helpful reader points out that the 2018 Required Local Effort report is up on the Web.

Va. Code § 22.1-97 requires an annual report showing whether each school division had provided the annual expenditure required by the Standards of Quality.  The report presents the data as “required local effort” (“RLE”) per school division, along with the actual expenditure, both in dollars and as the excess above the RLE.

Note: The report breaks out the data for all divisions but the SOL database consolidates Emporia with Greensville County, James City County with Williamsburg, and Fairfax City with the County.  I’ve calculated those three consolidated RLE’s from the individual division data.

To start, here are the 2018 division reading pass rates plotted vs. the expenditures beyond the RLE.

image

The slope of the fitted line would suggest that the average reading pass rate increases by 1.17% for a 100% increase in the excess RLE.  The correlation, however, is minuscule.

Translating the statistics-speak into English: The division excess expenditure does not predict the reading pass rate.

Richmond is the gold square on that graph.  The peer cities are the red diamonds: From the left, Norfolk, Newport News, and Hampton.  Charles City is the green diamond; Lynchburg, the blue.

The Big Spenders, yellow circles on the graph, are:

image

For the most part, those divisions performed well (green shading for those > the average).

The Little Spenders, the red circles on the graph, are:

image

With the exception of Petersburg, those divisions also performed well, or close to it.  Notice that six of the ten are in Southwest Virginia and five of those performed quite well on the reading tests.

BTW: Williamsburg had the lowest excess, 3.69%, but that datum disappears here into the average with James City County.

Excel is happy to calculate and plot a Bang/Buck ratio by dividing the RLE excess into the pass rate.  The result:

image

The Bang/Buck correlates well with the reciprocal of the Excess; no surprise there.

Looking just at the middle of the curve, we see the divisions of interest underperforming the fitted curve, Hampton only slightly.

image

The data for the other subjects tell much the same story (albeit with a hint of a correlation in the writing data).

image

image

image

image

All that said, we know that “economically disadvantaged” students underperform their more affluent peers and that Richmond’s large population of ED students puts RPS at a disadvantage in terms of raw SOL pass rates. 

Indeed.  Stay tuned for the next post.

No SATisfaction

Until yesterday, if you were to look on the RPS Homepage, under “About RPS,” you would have found a list of SAT scores that stopped in 2016.  The estimable Carol Wolf asked RPS when they planned to update the data; Ms. Hudacsko of RPS got things straightened out.

Let’s look at those updated data.  First the reading/writing numbers:

image

*Before 2016, the SAT had separate reading and writing scores; since then, they report a single “reading and writing” number.  The data above (and below) are those combined scores for 2016 and later but only the reading scores before then.

Of course, some schools did better than others.

image

Turning to math:

image

image

It’s not Good News that our best two high schools underperformed the state average on both tests.

For some context, here are data on this year’s admissions at Longwood

clip_image001

and at THE University.

clip_image001[4]

To get some further context, I consulted Dr. Google and found a shortage of posted SAT scores for individual schools.  Fairfax is a notable exception (when you look at the numbers, below, you’ll see why).  Here are their 2018 totals (reading/writing + math), along with the same data for the Richmond high schools and for Richmond, Virginia, and the US.

image

Now, class: In 25,000 words or fewer, explain how that performance comports with the RPS Web page that says:

image

RPS Embarrassing Itself

The RPS Web site has a chart:

image

The chart tells us the information there is

image

No telling why they would post old data.

In any case, the “success” headline is a lie.  A quick run to the VDOE database measures the size of that lie: On the 2015 SOL tests, just announced in August, 2016, Richmond was lowest in the state in terms of the reading pass rate and second lowest in writing, history & social science, math, and science.

image  image image

image

image

If that is “success,” I am Bill Gates.

Returning to the chart, we see this picture of RPS demographics.

image

Hmmm.  That 1.33% Asian slice is smaller than the 0.06% Hawaiian and much smaller than the 0.16% Am-Indian.  Let’s see what Excel does with the same numbers:

image

OK.  The graph is wrong but, unlike the “success” lie, there’s no benefit to it.  It just demonstrates incompetence in graphing.

Then we have arithmetic:

image

Thing is, if you add up the numbers, it’s 43 schools, not 44.

According to the Chart, RPS is spending more than a third of a billion dollars a year.  For that price, in one chart on their Web site they manage to tell a blatant lie, proffer a nonsense graph, and demonstrate that they have not mastered simple arithmetic.

Your tax dollars at “work.”

Passing the Kid (and Passing the Problem On)

Table 7 in the 2018 Superintendent’s Annual Report gives us, among other things, a count of the number of public school students retained in grade from 2017.  The short answer: A few in the first two grades and then hardly any until high school.

image

Exprerssed as the percentage of students, the graph looks much the same:

image

This greasing of the academic skids until the student reaches high school, but only until then, surely is related to the “ninth grade bump” that looks to be a nationwide phenomenon.

image

The principal argument for social promotion seems to be that holding the kid back is even less effective for learning than promoting that student.  The literature I’ve seen does not explain why that reason (or simple inertia or moving the problem on to the next level or whatever else might explain social promotion) stops at the ninth grade.

Accreditation Euphemism

The School Quality Profiles on the VDOE Web site offer an interesting look under the hood of the new accreditation process.

There’s no central source of the numbers so one must pluck them out, subject by subject and school by school.  The English data for Richmond took a chunk of an afternoon.

Here are those English data.  First, the elementary schools:

image

The red line at 75% marks the nominal (“adjusted”) accreditation benchmark, “Level 1” under the regulation.  The purple marks the new, good-enough-for-four-years “Level 2” at 66%.  The various “adjustments” [subsection F.1] are color coded atop the blue pass rates.

As you see, only four schools made Level 1 directly; with the adjustments, add another eight.  Four schools were adjusted into Level 2; eight others did not make Level 2, even with the help of the adjustments. 

There are other adjustments, however.  For example, Miles Jones made Level 1 on the three year average; Bellevue and Fairfield Court made Level 2 on that average. 

(The Greene “Growth” and “English Language” boosts are remarkably large.  FWIW, the growth boost, while extravagant, looks to be in line with other low-performing schools.  The database tells us that 220 of 303 Greene students taking the reading tests, 70%, were English Learners; those EL students passed at a 28.18% rate while 65.59 of the non-EL passed.  And, boost or no, Greene was accredited on the three year average.)

After everything, the elementary school accreditation data look like this (with math, science, and absenteeism shown along with English but the “achievement” gaps and “Accredited on 2015 Standards” items omitted) :

image

Miles Jones is anomalous there.  It looks fine on the items in this list but still is “with Conditions.”  Among the items not in the figure, Jones hit L3 on both math and English for its disabled students as well as math for black students and the math “achievement gap.”

Next the middle schools.

image

image

Hill looks fine on this figure but was L3 on the achievement gap and performance of disadvantaged and disabled students in both English and math, as well a the math performance of its black students.

The VDOE spreadsheet shows Brown at L3 in English; the School Quality Profiles show L2.  I can’t reconcile the two.  In any case, there are some L3’s in the achievement gaps that would explain the “with Conditions” at Brown.

The high schools look good, at least until you look past the reading scores.

image

clip_image001

And the specialty schools look good, period.

image

image

Here are the 2017 and 2018 Richmond lists (omitting Amelia Street and Richmond Career).

image

Or, in summary (leaving Carver out):

image

The old accreditation process was riddled with “adjustments” that improved the scores and allowed more schools to be accredited.  Even so, one Virginia school in twenty was denied accreditation in 2017.  The new regulation solves that problem.  In addition to providing a spectrum of helpful “adjustments” to boost the scores, the regulation provides:

If a school is [in such bad shape that even the adjustments don’t save it] and the school or school division fails to adopt and implement school division or school corrective action plans with fidelity as specified by 8VAC20-131-400 D, it may be designated by the board as “Accreditation Denied” as provided in 8VAC20-131-400 D 4. [Emphasis supplied.]

In short, as long as a school makes nice to the Board of Education, it won’t be denied accreditation.

Looking at the data above, the conclusion is clear: “Accreditation Denied” has gone away; “Accredited with Conditions” is the new, more Superintendent-friendly euphemism that means roughly the same thing. 

Thus, if we read the new system for what it means, not what it says, it might even be taken as an improvement on the old, bogus system.  Let’s call it the new, bogus system.

But, then, “Board of Education” is itself a euphemism, if not an outright lie (for example, see this and this and this and this) so there’s no reason to expect honest reporting from their “accreditation” process. 

Even more to the point, that Board has demonstrated (and members have admitted [Sept. 21, 2016 video starting at 1:48]) that they don’t know how to fix awful schools such as those in Petersburg.  So the labels they apply to failing schools provide bureaucratic camouflage: Those schools, and our Board of “Education,” can put on a happy face while they persist in failing to educate far too many children.

Why You Might Want a Woman for Your Kid’s English Teacher

. . . or for your lawyer if she went to school in Virginia, but not if she went to school in Richmond:

image

On the other hand, it might be OK for your next Doc to be a male (but, again, not if he went to school in Richmond).

The End of Course pass rates also suggest a preference for the female lawyer or English teacher and a slight preference for the female doctor, if she didn’t go to school in Richmond.

image

You may remember that the large Richmond population of economically disadvantaged students imposes about a 5% penalty on the Richmond SOL pass rates.  Unfortunately, adding that much to these Richmond numbers does not affect the recommendations.