Lies, Damn Lies, and Graduation Rates

To follow up on this morning’s post, VDOE’s “on-time” graduation rate counts the

  • Modified Standard diploma (students with disabilities who “are unlikely to meet the credit requirements for a Standard Diploma”), the
  • Special diploma (term not defined on the VDOE Web site but appears to apply to the Applied Studies diploma, available to certain students having a disability), and the
  • General Achievement diploma (none reported this year)

in addition to the standard and advanced diplomas. 

This gives a nice boost to the official “graduation rate,” especially for those divisions willing to misclassify students as handicapped in order to boost their SOL pass rates.  On the 2017 4-year cohort data, the “on-time” fiction boosted the statewide rate by 2.8% and the Richmond rate by 6.7% compared to the federal (advanced plus standard diploma) rate.

image

The boosts this year look to have mostly come from the special diplomas.

image

Actually, it’s worse than that.  Beginning with students entering the ninth grade in 2013-14 (i.e., this year’s 4-year cohort), there was supposed to be no modified standard diploma.  Instead, “Credit accommodations allow students with disabilities who previously would have pursued a Modified Standard Diploma to earn a Standard Diploma.” 

This change has three benefits for the education establishment:

  • The modified standard diploma students who formerly would not count toward the federal graduation rate now count,
  • The divisions have a new avenue – “credit accommodations” – for boosting the rate, and
  • The process is hidden from the public.

This explains the low Modified Standard rates this year.  Last year, those rates were 1.4% for the state and 5.6% for Richmond; this year, 0.1% and 0.5%. 

Looks like this year they successfully concealed about a 5% boost in the Richmond rate.

(The Modified Standard rate should be zero this year except that they get to fudge the cohort for students with disabilities.)

The only question here is whey they did not similarly transform the “special” diplomas into standard diplomas so they could conceal the whole, sordid process.

Added Note:  If we were to figure that Richmond’s federal graduation rate was boosted this year by about 5% (and the state by about 1.3%) because of the transformation of modified standard diploma graduates into standard diploma graduates, Richmond’s 69.9% rate this year

image

would look more like a 65.  If that were the case, Richmond’s recent swoon

image

would look more like a slump (and the state wouldn’t look so hot either).

image

2017 Cohort Data for Richmond Schools

Here are the 2017 4-year cohort dropout rates for those Richmond schools with a graduating class (except for Ric. Career Ed., which had too few students to report).

image

“Mercy,” you say.  “Where did that 18% overall rate come from?”

Well, in addition to the 84 dropouts at Huguenot, there were 70 (in a cohort of 116) at Richmond Alternative.

(As to Alternative, can you spell “dumping ground for troublesome kids”?  Do you remember when it was improving nicely until RPS took it over and embarked on the path to awfulness?)

Here also are the diploma rates.

image

We’re No. 1 (In Dropouts)

The 2017 4-Year Cohort graduation and dropout rates are up on the VDOE Web site.

We have the highest dropout rate in the State.

image

In terms of a graph, here is the Richmond dropout rate, along with those of the peer cities Hampton, Newport News, and Norfolk, as well as Charles City and Lynchburg (where I sometimes have a reader) and the state average.

image

VDOE did a press release to brag about the “on-time” graduation rate.  As with the Accreditation process, they have jiggered the numbers.  The federales only count real (i.e., advanced and standard) diplomas (see the list here) and the data allow us the calculate that uninflated number.

You might think that our awful dropout rate would serve to improve the graduation rate. (Oops!  That was written early in the morning before my old brain turned on.  Dropouts cannot improve the graduation rate; they can – and probably do – improve the SOL scores insofar as the dropouts probably were not great scholars.)  Be that as it may, in terms of the federal rate Richmond was last among the Virginia divisions.

image

Here are the Richmond diploma rates along with those of the divisions in the previous graph.

image

Not only is our total graduation rate in the cellar, we achieved that rate with a preponderance of standard diplomas, in contrast to the average.

The statewide graduation rate has been rising in recent years, driven by increases in the rate of advanced diplomas.  Until the last four years, Richmond has seen increases in both rates.

image

If we fit least squares lines to the totals, we see the state rate increasing at 1.1% per year while the Richmond increase has been 1.6%.

image

If these rates were to continue, the state average would hit 100% in 2028; Richmond would reach the current state average in that same year.  On the other hand, if the Richmond rate were to continue to change at the same rate as the last four years, it would never again rise above 70%.

The 416 Richmond students that did not receive standard or advanced diplomas were 30.1% of the Richmond cohort.  Of those 416, 249 (60%)were dropouts.

Middle School Miasma?

In Richmond, the SOL pass rates drop precipitously between the fifth and sixth grades.

image

I was chatting with a Chesterfield middle school teacher the other day.  He told me that, in his experience, it is a matter of family environment: The kids who go home to a milieu run by adults manage the transition to middle school (and puberty and the social upheavals of that period) much better than those who go home to a situation dominated by a peer group.

To the extent that economic disadvantage (“ED”), or the lack of it, correlates with those family environments, we have some data on that. 

Here, to start, are the 2017 Richmond and Virginia SOL pass rate changes from fifth to sixth grades.

image image

Hmmm.  It looks like there may be something there in the statewide averages on the reading tests but not the math.  In contrast, both reading and math scores drop in the sixth grade in Richmond, moreso for the ED group.

We know that increasing ED correlates with decreasing overall division pass rates.  Could it be that increasing ED populations (Richmond was 64% ED in the 2017 school year) also pull down the score changes? 

Here are the 2017 Division reading pass rate changes from fifth to sixth grade, for both the ED and non-ED populations, plotted vs. the division % ED students.

image

Richmond is the yellow points.

The R-squared values of the fitted lines tell us that the division pass rate changes for both the ED and non-ED groups are essentially uncorrelated with the %ED.  Indeed, if there were a correlation, it would seem to falsify the hypothesis: The slope is positive, which implies increasing sixth grade pass rates, relative to the fifth grade scores.  And, in fact, many divisions enjoy pass nice rate increases from fifth to sixth grades. 

The math scores tell the same story, albeit with more increased pass rates than decreased.

image

To the extent that ED correlates with home environment, these data falsify the hypothesis.

As well, notice that most of the divisions with higher ED populations than Richmond enjoy score increases from grade 5 to 6 (and none of those divisions suffers a decrease as large as Richmond’s).

image

image

Which leaves us to wonder whether those unusually large drops in Richmond are the product of cheating in the elementary schools or awful teaching in the middle schools.  Or both.

Ambiguous News in Richmond

The Times-Dispatch this morning reports that Richmond has (at last) decided to do something about its ongoing educational disaster.

The RT-D quotes interim Chief Academic Officer Victoria Oakley: “The bulk of our work needs to be with these partially warned and the schools that are in denied status to move them to full accreditation.”

We might hope they would revise that statement to “The bulk of our work needs to be with these partially warned and the schools that are in denied status to move them to full accreditation.”

If Richmond tackles the hard problem – the schools whose repeated failures have landed them in the accreditation basement – they will necessarily solve the easier problems at the same time.  If they attack the spectrum, there will be a temptation to focus on the easy wins.  That could impair the effort in the schools where it is most needed – and where nearly half of Richmond’s students languish in failed schools.

image

(2017 data here and here)

Please recall that they are dealing, in the worst case, with this appalling situation:

http://calaf.org/wp-content/uploads/2017/08/image-142.pnghttp://calaf.org/wp-content/uploads/2017/08/image-143.png

The (perhaps) good news in the story:

[The] work involves meeting with school principals to make individual plans and see where intervention or professional development is needed. Schools are creating progress plans that will be tracked monthly.

There are at least two ways to translate that bureaucratese:

  • If it means “hold our principals accountable for retraining or firing the ineffective teachers,” it may be the harbinger of a genuine effort to fix even our worst schools.
  • If it means more bloated plans, it sentences thousands more of Richmond’s schoolchildren to ineffective teaching on into an indefinite future.

Johnny Can’t Read But Greene Is Accredited Anyhow

This year, VDOEadjusted” the ESH Greene Elementary reading SOL from 51 to 81, the math from 58 to 84, and pronounced that school “Fully Accredited.”  Indeed, Greene has been fully accredited for at least the past eight years.

Let’s take a look back and see if these helpful pass rate “adjustments” have a history.

For a start, English.

NOTE: In the period of the data here, they administered a fifth grade writing test from 2011 to 2013.  The “English” SOL is some kind of average of that with the Grade 3-5 reading scores.  The numbers below are the average of the grade 3-5 reading and grade 5 writing for the 2011-13 period.  Given that the writing score covers only one grade, the actual averages probably are closer to the reading numbers.

image

For the purposes of the present analysis, the 2013-14 numbers are so appalling appallingly (oops!) low that those differences are beside the point.

Here, then, are the English pass rates and the “adjusted” values.

image

The orange line is the 75% accreditation benchmark.

For the past five years, the “adjustments” have boosted Green into accreditation.  This year the boost was thirty(!) points.

The math situation is slightly less egregious: The boost this year was only twenty-six points.

image

Courtesy of this falsification by our “education” establishment, Greene has remained fully accredited to the present.

HOWEVER

This official mendacity gives Greene bragging rights while failing to teach nearly half its students to read or reckon.  Unfortunately, that’s only the surface problem.

More fundamentally:  Accreditation — or lack of it — is meaningless.

In theory, a school that loses accreditation is in violation of the Standards of Quality and the Board of “Education” has the authority to compel the local school board to bring that school into compliance.  In practice, the Board does not know how (Sept. 21, 2016 video starting at 1:48) to do that.

The poster child in this respect is Petersburg, which has been operating under Memoranda of Understanding with the Board since April 2004 and still can’t teach 40% of its schoolchildren how to read.

image

In light of

  • the fictitious accreditation numbers,
  • the Board’s ineffective “help,”
  • the Board’s total failure to exercise its authority, and
  • the Board’s admission that it does not know how to fix broken schools

the accreditation process is a shameful sham.

Your tax dollars at “work.”

—————————-

Added Note:

The “adjustment” here appears to lie with the “English Learner” or “Limited English Proficiency” population.  (The VDOE Web site seems to use those terms interchangeably.)

To start out, the local school gets to decide which students are LEP (or EL).  For the reasons you’ll see below, I’ll bet you a #2 lead pencil that every student – except for the bright ones – who knows how to pronounce “señor” gets classified as LEP.

The LEP population affects the scoring in two ways.

First, “LEP students who have attended school in the United States for less than 12 months may receive a one – time exemption for the SOL Reading test in grades 3 through 8.”

So we get a one-time SOL score boost for any new, LEP student.

Then, to the point here, “The scores of LEP students enrolled in Virginia public schools fewer than 11 semesters may be excluded from the accreditation rating calculations.”

So LEP students who start here in kindergarten don’t count against accreditation until they’ve had six years to learn our Mother Tongue.  And those who start in the first grade or later need not be taught English; they can’t hurt the accreditation rating. 

Can’t you hear the mantra: “Teach the bright ones; forget the rest.”

Greene has a large population of “EL” students.  The Fall, 2017 report shows:

image

For sure, the Greene data tell us many of those kids are not being taught English.  Or math, which ought to be the same in Spanish as in English.

So the accreditation system accomplishes three things at Greene:

  • It produces accreditation numbers that are unrelated to how well the school is teaching,
  • It tells the school to classify all but the brightest immigrant students as LEP, whether those kids are fluent in English or not, and
  • It encourages the school to forget about the LEP kids; they can’t affect the accreditation even if they don’t learn a thing.

Your tax dollars at “work.”

Accreditation Inflation

The initial accreditation results tell us that 18 of 44 Richmond schools were fully accredited this year.  There remain 26 schools that, even by the highly flexible standards of the Board of “Education,” have a problem.

A closer look at the data shows a situation that is worse – in a couple of cases, much worse – than even those numbers suggest.

ESH Greene Elementary

Greene is fully accredited this year, despite a sad history of decline.

image

Recall that the accreditation benchmark is 75 for English, 70 for the other subjects.  Here we see Greene underwater in all five subjects.  Indeed, by any honest measure, Green would have run out its four years of failure last year.

But all that was before the “adjustments” provided some remarkable enhancements.  Here are this year’s results:

image

49.5% of the Greene students flunked the reading SOL but the “adjustments” produced a score six points above the 75% benchmark.  (They stopped reporting writing in elementary school three years ago so we are spared the ambiguity in averaging the reading and writing pass rates to get to the English datum.)

Greene enjoyed similar, if smaller, “adjustments” in the other subjects; the school then “met” the benchmarks in all subjects except Science, where a ten point boost still left it thirteen points short.  The school remains fully accredited.  Never mind it plainly is failing to properly serve its students.

There are good, old English words to describe this situation.  Most of them can’t be used in polite company.

Seven other Richmond schools were boosted into accreditation in one or more subjects.  Fortunately, none as dramatically as Greene.

Franklin Military Academy

Franklin has done quite well recently, except for a problem with math.

image

The “adjustments” this year camouflaged that problem.

image


Linwood Holton Elementary

Holton’s picture is similar to Franklin’s.

image

image


JL Francis Elementary

Francis has a persistent problem with reading; this year its SOLs dipped in math and science. 

image

The “adjustments” this year cured the latter two problems.

image


Miles Jones Elementary

In recent times, Jones has done well in history but has been borderline or below in reading, math, and science.

image

The “adjustments” this year cured all those little problems.

image


Southampton Elementary

Southampton had been improving but slid this year below the English and math benchmarks.

image

The “adjustments” fixed that.

image


Bellevue Elementary

Bellevue enjoyed the “adjustment” in math but not in reading.

image

image


Richmond Community High

Community usually does quite well.  This year, however, its math performance slipped below the benchmark.

image

The “adjustment” took care of that.

image


Comment
Rant

VDOE writes the SOL tests.  They can boost the pass rates and accreditation rates simply by making the tests easier.  Yet they indulge in this opaque process to produce meaningless numbers that favor some schools over others.

Moreover, they do not adjust the scores for the one factor that they measure and that we know affects the pass rates: Economic Disadvantage.  Indeed, they have abandoned (pdf at p.102) their measure of academic progress, the SGP, that is independent of economic status.

(Their excuse for dropping the SGP is, they say (pdf at p.102), that they can’t calculate it until all the data are in at the end of the summer.  I don’t think they were too stupid to know that when they started with the SGP; I think they are lying: The SGP gave intolerable amounts of accurate, comparable information about teacher/school/division performance.)

And then we have Petersburg that has managed to remain without accreditation for at least fourteen years, despite all the helpful “adjustments.”  The Board of “Education” has at last figured out what to do about that: They are going to change the accreditation system to make the process still more byzantine and to make it easier for a school to remain accredited.

Your tax dollars at “work.”

Accreditation Boosts in Richmond

Having looked at the accreditation score boosts by division, let’s turn to the Richmond schools.

As a reminder: The accreditation scores start with the pass rates.  VDOE then applies “adjustments” for “remediation,” limited English proficiency, and transfer students.  The result of the adjustments almost always is a boost in the score.  Indeed, one division enjoyed a 26 point boost in its math score.

Unfortunately, VDOE does not release the data that underlie these fudge factors so there is no way to audit the process.  We can, however, look at the accreditation scores and compare them to the underlying pass rates.

Here, then, are the 2017 math data for Richmond:

image

The red line is the 70% accreditation benchmark.  This year Bellevue, Greene, Franklin, Holton, Francis, Jones, Community, and Southampton were boosted into math accreditation.

The average math boost was 4%; the largest was 26% (!) at Greene.

Imagine that!  Greene had a 58% pass rate in math but managed to be fully accredited.  With the Good Fairy handing out this kind of treats, it’s a wonder than anybody fails to be fully accredited.

Scott Adams points out that “wherever you have large stakes, an opportunity for wrong-doing, and a small risk of getting caught, wrong-doing happens. . .  When humans CAN cheat, they do.” 

For sure, we have seen Atlanta, Buchanan County, Oak Grove, Petersburg, and VDOE’s own history in that respect.

Viewed in that light, in the reflection of the demand for better pass rates, and in the context of the large boosts that some scores enjoy, VDOE’s secret data manipulation is ill-advised, even if it is not corrupt.

————————–

P.S.:  I’m holding off on the English data until I can learn how they average the reading and writing pass rates.  If it’s by averaging the school averages for each subject, we’ll see a 30 point boost at Greene and a 31% jump at Armstrong.  Stay tuned.

It’s Performance, not Poverty

The popular sport in the Richmond “education” establishment has been to blame the kids for the awful performance of our schools.  We particularly hear about our less affluent (the official euphemism is “economically disadvantaged”) students.

We have some data on that.  Again.

Here are the average reading pass rates by grade of the economically disadvantaged (”ED”) and non-ED students in Richmond and the state.  “EOC” indicates the End of Course tests that generally must be passed to receive “verified credits” that count toward a diploma.

image

Both in Richmond and on average, the ED group underperforms the non-ED group.

To the point here, the Richmond ED students underperform their peers in the state averages, as do the non-ED Richmond students

We can calculate the differences between the Richmond groups and state average to measure that underperformance.

image

Here we see Richmond’s ED students underperforming their peers by about 7% in elementary school while our non-ED students average some 9% below that group statewide.  In middle school the difference increases to roughly 19% for the non-ED students and 25% for the ED group.

The math test results show a similar pattern. 

image

image

These data tell us two things:

  • Richmond students, both ED and not, underperform their statewide peer groups on average; and
  • The average SOL performance of Richmond students, ED and not, deteriorates dramatically in middle school.

As I have demonstrated elsewhere, the large percentage of ED students in Richmond (64% in 2017) does not explain our low pass rates.  So we are left with (at least) two possible explanations: Either Richmond students are less capable on average than students statewide or our schools are less effective than average.

If Richmond’s students were just less capable, it would explain the low elementary school scores but not the drop in pass rates after the fifth grade.

The plummeting performance of our students when they reach middle school tells us there’s a (big!) problem with our middle schools.  And there’s every reason to think that the school system that has terrible middle schools might also have problems with its elementary schools.

To the same end, notice how the ED performance by grade tracks the non-Ed performance in Richmond.  We saw the same thing last year:

image

image

Those parallel curves are fully consistent with the notion that performance variations by grade are driven by the teaching, not by the capabilities of the students.

As well, Friar Occam would suggest the simple explanation: Substandard elementary schools and awful middle schools in Richmond, not thousands of dunderheads (both poor and more affluent) attending those schools.

There’s an alternative that we might consider: I keep hearing that some of our elementary schools cheat on the SOLs.  We know for sure that’s happened in the past in Richmond and this year in Petersburg

If, in fact, the Richmond elementary pass rates are boosted by cheating, it absolves the middle schools of some or all of the decline in scores between the fifth and sixth grades but it leaves the conclusion here intact; it merely moves the elementary schools toward the “awful” category of our middle schools.

“Adjusting” Accreditation Scores

The accreditation scores for 2017-2018, based on the 2017 testing, are up on the VDOE site.

In the distant past (2005), VDOE’s opaque accreditation process transformed 76.3 and 73.7 math scores at Jeter-Watson into “perfect scores” and embarrassed the Governor.

They now admit to manipulating the scores:

Accreditation ratings also reflect adjustments made for schools that successfully remediate students who initially fail reading or mathematics tests. Adjustments also may be made for students with limited English proficiency and for students who have recently transferred into a Virginia public school. All of these factors are taken into account in calculating pass rates in each subject area.

(But don’t ask them for the remediation data.  They’ll tell you to go breathe ozone.)

They tell me the English rating is based on an average of the reading and writing pass rates.  I’m waiting for information on whether that is an average of the school averages or an average by students.  In the meantime, let’s look at the math data.

The accreditation “adjustments” to pass rates are not so dramatic these days, an average of 2.6 points on the math tests this year, but they still have significant effects.

To start, here is a plot of the math “adjusted” rates (i.e., the rates used for accreditation), by school, vs. the actual pass rates.

image

In a world populated with honest data, all those points would lie on the red line.  As you see, some do, and a few lie below (hard to know how the adjustments would produce that result), but most of the data show accreditation scores that are “adjusted” to larger values.

NOTE: It took several hours to groom the dataset.  There were ten schools for which the database reported SOL pass rates but the Accreditation list had no report.  As well, VDOE reported accreditation data for 41 schools not listed in the SOL database and listed another six in the accreditation data without any numbers.  Then there are another four schools that appear in both lists but are missing data in one or the other.  The data here are for the remaining 1,772 schools.

If we plot the distribution of differences (i.e. adjusted score minus actual pass rate), we see that most of the adjustments are five points are fewer. 

image

Rescaling the y-axis reveals that numbers even in the 10% range are not trivial and, in one case, the “adjustments” produced a 26 point gain.

image

The adjustments reduce the population of scores just below the 70% cutoff for accreditation and increase the population above that benchmark:

image

The (least squares) fitted curves show the shift in the average score.

A plot of counts of adjusted minus counts of actual pass rates emphasizes how the adjustments deplete the population below the 70% cutoff and increase it above.

image

The outstanding example:  There are 37 schools with 69% math pass rates but only five with that adjusted rate.  The average adjusted rate for those 37 schools is 74%.

VDOE writes the SOL tests.  They can boost the pass rates and accreditation rates simply by making the tests easier.  Yet they indulge in this opaque process to produce meaningless numbers that favor some schools over others.

Moreover, they do not adjust the scores for the one factor that they measure and that we know affects the rates: Economic Disadvantage.

And remember that the pass rates themselves have been fudged in some cases: See, e.g., this and this.

So “opaque” is insufficient to describe this process.  “Opaque and corrupt and unfair” comes closer.

Your tax dollars at “work.”