Accreditation: Progress by Fiat

How to Accredit a Failed (and Failing) School

This year, the SOL pass rates declined in the three subjects that underlie the accreditation process, both statewide and in Petersburg.

image

At the same time, the Board of Education proclaimed a statewide accreditation triumph.

image

Notes:

  • One school that was “Accredited Pending Review of Alternative Accreditation Plan” in ‘18 is omitted from the graph. 
  • The VDOE Web page says 92% Accredited.  The data from their spreadsheet show 92.77%.  Even without their curious stance on roundoff (see below), that rounds to 93%.
  • See below for the demise in 2018 of the several “Partially Accredited” categories and the genesis that year of the “Accredited with Conditions” status.

We can gain some insight into the reason for this boom in accreditations atop a slump in pass rates by examining the Petersburg data.

The Petersburg Schools have been operating under Memoranda of Understanding since at least 2004

http://calaf.org/wp-content/uploads/2016/11/image.png

The system was denied accreditation continuously from 2006 to 2017.  On the 2017 data, Stuart Elementary and Johns Middle were denied accreditation; AP Hill Elementary accreditation was withheld because the staff there got caught cheating.

This year, notwithstanding all that “help” from the state, all the Petersburg reading pass rates, except perhaps for Peabody/Johns, decreased.

image

Notes:

  • Data here are from the SOL database for reading; they no longer test writing in the elementary grades.  Numbers in the School Quality Profiles can be different (see below).
  • Petersburg changed the names of three elementary schools this year:
    1. A.P. Hill became Cool Spring
    2. J.E.B. Stuart became Pleasants Lane
    3. R.E. Lee became Lakemont

Except at the high school, the math rates dropped as well.

image

The science rates rose at Walnut Hill and Pleasants Lane; they fell at the high school and tanked at Lakemont.

image

The Petersburg schools’ home page announced the resulting change in accreditation:

Accredited

All Petersburg schools are accredited in the 2018-19 school year.

(To their credit, they also said that, except for Cool Spring and Walnut Hill, those accreditations were “with conditions.”)

The accreditation changes were dramatic: 

image

How shall we explain this situation?

Well, if you are feeling masochistic, you can read the new accreditation regulation and dig into the numbers buried in the School Quality Profiles.   Or you can read on here.

This year, there are three performance levels

For English:

  1. Level 1: 75% current or 3-year average pass rate (boosted for “recovery,” growth, and English learners) or Level 2 prior year and decrease failure rate by 10%.
  2. Level 2: 66% current or 3-year average pass rate (boosted for retakes, growth, and English learners) or 50% prior year (boosted) pass rate improved by 10%. 
  3. Level 3: Everybody else, plus anybody at Level 2 for four years.

For math, the scheme is the same except the Level 1 benchmark is 70% and there is no boost for English learners.  For science, the math scheme applies except there is no boost for growth or recovery.

Here, then, are the English accreditation scores for the Petersburg schools.

image

The colors tell where the numbers came from:

  • Light green: The pass rate as reported by VDOE.
  • Brown: The percentage of students who flunked reading last year, took the remediation program, and passed this year.  They count twice(!).
  • Dark Green: The students who failed but showed “growth.”
  • Blue: English learners (who are counted only if they pass, but then count twice).
  • Red: The Level 1 benchmark.
  • Gray: The Level 2 benchmark.

There are some anomalies here:

  • VDOE reported a 51.3% pass rate for Cool Spring but their accreditation page says 55% (they round everything off). 
  • Pleasants Lane numbers similarly improved from 60.38 to 61.  (Formerly VDOE rounded up at 0.45; that does not explain this change). 
  • As well, the 71.49 at Walnut Hill became a 72 (perhaps because of the enhanced rounding up). 
  • The reading pass rate/accreditation numbers probably are not comparable for Johns and the high school because the writing scores get averaged in.

Even with all the Finagle factors, Vernon Johns did not come close, even to the new, diluted benchmark. 

Presumably Cool Spring could not enjoy “Growth” or “Recovery” boosts because the cheating canceled the 2017 scores.

But these enhanced pass rates are not the end of the matter.  There also is a potential boost from the three-year average: 

image

Poor Vernon Johns still looks to be beyond help.  (BTW: VDOE does not explain how they calculate the numbers to accommodate missing Johns data for ‘17 and the Peabody scores prior to the merger).  We need not examine the 2015 rules or the 50%-with-10%-gain options here: VDOE tells us Johns is Level 3 (also in math and science).

Despite an appalling performance in 2018 and no data in 2017 (cheating), Hill/Cool Spring makes the diluted 66% level (and nearly makes the 75% benchmark). 

A little arithmetic (nearly) shows where the Cool Spring 73 came from:  Here are the accreditation scores for Hill/Cool Spring for the last three years:

image

The actual three-year average for Reading is 48.  BUT if we just average the 2018 value with the (obviously cheating-enhanced) 2016 value, we get 72.  Close enough to 73 for VDOE, it seems.

Math is even more dramatic: 

image

But the three-year bonus takes care of Pleasants Lane and elevates Cool Spring to Level 1.

image

Again, the only way VDOE can get the Hill/Cool Spring numbers is by ignoring 2017.  No telling why they report 70 when the actual average was 69.

By any measure, Johns and Lakemont strike out and, indeed, VDOE reports them as Level 3.

The science scoring does not include the recovery et al. adjustments.  On the pass rates, only Walnut Hill makes either benchmark. 

image

The 3-year average does not save anybody.

image

On these numbers, Cool Spring’s enhanced average is 65, one point short of the Level 2 benchmark.

Another way to make Level 2 is to take a score of 50 or better and raise it by ten points.  That plainly does not work for Cool Spring, which started in ‘17 at zero (or in ‘16 at 83 in science and dropped to just over half that in ‘18).

Nonetheless VDOE reports Cool Spring as Level 2 for science.

These “levels” feed into the accreditation ratings:  A school is “Accredited” if all its school quality indicators are Level 1 or 2.  Any school with an indicator at Level 3 is “Accredited with Conditions.”  A school accredited with conditions “may” be denied accreditation if it fails to adopt and implement relevant corrective action plans “with fidelity.”  The regulation does not tell us what level of “fidelity” is sufficient.

Aside:  The question whether a school has implemented “with fidelity” does not “rest[] entirely upon . . . [an] examination” or constitute “approval by the Board of Education of a school division corrective action plan” so the due process requirements of the Administrative Process Act should apply.  If the Board of Education should ever try to deny accreditation somewhere, it will be interesting to see whether they comply with those requirements.

Another way to get accreditation this year is to make the grade under the previous regulation.  VDOE tells us no Petersburg school is in that category this year.

A last way to make full accreditation is to enjoy the running three-year accreditation exception from the statute.  On that subject, here are the Cool Spring data:

image

Cool Spring was accredited in 2015 and 2016 (certainly because they were cheating) but not in 2014 (and of course not in 2017, what with getting caught at the cheating).  So there’s no three-year run and the statute does not apply.

(But the data do make one wonder why VDOE did not bother to examine those obviously bogus 2015 and 2016 numbers and the resulting accreditation ratings.  Indeed, if not rewarded for that obvious cheating, Cool Spring would not enjoy being “accredited” this year.)

Thus, we have the curious situation where VDOE blesses the cheating at AP Hill/Cool Spring with bogus English and Math ratings and with a science rating that looks to be invented from whole cloth.

There are further “achievement gap” measures that provide useful information but let’s pass over those and look at the 2018 results for Petersburg.

image

As the Petersburg Web page said, everybody is accredited! 

None of the “conditions” schools, even Johns, which flunked on all the academic measures, can be denied accreditation unless it “fails to adopt and implement . . . corrective action plans with fidelity.”

So there you have it: The process rewards past cheating.  The old benchmarks at 75 and 70% are reduced to 66% (at least for four consecutive years) and apply to manipulated pass rates that can be nearly double the actual pass rates.  A school that misses those relaxed benchmarks, and does not benefit from one of the helpful exceptions, is accredited, albeit “with conditions.”   That school can be denied accreditation only if it does not, in effect, tell the Board of Education to go jump in the James.  Or, in the case of Petersburg, the Appomattox.

So, Petersburg gets to feel good about being “accredited.”  Never mind the increased numbers of kids who are not being educated.

98.2 million of your tax dollars at “work.”

Boosting the Graduation Rate

As VDOE bragged, their (bogus) “On-Time” graduation rate rose this year.  They didn’t brag about the Richmond rate; it dropped.

Turns out, the (somewhat less bogus) “federal” rates show the same pattern.

image

The increase in the state rate was driven by an increase in the standard diploma rate.  The drop in the Richmond rate came from a 2.7% decrease in the advanced studies rate.

image

But you, Astute Reader, are still looking at that first graph and asking: “John!  You’ve been ranting about how VDOE’s manipulation improved the state rate by about 1.3 points for ‘17 and ‘18 and the Richmond rate by perhaps five points.  Where are those increases?”

Ah, what a pleasure to have an attentive reader!  The 1.3 and 5 point boosts look to have been offset or partially offset by decreases in the End of Course pass rates. 

Turning to the data, here are the graduation rates again along with the averages of the five EOC subject area pass rates.

image

Students must pass six EOC tests to graduate.  Thus, the decreases of the pass rates of those required courses must have lowered the graduation rates.  Then the VDOE data manipulation offset those graduation rate declines in some measure. 

That looks like a general explanation.  The specific would require a more detailed knowledge of which students passed or failed which courses, and where in their four-year journey through high school, and whether they graduated.  For sure, the drop in the Richmond pass rates is consistent with the absence of the five-point boost there.

Of course, correlation is not causation and doubtless there are other factors in the mix here.  The floor is open for any more convincing suggestion.

BTW: The Big Drops in Richmond, and the lesser in state, EOC pass rates mostly came in math and science.

image

image

Preview of Coming Attraction: The Board of “Education” has its next Finagle factor in place: Under the new accreditation regulation (subsection B.3), we now have “locally verified credits” for students who flunk the required SOL tests.  This should insure another nice increase in the state graduation rate, paid for by another not-so-nice decrease in student learning.

Graduation (and Not)

The 2018 four-year cohort graduation data are up on the VDOE Web site.

For a discussion of the ways VDOE manipulates the data, see this post.  To the point here, the state’s “on-time” rate boosts the numbers, especially for those divisions willing to misclassify students as handicapped.  Now we have an additional boost: “Credit Accommodations” now allow students with disabilities who previously would have pursued a Modified Standard Diploma to earn a Standard Diploma.

The data below are the “federal” graduation rate, the sum of standard and advanced studies diplomas divided by the cohort size.  Because of the Credit Accommodations, the numbers do not compare directly to data from prior years.  (Calculations from earlier data suggest that this manipulation boosts Richmond’s federal rate by about 5% and the state rate by about 1.3%)

The cohort data also include dropout numbers.

Let’s start with some division graduation and dropout rates.

image

By nearly a 2:1 ratio, those Richmond diplomas are standard, not advanced studies.

image

The Richmond numbers are driven by three of our general population high schools, with the other two tagging along.

image

We have the worst graduation rate in the state.

image

Also the worst dropout rate.

image

Finally, here are the lists of the worst schools in terms of graduation rate,

image

and dropout rate.  (The “#NA()” entries are cases where groups of <10 students invoked VDOE’s suppression rules.)

image

“Quality Indicators”

The new “school quality indicators” for Accreditation come in three levels (see Subsection F).  For the English tests, they are defined as:

Level One: Schools with a current year or three-year average rate of at least 75%, or schools that were at Level Two the prior year and decrease the failure rate by 10% or more from the prior year.

Level Two: Schools not meeting Level One performance with a current year or three-year average rate of at least 66%, or schools with a prior year rate of at least 50% that decrease the failure rate by 10% or more from the prior year. A school shall not receive a Level Two performance designation for more than four consecutive years.

Level Three: Schools not meeting Level One or Level Two performance.

Math and science are the same except the Level One benchmark is 70%.

The definitions for dropout rate, absenteeism, and graduation rate are similarly complicated.  The benchmarks, before the complications, are:

  • Dropouts: ≤6%;
  • Absenteeism: ≤15%; and
  • Graduation Index: ≥88%

See the regulation for the complications. 

Given that the entire process is built upon arbitrary criteria, let’s create an arbitrary grading scale for the three academic areas, along with dropouts, absenteeism, and graduation index:

  • L1: 100
  • L2: 50
  • L3: 0

With that we can average the results to calculate an overall Quality Indicator Score.

On that basis, 1,464 of 1,813 schools (80.8%) score 100%.  There are seventeen Richmond schools in that group. 

image

That “TS” represents a group too small to evaluate, they say.

The other twenty-six Richmond schools did less well:

image

If we turn to the bottom of the state list, as measured by this score, we see MLK in a three way tie for last place, with five other Richmond schools in the 15-way tie for fourth from last:

clip_image001[1]

But, by golly, they’re all accredited.

Quote Without Comment, CGCS Version

The estimable Carol Wolf points me to a June, 2018 Review of the Finance and Business Operations of the Richmond Public Schools prepared by the Council of the Great City Schools.

First the Good News: Our new Superintendent requested the study.

The rest is Bad News.  You really should read the report to understand what a huge hole Mr. Kamras is trying to climb out of. 

Here, as a teaser, are some tidbits:

  • The team noted an absence of focus on successful student outcomes.
  • The team did not see any evidence that the district has developed an action plan to address the issues identified in the Memorandum of Understanding or has used the MOU as an opportunity to accelerate change. When the team asked for a copy of the district’s Corrective Action Plan, the team was told, “We are working on our CAP with the state now.” Additionally, a request for samples of the last six “required monthly updates” on steps taken to implement corrective action in the areas of operations and support services went unanswered.
  • There is a lack of communication channels up-and-down and side-to-side within and between departments. The team was told that . . . Departments work in silos with little communications between and among staff teams . . .
  • The team found few analytical tools and techniques, such as key performance indicators (KPIs), are used to measure and compare performance to increase effectiveness, achieve greater efficiencies, and set goals.
  • None of the interviewees could articulate a vision, mission, goals, objectives, or priorities of the administration.
  • Although employee performance evaluations are generally issued annually, assessments are not tied to goals or accountabilities.
  • Business plans with goals and objectives, benchmarks, accountabilities, timelines, deliverables, cost estimates, cost-benefit analysis, return on investment, and other analytics are generally not used or required. Performance metrics to drive programs and support projects and initiatives have not been developed.
  • The lack of a robust position control and management system has created frustration and finger-pointing between budget and human resources departments.
  • Purchase orders are required for all purchases, regardless of value.
  • Audit responses:

  • image

  • The team was unable to determine if any of the 27 recommendations from the August 2004 School Efficiency Review: City of Richmond Public Schools Division, conducted by the Commonwealth of Virginia – Office of the Secretary of Finance, were acted upon.
  • The team noted . . . [l]ittle recognition by most interviewed of how their specific role and function supported the classroom, students, or student achievement.  
  • The internal audit function is misaligned in that the current reporting relationship represents an internal control issue as the independence of the function has the potential to be compromised. . .  Further, the Internal Auditor is not included in the distribution of the external audit reports.
  • The district lacks a designated cybersecurity position to help prevent information breaches, equipment damage, overall network failures, and the potential for “hacking.”
  • The district’s Enterprise Resource Planning (ERP) legacy software system is antiquated (25+ years old), highly customized, and highly inefficient.
  • Annual building safety inspections are not taking place.

ALL Our Schools Are Above Average!

Note added Sept. 28.  Upon rereading the reg. I see I overstated the requirement for denial.  There is a four-year window built in at Level Two but once a school reaches Level Three, it is Accredited with Conditions.  There follows a regulatory minuet but the bottom line is that the school has to tell the Board to go jump in the James to get its accreditation denied.

The 2019 Accreditation Results (based on 2018 testing) are up.

Last year, nineteen (of forty-four) Richmond schools were denied accreditation.  This year, all Richmond – and, indeed, all Virginia – schools are accredited!

(Under the new, byzantine system – see below – nineteen Richmond schools are “Accredited” this year and twenty-four are “Accredited With Conditions.”  The “condition” is that each of the twenty-four must “develop and implement a corrective action plan.”  It will be at least four years before any Virginia school can be denied accreditation.  Only if it fails to negotiate and implement that plan “with fidelity” can the school be denied accreditation.)

If you thought that the elimination of all those denials of accreditation reflected performance improvements, you’d be wrong. 

In Richmond, the reading pass rate rose by 0.73% this year, the science by 0.53; the others fell sufficiently that the five subject average fell by 1.73%.

image

For reference:  The nominal benchmark for English is 75 (the blue line on the graph); for the other subjects, 70 (the orange line).

Statewide, all the averages fell.

image

The picture is even more interesting at some of the individual Richmond schools.  For example, we have Armstrong, the 12th worst school in Virginia in terms of the all-subjects pass rate:

image

In this mixed bag of gains and losses, only one pass rate broke 50%.

But Armstrong now is accredited, albeit “with conditions.”

MLK, the second worst school in the state this year on the all-subject average, paints an even more remarkable picture:  It’s highest 2018 pass rate was 43.18 in History and Social Science.  It’s lowest, a plunge to 16.67 in writing.

image

But, by golly, MLK now is accredited, also “with conditions.”

Then we have Woodville, the fourth worst school in Virginia, as measured by the all-subjects average:

image

No pass rate above 40% this year, but accredited, again “with conditions.”

Note: The “#N/A” is Excel’s way of saying that the State does not test writing in elementary school and it can’t graph the nothingness.

How did the Board of “Education” produce these bizarre results? 

It all goes back to Petersburg, which has been operating under Memoranda of Understanding since 2004 and which was denied accreditation for 2006.  And has been denied accreditation ever since.

The Board has demonstrated beyond doubt that it does not know how to fix Petersburg (and admitted as much, see the Sept. 21, 2016 video starting at 1:48).  Faced now with the ongoing Petersburg debacle and with accreditation disasters in Richmond and elsewhere, the Board punted: They adopted a new, emasculated accreditation regulation.

I commented on that regulation at the proposed stage, pointing out, inter alia, that the changes “make it almost impossible for a school to be denied accreditation.”

To read the entire regulation is to earn a PhD in masochism.  (A reporter for the Daily Press did read it; she survived, and wrote a nice summary.) 

Blessedly, the important parts are short:

  • If a school does not meet a standard and does not come close to meeting it (the regulation dilutes the 75%/70% benchmarks to 66%) for four consecutive years, it falls to performance Level Three.  8VAC20-131-380.E.2.
  • A school at Level Three must develop a “corrective action plan.”  8VAC20-131-400.D. 
  • If a school (or division) fails to adopt and implement a corrective action plan “with fidelity,” it can be denied accreditation.  Id.

In short, in order to lose accreditation, a school must foul up badly for four consecutive years and then tell the Board to go to hell.

That is not a problem, however; that is a feature.  The regulation imposes a sterile paperwork smokescreen to hide the Board’s incompetence as to the inadequate schools in Petersburg and Richmond (and elsewhere).  And, not at all beside the point, to make the Board and all those awful schools look better than they are in fact. 

Never mind the kids who are the victims of this perverse charade.

98.2 million dollars of our tax money at “work.”

—————————-

Postscript:  Here is the Richmond list.

image

State Malfeasance at Carver (and Everywhere Else)

The Virginia Board of Education’s concern for the effect of institutional cheating on the Carver students only applies to a fraction of those students.  And it does not apply to students affected by cheating elsewhere.

Following its investigation of the institutional cheating at Richmond’s George Washing Carver Elementary School this spring, the Department of “Education” wrote a thirty-three page report

Perhaps the most devastating feature of the report was the analysis of the effect of the past cheating upon the Carver students who went on to middle school.  Two graphs show the impact reaching back to the cohort of students who were in the fifth grade at Carver in 2016:

image

image

As we might expect, kids who were not doing well at Carver were told they were wonderful; then they bombed out in middle school.  This has been going on since at least 2016, and probably since 2014.

Section V. of the report says that RPS “must” implement a set of seven “actions.”  The two actions relating to help for the affected students are:

1. By Friday, September 21, 2018, RPS division-level staff will develop: 1) a plan to evaluate whether additional instructional supports are necessary for any former GWC ES student entering middle school for the first time in 2018-2019 and 2) a plan to implement any additional instructional supports needed for these students.

2. By Friday, September 21, 2018, RPS division-level staff will develop: 1) a plan to evaluate whether additional instructional supports are necessary for GWC ES students entering the fourth and fifth grade in the 2018-2019 school year and 2) a plan to implement any additional instructional supports needed for these students.

So, Richmond must act to ameliorate the effect of the cheating upon students who were at Carver this year. 

But the Carver students in grade 5 last year and the students of the graphs, who were in grade 5 in 2016, must fend for themselves.

And, looking at the graphs, the cheating was rampant in 2015 and ramping up in 2014.  But the Board of “Education” is indifferent to the effect on the fifth graders of those years.

As well, the report shows an analogous impact on Carver students who transferred to other elementary schools.  But those students, also, must fend for themselves.

The Carver outrage was not the first such incident.  There was an institutional cheating scandal at Oak Grove in 2005.  There was another at Petersburg’s A.P. Hill in 2017. 

We now see that the Board of “Education” has had a simple tool for spotting and – if the analysis were known to be in use – for preventing such cheating.

But the Board has not used this simple analysis in the past and it shows no inclination toward a general deployment now.

It is hard to know what is most outrageous:

  • This official decision to ignore the effect of the cheating on so many students;
  • The failure of the Board of “Education,” to have been conducting that cohort study every year to catch – and, even more to the point, to prevent – this kind of cheating; or
  • The failure of the Board of “Education,” to institute an annual, general cohort study going forward, primarily to prevent another such abuse of Virginia schoolchildren.

Ah, well: $98.2 million of your tax money at “work.”

Muzik Non Rilassante

Note added 9/20: Mr. Muzik emailed a response to this post.  I’ve appended it here.
Still later: Plus a note from Mr. Pyle of VDOE.

The Principal at Munford, Greg Muzik, posted a couple of comments to Carol Wolf’s repost of my Munford post.  Yesterday, I copied him on a note to Carol and Chuck Pyle of VDOE that included a spreadsheet analyzing his data and reexamining my own.  Mr. Muzik replied to the email.  I now respond:

Mr. Muzik,

I write in response to your email of yesterday to me with copies to Mrs. Wolf and Mr. Pyle (copy attached below).

Your email begins:

I am not sure what you mean by bowderlized. I sent the data based on what you had put on our blog related to student pass rates (just grade 3 and 5 from 2017 and 2018 and what Carol W. had requested.

Notes: The parenthesis is not closed in the original. The “what you had put on our blog” is nonsense: I posted originally to my own blog; Mrs. Wolf reposted to hers; I have not “put” anything on any blog of yours. As set out below, my analysis manifestly was not restricted to “just grade 3 and 5.”

By “bowdlerized,” I mean the adjective form of the verb, bowdlerize:

Bowdlerize  verb . . .
2 : to modify by abridging, simplifying, or distorting in style or content . . .
Synonyms
censor, clean (up), expurgate, launder, red-pencil

The pdf you sent Mrs. Wolf contains eight pages. These report “CAT” data for Munford: For 2018, one page each 3d grade reading and math and one each 5th grade reading and math; and the same four datasets for 2017.

These data appear to be the basis for your objection to my remarks. Those remarks were, in relevant part:

image

It would be unusual to see scores in the 86’s for both lower grades and some 8 points higher in grade 5 in either subject, much less in both.  It must be beyond coincidence that, as well, the reading and math scores are the same at each grade level and when averaged, either by grade or by student.

Your response, posted in comments on Mrs. Wolf’s site, discussed at length the CAT pass rate differences at Munford between the third and fifth grades, with no mention of the fourth grade and without any principled explanation of the pattern of reading/math pass rates at all three grade levels in the official VDOE data containing the results of all the tests.

You thrashed that straw man after admitting “I am not sure where Butcher is pulling his data.” If you had attended either my post or Mrs. Wolf’s repost, you would have seen the link to the VDOE database (containing the scores I list above) in the second line of each post.

(On the subject of attention or the lack of it, I would point out that my name is “Butcher,” not “Bollard” as you call me at one point in your comment on the Wolf site.)

If you had wished to dispute my data, the principled approach would have been to download the VDOE data, check my analysis, point out any discrepancy, and suggest a data exchange to resolve any dispute. Instead, you used a partial dataset that is not available to the public to attack my use of the official data that I had sourced and that you admitted were foreign to you.

More to the point, your analysis, such as it was, focused on the aspect of the data that I called “unusual” and, aside from your analysis of the wrong, partial database, failed to respond to the more serious issue, the identical pass rates.

I found ten instances in the six-year period where a Richmond elementary school reported identical reading and math pass rates at grade 3, 4, or 5.

image

Exactly half of those cases were at Munford.

The presence of Carver in this list carries a particularly unfortunate implication. The presence of Oak Grove, where there was a 2005 cheating scandal, casts a weaker aura.

Friar Occam would counsel the same conclusion that I suggested:

Cheating (by the school, not the students), done systematically, could easily produce equal pass rates in the two subjects.  Coincidence is most unlikely here.  No other explanation presents itself.

I now would amend that to include another unlikely possibility, error by VDOE in processing the data. While possible, that also is not a likely explanation. I have been looking at their data for more than a decade. I recall only one instance where they made a mistake (it was obvious on the face of the data) and one other where they failed to demand that Richmond complete a faulty submission.

As you saw (or should have seen) from my earlier email, I have asked Mr. Pyle whether VDOE has an explanation for the discrepancy between your and VDOE’s pass rate data. The copies you provided, along with the discussion in your last email, give us that explanation: You chose to rely upon a limited set of data, not available to the public, that reports the results of CAT tests, not the full range of tests included in the VDOE pass rate data that I cited and used in my analysis.

To the more fundamental point, VDOE has already taken a first look at the Munford data: Mr. Pyle of VDOE wrote me on the 17th:

Not the final word, but I wanted you to know that we looked at the Mary Munford Elementary data. When you breakdown the grades 3-5 reading and math pass rates, the advanced/pass-proficient percentages are not identical. Given the number of students tested in each grade level, coincidence is not out of the realm of possibility.

Of course, coincidence is not out of the range of possibility; neither is VDOE error. Neither is probable, however, and coincidence is especially unlikely in light of the data above.

BTW: You have overlooked a feature of the State data that supports the notion that the Munford scores were not rigged. One giveaway to the Carver data was the sudden rise to best in the City. For (a startling) example:

image

In contrast, the Munford data, particularly in the third grade, show year-to-year variations, with an improving trend, all in a good but not spectacular range. For example:

image

If your people were cooking the data, these numbers could be expected to be more uniform and, especially, higher, contributing to a statewide all-subject average rank much better than Munford’s current #203 (of 1,714).

On another subject, you said, “Butcher seems to be complaining about our tax dollars used to support education.” Again, you have revealed your lack of attention to the material you are criticizing. My comment, “98.2 million dollars of our tax money at ‘work,’” was the sarcastic conclusion to a jeremiad about VDOE’s ability to easily spot cheating such as persisted at Carver for at least four years, and its abiding failure to do so.

Indeed, in the Carver case, VDOE demonstrated that it has a key to falsification of the cheating hypothesis as to Munford. Their analysis of the Carver situation comported with the obvious notion that students whose scores are artificially raised in elementary school will generally score much lower once they enter middle school. I already have asked Mr. Pyle whether VDOE intends to perform a similar cohort analysis of the Munford students who have recently entered middle school. I hope you will join me and asking that they conduct that study for all of our elementary schools.  Even though it cannot identify problems with this year’s grade 5 pass rates, that study can root out past problems and, more importantly, should serve as prophylaxis for next year’s testing.

————————————–

The email:

I am not sure what you mean by bowderlized. I sent the data based on what you had put on our blog related to student pass rates (just grade 3 and 5 from 2017 and 2018 and what Carol W. had requested.   The data sent is “raw” data that shows the actual number of students who took the test and who passed or failed.    The data from the state you get may be the adjusted pass rates.  This takes into account students who were “recovery” that can add to your pass rate, those who transferred into the school form another division after the first 30 days of school (Failing scores don’t count) and English Language learners whose scores are not counted when DOE adjusts the pass rates. It also does not include special Education students who take the Va. Alternative Assessment (VAAP).   That data is reported separately, but included in the state published pass rates.    As I told Ms. Wolf, the scores at Munford are at or just a little higher than what would be expected based on the population we serve.  While I would love to say the reason our students perform so well is they great teachers and principal, the results reflect what we see all over Virginia and the nation.

I did neglect to include a sheet for the Plain English Math Assessment that one student took that so the math and reading pass rates in 2018 were the same (4 students failed both math and reading our of the 71 who took the tests). but having the same number of kids passing or failing a test is not unusual  and does not mean much. In our case, 3 of the 4 students were special education students and this is a challenge for all schools,  and one of the gap groups.  In grade 5 reading and math go hand in hand as there is a great deal of reading students must do in the math assessment.  The pass rate has nothing to do with how student performed as passing scores range from 400 – 600.     A better way to look at grade level performance is the mean scaled score.  The mean scaled score for math was 510.5 and the mean scaled score for reading was 497.9.   So even students who passed had great variation in their scores (as expected).  

For schools, the raw data is what is used to look at instruction, addtional support services and programs to support students.  The adjusted pass rates are really just a school accountability process for accreditation.  For instruction use, we use the raw data.   

On Tue, Sep 18, 2018 at 9:54 AM, John Butcher <[deleted]> wrote:

Carol and Chuck,

Here . . . are the (obviously bowderlized) Pearson pdf that Carol received from Munford along with my spreadsheet that compares the Pearson data for Munford (4th grade numbers were absent from the Pearson data I received), my original Munford numbers extracted from a large dl from the VDOE site, and a fresh Munford dl from the VDOE site. Yellow headings denote my calculations.

The VDOE pass rates are identical in both downloads.  There is a clear disconnect between the Pearson and VDOE data. 

[Chuck]:

  • Do your data folks have an explanation?
  • Carol tells me that the Principal told her that the Pearson data include SGPs.  Does the FOIA still define “public record” to include data “prepared or owned by, or in the possession of a public body or its officers, employees or agents in the transaction of public business”?

———————————-

Muzik email of 9/20:

1.  The data I provided  is the raw data from Pearson and was not altered data in any way.  This does not account for SOA adjustments that relate more to accreditation that student performance.  The Raw data is what drives instructional priories and programs to address remediation and student support.  

2.  I only addressed grade 3 and 5 because there seemed to be a concern about the SOL pass rate growth of 8 points from grade 3 – 5 which matches  the growth state wide from grade 3 – 5.  I provided several reasons why we see growth from grade 3 – 5

3.  It it not unusual   to have the same pass rates in reading and math in a specific grade level. In grade 5 last year, it just happened that there were 4 students who failed the SOL math and 4 who failed the reading.  This does not mean the results are the same as the range of failing is 0 – 400.   It is also not unusual  for students who fail the reading test to also fail the math especially if they are students with disabilities.  Many students who struggle with reading comprehension, vocabulary and fluency may also have difficulty with math fluency.   However, I reviewed the specific students scores and in grade 5.  Only one student failed both the reading and math SOL tests.  The rest of the students failed one and passed the other with scores ranging from a low of 301 to a high of 397.  A student with a score of 397 was one question away from passing!  

As already indicated, there is nothing unusual  about the SOL results at Munford. We are one of very few schools in the city that has very few children living in poverty.     Our student population reflects the neighborhood we serve.  Our scores reflect what is seen across the state based on the population of students.      All other outcome indicators at Munford have similar results such as the PALS, NWEA MAP and ARDT when we used it.   Mary Munford has outstanding teachers who provide high quality instruction to students and this does impact SOL results,  but may not be seen if simply looking at pass rates.  High quality instruction helps all students, but pass/fail  end of course testing does not  show student growth during the year.

——————

Pyle email:

I’ll just add that I am not familiar with the form of the data schools receive directly from Pearson. The 2017-2018 pass rates reported by VDOE in August were calculated according to business rules under ESSA, just as pass rates were previously calculated according to business rules under NCLB. The adjusted accreditation pass rates we’ve reported for SOA purposes reflect the state Board of Education’s regulations and another set of business rules. Next week, we will report accreditation data aligned with the revised SOA.

Middle School Muddle

We have seen that Richmond’s middle schools have underperformed, particularly since the new math tests in 2012 and the new English tests in 2013.  For instance:

image

image

Or, in terms of the individual middle schools (here, the 6th grade pass rates):

image

image

Note: To achieve some level of clutter control, I’ve left out Chandler, which closed in 2010.  I’ve also omitted Elkhardt and Thompson, which merged to become Elkhardt-Thompson in 2016.

Looking just at the average pass rates for transitional grades 5 and 6 we see:

image

We know that at least some of the Richmond difference between the elementary and middle school grades can be explained by cheating in the elementary schools (by the schools, not by the students).  Doubtless the absence of the bogus Carver scores from the 2018 data helps explain Richmond’s Grade 5 drop that year.

The math data show much the same picture.

image

We can get a more nuanced view by comparing the 6th grade scores with the rates from the fifth grade a year earlier.  This gives a measure (not a cohort analysis, but the best we can do with the public data) of the progress of the class as it goes from elementary to middle school.

For example, Richmond’s 56.25 reading pass rate in 2018 is 13.36 points below the fifth grade’s 69.61 in 2017.  A graph gives us the history.

image

The dips in 2013 reflect the score drops with the new tests.  Then both Richmond and the state averages recovered, albeit Richmond to a much larger gap.  In an ideal world, both graphs would level off at zero.

To the point here, the state average these days shows a small drop in pass rate going from elementary to middle school while the Richmond decrease ranges from 9.6 to 16.7 points larger.

On the math tests, the state average 6th Grade rates are higher than the 5th, probably indicating that the new 6th grade test is the easier.  As well, the state recovery from the new tests was much more robust.

image

But Richmond remains 17.9 points down this year, even after the Carver cheating bonus has been removed.

There is little hope that the Board of “Education” will be looking in to this.  That Board did not bother to look at the data that nailed the Carver cheating until after our Superintendent asked for an investigation.  That dereliction of their duty as to Carver was part of a general failure, probably ongoing, to even glance at the data, much less to investigate the discrepancy.

Thus, we’ll have to depend on how vigorously our new Superintendent roots out the other (alleged) elementary school cheating.  If he does his job, the 2019 data may give a clearer picture of whether the current underperformance of Richmond’s middle schools reflects middle school incompetence or widespread elementary school cheating (or, Heaven forefend it, both).