Let’s look at the distributions of 2018 SOL pass rates.
First, the state and then Richmond on the reading tests:
How to Accredit a Failed (and Failing) School
At the same time, the Board of Education proclaimed a statewide accreditation triumph.
We can gain some insight into the reason for this boom in accreditations atop a slump in pass rates by examining the Petersburg data.
The system was denied accreditation continuously from 2006 to 2017. On the 2017 data, Stuart Elementary and Johns Middle were denied accreditation; AP Hill Elementary accreditation was withheld because the staff there got caught cheating.
This year, notwithstanding all that “help” from the state, all the Petersburg reading pass rates, except perhaps for Peabody/Johns, decreased.
Except at the high school, the math rates dropped as well.
The science rates rose at Walnut Hill and Pleasants Lane; they fell at the high school and tanked at Lakemont.
The Petersburg schools’ home page announced the resulting change in accreditation:
All Petersburg schools are accredited in the 2018-19 school year.
(To their credit, they also said that, except for Cool Spring and Walnut Hill, those accreditations were “with conditions.”)
The accreditation changes were dramatic:
How shall we explain this situation?
This year, there are three performance levels.
For math, the scheme is the same except the Level 1 benchmark is 70% and there is no boost for English learners. For science, the math scheme applies except there is no boost for growth or recovery.
Here, then, are the English accreditation scores for the Petersburg schools.
The colors tell where the numbers came from:
There are some anomalies here:
Even with all the Finagle factors, Vernon Johns did not come close, even to the new, diluted benchmark.
Presumably Cool Spring could not enjoy “Growth” or “Recovery” boosts because the cheating canceled the 2017 scores.
But these enhanced pass rates are not the end of the matter. There also is a potential boost from the three-year average:
Poor Vernon Johns still looks to be beyond help. (BTW: VDOE does not explain how they calculate the numbers to accommodate missing Johns data for ‘17 and the Peabody scores prior to the merger). We need not examine the 2015 rules or the 50%-with-10%-gain options here: VDOE tells us Johns is Level 3 (also in math and science).
Despite an appalling performance in 2018 and no data in 2017 (cheating), Hill/Cool Spring makes the diluted 66% level (and nearly makes the 75% benchmark).
A little arithmetic (nearly) shows where the Cool Spring 73 came from: Here are the accreditation scores for Hill/Cool Spring for the last three years:
The actual three-year average for Reading is 48. BUT if we just average the 2018 value with the (obviously cheating-enhanced) 2016 value, we get 72. Close enough to 73 for VDOE, it seems.
Math is even more dramatic:
But the three-year bonus takes care of Pleasants Lane and elevates Cool Spring to Level 1.
Again, the only way VDOE can get the Hill/Cool Spring numbers is by ignoring 2017. No telling why they report 70 when the actual average was 69.
By any measure, Johns and Lakemont strike out and, indeed, VDOE reports them as Level 3.
The science scoring does not include the recovery et al. adjustments. On the pass rates, only Walnut Hill makes either benchmark.
The 3-year average does not save anybody.
On these numbers, Cool Spring’s enhanced average is 65, one point short of the Level 2 benchmark.
Another way to make Level 2 is to take a score of 50 or better and raise it by ten points. That plainly does not work for Cool Spring, which started in ‘17 at zero (or in ‘16 at 83 in science and dropped to just over half that in ‘18).
Nonetheless VDOE reports Cool Spring as Level 2 for science.
These “levels” feed into the accreditation ratings: A school is “Accredited” if all its school quality indicators are Level 1 or 2. Any school with an indicator at Level 3 is “Accredited with Conditions.” A school accredited with conditions “may” be denied accreditation if it fails to adopt and implement relevant corrective action plans “with fidelity.” The regulation does not tell us what level of “fidelity” is sufficient.
Aside: The question whether a school has implemented “with fidelity” does not “rest entirely upon . . . [an] examination” or constitute “approval by the Board of Education of a school division corrective action plan” so the due process requirements of the Administrative Process Act should apply. If the Board of Education should ever try to deny accreditation somewhere, it will be interesting to see whether they comply with those requirements.
Another way to get accreditation this year is to make the grade under the previous regulation. VDOE tells us no Petersburg school is in that category this year.
A last way to make full accreditation is to enjoy the running three-year accreditation exception from the statute. On that subject, here are the Cool Spring data:
Cool Spring was accredited in 2015 and 2016 (certainly because they were cheating) but not in 2014 (and of course not in 2017, what with getting caught at the cheating). So there’s no three-year run and the statute does not apply.
(But the data do make one wonder why VDOE did not bother to examine those obviously bogus 2015 and 2016 numbers and the resulting accreditation ratings. Indeed, if not rewarded for that obvious cheating, Cool Spring would not enjoy being “accredited” this year.)
Thus, we have the curious situation where VDOE blesses the cheating at AP Hill/Cool Spring with bogus English and Math ratings and with a science rating that looks to be invented from whole cloth.
There are further “achievement gap” measures that provide useful information but let’s pass over those and look at the 2018 results for Petersburg.
As the Petersburg Web page said, everybody is accredited!
None of the “conditions” schools, even Johns, which flunked on all the academic measures, can be denied accreditation unless it “fails to adopt and implement . . . corrective action plans with fidelity.”
So there you have it: The process rewards past cheating. The old benchmarks at 75 and 70% are reduced to 66% (at least for four consecutive years) and apply to manipulated pass rates that can be nearly double the actual pass rates. A school that misses those relaxed benchmarks, and does not benefit from one of the helpful exceptions, is accredited, albeit “with conditions.” That school can be denied accreditation only if it does not, in effect, tell the Board of Education to go jump in the James. Or, in the case of Petersburg, the Appomattox.
So, Petersburg gets to feel good about being “accredited.” Never mind the increased numbers of kids who are not being educated.
98.2 million of your tax dollars at “work.”
The increase in the state rate was driven by an increase in the standard diploma rate. The drop in the Richmond rate came from a 2.7% decrease in the advanced studies rate.
But you, Astute Reader, are still looking at that first graph and asking: “John! You’ve been ranting about how VDOE’s manipulation improved the state rate by about 1.3 points for ‘17 and ‘18 and the Richmond rate by perhaps five points. Where are those increases?”
Ah, what a pleasure to have an attentive reader! The 1.3 and 5 point boosts look to have been offset or partially offset by decreases in the End of Course pass rates.
Turning to the data, here are the graduation rates again along with the averages of the five EOC subject area pass rates.
Students must pass six EOC tests to graduate. Thus, the decreases of the pass rates of those required courses must have lowered the graduation rates. Then the VDOE data manipulation offset those graduation rate declines in some measure.
That looks like a general explanation. The specific would require a more detailed knowledge of which students passed or failed which courses, and where in their four-year journey through high school, and whether they graduated. For sure, the drop in the Richmond pass rates is consistent with the absence of the five-point boost there.
Of course, correlation is not causation and doubtless there are other factors in the mix here. The floor is open for any more convincing suggestion.
BTW: The Big Drops in Richmond, and the lesser in state, EOC pass rates mostly came in math and science.
Preview of Coming Attraction: The Board of “Education” has its next Finagle factor in place: Under the new accreditation regulation (subsection B.3), we now have “locally verified credits” for students who flunk the required SOL tests. This should insure another nice increase in the state graduation rate, paid for by another not-so-nice decrease in student learning.
The 2018 four-year cohort graduation data are up on the VDOE Web site.
For a discussion of the ways VDOE manipulates the data, see this post. To the point here, the state’s “on-time” rate boosts the numbers, especially for those divisions willing to misclassify students as handicapped. Now we have an additional boost: “Credit Accommodations” now allow students with disabilities who previously would have pursued a Modified Standard Diploma to earn a Standard Diploma.
The data below are the “federal” graduation rate, the sum of standard and advanced studies diplomas divided by the cohort size. Because of the Credit Accommodations, the numbers do not compare directly to data from prior years. (Calculations from earlier data suggest that this manipulation boosts Richmond’s federal rate by about 5% and the state rate by about 1.3%)
The cohort data also include dropout numbers.
Let’s start with some division graduation and dropout rates.
By nearly a 2:1 ratio, those Richmond diplomas are standard, not advanced studies.
The Richmond numbers are driven by three of our general population high schools, with the other two tagging along.
We have the worst graduation rate in the state.
Also the worst dropout rate.
Finally, here are the lists of the worst schools in terms of graduation rate,
and dropout rate. (The “#NA()” entries are cases where groups of <10 students invoked VDOE’s suppression rules.)
The new “school quality indicators” for Accreditation come in three levels (see Subsection F). For the English tests, they are defined as:
Level One: Schools with a current year or three-year average rate of at least 75%, or schools that were at Level Two the prior year and decrease the failure rate by 10% or more from the prior year.
Level Two: Schools not meeting Level One performance with a current year or three-year average rate of at least 66%, or schools with a prior year rate of at least 50% that decrease the failure rate by 10% or more from the prior year. A school shall not receive a Level Two performance designation for more than four consecutive years.
Level Three: Schools not meeting Level One or Level Two performance.
Math and science are the same except the Level One benchmark is 70%.
The definitions for dropout rate, absenteeism, and graduation rate are similarly complicated. The benchmarks, before the complications, are:
See the regulation for the complications.
Given that the entire process is built upon arbitrary criteria, let’s create an arbitrary grading scale for the three academic areas, along with dropouts, absenteeism, and graduation index:
With that we can average the results to calculate an overall Quality Indicator Score.
On that basis, 1,464 of 1,813 schools (80.8%) score 100%. There are seventeen Richmond schools in that group.
That “TS” represents a group too small to evaluate, they say.
The other twenty-six Richmond schools did less well:
If we turn to the bottom of the state list, as measured by this score, we see MLK in a three way tie for last place, with five other Richmond schools in the 15-way tie for fourth from last:
But, by golly, they’re all accredited.
The estimable Carol Wolf points me to a June, 2018 Review of the Finance and Business Operations of the Richmond Public Schools prepared by the Council of the Great City Schools.
First the Good News: Our new Superintendent requested the study.
The rest is Bad News. You really should read the report to understand what a huge hole Mr. Kamras is trying to climb out of.
Here, as a teaser, are some tidbits:
Note added Sept. 28. Upon rereading the reg. I see I overstated the requirement for denial. There is a four-year window built in at Level Two but once a school reaches Level Three, it is Accredited with Conditions. There follows a regulatory minuet but the bottom line is that the school has to tell the Board to go jump in the James to get its accreditation denied.
The 2019 Accreditation Results (based on 2018 testing) are up.
Last year, nineteen (of forty-four) Richmond schools were denied accreditation. This year, all Richmond – and, indeed, all Virginia – schools are accredited!
(Under the new, byzantine system – see below – nineteen Richmond schools are “Accredited” this year and twenty-four are “Accredited With Conditions.” The “condition” is that each of the twenty-four must “develop and implement a corrective action plan.”
It will be at least four years before any Virginia school can be denied accreditation. Only if it fails to negotiate and implement that plan “with fidelity” can the school be denied accreditation.)
If you thought that the elimination of all those denials of accreditation reflected performance improvements, you’d be wrong.
In Richmond, the reading pass rate rose by 0.73% this year, the science by 0.53; the others fell sufficiently that the five subject average fell by 1.73%.
Statewide, all the averages fell.
The picture is even more interesting at some of the individual Richmond schools. For example, we have Armstrong, the 12th worst school in Virginia in terms of the all-subjects pass rate:
In this mixed bag of gains and losses, only one pass rate broke 50%.
But Armstrong now is accredited, albeit “with conditions.”
MLK, the second worst school in the state this year on the all-subject average, paints an even more remarkable picture: It’s highest 2018 pass rate was 43.18 in History and Social Science. It’s lowest, a plunge to 16.67 in writing.
But, by golly, MLK now is accredited, also “with conditions.”
Then we have Woodville, the fourth worst school in Virginia, as measured by the all-subjects average:
No pass rate above 40% this year, but accredited, again “with conditions.”
Note: The “#N/A” is Excel’s way of saying that the State does not test writing in elementary school and it can’t graph the nothingness.
How did the Board of “Education” produce these bizarre results?
The Board has demonstrated beyond doubt that it does not know how to fix Petersburg (and admitted as much, see the Sept. 21, 2016 video starting at 1:48). Faced now with the ongoing Petersburg debacle and with accreditation disasters in Richmond and elsewhere, the Board punted: They adopted a new, emasculated accreditation regulation.
I commented on that regulation at the proposed stage, pointing out, inter alia, that the changes “make it almost impossible for a school to be denied accreditation.”
To read the entire regulation is to earn a PhD in masochism. (A reporter for the Daily Press did read it; she survived, and wrote a nice summary.)
Blessedly, the important parts are short:
In short, in order to lose accreditation, a school must foul up badly
for four consecutive years and then tell the Board to go to hell.
That is not a problem, however; that is a feature. The regulation imposes a sterile paperwork smokescreen to hide the Board’s incompetence as to the inadequate schools in Petersburg and Richmond (and elsewhere). And, not at all beside the point, to make the Board and all those awful schools look better than they are in fact.
Never mind the kids who are the victims of this perverse charade.
98.2 million dollars of our tax money at “work.”
Postscript: Here is the Richmond list.
The Virginia Board of Education’s concern for the effect of institutional cheating on the Carver students only applies to a fraction of those students. And it does not apply to students affected by cheating elsewhere.
Following its investigation of the institutional cheating at Richmond’s George Washing Carver Elementary School this spring, the Department of “Education” wrote a thirty-three page report.
Perhaps the most devastating feature of the report was the analysis of the effect of the past cheating upon the Carver students who went on to middle school. Two graphs show the impact reaching back to the cohort of students who were in the fifth grade at Carver in 2016:
As we might expect, kids who were not doing well at Carver were told they were wonderful; then they bombed out in middle school. This has been going on since at least 2016, and probably since 2014.
Section V. of the report says that RPS “must” implement a set of seven “actions.” The two actions relating to help for the affected students are:
1. By Friday, September 21, 2018, RPS division-level staff will develop: 1) a plan to evaluate whether additional instructional supports are necessary for any former GWC ES student entering middle school for the first time in 2018-2019 and 2) a plan to implement any additional instructional supports needed for these students.
2. By Friday, September 21, 2018, RPS division-level staff will develop: 1) a plan to evaluate whether additional instructional supports are necessary for GWC ES students entering the fourth and fifth grade in the 2018-2019 school year and 2) a plan to implement any additional instructional supports needed for these students.
So, Richmond must act to ameliorate the effect of the cheating upon students who were at Carver this year.
But the Carver students in grade 5 last year and the students of the graphs, who were in grade 5 in 2016, must fend for themselves.
And, looking at the graphs, the cheating was rampant in 2015 and ramping up in 2014. But the Board of “Education” is indifferent to the effect on the fifth graders of those years.
As well, the report shows an analogous impact on Carver students who transferred to other elementary schools. But those students, also, must fend for themselves.
We now see that the Board of “Education” has had a simple tool for spotting and – if the analysis were known to be in use – for preventing such cheating.
But the Board has not used this simple analysis in the past and it shows no inclination toward a general deployment now.
It is hard to know what is most outrageous:
Ah, well: $98.2 million of your tax money at “work.”
Note added 9/20: Mr. Muzik emailed a response to this post. I’ve appended it here.
Still later: Plus a note from Mr. Pyle of VDOE.
The Principal at Munford, Greg Muzik, posted a couple of comments to Carol Wolf’s repost of my Munford post. Yesterday, I copied him on a note to Carol and Chuck Pyle of VDOE that included a spreadsheet analyzing his data and reexamining my own. Mr. Muzik replied to the email. I now respond:
I write in response to your email of yesterday to me with copies to Mrs. Wolf and Mr. Pyle (copy attached below).
Your email begins:
I am not sure what you mean by bowderlized. I sent the data based on what you had put on our blog related to student pass rates (just grade 3 and 5 from 2017 and 2018 and what Carol W. had requested.
Notes: The parenthesis is not closed in the original. The “what you had put on our blog” is nonsense: I posted originally to my own blog; Mrs. Wolf reposted to hers; I have not “put” anything on any blog of yours. As set out below, my analysis manifestly was not restricted to “just grade 3 and 5.”
By “bowdlerized,” I mean the adjective form of the verb, bowdlerize:
Bowdlerize verb . . .
2 : to modify by abridging, simplifying, or distorting in style or content . . .
censor, clean (up), expurgate, launder, red-pencil
The pdf you sent Mrs. Wolf contains eight pages. These report “CAT” data for Munford: For 2018, one page each 3d grade reading and math and one each 5th grade reading and math; and the same four datasets for 2017.
These data appear to be the basis for your objection to my remarks. Those remarks were, in relevant part:
It would be unusual to see scores in the 86’s for both lower grades and some 8 points higher in grade 5 in either subject, much less in both. It must be beyond coincidence that, as well, the reading and math scores are the same at each grade level and when averaged, either by grade or by student.
Your response, posted in comments on Mrs. Wolf’s site, discussed at length the CAT pass rate differences at Munford between the third and fifth grades, with no mention of the fourth grade and without any principled explanation of the pattern of reading/math pass rates at all three grade levels in the official VDOE data containing the results of all the tests.
You thrashed that straw man after admitting “I am not sure where Butcher is pulling his data.” If you had attended either my post or Mrs. Wolf’s repost, you would have seen the link to the VDOE database (containing the scores I list above) in the second line of each post.
(On the subject of attention or the lack of it, I would point out that my name is “Butcher,” not “Bollard” as you call me at one point in your comment on the Wolf site.)
If you had wished to dispute my data, the principled approach would have been to download the VDOE data, check my analysis, point out any discrepancy, and suggest a data exchange to resolve any dispute. Instead, you used a partial dataset that is not available to the public to attack my use of the official data that I had sourced and that you admitted were foreign to you.
More to the point, your analysis, such as it was, focused on the aspect of the data that I called “unusual” and, aside from your analysis of the wrong, partial database, failed to respond to the more serious issue, the identical pass rates.
I found ten instances in the six-year period where a Richmond elementary school reported identical reading and math pass rates at grade 3, 4, or 5.
Exactly half of those cases were at Munford.
Friar Occam would counsel the same conclusion that I suggested:
Cheating (by the school, not the students), done systematically, could easily produce equal pass rates in the two subjects. Coincidence is most unlikely here. No other explanation presents itself.
I now would amend that to include another unlikely possibility, error by VDOE in processing the data. While possible, that also is not a likely explanation. I have been looking at their data for more than a decade. I recall only one instance where they made a mistake (it was obvious on the face of the data) and one other where they failed to demand that Richmond complete a faulty submission.
As you saw (or should have seen) from my earlier email, I have asked Mr. Pyle whether VDOE has an explanation for the discrepancy between your and VDOE’s pass rate data. The copies you provided, along with the discussion in your last email, give us that explanation: You chose to rely upon a limited set of data, not available to the public, that reports the results of CAT tests, not the full range of tests included in the VDOE pass rate data that I cited and used in my analysis.
To the more fundamental point, VDOE has already taken a first look at the Munford data: Mr. Pyle of VDOE wrote me on the 17th:
Not the final word, but I wanted you to know that we looked at the Mary Munford Elementary data. When you breakdown the grades 3-5 reading and math pass rates, the advanced/pass-proficient percentages are not identical. Given the number of students tested in each grade level, coincidence is not out of the realm of possibility.
Of course, coincidence is not out of the range of possibility; neither is VDOE error. Neither is probable, however, and coincidence is especially unlikely in light of the data above.
BTW: You have overlooked a feature of the State data that supports the notion that the Munford scores were not rigged. One giveaway to the Carver data was the sudden rise to best in the City. For (a startling) example:
In contrast, the Munford data, particularly in the third grade, show year-to-year variations, with an improving trend, all in a good but not spectacular range. For example:
If your people were cooking the data, these numbers could be expected to be more uniform and, especially, higher, contributing to a statewide all-subject average rank much better than Munford’s current #203 (of 1,714).
On another subject, you said, “Butcher seems to be complaining about our tax dollars used to support education.” Again, you have revealed your lack of attention to the material you are criticizing. My comment, “98.2 million dollars of our tax money at ‘work,’” was the sarcastic conclusion to a jeremiad about VDOE’s ability to easily spot cheating such as persisted at Carver for at least four years, and its abiding failure to do so.
Indeed, in the Carver case, VDOE demonstrated that it has a key to falsification of the cheating hypothesis as to Munford. Their analysis of the Carver situation comported with the obvious notion that students whose scores are artificially raised in elementary school will generally score much lower once they enter middle school. I already have asked Mr. Pyle whether VDOE intends to perform a similar cohort analysis of the Munford students who have recently entered middle school. I hope you will join me and asking that they conduct that study for all of our elementary schools. Even though it cannot identify problems with this year’s grade 5 pass rates, that study can root out past problems and, more importantly, should serve as prophylaxis for next year’s testing.
I am not sure what you mean by bowderlized. I sent the data based on what you had put on our blog related to student pass rates (just grade 3 and 5 from 2017 and 2018 and what Carol W. had requested. The data sent is “raw” data that shows the actual number of students who took the test and who passed or failed. The data from the state you get may be the adjusted pass rates. This takes into account students who were “recovery” that can add to your pass rate, those who transferred into the school form another division after the first 30 days of school (Failing scores don’t count) and English Language learners whose scores are not counted when DOE adjusts the pass rates. It also does not include special Education students who take the Va. Alternative Assessment (VAAP). That data is reported separately, but included in the state published pass rates. As I told Ms. Wolf, the scores at Munford are at or just a little higher than what would be expected based on the population we serve. While I would love to say the reason our students perform so well is they great teachers and principal, the results reflect what we see all over Virginia and the nation.
I did neglect to include a sheet for the Plain English Math Assessment that one student took that so the math and reading pass rates in 2018 were the same (4 students failed both math and reading our of the 71 who took the tests). but having the same number of kids passing or failing a test is not unusual and does not mean much. In our case, 3 of the 4 students were special education students and this is a challenge for all schools, and one of the gap groups. In grade 5 reading and math go hand in hand as there is a great deal of reading students must do in the math assessment. The pass rate has nothing to do with how student performed as passing scores range from 400 – 600. A better way to look at grade level performance is the mean scaled score. The mean scaled score for math was 510.5 and the mean scaled score for reading was 497.9. So even students who passed had great variation in their scores (as expected).
For schools, the raw data is what is used to look at instruction, addtional support services and programs to support students. The adjusted pass rates are really just a school accountability process for accreditation. For instruction use, we use the raw data.
On Tue, Sep 18, 2018 at 9:54 AM, John Butcher <[deleted]> wrote:
Carol and Chuck,
Here . . . are the (obviously bowderlized) Pearson pdf that Carol received from Munford along with my spreadsheet that compares the Pearson data for Munford (4th grade numbers were absent from the Pearson data I received), my original Munford numbers extracted from a large dl from the VDOE site, and a fresh Munford dl from the VDOE site. Yellow headings denote my calculations.
The VDOE pass rates are identical in both downloads. There is a clear disconnect between the Pearson and VDOE data.
Muzik email of 9/20:
1. The data I provided is the raw data from Pearson and was not altered data in any way. This does not account for SOA adjustments that relate more to accreditation that student performance. The Raw data is what drives instructional priories and programs to address remediation and student support.
2. I only addressed grade 3 and 5 because there seemed to be a concern about the SOL pass rate growth of 8 points from grade 3 – 5 which matches the growth state wide from grade 3 – 5. I provided several reasons why we see growth from grade 3 – 5
3. It it not unusual to have the same pass rates in reading and math in a specific grade level. In grade 5 last year, it just happened that there were 4 students who failed the SOL math and 4 who failed the reading. This does not mean the results are the same as the range of failing is 0 – 400. It is also not unusual for students who fail the reading test to also fail the math especially if they are students with disabilities. Many students who struggle with reading comprehension, vocabulary and fluency may also have difficulty with math fluency. However, I reviewed the specific students scores and in grade 5. Only one student failed both the reading and math SOL tests. The rest of the students failed one and passed the other with scores ranging from a low of 301 to a high of 397. A student with a score of 397 was one question away from passing!
As already indicated, there is nothing unusual about the SOL results at Munford. We are one of very few schools in the city that has very few children living in poverty. Our student population reflects the neighborhood we serve. Our scores reflect what is seen across the state based on the population of students. All other outcome indicators at Munford have similar results such as the PALS, NWEA MAP and ARDT when we used it. Mary Munford has outstanding teachers who provide high quality instruction to students and this does impact SOL results, but may not be seen if simply looking at pass rates. High quality instruction helps all students, but pass/fail end of course testing does not show student growth during the year.
I’ll just add that I am not familiar with the form of the data schools receive directly from Pearson. The 2017-2018 pass rates reported by VDOE in August were calculated according to business rules under ESSA, just as pass rates were previously calculated according to business rules under NCLB. The adjusted accreditation pass rates we’ve reported for SOA purposes reflect the state Board of Education’s regulations and another set of business rules. Next week, we will report accreditation data aligned with the revised SOA.
Or, in terms of the individual middle schools (here, the 6th grade pass rates):
Looking just at the average pass rates for transitional grades 5 and 6 we see:
We know that at least some of the Richmond difference between the elementary and middle school grades can be explained by cheating in the elementary schools (by the schools, not by the students). Doubtless the absence of the bogus Carver scores from the 2018 data helps explain Richmond’s Grade 5 drop that year.
The math data show much the same picture.
We can get a more nuanced view by comparing the 6th grade scores with the rates from the fifth grade a year earlier. This gives a measure (not a cohort analysis, but the best we can do with the public data) of the progress of the class as it goes from elementary to middle school.
For example, Richmond’s 56.25 reading pass rate in 2018 is 13.36 points below the fifth grade’s 69.61 in 2017. A graph gives us the history.
The dips in 2013 reflect the score drops with the new tests. Then both Richmond and the state averages recovered, albeit Richmond to a much larger gap. In an ideal world, both graphs would level off at zero.
To the point here, the state average these days shows a small drop in pass rate going from elementary to middle school while the Richmond decrease ranges from 9.6 to 16.7 points larger.
On the math tests, the state average 6th Grade rates are higher than the 5th, probably indicating that the new 6th grade test is the easier. As well, the state recovery from the new tests was much more robust.
But Richmond remains 17.9 points down this year, even after the Carver cheating bonus has been removed.
There is little hope that the Board of “Education” will be looking in to this. That Board did not bother to look at the data that nailed the Carver cheating until after our Superintendent asked for an investigation. That dereliction of their duty as to Carver was part of a general failure, probably ongoing, to even glance at the data, much less to investigate the discrepancy.
Thus, we’ll have to depend on how vigorously our new Superintendent roots out the other (alleged) elementary school cheating. If he does his job, the 2019 data may give a clearer picture of whether the current underperformance of Richmond’s middle schools reflects middle school incompetence or widespread elementary school cheating (or, Heaven forefend it, both).