Rigged Grading

A large study by the OECD concludes: “[T]he socio-economic composition of schools explains far more of the differences in student performance between schools than do other school factors that are more easily amenable to policy makers, such as school resources and school policies.” That is consistent with the Virginia data that show economically disadvantaged (“ED”) students underperforming their more affluent peers (“Not ED”) by about twenty points on the SOL pass rates.

The SOL reporting system ignores this issue and punishes schools with larger populations of less affluent students. Four Richmond elementary schools illustrate the situation.

Fox and Munford serve relatively affluent neighborhoods; both schools turn in superb SOL results. Cary and Obama have much tougher clienteles and do not show as well. Nonetheless, Cary and Obama get better ED pass rates than do Fox and Munford.

Hard to believe, eh? Well, look at this. And notice how the SOL average is affected by the %ED.

image

The SOL average over all students punishes Cary and Obama for the relative poverty of the majority of their students, albeit those students outscore the ED students at Fox and Munford. At the same time, Munford and Fox are rewarded for having small ED populations.

If we look at the averages of the ED and Not ED rates, a different pattern emerges.

image

In terms of those averages, all four schools cluster near the 75% accreditation benchmark. This calculation rewards Cary and Obama for the superior performance of their tougher-to-teach students and recognizes the inferior results at Fox and Munford.

On the math tests, Cary escapes most of the SOL penalty by way of a very high ED pass rate but Obama confirms the point. The Fox and Munford SOLs again shine, despite relatively lower ED pass rates.

image

The ED/Not ED average again tells a more balanced story.

image

The Board of Education bases its accreditation calculation on the SOL average over all students in a school or division. Indeed, the public measure of school quality is that same average. That rewards the schools and divisions with fewer ED students, whether or not they get good results with those ED students, and penalizes schools and divisions with large ED populations, even when they get better than average results with their ED students.

A plot of division average reading pass rates and ED/Not ED averages vs. the %ED illuminates the difference between the SOL average pass rate and the ED/Not ED average.

image

The fitted line to the SOL scores (red) suggests that the SOLs decrease on average by about 2.5% for a 10% increase in the ED population. The R-squared of 23% suggests a modest correlation. Indeed, we have seen that the correlation derives almost entirely from the effect of the ED group’s generally lower scores.

The green points show the averages of the ED and Not ED pass rates. The slope of the fitted line is much lower, minus 0.7% for a 10% ED increase, and the 2.3% R-squared value denotes only a trace of correlation. Said otherwise, this average is almost entirely unrelated to the percentage of ED students in the division.

The math and science data tell the same story.

image

image

An ideal grading system would present a horizonal fitted line with an R-Squared of zero. The ED/Not Ed average comes close and, in any event, is vastly less unfair than the current system.

BTW: The Board of Education had an even better system, the SGP, that was uncorrelated with economic status. The Board abandoned that system because it provided an accurate measure of teacher performance.

So we are left with a reporting system that punishes schools and divisions that serve larger populations of poorer students.

If that is a fair system, I am Santa Claus.

2019 Graduation Rates

The four-year cohort graduation rates are up on the VDOE Web site.

The Board of Education brags on its (inflated) “On-Time” rate that counts the “Board of Education-approved”  diplomas. The Federales, to their credit, count only the Advanced Studies and Standard diplomas.

This year, the Federal rate for the state slipped from 88.8% to 88.7%; the Richmond average fell from 67.6% to 64.9%.

image

On these data, there’s no telling how much of the Richmond decline is the result of correcting the transcript issues.

Breaking out the Standard and Advanced rates, the Advanced average fell from 52.0 to 51.5% while the Standard rate rose from 36.8% to 37.2%. The Richmond Advanced rate fell from 23.3% to 21.7%, the Standard, 44.3% to 43.2%.

image

Here are the Richmond rates by school, sorted by the Federal rate.

image

And, to complete the picture, the rates for Richmond’s economically disadvantaged students (ca. 2/3 of the Richmond population).

image

Note: The zero Advanced rates here for Wythe and Franklin probably are artifacts of the VDOE suppression rules for groups of <10 students. The Franklin rate calculates as 29%; the Wythe could be as high as 5.5%.

“Help” for Petersburg II: Mirror of VBOE Incompetence

The Petersburg Public Schools have been laboring under the supervision of the Board of Education since at least 2004.

The results are appalling.

Here, to start, are this year’s accreditation calculations. (See this post for descriptions of the L1 and L2 benchmarks and the various score boosts.)

image

image

image

Or, in summary:

image

Petersburg reached this condition as the latest step in a march to abject failure. The decline was exacerbated in 2017 when the staff at AP Hill Elementary (now Cool Spring) were caught cheating.

And, please remember that the accreditation system is rigged to avoid this kind of failure.

The Board’s intervention at Petersburg goes back to a Memorandum of Understanding (“MOU”) in 2004:

https://calaf.org/wp-content/uploads/2019/08/image-42.png

Approaches of the last two MOUs illuminate the Board’s feckless approach to improving the Petersburg schools. In November, 2009 the Board took over the Petersburg system:

The VBOE and the VDOE will continue to assign a CAO . . .   The CAO will have administrative authority over processes, procedures, and strategies that are implemented in support of the MOU and funded by targeted federal and state funds with subsequent review and approval by the Petersburg City School Board.

Then, in April, 2016 the Board retreated to “coordinat[ion]” and “technical assistance”:

The Director of [Office of School Improvement] will coordinate with school division staff and other VDOE offices to develop a Corrective Action Plan for Petersburg Public Schools and to provide technical assistance in support of the MOU and Corrective Action Plan.

Notwithstanding this pusillanimity, The Board has the authority to compel Petersburg to fix its schools:

Va. Code § 22.1-8 provides: “The general supervision of the public school system shall be vested in the Board of Education.”

Va. Code § 22.1-253.13:8 provides:

The Board of Education shall have authority to seek school division compliance with the foregoing Standards of Quality. When the Board of Education determines that a school division has failed or refused, and continues to fail or refuse, to comply with any such Standard, the Board may petition the circuit court having jurisdiction in the school division to mandate or otherwise enforce compliance with such standard, including the development or implementation of any required corrective action plan that a local school board has failed or refused to develop or implement in a timely manner.

Yet the Board has never sued Petersburg, or any other division, to compel compliance with the law. The reason is clear: The Board would have to tell the judge what the division must do to comply with the law but the Board DOES NOT KNOW HOW TO FIX BAD SCHOOLS [Sept. 21, 2016 video starting at 1:48].

I think it is past time for the Board members (and their staff at VDOE) to be directed to work that is better suited to their capabilities.

Effects of “Help” for Petersburg

The Board of “Education” has been “supervising” Petersburg since at least 2004.

https://calaf.org/wp-content/uploads/2019/08/image-42.png

If anything, all that “supervision” has let Petersburg’s performance continue to decay. In summary:

https://calaf.org/wp-content/uploads/2019/08/image-43.png https://calaf.org/wp-content/uploads/2019/08/image-46.png

Notes: Statewide, economically disadvantaged (“ED”) students underperform their more affluent peers (“Not ED”) by about 20% in terms of the SOL pass rates. Yet, “the socio-economic composition of schools explains far more of the differences in student performance between schools than do other school factors .” This makes the SOL average an unfair measure for divisions, such as Petersburg, with large ED populations. Accordingly, the graphs here and below show the performance of both the ED and Not ED students. The new math tests in 2012 and the new English tests in 2013 dropped pass rates statewide; the new math tests in 2019 raised pass rates statewide by 3.4% for Not ED students, 6.6% for ED. The red lines are the nominal levels for accreditation. VDOE “adjusts” the pass rates in order to accredit some schools that come nowhere near those thresholds.

Turning to the individual Petersburg schools: The 2017 data for AP Hill, now Cool Spring, are missing because the school (the staff, not the kids) was caught cheating. The splendid numbers before and dismal performance since 2017 show much the same pattern as Richmond’s GW Carver.

image image

To the point here, with the false aura of the cheating removed, the data show that the school is unable to prepare half its students to pass the SOLs.

Note in passing: Last year, the Board of Education accredited Cool Spring in English based on the three year average. Lacking 2017 data because of the cheating, the Board ignored 2017 and reached back to the (cheating enhanced) 2016 pass rate to compute an average that met the “Level 2” threshold.

Pleasants Lane, formerly JEB Stuart, showed some improvement in the math scores this year (as did the state average with the help of the new, failure-averse scoring system), but the Reading rates slid.

image image

Lakemont, formerly RE Lee, painted a less rosy picture.

image image

Walnut Hill, in contrast, stayed in the running for accreditation this year.

image image

There are four middle schools in the database: Peabody, Vernon Johns Jr., Old Vernon Johns, and Vernon Johns. The relationship between these is not clear. However, only Vernon Johns has data after 2017 so we’ll go with that:

image image

The high school’s students suffer from declining performance.

image image

It’s hard to see any benefit here from sixteen years of Board of Education “supervision.” To the contrary, the ongoing, dismal failure of the Petersburg schools testifies to the abiding fecklessness of that Board.

Your tax dollars at “work” (this year, $1,939,750 for “school improvement).”

Middle and High School Accreditation Analysis

Here, to start, are the 2019 English Accreditation data for Richmond’s middle schools.

image

(For details regarding the benchmarks and adjustments, see the earlier post.)

Franklin, a selective school whose pass rates include results from both middle and high school students, made L1 on pass rate alone. Adjustments brought Hill to exactly the L1 benchmark. Binford and Brown adjusted to L2. None of the other four came close to L2.

The math data look a bit better.

image

Brown adjusted to within one point of L2 (66) and also was 65 on the three year average and did not decrease the failure rate by 10% from a pass rate of at least 50%, but was awarded L2 anyhow. Even more entertaining, Elkhardt-Thompson adjusted to 62 and was 55 on the three year and failed to reduce the failure rate by 10% but also wound up at L2. It looks like near misses count in horseshoes, hand grenades, nuclear warfare and accreditation.

In science, four schools made L1 on the pass rate, the other four did not even come close to L2.

image

The overall picture:

image

Hill made L1 on all the major measures but wound up at L3 because of problems with achievement gaps of its black, poor, and disabled students (the standard says the rating is L3 if two or more student groups are at L3). Similarly, Brown, although credited with L2 on the English pass rate, flunked English because of achievement gaps (also the black, poor, and disabled groups).

Turning to the high schools.

image

The English pass rates were excellent at the selective schools and more than good enough at Marshall and TJ. Huguenot made L2. Wythe missed L2 on the pass rate but made L1(!) with a 75 on the three year average. Armstrong barely made 50%.

The math numbers were not so good.

image

Among the standard high schools, only Huguenot made L1 with the others languishing.

The science data painted a similar picture but with all the standard high schools below the L2 benchmark.

image

Marshall (65 this year and 54 on the three-year average) made L2 by reducing the failure rate by >10%.  TJ (64 this year, 60 last year, 63 on the 3-year average) got a mystery boost to L2.

At the bottom line, only the selective schools were accredited. None of the standard high schools made L2, despite the hidden help they all obtained by stealing the scores of Maggie Walker students).

image

Elementary Accreditation, a Deeper Dive

Having seen the contours of the 2019 accreditation data for the Richmond schools, let’s take a look at the underlying numbers.

Here is the Big Picture for the English tests at the elementary schools:

image

The data are from the “School Quality Profiles

The green line marks the “Level 1” (at standard) benchmark. The gray is the “Level 2” (near standard) benchmark. Anything less is “Level 3” (flunking). A school at Level 1 or 2 is “Accredited.” Level 3 earns “Accredited with Conditions.” A school at Level 3 that “fails to adopt and implement school division or school corrective action plans with fidelity . . . “may be designated by the board as “Accreditation Denied.” (Translated: To be denied accreditation, a school or division must marinate in Level 3 and tell the Board of Education to go jump in the James.)

The blue bars are the average pass rates for each school.

Those pass rates are boosted by the magenta “Remediation Recovery” numbers. The orange boost is the percentage of students who, in VDOE shorthand [pick a school, scroll down to “Academic Achievement,” and click “show explanation”] “improved compared with prior performance on state English tests.” Similarly, the gold boost indicates  the “[p]ercent of English-language learners passing state English-language proficiency tests or making progress toward English-language proficiency.”

Make what you will of that system, these are the 2019 results in English for Richmond’s elementary schools.

As to math, there is no EL boost but the Level 1 benchmark drops to 70%.

image

Note: These data are sorted by the reported total while the graphs shows the sum of the parts. All the numbers from VDOE are rounded to whole percentages, producing roundoff errors

image image

and bumps in the math graph (which is sorted by the reported numbers).

For science, there are no adjustments so only the pass rates matter. The L1 benchmark again is 70%.

image

To summarize the accreditation results:

image

Looking at the data above, the conclusion is clear: “Accreditation Denied” has gone away; “Accredited with Conditions” is the new, more Superintendent-friendly euphemism that means much the same thing. 

Thus, if we read the new system for what it means, not what it says, it might even be taken as an improvement on the old, bogus system.  Let’s call it the new, bogus system.

But, then, “Board of Education” is itself a euphemism, if not an outright lie (for example, see this and this and this and this) so there’s no reason to expect honest reporting from their “accreditation” process. 

Even more to the point, that Board has demonstrated (and members have admitted [Sept. 21, 2016 video starting at 1:48]) that they don’t know how to fix awful schools such as those in Petersburg (and, of course, in Richmond).  So the labels they apply to failing schools provide bureaucratic camouflage: Those “Accredited with Conditions” schools and our Board of “Education,” can put on a happy face while they persist in failing to educate far too many children.

Richmond High Schools

Having looked at the 2019 pass rates of Richmond’s elementary schools and middle schools, let’s turn to the high schools.

Note: The Board of Education has designed its SOL reporting to discriminate against Richmond and other divisions with large populations of economically disadvantaged (“ED”) students. Those students underperform their more affluent peers (“Not ED”) by about 20 points on average. As a result, the SOL averages for divisions such as Richmond (ca. 2/3 ED) are lowered relative to divisions with similar ED and Not ED pass rates but fewer ED students. Fortunately, the database provides both ED and Not ED pass rates.

The End of Course (“EOC”) tests are primarily administered in the high schools. The standard diploma requires that the student pass the EOC tests in two English courses and one math course.

To start, here are the EOC reading pass rate averages by school for Not ED (more affluent) students.

image

All three of the selective schools aced these tests. Of the mainstream high schools, only TJ met the nominal benchmark for accreditation (75%). And notice, this is the result for the more affluent (and presumably higher-scoring) students.

The Not ED pass rates for the mainstream high schools were reduced by the loss of some better-performing students to the three selective schools. At the same time, the rates for those five schools were boosted some by the scores of Maggie Walker students; those are reported at the high schools in those students home districts, not at Walker. VDOE does not report the magnitude of those effects.

Turning to the ED pass rates:

image

Again the selective high schools turned in superb numbers. Marshall was the pick of the mainstream high schools, 4.3 points below the nominal benchmark for accreditation; the other schools worked together to lower the Richmond average to 61.4.

Again, the selective schools skimmed some of the most capable ED students from the pool. That cannot have affected the division average. To the contrary, the Richmond average was boosted some by the rip-off of the Maggie Walker results.

Even so, the averages told a sad story about the state of our high schools.

image

The Not ED minus ED data showed a curious pattern.

image

The Richmond average difference was inflated in some measure by the Maggie Walker swindle. The selective schools showed the effect of attracting some of the more capable ED students. The mainstream high schools were all over the place.  The TJ difference was as astounding in one direction as the Marshall in the other.

Except perhaps at Franklin, the numbers tested were not so small that a few high or low scores could have produced a large fluctuation in the pass rate.

image

Either the nonselective schools were showing wildly variable performances in their teachers or in their learners. Or both.

Turning to the math tests and the Not ED pass rates:

image

The nominal accreditation benchmark here is 70; only the selective schools met it.

As to the ED pass rates, none of the mainstream high school came close to the benchmark. The Richmond average barely broke 50%.

image

There is a complication: The Richmond math averages include results from middle schools. Those offer advanced classes, including some in the high school math subjects, that allow some of their better students to get a jump on the graduation requirements. The entire picture looks like this:

image

image

In light of that selection process, is is no surprise that the middle schools outscored the high schools.

image

The state averages are subject to the same issue.  Thus, the comparison with the overall Richmond averages should be a fair one, albeit it does not directly measure the average high school performance.

image

Richmond has a very large math problem.

As to the Richmond average, the Not ED/ED difference was within the realm of reason. Otherwise another unexplained spectrum.

image

And, again, the numbers tested were not so small as to explain all the scatter.

image

On these data, the scatter in the Not ED/ED pass rate differences will remain a puzzle.

However, the message of the averages (and of the pass rates of too many schools) is clear.

First Look at Accreditation

The 2019 accreditation results are up.

Notes:

  • Please recall that these ratings are so thoroughly rigged as to be bogus.
  • VDOE calls them the “2019-2020 Accreditation Ratings” because they supposedly apply to the current school year, not the 2019 school year for which they (purportedly) measure performance.
  • In the interest of honest reporting, the tables below say “Flunked” where VDOE says “Accredited With Conditions.” That official euphemism allows everybody to say a school is “accredited” when it actually flunked, even by the current, rigged standards.

Here, then are the school totals for the state, Richmond, three peer cities, and poor Petersburg:

image

Or, expressing the numbers percentages of the totals:

image

Richmond is up this year as to the percentage of schools accredited, from 43.2% to 45.5% (one school difference), and also improved on the percentage flunking, from 54.5% to 50% (two schools).

Caveat: The table below shows that both of the schools “Accredited Pending Review of Alternative Plan” deserve to flunk.

Richmond had 2.3% of the schools in Virginia this year and 16.7% of the schools that flunked.

Finally, here is a summary of the “indicators” for the Richmond schools as to the basic subjects as well as truancy, graduation rate, and dropout rate.

image

A reader (THE reader?) tells me to say what L1 et al. mean. L1 means the school met the requirement, such as it is; L2 means it met a relaxed requirement; L3 is the laggards. It’s more complicated than that; details here.

In cases that are all L1 here but the school flunked, e.g. Albert Hill, you can dig into the complete table or the “profile” to see what drove the rating. Albert Hill is all L1 on this table but flunked on the achievement gaps.

Despite, appalling pass rates this year, Fairfield Court was accredited on a three year average (that included two years of excellent numbers that almost certainly were based on cheating). The current school year should resolve that anomaly.

Richmond Middle Schools

Having looked at the 2019 pass rates of Richmond’s elementary schools, let’s turn to the middle schools.

Note: The Board of Education has designed its SOL reporting to discriminate against Richmond and other divisions with large populations of economically disadvantaged (“ED”) students. Those students underperform their more affluent peers (“Not ED”) by about 20 points on average. As a result, the SOL averages for divisions such as Richmond (ca. 2/3 ED) are lowered relative to divisions with similar ED and Not ED pass rates but fewer ED students. Fortunately, the database provides both ED and Not ED pass rates.

We’ll start with the 6th grade reading pass rates for the Not ED (more affluent) students.

image

Henderson is blank on this graph because its small number of Not ED students tested triggered the suppression rule (<10 Not ED students tested at this grade level). All we can say about that school is that nearly all the 6th Grade students are ED.

Franklin is unusual in that it has both middle- and high school grades. I’ve included Franklin here because the by-grade data cut out any effect of the high school grades. All the same, Franklin is a selective school so its numbers don’t compare directly to the mainstream middle schools.

Aside from Franklin, only Hill beat the state average. Binford handily beat the nominal benchmark for English accreditation (75%). The Not ED students of the remaining schools failed at either bad or catastrophic rates.

As to the ED students, only Franklin broke 50% on the 6th grade reading tests.

image

Turning to the seventh grade:

image

This time MLK replaced Henderson in the clutches of the suppression rule. Elkhardt-Thompson and Boushall stayed in the race to the cellar, joined there by Henderson.

As to the ED students, Franklin led the pack. Otherwise, only Brown and Hill broke 50%.

image

The 8th grade data were another chorus of the same distressing song.

image

image

The math data were even more disturbing.

image

image

image

image

image

image

Turning to the school averages we lose Franklin: The database will not give an average over just the middle school students there.

On the reading tests we see Not ED students did well at Hill and Binford.

image

None of the other schools made the nominal benchmark for accreditation.

No school did well with its ED students.

image

In terms of the Not ED/ED differences, Hill and Binford did a Munford: Despite excellent Not ED numbers, the ED pass rates were unusually low.

image

The other schools produced unusually small Not ED/ED differences and the Boushall numbers are anomalous, with the ED students outscoring the Not ED.

Turning to the Not ED math data, we see Hill did well again while Binford performed at the margin and the other schools languished in failure.

image

The ED math numbers were even worse than the reading, with no school breaking 50%.

image

The Not ED/ED differences were similar to those on the reading tests with Hill’s ED underperformance even more exaggerated, Binford’s less so, and Boushall again an anomaly.

image

In light of these data, it is no mystery why Richmond’s enrollment drops over 15% between the 5th and 6th grades.

https://calaf.org/wp-content/uploads/2019/03/image-21.png

Stay tuned for the high school numbers.

Richmond Elementary Schools

While we await the (fictional) accreditation numbers, it might be interesting to look at some of the 2019 Richmond pass rates in more detail.

The excellent – but very sloooow – VDOE database can provide data by subject and by test level. The database also breaks out pass rates for students who are economically disadvantaged (“ED”) and for their more affluent peers (“Not ED”).

The ED averages run about 20 points below the Not ED so the SOL averages punish the schools with large ED populations. To avoid that distortion, we’ll look here at the the underlying ED/Not ED numbers.

To start, here are the Not ED pass rates on the 3d Grade Reading tests.

image

Those ten schools with no data are the victims of the suppression rule (no numbers released to the public if fewer than 10 students). About all this tells us is that those schools have very large ED populations.

Here we see Munford, Holton, Patrick Henry, and Fox beating the state average. Then there are five schools where half or more of the Not ED students cannot pass the 3d grade reading tests.

Turning to the 3d grade ED students, we see the state average is lower than the ED by 24 points and the Richmond average, by 23.

image

On these data, there’s no telling how well Munford and Fox did with the ED students. In any case, they had too few to affect their averages much.

Patrick Henry’s and Holton’s ED students badly underperformed their Not ED peers.

At the bottom of the scale, we see sixteen (57% of twenty-eight) schools where half or more of the ED students flunked the 3d grade reading tests. Indeed, on average 57.82% of the Richmond ED students did not pass those tests.

The fourth grade data paint a similar picture, albeit with some interesting differences.

image

Do you believe that Chimborazo number? It could be that Chimborazo has an outstanding fourth grade English teacher; perhaps it has a few really bright kids; maybe it has a Carver type operation. No telling from these data.

Patrick Henry and Holton again did well. 

Turning to the 4th Grade ED students:

image

As with the third grade, Holton did not do well. There were too few ED students to tell about Patrick Henry and, again, Munford.

Swansboro looks to have a very effective 4th grade English teacher.

But look at Chimborazo, second from the bottom. Those Not ED Chimborazo data look to be anomalous.

The fifth grade data again show a similar picture with, again, some interesting variations.

image

image

Holton turns in yet another high Not ED/low ED performance.

In this case we have ED data for all the schools and we see Fox, Southampton, Cary, and Patrick Henry beating Munford as to the ED pass rate, with Munford not quite three points above the state average. Of course, these data do not separate the effects of teaching, student ability, or home environment so they do not speak directly to the quality of those schools. That said, Munford does not get any bragging rights.

To the good, these ED numbers, while appalling, are not as awful as those for the earlier grades.

For an overall view that reduces the effects of the suppression rules, here are the school average reading pass rates.

image

image

Never mind the Munford/Fox empire: Cary, Obama, and Fisher are the ED stars here, with Redd and Francis also beating out Fox. Given that about 2/3 of Richmond students are ED, it’s interesting to see the duopoly thus dethroned.

These data also provide a nuance to moving students between Cary and either Fox or Munford. Cary gets better results with its own, majority ED students. The Munford and Fox ED numbers do not suggest that the Fox/Munford environments might convey any major benefits to those Cary ED students (for even more of that, see the math data below). Indeed, these data suggest it might be helpful to the Fox/Munford ED students if they were moved to Cary.

Turning to math (where a new, failure-averse scoring system improved pass rates statewide this year), the 3d Grade Not ED pass rates (where we have data) range from excellent to (mostly) discouraging.

image

The ED pass rates are heartbreakers in too many cases, but look at Cary, Obama, and Redd.

image

Nine schools have 3d grade pass rates below 50%, with Fox(!) nearly in that league.

Fourth Grade.

image

image

Notice Swansboro, which also had good reading numbers in the fourth grade.

Fifth grade. Notice Fox upping its game here.

image

image

Looking at the school-wide data, we see Munford and Fox still further down in the pack as to their ED students.

image

image

Notice that Ginter Park joins Fox and Munford with excellent Not ED and not so excellent ED pass rates.

One more look: Here are the Not ED/ED differences.

image

image

The several large differences raise the question whether those schools (which include Munford and Fox and some other high-performing schools) are serving their ED students well.

Then there are the anomalous cases, Greene and a few others, where the ED students passed at higher rates than the Not ED. Without more information it’s impossible to know what’s going on there but, for sure, something is out of whack.

———-

Note added on 9/17:

Here are two graphs that examine the Obama/Cary/Fox/Munford situation more directly. The red lines are the nominal accreditation thresholds.

image

image

Those math data are particularly dramatic: Obama and Cary are dealing with much more challenging situations and getting clearly better results with their ED students.

———-

Stay tuned for a look at the middle schools.