Picking the Unblemished Cherries

The headline in the Chronicle says, “SOL scores soar in Charles City.”

The text continues:

In Charles City, several areas saw massive gains in comparison to the previous year. Of the improvement areas, perfect pass rates were achieved in the areas of Algebra II and chemistry. Subject areas that saw a 20 or more point gain include grade seven English reading (64 to 85), World History I (59 to 79), World History II (54 to 92), grade three mathematics (53 to 81), grade seven mathematics (36 to 66), geometry (28 to 51), and the aforementioned Algebra II (34 to 100). . .

“When we got those scores in, it was absolute elation,” said Charles City superintendent of schools David Gaston. “Coming off last year, we didn’t do poorly but we knew we had areas to work on.

To be sure, CCCo saw some nice pass rate gains this year. As well, there were some (unmentioned) decreases.

image

In the larger view, the overall trends do not justify the Superintendent’s “elation.”

Reading is indeed up this year. But over half of the gain was used to overcome the loss of 2018.

image

The six year trend is +0.21% per year.

The trends in the other subject areas are less encouraging.

In math, this year’s gain was not sufficient to overcome the 2018 decrease, and the overall trend is down.

image

History and social science, moreso.

image

The writing pass rate this year fell to a six-year low, wiping out the gains of the previous two years.

image

This year’s decrease in the science pass rate more than undid 2018’s increase.

image

The average of those five subject area pass rates increased by 1.2 points this year but the trend continued to decline (-1.4 points per year).

image

It takes selective blinders to see cause for “elation” in these data. A more measured reaction might be “some bright spots but large areas that continue to need our attention.”

Profligate Promotion

The conventional wisdom these days seems to be that both social promotion and retention in grade are ineffective. For example:

Studies indicate that retention negatively impacts students’ behavior, attitude, and attendance, but it is still practiced in schools around the country. Social promotion undermines students’ futures when they fail to develop critical study and job-related skills; however, it too is still practiced in many schools throughout the United States. These practices are ruining public education as we know it, and unless we innovate and find alternative strategies to replace them, the US. K-12 education system will continue to underperform.

BTW: That article says that both social promotion and retention in grade “are ruining public education.” A closer view might suggest that the prerequisite failure to learn is the problem, not the dilemma of retention v. promotion.

In any case, the data tell us that Richmond has elected for social promotion.

Table 7 in the Superintendent’s Annual Report gives the number of students repeating the same grade as in 2018. The 2018 Table 7 reports the 2018 Fall membership. The ratio gives the percentage of students held back. The SOL database offers the percentage of students failing the tests in each subject in each grade. With those data in hand, it is straightforward to produce a graph:

image

Changing math failure rates for reading gives a similar picture.

image

The other subjects are not tested in all grades 3-12. In that data desert, the 6th grade failure rate in writing probably says something about the fifth grade education in that subject.

image

2019 Dropouts

Richmond won the dropout race this year.

Here are the eleven divisions with the largest All Students 4-year cohort dropout rates for 2019:

image

That 24.4% is 372 students (of 1,523) who dropped out.

The peer city, Norfolk, is highlighted in red.

The “ED” column is the rate for economically disadvantaged students. Sorted by the ED rate, the list changes but Richmond’s position does not.

image

Just to cleanse the palate, here are the divisions with the lowest rates, sorted first by the all students rate, then by ED.

image

image

In terms of the subgroups of students, the rates of black, white, and homeless students are unacceptable; those of the English learners, Hispanics, and students with disabilities are astronomical.

image

The VDOE information sheet and FAQ don’t tell us what “anytime” means in this context.

Our Superintendent’s Plan, Dreams4RPS, lists ten priorities; the eighth is “Decrease chronic absenteeism-overall and for each subgroup.” There is no mention of how, by how much, or who is to be responsible for the decrease. None of the five “priorities” in the Plan mentions attendance, truancy, or absenteeism.

More Salary $ ≠ Higher Pass Rate

We often hear that better pay leads to better teaching. VDOE has some data on that.

Table 19 in the Superintendent’s Annual Report lists, among other things, the average annual teachers salaries by division. Data here are averages for both teachers and principals. The very nice database on the VDOE site provides pass rates. The latest data in the Annual Report are from 2018 so the pass rates here are from 2018 as well.

For the reading tests, the data look like this:

image

“ED” on the chart refers to the “economically disadvantaged” students. Statewide, ED students underperform their more affluent peers (“Not ED”) by about twenty points so I’ve posted both sets of data.

The fitted line to the Not ED data suggests that a $10,000 increase in the average salary is associated with a 1.1% increase in the pass rate. The R-squared value of 2.3%, however, tells us that the two variables are essentially uncorrelated.

The ED data suggest a decrease in the pass rate of 3.1% per $10,000 increase in average salary. The R-squared of 9.2% indicates a slight correlation, but nothing to write a thesis about.

The fair conclusion is that division average reading pass rates are unrelated to average teacher salaries, except for a hint that ED rates may decrease slightly with increasing average salary.

The enlarged, squared points are Richmond, paying a bit more than average and getting a lot less.

The math and science data tell much the same story.

image

image

For the record, here are the ten high- and low paying divisions:

image

image

Notice that high paying and high scoring Falls Church is not doing well at all with its ED students. Ditto most of those Big Spenders. Also notice that, except for Lexington and Halifax, the Little Spenders have ED pass rates within ten to fifteen points of their Not ED.

CAVEAT: These data are consistent with the notions that Virginia’s systems for evaluating educational outcomes and for setting teacher salaries are counterproductive:

Research dating back to the 1966 release of Equality of Educational Opportunity (the “Coleman Report”) shows that student performance is only weakly related to school quality. The report concluded that students’ socioeconomic background was a far more influential factor. However, among the various influences that schools and policymakers can control, teacher quality was found to account for a larger portion of the variation in student test scores than all other characteristics of a school, excluding the composition of the student body (so-called peer effects).

Yet, Virginia’s evaluation system punishes schools and divisions with larger numbers of “economically disadvantaged” students. Moreover, the counterproductive salary scales reward degrees and time in service, not teaching effectiveness:

Teachers’ education (degree) and experience levels are probably the most widely studied teacher attributes, both because they are easy to measure and because they are, in the vast majority of school systems, the sole determinants of teachers’ salaries. However, there appears to be only weak evidence that these characteristics consistently and positively influence student learning.

For another look at the relationship between educational inputs and outputs, see this study from the OECD.

Rigged Grading

A large study by the OECD concludes: “[T]he socio-economic composition of schools explains far more of the differences in student performance between schools than do other school factors that are more easily amenable to policy makers, such as school resources and school policies.” That is consistent with the Virginia data that show economically disadvantaged (“ED”) students underperforming their more affluent peers (“Not ED”) by about twenty points on the SOL pass rates.

The SOL reporting system ignores this issue and punishes schools with larger populations of less affluent students. Four Richmond elementary schools illustrate the situation.

Fox and Munford serve relatively affluent neighborhoods; both schools turn in superb SOL results. Cary and Obama have much tougher clienteles and do not show as well. Nonetheless, Cary and Obama get better ED pass rates than do Fox and Munford.

Hard to believe, eh? Well, look at this. And notice how the SOL average is affected by the %ED.

image

The SOL average over all students punishes Cary and Obama for the relative poverty of the majority of their students, albeit those students outscore the ED students at Fox and Munford. At the same time, Munford and Fox are rewarded for having small ED populations.

If we look at the averages of the ED and Not ED rates, a different pattern emerges.

image

In terms of those averages, all four schools cluster near the 75% accreditation benchmark. This calculation rewards Cary and Obama for the superior performance of their tougher-to-teach students and recognizes the inferior results at Fox and Munford.

On the math tests, Cary escapes most of the SOL penalty by way of a very high ED pass rate but Obama confirms the point. The Fox and Munford SOLs again shine, despite relatively lower ED pass rates.

image

The ED/Not ED average again tells a more balanced story.

image

The Board of Education bases its accreditation calculation on the SOL average over all students in a school or division. Indeed, the public measure of school quality is that same average. That rewards the schools and divisions with fewer ED students, whether or not they get good results with those ED students, and penalizes schools and divisions with large ED populations, even when they get better than average results with their ED students.

A plot of division average reading pass rates and ED/Not ED averages vs. the %ED illuminates the difference between the SOL average pass rate and the ED/Not ED average.

image

The fitted line to the SOL scores (red) suggests that the SOLs decrease on average by about 2.5% for a 10% increase in the ED population. The R-squared of 23% suggests a modest correlation. Indeed, we have seen that the correlation derives almost entirely from the effect of the ED group’s generally lower scores.

The green points show the averages of the ED and Not ED pass rates. The slope of the fitted line is much lower, minus 0.7% for a 10% ED increase, and the 2.3% R-squared value denotes only a trace of correlation. Said otherwise, this average is almost entirely unrelated to the percentage of ED students in the division.

The math and science data tell the same story.

image

image

An ideal grading system would present a horizonal fitted line with an R-Squared of zero. The ED/Not Ed average comes close and, in any event, is vastly less unfair than the current system.

BTW: The Board of Education had an even better system, the SGP, that was uncorrelated with economic status. The Board abandoned that system because it provided an accurate measure of teacher performance.

So we are left with a reporting system that punishes schools and divisions that serve larger populations of poorer students.

If that is a fair system, I am Santa Claus.

2019 Graduation Rates

The four-year cohort graduation rates are up on the VDOE Web site.

The Board of Education brags on its (inflated) “On-Time” rate that counts the “Board of Education-approved”  diplomas. The Federales, to their credit, count only the Advanced Studies and Standard diplomas.

This year, the Federal rate for the state slipped from 88.8% to 88.7%; the Richmond average fell from 67.6% to 64.9%.

image

On these data, there’s no telling how much of the Richmond decline is the result of correcting the transcript issues.

Breaking out the Standard and Advanced rates, the Advanced average fell from 52.0 to 51.5% while the Standard rate rose from 36.8% to 37.2%. The Richmond Advanced rate fell from 23.3% to 21.7%, the Standard, 44.3% to 43.2%.

image

Here are the Richmond rates by school, sorted by the Federal rate.

image

And, to complete the picture, the rates for Richmond’s economically disadvantaged students (ca. 2/3 of the Richmond population).

image

Note: The zero Advanced rates here for Wythe and Franklin probably are artifacts of the VDOE suppression rules for groups of <10 students. The Franklin rate calculates as 29%; the Wythe could be as high as 5.5%.

“Help” for Petersburg II: Mirror of VBOE Incompetence

The Petersburg Public Schools have been laboring under the supervision of the Board of Education since at least 2004.

The results are appalling.

Here, to start, are this year’s accreditation calculations. (See this post for descriptions of the L1 and L2 benchmarks and the various score boosts.)

image

image

image

Or, in summary:

image

Petersburg reached this condition as the latest step in a march to abject failure. The decline was exacerbated in 2017 when the staff at AP Hill Elementary (now Cool Spring) were caught cheating.

And, please remember that the accreditation system is rigged to avoid this kind of failure.

The Board’s intervention at Petersburg goes back to a Memorandum of Understanding (“MOU”) in 2004:

https://calaf.org/wp-content/uploads/2019/08/image-42.png

Approaches of the last two MOUs illuminate the Board’s feckless approach to improving the Petersburg schools. In November, 2009 the Board took over the Petersburg system:

The VBOE and the VDOE will continue to assign a CAO . . .   The CAO will have administrative authority over processes, procedures, and strategies that are implemented in support of the MOU and funded by targeted federal and state funds with subsequent review and approval by the Petersburg City School Board.

Then, in April, 2016 the Board retreated to “coordinat[ion]” and “technical assistance”:

The Director of [Office of School Improvement] will coordinate with school division staff and other VDOE offices to develop a Corrective Action Plan for Petersburg Public Schools and to provide technical assistance in support of the MOU and Corrective Action Plan.

Notwithstanding this pusillanimity, The Board has the authority to compel Petersburg to fix its schools:

Va. Code § 22.1-8 provides: “The general supervision of the public school system shall be vested in the Board of Education.”

Va. Code § 22.1-253.13:8 provides:

The Board of Education shall have authority to seek school division compliance with the foregoing Standards of Quality. When the Board of Education determines that a school division has failed or refused, and continues to fail or refuse, to comply with any such Standard, the Board may petition the circuit court having jurisdiction in the school division to mandate or otherwise enforce compliance with such standard, including the development or implementation of any required corrective action plan that a local school board has failed or refused to develop or implement in a timely manner.

Yet the Board has never sued Petersburg, or any other division, to compel compliance with the law. The reason is clear: The Board would have to tell the judge what the division must do to comply with the law but the Board DOES NOT KNOW HOW TO FIX BAD SCHOOLS [Sept. 21, 2016 video starting at 1:48].

I think it is past time for the Board members (and their staff at VDOE) to be directed to work that is better suited to their capabilities.

Effects of “Help” for Petersburg

The Board of “Education” has been “supervising” Petersburg since at least 2004.

https://calaf.org/wp-content/uploads/2019/08/image-42.png

If anything, all that “supervision” has let Petersburg’s performance continue to decay. In summary:

https://calaf.org/wp-content/uploads/2019/08/image-43.png https://calaf.org/wp-content/uploads/2019/08/image-46.png

Notes: Statewide, economically disadvantaged (“ED”) students underperform their more affluent peers (“Not ED”) by about 20% in terms of the SOL pass rates. Yet, “the socio-economic composition of schools explains far more of the differences in student performance between schools than do other school factors .” This makes the SOL average an unfair measure for divisions, such as Petersburg, with large ED populations. Accordingly, the graphs here and below show the performance of both the ED and Not ED students. The new math tests in 2012 and the new English tests in 2013 dropped pass rates statewide; the new math tests in 2019 raised pass rates statewide by 3.4% for Not ED students, 6.6% for ED. The red lines are the nominal levels for accreditation. VDOE “adjusts” the pass rates in order to accredit some schools that come nowhere near those thresholds.

Turning to the individual Petersburg schools: The 2017 data for AP Hill, now Cool Spring, are missing because the school (the staff, not the kids) was caught cheating. The splendid numbers before and dismal performance since 2017 show much the same pattern as Richmond’s GW Carver.

image image

To the point here, with the false aura of the cheating removed, the data show that the school is unable to prepare half its students to pass the SOLs.

Note in passing: Last year, the Board of Education accredited Cool Spring in English based on the three year average. Lacking 2017 data because of the cheating, the Board ignored 2017 and reached back to the (cheating enhanced) 2016 pass rate to compute an average that met the “Level 2” threshold.

Pleasants Lane, formerly JEB Stuart, showed some improvement in the math scores this year (as did the state average with the help of the new, failure-averse scoring system), but the Reading rates slid.

image image

Lakemont, formerly RE Lee, painted a less rosy picture.

image image

Walnut Hill, in contrast, stayed in the running for accreditation this year.

image image

There are four middle schools in the database: Peabody, Vernon Johns Jr., Old Vernon Johns, and Vernon Johns. The relationship between these is not clear. However, only Vernon Johns has data after 2017 so we’ll go with that:

image image

The high school’s students suffer from declining performance.

image image

It’s hard to see any benefit here from sixteen years of Board of Education “supervision.” To the contrary, the ongoing, dismal failure of the Petersburg schools testifies to the abiding fecklessness of that Board.

Your tax dollars at “work” (this year, $1,939,750 for “school improvement).”

Middle and High School Accreditation Analysis

Here, to start, are the 2019 English Accreditation data for Richmond’s middle schools.

image

(For details regarding the benchmarks and adjustments, see the earlier post.)

Franklin, a selective school whose pass rates include results from both middle and high school students, made L1 on pass rate alone. Adjustments brought Hill to exactly the L1 benchmark. Binford and Brown adjusted to L2. None of the other four came close to L2.

The math data look a bit better.

image

Brown adjusted to within one point of L2 (66) and also was 65 on the three year average and did not decrease the failure rate by 10% from a pass rate of at least 50%, but was awarded L2 anyhow. Even more entertaining, Elkhardt-Thompson adjusted to 62 and was 55 on the three year and failed to reduce the failure rate by 10% but also wound up at L2. It looks like near misses count in horseshoes, hand grenades, nuclear warfare and accreditation.

In science, four schools made L1 on the pass rate, the other four did not even come close to L2.

image

The overall picture:

image

Hill made L1 on all the major measures but wound up at L3 because of problems with achievement gaps of its black, poor, and disabled students (the standard says the rating is L3 if two or more student groups are at L3). Similarly, Brown, although credited with L2 on the English pass rate, flunked English because of achievement gaps (also the black, poor, and disabled groups).

Turning to the high schools.

image

The English pass rates were excellent at the selective schools and more than good enough at Marshall and TJ. Huguenot made L2. Wythe missed L2 on the pass rate but made L1(!) with a 75 on the three year average. Armstrong barely made 50%.

The math numbers were not so good.

image

Among the standard high schools, only Huguenot made L1 with the others languishing.

The science data painted a similar picture but with all the standard high schools below the L2 benchmark.

image

Marshall (65 this year and 54 on the three-year average) made L2 by reducing the failure rate by >10%.  TJ (64 this year, 60 last year, 63 on the 3-year average) got a mystery boost to L2.

At the bottom line, only the selective schools were accredited. None of the standard high schools made L2, despite the hidden help they all obtained by stealing the scores of Maggie Walker students).

image

Elementary Accreditation, a Deeper Dive

Having seen the contours of the 2019 accreditation data for the Richmond schools, let’s take a look at the underlying numbers.

Here is the Big Picture for the English tests at the elementary schools:

image

The data are from the “School Quality Profiles

The green line marks the “Level 1” (at standard) benchmark. The gray is the “Level 2” (near standard) benchmark. Anything less is “Level 3” (flunking). A school at Level 1 or 2 is “Accredited.” Level 3 earns “Accredited with Conditions.” A school at Level 3 that “fails to adopt and implement school division or school corrective action plans with fidelity . . . “may be designated by the board as “Accreditation Denied.” (Translated: To be denied accreditation, a school or division must marinate in Level 3 and tell the Board of Education to go jump in the James.)

The blue bars are the average pass rates for each school.

Those pass rates are boosted by the magenta “Remediation Recovery” numbers. The orange boost is the percentage of students who, in VDOE shorthand [pick a school, scroll down to “Academic Achievement,” and click “show explanation”] “improved compared with prior performance on state English tests.” Similarly, the gold boost indicates  the “[p]ercent of English-language learners passing state English-language proficiency tests or making progress toward English-language proficiency.”

Make what you will of that system, these are the 2019 results in English for Richmond’s elementary schools.

As to math, there is no EL boost but the Level 1 benchmark drops to 70%.

image

Note: These data are sorted by the reported total while the graphs shows the sum of the parts. All the numbers from VDOE are rounded to whole percentages, producing roundoff errors

image image

and bumps in the math graph (which is sorted by the reported numbers).

For science, there are no adjustments so only the pass rates matter. The L1 benchmark again is 70%.

image

To summarize the accreditation results:

image

Looking at the data above, the conclusion is clear: “Accreditation Denied” has gone away; “Accredited with Conditions” is the new, more Superintendent-friendly euphemism that means much the same thing. 

Thus, if we read the new system for what it means, not what it says, it might even be taken as an improvement on the old, bogus system.  Let’s call it the new, bogus system.

But, then, “Board of Education” is itself a euphemism, if not an outright lie (for example, see this and this and this and this) so there’s no reason to expect honest reporting from their “accreditation” process. 

Even more to the point, that Board has demonstrated (and members have admitted [Sept. 21, 2016 video starting at 1:48]) that they don’t know how to fix awful schools such as those in Petersburg (and, of course, in Richmond).  So the labels they apply to failing schools provide bureaucratic camouflage: Those “Accredited with Conditions” schools and our Board of “Education,” can put on a happy face while they persist in failing to educate far too many children.