2019 Dropouts

Richmond won the dropout race this year.

Here are the eleven divisions with the largest All Students 4-year cohort dropout rates for 2019:

image

That 24.4% is 372 students (of 1,523) who dropped out.

The peer city, Norfolk, is highlighted in red.

The “ED” column is the rate for economically disadvantaged students. Sorted by the ED rate, the list changes but Richmond’s position does not.

image

Just to cleanse the palate, here are the divisions with the lowest rates, sorted first by the all students rate, then by ED.

image

image

In terms of the subgroups of students, the rates of black, white, and homeless students are unacceptable; those of the English learners, Hispanics, and students with disabilities are astronomical.

image

The VDOE information sheet and FAQ don’t tell us what “anytime” means in this context.

Our Superintendent’s Plan, Dreams4RPS, lists ten priorities; the eighth is “Decrease chronic absenteeism-overall and for each subgroup.” There is no mention of how, by how much, or who is to be responsible for the decrease. None of the five “priorities” in the Plan mentions attendance, truancy, or absenteeism.

More Salary $ ≠ Higher Pass Rate

We often hear that better pay leads to better teaching. VDOE has some data on that.

Table 19 in the Superintendent’s Annual Report lists, among other things, the average annual teachers salaries by division. Data here are averages for both teachers and principals. The very nice database on the VDOE site provides pass rates. The latest data in the Annual Report are from 2018 so the pass rates here are from 2018 as well.

For the reading tests, the data look like this:

image

“ED” on the chart refers to the “economically disadvantaged” students. Statewide, ED students underperform their more affluent peers (“Not ED”) by about twenty points so I’ve posted both sets of data.

The fitted line to the Not ED data suggests that a $10,000 increase in the average salary is associated with a 1.1% increase in the pass rate. The R-squared value of 2.3%, however, tells us that the two variables are essentially uncorrelated.

The ED data suggest a decrease in the pass rate of 3.1% per $10,000 increase in average salary. The R-squared of 9.2% indicates a slight correlation, but nothing to write a thesis about.

The fair conclusion is that division average reading pass rates are unrelated to average teacher salaries, except for a hint that ED rates may decrease slightly with increasing average salary.

The enlarged, squared points are Richmond, paying a bit more than average and getting a lot less.

The math and science data tell much the same story.

image

image

For the record, here are the ten high- and low paying divisions:

image

image

Notice that high paying and high scoring Falls Church is not doing well at all with its ED students. Ditto most of those Big Spenders. Also notice that, except for Lexington and Halifax, the Little Spenders have ED pass rates within ten to fifteen points of their Not ED.

CAVEAT: These data are consistent with the notions that Virginia’s systems for evaluating educational outcomes and for setting teacher salaries are counterproductive:

Research dating back to the 1966 release of Equality of Educational Opportunity (the “Coleman Report”) shows that student performance is only weakly related to school quality. The report concluded that students’ socioeconomic background was a far more influential factor. However, among the various influences that schools and policymakers can control, teacher quality was found to account for a larger portion of the variation in student test scores than all other characteristics of a school, excluding the composition of the student body (so-called peer effects).

Yet, Virginia’s evaluation system punishes schools and divisions with larger numbers of “economically disadvantaged” students. Moreover, the counterproductive salary scales reward degrees and time in service, not teaching effectiveness:

Teachers’ education (degree) and experience levels are probably the most widely studied teacher attributes, both because they are easy to measure and because they are, in the vast majority of school systems, the sole determinants of teachers’ salaries. However, there appears to be only weak evidence that these characteristics consistently and positively influence student learning.

For another look at the relationship between educational inputs and outputs, see this study from the OECD.

Rigged Grading

A large study by the OECD concludes: “[T]he socio-economic composition of schools explains far more of the differences in student performance between schools than do other school factors that are more easily amenable to policy makers, such as school resources and school policies.” That is consistent with the Virginia data that show economically disadvantaged (“ED”) students underperforming their more affluent peers (“Not ED”) by about twenty points on the SOL pass rates.

The SOL reporting system ignores this issue and punishes schools with larger populations of less affluent students. Four Richmond elementary schools illustrate the situation.

Fox and Munford serve relatively affluent neighborhoods; both schools turn in superb SOL results. Cary and Obama have much tougher clienteles and do not show as well. Nonetheless, Cary and Obama get better ED pass rates than do Fox and Munford.

Hard to believe, eh? Well, look at this. And notice how the SOL average is affected by the %ED.

image

The SOL average over all students punishes Cary and Obama for the relative poverty of the majority of their students, albeit those students outscore the ED students at Fox and Munford. At the same time, Munford and Fox are rewarded for having small ED populations.

If we look at the averages of the ED and Not ED rates, a different pattern emerges.

image

In terms of those averages, all four schools cluster near the 75% accreditation benchmark. This calculation rewards Cary and Obama for the superior performance of their tougher-to-teach students and recognizes the inferior results at Fox and Munford.

On the math tests, Cary escapes most of the SOL penalty by way of a very high ED pass rate but Obama confirms the point. The Fox and Munford SOLs again shine, despite relatively lower ED pass rates.

image

The ED/Not ED average again tells a more balanced story.

image

The Board of Education bases its accreditation calculation on the SOL average over all students in a school or division. Indeed, the public measure of school quality is that same average. That rewards the schools and divisions with fewer ED students, whether or not they get good results with those ED students, and penalizes schools and divisions with large ED populations, even when they get better than average results with their ED students.

A plot of division average reading pass rates and ED/Not ED averages vs. the %ED illuminates the difference between the SOL average pass rate and the ED/Not ED average.

image

The fitted line to the SOL scores (red) suggests that the SOLs decrease on average by about 2.5% for a 10% increase in the ED population. The R-squared of 23% suggests a modest correlation. Indeed, we have seen that the correlation derives almost entirely from the effect of the ED group’s generally lower scores.

The green points show the averages of the ED and Not ED pass rates. The slope of the fitted line is much lower, minus 0.7% for a 10% ED increase, and the 2.3% R-squared value denotes only a trace of correlation. Said otherwise, this average is almost entirely unrelated to the percentage of ED students in the division.

The math and science data tell the same story.

image

image

An ideal grading system would present a horizonal fitted line with an R-Squared of zero. The ED/Not Ed average comes close and, in any event, is vastly less unfair than the current system.

BTW: The Board of Education had an even better system, the SGP, that was uncorrelated with economic status. The Board abandoned that system because it provided an accurate measure of teacher performance.

So we are left with a reporting system that punishes schools and divisions that serve larger populations of poorer students.

If that is a fair system, I am Santa Claus.

2019 Graduation Rates

The four-year cohort graduation rates are up on the VDOE Web site.

The Board of Education brags on its (inflated) “On-Time” rate that counts the “Board of Education-approved”  diplomas. The Federales, to their credit, count only the Advanced Studies and Standard diplomas.

This year, the Federal rate for the state slipped from 88.8% to 88.7%; the Richmond average fell from 67.6% to 64.9%.

image

On these data, there’s no telling how much of the Richmond decline is the result of correcting the transcript issues.

Breaking out the Standard and Advanced rates, the Advanced average fell from 52.0 to 51.5% while the Standard rate rose from 36.8% to 37.2%. The Richmond Advanced rate fell from 23.3% to 21.7%, the Standard, 44.3% to 43.2%.

image

Here are the Richmond rates by school, sorted by the Federal rate.

image

And, to complete the picture, the rates for Richmond’s economically disadvantaged students (ca. 2/3 of the Richmond population).

image

Note: The zero Advanced rates here for Wythe and Franklin probably are artifacts of the VDOE suppression rules for groups of <10 students. The Franklin rate calculates as 29%; the Wythe could be as high as 5.5%.

“Help” for Petersburg II: Mirror of VBOE Incompetence

The Petersburg Public Schools have been laboring under the supervision of the Board of Education since at least 2004.

The results are appalling.

Here, to start, are this year’s accreditation calculations. (See this post for descriptions of the L1 and L2 benchmarks and the various score boosts.)

image

image

image

Or, in summary:

image

Petersburg reached this condition as the latest step in a march to abject failure. The decline was exacerbated in 2017 when the staff at AP Hill Elementary (now Cool Spring) were caught cheating.

And, please remember that the accreditation system is rigged to avoid this kind of failure.

The Board’s intervention at Petersburg goes back to a Memorandum of Understanding (“MOU”) in 2004:

https://calaf.org/wp-content/uploads/2019/08/image-42.png

Approaches of the last two MOUs illuminate the Board’s feckless approach to improving the Petersburg schools. In November, 2009 the Board took over the Petersburg system:

The VBOE and the VDOE will continue to assign a CAO . . .   The CAO will have administrative authority over processes, procedures, and strategies that are implemented in support of the MOU and funded by targeted federal and state funds with subsequent review and approval by the Petersburg City School Board.

Then, in April, 2016 the Board retreated to “coordinat[ion]” and “technical assistance”:

The Director of [Office of School Improvement] will coordinate with school division staff and other VDOE offices to develop a Corrective Action Plan for Petersburg Public Schools and to provide technical assistance in support of the MOU and Corrective Action Plan.

Notwithstanding this pusillanimity, The Board has the authority to compel Petersburg to fix its schools:

Va. Code § 22.1-8 provides: “The general supervision of the public school system shall be vested in the Board of Education.”

Va. Code § 22.1-253.13:8 provides:

The Board of Education shall have authority to seek school division compliance with the foregoing Standards of Quality. When the Board of Education determines that a school division has failed or refused, and continues to fail or refuse, to comply with any such Standard, the Board may petition the circuit court having jurisdiction in the school division to mandate or otherwise enforce compliance with such standard, including the development or implementation of any required corrective action plan that a local school board has failed or refused to develop or implement in a timely manner.

Yet the Board has never sued Petersburg, or any other division, to compel compliance with the law. The reason is clear: The Board would have to tell the judge what the division must do to comply with the law but the Board DOES NOT KNOW HOW TO FIX BAD SCHOOLS [Sept. 21, 2016 video starting at 1:48].

I think it is past time for the Board members (and their staff at VDOE) to be directed to work that is better suited to their capabilities.

Effects of “Help” for Petersburg

The Board of “Education” has been “supervising” Petersburg since at least 2004.

https://calaf.org/wp-content/uploads/2019/08/image-42.png

If anything, all that “supervision” has let Petersburg’s performance continue to decay. In summary:

https://calaf.org/wp-content/uploads/2019/08/image-43.png https://calaf.org/wp-content/uploads/2019/08/image-46.png

Notes: Statewide, economically disadvantaged (“ED”) students underperform their more affluent peers (“Not ED”) by about 20% in terms of the SOL pass rates. Yet, “the socio-economic composition of schools explains far more of the differences in student performance between schools than do other school factors .” This makes the SOL average an unfair measure for divisions, such as Petersburg, with large ED populations. Accordingly, the graphs here and below show the performance of both the ED and Not ED students. The new math tests in 2012 and the new English tests in 2013 dropped pass rates statewide; the new math tests in 2019 raised pass rates statewide by 3.4% for Not ED students, 6.6% for ED. The red lines are the nominal levels for accreditation. VDOE “adjusts” the pass rates in order to accredit some schools that come nowhere near those thresholds.

Turning to the individual Petersburg schools: The 2017 data for AP Hill, now Cool Spring, are missing because the school (the staff, not the kids) was caught cheating. The splendid numbers before and dismal performance since 2017 show much the same pattern as Richmond’s GW Carver.

image image

To the point here, with the false aura of the cheating removed, the data show that the school is unable to prepare half its students to pass the SOLs.

Note in passing: Last year, the Board of Education accredited Cool Spring in English based on the three year average. Lacking 2017 data because of the cheating, the Board ignored 2017 and reached back to the (cheating enhanced) 2016 pass rate to compute an average that met the “Level 2” threshold.

Pleasants Lane, formerly JEB Stuart, showed some improvement in the math scores this year (as did the state average with the help of the new, failure-averse scoring system), but the Reading rates slid.

image image

Lakemont, formerly RE Lee, painted a less rosy picture.

image image

Walnut Hill, in contrast, stayed in the running for accreditation this year.

image image

There are four middle schools in the database: Peabody, Vernon Johns Jr., Old Vernon Johns, and Vernon Johns. The relationship between these is not clear. However, only Vernon Johns has data after 2017 so we’ll go with that:

image image

The high school’s students suffer from declining performance.

image image

It’s hard to see any benefit here from sixteen years of Board of Education “supervision.” To the contrary, the ongoing, dismal failure of the Petersburg schools testifies to the abiding fecklessness of that Board.

Your tax dollars at “work” (this year, $1,939,750 for “school improvement).”

Middle and High School Accreditation Analysis

Here, to start, are the 2019 English Accreditation data for Richmond’s middle schools.

image

(For details regarding the benchmarks and adjustments, see the earlier post.)

Franklin, a selective school whose pass rates include results from both middle and high school students, made L1 on pass rate alone. Adjustments brought Hill to exactly the L1 benchmark. Binford and Brown adjusted to L2. None of the other four came close to L2.

The math data look a bit better.

image

Brown adjusted to within one point of L2 (66) and also was 65 on the three year average and did not decrease the failure rate by 10% from a pass rate of at least 50%, but was awarded L2 anyhow. Even more entertaining, Elkhardt-Thompson adjusted to 62 and was 55 on the three year and failed to reduce the failure rate by 10% but also wound up at L2. It looks like near misses count in horseshoes, hand grenades, nuclear warfare and accreditation.

In science, four schools made L1 on the pass rate, the other four did not even come close to L2.

image

The overall picture:

image

Hill made L1 on all the major measures but wound up at L3 because of problems with achievement gaps of its black, poor, and disabled students (the standard says the rating is L3 if two or more student groups are at L3). Similarly, Brown, although credited with L2 on the English pass rate, flunked English because of achievement gaps (also the black, poor, and disabled groups).

Turning to the high schools.

image

The English pass rates were excellent at the selective schools and more than good enough at Marshall and TJ. Huguenot made L2. Wythe missed L2 on the pass rate but made L1(!) with a 75 on the three year average. Armstrong barely made 50%.

The math numbers were not so good.

image

Among the standard high schools, only Huguenot made L1 with the others languishing.

The science data painted a similar picture but with all the standard high schools below the L2 benchmark.

image

Marshall (65 this year and 54 on the three-year average) made L2 by reducing the failure rate by >10%.  TJ (64 this year, 60 last year, 63 on the 3-year average) got a mystery boost to L2.

At the bottom line, only the selective schools were accredited. None of the standard high schools made L2, despite the hidden help they all obtained by stealing the scores of Maggie Walker students).

image

Elementary Accreditation, a Deeper Dive

Having seen the contours of the 2019 accreditation data for the Richmond schools, let’s take a look at the underlying numbers.

Here is the Big Picture for the English tests at the elementary schools:

image

The data are from the “School Quality Profiles

The green line marks the “Level 1” (at standard) benchmark. The gray is the “Level 2” (near standard) benchmark. Anything less is “Level 3” (flunking). A school at Level 1 or 2 is “Accredited.” Level 3 earns “Accredited with Conditions.” A school at Level 3 that “fails to adopt and implement school division or school corrective action plans with fidelity . . . “may be designated by the board as “Accreditation Denied.” (Translated: To be denied accreditation, a school or division must marinate in Level 3 and tell the Board of Education to go jump in the James.)

The blue bars are the average pass rates for each school.

Those pass rates are boosted by the magenta “Remediation Recovery” numbers. The orange boost is the percentage of students who, in VDOE shorthand [pick a school, scroll down to “Academic Achievement,” and click “show explanation”] “improved compared with prior performance on state English tests.” Similarly, the gold boost indicates  the “[p]ercent of English-language learners passing state English-language proficiency tests or making progress toward English-language proficiency.”

Make what you will of that system, these are the 2019 results in English for Richmond’s elementary schools.

As to math, there is no EL boost but the Level 1 benchmark drops to 70%.

image

Note: These data are sorted by the reported total while the graphs shows the sum of the parts. All the numbers from VDOE are rounded to whole percentages, producing roundoff errors

image image

and bumps in the math graph (which is sorted by the reported numbers).

For science, there are no adjustments so only the pass rates matter. The L1 benchmark again is 70%.

image

To summarize the accreditation results:

image

Looking at the data above, the conclusion is clear: “Accreditation Denied” has gone away; “Accredited with Conditions” is the new, more Superintendent-friendly euphemism that means much the same thing. 

Thus, if we read the new system for what it means, not what it says, it might even be taken as an improvement on the old, bogus system.  Let’s call it the new, bogus system.

But, then, “Board of Education” is itself a euphemism, if not an outright lie (for example, see this and this and this and this) so there’s no reason to expect honest reporting from their “accreditation” process. 

Even more to the point, that Board has demonstrated (and members have admitted [Sept. 21, 2016 video starting at 1:48]) that they don’t know how to fix awful schools such as those in Petersburg (and, of course, in Richmond).  So the labels they apply to failing schools provide bureaucratic camouflage: Those “Accredited with Conditions” schools and our Board of “Education,” can put on a happy face while they persist in failing to educate far too many children.

Richmond High Schools

Having looked at the 2019 pass rates of Richmond’s elementary schools and middle schools, let’s turn to the high schools.

Note: The Board of Education has designed its SOL reporting to discriminate against Richmond and other divisions with large populations of economically disadvantaged (“ED”) students. Those students underperform their more affluent peers (“Not ED”) by about 20 points on average. As a result, the SOL averages for divisions such as Richmond (ca. 2/3 ED) are lowered relative to divisions with similar ED and Not ED pass rates but fewer ED students. Fortunately, the database provides both ED and Not ED pass rates.

The End of Course (“EOC”) tests are primarily administered in the high schools. The standard diploma requires that the student pass the EOC tests in two English courses and one math course.

To start, here are the EOC reading pass rate averages by school for Not ED (more affluent) students.

image

All three of the selective schools aced these tests. Of the mainstream high schools, only TJ met the nominal benchmark for accreditation (75%). And notice, this is the result for the more affluent (and presumably higher-scoring) students.

The Not ED pass rates for the mainstream high schools were reduced by the loss of some better-performing students to the three selective schools. At the same time, the rates for those five schools were boosted some by the scores of Maggie Walker students; those are reported at the high schools in those students home districts, not at Walker. VDOE does not report the magnitude of those effects.

Turning to the ED pass rates:

image

Again the selective high schools turned in superb numbers. Marshall was the pick of the mainstream high schools, 4.3 points below the nominal benchmark for accreditation; the other schools worked together to lower the Richmond average to 61.4.

Again, the selective schools skimmed some of the most capable ED students from the pool. That cannot have affected the division average. To the contrary, the Richmond average was boosted some by the rip-off of the Maggie Walker results.

Even so, the averages told a sad story about the state of our high schools.

image

The Not ED minus ED data showed a curious pattern.

image

The Richmond average difference was inflated in some measure by the Maggie Walker swindle. The selective schools showed the effect of attracting some of the more capable ED students. The mainstream high schools were all over the place.  The TJ difference was as astounding in one direction as the Marshall in the other.

Except perhaps at Franklin, the numbers tested were not so small that a few high or low scores could have produced a large fluctuation in the pass rate.

image

Either the nonselective schools were showing wildly variable performances in their teachers or in their learners. Or both.

Turning to the math tests and the Not ED pass rates:

image

The nominal accreditation benchmark here is 70; only the selective schools met it.

As to the ED pass rates, none of the mainstream high school came close to the benchmark. The Richmond average barely broke 50%.

image

There is a complication: The Richmond math averages include results from middle schools. Those offer advanced classes, including some in the high school math subjects, that allow some of their better students to get a jump on the graduation requirements. The entire picture looks like this:

image

image

In light of that selection process, is is no surprise that the middle schools outscored the high schools.

image

The state averages are subject to the same issue.  Thus, the comparison with the overall Richmond averages should be a fair one, albeit it does not directly measure the average high school performance.

image

Richmond has a very large math problem.

As to the Richmond average, the Not ED/ED difference was within the realm of reason. Otherwise another unexplained spectrum.

image

And, again, the numbers tested were not so small as to explain all the scatter.

image

On these data, the scatter in the Not ED/ED pass rate differences will remain a puzzle.

However, the message of the averages (and of the pass rates of too many schools) is clear.

First Look at Accreditation

The 2019 accreditation results are up.

Notes:

  • Please recall that these ratings are so thoroughly rigged as to be bogus.
  • VDOE calls them the “2019-2020 Accreditation Ratings” because they supposedly apply to the current school year, not the 2019 school year for which they (purportedly) measure performance.
  • In the interest of honest reporting, the tables below say “Flunked” where VDOE says “Accredited With Conditions.” That official euphemism allows everybody to say a school is “accredited” when it actually flunked, even by the current, rigged standards.

Here, then are the school totals for the state, Richmond, three peer cities, and poor Petersburg:

image

Or, expressing the numbers percentages of the totals:

image

Richmond is up this year as to the percentage of schools accredited, from 43.2% to 45.5% (one school difference), and also improved on the percentage flunking, from 54.5% to 50% (two schools).

Caveat: The table below shows that both of the schools “Accredited Pending Review of Alternative Plan” deserve to flunk.

Richmond had 2.3% of the schools in Virginia this year and 16.7% of the schools that flunked.

Finally, here is a summary of the “indicators” for the Richmond schools as to the basic subjects as well as truancy, graduation rate, and dropout rate.

image

A reader (THE reader?) tells me to say what L1 et al. mean. L1 means the school met the requirement, such as it is; L2 means it met a relaxed requirement; L3 is the laggards. It’s more complicated than that; details here.

In cases that are all L1 here but the school flunked, e.g. Albert Hill, you can dig into the complete table or the “profile” to see what drove the rating. Albert Hill is all L1 on this table but flunked on the achievement gaps.

Despite, appalling pass rates this year, Fairfield Court was accredited on a three year average (that included two years of excellent numbers that almost certainly were based on cheating). The current school year should resolve that anomaly.