It’s Performance, not Poverty

The popular sport in the Richmond “education” establishment has been to blame the kids for the awful performance of our schools.  We particularly hear about our less affluent (the official euphemism is “economically disadvantaged”) students.

We have some data on that.  Again.

Here are the average reading pass rates by grade of the economically disadvantaged (”ED”) and non-ED students in Richmond and the state.  “EOC” indicates the End of Course tests that generally must be passed to receive “verified credits” that count toward a diploma.

image

Both in Richmond and on average, the ED group underperforms the non-ED group.

To the point here, the Richmond ED students underperform their peers in the state averages, as do the non-ED Richmond students

We can calculate the differences between the Richmond groups and state average to measure that underperformance.

image

Here we see Richmond’s ED students underperforming their peers by about 7% in elementary school while our non-ED students average some 9% below that group statewide.  In middle school the difference increases to roughly 19% for the non-ED students and 25% for the ED group.

The math test results show a similar pattern. 

image

image

These data tell us two things:

  • Richmond students, both ED and not, underperform their statewide peer groups on average; and
  • The average SOL performance of Richmond students, ED and not, deteriorates dramatically in middle school.

As I have demonstrated elsewhere, the large percentage of ED students in Richmond (64% in 2017) does not explain our low pass rates.  So we are left with (at least) two possible explanations: Either Richmond students are less capable on average than students statewide or our schools are less effective than average.

If Richmond’s students were just less capable, it would explain the low elementary school scores but not the drop in pass rates after the fifth grade.

The plummeting performance of our students when they reach middle school tells us there’s a (big!) problem with our middle schools.  And there’s every reason to think that the school system that has terrible middle schools might also have problems with its elementary schools.

To the same end, notice how the ED performance by grade tracks the non-Ed performance in Richmond.  We saw the same thing last year:

image

image

Those parallel curves are fully consistent with the notion that performance variations by grade are driven by the teaching, not by the capabilities of the students.

As well, Friar Occam would suggest the simple explanation: Substandard elementary schools and awful middle schools in Richmond, not thousands of dunderheads (both poor and more affluent) attending those schools.

There’s an alternative that we might consider: I keep hearing that some of our elementary schools cheat on the SOLs.  We know for sure that’s happened in the past in Richmond and this year in Petersburg

If, in fact, the Richmond elementary pass rates are boosted by cheating, it absolves the middle schools of some or all of the decline in scores between the fifth and sixth grades but it leaves the conclusion here intact; it merely moves the elementary schools toward the “awful” category of our middle schools.

2017 Accreditation Data

The 2017 accreditation data are up on the VDOE Web site.

(Just to keep us confused, they call these the accreditation ratings for the 2018 school year.  But, to be clear, the ratings are based, at least in theory, on the 2017 SOL pass rates.)

I have extracted the Richmond data and posted them here.

The change in Richmond is that Stuart, where the scores improved remarkably this year, now is fully accredited.

Here are the overall totals.  I have bowdlerized the wordy, softened classifications of failure.  For example, “Partially Accredited: Approaching Benchmark-Pass Rate”  is “Close” in my table.  See the VDOE Accreditation page for the long versions.

image

In sum:

image

Or, expressed as percentages of the numbers of schools:

clip_image001

image

image

That TBD in Richmond is AP Hill.  The VDOE spreadsheet reports a 72 at Hill in English, three points short of the criterion.  Stay tuned to see whether the secret “adjustments” can turn that into a 75 and give us at least one accredited middle school (not counting Franklin Military, which is fully accredited as to both its middle and high school grades).

Indeed, the reading pass rate at Hill this year was 70.7%, down 0.3% from 2016; writing was 68.3%, up by 6%.  It looks like those scores already have enjoyed some “adjustments” to get up to the reported 72.

Petersburg: Compounding Failure?

The headline in the Progress-Index tells us “McAuliffe touts Petersburg schools’ improvements” on a visit to Petersburg.

The “improvements” listed in the story? 

McAuliffe mentioned that the graduation rate for Petersburg High School increased last year, as well as SAT and ACT scores.

The untold story:

image

image

image

image

  • We can wonder about the lovely numbers at A.P. Hill in previous years; for sure there are no numbers this year because they were caught cheating
  • Of the other three elementary schools, only Walnut Hill made the 75% accreditation mark for reading and the 70% cutoff for math this year.  The (failing) elementary reading scores at the other two look to be improving; the (also failing) math scores, not.
  • It’s difficult to reconcile improving graduation rate, SAT, and ACT scores with declining high school pass rates. 
  • Peabody Middle remains lost in a miasma of failure; Johns Jr. looks to be trying to join it.
  • Petersburg achieved this record of failure after fourteen years of “help” from the Board of “Education.”

The Governor “wanted top-to-bottom reform.”  That reform, the article tells us, is

There is a lot of poverty in Petersburg.  In 2016-17, 61% of the students were classified as “economically disadvantaged.”  Nutrition and social services might well help (Were they not needed before now?) but they hardly amount to “top-to-bottom reform” of the failed school system. 

So it looks like any “reform” will have to be found in the 63 slick pages of the Plan

There’s lots of feel-good in that document.  For example, at page 15 it says they will:

  • Guide and support teachers; and
  • Provide relevant and sophisticated support – including professional development opportunities, as well as ongoing coaching and collaboration – geared to [the] teachers’ and leaders’ needs.

But you’ll have to read to page 46 to find anything that even brushes up against “reform”: 

Measure of Success: All PCPS principals have adopted practices to collect and use evidence of teacher instructional performance.

If that means “hold the principals and teachers accountable,” it might indeed be a (partial) harbinger of reform.  If it means “continue using the existing, mendacious evaluation system,” there’s no hope for the students who are being damaged by the awful Petersburg schools.

And notice: They merely have to adopt practices.  They don’t actually have to hold anybody accountable for failure to do the job.

The (faint) promise of p.46 aside, the overwhelming problem here is the general lack of accountability, not only as to teachers and principals but on up the chain:

  • If Petersburg fails to meet the 2022 deadline of the Plan, it can say it did everything the State demanded.
  • The Board of “Education” has never been held accountable for its fourteen-year record of failure in Petersburg and there is no sign that the current Plan (or, more to the point, the Governor) will remedy that failure of leadership.
  • The Governor will be gone next year.

Your tax dollars at “work.”

Smaller Classes?

Table 17b in the 2016 Superintendent’s Annual Report (the latest available) includes data by division on the total numbers of “instructional positions” and students (“ADM”). 

The table contains data for principals and assistant principals, teachers (including technology instructors), teacher aides, guidance counselors, librarians, and district-wide instructors based on positions reported in school divisions’ Annual School Reports.  District-wide positions include Summer School, Adult Education, Pre-Kindergarten, and other non-regular day and non-LEA instructional positions.

The Average Daily Membership (ADM) shown in this table reflects all pupils (Pre-K through Post-graduate) served in the school division at the end of the year.

The (very nice) VDOE front end to the SOL database offers a plethora of data, including the division average SOL pass rates for 2016.

Combining those datasets for the reading tests gives the following:

image

The red diamonds are, from the left, the peer cities Newport News, Hampton, and Norfolk.  Richmond is the gold square.  The green diamond is Charles City; the blue diamond is Lynchburg.

The division average teacher/ADM ratio is 11.1%.  Richmond is 11.5%.  The average pass rate is 77.4%.

The fitted line suggests that the pass rate by division goes down as the Teacher/Pupil ratio increases, i.e., as the class size drops.  The R-squared, however, tells us that the two variables are only trivially correlated.

In short, by this measure smaller classes don’t perform any better.

The math data tell the same story.

image

The R-squared rises to 3.4%: Still not enough to bet on and, in any case, the correlation still goes in the wrong direction.

The average pass rate is 78.4%.

Finally, the average of the averages of the five subjects:

image

Unfortunately the available data do not give us the teacher numbers by grade, so we can’t refine the analysis down to the grade and school level.  Even so, these data suggest that the key to improved learning lies somewhere other than in class size.









































































































































VGLA: Rest in Ignominy

NOTE: VDOE has scrubbed most of the VGLA materials from its Web site, breaking some of the links in my draft of this post.  As of the posting date, all the remaining links are good.  I have posted the 2017 VGLA Manual here as a partial remedy for this Orwellian information vacuum.

The Virginia Grade Level Alternative, aka the VGLA, “is an alternative assessment for the Standards of Learning (SOL) Reading tests in grades 3-8. The VGLA Reading assessment is available only to Limited English Proficient (LEP) students in grades 3-8 who meet the eligibility criteria” (emphasis in original).

More specifically: “LEP students who have attended school in the United States for less than 12 months may receive a one – time exemption for the SOL Reading test in grades 3 through 8.”

Prior to 2012 in math and 2013 in reading, the VGLA was locally graded and was widely abused as a mechanism to boost SOL scores.  Since then, VDOE (through a contractor) has audited some of the reading “collections of evidence” (ca. 20% of them, it seems) but the grading has remained local. 

The VDOE database can tell us about the performance of students in the VGLA and in the regular SOL tests.  Let’s start with the third grade reading tests.

image

The SOL scores dropped statewide in 2013 with the advent of the new, tougher English SOL tests.  Here we see the third grade scores later recovering to near the 75% accreditation benchmark.

The abuse of the VGLA before 2011 was so outrageous (as you can see here) that the General Assembly passed a law in 2010 to curtail it.

The Board of “Education,” reacting to the new law, restricted the VGLA to reading and LEP students beginning in 2013.  The scores, however, remained high, settling a bit this year to 95.4%. 

There are at least three possible explanations for these phenomenal, post-2013 pass rates:

  • Most of the LEP students are from Lake Woebegon;
  • The LEP tests are very easy; or
  • Schools have been misclassifying kids as LEP (or cheating in some other manner) in order to boost their pass rates.

The fourth and fifth grade numbers show much the same pattern.  Note: No data for the fourth grade for 2017; I haven’t found the reason. 

image

image

The middle school data show lower VGLA pass rates in the sixth grade and the high VGLA scores vanishing in the seventh and eighth grades.  Note: No data for the sixth and seventh grades for 2017 .

image

image

image

Notice also the general absence of anomalous pass rates prior to 2013 in the middle school data.

The participation counts show that either there are many more 3d grade LEP students or, more likely, the schools have been using their one year of VGLA eligibility at the first opportunity.

image

These curves also show the 2011 effect of the 2010 statute requiring that the superintendent and the school board chairman certify every student taking the VGLA.  The drops in 2013 correspond with the new tests and the restriction of the reading VGLA to LEP students.

The fall enrollments show larger numbers of ESL students in the lower grades but less than the ca. 3:1 preponderance of third grade VGLA test takers.  I didn’t find data for the LEP populations.

image

The math VGLA data end at the advent of the new test in 2012.  The third grade data do not suggest abuse of the VGLA during the period of those data.

image

In the higher grades, some anomalous VGLA scores are evident, esp. in the middle school grades.

image

image

image

image

image

In contrast to the reading tests, here we see unusually high pass rates prior to the advent of the new tests in middle school, but not so much elementary school.

Plainly, there was a while lot of cheating going on before 2011.  These data do not immediately suggest the reason for the relatively cheating-free areas.

Let’s turn to the anomalously high reading VGLA pass rates persisting in the elementary grades to the present. 

If the LEP tests were very easy in the elementary grades, one might think they would be very easy for the middle school grades.  At the same time, if elementary schools were cheating, we can wonder why the middle schools would not be doing the same thing.  For sure, it looks like they were in math, prior to the abolition of the math VGLA.

As to the mechanism for cheating: It doesn’t have to be misclassification.  As we have seen in Petersburg, the proctors can be helpful to the students taking the SOL tests.  In the case of the VGLA, it is particularly easy to be helpful because the data are collected throughout the year:

The VGLA Reading assessment provides the eligible LEP student opportunities throughout the
school year to demonstrate his/her knowledge and achievement through a non-traditional mode of testing. The student creates work samples that demonstrate his/her knowledge and skills of the Reading SOL for his/her grade level of enrollment. These work samples are assembled as the Collection of Evidence (COE).

2016-2017 VGLA Implementation Manual at p.7.  See the note above.

I’m told that the technique here is to drill the kid on one item, test that item, and repeat as needed.

One thing for sure: The Board of “Education” had to know that the VGLA was being gamed before 2010 and they did exactly nothing until the General Assembly intervened. 

More recently, there’s no sign they did anything about the post-2013 inflated pass rates until this year when the feds made them drop the test.  That Board has to be corrupt or a bunch of very slow learners.

Looking at what they’ve done in response to the federal ban, in light of their earlier behavior, I vote for “corrupt.” 

Here is the Superintendent’s memo regarding the VGLA termination.

As a reminder, SOL testing may be discontinued for EL [“English Learner”] students who struggle with reading the test items after the student has responded to five items (not including the sample items). Students who answer at least five items will be counted as participants in the Reading test for purposes of federal accountability. The Standards of Accreditation (SOA) adjustment remains available for students who are currently classified as EL and have been enrolled in a Virginia public school for fewer than 11 semesters, as does the one-time exemption in Reading for recently arrived EL students in grades 3 through 8.

So, while we wait to see whether the locals can find another way to game the new system, we have an official window of opportunity:  As long as the proctor can count to five, “EL” students can duck the test but be counted toward the required participation rate(!).  Perhaps those kids can no longer be abused to boost the scores, but they don’t have to harm the scores either.

Your tax dollars at “work.”

Chumming the Neighborhood for Criminals

An observant neighbor called the police Labor Day afternoon because of the fellow opening cars in the 42d St. parking lot.

image

The cops arrived quickly, chased the fellow, and caught him.  Unfortunately (or, in a way, fortunately), nothing seemed to be missing from either of the cars he was seen to enter.

Stay tuned for news about that.

In the meantime, please recall that “Theft from Motor Vehicle” is the most commonly reported offense in our neighborhood and the one that is entirely preventable

As an update to the earlier report, here are the high-crime blocks in Forest Hill for this year, up to Labor Day:

image

Or, as percentages of those totals:

image

As you see, the problem runs to the overflow parking on Hillcrest and, of course, to the 41st St. lot and nearby streets.

Much of the trouble is that park visitors leave stuff in their cars and leave the cars unlocked.  Sometimes they leave stuff in view and lock the car.

car

In either case, they are chumming our neighborhood for criminals.

Please join me in asking the Park folks and the City why there are no LARGE signs at the park entrances and in the parking areas to warn people of this problem. 

I’m told told there is a camera in the 21st. St. lot that has had a good effect.  What would you think about one in 42d St. lot and another at 41st St. and another along the 4200 block of Riverside Dr.?

“Help” From the Hapless

Petersburg has been operating under Memoranda of Understanding (“MOUs”) issued by the Board of Education since at least 2004.  Let’s take a look at what that state supervision has accomplished.

To start, here are the Petersburg and state division average reading pass rates for the period of the VDOE database.

image

Here is the difference between those two curves.

image

As of this year, the state average was 4.6% above the benchmark for accreditation in English; Petersburg was 14.9 points below.  Thirteen years of “help” (in fact, command and control) from the state didn’t help.

How about math?

image

image

The accreditation benchmark here is 70%.  The 2017 state average is 9.2 points above that; Petersburg is 17.8 below.

Let’s not fill up the Internet with graphs from the other three subjects; we’ll go directly to the average of all five:

image

image

The Board of “Education” could sue Petersburg for this ongoing, massive violation of the Standards of Quality

§ 22.1-253.13:8. Compliance.

* * *

The Board of Education shall have authority to seek school division compliance with the foregoing Standards of Quality. When the Board of Education determines that a school division has failed or refused, and continues to fail or refuse, to comply with any such Standard, the Board may petition the circuit court having jurisdiction in the school division to mandate or otherwise enforce compliance with such standard, including the development or implementation of any required corrective action plan that a local school board has failed or refused to develop or implement in a timely manner.

They have not sued any division, not even Petersburg.

Why not? 

If they sued, the Board would have to tell the judge what the division must do to meet the Standards of Quality.  Yet Petersburg (it seems) has done everything the Board demanded for thirteen years now and it hasn’t worked.  Manifestly (and as they themselves admit), the Board does not know (Sept. 21, 2016 video starting at 1:48) how to fix these awful schools.

(I have not embraced the alternative explanation: Petersburg did not do what the Board required.  In that case also, the Board would have to know how to fix those schools in order to sue.) 

In the face of the Board’s primal incompetence, the Governor should long ago have fired the members for malfeasance.  This is a massive failure of the government to govern.

$104.6 million (this year) of your tax dollars at “work.”

Middle Schools by Grade and Year

Here are Richmond’s middle school reading and math pass rates by grade and by year.

AP Hill:

image

image

Binford:

image

image

Elkhardt/Thompson:

image

image

Franklin.  Please recall that Franklin has a selected population so its numbers are not directly comparable to the pass rates at the other schools.

image

image

Henderson:

image

image

Brown:

image

image

MLK:

image

image

Finally, Boushall:

image

image

SOL by School, 2016 and 2017

Here are the Richmond SOL pass rates by subject and by school for 2016 and 2017.  First the elementary schools.

BTW: The accreditation benchmark for English is 75; for all the other subjects, 70.

image

image

Next, the middle schools.  Please recall that Franklin has high school grades as well as middle, and has a select student population, so its numbers are not directly comparable to the other schools.

image

image

image

image

image

Last the high schools.  Franklin, Community and Open have select populations; Franklin also has middle school grades.  None of those schools’ performance is directly comparable to the five mainstream high schools.

image

image

image

image

image

2017 SOL v. Poverty

The excuse we often hear for Richmond’s poor performance on the SOL tests is poverty.

VDOE has data on that.  They define a student as “economically disadvantaged” if that student “1) is eligible for Free/Reduced Meals, or 2) receives TANF, or 3) is eligible for Medicaid, or 4) [is] identified as either Migrant or experiencing Homelessness.”  They formerly had a handy database front end for enrollments; that looks to be under repair just now but they have the fall division enrollments in a pair of spreadsheets here.

Juxtaposing the 2017 Division pass rates with the ED percentage of the enrollment, we see the following for the reading subject area:

image

The R-squared of 35% tells us that 35% of the variance in the pass rates is predictable from the %ED.  That is, to a considerable degree, the pass rates and %ED are related.

(Remember that correlation does not imply causation, so these data don’t say that increasing the ED population causes some portion of the decline in the pass rate.)

Statistics or no, the graph tells us that Richmond (the gold square) grossly underperformed the peer jurisdictions (red diamonds, from the left Hampton, Newport News, and Norfolk) and, indeed underperformed all the Virginia divisions except for Greensville County and Danville, higher poverty or not.

Our Math performance was nearly as dismal, beating out only poor Petersburg and, to the rounded value, getting a tie with Danville.

image

Similarly, the five subject average.

image

For sure, we have a lot of students with low family incomes.  For sure, those kids on average don’t perform as well as students from more affluent families.  But that does not come close to explaining, much less excusing, the awful performance of our schools.