Accreditation Inflation

The initial accreditation results tell us that 18 of 44 Richmond schools were fully accredited this year.  There remain 26 schools that, even by the highly flexible standards of the Board of “Education,” have a problem.

A closer look at the data shows a situation that is worse – in a couple of cases, much worse – than even those numbers suggest.

ESH Greene Elementary

Greene is fully accredited this year, despite a sad history of decline.

image

Recall that the accreditation benchmark is 75 for English, 70 for the other subjects.  Here we see Greene underwater in all five subjects.  Indeed, by any honest measure, Green would have run out its four years of failure last year.

But all that was before the “adjustments” provided some remarkable enhancements.  Here are this year’s results:

image

49.5% of the Greene students flunked the reading SOL but the “adjustments” produced a score six points above the 75% benchmark.  (They stopped reporting writing in elementary school three years ago so we are spared the ambiguity in averaging the reading and writing pass rates to get to the English datum.)

Greene enjoyed similar, if smaller, “adjustments” in the other subjects; the school then “met” the benchmarks in all subjects except Science, where a ten point boost still left it thirteen points short.  The school remains fully accredited.  Never mind it plainly is failing to properly serve its students.

There are good, old English words to describe this situation.  Most of them can’t be used in polite company.

Seven other Richmond schools were boosted into accreditation in one or more subjects.  Fortunately, none as dramatically as Greene.

Franklin Military Academy

Franklin has done quite well recently, except for a problem with math.

image

The “adjustments” this year camouflaged that problem.

image


Linwood Holton Elementary

Holton’s picture is similar to Franklin’s.

image

image


JL Francis Elementary

Francis has a persistent problem with reading; this year its SOLs dipped in math and science. 

image

The “adjustments” this year cured the latter two problems.

image


Miles Jones Elementary

In recent times, Jones has done well in history but has been borderline or below in reading, math, and science.

image

The “adjustments” this year cured all those little problems.

image


Southampton Elementary

Southampton had been improving but slid this year below the English and math benchmarks.

image

The “adjustments” fixed that.

image


Bellevue Elementary

Bellevue enjoyed the “adjustment” in math but not in reading.

image

image


Richmond Community High

Community usually does quite well.  This year, however, its math performance slipped below the benchmark.

image

The “adjustment” took care of that.

image


Comment
Rant

VDOE writes the SOL tests.  They can boost the pass rates and accreditation rates simply by making the tests easier.  Yet they indulge in this opaque process to produce meaningless numbers that favor some schools over others.

Moreover, they do not adjust the scores for the one factor that they measure and that we know affects the pass rates: Economic Disadvantage.  Indeed, they have abandoned (pdf at p.102) their measure of academic progress, the SGP, that is independent of economic status.

(Their excuse for dropping the SGP is, they say (pdf at p.102), that they can’t calculate it until all the data are in at the end of the summer.  I don’t think they were too stupid to know that when they started with the SGP; I think they are lying: The SGP gave intolerable amounts of accurate, comparable information about teacher/school/division performance.)

And then we have Petersburg that has managed to remain without accreditation for at least fourteen years, despite all the helpful “adjustments.”  The Board of “Education” has at last figured out what to do about that: They are going to change the accreditation system to make the process still more byzantine and to make it easier for a school to remain accredited.

Your tax dollars at “work.”

Accreditation Boosts in Richmond

Having looked at the accreditation score boosts by division, let’s turn to the Richmond schools.

As a reminder: The accreditation scores start with the pass rates.  VDOE then applies “adjustments” for “remediation,” limited English proficiency, and transfer students.  The result of the adjustments almost always is a boost in the score.  Indeed, one division enjoyed a 26 point boost in its math score.

Unfortunately, VDOE does not release the data that underlie these fudge factors so there is no way to audit the process.  We can, however, look at the accreditation scores and compare them to the underlying pass rates.

Here, then, are the 2017 math data for Richmond:

image

The red line is the 70% accreditation benchmark.  This year Bellevue, Greene, Franklin, Holton, Francis, Jones, Community, and Southampton were boosted into math accreditation.

The average math boost was 4%; the largest was 26% (!) at Greene.

Imagine that!  Greene had a 58% pass rate in math but managed to be fully accredited.  With the Good Fairy handing out this kind of treats, it’s a wonder than anybody fails to be fully accredited.

Scott Adams points out that “wherever you have large stakes, an opportunity for wrong-doing, and a small risk of getting caught, wrong-doing happens. . .  When humans CAN cheat, they do.” 

For sure, we have seen Atlanta, Buchanan County, Oak Grove, Petersburg, and VDOE’s own history in that respect.

Viewed in that light, in the reflection of the demand for better pass rates, and in the context of the large boosts that some scores enjoy, VDOE’s secret data manipulation is ill-advised, even if it is not corrupt.

————————–

P.S.:  I’m holding off on the English data until I can learn how they average the reading and writing pass rates.  If it’s by averaging the school averages for each subject, we’ll see a 30 point boost at Greene and a 31% jump at Armstrong.  Stay tuned.

It’s Performance, not Poverty

The popular sport in the Richmond “education” establishment has been to blame the kids for the awful performance of our schools.  We particularly hear about our less affluent (the official euphemism is “economically disadvantaged”) students.

We have some data on that.  Again.

Here are the average reading pass rates by grade of the economically disadvantaged (”ED”) and non-ED students in Richmond and the state.  “EOC” indicates the End of Course tests that generally must be passed to receive “verified credits” that count toward a diploma.

image

Both in Richmond and on average, the ED group underperforms the non-ED group.

To the point here, the Richmond ED students underperform their peers in the state averages, as do the non-ED Richmond students

We can calculate the differences between the Richmond groups and state average to measure that underperformance.

image

Here we see Richmond’s ED students underperforming their peers by about 7% in elementary school while our non-ED students average some 9% below that group statewide.  In middle school the difference increases to roughly 19% for the non-ED students and 25% for the ED group.

The math test results show a similar pattern. 

image

image

These data tell us two things:

  • Richmond students, both ED and not, underperform their statewide peer groups on average; and
  • The average SOL performance of Richmond students, ED and not, deteriorates dramatically in middle school.

As I have demonstrated elsewhere, the large percentage of ED students in Richmond (64% in 2017) does not explain our low pass rates.  So we are left with (at least) two possible explanations: Either Richmond students are less capable on average than students statewide or our schools are less effective than average.

If Richmond’s students were just less capable, it would explain the low elementary school scores but not the drop in pass rates after the fifth grade.

The plummeting performance of our students when they reach middle school tells us there’s a (big!) problem with our middle schools.  And there’s every reason to think that the school system that has terrible middle schools might also have problems with its elementary schools.

To the same end, notice how the ED performance by grade tracks the non-Ed performance in Richmond.  We saw the same thing last year:

image

image

Those parallel curves are fully consistent with the notion that performance variations by grade are driven by the teaching, not by the capabilities of the students.

As well, Friar Occam would suggest the simple explanation: Substandard elementary schools and awful middle schools in Richmond, not thousands of dunderheads (both poor and more affluent) attending those schools.

There’s an alternative that we might consider: I keep hearing that some of our elementary schools cheat on the SOLs.  We know for sure that’s happened in the past in Richmond and this year in Petersburg

If, in fact, the Richmond elementary pass rates are boosted by cheating, it absolves the middle schools of some or all of the decline in scores between the fifth and sixth grades but it leaves the conclusion here intact; it merely moves the elementary schools toward the “awful” category of our middle schools.

“Adjusting” Accreditation Scores

The accreditation scores for 2017-2018, based on the 2017 testing, are up on the VDOE site.

In the distant past (2005), VDOE’s opaque accreditation process transformed 76.3 and 73.7 math scores at Jeter-Watson into “perfect scores” and embarrassed the Governor.

They now admit to manipulating the scores:

Accreditation ratings also reflect adjustments made for schools that successfully remediate students who initially fail reading or mathematics tests. Adjustments also may be made for students with limited English proficiency and for students who have recently transferred into a Virginia public school. All of these factors are taken into account in calculating pass rates in each subject area.

(But don’t ask them for the remediation data.  They’ll tell you to go breathe ozone.)

They tell me the English rating is based on an average of the reading and writing pass rates.  I’m waiting for information on whether that is an average of the school averages or an average by students.  In the meantime, let’s look at the math data.

The accreditation “adjustments” to pass rates are not so dramatic these days, an average of 2.6 points on the math tests this year, but they still have significant effects.

To start, here is a plot of the math “adjusted” rates (i.e., the rates used for accreditation), by school, vs. the actual pass rates.

image

In a world populated with honest data, all those points would lie on the red line.  As you see, some do, and a few lie below (hard to know how the adjustments would produce that result), but most of the data show accreditation scores that are “adjusted” to larger values.

NOTE: It took several hours to groom the dataset.  There were ten schools for which the database reported SOL pass rates but the Accreditation list had no report.  As well, VDOE reported accreditation data for 41 schools not listed in the SOL database and listed another six in the accreditation data without any numbers.  Then there are another four schools that appear in both lists but are missing data in one or the other.  The data here are for the remaining 1,772 schools.

If we plot the distribution of differences (i.e. adjusted score minus actual pass rate), we see that most of the adjustments are five points are fewer. 

image

Rescaling the y-axis reveals that numbers even in the 10% range are not trivial and, in one case, the “adjustments” produced a 26 point gain.

image

The adjustments reduce the population of scores just below the 70% cutoff for accreditation and increase the population above that benchmark:

image

The (least squares) fitted curves show the shift in the average score.

A plot of counts of adjusted minus counts of actual pass rates emphasizes how the adjustments deplete the population below the 70% cutoff and increase it above.

image

The outstanding example:  There are 37 schools with 69% math pass rates but only five with that adjusted rate.  The average adjusted rate for those 37 schools is 74%.

VDOE writes the SOL tests.  They can boost the pass rates and accreditation rates simply by making the tests easier.  Yet they indulge in this opaque process to produce meaningless numbers that favor some schools over others.

Moreover, they do not adjust the scores for the one factor that they measure and that we know affects the rates: Economic Disadvantage.

And remember that the pass rates themselves have been fudged in some cases: See, e.g., this and this.

So “opaque” is insufficient to describe this process.  “Opaque and corrupt and unfair” comes closer.

Your tax dollars at “work.”

2017 Accreditation Data

The 2017 accreditation data are up on the VDOE Web site.

(Just to keep us confused, they call these the accreditation ratings for the 2018 school year.  But, to be clear, the ratings are based, at least in theory, on the 2017 SOL pass rates.)

I have extracted the Richmond data and posted them here.

The change in Richmond is that Stuart, where the scores improved remarkably this year, now is fully accredited.

Here are the overall totals.  I have bowdlerized the wordy, softened classifications of failure.  For example, “Partially Accredited: Approaching Benchmark-Pass Rate”  is “Close” in my table.  See the VDOE Accreditation page for the long versions.

image

In sum:

image

Or, expressed as percentages of the numbers of schools:

clip_image001

image

image

That TBD in Richmond is AP Hill.  The VDOE spreadsheet reports a 72 at Hill in English, three points short of the criterion.  Stay tuned to see whether the secret “adjustments” can turn that into a 75 and give us at least one accredited middle school (not counting Franklin Military, which is fully accredited as to both its middle and high school grades).

Indeed, the reading pass rate at Hill this year was 70.7%, down 0.3% from 2016; writing was 68.3%, up by 6%.  It looks like those scores already have enjoyed some “adjustments” to get up to the reported 72.

Petersburg: Compounding Failure?

The headline in the Progress-Index tells us “McAuliffe touts Petersburg schools’ improvements” on a visit to Petersburg.

The “improvements” listed in the story? 

McAuliffe mentioned that the graduation rate for Petersburg High School increased last year, as well as SAT and ACT scores.

The untold story:

image

image

image

image

  • We can wonder about the lovely numbers at A.P. Hill in previous years; for sure there are no numbers this year because they were caught cheating
  • Of the other three elementary schools, only Walnut Hill made the 75% accreditation mark for reading and the 70% cutoff for math this year.  The (failing) elementary reading scores at the other two look to be improving; the (also failing) math scores, not.
  • It’s difficult to reconcile improving graduation rate, SAT, and ACT scores with declining high school pass rates. 
  • Peabody Middle remains lost in a miasma of failure; Johns Jr. looks to be trying to join it.
  • Petersburg achieved this record of failure after fourteen years of “help” from the Board of “Education.”

The Governor “wanted top-to-bottom reform.”  That reform, the article tells us, is

There is a lot of poverty in Petersburg.  In 2016-17, 61% of the students were classified as “economically disadvantaged.”  Nutrition and social services might well help (Were they not needed before now?) but they hardly amount to “top-to-bottom reform” of the failed school system. 

So it looks like any “reform” will have to be found in the 63 slick pages of the Plan

There’s lots of feel-good in that document.  For example, at page 15 it says they will:

  • Guide and support teachers; and
  • Provide relevant and sophisticated support – including professional development opportunities, as well as ongoing coaching and collaboration – geared to [the] teachers’ and leaders’ needs.

But you’ll have to read to page 46 to find anything that even brushes up against “reform”: 

Measure of Success: All PCPS principals have adopted practices to collect and use evidence of teacher instructional performance.

If that means “hold the principals and teachers accountable,” it might indeed be a (partial) harbinger of reform.  If it means “continue using the existing, mendacious evaluation system,” there’s no hope for the students who are being damaged by the awful Petersburg schools.

And notice: They merely have to adopt practices.  They don’t actually have to hold anybody accountable for failure to do the job.

The (faint) promise of p.46 aside, the overwhelming problem here is the general lack of accountability, not only as to teachers and principals but on up the chain:

  • If Petersburg fails to meet the 2022 deadline of the Plan, it can say it did everything the State demanded.
  • The Board of “Education” has never been held accountable for its fourteen-year record of failure in Petersburg and there is no sign that the current Plan (or, more to the point, the Governor) will remedy that failure of leadership.
  • The Governor will be gone next year.

Your tax dollars at “work.”

Smaller Classes?

Table 17b in the 2016 Superintendent’s Annual Report (the latest available) includes data by division on the total numbers of “instructional positions” and students (“ADM”). 

The table contains data for principals and assistant principals, teachers (including technology instructors), teacher aides, guidance counselors, librarians, and district-wide instructors based on positions reported in school divisions’ Annual School Reports.  District-wide positions include Summer School, Adult Education, Pre-Kindergarten, and other non-regular day and non-LEA instructional positions.

The Average Daily Membership (ADM) shown in this table reflects all pupils (Pre-K through Post-graduate) served in the school division at the end of the year.

The (very nice) VDOE front end to the SOL database offers a plethora of data, including the division average SOL pass rates for 2016.

Combining those datasets for the reading tests gives the following:

image

The red diamonds are, from the left, the peer cities Newport News, Hampton, and Norfolk.  Richmond is the gold square.  The green diamond is Charles City; the blue diamond is Lynchburg.

The division average teacher/ADM ratio is 11.1%.  Richmond is 11.5%.  The average pass rate is 77.4%.

The fitted line suggests that the pass rate by division goes down as the Teacher/Pupil ratio increases, i.e., as the class size drops.  The R-squared, however, tells us that the two variables are only trivially correlated.

In short, by this measure smaller classes don’t perform any better.

The math data tell the same story.

image

The R-squared rises to 3.4%: Still not enough to bet on and, in any case, the correlation still goes in the wrong direction.

The average pass rate is 78.4%.

Finally, the average of the averages of the five subjects:

image

Unfortunately the available data do not give us the teacher numbers by grade, so we can’t refine the analysis down to the grade and school level.  Even so, these data suggest that the key to improved learning lies somewhere other than in class size.









































































































































VGLA: Rest in Ignominy

NOTE: VDOE has scrubbed most of the VGLA materials from its Web site, breaking some of the links in my draft of this post.  As of the posting date, all the remaining links are good.  I have posted the 2017 VGLA Manual here as a partial remedy for this Orwellian information vacuum.

The Virginia Grade Level Alternative, aka the VGLA, “is an alternative assessment for the Standards of Learning (SOL) Reading tests in grades 3-8. The VGLA Reading assessment is available only to Limited English Proficient (LEP) students in grades 3-8 who meet the eligibility criteria” (emphasis in original).

More specifically: “LEP students who have attended school in the United States for less than 12 months may receive a one – time exemption for the SOL Reading test in grades 3 through 8.”

Prior to 2012 in math and 2013 in reading, the VGLA was locally graded and was widely abused as a mechanism to boost SOL scores.  Since then, VDOE (through a contractor) has audited some of the reading “collections of evidence” (ca. 20% of them, it seems) but the grading has remained local. 

The VDOE database can tell us about the performance of students in the VGLA and in the regular SOL tests.  Let’s start with the third grade reading tests.

image

The SOL scores dropped statewide in 2013 with the advent of the new, tougher English SOL tests.  Here we see the third grade scores later recovering to near the 75% accreditation benchmark.

The abuse of the VGLA before 2011 was so outrageous (as you can see here) that the General Assembly passed a law in 2010 to curtail it.

The Board of “Education,” reacting to the new law, restricted the VGLA to reading and LEP students beginning in 2013.  The scores, however, remained high, settling a bit this year to 95.4%. 

There are at least three possible explanations for these phenomenal, post-2013 pass rates:

  • Most of the LEP students are from Lake Woebegon;
  • The LEP tests are very easy; or
  • Schools have been misclassifying kids as LEP (or cheating in some other manner) in order to boost their pass rates.

The fourth and fifth grade numbers show much the same pattern.  Note: No data for the fourth grade for 2017; I haven’t found the reason. 

image

image

The middle school data show lower VGLA pass rates in the sixth grade and the high VGLA scores vanishing in the seventh and eighth grades.  Note: No data for the sixth and seventh grades for 2017 .

image

image

image

Notice also the general absence of anomalous pass rates prior to 2013 in the middle school data.

The participation counts show that either there are many more 3d grade LEP students or, more likely, the schools have been using their one year of VGLA eligibility at the first opportunity.

image

These curves also show the 2011 effect of the 2010 statute requiring that the superintendent and the school board chairman certify every student taking the VGLA.  The drops in 2013 correspond with the new tests and the restriction of the reading VGLA to LEP students.

The fall enrollments show larger numbers of ESL students in the lower grades but less than the ca. 3:1 preponderance of third grade VGLA test takers.  I didn’t find data for the LEP populations.

image

The math VGLA data end at the advent of the new test in 2012.  The third grade data do not suggest abuse of the VGLA during the period of those data.

image

In the higher grades, some anomalous VGLA scores are evident, esp. in the middle school grades.

image

image

image

image

image

In contrast to the reading tests, here we see unusually high pass rates prior to the advent of the new tests in middle school, but not so much elementary school.

Plainly, there was a while lot of cheating going on before 2011.  These data do not immediately suggest the reason for the relatively cheating-free areas.

Let’s turn to the anomalously high reading VGLA pass rates persisting in the elementary grades to the present. 

If the LEP tests were very easy in the elementary grades, one might think they would be very easy for the middle school grades.  At the same time, if elementary schools were cheating, we can wonder why the middle schools would not be doing the same thing.  For sure, it looks like they were in math, prior to the abolition of the math VGLA.

As to the mechanism for cheating: It doesn’t have to be misclassification.  As we have seen in Petersburg, the proctors can be helpful to the students taking the SOL tests.  In the case of the VGLA, it is particularly easy to be helpful because the data are collected throughout the year:

The VGLA Reading assessment provides the eligible LEP student opportunities throughout the
school year to demonstrate his/her knowledge and achievement through a non-traditional mode of testing. The student creates work samples that demonstrate his/her knowledge and skills of the Reading SOL for his/her grade level of enrollment. These work samples are assembled as the Collection of Evidence (COE).

2016-2017 VGLA Implementation Manual at p.7.  See the note above.

I’m told that the technique here is to drill the kid on one item, test that item, and repeat as needed.

One thing for sure: The Board of “Education” had to know that the VGLA was being gamed before 2010 and they did exactly nothing until the General Assembly intervened. 

More recently, there’s no sign they did anything about the post-2013 inflated pass rates until this year when the feds made them drop the test.  That Board has to be corrupt or a bunch of very slow learners.

Looking at what they’ve done in response to the federal ban, in light of their earlier behavior, I vote for “corrupt.” 

Here is the Superintendent’s memo regarding the VGLA termination.

As a reminder, SOL testing may be discontinued for EL [“English Learner”] students who struggle with reading the test items after the student has responded to five items (not including the sample items). Students who answer at least five items will be counted as participants in the Reading test for purposes of federal accountability. The Standards of Accreditation (SOA) adjustment remains available for students who are currently classified as EL and have been enrolled in a Virginia public school for fewer than 11 semesters, as does the one-time exemption in Reading for recently arrived EL students in grades 3 through 8.

So, while we wait to see whether the locals can find another way to game the new system, we have an official window of opportunity:  As long as the proctor can count to five, “EL” students can duck the test but be counted toward the required participation rate(!).  Perhaps those kids can no longer be abused to boost the scores, but they don’t have to harm the scores either.

Your tax dollars at “work.”

Chumming the Neighborhood for Criminals

An observant neighbor called the police Labor Day afternoon because of the fellow opening cars in the 42d St. parking lot.

image

The cops arrived quickly, chased the fellow, and caught him.  Unfortunately (or, in a way, fortunately), nothing seemed to be missing from either of the cars he was seen to enter.

Stay tuned for news about that.

In the meantime, please recall that “Theft from Motor Vehicle” is the most commonly reported offense in our neighborhood and the one that is entirely preventable

As an update to the earlier report, here are the high-crime blocks in Forest Hill for this year, up to Labor Day:

image

Or, as percentages of those totals:

image

As you see, the problem runs to the overflow parking on Hillcrest and, of course, to the 41st St. lot and nearby streets.

Much of the trouble is that park visitors leave stuff in their cars and leave the cars unlocked.  Sometimes they leave stuff in view and lock the car.

car

In either case, they are chumming our neighborhood for criminals.

Please join me in asking the Park folks and the City why there are no LARGE signs at the park entrances and in the parking areas to warn people of this problem. 

I’m told told there is a camera in the 21st. St. lot that has had a good effect.  What would you think about one in 42d St. lot and another at 41st St. and another along the 4200 block of Riverside Dr.?

“Help” From the Hapless

Petersburg has been operating under Memoranda of Understanding (“MOUs”) issued by the Board of Education since at least 2004.  Let’s take a look at what that state supervision has accomplished.

To start, here are the Petersburg and state division average reading pass rates for the period of the VDOE database.

image

Here is the difference between those two curves.

image

As of this year, the state average was 4.6% above the benchmark for accreditation in English; Petersburg was 14.9 points below.  Thirteen years of “help” (in fact, command and control) from the state didn’t help.

How about math?

image

image

The accreditation benchmark here is 70%.  The 2017 state average is 9.2 points above that; Petersburg is 17.8 below.

Let’s not fill up the Internet with graphs from the other three subjects; we’ll go directly to the average of all five:

image

image

The Board of “Education” could sue Petersburg for this ongoing, massive violation of the Standards of Quality

§ 22.1-253.13:8. Compliance.

* * *

The Board of Education shall have authority to seek school division compliance with the foregoing Standards of Quality. When the Board of Education determines that a school division has failed or refused, and continues to fail or refuse, to comply with any such Standard, the Board may petition the circuit court having jurisdiction in the school division to mandate or otherwise enforce compliance with such standard, including the development or implementation of any required corrective action plan that a local school board has failed or refused to develop or implement in a timely manner.

They have not sued any division, not even Petersburg.

Why not? 

If they sued, the Board would have to tell the judge what the division must do to meet the Standards of Quality.  Yet Petersburg (it seems) has done everything the Board demanded for thirteen years now and it hasn’t worked.  Manifestly (and as they themselves admit), the Board does not know (Sept. 21, 2016 video starting at 1:48) how to fix these awful schools.

(I have not embraced the alternative explanation: Petersburg did not do what the Board required.  In that case also, the Board would have to know how to fix those schools in order to sue.) 

In the face of the Board’s primal incompetence, the Governor should long ago have fired the members for malfeasance.  This is a massive failure of the government to govern.

$104.6 million (this year) of your tax dollars at “work.”