VAAP Claptrap

Virginia has four programs for testing students with “special needs.”  I have written at length about the abuse of one of them, the Virginia Grade Level Alternative (VGLA), to (1) classify as handicapped students who were not and (2) artificially boost pass rates on the required standardized tests. 

Another of those programs, the Virginia Alternative Assessment Program (VAAP), offers a testing alternative for students with “significant cognitive disabilities.”  For reasons I’ll discuss in a later post, I’ve taken a look at the VAAP test scores.

The first thing that jumps out is that many divisions show nonzero pass rates on the VAAP but zero participation counts: For 2015, VDOE reports VAAP reading test participation data for only twenty (of 132) divisions; they report zero participation counts but nonzero pass rates for 113 divisions.

Here for those twenty divisions is a graph of the 2015 VAAP pass rate v. the SOL pass rate.

image

Ideally, these data would lie on the red line, indicating that the average test performance was the same on both tests.  The least squares fit to the actual data, the dotted blue line, suggests that the results are fairly close to the ideal, and the R2 indicates a modest correlation.

If we include all 113 divisions that report a VAAP pass rate, the picture changes.

image

The R2 indicates only minuscule correlation between the VAAP and SOL scores.  The slope of the fitted line suggests that the divisions with lower SOL scores (i.e., those in need of better scores) have relatively higher VAAP scores.

If you don’t smell a rat here, read on.

The 2015 math tests present a similar picture.

image

image

What’s with all those zero participation counts?

The count is important because it allows some insight into whether a division is using the alternative test to boost its scores, either by easy grading of the alternative test or by removing marginal performers from the SOL testing.  As well, the count tells us whether the division is above the 1% of VAAP tests allowed by the feds at 34 CFR § 200.13(c)(2)(i).  And we know that divisions have used the alternative tests to cheat and that VDOE has let them get away with it.

VDOE says it suppresses test participation  counts fewer than ten “to protect the identity of individual students.”  The actual suppression process is far more draconian (and opaque) and the purpose is far less clear than that.

image

The first entry upper left is misleading; once you read the whole thing you’ll see it means “if any individual count anywhere is <10, we suppress almost everything.” 

The effect of all this suppression can be astonishing.  For example, here are the highest 2015 math VAAP pass rates <100% for divisions where the participation is reported as zero (as well, six divisions report 100% pass rates and zero participation):

image

Students come in integral units, not in decimal fractions.  The highest possible pass rate for 9 or fewer (integral) students, other than 100%, is 8/9, i.e., 88.88…%.  All of these pass rates are higher.  Thus, counts larger than 9 are being suppressed.

Indeed, the ratios of the smallest integers in the VDOE reported pass rates above are 11/12 (91.666…%; Goochland, Page, Powhatan, Staunton, and Sussex), 12/13 (92.308%; Poquoson and Westmoreland), and 14/15 (93.33…%, Frederick). 

Of course, 11/12 = 22/24 =33/36 . . .   So all we can tell for sure is that VDOE is suppressing numbers larger, and possibly much larger, than 9.

At the extreme, Stafford, the smallest integers that produce a 95.97% pass rate are 119/124(!).  So VDOE’s suppression rules report a zero for a VAAP participation that was at least 124, and could have been two or three (or more) times that.

In summary, here is that set, showing the minimum pass/participation integers:

image

There are no student identities available from the VDOE Build-A-Table.   So, why is VDOE hiding all these participation data?  Could it be that they are suppressing numbers that could embarrass both VDOE and a number of school divisions?

Scott Adams, creator of Dilbert, argues (see, also this) that “whenever you have large stakes, an opportunity for wrong-doing, and a small risk of getting caught, wrong-doing happens. . . .  When humans can cheat, they do.” 

Given VDOE’s interest in high SOL scores, should we suspect that they are hiding something here?

In light of VDOE’s track record of hiding and manipulating data (see this and this and this and this and this and this), the answer is obvious.