Is Fairfax Above the Law?

Fairfax doesn’t like Virginia’s mandatory school attendance law, so they object to the regulation that proposes to enforce that law.

Va. Code  § 22.1-269 requires that the Board of Education “see that the [mandatory attendance laws] are properly enforced throughout the Commonwealth.” Notwithstanding that mandate, the Board still neither collects nor publishes data that would allow the public to assess its performance of that duty.

The Board did not publish a proposed truancy regulation until December 21, 2009. The history of that regulation is set forth on the Town Hall website.  In short, the regulation completed its fourth(!) public comment period on 12/2/2015; the proposed regulation still fails to discharge the Board’s duty and the Board has not yet acted to either adopt the proposed regulation or to repair it.

Yet, in the Peoples Republic of Northern Virginia the proposed regulation is defective insofar as it does attempt to require compliance with State law.

In comments dated 12/2/15, Dr. Karen Garza of Fairfax County Schools says:

[T]he draft regulations require the creation of an attendance plan after the fifth unexcused absence, as specified by Code.  But instead of allowing time for that plan’s implementation or for any sort of evaluation of a student’s progress toward addressing issues identified by the plan, the regulations (again in compliance with Code) require additional significant interventions after only a single additional unexcused absence.  One more unexcused absence beyond that potentially triggers a court petition and legal intervention.  Simply put, this timeline is fundamentally flawed and undermines the potential usefulness of the required attendance plan by giving it little to no time to actually work.  Unless the underlying Code constraints are addressed, regulatory changes will be of limited worth in truly moving student attendance policies in school divisions toward evidence-based best practices.

In short: “The State law is fundamentally flawed and we object to your regulation that would seek to enforce that law.”

The Fairfax schools’ Web page tells us that Dr. Garza is the Superintendent up there.  Her comments tell us that Dr. Garza needs to be fired and replaced with a Superintendent who is willing to apply Virginia law as it is, not as she wishes it to be.

 

PS to Superintendent Staples: In the likely event that Fairfax won’t fire her, you and your Board can.

The VGLA Cheating Monster Lives!

As we have just seen, VDOE is concealing participation rates for the VAAP in a manner that raises the question whether some divisions are abusing the VAAP process to boost their scores.

We earlier saw that VDOE ignored rampant abuse of the VGLA until the General Assembly waved a red flag.  VDOE’s response to the new law was to eliminate the VGLA, except for the reading tests in grades 3-8 for Limited English Proficient (LEP) students. 

Unfortunately, it appears that even those remaining VGLA tests are being abused.

The 2015 student achievement reports show VGLA pass rates for 23 divisions.  Fifteen of those 23 divisions (65%) show zero participation.  The average VGLA score is fourteen percent higher than the reading SOL score.  More particularly:

image

Here we see the reading VGLA pass rates of those 23 divisions plotted vs. the SOL pass rates.  The red line shows the ideal: VGLA pass rate same as the SOL pass rate.  The dotted line is fitted to the actual data and the R2 shows only a modest correlation between the two scores.

There are two interesting features here:

  • The VGLA pass rates are remarkably higher than the SOL rates; and
  • The difference decreases as the division has less need to boost its scores, i.e., with increasing SOL pass rates.

Redrawing the graph we see:

image

The gold line shows the difference between the fitted and ideal lines.

I earlier quoted Scott Adams for the notion that “whenever you have large stakes, an opportunity for wrong-doing, and a small risk of getting caught, wrong-doing happens. . . .  When humans can cheat, they do.”  That certainly is what we’ve seen wholesale in Atlanta and in Virginia earlier on the VGLA. 

And here we have VDOE again hiding data, with the remaining data consistent with cheating.  Should we suspect that VDOE is again hiding evidence and overlooking cheating?  In light of VDOE’s track record of hiding and manipulating data and ignoring wholesale cheating (see this and this and this and this and this and this), the answer is obvious.

————————————

Here are the data for the 23 divisions, sorted by VGLA/SOL ratio:

image

VAAP Claptrap

Virginia has four programs for testing students with “special needs.”  I have written at length about the abuse of one of them, the Virginia Grade Level Alternative (VGLA), to (1) classify as handicapped students who were not and (2) artificially boost pass rates on the required standardized tests. 

Another of those programs, the Virginia Alternative Assessment Program (VAAP), offers a testing alternative for students with “significant cognitive disabilities.”  For reasons I’ll discuss in a later post, I’ve taken a look at the VAAP test scores.

The first thing that jumps out is that many divisions show nonzero pass rates on the VAAP but zero participation counts: For 2015, VDOE reports VAAP reading test participation data for only twenty (of 132) divisions; they report zero participation counts but nonzero pass rates for 113 divisions.

Here for those twenty divisions is a graph of the 2015 VAAP pass rate v. the SOL pass rate.

image

Ideally, these data would lie on the red line, indicating that the average test performance was the same on both tests.  The least squares fit to the actual data, the dotted blue line, suggests that the results are fairly close to the ideal, and the R2 indicates a modest correlation.

If we include all 113 divisions that report a VAAP pass rate, the picture changes.

image

The R2 indicates only minuscule correlation between the VAAP and SOL scores.  The slope of the fitted line suggests that the divisions with lower SOL scores (i.e., those in need of better scores) have relatively higher VAAP scores.

If you don’t smell a rat here, read on.

The 2015 math tests present a similar picture.

image

image

What’s with all those zero participation counts?

The count is important because it allows some insight into whether a division is using the alternative test to boost its scores, either by easy grading of the alternative test or by removing marginal performers from the SOL testing.  As well, the count tells us whether the division is above the 1% of VAAP tests allowed by the feds at 34 CFR § 200.13(c)(2)(i).  And we know that divisions have used the alternative tests to cheat and that VDOE has let them get away with it.

VDOE says it suppresses test participation  counts fewer than ten “to protect the identity of individual students.”  The actual suppression process is far more draconian (and opaque) and the purpose is far less clear than that.

image

The first entry upper left is misleading; once you read the whole thing you’ll see it means “if any individual count anywhere is <10, we suppress almost everything.” 

The effect of all this suppression can be astonishing.  For example, here are the highest 2015 math VAAP pass rates <100% for divisions where the participation is reported as zero (as well, six divisions report 100% pass rates and zero participation):

image

Students come in integral units, not in decimal fractions.  The highest possible pass rate for 9 or fewer (integral) students, other than 100%, is 8/9, i.e., 88.88…%.  All of these pass rates are higher.  Thus, counts larger than 9 are being suppressed.

Indeed, the ratios of the smallest integers in the VDOE reported pass rates above are 11/12 (91.666…%; Goochland, Page, Powhatan, Staunton, and Sussex), 12/13 (92.308%; Poquoson and Westmoreland), and 14/15 (93.33…%, Frederick). 

Of course, 11/12 = 22/24 =33/36 . . .   So all we can tell for sure is that VDOE is suppressing numbers larger, and possibly much larger, than 9.

At the extreme, Stafford, the smallest integers that produce a 95.97% pass rate are 119/124(!).  So VDOE’s suppression rules report a zero for a VAAP participation that was at least 124, and could have been two or three (or more) times that.

In summary, here is that set, showing the minimum pass/participation integers:

image

There are no student identities available from the VDOE Build-A-Table.   So, why is VDOE hiding all these participation data?  Could it be that they are suppressing numbers that could embarrass both VDOE and a number of school divisions?

Scott Adams, creator of Dilbert, argues (see, also this) that “whenever you have large stakes, an opportunity for wrong-doing, and a small risk of getting caught, wrong-doing happens. . . .  When humans can cheat, they do.” 

Given VDOE’s interest in high SOL scores, should we suspect that they are hiding something here?

In light of VDOE’s track record of hiding and manipulating data (see this and this and this and this and this and this), the answer is obvious.

AWOL on Truancy

In light of Richmond’s flagrant violations of the state law regarding truancy and their failure to post any truancy data this year, I filed a FOIA request for their 2015 data.  Their response, in short (the 2d column lists the number of students with the specified number of absences; the fourth column, the number of the required responses):

image

As a reminder, the law requires:

  • 5 unexcused absences: Attendance Plan;
  • 6 unexcused absences: Conference w the Parents; and
  • 7 unexcused absences: Prosecution of the Parents or CHINS petition v. the student.

Richmond’s recent history (Note: They have been counting ten-absence cases and sending the Parents a letter):

image

So this year they are not even keeping records, except to show an appalling number of five-absence cases and that the unlawfully minuscule number of CHINS petitions has dropped.

You might think that VDOE, which has the duty to “see that the [mandatory attendance laws] are properly enforced throughout the Commonwealth” would be doing something about this tragic defiance of the law.  But to think that you’d have to be unaware of VDOE’s ongoing disdain for performing that duty.

What I don’t understand is why somebody doesn’t mandamus the School Board.  Or sue them for damages (e.g., for a death by stabbing) caused by a frequent truant.

(Would be) Theft from Motor Vehicle

Yesterday afternoon Ms. Penelope looked out the window and saw a young fellow walking up Riverside Drive trying the driver’s side doors of the parked vehicles (warm day; lots of river visitors).  When she and I got out the front door, we saw the fellow with the door of a car open, right in front of our mailbox.

Never mind that some thoughtless fool parked so Mr. Worsham would have to get around the car to deliver our mail.  That thoughtless fool left his car unlocked, helping to chum for criminals in our quiet neighborhood.  Indeed, car breakins (or, more often, theft from unlocked cars) continues the be the #1 crime in our block.

The would-be thief closed the door and went on down the block when he saw us watching him.  Then he worked his way back toward 42d St., got into a shiny, black car, and drove away.  He was gone when the cops got here.

For what it’s worth: thin, tan knit cap over ears, wife-beater shirt, orange underwear showing where the belt was half way down his butt.

There’s not much we can do about the fools who park here and leave stuff in their unlocked vehicles.  But, as always, put your stuff in the trunk or in the house and lock your cars so you won’t contribute to, and suffer from, this problem.

Blarney à la Bedden, II

The Times-Dispatch yesterday reported on the Superintendent’s State of the Schools speech.

Bedden returned to the old excuses for Richmond’s awful performance: poverty, handicapped students, and students whose native language is not English.  I commented earlier on those bogus arguments: In short, those populations in Richmond are underperforming the state averages for the same populations so we should be looking to inferior instruction, not the students, to explain our lousy scores.

Much of the speech seems to have been devoted to explaining the needs of RPS for more money.  Crucially absent was any discussion of what RPS is doing to reduce the $50 million of excess spending that does not seem to be helping our students.

The RT-D also quoted the Superintendent for five specific statements, none of which withstands close examination.

 

10 out of 28 of elementary schools met the state’s standards for full accreditation; up from 43 percent last year.

Just a week ago, VDOE updated its accreditation data.   Their spreadsheet shows eleven of twenty-seven Richmond elementary schools to be fully accredited if one counts Richmond Career Education and Employment as an elementary school; if we notice that the school is oriented to “employment for Richmond students ages 14-21” and do not count it as an elementary school, the fully accredited count is ten of twenty-six, not twenty-eight.

image

In any case, ten of twenty-eight is 36% and ten of twenty-six is 38% and eleven of twenty-seven is 41%, none of which is greater than 43%. 

 

Five out of seven middle schools are partially accredited

Franklin Military, which has both middle and high school classes, is fully accredited.  Hill is “Partially Accredited Improving School – Pass Rate,” i.e., it is not accredited and “[does] not qualify for a rating of Partially Accredited . . . but [is] making acceptable progress toward full accreditation.”  Binford, Henderson, Brown, and Boushall all are “Partially Accredited” only insofar as they are being reconstituted after being denied accreditation.  Elkhardt-Thompson is “new” and gets a pass, erasing the “Denied” ranking for Thompson last year.

image

The Superintendent gets a “True But Vastly Misleading” rating on this statement, whether we read it as “five of seven” or “five of eight.”

Six out of eight of the comprehensive and specialty high schools met the state’s standards for full accreditation, up from 37 percent last year.

Actually it was 29% last year, 20% if you count Franklin.

image

This year, it indeed is six of eight (if you count Franklin).

image

 

28 out of 45 schools posted gains in English scores for Standards of Learning tests, and 9 schools demonstrated double-digit gains. 

There are two English SOL test classes, Reading and Writing, but a single score for accreditation purposes.  Since the Super speaks of “English” scores, let’s look at the accreditation scores.

Thompson and Elkhardt had awful English scores last year (thirty-eight and forty-two, respectively) but the combined school has no English score reported this year.  That leaves forty-three scores reported, not forty-five.  Of those, twenty-four, not twenty-eight, improved, seven, not nine, by double digits; eighteen schools declined; and one remained unchanged.  

image

Note that these accreditation scores have been “adjusted” in many  cases to depart significantly from the actual pass rates.

 

33 out of 45 schools posted gains in math SOLs.

As to math, it’s twenty-eight up, not thirty-three, with twelve by double digits; twelve down; and three with no change.

image

Just in case the Super is speaking of actual SOL math pass rates, the results are:

image

That’s thirty-two up, one the same, and twelve down.

Overall, those score increases did not move Richmond from its second-from-lowest place in the state for reading pass rate and improved our math pass rate from fifth worst to sixth.  That’s not much to brag about.

image

image

In light of all this, I’d make four suggestions to the Superintendent:

  1. Stop blaming the kids for the awful instruction in your schools;
  2. Recheck the numbers your staff give you;
  3. Try telling the whole truth – good and bad — when you are bragging on your performance; and
  4. Spend more effort improving instruction and finding out where RPS is wasting the money it has (and talking about these!), and spend less time kvetching about the amount of money in the Mayor’s budget.

The estimable Carol Wolf makes two further suggestions:

  1. Take some credit for cleaning your administrative house, making the budget more transparent, and the money you have saved; and
  2. Think about other metrics, e.g., scholarships earned by seniors, student and staff accomplishments that don’t fit into the bureaucratic categories, Franklin students who have served with distinction in the military..

Dollars But Not Scholars, 2015

Jim Weigand emails to say that VDOE has just reported the 2015 excess Required Local Effort (RLE).  The RLE is the local expenditure required by the Standards of Quality

The 2014 data are here.

I have juxtaposed the excess local effort, i.e., the expenditure above the requirement expressed as a percentage of the RLE, with the SOL pass rates

Notes: There are no RLE data for Lee County.  VDOE reports RLE data separately for Greensville County and Emporia and for both James City County and Williamsburg, but SOLs for the combined systems; I have omitted those data.  VDOE also reports RLE data separately for Fairfax County and City but SOLs for the combined system; because the county is so much larger, I have used the county RLE datum.

With those caveats, here are the reading data.

image

Richmond is the gold square.

The least squares fit suggests that doubling the RLE is associated with a 2.4% increase in the pass rate but the R2 tells us that the pass rates and excess RLEs are essentially uncorrelated.

The math data present essentially the same picture.

image

The high price, high score jurisdiction is West Point.  The second highest price, not quite as well scoring, jurisdiction is Falls Church.

SAT Update

RPS has just posted the 2015 SAT data.  Here are the reading scores by school back to 2010, along with the division averages and the Virginia averages.

image

And here are the math scores.

image

I’ve included the available data points for Maggie Walker (the 2010 data are from an RPS post; 2014, from Jeff McGee at MLW); of course, MLW is not a Richmond public school, albeit VDOE reports the SOL scores of MLW students at the high schools (that those students do not attend) in those students home districts.

To provide some context, here are the 2014 (presumably; they were posted on 12/20/14) 25th and 75th percentile scores of the students admitted to Virginia public colleges, along with the 2014 Virginia and Richmond averages. 

image

image

Here is (part of) what the Web page has to say about what the percentiles mean:

Understanding these numbers is important when you plan how many colleges to apply to, and when you figure out which schools are a reach, a match, or a safety. If your scores are below the 25th percentile numbers, you should consider the school a reach. Note that this does not mean you won’t get in — remember that 25% of students who enroll have a score that is at or below that lower number.

For sure, averages v. percentiles is an apples and pomegranates comparison.  That said, the Virginia reading average is between to the 25th percentiles at VMI and Christopher Newport; Richmond is 95 points lower than that state average; for math, the Virginia average is between the 25th percentiles at VCU and VMI while Richmond is 102 points lower.

Where Have All the Data Gone?

Is Richmond Hiding Its Egregious Truancy Problem?

Richmond has a longstanding and ugly problem with truancy.

image

image

Virginia law is perfectly clear as to what Richmond must do about truancy: Schools are required to notify the parents of any unexcused absence.  After the fifth such absence, “[t]he school principal or his designee or the attendance officer, the pupil, and the pupil’s parent shall jointly develop a plan to resolve the pupil’s nonattendance.  Such plan shall include documentation of the reasons for the pupil’s nonattendance.”  After a further (sixth) absence, the school must schedule an attendance conference with the parents.  After a further (seventh) absence, the school division must either prosecute the parents or file a Child in Need of Services/Supervision petition.

For years, Richmond largely ignored these requirements.  Indeed, Richmond only counted ten-absence truancies (three beyond the required filing of a court action) and publicly stated that, upon the tenth absence, it sent a letter to the parents. 

Following some publicity regarding its lawless behavior, Richmond began to schedule more of the required conferences but the number of court actions remained pitifully (and unlawfully) small.

image

Note those numbers, please: In 2014, Richmond was required by law to file somewhere between 2,254 and 3,864 court actions to deal with truant students; they filed only 291 (13% of 2,254, <13% of the required number).

VDOE reports that Richmond held 6,946 conferences (!) in 2015.

On Feb. 24, I emailed our Superintendent and my district School Board member to inquire about the disappearance of the SAT and dropout data from the RPS Web site.  They have not favored me with a reply.  Today I noticed that the truancy data also are missing in action.  Looks like it’s time for a FOIA demand.

 

P.S.: To its credit, Richmond has at least published some truancy data.  The State Board of Education deliberately abides by its failure to even collect data by which it might perform its statutory duty to “see that the [mandatory attendance laws] are properly enforced throughout the Commonwealth.”