Dollars But Not Scholars, Yet Again

We have seen (here and here and here) that division expenditure does not correlate with division SOL pass rate.

Today we explore the relationship (if any) between average teacher salary and pass rate.

VDOE posts an annual report that includes the average classroom teacher salaries (regular K-12 education teachers, art, music, physical education, technology, remedial, gifted, mathematics, reading, special education, and ESL teachers; not included in the calculation are: teacher aides, guidance counselors or librarians) by division and school.

Here, for a start, are the 2016 average teacher salaries of the highest and lowest and several selected divisions.

image

Richmond, it seems, is outspending both its peer, older city divisions and the neighboring counties. 

Maggie Walker (despite not being a “school”), looks like a real bargain.

VDOE will have the 2016 SOL scores in time for graduations this month but they won’t post them until August or September.  So we’ll have to be satisfied with the 2015 pass rates.  Here are the averages of the division pass rates on the reading, writing, math, science, and history & social science tests.

image

Richmond is the gold square.  The red diamonds are, from the left, Hampton, Norfolk, and Newport News.  The green diamonds are, from the top, Hanover, Chesterfield, Henrico, and Charles City (partially obscured, just above Hampton).  Lynchburg is the blue diamond.

You can decide for yourself what kind of return Richmond is getting on our money.

As you see, the computer is glad to fit a curve to these data but the correlation is nil (R2 = 1.3%).

Turning to the Richmond elementary schools, we see:

image

That 18% correlation looks to be driven in large part by expensive Munford (over at the right) and inexpensive, lousy scoring Woodville (bottom, left).  Note that the high scorer, Carver, is not all that expensive.

The state data still have not caught up with the Elkhardt/Thompson situation.  Here are the other middle schools

image

R2 is only 3.2%.  The low point there is MLK.

As to the high schools, it looks like we have a 34% correlation with salary

image

until we take out Community and Open, which restricts the analysis to the general population high schools + Franklin Military.

image

The low score there is Armstrong.  The expensive school is Huguenot.

Here are the data.

image

Of course, SOL scores depend on the economic status of the students as well as upon the quality of the teaching.  VDOE has student growth percentile (“SGP”) data that are not dependent on economic status but they have been sequestering those results.  Brian Davison has pried loose some of the data by division.  We’ll see if his recent victory court will make the SGP data available by school and by teacher.

Has Norfolk Joined the Cheaters Club?

I earlier quoted Scott Adams for the notion that “whenever you have large stakes, an opportunity for wrong-doing, and a small risk of getting caught, wrong-doing happens. . . .  When humans can cheat, they do.”  That certainly is what we’ve seen wholesale in Atlanta and in Virginia on the VGLA.

Now we heve the Virginian-Pilot seeing what looks like the smoke of cheating fires: In the course of a report of attempts to learn whether failing students are being withdrawn from courses where the SOL is mandatory, the paper obtained enrollment data from all the Hampton Roads divisions except one.  Norfolk said it couldn’t retrieve the data.  In the face of an incoherent push back from the Norfolk Superintendent, the Pilot stood by its story.

The state’s SOL pass rate data may speak to this situation.

As a first look, here are the averages of the pass rates for the five subjects reported, expressed as the differences between the division averages and the state average.

image

Norfolk was on a failing path and it stumbled badly on the new (non-VGLA) reading, writing, and science tests in 2013.  Then, mirabile dictu, it recovered dramatically.

Hmmm.  What about the pass rates for the individual subjects? 

The reading data show the hit from new tests and the subsequent recovery.

image

The math pass rates show the effect of the new tests in 2012 and an even more dramatic recovery.

image

The writing and science pass rates also show big hits from the new tests in 2013 and remarkable recoveries.

image

image

The history and social science data show a dismal pattern broken by a remarkable jump in 2015.

image

You get to draw your own conclusion from this.  I have one I’ll share: I’ll bet you a #2 lead pencil that the State Department of Data Suppression will not look beneath all this smoke to see if there is a bonfire of cheating.

More Money Down the RPS Rathole?

We dropped our subscription to the Richmond Times-Dispatch some time ago: The paper kept getting smaller and the local coverage more curtailed. 

Even so, the (excellent!) VPAP newsfeed discloses an ongoing kerfuffle over whether the Richmond Public Schools shall have more money.

Despite the recognition that the awful condition of some school buildings reflects inadequate maintenance, i.e., deliberate waste, there’s no talk of assuring that the new buildings will be properly maintained. 

Indeed, the money discussion has avoided the central problem: RPS is wasting money wholesale.

  • They spent an inordinate sum to paint handicap parking spaces and even more money to design them.
  • The violated Virginia law to give a $291,080 elevator design contract to a favored engineering firm.
  • It seems that every operation the City Auditor touches at RPS turns out to be wasting money (bottom of the page).
  • My own estimate suggests the something like $50 million per year is disappearing into the RPS budget with no discernable result.

Seems to me that the City should demand independent oversight of RPS spending before it even considers budgeting more money for the system.

As Promised Earlier Today

Subject: FOIA Request
From: John Butcher <[redacted]@verizon.net>
Date: 04/21/2016 01:31 PM

To: “Pyle, Charles (DOE)” Charles.Pyle@doe.virginia.gov

Mr. Pyle,

I am a Citizen of the Commonwealth and a resident of the City of Richmond at the address set out below.  Under the authority of the Virginia Freedom of Information Act, I request an opportunity to inspect and copy the following public records, as that term is defined at Va. Code § 2.2-3701, that are prepared, owned, or in the possession of the Department of Education:

•    All reading and math assessment scores for teachers in Richmond Public Schools, by teacher and by school, for school years 2000 through 2015, whether derived from student growth percentiles or other data.

•    Records setting forth the method or methods for calculating those assessment scores for each year.

If any record responsive to this request exists in electronic form, I request that you provide it by posting it to the Department’s web site or EMailing it to me at the return address above.

In the event the Department elects to withhold any public record responsive to this request, for each such record please:

•    Identify the record withheld by date, author, title, and summary or purpose of the record;

•    Identify all persons outside your Department to whom the record has been shown or to whom copies have been furnished; and

•    State specifically the statutory exemption under which the Department elects to withhold the record.

If you elect to charge me part or all of the actual cost incurred in accessing, duplicating, supplying, or searching for the requested records, please estimate the total charges beforehand.  If those total charges exceed $100, please notify me before you incur the costs.

Please contact me by telephone at the number below or by email at the address above if I can answer any question about this request.

I look forward to hearing from you as promptly as possible and in any event within the five work days provided by the Act.

John Butcher
[redacted]
Richmond, Virginia 23225
804.[redacted]

Piercing the Secrecy Barrier

As we have seen, the Virginia Department of Data Suppression doesn’t want you to know whether your kid is suffering under a lousy teacher or whether your principal is acting to retrain or fire that lousy teacher. 

The Department would have us believe that Virginia is the Lake Woebegon of Teachers: For example, in 99.28% of all respects, Richmond teachers are evaluated to be at or above average.  In fact, of course, we have some really lousy teachers.  Here in Richmond, in 2015, we had the sixth worst division pass rate in math and the second worst in reading.

Leading the charge against the state’s concealment of the facts we have Brian Davison of Loudoun, who earlier compelled the disclosure of the SGP data by division and by (anonymized) teacher. 

Through Brian’s efforts, we now know that “student growth percentiles have not been used as a teacher performance indicator by Loudoun County Public Schools.”  By the terms of a final order signed by Richmond Circuit Court Judge Melvin Hughes and entered on April 12, VDOE now must cough up the Loudoun assessment data by school and by teacher for the last five years and must pay Brian $35,000 toward his attorney’s fees.

This is a tremendous victory for transparency in public education (and a much-needed breach in the wall of secrecy at VDOE).  I plan to modify have modified   Brian’s Loudoun request (see Exhibit 5) and to change the name to Richmond.  I hope you’ll do the same for your school division.

Is Fairfax Above the Law?

Fairfax doesn’t like Virginia’s mandatory school attendance law, so they object to the regulation that proposes to enforce that law.

Va. Code  § 22.1-269 requires that the Board of Education “see that the [mandatory attendance laws] are properly enforced throughout the Commonwealth.” Notwithstanding that mandate, the Board still neither collects nor publishes data that would allow the public to assess its performance of that duty.

The Board did not publish a proposed truancy regulation until December 21, 2009. The history of that regulation is set forth on the Town Hall website.  In short, the regulation completed its fourth(!) public comment period on 12/2/2015; the proposed regulation still fails to discharge the Board’s duty and the Board has not yet acted to either adopt the proposed regulation or to repair it.

Yet, in the Peoples Republic of Northern Virginia the proposed regulation is defective insofar as it does attempt to require compliance with State law.

In comments dated 12/2/15, Dr. Karen Garza of Fairfax County Schools says:

[T]he draft regulations require the creation of an attendance plan after the fifth unexcused absence, as specified by Code.  But instead of allowing time for that plan’s implementation or for any sort of evaluation of a student’s progress toward addressing issues identified by the plan, the regulations (again in compliance with Code) require additional significant interventions after only a single additional unexcused absence.  One more unexcused absence beyond that potentially triggers a court petition and legal intervention.  Simply put, this timeline is fundamentally flawed and undermines the potential usefulness of the required attendance plan by giving it little to no time to actually work.  Unless the underlying Code constraints are addressed, regulatory changes will be of limited worth in truly moving student attendance policies in school divisions toward evidence-based best practices.

In short: “The State law is fundamentally flawed and we object to your regulation that would seek to enforce that law.”

The Fairfax schools’ Web page tells us that Dr. Garza is the Superintendent up there.  Her comments tell us that Dr. Garza needs to be fired and replaced with a Superintendent who is willing to apply Virginia law as it is, not as she wishes it to be.

 

PS to Superintendent Staples: In the likely event that Fairfax won’t fire her, you and your Board can.

The VGLA Cheating Monster Lives!

As we have just seen, VDOE is concealing participation rates for the VAAP in a manner that raises the question whether some divisions are abusing the VAAP process to boost their scores.

We earlier saw that VDOE ignored rampant abuse of the VGLA until the General Assembly waved a red flag.  VDOE’s response to the new law was to eliminate the VGLA, except for the reading tests in grades 3-8 for Limited English Proficient (LEP) students. 

Unfortunately, it appears that even those remaining VGLA tests are being abused.

The 2015 student achievement reports show VGLA pass rates for 23 divisions.  Fifteen of those 23 divisions (65%) show zero participation.  The average VGLA score is fourteen percent higher than the reading SOL score.  More particularly:

image

Here we see the reading VGLA pass rates of those 23 divisions plotted vs. the SOL pass rates.  The red line shows the ideal: VGLA pass rate same as the SOL pass rate.  The dotted line is fitted to the actual data and the R2 shows only a modest correlation between the two scores.

There are two interesting features here:

  • The VGLA pass rates are remarkably higher than the SOL rates; and
  • The difference decreases as the division has less need to boost its scores, i.e., with increasing SOL pass rates.

Redrawing the graph we see:

image

The gold line shows the difference between the fitted and ideal lines.

I earlier quoted Scott Adams for the notion that “whenever you have large stakes, an opportunity for wrong-doing, and a small risk of getting caught, wrong-doing happens. . . .  When humans can cheat, they do.”  That certainly is what we’ve seen wholesale in Atlanta and in Virginia earlier on the VGLA. 

And here we have VDOE again hiding data, with the remaining data consistent with cheating.  Should we suspect that VDOE is again hiding evidence and overlooking cheating?  In light of VDOE’s track record of hiding and manipulating data and ignoring wholesale cheating (see this and this and this and this and this and this), the answer is obvious.

————————————

Here are the data for the 23 divisions, sorted by VGLA/SOL ratio:

image

VAAP Claptrap

Virginia has four programs for testing students with “special needs.”  I have written at length about the abuse of one of them, the Virginia Grade Level Alternative (VGLA), to (1) classify as handicapped students who were not and (2) artificially boost pass rates on the required standardized tests. 

Another of those programs, the Virginia Alternative Assessment Program (VAAP), offers a testing alternative for students with “significant cognitive disabilities.”  For reasons I’ll discuss in a later post, I’ve taken a look at the VAAP test scores.

The first thing that jumps out is that many divisions show nonzero pass rates on the VAAP but zero participation counts: For 2015, VDOE reports VAAP reading test participation data for only twenty (of 132) divisions; they report zero participation counts but nonzero pass rates for 113 divisions.

Here for those twenty divisions is a graph of the 2015 VAAP pass rate v. the SOL pass rate.

image

Ideally, these data would lie on the red line, indicating that the average test performance was the same on both tests.  The least squares fit to the actual data, the dotted blue line, suggests that the results are fairly close to the ideal, and the R2 indicates a modest correlation.

If we include all 113 divisions that report a VAAP pass rate, the picture changes.

image

The R2 indicates only minuscule correlation between the VAAP and SOL scores.  The slope of the fitted line suggests that the divisions with lower SOL scores (i.e., those in need of better scores) have relatively higher VAAP scores.

If you don’t smell a rat here, read on.

The 2015 math tests present a similar picture.

image

image

What’s with all those zero participation counts?

The count is important because it allows some insight into whether a division is using the alternative test to boost its scores, either by easy grading of the alternative test or by removing marginal performers from the SOL testing.  As well, the count tells us whether the division is above the 1% of VAAP tests allowed by the feds at 34 CFR § 200.13(c)(2)(i).  And we know that divisions have used the alternative tests to cheat and that VDOE has let them get away with it.

VDOE says it suppresses test participation  counts fewer than ten “to protect the identity of individual students.”  The actual suppression process is far more draconian (and opaque) and the purpose is far less clear than that.

image

The first entry upper left is misleading; once you read the whole thing you’ll see it means “if any individual count anywhere is <10, we suppress almost everything.” 

The effect of all this suppression can be astonishing.  For example, here are the highest 2015 math VAAP pass rates <100% for divisions where the participation is reported as zero (as well, six divisions report 100% pass rates and zero participation):

image

Students come in integral units, not in decimal fractions.  The highest possible pass rate for 9 or fewer (integral) students, other than 100%, is 8/9, i.e., 88.88…%.  All of these pass rates are higher.  Thus, counts larger than 9 are being suppressed.

Indeed, the ratios of the smallest integers in the VDOE reported pass rates above are 11/12 (91.666…%; Goochland, Page, Powhatan, Staunton, and Sussex), 12/13 (92.308%; Poquoson and Westmoreland), and 14/15 (93.33…%, Frederick). 

Of course, 11/12 = 22/24 =33/36 . . .   So all we can tell for sure is that VDOE is suppressing numbers larger, and possibly much larger, than 9.

At the extreme, Stafford, the smallest integers that produce a 95.97% pass rate are 119/124(!).  So VDOE’s suppression rules report a zero for a VAAP participation that was at least 124, and could have been two or three (or more) times that.

In summary, here is that set, showing the minimum pass/participation integers:

image

There are no student identities available from the VDOE Build-A-Table.   So, why is VDOE hiding all these participation data?  Could it be that they are suppressing numbers that could embarrass both VDOE and a number of school divisions?

Scott Adams, creator of Dilbert, argues (see, also this) that “whenever you have large stakes, an opportunity for wrong-doing, and a small risk of getting caught, wrong-doing happens. . . .  When humans can cheat, they do.” 

Given VDOE’s interest in high SOL scores, should we suspect that they are hiding something here?

In light of VDOE’s track record of hiding and manipulating data (see this and this and this and this and this and this), the answer is obvious.

AWOL on Truancy

In light of Richmond’s flagrant violations of the state law regarding truancy and their failure to post any truancy data this year, I filed a FOIA request for their 2015 data.  Their response, in short (the 2d column lists the number of students with the specified number of absences; the fourth column, the number of the required responses):

image

As a reminder, the law requires:

  • 5 unexcused absences: Attendance Plan;
  • 6 unexcused absences: Conference w the Parents; and
  • 7 unexcused absences: Prosecution of the Parents or CHINS petition v. the student.

Richmond’s recent history (Note: They have been counting ten-absence cases and sending the Parents a letter):

image

So this year they are not even keeping records, except to show an appalling number of five-absence cases and that the unlawfully minuscule number of CHINS petitions has dropped.

You might think that VDOE, which has the duty to “see that the [mandatory attendance laws] are properly enforced throughout the Commonwealth” would be doing something about this tragic defiance of the law.  But to think that you’d have to be unaware of VDOE’s ongoing disdain for performing that duty.

What I don’t understand is why somebody doesn’t mandamus the School Board.  Or sue them for damages (e.g., for a death by stabbing) caused by a frequent truant.

(Would be) Theft from Motor Vehicle

Yesterday afternoon Ms. Penelope looked out the window and saw a young fellow walking up Riverside Drive trying the driver’s side doors of the parked vehicles (warm day; lots of river visitors).  When she and I got out the front door, we saw the fellow with the door of a car open, right in front of our mailbox.

Never mind that some thoughtless fool parked so Mr. Worsham would have to get around the car to deliver our mail.  That thoughtless fool left his car unlocked, helping to chum for criminals in our quiet neighborhood.  Indeed, car breakins (or, more often, theft from unlocked cars) continues the be the #1 crime in our block.

The would-be thief closed the door and went on down the block when he saw us watching him.  Then he worked his way back toward 42d St., got into a shiny, black car, and drove away.  He was gone when the cops got here.

For what it’s worth: thin, tan knit cap over ears, wife-beater shirt, orange underwear showing where the belt was half way down his butt.

There’s not much we can do about the fools who park here and leave stuff in their unlocked vehicles.  But, as always, put your stuff in the trunk or in the house and lock your cars so you won’t contribute to, and suffer from, this problem.