Excuses, Excuses . . .

WaPo has a piece this morning on the Establishment’s reaction to Brian Davison’s suit, Davison v. Virginia Ed. Dep’t, No. CL14004321-00 (Richmond Cir. Ct., Petition for Mandamus, October 2, 2014), to require VDOE to release SGP data.

Two paragraphs in the WaPo piece capture most of the anti arguments:

Growth percentiles cannot accurately measure growth among the highest- and lowest-performing children, officials say, and they warn that in some cases student scores might be erroneously assigned to teachers who never actually taught them. In addition, they rely on consecutive years of test data that might not be widely available in schools serving transient populations.

And unlike value-added models used by other states, Virginia’s model does not attempt to control for the effects of poverty or other demographic characteristics. Critics of the growth percentiles say that disadvantages teachers who work with the neediest children.

Let’s take those one at a time:

Highest- and Lowest-Performers

Students who score “passing, advanced” two years in a row are not included in the SGP calculation.  That’s because they are doing just fine, thank you, and have precious little room to do better.  The SGP measures improvement.

As well, among the kids who are tested, the statistics have a problem at the extremes.  For example, on the 2014 8th grade reading results, we see the typical spikes at the 1% and 99% levels:

image

No teacher can complain about having a few extra 99s in the SGP average.  And, just maybe, that teacher is doing an outstanding job. 

Any teacher with a relatively large number of 1s is dealing with a tough group (so long as the rest of that teacher’s distribution is reasonable).  So the data will tell us to discount the average and look directly at the data (as we should be doing anyhow). 

What is the problem?

Students Counted for Wrong Teacher

The anonymous “officials” complain that the SGP might be inaccurate because students scores could be erroneously assigned to teachers who never taught those students.

This is not a criticism of the SGP.  This is a criticism of the local school officials who keep lousy records, and of VDOE, which lets them get away with it.

Transient Populations

The same anonymous “officials” complain that no SGPs are available for transient populations.  This hardly is a reason to suppress the SGP as a measure of progress of the non-transient students.

Aside: National “Teacher Quality” Group

WaPo also quotes the National Council on Teacher Quality for the propositions that the SGP is “not 100% accurate” and is a “real invasion of privacy.”

What examination, pray, is “100% accurate?”  None.  This is just another attempt to turn the unachievable perfect into the enemy of the good.

As to privacy, it’s easy to understand how ineffective teachers, and their masters, would prefer to keep evidence of their performances secret.  But these are public employees, paid with tax money to educate our society’s children.  How can their preference for privacy outweigh the public’s interest in knowing the effectiveness of our teachers, schools, and school divisions?

2d Aside: Data Could be Misinterpreted

We also have a Loudoun School Board member complaining that the SGP data “could be misinterpreted” if released.  What that official is not saying is that the public is more easily fooled if the School Board keeps that public uninformed.  And she is admitting that her School Board and the State Board of Education are unwilling or unable to educate the public in the proper use of this important tool.

Effect of Poverty

Aside from privacy, the arguments above are merely misleading attempts to say that the SGP is not perfect, so it must be abandoned entirely.  The “poor kids do worse” argument is a lie.

The SOL scores correlate fairly well with poverty.  VDOE uses the term “economic disadvantage” and their data show a clear relationship (here on the reading SOLs for 2014):

image

BTW: Richmond is the gold square on that graph: We have an unusually large percentage of economically disadvantaged students and even more unusually low reading scores.  The red diamonds are Hampton, Norfolk, and Newport News; the green is Charles City.

The SGP process, in contrast, compares progress among similarly situated students.  It produces scores that are largely independent of economic status.  VDOE itself quotes the Colorado data:

image

Indeed, VDOE has contradicted this lie:

A student growth percentile complements a student’s SOL scaled score and gives his or her teacher, parents and principal a more complete picture of achievement and progress. A high growth percentile is an indicator of effective instruction, regardless of a student’s scaled score.

In short, it is SOLs that provide a biased standard for comparison of teachers, schools, and divisions.  They compare scores of needy children to their affluent peers. The SGP removes that bias and gives a measure of how much a teacher, school, or division is teaching its students, be they rich or poor.

Yet, VDOE, the Teachers Ass’n., and the Loudoun School Board all want to keep these important data secret.  What do you suppose they are trying to hide?