Dollars But Not Scholars, Yet Again

We have seen (here and here and here) that division expenditure does not correlate with division SOL pass rate.

Today we explore the relationship (if any) between average teacher salary and pass rate.

VDOE posts an annual report that includes the average classroom teacher salaries (regular K-12 education teachers, art, music, physical education, technology, remedial, gifted, mathematics, reading, special education, and ESL teachers; not included in the calculation are: teacher aides, guidance counselors or librarians) by division and school.

Here, for a start, are the 2016 average teacher salaries of the highest and lowest and several selected divisions.

image

Richmond, it seems, is outspending both its peer, older city divisions and the neighboring counties. 

Maggie Walker (despite not being a “school”), looks like a real bargain.

VDOE will have the 2016 SOL scores in time for graduations this month but they won’t post them until August or September.  So we’ll have to be satisfied with the 2015 pass rates.  Here are the averages of the division pass rates on the reading, writing, math, science, and history & social science tests.

image

Richmond is the gold square.  The red diamonds are, from the left, Hampton, Norfolk, and Newport News.  The green diamonds are, from the top, Hanover, Chesterfield, Henrico, and Charles City (partially obscured, just above Hampton).  Lynchburg is the blue diamond.

You can decide for yourself what kind of return Richmond is getting on our money.

As you see, the computer is glad to fit a curve to these data but the correlation is nil (R2 = 1.3%).

Turning to the Richmond elementary schools, we see:

image

That 18% correlation looks to be driven in large part by expensive Munford (over at the right) and inexpensive, lousy scoring Woodville (bottom, left).  Note that the high scorer, Carver, is not all that expensive.

The state data still have not caught up with the Elkhardt/Thompson situation.  Here are the other middle schools

image

R2 is only 3.2%.  The low point there is MLK.

As to the high schools, it looks like we have a 34% correlation with salary

image

until we take out Community and Open, which restricts the analysis to the general population high schools + Franklin Military.

image

The low score there is Armstrong.  The expensive school is Huguenot.

Here are the data.

image

Of course, SOL scores depend on the economic status of the students as well as upon the quality of the teaching.  VDOE has student growth percentile (“SGP”) data that are not dependent on economic status but they have been sequestering those results.  Brian Davison has pried loose some of the data by division.  We’ll see if his recent victory court will make the SGP data available by school and by teacher.

Has Norfolk Joined the Cheaters Club?

I earlier quoted Scott Adams for the notion that “whenever you have large stakes, an opportunity for wrong-doing, and a small risk of getting caught, wrong-doing happens. . . .  When humans can cheat, they do.”  That certainly is what we’ve seen wholesale in Atlanta and in Virginia on the VGLA.

Now we heve the Virginian-Pilot seeing what looks like the smoke of cheating fires: In the course of a report of attempts to learn whether failing students are being withdrawn from courses where the SOL is mandatory, the paper obtained enrollment data from all the Hampton Roads divisions except one.  Norfolk said it couldn’t retrieve the data.  In the face of an incoherent push back from the Norfolk Superintendent, the Pilot stood by its story.

The state’s SOL pass rate data may speak to this situation.

As a first look, here are the averages of the pass rates for the five subjects reported, expressed as the differences between the division averages and the state average.

image

Norfolk was on a failing path and it stumbled badly on the new (non-VGLA) reading, writing, and science tests in 2013.  Then, mirabile dictu, it recovered dramatically.

Hmmm.  What about the pass rates for the individual subjects? 

The reading data show the hit from new tests and the subsequent recovery.

image

The math pass rates show the effect of the new tests in 2012 and an even more dramatic recovery.

image

The writing and science pass rates also show big hits from the new tests in 2013 and remarkable recoveries.

image

image

The history and social science data show a dismal pattern broken by a remarkable jump in 2015.

image

You get to draw your own conclusion from this.  I have one I’ll share: I’ll bet you a #2 lead pencil that the State Department of Data Suppression will not look beneath all this smoke to see if there is a bonfire of cheating.

As Promised Earlier Today

Subject: FOIA Request
From: John Butcher <[redacted]@verizon.net>
Date: 04/21/2016 01:31 PM

To: “Pyle, Charles (DOE)” Charles.Pyle@doe.virginia.gov

Mr. Pyle,

I am a Citizen of the Commonwealth and a resident of the City of Richmond at the address set out below.  Under the authority of the Virginia Freedom of Information Act, I request an opportunity to inspect and copy the following public records, as that term is defined at Va. Code § 2.2-3701, that are prepared, owned, or in the possession of the Department of Education:

•    All reading and math assessment scores for teachers in Richmond Public Schools, by teacher and by school, for school years 2000 through 2015, whether derived from student growth percentiles or other data.

•    Records setting forth the method or methods for calculating those assessment scores for each year.

If any record responsive to this request exists in electronic form, I request that you provide it by posting it to the Department’s web site or EMailing it to me at the return address above.

In the event the Department elects to withhold any public record responsive to this request, for each such record please:

•    Identify the record withheld by date, author, title, and summary or purpose of the record;

•    Identify all persons outside your Department to whom the record has been shown or to whom copies have been furnished; and

•    State specifically the statutory exemption under which the Department elects to withhold the record.

If you elect to charge me part or all of the actual cost incurred in accessing, duplicating, supplying, or searching for the requested records, please estimate the total charges beforehand.  If those total charges exceed $100, please notify me before you incur the costs.

Please contact me by telephone at the number below or by email at the address above if I can answer any question about this request.

I look forward to hearing from you as promptly as possible and in any event within the five work days provided by the Act.

John Butcher
[redacted]
Richmond, Virginia 23225
804.[redacted]

Piercing the Secrecy Barrier

As we have seen, the Virginia Department of Data Suppression doesn’t want you to know whether your kid is suffering under a lousy teacher or whether your principal is acting to retrain or fire that lousy teacher. 

The Department would have us believe that Virginia is the Lake Woebegon of Teachers: For example, in 99.28% of all respects, Richmond teachers are evaluated to be at or above average.  In fact, of course, we have some really lousy teachers.  Here in Richmond, in 2015, we had the sixth worst division pass rate in math and the second worst in reading.

Leading the charge against the state’s concealment of the facts we have Brian Davison of Loudoun, who earlier compelled the disclosure of the SGP data by division and by (anonymized) teacher. 

Through Brian’s efforts, we now know that “student growth percentiles have not been used as a teacher performance indicator by Loudoun County Public Schools.”  By the terms of a final order signed by Richmond Circuit Court Judge Melvin Hughes and entered on April 12, VDOE now must cough up the Loudoun assessment data by school and by teacher for the last five years and must pay Brian $35,000 toward his attorney’s fees.

This is a tremendous victory for transparency in public education (and a much-needed breach in the wall of secrecy at VDOE).  I plan to modify have modified   Brian’s Loudoun request (see Exhibit 5) and to change the name to Richmond.  I hope you’ll do the same for your school division.

The VGLA Cheating Monster Lives!

As we have just seen, VDOE is concealing participation rates for the VAAP in a manner that raises the question whether some divisions are abusing the VAAP process to boost their scores.

We earlier saw that VDOE ignored rampant abuse of the VGLA until the General Assembly waved a red flag.  VDOE’s response to the new law was to eliminate the VGLA, except for the reading tests in grades 3-8 for Limited English Proficient (LEP) students. 

Unfortunately, it appears that even those remaining VGLA tests are being abused.

The 2015 student achievement reports show VGLA pass rates for 23 divisions.  Fifteen of those 23 divisions (65%) show zero participation.  The average VGLA score is fourteen percent higher than the reading SOL score.  More particularly:

image

Here we see the reading VGLA pass rates of those 23 divisions plotted vs. the SOL pass rates.  The red line shows the ideal: VGLA pass rate same as the SOL pass rate.  The dotted line is fitted to the actual data and the R2 shows only a modest correlation between the two scores.

There are two interesting features here:

  • The VGLA pass rates are remarkably higher than the SOL rates; and
  • The difference decreases as the division has less need to boost its scores, i.e., with increasing SOL pass rates.

Redrawing the graph we see:

image

The gold line shows the difference between the fitted and ideal lines.

I earlier quoted Scott Adams for the notion that “whenever you have large stakes, an opportunity for wrong-doing, and a small risk of getting caught, wrong-doing happens. . . .  When humans can cheat, they do.”  That certainly is what we’ve seen wholesale in Atlanta and in Virginia earlier on the VGLA. 

And here we have VDOE again hiding data, with the remaining data consistent with cheating.  Should we suspect that VDOE is again hiding evidence and overlooking cheating?  In light of VDOE’s track record of hiding and manipulating data and ignoring wholesale cheating (see this and this and this and this and this and this), the answer is obvious.

————————————

Here are the data for the 23 divisions, sorted by VGLA/SOL ratio:

image

Gains and Not So Much, III

Brian Davison suggests that the increased number of retakes benefits the divisions with low pass rates, i.e., with lots of students who might be eligible for retakes.  But, since the State Department of Data Suppression does not tell us about retakes, we can’t know about that.

I’ll suggest there is a more subtle problem: If a division with a 90% overall pass rate increases that rate by 1%, it has decreased the failure rate by 10%.  In contrast, a division with a 50% pass rate that increases the pass rate by 1% leaves 49% of its students failing; it decreases the failure rate by 2% of the failure rate.  To achieve a result equivalent to the first division, this division must increase its overall pass rate by 5%.  But then, it is shooting at a larger target:  A division with a high pass rate has little room for improvement; a division with a low pass rate has plenty of room for (and needs to make lots of) improvement.

The estimable Carol Wolf suggests that I use a simpler analogy: If your pass rate is 50%, you get fifty shots per hundred kids at improving it; if the pass rate is 90%, you get only ten.

That is, a fairer measure of progress is the overall pass rate change divided by the percentage of students who failed to pass the previous year.

Here are those data for Lynchburg:

image

image

image

Gains and Not So Much, II

Delving further into Jim Weigand’s report that his Super is bragging on the year-to-year SOL performance in Lynchburg: The problem is the absence of data on the new retest rate that raised pass rates by about 4% in 2015.  So let’s look at the 2014-2015  changes in pass rates v. the state average, which at least discounts the Lynchburg score gains by the statewide equal opportunity for grade inflation.

Here, for a start, are the five-subject pass rate changes by race and economic disadvantage (“Y” = economically disadvantaged as reported in the VDOE database; “N” = not).

image

On the five-subject average, the Super gets bragging rights for the non–ED white kids, with a 1.4% score gain over the state average; otherwise he has some explaining to do, especially as to the ED white students, who improved by 3.9% less than the state average.

On the reading tests, the opportunity for bragging rights doubles by the addition of the black, economically disadvantaged students.  But the score gain by the white, ED population lags by 2.5% and the black, not-ED by 2.9%

image

Finally, on the math tests, the Lynchburg Super gets full bragging rights, especially as to the black, ED students.

image

Looks to me like Lynchburg needs figure out what its math teachers are doing and spread some of that to the teaching of the other four subjects.

Lake Woebegon of Teachers??

Browsing through the VDOE Web pages, one finds the Teacher and Principal Evaluation Collection Results.

The only data there are for 2011.  Even that limited dataset, however, is sufficient to demonstrate that the “evaluation” process was ridiculous, if not  fraudulent.

The 2011 data show that all forty-six Richmond principals were “satisfactory.”  All our principals, it seems, were at or above average.  Never mind that Richmond’s reading SOL pass rate that year was 1.6 standard deviations below the division mean and its math score was 2.0 standard deviations low.  (Richmond is the gold square on the graphs.)

imageimage

The teacher data were more nuanced but similarly ridiculous.  Here is a summary.

  Classroom Management/ Positive Learning Environment Communi-cation Skills  Evaluation and Assessment Implements and Manages Instruction Knowledge of Subject Planning Activities Profes-sional Responsi-bilities Total
EE = Exceeds Expectations 317 437 208 273 479 240 302 2256
ME = Meets Expectations  698 598 826 754 555 787 733 4951
NI = Needs Improvement 20 2 3 10 3 9 2 49
U = Unsatis-factory  2 0 0 0 0 1 0 3
Total 1037 1037 1037 1037 1037 1037 1037 7259

So we see that three of 7,259 ratings were “unsatisfactory” and forty-nine were “needs improvement.”  That is, the report says that only 0.72% of the items in Richmond teachers’ evaluations showed some aspect of failure to meet or exceed expectations in 2011.

That is absurd in the abstract; in light of the available data it is baldly mendacious.

You may recall that the SGP data (that Brian Davison had to pry from VDOE with a lawsuit) can measure teacher performance.  Unlike the SOL itself, the SGP data are not correlated with economic advantage or disadvantage.  So the “poor students” excuse doesn’t work as to SGP.

We have SGP data for the following year, 2012.  Here, with caveats, are the reading data, starting with the distribution of teacher average SGPs (i.e., the average, by teacher, of the students’ SGPs).

image

The orange line is a Gaussian distribution fitted to the data: Mean = 45.0; standard deviation = 10.8.

Then here is the distribution of Richmond reading teachers’ average SGPs.

image

Note the absence of very high performing teachers and the plethora of low performers in Richmond.  One hundred nine of 205 Richmond reading teachers (53.2% v. 50% in a normal distribution) are below the state mean; sixteen (7.8% v. 2.5% in a normal distribution) are more than two standard deviations and fifty-two (25.4%, v. 16% in a normal distribution) are more than one standard deviation below the state mean.

For math, the state distribution has a mean of 46.8 and a standard deviation of 14.6.

image

In contrast to the reading data, Richmond has some outstanding math teachers but their numbers are outweighed by underperforming teachers.

image

Indeed, 111 of 193 Richmond math teachers (57.5%) are below the state mean; six (3.1%) are more than two standard deviations and thirty-seven (19.2%) are more than one standard deviation below the state mean.

Yet, according to the evaluations from the previous year, Richmond’s teachers were just fine, thank you, in 99.3% of all measures.

Just as a reminder, the effect of a good or bad teacher can be dramatic.  Here for instance are the students’ 2014 reading SGPs for Richmond teacher #66858 (anonymized in the VDOE database).

image

And here, in contrast, are the students’ SGPs for teacher #68809.

image

Unfortunately, we have too many who are more like #68809 than #66858.

Richmond’s subpar teacher performance has only worsened recently, as reflected in the deteriorating SOL performance: For 2015, we are the sixth worst division in the state in math, second worst in reading.

Of course, our principals and Superintendent have the power (and duty) to remedy these glaring defects.  Inadequate teacher performance that is not corrected reflects inadequate principal performance; inadequate principal performance that is not corrected reflects inadequate Superintendent performance.  We need to hold these public employees accountable when they don’t deal with the teachers who are harming Richmond’s schoolchildren.

But, then, it’s difficult to impose accountability when VDOE is hiding the relevant data.

Your tax dollars at work.

 

PS: You’ll notice that the internal “evaulations” deal with inputs, which are easy to measure (and to fudge), while the important quantities are outputs (how much our students learn), which are hard to measure.  VDOE is making it harder to know about outputs by abandoning the SGP and installing “progress tables” that measure progress but mostly ignore the lack of it.  Even so, we’ll have some measure of outputs, albeit VDOE doubtless will not release the data.

Seems to me City Council should demand, as a condition of all the money they are spending on RPS, an audit of teacher performance (SGP in the past, progress tables going forward) with an analysis of principal and Superintendent actions regarding underperforming teachers.  For sure, if Council doesn’t demand those data, the rest of us will continue to be operated on the mushroom principle (keep ‘em in the dark, feed ‘em horse poop).

 

I’ll save VDOE’s feckless “improvements” in the evaluation process for another post.

The Voice of Reason

The Daily Press today took the side of the taxpayers who are paying our teachers:

Virginia residents pour billions into the public schools each year. Local taxpayers, such as those in Newport News and Hampton, spend millions more on education.

It’s fair to ask about the return we’re receiving on that massive annual investment. And data collected through statewide evaluative testing measures can help provide a more accurate picture of that.

We don’t want to embarrass hard-working teachers. Rather, we believe [these SGP data are] information to which the public is entitled and which it should have.

More Data that VDOE Suppresses

Brian Davison points me to the Arkansas Web site that reports, inter alia, grade inflation:

As required by Arkansas Code Annotated § 6-15-421, the Division of Public School Accountability in the Arkansas Department of Education provides
this report of the percentage of students receiving a grade of “B” or above in the corresponding course who did not pass the end of course assessment on the first attempt.  The report also includes the name, address, and superintendent of any high school in which more than twenty percent (20%) of the students received a letter grade of “B” or above, but did not pass the end-of-course assessment on the first attempt.

Indeed, the schools with more than 20% are highlighted in yellow. 

Near the top of the list, we see Badger Academy, where 100% of the students received A or B grades but none passed the End of Course test on the first try.  Perhaps Badger Academy is a special purpose school, such as the Ark. School for the Deaf (90%), but then we have Decatur High School (50%), Clinton High School (66.7%), and  Siatech Little Rock Charter (83.3%). 

In contrast, you can search the VDOE Web site for “grade inflation” and find only one paper that speaks of grade inflation in teacher evaluations and one on the same subject in principal evaluations.  Nothing at all about inflated student grades.

But, then, you wouldn’t really expect the State Department of Superintendent Protection and Data Suppression to report anything so useful.