Racial Smoke Screen in Lynchburg

Jim Weigand points out the News & Advance article reporting that the Lynchburg school gurus have concluded that students’ race is a greater indicator of challenge than poverty, albeit both factors “matter a lot.”

Ask the wrong question, get an irrelevant answer. 

The racial achievement gap is a fact of life, although the reasons remain a lively source of controversy.

The relevant question here is not whether there is a racial gap in Lynchburg or whether that gap exceeds the differences attributable to economic disparity; the question is whether Lynchburg’s students are learning as much as their racial and economic peer groups statewide.

Let’s start with the SGP data.  Those data, which are essentially uncorrelated with whether students are economically disadvantaged, tell us that Lynchburg’s schools are doing an awful job.  For example, the 2014 statewide distributions of SGPs by teacher show an average of 48 for the reading tests and 49.3 for math.

image

Lynchburg, in contrast, has a reading average of 40.2

image

and a math average of 37.2.

image

How would you like to have your kid in the hands of that Lynchburg math “teacher” who produced an average SGP of four?

(In light of the manifest utility of these data, do you wonder why the teachers’ association, which claims it works “for the betterment of public education,” thinks it would be terrible to publicly identify the good and bad teachers in Virginia’s public schools or why VDOE has had second thoughts and is abandoning the SGP?)

We don’t have SGP data by race (The Virginia Department of Data Suppression has those data but has not shared them).  Less usefully, the VDOE database can break out pass rates by race.  Data there show that, statewide, Asian students on the reading tests outperform white students, who outperform black students.  This holds both for students who are and for those who are not economically disadvantaged.

image

The Lynchburg pattern is somewhat different.

image

To illuminate  the differences, we can calculate the ratios of the Lynchburg and State pass rates by race and economic status:

image

Here we see Lynchburg’s white students, economically disadvantaged and not, performing about the same level as their peer groups statewide.  The black students who are not economically disadvantaged are underperforming the state average of similarly situated black students; Lynchburg’s economically disadvantaged black students are considerably underperforming their statewide peer group.  So, economic disadvantage or no, Lynchburg’s black students are passing the tests at a rate below the state average.

Here are the same data for the math tests:

image

image

image

So we see that Lynchburg’s white students, both economically disadvantaged and not, who pass the reading tests at or slightly above the state average rates for their groups nonetheless underperform on the math tests.  Lynchburg’s black students, both economically disadvantaged and not, considerably underperform their peers statewide. 

The important questions Lynchburg should be seeking to answer are why its black students underperform the state averages for their peers, economically disadvantaged or not,  in both reading and math and and why all Lynchburg groups but the economically disadvantaged Asian students(!) underperform in math.

Lies, Damn Lies, and SOL Scores

Both VODE VDOE [blush] and the Governor are out with press releases bragging on the “significant progress” (or significant improvement) in SOL pass rates.  To their credit, both then acknowledge that the five point increases in the reading and math pass rates come in the face of four point score boosts from the newly installed retakes.

VDOE has not released data that would let us examine the retake boost in any detail.  Using their four-point number, the “significant progress,” 4.6% in reading and 5.2% in math, looks more like minimal progress: 0.6% in reading, 1.2% in math:

image

As Omar might (not) have written:

The moving target moves; and having moved,

Moves on:  nor all thy piety nor wit

Shall lure it back to give an honest answer

Nor all thy tears wash away the bureaucrats’ obfuscation.

Lynchburg SGP

My paternal grandmother was Angie Lynch, said to be a relative of John Lynch.  Angie was the second woman in the Oklahoma territory with an advanced degree.

I’ve maintained an affection for Lynchburg, especially in celebration of the US 460 bypass that makes travel to Roanoke a much lighter task.  So it was a particular sorrow when my earlier Lynchburg post got wiped.

In light of VDOE’s third data release (that includes data by teacher, but not by school), I thought I’d redo the post.

First, as a reminder, here are the statewide distributions of teacher average SGPs in reading and math.

image

image

Next the Lynchburg distributions.

image

image

image

Need I say it: These are not good numbers.

We have three years’ data so let’s look at the trends, restricting the graphs to those teachers who taught the subject for all three years.

image

There are too many reading teachers to make much sense of the graph (the table on the right is too small to even list them all).  Let’s take out all but the top and bottom few.

image

Here we see some average and below average teachers improving nicely (No. 66197 presents a happy picture) and others deteriorating severely (No. 69532 is an unfortunate counterbalance to No. 66197).  The citywide average by teacher (that includes all the teachers, including those who taught reading for only one or two years) is low and the lack of a trend does not suggest improvement.

Going directly to the bowdlerized dataset, the math data are more lively.

image

Of interest, we again see low-performing teachers whose performance deteriorated.  We also see a citywide average that bounced but then dropped back to subpar.

Only three Lynchburg teachers taught Algebra I all three years so the graph is much simpler.

image

None of the three improved over the period; quite the contrary.  The average is pulled down by the teachers, not shown, who taught fewer than all three years.  It starts above the state average but deteriorates into the unacceptable range populated by Lynchburg’s reading and math averages.

We also have detailed data by teacher, albeit VDOE won’t tell us who they are.  The high-performing teacher in this collection is No. 71485, who had only one 4th grade math student scoring below the statewide average.

image

In contrast, the best math SGP in the 4th grade class of Teacher No. 71819 was 23.

image

This teacher also had a 4th grade reading class.

image

The 25.7 average in that reading class is far from acceptable but it is far less dismal that the 4.4 average in this teacher’s math class.

For any reader inclined to overlook the fundamental notion that the SGP measures teacher performance, a glance at the eight students that took both reading and math from this teacher is instructive.

image

One student scored in the same Student Growth Percentile in both subjects; the other seven scored higher, some much higher, in reading.  Note especially student No. 7B89048849408, who scored in the first percentile with the benefit of this teacher’s math instruction but in the 70th on the reading test.

Unfortunately, this teacher is getting worse.

image

I could go on but I think these data make my points.  I’ll suggest five things:

  • Lynchburg has a problem with its school system.
  • No. 71819 is an awful math teacher.
  • No. 71819 is a very bad reading teacher.
  • Any principal who subjected schoolchildren to No. 71819 in 2015 should be fired.
  • The bureaucrats at VDOE who refuse to identify No. 71819, as well as that teacher’s principal, to the parents of Lynchburg are misusing the public funds that pay them and pay for the statewide testing.

Important SGP Data Suppressed by VDOE

The third SGP data release by VDOE contains anonymized teacher IDs (but no data by school).  These abbreviated data serve to emphasize the perversity of VDOE’s suppression of the teacher identities (and other data).

In Richmond, according to the database, 304 teachers taught reading in grades 4 to 8 in the 2012-2014 period.  Of these, 74 taught the subject all three years.  A graph of the average SGP performance of that 74 is far too busy to convey much information, aside from showing the remarkable range of the average scores and the dip in 2013 because of the former Superintendent’s failure to align the curriculum to the new SOL tests.

image

If we remove all but the top and bottom four or five 2014 scores, the graph is more informative.

image

The average shows the Richmond dip in 2013.  Note that the State SOL scores dropped in 2013, because of the new tests, but the SGP did not: The SGP measures relative performance and the statewide SGP average is unaffected by the overall drop in SOL scores.  Richmond’s SGP dropped relative to the statewide numbers, however, denoting underperformance here.

Turning to specifics: Teacher No. 66858 (Gr 5 reading) started above average and improved dramatically.  Teacher No. 74415 (Gr 4 reading) started below average and deteriorated dramatically. 

The distribution of Teacher 66858’s 2014 SGP scores provides a more detailed picture of that teacher’s excellent job.

image

image

This teacher had only one student whose reading progress in 2014 was below average.  The low end of the 95% confidence interval of the mean for these data is 76.2. 

In contrast,

image

image

The high end of the 95% confidence interval for for the mean of this teacher is 13.9.  Notice that this teacher’s 2013 performance was not a whole lot better. 

The principal who allowed 25 kids (we have SGPs for 24 of the 25) to be subjected to this educational malpractice in 2014 should have been fired.  Yet VDOE deliberately makes it impossible for Richmond’s parents to know whether this situation has been corrected or whether, as is almost certain, another batch of kids is being similarly afflicted with this awful teacher.

The math data show a similarly diverse pattern, albeit without the 2013 drop: good and average teachers getting better; average and bad teachers getting worse; bad teachers staying bad.

image

It turns out that both of the reading teachers above also taught math to the same kids that they taught (or failed to teach, in the one case) reading. 

No. 66858 turns out to be an excellent math teacher, albeit not as excellent as at reading.

image

image

Similarly, # 74415 is a bad math teacher, but not as awful as at reading.

image

image

No real surprises here.  We would expect that, to some degree, teaching math might take a different skill set than teaching reading.  We also might expect that a good reading teacher would be good at math, and a bad one at reading similarly bad at math.

I could go on and on but the point already is clear: VDOE is concealing important data about the performance of our teachers and principals.  Without these data, the public cannot know which of our principals are harboring educational malpractice.

Finally, Algebra I.  Only nine Richmond teachers taught this subject in all the three years so the graph includes them all.

image

These data paint a picture of improvement, but see the caveats below.

Thirty-one Richmond students were afflicted with teacher No. 68640 but we have SGPs for only 13.  The scores of those 13 do not paint a pretty picture.

image

image

This teacher improved from appalling to awful from 2013 to 2014 but still had only three students above average in 2014.  It is tempting to think that this teacher demonstrates that yet another principal needs firing but there are problems with the data.

The Algebra I data are skewed in at least two ways: The bright kids tend to pass it in middle school.  The ones who can’t pass in high school contribute to our appalling dropout rate.  Then we have the students who take Algebra but don’t get an SGP score because of the byzantine rules (see the class above with 31 students but only 13 SGPs).

And then, the students who have trouble in high school tend to retake (and retake and retake) the test in order to graduate.  VODE has let slip enough retake data to suggest that the retake scores are bogus.

For instance, here are the 2014 Algebra I retest counts in Richmond.

image

On the data that slipped out of VDOE, those retests increased the SOL scores of the students involved by an average of 24.6 points.  One student improved his/her Algebra I score by 108 points. 

The data are here.  Note that the averages at the link include the retest score decreases and still show a net positive 12+ points.

The summerlong retesting is another facet of VDOE’s deliberate concealment: They have enough SOL data to schedule graduations in May but they do not release the SOLs until late August.  No telling what machinations take place in those three months; the data above suggest many and major.

So, we have VDOE manipulating the data in secret and, even more to the point, concealing data about the good and bad teachers in our schools. 

Our tax dollars at “work.”

Why Publish Teacher Evaluations?

Regarding the State Department of Data Suppression (aka VDOE) and its attempts to conceal SGP data from the public, here is an interesting piece on teacher evaluations.  In particular:

The point of empowering parents isn’t to enable them to game the system. The point is to give the small minority of teachers who fall behind some useful feedback on what’s not working and some genuine incentive to fix it.

And, perhaps more to the point, to give the School Board some incentive to do something about those inadequate teachers.

New SGP Data–Reading by Division

VDOE has provided Brian Davison a third set of SGP data.  This set includes (anonymized) teacher IDs.

As a reminder, in the words of VDOE:

A student growth percentile expresses how much progress a student has made relative to the progress of students whose achievement was similar on previous assessments.

A student growth percentile complements a student’s SOL scaled score and gives his or her teacher, parents and principal a more complete picture of achievement and progress. A high growth percentile is an indicator of effective instruction, regardless of a student’s scaled score.

That last point is particularly important: The SOL is significantly affected by the student’s economic status; poorer students generally get lower scores.  The SGP, in contrast, measures improvement relative to similarly situated students; the SGP ranking is pretty much independent of economic status.  Thus, the SGP gives us a measure of quality that allows comparisons from division to division, school to school, and teacher to teacher.

CAVEATS:

  • VDOE has manipulated these data.  For example, they have suppressed the data in cases where the number of students (presumably in a class) is fewer than ten.  In contrast to earlier data sets, they have suppressed the no-retest scores for students showing a retest score.  They also have suppressed retest scores below passing (<400).   Those are the changes I know about or have found; there may well be others.   In any event, these data deviate from the expected mean and median of 50, probably because of the VDOE manipulations.  For example, the average of the 2014 sixth grade reading SGPs in this dataset is 47.7 and the distribution is not the expected flat line:

image

  • VDOE gave us anonymized teacher IDs, but not the names of the teachers involved.  More on this outrage in a later post.
  • The SGP is calculated from the difference between similarly sized SOL scores, which can lead to large relative errors in the SGP.  This suggests considerable caution in interpreting any individual student’s SGP score or the average (or distribution) for a teacher where the number of scores is small.

That said, here is the distribution of 2014 division average Reading SGPs.

image

On this graph, and those below, Richmond is the yellow bar, Petersburg is red, Norfolk is blue, and Hampton is green.

Excel is unable to readably list all the divisions on this graph; the list here and below excludes every 2d division.  Thus, on the graph below, the blue bar (Norfolk) appears between Surry and Essex Counties.  That’s just an artifact of the program, not evidence that Norfolk has disappeared.

The graph above averages the results of reading tests in the five grades, 4-8 (the SOL score in Grade 3 is the starting point so there’s no third grade SGP).  It turns out there is a considerable variation from grade to grade.  Here, then, are the data for each grade, starting with Grade 4 Reading:

image

Grade 5 Reading:

image

There you see Richmond doing quite well.  But wait.  Here are the data for the higher grades, starting with Grade 6 Reading:

image

Grade 7 Reading:

image

Grade 8 Reading:

image

These graphs show a distressing pattern that we will see repeated in the math graphs: Richmond does an average or better job in the elementary schools and a particularly awful job in the middle schools.  Note here the drop from a better-than-average 49 in the fifth grade to a third-from-the-bottom 33 in the sixth.

In summary, here are the 2014 Richmond reading averages by grade v. the state.

image

These data don’t tell us why our middle schools are doing such an awful job.  But they (in conjunction with the data by teacher – stay tuned for a later post) certainly tell us where the system is in dire need of repair.

 

Next post: Math SGPs by division.