A Modest Proposal

SOL scores decrease with decreasing economic status of the family.  Thus, the Feds have required (select the SGP Primer link) VDOE to compute a measure of learning, not family income.  VDOE selected the SGP.  VDOE now has three years’ of those SGP data that can be used to measure teacher effectiveness. 

VDOE has a lawyer full of bogus excuses for not releasing the data with the teachers’ identities attached.  None of those would prevent use of the data to rate the effectiveness of the colleges that sent us those teachers.

Just think, VDOE now can measure how well each college’s graduates perform as fledgling teachers and how quickly they improve (or not) in the job.  In this time of increasing college costs, those data would be important for anyone considering a career in education.  And the data should help our school divisions make hiring decisions.

In addition, VDOE could assess the effectiveness of the teacher training at VCU, which is spending $90,000 a year of your and my tax money to hire Richmond’s failed Superintendent as an Associate Professor in “Educational Leadership.”  Wouldn’t it be interesting to see whether that kind of “leadership” can produce capable teachers (albeit it produced an educational disaster in Richmond).

Categories SGP

Lynchburg SGP

My paternal grandmother was Angie Lynch, said to be a relative of John Lynch.  Angie was the second woman in the Oklahoma territory with an advanced degree.

I’ve maintained an affection for Lynchburg, especially in celebration of the US 460 bypass that makes travel to Roanoke a much lighter task.  So it was a particular sorrow when my earlier Lynchburg post got wiped.

In light of VDOE’s third data release (that includes data by teacher, but not by school), I thought I’d redo the post.

First, as a reminder, here are the statewide distributions of teacher average SGPs in reading and math.

image

image

Next the Lynchburg distributions.

image

image

image

Need I say it: These are not good numbers.

We have three years’ data so let’s look at the trends, restricting the graphs to those teachers who taught the subject for all three years.

image

There are too many reading teachers to make much sense of the graph (the table on the right is too small to even list them all).  Let’s take out all but the top and bottom few.

image

Here we see some average and below average teachers improving nicely (No. 66197 presents a happy picture) and others deteriorating severely (No. 69532 is an unfortunate counterbalance to No. 66197).  The citywide average by teacher (that includes all the teachers, including those who taught reading for only one or two years) is low and the lack of a trend does not suggest improvement.

Going directly to the bowdlerized dataset, the math data are more lively.

image

Of interest, we again see low-performing teachers whose performance deteriorated.  We also see a citywide average that bounced but then dropped back to subpar.

Only three Lynchburg teachers taught Algebra I all three years so the graph is much simpler.

image

None of the three improved over the period; quite the contrary.  The average is pulled down by the teachers, not shown, who taught fewer than all three years.  It starts above the state average but deteriorates into the unacceptable range populated by Lynchburg’s reading and math averages.

We also have detailed data by teacher, albeit VDOE won’t tell us who they are.  The high-performing teacher in this collection is No. 71485, who had only one 4th grade math student scoring below the statewide average.

image

In contrast, the best math SGP in the 4th grade class of Teacher No. 71819 was 23.

image

This teacher also had a 4th grade reading class.

image

The 25.7 average in that reading class is far from acceptable but it is far less dismal that the 4.4 average in this teacher’s math class.

For any reader inclined to overlook the fundamental notion that the SGP measures teacher performance, a glance at the eight students that took both reading and math from this teacher is instructive.

image

One student scored in the same Student Growth Percentile in both subjects; the other seven scored higher, some much higher, in reading.  Note especially student No. 7B89048849408, who scored in the first percentile with the benefit of this teacher’s math instruction but in the 70th on the reading test.

Unfortunately, this teacher is getting worse.

image

I could go on but I think these data make my points.  I’ll suggest five things:

  • Lynchburg has a problem with its school system.
  • No. 71819 is an awful math teacher.
  • No. 71819 is a very bad reading teacher.
  • Any principal who subjected schoolchildren to No. 71819 in 2015 should be fired.
  • The bureaucrats at VDOE who refuse to identify No. 71819, as well as that teacher’s principal, to the parents of Lynchburg are misusing the public funds that pay them and pay for the statewide testing.

Important SGP Data Suppressed by VDOE

The third SGP data release by VDOE contains anonymized teacher IDs (but no data by school).  These abbreviated data serve to emphasize the perversity of VDOE’s suppression of the teacher identities (and other data).

In Richmond, according to the database, 304 teachers taught reading in grades 4 to 8 in the 2012-2014 period.  Of these, 74 taught the subject all three years.  A graph of the average SGP performance of that 74 is far too busy to convey much information, aside from showing the remarkable range of the average scores and the dip in 2013 because of the former Superintendent’s failure to align the curriculum to the new SOL tests.

image

If we remove all but the top and bottom four or five 2014 scores, the graph is more informative.

image

The average shows the Richmond dip in 2013.  Note that the State SOL scores dropped in 2013, because of the new tests, but the SGP did not: The SGP measures relative performance and the statewide SGP average is unaffected by the overall drop in SOL scores.  Richmond’s SGP dropped relative to the statewide numbers, however, denoting underperformance here.

Turning to specifics: Teacher No. 66858 (Gr 5 reading) started above average and improved dramatically.  Teacher No. 74415 (Gr 4 reading) started below average and deteriorated dramatically. 

The distribution of Teacher 66858’s 2014 SGP scores provides a more detailed picture of that teacher’s excellent job.

image

image

This teacher had only one student whose reading progress in 2014 was below average.  The low end of the 95% confidence interval of the mean for these data is 76.2. 

In contrast,

image

image

The high end of the 95% confidence interval for for the mean of this teacher is 13.9.  Notice that this teacher’s 2013 performance was not a whole lot better. 

The principal who allowed 25 kids (we have SGPs for 24 of the 25) to be subjected to this educational malpractice in 2014 should have been fired.  Yet VDOE deliberately makes it impossible for Richmond’s parents to know whether this situation has been corrected or whether, as is almost certain, another batch of kids is being similarly afflicted with this awful teacher.

The math data show a similarly diverse pattern, albeit without the 2013 drop: good and average teachers getting better; average and bad teachers getting worse; bad teachers staying bad.

image

It turns out that both of the reading teachers above also taught math to the same kids that they taught (or failed to teach, in the one case) reading. 

No. 66858 turns out to be an excellent math teacher, albeit not as excellent as at reading.

image

image

Similarly, # 74415 is a bad math teacher, but not as awful as at reading.

image

image

No real surprises here.  We would expect that, to some degree, teaching math might take a different skill set than teaching reading.  We also might expect that a good reading teacher would be good at math, and a bad one at reading similarly bad at math.

I could go on and on but the point already is clear: VDOE is concealing important data about the performance of our teachers and principals.  Without these data, the public cannot know which of our principals are harboring educational malpractice.

Finally, Algebra I.  Only nine Richmond teachers taught this subject in all the three years so the graph includes them all.

image

These data paint a picture of improvement, but see the caveats below.

Thirty-one Richmond students were afflicted with teacher No. 68640 but we have SGPs for only 13.  The scores of those 13 do not paint a pretty picture.

image

image

This teacher improved from appalling to awful from 2013 to 2014 but still had only three students above average in 2014.  It is tempting to think that this teacher demonstrates that yet another principal needs firing but there are problems with the data.

The Algebra I data are skewed in at least two ways: The bright kids tend to pass it in middle school.  The ones who can’t pass in high school contribute to our appalling dropout rate.  Then we have the students who take Algebra but don’t get an SGP score because of the byzantine rules (see the class above with 31 students but only 13 SGPs).

And then, the students who have trouble in high school tend to retake (and retake and retake) the test in order to graduate.  VODE has let slip enough retake data to suggest that the retake scores are bogus.

For instance, here are the 2014 Algebra I retest counts in Richmond.

image

On the data that slipped out of VDOE, those retests increased the SOL scores of the students involved by an average of 24.6 points.  One student improved his/her Algebra I score by 108 points. 

The data are here.  Note that the averages at the link include the retest score decreases and still show a net positive 12+ points.

The summerlong retesting is another facet of VDOE’s deliberate concealment: They have enough SOL data to schedule graduations in May but they do not release the SOLs until late August.  No telling what machinations take place in those three months; the data above suggest many and major.

So, we have VDOE manipulating the data in secret and, even more to the point, concealing data about the good and bad teachers in our schools. 

Our tax dollars at “work.”

Bang per Buck SGP Analysis

An earlier analysis showed that Division SOL scores are essentially uncorrelated with division expenditure per student and that Richmond spends much more per student and obtains much lower SOLs that most other divisions.

The recent release of the Student Growth Percentile data opens a new path for analysis.  Unlike the SOL, the SGP is essentially uncorrelated with economic status.

We have the 2014 SGP data but VDOE is yet to post the expenditure data for the 2014 Superintendent’s Annual Report.  So we turn to the 2013 data (here with the facility costs, debt service, and contingency reserves removed from the calculation of total cost per student).

First, the division average reading SGP scores vs. the division average expenditure per student.

image

The fitted curve might suggest that the reading score increases with increasing expenditure but the R2 of just under 1% tells us that the two variables are essentially uncorrelated. 

Richmond is the gold square, spending above average sums per student and obtaining, as with the SOL, below average scores.  The red diamond to the left is Hampton, which Brian Davison already has identified as a division with a large population of economically disadvantaged students but a robust SGP, suggesting superior teaching there.

On the subject of robust, the two divisions with SGP averages >55 are Poquoson on the left and Charles City on the right.

The other red diamonds are, from the left, Newport News and Norfolk.  All three of the red diamond jurisdictions demonstrate that it is possible for an old, urban jurisdiction in Virginia to deliver average or superior education at near-average costs.

The math data show a similar picture as to Richmond albeit a less flattering one for Hampton:

image

Again, the SGP results are barely correlated with expenditures (R2 = 2.7%).

The outperforming divisions, all with SGP averages >60, are, from the left, Bland, Botetourt, Buckingham, and Bristol.

I’ll venture a conclusion: Virginia schools, especially the expensive ones such as Richmond, don’t need more money; they need more of whatever Hampton and Poquoson and Charles City and Bland and Botetourt and Buckingham and Bristol are doing.

Categories SGP

Why Publish Teacher Evaluations?

Regarding the State Department of Data Suppression (aka VDOE) and its attempts to conceal SGP data from the public, here is an interesting piece on teacher evaluations.  In particular:

The point of empowering parents isn’t to enable them to game the system. The point is to give the small minority of teachers who fall behind some useful feedback on what’s not working and some genuine incentive to fix it.

And, perhaps more to the point, to give the School Board some incentive to do something about those inadequate teachers.

VDOE Is Spending Your Money to Avoid Disclosing the Data You Paid For

Yesterday, VDOE sent its (very capable) lawyer to talk to Richmond Circuit Court Judge Melvin Hughes.

The lawyer told the judge that VDOE wanted a try-again on its loss to Brian Davison last year where the judge told VDOE to disclose the SGP data by teacher.  VDOE also wanted to bring along the Loudoun School Board, the Virginia School Boards Ass’n, the VEA, and the Virginia Superintendents’ Ass’n. to whine about how terrible it would be to publicly identify the good and bad teachers in Virginia’s public schools.

It looks like the judge divided the baby: He told the various Associations that they were not the affected teachers and they lacked “standing” to intervene in the suit.  He allowed the Loudoun School Board to join VDOE in trying to get him to change his mind about disclosing teachers’ identities and, indeed, about releasing the SGP records at all.

It will be up to Judge Hughes to decide the legal questions here.  In contrast, it is clear that VDOE is on the wrong side of the policy issue: They are using taxpayer money to resist disclosure to the taxpayers of data those taxpayers paid for.  Those data can tell the public which teachers, schools, and school divisions — all paid for by those taxpayers — are doing a good or poor job of public education.

We have a preliminary data release (actually the last of three) that contains anonymous teacher identifiers and that demonstrates the importance of these data.

As a reminder, the Student Growth Percentile measures how much a student has learned in a particular class in comparison to other, similarly situated students.  Importantly, the SGP, in contrast to the SOL, is generally unaffected by the wealth or poverty of the student’s family.  VDOE has been collecting these data, under a federal mandate, since 2011.

Let’s look at the 2014 Richmond fifth-grade reading data from VDOE’s latest (“16790”) release.  For a start, here is the distribution of student score averages by teacher across all the teachers of that subject in all of Richmond’s elementary schools.  The SGP percentiles are on the abscissa, the count of teachers with that average percentile is on the ordinate.

image

Here we see a close-to normal distribution with a mean of 48 and a standard deviation of 13.  For comparison, the statewide distribution for this subject (also by teacher average) also averages 48, and with a standard deviation of 11.

For the teacher at “85” on that graph (ID # 66858, average reading SGP of 84.52), the database reports 23 scores.  Of those 23 students, only one scored below the state average SGP.

image

At the other end of the graph, teacher # 66294 averaged only 16.6 but with only ten scores.  Let’s look at the next teacher up, # 68809, with 22 scores averaging 22.8:

image

Three kids in that class scored above the state average.  Two scored in the minimum percentile and four more were in the third percentile.

More specifically, the 95% confidence interval of this teacher’s 23 average is 11.  In terms of student progress, we can be confident that this teacher is in the bottom third, and probably the bottom quarter, statewide.  Clearly it’s time for some retraining and, if that doesn’t take, a replacement teacher.

Your fifth grader in Richmond might be stuck with this teacher.  But VDOE doesn’t want you to know how bad this teacher is.

Please recall that the SGP measures improvement in comparison to similarly situated students.  A student with high achievement (high SOL score) last year who improves as much as the others with similar achievement, only makes the 50th percentile.  A student with low achievement last year and who improves as much this year as the other low achievers also makes the 50th percentile.  In short, the SGP does not penalize a teacher for having a bunch of low-performing students; it rewards or penalizes a teacher based on how much improvement that teacher achieves compared with similar students statewide.

Which teacher do you think is doing a better job?  Which class would you want your kid to be in?

Why do you suppose VDOE doesn’t want you to know how well (or how badly) your kid’s teacher is doing?

Are you beginning to understand why I refer to VDOE as the State Department of Data Suppression?

Excuses, Excuses . . .

WaPo has a piece this morning on the Establishment’s reaction to Brian Davison’s suit, Davison v. Virginia Ed. Dep’t, No. CL14004321-00 (Richmond Cir. Ct., Petition for Mandamus, October 2, 2014), to require VDOE to release SGP data.

Two paragraphs in the WaPo piece capture most of the anti arguments:

Growth percentiles cannot accurately measure growth among the highest- and lowest-performing children, officials say, and they warn that in some cases student scores might be erroneously assigned to teachers who never actually taught them. In addition, they rely on consecutive years of test data that might not be widely available in schools serving transient populations.

And unlike value-added models used by other states, Virginia’s model does not attempt to control for the effects of poverty or other demographic characteristics. Critics of the growth percentiles say that disadvantages teachers who work with the neediest children.

Let’s take those one at a time:

Highest- and Lowest-Performers

Students who score “passing, advanced” two years in a row are not included in the SGP calculation.  That’s because they are doing just fine, thank you, and have precious little room to do better.  The SGP measures improvement.

As well, among the kids who are tested, the statistics have a problem at the extremes.  For example, on the 2014 8th grade reading results, we see the typical spikes at the 1% and 99% levels:

image

No teacher can complain about having a few extra 99s in the SGP average.  And, just maybe, that teacher is doing an outstanding job. 

Any teacher with a relatively large number of 1s is dealing with a tough group (so long as the rest of that teacher’s distribution is reasonable).  So the data will tell us to discount the average and look directly at the data (as we should be doing anyhow). 

What is the problem?

Students Counted for Wrong Teacher

The anonymous “officials” complain that the SGP might be inaccurate because students scores could be erroneously assigned to teachers who never taught those students.

This is not a criticism of the SGP.  This is a criticism of the local school officials who keep lousy records, and of VDOE, which lets them get away with it.

Transient Populations

The same anonymous “officials” complain that no SGPs are available for transient populations.  This hardly is a reason to suppress the SGP as a measure of progress of the non-transient students.

Aside: National “Teacher Quality” Group

WaPo also quotes the National Council on Teacher Quality for the propositions that the SGP is “not 100% accurate” and is a “real invasion of privacy.”

What examination, pray, is “100% accurate?”  None.  This is just another attempt to turn the unachievable perfect into the enemy of the good.

As to privacy, it’s easy to understand how ineffective teachers, and their masters, would prefer to keep evidence of their performances secret.  But these are public employees, paid with tax money to educate our society’s children.  How can their preference for privacy outweigh the public’s interest in knowing the effectiveness of our teachers, schools, and school divisions?

2d Aside: Data Could be Misinterpreted

We also have a Loudoun School Board member complaining that the SGP data “could be misinterpreted” if released.  What that official is not saying is that the public is more easily fooled if the School Board keeps that public uninformed.  And she is admitting that her School Board and the State Board of Education are unwilling or unable to educate the public in the proper use of this important tool.

Effect of Poverty

Aside from privacy, the arguments above are merely misleading attempts to say that the SGP is not perfect, so it must be abandoned entirely.  The “poor kids do worse” argument is a lie.

The SOL scores correlate fairly well with poverty.  VDOE uses the term “economic disadvantage” and their data show a clear relationship (here on the reading SOLs for 2014):

image

BTW: Richmond is the gold square on that graph: We have an unusually large percentage of economically disadvantaged students and even more unusually low reading scores.  The red diamonds are Hampton, Norfolk, and Newport News; the green is Charles City.

The SGP process, in contrast, compares progress among similarly situated students.  It produces scores that are largely independent of economic status.  VDOE itself quotes the Colorado data:

image

Indeed, VDOE has contradicted this lie:

A student growth percentile complements a student’s SOL scaled score and gives his or her teacher, parents and principal a more complete picture of achievement and progress. A high growth percentile is an indicator of effective instruction, regardless of a student’s scaled score.

In short, it is SOLs that provide a biased standard for comparison of teachers, schools, and divisions.  They compare scores of needy children to their affluent peers. The SGP removes that bias and gives a measure of how much a teacher, school, or division is teaching its students, be they rich or poor.

Yet, VDOE, the Teachers Ass’n., and the Loudoun School Board all want to keep these important data secret.  What do you suppose they are trying to hide?

SGP V – Bad Data?

Turning back to the question of the appalling performance of Richmond’s middle schools, especially in the sixth grade, let’s take a look at the SGP average by teacher by year.  Here, for a start, are the fifth grade data, first for reading and then for math:

image

image

These graphs make a couple of points:

  • In 41% of the cases, a teacher taught the subject for only one year year of the three.  For the most part, however, the teachers in both subjects taught the same grade and subject for either two years (21% and 22% for reading and math, respectively) or all three years (38% and 37%).
  • The dip in the reading SGPs in 2013 corresponds to the new reading tests that year and Richmond’s failure to align its curriculum.  Otherwise, the year-to-year changes give a quick idea of the variability per teacher.  To put a number on that, the average standard deviation of the annual SGPs is 11 for both sets of data; doubtless that number would be smaller if Richmond had not failed to prepare for the new reading test in 2013.

The sixth grade data show an entirely different pattern.  Again, first for reading and then math:

image

image

First, of course, the SGPs are much lower than the fifth grade numbers.

Remarkably, only one teacher (#66291) taught one subject (math) at the 6th grade level for two years in a row.  At least that’s what this database says.  Thirty-seven teachers taught reading only one of the three years while five taught two of the three years.  For math, those numbers are twenty-four and six.  As the graphs show, none taught for all three.

The small number of teachers reported in 2013 looks strange.  It is strange.  See below.

The seventh grade patterns are similar, albeit the scores are a bit higher and we see a few more teachers reported to be repeating the same subject.  Again, reading and then math:

image

image

Finally, the eighth grade scores, again reading first:

image

image

Aside from the remarkable teacher turnover implied by these data, the number of Grade 6 to 8 reading and math teachers is unbelievably low in 2013 except, perhaps, for the 8th grade reading.  As well, the counts of SOL and SGP results in the VDOE SGP database for grades 6-8 in 2013 look to be anomalous:

image

image

We have nowhere to go to test the number of SGP reports.  In contrast, the (very nice) VDOE SOL database gives the participation counts for the same years and grades.  Here, then are the Richmond numbers of SOL results in the SGP database, expressed as percentages of the participation numbers reported for the same tests.  First, reading:

image

Then math:

image

Let’s first deal with the ridiculous datum: The “#DIV/0!” for 2014 6th grade math is there because VDOE reported zero participation in that test:

image

Looking beyond that obvious error: VDOE attached a sheet to the earlier download, admitting that the SGP database suppressed data for classes fewer than ten students and for students who transferred from one Virginia school to another during a school year.  They provided counts for the suppressed data by test, ranging from 1.6% to 7.3%, with an average of 4%.

Except for the 2012 Grade 6 math datum, where the SGP database reports SOL scores for 106% of the students who took the test, the numbers here are much lower than even a 7% suppression (low in elementary school, very low in middle school).  They are ridiculously low for grades 6-8 in 2013.

Doubtless that explains the remarkable teacher turnover reported for the SGP in grades 6-8 for 2013: Leave out the test data and the reports by teacher also will be left out.

We’ll have to try to get a correct SGP database and see how that affects the SGP analysis here and earlier.

Sigh.

SGP IV – Rating the Teachers: Math

We have seen that the new SGP dataset shows a precipitous drop in the quality of instruction in math in the sixth grade in the Richmond public schools.

http://calaf.org/wp-content/uploads/2015/03/image14.png

The by-teacher averages add some context to that situation.  Let’s start with the fifth grade math distribution for the state:

image

And for the City:

image

The Richmond average is slightly below the state average.  In both cases, the fairly large standard deviation reflects the relatively large numbers of very good and very poor performers.

Next the sixth grade math data, starting with the state distribution:

image

Then Richmond:

image

(Note that, with this relatively small dataset, the average SGP by student, 24.2, is not the same, and is not necessarily the same, as the average by teacher, 20.9)

The Richmond average is 1.75 standard deviations below the state average.  Four of eighteen Richmond math teachers are more than two standard deviations below the state average.  Only one is above the state average.

The 7th grade math results are less awful but still unacceptable: 

image

The eighth grade math results are below average, but less horrible than the data for the previous two grades:

image

Of course, we’ve long known that the SOL scores of Richmond’s middle schools were a disaster. 

image_thumb9

The SOL scores (data again from 2014) decrease with increasing economic disadvantage. 

image

(Richmond is the gold square.)

In contrast, the SGP scores measure improvement compared to similarly situated students and are largely independent of family wealth.  The Colorado data are widely quoted on this point:

image_thumb4 image_thumb5

The SGP data confound the traditional excuse that too many of Richmond’s students are from economically disadvantaged homes.  Thus, Richmond no longer can blame the students for the miserable performance of its schools, especially the middle schools.  The problem is (some of) the teachers, (most of) the schools, and the administration, not the students. 

Rant Begins:

Offsetting that important progress in evaluating the schools, VDOE is refusing to release the identities of the under- (AND over-) performing teachers.  They claim that those records of academic improvement (or lack thereof) in particular classrooms in public schools, financed by public money, are “personnel records,” never mind that the personnel in question are not VDOE employees. 

Whether these data as to particular teachers are “personnel records” under the Freedom of Information Act is a question for the Judge.  In any case, VDOE has the discretion to disclose the records. 

At present, parents take their kids’ teachers willy nilly.  VDOE now has data in some cases to tell those parents whether the teachers are effective.  Yet VDOE thinks the “privacy” of those public employees is more important than informing the public about those employees’ performance.  VDOE’s refusal to share those important data that have been bought with taxpayer dollars is an abiding and outrageous insult to Virginia’s taxpayers.

Until someone comes up with a catchy acronym, I will call the gang of secretive bureaucrats at VDOE the State Department of Data Suppression.  The name may not be catchy but it surely is accurate. 

(And the abbreviation, SDDS, is a palindrome!)

Your tax dollars at “work.”

SGP III – Rating the Teachers: Reading

The most recent VDOE database of Student Growth Percentiles contains (anonymous) teacher IDs.  This gives us a first peek at how well, and how badly, some of Richmond’s teachers are performing.

With all the earlier caveats, let’s start with the statewide distributions of teachers’ average SGP scores in reading and math.

image

Brian Davison points out that both distributions are reasonably symmetrical, suggesting that we do not have an unusually large number of teachers doing particularly well or poorly.  That said, no parent will want a child to be subjected to the reading teacher in the first percentile, the other teacher in the second, or the three in the eighth.

The math scores are more widely distributed, showing a larger number of excellent and a larger number of awful teachers. 

image

Aside from targeting the lowest performers in both subjects, these data suggest that we need math retraining more than reading.

Turning to the data by grade, here is the distribution of fifth grade reading averages.

image

The orange curve is a normal distribution, showing the least squares fit.  The average and standard deviation of the fitted curve are shown at the base of the graph.

The distribution of sixth grade reading teachers is close to the same.

image

We already have seen that the Richmond average reading SGP plunges from fifth to sixth grades.

image

The Richmond distributions conform to that pattern.  First, grade 5:

image

(Note that this average by teacher is slightly less than the average by student, above.)

As you see, this distribution is a bit wider than the statewide distribution.  That is, Richmond has relatively more excellent fifth grade reading teachers than the statewide average, and also relatively more who are not performing.  Five (of sixty-seven) Richmond teachers are more than two standard deviations above the state average; three are more than two standard deviations below.

Those teachers at the low end need some work but, for the most part, Richmond’s fifth graders are in pretty good hands as to reading.

Then we have grade 6 reading results:

image

Only one of Richmond’s twenty-one sixth grade reading teachers produced an average student improvement better than the state average; none was more than two standard deviations above the statewide average.  Six (or seven, depending on the rounding) were more than two standard deviations below the state average and four were more than three standard deviations below.  The Richmond average is 1.5 standard deviations below the state average.

These data tell us that Richmond’s sixth grade reading teachers are not doing a bad job.  They are doing an appalling job.

Upon some reflection, the data also tell us two even more important things:

  • The principals (and the Superintendent) now have a quantitative measure of teacher performance (at least as to reading and math).  If they don’t do something (soon!) about rewarding the excellent performers and retraining or firing the poor ones, we’ll know they need to be replaced themselves.
  • VDOE is hiding the identities of these high- and low-performing teachers from the parents who pay them and the teachers and whose kids are directly affected by teacher performance.  Apparently those bureaucrats think it would be intrusive for the parents of Virginia’s schoolchildren to know whether their kids are in the hands or excellent, average, or lousy teachers.  I think the term for that kind of inexcusable bureaucratic arrogance is “malfeasance.”

Tomorrow, the Math situation.  (Hint: It’s even worse.)