Richmond Schools by Grade

The wonderful VDOE database can give us pass rates by test subject by grade.  For a single school, here Westover Hills, the data look something like this:



And here is Lucille Brown:



But there’s no way to fit all the data on one graph, or even on a few.  So I have posted the spreadsheet.

After you open the spreadsheet, select the reading or math tab and follow the instructions there.  Please let me know if you have a problem: john {at} crankytaxpayer [dot] org.

The spreadsheet is here.

High Schools

Finally, here are the End of Course pass rates at Richmond’s high schools for the reading and math tests:



Notice the huge decrease in scores caused by the new reading tests in ‘13 and the appalling drop attending the new math tests in ‘12.

These data must be taken with a front end loader full of salt because of Richmond’s remarkably high retest rate that VDOE kept hidden until a judge made them cough up the 2014 SGP data.  Here, for example, are the 2014 algebra I retest counts in Richmond.

My analysis of those data showed that the retests increased the SOL scores of the students involved by an average of 24.6 points on a test where 400 was passing (i.e., by 6.1% of the passing score).

This year, for the first time, retests were available for elementary and middle school students.  Even with that boost (about 4% according to VDOE), we have seen the Richmond middle school and far too many elementary school scores endured at appalling levels.

Richmond Middle Schools

Here are the average reading pass rates, by year, of the Richmond middle schools.


Recalling that the Accreditation Benchmark for reading is 75%, we see our highest scoring middle school five points below that mark.

Recalling further that Thompson was the only Richmond school denied accreditation last year, and noticing that Thompson almost looks good in comparison to King and Henderson, we see further evidence of VDOE’s wholesale cooking of the accreditation data.

The math data paint a similar picture.


The Accreditation Benchmark here is 70% and only Hill comes even close.  Note also that the “accreditation denied” Thompson has whipped the “accredited with warning” Elkhardt, Henderson, and King for all four years of the new math tests.

It should be a crime to subject any child to any of these schools.  It is an embarrassment.

Richmond Elementary Schools

Looking further into the SOL pass rates, here are the reading data for Richmond’s elementary schools by year:


That graph is much too complicated but it does contain all the data, plus the Richmond average.  Also, the graph does illuminate the disastrous effect of the new tests in 2013 and the relative abilities of some schools to weather the change (notably Carver(!), Munford, and Fox, and, with a different pattern, Fairfield) and the gross failures of others (notable Mason, Oak Grove, and Woodville).  Doubtless this tells us a lot about the principals of those schools, especially in light of the more challenging student populations at Carver and Fairfield.

Focusing on the high and low performers, plus our neighborhood school (with the State average for grades 3-5 added) gives a more readable picture.


The math scores paint a similar picture, except that the Big Hit from the new tests came a year earlier.



VCU: Expanding Upon Incompetence

The other day, Jim Bacon published a piece on the Brookings study, Beyond College Rankings: A Value-Added Approach to Assessing Two- and Four-Year Schools.

Brookings looked at “economic value added.”  In essence, they measured “the difference between actual alumni outcomes (like salaries) and predicted outcomes for institutions with similar characteristics and students.”

The Brookings data have something to tell us about the jobs our colleges and universities are doing.

Here, for a start, are the Brookings data for Virginia institutions.


If we graph those data, we see a clear trend.  Indeed, it makes sense that mid-career salary would correlate with occupational earnings.

There are two clearly anomalous data points: the University of Richmond (the yellow square) and Mary Washington (the pink circle) are well removed from the trend.


Richmond is the only “Baccalaureate College[]” in the list, albeit it offers advanced degrees in law, education, and other subjects.  Presumably Brookings looked only at the undergraduate program.  As to why Richmond graduates would have relatively high mid-career salaries but low earnings, go figure.

If we just look at the research universities, we see a clearer picture: Outstanding performance, except from VCU, and a clear trend.



Unfortunately, VCU seems devoted to its mediocre performance.  They are spending $90,000 per year of your and my tax money to hire Richmond’s failed Superintendent as as an Associate Professor in “Educational Leadership.”  If that kind of “leadership” is characteristic of VCU, it’s a miracle that their graduates’ earnings are not still lower.

Turning to the schools with masters programs:



Here we see another fairly clear trend, with Mary Washington and Hampton showing anomalous mid-career salary outcomes.

My takeaway: Enjoy VCU basketball and send your kids to another university.


Note added Wednesday afternoon: The inimitable Carol Wolf pointed out that VUU VSU [Oops!] beat VCU as to both earnings and mid-career salary.

Indeed.  VUU VSU in green, VCU in red:


In fact, it’s more dramatic than that.  VCU is a research university, while VUU VSU is in the lower-scoring cohort of “Master’s Colleges and Universities.”  It would have been entirely unremarkable if VCU had significantly outscored VUU VSU.  It is entirely remarkable that VUU VSU whipped VCU in both rankings.

Reversible “Progress”

The ratty, old steel guardrails on Riverside Drive did a fair job of standing the tests of time and drunks.  The new, wood guardrail got (and failed) its first test about 02:00 Friday morning on the curve just west of the 42d St. parking lot.

The folks who installed the railing did a nice job of setting it straight; our Friday morning visitor undid that straightness,

150419 004

mostly by moving the posts.

150419 005

150419 006

If our City repairs the new guardrail with the same care that they bring to picking up the leaves on Riverside Drive, we are in for a deteriorating wooden eyesore.

150419 001

Your tax dollars at “work.”

“Educator” = “Criminal”??

The Wall Street Journal this morning headlined “Eleven Atlanta Educators Convicted in Cheating Scandal.”

The story reported that eleven of twelve former administrators, principals, and teachers were convicted of racketeering for their participation in a conspiracy to cheat on their students’ standardized tests.

Looks like the WSJ couldn’t think of a better word than “educator.”  They might have said “former public school personnel.”  “Criminal” would have been even more compact.

For sure, calling those people “educators” was a slam to all the decent folks who work in the public schools.

A Modest Proposal

SOL scores decrease with decreasing economic status of the family.  Thus, the Feds have required (select the SGP Primer link) VDOE to compute a measure of learning, not family income.  VDOE selected the SGP.  VDOE now has three years’ of those SGP data that can be used to measure teacher effectiveness. 

VDOE has a lawyer full of bogus excuses for not releasing the data with the teachers’ identities attached.  None of those would prevent use of the data to rate the effectiveness of the colleges that sent us those teachers.

Just think, VDOE now can measure how well each college’s graduates perform as fledgling teachers and how quickly they improve (or not) in the job.  In this time of increasing college costs, those data would be important for anyone considering a career in education.  And the data should help our school divisions make hiring decisions.

In addition, VDOE could assess the effectiveness of the teacher training at VCU, which is spending $90,000 a year of your and my tax money to hire Richmond’s failed Superintendent as an Associate Professor in “Educational Leadership.”  Wouldn’t it be interesting to see whether that kind of “leadership” can produce capable teachers (albeit it produced an educational disaster in Richmond).

Categories SGP

Lynchburg SGP

My paternal grandmother was Angie Lynch, said to be a relative of John Lynch.  Angie was the second woman in the Oklahoma territory with an advanced degree.

I’ve maintained an affection for Lynchburg, especially in celebration of the US 460 bypass that makes travel to Roanoke a much lighter task.  So it was a particular sorrow when my earlier Lynchburg post got wiped.

In light of VDOE’s third data release (that includes data by teacher, but not by school), I thought I’d redo the post.

First, as a reminder, here are the statewide distributions of teacher average SGPs in reading and math.



Next the Lynchburg distributions.




Need I say it: These are not good numbers.

We have three years’ data so let’s look at the trends, restricting the graphs to those teachers who taught the subject for all three years.


There are too many reading teachers to make much sense of the graph (the table on the right is too small to even list them all).  Let’s take out all but the top and bottom few.


Here we see some average and below average teachers improving nicely (No. 66197 presents a happy picture) and others deteriorating severely (No. 69532 is an unfortunate counterbalance to No. 66197).  The citywide average by teacher (that includes all the teachers, including those who taught reading for only one or two years) is low and the lack of a trend does not suggest improvement.

Going directly to the bowdlerized dataset, the math data are more lively.


Of interest, we again see low-performing teachers whose performance deteriorated.  We also see a citywide average that bounced but then dropped back to subpar.

Only three Lynchburg teachers taught Algebra I all three years so the graph is much simpler.


None of the three improved over the period; quite the contrary.  The average is pulled down by the teachers, not shown, who taught fewer than all three years.  It starts above the state average but deteriorates into the unacceptable range populated by Lynchburg’s reading and math averages.

We also have detailed data by teacher, albeit VDOE won’t tell us who they are.  The high-performing teacher in this collection is No. 71485, who had only one 4th grade math student scoring below the statewide average.


In contrast, the best math SGP in the 4th grade class of Teacher No. 71819 was 23.


This teacher also had a 4th grade reading class.


The 25.7 average in that reading class is far from acceptable but it is far less dismal that the 4.4 average in this teacher’s math class.

For any reader inclined to overlook the fundamental notion that the SGP measures teacher performance, a glance at the eight students that took both reading and math from this teacher is instructive.


One student scored in the same Student Growth Percentile in both subjects; the other seven scored higher, some much higher, in reading.  Note especially student No. 7B89048849408, who scored in the first percentile with the benefit of this teacher’s math instruction but in the 70th on the reading test.

Unfortunately, this teacher is getting worse.


I could go on but I think these data make my points.  I’ll suggest five things:

  • Lynchburg has a problem with its school system.
  • No. 71819 is an awful math teacher.
  • No. 71819 is a very bad reading teacher.
  • Any principal who subjected schoolchildren to No. 71819 in 2015 should be fired.
  • The bureaucrats at VDOE who refuse to identify No. 71819, as well as that teacher’s principal, to the parents of Lynchburg are misusing the public funds that pay them and pay for the statewide testing.

Important SGP Data Suppressed by VDOE

The third SGP data release by VDOE contains anonymized teacher IDs (but no data by school).  These abbreviated data serve to emphasize the perversity of VDOE’s suppression of the teacher identities (and other data).

In Richmond, according to the database, 304 teachers taught reading in grades 4 to 8 in the 2012-2014 period.  Of these, 74 taught the subject all three years.  A graph of the average SGP performance of that 74 is far too busy to convey much information, aside from showing the remarkable range of the average scores and the dip in 2013 because of the former Superintendent’s failure to align the curriculum to the new SOL tests.


If we remove all but the top and bottom four or five 2014 scores, the graph is more informative.


The average shows the Richmond dip in 2013.  Note that the State SOL scores dropped in 2013, because of the new tests, but the SGP did not: The SGP measures relative performance and the statewide SGP average is unaffected by the overall drop in SOL scores.  Richmond’s SGP dropped relative to the statewide numbers, however, denoting underperformance here.

Turning to specifics: Teacher No. 66858 (Gr 5 reading) started above average and improved dramatically.  Teacher No. 74415 (Gr 4 reading) started below average and deteriorated dramatically. 

The distribution of Teacher 66858’s 2014 SGP scores provides a more detailed picture of that teacher’s excellent job.



This teacher had only one student whose reading progress in 2014 was below average.  The low end of the 95% confidence interval of the mean for these data is 76.2. 

In contrast,



The high end of the 95% confidence interval for for the mean of this teacher is 13.9.  Notice that this teacher’s 2013 performance was not a whole lot better. 

The principal who allowed 25 kids (we have SGPs for 24 of the 25) to be subjected to this educational malpractice in 2014 should have been fired.  Yet VDOE deliberately makes it impossible for Richmond’s parents to know whether this situation has been corrected or whether, as is almost certain, another batch of kids is being similarly afflicted with this awful teacher.

The math data show a similarly diverse pattern, albeit without the 2013 drop: good and average teachers getting better; average and bad teachers getting worse; bad teachers staying bad.


It turns out that both of the reading teachers above also taught math to the same kids that they taught (or failed to teach, in the one case) reading. 

No. 66858 turns out to be an excellent math teacher, albeit not as excellent as at reading.



Similarly, # 74415 is a bad math teacher, but not as awful as at reading.



No real surprises here.  We would expect that, to some degree, teaching math might take a different skill set than teaching reading.  We also might expect that a good reading teacher would be good at math, and a bad one at reading similarly bad at math.

I could go on and on but the point already is clear: VDOE is concealing important data about the performance of our teachers and principals.  Without these data, the public cannot know which of our principals are harboring educational malpractice.

Finally, Algebra I.  Only nine Richmond teachers taught this subject in all the three years so the graph includes them all.


These data paint a picture of improvement, but see the caveats below.

Thirty-one Richmond students were afflicted with teacher No. 68640 but we have SGPs for only 13.  The scores of those 13 do not paint a pretty picture.



This teacher improved from appalling to awful from 2013 to 2014 but still had only three students above average in 2014.  It is tempting to think that this teacher demonstrates that yet another principal needs firing but there are problems with the data.

The Algebra I data are skewed in at least two ways: The bright kids tend to pass it in middle school.  The ones who can’t pass in high school contribute to our appalling dropout rate.  Then we have the students who take Algebra but don’t get an SGP score because of the byzantine rules (see the class above with 31 students but only 13 SGPs).

And then, the students who have trouble in high school tend to retake (and retake and retake) the test in order to graduate.  VODE has let slip enough retake data to suggest that the retake scores are bogus.

For instance, here are the 2014 Algebra I retest counts in Richmond.


On the data that slipped out of VDOE, those retests increased the SOL scores of the students involved by an average of 24.6 points.  One student improved his/her Algebra I score by 108 points. 

The data are here.  Note that the averages at the link include the retest score decreases and still show a net positive 12+ points.

The summerlong retesting is another facet of VDOE’s deliberate concealment: They have enough SOL data to schedule graduations in May but they do not release the SOLs until late August.  No telling what machinations take place in those three months; the data above suggest many and major.

So, we have VDOE manipulating the data in secret and, even more to the point, concealing data about the good and bad teachers in our schools. 

Our tax dollars at “work.”