Reedy Creek: If It Ain’t Broke, Spend Tax Money to Fix It

We have seen that the City plans to spend $1.27 million to control phosphorus and sediment in Reedy Creek that already are being controlled by the sediment traps at Forest Hill Park Lake.

The Reedy Creek Coalition has some stream monitoring data that show another side of this boondoggle.  The monitoring points are here:


Let’s rename those stations to something more descriptive: RC4 = Headwaters; RC3 = Upstream (of the project area); CB1 = Trib (as it enters the project area); RC1 = Downstream (of the project area).  With those labels here are the dissolved oxygen data:


The water quality standard for dissolved oxygen in this stream is 5.0 parts per million as a daily average.  That does not translate to a percentage unless we know the temperature.  The percentage data nonetheless tell us that the lowest dissolved oxygen levels are found in the headwaters and that the concentration improves as the water flows through the project area.

The DEQ monitoring has placed the upstream reaches of Reedy Creek on the 305(b) list of impaired waters for dissolved oxygen and pH violations [at 453-4].

So there you have it: The impaired waters are upstream.  Rather than fix the problem at the upstream source, the City is spending $1.27 million on the part of the stream that is helping to correct the impairment.  Their justification is that the project will remove pollutants that already are being removed at Forest Hill Park Lake. 

Your tax dollars at “work.”

Reedy Creek Boondoggle

The City is planning to waste your tax money on a “restoration” project that won’t do any good.

Reedy Creek rises near Chippenham Hospital; it flows through Forest Hill park and into the James. 

The concrete channel built to control the flooding of and near Midlo Turnpike (esp. in front of the Evergreen plant) has led to high stormwater flows that erode the banks downstream.  The City now plans to spend $1.27 million of your and my tax money to create a new floodplain on the City property across Crutchfield St. from Wythe High School to reduce erosion there. 

The Reedy Creek Coalition lists five major reasons for opposing the project.  Their #4 should settle the issue: The money won’t do any good.

Apparently the DEQ bureaucrat who approved the City’s application for grant funds is a recent ComeHere.  Anybody who has lived in Richmond for more than a few years knows that the City already spent $1.4 million to restore the Forest Hill Park lake that is downstream of this project.  The renovation included a silt capture system , actually two forebays:

“A forebay is basically a hole. It’s a settlement hole where the silt will kind of build up. We will be able to clear it out with a Bobcat and haul it off and it will fill up again. So the process will be able to continue. But it will not affect the lake so that the citizens’ investment that they have in the lake will certainly be safeguarded,” said Richmond Parks deputy director Roslyn Johnson.

The City’s grant application to DEQ calculates that the new project will remove 150 lbs/yr of phosphorus and 98,736 lbs/yr of sediment.  The calculation entirely overlooks the two forebays that already are removing most of this and other sediment and the attached phosphorus.

So the City wants to spend $1.27 million to solve a problem that it already has solved.

And they want Chesapeake Bay TMDL credits for removing pollutants that are NOT entering the river.

Can you spell “boondoggle”?

Has Roanoke Joined the Cheaters Club?

I earlier quoted Scott Adams for the notion that “whenever you have large stakes, an opportunity for wrong-doing, and a small risk of getting caught, wrong-doing happens. . . .  When humans can cheat, they do.”  That certainly is what we’ve seen wholesale in Atlanta and in Virginia on the VGLA.

Now I have a copy of a letter from a former Roanoke Latin teacher to the President of the VBOE, alleging wholesale cheating at one or more Roanoke schools.

If Adams is right, it would not be a surprise to find a fire beneath this smoke.  In any case it will be interesting to see whether VBOE, which is supposed to supervise the public school system, conducts an investigation.

Here is the letter:


Dr. Billy K. Cannaday, Jr.
Virginia Board of Education
P.O. Box 2120
Richmond, VA 23218
(804) 225-2924

Dear Dr. Cannaday:

I am writing you on behalf of over twenty former and current students including faculty at Hidden Valley High School in Roanoke County Public Schools (RCPS), who are extremely concerned about cheating on non-SOL testing on school-issued laptops, which has been a chronic problem since 2007.1 Unfortunately, cheating is not only a widespread problem at Hidden Valley High School, but throughout RCPS in grades 8-12.2

I taught Latin at Hidden Valley High School from 2011 to 2013. Despite informing the administration in November 2012 about cheating on school-issued laptops, nothing was ever resolved. Over ten of my former students informed me after graduating in June 2015 that cheating at the school actually had worsened in the past two years. Many of them described the cheating as “nuts,” “rampant,” and “out of control.” I informed Al Bedrosian of the Board of Supervisors in October 2015 and Fuzzy Minnix of the School Board in November 2015 about my concerns, but nothing was resolved. So I addressed the School Board on March 24, and again nothing was resolved except a vague promise by Jeff Terry, the Chief Information Officer, to update and secure Blackboard next fall (Gregory). I also addressed the Board of Supervisors on April 26 upon the invitation of Al Bedrosian, but unfortunately they do not have any oversight of the school district.

I believe that RCPS is in violation of Standard 7 (C) (3) of the Code of Virginia, which states that “the standards of student conduct and attendance and enforcement procedures [are] designed to provide that public education be conducted in an atmosphere free of disruption and threat to persons or property and supportive of individual rights” (§ 22.1-253.13:7).3 There is no question that RCPS currently has adequate “standards of student conduct” in place for academic integrity. According to Policy 7.11 or the Roanoke County Student Conduct Code, Rule 9 states that “students are expected to perform honestly on any assigned schoolwork or tests” (RCPS Current Policies SERIES 07: Students). Rule 9 (A) states that “students shall not cheat on a test or assigned schoolwork by giving, receiving, offering, and/or soliciting information” while Rule 9 (E) further states that they shall not “use technology for any unauthorized use” (RCPS Current Policies). Likewise, according to the Student Handbook of Hidden Valley High School for 2014-15 the honor code’s goal is “to maintain a high level of integrity, to strive honestly in all endeavors, and to perpetuate an atmosphere of trust between peers, students, and faculty” (2).4

Unfortunately, the central office of RCPS and the administration at Hidden Valley High School have total disregard for realistically enforcing these policies and rules when students take an online non-SOL test or quiz on school-issued laptops using Blackboard. It is extremely easy for a student to cheat without getting caught making the “enforcement procedures” in Standard 7 (C) (3) almost meaningless. The problem is that students have complete access to both their hard drives and the internet during an online test, and it is impossible for a dedicated teacher to watch fifteen or thirty laptop screens and also look for traditional cheating such as crib sheets and smartphones. Students can easily right click on Google, access the Snipping Tool, copy and paste answers, hide a cheat sheet, email passwords, etc. and most insidiously program a key to perform screen captures of an entire test or quiz to a Google server without the teacher ever knowing it. This testing environment is the direct opposite of state-mandated SOL testing which requires a lockdown browser and other needed software in order to prevent digital cheating.

Standard 7 (C) (3) clearly states that “public education be conducted in an atmosphere” “supportive of individual rights” (§ 22.1-253.13:7). RCPS has violated the “individual rights” of honest students who obey the rules or “standards of student conduct” (§ 22.1-253.13:7). The honest students are at a distinct disadvantage in competing against the dishonest ones in terms of lower GPAs, lower class ranking, and less academic awards, which also negatively impacts college admissions, scholarships and grants. There is a de facto system of academic apartheid between the honest students and the dishonest ones or cheaters in grades 8-12 throughout RCPS, thereby negligently allowing a non-level playing field and creating a negative “atmosphere” of learning. Like Major League Baseball players in the 1990s until 2005 during the steroid era, many honest students ask themselves if they should cheat in order to get ahead academically while the dishonest students never ask themselves this question. This is a moral dilemma every honest student faces during the academic year at Hidden Valley High School and all the other county schools in grades 8-12.

In addition, Standard 7 (C) (3) states that “public education be conducted in an atmosphere” “free of disruption” (§ 22.1-253.13:7). Not only is cheating both academically disruptive and morally wrong it also teaches bad “citizenship” by negative example for irresponsible and NOT “responsible participation in American society,” which is both a violation of the public trust and Standard 1 (C. 1) (e.) (Code of Virginia. § 22.1-253.13:1).5 RCPS should not be teaching its students to be emulating such notorious “cheats” as Lance Armstrong, Mark McGwire, Lenny Dykstra and Alex Rodriguez, not to mention Swiss banks, Mitsubishi and Volkswagen. Lastly, cheating certainly does not “foster public confidence” in RCPS, which is one of the five “accreditation standards” of the “public education system” in Virginia (“Regulations Establishing Standards for Accrediting Public Schools in Virginia” (8VAC20-131) 3).
RCPS has not been in compliance with both Standards 7 (C) (3) and 1 (C. 1) (e.) in grades 9-12 since 2007.6 When a student takes an online test or quiz on a school-issued laptop, the school district does not provide adequate “enforcement procedures” as described in Rules 9(A) and 9(E) in Policy 7.11 or the Roanoke County Student Conduct Code. Hidden Valley High School has also failed “to maintain a high level of integrity” and other ethical standards as described in the school’s honor code. However, the most egregious violation has been the noncompliance of RCPS with Standard 7 (C) (3), which states that “public education be conducted in an atmosphere” “supportive of individual rights.” This has repeatedly resulted in honest students being at a distinct disadvantage in competing against the dishonest ones in terms of lower GPAs, lower class ranking, and less academic awards negatively impacting college admissions, scholarships and grants. Consequently cheating has also allowed the teaching of very bad citizenship, which is a violation of Standard 1 (C. 1) (e.). There needs to be an immediate external investigation from Richmond in order to ascertain the status of the school district’s state accreditation, and determine who has been either responsible or complicit in this shameful and preventable academic misconduct. The students, parents and taxpayers in Roanoke County all deserve more integrity and better accountability from their public schools.


Robert Maronic



1. Smith wrote about cheating on non-SOL testing using school-issued laptops and Blackboard at Hidden Valley High School in May 2013: “In a miniature poll of Hidden Valley students, who’s [sic] identities will be kept anonymous, one 11th grader estimated that in a class of 25 students taking a Blackboard test, ten to fifteen would be cheating.  Another student, a 12th grader, believes that in the same situation, only two or three students would be cheating.  Whichever version is true, students are still cheating on tests.” Smith also wrote, “Most students and teachers agree that it is easier to cheat on a Blackboard test than on a paper test.  A 10th grade student said that it was easier to cheat on a Blackboard test because ‘you can switch windows while you are working on a test.’  An 11th grade student said that it is easier to cheat on a Blackboard test because of ‘search engines such as Google, Bing, and Yahoo.’ Some students have witnessed so much cheating that they have become numb to it.”

2. Cheating is truly a widespread problem throughout RCPS. I have talked with over twenty teachers, students and graduates from Cave Spring High School, William Byrd High School and Northside High School since 2011, and all their complaints about cheating on the school-issued laptops are identical to what I was told or experienced at Hidden Valley High School. I have listened to the complaints of one recent graduate of Glenvar High School, and would assume that cheating is just as prevalent there as the other four county high schools. Please also note that RCPS first issued laptops to all eighth graders during the 2015-16 academic year. All seventh graders will be issued laptops during the 2016-17 academic year according to what was discussed at the School Board meeting on March 24.

3. According to “Bill Tracking (Chapter 474 ) – 2008 Session Legislation,” Standard 7 (C) (3) was known as Standard 7 (B.1) (3) in 2007 and 2008 in the Code of Virginia.

4. An updated version of the Hidden Valley High School Student Handbook for the 2015-16 academic year is currently unavailable online.

5. According to the “Virginia Department of Education SOQ Compliance Detail Report [for] Roanoke County” submitted for the 2014-15 academic year, Standard 1 (C. 1) (e.) states, “Essential skills and concepts of citizenship, including knowledge of Virginia history and world and United States history, economics, government, foreign languages, and international cultures, health and physical education, environmental issues and geography necessary for responsible participation in American society and in the international community.”

6. RCPS first issued laptops to all eighth graders during the 2015-16 academic year. The school district has not been in compliance with both Standards 7 (C) (3) and 1 (C. 1) (e.) for eighth graders since August 2015.


Works Cited

“Bill Tracking (Chapter 474 ) – 2008 Session Legislation.” Bill Tracking – 2008 Session of the VA General
Assembly. Web. 17 May 2016.

Code of Virginia. § 22.1-253.13:1. Standard 1. Instructional Programs Supporting the Standards of
Learning and Other Educational Objectives. Web. 15 May 2016.

Code of Virginia.
§ 22.1-253.13:7. Standard 7. School Board Policies. Web. 14 May 2016.

Gregory, Sara. “Roanoke County School Board Approves 2 Percent Raise for Teachers.” Roanoke Times.
24 Mar. 2016. Web. 24 Mar. 2016.

Hidden Valley High School: Student Handbook 2014-2015. Roanoke County Public Schools. Web.
12 May 2016.

“RCPS Current Policies SERIES 07: Students.” Student Conduct Code: Policy 7.11. Roanoke County
Public Schools, 13 Aug. 2015. Web. 12 May 2016. See Rule 9 – Integrity.

“Regulations Establishing Standards for Accrediting Public Schools in Virginia” (8VAC20-131). VA
Department of Education, 19 Oct. 2015. Web. 12 May 2016. See p. 3 (8VAC20-131-10. Purpose).

Smith, Tanner. “Cheating Continues to Plague Acadmic [sic] Careers.” Titan Times. Hidden Valley High
School [Roanoke], 2 May 2013. Web. 13 May 2016.

The Empire Strikes Back. Feebly.

On June 2, Donald Wilms, President of the Chesterfield Education Association, responded in the Times-Dispatch to Bart Hinkle’s editorial of May 28.

Hinkle had made the point that the Virginia Education Association’s attempt to suppress truthful data on teacher effectiveness sought to keep “parents and taxpayers . . . in the dark about which teachers are doing a great job – and which ones aren’t.” 

Wilms sought to argue that the data should be kept secret.  He mostly demonstrated that the Chesterfield Education Association needs a better argument.

Wilms brought on some emotional arm-waving about the students who may come to school on test day after oversleeping and missing breakfast; after a fight; after losing a boy-/girlfriend; or the like.  He neglected to mention that the data he disdains are based on two (or more) successive years’ test scores:  An outside event in the second year could lower a student’s measured progress but the same event in the first year could increase the score difference and, thus, the progress measure.

The nut of Wilms’ argument, however, was the kids who are at a systematic disadvantage:

[W]ould it be fair for schools with high-income households — where both parents are college-educated, where kids go to museums and take exotic vacations, where parents have two cars and are available to take kids to the public library — to compete with schools where kids live in single-parent households, where parents hold several low-wage jobs, with hours preventing them from being home to take kids to museums or public libraries, and incomes preventing them from any vacation at all? Heck, it would be unfair to compare a school in western Chesterfield with one in eastern Chesterfield, let alone to compare one of Chesterfield’s (or Henrico’s) wealthiest school communities with one of Richmond’s neediest school communities, don’t you think?

* * *

What these [SGP data] really illustrate is which teachers have the at-risk kids and which don’t.

On these emotional grounds, Wilms attacked the fairness of the “flawed rankings” of the Student Growth Percentile, the “SGP” (and, apparently, other measures of teaching effectiveness).  He disdained to discuss the actual data, which are to the contrary.

When it was considering adopting the SGP, VDOE provided data showing that test scores fall with increasing economic disadvantage (no surprise there) but that SGP percentiles do not:


That is because the SGP, by design, compares low-performers only to other low-performers.  Unlike the SOL, it measures progress relative to others similarly situated.

Indeed, the data for Chesterfield County make a clear case that students who score poorly one year, for whatever reason, can show outstanding academic growth the next year.  Here, for instance, are the 2014 math SGP scores of Chesterfield students plotted against those same students’ 2013 SOL scores.


Before drawing any conclusions, we need to refine the analysis slightly: Students who score pass, advanced (SOL 500 or above) for two years running do not receive an SGP (it’s hard to improve on “excellent”).  In a sense, the SGP penalizes teachers and schools with large numbers of very bright students!

The SOL scores ≥ 500 in the graph above represent students who scored advanced 2013 but less than advanced in 2014.  The students who scored advanced in both ‘13 and ‘14 do not appear in this dataset at all, which biases the analysis.  So let’s look only at the students with 2013 SOLs < 500:


The result doesn’t change much.  The R2 value of 0.14% tells us that the SGPs are quite uncorrelated with the previous year’s SOL scores.  Of course, correlation is necessary but not sufficient to show causation.  Said otherwise: Chesterfield students with low SOL scores in the previous year, for whatever reason, show superior (or inferior) academic growth (as measured by the SGP) in the current year just as often as students who scored well in the previous year.

We can’t directly test Wilms’ statement about pitting a rich school against a poor one because VDOE (known here as the Virginia Department of Data Suppression) hasn’t released (and won’t release) the data.  So let’s go one better: Let’s use the data we have to compare some of the worst students in Chesterfield in terms of 2013 SOL with some of the best.

Using the math data, here is the first group, all at 50 points or more below the passing score of 400:


As before, the 2014 SGP does not correlate with the 2013 SOL. 

Next the 2013 SOLs between 450 (50 points above the passing score) and 499 (where we stop to avoid the advanced student penalty).


Pretty much the same story: High or low SOL one year does not predict SGP growth in the next year.

Said yet otherwise: Part of teaching in the public schools is dealing with kids who are disadvantaged; the SGP identifies teachers who do that well; the SOL may not.

BTW: The reading data tell the same story with slightly different numbers.  Let’s avoid some clutter and leave them out. 

It’s easy to understand why Wilms would prefer the current system.  In 2011 (the only year that VDOE slipped up and posted evaluation data), Chesterfield teachers were evaluated on seven criteria.  Six of those were inputs, i.e., relatively easy to measure but only distantly related to student learning.  On the only important measure, “Student Achievement and Academic Progress,” only twelve of 1222 Chesterfield teachers (1%) were said to need improvement and none was unsatisfactory. 


But when we look at real performance data (2012 SGPs are the earliest that Brian Davison sued out of VDOE), we see student progress that was much worse than “unsatisfactory.”  For example here is the math performance of Teacher No. 34785 (identifier anonymized by VDOE):


The yellow points are the annual SGP averages of that teacher’s math students.  The blue bars show the 95% confidence intervals. 

The Chesterfield math averages those years were 48.7, 49.2, and 48.2.

Then we have Nos. 54317 and 86898:

image  image

To put some context on these numbers, here is the 2014 statewide distribution of math SGP averages by teacher.


The mean was 49.3; the estimated standard deviation, 16.3.  That is, about sixteen percent of the teachers were below 33.0; another sixteen percent, above 65.6.  Two and a half percent were below 16.7; another 2.5%, above 81.9. I’ve marked those values on the 2014 Chesterfield distribution:


The problem here is not “flawed rankings.”  It is flawed teachers that the Chesterfield “Education” Association does not want the parents of Chesterfield County to know about.

BTW: The data also show that there are some really fine teachers in Chesterfield.  For example, in math we see Nos. 29893, 28974, and 81816:

image image image

The Chesterfield and Virginia “Education” Associations don’t want you to know about these outstanding teachers, either.  They just want you to think that 99% are above average.

And VDOE is a co-conspirator.  Your tax dollars “at work.”

Yet More of What VEA and VDOE Are Trying to Hide

In the discussion of the SGP data that Brian Davison sued out of the State Department of Data Suppression, I’ve been focused on the awful teaching (e.g., here, here, and here) that the Virginia Department of “Education” and the Virginia “Education” Association have been attempting to conceal.  But their efforts to hide teacher performance data from the taxpayers who are paying the teachers have another outrageous effect: They suppress the identities of the many great teachers in Virginia’s public schools.

Having looked at the 43 teachers with the worst three-year average math SGPs, let’s turn to the 43 with the best averages:


The “Row Labels” column contains the (anonymized) teacher IDs.  The “Grand Total” column is the three year average for each teacher.  The “Grand Total” row reports the statewide average of each column.

All but three of the 43 teachers at the bottom of the SGP list damaged Virginia schoolchildren for only one year; almost half of the teachers in the present list helped educate Virginia schoolchildren for more than one year.

Here are the Top Ten who taught all three of the years for which we have data.


Notice that all but two of these got better performances from their students in 2014 than in 2012.  And, of those two, No. 115415’s even 99 in 2012 and much lower average suggest only one student (of doubtful statistical significance) in the first year and substantial progress over the second two years.

The preponderance of NoVa suburbs in this list raises the question whether those divisions have better students, or better teachers, or some combination.  See below for some data suggesting that there is more learning, and presumably more teaching, is in some more rural districts.  In the meantime, here are data for the Top Ten in graphical form.


Or, with the ordinate expanded:


The division averages provide some further insights.  Let’s start with the distribution of division mathematics averages for 2014.


As shown above, the mean math SGP in 2014 was 49.1.  The division mean was 48.3, with a standard deviation of 6.2.

The Top Ten divisions of 2014 did not include any of the NoVa suburbs.


Neither, for that matter, did the bottom ten.


Here is the entire 2014 math list, sorted by division (Colonial Beach, Craig, and Surry had no data).


When we turn to the division averages by year an interesting pattern emerges: The Top Ten in 2014 all improved from 2012.



Seems to me that a whole bunch of educators should be put on a bus to one of these divisions to find out how to increase student performance in math.

The Bottom Ten in 2014 mostly declined from 2012, except for Franklin and West Point, which improved; King and Queen, which improved slightly; and Galax, which only had data for 2014.



But the Virginia Department of “Education” and the Virginia “Education” Association don’t want you to see these data.  Apparently you are not qualified to know what you are getting (or not getting) for your tax dollar.

Still More of What VEA Wants to Hide

The SGP data that Brian Davison sued out of the State Department of Data Suppression last year showed (e.g., here and here) that we have some truly awful teachers in Virginia.  The lawsuit threatened by the Virginia “Education” Association demonstrates that it is more interested in protecting those bad teachers than in the effect of those teachers on Virginia’s schoolchildren.  And the VDOE’s own document, confirmed by the available data, admits that the teacher evaluation process has been worse than ineffective.

I’ve turned to the math data to provide some details.

A list of the very worst three-year teacher average math SGPs offers some Good News and some very Bad News. 

  • The Bad: Some truly awful teachers have been inflicted on Virginia’s schoolchildren, sometimes more than once. 
  • The Good: Most of those teachers from 2012 and 2013 were not there the following year.

Here are the worst 43 three-year math SGP averages by teacher, sorted by increasing average.


The “Row Labels” column contains the (anonymized) teacher IDs.  The “Grand Total” column is the three-year average SGP for each teacher.  The “Grand Total” row is the statewide average for each year and for the teacher averages.

We can’t tell from these data whether any of the teachers with only 2014 results stayed on to harm schoolchildren in 2015.  The yellow highlights identify the two teachers who came back after a failure to teach in an earlier year.

The red highlight indicates the teacher with the lowest average SGP among those who “taught” all three years.

Turning the cases where the school division was so evil (or so intimidated by the VEA) that it subjected its students to more than two years’ exposure to the same awful teaching, here are the Top Ten, sorted by increasing three-year average:


No. 90763 is an outlier here: The three year average of 17.1 comes from yearly averages of 71.0, 68.0, and 15.6.  The “.0s” are clues: This teacher likely had only one reported SGP for each of the first two years and a much larger class in 2014.  Indeed, the database shows 72 math SGPs in 2014.  If we add the 15.625 average that year times 72 to the 71 and 68, and divide by 74 we get 17.08, the three year average for this teacher.

That said, no telling what else unusual was going on there.

Turning to the other nine cases, we see:


Or, upon expanding the ordinate:


Only two of the nine showed improvement over the three-year period.  No. 88959 went from a 9.6 to a 28.3, which is still more than twenty percentiles below average.  No. 114261 improved, but only to 17.6.  We have no data regarding the guinea pigs students who were damaged in these experiments.

The other seven “teachers” all showed deteriorated performance over the period.  Note especially No. 54317, who went from bad to appalling (and was paid tax dollars to continue afflicting Chesterfield County schoolchildren).

There are as many as nine principals (depending on time frame in the job and whether it was one or two schools in Chesterfield) who should have been fired over these disasters.  I’ll bet you a #2 lead pencil they all got raises.

(And, for those of us in Richmond who look to the schools in the Counties as the Elysian Fields when compared to the barren wasteland of Richmond Public Schools, the presence of one Henrico and two Chesterfield teachers in this list comes as a shock.)

But, the Virginia “Education” Association and the State Department of Data Suppression don’t want you to know about this, especially if your kid is suffering under such an awful “teacher” and such a pusillanimous school system.

Your tax dollars at “work.”

More of What VEA Wants to Hide

Turning again to the SGP data that Brian Davison sued out of VDOE last year, let’s look at mathematics.

No. 40837, the teacher with the best 2014 math SGP average in Virginia, 95.2, had a fifth grade class in Fairfax.  Here are the data:


We can resist the temptation to dismiss this as the result of a class full of very bright students:  Students who scored in the advanced “proficient” range two years running didn’t get an SGP (it’s hard to show growth from really-good-already).  Thus, the students reported here did not obtain superior scores in both 2013 and 2014; they improved radically in 2014, compared to the others in Virginia who had similar performances in 2013.

Of further interest, this teacher’s reading SGPs (average of 65.7) are above average (48.0) but much less spectacular:


We can think of all kinds of explanations for this pattern.  In the absence of other data, Friar Occam would tell us to look first at the simple one: This is a superior math teacher and an above average reading teacher.

At the other end of the spectrum, we have No. 76323 whose fifth grade math students in Richmond averaged an SGP of 4.0.


The Virginia “Education” Association has threatened to sue because it doesn’t want you to know about this teacher.  But you can bet that the Richmond School Board members, Superintendent, principals and teachers knew and that none of their kids was among the unfortunate 25 in No. 76323’s class.

The reading performance of this teacher was a world better, with an above-average mean of 57.7


This suggests that there is a place for this teacher in Richmond, just not teaching math (absent some heavy-duty retraining).

The second worst 2014 math performance comes from fourth grade teacher No. 71819 in Lynchburg, with an average of 4.4.


This same teacher turned in a 25.7 in reading.


So, one awful performance and one clearly sub-par. 

Unfortunately, this teacher was getting worse:


You can again bet that no child of a School Board member, the Superintendent, or any Lynchburg principal or teacher was damaged by this teacher.

All the schoolchildren of Lynchburg (and their parents who pay the teachers) deserve to be protected from this teacher, and the too many others who are nearly as bad.  Another twenty-nine Lynchburg teachers averaged a math SGP of less than thirty in 2014; eight of those were less than twenty.


The 2014 reading performance in Lynchburg was less horrible: One teacher was below an average SGP of twenty, another six were between twenty and thirty. 

The SGP data from 2012 to 2014 tell us that Lynchburg’s average performance was sub-par and deteriorating.


Unfortunately, we can’t rely on the schools to deal with the ineffective teachers.  VDOE quotes a 2009 study for the proposition that

99 percent of teachers were rated as satisfactory when their schools used a satisfactory/unsatisfactory rating system; in schools that used an evaluation scale with a broader range of options, an overwhelming 94 percent of all teachers received one of the top two ratings.

The only Virginia data we have on performance evaluations for teachers are from 2011.  That year 99.3% of the Lynchburg teachers were rated “proficient”; 0.1% (one of 731) were “unacceptable – recommend plan of assistance”; 0.5% (four of 731) were “unacceptable – recommend non-renewal or dismissal.”  Looks like Lynchburg, like Richmond, thinks it is the Lake Woebegone of teachers.

Indeed, it looks like there was little thinking involved and the evaluation process was a bad joke.  I have a Freedom of Information Act request pending at VDOE to see whether their new process is any better (or whether they are part of the VEA conspiracy of secrecy).

The Virginia “Education” Association is threatening to sue VDOE, Brian Davison, and me because they don’t want you to know whether your kid is being subjected to a lousy teacher.  Their behavior demonstrates that their mission is protecting incompetent teachers, not advancing “education.”  That makes the very name of the Association an exercise in mendacity. 

As to Richmond (and, doubtless, Lynchburg) I said earlier:

Seems to me City Council should demand, as a condition of all the money they are spending on [the schools], an audit of teacher performance (SGP in the past, progress tables going forward) with an analysis of Principal and Superintendent actions regarding underperforming teachers.  For sure, if Council doesn’t demand those data, the rest of us will continue to be operated on the mushroom principle (keep ‘em in the dark, feed ‘em horse poop).

Look What VEA Wants to Hide in Richmond

It’s a strange state we live in.

The meetings of our legislators are open to the public; their work product goes in the newspaper and on the Internet. The public is free to evaluate their positions, express opinions, and hold them accountable by voting them in or out of office.

Virginia’s judges perform in open court. Their work product is public and subject to review by the appellate courts. Judicial Performance Evaluations based on feedback from attorneys and jurors go to the General Assembly, which has the power to fire judges, and to the public, which can fire members of the General Assembly.

In contrast, the evaluations of how much the students of any teacher in our public schools have learned (or not) are confidential.  The Virginia “Education” Association says that the public is too stupid (or biased or something) to properly evaluate those data.  The evaluation is left to the school systems, who are free to ignore bad teaching, and do so with gusto.  So the parents of Virginia are left without the information to evaluate their children’s teachers or to oversee the school divisions’ management of the inadequate teachers.

Brian Davison of Loudoun sued the Department of Education and punched a small hole in this conspiracy against Virginia’s schoolchildren.  So, now, the VEA has threatened to sue VDOE, Brian, and me, seeking court orders to prevent, among other things, Brian’s and my disseminating and commenting upon SGP and, perhaps, other data regarding teacher effectiveness (or lack thereof).

At the outset, this demonstrates that the VEA is too stupid to count to “one”: The First Amendment bars this attempted prior restraint of Brian’s and my truthful speech.  (Could it be that the manifest insecurity of the VEA’s lawyer stems from a recognition, however faint, of that stupidity?)

As well, the information already available provides a window into what VEA is trying to hide. 

For three or four years, VDOE calculated Student Growth Percentiles (“SGPs”).  They calculate the SGP by looking at student progress compared to other students who were similarly situated in the previous year(s).  The score change of each student in the group is then reported as a percentile rank from 1 (worst 1% of the group) to 99 (best 1%).

The 2014 statewide distribution of average reading SGPs by teacher approaches the ideal normal distribution.


The orange curve fitted to the data shows an average of 48.0 with a standard deviation of 9.7

The Richmond distribution that year leans toward the low end (no surprise there).


The fitted curve has a mean of 44.0 and a standard deviation of 11.4.

Indeed, we know that the actual data are worse: Richmond failed to report a bunch of its (awful) middle school data.  VDOE did nothing about that, of course.

The distribution of individual student reading SGPs in Richmond, again for 2014, also leans toward the low end. 


Since we know that students who have shown more progress than their peers get higher SGP scores, this is not good news for Richmond. 

Let’s turn to some specifics.  First some Good News.

The (fifth grade) teacher No. 74414 (anonymized identifier from VDOE) whose students averaged a 78 SGP shows a much different distribution.


That teacher got even more splendid results in math (average = 93).


We could hope this teacher would be in line for a big raise and a task to mentor other teachers.

And we have to wonder why the VEA would want to hide this teacher’s name.

Then we have a large number of teachers near the middle of the pack.  For example, here is No. 76273 with SGPs for 21 fifth grade reading students and a 48 average.


This same teacher did much better in math, with an 81 average.


This is a fine math teacher who might benefit from some work on his/her (average but lesser) skills for the teaching of reading.

The VEA says the adequacy of this teacher should be concealed from the parents of the students in his/her classroom because the information “can be used or misused to make prejudicial judgments about teacher performance.”

Then we have the teachers who are actively harming their students.  As one example, here is Richmond teacher No. 74415, with 25 fourth grade students averaging a reading SGP of 8:


Then we also have No. 75318, averaging 8 for 22 fourth grade reading students:


The parents of the affected students are not allowed to know who these teachers are.  Indeed, the Virginia “Education” Association would prohibit even my revealing that these teachers exist.

OFFER: I’ll bet you a #2 lead pencil that no child of an RPS teacher, principal, administrator, or School Board member was or will be in 74415’s or 75318’s class.  (But, of course, you are not important enough to have the information to avoid that hazard to your kid.)

Without information for the public to oversee the schools, we know nothing will be done about these and other ineffective teachers:  The assessment system is so pitiful that in 2011 Richmond teachers met or exceeded expectations in 99.28% of the measurements.

Yet VEA says, in effect, “Damn the students!  These teachers might be embarrassed if the parents knew enough to demand their retraining or replacement.”

On its Web site, VEA says:

The mission of the Virginia Education Association is to unite our members and local communities across the Commonwealth in fulfilling the promise of a high quality public education that successfully prepares every single student to realize his or her full potential. We believe this can be accomplished by advocating for students, education professionals, and support professionals.

As to the students who are suffering under inept VEA members and as to the whole notion of “high quality public education,” the threatened VEA suit confesses that this “mission” statement is a shameless lie.  Indeed, the honest name for the organization would be “Virginia Association for the Protection of Incompetent Teachers.”

Why Does VDOE Use Biased Data to Accredit Our Schools?

VDOE has an elaborate scheme to accredit (or not accredit) Virginia’s schools.  The basis is SOL pass rates (plus, for high schools, the graduation rate that depends on passing at least six end-of-course SOL tests).

But we know that the SOL is influenced by economic status.  For example, here are the 2015 reading pass rates by division vs. the percentage of economically disadvantaged students in the division.

We’re not here to discuss whether this correlation suggests that more affluent families live in better school districts, whether their children are better prepared for school, whether their children have higher IQs, or whatever.  The point here is that more affluent kids will show better SOL scores than less affluent students.

That’s only part of the problem with accreditation.  VDOE adjusts (I would say “manipulates”) the accreditation data in secret ways that mostly boost the scores.  In one case, that manipulation converted a 76.3 and a 73.7 into “perfect scores” and embarrassed the Governor.

So it’s no surprise that VDOE has not used, and now is abandoning, a measure of student progress that is insensitive to economic advantage or disadvantage and that might even be resistant to manipulation, the Student Growth Percentile (“SGP”).

VDOE says:

A student growth percentile expresses how much progress a student has made relative to the progress of students whose achievement was similar on previous assessments.
A student growth percentile complements a student’s SOL scaled score and gives his or her teacher, parents and principal a more complete picture of achievement and progress. A high growth percentile is an indicator of effective instruction, regardless of a student’s scaled score.

VDOE calculated SGPs in reading, math, and algebra for at least three years, ending in 2014. Then they abandoned the SGP for a new measure that looks to be coarser than the SGP. 

VDOE says that the new measure might be useful in the accreditation process because it allows “partial point[s] for growth,” i.e. another way to boost the scores.  There is no mention of sensitivity to economic disadvantage.

How about it, VDOE?  Does your dandy new measure of progress cancel the advantage of the more affluent students?  And if it does, will you use it to replace the SOL in the accreditation process?

Kudos to the VaCU

I’ve been a customer at the Virginia Credit Union for some time, especially for the period awhile back when their CD rates were competitive.

From time to time they offer free document shredding at their Boulders location.  We’ve been frustrated there in the past when the line of cars reached out onto Jahnke Rd.

Disdaining the lessons of the past, we drove there about 08:40 this morning, expecting to wait in line until the 09:00 opening.  We were surprised to find them open early, with two lines and four shredding trucks.  We left our bag of old documents and were gone in a few moments.  Indeed, when we drove past an hour or so later there still was no line visible from Boulders Pkwy.

All Glory to the VaCU, especially to the person there who decided to improve this helpful public service.

Now, if they’ll just bump their 60 month CD rate to match, or even come a bit closer to matching, their online competitors . . .