SAT Splat (Again)

RPS has posted the 2016 SAT scores

Caveat: The College Board has again jiggled the tests and the scoring so the “new” (2016) scores are not directly comparable to the “old” scores.  They renamed the tests, too.

Further Caveat: RPS reports scores of “college bound” students.  That generally runs a few points higher than the average for all SAT takers.

That said, it looks like the new system left the Richmond and State averages about where they were.

image

image

As to the school averages, except for some slight bumps at Franklin, Huguenot, and Marshall on the reading test and at Marshall on math, and a drop in the Armstrong math average, the “new” scores don’t look much different than the “old,” 2015 numbers.

image

image

The pairs of red points in the vicinity of 700 on the graphs are the only Maggie Walker data I have.  Of course, Walker is not a Richmond Public School, albeit Richmond included Walker scores in the Richmond average until they got caught.

For some context as to the RPS scores, the 25th percentile of the 2014 Longwood entering class was 470 reading, 460 math, well under the state averages.  As you see, Richmond’s two best high schools topped those numbers but the others fell far short.

Chronic Absence by Division

From a further dive into the Feds’ “Civil Rights Data Collection,” here is the distribution of Chronic Absence rates (fifteen or more school days, whether excused or not) by division.

 

image

Richmond is the yellow bar at 22%.  The red bars, from the left, are the peer jurisdictions Newport News, Norfolk, and Hampton.  Hampton is off scale on this graph at 68%(!); no telling whether that datum is in error.

Here are the data:

image image

I left out six divisions because their numbers looked to be erroneously low.  Here they are FWIW:

image

The Feds also included two Governor’s Schools:

image

And we have the (believable) attendance leader:

image

These federal data do not compare directly to the State’s attendance numbers.  The federal chronic data count absences of fifteen days or more; 50 days of absence count exactly the same as fifteen.  The state, in contrast, counts average attendance. 

Converting the state data to absence rates, we obtain this distribution:

image

Richmond again is the yellow bar; the red bars from the left are Hampton, Newport News, and Norfolk.  Whatever be the incongruities between the two datasets, it’s clear that the Hampton datum is wrong in one.

More to the point, Richmond’s attendance is lousy by both measures and the State Board of “Education,” which is charged with the duty and authority “to see that the [mandatory attendance laws] are properly enforced throughout the Commonwealth,” does not even collect useful truancy data.  Rather than dealing with the manifest problem, it looks like the Board will adopt a feckless, unlawful, belated regulation on June 23.

What Do They Learn When They’re Not In School?

The Feds have an annual “Civil Rights Data Collection” that includes data on chronic absences (absent 15 or more school days,  whether excused or not) by school.  They have mapped those data.

The latest dataset, 2013-14, runs 457 MB as a CSV file and 606 MB when imported into Excel.  My first venture into that thicket found some interesting data about Richmond.

Let’s begin with the elementary schools:

image

The Excellent News here is Patrick Henry.  Now, if they only had better SOLs to match.

The shockers here (beyond the astounding rates of absenteeism) are Carver and Fairfield.  Their awful rates of chronic absences contrast with their recent, splendid SOL scores.

image image

If we plot the elementary school pass rates vs. the chronic absence rates, we see:

image

The outperformer there, with an absence rate of 25% and pass rates above 80% on both tests, is Carver (squares on the graph).  Fairfield Court (diamonds on the graph), at 21% absences, outperforms on math at 79% but not so much on reading at 61%.

These data can’t tell us whether those two principals are superb educators or accomplished cheaters but it looks like one or the other.

If you thought the elementary school absence rates were awful, just wait:  The Division average is 22% so something is pulling the average up.  That “something” is the middle and high schools.

The (much too high) middle school numbers look to be consistent with the pass rates there.

image

image

The two big underperformers are Elkhardt (squares) and Henderson (diamonds).

The high schools’ rates (including the astounding 60% chronic absence rate at Wythe) are much worse than the middle schools’ while the SOLs are generally better (courtesy, methinks, of the horrific dropout rate).

image

image

The overperformers with excellent attendance and pass rates are Open (4% absences, 98% reading, 91% math) and Community (5%, 100%, 82%).  Wythe, at 60% absences, nonetheless overperforms the mainline high schools in reading.  The underperformer on both tests, notwithstanding pretty good attendance, is Franklin (squares).  The other math underperformers are TJ (diamond) and Armstrong (triangle).

Note that Franklin has both middle and high school grades and I’ve included it in both the middle and high school datasets, albeit the numbers probably do not compare directly.

Looking at these data, notably Open and Community high schools, we can guesstimate that the excused absence rate is somewhere under 5%. The 22% Richmond average rate of fifteen day absences thus translates to something like a 17% truancy rate.  Yet Richmond reports only a 10% rate of ten-day truancies.  There’s something fishy here.

Unfortunately, the Virginia Board of Education, which has the duty and authority “to see that the [mandatory attendance laws] are properly enforced throughout the Commonwealth,” still does not even collect useful truancy data.  Instead of correcting that situation, it looks like the Board will vote to adopt a feckless, unlawful, belated draft regulation on June 23.

And the government compels us to pay for this shambles.

—————————————

Here are the data:

image

image

image

Reedy Creek: If It Ain’t Broke, Spend Tax Money to Fix It

We have seen that the City plans to spend $1.27 million to control phosphorus and sediment in Reedy Creek that already are being controlled by the sediment traps at Forest Hill Park Lake.

The Reedy Creek Coalition has some stream monitoring data that show another side of this boondoggle.  The monitoring points are here:

image

Let’s rename those stations to something more descriptive: RC4 = Headwaters; RC3 = Upstream (of the project area); CB1 = Trib (as it enters the project area); RC1 = Downstream (of the project area).  With those labels here are the dissolved oxygen data:

image

The water quality standard for dissolved oxygen in this stream is 5.0 parts per million as a daily average.  That does not translate to a percentage unless we know the temperature.  The percentage data nonetheless tell us that the lowest dissolved oxygen levels are found in the headwaters and that the concentration improves as the water flows through the project area.

The DEQ monitoring has placed the upstream reaches of Reedy Creek on the 305(b) list of impaired waters for dissolved oxygen and pH violations [at 453-4].

So there you have it: The impaired waters are upstream.  Rather than fix the problem at the upstream source, the City is spending $1.27 million on the part of the stream that is helping to correct the impairment.  Their justification is that the project will remove pollutants that already are being removed at Forest Hill Park Lake. 

Your tax dollars at “work.”

Reedy Creek Boondoggle

The City is planning to waste your tax money on a “restoration” project that won’t do any good.

Reedy Creek rises near Chippenham Hospital; it flows through Forest Hill park and into the James. 

The concrete channel built to control the flooding of and near Midlo Turnpike (esp. in front of the Evergreen plant) has led to high stormwater flows that erode the banks downstream.  The City now plans to spend $1.27 million of your and my tax money to create a new floodplain on the City property across Crutchfield St. from Wythe High School to reduce erosion there. 

The Reedy Creek Coalition lists five major reasons for opposing the project.  Their #4 should settle the issue: The money won’t do any good.

Apparently the DEQ bureaucrat who approved the City’s application for grant funds is a recent ComeHere.  Anybody who has lived in Richmond for more than a few years knows that the City already spent $1.4 million to restore the Forest Hill Park lake that is downstream of this project.  The renovation included a silt capture system , actually two forebays:

“A forebay is basically a hole. It’s a settlement hole where the silt will kind of build up. We will be able to clear it out with a Bobcat and haul it off and it will fill up again. So the process will be able to continue. But it will not affect the lake so that the citizens’ investment that they have in the lake will certainly be safeguarded,” said Richmond Parks deputy director Roslyn Johnson.

The City’s grant application to DEQ calculates that the new project will remove 150 lbs/yr of phosphorus and 98,736 lbs/yr of sediment.  The calculation entirely overlooks the two forebays that already are removing most of this and other sediment and the attached phosphorus.

So the City wants to spend $1.27 million to solve a problem that it already has solved.

And they want Chesapeake Bay TMDL credits for removing pollutants that are NOT entering the river.

Can you spell “boondoggle”?

Has Roanoke Joined the Cheaters Club?

I earlier quoted Scott Adams for the notion that “whenever you have large stakes, an opportunity for wrong-doing, and a small risk of getting caught, wrong-doing happens. . . .  When humans can cheat, they do.”  That certainly is what we’ve seen wholesale in Atlanta and in Virginia on the VGLA.

Now I have a copy of a letter from a former Roanoke Latin teacher to the President of the VBOE, alleging wholesale cheating at one or more Roanoke schools.

If Adams is right, it would not be a surprise to find a fire beneath this smoke.  In any case it will be interesting to see whether VBOE, which is supposed to supervise the public school system, conducts an investigation.

Here is the letter:

 

Dr. Billy K. Cannaday, Jr.
President
Virginia Board of Education
P.O. Box 2120
Richmond, VA 23218
(804) 225-2924
BOE@doe.virginia.gov

Dear Dr. Cannaday:

I am writing you on behalf of over twenty former and current students including faculty at Hidden Valley High School in Roanoke County Public Schools (RCPS), who are extremely concerned about cheating on non-SOL testing on school-issued laptops, which has been a chronic problem since 2007.1 Unfortunately, cheating is not only a widespread problem at Hidden Valley High School, but throughout RCPS in grades 8-12.2

I taught Latin at Hidden Valley High School from 2011 to 2013. Despite informing the administration in November 2012 about cheating on school-issued laptops, nothing was ever resolved. Over ten of my former students informed me after graduating in June 2015 that cheating at the school actually had worsened in the past two years. Many of them described the cheating as “nuts,” “rampant,” and “out of control.” I informed Al Bedrosian of the Board of Supervisors in October 2015 and Fuzzy Minnix of the School Board in November 2015 about my concerns, but nothing was resolved. So I addressed the School Board on March 24, and again nothing was resolved except a vague promise by Jeff Terry, the Chief Information Officer, to update and secure Blackboard next fall (Gregory). I also addressed the Board of Supervisors on April 26 upon the invitation of Al Bedrosian, but unfortunately they do not have any oversight of the school district.

I believe that RCPS is in violation of Standard 7 (C) (3) of the Code of Virginia, which states that “the standards of student conduct and attendance and enforcement procedures [are] designed to provide that public education be conducted in an atmosphere free of disruption and threat to persons or property and supportive of individual rights” (§ 22.1-253.13:7).3 There is no question that RCPS currently has adequate “standards of student conduct” in place for academic integrity. According to Policy 7.11 or the Roanoke County Student Conduct Code, Rule 9 states that “students are expected to perform honestly on any assigned schoolwork or tests” (RCPS Current Policies SERIES 07: Students). Rule 9 (A) states that “students shall not cheat on a test or assigned schoolwork by giving, receiving, offering, and/or soliciting information” while Rule 9 (E) further states that they shall not “use technology for any unauthorized use” (RCPS Current Policies). Likewise, according to the Student Handbook of Hidden Valley High School for 2014-15 the honor code’s goal is “to maintain a high level of integrity, to strive honestly in all endeavors, and to perpetuate an atmosphere of trust between peers, students, and faculty” (2).4

Unfortunately, the central office of RCPS and the administration at Hidden Valley High School have total disregard for realistically enforcing these policies and rules when students take an online non-SOL test or quiz on school-issued laptops using Blackboard. It is extremely easy for a student to cheat without getting caught making the “enforcement procedures” in Standard 7 (C) (3) almost meaningless. The problem is that students have complete access to both their hard drives and the internet during an online test, and it is impossible for a dedicated teacher to watch fifteen or thirty laptop screens and also look for traditional cheating such as crib sheets and smartphones. Students can easily right click on Google, access the Snipping Tool, copy and paste answers, hide a cheat sheet, email passwords, etc. and most insidiously program a key to perform screen captures of an entire test or quiz to a Google server without the teacher ever knowing it. This testing environment is the direct opposite of state-mandated SOL testing which requires a lockdown browser and other needed software in order to prevent digital cheating.

Standard 7 (C) (3) clearly states that “public education be conducted in an atmosphere” “supportive of individual rights” (§ 22.1-253.13:7). RCPS has violated the “individual rights” of honest students who obey the rules or “standards of student conduct” (§ 22.1-253.13:7). The honest students are at a distinct disadvantage in competing against the dishonest ones in terms of lower GPAs, lower class ranking, and less academic awards, which also negatively impacts college admissions, scholarships and grants. There is a de facto system of academic apartheid between the honest students and the dishonest ones or cheaters in grades 8-12 throughout RCPS, thereby negligently allowing a non-level playing field and creating a negative “atmosphere” of learning. Like Major League Baseball players in the 1990s until 2005 during the steroid era, many honest students ask themselves if they should cheat in order to get ahead academically while the dishonest students never ask themselves this question. This is a moral dilemma every honest student faces during the academic year at Hidden Valley High School and all the other county schools in grades 8-12.

In addition, Standard 7 (C) (3) states that “public education be conducted in an atmosphere” “free of disruption” (§ 22.1-253.13:7). Not only is cheating both academically disruptive and morally wrong it also teaches bad “citizenship” by negative example for irresponsible and NOT “responsible participation in American society,” which is both a violation of the public trust and Standard 1 (C. 1) (e.) (Code of Virginia. § 22.1-253.13:1).5 RCPS should not be teaching its students to be emulating such notorious “cheats” as Lance Armstrong, Mark McGwire, Lenny Dykstra and Alex Rodriguez, not to mention Swiss banks, Mitsubishi and Volkswagen. Lastly, cheating certainly does not “foster public confidence” in RCPS, which is one of the five “accreditation standards” of the “public education system” in Virginia (“Regulations Establishing Standards for Accrediting Public Schools in Virginia” (8VAC20-131) 3).
RCPS has not been in compliance with both Standards 7 (C) (3) and 1 (C. 1) (e.) in grades 9-12 since 2007.6 When a student takes an online test or quiz on a school-issued laptop, the school district does not provide adequate “enforcement procedures” as described in Rules 9(A) and 9(E) in Policy 7.11 or the Roanoke County Student Conduct Code. Hidden Valley High School has also failed “to maintain a high level of integrity” and other ethical standards as described in the school’s honor code. However, the most egregious violation has been the noncompliance of RCPS with Standard 7 (C) (3), which states that “public education be conducted in an atmosphere” “supportive of individual rights.” This has repeatedly resulted in honest students being at a distinct disadvantage in competing against the dishonest ones in terms of lower GPAs, lower class ranking, and less academic awards negatively impacting college admissions, scholarships and grants. Consequently cheating has also allowed the teaching of very bad citizenship, which is a violation of Standard 1 (C. 1) (e.). There needs to be an immediate external investigation from Richmond in order to ascertain the status of the school district’s state accreditation, and determine who has been either responsible or complicit in this shameful and preventable academic misconduct. The students, parents and taxpayers in Roanoke County all deserve more integrity and better accountability from their public schools.

Sincerely,

Robert Maronic

 

Notes

1. Smith wrote about cheating on non-SOL testing using school-issued laptops and Blackboard at Hidden Valley High School in May 2013: “In a miniature poll of Hidden Valley students, who’s [sic] identities will be kept anonymous, one 11th grader estimated that in a class of 25 students taking a Blackboard test, ten to fifteen would be cheating.  Another student, a 12th grader, believes that in the same situation, only two or three students would be cheating.  Whichever version is true, students are still cheating on tests.” Smith also wrote, “Most students and teachers agree that it is easier to cheat on a Blackboard test than on a paper test.  A 10th grade student said that it was easier to cheat on a Blackboard test because ‘you can switch windows while you are working on a test.’  An 11th grade student said that it is easier to cheat on a Blackboard test because of ‘search engines such as Google, Bing, and Yahoo.’ Some students have witnessed so much cheating that they have become numb to it.”

2. Cheating is truly a widespread problem throughout RCPS. I have talked with over twenty teachers, students and graduates from Cave Spring High School, William Byrd High School and Northside High School since 2011, and all their complaints about cheating on the school-issued laptops are identical to what I was told or experienced at Hidden Valley High School. I have listened to the complaints of one recent graduate of Glenvar High School, and would assume that cheating is just as prevalent there as the other four county high schools. Please also note that RCPS first issued laptops to all eighth graders during the 2015-16 academic year. All seventh graders will be issued laptops during the 2016-17 academic year according to what was discussed at the School Board meeting on March 24.

3. According to “Bill Tracking (Chapter 474 ) – 2008 Session Legislation,” Standard 7 (C) (3) was known as Standard 7 (B.1) (3) in 2007 and 2008 in the Code of Virginia.

4. An updated version of the Hidden Valley High School Student Handbook for the 2015-16 academic year is currently unavailable online.

5. According to the “Virginia Department of Education SOQ Compliance Detail Report [for] Roanoke County” submitted for the 2014-15 academic year, Standard 1 (C. 1) (e.) states, “Essential skills and concepts of citizenship, including knowledge of Virginia history and world and United States history, economics, government, foreign languages, and international cultures, health and physical education, environmental issues and geography necessary for responsible participation in American society and in the international community.”

6. RCPS first issued laptops to all eighth graders during the 2015-16 academic year. The school district has not been in compliance with both Standards 7 (C) (3) and 1 (C. 1) (e.) for eighth graders since August 2015.

 

Works Cited

“Bill Tracking (Chapter 474 ) – 2008 Session Legislation.” Bill Tracking – 2008 Session of the VA General
Assembly. Web. 17 May 2016.
<http://lis.virginia.gov/cgi-bin/legp604.exe?081+ful+CHAP0474>.

Code of Virginia. § 22.1-253.13:1. Standard 1. Instructional Programs Supporting the Standards of
Learning and Other Educational Objectives. Web. 15 May 2016.
http://law.lis.virginia.gov/vacode/title22.1/chapter13.2/section22.1-253.13:1/

Code of Virginia.
§ 22.1-253.13:7. Standard 7. School Board Policies. Web. 14 May 2016.
http://law.lis.virginia.gov/vacode/title22.1/chapter13.2/section22.1-253.13:7/

Gregory, Sara. “Roanoke County School Board Approves 2 Percent Raise for Teachers.” Roanoke Times.
24 Mar. 2016. Web. 24 Mar. 2016.
http://www.roanoke.com/news/education/roanoke_county/roanoke-county-school-board-approves-percent-raise-for-teachers/article_3e91756d-b9f9-5bef-83fd-60ed6e5f5bfd.html

Hidden Valley High School: Student Handbook 2014-2015. Roanoke County Public Schools. Web.
12 May 2016.
http://www.rcps.us/hvhs/documents/student_handbook.pdf

“RCPS Current Policies SERIES 07: Students.” Student Conduct Code: Policy 7.11. Roanoke County
Public Schools, 13 Aug. 2015. Web. 12 May 2016. See Rule 9 – Integrity.
http://www.boarddocs.com/vsba/roecnty/Board.nsf/goto?open&id=9XY22D71C86A

“Regulations Establishing Standards for Accrediting Public Schools in Virginia” (8VAC20-131). VA
Department of Education, 19 Oct. 2015. Web. 12 May 2016. See p. 3 (8VAC20-131-10. Purpose).
http://www.doe.virginia.gov/boe/accreditation/regulations_establishing_soa.pdf

Smith, Tanner. “Cheating Continues to Plague Acadmic [sic] Careers.” Titan Times. Hidden Valley High
School [Roanoke], 2 May 2013. Web. 13 May 2016.
<http://www.titantimes.org/news/2013/05/02/cheating-continues-to-plague-acadmic-careers/>

The Empire Strikes Back. Feebly.

On June 2, Donald Wilms, President of the Chesterfield Education Association, responded in the Times-Dispatch to Bart Hinkle’s editorial of May 28.

Hinkle had made the point that the Virginia Education Association’s attempt to suppress truthful data on teacher effectiveness sought to keep “parents and taxpayers . . . in the dark about which teachers are doing a great job – and which ones aren’t.” 

Wilms sought to argue that the data should be kept secret.  He mostly demonstrated that the Chesterfield Education Association needs a better argument.

Wilms brought on some emotional arm-waving about the students who may come to school on test day after oversleeping and missing breakfast; after a fight; after losing a boy-/girlfriend; or the like.  He neglected to mention that the data he disdains are based on two (or more) successive years’ test scores:  An outside event in the second year could lower a student’s measured progress but the same event in the first year could increase the score difference and, thus, the progress measure.

The nut of Wilms’ argument, however, was the kids who are at a systematic disadvantage:

[W]ould it be fair for schools with high-income households — where both parents are college-educated, where kids go to museums and take exotic vacations, where parents have two cars and are available to take kids to the public library — to compete with schools where kids live in single-parent households, where parents hold several low-wage jobs, with hours preventing them from being home to take kids to museums or public libraries, and incomes preventing them from any vacation at all? Heck, it would be unfair to compare a school in western Chesterfield with one in eastern Chesterfield, let alone to compare one of Chesterfield’s (or Henrico’s) wealthiest school communities with one of Richmond’s neediest school communities, don’t you think?

* * *

What these [SGP data] really illustrate is which teachers have the at-risk kids and which don’t.

On these emotional grounds, Wilms attacked the fairness of the “flawed rankings” of the Student Growth Percentile, the “SGP” (and, apparently, other measures of teaching effectiveness).  He disdained to discuss the actual data, which are to the contrary.

When it was considering adopting the SGP, VDOE provided data showing that test scores fall with increasing economic disadvantage (no surprise there) but that SGP percentiles do not:

image

That is because the SGP, by design, compares low-performers only to other low-performers.  Unlike the SOL, it measures progress relative to others similarly situated.

Indeed, the data for Chesterfield County make a clear case that students who score poorly one year, for whatever reason, can show outstanding academic growth the next year.  Here, for instance, are the 2014 math SGP scores of Chesterfield students plotted against those same students’ 2013 SOL scores.

image

Before drawing any conclusions, we need to refine the analysis slightly: Students who score pass, advanced (SOL 500 or above) for two years running do not receive an SGP (it’s hard to improve on “excellent”).  In a sense, the SGP penalizes teachers and schools with large numbers of very bright students!

The SOL scores ≥ 500 in the graph above represent students who scored advanced 2013 but less than advanced in 2014.  The students who scored advanced in both ‘13 and ‘14 do not appear in this dataset at all, which biases the analysis.  So let’s look only at the students with 2013 SOLs < 500:

image

The result doesn’t change much.  The R2 value of 0.14% tells us that the SGPs are quite uncorrelated with the previous year’s SOL scores.  Of course, correlation is necessary but not sufficient to show causation.  Said otherwise: Chesterfield students with low SOL scores in the previous year, for whatever reason, show superior (or inferior) academic growth (as measured by the SGP) in the current year just as often as students who scored well in the previous year.

We can’t directly test Wilms’ statement about pitting a rich school against a poor one because VDOE (known here as the Virginia Department of Data Suppression) hasn’t released (and won’t release) the data.  So let’s go one better: Let’s use the data we have to compare some of the worst students in Chesterfield in terms of 2013 SOL with some of the best.

Using the math data, here is the first group, all at 50 points or more below the passing score of 400:

image

As before, the 2014 SGP does not correlate with the 2013 SOL. 

Next the 2013 SOLs between 450 (50 points above the passing score) and 499 (where we stop to avoid the advanced student penalty).

image

Pretty much the same story: High or low SOL one year does not predict SGP growth in the next year.

Said yet otherwise: Part of teaching in the public schools is dealing with kids who are disadvantaged; the SGP identifies teachers who do that well; the SOL may not.

BTW: The reading data tell the same story with slightly different numbers.  Let’s avoid some clutter and leave them out. 

It’s easy to understand why Wilms would prefer the current system.  In 2011 (the only year that VDOE slipped up and posted evaluation data), Chesterfield teachers were evaluated on seven criteria.  Six of those were inputs, i.e., relatively easy to measure but only distantly related to student learning.  On the only important measure, “Student Achievement and Academic Progress,” only twelve of 1222 Chesterfield teachers (1%) were said to need improvement and none was unsatisfactory. 

image

But when we look at real performance data (2012 SGPs are the earliest that Brian Davison sued out of VDOE), we see student progress that was much worse than “unsatisfactory.”  For example here is the math performance of Teacher No. 34785 (identifier anonymized by VDOE):

image

The yellow points are the annual SGP averages of that teacher’s math students.  The blue bars show the 95% confidence intervals. 

The Chesterfield math averages those years were 48.7, 49.2, and 48.2.

Then we have Nos. 54317 and 86898:

image  image

To put some context on these numbers, here is the 2014 statewide distribution of math SGP averages by teacher.

image

The mean was 49.3; the estimated standard deviation, 16.3.  That is, about sixteen percent of the teachers were below 33.0; another sixteen percent, above 65.6.  Two and a half percent were below 16.7; another 2.5%, above 81.9. I’ve marked those values on the 2014 Chesterfield distribution:

image

The problem here is not “flawed rankings.”  It is flawed teachers that the Chesterfield “Education” Association does not want the parents of Chesterfield County to know about.

BTW: The data also show that there are some really fine teachers in Chesterfield.  For example, in math we see Nos. 29893, 28974, and 81816:

image image image

The Chesterfield and Virginia “Education” Associations don’t want you to know about these outstanding teachers, either.  They just want you to think that 99% are above average.

And VDOE is a co-conspirator.  Your tax dollars “at work.”

Yet More of What VEA and VDOE Are Trying to Hide

In the discussion of the SGP data that Brian Davison sued out of the State Department of Data Suppression, I’ve been focused on the awful teaching (e.g., here, here, and here) that the Virginia Department of “Education” and the Virginia “Education” Association have been attempting to conceal.  But their efforts to hide teacher performance data from the taxpayers who are paying the teachers have another outrageous effect: They suppress the identities of the many great teachers in Virginia’s public schools.

Having looked at the 43 teachers with the worst three-year average math SGPs, let’s turn to the 43 with the best averages:

image

The “Row Labels” column contains the (anonymized) teacher IDs.  The “Grand Total” column is the three year average for each teacher.  The “Grand Total” row reports the statewide average of each column.

All but three of the 43 teachers at the bottom of the SGP list damaged Virginia schoolchildren for only one year; almost half of the teachers in the present list helped educate Virginia schoolchildren for more than one year.

Here are the Top Ten who taught all three of the years for which we have data.

image

Notice that all but two of these got better performances from their students in 2014 than in 2012.  And, of those two, No. 115415’s even 99 in 2012 and much lower average suggest only one student (of doubtful statistical significance) in the first year and substantial progress over the second two years.

The preponderance of NoVa suburbs in this list raises the question whether those divisions have better students, or better teachers, or some combination.  See below for some data suggesting that there is more learning, and presumably more teaching, is in some more rural districts.  In the meantime, here are data for the Top Ten in graphical form.

image

Or, with the ordinate expanded:

image

The division averages provide some further insights.  Let’s start with the distribution of division mathematics averages for 2014.

image

As shown above, the mean math SGP in 2014 was 49.1.  The division mean was 48.3, with a standard deviation of 6.2.

The Top Ten divisions of 2014 did not include any of the NoVa suburbs.

image

Neither, for that matter, did the bottom ten.

image

Here is the entire 2014 math list, sorted by division (Colonial Beach, Craig, and Surry had no data).

wp352i5z

When we turn to the division averages by year an interesting pattern emerges: The Top Ten in 2014 all improved from 2012.

image

image

Seems to me that a whole bunch of educators should be put on a bus to one of these divisions to find out how to increase student performance in math.

The Bottom Ten in 2014 mostly declined from 2012, except for Franklin and West Point, which improved; King and Queen, which improved slightly; and Galax, which only had data for 2014.

image

image

But the Virginia Department of “Education” and the Virginia “Education” Association don’t want you to see these data.  Apparently you are not qualified to know what you are getting (or not getting) for your tax dollar.

Still More of What VEA Wants to Hide

The SGP data that Brian Davison sued out of the State Department of Data Suppression last year showed (e.g., here and here) that we have some truly awful teachers in Virginia.  The lawsuit threatened by the Virginia “Education” Association demonstrates that it is more interested in protecting those bad teachers than in the effect of those teachers on Virginia’s schoolchildren.  And the VDOE’s own document, confirmed by the available data, admits that the teacher evaluation process has been worse than ineffective.

I’ve turned to the math data to provide some details.

A list of the very worst three-year teacher average math SGPs offers some Good News and some very Bad News. 

  • The Bad: Some truly awful teachers have been inflicted on Virginia’s schoolchildren, sometimes more than once. 
  • The Good: Most of those teachers from 2012 and 2013 were not there the following year.

Here are the worst 43 three-year math SGP averages by teacher, sorted by increasing average.

image

The “Row Labels” column contains the (anonymized) teacher IDs.  The “Grand Total” column is the three-year average SGP for each teacher.  The “Grand Total” row is the statewide average for each year and for the teacher averages.

We can’t tell from these data whether any of the teachers with only 2014 results stayed on to harm schoolchildren in 2015.  The yellow highlights identify the two teachers who came back after a failure to teach in an earlier year.

The red highlight indicates the teacher with the lowest average SGP among those who “taught” all three years.

Turning the cases where the school division was so evil (or so intimidated by the VEA) that it subjected its students to more than two years’ exposure to the same awful teaching, here are the Top Ten, sorted by increasing three-year average:

image

No. 90763 is an outlier here: The three year average of 17.1 comes from yearly averages of 71.0, 68.0, and 15.6.  The “.0s” are clues: This teacher likely had only one reported SGP for each of the first two years and a much larger class in 2014.  Indeed, the database shows 72 math SGPs in 2014.  If we add the 15.625 average that year times 72 to the 71 and 68, and divide by 74 we get 17.08, the three year average for this teacher.

That said, no telling what else unusual was going on there.

Turning to the other nine cases, we see:

image

Or, upon expanding the ordinate:

image

Only two of the nine showed improvement over the three-year period.  No. 88959 went from a 9.6 to a 28.3, which is still more than twenty percentiles below average.  No. 114261 improved, but only to 17.6.  We have no data regarding the guinea pigs students who were damaged in these experiments.

The other seven “teachers” all showed deteriorated performance over the period.  Note especially No. 54317, who went from bad to appalling (and was paid tax dollars to continue afflicting Chesterfield County schoolchildren).

There are as many as nine principals (depending on time frame in the job and whether it was one or two schools in Chesterfield) who should have been fired over these disasters.  I’ll bet you a #2 lead pencil they all got raises.

(And, for those of us in Richmond who look to the schools in the Counties as the Elysian Fields when compared to the barren wasteland of Richmond Public Schools, the presence of one Henrico and two Chesterfield teachers in this list comes as a shock.)

But, the Virginia “Education” Association and the State Department of Data Suppression don’t want you to know about this, especially if your kid is suffering under such an awful “teacher” and such a pusillanimous school system.

Your tax dollars at “work.”

More of What VEA Wants to Hide

Turning again to the SGP data that Brian Davison sued out of VDOE last year, let’s look at mathematics.

No. 40837, the teacher with the best 2014 math SGP average in Virginia, 95.2, had a fifth grade class in Fairfax.  Here are the data:

image

We can resist the temptation to dismiss this as the result of a class full of very bright students:  Students who scored in the advanced “proficient” range two years running didn’t get an SGP (it’s hard to show growth from really-good-already).  Thus, the students reported here did not obtain superior scores in both 2013 and 2014; they improved radically in 2014, compared to the others in Virginia who had similar performances in 2013.

Of further interest, this teacher’s reading SGPs (average of 65.7) are above average (48.0) but much less spectacular:

image

We can think of all kinds of explanations for this pattern.  In the absence of other data, Friar Occam would tell us to look first at the simple one: This is a superior math teacher and an above average reading teacher.

At the other end of the spectrum, we have No. 76323 whose fifth grade math students in Richmond averaged an SGP of 4.0.

image

The Virginia “Education” Association has threatened to sue because it doesn’t want you to know about this teacher.  But you can bet that the Richmond School Board members, Superintendent, principals and teachers knew and that none of their kids was among the unfortunate 25 in No. 76323’s class.

The reading performance of this teacher was a world better, with an above-average mean of 57.7

image

This suggests that there is a place for this teacher in Richmond, just not teaching math (absent some heavy-duty retraining).

The second worst 2014 math performance comes from fourth grade teacher No. 71819 in Lynchburg, with an average of 4.4.

image

This same teacher turned in a 25.7 in reading.

image

So, one awful performance and one clearly sub-par. 

Unfortunately, this teacher was getting worse:

image

You can again bet that no child of a School Board member, the Superintendent, or any Lynchburg principal or teacher was damaged by this teacher.

All the schoolchildren of Lynchburg (and their parents who pay the teachers) deserve to be protected from this teacher, and the too many others who are nearly as bad.  Another twenty-nine Lynchburg teachers averaged a math SGP of less than thirty in 2014; eight of those were less than twenty.

image

The 2014 reading performance in Lynchburg was less horrible: One teacher was below an average SGP of twenty, another six were between twenty and thirty. 

The SGP data from 2012 to 2014 tell us that Lynchburg’s average performance was sub-par and deteriorating.

image

Unfortunately, we can’t rely on the schools to deal with the ineffective teachers.  VDOE quotes a 2009 study for the proposition that

99 percent of teachers were rated as satisfactory when their schools used a satisfactory/unsatisfactory rating system; in schools that used an evaluation scale with a broader range of options, an overwhelming 94 percent of all teachers received one of the top two ratings.

The only Virginia data we have on performance evaluations for teachers are from 2011.  That year 99.3% of the Lynchburg teachers were rated “proficient”; 0.1% (one of 731) were “unacceptable – recommend plan of assistance”; 0.5% (four of 731) were “unacceptable – recommend non-renewal or dismissal.”  Looks like Lynchburg, like Richmond, thinks it is the Lake Woebegone of teachers.

Indeed, it looks like there was little thinking involved and the evaluation process was a bad joke.  I have a Freedom of Information Act request pending at VDOE to see whether their new process is any better (or whether they are part of the VEA conspiracy of secrecy).

The Virginia “Education” Association is threatening to sue VDOE, Brian Davison, and me because they don’t want you to know whether your kid is being subjected to a lousy teacher.  Their behavior demonstrates that their mission is protecting incompetent teachers, not advancing “education.”  That makes the very name of the Association an exercise in mendacity. 

As to Richmond (and, doubtless, Lynchburg) I said earlier:

Seems to me City Council should demand, as a condition of all the money they are spending on [the schools], an audit of teacher performance (SGP in the past, progress tables going forward) with an analysis of Principal and Superintendent actions regarding underperforming teachers.  For sure, if Council doesn’t demand those data, the rest of us will continue to be operated on the mushroom principle (keep ‘em in the dark, feed ‘em horse poop).