Middle School Mess, II

We have seen that Middle School and the onset of puberty have been marked by a slight drop in math pass rates statewide but not in reading.  In contrast,  the sixth grade brought plummeting pass rates in both subjects in the Richmond schools.

Those data mostly looked at the period before the new math tests in 2012 and the new reading tests in 2013.

Today let’s look at the third-graders of 2011 and the progress of their group up to the past school year.  To that end, here are the pass rates of the third graders in 2011, the fourth graders in 2012, and so on. 

Let’s start with the reading pass rates.


The new tests dropped Richmond’s fifth graders to 23 points below the state average, while the same group scored six points below the average in 2012 (under the old tests).

The same group dropped another ten points when they entered middle school in 2014; in the eighth grade that group remained 31 points below the state average.

The math scores are even more dramatic.


The group went from six to fourteen points below the state under the new tests in 2012, and dropped another three points in the fifth grade (2013).  Then came middle school: In the sixth grade the (mostly) same students fell to 43 points below the average.  In 2016 (eighth grade) the group remained 30 points down.

As before, we see two effects here: The new tests whacked the Richmond scores in both elementary and middle grades, especially in math.  The middle schools took students who had been performing below average in elementary school and devastated their performances, especially in math.

These data don’t tell us what is wrong with our middle schools; they do tell is that, whatever it is, it is horrible.

Gone. Forgotten. For Shame!

The 2016 4-year cohort dropout data are up at VDOE.  The only good thing about the Richmond datum, 9.9% dropouts, is that it’s less than last year’s 11.9%.

Here is the distribution of division dropout rates.


Richmond is the gold bar.  The red bars are, from the left, Petersburg, Norfolk, Hampton, and Newport News.  The blue, with a hat tip to Jim Weigand, is Lynchburg.  Charles City is an invisible 0% over at the right.

And here are those selected divisions and the state average.


That 9.9% in Richmond counts the 146 kids, out of the 1,472 student cohort, whom the Richmond schools utterly failed to educate. 

But see this on the subject of the students who did not drop out and were left to marinate in the incompetence of RPS. 

Teacher Truancy, II

The 2014 data from the Feds showed Richmond with the ninth worst Virginia division record of teacher absences >10 days, excluding days for professional development. 

The Richmond data by school ranged from surprisingly high to astronomically high:

The 2014 RPS budget did not break out Richmond’s expenditures for substitutes; the 2015 budget showed $101.8 million for “instr. class staff” and $4.104 million (3.9% of the “instr.” budget) for “n-substitute instr prof.”

Today I checked the 2017 adopted budget.  It shows:


The 2015 actual expenditure for substitutes came to 5.6%, well beyond the budgeted 3.9%. 

As well, the budget for FY 17 shows 4.2% for substitutes (going up, it seems).  

Of course, I’ve filed a FOIA request to see what our Superintendent has been doing to reduce this waste of taxpayer funds.

Teacher Non-Evaluation

The National Council on Teacher Quality has a state-by-state evaluation of teacher evaluations.  Their analysis flunks Virginia’s implementation of the (statutory) requirement for use of objective measures of student growth as part of the teacher evaluation system.


(Note citation to Vermont).

In fact, that understates the weakness of the Virginia system.

The Virginia requirement in Va. Code § 22.1-253.13:5 is

Evaluations shall include student academic progress as a significant component and an overall summative rating.

According to the Board of Education, “Student Academic Progress” supposedly accounts for 40% of the evaluation.  But the Board’s “performance indicators” dilute that beyond recognition:


Nowhere in “sets . . . goals,” “documents . . . progress,” “provides evidence,” or “uses . . . data” do the guidelines say that the teacher shall be evaluated based on how much the students learn. 

This is important because experience teaches us that teacher evaluations that are not firmly grounded in objective data are inflated and meaningless:

  • In the 2011 the statewide report of teacher evaluations (the only such report), three of 7,257 ratings were “unsatisfactory” and forty-nine were “needs improvement.”  All the rest were “meets” or “exceeds” expectations.
  • In twelve of the Richmond schools denied accreditation this year, four of 444 (0.9%) teachers were rated “unacceptable” and 32 (7.2%), “developing/needs improvement.”  That is, only 8.1% of the teachers in these schools that had failed accreditation for four years running were less than “proficient.”

For a more detailed analysis of the Board’s feckless evasion of the law, see this.

Can You Spell “Ripoff”?

The annual CPI increase from 2014 to 2015 was 0.12%; the mid-year increase from 2015 to 2016 was 1.07%; extrapolated from the first six months of this year, it will be 1.68% in 2016.  The average increase in mandatory non-educational and general fees at our state colleges and universities for the upcoming year is 4.2%, ranging from zero in the Community College System to 8% at Mary Washington.


Details are in the SCHEV report here.

Reedy Creek Boondoggle

The City is planning to waste your tax money on a “restoration” project that won’t do any good.

Reedy Creek rises near Chippenham Hospital; it flows through Forest Hill park and into the James. 

The concrete channel built to control the flooding of and near Midlo Turnpike (esp. in front of the Evergreen plant) has led to high stormwater flows that erode the banks downstream.  The City now plans to spend $1.27 million of your and my tax money to create a new floodplain on the City property across Crutchfield St. from Wythe High School to reduce erosion there. 

The Reedy Creek Coalition lists five major reasons for opposing the project.  Their #4 should settle the issue: The money won’t do any good.

Apparently the DEQ bureaucrat who approved the City’s application for grant funds is a recent ComeHere.  Anybody who has lived in Richmond for more than a few years knows that the City already spent $1.4 million to restore the Forest Hill Park lake that is downstream of this project.  The renovation included a silt capture system , actually two forebays:

“A forebay is basically a hole. It’s a settlement hole where the silt will kind of build up. We will be able to clear it out with a Bobcat and haul it off and it will fill up again. So the process will be able to continue. But it will not affect the lake so that the citizens’ investment that they have in the lake will certainly be safeguarded,” said Richmond Parks deputy director Roslyn Johnson.

The City’s grant application to DEQ calculates that the new project will remove 150 lbs/yr of phosphorus and 98,736 lbs/yr of sediment.  The calculation entirely overlooks the two forebays that already are removing most of this and other sediment and the attached phosphorus.

So the City wants to spend $1.27 million to solve a problem that it already has solved.

And they want Chesapeake Bay TMDL credits for removing pollutants that are NOT entering the river.

Can you spell “boondoggle”?

SAT Update

RPS has just posted the 2015 SAT data.  Here are the reading scores by school back to 2010, along with the division averages and the Virginia averages.


And here are the math scores.


I’ve included the available data points for Maggie Walker (the 2010 data are from an RPS post; 2014, from Jeff McGee at MLW); of course, MLW is not a Richmond public school, albeit VDOE reports the SOL scores of MLW students at the high schools (that those students do not attend) in those students home districts.

To provide some context, here are the 2014 (presumably; they were posted on 12/20/14) 25th and 75th percentile scores of the students admitted to Virginia public colleges, along with the 2014 Virginia and Richmond averages. 



Here is (part of) what the Web page has to say about what the percentiles mean:

Understanding these numbers is important when you plan how many colleges to apply to, and when you figure out which schools are a reach, a match, or a safety. If your scores are below the 25th percentile numbers, you should consider the school a reach. Note that this does not mean you won’t get in — remember that 25% of students who enroll have a score that is at or below that lower number.

For sure, averages v. percentiles is an apples and pomegranates comparison.  That said, the Virginia reading average is between to the 25th percentiles at VMI and Christopher Newport; Richmond is 95 points lower than that state average; for math, the Virginia average is between the 25th percentiles at VCU and VMI while Richmond is 102 points lower.

Westover Hills Elementary: Glass Half Full?

The Winter edition of the Forest Hill Flyer (not yet posted to the Association Web site) had an interesting piece regarding the new (since 2011) Principal at Westover Hills, Virginia Loving, and her project to “make it a neighborhood school.” 

I’ll leave it to someone with more direct knowledge to assess the other effects of Ms. Loving’s outreach; I turn to the results on the VDOE Web site from the statewide testing program under the SOLs.

First, I should note that Principal Loving came in at a particularly difficult time: VDOE promulgated a new, tougher set of math tests in 2012 and reading tests in 2013 that clobbered the pass rates statewide (data here for all tested grades). 


Unfortunately, our former Superintendent did not prepare the Division for the new tests.  The lack of preparation exacerbated the score drop here.


You might also recall that Richmond’s elementary schools on average perform ten points or more below the state average (but stratospherically above our middle schools).


That said, the SOL performance at Westover Hills has been decidedly mixed.

On the math tests, WH was scoring near the (unacceptable) Richmond average before 2012.  The new tests hit even harder at WH than the at the district average.  But WH recovered more quickly than the Richmond average (data for grades 3-5).


That is, under Principal Loving, WH took an unusually big hit from the new math tests but since has been showing signs of improvement.

I’ve included the data for Patrick Henry, which is nearby and might be viewed as a neighborhood school.  PH was hit even harder by the new tests and recovered only to about the (awful) Richmond average.

The reading scores reveal a more troubled situation.  WH again was performing at about the Richmond average.  After the new test plunge in 2013, Richmond scores improved but the WH scores continued to decline.  This  is not good news for the school or for the children who attend it.  (Again, data for grades 3-5).


Patrick Henry started higher and dropped to the state average, but then continued also to drop. 

For what they may communicate, here are the combined (reading + math) pass rates.


Seems to me the neighborhood outreach could be more effective if the teaching, especially of reading, were improved. 

Of course, VDOE has been obtaining SGP data that would tell us which of the Westover Hills teachers are, or are not, effective, so Principal Loving (and the neighborhood) would have data that directly measure teacher performance.  Unfortunately, VODE is concealing the data they already have and now is abandoning the SGP entirely.  This is fully consistent with VDOE’s actual function, which is to be the Department of Data Suppression.