Question for VDOE

I’ve been looking at enrollment patterns (posts will follow) and have emerged with questions about the problems that underlie the “ninth grade bump,” the larger enrollment in the ninth grade compared to both the the later grades and the eighth grade.  So today I sent an email to the estimable Chuck Pyle at VDOE:

3/1/17

Via Email

Chuck,

The “ninth grade bump” looks to be a statewide phenomenon.

clip_image002

VDOE examines the far side of that bump with data on dropouts and graduation rates. The graduation rate now is part of the accreditation process.

The near side of the ninth grade bump looks to reflect students who are not prepared to do high school work or who otherwise have problems adapting to high school. What is VDOE doing to measure the influences that lead to this problem and to counter those influences?

John

Middle School Mess, II

We have seen that Middle School and the onset of puberty have been marked by a slight drop in math pass rates statewide but not in reading.  In contrast,  the sixth grade brought plummeting pass rates in both subjects in the Richmond schools.

Those data mostly looked at the period before the new math tests in 2012 and the new reading tests in 2013.

Today let’s look at the third-graders of 2011 and the progress of their group up to the past school year.  To that end, here are the pass rates of the third graders in 2011, the fourth graders in 2012, and so on. 

Let’s start with the reading pass rates.

image

The new tests dropped Richmond’s fifth graders to 23 points below the state average, while the same group scored six points below the average in 2012 (under the old tests).

The same group dropped another ten points when they entered middle school in 2014; in the eighth grade that group remained 31 points below the state average.

The math scores are even more dramatic.

image

The group went from six to fourteen points below the state under the new tests in 2012, and dropped another three points in the fifth grade (2013).  Then came middle school: In the sixth grade the (mostly) same students fell to 43 points below the average.  In 2016 (eighth grade) the group remained 30 points down.

As before, we see two effects here: The new tests whacked the Richmond scores in both elementary and middle grades, especially in math.  The middle schools took students who had been performing below average in elementary school and devastated their performances, especially in math.

These data don’t tell us what is wrong with our middle schools; they do tell is that, whatever it is, it is horrible.

Gone. Forgotten. For Shame!

The 2016 4-year cohort dropout data are up at VDOE.  The only good thing about the Richmond datum, 9.9% dropouts, is that it’s less than last year’s 11.9%.

Here is the distribution of division dropout rates.

image

Richmond is the gold bar.  The red bars are, from the left, Petersburg, Norfolk, Hampton, and Newport News.  The blue, with a hat tip to Jim Weigand, is Lynchburg.  Charles City is an invisible 0% over at the right.

And here are those selected divisions and the state average.

image

That 9.9% in Richmond counts the 146 kids, out of the 1,472 student cohort, whom the Richmond schools utterly failed to educate. 

But see this on the subject of the students who did not drop out and were left to marinate in the incompetence of RPS. 

Teacher Truancy, II

The 2014 data from the Feds showed Richmond with the ninth worst Virginia division record of teacher absences >10 days, excluding days for professional development. 

The Richmond data by school ranged from surprisingly high to astronomically high:

The 2014 RPS budget did not break out Richmond’s expenditures for substitutes; the 2015 budget showed $101.8 million for “instr. class staff” and $4.104 million (3.9% of the “instr.” budget) for “n-substitute instr prof.”

Today I checked the 2017 adopted budget.  It shows:

clip_image001

The 2015 actual expenditure for substitutes came to 5.6%, well beyond the budgeted 3.9%. 

As well, the budget for FY 17 shows 4.2% for substitutes (going up, it seems).  

Of course, I’ve filed a FOIA request to see what our Superintendent has been doing to reduce this waste of taxpayer funds.

Teacher Non-Evaluation

The National Council on Teacher Quality has a state-by-state evaluation of teacher evaluations.  Their analysis flunks Virginia’s implementation of the (statutory) requirement for use of objective measures of student growth as part of the teacher evaluation system.

image

(Note citation to Vermont).

In fact, that understates the weakness of the Virginia system.

The Virginia requirement in Va. Code § 22.1-253.13:5 is

Evaluations shall include student academic progress as a significant component and an overall summative rating.

According to the Board of Education, “Student Academic Progress” supposedly accounts for 40% of the evaluation.  But the Board’s “performance indicators” dilute that beyond recognition:

image

Nowhere in “sets . . . goals,” “documents . . . progress,” “provides evidence,” or “uses . . . data” do the guidelines say that the teacher shall be evaluated based on how much the students learn. 

This is important because experience teaches us that teacher evaluations that are not firmly grounded in objective data are inflated and meaningless:

  • In the 2011 the statewide report of teacher evaluations (the only such report), three of 7,257 ratings were “unsatisfactory” and forty-nine were “needs improvement.”  All the rest were “meets” or “exceeds” expectations.
  • In twelve of the Richmond schools denied accreditation this year, four of 444 (0.9%) teachers were rated “unacceptable” and 32 (7.2%), “developing/needs improvement.”  That is, only 8.1% of the teachers in these schools that had failed accreditation for four years running were less than “proficient.”

For a more detailed analysis of the Board’s feckless evasion of the law, see this.

Can You Spell “Ripoff”?

The annual CPI increase from 2014 to 2015 was 0.12%; the mid-year increase from 2015 to 2016 was 1.07%; extrapolated from the first six months of this year, it will be 1.68% in 2016.  The average increase in mandatory non-educational and general fees at our state colleges and universities for the upcoming year is 4.2%, ranging from zero in the Community College System to 8% at Mary Washington.

image

Details are in the SCHEV report here.

Reedy Creek Boondoggle

The City is planning to waste your tax money on a “restoration” project that won’t do any good.

Reedy Creek rises near Chippenham Hospital; it flows through Forest Hill park and into the James. 

The concrete channel built to control the flooding of and near Midlo Turnpike (esp. in front of the Evergreen plant) has led to high stormwater flows that erode the banks downstream.  The City now plans to spend $1.27 million of your and my tax money to create a new floodplain on the City property across Crutchfield St. from Wythe High School to reduce erosion there. 

The Reedy Creek Coalition lists five major reasons for opposing the project.  Their #4 should settle the issue: The money won’t do any good.

Apparently the DEQ bureaucrat who approved the City’s application for grant funds is a recent ComeHere.  Anybody who has lived in Richmond for more than a few years knows that the City already spent $1.4 million to restore the Forest Hill Park lake that is downstream of this project.  The renovation included a silt capture system , actually two forebays:

“A forebay is basically a hole. It’s a settlement hole where the silt will kind of build up. We will be able to clear it out with a Bobcat and haul it off and it will fill up again. So the process will be able to continue. But it will not affect the lake so that the citizens’ investment that they have in the lake will certainly be safeguarded,” said Richmond Parks deputy director Roslyn Johnson.

The City’s grant application to DEQ calculates that the new project will remove 150 lbs/yr of phosphorus and 98,736 lbs/yr of sediment.  The calculation entirely overlooks the two forebays that already are removing most of this and other sediment and the attached phosphorus.

So the City wants to spend $1.27 million to solve a problem that it already has solved.

And they want Chesapeake Bay TMDL credits for removing pollutants that are NOT entering the river.

Can you spell “boondoggle”?

SAT Update

RPS has just posted the 2015 SAT data.  Here are the reading scores by school back to 2010, along with the division averages and the Virginia averages.

image

And here are the math scores.

image

I’ve included the available data points for Maggie Walker (the 2010 data are from an RPS post; 2014, from Jeff McGee at MLW); of course, MLW is not a Richmond public school, albeit VDOE reports the SOL scores of MLW students at the high schools (that those students do not attend) in those students home districts.

To provide some context, here are the 2014 (presumably; they were posted on 12/20/14) 25th and 75th percentile scores of the students admitted to Virginia public colleges, along with the 2014 Virginia and Richmond averages. 

image

image

Here is (part of) what the Web page has to say about what the percentiles mean:

Understanding these numbers is important when you plan how many colleges to apply to, and when you figure out which schools are a reach, a match, or a safety. If your scores are below the 25th percentile numbers, you should consider the school a reach. Note that this does not mean you won’t get in — remember that 25% of students who enroll have a score that is at or below that lower number.

For sure, averages v. percentiles is an apples and pomegranates comparison.  That said, the Virginia reading average is between to the 25th percentiles at VMI and Christopher Newport; Richmond is 95 points lower than that state average; for math, the Virginia average is between the 25th percentiles at VCU and VMI while Richmond is 102 points lower.