Neither Advanced Nor Proficient

Brian Davison mentioned the other day that it can be useful to follow the Pass Advanced and Pass Proficient numbers that are elements of the total pass rates.

The Board of Education has set “cut scores” for the various SOL tests.  For instance, on the 5th grade math test, a student needs 31 of 50 items to score Pass Proficient and 45 of 50 for Pass Advanced.  The VDOE database reports the Advanced and Proficient pass rates down to the school and grade level.

For my first venture into this thicket, I’ve pulled the reading and math subject area data by grade for Richmond and the state. 

Let’s start with reading.  Here are the 5th grade data:

image

To untangle this, start with the red data points, which are Pass Advanced rates for Richmond (yellow lines) and the state (blue lines).

Richmond fifth graders were performing within about 10% of the state average before the new tests in 2013, actually much closer in 2012.  The new tests in 2013 halved the Pass Advanced rate for the state and cut Richmond by two thirds. 

The green points are the Pass Proficient data, again with yellow lines for Richmond and blue for the state.  Before 2013, Richmond was partially compensating for its lower Pass Advanced rate with a higher than state average Pass Proficient rate.  The new tests in ‘13 dropped the Richmond Proficient rate by almost a third but left the state rate essentially unchanged.

So, at the state level, we see the new test moving about half of the Pass Advanced students to lower tranches (presumably mostly to Pass Proficient) and moving about as many pass proficient students to failing as were added from the ranks of the former Pass Advanced.  In Richmond, however, the Pass Proficient rate dropped by almost 20%; that rate recovered a bit two years later but it remains below the state average.

The curves with no points are the total pass rates, blue for the state and yellow for Richmond. 

Why did our elementary schools (and middle and high schools, see below) get disproportionately clobbered by the new, tougher, tests?  Almost certainly because Richmond’s then-Superintendent did not align the curricula to the new tests. 

This handed our current Superintendent a blank check: Fix the curricula and get a bounce.  (Bedden started here in January 2014, so look for the Bedden Bounce in the ‘14-15 numbers.)  In these data, the Pass Proficient rate shows some bounce but the Pass Advanced rate does not.

Bounce or no, the new tests lowered Richmond’s overall pass rate, compared to the State.  Richmond’s fifth graders had been flirting with the state total pass rate until 2013; now, even after the Bedden Bounce, they are fourteen points behind.

Next the 8th grade reading data:

image

The new tests whacked the Pass Advanced rates both in Richmond and statewide.  The Pass Proficient rate for the state jumped under the new tests (looks like a chunk of the 40% decrease of the Advanced population wound up in Proficient) but in Richmond that rate fell.  Again, the Pass Proficient rate in Richmond enjoyed a bounce under the new Superintendent, but left us ten points below the 2012 level.  The Pass Advanced numbers rose slightly in ‘15. 

As to the totals, the new test clobbered Richmond’s (already dismal) rate vs. the state, leaving our eighth graders thirty points down.

The End of Course numbers show a larger 2013 hit to the Pass Advanced rate, both here and statewide.  The state Proficient rate jumped; the Richmond rate showed a smaller rise with no Bedden bump:

image

Our EOC average in 2016 was fourteen points below the state average.

Turning to math: The new tests came a year earlier, 2012.  Those tests degraded a close-to-average fifth grade performance.

image

Again, the Pass Proficient rate enjoyed a bounce in 2015.  The 2014 increases in the both rates came after Bedden had been on the job for only six months, so the credit will have to go elsewhere.  As to the totals, after the Bedden Bump we still are nine points down.

Our awful middle schools blew the new math tests, big time.

image

After the Bedden Bump (Proficient only), our total pass rate still was thirty points below the state average.

The Pass Proficient rate in the high schools partially recovered from the new tests in 2014 and ‘15; the Pass Advanced rate, not so much.

image

In 2016 our total pass rate remained sixteen points below the state average.

A look at the averages by subject area emphasizes the magnitude of Richmond’s failure here, particularly in the Pass Proficient rates:

image

image

We were twenty points below the state average in reading in 2016, 23 points in math.

For the global view, here are the five-subject averages:

image

The 2016 Richmond average is 22 points below the state average.

Note: Jim Weigand points out that 2015 also was the first year that retakes were allowed for the elementary and middle school grades. VDOE does not release retake data; they did admit: “The 2014-2015 school year was the first during which students in grades 3-8 were allowed to retake SOL tests in reading, mathematics, science and history. On average, the performance of students on expedited retakes increased pass rates by about four points on each test.”

In light of that, we probably could discount the statewide average increases for 2015 by about 4% and we could wonder how much of the Richmond increases was Bedden Bounce and how much was retakes. 

Here are the raw numbers:

image

Bottom line:  The modest gains from aligning the English and math curricula to the new tests mostly failed to repair the damage those tests did to Richmond’s’ pass rates.

That left RPS mired in failure: We have the lowest pass rates in the state in reading and the five-subject average; we have the second lowest in math, writing, history & social science, and science.  That tells us something crucial about our Superintendent and about the School Board that hired him.

I’ve posted these data to OneDrive, where you can look for more details.  The data by grade are here; the single and five-subject data are here.

How’s My School Doing?

I have posted to OneDrive a spreadsheet of SOL pass rates by year for the Richmond schools.  You can access it at this link.

To get data on a particular school, go to the drop down list at H16 and click the little funnel. 

imageClear the school currently selected (easy way is to clear the “(Select All)” button) and select the school you want to view.  The table and graph will change to display the data.  If you want the Richmond average, click “Select All.”

Warning: If you don’t clear the previous selection, you’ll get the average of both (or all) schools selected.  You can tell that has happened by the drop down list, which will show “(Multiple Items)” rather than the name of a particular school.

image

Note: The accreditation level for English is 75%; for all the others, 70%.  Those are nominal; in fact, VDOE applies adjustments through an opaque process that tends to raise the actual pass rates.

Here, as an example, is the graph for Carver:

image

And here, for contrast, is MLK:

image

“The” Is a Dirty Word in Richmond

Our City Throws Away Emails Without Any Notice and Then Pretends It’s Not Their Fault

Perhaps you’ve noticed that Richmond sought and received a grant to pay for removing pollutants from Reedy Creek, never mind that the sediment traps in Forest Hill Lake already are removing those pollutants.  And never mind that they plan to dig up the part of Reedy Creek that now is helping to improve water quality.

It gets more interesting: DPU’s Grace LeRose is co-author of a PowerPoint that touts “Integrated Watershed Management” and suggests that we “Apply $$ to get best environmental gain.”  Yet DPU has not studied the environmental gain available from the upstream portion of Reedy Creek where the City’s efforts have exacerbated the problem that their present effort will not solve.  Another LeRose PowerPoint discusses “Stakeholder Involvement” that apparently never occurred.

I was curious enough about all this to send them a Freedom of Information Act request for, inter alia:

  • All records that evaluate or comment upon alternatives to the [Reedy Creek] project;
  • All records that disclose, or discuss the actual or potential disclosure, to the Department of Environmental Quality that some portion of the sediment and/or phosphorus to be removed from Reedy Creek by the project now is removed by the sediment traps at Forest Hill Lake; and
  • All records that establish or comment upon the relationship of the Reedy Creek project and the goal of [the LeRose PowerPoint] in light of the existing sediment traps at Forest Hill Lake: Apply $$ to get best environmental gain.

No reply!

I’ve had problems in the past with the Richmond system blocking emails so I forwarded the requests to Mr. Todd of IT.  He has been helpful in the past with disappearing emails.

No reply.

At this point I should have sued them.  But lawsuits are disruptive and loaded with uncertainty.  Most annoying: I would have to pay income taxes on the attorneys’ fee award.

So I mailed a hard copy to the City Attorney.  Nine days later I got a helpful call from Dave Kearney, who has returned to the City Attorney.  He assembled the request and the (LeRose PowerPoint) attachments and got them to DPU.

Next day I received an email with a pdf of a letter from Susan McKenney, also with the City Attorney.

If I were to deal with all the outrageous statements in the McKenney letter, I’d have to write a treatise.  So I’ll stick to the really weird one:

We believe the City’s email filtering appliance likely intercepted the email intended for Mr. Steidel (and any of your subsequent attempts to forward or resend that email to Mr. Steidel, Mr. Todd, or Mr. Jackson) due the the appearance of the terms (sic) “porn,” “jerk,” and “the” [in the email].

McKenney has to say “likely” because the same primitive spam filter that purges the emails without notice to anybody deletes its logs after seven days.

Then we have those offensive “terms.”  They are in the signature that I put on my Verizon account some time back to warn unwary readers about my propensity to send out links (offensive terms hilighted here):

HEADS UP: I don’t think I’ve been hacked and I post only links that work and don’t seem to be dangerous. Even so, DON’T CLICK ON ANY LINK IN THIS (or any other) EMAIL. It’s just not safe. If what I say looks interesting, and you don’t mind some risk, open your browser, type in the address of the Web site in question, and drill down to the page in question. (It’s no accident that the address of my blog, calaf.org, is quite short and easy to type.)

For example, I recently sent the link http://www.techspot.com/news/59754-watch-300-android-phones-tablets-play-beethoven-ode.html. If you are interested, you can open your browser and type in techspot.com/news. (Obviously you won’t intentionally go to anyplace in China or Russia or to anything related to porn or gambling. And you WILL check the spelling and avoid obvious traps such as “Goegle” or “tachspot”) As I write this, the Techspot 300 android phones post is listed on the /news page. Later on when it’s been replaced by newer news, you can click the search button there and search for “beethoven.” Or just Google “android phones beethoven.”

I know, I know. It’s a lot of trouble. But then, if you click a bad link and some jerk gets your logon data and your banking password and your latest love notes, you’ll wonder why you didn’t take the trouble.

So there you have it:

  • The City’s primitive and arbitrary spam filter blocks emails that contain offensive words such as “the”;
  • Any FOIA request that offends that filter gets deleted without notice to the sender or intended recipient;
  • Seven days later, they delete the logs so they can’t know what they have received or not;
  • The statute requires the City to respond to FOIA requests within five working days of receipt;
  • The City could not arbitrarily trash an email if it had not received that email;
  • The City cannot respond to a request it has trashed without notice to anybody; and
  • Poor Ms. McKenney caught the hot potato and had to embarrass herself by writing a letter that seeks to defend the City’s stupidity.

I like to say that Richmond is the second most embarrassing jurisdiction in the East.  It looks like they are bucking for first place.

Richmond SOL vs. Poverty by Grade

We have seen that Richmond’s SOL scores mostly dropped this year and that neither expenditure nor the population of economically disadvantaged students explains Richmond’s worst- or close to worst in the state performance.

To get a more detailed look at the matter, let’s take a look at SOL pass rates by economic disadvantage and grade. 

Here, to start, are Richmond’s and the state’s average pass rates in the reading subject area, broken out by grade for the ED and not ED populations.  (“EOC” is the end of course tests, required to obtain “verified” credits toward graduation.)

image

Hmmm.  ED students are scoring below their non-ED peers at both the state and Richmond levels.  Let’s examine the gaps by subtracting the Richmond pass rates from the state values for each group.

image

Two things jump out here:

First, Richmond’s awful middle schools somehow take kids who are performing within sight of the state average and drop their performance off a cliff.

Second, both the ED and non-ED populations underperform the state averages for their groups and grades by about the same amount.  Said otherwise, Richmond’s schools are underperforming at about the same level for both the ED and non-ED populations.

Next the math subject area.

image

image

Note the different scale here: As to middle school reading, both groups were 16 to 25 points behind their peers statewide; for math that range is 23 to 43%. 

And we see our middle schools again doing a terrible job for our ED students but an even worse job for the non-ED population.

Finally, the five subject average.  (“CST” = Content Specific Test, used for elementary and middle school history tests.)

image

image

This paints a picture similar to the reading results: Low performance in elementary schools; awful performance in middle schools; ca. 10% underperformance on the EOC tests;  Richmond ED and non-ED students underperforming the statewide averages for their groups by about the same amount.

This more finely grained view provides some interesting details but does not change the basic result: Economic disadvantage is a problem but it’s not the problem with our schools.  That problem is lousy schools (especially the middle schools).

Big Bucks; Pitiful Bang

In the past, I have demonstrated that Richmond was paying a lot of money for poor SOL results.  This year, in light of our Superintendent’s complaints about his customer base, I’ve demonstrated that our SOL scores are awful, even after correcting for the economic disadvantage of the student population.

In this latter light, I turn to this year’s bang per buck analysis.  Instead of using the raw SOL scores as with, e.g., last year’s reading data,

http://calaf.org/wp-content/uploads/2015/08/image8.png

I’ll adjust the scores for economic disadvantage, by adding the decreases predicted by the % ED

image

to the actual pass rates to level the playing field. 

image

Note: The gold squares are Richmond; the red diamonds, from the left, are Hampton, Newport News, and Norfolk; the green diamond is Charles City County.

On the reading data, this produces an average adjusted score of 90.4, which is the intercept of the fitted line, i.e., the pass rate extrapolated to 0% ED.  Said otherwise, it expresses each division’s pass rate as the actual rate increased by the disadvantage posed by the division’s percentage of economically disadvantaged students.

You might notice that six divisions show corrected pass rates > 100%.  That is because they have pass rates that both are high and are considerably higher than their average ED would predict.  The rising tide floats all boats: The adjustment also raises Richmond from an actual 60% pass rate to an adjusted 79%

As to cost, VDOE will not post the 2016 data until sometime this Spring so we’ll have to make do with 2015 data, here the disbursements per student.

On that basis, here are the 2016 division average reading pass rates, corrected for the ED of the division, plotted vs. the 2015 division disbursements per student.

image

The fitted line suggests a slight increase in score with disbursement but there is no correlation.  That is, spending more per student is not correlated with better pass rates (adjusted for ED, so the divisions with more poor students are not disadvantaged).

Richmond (the gold square) underperformed and overspent its peers, Hampton, Norfolk, and Newport News (the red diamonds, from the left, ), as well as our neighbor, Charles City County (the green diamond).

Three of those six divisions with adjusted pass rates > 100% (i.e., > 9.6 points above the 90.4 adjusted average) were spending < $12,000 per student; all spent less than Richmond’s $15,155.

image

The math data paint much the same picture.

image

Here the slope might suggest a negative correlation between pass rate and disbursement except that the R2 again tells us that there is no correlation.  Richmond again underperforms, expensively.

The five subject average tells the same story.

image

To state the obvious: Richmond schools don’t need more money; they need competent management.

The data are here.

2016 SOL v Poverty II

We have seen that, as expected, school division performance on the SOL is affected by the relative affluence of the students.  Indeed, on the 2016 Virginia data, it appears that the percentage of economically disadvantaged students explains about 25% of the variance in the division SOL pass rates.

image

Excel is glad to recast the data to express the division pass rates as differences from the least squares fit.  When we do that calculation for the reading subject, the graph of pass rate v. % ED

image

becomes a plot of (pass rate – calculated pass rate) v. % ED:

image

As before, Richmond is the gold square.  The red diamonds are, from the left, Hampton, Newport News, and Norfolk.  Charles City is the green diamond.

In this case, we see Richmond move from the lowest reading pass rate to the fifth from lowest rate after correction for the average (well, least squares fitted) influence of economic disadvantage.

image

Note that all four of the lower performers have much lower ED percentages, with only Petersburg breaking 50%.  Among the divisions with larger ED populations, Richmond’s reading performance is uniquely awful.

Applying the same process to the math pass rates we see:

image

image

image

Here, only one of the high-ED divisions (Lancaster Co.) underperformed Richmond.

Finally, the five-subject averages, where we get bragging rights only as to poor Petersburg.

image

image

image

Please forgive me for saying it again: Economic disadvantage is a problem but it’s not the problem with our schools.  That problem is lousy schools.

Richmond SOL by School

The RT-D reports “Standards of Learning pass rates mostly hold steady across Richmond region, state.”

As to the state, that is true.  As to Richmond, not.

Let’s start with the reading subject area and Richmond’s elementary schools.  Here are the 2016 pass rates (note that 75%, after the “adjustments,”is passing for accreditation purposes).

image

The 64.4% average is not encouraging.  It improved from last year by 0.3%.

image

The accreditation benchmark for math, science, and history is 70%.

Our elementary schools’ History & Social Science pass rates showed a 77.2% average with a 2.5% decrease.

image

image

Patrick Henry was a particular disappointment here.

The math average of 68% was a 3.0% decrease from 2015.

image

 image

The elementary schools averaged 70% for the science subject area, up 1.2% from 2015.

image

image

The four subject average for our elementary schools was 69.9, down 1.0% from 2015.

image

image

Carver, with a tough clientele, continues to shine.

Our lousy middle schools turned in another awful reading performance, averaging 54.8%, an increase of 0.3% from 2015.

Notes: Franklin Military has both middle and high school grades; I’ve included it in both the middle and high school datasets although the pass rates do not compare directly with either the middle or high schools.  2016 is the first year for Elkhardt-Thompson so there are no year-to-year data.

image

image

The writing average was an appalling 39.3, down 8% from 2015.

image

image

History & Social Science looks a lot better, but shows a 2.6% decrease.

image

image

The math pass rate plunged 6.3% to 47.5%.

image

image

Science dropped 3.3% to 60.7.

image

image

The middle school five subject average dropped 4.0% to 54.8%.

image

image

As to Elkhardt-Thompson, here are the 2015 data for the separate schools and the 2016 rates for the combined school. 

image

The History & SS scores came up a bit to passing; all the rest are and were variously awful.

The high schools posted a mixed group of pass rates with an average drop of 3.9% to a 72.8% average.

image

image

image

image

image

image

image

image

image

image

image

image

Looks like ongoing problems at Armstrong, Marshall, and Wythe.

Overall, these data reflect a frightful incompetence of our School Board and Superintendent.

2016 SOL v Poverty

 The excuse we often hear for Richmond’s poor performance on the SOL tests is poverty.

VDOE has data on that.  They define a student as “economically disadvantaged” if that student “1) is eligible for Free/Reduced Meals, or 2) receives TANF, or 3) is eligible for Medicaid, or 4) [is] identified as either Migrant or experiencing Homelessness.”  Data (Fall enrollments) are here.

Juxtaposing the 2016 Division pass rates with the ED percentage of the enrollment, we see the following for the reading subject area:

image

With an R2 of 39%, it appears that ED is a considerable influence on the pass rate.

Even so, Richmond (the gold square) underperformed not just the peer jurisdictions (red diamonds, from the left: Hampton, Newport News, and Norfolk) and not just the divisions with similar rates of poverty, but all of Virginia’s school divisions, higher poverty or not.

Richmond’s math performance was nearly as dismal, beating out Lancaster County.

image

The five-subject pass rate also was perfectly dismal, notwithstanding the larger R2.

image

Anybody who uses poverty as an excuse for Richmond’s lousy SOL scores is an ignoramus or a liar.  Or both.

————–

Here are the data:

image

image

2016 SOLs by Year

As a second bite at the SOL data released today, here are pass rates by year for Richmond; the peer jurisdictions of Hampton, Newport News, and Norfolk; our neighbor, Charles City; and the state.

image

image

image

Here are the same data, expressed as pass rates of division minus state.

image

image

image

Upon reflection, firing of the School Board and Superintendent would be too kind.

Time for a New Superintendent (And a New School Board)

The SOLs are up today. 

 “Wait a minute,” you say.  “They had those data in time to schedule graduations last May.  Why did they wait until now to publish them?”

Good question.  Their excuse is that the summer testing data are not available until now.  The reason (one suspects; they hide the data so we can’t actually know) is that they manipulate these data extensively and all that futzing takes time.  As well, one can wonder whether the summer scores are so unusual that they would be embarrassed by their disclosure

The Richmond scores are a disaster.

Here are the pass rates of the bottom ten divisions in each subject area as well as the five-subject average:

image

image

image

image

image

image

You may recall that we were second from the bottom in reading last year and sixth from the bottom in math. 

More details will follow as I find time to process the data.