Middle School Mess

The VDOE database is glad to produce pass rates by grade. 

We must view these numbers with some caution.  The high school pass rates are boosted by inclusion of the Maggie Walker students who live in Richmond, albeit Walker is not a Richmond public school.  They still are giving the notorious VGLA to LEP students in grades 3-8 (albeit it’s now graded by the state, not the local schools). 

Here, then, are the pass rates for the 2015 reading tests for Richmond and the State. 

image

And here are the pass rates on the math tests.

image

We can simplify the pictures by taking the difference between the Richmond and state pass rates.

image

So we see the elementary schools underperforming, the middle schools failing their students miserably, and the high schools a very mixed bag.

For sure, if RPS elects to attack the worst of the awful, they’ll start with the middle schools and eleventh grade reading.

Blarney à la Bedden

Carol Wolf has posted our Superintendent’s 2015 State of the Schools speech. 

After the introduction, the speech embraces the old, sad, false excuses for Richmond’s miserable performance:

  • A large percentage of students ages 0-17 live in poverty, 
  • More than 3 out of 4 students qualify for free/reduced lunch, 
  • 19% or put another way, over 4000 students receiving special education services, and
  • The growing ESL population, which has risen from 5% in the early 2000s to approximately 12% today.

 

Poverty Is Not the Problem

We already have seen that, while SOL pass rates certainly decline with increased poverty,

image

Richmond is grossly underperforming the other Virginia divisions with similar or higher poverty rates.

image

image

(Richmond is the gold points in the graphs above.)

 

Students with Disabilities Are Not the Problem Either

To see about the disabled students, let’s start with the reading pass rates by year:

image

Here we see Richmond’s students without disabilities consistently performing far below the state average for students without disabilities.  Then we see Richmond’s students with disabilities outperforming the state average for students with disabilities until the new reading tests in 2013, when the Richmond scores plummeted even more than the state average. 

We can tease out the magnitude of these effects by plotting the Richmond minus state pass rates.

image

In short, Richmond has since at least 2005 done a consistently lousy job teaching reading to its non-disabled students; since the new reading tests, Richmond has done a consistently lousy job teaching reading to all its students.

The math scores show a similar picture, except that the big drop came a year earlier with the new tests in 2012.

image

image

You might well pause to wonder how Richmond’s students with disabilities outperformed their peers statewide until the advent of the new tests.  I think it was because Richmond was abusing its disabled students to cheat on the VGLA

In any case, in light of the gross underperformance of Richmond’s students without disabilities, it is simply false to claim that the students with disabilities are the problem here.  The problem is lousy schools, period.

 

Nor Are LEP Students the Problem

Turning to limited English proficiency (“LEP”), here are the data.

image

image

The signal/noise ratio here is lower for the LEP students, perhaps because of the smaller sample sizes, but the general effect is clear: Since at least 2010, Richmond’s LEP students have been underperforming their peers statewide but, relative to their peers, they have been outperforming Richmond’s non-LEP students.  LEP students are not the problem here.

 

Richmond’s lousy schools are the problem

Thus, it is worse that than misleading to blame Richmond’s poor, its disabled, or its LEP students for Richmond’s awful SOL performance.  Richmond’s awful schools are the problem here.

And I foolishly thought this Superintendent might be better than to wallow in these false excuses that blame the kids for the system’s failures.

 

Indeed, Bedden Blew His Best Chance

Superintendent Bedden had his chance to shine.  His predecessor failed to align the Richmond curricula with the new math and reading tests.  2015 was his first full year with newly aligned curricula.  He did not shine: In 2015 his schools were second worst in the state on reading; sixth in math. 

image

image

Now he is making the old, false excuses. 

At least he still has a little room to sing the other pitiful, old refrain: “We’re doing better than Petersburg.”

image

image

. . . Garbage Out

I’ve already discussed VDOE’s byzantine, opaque process for “adjusting” pass rates to calculate accreditation status.  So, without further comment, and for whatever these numbers may mean, here is the distribution of Full accreditations by division.

image

Richmond is the yellow bar.  The red bars are, from the left, Petersburg, Norfolk, Newport News, and Hampton.  The blue bar is the state average.

VDOE invented several forms of “nearly pass” categories this year.  Here is a list of the categories that appear in this year’s database, along with the abbreviations I had to use to fit the table below on the page. 

Accreditation Denied Denied
Conditionally Accredited (New Schools) New School
Fully Accredited Full
Partially Accredited: Approaching Benchmark-GCI Close, GCI
Partially Accredited: Approaching Benchmark-Pass Rate Close, Pass Rate
Partially Accredited: Improving School-Pass Rate Improving, Pass Rate
Partially Accredited: Warned School-Pass Rate Warned, Improved Pass Rate
To Be Determined TBD

The complete list is here and the press release explaining the new categories is here.

Finally, here is the table, sorted by division.

  Denied New School Full Close, GCI Close, Pass Rate Improving Pass Rate Warned, Improved Pass Rate TBD Total % Full
Accomack County     8     2 1   11 73%
Albemarle County     19   1   6   26 73%
Alexandria City 1   12     2 1   16 75%
Alleghany County     4       1   5 80%
Amelia County     1   1 1     3 33%
Amherst County     7     1 2   10 70%
Appomattox County     4           4 100%
Arlington County   1 31           32 97%
Augusta County     14   1 1 4   20 70%
Bath County     2       1   3 67%
Bedford County     12       6 1 19 63%
Bland County     1   1       2 50%
Botetourt County     10       1   11 91%
Bristol City     4       2   6 67%
Brunswick County     2     1 2   5 40%
Buchanan County     7   1     1 9 78%
Buckingham County     1   1   2   4 25%
Buena Vista City             1 3 4 0%
Campbell County     12         1 13 92%
Caroline County     3       2   5 60%
Carroll County     7       2   9 78%
Charles City County     1       1   2 50%
Charlotte County     4       1   5 80%
Charlottesville City     9           9 100%
Chesapeake City     34   2 5 4   45 76%
Chesterfield County     52   1 3 5   61 85%
Clarke County     3     1     4 75%
Colonial Beach     2           2 100%
Colonial Heights City     5           5 100%
Covington City     2   1       3 67%
Craig County     2           2 100%
Culpeper County     7   1 1 1   10 70%
Cumberland County             3   3 0%
Danville City     3   1 1 5 1 11 27%
Dickenson County   2 1     1 1   5 20%
Dinwiddie County     5       1 1 7 71%
Essex County           1 1 1 3 0%
Fairfax County     177 1 2 3 9   192 92%
Falls Church City     4           4 100%
Fauquier County     17       2   19 89%
Floyd County     5           5 100%
Fluvanna County     5           5 100%
Franklin City     1         2 3 33%
Franklin County     16           16 100%
Frederick County     12   1 1 3 1 18 67%
Fredericksburg City     4           4 100%
Galax City     3           3 100%
Giles County     4       1   5 80%
Gloucester County   1 7           8 88%
Goochland County     5           5 100%
Grayson County     6       1   7 86%
Greene County     2   1   2   5 40%
Greensville County     1     1 2   4 25%
Halifax County     4     2 3   9 44%
Hampton City     12   4   8 5 29 41%
Hanover County     23           23 100%
Harrisonburg City     6     1 1   8 75%
Henrico County 1   45     3 17 1 67 67%
Henry County     11   2   1   14 79%
Highland County     1     1     2 50%
Hopewell City     1       4   5 20%
Isle of Wight County     8       1   9 89%
King and Queen County     3           3 100%
King George County     4     1     5 80%
King William County     4           4 100%
Lancaster County         1   2   3 0%
Lee County     8       2   10 80%
Lexington City     2           2 100%
Loudoun County   1 83   1   1   86 97%
Louisa County     5     1     6 83%
Lunenburg County     1       3   4 25%
Lynchburg City     3   1 1 8 3 16 19%
Madison County     2       2   4 50%
Manassas City     6   1 1     8 75%
Manassas Park City     3       1   4 75%
Martinsville City           1 3   4 0%
Mathews County     3           3 100%
Mecklenburg County     3   3   1 1 8 38%
Middlesex County     3           3 100%
Montgomery County     18       1   19 95%
Nelson County     3       1   4 75%
New Kent County     4           4 100%
Newport News City 3   15   2 6 9 3 38 39%
Norfolk City 4 1 17   2 4 10 7 45 38%
Northampton County 1   1     2     4 25%
Northumberland County     2   1       3 67%
Norton City     2           2 100%
Nottoway County     1     3 2   6 17%
Orange County     9           9 100%
Page County     4     1 3   8 50%
Patrick County     6     1     7 86%
Petersburg City 1   1       3 2 7 14%
Pittsylvania County     16     1 1   18 89%
Poquoson City     4           4 100%
Portsmouth City     11     2 4 2 19 58%
Powhatan County     6           6 100%
Prince Edward County     1       2   3 33%
Prince George County     6       2   8 75%
Prince William County   1 80   4 1 1 1 88 91%
Pulaski County     7       1   8 88%
Radford City     4           4 100%
Rappahannock County     2           2 100%
Richmond City 2 2 17     3 14 7 45 38%
Richmond County     2           2 100%
Roanoke City     15   1 3 5   24 63%
Roanoke County     26           26 100%
Rockbridge County     5       1   6 83%
Rockingham County     22   1       23 96%
Russell County     11       1   12 92%
Salem City     6           6 100%
Scott County     13           13 100%
Shenandoah County     5     1 3   9 56%
Smyth County     12       1   13 92%
Southampton County     5         1 6 83%
Spotsylvania County     27     1 1   29 93%
Stafford County     30           30 100%
State 13 9 1414 1 46 76 215 49 1823 78%
Staunton City     3     1   1 5 60%
Suffolk City     11     3 4 1 19 58%
Surry County     2     1     3 67%
Sussex County     2   1       3 67%
Tazewell County     15           15 100%
Virginia Beach City     73   3 1 3 2 82 89%
Warren County     6   1 1     8 75%
Washington County     14   1       15 93%
Waynesboro City     2       4   6 33%
West Point     3           3 100%
Westmoreland County     1     2 1   4 25%
Williamsburg-James City County     15           15 100%
Winchester City     5       1   6 83%
Wise County     12           12 100%
Wythe County     11       1   12 92%
York County     19           19 100%

Lies, Damn Lies, and Accreditation “Adjustments”

On Tuesday, the Governor announced a “10-Point Increase in Fully Accredited Schools.”  As Jim Bacon quickly pointed out, some part of that increase must be due to the newly-allowed retakes that boosted pass rates by about four percent. 

Then we have the “adjustments.”  VDOE acknowledges that it fiddles the numbers:

Accreditation ratings also reflect adjustments made for schools that successfully remediate students who initially fail reading or mathematics tests. Adjustments also may be made for students with limited English proficiency and for students who have recently transferred into a Virginia public school. All of these factors are taken into account in calculating pass rates in each subject area.

That falls considerably short of earlier admissions.  Indeed, we know that earlier “adjustments” converted a 76.3 and a 73.7 into “perfect scores” and embarrassed the Governor

In any case, the process is opaque.  About all we can do is compare the “adjusted” pass rates with those reported in the SOL database (that already includes the 4% retake boost).  I have a modest example here.

For the 1774 schools that appear in both databases (see below for the missing 49), the “adjustments” increase the math pass rates:

image

Excel is happy to fit curves to these data.  For the fitted curves, the actual mean is 82.4, the “adjusted” mean is 84.6.

All this produced a nice increase in the number of schools that made the 70% cutoff:

image

VDOE writes the tests; they can make them as hard or easy as they wish.  Yet they indulge in this byzantine, opaque process.  And then they brag about the fudged results.

Moreover, there’s a problem with the data.

Data Problem

In juxtaposing the Accreditation and SOL data, I had to make sure that the school names in both lists were the aligned.  In many cases they were not.  So I spent a rainy afternoon yesterday getting the lists to match.

To accomplish that, I dealt with dozens of cases where the SOL database had a space after the school name but the accreditation list did not (Ask Excel to compare two strings and it really compares them).  As well, I had to deal with cases such as a Norfolk school that was “Mary Calcott Elementary School” in one list and “Mary Calcott Elementary” in the other.  Beyond those minor issues, I had to remove 48 schools that were in the accreditation list but not in the SOL database.

image

(You might notice that 1774+48=1822, which is one short of the 1823 reported by VDOE.  I had to move these by hand and perhaps I messed up a cut-and-paste operation.  I’m not sufficiently invested in this to spend another afternoon trying to figure out who’s missing.)

We are left to wonder how they calculated “adjusted” pass rates for these schools that apparently had no pass rates.

I also had to remove twelve schools from the SOL report that were not in the accreditation list.

image

At least the two Richmond schools here make some sense: Elkhardt and Thompson were combined into a single school this year.  We are left to wonder why their pass rates were reported separately but they got accredited jointly,* and what happened to the accreditations of the other schools in this list.

As a more global matter, we are left to speculate why they fudge these data.  And how they do it.  And what other ways the data are screwed up.

Oh, and if one secret process for manipulating the data were not enough, we have another: the federal count of schools and divisions that met or failed to meet their Annual Measurable Objectives (aka “AMO’s,” of course).  The only thing to be said for this further waste of taxpayer dollars is that it may be more honest: 51.5% of Virginia schools flunked.

image

 

*Actually, we know the answer, at least as to the latter: The combined Elkhardt-Thompson is a “new school,” so it got a bye on accreditation.  The joint accreditation thus solved the problem of Thompson, which was denied accreditation last year.

In the Accreditation Basement

VDOE has posted the 2016 Accreditation Ratings, based on the 2015 test scores.

I’ll have more to say later about VDOE’s manipulation of the Accreditation Ratings, to include the newly minted “junior is flunking but by less than before” ratings.  For now, here are the Richmond results.

image

image

It’s hard to know what all those “TBD” entries mean.  I’ll have a look at the pass rates and post them here, soon.  For sure, 38% fully accredited is not good news.

And, also for sure, Thompson was denied accreditation last year but the newly-minted Elkhardt-Thompson is getting a free pass.

Sex and the SOL

The VDOE database is glad to produce the SOL data by sex, as well as by economic disadvantage.

BTW: The database calls it “gender,” not “sex.”  Nobody who took German long enough to know that “das Mädchen” (the maiden) is neuter gender would make that mistake.

Here, to start, are the 2015 reading pass rates for the state, Charles City, Richmond, and three other old, urban jurisdictions.

image

Even in that forest of data, it’s clear that the girls outscore the boys. 

As well, the Richmond numbers are low, compared either to the state or to the peer jurisdictions.  But, then, Richmond had the second lowest pass rate in the state. 

The Charles City male/female data look to be  anomalous. 

Taking this one step further, here are the female minus male pass rates by jurisdiction and economic disadvantage.

image

Here we see the female outperformance is larger in the economically disadvantaged populations, both statewide and in the urban jurisdictions. 

Something is different in Charles City.  I couldn’t guess what.

Finally, here are the not economically disadvantaged less economically disadvantaged pass rates by sex.

image

The effect of economic disadvantage is smaller in Charles City and Richmond, consistent with my speculation (here and in an unpublished communication) that Charles City and Richmond may have been overclassifying kids as “economically disadvantaged” to increase their Title I funding. 

Consistent with the anomalous numbers above, Charles City reverses the usual difference by sex.

Here are the analogous graphs showing the math data.

image

image

image

These data are generally consistent with the reading results, except that the Charles City anomaly on the female minus male scores is larger.

Lying by Telling the Truth

Monday, His Excellency Arne Duncan touted “a continuing upward trend in graduation rates.”  USDOE has a Press Release to the same effect.

What Duncan and USDOE neglected to mention was that the NAEP long-term data do not show improvements in reading or math scores of seventeen-year-old students.

image

image

Duncan is not dumb, so he must be deliberately overlooking the obvious conclusion, which is nothing to brag about:

Easier Grading = Higher Graduation Rate

Your tax dollars at “work.”

Richmond Pass Rates by Race and Economic Disadvantage

Having examined Lynchburg’s SOL performance broken out be race and economic disadvantage, I thought I’d take a look at Richmond.

First the baseline: Here are the statewide averages for the reading tests.

image

It is no surprise that Virginia’s Asian students outperform the white students who in turn outperform the black students, nor that, within each racial group, the economically disadvantaged students underperform their more affluent peers. 

The question for the day, however, is Richmond’s performance in each of these categories.  Here, to start, are the Richmond pass rates on the reading tests.

image

To simplify the comparison, let’s take the ratio of the Richmond to the State pass rates for each group.

image

There are several notable features in this pattern of underperformance:

  • Only Richmond’s white students who are not economically disadvantaged managed to equal (actually, to slightly exceed) the state average pass rate for their peer group;
  • Richmond’s Asian students did not outperform; and
  • Although well below the state average for their peer group, Richmond’s economically disadvantaged black students outperformed Richmond’s economically disadvantaged white students, relative to their peer group, and outperformed those of Richmond’s black students who are not economically disadvantaged. 

The outperformance of the economically disadvantaged black students comes as a surprise.  Everything else being equal, the more affluent students would perform better.  Perhaps the Richmond data  reflect an overclassification of students as “economically disadvantaged.”  We know that Richmond overclassified students as “disabled” in order to improve its scores.  The same approach to economic disadvantage would allow Richmond to collect more Title I money.   

Whatever the reason, the numbers here certainly are anomalous.

Turning to the math tests:

image

image

image

Here,

  • All groups are underperforming the state average pass rates for their peer groups;
  • Richmond’s Asian students again are not outperforming, albeit they are doing better here than on the reading tests;
  • Relative to their respective peer groups, Richmond’s black students who are economically disadvantaged again outperform both those who are not so disadvantaged as well as the white, economically disadvantaged students.

To compare the reading and math data, let’s subtract the Richmond/State performance for reading from that for math:

image

Except for one group, the math score ratio is higher than the reading.  That is, Richmond’s reading instruction is batting only .167 when compared to Richmond’s (already inferior) math instruction.  And, for sure, there is a larger problem with the reading instruction for our economically disadvantaged Asian students.

The Bottom Line: Richmond’s math instruction is bad and its reading instruction is even worse.  But, then, we already knew that.

Racial Smoke Screen in Lynchburg

Jim Weigand points out the News & Advance article reporting that the Lynchburg school gurus have concluded that students’ race is a greater indicator of challenge than poverty, albeit both factors “matter a lot.”

Ask the wrong question, get an irrelevant answer. 

The racial achievement gap is a fact of life, although the reasons remain a lively source of controversy.

The relevant question here is not whether there is a racial gap in Lynchburg or whether that gap exceeds the differences attributable to economic disparity; the question is whether Lynchburg’s students are learning as much as their racial and economic peer groups statewide.

Let’s start with the SGP data.  Those data, which are essentially uncorrelated with whether students are economically disadvantaged, tell us that Lynchburg’s schools are doing an awful job.  For example, the 2014 statewide distributions of SGPs by teacher show an average of 48 for the reading tests and 49.3 for math.

image

Lynchburg, in contrast, has a reading average of 40.2

image

and a math average of 37.2.

image

How would you like to have your kid in the hands of that Lynchburg math “teacher” who produced an average SGP of four?

(In light of the manifest utility of these data, do you wonder why the teachers’ association, which claims it works “for the betterment of public education,” thinks it would be terrible to publicly identify the good and bad teachers in Virginia’s public schools or why VDOE has had second thoughts and is abandoning the SGP?)

We don’t have SGP data by race (The Virginia Department of Data Suppression has those data but has not shared them).  Less usefully, the VDOE database can break out pass rates by race.  Data there show that, statewide, Asian students on the reading tests outperform white students, who outperform black students.  This holds both for students who are and for those who are not economically disadvantaged.

image

The Lynchburg pattern is somewhat different.

image

To illuminate  the differences, we can calculate the ratios of the Lynchburg and State pass rates by race and economic status:

image

Here we see Lynchburg’s white students, economically disadvantaged and not, performing about the same level as their peer groups statewide.  The black students who are not economically disadvantaged are underperforming the state average of similarly situated black students; Lynchburg’s economically disadvantaged black students are considerably underperforming their statewide peer group.  So, economic disadvantage or no, Lynchburg’s black students are passing the tests at a rate below the state average.

Here are the same data for the math tests:

image

image

image

So we see that Lynchburg’s white students, both economically disadvantaged and not, who pass the reading tests at or slightly above the state average rates for their groups nonetheless underperform on the math tests.  Lynchburg’s black students, both economically disadvantaged and not, considerably underperform their peers statewide. 

The important questions Lynchburg should be seeking to answer are why its black students underperform the state averages for their peers, economically disadvantaged or not,  in both reading and math and and why all Lynchburg groups but the economically disadvantaged Asian students(!) underperform in math.

Where Are the Data

As a further look at the performance and underperformance of Richmond’s elementary schools, here is the range of 2015 pass rates on the reading tests.

image

Here we see Carver and Fairfield Court outperforming (we’ll deal with Munford below) while Woodville underperforms at an unconscionable level.  In the meantime, the charter school, Patrick Henry, is in the middle of the pack.

The math scores paint a similar picture except that Cary joins the outperformers and Patrick Henry sinks to the bottom third.

image

The Fall membership data from VDOE tell us that Munford, the green point, is blessed with a large population of more affluent kids while the other leaders, blue with Carver to the left, are not. 

image

Woodville, with 79% economically disadvantaged students, is the orange point.

Here is the same graph for the math tests.  Cary joins the leaders as the left-hand blue point.

image

For sure, the economic status of the students does not explain these data.

Here is the dataset.

School Name

% ED

Reading

Math

Bellevue Elementary

61%

64%

73%

Blackwell Elementary

53%

53%

66%

Broad Rock Elementary

70%

81%

83%

Chimborazo Elementary

68%

50%

57%

E.S.H. Greene Elementary

73%

55%

71%

Elizabeth D. Redd Elementary

68%

63%

68%

Fairfield Court Elementary

84%

88%

90%

G.H. Reid Elementary

65%

49%

51%

George Mason Elementary

81%

43%

61%

George W. Carver Elementary

71%

98%

97%

Ginter Park Elementary

58%

63%

79%

J.B. Fisher Elementary

42%

83%

90%

J.E.B. Stuart Elementary

69%

72%

80%

J.L. Francis Elementary

68%

59%

69%

John B. Cary Elementary

69%

72%

93%

Linwood Holton Elementary

29%

74%

69%

Mary Munford Elementary

11%

90%

91%

Miles Jones Elementary

70%

61%

70%

Oak Grove/Bellemeade Elementary

75%

41%

56%

Overby-Sheppard Elementary

68%

48%

62%

Patrick Henry School Of Science And Arts

31%

67%

65%

Southampton Elementary

54%

73%

75%

Swansboro Elementary

69%

52%

51%

Westover Hills Elementary

64%

53%

68%

William Fox Elementary

16%

85%

82%

Woodville Elementary

79%

29%

30%

The data do raise some questions:

  • Where is VDOE?  Where is their study that explains the over- and under- and mediocre-performance of these schools?  What are they doing to transmit that information to the other schools?
  • Carver and Fairfield and Cary (in math) are doing something right (or cheating extravagantly); what is it and why are the other schools not doing it?
  • Patrick Henry has absorbed a lot of money and energy but is not getting results.  What is wrong there?
  • Where are the Woodville parents?  Why are they not at the School Board every meeting to demand that RPS stop abusing their kids?
  • Where is VCU?  To date, their major “contributions” have been a study to validate the VGLA that, upon examination, is a whitewash and the hiring of Richmond’s failed Superintendent as an Associate Professor in “Educational Leadership.”  Perhaps they could do something constructive for a change.