Gains and Not So Much, II

Delving further into Jim Weigand’s report that his Super is bragging on the year-to-year SOL performance in Lynchburg: The problem is the absence of data on the new retest rate that raised pass rates by about 4% in 2015.  So let’s look at the 2014-2015  changes in pass rates v. the state average, which at least discounts the Lynchburg score gains by the statewide equal opportunity for grade inflation.

Here, for a start, are the five-subject pass rate changes by race and economic disadvantage (“Y” = economically disadvantaged as reported in the VDOE database; “N” = not).

image

On the five-subject average, the Super gets bragging rights for the non–ED white kids, with a 1.4% score gain over the state average; otherwise he has some explaining to do, especially as to the ED white students, who improved by 3.9% less than the state average.

On the reading tests, the opportunity for bragging rights doubles by the addition of the black, economically disadvantaged students.  But the score gain by the white, ED population lags by 2.5% and the black, not-ED by 2.9%

image

Finally, on the math tests, the Lynchburg Super gets full bragging rights, especially as to the black, ED students.

image

Looks to me like Lynchburg needs figure out what its math teachers are doing and spread some of that to the teaching of the other four subjects.

Gains and Not So Much

Jim Weigand of Lynchburg reports that his Super is bragging on the year-to-year SOL performance there.  Jim points out that we don’t know how much of that can be attributed to the new retesting regime that raised the grade 3-8 pass rate by about 4%.

I’ve asked Chuck Pyle whether retest data are available.  While waiting for his answer, I thought I’d take a glace at the raw year-to-year changes.

For a start, here are the five-subject average division pass rate changes from 2014 to ‘15:

image

Note that Excel can get only so much information on the axis and, thus, gives names only for alternate divisions.  For example, Richmond is the unnamed yellow bar between Rappahannock and Richmond County.

The green bar is Lynchburg; the red are, from the top, Hampton, Newport News, and Norfolk.  The average gain is 3.9%

Here are the same data, sorted by change in pass rate:

image

From this direction, Lynchburg looks pretty good, albeit we don’t know whether they had an unusual retest rate.

For completeness, here are the Reading and Math score increases (reading average = 5.4%, math, 6.3%):

image

image

On that last graph, Highland is off the scale at 27.8%.

I’ll bet you a #2 lead pencil that the State Department of Data Suppression does not have the retest data, which will leave us not knowing how much any of these changes represents actual improvement in the pass rate.

Lousy Schools, Not a Shortage of Money

I earlier showed that Richmond’s dismal pass rates on the reading and math SOLs are not explained by Richmond’s expense per student or its large population of economically disadvantaged students.  Indeed, even among the twenty-eight divisions with similar or higher populations of economically disadvantaged students, Richmond underperforms grossly.

It’s arctic outside this morning and I’m in no hurry to vacuum the house, so I thought I’d take a further look at the data.  The VDOE SOL database is glad to deliver the pass rates of the students who are not economically disadvantaged.  Here, then, are the five-subject division average pass rates of the non-ED student populations v. the 2014 division disbursements per student (VDOE won’t post the 2015 data until sometime this Spring), with disbursements for adult education, facilities, debt service, and contingency funds not included.

image

Richmond is the gold square; the red diamonds, from the left, are peer cities Hampton, Newport News, and Norfolk.

Excel was glad to fit a least-squares line to the data.  The slope suggests that increasing disbursements decreases the pass rates but the R2 shows that pass rates and disbursements are essentially uncorrelated.

One might think that a very large population of economically disadvantaged students would weigh on the performance of the less disadvantaged.  To look at that, here are the same data for the twenty-eight divisions with the highest populations of economically disadvantaged students.

image

The red diamonds are Newport News and Norfolk. 

As with the general populations, Richmond is spending a lot of money and getting lousy results.

Note that all but two of these twenty-eight divisions have larger ED populations than Richmond:

image

Indeed, Bristol, Colonial Beach, and Roanoke all managed pass rates >90% with ED percentages higher than Richmond’s.

As I’ve said, Richmond’s problem is lousy schools, not poverty or lack of money.

Here are the data:

image

Data Storm at SCHEV

Looking again at the VDOE data on post-graduation outcomes, I noticed that their numbers come from SCHEV.

SCHEV turns out to have a blizzard of data.

As a first look, here are some of the completion rates at Virginia’s 4-year colleges and universities.  These data are for “all, first-time and transfer,” students entering college in 2009 who completed their studies at any school.  For all students, the four year rate is 58%, the five year rate, 74.7%.  The data show women and “majority” (I think that means “white”) students outperforming the average, while males and students of color underperform.

image

Looking at the all student data for the research universities, we see VCU underperforming considerably.

image

Of course, VCU is the school that hired Richmond’s failed Superintendent as an Associate Professor in “Educational Leadership,” (link now down; we can hope that means they are rid of her) so we can infer that academic rigor is not what is holding down their graduation rate.

Turning from the research universities, we see JMU and Longwood leading the pack with VSU and Norfolk state trailing.

image

As a further hint at the depth of data available on this site, here are the VCU and UVa data by sex and race (“SOC” = Students of Color.  Their term, not mine.).

image

image

Stay tuned while I go dig for some Richmond numbers.

Poor Richmond

Jim Bacon points out a study from WalletHub that ranks “cities” by average credit scores of their residents.

We need to be a little bit careful here.  Virginia has a very specific definition of “city.”  The WalletHub data come from some other definition.  So we see WalletHub calling Glen Allen a “city,” while Virginia views it as part of Henrico and we see Virginia calling Franklin City a “city” while WalletHub apparently sees only Franklin County, not a city.

Reducing the universe to only those cities that appear on both lists, we see on the WalletHub data:

Rank  Percentile City  Credit
875 66 Alexandria, VA  687.1
1541 40 Bristol, VA  663.36
669 74 Charlottesville, VA  695.86
1563 39 Chesapeake, VA  662.6
2087 19 Danville, VA  644.31
737 71 Falls Church, VA  692.52
1367 47 Fredericksburg, VA  669.62
2179 15 Hampton, VA  640.3
1727 33 Harrisonburg, VA  656.72
2471 4 Hopewell, VA  621.44
1476 43 Lynchburg, VA  665.73
1506 41 Manassas, VA  665
2077 19 Martinsville, VA  644.75
2383 7 Newport News, VA  628.88
2454 5 Norfolk, VA  622.92
2544 1 Petersburg, VA  599.25
2486 3 Portsmouth, VA  618.69
2217 14 Richmond, VA  638.71
1404 45 Roanoke, VA  668.1
816 68 Salem, VA  689.42
1203 53 Staunton, VA  674.62
1837 29 Suffolk, VA  653.13
1392 46 Virginia Beach, VA  668.61
1420 45 Waynesboro, VA  667.68
1401 45 Winchester, VA  668.12

Here “Rank” is the average credit score rank among US cities, “Percentile” is the percentile ranking of those scores, and “Credit” is the average credit score.  Petersburg, at “1,” is in the lowest percentile.

A graph will let us see how those credit scores align with SOL pass rates.  Here are the 2015 reading pass rates.

image

Unsurprisingly, the pass rates decrease with decreasing average credit scores, albeit the correlation tells us that some other factor or factors have a much larger effect than credit scores.

On the graph, Richmond is the gold square; the red diamonds are, from the left, Norfolk, Newport News, and Hampton. 

To the same effect, here are the math scores.

image

A better correlation but Richmond again is underperforming most of its peers.

Finally, with a hat tip to Jim Weigand, here the five subject averages.

image

If it were not for Petersburg,  Martinsville, and Danville, Richmond would be the sole underperformer among the poorer cities; as it is, we are just a gross underperformer. 

Even so, as we have seen before, poverty is not the reason for Richmond’s awful schools.

Here are the data:

City  Credit Reading Writing History & SS Math Science 5-Subjects
Alexandria  687.1 71% 70% 77% 69% 68% 71%
Bristol  663.36 77% 76% 88% 81% 79% 80%
Charlottesville  695.86 77% 72% 80% 77% 74% 76%
Chesapeake  662.6 81% 83% 91% 85% 87% 86%
Danville  644.31 66% 66% 74% 65% 68% 68%
Falls Church  692.52 92% 93% 96% 90% 90% 92%
Fredericksburg  669.62 76% 75% 80% 75% 77% 77%
Hampton  640.3 72% 67% 84% 74% 74% 74%
Harrisonburg  656.72 67% 67% 81% 76% 79% 74%
Hopewell  621.44 65% 60% 78% 72% 72% 69%
Lynchburg  665.73 67% 65% 80% 64% 68% 69%
Manassas  665 72% 72% 80% 78% 74% 75%
Martinsville  644.75 62% 65% 67% 59% 58% 62%
Newport News  628.88 68% 69% 81% 71% 74% 73%
Norfolk  622.92 67% 69% 80% 72% 74% 72%
Petersburg  599.25 58% 49% 72% 57% 66% 60%
Portsmouth  618.69 72% 67% 85% 73% 75% 74%
Richmond  638.71 59% 48% 72% 62% 66% 61%
Roanoke  668.1 72% 71% 82% 78% 77% 76%
Salem  689.42 86% 82% 92% 88% 91% 87%
Staunton  674.62 74% 72% 78% 73% 79% 75%
Suffolk  653.13 74% 71% 84% 75% 78% 76%
Virginia Beach  668.61 83% 80% 88% 84% 85% 84%
Waynesboro  667.68 71% 68% 76% 69% 71% 71%
Winchester  668.12 72% 73% 85% 74% 72% 75%

Your tax dollars at “work.”

Disorder Disaster

The VDOE Web site tells us:

The Code of Virginia requires school divisions to submit data to VDOE on incidents of discipline, crime and violence (DCV). Current reports contain selected comparisons to prior years. DCV data are used also to complete federal reports required by the Gun-Free Schools Act of 1994 (GFSA) and the Individuals with Disabilities Education Act (IDEA). GFSA requires annual reporting of the number of students suspended or expelled statewide for possessing or bringing firearms on school property. IDEA contains requirements for reporting disciplinary actions involving students with disabilities.

VDOE reports those data (up through 2015) in a database, accessible on the Web.  A partial explanation of the data is here.

As to Richmond, those data are (pick your adjective) appalling/bad/calamitous/ disheartening/egregious/frightful/ghastly/horrific/ignominious/lamentable/ miserable/nightmarish/outrageous/pathetic.  

The overall Richmond rate of individual student offenders as a percentage of the enrollment is more than twice the state rate.

clip_image002

The mainstream high schools look to be driving the Richmond rate.  Note the different scales on the ordinates.

Armstrong:  image

Wythe: image

Huguenot:  image

Marshall:  image

And, to a lesser extent, TJ:  image

The middle schools show a mixed picture, mostly of disorder:

A.P. Hill:  image

Binford:  image

Elkhardt:  image

Thompson:  image

Henderson:  image

Brown:  image

MLK:  image

And Boushall:  image

Given that these data reflect only the behavior that is reported, one can wonder what the underlying rates of unreported disorder must be.

Among the elementary schools, Fairfield Court shows a jump that precedes its recent bump in SOL scores.

image

image

While Carver shows the opposite pattern.

image

image

Just for contrast, here are Cary

image

Fox  image

and Munford.  image

Teacher Evaluation: Fecklessness at the Board of “Education”

On April 28, 2011, the Board of Education adopted revised Guidelines for Uniform Performance Standards and Evaluation Criteria for Teachers.  The Guidelines provided for teacher evaluations in seven respects:

         Performance Standard 1:  Professional Knowledge
         Performance Standard 2:  Instructional Planning 
         Performance Standard 3:  Instructional Delivery
         Performance Standard 4: Assessment of/for Student Learning
         Performance Standard 5: Learning Environment
         Performance Standard 6: Professionalism
         Performance Standard 7: Student Academic Progress

Just from this list, we can see that the Board was focused on process, not results.  If chefs were rated on a similar scale, six parts of the rating would deal with cleanliness of the kitchen, skill in chopping vegetables, chefly demeanor, and the like, with only one item in seven related to the quality of the cooking.

It gets worse.

The measures of “student academic progress” in Standard 7 are:

• Sets acceptable, measurable, and appropriate achievement goals for student learning progress based on baseline data.
• Documents the progress of each student throughout the year.
• Provides evidence that achievement goals have been met, including the state-provided growth measure when available as well as other measures of academic progress.
• Uses available performance outcome data to continually document and communicate student progress and develop interim learning targets.

Nowhere in “sets . . . goals,” “documents . . . progress,” “provides evidence,” or “uses . . . data” do the guidelines say that the teacher shall be evaluated based on how much the students learn.  In the kitchen analogy, the chef’s cooking is to be measured by goals, progress, evidence, and data, not by the taste and presentation of the food.

Apparently the General Assembly noticed this gross departure from the Board’s duty to “supervis[e] the public schools.”  Chapter 588 of the 2013 Acts of Assembly includes the following amendments to Code §22.1-253.13:5.B:

Consistent with the finding that leadership is essential for the advancement of public education in the Commonwealth, teacher, administrator principal, and superintendent evaluations shall be consistent with the performance objectives standards included in the Guidelines for Uniform Performance Standards and Evaluation Criteria for Teachers, Administrators Principals, and Superintendents. Evaluations shall include student academic progress as a significant component and an overall summative rating. Teacher evaluations shall include regular observation and evidence that instruction is aligned with the school’s curriculum. Evaluations shall include identification of areas of individual strengths and weaknesses and recommendations for appropriate professional activities. (emphasis supplied)

Supposedly responding to that mandate, on July 23, 2015 the Board amended the Guidelines.  

The amended Guidelines contain the same seven standards.  Standard 7 gets amended only to replace “growth measure” with “progress data,” to reflect the Board’s abandonment of the rigorous SGPs for the new, not-much-penalty-for-failure, progress tables.

7.3  Provides evidence that achievement goals have been met, including the state-provided growth measure progress data when available as well as other multiple measures of student academic progress.

Even then, the teacher is not to be evaluated on achieving progress, but only for “provid[ing] evidence.” 

If that were not weak enough, the operative provision (Guidelines at 4) says:

The Code of Virginia requires that school boards’ procedures for evaluating teachers address student academic progress; how this requirement is met is the responsibility of local school boards. As prescribed by the Code of Virginia, each teacher must receive a summative evaluation rating. The Board’s Guidelines for Uniform Performance Standards and Evaluation Criteria for Teachers recommend weighting the first six standards equally at 10 percent each and that the seventh standard, student academic progress, account for 40 percent of the summative evaluation.

That provision renders the Guidelines both stupid and unlawful.

Stupid: The Guidelines recommend that the seventh standard account for 40% of the evaluation.  Yet Code §22.1-253.13:1.A tells us that

The General Assembly and the Board of Education believe that the fundamental goal of the public schools of the Commonwealth must be to enable each student to develop the skills that are necessary for success in school, preparation for life, and reaching their full potential.

So the Board of “Education” says that 60% (actually much more, given the fluff in the seventh Standard) of the evaluation should turn on things other than how much the students learn.

Unlawful: The statute, quoted above, requires that teacher evaluations be “consistent” with the standards in the Guidelines.  Yet the Guidelines themselves tell us that they are mere recommendations and that the local school boards get to decide what is “consistent.”  So, in fact we get up to 132 different sets of guidelines.

 

Why do you suppose the Board of “Education” is so dedicated to serving incompetent teachers instead of the students whose parents who are paying those teachers?

Lake Woebegon of Teachers??

Browsing through the VDOE Web pages, one finds the Teacher and Principal Evaluation Collection Results.

The only data there are for 2011.  Even that limited dataset, however, is sufficient to demonstrate that the “evaluation” process was ridiculous, if not  fraudulent.

The 2011 data show that all forty-six Richmond principals were “satisfactory.”  All our principals, it seems, were at or above average.  Never mind that Richmond’s reading SOL pass rate that year was 1.6 standard deviations below the division mean and its math score was 2.0 standard deviations low.  (Richmond is the gold square on the graphs.)

imageimage

The teacher data were more nuanced but similarly ridiculous.  Here is a summary.

  Classroom Management/ Positive Learning Environment Communi-cation Skills  Evaluation and Assessment Implements and Manages Instruction Knowledge of Subject Planning Activities Profes-sional Responsi-bilities Total
EE = Exceeds Expectations 317 437 208 273 479 240 302 2256
ME = Meets Expectations  698 598 826 754 555 787 733 4951
NI = Needs Improvement 20 2 3 10 3 9 2 49
U = Unsatis-factory  2 0 0 0 0 1 0 3
Total 1037 1037 1037 1037 1037 1037 1037 7259

So we see that three of 7,259 ratings were “unsatisfactory” and forty-nine were “needs improvement.”  That is, the report says that only 0.72% of the items in Richmond teachers’ evaluations showed some aspect of failure to meet or exceed expectations in 2011.

That is absurd in the abstract; in light of the available data it is baldly mendacious.

You may recall that the SGP data (that Brian Davison had to pry from VDOE with a lawsuit) can measure teacher performance.  Unlike the SOL itself, the SGP data are not correlated with economic advantage or disadvantage.  So the “poor students” excuse doesn’t work as to SGP.

We have SGP data for the following year, 2012.  Here, with caveats, are the reading data, starting with the distribution of teacher average SGPs (i.e., the average, by teacher, of the students’ SGPs).

image

The orange line is a Gaussian distribution fitted to the data: Mean = 45.0; standard deviation = 10.8.

Then here is the distribution of Richmond reading teachers’ average SGPs.

image

Note the absence of very high performing teachers and the plethora of low performers in Richmond.  One hundred nine of 205 Richmond reading teachers (53.2% v. 50% in a normal distribution) are below the state mean; sixteen (7.8% v. 2.5% in a normal distribution) are more than two standard deviations and fifty-two (25.4%, v. 16% in a normal distribution) are more than one standard deviation below the state mean.

For math, the state distribution has a mean of 46.8 and a standard deviation of 14.6.

image

In contrast to the reading data, Richmond has some outstanding math teachers but their numbers are outweighed by underperforming teachers.

image

Indeed, 111 of 193 Richmond math teachers (57.5%) are below the state mean; six (3.1%) are more than two standard deviations and thirty-seven (19.2%) are more than one standard deviation below the state mean.

Yet, according to the evaluations from the previous year, Richmond’s teachers were just fine, thank you, in 99.3% of all measures.

Just as a reminder, the effect of a good or bad teacher can be dramatic.  Here for instance are the students’ 2014 reading SGPs for Richmond teacher #66858 (anonymized in the VDOE database).

image

And here, in contrast, are the students’ SGPs for teacher #68809.

image

Unfortunately, we have too many who are more like #68809 than #66858.

Richmond’s subpar teacher performance has only worsened recently, as reflected in the deteriorating SOL performance: For 2015, we are the sixth worst division in the state in math, second worst in reading.

Of course, our principals and Superintendent have the power (and duty) to remedy these glaring defects.  Inadequate teacher performance that is not corrected reflects inadequate principal performance; inadequate principal performance that is not corrected reflects inadequate Superintendent performance.  We need to hold these public employees accountable when they don’t deal with the teachers who are harming Richmond’s schoolchildren.

But, then, it’s difficult to impose accountability when VDOE is hiding the relevant data.

Your tax dollars at work.

 

PS: You’ll notice that the internal “evaulations” deal with inputs, which are easy to measure (and to fudge), while the important quantities are outputs (how much our students learn), which are hard to measure.  VDOE is making it harder to know about outputs by abandoning the SGP and installing “progress tables” that measure progress but mostly ignore the lack of it.  Even so, we’ll have some measure of outputs, albeit VDOE doubtless will not release the data.

Seems to me City Council should demand, as a condition of all the money they are spending on RPS, an audit of teacher performance (SGP in the past, progress tables going forward) with an analysis of principal and Superintendent actions regarding underperforming teachers.  For sure, if Council doesn’t demand those data, the rest of us will continue to be operated on the mushroom principle (keep ‘em in the dark, feed ‘em horse poop).

 

I’ll save VDOE’s feckless “improvements” in the evaluation process for another post.

Graduating (and Not)

The RT-D this morning reports that Virginia’s on-time graduation rate of 90.5% “tops” the national average of 82%.

The RT-D is mixing apples and pomegranates.  They are comparing national cohort data for 2014 with Virginia “on-time” data from 2015.

The Virginia “on-time” rate is a fiction, generated by VDOE to  inflate the actual rate.  The actual 4-year cohort Virginia rate in 2015 was 86.7%.

Even so, that’s Virginia.  This is Richmond.  The (awful) Richmond rate actually dropped this year.

image

The 2015 cohort also had 167 dropouts in Richmond, 11.8% of the cohort. 

The enrollment pattern by grade gives a more nuanced picture of the huge numbers of students Richmond loses to dropouts and from parents who move to the much better schools in the Counties.

image

Westover Hills Elementary: Glass Half Full?

The Winter edition of the Forest Hill Flyer (not yet posted to the Association Web site) had an interesting piece regarding the new (since 2011) Principal at Westover Hills, Virginia Loving, and her project to “make it a neighborhood school.” 

I’ll leave it to someone with more direct knowledge to assess the other effects of Ms. Loving’s outreach; I turn to the results on the VDOE Web site from the statewide testing program under the SOLs.

First, I should note that Principal Loving came in at a particularly difficult time: VDOE promulgated a new, tougher set of math tests in 2012 and reading tests in 2013 that clobbered the pass rates statewide (data here for all tested grades). 

image

Unfortunately, our former Superintendent did not prepare the Division for the new tests.  The lack of preparation exacerbated the score drop here.

image

You might also recall that Richmond’s elementary schools on average perform ten points or more below the state average (but stratospherically above our middle schools).

image

That said, the SOL performance at Westover Hills has been decidedly mixed.

On the math tests, WH was scoring near the (unacceptable) Richmond average before 2012.  The new tests hit even harder at WH than the at the district average.  But WH recovered more quickly than the Richmond average (data for grades 3-5).

image

That is, under Principal Loving, WH took an unusually big hit from the new math tests but since has been showing signs of improvement.

I’ve included the data for Patrick Henry, which is nearby and might be viewed as a neighborhood school.  PH was hit even harder by the new tests and recovered only to about the (awful) Richmond average.

The reading scores reveal a more troubled situation.  WH again was performing at about the Richmond average.  After the new test plunge in 2013, Richmond scores improved but the WH scores continued to decline.  This  is not good news for the school or for the children who attend it.  (Again, data for grades 3-5).

image

Patrick Henry started higher and dropped to the state average, but then continued also to drop. 

For what they may communicate, here are the combined (reading + math) pass rates.

image

Seems to me the neighborhood outreach could be more effective if the teaching, especially of reading, were improved. 

Of course, VDOE has been obtaining SGP data that would tell us which of the Westover Hills teachers are, or are not, effective, so Principal Loving (and the neighborhood) would have data that directly measure teacher performance.  Unfortunately, VODE is concealing the data they already have and now is abandoning the SGP entirely.  This is fully consistent with VDOE’s actual function, which is to be the Department of Data Suppression.