Accreditation and Not

VDOE has updated its 2015-16 Accreditation data.

I earlier discussed their byzantine, opaque process for accrediting Virginia’s public schools.

Well, some of those schools.  You won’t find a rating for Maggie Walker, for instance, because VDOE counts the scores of the MW students at high schools they don’t attend.  And that just scratches the surface of the “adjustments” that boosted the accreditation scores this year by 6.1%.

This year they made it even easier to avoid “Accreditation Denied” by relabeling some denied schools as “Partially Accredited: Approaching Benchmark-Pass Rate” or “Partially Accredited: Improving School-Pass Rate, or “Partially Accredited: Reconstituted School,” among others.

Adjustments or not, relabeling or not, VDOE could not get entirely away from Richmond’s second-from-last place performance on the reading tests or its sixth-from-last place on the math tests.  The initial accreditation results showed Richmond with 37.8% accredited, v. 77.6% of the state’s schools and, more to the point here, with 15.6% “To Be Determined.”  VDOE now has acted on five of the seven Richmond TBDs, which bumps the Accreditation Denied rate from 4.4% to 8.9% and the Reconstituted rate from zero to 6.7%.  Here are the new data:

image

image

image

Note that one of the new schools is Elkhardt/Thompson; the relabeling converts Thompson’s earlier “Denied” rating into “New School.”  All told, 53% of the Richmond schools were warned or denied accreditation this year with two schools still TBD and another failed middle school, Thompson, hiding in the definitional weeds.

The keel of this sinking ship is the middle schools: King denied; Hill improving;  Binford, Brown, and Henderson reconstituted; Elkhardt/Thompson new and camouflaging the denied Thompson.  Only Franklin, which includes both middle and high school grades, is fully accredited.

Data are here.

Gains and Not So Much, IV

We have seen that Richmond has fallen far behind on the SOL pass rates

image

For example, see the data here

image

and here.

image

image

More recently, I noticed that divisions with high pass rates have little room for improvement while those with lower pass rates are shooting at larger targets.  The VDOE (bogus) accreditation process seems to recognize this.  I suggested that a better measure of progress is the overall pass rate change divided by the previous year’s failure rate, which measures the decrease in the failure rate.

Added Note: Jim Weigand reminds me that the 2015 pass rates are boosted by about 4% by the new retest policy, albeit the Virginia Department of Data Suppression is not posting anything that would let us evaluate that effect on any particular division or school.

Here are those data for Richmond, broken out by race and economic disadvantage (“Y” indicates economically disadvantaged, “N” = not).  Let’s start with Reading.

image

image

image

Then Math:

image

image

image

Writing.  Note: No data for Richmond’s white, economically disadvantaged kids:

image

image

image

History & Social Science:

image

image

image

Science:

image

image

image

Finally, the five subject averages:

image

image

image

These data suggest that Richmond is doing a poorer job with its students who are not economically disadvantaged and that it is focusing on reading and math to the detriment of writing, history & social science, and science.

Gains and Not So Much, III

Brian Davison suggests that the increased number of retakes benefits the divisions with low pass rates, i.e., with lots of students who might be eligible for retakes.  But, since the State Department of Data Suppression does not tell us about retakes, we can’t know about that.

I’ll suggest there is a more subtle problem: If a division with a 90% overall pass rate increases that rate by 1%, it has decreased the failure rate by 10%.  In contrast, a division with a 50% pass rate that increases the pass rate by 1% leaves 49% of its students failing; it decreases the failure rate by 2% of the failure rate.  To achieve a result equivalent to the first division, this division must increase its overall pass rate by 5%.  But then, it is shooting at a larger target:  A division with a high pass rate has little room for improvement; a division with a low pass rate has plenty of room for (and needs to make lots of) improvement.

The estimable Carol Wolf suggests that I use a simpler analogy: If your pass rate is 50%, you get fifty shots per hundred kids at improving it; if the pass rate is 90%, you get only ten.

That is, a fairer measure of progress is the overall pass rate change divided by the percentage of students who failed to pass the previous year.

Here are those data for Lynchburg:

image

image

image

Gains and Not So Much, II

Delving further into Jim Weigand’s report that his Super is bragging on the year-to-year SOL performance in Lynchburg: The problem is the absence of data on the new retest rate that raised pass rates by about 4% in 2015.  So let’s look at the 2014-2015  changes in pass rates v. the state average, which at least discounts the Lynchburg score gains by the statewide equal opportunity for grade inflation.

Here, for a start, are the five-subject pass rate changes by race and economic disadvantage (“Y” = economically disadvantaged as reported in the VDOE database; “N” = not).

image

On the five-subject average, the Super gets bragging rights for the non–ED white kids, with a 1.4% score gain over the state average; otherwise he has some explaining to do, especially as to the ED white students, who improved by 3.9% less than the state average.

On the reading tests, the opportunity for bragging rights doubles by the addition of the black, economically disadvantaged students.  But the score gain by the white, ED population lags by 2.5% and the black, not-ED by 2.9%

image

Finally, on the math tests, the Lynchburg Super gets full bragging rights, especially as to the black, ED students.

image

Looks to me like Lynchburg needs figure out what its math teachers are doing and spread some of that to the teaching of the other four subjects.

Gains and Not So Much

Jim Weigand of Lynchburg reports that his Super is bragging on the year-to-year SOL performance there.  Jim points out that we don’t know how much of that can be attributed to the new retesting regime that raised the grade 3-8 pass rate by about 4%.

I’ve asked Chuck Pyle whether retest data are available.  While waiting for his answer, I thought I’d take a glace at the raw year-to-year changes.

For a start, here are the five-subject average division pass rate changes from 2014 to ‘15:

image

Note that Excel can get only so much information on the axis and, thus, gives names only for alternate divisions.  For example, Richmond is the unnamed yellow bar between Rappahannock and Richmond County.

The green bar is Lynchburg; the red are, from the top, Hampton, Newport News, and Norfolk.  The average gain is 3.9%

Here are the same data, sorted by change in pass rate:

image

From this direction, Lynchburg looks pretty good, albeit we don’t know whether they had an unusual retest rate.

For completeness, here are the Reading and Math score increases (reading average = 5.4%, math, 6.3%):

image

image

On that last graph, Highland is off the scale at 27.8%.

I’ll bet you a #2 lead pencil that the State Department of Data Suppression does not have the retest data, which will leave us not knowing how much any of these changes represents actual improvement in the pass rate.

Lousy Schools, Not a Shortage of Money

I earlier showed that Richmond’s dismal pass rates on the reading and math SOLs are not explained by Richmond’s expense per student or its large population of economically disadvantaged students.  Indeed, even among the twenty-eight divisions with similar or higher populations of economically disadvantaged students, Richmond underperforms grossly.

It’s arctic outside this morning and I’m in no hurry to vacuum the house, so I thought I’d take a further look at the data.  The VDOE SOL database is glad to deliver the pass rates of the students who are not economically disadvantaged.  Here, then, are the five-subject division average pass rates of the non-ED student populations v. the 2014 division disbursements per student (VDOE won’t post the 2015 data until sometime this Spring), with disbursements for adult education, facilities, debt service, and contingency funds not included.

image

Richmond is the gold square; the red diamonds, from the left, are peer cities Hampton, Newport News, and Norfolk.

Excel was glad to fit a least-squares line to the data.  The slope suggests that increasing disbursements decreases the pass rates but the R2 shows that pass rates and disbursements are essentially uncorrelated.

One might think that a very large population of economically disadvantaged students would weigh on the performance of the less disadvantaged.  To look at that, here are the same data for the twenty-eight divisions with the highest populations of economically disadvantaged students.

image

The red diamonds are Newport News and Norfolk. 

As with the general populations, Richmond is spending a lot of money and getting lousy results.

Note that all but two of these twenty-eight divisions have larger ED populations than Richmond:

image

Indeed, Bristol, Colonial Beach, and Roanoke all managed pass rates >90% with ED percentages higher than Richmond’s.

As I’ve said, Richmond’s problem is lousy schools, not poverty or lack of money.

Here are the data:

image

Data Storm at SCHEV

Looking again at the VDOE data on post-graduation outcomes, I noticed that their numbers come from SCHEV.

SCHEV turns out to have a blizzard of data.

As a first look, here are some of the completion rates at Virginia’s 4-year colleges and universities.  These data are for “all, first-time and transfer,” students entering college in 2009 who completed their studies at any school.  For all students, the four year rate is 58%, the five year rate, 74.7%.  The data show women and “majority” (I think that means “white”) students outperforming the average, while males and students of color underperform.

image

Looking at the all student data for the research universities, we see VCU underperforming considerably.

image

Of course, VCU is the school that hired Richmond’s failed Superintendent as an Associate Professor in “Educational Leadership,” (link now down; we can hope that means they are rid of her) so we can infer that academic rigor is not what is holding down their graduation rate.

Turning from the research universities, we see JMU and Longwood leading the pack with VSU and Norfolk state trailing.

image

As a further hint at the depth of data available on this site, here are the VCU and UVa data by sex and race (“SOC” = Students of Color.  Their term, not mine.).

image

image

Stay tuned while I go dig for some Richmond numbers.

Poor Richmond

Jim Bacon points out a study from WalletHub that ranks “cities” by average credit scores of their residents.

We need to be a little bit careful here.  Virginia has a very specific definition of “city.”  The WalletHub data come from some other definition.  So we see WalletHub calling Glen Allen a “city,” while Virginia views it as part of Henrico and we see Virginia calling Franklin City a “city” while WalletHub apparently sees only Franklin County, not a city.

Reducing the universe to only those cities that appear on both lists, we see on the WalletHub data:

Rank  Percentile City  Credit
875 66 Alexandria, VA  687.1
1541 40 Bristol, VA  663.36
669 74 Charlottesville, VA  695.86
1563 39 Chesapeake, VA  662.6
2087 19 Danville, VA  644.31
737 71 Falls Church, VA  692.52
1367 47 Fredericksburg, VA  669.62
2179 15 Hampton, VA  640.3
1727 33 Harrisonburg, VA  656.72
2471 4 Hopewell, VA  621.44
1476 43 Lynchburg, VA  665.73
1506 41 Manassas, VA  665
2077 19 Martinsville, VA  644.75
2383 7 Newport News, VA  628.88
2454 5 Norfolk, VA  622.92
2544 1 Petersburg, VA  599.25
2486 3 Portsmouth, VA  618.69
2217 14 Richmond, VA  638.71
1404 45 Roanoke, VA  668.1
816 68 Salem, VA  689.42
1203 53 Staunton, VA  674.62
1837 29 Suffolk, VA  653.13
1392 46 Virginia Beach, VA  668.61
1420 45 Waynesboro, VA  667.68
1401 45 Winchester, VA  668.12

Here “Rank” is the average credit score rank among US cities, “Percentile” is the percentile ranking of those scores, and “Credit” is the average credit score.  Petersburg, at “1,” is in the lowest percentile.

A graph will let us see how those credit scores align with SOL pass rates.  Here are the 2015 reading pass rates.

image

Unsurprisingly, the pass rates decrease with decreasing average credit scores, albeit the correlation tells us that some other factor or factors have a much larger effect than credit scores.

On the graph, Richmond is the gold square; the red diamonds are, from the left, Norfolk, Newport News, and Hampton. 

To the same effect, here are the math scores.

image

A better correlation but Richmond again is underperforming most of its peers.

Finally, with a hat tip to Jim Weigand, here the five subject averages.

image

If it were not for Petersburg,  Martinsville, and Danville, Richmond would be the sole underperformer among the poorer cities; as it is, we are just a gross underperformer. 

Even so, as we have seen before, poverty is not the reason for Richmond’s awful schools.

Here are the data:

City  Credit Reading Writing History & SS Math Science 5-Subjects
Alexandria  687.1 71% 70% 77% 69% 68% 71%
Bristol  663.36 77% 76% 88% 81% 79% 80%
Charlottesville  695.86 77% 72% 80% 77% 74% 76%
Chesapeake  662.6 81% 83% 91% 85% 87% 86%
Danville  644.31 66% 66% 74% 65% 68% 68%
Falls Church  692.52 92% 93% 96% 90% 90% 92%
Fredericksburg  669.62 76% 75% 80% 75% 77% 77%
Hampton  640.3 72% 67% 84% 74% 74% 74%
Harrisonburg  656.72 67% 67% 81% 76% 79% 74%
Hopewell  621.44 65% 60% 78% 72% 72% 69%
Lynchburg  665.73 67% 65% 80% 64% 68% 69%
Manassas  665 72% 72% 80% 78% 74% 75%
Martinsville  644.75 62% 65% 67% 59% 58% 62%
Newport News  628.88 68% 69% 81% 71% 74% 73%
Norfolk  622.92 67% 69% 80% 72% 74% 72%
Petersburg  599.25 58% 49% 72% 57% 66% 60%
Portsmouth  618.69 72% 67% 85% 73% 75% 74%
Richmond  638.71 59% 48% 72% 62% 66% 61%
Roanoke  668.1 72% 71% 82% 78% 77% 76%
Salem  689.42 86% 82% 92% 88% 91% 87%
Staunton  674.62 74% 72% 78% 73% 79% 75%
Suffolk  653.13 74% 71% 84% 75% 78% 76%
Virginia Beach  668.61 83% 80% 88% 84% 85% 84%
Waynesboro  667.68 71% 68% 76% 69% 71% 71%
Winchester  668.12 72% 73% 85% 74% 72% 75%

Your tax dollars at “work.”

Disorder Disaster

The VDOE Web site tells us:

The Code of Virginia requires school divisions to submit data to VDOE on incidents of discipline, crime and violence (DCV). Current reports contain selected comparisons to prior years. DCV data are used also to complete federal reports required by the Gun-Free Schools Act of 1994 (GFSA) and the Individuals with Disabilities Education Act (IDEA). GFSA requires annual reporting of the number of students suspended or expelled statewide for possessing or bringing firearms on school property. IDEA contains requirements for reporting disciplinary actions involving students with disabilities.

VDOE reports those data (up through 2015) in a database, accessible on the Web.  A partial explanation of the data is here.

As to Richmond, those data are (pick your adjective) appalling/bad/calamitous/ disheartening/egregious/frightful/ghastly/horrific/ignominious/lamentable/ miserable/nightmarish/outrageous/pathetic.  

The overall Richmond rate of individual student offenders as a percentage of the enrollment is more than twice the state rate.

clip_image002

The mainstream high schools look to be driving the Richmond rate.  Note the different scales on the ordinates.

Armstrong:  image

Wythe: image

Huguenot:  image

Marshall:  image

And, to a lesser extent, TJ:  image

The middle schools show a mixed picture, mostly of disorder:

A.P. Hill:  image

Binford:  image

Elkhardt:  image

Thompson:  image

Henderson:  image

Brown:  image

MLK:  image

And Boushall:  image

Given that these data reflect only the behavior that is reported, one can wonder what the underlying rates of unreported disorder must be.

Among the elementary schools, Fairfield Court shows a jump that precedes its recent bump in SOL scores.

image

image

While Carver shows the opposite pattern.

image

image

Just for contrast, here are Cary

image

Fox  image

and Munford.  image

Teacher Evaluation: Fecklessness at the Board of “Education”

On April 28, 2011, the Board of Education adopted revised Guidelines for Uniform Performance Standards and Evaluation Criteria for Teachers.  The Guidelines provided for teacher evaluations in seven respects:

         Performance Standard 1:  Professional Knowledge
         Performance Standard 2:  Instructional Planning 
         Performance Standard 3:  Instructional Delivery
         Performance Standard 4: Assessment of/for Student Learning
         Performance Standard 5: Learning Environment
         Performance Standard 6: Professionalism
         Performance Standard 7: Student Academic Progress

Just from this list, we can see that the Board was focused on process, not results.  If chefs were rated on a similar scale, six parts of the rating would deal with cleanliness of the kitchen, skill in chopping vegetables, chefly demeanor, and the like, with only one item in seven related to the quality of the cooking.

It gets worse.

The measures of “student academic progress” in Standard 7 are:

• Sets acceptable, measurable, and appropriate achievement goals for student learning progress based on baseline data.
• Documents the progress of each student throughout the year.
• Provides evidence that achievement goals have been met, including the state-provided growth measure when available as well as other measures of academic progress.
• Uses available performance outcome data to continually document and communicate student progress and develop interim learning targets.

Nowhere in “sets . . . goals,” “documents . . . progress,” “provides evidence,” or “uses . . . data” do the guidelines say that the teacher shall be evaluated based on how much the students learn.  In the kitchen analogy, the chef’s cooking is to be measured by goals, progress, evidence, and data, not by the taste and presentation of the food.

Apparently the General Assembly noticed this gross departure from the Board’s duty to “supervis[e] the public schools.”  Chapter 588 of the 2013 Acts of Assembly includes the following amendments to Code §22.1-253.13:5.B:

Consistent with the finding that leadership is essential for the advancement of public education in the Commonwealth, teacher, administrator principal, and superintendent evaluations shall be consistent with the performance objectives standards included in the Guidelines for Uniform Performance Standards and Evaluation Criteria for Teachers, Administrators Principals, and Superintendents. Evaluations shall include student academic progress as a significant component and an overall summative rating. Teacher evaluations shall include regular observation and evidence that instruction is aligned with the school’s curriculum. Evaluations shall include identification of areas of individual strengths and weaknesses and recommendations for appropriate professional activities. (emphasis supplied)

Supposedly responding to that mandate, on July 23, 2015 the Board amended the Guidelines.  

The amended Guidelines contain the same seven standards.  Standard 7 gets amended only to replace “growth measure” with “progress data,” to reflect the Board’s abandonment of the rigorous SGPs for the new, not-much-penalty-for-failure, progress tables.

7.3  Provides evidence that achievement goals have been met, including the state-provided growth measure progress data when available as well as other multiple measures of student academic progress.

Even then, the teacher is not to be evaluated on achieving progress, but only for “provid[ing] evidence.” 

If that were not weak enough, the operative provision (Guidelines at 4) says:

The Code of Virginia requires that school boards’ procedures for evaluating teachers address student academic progress; how this requirement is met is the responsibility of local school boards. As prescribed by the Code of Virginia, each teacher must receive a summative evaluation rating. The Board’s Guidelines for Uniform Performance Standards and Evaluation Criteria for Teachers recommend weighting the first six standards equally at 10 percent each and that the seventh standard, student academic progress, account for 40 percent of the summative evaluation.

That provision renders the Guidelines both stupid and unlawful.

Stupid: The Guidelines recommend that the seventh standard account for 40% of the evaluation.  Yet Code §22.1-253.13:1.A tells us that

The General Assembly and the Board of Education believe that the fundamental goal of the public schools of the Commonwealth must be to enable each student to develop the skills that are necessary for success in school, preparation for life, and reaching their full potential.

So the Board of “Education” says that 60% (actually much more, given the fluff in the seventh Standard) of the evaluation should turn on things other than how much the students learn.

Unlawful: The statute, quoted above, requires that teacher evaluations be “consistent” with the standards in the Guidelines.  Yet the Guidelines themselves tell us that they are mere recommendations and that the local school boards get to decide what is “consistent.”  So, in fact we get up to 132 different sets of guidelines.

 

Why do you suppose the Board of “Education” is so dedicated to serving incompetent teachers instead of the students whose parents who are paying those teachers?