Teacher Evaluation: Fecklessness at the Board of “Education”

On April 28, 2011, the Board of Education adopted revised Guidelines for Uniform Performance Standards and Evaluation Criteria for Teachers.  The Guidelines provided for teacher evaluations in seven respects:

         Performance Standard 1:  Professional Knowledge
         Performance Standard 2:  Instructional Planning 
         Performance Standard 3:  Instructional Delivery
         Performance Standard 4: Assessment of/for Student Learning
         Performance Standard 5: Learning Environment
         Performance Standard 6: Professionalism
         Performance Standard 7: Student Academic Progress

Just from this list, we can see that the Board was focused on process, not results.  If chefs were rated on a similar scale, six parts of the rating would deal with cleanliness of the kitchen, skill in chopping vegetables, chefly demeanor, and the like, with only one item in seven related to the quality of the cooking.

It gets worse.

The measures of “student academic progress” in Standard 7 are:

• Sets acceptable, measurable, and appropriate achievement goals for student learning progress based on baseline data.
• Documents the progress of each student throughout the year.
• Provides evidence that achievement goals have been met, including the state-provided growth measure when available as well as other measures of academic progress.
• Uses available performance outcome data to continually document and communicate student progress and develop interim learning targets.

Nowhere in “sets . . . goals,” “documents . . . progress,” “provides evidence,” or “uses . . . data” do the guidelines say that the teacher shall be evaluated based on how much the students learn.  In the kitchen analogy, the chef’s cooking is to be measured by goals, progress, evidence, and data, not by the taste and presentation of the food.

Apparently the General Assembly noticed this gross departure from the Board’s duty to “supervis[e] the public schools.”  Chapter 588 of the 2013 Acts of Assembly includes the following amendments to Code §22.1-253.13:5.B:

Consistent with the finding that leadership is essential for the advancement of public education in the Commonwealth, teacher, administrator principal, and superintendent evaluations shall be consistent with the performance objectives standards included in the Guidelines for Uniform Performance Standards and Evaluation Criteria for Teachers, Administrators Principals, and Superintendents. Evaluations shall include student academic progress as a significant component and an overall summative rating. Teacher evaluations shall include regular observation and evidence that instruction is aligned with the school’s curriculum. Evaluations shall include identification of areas of individual strengths and weaknesses and recommendations for appropriate professional activities. (emphasis supplied)

Supposedly responding to that mandate, on July 23, 2015 the Board amended the Guidelines.  

The amended Guidelines contain the same seven standards.  Standard 7 gets amended only to replace “growth measure” with “progress data,” to reflect the Board’s abandonment of the rigorous SGPs for the new, not-much-penalty-for-failure, progress tables.

7.3  Provides evidence that achievement goals have been met, including the state-provided growth measure progress data when available as well as other multiple measures of student academic progress.

Even then, the teacher is not to be evaluated on achieving progress, but only for “provid[ing] evidence.” 

If that were not weak enough, the operative provision (Guidelines at 4) says:

The Code of Virginia requires that school boards’ procedures for evaluating teachers address student academic progress; how this requirement is met is the responsibility of local school boards. As prescribed by the Code of Virginia, each teacher must receive a summative evaluation rating. The Board’s Guidelines for Uniform Performance Standards and Evaluation Criteria for Teachers recommend weighting the first six standards equally at 10 percent each and that the seventh standard, student academic progress, account for 40 percent of the summative evaluation.

That provision renders the Guidelines both stupid and unlawful.

Stupid: The Guidelines recommend that the seventh standard account for 40% of the evaluation.  Yet Code §22.1-253.13:1.A tells us that

The General Assembly and the Board of Education believe that the fundamental goal of the public schools of the Commonwealth must be to enable each student to develop the skills that are necessary for success in school, preparation for life, and reaching their full potential.

So the Board of “Education” says that 60% (actually much more, given the fluff in the seventh Standard) of the evaluation should turn on things other than how much the students learn.

Unlawful: The statute, quoted above, requires that teacher evaluations be “consistent” with the standards in the Guidelines.  Yet the Guidelines themselves tell us that they are mere recommendations and that the local school boards get to decide what is “consistent.”  So, in fact we get up to 132 different sets of guidelines.

 

Why do you suppose the Board of “Education” is so dedicated to serving incompetent teachers instead of the students whose parents who are paying those teachers?

Lake Woebegon of Teachers??

Browsing through the VDOE Web pages, one finds the Teacher and Principal Evaluation Collection Results.

The only data there are for 2011.  Even that limited dataset, however, is sufficient to demonstrate that the “evaluation” process was ridiculous, if not  fraudulent.

The 2011 data show that all forty-six Richmond principals were “satisfactory.”  All our principals, it seems, were at or above average.  Never mind that Richmond’s reading SOL pass rate that year was 1.6 standard deviations below the division mean and its math score was 2.0 standard deviations low.  (Richmond is the gold square on the graphs.)

imageimage

The teacher data were more nuanced but similarly ridiculous.  Here is a summary.

  Classroom Management/ Positive Learning Environment Communi-cation Skills  Evaluation and Assessment Implements and Manages Instruction Knowledge of Subject Planning Activities Profes-sional Responsi-bilities Total
EE = Exceeds Expectations 317 437 208 273 479 240 302 2256
ME = Meets Expectations  698 598 826 754 555 787 733 4951
NI = Needs Improvement 20 2 3 10 3 9 2 49
U = Unsatis-factory  2 0 0 0 0 1 0 3
Total 1037 1037 1037 1037 1037 1037 1037 7259

So we see that three of 7,259 ratings were “unsatisfactory” and forty-nine were “needs improvement.”  That is, the report says that only 0.72% of the items in Richmond teachers’ evaluations showed some aspect of failure to meet or exceed expectations in 2011.

That is absurd in the abstract; in light of the available data it is baldly mendacious.

You may recall that the SGP data (that Brian Davison had to pry from VDOE with a lawsuit) can measure teacher performance.  Unlike the SOL itself, the SGP data are not correlated with economic advantage or disadvantage.  So the “poor students” excuse doesn’t work as to SGP.

We have SGP data for the following year, 2012.  Here, with caveats, are the reading data, starting with the distribution of teacher average SGPs (i.e., the average, by teacher, of the students’ SGPs).

image

The orange line is a Gaussian distribution fitted to the data: Mean = 45.0; standard deviation = 10.8.

Then here is the distribution of Richmond reading teachers’ average SGPs.

image

Note the absence of very high performing teachers and the plethora of low performers in Richmond.  One hundred nine of 205 Richmond reading teachers (53.2% v. 50% in a normal distribution) are below the state mean; sixteen (7.8% v. 2.5% in a normal distribution) are more than two standard deviations and fifty-two (25.4%, v. 16% in a normal distribution) are more than one standard deviation below the state mean.

For math, the state distribution has a mean of 46.8 and a standard deviation of 14.6.

image

In contrast to the reading data, Richmond has some outstanding math teachers but their numbers are outweighed by underperforming teachers.

image

Indeed, 111 of 193 Richmond math teachers (57.5%) are below the state mean; six (3.1%) are more than two standard deviations and thirty-seven (19.2%) are more than one standard deviation below the state mean.

Yet, according to the evaluations from the previous year, Richmond’s teachers were just fine, thank you, in 99.3% of all measures.

Just as a reminder, the effect of a good or bad teacher can be dramatic.  Here for instance are the students’ 2014 reading SGPs for Richmond teacher #66858 (anonymized in the VDOE database).

image

And here, in contrast, are the students’ SGPs for teacher #68809.

image

Unfortunately, we have too many who are more like #68809 than #66858.

Richmond’s subpar teacher performance has only worsened recently, as reflected in the deteriorating SOL performance: For 2015, we are the sixth worst division in the state in math, second worst in reading.

Of course, our principals and Superintendent have the power (and duty) to remedy these glaring defects.  Inadequate teacher performance that is not corrected reflects inadequate principal performance; inadequate principal performance that is not corrected reflects inadequate Superintendent performance.  We need to hold these public employees accountable when they don’t deal with the teachers who are harming Richmond’s schoolchildren.

But, then, it’s difficult to impose accountability when VDOE is hiding the relevant data.

Your tax dollars at work.

 

PS: You’ll notice that the internal “evaulations” deal with inputs, which are easy to measure (and to fudge), while the important quantities are outputs (how much our students learn), which are hard to measure.  VDOE is making it harder to know about outputs by abandoning the SGP and installing “progress tables” that measure progress but mostly ignore the lack of it.  Even so, we’ll have some measure of outputs, albeit VDOE doubtless will not release the data.

Seems to me City Council should demand, as a condition of all the money they are spending on RPS, an audit of teacher performance (SGP in the past, progress tables going forward) with an analysis of principal and Superintendent actions regarding underperforming teachers.  For sure, if Council doesn’t demand those data, the rest of us will continue to be operated on the mushroom principle (keep ‘em in the dark, feed ‘em horse poop).

 

I’ll save VDOE’s feckless “improvements” in the evaluation process for another post.

Graduating (and Not)

The RT-D this morning reports that Virginia’s on-time graduation rate of 90.5% “tops” the national average of 82%.

The RT-D is mixing apples and pomegranates.  They are comparing national cohort data for 2014 with Virginia “on-time” data from 2015.

The Virginia “on-time” rate is a fiction, generated by VDOE to  inflate the actual rate.  The actual 4-year cohort Virginia rate in 2015 was 86.7%.

Even so, that’s Virginia.  This is Richmond.  The (awful) Richmond rate actually dropped this year.

image

The 2015 cohort also had 167 dropouts in Richmond, 11.8% of the cohort. 

The enrollment pattern by grade gives a more nuanced picture of the huge numbers of students Richmond loses to dropouts and from parents who move to the much better schools in the Counties.

image

Westover Hills Elementary: Glass Half Full?

The Winter edition of the Forest Hill Flyer (not yet posted to the Association Web site) had an interesting piece regarding the new (since 2011) Principal at Westover Hills, Virginia Loving, and her project to “make it a neighborhood school.” 

I’ll leave it to someone with more direct knowledge to assess the other effects of Ms. Loving’s outreach; I turn to the results on the VDOE Web site from the statewide testing program under the SOLs.

First, I should note that Principal Loving came in at a particularly difficult time: VDOE promulgated a new, tougher set of math tests in 2012 and reading tests in 2013 that clobbered the pass rates statewide (data here for all tested grades). 

image

Unfortunately, our former Superintendent did not prepare the Division for the new tests.  The lack of preparation exacerbated the score drop here.

image

You might also recall that Richmond’s elementary schools on average perform ten points or more below the state average (but stratospherically above our middle schools).

image

That said, the SOL performance at Westover Hills has been decidedly mixed.

On the math tests, WH was scoring near the (unacceptable) Richmond average before 2012.  The new tests hit even harder at WH than the at the district average.  But WH recovered more quickly than the Richmond average (data for grades 3-5).

image

That is, under Principal Loving, WH took an unusually big hit from the new math tests but since has been showing signs of improvement.

I’ve included the data for Patrick Henry, which is nearby and might be viewed as a neighborhood school.  PH was hit even harder by the new tests and recovered only to about the (awful) Richmond average.

The reading scores reveal a more troubled situation.  WH again was performing at about the Richmond average.  After the new test plunge in 2013, Richmond scores improved but the WH scores continued to decline.  This  is not good news for the school or for the children who attend it.  (Again, data for grades 3-5).

image

Patrick Henry started higher and dropped to the state average, but then continued also to drop. 

For what they may communicate, here are the combined (reading + math) pass rates.

image

Seems to me the neighborhood outreach could be more effective if the teaching, especially of reading, were improved. 

Of course, VDOE has been obtaining SGP data that would tell us which of the Westover Hills teachers are, or are not, effective, so Principal Loving (and the neighborhood) would have data that directly measure teacher performance.  Unfortunately, VODE is concealing the data they already have and now is abandoning the SGP entirely.  This is fully consistent with VDOE’s actual function, which is to be the Department of Data Suppression.

Board of Education Finally Acting on a Truancy Regulation

CODE § 22.1-269 requires that the Board of Education “see that the [mandatory attendance laws] are properly enforced throughout the Commonwealth.” Notwithstanding that mandate, the Board still neither collects nor publishes data that would allow the public to assess its performance of that duty.

In this enforcement vacuum, Richmond, for example, has been free to define “truancy” as ten unexcused absences and, instead of filing a petition for judicial action at seven absences, as § 22.1-258 requires, sending a letter after ten.

The Board did not publish a proposed truancy regulation until December 21, 2009. The history of that regulation is set forth on the Town Hall website. In short, the regulation now is in its fourth public comment period.

Despite six years of consideration and reconsideration, the regulation remains unlawful and ineffective.

I. The amended definitions of “excused absence” and “unexcused absence” render the regulation unlawful.

CODE § 22.1-254 contains the compulsory attendance provision of Virginia law:

Except as otherwise provided in this article, every parent, guardian, or other person in the Commonwealth having control or charge of any child who will have reached the fifth birthday on or before September 30 of any school year and who has not passed the eighteenth birthday shall, during the period of each year the public schools are in session and for the same number of days and hours per day as the public schools, send such child to a public school or to a private, denominational, or parochial school or have such child taught by a tutor or teacher of qualifications prescribed by the Board of Education and approved by the division superintendent, or provide for home instruction of such child as described in § 22.1-254.1 (emphasis supplied).

CODE § 22.1-98.B.1 provides: “The length of every school’s term in every school division shall be at least 180 teaching days or 990 teaching hours in any school year.” The statute sets out exceptions (e.g., severe weather); those exceptions do not include part day absences.

In short, the law requires attendance for the full school day and the full school year.

CODE § 22.1-269 provides:

The Board of Education shall have the authority and it shall be its duty to see that the provisions of [§§ 22.1-254 through -269.1] are properly enforced throughout the Commonwealth.

That is, this Board has the authority and duty to enforce § 22.1-254 and -258, not to create loopholes in those statutes. Yet, the Board’s proposed regulation would excuse an absence that is shorter than the full school day by even a moment and would except that absence from the mandatory enforcement procedures of §§ 22.1-258 et al.

Under this regulation, a student could march into school only during the last five minutes of class on each school day, or report for the first roll call and then leave for the day, and never be classified as truant under the amended regulation. Surely the General Assembly did not intend that absurd result.

The sole rationale proffered by staff for this unlawful exception is convenience. Yet the statute does not make an exception for the convenience of the school divisions or of the courts.

Perhaps the Board could make an exception for an occasional de minimis instance where a student is tardy or unavoidably misses a few minutes of class. But the wholesale exception of any absence less than a full class day, as now proposed, is baldly unlawful.

As pointed out in my earlier comments, Richmond has been violating § 22.1-258 wholesale. Full compliance with the law surely will be greatly inconvenient to Richmond and to any division similarly engaged in ignoring § 22.1-258. Anything less, however, would be contrary to the manifest will of the General Assembly, would subject the Board to suit over an unlawful regulation, and would leave the divisions exposed to actions for mandamus for failure to comply with the clear requirements of Virginia law.

Indeed, any division that might be overwhelmed by the requirement to obey this law will have the same recourse as any other public agency with inadequate resources to comply with the law: Seek more resources and, in the meantime, prioritize the workload and deal with as many cases as possible.

Staff point to the 2d and 3d paragraphs of § 22.1-258 which require notice to the parent “[w]henever any pupil fails to report to school on a regularly scheduled school day” and require an attendance plan after the fifth such failure. Read literally, these provisions would never invoke the enforcement mechanisms of § 22.1-258 so long as the student reported in at any time during the school day, regardless of whether the student then departed immediately. This narrow reading of the second and third paragraphs overlooks the fourth paragraph of § 22.1-258, which requires a conference “[i]f the pupil as absent” a sixth time and requires referral to court “[u]pon the next absence,” both without mentioning failure to “report.” Moreover, staff’s narrow reading of the failure to report language would lead to a regulation that is manifestly inconsistent with the General Assembly’s command that every student attend (i.e, report to) school for “at least 180 teaching days or 990 teaching hours.”

The Supreme Court’s Blake decision does not modify this duty. Blake holds that “send” in § 22.1-254 is sufficiently ambiguous that a parent cannot be prosecuted for her child’s tardiness. Nowhere, however, does the decision contradict the manifest purpose of the compulsory attendance statutes that, with very limited exceptions, every school age student shall attend school all of every school day.

Moreover, the Board has the authority (and duty) to rectify any ambiguity in the statute in order that the statute may serve its clear purpose.

Finally, the separate definitions of excused and unexcused absences are dangerous and unworkmanlike.

By setting out long definitions of both excused an unexcused absences, the Board invites lawlerly mining for ambiguity and conflicts between the two definitions. Moreover, in light of the statutory requirement for full attendance, the Board should set out careful and narrow standards for excused absences, and then define any other absence as “unexcused.”

II. Consistent with the Failure to Require Full Day Attendance, the Regulation Fails to Require Reporting of Part Day Absences.

The data collection portion of the Regulation, 8 VAC 20-730-30, is silent as to part-day absences.

III. The New § 8 VAC 20-730-20 Invites Up to 132 Separate Definitions of “Excused Absence.”

The new § 20-730-20 would have each school board provide “guidance” as to “what would constitute an excused absence.” In this, the Board unlawfully delegates its own authority and invites a spectrum of definitions that would emasculate the compulsory attendance laws and render the data collected under the regulation meaningless.

IV. 8 VAC 20-730-30.E Does Not Require Reasons for the Choice Between CHINS and Misdemeanors.

Upon the next absence after the scheduling of the six-absence conference, CODE § 22.1-258 requires either or both (1) filing of a complaint alleging the student is a child in need of supervision (CHINS petition), and (2) prosecution of the parent.

Subsection 30.E of the Regulation requires a report whether a seventh absence leads to a complaint but fails to require the attendance officer or Superintendent to set out the reasons for choosing one course or the other.

Yet the choice must be driven by the facts of each case. For example, one of the division’s options under § 22.1-258 is to prosecute the parent under § 22.1-262. That latter statute authorizes prosecution for, inter alia, “refus[al] to participate in the development of the plan to resolve the student’s nonattendance or in the conference provided for in § 22.1-258.” Manifestly, if the division fails to prosecute a parent who refuses to participate, the attendance officer should be required to set out a principled reason for not prosecuting.

This failure to require transparency is fully consistent with the general absence of any requirement in the regulation for accountability. See the next item.

V. The Regulation Should Create a Clear Chain of Accountability.

The regulation fails to require a system of accountability so that the public, the Board, and the local school boards, can measure the performance of a school system and its employees.

Richmond serves as an example, perhaps an extreme one, of the effect of this Board’s failure to obtain reliable truancy data and to enforce the requirements of § 22.1-258.

In an email dated May 22, 2012, Felicia Cosby of the City of Richmond Public Schools wrote: “As of March 22, Richmond Pubic Schools has sent 77 failure to send petitions–an increase from last year’s total submissions of 47.” Yet Richmond had 1,875 cases of ten or more unexcused absences during 2009. This amounts to somewhere <2.5% compliance with CODE § 22.1-258.

Note: The 2013 report from the Department of Education (the most recent report available as of November, 2015) shows 3,268 six-absence conferences in Richmond, 13.8% of the fall ADM of 23,649. In the absence of any further data from the state, it is impossible to know whether that astounding datum represents an improvement or not.

If, as at present, there is to be no clear chain of accountability and no expectation of consequences for poor performance and no State enforcement of the mandatory attendance laws and no useful information from the State, we must expect that Richmond, and surely other divisions, and the State will continue to fail the children in our schools. The Board should use this regulation as an opportunity to correct that dismal situation.

The Voice of Reason

The Daily Press today took the side of the taxpayers who are paying our teachers:

Virginia residents pour billions into the public schools each year. Local taxpayers, such as those in Newport News and Hampton, spend millions more on education.

It’s fair to ask about the return we’re receiving on that massive annual investment. And data collected through statewide evaluative testing measures can help provide a more accurate picture of that.

We don’t want to embarrass hard-working teachers. Rather, we believe [these SGP data are] information to which the public is entitled and which it should have.

More Data that VDOE Suppresses

Brian Davison points me to the Arkansas Web site that reports, inter alia, grade inflation:

As required by Arkansas Code Annotated § 6-15-421, the Division of Public School Accountability in the Arkansas Department of Education provides
this report of the percentage of students receiving a grade of “B” or above in the corresponding course who did not pass the end of course assessment on the first attempt.  The report also includes the name, address, and superintendent of any high school in which more than twenty percent (20%) of the students received a letter grade of “B” or above, but did not pass the end-of-course assessment on the first attempt.

Indeed, the schools with more than 20% are highlighted in yellow. 

Near the top of the list, we see Badger Academy, where 100% of the students received A or B grades but none passed the End of Course test on the first try.  Perhaps Badger Academy is a special purpose school, such as the Ark. School for the Deaf (90%), but then we have Decatur High School (50%), Clinton High School (66.7%), and  Siatech Little Rock Charter (83.3%). 

In contrast, you can search the VDOE Web site for “grade inflation” and find only one paper that speaks of grade inflation in teacher evaluations and one on the same subject in principal evaluations.  Nothing at all about inflated student grades.

But, then, you wouldn’t really expect the State Department of Superintendent Protection and Data Suppression to report anything so useful.

Five Kinds of Failure

Following up on Jim Weigand’s five-subject SOL averages (average of the pass rates in Reading, Writing, History & SS, Math, and Science), I again unleashed Excel’s pivot table on the VDOE database

As before, here are the five-subject averages for Charles City, Hampton, Newport News, Norfolk, and Richmond by year, along with the average of division averages (as opposed to the state average pass rates in the earlier post).

image

And here are the same data, expressed as the difference between the Division average and the average of divisions.

image

One further measure of Richmond’s dismal performance: Here are the rankings of the 132 division averages by year (132 is the lowest possible rank).

image

Your tax dollars at “work.”

Damaged in Transit Thru RPS

I finally got around to updating the SOL page on crankytaxpayer.org and was reminded that we now have outcome data beyond just the graduation and dropout rates.

These Federal data are part of USDOE’s reporting requirements under the American Recovery and Reinvestment Act.  Inter alia, Virginia must report the numbers of students in the cohort who, having graduated with a regular diploma, enter a public, private, or 2-year college or university (Institution of Higher Education, “IHE” in FederalSpeak) within sixteen months of graduation.  Here are the data for the 4-year cohort graduating in 2014, expressed as a percentage of the cohort:

sol_sc13

I trust you got that: Even with Richmond’s reporting of Maggie Walker students at schools they don’t attend (you can be sure those MLW kids will graduate and do well afterward: average SAT scores in 2013 were 713 verbal, 692 math; average scholarship offer was $72,000 per student; SOL pass rate 100%), the diploma graduates of RPS are much less successful than the state norm at getting into public universities and community colleges. 

Then we have the 2012 high school graduates who enrolled in a Virginia IHE within sixteen months of graduation and who completed at least one year’s worth of college credit applicable to a degree within two years of enrollment in the IHE.

sol_sc4

One can only conclude that Richmond is giving diplomas to a number of students who would not receive them in other divisions. 

Your tax dollars at “work.”

It’s Dismal Here in the Basement

Jim Weigand, who likes to look at the five subject average pass rate (Reading, Writing, History & SS, Math, and Science) sends along Richmond’s rank on that scale by year:

image

Given that the data are readily available I thought I’d also plot the actual five subject averages pass rates for Richmond and some comparable old, urban jurisdictions (and our neighbor Charles City County).

image

Those data are perhaps more accessible expressed as differences from the State average.

image

We can thank our former Superintendent for the recent deterioration of our already substandard performance.