Teacher Truancy Costs, 2018 Version

RPS is hiring a lot of substitute teachers because, as measured by the 2018 CRDC, astounding numbers of the full time teachers are absent from work. The RPS budgets provide a measure of the costs.

In the 2017 adopted budget there was a line item in “General Fund Expenditures by Object Class” under “Other Compensation” for substitutes (at category 523): “N-SUBSTITUTE INSTR PROF.”

image

In the 2020 budget, that category had morphed to “N-INSTRUCTIONAL STAFF.”

image

Indeed, that category title changed every year from 2017 to 2020 (and in between the category number disappeared). No telling what’s going on there.

We can extract some useful information from this, however, because each budget contains data for the previous two years. The numbers reported under the new category titles in later budgets included those under the older titles in the earlier budgets. That is, despite the name changes, this category was, and remained, for substitute instructors.

image

These budgets also include the “actual” expenditures for substitutes for the first of the three budget years: Those show RPS underestimating the substitute teacher costs by 33 to 42 percent in each of the four years analyzed above.

And, to the point of teacher truancy, if RPS could cut the actual substitute costs by half, they could boost the full time teaching salaries by about 3% without increasing the overall budget.

image

Your tax dollars at “work.”

2018 CRDC: Teacher Certification

The importance of teacher quality should be clear but here’s room to debate the effectiveness of teacher certification. Nonetheless, certification is one of the few measures available to those of us who are taxed to pay those teachers.

Here, then, from the 2018 CRDC, are the Virginia public school division percentages of full time teachers not certified.

image

There are 32 divisions with 0%, i.e., all the teachers are certified. Richmond is the Big Winner with 22.5%, the gold bar. The red are, from the left, the peer cities Newport News (nearly hidden, 0.1%), Norfolk, and Hampton. The blue is the bar for Galax which, at 3.7% is the closest division to the division average, 3.6%.

image

Next, the Richmond schools:

image

Richmond’s awful middle schools, pink, dominate the high end here, except for Hill at 17% (4.8 times the division average). The high schools, yellow, are spread out some, ranging from Community (invisible at 0%) to Wythe at 35%. The elementary schools, green, run from 4% to 31%. The blue bar is the Richmond average, 22.5%, which is 6.3 times the division average.

Here is the list.

image

Except for Community High and six of the specialty schools, all our schools had remarkable shortages of certified teachers. Perhaps that was related to the salaries: The Richmond average for teaching positions in ‘18 was $51,528 against a state average of $57,260. Then, again, these data don’t tell us how much of the salary difference reflected lower salaries paid to all those uncertified teachers.

Teacher Truancy, 2018 Version

We hear a lot about truancy and its malign effects. For example, here.

Courtesy of the Federales, we now have data on teacher absences that may be similarly damaging to students and that certainly are harder to justify.

The 2018 version of the biennial Civil Rights Data Collection, available since October, 2020, has data on both truancy and teacher absences. Let’s look at the teachers.

The feds count full time teacher absences >10 during the school year. Eleven days are 6.1% of a 180 day school year.

Here are the 2018 counts for the Virginia school divisions.

image

Richmond is the gold bar at 65%. The red bars, from the left, are the peer jurisdictions Norfolk, Hampton, and Newport News. The blue bar is superimposed on one of the two divisions at the state average, 38%.

image

Richmond is a bit improved from the 2016 report. That year we were second from worst at 68%; this year, fifth at 65%.

The distribution of division rates looks like this:

image

The tall, red bar marks the Richmond datum at 65%, 1.9 standard deviations above the mean.

The Richmond school data show only Community, Ginter Park, Greene, Munford, and Alternative below the state average.

image

Little kids are notorious for being petri dishes for germs so we could expect the elementary schools to run high. In fact, except for that remarkable 9% number at Alternative, Richmond’s elementary schools reach both ends of the Richmond spectrum.

clip_image001

You might think that the very few schools where the teachers mostly come to work must have unusually heathy students and teachers or excellent principals. I’ll vote for the latter. Similarly, the very many schools where most of the teachers miss a lot of school would seem to have remarkably sick people or lousy leadership. There’s no reason to expect a plethora of sick people when some schools clearly don’t have many.

And, for sure, there is a leadership vacuum downtown and at the Board of “Education.”

In contrast, the middle schools run high while the high schools cluster toward the high-middle.

clip_image001[4]

image

BTW: Maggie Walker is not a Richmond Public School (although the SOL pass rates there are reported at the public high schools in the students’ home school zones) and the numbers there are in a different universe from even the best of the Richmond high schools.

image

In any case, it is clear that the RPS “leaders” downtown need to be directed to work that is better suited to their talents.

We might expect the SOL performance to decline with increasing teacher truancy. Here are the reading data for the Richmond elementary schools:

image

Note: Virginia’s economically disadvantaged (here “ED”) students pass the SOLs at rates some 17 to 22 points lower than their more affluent peers (here “Not ED”), depending on the subject. Thus the school and division average pass rates depend both on student performance and the relative numbers of ED and Not ED students. This punishes the schools and divisions with large ED enrollments. We’ll avoid that issue here by looking at the rates for both groups.

For the Richmond elementary schools, the ED reading rate drops by some 2.2% for a 10% increase in the teacher truancy but the R-squared value tells us that the teacher absences only explain about 12% of the SOL variance. As to the Not ED students, any effect is smaller (ca. 9% for a 10% increase in teacher absences) and very nearly uncorrelated.

The math data give similar results.

image

The division data paint a similar picture.

image

image

In contrast to the national data, these Virginia numbers do not show a strong relationship between SOL performance and teacher truancy. They do suggest that the absent teachers may not be much more effective at teaching than the substitutes. And, in any case, the data spotlight a massive, and expensive, management failure, both in Richmond and statewide.

Commonwealth of Lake Woebegon

If you wanted to boost the pass rates of the SOLs, you’d have three choices (aside from the one perfected at Carver): Improve teaching, make the tests easier, or relax the scoring.

On the 2019 revision of the math tests, the Board of “Education” chose the last option: They adopted cut scores in five of six cases that were less than the level necessary to retain the same level of rigor as the earlier tests. The results were predictable (and, of course, fed the false notion that student performance was improving).

image

The Board now has jiggered the English tests to the same end. The recommendation (teacher-driven; no pretense here of objectivity) was for every cut score to be lower (easier) than the level necessary to maintain the rigor of the tests.

clip_image001

The Board rejected the Superintendent’s slightly higher recommendations and adopted the committee’s numbers (video at 1:44:55; minutes are not yet available). This grade inflation will have the happy result of making the Board and the Superintendents and the English teachers and the students all look better, all without anybody having to break a sweat.

It will also make it impossible to measure the effect of the coronavirus on English performance.

This is not an anomaly, but rather part of an ongoing campaign to camouflage the fading performance of the Virginia public school system. However, unfortunately for the self-serving mendacity of the “education” establishment, the NAEP data for fourth grade reading

image

and eighth grade reading

image

give away the game.

Your tax dollars at “work.”

Richmond: The Excellent, the OK, and the Awful

While we wait to see how far the Board of Education will punt on the 2021 SOL testing, let’s look in some detail at the 2019 performance of Richmond’s schools (there having been no testing in 2020).

But first, some important background: Statewide, “economically disadvantaged” (here, “ED”) students underperform their more affluent peers (“Not ED”) by some 17 to 22 points, depending on the subject. Thus the school and division average pass rates depend both on student performance and the relative numbers of ED and Not ED students. This punishes the schools and divisions with large ED enrollments. We’ll avoid that issue here by looking at the rates for both groups.

To start, here are the ED pass rate distributions of Virginia and Richmond schools on the reading tests.

image

The blue bars are the counts of Virginia schools with the indicated (rounded) pass rates. The red bars, with open middles so the state data can show through, are Richmond; the Richmond scale is on the right-hand axis.

The state data here (and even more in the next chart) are skewed toward the low end. That renders the usual measures of a distribution, the mean and standard deviation, less useful. The measure reported here is the median.

The two Richmond schools that aced the reading tests are Open and Community. The next entry, at 86%, is the other selective school, Franklin. The best of the mainstream schools is Marshall at 71%. The only other school to beat the state median was Cary at 68%. The eight Richmond schools in the cellar are, from the bottom, Alternative, Fairfield Court, MLK, Carver, Woodville, Chimborazo, and Mason.

The Not ED data portray another disaster.

image

Community and Open again aced the tests. They are followed by Munford, Hill, Franklin, Fox, Alternative and, barely above the state median, Patrick Henry. At the other end, the largest failures are, from the left, Greene, MLK, Boushall, Woodville, Elkhardt-Thompson, and Henderson. Fairfield Court would surely be in that latter list but for the suppression rule (<10 Not ED students).

Turning to the math tests, the Richmond pass rates are even less encouraging:

image

The schools that beat the state median are, from the top, Open, Community, Cary, Franklin, and Redd. At the other end, the basement dwellers are, from the bottom, Alternative, MLK, Fairfield Court, Carver, Boushall, Wythe, and Henderson.

image

As to Not ED, Open, Community, Munford, Fox, and Ginter Park beat the state median. Boushall, MLK, Wythe, Greene, Henderson, Elkhardt-Thompson, and Blackwell all scored below 50%.

These data emphasize the huge spreads between Richmond’s best and worst schools as well as the stunning under-performance of flocks of Richmond’s students.

For the record, here are the data, sorted by decreasing averages of the four data points. The “#DIV/0!” entries are for cases where the student count was zero or, more likely, suppressed by VDOE because it was <10.

image

Region 7 Addendum

We have seen that the divisions in SW Virginia (“Region 7” in the VDOE system) formed their own organization, the Comprehensive Instructional Program (“CIP”), that brought nice improvements in student performance.

While we wait to see whether the Board of “Education” will punt on the 2021 SOL testing, I’ve been looking over the 2019 data (there being no tests in 2020). The data for Region 7 paint a lovely picture.

You may recall that, since undertaking the CIP, Region 7 has seen major improvements in the pass rates of its economically disadvantaged (“ED”) students.

image

image
They accomplished this with a large and increasing ED population.

image

To put the 2019 results in a more nuanced context, let’s start with the school average reading pass rates for the Not ED students.

image

The blue bars are the counts of Virginia schools with the indicated 2019 pass rates of Not ED students (rounded to the nearest whole nubers). Thus, one school (Fairfax County Adult High) turned in a 13% pass rate(!) and 102 schools had 88% rates. The red-bounded bars are Region 7, left open to allow the state numbers to show through. The Region 7 scale is on the right vertical axis. The lowest school there turned in a 69 while 11 schools had 91% rates. (Excel reports for “multiple items” when you tell it to report data for more than one division; please read that term as “Region 7.”)

The usual statistical measures, mean and standard deviation, are of limited use with skewed distributions so I show the medians here. Of course, as a distribution approaches “normal,” the median approach the mean. In any case, these are medians of the school averages, not division medians.

If you think the Not ED pass rates for Region 7 schools are a pleasant bit of news, take a look at the ED numbers:

image

Here, the Region 7 median is ten points higher than the state. Or you might prefer to ignore those stats and just look at the lovely picture.

The math data similarly testify to the success of the CIP.

image

image

It is instructive to compare the (manifestly sensible) techniques used by the CIP with the resolutely ineffective bureaucratic nonsense imposed by the “education” establishment.

The CIP:

  • Identify the good teachers,
  • Share their materials and techniques,
  • Measure what works,
  • Focus on core skills,
  • Set high expectations,
  • Bond with the students, and
  • Use the feckless VDOE only for what
    it actually can do well: crunch numbers.

The state, here the Petersburg Corrective Action Plan (for a division that the state has been attempting to repair, without success, since 2004):

image
I think it is past time to redirect the education bureaucracy to what it can do well, crunch numbers, and give the rest of its budget to the CIP.

More Money and Less Education in Richmond

On the subject of spending for schools, the VDOE Web Site has 2019 data for division income by source and fall enrollments of both economically disadvantaged (“ED”) students and their more affluent peers (“Not ED”).

The division income totals per student, plotted against the % of ED enrollment, looks like this:

image

Richmond is the enlarged, yellow point. The red points are, from the left, the peer cities Newport News, Hampton, and Norfolk.

The fitted line suggests that per student division income increases by $68 for a 10% increase in the ED percentage but the R-squared value tells us the variables are uncorrelated.

Richmond is in 15th place in this find-the-money derby.

image

In terms of local funding, Richmond again is above average but down in 27th place.

image

The R-squared value of 7% suggests a slight correlation between local funding and % ED, but in the decreasing direction.

The other funding sources present a more interesting picture.

image

State funding shows a modest correlation, R-squared = 22%, while the federal data exhibit a more robust R-squared of 42%. Funding in both categories increases with increasing % ED.  Richmond is well below the fitted curve for state funding, with part of that gap closed by Uncle Sam.

The Sales Tax funding is essentially flat.

Looking again at just the Big Winners:

clip_image001

Here we see that larger than average local taxes support the effort in every case while the State accounts for some of the excess in Highland and Sussex and the federales do so in Surry, Sussex, and Richmond.

Of course, once that money comes in the divisions spend it.  We earlier saw the expenditure data juxtaposed with Richmond’s lousy performance:

image

If only Richmond would stop whinging about money and start educating its schoolchildren.

Learning and Poverty

An earlier post showed the absence of a correlation between division day school expenditures and either the average SOL pass rate of economically disadvantaged (“ED”) students (mostly those who qualify for the free lunch program) or those of their more affluent peers (“Not ED”).  As well, the post dwelled on the remarkable progress of ED students under the Comprehensive Instructional Program, a bottom-up initiative that started in VDOE’s Region 7 (southwest Virginia).

Regarding that post, the estimable Dick Hall-Sizemore asks whether the relative numbers of ED and Not ED students make a difference as to SOL performance.

The short answer is NO as to reading and slightly as to math, but with the (modest) effect in a place where you might not expect it.

For the longer answer, here are the data:

image

The average reading rate for ED students increases 1.8% going from 0% to 100% ED students in the tested group. However, the R-squared of 0.1% tells us the two variables are not correlated.

The Not ED performance, in contrast, decreases by 1.4% for a 10% increase in the ED population. The R-squared value indicates that the ED population explains 11% of the variance in those pass rates.  Thus, to a modest extent, it seems that increasing the proportion of less affluent students is related to lowered performance of the more affluent students.

If you have a testable explanation for this result, please do share it with me.

Richmond is the enlarged points with yellow fill. The red fills are the peer cities, Hampton and Newport News on the left (Hampton above) and Norfolk on the right.

Notice particularly the several divisions with both ED percentages and reading pass rates higher than Richmond’s.

image

Indeed, the sole exception among that dozen divisions is the Petersburg Not ED datum.

The math data tell a similar story, but with a steeper decline in the Not ED pass rates (3.3% for a 10% ED population increase) and a more substantial R-squared value, 24%.

image

ED population also fails to explain the better performance of the Region 7 schools that started the (remarkably successful) Comprehensive Instructional Program.  Region 7 has considerably more than the average population of ED students.

image

Note: There is an anomaly in these numbers. Washington County goes from 62.3% ED in 2018 to 99.1% in 2019. This looks like a data error.  I have asked VDOE about it; they have not replied (presumably too busy with programs that don’t work). The 2019 average with Washington Co. excluded would be 56.0%. For now, it would be wise to ignore the Washington Co. numbers and to discount that 61% some.

As to the details:

image

image

Note: The points at 99% are Washington County.

It is instructive to compare the manifestly effective CIP approach in Region 7 with the resolutely ineffective VBOE “Plan” for dealing with the ongoing failure of the Petersburg system.

STOP! Please go back and read all of each of those lists so you can fully appreciate the fecklessness of the Board’s approach.

A modest proposal: Let’s expand the CIP statewide and and shrink the Board of Education’s function to what they can usefully do: statistics and webinars.

Money Don’t Buy You Learning

It’s December. The Generous Assembly is about to return and the demands for more education funding [see Executive Summary at p.4] resound throughout the Commonwealth.

The data would suggest that these demands are misplaced.

VDOE won’t post the 2020 expenditure data until sometime this Spring and there were no 2020 SOLs, so we’ll use the 2019 expenditure and SOL data. The expenditure numbers below are the those for “day school operation” (the sum of Administration, Instruction, Attendance and Health Services, Pupil Transportation, and O&M spending). Student counts are the year-end average daily membership.

One wrinkle: Statewide, “economically disadvantaged” (here, “ED”) students underperform their more affluent peers (“Not ED”) by some 17 to 22 points, depending on the subject. Thus the division average pass rates depend both on student performance and the relative numbers of ED and Not ED students. We’ll avoid that issue here by looking at the rates for both groups.

Here, then, are the division average reading pass rates for the two groups plotted v. the division day school expenditure per student.

image

Richmond is the enlarged points with yellow fill. The red-filled points are, from the left, the peer cities Hampton, Norfolk, and Newport News.

The fitted lines suggest that performance of the Not ED students increases slightly with expenditure (about 2% per $10,000) while the ED scores decrease (ca. 4% per $10,000). The R-squared values, however, tell us there is only a minuscule correlation between the pass rates and the expenditures.

We can get a clearer view of the data for Richmond and the peer cities by expanding the axis to hide the Very Big Spenders.

image

image

Need I say it? Richmond is spending well above average money and obtaining lousy results for both groups of students.

The math data tell the same story: More Money doesn’t correlate with more learning; Richmond spends a lot and gets awful pass rates.

image

image

As a pleasant contrast to that bad news, the (locally created and run) Comprehensive Instructional Plan has produced remarkable gains in the Southwest:

image

They’ve told us how they achieved this:

  • Identify the good teachers,
  • Share their materials and techniques,
  • Measure what works,
  • Focus on core skills,
  • Set high expectations,
  • Bond with the students, and
  • Use the feckless VDOE only for what it actually can do well: crunch numbers.

While the Generous Assembly is in town perhaps they will consider taking the school improvement budget that is wasted at VDOE and giving it to the CIP, where they know how to get results.

More Graduates, Less Learning

The estimable Jim Bacon notices the increased graduation rates this year and wonders how much of the increase reflects the waivers issued by the Superintendent.  We have some of the underlying data.

On May 26, the Governor issued Executive Order 51 that provides, in part:

Authorization for the heads of executive branch agencies, on behalf of their regulatory boards as appropriate, and with the concurrence of their Cabinet Secretary, to waive any state requirement or regulation. . . .  All waivers issued by agencies shall be posted on their websites.

The “guidance” on the VDOE Web Site provides, also in part:

The following graduation requirements are waived based on authority granted to the Superintendent of Public Instruction per Executive Order Fifty-One (2020):

  • Students currently enrolled in a course for which they need a verified credit in order to graduate;

  • Students who have previously been awarded standard credit, but have not earned the associated verified credit;

  • Students who have not completed the student-selected test;

  • Students who are currently enrolled in or have previously completed a course leading to a CTE credential necessary for a Standard Diploma but have not yet earned the credential;

  • Students who have not completed a United States and Virginia history course*;

  • Students who have not completed a fine or performing arts or career and technical education course*;

  • Students in the second of sequential courses*;

  • Students who have not completed an economics and personal finance course*.

VDOE does not set out any direct measure of the effect of these waivers but the history of the cohort graduation rates provides some clues.

But first, what rates shall we measure?

The VDOE announcement brags that the 4-year, “on time,” 2020 cohort graduation rate rose to 92.3% from 91.5% last year, “despite the closure of schools due to COVID-19 in March.”

At the threshold, we might notice that the “on time” graduation rate is manipulated by the inclusion of Modified Standard diplomas that are issued to students who “are unlikely to meet the credit requirements for a Standard Diploma.” The more honest numbers are the “federal” rates (counting only the advanced and standard diplomas), 89.9% this year and 88.7% last year.

Note added later on 10/3: Oops! In fact, there were fewer than ten Modified Standard diplomas this year. The Board already put its thumb on the scale as to that diploma; “The Modified Standard Diploma will not be an option for students with disabilities who enter the ninth grade for the first time beginning in 2013-2014. Credit accommodations allow students with disabilities who previously would have pursued a Modified Standard Diploma to earn a Standard Diploma.” The current jiggering is set out here: “Special education students and limited English students who have plans in place that allow them more time to graduate will be counted as graduates or non-graduates when they earn a diploma or otherwise exit high school.” Translated, that means that Special Ed and LEP students who don’t graduate in four years get counted in the next (or later) year’s cohort. Talk about win-win for boosting the graduation numbers.

Here is the recent history of the federal diploma rates.

image

The state rate was flat between ‘18 and ‘19. Extrapolating that suggests that the 1.2 point improvement in ‘20 was entirely artificial. The fitted line shows an annual increase of ca. 0.3% per year as of 2019, suggesting that 3/4 of the 1.2% increase this year came from the waived requirements. Slice that any way you like, most of the increase reflects administrative fiat, not academic accomplishment.

Thus, of the 98,481 students in the cohort, it looks like somewhere between about 885 and 1180 received bogus diplomas.

Closer to home, the Richmond data suggest that the waived requirements artificially reversed Richmond’s plunging graduation rates.

You and I get to wonder what kind of lying bureaucrat would brag that the artificial increase served “to ensure that students were not held back because being unable to take a Standards of Learning test or complete a required course” while ignoring the wholesale grant of degrees to students who would not have earned them.

The distribution of the 2020 division rates skews toward the low end with Richmond, the yellow datum, leading the skew.

image

Data for Covington, Lexington, and Highland Co. are absent. I’ve colored the 90% datum light blue to indicate the location of the state average.

All that said, “economically disadvantaged” (here, “ED”) students underperform their more affluent peers (“Not ED”).  The division average SOL pass rates for the two groups differ by about 20 points.

Similarly, the (inflated) state average graduation rates for Not ED and ED students differ this year by just 10%, so the division averages again reflect affluence as well as performance. To deal with that, let’s break out the data for those two groups.

image

image

Here we see that the state average increase this year came mostly from the ED population. This makes sense, given that the ED group graduates at a lower rate and would be helped more by a waiver of graduation requirements.

As to Richmond, this year’s jump came entirely from the ED average while the graduation rate of Not ED students actually dropped.

(If you can explain that Not ED drop in the face of waived requirements, or those other fluctuations in the Richmond rates, please relieve my confusion with an email to John{at}calaf{dot}org.)

When we separate out the ED and Not ED data, Richmond’s place among the divisions is more complicated.

image

image

Richmond is the gold bar in the first graph and is one of the four represented by the gold bar in the second. The light blue bars again are at the counts at the state averages, here 92 and 82.

As to Not ED students, Richmond’s 59% graduation rate (!) is third from the bottom. (Mercy! What can be going on in Covington, 12%, and Hopewell, 37%?)  As to ED students, Richmond’s 74% is too low, but not in the cellar.

These data do not begin to explain why Richmond’s Not ED graduation rate is so appallingly low, nor why it declined (yet again) this year.

Finally, here are the graduation rate changes from 2019 to 2020, sorted by decreasing ED differences.

image

It is far from obvious why any of those graduation rates should decrease in light of the waivers, much less how Goochland landed a 24.76% decrease in the ED rate.