Teachers: Licensed And Not

Note added on 2/7/21:

For some insight into this situation, see this later post.

Note added on 2/5/21:

I just received the following email (sent to me and the estimable Carol Wolf) by Michelle Hudacsko, the RPS Chief of Staff:

Hi Carol and John,

There are 4 teachers in RPS that are unlicensed.  I’m getting information on why.  The four teachers are at the following locations/content areas:

· Oak Grove – Kindergarten
· Miles Jones -2nd grade
· Woodville – 4th grade
· Armstrong – Art

There are 32 teachers whose licensure application is with the State being processed.  There are 6 teachers hired in December/January and are getting paperwork in to us to submit to the State. 

We have about 2,100 teachers.  So if you count just the 4 unlicensed we are about 0% unlicensed (as I expected and as it should be!).  If you count the 38 who are just in a licensure processing queue (which is a bit delayed due to COVID), while I would argue they aren’t unlicensed from a qualifications perspective, just a paper perspective, we’d be at 2%.

To the extent these data from the RPS “Talent” office are complete and accurate, this bespeaks a big win for our Superintendent. More to the point, to the extent teacher qualifications are relevant to students’ learning, this indicates a big win for Richmond’s schoolchildren.

As well, if these data had been available on the RPS Web site, the post below would have been much more a celebration than a lamentation.

End of note.

There is room to argue whether the licensing system for public school teachers measures teaching quality. The National Comprehensive Center for Teacher Quality says “no”:

The No Child Left Behind (NCLB) Act mandates that all teachers should be highly qualified, and by the federal definition, most teachers now meet this requirement. However, it is increasingly clear that “highly qualified” – having the necessary qualifications and certifications – does not necessarily predict “highly effective” teaching – teaching that improves student learning.

In any case, the system is in effect and the 2018 Civil Rights Data Collection has the counts of full-time teachers who do, or do not, meet all state licensing/certification requirements.

To start, here are the percentages of unlicensed teachers in the Virginia public school divisions.


Richmond is the gold bar at 22.5%. The red bars, from the left, are the peer cities Newport News (barely visible at 0.1%), Norfolk, and Hampton. The blue bar is Galax which, at 3.7%, is closest to the state average of 3.6%.


Richmond is 6.3 times the division average.

Turning to the Richmond schools:


The yellow bars are the Richmond high schools (with Community invisible at 0% and counting Franklin as a high school); the pink are the middle schools; and the green, the elementary. The white bars are specialty schools (five of which are at 0%). Gold is the Richmond average; blue is the state division average.

Here is the list.


Turning to the relationship between the SOL reading pass rate and the percentage of unlicensed teachers:


The fitted curve suggests a seven point decrease in the pass rate per 10% increase in the unlicensed teachers, with just over a fifth of the variance in the pass rates explained by the unlicensed percentage.

The math data show a stronger relationship: 8.6% decrease per 10% and R-squared = 27%.


Of course, these data do not imply causation so they cannot tell us whether larger numbers of unlicensed teachers tend to reduce the pass rates or whether schools with lower pass rates tend to hire larger numbers of unlicensed teachers.  But I’ll bet you a #2 lead pencil it’s the latter.

Indeed, it would be an interesting, and perhaps useful, experiment to fully staff up, say, Boushall (48% unlicensed, reading pass rate = 47%) or MLK (39%, 32%) with licensed teachers and see what happens.


Puzzle question: Can you explain why the Richmond percentage of unlicensed teachers is more strongly associated with the decrease of pass rates of students who are not economically disadvantaged (“Not ED”) than with the rates of those who are (“ED”)?



Your hypothesis needs to accommodate the division-level data that reverse that order, mostly by the weaker relationship with the Not ED pass rates.



Richmond is the enlarged, yellow points. The red are, from the left, Newport News, Norfolk, and Hampton.

2018 CRDC: First & Second Year Teachers

Continuing to dig into the 2018 Civil Rights Data Collection, here are the Virginia division percentages of first-year teachers.


Richmond is the gold bar at 18.3%. The blue bar is Buena Vista is at 5.8%, which also is the state average. Richmond is 3.2 times the state average.

The red bars are the peer cities, from the left Newport News (7%), Hampton (7.9%), and Norfolk (11%).

The second-year percentages of full time teachers paint quite a different picture.


The average is 5%. Four divisions straddle that value: Alleghany and Alexandria at 4.9%; Stafford and Montgomery at 5.1%. The blue bar is on the first of the 5.1% divisions.

The maximum datum there is Sussex, which is off scale at 40.7%; the county went from 9 first-year to 43 second-year (out of 103.24 total; I’d be interested to meet that 0.24 teacher).

Richmond drops to 3.1% (from the first-year value of 18.3%). The implication is heavy hiring of first-year teachers, most of whom quit or go to a nearby division after that year.

The peer cities from the left are Newport News (4.7%), Hampton (6.5%), and Norfolk (7.1%).

In the ordinary course, we might expect the second-year numbers to be a bit lower than the first-year to reflect teachers who move to another division or quit the profession. The actual picture is more complicated:


Note: These second-year data are from 2018, as are the first-year, so the difference is meaningful only to the extent that hiring was about the same in both 2017-8 and 2016-7.

Richmond here is second from the highest: 15.2% difference.

The state average is 0.8%, i.e., most of the first-year teachers hired the previous year staying for a second year (or being replaced by new 2d-year hires). These data show some huge differences from that value, however.

At the negative end (i.e., divisions where there are more second- than first-year teachers), the leader is Sussex, off scale at –32%. Essex is next at a much less extravagant –6.3%. The hopeful explanation for these cases is large numbers of first year hires the previous year, most of whom stayed for a second year.

Indeed, many of these differences probably are the result of year-to-year variations in hiring patterns. For example, the Richmond data from the previous (2016) CRDC show large numbers for both first- and second-year teacher populations but a difference that, while large, was much closer to the state average than in 2018.


As a benchmark: If all teachers were hired in their first year and all retired after thirty, the first-year/second-year difference would be zero, with both populations 3.3%. The Richmond first-year numbers were 4.7 times that in 2016 and 5.5 times in 2018.

We could wish for more data, but we’re stuck with what the Feds have collected here. Those numbers nonetheless show a whole lot of first-year teachers in Richmond, and suggest a very large attrition rate of those first-year hires.

Teacher Truancy Costs, 2018 Version

RPS is hiring a lot of substitute teachers because, as measured by the 2018 CRDC, astounding numbers of the full time teachers are absent from work. The RPS budgets provide a measure of the costs.

In the 2017 adopted budget there was a line item in “General Fund Expenditures by Object Class” under “Other Compensation” for substitutes (at category 523): “N-SUBSTITUTE INSTR PROF.”


In the 2020 budget, that category had morphed to “N-INSTRUCTIONAL STAFF.”


Indeed, that category title changed every year from 2017 to 2020 (and in between the category number disappeared). No telling what’s going on there.

We can extract some useful information from this, however, because each budget contains data for the previous two years. The numbers reported under the new category titles in later budgets included those under the older titles in the earlier budgets. That is, despite the name changes, this category was, and remained, for substitute instructors.


These budgets also include the “actual” expenditures for substitutes for the first of the three budget years: Those show RPS underestimating the substitute teacher costs by 33 to 42 percent in each of the four years analyzed above.

And, to the point of teacher truancy, if RPS could cut the actual substitute costs by half, they could boost the full time teaching salaries by about 3% without increasing the overall budget.


Your tax dollars at “work.”

2018 CRDC: Teacher Certification

The importance of teacher quality should be clear but here’s room to debate the effectiveness of teacher certification. Nonetheless, certification is one of the few measures available to those of us who are taxed to pay those teachers.

Here, then, from the 2018 CRDC, are the Virginia public school division percentages of full time teachers not certified.


There are 32 divisions with 0%, i.e., all the teachers are certified. Richmond is the Big Winner with 22.5%, the gold bar. The red are, from the left, the peer cities Newport News (nearly hidden, 0.1%), Norfolk, and Hampton. The blue is the bar for Galax which, at 3.7% is the closest division to the division average, 3.6%.


Next, the Richmond schools:


Richmond’s awful middle schools, pink, dominate the high end here, except for Hill at 17% (4.8 times the division average). The high schools, yellow, are spread out some, ranging from Community (invisible at 0%) to Wythe at 35%. The elementary schools, green, run from 4% to 31%. The blue bar is the Richmond average, 22.5%, which is 6.3 times the division average.

Here is the list.


Except for Community High and six of the specialty schools, all our schools had remarkable shortages of certified teachers. Perhaps that was related to the salaries: The Richmond average for teaching positions in ‘18 was $51,528 against a state average of $57,260. Then, again, these data don’t tell us how much of the salary difference reflected lower salaries paid to all those uncertified teachers.

Teacher Truancy, 2018 Version

We hear a lot about truancy and its malign effects. For example, here.

Courtesy of the Federales, we now have data on teacher absences that may be similarly damaging to students and that certainly are harder to justify.

The 2018 version of the biennial Civil Rights Data Collection, available since October, 2020, has data on both truancy and teacher absences. Let’s look at the teachers.

The feds count full time teacher absences >10 during the school year. Eleven days are 6.1% of a 180 day school year.

Here are the 2018 counts for the Virginia school divisions.


Richmond is the gold bar at 65%. The red bars, from the left, are the peer jurisdictions Norfolk, Hampton, and Newport News. The blue bar is superimposed on one of the two divisions at the state average, 38%.


Richmond is a bit improved from the 2016 report. That year we were second from worst at 68%; this year, fifth at 65%.

The distribution of division rates looks like this:


The tall, red bar marks the Richmond datum at 65%, 1.9 standard deviations above the mean.

The Richmond school data show only Community, Ginter Park, Greene, Munford, and Alternative below the state average.


Little kids are notorious for being petri dishes for germs so we could expect the elementary schools to run high. In fact, except for that remarkable 9% number at Alternative, Richmond’s elementary schools reach both ends of the Richmond spectrum.


You might think that the very few schools where the teachers mostly come to work must have unusually heathy students and teachers or excellent principals. I’ll vote for the latter. Similarly, the very many schools where most of the teachers miss a lot of school would seem to have remarkably sick people or lousy leadership. There’s no reason to expect a plethora of sick people when some schools clearly don’t have many.

And, for sure, there is a leadership vacuum downtown and at the Board of “Education.”

In contrast, the middle schools run high while the high schools cluster toward the high-middle.



BTW: Maggie Walker is not a Richmond Public School (although the SOL pass rates there are reported at the public high schools in the students’ home school zones) and the numbers there are in a different universe from even the best of the Richmond high schools.


In any case, it is clear that the RPS “leaders” downtown need to be directed to work that is better suited to their talents.

We might expect the SOL performance to decline with increasing teacher truancy. Here are the reading data for the Richmond elementary schools:


Note: Virginia’s economically disadvantaged (here “ED”) students pass the SOLs at rates some 17 to 22 points lower than their more affluent peers (here “Not ED”), depending on the subject. Thus the school and division average pass rates depend both on student performance and the relative numbers of ED and Not ED students. This punishes the schools and divisions with large ED enrollments. We’ll avoid that issue here by looking at the rates for both groups.

For the Richmond elementary schools, the ED reading rate drops by some 2.2% for a 10% increase in the teacher truancy but the R-squared value tells us that the teacher absences only explain about 12% of the SOL variance. As to the Not ED students, any effect is smaller (ca. 0.9% for a 10% increase in teacher absences) and very nearly uncorrelated.

The math data give similar results.


The division data paint a similar picture.



In contrast to the national data, these Virginia numbers do not show a strong relationship between SOL performance and teacher truancy. They do suggest that the absent teachers may not be much more effective at teaching than the substitutes. And, in any case, the data spotlight a massive, and expensive, management failure, both in Richmond and statewide.

Commonwealth of Lake Woebegon

If you wanted to boost the pass rates of the SOLs, you’d have three choices (aside from the one perfected at Carver): Improve teaching, make the tests easier, or relax the scoring.

On the 2019 revision of the math tests, the Board of “Education” chose the last option: They adopted cut scores in five of six cases that were less than the level necessary to retain the same level of rigor as the earlier tests. The results were predictable (and, of course, fed the false notion that student performance was improving).


The Board now has jiggered the English tests to the same end. The recommendation (teacher-driven; no pretense here of objectivity) was for every cut score to be lower (easier) than the level necessary to maintain the rigor of the tests.


The Board rejected the Superintendent’s slightly higher recommendations and adopted the committee’s numbers (video at 1:44:55; minutes are not yet available). This grade inflation will have the happy result of making the Board and the Superintendents and the English teachers and the students all look better, all without anybody having to break a sweat.

It will also make it impossible to measure the effect of the coronavirus on English performance.

This is not an anomaly, but rather part of an ongoing campaign to camouflage the fading performance of the Virginia public school system. However, unfortunately for the self-serving mendacity of the “education” establishment, the NAEP data for fourth grade reading


and eighth grade reading


give away the game.

Your tax dollars at “work.”

Richmond: The Excellent, the OK, and the Awful

While we wait to see how far the Board of Education will punt on the 2021 SOL testing, let’s look in some detail at the 2019 performance of Richmond’s schools (there having been no testing in 2020).

But first, some important background: Statewide, “economically disadvantaged” (here, “ED”) students underperform their more affluent peers (“Not ED”) by some 17 to 22 points, depending on the subject. Thus the school and division average pass rates depend both on student performance and the relative numbers of ED and Not ED students. This punishes the schools and divisions with large ED enrollments. We’ll avoid that issue here by looking at the rates for both groups.

To start, here are the ED pass rate distributions of Virginia and Richmond schools on the reading tests.


The blue bars are the counts of Virginia schools with the indicated (rounded) pass rates. The red bars, with open middles so the state data can show through, are Richmond; the Richmond scale is on the right-hand axis.

The state data here (and even more in the next chart) are skewed toward the low end. That renders the usual measures of a distribution, the mean and standard deviation, less useful. The measure reported here is the median.

The two Richmond schools that aced the reading tests are Open and Community. The next entry, at 86%, is the other selective school, Franklin. The best of the mainstream schools is Marshall at 71%. The only other school to beat the state median was Cary at 68%. The eight Richmond schools in the cellar are, from the bottom, Alternative, Fairfield Court, MLK, Carver, Woodville, Chimborazo, and Mason.

The Not ED data portray another disaster.


Community and Open again aced the tests. They are followed by Munford, Hill, Franklin, Fox, Alternative and, barely above the state median, Patrick Henry. At the other end, the largest failures are, from the left, Greene, MLK, Boushall, Woodville, Elkhardt-Thompson, and Henderson. Fairfield Court would surely be in that latter list but for the suppression rule (<10 Not ED students).

Turning to the math tests, the Richmond pass rates are even less encouraging:


The schools that beat the state median are, from the top, Open, Community, Cary, Franklin, and Redd. At the other end, the basement dwellers are, from the bottom, Alternative, MLK, Fairfield Court, Carver, Boushall, Wythe, and Henderson.


As to Not ED, Open, Community, Munford, Fox, and Ginter Park beat the state median. Boushall, MLK, Wythe, Greene, Henderson, Elkhardt-Thompson, and Blackwell all scored below 50%.

These data emphasize the huge spreads between Richmond’s best and worst schools as well as the stunning under-performance of flocks of Richmond’s students.

For the record, here are the data, sorted by decreasing averages of the four data points. The “#DIV/0!” entries are for cases where the student count was zero or, more likely, suppressed by VDOE because it was <10.


Region 7 Addendum

We have seen that the divisions in SW Virginia (“Region 7” in the VDOE system) formed their own organization, the Comprehensive Instructional Program (“CIP”), that brought nice improvements in student performance.

While we wait to see whether the Board of “Education” will punt on the 2021 SOL testing, I’ve been looking over the 2019 data (there being no tests in 2020). The data for Region 7 paint a lovely picture.

You may recall that, since undertaking the CIP, Region 7 has seen major improvements in the pass rates of its economically disadvantaged (“ED”) students.


They accomplished this with a large and increasing ED population.


To put the 2019 results in a more nuanced context, let’s start with the school average reading pass rates for the Not ED students.


The blue bars are the counts of Virginia schools with the indicated 2019 pass rates of Not ED students (rounded to the nearest whole nubers). Thus, one school (Fairfax County Adult High) turned in a 13% pass rate(!) and 102 schools had 88% rates. The red-bounded bars are Region 7, left open to allow the state numbers to show through. The Region 7 scale is on the right vertical axis. The lowest school there turned in a 69 while 11 schools had 91% rates. (Excel reports for “multiple items” when you tell it to report data for more than one division; please read that term as “Region 7.”)

The usual statistical measures, mean and standard deviation, are of limited use with skewed distributions so I show the medians here. Of course, as a distribution approaches “normal,” the median approach the mean. In any case, these are medians of the school averages, not division medians.

If you think the Not ED pass rates for Region 7 schools are a pleasant bit of news, take a look at the ED numbers:


Here, the Region 7 median is ten points higher than the state. Or you might prefer to ignore those stats and just look at the lovely picture.

The math data similarly testify to the success of the CIP.



It is instructive to compare the (manifestly sensible) techniques used by the CIP with the resolutely ineffective bureaucratic nonsense imposed by the “education” establishment.

The CIP:

  • Identify the good teachers,
  • Share their materials and techniques,
  • Measure what works,
  • Focus on core skills,
  • Set high expectations,
  • Bond with the students, and
  • Use the feckless VDOE only for what
    it actually can do well: crunch numbers.

The state, here the Petersburg Corrective Action Plan (for a division that the state has been attempting to repair, without success, since 2004):

I think it is past time to redirect the education bureaucracy to what it can do well, crunch numbers, and give the rest of its budget to the CIP.

More Money and Less Education in Richmond

On the subject of spending for schools, the VDOE Web Site has 2019 data for division income by source and fall enrollments of both economically disadvantaged (“ED”) students and their more affluent peers (“Not ED”).

The division income totals per student, plotted against the % of ED enrollment, looks like this:


Richmond is the enlarged, yellow point. The red points are, from the left, the peer cities Newport News, Hampton, and Norfolk.

The fitted line suggests that per student division income increases by $68 for a 10% increase in the ED percentage but the R-squared value tells us the variables are uncorrelated.

Richmond is in 15th place in this find-the-money derby.


In terms of local funding, Richmond again is above average but down in 27th place.


The R-squared value of 7% suggests a slight correlation between local funding and % ED, but in the decreasing direction.

The other funding sources present a more interesting picture.


State funding shows a modest correlation, R-squared = 22%, while the federal data exhibit a more robust R-squared of 42%. Funding in both categories increases with increasing % ED.  Richmond is well below the fitted curve for state funding, with part of that gap closed by Uncle Sam.

The Sales Tax funding is essentially flat.

Looking again at just the Big Winners:


Here we see that larger than average local taxes support the effort in every case while the State accounts for some of the excess in Highland and Sussex and the federales do so in Surry, Sussex, and Richmond.

Of course, once that money comes in the divisions spend it.  We earlier saw the expenditure data juxtaposed with Richmond’s lousy performance:


If only Richmond would stop whinging about money and start educating its schoolchildren.

Learning and Poverty

An earlier post showed the absence of a correlation between division day school expenditures and either the average SOL pass rate of economically disadvantaged (“ED”) students (mostly those who qualify for the free lunch program) or those of their more affluent peers (“Not ED”).  As well, the post dwelled on the remarkable progress of ED students under the Comprehensive Instructional Program, a bottom-up initiative that started in VDOE’s Region 7 (southwest Virginia).

Regarding that post, the estimable Dick Hall-Sizemore asks whether the relative numbers of ED and Not ED students make a difference as to SOL performance.

The short answer is NO as to reading and slightly as to math, but with the (modest) effect in a place where you might not expect it.

For the longer answer, here are the data:


The average reading rate for ED students increases 1.8% going from 0% to 100% ED students in the tested group. However, the R-squared of 0.1% tells us the two variables are not correlated.

The Not ED performance, in contrast, decreases by 1.4% for a 10% increase in the ED population. The R-squared value indicates that the ED population explains 11% of the variance in those pass rates.  Thus, to a modest extent, it seems that increasing the proportion of less affluent students is related to lowered performance of the more affluent students.

If you have a testable explanation for this result, please do share it with me.

Richmond is the enlarged points with yellow fill. The red fills are the peer cities, Hampton and Newport News on the left (Hampton above) and Norfolk on the right.

Notice particularly the several divisions with both ED percentages and reading pass rates higher than Richmond’s.


Indeed, the sole exception among that dozen divisions is the Petersburg Not ED datum.

The math data tell a similar story, but with a steeper decline in the Not ED pass rates (3.3% for a 10% ED population increase) and a more substantial R-squared value, 24%.


ED population also fails to explain the better performance of the Region 7 schools that started the (remarkably successful) Comprehensive Instructional Program.  Region 7 has considerably more than the average population of ED students.


Note: There is an anomaly in these numbers. Washington County goes from 62.3% ED in 2018 to 99.1% in 2019. This looks like a data error.  I have asked VDOE about it; they have not replied (presumably too busy with programs that don’t work). The 2019 average with Washington Co. excluded would be 56.0%. For now, it would be wise to ignore the Washington Co. numbers and to discount that 61% some.

As to the details:



Note: The points at 99% are Washington County.

It is instructive to compare the manifestly effective CIP approach in Region 7 with the resolutely ineffective VBOE “Plan” for dealing with the ongoing failure of the Petersburg system.

STOP! Please go back and read all of each of those lists so you can fully appreciate the fecklessness of the Board’s approach.

A modest proposal: Let’s expand the CIP statewide and and shrink the Board of Education’s function to what they can usefully do: statistics and webinars.